Sample records for handling large web

  1. AdaFF: Adaptive Failure-Handling Framework for Composite Web Services

    NASA Astrophysics Data System (ADS)

    Kim, Yuna; Lee, Wan Yeon; Kim, Kyong Hoon; Kim, Jong

    In this paper, we propose a novel Web service composition framework which dynamically accommodates various failure recovery requirements. In the proposed framework called Adaptive Failure-handling Framework (AdaFF), failure-handling submodules are prepared during the design of a composite service, and some of them are systematically selected and automatically combined with the composite Web service at service instantiation in accordance with the requirement of individual users. In contrast, existing frameworks cannot adapt the failure-handling behaviors to user's requirements. AdaFF rapidly delivers a composite service supporting the requirement-matched failure handling without manual development, and contributes to a flexible composite Web service design in that service architects never care about failure handling or variable requirements of users. For proof of concept, we implement a prototype system of the AdaFF, which automatically generates a composite service instance with Web Services Business Process Execution Language (WS-BPEL) according to the users' requirement specified in XML format and executes the generated instance on the ActiveBPEL engine.

  2. Web-based visualization of very large scientific astronomy imagery

    NASA Astrophysics Data System (ADS)

    Bertin, E.; Pillay, R.; Marmo, C.

    2015-04-01

    Visualizing and navigating through large astronomy images from a remote location with current astronomy display tools can be a frustrating experience in terms of speed and ergonomics, especially on mobile devices. In this paper, we present a high performance, versatile and robust client-server system for remote visualization and analysis of extremely large scientific images. Applications of this work include survey image quality control, interactive data query and exploration, citizen science, as well as public outreach. The proposed software is entirely open source and is designed to be generic and applicable to a variety of datasets. It provides access to floating point data at terabyte scales, with the ability to precisely adjust image settings in real-time. The proposed clients are light-weight, platform-independent web applications built on standard HTML5 web technologies and compatible with both touch and mouse-based devices. We put the system to the test and assess the performance of the system and show that a single server can comfortably handle more than a hundred simultaneous users accessing full precision 32 bit astronomy data.

  3. Electronic sensor and actuator webs for large-area complex geometry cardiac mapping and therapy

    PubMed Central

    Kim, Dae-Hyeong; Ghaffari, Roozbeh; Lu, Nanshu; Wang, Shuodao; Lee, Stephen P.; Keum, Hohyun; D’Angelo, Robert; Klinker, Lauren; Su, Yewang; Lu, Chaofeng; Kim, Yun-Soung; Ameen, Abid; Li, Yuhang; Zhang, Yihui; de Graff, Bassel; Hsu, Yung-Yu; Liu, ZhuangJian; Ruskin, Jeremy; Xu, Lizhi; Lu, Chi; Omenetto, Fiorenzo G.; Huang, Yonggang; Mansour, Moussa; Slepian, Marvin J.; Rogers, John A.

    2012-01-01

    Curved surfaces, complex geometries, and time-dynamic deformations of the heart create challenges in establishing intimate, nonconstraining interfaces between cardiac structures and medical devices or surgical tools, particularly over large areas. We constructed large area designs for diagnostic and therapeutic stretchable sensor and actuator webs that conformally wrap the epicardium, establishing robust contact without sutures, mechanical fixtures, tapes, or surgical adhesives. These multifunctional web devices exploit open, mesh layouts and mount on thin, bio-resorbable sheets of silk to facilitate handling in a way that yields, after dissolution, exceptionally low mechanical moduli and thicknesses. In vivo studies in rabbit and pig animal models demonstrate the effectiveness of these device webs for measuring and spatially mapping temperature, electrophysiological signals, strain, and physical contact in sheet and balloon-based systems that also have the potential to deliver energy to perform localized tissue ablation. PMID:23150574

  4. Large orb-webs adapted to maximise total biomass not rare, large prey

    PubMed Central

    Harmer, Aaron M. T.; Clausen, Philip D.; Wroe, Stephen; Madin, Joshua S.

    2015-01-01

    Spider orb-webs are the ultimate anti-ballistic devices, capable of dissipating the relatively massive kinetic energy of flying prey. Increased web size and prey stopping capacity have co-evolved in a number orb-web taxa, but the selective forces driving web size and performance increases are under debate. The rare, large prey hypothesis maintains that the energetic benefits of rare, very large prey are so much greater than the gains from smaller, more common prey that smaller prey are irrelevant for reproduction. Here, we integrate biophysical and ecological data and models to test a major prediction of the rare, large prey hypothesis, that selection should favour webs with increased stopping capacity and that large prey should comprise a significant proportion of prey stopped by a web. We find that larger webs indeed have a greater capacity to stop large prey. However, based on prey ecology, we also find that these large prey make up a tiny fraction of the total biomass (=energy) potentially captured. We conclude that large webs are adapted to stop more total biomass, and that the capacity to stop rare, but very large, prey is an incidental consequence of the longer radial silks that scale with web size. PMID:26374379

  5. A Query Language for Handling Big Observation Data Sets in the Sensor Web

    NASA Astrophysics Data System (ADS)

    Autermann, Christian; Stasch, Christoph; Jirka, Simon; Koppe, Roland

    2017-04-01

    The Sensor Web provides a framework for the standardized Web-based sharing of environmental observations and sensor metadata. While the issue of varying data formats and protocols is addressed by these standards, the fast growing size of observational data is imposing new challenges for the application of these standards. Most solutions for handling big observational datasets currently focus on remote sensing applications, while big in-situ datasets relying on vector features still lack a solid approach. Conventional Sensor Web technologies may not be adequate, as the sheer size of the data transmitted and the amount of metadata accumulated may render traditional OGC Sensor Observation Services (SOS) unusable. Besides novel approaches to store and process observation data in place, e.g. by harnessing big data technologies from mainstream IT, the access layer has to be amended to utilize and integrate these large observational data archives into applications and to enable analysis. For this, an extension to the SOS will be discussed that establishes a query language to dynamically process and filter observations at storage level, similar to the OGC Web Coverage Service (WCS) and it's Web Coverage Processing Service (WCPS) extension. This will enable applications to request e.g. spatial or temporal aggregated data sets in a resolution it is able to display or it requires. The approach will be developed and implemented in cooperation with the The Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research whose catalogue of data compromises marine observations of physical, chemical and biological phenomena from a wide variety of sensors, including mobile (like research vessels, aircrafts or underwater vehicles) and stationary (like buoys or research stations). Observations are made with a high temporal resolution and the resulting time series may span multiple decades.

  6. Handling Qualities of Large Flexible Aircraft. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Poopaka, S.

    1980-01-01

    The effects on handling qualities of elastic modes interaction with the rigid body dynamics of a large flexible aircraft are studied by a mathematical computer simulation. An analytical method to predict the pilot ratings when there is a severe modes interactions is developed. This is done by extending the optimal control model of the human pilot response to include the mode decomposition mechanism into the model. The handling qualities are determined for a longitudinal tracking task using a large flexible aircraft with parametric variations in the undamped natural frequencies of the two lowest frequency, symmetric elastic modes made to induce varying amounts of mode interaction.

  7. Proposal for a Web Encoding Service (wes) for Spatial Data Transactio

    NASA Astrophysics Data System (ADS)

    Siew, C. B.; Peters, S.; Rahman, A. A.

    2015-10-01

    Web services utilizations in Spatial Data Infrastructure (SDI) have been well established and standardized by Open Geospatial Consortium (OGC). Similar web services for 3D SDI are also being established in recent years, with extended capabilities to handle 3D spatial data. The increasing popularity of using City Geographic Markup Language (CityGML) for 3D city modelling applications leads to the needs for large spatial data handling for data delivery. This paper revisits the available web services in OGC Web Services (OWS), and propose the background concepts and requirements for encoding spatial data via Web Encoding Service (WES). Furthermore, the paper discusses the data flow of the encoder within web service, e.g. possible integration with Web Processing Service (WPS) or Web 3D Services (W3DS). The integration with available web service could be extended to other available web services for efficient handling of spatial data, especially 3D spatial data.

  8. Food-web dynamics in a large river discontinuum

    USGS Publications Warehouse

    Cross, Wyatt F.; Baxter, Colden V.; Rosi-Marshall, Emma J.; Hall, Robert O.; Kennedy, Theodore A.; Donner, Kevin C.; Kelly, Holly A. Wellard; Seegert, Sarah E.Z.; Behn, Kathrine E.; Yard, Michael D.

    2013-01-01

    Nearly all ecosystems have been altered by human activities, and most communities are now composed of interacting species that have not co-evolved. These changes may modify species interactions, energy and material flows, and food-web stability. Although structural changes to ecosystems have been widely reported, few studies have linked such changes to dynamic food-web attributes and patterns of energy flow. Moreover, there have been few tests of food-web stability theory in highly disturbed and intensely managed freshwater ecosystems. Such synthetic approaches are needed for predicting the future trajectory of ecosystems, including how they may respond to natural or anthropogenic perturbations. We constructed flow food webs at six locations along a 386-km segment of the Colorado River in Grand Canyon (Arizona, USA) for three years. We characterized food-web structure and production, trophic basis of production, energy efficiencies, and interaction-strength distributions across a spatial gradient of perturbation (i.e., distance from Glen Canyon Dam), as well as before and after an experimental flood. We found strong longitudinal patterns in food-web characteristics that strongly correlated with the spatial position of large tributaries. Above tributaries, food webs were dominated by nonnative New Zealand mudsnails (62% of production) and nonnative rainbow trout (100% of fish production). The simple structure of these food webs led to few dominant energy pathways (diatoms to few invertebrate taxa to rainbow trout), large energy inefficiencies (i.e., Below large tributaries, invertebrate production declined ∼18-fold, while fish production remained similar to upstream sites and comprised predominately native taxa (80–100% of production). Sites below large tributaries had increasingly reticulate and detritus-based food webs with a higher prevalence of omnivory, as well as interaction strength distributions more typical of theoretically stable food webs (i

  9. geoknife: Reproducible web-processing of large gridded datasets

    USGS Publications Warehouse

    Read, Jordan S.; Walker, Jordan I.; Appling, Alison P.; Blodgett, David L.; Read, Emily K.; Winslow, Luke A.

    2016-01-01

    Geoprocessing of large gridded data according to overlap with irregular landscape features is common to many large-scale ecological analyses. The geoknife R package was created to facilitate reproducible analyses of gridded datasets found on the U.S. Geological Survey Geo Data Portal web application or elsewhere, using a web-enabled workflow that eliminates the need to download and store large datasets that are reliably hosted on the Internet. The package provides access to several data subset and summarization algorithms that are available on remote web processing servers. Outputs from geoknife include spatial and temporal data subsets, spatially-averaged time series values filtered by user-specified areas of interest, and categorical coverage fractions for various land-use types.

  10. WebCIS: large scale deployment of a Web-based clinical information system.

    PubMed

    Hripcsak, G; Cimino, J J; Sengupta, S

    1999-01-01

    WebCIS is a Web-based clinical information system. It sits atop the existing Columbia University clinical information system architecture, which includes a clinical repository, the Medical Entities Dictionary, an HL7 interface engine, and an Arden Syntax based clinical event monitor. WebCIS security features include authentication with secure tokens, authorization maintained in an LDAP server, SSL encryption, permanent audit logs, and application time outs. WebCIS is currently used by 810 physicians at the Columbia-Presbyterian center of New York Presbyterian Healthcare to review and enter data into the electronic medical record. Current deployment challenges include maintaining adequate database performance despite complex queries, replacing large numbers of computers that cannot run modern Web browsers, and training users that have never logged onto the Web. Although the raised expectations and higher goals have increased deployment costs, the end result is a far more functional, far more available system.

  11. Handling Qualities of Large Rotorcraft in Hover and Low Speed

    NASA Technical Reports Server (NTRS)

    Malpica, Carlos; Theodore, Colin R.; Lawrence , Ben; Blanken, Chris L.

    2015-01-01

    According to a number of system studies, large capacity advanced rotorcraft with a capability of high cruise speeds (approx.350 mph) as well as vertical and/or short take-off and landing (V/STOL) flight could alleviate anticipated air transportation capacity issues by making use of non-primary runways, taxiways, and aprons. These advanced aircraft pose a number of design challenges, as well as unknown issues in the flight control and handling qualities domains. A series of piloted simulation experiments have been conducted on the NASA Ames Research Center Vertical Motion Simulator (VMS) in recent years to systematically investigate the fundamental flight control and handling qualities issues associated with the characteristics of large rotorcraft, including tiltrotors, in hover and low-speed maneuvering.

  12. WebViz:A Web-based Collaborative Interactive Visualization System for large-Scale Data Sets

    NASA Astrophysics Data System (ADS)

    Yuen, D. A.; McArthur, E.; Weiss, R. M.; Zhou, J.; Yao, B.

    2010-12-01

    WebViz is a web-based application designed to conduct collaborative, interactive visualizations of large data sets for multiple users, allowing researchers situated all over the world to utilize the visualization services offered by the University of Minnesota’s Laboratory for Computational Sciences and Engineering (LCSE). This ongoing project has been built upon over the last 3 1/2 years .The motivation behind WebViz lies primarily with the need to parse through an increasing amount of data produced by the scientific community as a result of larger and faster multicore and massively parallel computers coming to the market, including the use of general purpose GPU computing. WebViz allows these large data sets to be visualized online by anyone with an account. The application allows users to save time and resources by visualizing data ‘on the fly’, wherever he or she may be located. By leveraging AJAX via the Google Web Toolkit (http://code.google.com/webtoolkit/), we are able to provide users with a remote, web portal to LCSE's (http://www.lcse.umn.edu) large-scale interactive visualization system already in place at the University of Minnesota. LCSE’s custom hierarchical volume rendering software provides high resolution visualizations on the order of 15 million pixels and has been employed for visualizing data primarily from simulations in astrophysics to geophysical fluid dynamics . In the current version of WebViz, we have implemented a highly extensible back-end framework built around HTTP "server push" technology. The web application is accessible via a variety of devices including netbooks, iPhones, and other web and javascript-enabled cell phones. Features in the current version include the ability for users to (1) securely login (2) launch multiple visualizations (3) conduct collaborative visualization sessions (4) delegate control aspects of a visualization to others and (5) engage in collaborative chats with other users within the user interface

  13. Open-source web-enabled data management, analyses, and visualization of very large data in geosciences using Jupyter, Apache Spark, and community tools

    NASA Astrophysics Data System (ADS)

    Chaudhary, A.

    2017-12-01

    Current simulation models and sensors are producing high-resolution, high-velocity data in geosciences domain. Knowledge discovery from these complex and large size datasets require tools that are capable of handling very large data and providing interactive data analytics features to researchers. To this end, Kitware and its collaborators are producing open-source tools GeoNotebook, GeoJS, Gaia, and Minerva for geosciences that are using hardware accelerated graphics and advancements in parallel and distributed processing (Celery and Apache Spark) and can be loosely coupled to solve real-world use-cases. GeoNotebook (https://github.com/OpenGeoscience/geonotebook) is co-developed by Kitware and NASA-Ames and is an extension to the Jupyter Notebook. It provides interactive visualization and python-based analysis of geospatial data and depending the backend (KTile or GeoPySpark) can handle data sizes of Hundreds of Gigabytes to Terabytes. GeoNotebook uses GeoJS (https://github.com/OpenGeoscience/geojs) to render very large geospatial data on the map using WebGL and Canvas2D API. GeoJS is more than just a GIS library as users can create scientific plots such as vector and contour and can embed InfoVis plots using D3.js. GeoJS aims for high-performance visualization and interactive data exploration of scientific and geospatial location aware datasets and supports features such as Point, Line, Polygon, and advanced features such as Pixelmap, Contour, Heatmap, and Choropleth. Our another open-source tool Minerva ((https://github.com/kitware/minerva) is a geospatial application that is built on top of open-source web-based data management system Girder (https://github.com/girder/girder) which provides an ability to access data from HDFS or Amazon S3 buckets and provides capabilities to perform visualization and analyses on geosciences data in a web environment using GDAL and GeoPandas wrapped in a unified API provided by Gaia (https

  14. An interactive web-based system using cloud for large-scale visual analytics

    NASA Astrophysics Data System (ADS)

    Kaseb, Ahmed S.; Berry, Everett; Rozolis, Erik; McNulty, Kyle; Bontrager, Seth; Koh, Youngsol; Lu, Yung-Hsiang; Delp, Edward J.

    2015-03-01

    Network cameras have been growing rapidly in recent years. Thousands of public network cameras provide tremendous amount of visual information about the environment. There is a need to analyze this valuable information for a better understanding of the world around us. This paper presents an interactive web-based system that enables users to execute image analysis and computer vision techniques on a large scale to analyze the data from more than 65,000 worldwide cameras. This paper focuses on how to use both the system's website and Application Programming Interface (API). Given a computer program that analyzes a single frame, the user needs to make only slight changes to the existing program and choose the cameras to analyze. The system handles the heterogeneity of the geographically distributed cameras, e.g. different brands, resolutions. The system allocates and manages Amazon EC2 and Windows Azure cloud resources to meet the analysis requirements.

  15. An Investigation of Large Tilt-Rotor Hover and Low Speed Handling Qualities

    NASA Technical Reports Server (NTRS)

    Malpica, Carlos A.; Decker, William A.; Theodore, Colin R.; Lindsey, James E.; Lawrence, Ben; Blanken, Chris L.

    2011-01-01

    A piloted simulation experiment conducted on the NASA-Ames Vertical Motion Simulator evaluated the hover and low speed handling qualities of a large tilt-rotor concept, with particular emphasis on longitudinal and lateral position control. Ten experimental test pilots evaluated different combinations of Attitude Command-Attitude Hold (ACAH) and Translational Rate Command (TRC) response types, nacelle conversion actuator authority limits and inceptor choices. Pilots performed evaluations in revised versions of the ADS-33 Hover, Lateral Reposition and Depart/Abort MTEs and moderate turbulence conditions. Level 2 handling qualities ratings were primarily recorded using ACAH response type in all three of the evaluation maneuvers. The baseline TRC conferred Level 1 handling qualities in the Hover MTE, but there was a tendency to enter into a PIO associated with nacelle actuator rate limiting when employing large, aggressive control inputs. Interestingly, increasing rate limits also led to a reduction in the handling qualities ratings. This led to the identification of a nacelle rate to rotor longitudinal flapping coupling effect that induced undesired, pitching motions proportional to the allowable amount of nacelle rate. A modification that counteracted this effect significantly improved the handling qualities. Evaluation of the different response type variants showed that inclusion of TRC response could provide Level 1 handling qualities in the Lateral Reposition maneuver by reducing coupled pitch and heave off axis responses that otherwise manifest with ACAH. Finally, evaluations in the Depart/Abort maneuver showed that uncertainty about commanded nacelle position and ensuing aircraft response, when manually controlling the nacelle, demanded high levels of attention from the pilot. Additional requirements to maintain pitch attitude within 5 deg compounded the necessary workload.

  16. Interactive effects of fire and large herbivores on web-building spiders.

    PubMed

    Foster, C N; Barton, P S; Wood, J T; Lindenmayer, D B

    2015-09-01

    Altered disturbance regimes are a major driver of biodiversity loss worldwide. Maintaining or re-creating natural disturbance regimes is therefore the focus of many conservation programmes. A key challenge, however, is to understand how co-occurring disturbances interact to affect biodiversity. We experimentally tested for the interactive effects of prescribed fire and large macropod herbivores on the web-building spider assemblage of a eucalypt forest understorey and investigated the role of vegetation in mediating these effects using path analysis. Fire had strong negative effects on the density of web-building spiders, which were partly mediated by effects on vegetation structure, while negative effects of large herbivores on web density were not related to changes in vegetation. Fire amplified the effects of large herbivores on spiders, both via vegetation-mediated pathways and by increasing herbivore activity. The importance of vegetation-mediated pathways and fire-herbivore interactions differed for web density and richness and also differed between web types. Our results demonstrate that for some groups of web-building spiders, the effects of co-occurring disturbance drivers may be mostly additive, whereas for other groups, interactions between drivers can amplify disturbance effects. In our study system, the use of prescribed fire in the presence of high densities of herbivores could lead to reduced densities and altered composition of web-building spiders, with potential cascading effects through the arthropod food web. Our study highlights the importance of considering both the independent and interactive effects of disturbances, as well as the mechanisms driving their effects, in the management of disturbance regimes.

  17. Large-Scale Multiobjective Static Test Generation for Web-Based Testing with Integer Programming

    ERIC Educational Resources Information Center

    Nguyen, M. L.; Hui, Siu Cheung; Fong, A. C. M.

    2013-01-01

    Web-based testing has become a ubiquitous self-assessment method for online learning. One useful feature that is missing from today's web-based testing systems is the reliable capability to fulfill different assessment requirements of students based on a large-scale question data set. A promising approach for supporting large-scale web-based…

  18. Storage and Retrieval of Large RDF Graph Using Hadoop and MapReduce

    NASA Astrophysics Data System (ADS)

    Farhan Husain, Mohammad; Doshi, Pankil; Khan, Latifur; Thuraisingham, Bhavani

    Handling huge amount of data scalably is a matter of concern for a long time. Same is true for semantic web data. Current semantic web frameworks lack this ability. In this paper, we describe a framework that we built using Hadoop to store and retrieve large number of RDF triples. We describe our schema to store RDF data in Hadoop Distribute File System. We also present our algorithms to answer a SPARQL query. We make use of Hadoop's MapReduce framework to actually answer the queries. Our results reveal that we can store huge amount of semantic web data in Hadoop clusters built mostly by cheap commodity class hardware and still can answer queries fast enough. We conclude that ours is a scalable framework, able to handle large amount of RDF data efficiently.

  19. Web tools for large-scale 3D biological images and atlases

    PubMed Central

    2012-01-01

    Background Large-scale volumetric biomedical image data of three or more dimensions are a significant challenge for distributed browsing and visualisation. Many images now exceed 10GB which for most users is too large to handle in terms of computer RAM and network bandwidth. This is aggravated when users need to access tens or hundreds of such images from an archive. Here we solve the problem for 2D section views through archive data delivering compressed tiled images enabling users to browse through very-large volume data in the context of a standard web-browser. The system provides an interactive visualisation for grey-level and colour 3D images including multiple image layers and spatial-data overlay. Results The standard Internet Imaging Protocol (IIP) has been extended to enable arbitrary 2D sectioning of 3D data as well a multi-layered images and indexed overlays. The extended protocol is termed IIP3D and we have implemented a matching server to deliver the protocol and a series of Ajax/Javascript client codes that will run in an Internet browser. We have tested the server software on a low-cost linux-based server for image volumes up to 135GB and 64 simultaneous users. The section views are delivered with response times independent of scale and orientation. The exemplar client provided multi-layer image views with user-controlled colour-filtering and overlays. Conclusions Interactive browsing of arbitrary sections through large biomedical-image volumes is made possible by use of an extended internet protocol and efficient server-based image tiling. The tools open the possibility of enabling fast access to large image archives without the requirement of whole image download and client computers with very large memory configurations. The system was demonstrated using a range of medical and biomedical image data extending up to 135GB for a single image volume. PMID:22676296

  20. Differential seed handling by two African primates affects seed fate and establishment of large-seeded trees

    NASA Astrophysics Data System (ADS)

    Gross-Camp, Nicole D.; Kaplin, Beth A.

    2011-11-01

    We examined the influence of seed handling by two semi-terrestrial African forest primates, chimpanzees ( Pan troglodytes) and l'Hoest's monkeys ( Cercopithecus lhoesti), on the fate of large-seeded tree species in an afromontane forest. Chimpanzees and l'Hoest's monkeys dispersed eleven seed species over one year, with quantity and quality of dispersal varying through time. Primates differed in their seed handling behaviors with chimpanzees defecating large seeds (>0.5 cm) significantly more than l'Hoest's. Furthermore, they exhibited different oral-processing techniques with chimpanzees discarding wadges containing many seeds and l'Hoest's monkeys spitting single seeds. A PCA examined the relationship between microhabitat characteristics and the site where primates deposited seeds. The first two components explained almost half of the observed variation. Microhabitat characteristics associated with sites where seeds were defecated had little overlap with those characteristics describing where spit seeds arrived, suggesting that seed handling in part determines the location where seeds are deposited. We monitored a total of 552 seed depositions through time, recording seed persistence, germination, and establishment. Defecations were deposited significantly farther from an adult conspecific than orally-discarded seeds where they experienced the greatest persistence but poorest establishment. In contrast, spit seeds were deposited closest to an adult conspecific but experienced the highest seed establishment rates. We used experimental plots to examine the relationship between seed handling, deposition site, and seed fate. We found a significant difference in seed handling and fate, with undispersed seeds in whole fruits experiencing the lowest establishment rates. Seed germination differed by habitat type with open forest experiencing the highest rates of germination. Our results highlight the relationship between primate seed handling and deposition site and seed

  1. Crawling Robots on Large Web in Rocket Experiment on Furoshiki Deployment

    NASA Astrophysics Data System (ADS)

    Kaya, N.; Iwashita, M.; Nakasuka, S.; Summerer, L.; Mankins, J.

    It is one of the most important and critical issues to develop a technology to construct space huge transmitting antenna such as the Solar Power Satellite. The huge antenna have many useful applications in space, for example, telecommunication antennas for cellular phones, radars for remote sensing, navigation and observation, and so on. We are proposing to apply the Furoshiki satellite with robots to construct the huge structures. After a large web is deployed using the Furoshiki satellite in the same size of the huge antenna, all of the antenna elements crawl on the web with their own legs toward their allocated locations in order to realize a huge antenna. The micro-gravity experiment is planned using a sounding rocket of ISAS in order to demonstrate the feasibility of the deployment of the large web and the phased array performance. Three daughter satellites are being separated from the mother satellite with weak springs, and the daughter satellites deploy the Furoshiki web to a triangular shape at the size of about 20-40m. The dynamics of the daughter satellites and the web is observed by several cameras installed on the mother and daughter satellites during the deployment, while the performance of the phased array antenna using the retrodirective method will simultaneously be measured at the ground station. Finally two micro robots crawl from the mother satellite to the certain points on the web to demonstrate one promising way to construct RF transmitter panels. The robots are internationally being developed by NASA, ESTEC and Kobe University. There are many various ideas for the robots to crawl on the web in the micro-gravity. Each organization is independently developing a different type of the robots. Kobe University is trying to develop wheels to run on the web by pinching the strings of the web. It can successfully run on the web, though the issue is found to tangle the strings.

  2. NGL Viewer: Web-based molecular graphics for large complexes.

    PubMed

    Rose, Alexander S; Bradley, Anthony R; Valasatava, Yana; Duarte, Jose M; Prlic, Andreas; Rose, Peter W

    2018-05-29

    The interactive visualization of very large macromolecular complexes on the web is becoming a challenging problem as experimental techniques advance at an unprecedented rate and deliver structures of increasing size. We have tackled this problem by developing highly memory-efficient and scalable extensions for the NGL WebGL-based molecular viewer and by using MMTF, a binary and compressed Macromolecular Transmission Format. These enable NGL to download and render molecular complexes with millions of atoms interactively on desktop computers and smartphones alike, making it a tool of choice for web-based molecular visualization in research and education. The source code is freely available under the MIT license at github.com/arose/ngl and distributed on NPM (npmjs.com/package/ngl). MMTF-JavaScript encoders and decoders are available at github.com/rcsb/mmtf-javascript. asr.moin@gmail.com.

  3. An Investigation of Large Aircraft Handling Qualities

    NASA Astrophysics Data System (ADS)

    Joyce, Richard D.

    An analytical technique for investigating transport aircraft handling qualities is exercised in a study using models of two such vehicles, a Boeing 747 and Lockheed C-5A. Two flight conditions are employed for climb and directional tasks, and a third included for a flare task. The analysis technique is based upon a "structural model" of the human pilot developed by Hess. The associated analysis procedure has been discussed previously in the literature, but centered almost exclusively on the characteristics of high-performance fighter aircraft. The handling qualities rating level (HQRL) and pilot induced oscillation tendencies rating level (PIORL) are predicted for nominal configurations of the aircraft and for "damaged" configurations where actuator rate limits are introduced as nonlinearites. It is demonstrated that the analysis can accommodate nonlinear pilot/vehicle behavior and do so in the context of specific flight tasks, yielding estimates of handling qualities, pilot-induced oscillation tendencies and upper limits of task performance. A brief human-in-the-loop tracking study was performed to provide a limited validation of the pilot model employed.

  4. Handling qualities of large flexible control-configured aircraft

    NASA Technical Reports Server (NTRS)

    Swaim, R. L.

    1980-01-01

    The effects on handling qualities of low frequency symmetric elastic mode interaction with the rigid body dynamics of a large flexible aircraft was analyzed by use of a mathematical pilot modeling computer simulation. An extension of the optimal control model for a human pilot was made so that the mode interaction effects on the pilot's control task could be assessed. Pilot ratings were determined for a longitudinal tracking task with parametric variations in the undamped natural frequencies of the two lowest frequency symmetric elastic modes made to induce varying amounts of mode interaction. Relating numerical performance index values associated with the frequency variations used in several dynamic cases, to a numerical Cooper-Harper pilot rating has proved successful in discriminating when the mathematical pilot can or cannot separate rigid from elastic response in the tracking task.

  5. Handling Big Data in Medical Imaging: Iterative Reconstruction with Large-Scale Automated Parallel Computation.

    PubMed

    Lee, Jae H; Yao, Yushu; Shrestha, Uttam; Gullberg, Grant T; Seo, Youngho

    2014-11-01

    The primary goal of this project is to implement the iterative statistical image reconstruction algorithm, in this case maximum likelihood expectation maximum (MLEM) used for dynamic cardiac single photon emission computed tomography, on Spark/GraphX. This involves porting the algorithm to run on large-scale parallel computing systems. Spark is an easy-to- program software platform that can handle large amounts of data in parallel. GraphX is a graph analytic system running on top of Spark to handle graph and sparse linear algebra operations in parallel. The main advantage of implementing MLEM algorithm in Spark/GraphX is that it allows users to parallelize such computation without any expertise in parallel computing or prior knowledge in computer science. In this paper we demonstrate a successful implementation of MLEM in Spark/GraphX and present the performance gains with the goal to eventually make it useable in clinical setting.

  6. Handling Qualities of a Large Civil Tiltrotor in Hover using Translational Rate Command

    NASA Technical Reports Server (NTRS)

    Malpica, Carlos A.; Theodore, Colin R.; Lawrence, Ben; Lindsey, James; Blanken, Chris

    2012-01-01

    A Translational Rate Command (TRC) control law has been developed to enable low speed maneuvering of a large civil tiltrotor with minimal pitch changes by means of automatic nacelle angle deflections for longitudinal velocity control. The nacelle actuator bandwidth required to achieve Level 1 handling qualities in hover and the feasibility of additional longitudinal cyclic control to augment low bandwidth nacelle actuation were investigated. A frequency-domain handling qualities criterion characterizing TRC response in terms of bandwidth and phase delay was proposed and validated against a piloted simulation conducted on the NASA-Ames Vertical Motion Simulator. Seven experimental test pilots completed evaluations in the ADS-33E-PRF Hover Mission Task Element (MTE) for a matrix of nacelle actuator bandwidths, equivalent rise times and control response sensitivities, and longitudinal cyclic control allocations. Evaluated against this task, longitudinal phase delay shows the Level 1 boundary is around 0.4 0.5 s. Accordingly, Level 1 handling qualities were achieved either with a nacelle actuator bandwidth greater than 4 rad/s, or by employing longitudinal cyclic control to augment low bandwidth nacelle actuation.

  7. Designing and developing portable large-scale JavaScript web applications within the Experiment Dashboard framework

    NASA Astrophysics Data System (ADS)

    Andreeva, J.; Dzhunov, I.; Karavakis, E.; Kokoszkiewicz, L.; Nowotka, M.; Saiz, P.; Tuckett, D.

    2012-12-01

    Improvements in web browser performance and web standards compliance, as well as the availability of comprehensive JavaScript libraries, provides an opportunity to develop functionally rich yet intuitive web applications that allow users to access, render and analyse data in novel ways. However, the development of such large-scale JavaScript web applications presents new challenges, in particular with regard to code sustainability and team-based work. We present an approach that meets the challenges of large-scale JavaScript web application design and development, including client-side model-view-controller architecture, design patterns, and JavaScript libraries. Furthermore, we show how the approach leads naturally to the encapsulation of the data source as a web API, allowing applications to be easily ported to new data sources. The Experiment Dashboard framework is used for the development of applications for monitoring the distributed computing activities of virtual organisations on the Worldwide LHC Computing Grid. We demonstrate the benefits of the approach for large-scale JavaScript web applications in this context by examining the design of several Experiment Dashboard applications for data processing, data transfer and site status monitoring, and by showing how they have been ported for different virtual organisations and technologies.

  8. Handling Big Data in Medical Imaging: Iterative Reconstruction with Large-Scale Automated Parallel Computation

    PubMed Central

    Lee, Jae H.; Yao, Yushu; Shrestha, Uttam; Gullberg, Grant T.; Seo, Youngho

    2014-01-01

    The primary goal of this project is to implement the iterative statistical image reconstruction algorithm, in this case maximum likelihood expectation maximum (MLEM) used for dynamic cardiac single photon emission computed tomography, on Spark/GraphX. This involves porting the algorithm to run on large-scale parallel computing systems. Spark is an easy-to- program software platform that can handle large amounts of data in parallel. GraphX is a graph analytic system running on top of Spark to handle graph and sparse linear algebra operations in parallel. The main advantage of implementing MLEM algorithm in Spark/GraphX is that it allows users to parallelize such computation without any expertise in parallel computing or prior knowledge in computer science. In this paper we demonstrate a successful implementation of MLEM in Spark/GraphX and present the performance gains with the goal to eventually make it useable in clinical setting. PMID:27081299

  9. Safe Patient Handling and Mobility: Development and Implementation of a Large-Scale Education Program.

    PubMed

    Lee, Corinne; Knight, Suzanne W; Smith, Sharon L; Nagle, Dorothy J; DeVries, Lori

    This article addresses the development, implementation, and evaluation of an education program for safe patient handling and mobility at a large academic medical center. The ultimate goal of the program was to increase safety during patient mobility/transfer and reduce nursing staff injury from lifting/pulling. This comprehensive program was designed on the basis of the principles of prework, application, and support at the point of care. A combination of online learning, demonstration, skill evaluation, and coaching at the point of care was used to achieve the goal. Specific roles and responsibilities were developed to facilitate implementation. It took 17 master trainers, 88 certified trainers, 176 unit-based trainers, and 98 coaches to put 3706 nurses and nursing assistants through the program. Evaluations indicated both an increase in knowledge about safe patient handling and an increased ability to safely mobilize patients. The challenge now is sustainability of safe patient-handling practices and the growth and development of trainers and coaches.

  10. The Effectiveness of Lecture-Integrated, Web-Supported Case Studies in Large Group Teaching

    ERIC Educational Resources Information Center

    Azzawi, May; Dawson, Maureen M.

    2007-01-01

    The effectiveness of lecture-integrated and web-supported case studies in supporting a large and academically diverse group of undergraduate students was evaluated in the present study. Case studies and resource (web)-based learning were incorporated as two complementary interactive learning strategies into the traditional curriculum. A truncated…

  11. Ergonomic material-handling device

    DOEpatents

    Barsnick, Lance E.; Zalk, David M.; Perry, Catherine M.; Biggs, Terry; Tageson, Robert E.

    2004-08-24

    A hand-held ergonomic material-handling device capable of moving heavy objects, such as large waste containers and other large objects requiring mechanical assistance. The ergonomic material-handling device can be used with neutral postures of the back, shoulders, wrists and knees, thereby reducing potential injury to the user. The device involves two key features: 1) gives the user the ability to adjust the height of the handles of the device to ergonomically fit the needs of the user's back, wrists and shoulders; and 2) has a rounded handlebar shape, as well as the size and configuration of the handles which keep the user's wrists in a neutral posture during manipulation of the device.

  12. An Investigation of Large Tilt-Rotor Short-Term Attitude Response Handling Qualities Requirements in Hover

    NASA Technical Reports Server (NTRS)

    Malcipa, Carlos; Decker, William A.; Theodore, Colin R.; Blanken, Christopher L.; Berger, Tom

    2010-01-01

    A piloted simulation investigation was conducted using the NASA Ames Vertical Motion Simulator to study the impact of pitch, roll and yaw attitude bandwidth and phase delay on handling qualities of large tilt-rotor aircraft. Multiple bandwidth and phase delay pairs were investigated for each axis. The simulation also investigated the effect that the pilot offset from the center of gravity has on handling qualities. While pilot offset does not change the dynamics of the vehicle, it does affect the proprioceptive and visual cues and it can have an impact on handling qualities. The experiment concentrated on two primary evaluation tasks: a precision hover task and a simple hover pedal turn. Six pilots flew over 1400 data runs with evaluation comments and objective performance data recorded. The paper will describe the experiment design and methodology, discuss the results of the experiment and summarize the findings.

  13. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    NASA Astrophysics Data System (ADS)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  14. Web service discovery among large service pools utilising semantic similarity and clustering

    NASA Astrophysics Data System (ADS)

    Chen, Fuzan; Li, Minqiang; Wu, Harris; Xie, Lingli

    2017-03-01

    With the rapid development of electronic business, Web services have attracted much attention in recent years. Enterprises can combine individual Web services to provide new value-added services. An emerging challenge is the timely discovery of close matches to service requests among large service pools. In this study, we first define a new semantic similarity measure combining functional similarity and process similarity. We then present a service discovery mechanism that utilises the new semantic similarity measure for service matching. All the published Web services are pre-grouped into functional clusters prior to the matching process. For a user's service request, the discovery mechanism first identifies matching services clusters and then identifies the best matching Web services within these matching clusters. Experimental results show that the proposed semantic discovery mechanism performs better than a conventional lexical similarity-based mechanism.

  15. Extending SME to Handle Large-Scale Cognitive Modeling.

    PubMed

    Forbus, Kenneth D; Ferguson, Ronald W; Lovett, Andrew; Gentner, Dedre

    2017-07-01

    Analogy and similarity are central phenomena in human cognition, involved in processes ranging from visual perception to conceptual change. To capture this centrality requires that a model of comparison must be able to integrate with other processes and handle the size and complexity of the representations required by the tasks being modeled. This paper describes extensions to Structure-Mapping Engine (SME) since its inception in 1986 that have increased its scope of operation. We first review the basic SME algorithm, describe psychological evidence for SME as a process model, and summarize its role in simulating similarity-based retrieval and generalization. Then we describe five techniques now incorporated into the SME that have enabled it to tackle large-scale modeling tasks: (a) Greedy merging rapidly constructs one or more best interpretations of a match in polynomial time: O(n 2 log(n)); (b) Incremental operation enables mappings to be extended as new information is retrieved or derived about the base or target, to model situations where information in a task is updated over time; (c) Ubiquitous predicates model the varying degrees to which items may suggest alignment; (d) Structural evaluation of analogical inferences models aspects of plausibility judgments; (e) Match filters enable large-scale task models to communicate constraints to SME to influence the mapping process. We illustrate via examples from published studies how these enable it to capture a broader range of psychological phenomena than before. Copyright © 2016 Cognitive Science Society, Inc.

  16. The Challenge of Handling Big Data Sets in the Sensor Web

    NASA Astrophysics Data System (ADS)

    Autermann, Christian; Stasch, Christoph; Jirka, Simon

    2016-04-01

    More and more Sensor Web components are deployed in different domains such as hydrology, oceanography or air quality in order to make observation data accessible via the Web. However, besides variability of data formats and protocols in environmental applications, the fast growing volume of data with high temporal and spatial resolution is imposing new challenges for Sensor Web technologies when sharing observation data and metadata about sensors. Variability, volume and velocity are the core issues that are addressed by Big Data concepts and technologies. Most solutions in the geospatial sector focus on remote sensing and raster data, whereas big in-situ observation data sets relying on vector features require novel approaches. Hence, in order to deal with big data sets in infrastructures for observational data, the following questions need to be answered: 1. How can big heterogeneous spatio-temporal datasets be organized, managed, and provided to Sensor Web applications? 2. How can views on big data sets and derived information products be made accessible in the Sensor Web? 3. How can big observation data sets be processed efficiently? We illustrate these challenges with examples from the marine domain and outline how we address these challenges. We therefore show how big data approaches from mainstream IT can be re-used and applied to Sensor Web application scenarios.

  17. Conducting real-time multiplayer experiments on the web.

    PubMed

    Hawkins, Robert X D

    2015-12-01

    Group behavior experiments require potentially large numbers of participants to interact in real time with perfect information about one another. In this paper, we address the methodological challenge of developing and conducting such experiments on the web, thereby broadening access to online labor markets as well as allowing for participation through mobile devices. In particular, we combine a set of recent web development technologies, including Node.js with the Socket.io module, HTML5 canvas, and jQuery, to provide a secure platform for pedagogical demonstrations and scalable, unsupervised experiment administration. Template code is provided for an example real-time behavioral game theory experiment which automatically pairs participants into dyads and places them into a virtual world. In total, this treatment is intended to allow those with a background in non-web-based programming to modify the template, which handles the technical server-client networking details, for their own experiments.

  18. Utilizing Web 2.0 Technologies for Library Web Tutorials: An Examination of Instruction on Community College Libraries' Websites Serving Large Student Bodies

    ERIC Educational Resources Information Center

    Blummer, Barbara; Kenton, Jeffrey M.

    2015-01-01

    This is the second part of a series on Web 2.0 tools available from community college libraries' Websites. The first article appeared in an earlier volume of this journal and it illustrated the wide variety of Web 2.0 tools on community college libraries' Websites serving large student bodies (Blummer and Kenton 2014). The research found many of…

  19. River Food Web Response to Large-Scale Riparian Zone Manipulations

    PubMed Central

    Wootton, J. Timothy

    2012-01-01

    Conservation programs often focus on select species, leading to management plans based on the autecology of the focal species, but multiple ecosystem components can be affected both by the environmental factors impacting, and the management targeting, focal species. These broader effects can have indirect impacts on target species through the web of interactions within ecosystems. For example, human activity can strongly alter riparian vegetation, potentially impacting both economically-important salmonids and their associated river food web. In an Olympic Peninsula river, Washington state, USA, replicated large-scale riparian vegetation manipulations implemented with the long-term (>40 yr) goal of improving salmon habitat did not affect water temperature, nutrient limitation or habitat characteristics, but reduced canopy cover, causing reduced energy input via leaf litter, increased incident solar radiation (UV and PAR) and increased algal production compared to controls. In response, benthic algae, most insect taxa, and juvenile salmonids increased in manipulated areas. Stable isotope analysis revealed a predominant contribution of algal-derived energy to salmonid diets in manipulated reaches. The experiment demonstrates that riparian management targeting salmonids strongly affects river food webs via changes in the energy base, illustrates how species-based management strategies can have unanticipated indirect effects on the target species via the associated food web, and supports ecosystem-based management approaches for restoring depleted salmonid stocks. PMID:23284786

  20. FRASS: the web-server for RNA structural comparison

    PubMed Central

    2010-01-01

    Background The impressive increase of novel RNA structures, during the past few years, demands automated methods for structure comparison. While many algorithms handle only small motifs, few techniques, developed in recent years, (ARTS, DIAL, SARA, SARSA, and LaJolla) are available for the structural comparison of large and intact RNA molecules. Results The FRASS web-server represents a RNA chain with its Gauss integrals and allows one to compare structures of RNA chains and to find similar entries in a database derived from the Protein Data Bank. We observed that FRASS scores correlate well with the ARTS and LaJolla similarity scores. Moreover, the-web server can also reproduce satisfactorily the DARTS classification of RNA 3D structures and the classification of the SCOR functions that was obtained by the SARA method. Conclusions The FRASS web-server can be easily used to detect relationships among RNA molecules and to scan efficiently the rapidly enlarging structural databases. PMID:20553602

  1. Uvf - Unified Volume Format: A General System for Efficient Handling of Large Volumetric Datasets.

    PubMed

    Krüger, Jens; Potter, Kristin; Macleod, Rob S; Johnson, Christopher

    2008-01-01

    With the continual increase in computing power, volumetric datasets with sizes ranging from only a few megabytes to petascale are generated thousands of times per day. Such data may come from an ordinary source such as simple everyday medical imaging procedures, while larger datasets may be generated from cluster-based scientific simulations or measurements of large scale experiments. In computer science an incredible amount of work worldwide is put into the efficient visualization of these datasets. As researchers in the field of scientific visualization, we often have to face the task of handling very large data from various sources. This data usually comes in many different data formats. In medical imaging, the DICOM standard is well established, however, most research labs use their own data formats to store and process data. To simplify the task of reading the many different formats used with all of the different visualization programs, we present a system for the efficient handling of many types of large scientific datasets (see Figure 1 for just a few examples). While primarily targeted at structured volumetric data, UVF can store just about any type of structured and unstructured data. The system is composed of a file format specification with a reference implementation of a reader. It is not only a common, easy to implement format but also allows for efficient rendering of most datasets without the need to convert the data in memory.

  2. Web Server Security on Open Source Environments

    NASA Astrophysics Data System (ADS)

    Gkoutzelis, Dimitrios X.; Sardis, Manolis S.

    Administering critical resources has never been more difficult that it is today. In a changing world of software innovation where major changes occur on a daily basis, it is crucial for the webmasters and server administrators to shield their data against an unknown arsenal of attacks in the hands of their attackers. Up until now this kind of defense was a privilege of the few, out-budgeted and low cost solutions let the defender vulnerable to the uprising of innovating attacking methods. Luckily, the digital revolution of the past decade left its mark, changing the way we face security forever: open source infrastructure today covers all the prerequisites for a secure web environment in a way we could never imagine fifteen years ago. Online security of large corporations, military and government bodies is more and more handled by open source application thus driving the technological trend of the 21st century in adopting open solutions to E-Commerce and privacy issues. This paper describes substantial security precautions in facing privacy and authentication issues in a totally open source web environment. Our goal is to state and face the most known problems in data handling and consequently propose the most appealing techniques to face these challenges through an open solution.

  3. Safe gas handling and system design for the large scale production of amorphous silicon based solar cells

    NASA Astrophysics Data System (ADS)

    Fortmann, C. M.; Farley, M. V.; Smoot, M. A.; Fieselmann, B. F.

    1988-07-01

    Solarex is one of the leaders in amorphous silicon based photovoltaic production and research. The large scale production environment presents unique safety concerns related to the quantity of dangerous materials as well as the number of personnel handling these materials. The safety measures explored by this work include gas detection systems, training, and failure resistant gas handling systems. Our experiences with flow restricting orifices in the CGA connections and the use of steel cylinders is reviewed. The hazards and efficiency of wet scrubbers for silane exhausts are examined. We have found it to be useful to provide the scrubbler with temperature alarms.

  4. High-performance web viewer for cardiac images

    NASA Astrophysics Data System (ADS)

    dos Santos, Marcelo; Furuie, Sergio S.

    2004-04-01

    With the advent of the digital devices for medical diagnosis the use of the regular films in radiology has decreased. Thus, the management and handling of medical images in digital format has become an important and critical task. In Cardiology, for example, the main difficulty is to display dynamic images with the appropriated color palette and frame rate used on acquisition process by Cath, Angio and Echo systems. In addition, other difficulty is handling large images in memory by any existing personal computer, including thin clients. In this work we present a web-based application that carries out these tasks with robustness and excellent performance, without burdening the server and network. This application provides near-diagnostic quality display of cardiac images stored as DICOM 3.0 files via a web browser and provides a set of resources that allows the viewing of still and dynamic images. It can access image files from the local disks, or network connection. Its features include: allows real-time playback, dynamic thumbnails image viewing during loading, access to patient database information, image processing tools, linear and angular measurements, on-screen annotations, image printing and exporting DICOM images to other image formats, and many others, all characterized by a pleasant user-friendly interface, inside a Web browser by means of a Java application. This approach offers some advantages over the most of medical images viewers, such as: facility of installation, integration with other systems by means of public and standardized interfaces, platform independence, efficient manipulation and display of medical images, all with high performance.

  5. AtomicChargeCalculator: interactive web-based calculation of atomic charges in large biomolecular complexes and drug-like molecules.

    PubMed

    Ionescu, Crina-Maria; Sehnal, David; Falginella, Francesco L; Pant, Purbaj; Pravda, Lukáš; Bouchal, Tomáš; Svobodová Vařeková, Radka; Geidl, Stanislav; Koča, Jaroslav

    2015-01-01

    Partial atomic charges are a well-established concept, useful in understanding and modeling the chemical behavior of molecules, from simple compounds, to large biomolecular complexes with many reactive sites. This paper introduces AtomicChargeCalculator (ACC), a web-based application for the calculation and analysis of atomic charges which respond to changes in molecular conformation and chemical environment. ACC relies on an empirical method to rapidly compute atomic charges with accuracy comparable to quantum mechanical approaches. Due to its efficient implementation, ACC can handle any type of molecular system, regardless of size and chemical complexity, from drug-like molecules to biomacromolecular complexes with hundreds of thousands of atoms. ACC writes out atomic charges into common molecular structure files, and offers interactive facilities for statistical analysis and comparison of the results, in both tabular and graphical form. Due to high customizability and speed, easy streamlining and the unified platform for calculation and analysis, ACC caters to all fields of life sciences, from drug design to nanocarriers. ACC is freely available via the Internet at http://ncbr.muni.cz/ACC.

  6. An Architecture for Autonomic Web Service Process Planning

    NASA Astrophysics Data System (ADS)

    Moore, Colm; Xue Wang, Ming; Pahl, Claus

    Web service composition is a technology that has received considerable attention in the last number of years. Languages and tools to aid in the process of creating composite Web services have been received specific attention. Web service composition is the process of linking single Web services together in order to accomplish more complex tasks. One area of Web service composition that has not received as much attention is the area of dynamic error handling and re-planning, enabling autonomic composition. Given a repository of service descriptions and a task to complete, it is possible for AI planners to automatically create a plan that will achieve this goal. If however a service in the plan is unavailable or erroneous the plan will fail. Motivated by this problem, this paper suggests autonomous re-planning as a means to overcome dynamic problems. Our solution involves automatically recovering from faults and creating a context-dependent alternate plan. We present an architecture that serves as a basis for the central activities autonomous composition, monitoring and fault handling.

  7. Mooring and ground handling rigid airships

    NASA Technical Reports Server (NTRS)

    Walker, H., Jr.

    1975-01-01

    The problems of mooring and ground handling rigid airships are discussed. A brief history of Mooring and Ground Handling Rigid Airships from July 2, 1900 through September 1, 1939 is included. Also a brief history of ground handling developments with large U. S. Navy nonrigid airships between September 1, 1939 and August 31, 1962 is included wherein developed equipment and techniques appear applicable to future large rigid airships. Finally recommendations are made pertaining to equipment and procedures which appear desirable and feasible for future rigid airship programs.

  8. Alumina Handling Dustiness

    NASA Astrophysics Data System (ADS)

    Authier-Martin, Monique

    Dustiness of calcined alumina is a major concern, causing undesirable working conditions and serious alumina losses. These losses occur primarily during unloading and handling or pot loading and crust breaking. The handling side of the problem is first addressed. The Perra pulvimeter constitutes a simple and reproducible tool to quantify handling dustiness and yields results in agreement with plant experience. Attempts are made to correlate dustiness with bulk properties (particle size, attrition index, …) for a large number of diverse aluminas. The characterization of the dust generated with the Perra pulvimeter is most revealing. The effect of the addition of E.S.P. dust is also reported.

  9. Data partitioning enables the use of standard SOAP Web Services in genome-scale workflows.

    PubMed

    Sztromwasser, Pawel; Puntervoll, Pål; Petersen, Kjell

    2011-07-26

    Biological databases and computational biology tools are provided by research groups around the world, and made accessible on the Web. Combining these resources is a common practice in bioinformatics, but integration of heterogeneous and often distributed tools and datasets can be challenging. To date, this challenge has been commonly addressed in a pragmatic way, by tedious and error-prone scripting. Recently however a more reliable technique has been identified and proposed as the platform that would tie together bioinformatics resources, namely Web Services. In the last decade the Web Services have spread wide in bioinformatics, and earned the title of recommended technology. However, in the era of high-throughput experimentation, a major concern regarding Web Services is their ability to handle large-scale data traffic. We propose a stream-like communication pattern for standard SOAP Web Services, that enables efficient flow of large data traffic between a workflow orchestrator and Web Services. We evaluated the data-partitioning strategy by comparing it with typical communication patterns on an example pipeline for genomic sequence annotation. The results show that data-partitioning lowers resource demands of services and increases their throughput, which in consequence allows to execute in-silico experiments on genome-scale, using standard SOAP Web Services and workflows. As a proof-of-principle we annotated an RNA-seq dataset using a plain BPEL workflow engine.

  10. RCrawler: An R package for parallel web crawling and scraping

    NASA Astrophysics Data System (ADS)

    Khalil, Salim; Fakir, Mohamed

    RCrawler is a contributed R package for domain-based web crawling and content scraping. As the first implementation of a parallel web crawler in the R environment, RCrawler can crawl, parse, store pages, extract contents, and produce data that can be directly employed for web content mining applications. However, it is also flexible, and could be adapted to other applications. The main features of RCrawler are multi-threaded crawling, content extraction, and duplicate content detection. In addition, it includes functionalities such as URL and content-type filtering, depth level controlling, and a robot.txt parser. Our crawler has a highly optimized system, and can download a large number of pages per second while being robust against certain crashes and spider traps. In this paper, we describe the design and functionality of RCrawler, and report on our experience of implementing it in an R environment, including different optimizations that handle the limitations of R. Finally, we discuss our experimental results.

  11. Handling qualities of large flexible control-configured aircraft

    NASA Technical Reports Server (NTRS)

    Swaim, R. L.

    1979-01-01

    The approach to an analytical study of flexible airplane longitudinal handling qualities was to parametrically vary the natural frequencies of two symmetric elastic modes to induce mode interactions with the rigid body dynamics. Since the structure of the pilot model was unknown for such dynamic interactions, the optimal control pilot modeling method is being applied and used in conjunction with pilot rating method.

  12. Visualizing Ecosystem Energy Flow in Complex Food Web Networks: A Comparison of Three Alaskan Large Marine Ecosystems

    NASA Astrophysics Data System (ADS)

    Kearney, K.; Aydin, K.

    2016-02-01

    Oceanic food webs are often depicted as network graphs, with the major organisms or functional groups displayed as nodes and the fluxes of between them as the edges. However, the large number of nodes and edges and high connectance of many management-oriented food webs coupled with graph layout algorithms poorly-suited to certain desired characteristics of food web visualizations often lead to hopelessly tangled diagrams that convey little information other than, "It's complex." Here, I combine several new graph visualization techniques- including a new node layout alorithm based on a trophic similarity (quantification of shared predator and prey) and trophic level, divided edge bundling for edge routing, and intelligent automated placement of labels- to create a much clearer visualization of the important fluxes through a food web. The technique will be used to highlight the differences in energy flow within three Alaskan Large Marine Ecosystems (the Bering Sea, Gulf of Alaska, and Aleutian Islands) that include very similar functional groups but unique energy pathways.

  13. Updates to FuncLab, a Matlab based GUI for handling receiver functions

    NASA Astrophysics Data System (ADS)

    Porritt, Robert W.; Miller, Meghan S.

    2018-02-01

    Receiver functions are a versatile tool commonly used in seismic imaging. Depending on how they are processed, they can be used to image discontinuity structure within the crust or mantle or they can be inverted for seismic velocity either directly or jointly with complementary datasets. However, modern studies generally require large datasets which can be challenging to handle; therefore, FuncLab was originally written as an interactive Matlab GUI to assist in handling these large datasets. This software uses a project database to allow interactive trace editing, data visualization, H-κ stacking for crustal thickness and Vp/Vs ratio, and common conversion point stacking while minimizing computational costs. Since its initial release, significant advances have been made in the implementation of web services and changes in the underlying Matlab platform have necessitated a significant revision to the software. Here, we present revisions to the software, including new features such as data downloading via irisFetch.m, receiver function calculations via processRFmatlab, on-the-fly cross-section tools, interface picking, and more. In the descriptions of the tools, we present its application to a test dataset in Michigan, Wisconsin, and neighboring areas following the passage of USArray Transportable Array. The software is made available online at https://robporritt.wordpress.com/software.

  14. The Management Challenge: Handling Exams Involving Large Quantities of Students, on and off Campus--A Design Concept

    ERIC Educational Resources Information Center

    Larsson, Ken

    2014-01-01

    This paper looks at the process of managing large numbers of exams efficiently and secure with the use of a dedicated IT support. The system integrates regulations on different levels, from national to local, (even down to departments) and ensures that the rules are employed in all stages of handling the exams. The system has a proven record of…

  15. On-demand server-side image processing for web-based DICOM image display

    NASA Astrophysics Data System (ADS)

    Sakusabe, Takaya; Kimura, Michio; Onogi, Yuzo

    2000-04-01

    Low cost image delivery is needed in modern networked hospitals. If a hospital has hundreds of clients, cost of client systems is a big problem. Naturally, a Web-based system is the most effective solution. But a Web browser could not display medical images with certain image processing such as a lookup table transformation. We developed a Web-based medical image display system using Web browser and on-demand server-side image processing. All images displayed on a Web page are generated from DICOM files on a server, delivered on-demand. User interaction on the Web page is handled by a client-side scripting technology such as JavaScript. This combination makes a look-and-feel of an imaging workstation not only for its functionality but also for its speed. Real time update of images with tracing mouse motion is achieved on Web browser without any client-side image processing which may be done by client-side plug-in technology such as Java Applets or ActiveX. We tested performance of the system in three cases. Single client, small number of clients in a fast speed network, and large number of clients in a normal speed network. The result shows that there are very slight overhead for communication and very scalable in number of clients.

  16. Design and Development of a Framework Based on Ogc Web Services for the Visualization of Three Dimensional Large-Scale Geospatial Data Over the Web

    NASA Astrophysics Data System (ADS)

    Roccatello, E.; Nozzi, A.; Rumor, M.

    2013-05-01

    This paper illustrates the key concepts behind the design and the development of a framework, based on OGC services, capable to visualize 3D large scale geospatial data streamed over the web. WebGISes are traditionally bounded to a bi-dimensional simplified representation of the reality and though they are successfully addressing the lack of flexibility and simplicity of traditional desktop clients, a lot of effort is still needed to reach desktop GIS features, like 3D visualization. The motivations behind this work lay in the widespread availability of OGC Web Services inside government organizations and in the technology support to HTML 5 and WebGL standard of the web browsers. This delivers an improved user experience, similar to desktop applications, therefore allowing to augment traditional WebGIS features with a 3D visualization framework. This work could be seen as an extension of the Cityvu project, started in 2008 with the aim of a plug-in free OGC CityGML viewer. The resulting framework has also been integrated in existing 3DGIS software products and will be made available in the next months.

  17. Phylo.io: Interactive Viewing and Comparison of Large Phylogenetic Trees on the Web.

    PubMed

    Robinson, Oscar; Dylus, David; Dessimoz, Christophe

    2016-08-01

    Phylogenetic trees are pervasively used to depict evolutionary relationships. Increasingly, researchers need to visualize large trees and compare multiple large trees inferred for the same set of taxa (reflecting uncertainty in the tree inference or genuine discordance among the loci analyzed). Existing tree visualization tools are however not well suited to these tasks. In particular, side-by-side comparison of trees can prove challenging beyond a few dozen taxa. Here, we introduce Phylo.io, a web application to visualize and compare phylogenetic trees side-by-side. Its distinctive features are: highlighting of similarities and differences between two trees, automatic identification of the best matching rooting and leaf order, scalability to large trees, high usability, multiplatform support via standard HTML5 implementation, and possibility to store and share visualizations. The tool can be freely accessed at http://phylo.io and can easily be embedded in other web servers. The code for the associated JavaScript library is available at https://github.com/DessimozLab/phylo-io under an MIT open source license. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  18. Mynodbcsv: lightweight zero-config database solution for handling very large CSV files.

    PubMed

    Adaszewski, Stanisław

    2014-01-01

    Volumes of data used in science and industry are growing rapidly. When researchers face the challenge of analyzing them, their format is often the first obstacle. Lack of standardized ways of exploring different data layouts requires an effort each time to solve the problem from scratch. Possibility to access data in a rich, uniform manner, e.g. using Structured Query Language (SQL) would offer expressiveness and user-friendliness. Comma-separated values (CSV) are one of the most common data storage formats. Despite its simplicity, with growing file size handling it becomes non-trivial. Importing CSVs into existing databases is time-consuming and troublesome, or even impossible if its horizontal dimension reaches thousands of columns. Most databases are optimized for handling large number of rows rather than columns, therefore, performance for datasets with non-typical layouts is often unacceptable. Other challenges include schema creation, updates and repeated data imports. To address the above-mentioned problems, I present a system for accessing very large CSV-based datasets by means of SQL. It's characterized by: "no copy" approach--data stay mostly in the CSV files; "zero configuration"--no need to specify database schema; written in C++, with boost [1], SQLite [2] and Qt [3], doesn't require installation and has very small size; query rewriting, dynamic creation of indices for appropriate columns and static data retrieval directly from CSV files ensure efficient plan execution; effortless support for millions of columns; due to per-value typing, using mixed text/numbers data is easy; very simple network protocol provides efficient interface for MATLAB and reduces implementation time for other languages. The software is available as freeware along with educational videos on its website [4]. It doesn't need any prerequisites to run, as all of the libraries are included in the distribution package. I test it against existing database solutions using a battery of

  19. Mynodbcsv: Lightweight Zero-Config Database Solution for Handling Very Large CSV Files

    PubMed Central

    Adaszewski, Stanisław

    2014-01-01

    Volumes of data used in science and industry are growing rapidly. When researchers face the challenge of analyzing them, their format is often the first obstacle. Lack of standardized ways of exploring different data layouts requires an effort each time to solve the problem from scratch. Possibility to access data in a rich, uniform manner, e.g. using Structured Query Language (SQL) would offer expressiveness and user-friendliness. Comma-separated values (CSV) are one of the most common data storage formats. Despite its simplicity, with growing file size handling it becomes non-trivial. Importing CSVs into existing databases is time-consuming and troublesome, or even impossible if its horizontal dimension reaches thousands of columns. Most databases are optimized for handling large number of rows rather than columns, therefore, performance for datasets with non-typical layouts is often unacceptable. Other challenges include schema creation, updates and repeated data imports. To address the above-mentioned problems, I present a system for accessing very large CSV-based datasets by means of SQL. It's characterized by: “no copy” approach – data stay mostly in the CSV files; “zero configuration” – no need to specify database schema; written in C++, with boost [1], SQLite [2] and Qt [3], doesn't require installation and has very small size; query rewriting, dynamic creation of indices for appropriate columns and static data retrieval directly from CSV files ensure efficient plan execution; effortless support for millions of columns; due to per-value typing, using mixed text/numbers data is easy; very simple network protocol provides efficient interface for MATLAB and reduces implementation time for other languages. The software is available as freeware along with educational videos on its website [4]. It doesn't need any prerequisites to run, as all of the libraries are included in the distribution package. I test it against existing database solutions using

  20. Handling Internet-Based Health Information: Improving Health Information Web Site Literacy Among Undergraduate Nursing Students.

    PubMed

    Wang, Weiwen; Sun, Ran; Mulvehill, Alice M; Gilson, Courtney C; Huang, Linda L

    2017-02-01

    Patient care problems arise when health care consumers and professionals find health information on the Internet because that information is often inaccurate. To mitigate this problem, nurses can develop Web literacy and share that skill with health care consumers. This study evaluated a Web-literacy intervention for undergraduate nursing students to find reliable Web-based health information. A pre- and postsurvey queried undergraduate nursing students in an informatics course; the intervention comprised lecture, in-class practice, and assignments about health Web site evaluation tools. Data were analyzed using Wilcoxon and ANOVA signed-rank tests. Pre-intervention, 75.9% of participants reported using Web sites to obtain health information. Postintervention, 87.9% displayed confidence in using an evaluation tool. Both the ability to critique health Web sites (p = .005) and confidence in finding reliable Internet-based health information (p = .058) increased. Web-literacy education guides nursing students to find, evaluate, and use reliable Web sites, which improves their ability to deliver safer patient care. [J Nurs Educ. 2017;56(2):110-114.]. Copyright 2017, SLACK Incorporated.

  1. On the Nets. Comparing Web Browsers: Mosaic, Cello, Netscape, WinWeb and InternetWorks Life.

    ERIC Educational Resources Information Center

    Notess, Greg R.

    1995-01-01

    World Wide Web browsers are compared by speed, setup, hypertext transport protocol (HTTP) handling, management of file transfer protocol (FTP), telnet, gopher, and wide area information server (WAIS); bookmark options; and communication functions. Netscape has the most features, the fastest retrieval, sophisticated bookmark capabilities. (JMV)

  2. Large area sheet task: Advanced Dendritic Web Growth Development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Hopkins, R. H.; Meier, D.; Schruben, J.

    1981-01-01

    A melt level control system was implemented to provide stepless silicon feed rates from zero to rates exactly matching the silicon consumed during web growth. Bench tests of the unit were successfully completed and the system mounted in a web furnace for operational verification. Tests of long term temperature drift correction techniques were made; web width monitoring seems most appropriate for feedback purposes. A system to program the initiation of the web growth cycle was successfully tested. A low cost temperature controller was tested which functions as well as units four times as expensive.

  3. Life as an emergent phenomenon: studies from a large-scale boid simulation and web data.

    PubMed

    Ikegami, Takashi; Mototake, Yoh-Ichi; Kobori, Shintaro; Oka, Mizuki; Hashimoto, Yasuhiro

    2017-12-28

    A large group with a special structure can become the mother of emergence. We discuss this hypothesis in relation to large-scale boid simulations and web data. In the boid swarm simulations, the nucleation, organization and collapse dynamics were found to be more diverse in larger flocks than in smaller flocks. In the second analysis, large web data, consisting of shared photos with descriptive tags, tended to group together users with similar tendencies, allowing the network to develop a core-periphery structure. We show that the generation rate of novel tags and their usage frequencies are high in the higher-order cliques. In this case, novelty is not considered to arise randomly; rather, it is generated as a result of a large and structured network. We contextualize these results in terms of adjacent possible theory and as a new way to understand collective intelligence. We argue that excessive information and material flow can become a source of innovation.This article is part of the themed issue 'Reconceptualizing the origins of life'. © 2017 The Author(s).

  4. Life as an emergent phenomenon: studies from a large-scale boid simulation and web data

    NASA Astrophysics Data System (ADS)

    Ikegami, Takashi; Mototake, Yoh-ichi; Kobori, Shintaro; Oka, Mizuki; Hashimoto, Yasuhiro

    2017-11-01

    A large group with a special structure can become the mother of emergence. We discuss this hypothesis in relation to large-scale boid simulations and web data. In the boid swarm simulations, the nucleation, organization and collapse dynamics were found to be more diverse in larger flocks than in smaller flocks. In the second analysis, large web data, consisting of shared photos with descriptive tags, tended to group together users with similar tendencies, allowing the network to develop a core-periphery structure. We show that the generation rate of novel tags and their usage frequencies are high in the higher-order cliques. In this case, novelty is not considered to arise randomly; rather, it is generated as a result of a large and structured network. We contextualize these results in terms of adjacent possible theory and as a new way to understand collective intelligence. We argue that excessive information and material flow can become a source of innovation. This article is part of the themed issue 'Reconceptualizing the origins of life'.

  5. HAND TRUCK FOR HANDLING EQUIPMENT

    DOEpatents

    King, D.W.

    1959-02-24

    A truck is described for the handling of large and relatively heavy pieces of equipment and particularly for the handling of ion source units for use in calutrons. The truck includes a chassis and a frame pivoted to the chassis so as to be operable to swing in the manner of a boom. The frame has spaced members so arranged that the device to be handled can be suspended between or passed between these spaced members and also rotated with respect to the frame when the device is secured to the spaced members.

  6. A Comprehensive Web-Based Patient Information Environment

    DTIC Science & Technology

    2001-10-25

    hospitals. Keywords - I. INTRODUCTION This paper describes a comprehensive, web-enabled, patient - centric medical information system called PiRiLiS...clinically focused. The system was found to reduce time for medical administration. The ability to view the entire patient record at anytime, anywhere in...Abstract- The paper describes a new type of medical information environment which is fully web-enabled. The system can handle any type medical

  7. A Semantics-Based Information Distribution Framework for Large Web-Based Course Forum System

    ERIC Educational Resources Information Center

    Chim, Hung; Deng, Xiaotie

    2008-01-01

    We propose a novel data distribution framework for developing a large Web-based course forum system. In the distributed architectural design, each forum server is fully equipped with the ability to support some course forums independently. The forum servers collaborating with each other constitute the whole forum system. Therefore, the workload of…

  8. Large-area sheet task advanced dendritic web growth development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Hopkins, R. H.; Meier, D. L.; Schruben, J.

    1982-01-01

    Thermal models were developed that accurately predict the thermally generated stresses in the web crystal which, if too high, cause the crystal to degenerate. The application of the modeling results to the design of low-stress experimental growth configurations will allow the growth of wider web crystals at higher growth velocities. A new experimental web growth machine was constructed. This facility includes all the features necessary for carrying out growth experiments under steady thermal conditions. Programmed growth initiation was developed to give reproducible crystal starts. Width control permits the growth of long ribbons at constant width. Melt level is controlled to 0.1 mm or better. Thus, the capability exists to grow long web crystals of constant width and thickness with little operator intervention, and web growth experiments can now be performed with growth variables controlled to a degree not previously possible.

  9. Spatial variations in food web structures with alternative stable states: evidence from stable isotope analysis in a large eutrophic lake

    NASA Astrophysics Data System (ADS)

    Li, Yunkai; Zhang, Yuying; Xu, Jun; Zhang, Shuo

    2018-03-01

    Food web structures are well known to vary widely among ecosystems. Moreover, many food web studies of lakes have generally attempted to characterize the overall food web structure and have largely ignored internal spatial and environmental variations. In this study, we hypothesize that there is a high degree of spatial heterogeneity within an ecosystem and such heterogeneity may lead to strong variations in environmental conditions and resource availability, in turn resulting in different trophic pathways. Stable carbon and nitrogen isotopes were employed for the whole food web to describe the structure of the food web in different sub-basins within Taihu Lake. This lake is a large eutrophic freshwater lake that has been intensively managed and highly influenced by human activities for more than 50 years. The results show significant isotopic differences between basins with different environmental characteristics. Such differences likely result from isotopic baseline differences combining with a shift in food web structure. Both are related to local spatial heterogeneity in nutrient loading in waters. Such variation should be explicitly considered in future food web studies and ecosystem-based management in this lake ecosystem.

  10. Panoptes: web-based exploration of large scale genome variation data.

    PubMed

    Vauterin, Paul; Jeffery, Ben; Miles, Alistair; Amato, Roberto; Hart, Lee; Wright, Ian; Kwiatkowski, Dominic

    2017-10-15

    The size and complexity of modern large-scale genome variation studies demand novel approaches for exploring and sharing the data. In order to unlock the potential of these data for a broad audience of scientists with various areas of expertise, a unified exploration framework is required that is accessible, coherent and user-friendly. Panoptes is an open-source software framework for collaborative visual exploration of large-scale genome variation data and associated metadata in a web browser. It relies on technology choices that allow it to operate in near real-time on very large datasets. It can be used to browse rich, hybrid content in a coherent way, and offers interactive visual analytics approaches to assist the exploration. We illustrate its application using genome variation data of Anopheles gambiae, Plasmodium falciparum and Plasmodium vivax. Freely available at https://github.com/cggh/panoptes, under the GNU Affero General Public License. paul.vauterin@gmail.com. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  11. Geoinformation web-system for processing and visualization of large archives of geo-referenced data

    NASA Astrophysics Data System (ADS)

    Gordov, E. P.; Okladnikov, I. G.; Titov, A. G.; Shulgina, T. M.

    2010-12-01

    Developed working model of information-computational system aimed at scientific research in area of climate change is presented. The system will allow processing and analysis of large archives of geophysical data obtained both from observations and modeling. Accumulated experience of developing information-computational web-systems providing computational processing and visualization of large archives of geo-referenced data was used during the implementation (Gordov et al, 2007; Okladnikov et al, 2008; Titov et al, 2009). Functional capabilities of the system comprise a set of procedures for mathematical and statistical analysis, processing and visualization of data. At present five archives of data are available for processing: 1st and 2nd editions of NCEP/NCAR Reanalysis, ECMWF ERA-40 Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, and NOAA-CIRES XX Century Global Reanalysis Version I. To provide data processing functionality a computational modular kernel and class library providing data access for computational modules were developed. Currently a set of computational modules for climate change indices approved by WMO is available. Also a special module providing visualization of results and writing to Encapsulated Postscript, GeoTIFF and ESRI shape files was developed. As a technological basis for representation of cartographical information in Internet the GeoServer software conforming to OpenGIS standards is used. Integration of GIS-functionality with web-portal software to provide a basis for web-portal’s development as a part of geoinformation web-system is performed. Such geoinformation web-system is a next step in development of applied information-telecommunication systems offering to specialists from various scientific fields unique opportunities of performing reliable analysis of heterogeneous geophysical data using approved computational algorithms. It will allow a wide range of researchers to work with geophysical data without specific programming

  12. Hydrology and grazing jointly control a large-river food web.

    PubMed

    Strayer, David L; Pace, Michael L; Caraco, Nina F; Cole, Jonathan J; Findlay, Stuart E G

    2008-01-01

    Inputs of fresh water and grazing both can control aquatic food webs, but little is known about the relative strengths of and interactions between these controls. We use long-term data on the food web of the freshwater Hudson River estuary to investigate the importance of, and interactions between, inputs of fresh water and grazing by the invasive zebra mussel (Dreissena polymorpha). Both freshwater inputs and zebra mussel grazing have strong, pervasive effects on the Hudson River food web. High flow tended to reduce population size in most parts of the food web. High grazing also reduced populations in the planktonic food web, but increased populations in the littoral food web, probably as a result of increases in water clarity. The influences of flow and zebra mussel grazing were roughly equal (i.e., within a factor of 2) for many variables over the period of our study. Zebra mussel grazing made phytoplankton less sensitive to freshwater inputs, but water clarity and the littoral food web more sensitive to freshwater inputs, showing that interactions between these two controlling factors can be strong and varied.

  13. Towards Semantic Web Services on Large, Multi-Dimensional Coverages

    NASA Astrophysics Data System (ADS)

    Baumann, P.

    2009-04-01

    Observed and simulated data in the Earth Sciences often come as coverages, the general term for space-time varying phenomena as set forth by standardization bodies like the Open GeoSpatial Consortium (OGC) and ISO. Among such data are 1-d time series, 2-D surface data, 3-D surface data time series as well as x/y/z geophysical and oceanographic data, and 4-D metocean simulation results. With increasing dimensionality the data sizes grow exponentially, up to Petabyte object sizes. Open standards for exploiting coverage archives over the Web are available to a varying extent. The OGC Web Coverage Service (WCS) standard defines basic extraction operations: spatio-temporal and band subsetting, scaling, reprojection, and data format encoding of the result - a simple interoperable interface for coverage access. More processing functionality is available with products like Matlab, Grid-type interfaces, and the OGC Web Processing Service (WPS). However, these often lack properties known as advantageous from databases: declarativeness (describe results rather than the algorithms), safe in evaluation (no request can keep a server busy infinitely), and optimizable (enable the server to rearrange the request so as to produce the same result faster). WPS defines a geo-enabled SOAP interface for remote procedure calls. This allows to webify any program, but does not allow for semantic interoperability: a function is identified only by its function name and parameters while the semantics is encoded in the (only human readable) title and abstract. Hence, another desirable property is missing, namely an explicit semantics which allows for machine-machine communication and reasoning a la Semantic Web. The OGC Web Coverage Processing Service (WCPS) language, which has been adopted as an international standard by OGC in December 2008, defines a flexible interface for the navigation, extraction, and ad-hoc analysis of large, multi-dimensional raster coverages. It is abstract in that it

  14. Practice and effectiveness of web-based problem-based learning approach in a large class-size system: A comparative study.

    PubMed

    Ding, Yongxia; Zhang, Peili

    2018-06-12

    Problem-based learning (PBL) is an effective and highly efficient teaching approach that is extensively applied in education systems across a variety of countries. This study aimed to investigate the effectiveness of web-based PBL teaching pedagogies in large classes. The cluster sampling method was used to separate two college-level nursing student classes (graduating class of 2013) into two groups. The experimental group (n = 162) was taught using a web-based PBL teaching approach, while the control group (n = 166) was taught using conventional teaching methods. We subsequently assessed the satisfaction of the experimental group in relation to the web-based PBL teaching mode. This assessment was performed following comparison of teaching activity outcomes pertaining to exams and self-learning capacity between the two groups. When compared with the control group, the examination scores and self-learning capabilities were significantly higher in the experimental group (P < 0.01) compared with the control group. In addition, 92.6% of students in the experimental group expressed satisfaction with the new web-based PBL teaching approach. In a large class-size teaching environment, the web-based PBL teaching approach appears to be more optimal than traditional teaching methods. These results demonstrate the effectiveness of web-based teaching technologies in problem-based learning. Copyright © 2018. Published by Elsevier Ltd.

  15. Knowledge Management of Web Financial Reporting in Human-Computer Interactive Perspective

    ERIC Educational Resources Information Center

    Wang, Dong; Chen, Yujing; Xu, Jing

    2017-01-01

    Handling and analyzing to web financial data is becoming a challenge issue in knowledge management and education to accounting practitioners. eXtensible Business Reporting Language (XBRL), which is a type of web financial reporting, describes and recognizes financial items by tagging metadata. The goal is to make it possible for financial reports…

  16. The pepATTRACT web server for blind, large-scale peptide-protein docking.

    PubMed

    de Vries, Sjoerd J; Rey, Julien; Schindler, Christina E M; Zacharias, Martin; Tuffery, Pierre

    2017-07-03

    Peptide-protein interactions are ubiquitous in the cell and form an important part of the interactome. Computational docking methods can complement experimental characterization of these complexes, but current protocols are not applicable on the proteome scale. pepATTRACT is a novel docking protocol that is fully blind, i.e. it does not require any information about the binding site. In various stages of its development, pepATTRACT has participated in CAPRI, making successful predictions for five out of seven protein-peptide targets. Its performance is similar or better than state-of-the-art local docking protocols that do require binding site information. Here we present a novel web server that carries out the rigid-body stage of pepATTRACT. On the peptiDB benchmark, the web server generates a correct model in the top 50 in 34% of the cases. Compared to the full pepATTRACT protocol, this leads to some loss of performance, but the computation time is reduced from ∼18 h to ∼10 min. Combined with the fact that it is fully blind, this makes the web server well-suited for large-scale in silico protein-peptide docking experiments. The rigid-body pepATTRACT server is freely available at http://bioserv.rpbs.univ-paris-diderot.fr/services/pepATTRACT. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  17. Development of longitudinal handling qualities criteria for large advanced supersonic aircraft

    NASA Technical Reports Server (NTRS)

    Sudderth, R. W.; Bohn, J. G.; Caniff, M. A.; Bennett, G. R.

    1975-01-01

    Longitudinal handling qualities criteria in terms of airplane response characteristics were developed. The criteria cover high speed cruise maneuvering, landing approach, and stall recovery. Data substantiating the study results are reported.

  18. Web Video Event Recognition by Semantic Analysis From Ubiquitous Documents.

    PubMed

    Yu, Litao; Yang, Yang; Huang, Zi; Wang, Peng; Song, Jingkuan; Shen, Heng Tao

    2016-12-01

    In recent years, the task of event recognition from videos has attracted increasing interest in multimedia area. While most of the existing research was mainly focused on exploring visual cues to handle relatively small-granular events, it is difficult to directly analyze video content without any prior knowledge. Therefore, synthesizing both the visual and semantic analysis is a natural way for video event understanding. In this paper, we study the problem of Web video event recognition, where Web videos often describe large-granular events and carry limited textual information. Key challenges include how to accurately represent event semantics from incomplete textual information and how to effectively explore the correlation between visual and textual cues for video event understanding. We propose a novel framework to perform complex event recognition from Web videos. In order to compensate the insufficient expressive power of visual cues, we construct an event knowledge base by deeply mining semantic information from ubiquitous Web documents. This event knowledge base is capable of describing each event with comprehensive semantics. By utilizing this base, the textual cues for a video can be significantly enriched. Furthermore, we introduce a two-view adaptive regression model, which explores the intrinsic correlation between the visual and textual cues of the videos to learn reliable classifiers. Extensive experiments on two real-world video data sets show the effectiveness of our proposed framework and prove that the event knowledge base indeed helps improve the performance of Web video event recognition.

  19. Using the Web to Encourage Student-generated Questions in Large-Format Introductory Biology Classes

    PubMed Central

    Olson, Joanne K.; Clough, Michael P.

    2007-01-01

    Students rarely ask questions related to course content in large-format introductory classes. The use of a Web-based forum devoted to student-generated questions was explored in a second-semester introductory biology course. Approximately 80% of the enrolled students asked at least one question about course content during each of three semesters during which this approach was implemented. About 95% of the students who posted questions reported reading the instructor's response to their questions. Although doing so did not contribute to their grade in the course, approximately 75% of the students reported reading questions posted by other students in the class. Approximately 60% of the students reported that the Web-based question-asking activity contributed to their learning of biology. PMID:17339393

  20. A Data Management System Integrating Web-based Training and Randomized Trials: Requirements, Experiences and Recommendations.

    PubMed

    Muroff, Jordana; Amodeo, Maryann; Larson, Mary Jo; Carey, Margaret; Loftin, Ralph D

    2011-01-01

    This article describes a data management system (DMS) developed to support a large-scale randomized study of an innovative web-course that was designed to improve substance abuse counselors' knowledge and skills in applying a substance abuse treatment method (i.e., cognitive behavioral therapy; CBT). The randomized trial compared the performance of web-course-trained participants (intervention group) and printed-manual-trained participants (comparison group) to determine the effectiveness of the web-course in teaching CBT skills. A single DMS was needed to support all aspects of the study: web-course delivery and management, as well as randomized trial management. The authors briefly reviewed several other systems that were described as built either to handle randomized trials or to deliver and evaluate web-based training. However it was clear that these systems fell short of meeting our needs for simultaneous, coordinated management of the web-course and the randomized trial. New England Research Institute's (NERI) proprietary Advanced Data Entry and Protocol Tracking (ADEPT) system was coupled with the web-programmed course and customized for our purposes. This article highlights the requirements for a DMS that operates at the intersection of web-based course management systems and randomized clinical trial systems, and the extent to which the coupled, customized ADEPT satisfied those requirements. Recommendations are included for institutions and individuals considering conducting randomized trials and web-based training programs, and seeking a DMS that can meet similar requirements.

  1. Large-area sheet task: Advanced dendritic-web-growth development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Schruben, J.

    1983-01-01

    Thermally generated stresses in the growing web crystal were reduced. These stresses, which if too high cause the ribbon to degenerate, were reduced by a factor of three, resulting in the demonstrated growth of high-quality web crystals to widths of 5.4 cm. This progress was brought about chiefly by the application of thermal models to the development of low-stress growth configurations. A new temperature model was developed which can analyze the thermal effects of much more complex lid and top shield configurations than was possible with the old lumped shield model. Growth experiments which supplied input data such as actual shield temperature and melt levels were used to verify the modeling results. Desirable modifications in the melt level-sensing circuitry were made in the new experimental web growth furnace, and this furnace has been used to carry out growth experiments under steady-state conditions. New growth configurations were tested in long growth runs at Westinghouse AESD which produced wider, lower stress and higher quality web crystals than designs previously used.

  2. Neuroimaging, Genetics, and Clinical Data Sharing in Python Using the CubicWeb Framework

    PubMed Central

    Grigis, Antoine; Goyard, David; Cherbonnier, Robin; Gareau, Thomas; Papadopoulos Orfanos, Dimitri; Chauvat, Nicolas; Di Mascio, Adrien; Schumann, Gunter; Spooren, Will; Murphy, Declan; Frouin, Vincent

    2017-01-01

    In neurosciences or psychiatry, the emergence of large multi-center population imaging studies raises numerous technological challenges. From distributed data collection, across different institutions and countries, to final data publication service, one must handle the massive, heterogeneous, and complex data from genetics, imaging, demographics, or clinical scores. These data must be both efficiently obtained and downloadable. We present a Python solution, based on the CubicWeb open-source semantic framework, aimed at building population imaging study repositories. In addition, we focus on the tools developed around this framework to overcome the challenges associated with data sharing and collaborative requirements. We describe a set of three highly adaptive web services that transform the CubicWeb framework into a (1) multi-center upload platform, (2) collaborative quality assessment platform, and (3) publication platform endowed with massive-download capabilities. Two major European projects, IMAGEN and EU-AIMS, are currently supported by the described framework. We also present a Python package that enables end users to remotely query neuroimaging, genetics, and clinical data from scripts. PMID:28360851

  3. Neuroimaging, Genetics, and Clinical Data Sharing in Python Using the CubicWeb Framework.

    PubMed

    Grigis, Antoine; Goyard, David; Cherbonnier, Robin; Gareau, Thomas; Papadopoulos Orfanos, Dimitri; Chauvat, Nicolas; Di Mascio, Adrien; Schumann, Gunter; Spooren, Will; Murphy, Declan; Frouin, Vincent

    2017-01-01

    In neurosciences or psychiatry, the emergence of large multi-center population imaging studies raises numerous technological challenges. From distributed data collection, across different institutions and countries, to final data publication service, one must handle the massive, heterogeneous, and complex data from genetics, imaging, demographics, or clinical scores. These data must be both efficiently obtained and downloadable. We present a Python solution, based on the CubicWeb open-source semantic framework, aimed at building population imaging study repositories. In addition, we focus on the tools developed around this framework to overcome the challenges associated with data sharing and collaborative requirements. We describe a set of three highly adaptive web services that transform the CubicWeb framework into a (1) multi-center upload platform, (2) collaborative quality assessment platform, and (3) publication platform endowed with massive-download capabilities. Two major European projects, IMAGEN and EU-AIMS, are currently supported by the described framework. We also present a Python package that enables end users to remotely query neuroimaging, genetics, and clinical data from scripts.

  4. Fabrication and Test of Large Area Spider-Web Bolometers for CMB Measurements

    NASA Astrophysics Data System (ADS)

    Biasotti, M.; Ceriale, V.; Corsini, D.; De Gerone, M.; Gatti, F.; Orlando, A.; Pizzigoni, G.

    2016-08-01

    Detecting the primordial 'B-mode' polarization of the cosmic microwave background is one of the major challenges of modern observational cosmology. Microwave telescopes need sensitive cryogenic bolometers with an overall equivalent noise temperature in the nK range. In this paper, we present the development status of large area (about 1 cm2) spider-web bolometer, which imply additional fabrication challenges. The spider-web is a suspended Si3N4 1 \\upmu m-thick and 8-mm diameter with mesh size of 250 \\upmu m. The thermal sensitive element is a superconducting transition edge sensor (TES) at the center of the bolometer. The first prototype is a Ti-Au TES with transition temperature tuned around 350 mK, new devices will be a Mo-Au bilayer tuned to have a transition temperature of 500 mK. We present the fabrication process with micro-machining techniques from silicon wafer covered with SiO2 - Si3N4 CVD films, 0.3 and 1 \\upmu m- thick, respectively, and preliminary tests.

  5. Macroscopic characterisations of Web accessibility

    NASA Astrophysics Data System (ADS)

    Lopes, Rui; Carriço, Luis

    2010-12-01

    The Web Science framework poses fundamental questions on the analysis of the Web, by focusing on how microscopic properties (e.g. at the level of a Web page or Web site) emerge into macroscopic properties and phenomena. One research topic on the analysis of the Web is Web accessibility evaluation, which centres on understanding how accessible a Web page is for people with disabilities. However, when framing Web accessibility evaluation on Web Science, we have found that existing research stays at the microscopic level. This article presents an experimental study on framing Web accessibility evaluation into Web Science's goals. This study resulted in novel accessibility properties of the Web not found at microscopic levels, as well as of Web accessibility evaluation processes themselves. We observed at large scale some of the empirical knowledge on how accessibility is perceived by designers and developers, such as the disparity of interpretations of accessibility evaluation tools warnings. We also found a direct relation between accessibility quality and Web page complexity. We provide a set of guidelines for designing Web pages, education on Web accessibility, as well as on the computational limits of large-scale Web accessibility evaluations.

  6. Large-area sheet task advanced dendritic web growth development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.

    1984-01-01

    The thermal models used for analyzing dendritic web growth and calculating the thermal stress were reexamined to establish the validity limits imposed by the assumptions of the models. Also, the effects of thermal conduction through the gas phase were evaluated and found to be small. New growth designs, both static and dynamic, were generated using the modeling results. Residual stress effects in dendritic web were examined. In the laboratory, new techniques for the control of temperature distributions in three dimensions were developed. A new maximum undeformed web width of 5.8 cm was achieved. A 58% increase in growth velocity of 150 micrometers thickness was achieved with dynamic hardware. The area throughput goals for transient growth of 30 and 35 sq cm/min were exceeded.

  7. The impact of web services at the IRIS DMC

    NASA Astrophysics Data System (ADS)

    Weekly, R. T.; Trabant, C. M.; Ahern, T. K.; Stults, M.; Suleiman, Y. Y.; Van Fossen, M.; Weertman, B.

    2015-12-01

    The IRIS Data Management Center (DMC) has served the seismological community for nearly 25 years. In that time we have offered data and information from our archive using a variety of mechanisms ranging from email-based to desktop applications to web applications and web services. Of these, web services have quickly become the primary method for data extraction at the DMC. In 2011, the first full year of operation, web services accounted for over 40% of the data shipped from the DMC. In 2014, over ~450 TB of data was delivered directly to users through web services, representing nearly 70% of all shipments from the DMC that year. In addition to handling requests directly from users, the DMC switched all data extraction methods to use web services in 2014. On average the DMC now handles between 10 and 20 million requests per day submitted to web service interfaces. The rapid adoption of web services is attributed to the many advantages they bring. For users, they provide on-demand data using an interface technology, HTTP, that is widely supported in nearly every computing environment and language. These characteristics, combined with human-readable documentation and existing tools make integration of data access into existing workflows relatively easy. For the DMC, the web services provide an abstraction layer to internal repositories allowing for concentrated optimization of extraction workflow and easier evolution of those repositories. Lending further support to DMC's push in this direction, the core web services for station metadata, timeseries data and event parameters were adopted as standards by the International Federation of Digital Seismograph Networks (FDSN). We expect to continue enhancing existing services and building new capabilities for this platform. For example, the DMC has created a federation system and tools allowing researchers to discover and collect seismic data from data centers running the FDSN-standardized services. A future capability

  8. Design of a Web-tool for diagnostic clinical trials handling medical imaging research.

    PubMed

    Baltasar Sánchez, Alicia; González-Sistal, Angel

    2011-04-01

    New clinical studies in medicine are based on patients and controls using different imaging diagnostic modalities. Medical information systems are not designed for clinical trials employing clinical imaging. Although commercial software and communication systems focus on storage of image data, they are not suitable for storage and mining of new types of quantitative data. We sought to design a Web-tool to support diagnostic clinical trials involving different experts and hospitals or research centres. The image analysis of this project is based on skeletal X-ray imaging. It involves a computerised image method using quantitative analysis of regions of interest in healthy bone and skeletal metastases. The database is implemented with ASP.NET 3.5 and C# technologies for our Web-based application. For data storage, we chose MySQL v.5.0, one of the most popular open source databases. User logins were necessary, and access to patient data was logged for auditing. For security, all data transmissions were carried over encrypted connections. This Web-tool is available to users scattered at different locations; it allows an efficient organisation and storage of data (case report form) and images and allows each user to know precisely what his task is. The advantages of our Web-tool are as follows: (1) sustainability is guaranteed; (2) network locations for collection of data are secured; (3) all clinical information is stored together with the original images and the results derived from processed images and statistical analysis that enable us to perform retrospective studies; (4) changes are easily incorporated because of the modular architecture; and (5) assessment of trial data collected at different sites is centralised to reduce statistical variance.

  9. Prey interception drives web invasion and spider size determines successful web takeover in nocturnal orb-web spiders.

    PubMed

    Gan, Wenjin; Liu, Shengjie; Yang, Xiaodong; Li, Daiqin; Lei, Chaoliang

    2015-09-24

    A striking feature of web-building spiders is the use of silk to make webs, mainly for prey capture. However, building a web is energetically expensive and increases the risk of predation. To reduce such costs and still have access to abundant prey, some web-building spiders have evolved web invasion behaviour. In general, no consistent patterns of web invasion have emerged and the factors determining web invasion remain largely unexplored. Here we report web invasion among conspecifics in seven nocturnal species of orb-web spiders, and examined the factors determining the probability of webs that could be invaded and taken over by conspecifics. About 36% of webs were invaded by conspecifics, and 25% of invaded webs were taken over by the invaders. A web that was built higher and intercepted more prey was more likely to be invaded. Once a web was invaded, the smaller the size of the resident spider, the more likely its web would be taken over by the invader. This study suggests that web invasion, as a possible way of reducing costs, may be widespread in nocturnal orb-web spiders. © 2015. Published by The Company of Biologists Ltd.

  10. Prey interception drives web invasion and spider size determines successful web takeover in nocturnal orb-web spiders

    PubMed Central

    Gan, Wenjin; Liu, Shengjie; Yang, Xiaodong; Li, Daiqin; Lei, Chaoliang

    2015-01-01

    ABSTRACT A striking feature of web-building spiders is the use of silk to make webs, mainly for prey capture. However, building a web is energetically expensive and increases the risk of predation. To reduce such costs and still have access to abundant prey, some web-building spiders have evolved web invasion behaviour. In general, no consistent patterns of web invasion have emerged and the factors determining web invasion remain largely unexplored. Here we report web invasion among conspecifics in seven nocturnal species of orb-web spiders, and examined the factors determining the probability of webs that could be invaded and taken over by conspecifics. About 36% of webs were invaded by conspecifics, and 25% of invaded webs were taken over by the invaders. A web that was built higher and intercepted more prey was more likely to be invaded. Once a web was invaded, the smaller the size of the resident spider, the more likely its web would be taken over by the invader. This study suggests that web invasion, as a possible way of reducing costs, may be widespread in nocturnal orb-web spiders. PMID:26405048

  11. Large-area sheet task advanced dendritic web growth development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Hopkins, R. H.; Meier, D.; Schruben, J.

    1982-01-01

    The thermal stress model was used to generate the design of a low stress lid and shield configuration, which was fabricated and tested experimentally. In preliminary tests, the New Experimental Web Growth Facility performed as designed, producing web on the first run. These experiments suggested desirable design modifications in the melt level sensing system to improve further its performance, and these are being implemented.

  12. WebEAV

    PubMed Central

    Nadkarni, Prakash M.; Brandt, Cynthia M.; Marenco, Luis

    2000-01-01

    The task of creating and maintaining a front end to a large institutional entity-attribute-value (EAV) database can be cumbersome when using traditional client-server technology. Switching to Web technology as a delivery vehicle solves some of these problems but introduces others. In particular, Web development environments tend to be primitive, and many features that client-server developers take for granted are missing. WebEAV is a generic framework for Web development that is intended to streamline the process of Web application development for databases having a significant EAV component. It also addresses some challenging user interface issues that arise when any complex system is created. The authors describe the architecture of WebEAV and provide an overview of its features with suitable examples. PMID:10887163

  13. Large area sheet task: Advanced dendritic web growth development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Hopkins, R. H.; Meier, D.; Schruben, J.

    1981-01-01

    The growth of silicon dendritic web for photovoltaic applications was investigated. The application of a thermal model for calculating buckling stresses as a function of temperature profile in the web is discussed. Lid and shield concepts were evaluated to provide the data base for enhancing growth velocity. An experimental web growth machine which embodies in one unit the mechanical and electronic features developed in previous work was developed. In addition, evaluation of a melt level control system was begun, along with preliminary tests of an elongated crucible design. The economic analysis was also updated to incorporate some minor cost changes. The initial applications of the thermal model to a specific configuration gave results consistent with experimental observation in terms of the initiation of buckling vs. width for a given crystal thickness.

  14. Technobabble: Photoshop 6 Converges Web, Print Photograph-Editing Capabilities.

    ERIC Educational Resources Information Center

    Communication: Journalism Education Today, 2001

    2001-01-01

    Discusses the newly-released Adobe Photoshop 6, and its use in student publications. Notes its refined text-handling capabilities, a more user-friendly interface, integrated vector functions, easier preparation of Web images, and new and more powerful layer functions. (SR)

  15. Developing a Hadoop-based Middleware for Handling Multi-dimensional NetCDF

    NASA Astrophysics Data System (ADS)

    Li, Z.; Yang, C. P.; Schnase, J. L.; Duffy, D.; Lee, T. J.

    2014-12-01

    Climate observations and model simulations are collecting and generating vast amounts of climate data, and these data are ever-increasing and being accumulated in a rapid speed. Effectively managing and analyzing these data are essential for climate change studies. Hadoop, a distributed storage and processing framework for large data sets, has attracted increasing attentions in dealing with the Big Data challenge. The maturity of Infrastructure as a Service (IaaS) of cloud computing further accelerates the adoption of Hadoop in solving Big Data problems. However, Hadoop is designed to process unstructured data such as texts, documents and web pages, and cannot effectively handle the scientific data format such as array-based NetCDF files and other binary data format. In this paper, we propose to build a Hadoop-based middleware for transparently handling big NetCDF data by 1) designing a distributed climate data storage mechanism based on POSIX-enabled parallel file system to enable parallel big data processing with MapReduce, as well as support data access by other systems; 2) modifying the Hadoop framework to transparently processing NetCDF data in parallel without sequencing or converting the data into other file formats, or loading them to HDFS; and 3) seamlessly integrating Hadoop, cloud computing and climate data in a highly scalable and fault-tolerance framework.

  16. Large area sheet task. Advanced dendritic web growth development. [silicon films

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Hopkins, R. H.; Meier, D.; Frantti, E.; Schruben, J.

    1981-01-01

    The development of a silicon dendritic web growth machine is discussed. Several refinements to the sensing and control equipment for melt replenishment during web growth are described and several areas for cost reduction in the components of the prototype automated web growth furnace are identified. A circuit designed to eliminate the sensitivity of the detector signal to the intensity of the reflected laser beam used to measure melt level is also described. A variable speed motor for the silicon feeder is discussed which allows pellet feeding to be accomplished at a rate programmed to match exactly the silicon removed by web growth.

  17. SLIDE - a web-based tool for interactive visualization of large-scale -omics data.

    PubMed

    Ghosh, Soumita; Datta, Abhik; Tan, Kaisen; Choi, Hyungwon

    2018-06-28

    Data visualization is often regarded as a post hoc step for verifying statistically significant results in the analysis of high-throughput data sets. This common practice leaves a large amount of raw data behind, from which more information can be extracted. However, existing solutions do not provide capabilities to explore large-scale raw datasets using biologically sensible queries, nor do they allow user interaction based real-time customization of graphics. To address these drawbacks, we have designed an open-source, web-based tool called Systems-Level Interactive Data Exploration, or SLIDE to visualize large-scale -omics data interactively. SLIDE's interface makes it easier for scientists to explore quantitative expression data in multiple resolutions in a single screen. SLIDE is publicly available under BSD license both as an online version as well as a stand-alone version at https://github.com/soumitag/SLIDE. Supplementary Information are available at Bioinformatics online.

  18. Web based visualization of large climate data sets

    USGS Publications Warehouse

    Alder, Jay R.; Hostetler, Steven W.

    2015-01-01

    We have implemented the USGS National Climate Change Viewer (NCCV), which is an easy-to-use web application that displays future projections from global climate models over the United States at the state, county and watershed scales. We incorporate the NASA NEX-DCP30 statistically downscaled temperature and precipitation for 30 global climate models being used in the Fifth Assessment Report (AR5) of the Intergovernmental Panel on Climate Change (IPCC), and hydrologic variables we simulated using a simple water-balance model. Our application summarizes very large, complex data sets at scales relevant to resource managers and citizens and makes climate-change projection information accessible to users of varying skill levels. Tens of terabytes of high-resolution climate and water-balance data are distilled to compact binary format summary files that are used in the application. To alleviate slow response times under high loads, we developed a map caching technique that reduces the time it takes to generate maps by several orders of magnitude. The reduced access time scales to >500 concurrent users. We provide code examples that demonstrate key aspects of data processing, data exporting/importing and the caching technique used in the NCCV.

  19. Predator-prey size relationships in an African large-mammal food web.

    PubMed

    Owen-Smith, Norman; Mills, M G L

    2008-01-01

    1. Size relationships are central in structuring trophic linkages within food webs, leading to suggestions that the dietary niche of smaller carnivores is nested within that of larger species. However, past analyses have not taken into account the differing selection shown by carnivores for specific size ranges of prey, nor the extent to which the greater carcass mass of larger prey outweighs the greater numerical representation of smaller prey species in the predator diet. Furthermore, the top-down impact that predation has on prey abundance cannot be assessed simply in terms of the number of predator species involved. 2. Records of found carcasses and cause of death assembled over 46 years in the Kruger National Park, South Africa, corrected for under-recording of smaller species, enabled a definitive assessment of size relationships between large mammalian carnivores and their ungulate prey. Five carnivore species were considered, including lion (Panthera leo), leopard (Panthera pardus), cheetah (Acinonyx jubatus), African wild dog (Lycaon pictus) and spotted hyena (Crocuta crocuta), and 22 herbivore prey species larger than 10 kg in adult body mass. 3. These carnivores selectively favoured prey species approximately half to twice their mass, within a total prey size range from an order of magnitude below to an order of magnitude above the body mass of the predator. The three smallest carnivores, i.e. leopard, cheetah and wild dog, showed high similarity in prey species favoured. Despite overlap in prey size range, each carnivore showed a distinct dietary preference. 4. Almost all mortality was through the agency of a predator for ungulate species up to the size of a giraffe (800-1200 kg). Ungulates larger than twice the mass of the predator contributed substantially to the dietary intake of lions, despite the low proportional mortality inflicted by predation on these species. Only for megaherbivores substantially exceeding 1000 kg in adult body mass did

  20. WebGIS based on semantic grid model and web services

    NASA Astrophysics Data System (ADS)

    Zhang, WangFei; Yue, CaiRong; Gao, JianGuo

    2009-10-01

    As the combination point of the network technology and GIS technology, WebGIS has got the fast development in recent years. With the restriction of Web and the characteristics of GIS, traditional WebGIS has some prominent problems existing in development. For example, it can't accomplish the interoperability of heterogeneous spatial databases; it can't accomplish the data access of cross-platform. With the appearance of Web Service and Grid technology, there appeared great change in field of WebGIS. Web Service provided an interface which can give information of different site the ability of data sharing and inter communication. The goal of Grid technology was to make the internet to a large and super computer, with this computer we can efficiently implement the overall sharing of computing resources, storage resource, data resource, information resource, knowledge resources and experts resources. But to WebGIS, we only implement the physically connection of data and information and these is far from the enough. Because of the different understanding of the world, following different professional regulations, different policies and different habits, the experts in different field will get different end when they observed the same geographic phenomenon and the semantic heterogeneity produced. Since these there are large differences to the same concept in different field. If we use the WebGIS without considering of the semantic heterogeneity, we will answer the questions users proposed wrongly or we can't answer the questions users proposed. To solve this problem, this paper put forward and experienced an effective method of combing semantic grid and Web Services technology to develop WebGIS. In this paper, we studied the method to construct ontology and the method to combine Grid technology and Web Services and with the detailed analysis of computing characteristics and application model in the distribution of data, we designed the WebGIS query system driven by

  1. Webmail: an Automated Web Publishing System

    NASA Astrophysics Data System (ADS)

    Bell, David

    A system for publishing frequently updated information to the World Wide Web will be described. Many documents now hosted by the NOAO Web server require timely posting and frequent updates, but need only minor changes in markup or are in a standard format requiring only conversion to HTML. These include information from outside the organization, such as electronic bulletins, and a number of internal reports, both human and machine generated. Webmail uses procmail and Perl scripts to process incoming email messages in a variety of ways. This processing may include wrapping or conversion to HTML, posting to the Web or internal newsgroups, updating search indices or links on related pages, and sending email notification of the new pages to interested parties. The Webmail system has been in use at NOAO since early 1997 and has steadily grown to include fourteen recipes that together handle about fifty messages per week.

  2. WebAlchemist: a Web transcoding system for mobile Web access in handheld devices

    NASA Astrophysics Data System (ADS)

    Whang, Yonghyun; Jung, Changwoo; Kim, Jihong; Chung, Sungkwon

    2001-11-01

    In this paper, we describe the design and implementation of WebAlchemist, a prototype web transcoding system, which automatically converts a given HTML page into a sequence of equivalent HTML pages that can be properly displayed on a hand-held device. The Web/Alchemist system is based on a set of HTML transcoding heuristics managed by the Transcoding Manager (TM) module. In order to tackle difficult-to-transcode pages such as ones with large or complex table structures, we have developed several new transcoding heuristics that extract partial semantics from syntactic information such as the table width, font size and cascading style sheet. Subjective evaluation results using popular HTML pages (such as the CNN home page) show that WebAlchemist generates readable, structure-preserving transcoded pages, which can be properly displayed on hand-held devices.

  3. A web portal for hydrodynamical, cosmological simulations

    NASA Astrophysics Data System (ADS)

    Ragagnin, A.; Dolag, K.; Biffi, V.; Cadolle Bel, M.; Hammer, N. J.; Krukau, A.; Petkova, M.; Steinborn, D.

    2017-07-01

    This article describes a data centre hosting a web portal for accessing and sharing the output of large, cosmological, hydro-dynamical simulations with a broad scientific community. It also allows users to receive related scientific data products by directly processing the raw simulation data on a remote computing cluster. The data centre has a multi-layer structure: a web portal, a job control layer, a computing cluster and a HPC storage system. The outer layer enables users to choose an object from the simulations. Objects can be selected by visually inspecting 2D maps of the simulation data, by performing highly compounded and elaborated queries or graphically by plotting arbitrary combinations of properties. The user can run analysis tools on a chosen object. These services allow users to run analysis tools on the raw simulation data. The job control layer is responsible for handling and performing the analysis jobs, which are executed on a computing cluster. The innermost layer is formed by a HPC storage system which hosts the large, raw simulation data. The following services are available for the users: (I) CLUSTERINSPECT visualizes properties of member galaxies of a selected galaxy cluster; (II) SIMCUT returns the raw data of a sub-volume around a selected object from a simulation, containing all the original, hydro-dynamical quantities; (III) SMAC creates idealized 2D maps of various, physical quantities and observables of a selected object; (IV) PHOX generates virtual X-ray observations with specifications of various current and upcoming instruments.

  4. ESAP plus: a web-based server for EST-SSR marker development.

    PubMed

    Ponyared, Piyarat; Ponsawat, Jiradej; Tongsima, Sissades; Seresangtakul, Pusadee; Akkasaeng, Chutipong; Tantisuwichwong, Nathpapat

    2016-12-22

    download all the results through the web interface. ESAP Plus is a comprehensive and convenient web-based bioinformatic tool for SSR marker development. ESAP Plus offers all necessary EST-SSR development processes with various adjustable options that users can easily use to identify SSR markers from a large EST collection. With familiar web interface, users can upload the raw EST using the data submission page and visualize/download the corresponding EST-SSR information from within ESAP Plus. ESAP Plus can handle considerably large EST datasets. This EST-SSR discovery tool can be accessed directly from: http://gbp.kku.ac.th/esap_plus/ .

  5. Depth gradients in food-web processes linking habitats in large lakes: Lake Superior as an exemplar ecosystem

    USGS Publications Warehouse

    Sierszen, Michael E.; Hrabik, Thomas R.; Stockwell, Jason D.; Cotter, Anne M; Hoffman, Joel C.; Yule, Daniel L.

    2014-01-01

    Support of whole-lake food webs through trophic linkages among pelagic, profundal and littoral habitats appears to be integral to the functioning of large lakes. These linkages can be disrupted though ecosystem disturbance such as eutrophication or the effects of invasive species and should be considered in native species restoration efforts.

  6. A Look at Technologies Vis-a-vis Information Handling Techniques.

    ERIC Educational Resources Information Center

    Swanson, Rowena W.

    The paper examines several ideas for information handling implemented with new technologies that suggest directions for future development. These are grouped under the topic headings: Handling Large Data Banks, Providing Personalized Information Packages, Providing Information Specialist Services, and Expanding Man-Machine Interaction. Guides in…

  7. Wilber 3: A Python-Django Web Application For Acquiring Large-scale Event-oriented Seismic Data

    NASA Astrophysics Data System (ADS)

    Newman, R. L.; Clark, A.; Trabant, C. M.; Karstens, R.; Hutko, A. R.; Casey, R. E.; Ahern, T. K.

    2013-12-01

    Since 2001, the IRIS Data Management Center (DMC) WILBER II system has provided a convenient web-based interface for locating seismic data related to a particular event, and requesting a subset of that data for download. Since its launch, both the scale of available data and the technology of web-based applications have developed significantly. Wilber 3 is a ground-up redesign that leverages a number of public and open-source projects to provide an event-oriented data request interface with a high level of interactivity and scalability for multiple data types. Wilber 3 uses the IRIS/Federation of Digital Seismic Networks (FDSN) web services for event data, metadata, and time-series data. Combining a carefully optimized Google Map with the highly scalable SlickGrid data API, the Wilber 3 client-side interface can load tens of thousands of events or networks/stations in a single request, and provide instantly responsive browsing, sorting, and filtering of event and meta data in the web browser, without further reliance on the data service. The server-side of Wilber 3 is a Python-Django application, one of over a dozen developed in the last year at IRIS, whose common framework, components, and administrative overhead represent a massive savings in developer resources. Requests for assembled datasets, which may include thousands of data channels and gigabytes of data, are queued and executed using the Celery distributed Python task scheduler, giving Wilber 3 the ability to operate in parallel across a large number of nodes.

  8. Overlay accuracy on a flexible web with a roll printing process based on a roll-to-roll system.

    PubMed

    Chang, Jaehyuk; Lee, Sunggun; Lee, Ki Beom; Lee, Seungjun; Cho, Young Tae; Seo, Jungwoo; Lee, Sukwon; Jo, Gugrae; Lee, Ki-yong; Kong, Hyang-Shik; Kwon, Sin

    2015-05-01

    For high-quality flexible devices from printing processes based on Roll-to-Roll (R2R) systems, overlay alignment during the patterning of each functional layer poses a major challenge. The reason is because flexible substrates have a relatively low stiffness compared with rigid substrates, and they are easily deformed during web handling in the R2R system. To achieve a high overlay accuracy for a flexible substrate, it is important not only to develop web handling modules (such as web guiding, tension control, winding, and unwinding) and a precise printing tool but also to control the synchronization of each unit in the total system. A R2R web handling system and reverse offset printing process were developed in this work, and an overlay between the 1st and 2nd layers of ±5μm on a 500 mm-wide film was achieved at a σ level of 2.4 and 2.8 (x and y directions, respectively) in a continuous R2R printing process. This paper presents the components and mechanisms used in reverse offset printing based on a R2R system and the printing results including positioning accuracy and overlay alignment accuracy.

  9. WeBIAS: a web server for publishing bioinformatics applications.

    PubMed

    Daniluk, Paweł; Wilczyński, Bartek; Lesyng, Bogdan

    2015-11-02

    One of the requirements for a successful scientific tool is its availability. Developing a functional web service, however, is usually considered a mundane and ungratifying task, and quite often neglected. When publishing bioinformatic applications, such attitude puts additional burden on the reviewers who have to cope with poorly designed interfaces in order to assess quality of presented methods, as well as impairs actual usefulness to the scientific community at large. In this note we present WeBIAS-a simple, self-contained solution to make command-line programs accessible through web forms. It comprises a web portal capable of serving several applications and backend schedulers which carry out computations. The server handles user registration and authentication, stores queries and results, and provides a convenient administrator interface. WeBIAS is implemented in Python and available under GNU Affero General Public License. It has been developed and tested on GNU/Linux compatible platforms covering a vast majority of operational WWW servers. Since it is written in pure Python, it should be easy to deploy also on all other platforms supporting Python (e.g. Windows, Mac OS X). Documentation and source code, as well as a demonstration site are available at http://bioinfo.imdik.pan.pl/webias . WeBIAS has been designed specifically with ease of installation and deployment of services in mind. Setting up a simple application requires minimal effort, yet it is possible to create visually appealing, feature-rich interfaces for query submission and presentation of results.

  10. web cellHTS2: a web-application for the analysis of high-throughput screening data.

    PubMed

    Pelz, Oliver; Gilsdorf, Moritz; Boutros, Michael

    2010-04-12

    The analysis of high-throughput screening data sets is an expanding field in bioinformatics. High-throughput screens by RNAi generate large primary data sets which need to be analyzed and annotated to identify relevant phenotypic hits. Large-scale RNAi screens are frequently used to identify novel factors that influence a broad range of cellular processes, including signaling pathway activity, cell proliferation, and host cell infection. Here, we present a web-based application utility for the end-to-end analysis of large cell-based screening experiments by cellHTS2. The software guides the user through the configuration steps that are required for the analysis of single or multi-channel experiments. The web-application provides options for various standardization and normalization methods, annotation of data sets and a comprehensive HTML report of the screening data analysis, including a ranked hit list. Sessions can be saved and restored for later re-analysis. The web frontend for the cellHTS2 R/Bioconductor package interacts with it through an R-server implementation that enables highly parallel analysis of screening data sets. web cellHTS2 further provides a file import and configuration module for common file formats. The implemented web-application facilitates the analysis of high-throughput data sets and provides a user-friendly interface. web cellHTS2 is accessible online at http://web-cellHTS2.dkfz.de. A standalone version as a virtual appliance and source code for platforms supporting Java 1.5.0 can be downloaded from the web cellHTS2 page. web cellHTS2 is freely distributed under GPL.

  11. Invasive Mussels Alter the Littoral Food Web of a Large Lake: Stable Isotopes Reveal Drastic Shifts in Sources and Flow of Energy

    PubMed Central

    Ozersky, Ted; Evans, David O.; Barton, David R.

    2012-01-01

    We investigated how establishment of invasive dreissenid mussels impacted the structure and energy sources of the littoral benthic food web of a large temperate lake. We combined information about pre- and postdreissenid abundance, biomass, and secondary production of the littoral benthos with results of carbon and nitrogen stable isotope analysis of archival (predreissenid) and recent (postdreissenid) samples of all common benthic taxa. This approach enabled us to determine the importance of benthic and sestonic carbon to the littoral food web before, and more than a decade after dreissenid establishment. Long term dreissenid presence was associated with a 32-fold increase in abundance, 6-fold increase in biomass, and 14-fold increase in secondary production of the littoral benthos. Dreissenids comprised a large portion of the post-invasion benthos, making up 13, 38, and 56% of total abundance, biomass, and secondary production, respectively. The predreissenid food web was supported primarily by benthic primary production, while sestonic material was relatively more important to the postdreissenid food web. The absolute importance of both sestonic material and benthic primary production to the littoral benthos increased considerably following dreissenid establishment. Our results show drastic alterations to food web structure and suggest that dreissenid mussels redirect energy and material from the water column to the littoral benthos both through biodeposition of sestonic material as well as stimulation of benthic primary production. PMID:23284673

  12. Web server for priority ordered multimedia services

    NASA Astrophysics Data System (ADS)

    Celenk, Mehmet; Godavari, Rakesh K.; Vetnes, Vermund

    2001-10-01

    In this work, our aim is to provide finer priority levels in the design of a general-purpose Web multimedia server with provisions of the CM services. The type of services provided include reading/writing a web page, downloading/uploading an audio/video stream, navigating the Web through browsing, and interactive video teleconferencing. The selected priority encoding levels for such operations follow the order of admin read/write, hot page CM and Web multicasting, CM read, Web read, CM write and Web write. Hot pages are the most requested CM streams (e.g., the newest movies, video clips, and HDTV channels) and Web pages (e.g., portal pages of the commercial Internet search engines). Maintaining a list of these hot Web pages and CM streams in a content addressable buffer enables a server to multicast hot streams with lower latency and higher system throughput. Cold Web pages and CM streams are treated as regular Web and CM requests. Interactive CM operations such as pause (P), resume (R), fast-forward (FF), and rewind (RW) have to be executed without allocation of extra resources. The proposed multimedia server model is a part of the distributed network with load balancing schedulers. The SM is connected to an integrated disk scheduler (IDS), which supervises an allocated disk manager. The IDS follows the same priority handling as the SM, and implements a SCAN disk-scheduling method for an improved disk access and a higher throughput. Different disks are used for the Web and CM services in order to meet the QoS requirements of CM services. The IDS ouput is forwarded to an Integrated Transmission Scheduler (ITS). The ITS creates a priority ordered buffering of the retrieved Web pages and CM data streams that are fed into an auto regressive moving average (ARMA) based traffic shaping circuitry before being transmitted through the network.

  13. Reducing Mouse Anxiety during Handling: Effect of Experience with Handling Tunnels

    PubMed Central

    Gouveia, Kelly; Hurst, Jane L.

    2013-01-01

    Handling stress is a well-recognised source of variation in animal studies that can also compromise the welfare of research animals. To reduce background variation and maximise welfare, methods that minimise handling stress should be developed and used wherever possible. Recent evidence has shown that handling mice by a familiar tunnel that is present in their home cage can minimise anxiety compared with standard tail handling. As yet, it is unclear whether a tunnel is required in each home cage to improve response to handling. We investigated the influence of prior experience with home tunnels among two common strains of laboratory mice: ICR(CD-1) and C57BL/6. We compared willingness to approach the handler and anxiety in an elevated plus maze test among mice picked up by the tail, by a home cage tunnel or by an external tunnel shared between cages. Willingness to interact with the handler was much greater for mice handled by a tunnel, even when this was unfamiliar, compared to mice picked up by the tail. Once habituated to handling, C57BL/6 mice were most interactive towards a familiar home tunnel, whereas the ICR strain showed strong interaction with all tunnel handling regardless of any experience of a home cage tunnel. Mice handled by a home cage or external tunnel showed less anxiety in an elevated plus maze than those picked up by the tail. This study shows that using a tunnel for routine handling reduces anxiety among mice compared to tail handling regardless of prior familiarity with tunnels. However, as home cage tunnels can further improve response to handling in some mice, we recommend that mice are handled with a tunnel provided in their home cage where possible as a simple practical method to minimise handling stress. PMID:23840458

  14. Reducing mouse anxiety during handling: effect of experience with handling tunnels.

    PubMed

    Gouveia, Kelly; Hurst, Jane L

    2013-01-01

    Handling stress is a well-recognised source of variation in animal studies that can also compromise the welfare of research animals. To reduce background variation and maximise welfare, methods that minimise handling stress should be developed and used wherever possible. Recent evidence has shown that handling mice by a familiar tunnel that is present in their home cage can minimise anxiety compared with standard tail handling. As yet, it is unclear whether a tunnel is required in each home cage to improve response to handling. We investigated the influence of prior experience with home tunnels among two common strains of laboratory mice: ICR(CD-1) and C57BL/6. We compared willingness to approach the handler and anxiety in an elevated plus maze test among mice picked up by the tail, by a home cage tunnel or by an external tunnel shared between cages. Willingness to interact with the handler was much greater for mice handled by a tunnel, even when this was unfamiliar, compared to mice picked up by the tail. Once habituated to handling, C57BL/6 mice were most interactive towards a familiar home tunnel, whereas the ICR strain showed strong interaction with all tunnel handling regardless of any experience of a home cage tunnel. Mice handled by a home cage or external tunnel showed less anxiety in an elevated plus maze than those picked up by the tail. This study shows that using a tunnel for routine handling reduces anxiety among mice compared to tail handling regardless of prior familiarity with tunnels. However, as home cage tunnels can further improve response to handling in some mice, we recommend that mice are handled with a tunnel provided in their home cage where possible as a simple practical method to minimise handling stress.

  15. WebGIVI: a web-based gene enrichment analysis and visualization tool.

    PubMed

    Sun, Liang; Zhu, Yongnan; Mahmood, A S M Ashique; Tudor, Catalina O; Ren, Jia; Vijay-Shanker, K; Chen, Jian; Schmidt, Carl J

    2017-05-04

    A major challenge of high throughput transcriptome studies is presenting the data to researchers in an interpretable format. In many cases, the outputs of such studies are gene lists which are then examined for enriched biological concepts. One approach to help the researcher interpret large gene datasets is to associate genes and informative terms (iTerm) that are obtained from the biomedical literature using the eGIFT text-mining system. However, examining large lists of iTerm and gene pairs is a daunting task. We have developed WebGIVI, an interactive web-based visualization tool ( http://raven.anr.udel.edu/webgivi/ ) to explore gene:iTerm pairs. WebGIVI was built via Cytoscape and Data Driven Document JavaScript libraries and can be used to relate genes to iTerms and then visualize gene and iTerm pairs. WebGIVI can accept a gene list that is used to retrieve the gene symbols and corresponding iTerm list. This list can be submitted to visualize the gene iTerm pairs using two distinct methods: a Concept Map or a Cytoscape Network Map. In addition, WebGIVI also supports uploading and visualization of any two-column tab separated data. WebGIVI provides an interactive and integrated network graph of gene and iTerms that allows filtering, sorting, and grouping, which can aid biologists in developing hypothesis based on the input gene lists. In addition, WebGIVI can visualize hundreds of nodes and generate a high-resolution image that is important for most of research publications. The source code can be freely downloaded at https://github.com/sunliang3361/WebGIVI . The WebGIVI tutorial is available at http://raven.anr.udel.edu/webgivi/tutorial.php .

  16. BPELPower—A BPEL execution engine for geospatial web services

    NASA Astrophysics Data System (ADS)

    Yu, Genong (Eugene); Zhao, Peisheng; Di, Liping; Chen, Aijun; Deng, Meixia; Bai, Yuqi

    2012-10-01

    The Business Process Execution Language (BPEL) has become a popular choice for orchestrating and executing workflows in the Web environment. As one special kind of scientific workflow, geospatial Web processing workflows are data-intensive, deal with complex structures in data and geographic features, and execute automatically with limited human intervention. To enable the proper execution and coordination of geospatial workflows, a specially enhanced BPEL execution engine is required. BPELPower was designed, developed, and implemented as a generic BPEL execution engine with enhancements for executing geospatial workflows. The enhancements are especially in its capabilities in handling Geography Markup Language (GML) and standard geospatial Web services, such as the Web Processing Service (WPS) and the Web Feature Service (WFS). BPELPower has been used in several demonstrations over the decade. Two scenarios were discussed in detail to demonstrate the capabilities of BPELPower. That study showed a standard-compliant, Web-based approach for properly supporting geospatial processing, with the only enhancement at the implementation level. Pattern-based evaluation and performance improvement of the engine are discussed: BPELPower directly supports 22 workflow control patterns and 17 workflow data patterns. In the future, the engine will be enhanced with high performance parallel processing and broad Web paradigms.

  17. Implementation of a web-based medication tracking system in a large academic medical center.

    PubMed

    Calabrese, Sam V; Williams, Jonathan P

    2012-10-01

    Pharmacy workflow efficiencies achieved through the use of an electronic medication-tracking system are described. Medication dispensing turnaround times at the inpatient pharmacy of a large hospital were evaluated before and after transition from manual medication tracking to a Web-based tracking process involving sequential bar-code scanning and real-time monitoring of medication status. The transition was carried out in three phases: (1) a workflow analysis, including the identification of optimal points for medication scanning with hand-held wireless devices, (2) the phased implementation of an automated solution and associated hardware at a central dispensing pharmacy and three satellite locations, and (3) postimplementation data collection to evaluate the impact of the new tracking system and areas for improvement. Relative to the manual tracking method, electronic medication tracking allowed the capture of far more data points, enabling the pharmacy team to delineate the time required for each step of the medication dispensing process and to identify the steps most likely to involve delays. A comparison of baseline and postimplementation data showed substantial reductions in overall medication turnaround times with the use of the Web-based tracking system (time reductions of 45% and 22% at the central and satellite sites, respectively). In addition to more accurate projections and documentation of turnaround times, the Web-based tracking system has facilitated quality-improvement initiatives. Implementation of an electronic tracking system for monitoring the delivery of medications provided a comprehensive mechanism for calculating turnaround times and allowed the pharmacy to identify bottlenecks within the medication distribution system. Altering processes removed these bottlenecks and decreased delivery turnaround times.

  18. FUn: a framework for interactive visualizations of large, high-dimensional datasets on the web.

    PubMed

    Probst, Daniel; Reymond, Jean-Louis

    2018-04-15

    During the past decade, big data have become a major tool in scientific endeavors. Although statistical methods and algorithms are well-suited for analyzing and summarizing enormous amounts of data, the results do not allow for a visual inspection of the entire data. Current scientific software, including R packages and Python libraries such as ggplot2, matplotlib and plot.ly, do not support interactive visualizations of datasets exceeding 100 000 data points on the web. Other solutions enable the web-based visualization of big data only through data reduction or statistical representations. However, recent hardware developments, especially advancements in graphical processing units, allow for the rendering of millions of data points on a wide range of consumer hardware such as laptops, tablets and mobile phones. Similar to the challenges and opportunities brought to virtually every scientific field by big data, both the visualization of and interaction with copious amounts of data are both demanding and hold great promise. Here we present FUn, a framework consisting of a client (Faerun) and server (Underdark) module, facilitating the creation of web-based, interactive 3D visualizations of large datasets, enabling record level visual inspection. We also introduce a reference implementation providing access to SureChEMBL, a database containing patent information on more than 17 million chemical compounds. The source code and the most recent builds of Faerun and Underdark, Lore.js and the data preprocessing toolchain used in the reference implementation, are available on the project website (http://doc.gdb.tools/fun/). daniel.probst@dcb.unibe.ch or jean-louis.reymond@dcb.unibe.ch.

  19. Solar cells and modules from dentritic web silicon

    NASA Technical Reports Server (NTRS)

    Campbell, R. B.; Rohatgi, A.; Seman, E. J.; Davis, J. R.; Rai-Choudhury, P.; Gallagher, B. D.

    1980-01-01

    Some of the noteworthy features of the processes developed in the fabrication of solar cell modules are the handling of long lengths of web, the use of cost effective dip coating of photoresist and antireflection coatings, selective electroplating of the grid pattern and ultrasonic bonding of the cell interconnect. Data on the cells is obtained by means of dark I-V analysis and deep level transient spectroscopy. A histogram of over 100 dentritic web solar cells fabricated in a number of runs using different web crystals shows an average efficiency of over 13%, with some efficiencies running above 15%. Lower cell efficiency is generally associated with low minority carrier time due to recombination centers sometimes present in the bulk silicon. A cost analysis of the process sequence using a 25 MW production line indicates a selling price of $0.75/peak watt in 1986. It is concluded that the efficiency of dentritic web cells approaches that of float zone silicon cells, reduced somewhat by the lower bulk lifetime of the former.

  20. Sprag Handle Wrenches

    NASA Technical Reports Server (NTRS)

    Vranishm, John M.

    2010-01-01

    Sprag handle wrenches have been proposed for general applications in which conventional pawl-and-ratchet wrenches and sprag and cam "clickless" wrenches are now used. Sprag handle wrenches are so named because they would include components that would function both as parts of handles and as sprags (roller locking/unlocking components). In comparison with all of the aforementioned conventional wrenches, properly designed sprag handle wrenches could operate with much less backlash; in comparison with the conventional clickless wrenches, sprag handle wrenches could be stronger and less expensive (because the sprags would be larger and more easily controllable than are conventional sprags and cams).

  1. Transportation and handling loads

    NASA Technical Reports Server (NTRS)

    Ostrem, F. E.

    1971-01-01

    Criteria and recommended practices are presented for the prediction and verification of transportation and handling loads for the space vehicle structure and for monitoring these loads during transportation and handling of the vehicle or major vehicle segments. Elements of the transportation and handling systems, and the forcing functions and associated loads are described. The forcing functions for common carriers and typical handling devices are assessed, and emphasis is given to the assessment of loads at the points where the space vehicle is supported during transportation and handling. Factors which must be considered when predicting the loads include the transportation and handling medium; type of handling fixture; transport vehicle speed; types of terrain; weather (changes in pressure of temperature, wind, etc.); and dynamics of the transportation modes or handling devices (acceleration, deceleration, and rotations of the transporter or handling device).

  2. GoWeb: a semantic search engine for the life science web.

    PubMed

    Dietze, Heiko; Schroeder, Michael

    2009-10-01

    Current search engines are keyword-based. Semantic technologies promise a next generation of semantic search engines, which will be able to answer questions. Current approaches either apply natural language processing to unstructured text or they assume the existence of structured statements over which they can reason. Here, we introduce a third approach, GoWeb, which combines classical keyword-based Web search with text-mining and ontologies to navigate large results sets and facilitate question answering. We evaluate GoWeb on three benchmarks of questions on genes and functions, on symptoms and diseases, and on proteins and diseases. The first benchmark is based on the BioCreAtivE 1 Task 2 and links 457 gene names with 1352 functions. GoWeb finds 58% of the functional GeneOntology annotations. The second benchmark is based on 26 case reports and links symptoms with diseases. GoWeb achieves 77% success rate improving an existing approach by nearly 20%. The third benchmark is based on 28 questions in the TREC genomics challenge and links proteins to diseases. GoWeb achieves a success rate of 79%. GoWeb's combination of classical Web search with text-mining and ontologies is a first step towards answering questions in the biomedical domain. GoWeb is online at: http://www.gopubmed.org/goweb.

  3. WebEAV: automatic metadata-driven generation of web interfaces to entity-attribute-value databases.

    PubMed

    Nadkarni, P M; Brandt, C M; Marenco, L

    2000-01-01

    The task of creating and maintaining a front end to a large institutional entity-attribute-value (EAV) database can be cumbersome when using traditional client-server technology. Switching to Web technology as a delivery vehicle solves some of these problems but introduces others. In particular, Web development environments tend to be primitive, and many features that client-server developers take for granted are missing. WebEAV is a generic framework for Web development that is intended to streamline the process of Web application development for databases having a significant EAV component. It also addresses some challenging user interface issues that arise when any complex system is created. The authors describe the architecture of WebEAV and provide an overview of its features with suitable examples.

  4. Development, implementation and pilot evaluation of a Web-based Virtual Patient Case Simulation environment--Web-SP.

    PubMed

    Zary, Nabil; Johnson, Gunilla; Boberg, Jonas; Fors, Uno G H

    2006-02-21

    The Web-based Simulation of Patients (Web-SP) project was initiated in order to facilitate the use of realistic and interactive virtual patients (VP) in medicine and healthcare education. Web-SP focuses on moving beyond the technology savvy teachers, when integrating simulation-based education into health sciences curricula, by making the creation and use of virtual patients easier. The project strives to provide a common generic platform for design/creation, management, evaluation and sharing of web-based virtual patients. The aim of this study was to evaluate if it was possible to develop a web-based virtual patient case simulation environment where the entire case authoring process might be handled by teachers and which would be flexible enough to be used in different healthcare disciplines. The Web-SP system was constructed to support easy authoring, management and presentation of virtual patient cases. The case authoring environment was found to facilitate for teachers to create full-fledged patient cases without the assistance of computer specialists. Web-SP was successfully implemented at several universities by taking into account key factors such as cost, access, security, scalability and flexibility. Pilot evaluations in medical, dentistry and pharmacy courses shows that students regarded Web-SP as easy to use, engaging and to be of educational value. Cases adapted for all three disciplines were judged to be of significant educational value by the course leaders. The Web-SP system seems to fulfil the aim of providing a common generic platform for creation, management and evaluation of web-based virtual patient cases. The responses regarding the authoring environment indicated that the system might be user-friendly enough to appeal to a majority of the academic staff. In terms of implementation strengths, Web-SP seems to fulfil most needs from course directors and teachers from various educational institutions and disciplines. The system is currently in

  5. Modelling of Tethered Space-Web Structures

    NASA Astrophysics Data System (ADS)

    McKenzie, D. J.; Cartnell, M. P.

    Large structures in space are an essential milestone in the path of many projects, from solar power collectors to space stations. In space, as on Earth, these large projects may be split up into more manageable sections, dividing the task into multiple replicable parts. Specially constructed spider robots could assemble these structures piece by piece over a membrane or space- web, giving a method for building a structure while on orbit. The modelling and applications of these space-webs are discussed, along with the derivation of the equations of motion of the structure. The presentation of some preliminary results from the solution of these equations will show that space-webs can take a variety of different forms, and give some guidelines for configuring the space-web system.

  6. Photosynthesis and the web: 2001.

    PubMed

    Orr, L

    2001-01-01

    First, a brief history of the Internet and the World Wide Web is presented. This is followed by relevant information on photosynthesis-related web sites grouped into several categories: (1) large group sites, (2) comprehensive overview sites, (3) specific subject sites, (4) individual researcher sites, (5) kindergarten through high school (K-12) educational sites, (6) books and journals, and, 7) other useful sites. A section on searching the Web is also included. Finally, we have included an appendix with all of the web sites discussed herein as well as other web sites that space did not allow. Readers are requested to send comments, corrections and additions to gov@uiuc.edu.

  7. 78 FR 23673 - Marketing Order Regulating the Handling of Spearmint Oil Produced in the Far West; Revision of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-22

    ... following Web site: http://www.ams.usda.gov/MarketingOrdersSmallBusinessGuide ; or by contacting Jeffrey... DEPARTMENT OF AGRICULTURE Agricultural Marketing Service 7 CFR Part 985 [Doc. Nos. AMS-FV-11-0088; FV12-985-1A FIR] Marketing Order Regulating the Handling of Spearmint Oil Produced in the Far West...

  8. Rapid EHR development and implementation using web and cloud-based architecture in a large home health and hospice organization.

    PubMed

    Weaver, Charlotte A; Teenier, Pamela

    2014-01-01

    Health care organizations have long been limited to a small number of major vendors in their selection of an electronic health record (EHR) system in the national and international marketplace. These major EHR vendors have in common base systems that are decades old, are built in antiquated programming languages, use outdated server architecture, and are based on inflexible data models [1,2]. The option to upgrade their technology to keep pace with the power of new web-based architecture, programming tools and cloud servers is not easily undertaken due to large client bases, development costs and risk [3]. This paper presents the decade-long efforts of a large national provider of home health and hospice care to select an EHR product, failing that to build their own and failing that initiative to go back into the market in 2012. The decade time delay had allowed new technologies and more nimble vendors to enter the market. Partnering with a new start-up company doing web and cloud based architecture for the home health and hospice market, made it possible to build, test and implement an operational and point of care system in 264 home health locations across 40 states and three time zones in the United States. This option of "starting over" with the new web and cloud technologies may be posing a next generation of new EHR vendors that retells the Blackberry replacement by iPhone story in healthcare.

  9. World Wide Web Metaphors for Search Mission Data

    NASA Technical Reports Server (NTRS)

    Norris, Jeffrey S.; Wallick, Michael N.; Joswig, Joseph C.; Powell, Mark W.; Torres, Recaredo J.; Mittman, David S.; Abramyan, Lucy; Crockett, Thomas M.; Shams, Khawaja S.; Fox, Jason M.; hide

    2010-01-01

    A software program that searches and browses mission data emulates a Web browser, containing standard meta - phors for Web browsing. By taking advantage of back-end URLs, users may save and share search states. Also, since a Web interface is familiar to users, training time is reduced. Familiar back and forward buttons move through a local search history. A refresh/reload button regenerates a query, and loads in any new data. URLs can be constructed to save search results. Adding context to the current search is also handled through a familiar Web metaphor. The query is constructed by clicking on hyperlinks that represent new components to the search query. The selection of a link appears to the user as a page change; the choice of links changes to represent the updated search and the results are filtered by the new criteria. Selecting a navigation link changes the current query and also the URL that is associated with it. The back button can be used to return to the previous search state. This software is part of the MSLICE release, which was written in Java. It will run on any current Windows, Macintosh, or Linux system.

  10. Development, implementation and pilot evaluation of a Web-based Virtual Patient Case Simulation environment – Web-SP

    PubMed Central

    Zary, Nabil; Johnson, Gunilla; Boberg, Jonas; Fors, Uno GH

    2006-01-01

    Background The Web-based Simulation of Patients (Web-SP) project was initiated in order to facilitate the use of realistic and interactive virtual patients (VP) in medicine and healthcare education. Web-SP focuses on moving beyond the technology savvy teachers, when integrating simulation-based education into health sciences curricula, by making the creation and use of virtual patients easier. The project strives to provide a common generic platform for design/creation, management, evaluation and sharing of web-based virtual patients. The aim of this study was to evaluate if it was possible to develop a web-based virtual patient case simulation environment where the entire case authoring process might be handled by teachers and which would be flexible enough to be used in different healthcare disciplines. Results The Web-SP system was constructed to support easy authoring, management and presentation of virtual patient cases. The case authoring environment was found to facilitate for teachers to create full-fledged patient cases without the assistance of computer specialists. Web-SP was successfully implemented at several universities by taking into account key factors such as cost, access, security, scalability and flexibility. Pilot evaluations in medical, dentistry and pharmacy courses shows that students regarded Web-SP as easy to use, engaging and to be of educational value. Cases adapted for all three disciplines were judged to be of significant educational value by the course leaders. Conclusion The Web-SP system seems to fulfil the aim of providing a common generic platform for creation, management and evaluation of web-based virtual patient cases. The responses regarding the authoring environment indicated that the system might be user-friendly enough to appeal to a majority of the academic staff. In terms of implementation strengths, Web-SP seems to fulfil most needs from course directors and teachers from various educational institutions and disciplines

  11. Handling and restraint.

    PubMed

    Donovan, John; Brown, Patricia

    2006-07-01

    For the safety of the handler and the animal, proper methods for handling and restraining laboratory animals should be followed. Improper handling can result in increased stress and injury to the animal. In addition, the handler risks injury from bite wounds or scratches inflicted when the animal becomes fearful or anxious. By using sure, direct movements with a determined attitude, the animal can be easily handled and restrained. Animals can be restrained either manually or in a plastic restrainer. The protocols in this unit describe handling and manual restraint of mice, rats, hamsters, and rabbits. Alternate protocols describe restraint using the plastic restrainer.

  12. Handling and restraint.

    PubMed

    Donovan, John; Brown, Patricia

    2004-09-01

    For the safety of the handler and the animal, proper methods for handling and restraining laboratory animals should be followed. Improper handling can result in increased stress and injury to the animal. In addition, the handler risks injury from bite wounds or scratches inflicted when the animal becomes fearful or anxious. By using sure, direct movements with a determined attitude, the animal can be easily handled and restrained. Animals can be restrained either manually or in a plastic restrainer. The protocols in this unit describe handling and manual restraint of mice, rats, hamsters, and rabbits. Alternate protocols describe restraint using the plastic restrainer.

  13. BEAM web server: a tool for structural RNA motif discovery.

    PubMed

    Pietrosanto, Marco; Adinolfi, Marta; Casula, Riccardo; Ausiello, Gabriele; Ferrè, Fabrizio; Helmer-Citterich, Manuela

    2018-03-15

    RNA structural motif finding is a relevant problem that becomes computationally hard when working on high-throughput data (e.g. eCLIP, PAR-CLIP), often represented by thousands of RNA molecules. Currently, the BEAM server is the only web tool capable to handle tens of thousands of RNA in input with a motif discovery procedure that is only limited by the current secondary structure prediction accuracies. The recently developed method BEAM (BEAr Motifs finder) can analyze tens of thousands of RNA molecules and identify RNA secondary structure motifs associated to a measure of their statistical significance. BEAM is extremely fast thanks to the BEAR encoding that transforms each RNA secondary structure in a string of characters. BEAM also exploits the evolutionary knowledge contained in a substitution matrix of secondary structure elements, extracted from the RFAM database of families of homologous RNAs. The BEAM web server has been designed to streamline data pre-processing by automatically handling folding and encoding of RNA sequences, giving users a choice for the preferred folding program. The server provides an intuitive and informative results page with the list of secondary structure motifs identified, the logo of each motif, its significance, graphic representation and information about its position in the RNA molecules sharing it. The web server is freely available at http://beam.uniroma2.it/ and it is implemented in NodeJS and Python with all major browsers supported. marco.pietrosanto@uniroma2.it. Supplementary data are available at Bioinformatics online.

  14. Large-area sheet task advanced dendritic web growth development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.

    1983-01-01

    Modeling in the development of low stress configurations for wide web growth is presented. Parametric sensitivity to identify design features which can be used for dynamic trimming of the furnace element was studied. Temperature measurements of experimental growth behavior led to modification in the growth system to improve lateral temperature distributions.

  15. Accounting Data to Web Interface Using PERL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hargeaves, C

    2001-08-13

    This document will explain the process to create a web interface for the accounting information generated by the High Performance Storage Systems (HPSS) accounting report feature. The accounting report contains useful data but it is not easily accessed in a meaningful way. The accounting report is the only way to see summarized storage usage information. The first step is to take the accounting data, make it meaningful and store the modified data in persistent databases. The second step is to generate the various user interfaces, HTML pages, that will be used to access the data. The third step is tomore » transfer all required files to the web server. The web pages pass parameters to Common Gateway Interface (CGI) scripts that generate dynamic web pages and graphs. The end result is a web page with specific information presented in text with or without graphs. The accounting report has a specific format that allows the use of regular expressions to verify if a line is storage data. Each storage data line is stored in a detailed database file with a name that includes the run date. The detailed database is used to create a summarized database file that also uses run date in its name. The summarized database is used to create the group.html web page that includes a list of all storage users. Scripts that query the database folder to build a list of available databases generate two additional web pages. A master script that is run monthly as part of a cron job, after the accounting report has completed, manages all of these individual scripts. All scripts are written in the PERL programming language. Whenever possible data manipulation scripts are written as filters. All scripts are written to be single source, which means they will function properly on both the open and closed networks at LLNL. The master script handles the command line inputs for all scripts, file transfers to the web server and records run information in a log file. The rest of the scripts

  16. Ride quality sensitivity to SAS control law and to handling quality variations

    NASA Technical Reports Server (NTRS)

    Roberts, P. A.; Schmidt, D. K.; Swaim, R. L.

    1976-01-01

    The RQ trends which large flexible aircraft exhibit under various parameterizations of control laws and handling qualities are discussed. A summary of the assumptions and solution technique, a control law parameterization review, a discussion of ride sensitivity to handling qualities, and the RQ effects generated by implementing relaxed static stability configurations are included.

  17. MilxXplore: a web-based system to explore large imaging datasets

    PubMed Central

    Bourgeat, P; Dore, V; Villemagne, V L; Rowe, C C; Salvado, O; Fripp, J

    2013-01-01

    Objective As large-scale medical imaging studies are becoming more common, there is an increasing reliance on automated software to extract quantitative information from these images. As the size of the cohorts keeps increasing with large studies, there is a also a need for tools that allow results from automated image processing and analysis to be presented in a way that enables fast and efficient quality checking, tagging and reporting on cases in which automatic processing failed or was problematic. Materials and methods MilxXplore is an open source visualization platform, which provides an interface to navigate and explore imaging data in a web browser, giving the end user the opportunity to perform quality control and reporting in a user friendly, collaborative and efficient way. Discussion Compared to existing software solutions that often provide an overview of the results at the subject's level, MilxXplore pools the results of individual subjects and time points together, allowing easy and efficient navigation and browsing through the different acquisitions of a subject over time, and comparing the results against the rest of the population. Conclusions MilxXplore is fast, flexible and allows remote quality checks of processed imaging data, facilitating data sharing and collaboration across multiple locations, and can be easily integrated into a cloud computing pipeline. With the growing trend of open data and open science, such a tool will become increasingly important to share and publish results of imaging analysis. PMID:23775173

  18. MilxXplore: a web-based system to explore large imaging datasets.

    PubMed

    Bourgeat, P; Dore, V; Villemagne, V L; Rowe, C C; Salvado, O; Fripp, J

    2013-01-01

    As large-scale medical imaging studies are becoming more common, there is an increasing reliance on automated software to extract quantitative information from these images. As the size of the cohorts keeps increasing with large studies, there is a also a need for tools that allow results from automated image processing and analysis to be presented in a way that enables fast and efficient quality checking, tagging and reporting on cases in which automatic processing failed or was problematic. MilxXplore is an open source visualization platform, which provides an interface to navigate and explore imaging data in a web browser, giving the end user the opportunity to perform quality control and reporting in a user friendly, collaborative and efficient way. Compared to existing software solutions that often provide an overview of the results at the subject's level, MilxXplore pools the results of individual subjects and time points together, allowing easy and efficient navigation and browsing through the different acquisitions of a subject over time, and comparing the results against the rest of the population. MilxXplore is fast, flexible and allows remote quality checks of processed imaging data, facilitating data sharing and collaboration across multiple locations, and can be easily integrated into a cloud computing pipeline. With the growing trend of open data and open science, such a tool will become increasingly important to share and publish results of imaging analysis.

  19. Ergonomics and comfort in lawn mower handle positioning: An evaluation of handle geometry.

    PubMed

    Lowndes, Bethany R; Heald, Elizabeth A; Hallbeck, M Susan

    2015-11-01

    Hand operation accompanied with any combination of large forces, awkward positions and repetition may lead to upper limb injury or illness and may be exacerbated by vibration. Commercial lawn mowers expose operators to these factors during actuation of hand controls and therefore may be a health concern. A nontraditional lawn mower control system may decrease upper limb illnesses and injuries through more neutral hand and body positioning. This study compared maximum grip strength in twelve different orientations (3 grip spans and 4 positions) and evaluated self-described comfortable handle positions. The results displayed force differences between nontraditional (X) and both vertical (V) and pistol (P) positions (p < 0.0001) and among the different grip spans (p < 0.0001). Based on these results, recommended designs should incorporate a tilt between 45 and 70°, handle rotations between 48 and 78°, and reduced force requirements or decreased grip spans to improve user health and comfort. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  20. Semantic Web-Driven LMS Architecture towards a Holistic Learning Process Model Focused on Personalization

    ERIC Educational Resources Information Center

    Kerkiri, Tania

    2010-01-01

    A comprehensive presentation is here made on the modular architecture of an e-learning platform with a distinctive emphasis on content personalization, combining advantages from semantic web technology, collaborative filtering and recommendation systems. Modules of this architecture handle information about both the domain-specific didactic…

  1. LACIE data-handling techniques

    NASA Technical Reports Server (NTRS)

    Waits, G. H. (Principal Investigator)

    1979-01-01

    Techniques implemented to facilitate processing of LANDSAT multispectral data between 1975 and 1978 are described. The data that were handled during the large area crop inventory experiment and the storage mechanisms used for the various types of data are defined. The overall data flow, from the placing of the LANDSAT orders through the actual analysis of the data set, is discussed. An overview is provided of the status and tracking system that was developed and of the data base maintenance and operational task. The archiving of the LACIE data is explained.

  2. NaviCell: a web-based environment for navigation, curation and maintenance of large molecular interaction maps

    PubMed Central

    2013-01-01

    Background Molecular biology knowledge can be formalized and systematically represented in a computer-readable form as a comprehensive map of molecular interactions. There exist an increasing number of maps of molecular interactions containing detailed and step-wise description of various cell mechanisms. It is difficult to explore these large maps, to organize discussion of their content and to maintain them. Several efforts were recently made to combine these capabilities together in one environment, and NaviCell is one of them. Results NaviCell is a web-based environment for exploiting large maps of molecular interactions, created in CellDesigner, allowing their easy exploration, curation and maintenance. It is characterized by a combination of three essential features: (1) efficient map browsing based on Google Maps; (2) semantic zooming for viewing different levels of details or of abstraction of the map and (3) integrated web-based blog for collecting community feedback. NaviCell can be easily used by experts in the field of molecular biology for studying molecular entities of interest in the context of signaling pathways and crosstalk between pathways within a global signaling network. NaviCell allows both exploration of detailed molecular mechanisms represented on the map and a more abstract view of the map up to a top-level modular representation. NaviCell greatly facilitates curation, maintenance and updating the comprehensive maps of molecular interactions in an interactive and user-friendly fashion due to an imbedded blogging system. Conclusions NaviCell provides user-friendly exploration of large-scale maps of molecular interactions, thanks to Google Maps and WordPress interfaces, with which many users are already familiar. Semantic zooming which is used for navigating geographical maps is adopted for molecular maps in NaviCell, making any level of visualization readable. In addition, NaviCell provides a framework for community-based curation of maps

  3. NaviCell: a web-based environment for navigation, curation and maintenance of large molecular interaction maps.

    PubMed

    Kuperstein, Inna; Cohen, David P A; Pook, Stuart; Viara, Eric; Calzone, Laurence; Barillot, Emmanuel; Zinovyev, Andrei

    2013-10-07

    Molecular biology knowledge can be formalized and systematically represented in a computer-readable form as a comprehensive map of molecular interactions. There exist an increasing number of maps of molecular interactions containing detailed and step-wise description of various cell mechanisms. It is difficult to explore these large maps, to organize discussion of their content and to maintain them. Several efforts were recently made to combine these capabilities together in one environment, and NaviCell is one of them. NaviCell is a web-based environment for exploiting large maps of molecular interactions, created in CellDesigner, allowing their easy exploration, curation and maintenance. It is characterized by a combination of three essential features: (1) efficient map browsing based on Google Maps; (2) semantic zooming for viewing different levels of details or of abstraction of the map and (3) integrated web-based blog for collecting community feedback. NaviCell can be easily used by experts in the field of molecular biology for studying molecular entities of interest in the context of signaling pathways and crosstalk between pathways within a global signaling network. NaviCell allows both exploration of detailed molecular mechanisms represented on the map and a more abstract view of the map up to a top-level modular representation. NaviCell greatly facilitates curation, maintenance and updating the comprehensive maps of molecular interactions in an interactive and user-friendly fashion due to an imbedded blogging system. NaviCell provides user-friendly exploration of large-scale maps of molecular interactions, thanks to Google Maps and WordPress interfaces, with which many users are already familiar. Semantic zooming which is used for navigating geographical maps is adopted for molecular maps in NaviCell, making any level of visualization readable. In addition, NaviCell provides a framework for community-based curation of maps.

  4. Compression-based aggregation model for medical web services.

    PubMed

    Al-Shammary, Dhiah; Khalil, Ibrahim

    2010-01-01

    Many organizations such as hospitals have adopted Cloud Web services in applying their network services to avoid investing heavily computing infrastructure. SOAP (Simple Object Access Protocol) is the basic communication protocol of Cloud Web services that is XML based protocol. Generally,Web services often suffer congestions and bottlenecks as a result of the high network traffic that is caused by the large XML overhead size. At the same time, the massive load on Cloud Web services in terms of the large demand of client requests has resulted in the same problem. In this paper, two XML-aware aggregation techniques that are based on exploiting the compression concepts are proposed in order to aggregate the medical Web messages and achieve higher message size reduction.

  5. Occupational health and safety aspects of animal handling in dairy production.

    PubMed

    Lindahl, Cecilia; Lundqvist, Peter; Hagevoort, G Robert; Lunner Kolstrup, Christina; Douphrate, David I; Pinzke, Stefan; Grandin, Temple

    2013-01-01

    Livestock handling in dairy production is associated with a number of health and safety issues. A large number of fatal and nonfatal injuries still occur when handling livestock. The many animal handling tasks on a dairy farm include moving cattle between different locations, vaccination, administration of medication, hoof care, artificial insemination, ear tagging, milking, and loading onto trucks. There are particular problems with bulls, which continue to cause considerable numbers of injuries and fatalities in dairy production. In order to reduce the number of injuries during animal handling on dairy farms, it is important to understand the key factors in human-animal interactions. These include handler attitudes and behavior, animal behavior, and fear in cows. Care when in close proximity to the animal is the key for safe handling, including knowledge of the flight zone, and use of the right types of tools and suitable restraint equipment. Thus, in order to create safe working conditions during livestock handling, it is important to provide handlers with adequate training and to establish sound safety management procedures on the farm.

  6. Development of Handling Qualities Criteria for Rotorcraft with Externally Slung Loads

    NASA Technical Reports Server (NTRS)

    Hoh, Roger H.; Heffley, Robert K.; Mitchell, David G.

    2006-01-01

    Piloted simulations were performed on the NASA-Ames Vertical Motion Simulator (VMS) to explore handling qualities issues for large cargo helicopters, particularly focusing on external slung load operations. The purpose of this work was based upon the need to include handling qualities criteria for cargo helicopters in an upgrade to the U.S. Army's rotorcraft handling qualities specification, Aeronautical Design Standard-33 (ADS-33E-PRF). From the VMS results, handling qualities criteria were developed fro cargo helicopters carrying external slung loads in the degraded visual environment (DVE). If satisfied, these criteria provide assurance that the handling quality rating (HQR) will be 4 or better for operations in the DVE, and with a load mass ratio of 0.33 or less. For lighter loads, flying qualities were found to be less dependent on the load geometry and therefore the significance of the criteria is less. For heavier loads, meeting the criteria ensures the best possible handling qualities, albeit Level 2 for load mass ratios greater than 0.33.

  7. Secure web-based invocation of large-scale plasma simulation codes

    NASA Astrophysics Data System (ADS)

    Dimitrov, D. A.; Busby, R.; Exby, J.; Bruhwiler, D. L.; Cary, J. R.

    2004-12-01

    We present our design and initial implementation of a web-based system for running, both in parallel and serial, Particle-In-Cell (PIC) codes for plasma simulations with automatic post processing and generation of visual diagnostics.

  8. Handling a Large Collection of PDF Documents

    EPA Pesticide Factsheets

    You have several options for making a large collection of PDF documents more accessible to your audience: avoid uploading altogether, use multiple document pages, and use document IDs as anchors for direct links within a document page.

  9. WImpiBLAST: web interface for mpiBLAST to help biologists perform large-scale annotation using high performance computing.

    PubMed

    Sharma, Parichit; Mantri, Shrikant S

    2014-01-01

    The function of a newly sequenced gene can be discovered by determining its sequence homology with known proteins. BLAST is the most extensively used sequence analysis program for sequence similarity search in large databases of sequences. With the advent of next generation sequencing technologies it has now become possible to study genes and their expression at a genome-wide scale through RNA-seq and metagenome sequencing experiments. Functional annotation of all the genes is done by sequence similarity search against multiple protein databases. This annotation task is computationally very intensive and can take days to obtain complete results. The program mpiBLAST, an open-source parallelization of BLAST that achieves superlinear speedup, can be used to accelerate large-scale annotation by using supercomputers and high performance computing (HPC) clusters. Although many parallel bioinformatics applications using the Message Passing Interface (MPI) are available in the public domain, researchers are reluctant to use them due to lack of expertise in the Linux command line and relevant programming experience. With these limitations, it becomes difficult for biologists to use mpiBLAST for accelerating annotation. No web interface is available in the open-source domain for mpiBLAST. We have developed WImpiBLAST, a user-friendly open-source web interface for parallel BLAST searches. It is implemented in Struts 1.3 using a Java backbone and runs atop the open-source Apache Tomcat Server. WImpiBLAST supports script creation and job submission features and also provides a robust job management interface for system administrators. It combines script creation and modification features with job monitoring and management through the Torque resource manager on a Linux-based HPC cluster. Use case information highlights the acceleration of annotation analysis achieved by using WImpiBLAST. Here, we describe the WImpiBLAST web interface features and architecture, explain design

  10. WImpiBLAST: Web Interface for mpiBLAST to Help Biologists Perform Large-Scale Annotation Using High Performance Computing

    PubMed Central

    Sharma, Parichit; Mantri, Shrikant S.

    2014-01-01

    The function of a newly sequenced gene can be discovered by determining its sequence homology with known proteins. BLAST is the most extensively used sequence analysis program for sequence similarity search in large databases of sequences. With the advent of next generation sequencing technologies it has now become possible to study genes and their expression at a genome-wide scale through RNA-seq and metagenome sequencing experiments. Functional annotation of all the genes is done by sequence similarity search against multiple protein databases. This annotation task is computationally very intensive and can take days to obtain complete results. The program mpiBLAST, an open-source parallelization of BLAST that achieves superlinear speedup, can be used to accelerate large-scale annotation by using supercomputers and high performance computing (HPC) clusters. Although many parallel bioinformatics applications using the Message Passing Interface (MPI) are available in the public domain, researchers are reluctant to use them due to lack of expertise in the Linux command line and relevant programming experience. With these limitations, it becomes difficult for biologists to use mpiBLAST for accelerating annotation. No web interface is available in the open-source domain for mpiBLAST. We have developed WImpiBLAST, a user-friendly open-source web interface for parallel BLAST searches. It is implemented in Struts 1.3 using a Java backbone and runs atop the open-source Apache Tomcat Server. WImpiBLAST supports script creation and job submission features and also provides a robust job management interface for system administrators. It combines script creation and modification features with job monitoring and management through the Torque resource manager on a Linux-based HPC cluster. Use case information highlights the acceleration of annotation analysis achieved by using WImpiBLAST. Here, we describe the WImpiBLAST web interface features and architecture, explain design

  11. Web-Based Designed Activities for Young People in Health Education: A Constructivist Approach

    ERIC Educational Resources Information Center

    Goldman, Juliette D. G.

    2006-01-01

    Modern Health Education in primary schools is increasingly using computer technologies in a variety of ways to enhance teaching and learning. Here, a Constructivist approach for a web-based educational activity for Grade 7 is discussed using an example of designing a healthy Food Handling Manual in the food industry. The Constructivist principles…

  12. CImbinator: a web-based tool for drug synergy analysis in small- and large-scale datasets.

    PubMed

    Flobak, Åsmund; Vazquez, Miguel; Lægreid, Astrid; Valencia, Alfonso

    2017-08-01

    Drug synergies are sought to identify combinations of drugs particularly beneficial. User-friendly software solutions that can assist analysis of large-scale datasets are required. CImbinator is a web-service that can aid in batch-wise and in-depth analyzes of data from small-scale and large-scale drug combination screens. CImbinator offers to quantify drug combination effects, using both the commonly employed median effect equation, as well as advanced experimental mathematical models describing dose response relationships. CImbinator is written in Ruby and R. It uses the R package drc for advanced drug response modeling. CImbinator is available at http://cimbinator.bioinfo.cnio.es , the source-code is open and available at https://github.com/Rbbt-Workflows/combination_index . A Docker image is also available at https://hub.docker.com/r/mikisvaz/rbbt-ci_mbinator/ . asmund.flobak@ntnu.no or miguel.vazquez@cnio.es. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  13. jsPsych: a JavaScript library for creating behavioral experiments in a Web browser.

    PubMed

    de Leeuw, Joshua R

    2015-03-01

    Online experiments are growing in popularity, and the increasing sophistication of Web technology has made it possible to run complex behavioral experiments online using only a Web browser. Unlike with offline laboratory experiments, however, few tools exist to aid in the development of browser-based experiments. This makes the process of creating an experiment slow and challenging, particularly for researchers who lack a Web development background. This article introduces jsPsych, a JavaScript library for the development of Web-based experiments. jsPsych formalizes a way of describing experiments that is much simpler than writing the entire experiment from scratch. jsPsych then executes these descriptions automatically, handling the flow from one task to another. The jsPsych library is open-source and designed to be expanded by the research community. The project is available online at www.jspsych.org .

  14. Evaluation of trapping-web designs

    USGS Publications Warehouse

    Lukacs, P.M.; Anderson, D.R.; Burnham, K.P.

    2005-01-01

    The trapping web is a method for estimating the density and abundance of animal populations. A Monte Carlo simulation study is performed to explore performance of the trapping web for estimating animal density under a variety of web designs and animal behaviours. The trapping performs well when animals have home ranges, even if the home ranges are large relative to trap spacing. Webs should contain at least 90 traps. Trapping should continue for 5-7 occasions. Movement rates have little impact on density estimates when animals are confined to home ranges. Estimation is poor when animals do not have home ranges and movement rates are rapid. The trapping web is useful for estimating the density of animals that are hard to detect and occur at potentially low densities. ?? CSIRO 2005.

  15. Recognition of pornographic web pages by classifying texts and images.

    PubMed

    Hu, Weiming; Wu, Ou; Chen, Zhouyao; Fu, Zhouyu; Maybank, Steve

    2007-06-01

    With the rapid development of the World Wide Web, people benefit more and more from the sharing of information. However, Web pages with obscene, harmful, or illegal content can be easily accessed. It is important to recognize such unsuitable, offensive, or pornographic Web pages. In this paper, a novel framework for recognizing pornographic Web pages is described. A C4.5 decision tree is used to divide Web pages, according to content representations, into continuous text pages, discrete text pages, and image pages. These three categories of Web pages are handled, respectively, by a continuous text classifier, a discrete text classifier, and an algorithm that fuses the results from the image classifier and the discrete text classifier. In the continuous text classifier, statistical and semantic features are used to recognize pornographic texts. In the discrete text classifier, the naive Bayes rule is used to calculate the probability that a discrete text is pornographic. In the image classifier, the object's contour-based features are extracted to recognize pornographic images. In the text and image fusion algorithm, the Bayes theory is used to combine the recognition results from images and texts. Experimental results demonstrate that the continuous text classifier outperforms the traditional keyword-statistics-based classifier, the contour-based image classifier outperforms the traditional skin-region-based image classifier, the results obtained by our fusion algorithm outperform those by either of the individual classifiers, and our framework can be adapted to different categories of Web pages.

  16. Web-based education in anesthesiology: a critical overview.

    PubMed

    Doyle, D John

    2008-12-01

    The purpose of this review is to discuss the rise of web-based educational resources available to the anesthesiology community. Recent developments of particular importance include the growth of 'Web 2.0' resources, the development of the concepts of 'open access' and 'information philanthropy', and the expansion of web-based medical simulation software products.In addition, peer review of online educational resources has now come of age. The worldwide web has made available a large variety of valuable medical information and education resources only dreamed of two decades ago. To a large extent,these developments represent a shift in the focus of medical education resources to emphasize free access to materials and to encourage collaborative development efforts.

  17. Perceived comfort level of medical students and residents in handling clinical ethics issues.

    PubMed

    Silverman, Henry J; Dagenais, Julien; Gordon-Lipkin, Eliza; Caputo, Laura; Christian, Matthew W; Maidment, Bert W; Binstock, Anna; Oyalowo, Akinbowale; Moni, Malini

    2013-01-01

    Studies have shown that medical students and residents believe that their ethics preparation has been inadequate for handling ethical conflicts. The objective of this study was to determine the self-perceived comfort level of medical students and residents in confronting clinical ethics issues. Clinical medical students and residents at the University of Maryland School of Medicine completed a web-based survey between September 2009 and February 2010. The survey consisted of a demographic section, questions regarding the respondents' sense of comfort in handling a variety of clinical ethics issues, and a set of knowledge-type questions in ethics. Survey respondents included 129 medical students (response rate of 40.7%) and 207 residents (response rate of 52.7%). There were only a few clinical ethics issues with which more than 70% of the respondents felt comfortable in addressing. Only a slight majority (60.8%) felt prepared, in general, to handle clinical situations involving ethics issues, and only 44.1% and 53.2% agreed that medical school and residency training, respectively, helped prepare them to handle such issues. Prior ethics training was not associated with these responses, but there was an association between the level of training (medical students vs residents) and the comfort level with many of the clinical ethics issues. Medical educators should include ethics educational methods within the context of real-time exposure to medical ethics dilemmas experienced by physicians-in-training.

  18. Truke, a web tool to check for and handle excel misidentified gene symbols.

    PubMed

    Mallona, Izaskun; Peinado, Miguel A

    2017-03-21

    Genomic datasets accompanying scientific publications show a surprisingly high rate of gene name corruption. This error is generated when files and tables are imported into Microsoft Excel and certain gene symbols are automatically converted into dates. We have developed Truke, a fexible Web tool to detect, tag and fix, if possible, such misconversions. Aside, Truke is language and regional locale-aware, providing file format customization (decimal symbol, field sepator, etc.) following user's preferences. Truke is a data format conversion tool with a unique corrupted gene symbol detection utility. Truke is freely available without registration at http://maplab.cat/truke .

  19. Spider webs designed for rare but life-saving catches

    PubMed Central

    Venner, Samuel; Casas, Jérôme

    2005-01-01

    The impact of rare but positive events on the design of organisms has been largely ignored, probably due to the paucity of recordings of such events and to the difficulty of estimating their impact on lifetime reproductive success. In this respect, we investigated the size of spider webs in relation to rare but large prey catches. First, we collected field data on a short time-scale using the common orb-weaving spider Zygiella x-notata to determine the distribution of the size of prey caught and to quantify the relationship between web size and daily capture success. Second, we explored, with an energetic model, the consequences of an increase in web size on spider fitness. Our results showed that (i) the great majority of prey caught are quite small (body length less than 2 mm) while large prey (length greater than 10 mm) are rare, (ii) spiders cannot survive or produce eggs without catching these large but rare prey and (iii) increasing web size increases the daily number of prey caught and thus long-term survival and fecundity. Spider webs seem, therefore, designed for making the best of the rare but crucial event of catching large prey. PMID:16048774

  20. Improving Memory Error Handling Using Linux

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlton, Michael Andrew; Blanchard, Sean P.; Debardeleben, Nathan A.

    As supercomputers continue to get faster and more powerful in the future, they will also have more nodes. If nothing is done, then the amount of memory in supercomputer clusters will soon grow large enough that memory failures will be unmanageable to deal with by manually replacing memory DIMMs. "Improving Memory Error Handling Using Linux" is a process oriented method to solve this problem by using the Linux kernel to disable (offline) faulty memory pages containing bad addresses, preventing them from being used again by a process. The process of offlining memory pages simplifies error handling and results in reducingmore » both hardware and manpower costs required to run Los Alamos National Laboratory (LANL) clusters. This process will be necessary for the future of supercomputing to allow the development of exascale computers. It will not be feasible without memory error handling to manually replace the number of DIMMs that will fail daily on a machine consisting of 32-128 petabytes of memory. Testing reveals the process of offlining memory pages works and is relatively simple to use. As more and more testing is conducted, the entire process will be automated within the high-performance computing (HPC) monitoring software, Zenoss, at LANL.« less

  1. Trust estimation of the semantic web using semantic web clustering

    NASA Astrophysics Data System (ADS)

    Shirgahi, Hossein; Mohsenzadeh, Mehran; Haj Seyyed Javadi, Hamid

    2017-05-01

    Development of semantic web and social network is undeniable in the Internet world these days. Widespread nature of semantic web has been very challenging to assess the trust in this field. In recent years, extensive researches have been done to estimate the trust of semantic web. Since trust of semantic web is a multidimensional problem, in this paper, we used parameters of social network authority, the value of pages links authority and semantic authority to assess the trust. Due to the large space of semantic network, we considered the problem scope to the clusters of semantic subnetworks and obtained the trust of each cluster elements as local and calculated the trust of outside resources according to their local trusts and trust of clusters to each other. According to the experimental result, the proposed method shows more than 79% Fscore that is about 11.9% in average more than Eigen, Tidal and centralised trust methods. Mean of error in this proposed method is 12.936, that is 9.75% in average less than Eigen and Tidal trust methods.

  2. Refining the Use of the Web (and Web Search) as a Language Teaching and Learning Resource

    ERIC Educational Resources Information Center

    Wu, Shaoqun; Franken, Margaret; Witten, Ian H.

    2009-01-01

    The web is a potentially useful corpus for language study because it provides examples of language that are contextualized and authentic, and is large and easily searchable. However, web contents are heterogeneous in the extreme, uncontrolled and hence "dirty," and exhibit features different from the written and spoken texts in other linguistic…

  3. New Generation Sensor Web Enablement

    PubMed Central

    Bröring, Arne; Echterhoff, Johannes; Jirka, Simon; Simonis, Ingo; Everding, Thomas; Stasch, Christoph; Liang, Steve; Lemmens, Rob

    2011-01-01

    Many sensor networks have been deployed to monitor Earth’s environment, and more will follow in the future. Environmental sensors have improved continuously by becoming smaller, cheaper, and more intelligent. Due to the large number of sensor manufacturers and differing accompanying protocols, integrating diverse sensors into observation systems is not straightforward. A coherent infrastructure is needed to treat sensors in an interoperable, platform-independent and uniform way. The concept of the Sensor Web reflects such a kind of infrastructure for sharing, finding, and accessing sensors and their data across different applications. It hides the heterogeneous sensor hardware and communication protocols from the applications built on top of it. The Sensor Web Enablement initiative of the Open Geospatial Consortium standardizes web service interfaces and data encodings which can be used as building blocks for a Sensor Web. This article illustrates and analyzes the recent developments of the new generation of the Sensor Web Enablement specification framework. Further, we relate the Sensor Web to other emerging concepts such as the Web of Things and point out challenges and resulting future work topics for research on Sensor Web Enablement. PMID:22163760

  4. Using NetCloak to develop server-side Web-based experiments without writing CGI programs.

    PubMed

    Wolfe, Christopher R; Reyna, Valerie F

    2002-05-01

    Server-side experiments use the Web server, rather than the participant's browser, to handle tasks such as random assignment, eliminating inconsistencies with JAVA and other client-side applications. Heretofore, experimenters wishing to create server-side experiments have had to write programs to create common gateway interface (CGI) scripts in programming languages such as Perl and C++. NetCloak uses simple, HTML-like commands to create CGIs. We used NetCloak to implement an experiment on probability estimation. Measurements of time on task and participants' IP addresses assisted quality control. Without prior training, in less than 1 month, we were able to use NetCloak to design and create a Web-based experiment and to help graduate students create three Web-based experiments of their own.

  5. Applying Semantic Web Services and Wireless Sensor Networks for System Integration

    NASA Astrophysics Data System (ADS)

    Berkenbrock, Gian Ricardo; Hirata, Celso Massaki; de Oliveira Júnior, Frederico Guilherme Álvares; de Oliveira, José Maria Parente

    In environments like factories, buildings, and homes automation services tend to often change during their lifetime. Changes are concerned to business rules, process optimization, cost reduction, and so on. It is important to provide a smooth and straightforward way to deal with these changes so that could be handled in a faster and low cost manner. Some prominent solutions use the flexibility of Wireless Sensor Networks and the meaningful description of Semantic Web Services to provide service integration. In this work, we give an overview of current solutions for machinery integration that combine both technologies as well as a discussion about some perspectives and open issues when applying Wireless Sensor Networks and Semantic Web Services for automation services integration.

  6. Effect of bit wear on hammer drill handle vibration and productivity.

    PubMed

    Antonucci, Andrea; Barr, Alan; Martin, Bernard; Rempel, David

    2017-08-01

    The use of large electric hammer drills exposes construction workers to high levels of hand vibration that may lead to hand-arm vibration syndrome and other musculoskeletal disorders. The aim of this laboratory study was to investigate the effect of bit wear on drill handle vibration and drilling productivity (e.g., drilling time per hole). A laboratory test bench system was used with an 8.3 kg electric hammer drill and 1.9 cm concrete bit (a typical drill and bit used in commercial construction). The system automatically advanced the active drill into aged concrete block under feed force control to a depth of 7.6 cm while handle vibration was measured according to ISO standards (ISO 5349 and 28927). Bits were worn to 4 levels by consecutive hole drilling to 4 cumulative drilling depths: 0, 1,900, 5,700, and 7,600 cm. Z-axis handle vibration increased significantly (p<0.05) from 4.8 to 5.1 m/s 2 (ISO weighted) and from 42.7-47.6 m/s 2 (unweighted) when comparing a new bit to a bit worn to 1,900 cm of cumulative drilling depth. Handle vibration did not increase further with bits worn more than 1900 cm of cumulative drilling depth. Neither x- nor y-axis handle vibration was effected by bit wear. The time to drill a hole increased by 58% for the bit with 5,700 cm of cumulative drilling depth compared to a new bit. Bit wear led to a small but significant increase in both ISO weighted and unweighted z-axis handle vibration. Perhaps more important, bit wear had a large effect on productivity. The effect on productivity will influence a worker's allowable daily drilling time if exposure to drill handle vibration is near the ACGIH Threshold Limit Value. [1] Construction contractors should implement a bit replacement program based on these findings.

  7. Adaptation of a web-based, open source electronic medical record system platform to support a large study of tuberculosis epidemiology

    PubMed Central

    2012-01-01

    Background In 2006, we were funded by the US National Institutes of Health to implement a study of tuberculosis epidemiology in Peru. The study required a secure information system to manage data from a target goal of 16,000 subjects who needed to be followed for at least one year. With previous experience in the development and deployment of web-based medical record systems for TB treatment in Peru, we chose to use the OpenMRS open source electronic medical record system platform to develop the study information system. Supported by a core technical and management team and a large and growing worldwide community, OpenMRS is now being used in more than 40 developing countries. We adapted the OpenMRS platform to better support foreign languages. We added a new module to support double data entry, linkage to an existing laboratory information system, automatic upload of GPS data from handheld devices, and better security and auditing of data changes. We added new reports for study managers, and developed data extraction tools for research staff and statisticians. Further adaptation to handle direct entry of laboratory data occurred after the study was launched. Results Data collection in the OpenMRS system began in September 2009. By August 2011 a total of 9,256 participants had been enrolled, 102,274 forms and 13,829 laboratory results had been entered, and there were 208 users. The system is now entirely supported by the Peruvian study staff and programmers. Conclusions The information system served the study objectives well despite requiring some significant adaptations mid-stream. OpenMRS has more tools and capabilities than it did in 2008, and requires less adaptations for future projects. OpenMRS can be an effective research data system in resource poor environments, especially for organizations using or considering it for clinical care as well as research. PMID:23131180

  8. Quantifying the effect of editor-author relations on manuscript handling times.

    PubMed

    Sarigöl, Emre; Garcia, David; Scholtes, Ingo; Schweitzer, Frank

    2017-01-01

    In this article we study to what extent the academic peer review process is influenced by social relations between the authors of a manuscript and the editor handling the manuscript. Taking the open access journal PlosOne as a case study, our analysis is based on a data set of more than 100,000 articles published between 2007 and 2015. Using available data on handling editor, submission and acceptance time of manuscripts, we study the question whether co-authorship relations between authors and the handling editor affect the manuscript handling time , i.e. the time taken between the submission and acceptance of a manuscript. Our analysis reveals (1) that editors handle papers co-authored by previous collaborators significantly more often than expected at random, and (2) that such prior co-author relations are significantly related to faster manuscript handling. Addressing the question whether these shorter manuscript handling times can be explained by the quality of publications, we study the number of citations and downloads which accepted papers eventually accumulate. Moreover, we consider the influence of additional (social) factors, such as the editor's experience, the topical similarity between authors and editors, as well as reciprocal citation relations between authors and editors. Our findings show that, even when correcting for other factors like time, experience, and performance, prior co-authorship relations have a large and significant influence on manuscript handling times, speeding up the editorial decision on average by 19 days.

  9. Web Services and Handle Infrastructure - WDCC's Contributions to International Projects

    NASA Astrophysics Data System (ADS)

    Föll, G.; Weigelt, T.; Kindermann, S.; Lautenschlager, M.; Toussaint, F.

    2012-04-01

    Climate science demands on data management are growing rapidly as climate models grow in the precision with which they depict spatial structures and in the completeness with which they describe a vast range of physical processes. The ExArch project is exploring the challenges of developing a software management infrastructure which will scale to the multi-exabyte archives of climate data which are likely to be crucial to major policy decisions in by the end of the decade. The ExArch approach to future integration of exascale climate archives is based on one hand on a distributed web service architecture providing data analysis and quality control functionality across archvies. On the other hand a consistent persistent identifier infrastructure is deployed to support distributed data management and data replication. Distributed data analysis functionality is based on the CDO climate data operators' package. The CDO-Tool is used for processing of the archived data and metadata. CDO is a collection of command line Operators to manipulate and analyse Climate and forecast model Data. A range of formats is supported and over 500 operators are provided. CDO presently is designed to work in a scripting environment with local files. ExArch will extend the tool to support efficient usage in an exascale archive with distributed data and computational resources by providing flexible scheduling capabilities. Quality control will become increasingly important in an exascale computing context. Researchers will be dealing with millions of data files from multiple sources and will need to know whether the files satisfy a range of basic quality criterea. Hence ExArch will provide a flexible and extensible quality control system. The data will be held at more than 30 computing centres and data archives around the world, but for users it will appear as a single archive due to a standardized ExArch Web Processing Service. Data infrastructures such as the one built by ExArch can greatly

  10. Semantic Annotations and Querying of Web Data Sources

    NASA Astrophysics Data System (ADS)

    Hornung, Thomas; May, Wolfgang

    A large part of the Web, actually holding a significant portion of the useful information throughout the Web, consists of views on hidden databases, provided by numerous heterogeneous interfaces that are partly human-oriented via Web forms ("Deep Web"), and partly based on Web Services (only machine accessible). In this paper we present an approach for annotating these sources in a way that makes them citizens of the Semantic Web. We illustrate how queries can be stated in terms of the ontology, and how the annotations are used to selected and access appropriate sources and to answer the queries.

  11. Social Networking on the Semantic Web

    ERIC Educational Resources Information Center

    Finin, Tim; Ding, Li; Zhou, Lina; Joshi, Anupam

    2005-01-01

    Purpose: Aims to investigate the way that the semantic web is being used to represent and process social network information. Design/methodology/approach: The Swoogle semantic web search engine was used to construct several large data sets of Resource Description Framework (RDF) documents with social network information that were encoded using the…

  12. Web tools for predictive toxicology model building.

    PubMed

    Jeliazkova, Nina

    2012-07-01

    The development and use of web tools in chemistry has accumulated more than 15 years of history already. Powered by the advances in the Internet technologies, the current generation of web systems are starting to expand into areas, traditional for desktop applications. The web platforms integrate data storage, cheminformatics and data analysis tools. The ease of use and the collaborative potential of the web is compelling, despite the challenges. The topic of this review is a set of recently published web tools that facilitate predictive toxicology model building. The focus is on software platforms, offering web access to chemical structure-based methods, although some of the frameworks could also provide bioinformatics or hybrid data analysis functionalities. A number of historical and current developments are cited. In order to provide comparable assessment, the following characteristics are considered: support for workflows, descriptor calculations, visualization, modeling algorithms, data management and data sharing capabilities, availability of GUI or programmatic access and implementation details. The success of the Web is largely due to its highly decentralized, yet sufficiently interoperable model for information access. The expected future convergence between cheminformatics and bioinformatics databases provides new challenges toward management and analysis of large data sets. The web tools in predictive toxicology will likely continue to evolve toward the right mix of flexibility, performance, scalability, interoperability, sets of unique features offered, friendly user interfaces, programmatic access for advanced users, platform independence, results reproducibility, curation and crowdsourcing utilities, collaborative sharing and secure access.

  13. A CityGML Extension for Handling Very Large Tins

    NASA Astrophysics Data System (ADS)

    Kumar, K.; Ledoux, H.; Stoter, J.

    2016-10-01

    In addition to buildings, the terrain forms an important part of a 3D city model. Although in GIS terrains are usually represented with 2D grids, TINs are also increasingly being used in practice. One example is 3DTOP10NL, the 3D city model covering the whole of the Netherlands, which stores the relief with a constrained TIN containing more than 1 billion triangles. Due to the massive size of such datasets, the main problem that arises is: how to efficiently store and maintain them? While CityGML supports the storage of TINs, we argue in this paper that the current solution is not adequate. For instance, the 1 billion+ triangles of 3DTOP10NL require 686 GB of storage space with CityGML. Furthermore, the current solution does not store the topological relationships of the triangles, and also there are no clear mechanisms to handle several LODs. We propose in this paper a CityGML extension for the compact representation of terrains. We describe our abstract and implementation specifications (modelled in UML), and our prototype implementation to convert TINs to our CityGML structure. It increases the topological relationships that are explicitly represented, and allows us to compress up to a factor of ∼ 25 in our experiments with massive real-world terrains (more than 1 billion triangles).

  14. New weight-handling device for commercial oil pressure balances

    NASA Astrophysics Data System (ADS)

    Woo, S. Y.; Choi, I. M.; Kim, B. S.

    2005-12-01

    This paper presents a new device to automatically handle a large number of weights for the calibration of a pressure gauge. This newly invented weight-handling device is made for use in conjunction with a commercial oil pressure balance. Although the pressure balance is essential as a calibration tool, its use has been generally tedious and labour intensive for a long time. In particular, the process of loading a different combination of weights on the top of a piston requires repetitious manual handling for every new measurement. This inevitably leaves the operator fatigued, and sometimes causes damage to the weights due to careless handling. The newly invented automatic weight-handling device can eliminate such tedious, error-prone and wear-inducing manual weight manipulation. The device consists of a stepping motor, a drive belt, a solenoid valve, three weight-lifting assemblies and three linear-motion guide assemblies. The weight-lifting assembly is composed of a pneumatic actuator, a solid-state switch and a metal finger. It has many advantages compared with the commercial automatic weight-handling device. Firstly, it is not necessary to lift all the weights off the piston in the weight selection process, as it is in the case of the commercial device. Thus it can prevent a permanent deformation of the weight carrier. Secondly, this new device can handle a larger number of weights than the commercial one. This is because the new device adopts a different method in retaining the remaining weights in place. Another advantage of this new device is that there is no possibility of the fingers touching the surface of the weights due to the oscillation of weights. Moreover it uses the general technology of a stepping motor, and is also made up of components that are easily obtainable in the market, thereby being very economical.

  15. Data handling and representation of freeform surfaces

    NASA Astrophysics Data System (ADS)

    Steinkopf, Ralf; Dick, Lars; Kopf, Tino; Gebhardt, Andreas; Risse, Stefan; Eberhardt, Ramona

    2011-10-01

    Freeform surfaces enable innovative optics. They are not limited by axis symmetry and hence they are almost free in design. They are used to reduce the installation space and enhance the performance of optical elements. State of the art optical design tools are computing with powerful algorithms to simulate freeform surfaces. Even new mathematical approaches are under development /1/. In consequence, new optical designs /2/ are pushing the development of manufacturing processes consequently and novel types of datasets have to proceed through the process chain /3/. The complexity of these data is the huge challenge for the data handling. Because of the asymmetrical and 3-dimensional surfaces of freeforms, large data volumes have to be created, trimmed, extended and fitted. All these processes must be performed without losing the accuracy of the original design data. Additionally, manifold types of geometries results in different kinds of mathematical representations of freeform surfaces and furthermore the used CAD/CAM tools are dealing with a set of spatial transport formats. These are all reasons why manufacture-oriented approaches for the freeform data handling are not yet sufficiently developed. This paper suggests a classification of freeform surfaces based on the manufacturing methods which are offered by diamond machining. The different manufacturing technologies, ranging from servo-turning to shaping, require a differentiated approach for the data handling process. The usage of analytical descriptions in form of splines and polynomials as well as the application of discrete descriptions like point clouds is shown in relation to the previously made classification. Advantages and disadvantages of freeform representations are discussed. Aspects of the data handling in between different process steps are pointed out and suitable exchange formats for freeform data are proposed. The described approach offers the possibility for efficient data handling from optical

  16. The Internet as a research site: establishment of a web-based longitudinal study of the nursing and midwifery workforce in three countries.

    PubMed

    Huntington, Annette; Gilmour, Jean; Schluter, Philip; Tuckett, Anthony; Bogossian, Fiona; Turner, Catherine

    2009-06-01

    The aim of this paper is to describe the development of a web-based longitudinal research project, The Nurses and Midwives e-cohort Study. The Internet has only recently been used for health research. However, web-based methodologies are increasingly discussed as significant and inevitable developments in research as Internet access and use rapidly increases worldwide. In 2006, a longitudinal web-based study of nurses and midwives workforce participation patterns, health and wellbeing, and lifestyle choices was established. Participating countries are Australia, New Zealand and the United Kingdom. Data collection is handled through a dedicated website using a range of standardized tools combined into one comprehensive questionnaire. Internet-specific data collection and a range of recruitment and retention strategies have been developed for this study. Internet-based technology can support the maintenance of cohorts across multiple countries and jurisdictions to explore factors influencing workforce participation. However, barriers to widespread adoption of web-based approaches include website development costs, the need for fast broadband connection for large data collection instruments, and varying degrees of Internet and computer literacy in the nursing and midwifery workforce. Many of the issues reported in this paper are transitional in nature at a time of rapid technological development. The development of on-line methods and tools is a major and exciting development in the world of research. Research via the world-wide web can support international collaborations across borders and cultures.

  17. WebMeV | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    Web MeV (Multiple-experiment Viewer) is a web/cloud-based tool for genomic data analysis. Web MeV is being built to meet the challenge of exploring large public genomic data set with intuitive graphical interface providing access to state-of-the-art analytical tools.

  18. CARRIER/CASK HANDLING SYSTEM DESCRIPTION DOCUMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E.F. Loros

    2000-06-23

    The Carrier/Cask Handling System receives casks on railcars and legal-weight trucks (LWTs) (transporters) that transport loaded casks and empty overpacks to the Monitored Geologic Repository (MGR) from the Carrier/Cask Transport System. Casks that come to the MGR on heavy-haul trucks (HHTs) are transferred onto railcars before being brought into the Carrier/Cask Handling System. The system is the interfacing system between the railcars and LWTs and the Assembly Transfer System (ATS) and Canister Transfer System (CTS). The Carrier/Cask Handling System removes loaded casks from the cask transporters and transfers the casks to a transfer cart for either the ATS or CTS,more » as appropriate, based on cask contents. The Carrier/Cask Handling System receives the returned empty casks from the ATS and CTS and mounts the casks back onto the transporters for reshipment. If necessary, the Carrier/Cask Handling System can also mount loaded casks back onto the transporters and remove empty casks from the transporters. The Carrier/Cask Handling System receives overpacks from the ATS loaded with canisters that have been cut open and emptied and mounts the overpacks back onto the transporters for disposal. If necessary, the Carrier/Cask Handling System can also mount empty overpacks back onto the transporters and remove loaded overpacks from them. The Carrier/Cask Handling System is located within the Carrier Bay of the Waste Handling Building System. The system consists of cranes, hoists, manipulators, and supporting equipment. The Carrier/Cask Handling System is designed with the tooling and fixtures necessary for handling a variety of casks. The Carrier/Cask Handling System performance and reliability are sufficient to support the shipping and emplacement schedules for the MGR. The Carrier/Cask Handling System interfaces with the Carrier/Cask Transport System, ATS, and CTS as noted above. The Carrier/Cask Handling System interfaces with the Waste Handling Building System for

  19. Flow Webs: Mechanism and Architecture for the Implementation of Sensor Webs

    NASA Astrophysics Data System (ADS)

    Gorlick, M. M.; Peng, G. S.; Gasster, S. D.; McAtee, M. D.

    2006-12-01

    The sensor web is a distributed, federated infrastructure much like its predecessors, the internet and the world wide web. It will be a federation of many sensor webs, large and small, under many distinct spans of control, that loosely cooperates and share information for many purposes. Realistically, it will grow piecemeal as distinct, individual systems are developed and deployed, some expressly built for a sensor web while many others were created for other purposes. Therefore, the architecture of the sensor web is of fundamental import and architectural strictures that inhibit innovation, experimentation, sharing or scaling may prove fatal. Drawing upon the architectural lessons of the world wide web, we offer a novel system architecture, the flow web, that elevates flows, sequences of messages over a domain of interest and constrained in both time and space, to a position of primacy as a dynamic, real-time, medium of information exchange for computational services. The flow web captures; in a single, uniform architectural style; the conflicting demands of the sensor web including dynamic adaptations to changing conditions, ease of experimentation, rapid recovery from the failures of sensors and models, automated command and control, incremental development and deployment, and integration at multiple levels—in many cases, at different times. Our conception of sensor webs—dynamic amalgamations of sensor webs each constructed within a flow web infrastructure—holds substantial promise for earth science missions in general, and of weather, air quality, and disaster management in particular. Flow webs, are by philosophy, design and implementation a dynamic infrastructure that permits massive adaptation in real-time. Flows may be attached to and detached from services at will, even while information is in transit through the flow. This concept, flow mobility, permits dynamic integration of earth science products and modeling resources in response to real

  20. Interactive Visualization of Large-Scale Hydrological Data using Emerging Technologies in Web Systems and Parallel Programming

    NASA Astrophysics Data System (ADS)

    Demir, I.; Krajewski, W. F.

    2013-12-01

    As geoscientists are confronted with increasingly massive datasets from environmental observations to simulations, one of the biggest challenges is having the right tools to gain scientific insight from the data and communicate the understanding to stakeholders. Recent developments in web technologies make it easy to manage, visualize and share large data sets with general public. Novel visualization techniques and dynamic user interfaces allow users to interact with data, and modify the parameters to create custom views of the data to gain insight from simulations and environmental observations. This requires developing new data models and intelligent knowledge discovery techniques to explore and extract information from complex computational simulations or large data repositories. Scientific visualization will be an increasingly important component to build comprehensive environmental information platforms. This presentation provides an overview of the trends and challenges in the field of scientific visualization, and demonstrates information visualization and communication tools developed within the light of these challenges.

  1. Managing Large Scale Project Analysis Teams through a Web Accessible Database

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel A.

    2008-01-01

    Large scale space programs analyze thousands of requirements while mitigating safety, performance, schedule, and cost risks. These efforts involve a variety of roles with interdependent use cases and goals. For example, study managers and facilitators identify ground-rules and assumptions for a collection of studies required for a program or project milestone. Task leaders derive product requirements from the ground rules and assumptions and describe activities to produce needed analytical products. Disciplined specialists produce the specified products and load results into a file management system. Organizational and project managers provide the personnel and funds to conduct the tasks. Each role has responsibilities to establish information linkages and provide status reports to management. Projects conduct design and analysis cycles to refine designs to meet the requirements and implement risk mitigation plans. At the program level, integrated design and analysis cycles studies are conducted to eliminate every 'to-be-determined' and develop plans to mitigate every risk. At the agency level, strategic studies analyze different approaches to exploration architectures and campaigns. This paper describes a web-accessible database developed by NASA to coordinate and manage tasks at three organizational levels. Other topics in this paper cover integration technologies and techniques for process modeling and enterprise architectures.

  2. A Web of Learning: Beyond "Itsy Bitsy Spider," Preschool Students Learn Science Content Naturally

    ERIC Educational Resources Information Center

    Evitt, Marie Faust

    2011-01-01

    One of the author's biggest challenges as a preschool teacher is helping children in a group see and touch and do. Hands-on explorations are important for everyone, but essential for young children. How can young children do hands-on explorations of spiders and their webs? Teachers do not want children handling all sorts of spiders. They worry…

  3. ODI - Portal, Pipeline, and Archive (ODI-PPA): a web-based astronomical compute archive, visualization, and analysis service

    NASA Astrophysics Data System (ADS)

    Gopu, Arvind; Hayashi, Soichi; Young, Michael D.; Harbeck, Daniel R.; Boroson, Todd; Liu, Wilson; Kotulla, Ralf; Shaw, Richard; Henschel, Robert; Rajagopal, Jayadev; Stobie, Elizabeth; Knezek, Patricia; Martin, R. Pierre; Archbold, Kevin

    2014-07-01

    The One Degree Imager-Portal, Pipeline, and Archive (ODI-PPA) is a web science gateway that provides astronomers a modern web interface that acts as a single point of access to their data, and rich computational and visualization capabilities. Its goal is to support scientists in handling complex data sets, and to enhance WIYN Observatory's scientific productivity beyond data acquisition on its 3.5m telescope. ODI-PPA is designed, with periodic user feedback, to be a compute archive that has built-in frameworks including: (1) Collections that allow an astronomer to create logical collations of data products intended for publication, further research, instructional purposes, or to execute data processing tasks (2) Image Explorer and Source Explorer, which together enable real-time interactive visual analysis of massive astronomical data products within an HTML5 capable web browser, and overlaid standard catalog and Source Extractor-generated source markers (3) Workflow framework which enables rapid integration of data processing pipelines on an associated compute cluster and users to request such pipelines to be executed on their data via custom user interfaces. ODI-PPA is made up of several light-weight services connected by a message bus; the web portal built using Twitter/Bootstrap, AngularJS and jQuery JavaScript libraries, and backend services written in PHP (using the Zend framework) and Python; it leverages supercomputing and storage resources at Indiana University. ODI-PPA is designed to be reconfigurable for use in other science domains with large and complex datasets, including an ongoing offshoot project for electron microscopy data.

  4. A web-based endpoint adjudication system for interim analyses in clinical trials.

    PubMed

    Nolen, Tracy L; Dimmick, Bill F; Ostrosky-Zeichner, Luis; Kendrick, Amy S; Sable, Carole; Ngai, Angela; Wallace, Dennis

    2009-02-01

    A data monitoring committee (DMC) is often employed to assess trial progress and review safety data and efficacy endpoints throughout a trail. Interim analyses performed for the DMC should use data that are as complete and verified as possible. Such analyses are complicated when data verification involves subjective study endpoints or requires clinical expertise to determine each subject's status with respect to the study endpoint. Therefore, procedures are needed to obtain adjudicated data for interim analyses in an efficient manner. In the past, methods for handling such data included using locally reported results as surrogate endpoints, adjusting analysis methods for unadjudicated data, or simply performing the adjudication as rapidly as possible. These methods all have inadequacies that make their sole usage suboptimal. For a study of prophylaxis for invasive candidiasis, adjudication of both study eligibility criteria and clinical endpoints prior to two interim analyses was required. Because the study was expected to enroll at a moderate rate and the sponsor required adjudicated endpoints to be used for interim analyses, an efficient process for adjudication was required. We created a web-based endpoint adjudication system (WebEAS) that allows for expedited review by the endpoint adjudication committee (EAC). This system automatically identifies when a subject's data are complete, creates a subject profile from the study data, and assigns EAC reviewers. The reviewers use the WebEAS to review the subject profile and submit their completed review form. The WebEAS then compares the reviews, assigns an additional review as a tiebreaker if needed, and stores the adjudicated data. The study for which this system was originally built was administratively closed after 10 months with only 38 subjects enrolled. The adjudication process was finalized and the WebEAS system activated prior to study closure. Some website accessibility issues presented initially. However

  5. A web-based platform for virtual screening.

    PubMed

    Watson, Paul; Verdonk, Marcel; Hartshorn, Michael J

    2003-09-01

    A fully integrated, web-based, virtual screening platform has been developed to allow rapid virtual screening of large numbers of compounds. ORACLE is used to store information at all stages of the process. The system includes a large database of historical compounds from high throughput screenings (HTS) chemical suppliers, ATLAS, containing over 3.1 million unique compounds with their associated physiochemical properties (ClogP, MW, etc.). The database can be screened using a web-based interface to produce compound subsets for virtual screening or virtual library (VL) enumeration. In order to carry out the latter task within ORACLE a reaction data cartridge has been developed. Virtual libraries can be enumerated rapidly using the web-based interface to the cartridge. The compound subsets can be seamlessly submitted for virtual screening experiments, and the results can be viewed via another web-based interface allowing ad hoc querying of the virtual screening data stored in ORACLE.

  6. How Public Is the Web?: Robots, Access, and Scholarly Communication.

    ERIC Educational Resources Information Center

    Snyder, Herbert; Rosenbaum, Howard

    1998-01-01

    Examines the use of Robot Exclusion Protocol (REP) to restrict the access of search engine robots to 10 major United States university Web sites. An analysis of Web site searching and interviews with Web server administrators shows that the decision to use this procedure is largely technical and is typically made by the Web server administrator.…

  7. Designing a web site for high school geoscience teaching in Iceland

    NASA Astrophysics Data System (ADS)

    Douglas, George R.

    1998-08-01

    The need to construct an earth science teaching site on the web prompted a survey of existing sites which, in spite of containing much of value, revealed many weaknesses in basic design, particularly as regards the organisation of links to information resources. Few web sites take into consideration the particular pedagogic needs of the high school science student and there has, as yet, been little serious attempt to exploit and organise the more outstanding advantages offered by the internet to science teaching, such as accessing real-time data. A web site has been constructed which, through basic design, enables students to access relevant information resources over a wide range of subjects and topics easily and rapidly, while at the same time performing an instructional role in how to handle both on-line and off-line resources. Key elements in the design are selection and monitoring by the teacher, task oriented pages and the use of the Dewey decimal classification system. The intention is to increase gradually the extent to which most teaching tasks are carried out via the web pages, in the belief that they can become an efficient central point for all the earth science curriculum.

  8. Collaboration tools and techniques for large model datasets

    USGS Publications Warehouse

    Signell, R.P.; Carniel, S.; Chiggiato, J.; Janekovic, I.; Pullen, J.; Sherwood, C.R.

    2008-01-01

    In MREA and many other marine applications, it is common to have multiple models running with different grids, run by different institutions. Techniques and tools are described for low-bandwidth delivery of data from large multidimensional datasets, such as those from meteorological and oceanographic models, directly into generic analysis and visualization tools. Output is stored using the NetCDF CF Metadata Conventions, and then delivered to collaborators over the web via OPeNDAP. OPeNDAP datasets served by different institutions are then organized via THREDDS catalogs. Tools and procedures are then used which enable scientists to explore data on the original model grids using tools they are familiar with. It is also low-bandwidth, enabling users to extract just the data they require, an important feature for access from ship or remote areas. The entire implementation is simple enough to be handled by modelers working with their webmasters - no advanced programming support is necessary. ?? 2007 Elsevier B.V. All rights reserved.

  9. An introduction to the Semantic Web for health sciences librarians.

    PubMed

    Robu, Ioana; Robu, Valentin; Thirion, Benoit

    2006-04-01

    The paper (1) introduces health sciences librarians to the main concepts and principles of the Semantic Web (SW) and (2) briefly reviews a number of projects on the handling of biomedical information that uses SW technology. The paper is structured into two main parts. "Semantic Web Technology" provides a high-level description, with examples, of the main standards and concepts: extensible markup language (XML), Resource Description Framework (RDF), RDF Schema (RDFS), ontologies, and their utility in information retrieval, concluding with mention of more advanced SW languages and their characteristics. "Semantic Web Applications and Research Projects in the Biomedical Field" is a brief review of the Unified Medical Language System (UMLS), Generalised Architecture for Languages, Encyclopedias and Nomenclatures in Medicine (GALEN), HealthCyberMap, LinkBase, and the thesaurus of the National Cancer Institute (NCI). The paper also mentions other benefits and by-products of the SW, citing projects related to them. Some of the problems facing the SW vision are presented, especially the ways in which the librarians' expertise in organizing knowledge and in structuring information may contribute to SW projects.

  10. [Development of quality assurance/quality control web system in radiotherapy].

    PubMed

    Okamoto, Hiroyuki; Mochizuki, Toshihiko; Yokoyama, Kazutoshi; Wakita, Akihisa; Nakamura, Satoshi; Ueki, Heihachi; Shiozawa, Keiko; Sasaki, Koji; Fuse, Masashi; Abe, Yoshihisa; Itami, Jun

    2013-12-01

    Our purpose is to develop a QA/QC (quality assurance/quality control) web system using a server-side script language such as HTML (HyperText Markup Language) and PHP (Hypertext Preprocessor), which can be useful as a tool to share information about QA/QC in radiotherapy. The system proposed in this study can be easily built in one's own institute, because HTML can be easily handled. There are two desired functions in a QA/QC web system: (i) To review the results of QA/QC for a radiotherapy machine, manuals, and reports necessary for routinely performing radiotherapy through this system. By disclosing the results, transparency can be maintained, (ii) To reveal a protocol for QA/QC in one's own institute using pictures and movies relating to QA/QC for simplicity's sake, which can also be used as an educational tool for junior radiation technologists and medical physicists. By using this system, not only administrators, but also all staff involved in radiotherapy, can obtain information about the conditions and accuracy of treatment machines through the QA/QC web system.

  11. The Availability of Web 2.0 Tools from Community College Libraries' Websites Serving Large Student Bodies

    ERIC Educational Resources Information Center

    Blummer, Barbara; Kenton, Jeffrey M.

    2014-01-01

    Web 2.0 tools offer academic libraries new avenues for delivering services and resources to students. In this research we report on a content analysis of 100 US community college libraries' Websites for the availability of Web 2.0 applications. We found Web 2.0 tools utilized by 97% of our sample population and many of these sites contained more…

  12. Evaluation in industry of a draft code of practice for manual handling.

    PubMed

    Ashby, Liz; Tappin, David; Bentley, Tim

    2004-05-01

    This paper reports findings from a study which evaluated the draft New Zealand Code of Practice for Manual Handling. The evaluation assessed the ease of use, applicability and validity of the Code and in particular the associated manual handling hazard assessment tools, within New Zealand industry. The Code was studied in a sample of eight companies from four sectors of industry. Subjective feedback and objective findings indicated that the Code was useful, applicable and informative. The manual handling hazard assessment tools incorporated in the Code could be adequately applied by most users, with risk assessment outcomes largely consistent with the findings of researchers using more specific ergonomics methodologies. However, some changes were recommended to the risk assessment tools to improve usability and validity. The evaluation concluded that both the Code and the tools within it would benefit from simplification, improved typography and layout, and industry-specific information on manual handling hazards.

  13. Large scale healthcare data integration and analysis using the semantic web.

    PubMed

    Timm, John; Renly, Sondra; Farkash, Ariel

    2011-01-01

    Healthcare data interoperability can only be achieved when the semantics of the content is well defined and consistently implemented across heterogeneous data sources. Achieving these objectives of interoperability requires the collaboration of experts from several domains. This paper describes tooling that integrates Semantic Web technologies with common tools to facilitate cross-domain collaborative development for the purposes of data interoperability. Our approach is divided into stages of data harmonization and representation, model transformation, and instance generation. We applied our approach on Hypergenes, an EU funded project, where we use our method to the Essential Hypertension disease model using a CDA template. Our domain expert partners include clinical providers, clinical domain researchers, healthcare information technology experts, and a variety of clinical data consumers. We show that bringing Semantic Web technologies into the healthcare interoperability toolkit increases opportunities for beneficial collaboration thus improving patient care and clinical research outcomes.

  14. bioWeb3D: an online webGL 3D data visualisation tool

    PubMed Central

    2013-01-01

    Background Data visualization is critical for interpreting biological data. However, in practice it can prove to be a bottleneck for non trained researchers; this is especially true for three dimensional (3D) data representation. Whilst existing software can provide all necessary functionalities to represent and manipulate biological 3D datasets, very few are easily accessible (browser based), cross platform and accessible to non-expert users. Results An online HTML5/WebGL based 3D visualisation tool has been developed to allow biologists to quickly and easily view interactive and customizable three dimensional representations of their data along with multiple layers of information. Using the WebGL library Three.js written in Javascript, bioWeb3D allows the simultaneous visualisation of multiple large datasets inputted via a simple JSON, XML or CSV file, which can be read and analysed locally thanks to HTML5 capabilities. Conclusions Using basic 3D representation techniques in a technologically innovative context, we provide a program that is not intended to compete with professional 3D representation software, but that instead enables a quick and intuitive representation of reasonably large 3D datasets. PMID:23758781

  15. bioWeb3D: an online webGL 3D data visualisation tool.

    PubMed

    Pettit, Jean-Baptiste; Marioni, John C

    2013-06-07

    Data visualization is critical for interpreting biological data. However, in practice it can prove to be a bottleneck for non trained researchers; this is especially true for three dimensional (3D) data representation. Whilst existing software can provide all necessary functionalities to represent and manipulate biological 3D datasets, very few are easily accessible (browser based), cross platform and accessible to non-expert users. An online HTML5/WebGL based 3D visualisation tool has been developed to allow biologists to quickly and easily view interactive and customizable three dimensional representations of their data along with multiple layers of information. Using the WebGL library Three.js written in Javascript, bioWeb3D allows the simultaneous visualisation of multiple large datasets inputted via a simple JSON, XML or CSV file, which can be read and analysed locally thanks to HTML5 capabilities. Using basic 3D representation techniques in a technologically innovative context, we provide a program that is not intended to compete with professional 3D representation software, but that instead enables a quick and intuitive representation of reasonably large 3D datasets.

  16. Collaborative Working for Large Digitisation Projects

    ERIC Educational Resources Information Center

    Yeates, Robin; Guy, Damon

    2006-01-01

    Purpose: To explore the effectiveness of large-scale consortia for disseminating local heritage via the web. To describe the creation of a large geographically based cultural heritage consortium in the South East of England and management lessons resulting from a major web site digitisation project. To encourage the improved sharing of experience…

  17. Large-area sheet task advanced dendritic web growth development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Hopkins, R. H.; Meier, D.; Schruben, J.

    1982-01-01

    The computer code for calculating web temperature distribution was expanded to provide a graphics output in addition to numerical and punch card output. The new code was used to examine various modifications of the J419 configuration and, on the basis of the results, a new growth geometry was designed. Additionally, several mathematically defined temperature profiles were evaluated for the effects of the free boundary (growth front) on the thermal stress generation. Experimental growth runs were made with modified J419 configurations to complement the modeling work. A modified J435 configuration was evaluated.

  18. LigoDV-web: Providing easy, secure and universal access to a large distributed scientific data store for the LIGO scientific collaboration

    NASA Astrophysics Data System (ADS)

    Areeda, J. S.; Smith, J. R.; Lundgren, A. P.; Maros, E.; Macleod, D. M.; Zweizig, J.

    2017-01-01

    Gravitational-wave observatories around the world, including the Laser Interferometer Gravitational-Wave Observatory (LIGO), record a large volume of gravitational-wave output data and auxiliary data about the instruments and their environments. These data are stored at the observatory sites and distributed to computing clusters for data analysis. LigoDV-web is a web-based data viewer that provides access to data recorded at the LIGO Hanford, LIGO Livingston and GEO600 observatories, and the 40 m prototype interferometer at Caltech. The challenge addressed by this project is to provide meaningful visualizations of small data sets to anyone in the collaboration in a fast, secure and reliable manner with minimal software, hardware and training required of the end users. LigoDV-web is implemented as a Java Enterprise Application, with Shibboleth Single Sign On for authentication and authorization, and a proprietary network protocol used for data access on the back end. Collaboration members with proper credentials can request data be displayed in any of several general formats from any Internet appliance that supports a modern browser with Javascript and minimal HTML5 support, including personal computers, smartphones, and tablets. Since its inception in 2012, 634 unique users have visited the LigoDV-web website in a total of 33 , 861 sessions and generated a total of 139 , 875 plots. This infrastructure has been helpful in many analyses within the collaboration including follow-up of the data surrounding the first gravitational-wave events observed by LIGO in 2015.

  19. Web Application Software for Ground Operations Planning Database (GOPDb) Management

    NASA Technical Reports Server (NTRS)

    Lanham, Clifton; Kallner, Shawn; Gernand, Jeffrey

    2013-01-01

    A Web application facilitates collaborative development of the ground operations planning document. This will reduce costs and development time for new programs by incorporating the data governance, access control, and revision tracking of the ground operations planning data. Ground Operations Planning requires the creation and maintenance of detailed timelines and documentation. The GOPDb Web application was created using state-of-the-art Web 2.0 technologies, and was deployed as SaaS (Software as a Service), with an emphasis on data governance and security needs. Application access is managed using two-factor authentication, with data write permissions tied to user roles and responsibilities. Multiple instances of the application can be deployed on a Web server to meet the robust needs for multiple, future programs with minimal additional cost. This innovation features high availability and scalability, with no additional software that needs to be bought or installed. For data governance and security (data quality, management, business process management, and risk management for data handling), the software uses NAMS. No local copy/cloning of data is permitted. Data change log/tracking is addressed, as well as collaboration, work flow, and process standardization. The software provides on-line documentation and detailed Web-based help. There are multiple ways that this software can be deployed on a Web server to meet ground operations planning needs for future programs. The software could be used to support commercial crew ground operations planning, as well as commercial payload/satellite ground operations planning. The application source code and database schema are owned by NASA.

  20. Evaluation of a Web-based Error Reporting Surveillance System in a Large Iranian Hospital.

    PubMed

    Askarian, Mehrdad; Ghoreishi, Mahboobeh; Akbari Haghighinejad, Hourvash; Palenik, Charles John; Ghodsi, Maryam

    2017-08-01

    Proper reporting of medical errors helps healthcare providers learn from adverse incidents and improve patient safety. A well-designed and functioning confidential reporting system is an essential component to this process. There are many error reporting methods; however, web-based systems are often preferred because they can provide; comprehensive and more easily analyzed information. This study addresses the use of a web-based error reporting system. This interventional study involved the application of an in-house designed "voluntary web-based medical error reporting system." The system has been used since July 2014 in Nemazee Hospital, Shiraz University of Medical Sciences. The rate and severity of errors reported during the year prior and a year after system launch were compared. The slope of the error report trend line was steep during the first 12 months (B = 105.727, P = 0.00). However, it slowed following launch of the web-based reporting system and was no longer statistically significant (B = 15.27, P = 0.81) by the end of the second year. Most recorded errors were no-harm laboratory types and were due to inattention. Usually, they were reported by nurses and other permanent employees. Most reported errors occurred during morning shifts. Using a standardized web-based error reporting system can be beneficial. This study reports on the performance of an in-house designed reporting system, which appeared to properly detect and analyze medical errors. The system also generated follow-up reports in a timely and accurate manner. Detection of near-miss errors could play a significant role in identifying areas of system defects.

  1. Students' Strategies for Exception Handling

    ERIC Educational Resources Information Center

    Rashkovits, Rami; Lavy, Ilana

    2011-01-01

    This study discusses and presents various strategies employed by novice programmers concerning exception handling. The main contributions of this paper are as follows: we provide an analysis tool to measure the level of assimilation of exception handling mechanism; we present and analyse strategies to handle exceptions; we present and analyse…

  2. HOPE: An On-Line Piloted Handling Qualities Experiment Data Book

    NASA Technical Reports Server (NTRS)

    Jackson, E. B.; Proffitt, Melissa S.

    2010-01-01

    A novel on-line database for capturing most of the information obtained during piloted handling qualities experiments (either flight or simulated) is described. The Hyperlinked Overview of Piloted Evaluations (HOPE) web application is based on an open-source object-oriented Web-based front end (Ruby-on-Rails) that can be used with a variety of back-end relational database engines. The hyperlinked, on-line data book approach allows an easily-traversed way of looking at a variety of collected data, including pilot ratings, pilot information, vehicle and configuration characteristics, test maneuvers, and individual flight test cards and repeat runs. It allows for on-line retrieval of pilot comments, both audio and transcribed, as well as time history data retrieval and video playback. Pilot questionnaires are recorded as are pilot biographies. Simple statistics are calculated for each selected group of pilot ratings, allowing multiple ways to aggregate the data set (by pilot, by task, or by vehicle configuration, for example). Any number of per-run or per-task metrics can be captured in the database. The entire run metrics dataset can be downloaded in comma-separated text for further analysis off-line. It is expected that this tool will be made available upon request

  3. Extracting knowledge from the World Wide Web

    PubMed Central

    Henzinger, Monika; Lawrence, Steve

    2004-01-01

    The World Wide Web provides a unprecedented opportunity to automatically analyze a large sample of interests and activity in the world. We discuss methods for extracting knowledge from the web by randomly sampling and analyzing hosts and pages, and by analyzing the link structure of the web and how links accumulate over time. A variety of interesting and valuable information can be extracted, such as the distribution of web pages over domains, the distribution of interest in different areas, communities related to different topics, the nature of competition in different categories of sites, and the degree of communication between different communities or countries. PMID:14745041

  4. Microfluidics on liquid handling stations (μF-on-LHS): a new industry-compatible microfluidic platform

    NASA Astrophysics Data System (ADS)

    Kittelmann, Jörg; Radtke, Carsten P.; Waldbaur, Ansgar; Neumann, Christiane; Hubbuch, Jürgen; Rapp, Bastian E.

    2014-03-01

    Since the early days microfluidics as a scientific discipline has been an interdisciplinary research field with a wide scope of potential applications. Besides tailored assays for point-of-care (PoC) diagnostics, microfluidics has been an important tool for large-scale screening of reagents and building blocks in organic chemistry, pharmaceutics and medical engineering. Furthermore, numerous potential marketable products have been described over the years. However, especially in industrial applications, microfluidics is often considered only an alternative technology for fluid handling, a field which is industrially mostly dominated by large-scale numerically controlled fluid and liquid handling stations. Numerous noteworthy products have dominated this field in the last decade and have been inhibited the widespread application of microfluidics technology. However, automated liquid handling stations and microfluidics do not have to be considered as mutually exclusive approached. We have recently introduced a hybrid fluidic platform combining an industrially established liquid handling station and a generic microfluidic interfacing module that allows probing a microfluidic system (such as an essay or a synthesis array) using the instrumentation provided by the liquid handling station. We term this technology "Microfluidic on Liquid Handling Stations (μF-on-LHS)" - a classical "best of both worlds"- approach that allows combining the highly evolved, automated and industry-proven LHS systems with any type of microfluidic assay. In this paper we show, to the best of our knowledge, the first droplet microfluidics application on an industrial LHS using the μF-on-LHS concept.

  5. Handle grip span for optimising finger-specific force capability as a function of hand size.

    PubMed

    Lee, Soo-Jin; Kong, Yong-Ku; Lowe, Brian D; Song, Seongho

    2009-05-01

    Five grip spans (45 to 65 mm) were tested to evaluate the effects of handle grip span and user's hand size on maximum grip strength, individual finger force and subjective ratings of comfort using a computerised digital dynamometer with independent finger force sensors. Forty-six males participated and were assigned into three hand size groups (small, medium, large) according to their hands' length. In general, results showed the 55- and 50-mm grip spans were rated as the most comfortable sizes and showed the largest grip strength (433.6 N and 430.8 N, respectively), whereas the 65-mm grip span handle was rated as the least comfortable size and the least grip strength. With regard to the interaction effect of grip span and hand size, small and medium-hand participants rated the best preference for the 50- to 55-mm grip spans and the least for the 65-mm grip span, whereas large-hand participants rated the 55- to 60-mm grip spans as the most preferred and the 45-mm grip span as the least preferred. Normalised grip span (NGS) ratios (29% and 27%) are the ratios of user's hand length to handle grip span. The NGS ratios were obtained and applied for suggesting handle grip spans in order to maximise subjective comfort as well as gripping force according to the users' hand sizes. In the analysis of individual finger force, the middle finger force showed the highest contribution (37.5%) to the total finger force, followed by the ring (28.7%), index (20.2%) and little (13.6%) finger. In addition, each finger was observed to have a different optimal grip span for exerting the maximum force, resulting in a bow-contoured shaped handle (the grip span of the handle at the centre is larger than the handle at the end) for two-handle hand tools. Thus, the grip spans for two-handle hand tools may be designed according to the users' hand/finger anthropometrics to maximise subjective ratings and performance based on this study. Results obtained in this study will provide guidelines

  6. Opal web services for biomedical applications.

    PubMed

    Ren, Jingyuan; Williams, Nadya; Clementi, Luca; Krishnan, Sriram; Li, Wilfred W

    2010-07-01

    Biomedical applications have become increasingly complex, and they often require large-scale high-performance computing resources with a large number of processors and memory. The complexity of application deployment and the advances in cluster, grid and cloud computing require new modes of support for biomedical research. Scientific Software as a Service (sSaaS) enables scalable and transparent access to biomedical applications through simple standards-based Web interfaces. Towards this end, we built a production web server (http://ws.nbcr.net) in August 2007 to support the bioinformatics application called MEME. The server has grown since to include docking analysis with AutoDock and AutoDock Vina, electrostatic calculations using PDB2PQR and APBS, and off-target analysis using SMAP. All the applications on the servers are powered by Opal, a toolkit that allows users to wrap scientific applications easily as web services without any modification to the scientific codes, by writing simple XML configuration files. Opal allows both web forms-based access and programmatic access of all our applications. The Opal toolkit currently supports SOAP-based Web service access to a number of popular applications from the National Biomedical Computation Resource (NBCR) and affiliated collaborative and service projects. In addition, Opal's programmatic access capability allows our applications to be accessed through many workflow tools, including Vision, Kepler, Nimrod/K and VisTrails. From mid-August 2007 to the end of 2009, we have successfully executed 239,814 jobs. The number of successfully executed jobs more than doubled from 205 to 411 per day between 2008 and 2009. The Opal-enabled service model is useful for a wide range of applications. It provides for interoperation with other applications with Web Service interfaces, and allows application developers to focus on the scientific tool and workflow development. Web server availability: http://ws.nbcr.net.

  7. 46 CFR 111.106-13 - Cargo handling devices or cargo pump rooms handling flammable or combustible cargoes.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... pierced by fixed lights, drive shafts, and pump-engine control rods, provided that the shafts and rods are... 46 Shipping 4 2014-10-01 2014-10-01 false Cargo handling devices or cargo pump rooms handling... OSVs § 111.106-13 Cargo handling devices or cargo pump rooms handling flammable or combustible cargoes...

  8. Low cost silicon solar array project large area silicon sheet task: Silicon web process development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Blais, P. D.; Davis, J. R., Jr.

    1977-01-01

    Growth configurations were developed which produced crystals having low residual stress levels. The properties of a 106 mm diameter round crucible were evaluated and it was found that this design had greatly enhanced temperature fluctuations arising from convection in the melt. Thermal modeling efforts were directed to developing finite element models of the 106 mm round crucible and an elongated susceptor/crucible configuration. Also, the thermal model for the heat loss modes from the dendritic web was examined for guidance in reducing the thermal stress in the web. An economic analysis was prepared to evaluate the silicon web process in relation to price goals.

  9. Development of liquid handling techniques in microgravity

    NASA Technical Reports Server (NTRS)

    Antar, Basil N.

    1995-01-01

    A large number of experiments dealing with protein crystal growth and also with growth of crystals from solution require complicated fluid handling procedures including filling of empty containers with liquids, mixing of solutions, and stirring of liquids. Such procedures are accomplished in a straight forward manner when performed under terrestrial conditions in the laboratory. However, in the low gravity environment of space, such as on board the Space Shuttle or an Earth-orbiting space station, these procedures sometimes produced entirely undesirable results. Under terrestrial conditions, liquids usually completely separate from the gas due to the buoyancy effects of Earth's gravity. Consequently, any gas pockets that are entrained into the liquid during a fluid handling procedure will eventually migrate towards the top of the vessel where they can be removed. In a low gravity environment any folded gas bubble will remain within the liquid bulk indefinitely at a location that is not known a priori resulting in a mixture of liquid and vapor.

  10. Internet Technology in Magnetic Resonance: A Common Gateway Interface Program for the World-Wide Web NMR Spectrometer

    NASA Astrophysics Data System (ADS)

    Buszko, Marian L.; Buszko, Dominik; Wang, Daniel C.

    1998-04-01

    A custom-written Common Gateway Interface (CGI) program for remote control of an NMR spectrometer using a World Wide Web browser has been described. The program, running on a UNIX workstation, uses multiple processes to handle concurrent tasks of interacting with the user and with the spectrometer. The program's parent process communicates with the browser and sends out commands to the spectrometer; the child process is mainly responsible for data acquisition. Communication between the processes is via the shared memory mechanism. The WWW pages that have been developed for the system make use of the frames feature of web browsers. The CGI program provides an intuitive user interface to the NMR spectrometer, making, in effect, a complex system an easy-to-use Web appliance.

  11. A Method for Transforming Existing Web Service Descriptions into an Enhanced Semantic Web Service Framework

    NASA Astrophysics Data System (ADS)

    Du, Xiaofeng; Song, William; Munro, Malcolm

    Web Services as a new distributed system technology has been widely adopted by industries in the areas, such as enterprise application integration (EAI), business process management (BPM), and virtual organisation (VO). However, lack of semantics in the current Web Service standards has been a major barrier in service discovery and composition. In this chapter, we propose an enhanced context-based semantic service description framework (CbSSDF+) that tackles the problem and improves the flexibility of service discovery and the correctness of generated composite services. We also provide an agile transformation method to demonstrate how the various formats of Web Service descriptions on the Web can be managed and renovated step by step into CbSSDF+ based service description without large amount of engineering work. At the end of the chapter, we evaluate the applicability of the transformation method and the effectiveness of CbSSDF+ through a series of experiments.

  12. Handling Missing Data With Multilevel Structural Equation Modeling and Full Information Maximum Likelihood Techniques.

    PubMed

    Schminkey, Donna L; von Oertzen, Timo; Bullock, Linda

    2016-08-01

    With increasing access to population-based data and electronic health records for secondary analysis, missing data are common. In the social and behavioral sciences, missing data frequently are handled with multiple imputation methods or full information maximum likelihood (FIML) techniques, but healthcare researchers have not embraced these methodologies to the same extent and more often use either traditional imputation techniques or complete case analysis, which can compromise power and introduce unintended bias. This article is a review of options for handling missing data, concluding with a case study demonstrating the utility of multilevel structural equation modeling using full information maximum likelihood (MSEM with FIML) to handle large amounts of missing data. MSEM with FIML is a parsimonious and hypothesis-driven strategy to cope with large amounts of missing data without compromising power or introducing bias. This technique is relevant for nurse researchers faced with ever-increasing amounts of electronic data and decreasing research budgets. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  13. Musculoskeletal injuries resulting from patient handling tasks among hospital workers.

    PubMed

    Pompeii, Lisa A; Lipscomb, Hester J; Schoenfisch, Ashley L; Dement, John M

    2009-07-01

    The purpose of this study was to evaluate musculoskeletal injuries and disorders resulting from patient handling prior to the implementation of a "minimal manual lift" policy at a large tertiary care medical center. We sought to define the circumstances surrounding patient handling injuries and to identify potential preventive measures. Human resources data were used to define the cohort and their time at work. Workers' compensation records (1997-2003) were utilized to identify work-related musculoskeletal claims, while the workers' description of injury was used to identify those that resulted from patient handling. Adjusted rate ratios were generated using Poisson regression. One-third (n = 876) of all musculoskeletal injuries resulted from patient handling activities. Most (83%) of the injury burden was incurred by inpatient nurses, nurses' aides and radiology technicians, while injury rates were highest for nurses' aides (8.8/100 full-time equivalent, FTEs) and smaller workgroups including emergency medical technicians (10.3/100 FTEs), patient transporters (4.3/100 FTEs), operating room technicians (3.1/100 FTEs), and morgue technicians (2.2/100 FTEs). Forty percent of injuries due to lifting/transferring patients may have been prevented through the use of mechanical lift equipment, while 32% of injuries resulting from repositioning/turning patients, pulling patients up in bed, or catching falling patients may not have been prevented by the use of lift equipment. The use of mechanical lift equipment could significantly reduce the risk of some patient handling injuries but additional interventions need to be considered that address other patient handling tasks. Smaller high-risk workgroups should not be neglected in prevention efforts.

  14. Helicopter Handling Qualities

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Helicopters are used by the military and civilian communities for a variety of tasks and must be capable of operating in poor weather conditions and at night. Accompanying extended helicopter operations is a significant increase in pilot workload and a need for better handling qualities. An overview of the status and problems in the development and specification of helicopter handling-qualities criteria is presented. Topics for future research efforts by government and industry are highlighted.

  15. Ergonomics and patient handling.

    PubMed

    McCoskey, Kelsey L

    2007-11-01

    This study aimed to describe patient-handling demands in inpatient units during a 24-hour period at a military health care facility. A 1-day total population survey described the diverse nature and impact of patient-handling tasks relative to a variety of nursing care units, patient characteristics, and transfer equipment. Productivity baselines were established based on patient dependency, physical exertion, type of transfer, and time spent performing the transfer. Descriptions of the physiological effect of transfers on staff based on patient, transfer, and staff characteristics were developed. Nursing staff response to surveys demonstrated how patient-handling demands are impacted by the staff's physical exertion and level of patient dependency. The findings of this study describe the types of transfers occurring in these inpatient units and the physical exertion and time requirements for these transfers. This description may guide selection of the most appropriate and cost-effective patient-handling equipment required for specific units and patients.

  16. New web technologies for astronomy

    NASA Astrophysics Data System (ADS)

    Sprimont, P.-G.; Ricci, D.; Nicastro, L.

    2014-12-01

    Thanks to the new HTML5 capabilities and the huge improvements of the JavaScript language, it is now possible to design very complex and interactive web user interfaces. On top of that, the once monolithic and file-server oriented web servers are evolving into easily programmable server applications capable to cope with the complex interactions made possible by the new generation of browsers. We believe that the whole community of amateur and professionals astronomers can benefit from the potential of these new technologies. New web interfaces can be designed to provide the user with a large deal of much more intuitive and interactive tools. Accessing astronomical data archives, schedule, control and monitor observatories, and in particular robotic telescopes, supervising data reduction pipelines, all are capabilities that can now be implemented in a JavaScript web application. In this paper we describe the Sadira package we are implementing exactly to this aim.

  17. Security and Efficiency Concerns With Distributed Collaborative Networking Environments

    DTIC Science & Technology

    2003-09-01

    have the ability to access Web communications services of the WebEx MediaTone Network from a single login. [24] WebEx provides a range of secure...Web. WebEx services enable secure data, voice and video communications through the browser and are supported by the WebEx MediaTone Network, a global...designed to host large-scale, structured events and conferences, featuring a Q&A Manager that allows multiple moderators to handle questions while

  18. An introduction to the Semantic Web for health sciences librarians*

    PubMed Central

    Robu, Ioana; Robu, Valentin; Thirion, Benoit

    2006-01-01

    Objectives: The paper (1) introduces health sciences librarians to the main concepts and principles of the Semantic Web (SW) and (2) briefly reviews a number of projects on the handling of biomedical information that uses SW technology. Methodology: The paper is structured into two main parts. “Semantic Web Technology” provides a high-level description, with examples, of the main standards and concepts: extensible markup language (XML), Resource Description Framework (RDF), RDF Schema (RDFS), ontologies, and their utility in information retrieval, concluding with mention of more advanced SW languages and their characteristics. “Semantic Web Applications and Research Projects in the Biomedical Field” is a brief review of the Unified Medical Language System (UMLS), Generalised Architecture for Languages, Encyclopedias and Nomenclatures in Medicine (GALEN), HealthCyberMap, LinkBase, and the thesaurus of the National Cancer Institute (NCI). The paper also mentions other benefits and by-products of the SW, citing projects related to them. Discussion and Conclusions: Some of the problems facing the SW vision are presented, especially the ways in which the librarians' expertise in organizing knowledge and in structuring information may contribute to SW projects. PMID:16636713

  19. Complaint handling in healthcare: expectation gaps between physicians and the public; results of a survey study.

    PubMed

    Friele, R D; Reitsma, P M; de Jong, J D

    2015-10-01

    Patients who submit complaints about the healthcare they have received are often dissatisfied with the response to their complaints. This is usually attributed to the failure of physicians to respond adequately to what complainants want, e.g. an apology or an explanation. However, expectations of complaint handling among the public may colour how they evaluate the way their own complaint is handled. This descriptive study assesses expectations of complaint handling in healthcare among the public and physicians. Negative public expectations and the gap between these expectations and those of physicians may explain patients' dissatisfaction with complaints procedures. We held two surveys; one among physicians, using a panel of 3366 physicians (response rate 57 %, containing all kinds of physicians like GP's, medical specialist and physicians working in a nursing home) and one among the public, using the Dutch Healthcare Consumer Panel (n = 1422, response rate 68 %). We asked both panels identical questions about their expectations of how complaints are handled in healthcare. Differences in expectation scores between the public and the physicians were tested using non-parametric tests. The public have negative expectations about how complaints are handled. Physician's expectations are far more positive, demonstrating large expectation gaps between physicians and the public. The large expectation gap between the public and physicians means that when they meet because of complaint, they are likely to start off with opposite expectations of the situation. This is no favourable condition for a positive outcome of a complaints procedure. The negative public preconceptions about the way their complaint will be handled will prove hard to change during the process of complaints handling. People tend to see what they thought would happen, almost inevitably leading to a negative judgement about how their complaint was handled.

  20. SIP: A Web-Based Astronomical Image Processing Program

    NASA Astrophysics Data System (ADS)

    Simonetti, J. H.

    1999-12-01

    I have written an astronomical image processing and analysis program designed to run over the internet in a Java-compatible web browser. The program, Sky Image Processor (SIP), is accessible at the SIP webpage (http://www.phys.vt.edu/SIP). Since nothing is installed on the user's machine, there is no need to download upgrades; the latest version of the program is always instantly available. Furthermore, the Java programming language is designed to work on any computer platform (any machine and operating system). The program could be used with students in web-based instruction or in a computer laboratory setting; it may also be of use in some research or outreach applications. While SIP is similar to other image processing programs, it is unique in some important respects. For example, SIP can load images from the user's machine or from the Web. An instructor can put images on a web server for students to load and analyze on their own personal computer. Or, the instructor can inform the students of images to load from any other web server. Furthermore, since SIP was written with students in mind, the philosophy is to present the user with the most basic tools necessary to process and analyze astronomical images. Images can be combined (by addition, subtraction, multiplication, or division), multiplied by a constant, smoothed, cropped, flipped, rotated, and so on. Statistics can be gathered for pixels within a box drawn by the user. Basic tools are available for gathering data from an image which can be used for performing simple differential photometry, or astrometry. Therefore, students can learn how astronomical image processing works. Since SIP is not part of a commercial CCD camera package, the program is written to handle the most common denominator image file, the FITS format.

  1. 7 CFR 1210.307 - Handle.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... AND ORDERS; MISCELLANEOUS COMMODITIES), DEPARTMENT OF AGRICULTURE WATERMELON RESEARCH AND PROMOTION PLAN Watermelon Research and Promotion Plan Definitions § 1210.307 Handle. Handle means to grade, pack...

  2. 7 CFR 1210.307 - Handle.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... AND ORDERS; MISCELLANEOUS COMMODITIES), DEPARTMENT OF AGRICULTURE WATERMELON RESEARCH AND PROMOTION PLAN Watermelon Research and Promotion Plan Definitions § 1210.307 Handle. Handle means to grade, pack...

  3. 7 CFR 1210.307 - Handle.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... AND ORDERS; MISCELLANEOUS COMMODITIES), DEPARTMENT OF AGRICULTURE WATERMELON RESEARCH AND PROMOTION PLAN Watermelon Research and Promotion Plan Definitions § 1210.307 Handle. Handle means to grade, pack...

  4. 7 CFR 1210.307 - Handle.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... AND ORDERS; MISCELLANEOUS COMMODITIES), DEPARTMENT OF AGRICULTURE WATERMELON RESEARCH AND PROMOTION PLAN Watermelon Research and Promotion Plan Definitions § 1210.307 Handle. Handle means to grade, pack...

  5. Web mapping system for complex processing and visualization of environmental geospatial datasets

    NASA Astrophysics Data System (ADS)

    Titov, Alexander; Gordov, Evgeny; Okladnikov, Igor

    2016-04-01

    Environmental geospatial datasets (meteorological observations, modeling and reanalysis results, etc.) are used in numerous research applications. Due to a number of objective reasons such as inherent heterogeneity of environmental datasets, big dataset volume, complexity of data models used, syntactic and semantic differences that complicate creation and use of unified terminology, the development of environmental geodata access, processing and visualization services as well as client applications turns out to be quite a sophisticated task. According to general INSPIRE requirements to data visualization geoportal web applications have to provide such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map legends and corresponding metadata information. It should be noted that modern web mapping systems as integrated geoportal applications are developed based on the SOA and might be considered as complexes of interconnected software tools for working with geospatial data. In the report a complex web mapping system including GIS web client and corresponding OGC services for working with geospatial (NetCDF, PostGIS) dataset archive is presented. There are three basic tiers of the GIS web client in it: 1. Tier of geospatial metadata retrieved from central MySQL repository and represented in JSON format 2. Tier of JavaScript objects implementing methods handling: --- NetCDF metadata --- Task XML object for configuring user calculations, input and output formats --- OGC WMS/WFS cartographical services 3. Graphical user interface (GUI) tier representing JavaScript objects realizing web application business logic Metadata tier consists of a number of JSON objects containing technical information describing geospatial datasets (such as spatio-temporal resolution, meteorological parameters, valid processing methods, etc). The middleware tier of JavaScript objects implementing methods for handling geospatial

  6. Grain Handling and Storage.

    ERIC Educational Resources Information Center

    Harris, Troy G.; Minor, John

    This text for a secondary- or postecondary-level course in grain handling and storage contains ten chapters. Chapter titles are (1) Introduction to Grain Handling and Storage, (2) Elevator Safety, (3) Grain Grading and Seed Identification, (4) Moisture Control, (5) Insect and Rodent Control, (6) Grain Inventory Control, (7) Elevator Maintenance,…

  7. New implementation of OGC Web Processing Service in Python programming language. PyWPS-4 and issues we are facing with processing of large raster data using OGC WPS

    NASA Astrophysics Data System (ADS)

    Čepický, Jáchym; Moreira de Sousa, Luís

    2016-06-01

    The OGC® Web Processing Service (WPS) Interface Standard provides rules for standardizing inputs and outputs (requests and responses) for geospatial processing services, such as polygon overlay. The standard also defines how a client can request the execution of a process, and how the output from the process is handled. It defines an interface that facilitates publishing of geospatial processes and client discovery of processes and and binding to those processes into workflows. Data required by a WPS can be delivered across a network or they can be available at a server. PyWPS was one of the first implementations of OGC WPS on the server side. It is written in the Python programming language and it tries to connect to all existing tools for geospatial data analysis, available on the Python platform. During the last two years, the PyWPS development team has written a new version (called PyWPS-4) completely from scratch. The analysis of large raster datasets poses several technical issues in implementing the WPS standard. The data format has to be defined and validated on the server side and binary data have to be encoded using some numeric representation. Pulling raster data from remote servers introduces security risks, in addition, running several processes in parallel has to be possible, so that system resources are used efficiently while preserving security. Here we discuss these topics and illustrate some of the solutions adopted within the PyWPS implementation.

  8. Index Compression and Efficient Query Processing in Large Web Search Engines

    ERIC Educational Resources Information Center

    Ding, Shuai

    2013-01-01

    The inverted index is the main data structure used by all the major search engines. Search engines build an inverted index on their collection to speed up query processing. As the size of the web grows, the length of the inverted list structures, which can easily grow to hundreds of MBs or even GBs for common terms (roughly linear in the size of…

  9. Internet Technology in Magnetic Resonance: A Common Gateway Interface Program for the World-Wide Web NMR Spectrometer

    PubMed

    Buszko; Buszko; Wang

    1998-04-01

    A custom-written Common Gateway Interface (CGI) program for remote control of an NMR spectrometer using a World Wide Web browser has been described. The program, running on a UNIX workstation, uses multiple processes to handle concurrent tasks of interacting with the user and with the spectrometer. The program's parent process communicates with the browser and sends out commands to the spectrometer; the child process is mainly responsible for data acquisition. Communication between the processes is via the shared memory mechanism. The WWW pages that have been developed for the system make use of the frames feature of web browsers. The CGI program provides an intuitive user interface to the NMR spectrometer, making, in effect, a complex system an easy-to-use Web appliance. Copyright 1998 Academic Press.

  10. Finger doses for staff handling radiopharmaceuticals in nuclear medicine.

    PubMed

    Pant, Gauri S; Sharma, Sanjay K; Rath, Gaura K

    2006-09-01

    Radiation doses to the fingers of occupational workers handling 99mTc-labeled compounds and 131I for diagnostic and therapeutic procedures in nuclear medicine were measured by thermoluminescence dosimetry. The doses were measured at the base of the ring finger and the index finger of both hands in 2 groups of workers. Group 1 (7 workers) handled 99mTc-labeled radiopharmaceuticals, and group 2 (6 workers) handled 131I for diagnosis and therapy. Radiation doses to the fingertips of 3 workers also were measured. Two were from group 1, and 1 was from group 2. The doses to the base of the fingers for the radiopharmacy staff and physicians from group 1 were observed to be 17+/-7.5 (mean+/-SD) and 13.4+/-6.5 microSv/GBq, respectively. Similarly, the dose to the base of the fingers for the 3 physicians in group 2 was estimated to be 82.0+/-13.8 microSv/GBq. Finger doses for the technologists in both groups could not be calculated per unit of activity because they did not handle the radiopharmaceuticals directly. Their doses were reported in millisieverts that accumulated in 1 wk. The doses to the fingertips of the radiopharmacy worker and the physician in group 1 were 74.3+/-19.8 and 53.5+/-21.9 microSv/GBq, respectively. The dose to the fingertips of the physician in group 2 was 469.9+/-267 microSv/GBq. The radiation doses to the fingers of nuclear medicine staff at our center were measured. The maximum expected annual dose to the extremities appeared to be less than the annual limit (500 mSv/y), except for a physician who handled large quantities of 131I for treatment. Because all of these workers are on rotation and do not constantly handle radioactivity throughout the year, the doses to the base of the fingers or the fingertips should not exceed the prescribed annual limit of 500 mSv.

  11. WEB-IS2: Next Generation Web Services Using Amira Visualization Package

    NASA Astrophysics Data System (ADS)

    Yang, X.; Wang, Y.; Bollig, E. F.; Kadlec, B. J.; Garbow, Z. A.; Yuen, D. A.; Erlebacher, G.

    2003-12-01

    Amira (www.amiravis.com) is a powerful 3-D visualization package and has been employed recently by the science and engineering communities to gain insight into their data. We present a new web-based interface to Amira, packaged in a Java applet. We have developed a module called WEB-IS/Amira (WEB-IS2), which provides web-based access to Amira. This tool allows earth scientists to manipulate Amira controls remotely and to analyze, render and view large datasets over the internet, without regard for time or location. This could have important ramifications for GRID computing. The design of our implementation will soon allow multiple users to visually collaborate by manipulating a single dataset through a variety of client devices. These clients will only require a browser capable of displaying Java applets. As the deluge of data continues, innovative solutions that maximize ease of use without sacrificing efficiency or flexibility will continue to gain in importance, particularly in the Earth sciences. Major initiatives, such as Earthscope (http://www.earthscope.org), which will generate at least a terabyte of data daily, stand to profit enormously by a system such as WEB-IS/Amira (WEB-IS2). We discuss our use of SOAP (Livingston, D., Advanced SOAP for Web development, Prentice Hall, 2002), a novel 2-way communication protocol, as a means of providing remote commands, and efficient point-to-point transfer of binary image data. We will present our initial experiences with the use of Naradabrokering (www.naradabrokering.org) as a means to decouple clients and servers. Information is submitted to the system as a published item, while it is retrieved through a subscription mechanisms, via what is known as "topics". These topic headers, their contents, and the list of subscribers are automatically tracked by Naradabrokering. This novel approach promises a high degree of fault tolerance, flexibility with respect to client diversity, and language independence for the

  12. Table Extraction from Web Pages Using Conditional Random Fields to Extract Toponym Related Data

    NASA Astrophysics Data System (ADS)

    Luthfi Hanifah, Hayyu'; Akbar, Saiful

    2017-01-01

    Table is one of the ways to visualize information on web pages. The abundant number of web pages that compose the World Wide Web has been the motivation of information extraction and information retrieval research, including the research for table extraction. Besides, there is a need for a system which is designed to specifically handle location-related information. Based on this background, this research is conducted to provide a way to extract location-related data from web tables so that it can be used in the development of Geographic Information Retrieval (GIR) system. The location-related data will be identified by the toponym (location name). In this research, a rule-based approach with gazetteer is used to recognize toponym from web table. Meanwhile, to extract data from a table, a combination of rule-based approach and statistical-based approach is used. On the statistical-based approach, Conditional Random Fields (CRF) model is used to understand the schema of the table. The result of table extraction is presented on JSON format. If a web table contains toponym, a field will be added on the JSON document to store the toponym values. This field can be used to index the table data in accordance to the toponym, which then can be used in the development of GIR system.

  13. Non-visual Web Browsing: Beyond Web Accessibility

    PubMed Central

    Ramakrishnan, I.V.; Ashok, Vikas

    2017-01-01

    People with vision impairments typically use screen readers to browse the Web. To facilitate non-visual browsing, web sites must be made accessible to screen readers, i.e., all the visible elements in the web site must be readable by the screen reader. But even if web sites are accessible, screen-reader users may not find them easy to use and/or easy to navigate. For example, they may not be able to locate the desired information without having to listen to a lot of irrelevant contents. These issues go beyond web accessibility and directly impact web usability. Several techniques have been reported in the accessibility literature for making the Web usable for screen reading. This paper is a review of these techniques. Interestingly, the review reveals that understanding the semantics of the web content is the overarching theme that drives these techniques for improving web usability. PMID:29202137

  14. Non-visual Web Browsing: Beyond Web Accessibility.

    PubMed

    Ramakrishnan, I V; Ashok, Vikas; Billah, Syed Masum

    2017-07-01

    People with vision impairments typically use screen readers to browse the Web. To facilitate non-visual browsing, web sites must be made accessible to screen readers, i.e., all the visible elements in the web site must be readable by the screen reader. But even if web sites are accessible, screen-reader users may not find them easy to use and/or easy to navigate. For example, they may not be able to locate the desired information without having to listen to a lot of irrelevant contents. These issues go beyond web accessibility and directly impact web usability. Several techniques have been reported in the accessibility literature for making the Web usable for screen reading. This paper is a review of these techniques. Interestingly, the review reveals that understanding the semantics of the web content is the overarching theme that drives these techniques for improving web usability.

  15. 7 CFR 905.9 - Handle or ship.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 8 2011-01-01 2011-01-01 false Handle or ship. 905.9 Section 905.9 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... TANGELOS GROWN IN FLORIDA Order Regulating Handling Definitions § 905.9 Handle or ship. Handle or ship...

  16. 7 CFR 948.8 - Handle or ship.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 8 2011-01-01 2011-01-01 false Handle or ship. 948.8 Section 948.8 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... Order Regulating Handling Definitions § 948.8 Handle or ship. Handle or ship means to transport, sell...

  17. 7 CFR 905.9 - Handle or ship.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 8 2014-01-01 2014-01-01 false Handle or ship. 905.9 Section 905.9 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (MARKETING AGREEMENTS... TANGELOS GROWN IN FLORIDA Order Regulating Handling Definitions § 905.9 Handle or ship. Handle or ship...

  18. 7 CFR 905.9 - Handle or ship.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 8 2012-01-01 2012-01-01 false Handle or ship. 905.9 Section 905.9 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... TANGELOS GROWN IN FLORIDA Order Regulating Handling Definitions § 905.9 Handle or ship. Handle or ship...

  19. 7 CFR 948.8 - Handle or ship.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 8 2013-01-01 2013-01-01 false Handle or ship. 948.8 Section 948.8 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (MARKETING AGREEMENTS... Order Regulating Handling Definitions § 948.8 Handle or ship. Handle or ship means to transport, sell...

  20. 7 CFR 905.9 - Handle or ship.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 8 2013-01-01 2013-01-01 false Handle or ship. 905.9 Section 905.9 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (MARKETING AGREEMENTS... TANGELOS GROWN IN FLORIDA Order Regulating Handling Definitions § 905.9 Handle or ship. Handle or ship...

  1. 7 CFR 948.8 - Handle or ship.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 8 2014-01-01 2014-01-01 false Handle or ship. 948.8 Section 948.8 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (MARKETING AGREEMENTS... Order Regulating Handling Definitions § 948.8 Handle or ship. Handle or ship means to transport, sell...

  2. 7 CFR 948.8 - Handle or ship.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 8 2012-01-01 2012-01-01 false Handle or ship. 948.8 Section 948.8 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... Order Regulating Handling Definitions § 948.8 Handle or ship. Handle or ship means to transport, sell...

  3. FUEL HANDLING MECHANISM

    DOEpatents

    Koch, L.J.; Hutter, E.

    1960-02-01

    A remotely operable handling device specifically adapted for the handling of vertically disposed fuel rods in a nuclear reactor was developed. The device consists essentially of an elongated tubular member having a gripping device at the lower end of the pivoted jaw type adapted to grip an enlarged head on the upper end of the workpiece. The device includes a sensing element which engages the enlarged head and is displaced to remotely indicate when the workpiece is in the proper position to be engaged by the jaws.

  4. Sodium Handling Technology and Engineering Design of the Madison Dynamo Experiment.

    NASA Astrophysics Data System (ADS)

    Kendrick, R.; Forest, C. B.; O'Connell, R.; Wright, A.; Robinson, K.

    1998-11-01

    A new liquid metal MHD experiment is being constructed at the University of Wisconsin to test several key predictions of dynamo theory: magnetic instabilities driven by sheared flow, the effects of turbulence on current generation, and the back-reaction of the self-generated magnetic field on the fluid motion which brings saturation. This presentation describes the engineering design of the experiment, which is a 0.5 m radius spherical vessel, filled with liquid sodium at 150 degrees Celsius. The experiment is designed to achieve a magnetic Reynolds number in excess of 100, which requires approximately 80 Hp of mechanical drive, producing flow velocities in sodium of 15 m/s through impellers. Handling liquid sodium offers a number of technical challenges, but routine techniques have been developed over the past several decades for safely handling large quantities for the fast breeder reactor. The handling strategy is discussed, technical details concerning seals and pressurazation are presented, and safety elements are highlighted.

  5. Tracing the cosmic web

    NASA Astrophysics Data System (ADS)

    Libeskind, Noam I.; van de Weygaert, Rien; Cautun, Marius; Falck, Bridget; Tempel, Elmo; Abel, Tom; Alpaslan, Mehmet; Aragón-Calvo, Miguel A.; Forero-Romero, Jaime E.; Gonzalez, Roberto; Gottlöber, Stefan; Hahn, Oliver; Hellwing, Wojciech A.; Hoffman, Yehuda; Jones, Bernard J. T.; Kitaura, Francisco; Knebe, Alexander; Manti, Serena; Neyrinck, Mark; Nuza, Sebastián E.; Padilla, Nelson; Platen, Erwin; Ramachandra, Nesar; Robotham, Aaron; Saar, Enn; Shandarin, Sergei; Steinmetz, Matthias; Stoica, Radu S.; Sousbie, Thierry; Yepes, Gustavo

    2018-01-01

    The cosmic web is one of the most striking features of the distribution of galaxies and dark matter on the largest scales in the Universe. It is composed of dense regions packed full of galaxies, long filamentary bridges, flattened sheets and vast low-density voids. The study of the cosmic web has focused primarily on the identification of such features, and on understanding the environmental effects on galaxy formation and halo assembly. As such, a variety of different methods have been devised to classify the cosmic web - depending on the data at hand, be it numerical simulations, large sky surveys or other. In this paper, we bring 12 of these methods together and apply them to the same data set in order to understand how they compare. In general, these cosmic-web classifiers have been designed with different cosmological goals in mind, and to study different questions. Therefore, one would not a priori expect agreement between different techniques; however, many of these methods do converge on the identification of specific features. In this paper, we study the agreements and disparities of the different methods. For example, each method finds that knots inhabit higher density regions than filaments, etc. and that voids have the lowest densities. For a given web environment, we find a substantial overlap in the density range assigned by each web classification scheme. We also compare classifications on a halo-by-halo basis; for example, we find that 9 of 12 methods classify around a third of group-mass haloes (i.e. Mhalo ∼ 1013.5 h-1 M⊙) as being in filaments. Lastly, so that any future cosmic-web classification scheme can be compared to the 12 methods used here, we have made all the data used in this paper public.

  6. A Dynamic Recommender System for Improved Web Usage Mining and CRM Using Swarm Intelligence.

    PubMed

    Alphy, Anna; Prabakaran, S

    2015-01-01

    In modern days, to enrich e-business, the websites are personalized for each user by understanding their interests and behavior. The main challenges of online usage data are information overload and their dynamic nature. In this paper, to address these issues, a WebBluegillRecom-annealing dynamic recommender system that uses web usage mining techniques in tandem with software agents developed for providing dynamic recommendations to users that can be used for customizing a website is proposed. The proposed WebBluegillRecom-annealing dynamic recommender uses swarm intelligence from the foraging behavior of a bluegill fish. It overcomes the information overload by handling dynamic behaviors of users. Our dynamic recommender system was compared against traditional collaborative filtering systems. The results show that the proposed system has higher precision, coverage, F1 measure, and scalability than the traditional collaborative filtering systems. Moreover, the recommendations given by our system overcome the overspecialization problem by including variety in recommendations.

  7. A Dynamic Recommender System for Improved Web Usage Mining and CRM Using Swarm Intelligence

    PubMed Central

    Alphy, Anna; Prabakaran, S.

    2015-01-01

    In modern days, to enrich e-business, the websites are personalized for each user by understanding their interests and behavior. The main challenges of online usage data are information overload and their dynamic nature. In this paper, to address these issues, a WebBluegillRecom-annealing dynamic recommender system that uses web usage mining techniques in tandem with software agents developed for providing dynamic recommendations to users that can be used for customizing a website is proposed. The proposed WebBluegillRecom-annealing dynamic recommender uses swarm intelligence from the foraging behavior of a bluegill fish. It overcomes the information overload by handling dynamic behaviors of users. Our dynamic recommender system was compared against traditional collaborative filtering systems. The results show that the proposed system has higher precision, coverage, F1 measure, and scalability than the traditional collaborative filtering systems. Moreover, the recommendations given by our system overcome the overspecialization problem by including variety in recommendations. PMID:26229978

  8. Proposition and Organization of an Adaptive Learning Domain Based on Fusion from the Web

    ERIC Educational Resources Information Center

    Chaoui, Mohammed; Laskri, Mohamed Tayeb

    2013-01-01

    The Web allows self-navigated education through interaction with large amounts of Web resources. While enjoying the flexibility of Web tools, authors may suffer from research and filtering Web resources, when they face various resources formats and complex structures. An adaptation of extracted Web resources must be assured by authors, to give…

  9. 7 CFR 58.443 - Whey handling.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Whey handling. 58.443 Section 58.443 Agriculture... Procedures § 58.443 Whey handling. (a) Adequate sanitary facilities shall be provided for the handling of whey. If outside, necessary precautions shall be taken to minimize flies, insects and development of...

  10. NGL Viewer: a web application for molecular visualization

    PubMed Central

    Rose, Alexander S.; Hildebrand, Peter W.

    2015-01-01

    The NGL Viewer (http://proteinformatics.charite.de/ngl) is a web application for the visualization of macromolecular structures. By fully adopting capabilities of modern web browsers, such as WebGL, for molecular graphics, the viewer can interactively display large molecular complexes and is also unaffected by the retirement of third-party plug-ins like Flash and Java Applets. Generally, the web application offers comprehensive molecular visualization through a graphical user interface so that life scientists can easily access and profit from available structural data. It supports common structural file-formats (e.g. PDB, mmCIF) and a variety of molecular representations (e.g. ‘cartoon, spacefill, licorice’). Moreover, the viewer can be embedded in other web sites to provide specialized visualizations of entries in structural databases or results of structure-related calculations. PMID:25925569

  11. 7 CFR 945.9 - Ship or handle.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 8 2011-01-01 2011-01-01 false Ship or handle. 945.9 Section 945.9 Agriculture... DESIGNATED COUNTIES IN IDAHO, AND MALHEUR COUNTY, OREGON Order Regulating Handling Definitions § 945.9 Ship or handle. Ship or handle means to pack, sell, consign, transport or in any other way to place...

  12. 7 CFR 927.8 - Ship or handle.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Ship or handle. 927.8 Section 927.8 Agriculture... Order Regulating Handling Definitions § 927.8 Ship or handle. Ship or handle means to sell, deliver, consign, transport or ship pears within the production area or between the production area and any point...

  13. 7 CFR 927.8 - Ship or handle.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 8 2011-01-01 2011-01-01 false Ship or handle. 927.8 Section 927.8 Agriculture... Order Regulating Handling Definitions § 927.8 Ship or handle. Ship or handle means to sell, deliver, consign, transport or ship pears within the production area or between the production area and any point...

  14. 7 CFR 945.9 - Ship or handle.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Ship or handle. 945.9 Section 945.9 Agriculture... DESIGNATED COUNTIES IN IDAHO, AND MALHEUR COUNTY, OREGON Order Regulating Handling Definitions § 945.9 Ship or handle. Ship or handle means to pack, sell, consign, transport or in any other way to place...

  15. 7 CFR 927.8 - Ship or handle.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 8 2013-01-01 2013-01-01 false Ship or handle. 927.8 Section 927.8 Agriculture... Order Regulating Handling Definitions § 927.8 Ship or handle. Ship or handle means to sell, deliver, consign, transport or ship pears within the production area or between the production area and any point...

  16. 7 CFR 945.9 - Ship or handle.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 8 2013-01-01 2013-01-01 false Ship or handle. 945.9 Section 945.9 Agriculture... DESIGNATED COUNTIES IN IDAHO, AND MALHEUR COUNTY, OREGON Order Regulating Handling Definitions § 945.9 Ship or handle. Ship or handle means to pack, sell, consign, transport or in any other way to place...

  17. 7 CFR 927.8 - Ship or handle.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 8 2012-01-01 2012-01-01 false Ship or handle. 927.8 Section 927.8 Agriculture... Order Regulating Handling Definitions § 927.8 Ship or handle. Ship or handle means to sell, deliver, consign, transport or ship pears within the production area or between the production area and any point...

  18. 7 CFR 927.8 - Ship or handle.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 8 2014-01-01 2014-01-01 false Ship or handle. 927.8 Section 927.8 Agriculture... Order Regulating Handling Definitions § 927.8 Ship or handle. Ship or handle means to sell, deliver, consign, transport or ship pears within the production area or between the production area and any point...

  19. Grid enablement of OpenGeospatial Web Services: the G-OWS Working Group

    NASA Astrophysics Data System (ADS)

    Mazzetti, Paolo

    2010-05-01

    In last decades two main paradigms for resource sharing emerged and reached maturity: the Web and the Grid. They both demonstrate suitable for building Distributed Computing Infrastructures (DCIs) supporting the coordinated sharing of resources (i.e. data, information, services, etc) on the Internet. Grid and Web DCIs have much in common as a result of their underlying Internet technology (protocols, models and specifications). However, being based on different requirements and architectural approaches, they show some differences as well. The Web's "major goal was to be a shared information space through which people and machines could communicate" [Berners-Lee 1996]. The success of the Web, and its consequent pervasiveness, made it appealing for building specialized systems like the Spatial Data Infrastructures (SDIs). In this systems the introduction of Web-based geo-information technologies enables specialized services for geospatial data sharing and processing. The Grid was born to achieve "flexible, secure, coordinated resource sharing among dynamic collections of individuals, institutions, and resources" [Foster 2001]. It specifically focuses on large-scale resource sharing, innovative applications, and, in some cases, high-performance orientation. In the Earth and Space Sciences (ESS) the most part of handled information is geo-referred (geo-information) since spatial and temporal meta-information is of primary importance in many application domains: Earth Sciences, Disasters Management, Environmental Sciences, etc. On the other hand, in several application areas there is the need of running complex models which require the large processing and storage capabilities that the Grids are able to provide. Therefore the integration of geo-information and Grid technologies might be a valuable approach in order to enable advanced ESS applications. Currently both geo-information and Grid technologies have reached a high level of maturity, allowing to build such an

  20. Statistical analysis and handling of missing data in cluster randomized trials: a systematic review.

    PubMed

    Fiero, Mallorie H; Huang, Shuang; Oren, Eyal; Bell, Melanie L

    2016-02-09

    Cluster randomized trials (CRTs) randomize participants in groups, rather than as individuals and are key tools used to assess interventions in health research where treatment contamination is likely or if individual randomization is not feasible. Two potential major pitfalls exist regarding CRTs, namely handling missing data and not accounting for clustering in the primary analysis. The aim of this review was to evaluate approaches for handling missing data and statistical analysis with respect to the primary outcome in CRTs. We systematically searched for CRTs published between August 2013 and July 2014 using PubMed, Web of Science, and PsycINFO. For each trial, two independent reviewers assessed the extent of the missing data and method(s) used for handling missing data in the primary and sensitivity analyses. We evaluated the primary analysis and determined whether it was at the cluster or individual level. Of the 86 included CRTs, 80 (93%) trials reported some missing outcome data. Of those reporting missing data, the median percent of individuals with a missing outcome was 19% (range 0.5 to 90%). The most common way to handle missing data in the primary analysis was complete case analysis (44, 55%), whereas 18 (22%) used mixed models, six (8%) used single imputation, four (5%) used unweighted generalized estimating equations, and two (2%) used multiple imputation. Fourteen (16%) trials reported a sensitivity analysis for missing data, but most assumed the same missing data mechanism as in the primary analysis. Overall, 67 (78%) trials accounted for clustering in the primary analysis. High rates of missing outcome data are present in the majority of CRTs, yet handling missing data in practice remains suboptimal. Researchers and applied statisticians should carry out appropriate missing data methods, which are valid under plausible assumptions in order to increase statistical power in trials and reduce the possibility of bias. Sensitivity analysis should be

  1. Hypothenar hammer syndrome from ice hockey stick-handling.

    PubMed

    Zayed, Mohamed A; McDonald, Joey; Tittley, Jacques G

    2013-11-01

    Ulnar artery thrombosis and hypothenar hammer syndrome are rare vascular complications that could potentially occur with repeated blows or trauma to the hand. Although initially reported as an occupational hazard among laborers and craftsmen, it has been observed more recently among recreationalists and athletes. Until now, it has never been reported as a complication in ice hockey players. In this case report, a 26-year-old Canadian professional ice hockey player presented with acute dominant right hand paleness, coolness, and pain with hand use. The patient used a wooden hockey stick with a large knob of tape at the end of the handle, which he regularly gripped in the palm of his right hand to help with face-offs and general stick-handling. Sonographic evaluation demonstrated no arterial flow in the distal right ulnar artery distribution, and ulnar artery occlusion with no aneurysmal degeneration was confirmed by magnetic resonance angiogram. Intraarterial thrombolytic therapy was initiated, and subsequent serial angiograms demonstrated significant improvement in distal ulnar artery flow as well as recanalization of right hand deep palmar arch and digital arteries. The patient's symptoms resolved, and he was maintained on therapeutic anticoagulation for 3 months prior to returning to playing ice hockey professionally, but with a padded glove and no tape knob at the handle tip. This case highlights a unique presentation of hockey stick-handling causing ulnar artery thrombosis that was likely from repeated palmar hypothenar trauma. Appropriate diagnostic imaging, early intraarterial thrombolysis, and postoperative surveillance and follow-up were crucial for the successful outcome in this patient. Copyright © 2013 Elsevier Inc. All rights reserved.

  2. Web-Based Course Delivery and Administration Using Scheme.

    ERIC Educational Resources Information Center

    Salustri, Filippo A.

    This paper discusses the use at the University of Windsor (Ontario) of a small World Wide Web-based tool for course delivery and administration called HAL (HTML-based Administrative Lackey), written in the Scheme programming language. This tool was developed by the author to provide Web-based services for a large first-year undergraduate course in…

  3. The Gaia On-Board Scientific Data Handling

    NASA Astrophysics Data System (ADS)

    Arenou, F.; Babusiaux, C.; Chéreau, F.; Mignot, S.

    2005-01-01

    Because Gaia will perform a continuous all-sky survey at a medium (Spectro) or very high (Astro) angular resolution, the on-board processing needs to cope with a high variety of objects and densities which calls for generic and adaptive algorithms at the detection level, but not only. Consequently, the Pyxis scientific algorithms developed for the on-board data handling cover a large range of application: detection and confirmation of astronomical objects, background sky estimation, classification of detected objects, Near-Earth Objects onboard detection, and window selection and positioning. Very dense fields, where the real-time computing requirements should remain within fixed bounds, are particularly challenging. Another constraint stems from the limited telemetry bandwidth and an additional compromise has to be found between scientific requirements and constraints in terms of the mass, volume and power budgets of the satellite. The rationale for the on-board data handling procedure is described here, together with the developed algorithms, the main issues and the expected scientific performances in the Astro and Spectro instruments.

  4. Design and deployment of a large brain-image database for clinical and nonclinical research

    NASA Astrophysics Data System (ADS)

    Yang, Guo Liang; Lim, Choie Cheio Tchoyoson; Banukumar, Narayanaswami; Aziz, Aamer; Hui, Francis; Nowinski, Wieslaw L.

    2004-04-01

    An efficient database is an essential component of organizing diverse information on image metadata and patient information for research in medical imaging. This paper describes the design, development and deployment of a large database system serving as a brain image repository that can be used across different platforms in various medical researches. It forms the infrastructure that links hospitals and institutions together and shares data among them. The database contains patient-, pathology-, image-, research- and management-specific data. The functionalities of the database system include image uploading, storage, indexing, downloading and sharing as well as database querying and management with security and data anonymization concerns well taken care of. The structure of database is multi-tier client-server architecture with Relational Database Management System, Security Layer, Application Layer and User Interface. Image source adapter has been developed to handle most of the popular image formats. The database has a user interface based on web browsers and is easy to handle. We have used Java programming language for its platform independency and vast function libraries. The brain image database can sort data according to clinically relevant information. This can be effectively used in research from the clinicians" points of view. The database is suitable for validation of algorithms on large population of cases. Medical images for processing could be identified and organized based on information in image metadata. Clinical research in various pathologies can thus be performed with greater efficiency and large image repositories can be managed more effectively. The prototype of the system has been installed in a few hospitals and is working to the satisfaction of the clinicians.

  5. Robopedia: Leveraging Sensorpedia for Web-Enabled Robot Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Resseguie, David R

    There is a growing interest in building Internetscale sensor networks that integrate sensors from around the world into a single unified system. In contrast, robotics application development has primarily focused on building specialized systems. These specialized systems take scalability and reliability into consideration, but generally neglect exploring the key components required to build a large scale system. Integrating robotic applications with Internet-scale sensor networks will unify specialized robotics applications and provide answers to large scale implementation concerns. We focus on utilizing Internet-scale sensor network technology to construct a framework for unifying robotic systems. Our framework web-enables a surveillance robot smore » sensor observations and provides a webinterface to the robot s actuators. This lets robots seamlessly integrate into web applications. In addition, the framework eliminates most prerequisite robotics knowledge, allowing for the creation of general web-based robotics applications. The framework also provides mechanisms to create applications that can interface with any robot. Frameworks such as this one are key to solving large scale mobile robotics implementation problems. We provide an overview of previous Internetscale sensor networks, Sensorpedia (an ad-hoc Internet-scale sensor network), our framework for integrating robots with Sensorpedia, two applications which illustrate our frameworks ability to support general web-based robotic control, and offer experimental results that illustrate our framework s scalability, feasibility, and resource requirements.« less

  6. Programmatic access to data and information at the IRIS DMC via web services

    NASA Astrophysics Data System (ADS)

    Weertman, B. R.; Trabant, C.; Karstens, R.; Suleiman, Y. Y.; Ahern, T. K.; Casey, R.; Benson, R. B.

    2011-12-01

    The IRIS Data Management Center (DMC) has developed a suite of web services that provide access to the DMC's time series holdings, their related metadata and earthquake catalogs. In addition, services are available to perform simple, on-demand time series processing at the DMC prior to being shipped to the user. The primary goal is to provide programmatic access to data and processing services in a manner usable by and useful to the research community. The web services are relatively simple to understand and use and will form the foundation on which future DMC access tools will be built. Based on standard Web technologies they can be accessed programmatically with a wide range of programming languages (e.g. Perl, Python, Java), command line utilities such as wget and curl or with any web browser. We anticipate these services being used for everything from simple command line access, used in shell scripts and higher programming languages to being integrated within complex data processing software. In addition to improving access to our data by the seismological community the web services will also make our data more accessible to other disciplines. The web services available from the DMC include ws-bulkdataselect for the retrieval of large volumes of miniSEED data, ws-timeseries for the retrieval of individual segments of time series data in a variety of formats (miniSEED, SAC, ASCII, audio WAVE, and PNG plots) with optional signal processing, ws-station for station metadata in StationXML format, ws-resp for the retrieval of instrument response in RESP format, ws-sacpz for the retrieval of sensor response in the SAC poles and zeros convention and ws-event for the retrieval of earthquake catalogs. To make the services even easier to use, the DMC is developing a library that allows Java programmers to seamlessly retrieve and integrate DMC information into their own programs. The library will handle all aspects of dealing with the services and will parse the returned

  7. A Study and Taxonomy of Vulnerabilities in Web Based Animation and Interactivity Software

    DTIC Science & Technology

    2010-12-01

    Flash Player is available as a plugin for most common Web browsers (Firefox, Mozilla, Netscape, Opera) and as an ActiveX control for Internet...script or HTML via (1) a swf file that uses the asfunction: protocol or (2) the navigateToURL function when used with the Flash Player ActiveX ...malicious page or open a malicious file. 2. Coding an Exploit The specific flaw exists in the Flash Player ActiveX Control’s handling of the

  8. Secure Web-based Ground System User Interfaces over the Open Internet

    NASA Technical Reports Server (NTRS)

    Langston, James H.; Murray, Henry L.; Hunt, Gary R.

    1998-01-01

    A prototype has been developed which makes use of commercially available products in conjunction with the Java programming language to provide a secure user interface for command and control over the open Internet. This paper reports successful demonstration of: (1) Security over the Internet, including encryption and certification; (2) Integration of Java applets with a COTS command and control product; (3) Remote spacecraft commanding using the Internet. The Java-based Spacecraft Web Interface to Telemetry and Command Handling (Jswitch) ground system prototype provides these capabilities. This activity demonstrates the use and integration of current technologies to enable a spacecraft engineer or flight operator to monitor and control a spacecraft from a user interface communicating over the open Internet using standard World Wide Web (WWW) protocols and commercial off-the-shelf (COTS) products. The core command and control functions are provided by the COTS Epoch 2000 product. The standard WWW tools and browsers are used in conjunction with the Java programming technology. Security is provided with the current encryption and certification technology. This system prototype is a step in the direction of giving scientist and flight operators Web-based access to instrument, payload, and spacecraft data.

  9. Guidelines for safe handling of hazardous drugs: A systematic review

    PubMed Central

    Bernabeu-Martínez, Mari A.; Ramos Merino, Mateo; Santos Gago, Juan M.; Álvarez Sabucedo, Luis M.; Wanden-Berghe, Carmina

    2018-01-01

    Objective To review the scientific literature related to the safe handling of hazardous drugs (HDs). Method Critical analysis of works retrieved from MEDLINE, the Cochrane Library, Scopus, CINHAL, Web of Science and LILACS using the terms "Hazardous Substances", "Antineoplastic Agents" and "Cytostatic Agents", applying "Humans" and "Guidelines" as filters. Date of search: January 2017. Results In total, 1100 references were retrieved, and from those, 61 documents were selected based on the inclusion and exclusion criteria: 24 (39.3%) documents related to recommendations about HDs; 27 (44.3%) about antineoplastic agents, and 10 (33.3%) about other types of substances (monoclonal antibodies, gene medicine and other chemical and biological agents). In 14 (23.3%) guides, all the stages in the manipulation process involving a risk due to exposure were considered. Only one guide addressed all stages of the handling process of HDs (including stages with and without the risk of exposure). The most described stages were drug preparation (41 guides, 67.2%), staff training and/or patient education (38 guides, 62.3%), and administration (37 guides, 60.7%). No standardized informatics system was found that ensured quality management, traceability and minimization of the risks associated with these drugs. Conclusions Most of the analysed guidelines limit their recommendations to the manipulation of antineoplastics. The most frequently described activities were preparation, training, and administration. It would be convenient to apply ICTs (Information and Communications Technologies) to manage processes involving HDs in a more complete and simpler fashion. PMID:29750798

  10. Guidelines for safe handling of hazardous drugs: A systematic review.

    PubMed

    Bernabeu-Martínez, Mari A; Ramos Merino, Mateo; Santos Gago, Juan M; Álvarez Sabucedo, Luis M; Wanden-Berghe, Carmina; Sanz-Valero, Javier

    2018-01-01

    To review the scientific literature related to the safe handling of hazardous drugs (HDs). Critical analysis of works retrieved from MEDLINE, the Cochrane Library, Scopus, CINHAL, Web of Science and LILACS using the terms "Hazardous Substances", "Antineoplastic Agents" and "Cytostatic Agents", applying "Humans" and "Guidelines" as filters. Date of search: January 2017. In total, 1100 references were retrieved, and from those, 61 documents were selected based on the inclusion and exclusion criteria: 24 (39.3%) documents related to recommendations about HDs; 27 (44.3%) about antineoplastic agents, and 10 (33.3%) about other types of substances (monoclonal antibodies, gene medicine and other chemical and biological agents). In 14 (23.3%) guides, all the stages in the manipulation process involving a risk due to exposure were considered. Only one guide addressed all stages of the handling process of HDs (including stages with and without the risk of exposure). The most described stages were drug preparation (41 guides, 67.2%), staff training and/or patient education (38 guides, 62.3%), and administration (37 guides, 60.7%). No standardized informatics system was found that ensured quality management, traceability and minimization of the risks associated with these drugs. Most of the analysed guidelines limit their recommendations to the manipulation of antineoplastics. The most frequently described activities were preparation, training, and administration. It would be convenient to apply ICTs (Information and Communications Technologies) to manage processes involving HDs in a more complete and simpler fashion.

  11. Strong regularities in world wide web surfing

    PubMed

    Huberman; Pirolli; Pitkow; Lukose

    1998-04-03

    One of the most common modes of accessing information in the World Wide Web is surfing from one document to another along hyperlinks. Several large empirical studies have revealed common patterns of surfing behavior. A model that assumes that users make a sequence of decisions to proceed to another page, continuing as long as the value of the current page exceeds some threshold, yields the probability distribution for the number of pages that a user visits within a given Web site. This model was verified by comparing its predictions with detailed measurements of surfing patterns. The model also explains the observed Zipf-like distributions in page hits observed at Web sites.

  12. Orientation of cosmic web filaments with respect to the underlying velocity field

    NASA Astrophysics Data System (ADS)

    Tempel, E.; Libeskind, N. I.; Hoffman, Y.; Liivamägi, L. J.; Tamm, A.

    2014-01-01

    The large-scale structure of the Universe is characterized by a web-like structure made of voids, sheets, filaments and knots. The structure of this so-called cosmic web is dictated by the local velocity shear tensor. In particular, the local direction of a filament should be strongly aligned with hat{e}_3, the eigenvector associated with the smallest eigenvalue of the tensor. That conjecture is tested here on the basis of a cosmological simulation. The cosmic web delineated by the halo distribution is probed by a marked point process with interactions (the Bisous model), detecting filaments directly from the halo distribution (P-web). The detected P-web filaments are found to be strongly aligned with the local hat{e}_3: the alignment is within 30° for ˜80 per cent of the elements. This indicates that large-scale filaments defined purely from the distribution of haloes carry more than just morphological information, although the Bisous model does not make any prior assumption on the underlying shear tensor. The P-web filaments are also compared to the structure revealed from the velocity shear tensor itself (V-web). In the densest regions, the P- and V-web filaments overlap well (90 per cent), whereas in lower density regions, the P-web filaments preferentially mark sheets in the V-web.

  13. Finding Specification Pages from the Web

    NASA Astrophysics Data System (ADS)

    Yoshinaga, Naoki; Torisawa, Kentaro

    This paper presents a method of finding a specification page on the Web for a given object (e.g., ``Ch. d'Yquem'') and its class label (e.g., ``wine''). A specification page for an object is a Web page which gives concise attribute-value information about the object (e.g., ``county''-``Sauternes'') in well formatted structures. A simple unsupervised method using layout and symbolic decoration cues was applied to a large number of the Web pages to acquire candidate attributes for each class (e.g., ``county'' for a class ``wine''). We then filter out irrelevant words from the putative attributes through an author-aware scoring function that we called site frequency. We used the acquired attributes to select a representative specification page for a given object from the Web pages retrieved by a normal search engine. Experimental results revealed that our system greatly outperformed the normal search engine in terms of this specification retrieval.

  14. Human dynamics revealed through Web analytics

    NASA Astrophysics Data System (ADS)

    Gonçalves, Bruno; Ramasco, José J.

    2008-08-01

    The increasing ubiquity of Internet access and the frequency with which people interact with it raise the possibility of using the Web to better observe, understand, and monitor several aspects of human social behavior. Web sites with large numbers of frequently returning users are ideal for this task. If these sites belong to companies or universities, their usage patterns can furnish information about the working habits of entire populations. In this work, we analyze the properly anonymized logs detailing the access history to Emory University’s Web site. Emory is a medium-sized university located in Atlanta, Georgia. We find interesting structure in the activity patterns of the domain and study in a systematic way the main forces behind the dynamics of the traffic. In particular, we find that linear preferential linking, priority-based queuing, and the decay of interest for the contents of the pages are the essential ingredients to understand the way users navigate the Web.

  15. Lagrangian methods of cosmic web classification

    NASA Astrophysics Data System (ADS)

    Fisher, J. D.; Faltenbacher, A.; Johnson, M. S. T.

    2016-05-01

    The cosmic web defines the large-scale distribution of matter we see in the Universe today. Classifying the cosmic web into voids, sheets, filaments and nodes allows one to explore structure formation and the role environmental factors have on halo and galaxy properties. While existing studies of cosmic web classification concentrate on grid-based methods, this work explores a Lagrangian approach where the V-web algorithm proposed by Hoffman et al. is implemented with techniques borrowed from smoothed particle hydrodynamics. The Lagrangian approach allows one to classify individual objects (e.g. particles or haloes) based on properties of their nearest neighbours in an adaptive manner. It can be applied directly to a halo sample which dramatically reduces computational cost and potentially allows an application of this classification scheme to observed galaxy samples. Finally, the Lagrangian nature admits a straightforward inclusion of the Hubble flow negating the necessity of a visually defined threshold value which is commonly employed by grid-based classification methods.

  16. Replacement of SSE with NASA's POWER Project GIS-enabled Web Data Portal

    Atmospheric Science Data Center

    2018-04-30

    Replacement of SSE with NASA's POWER Project GIS-enabled Web Data Portal Friday, March ... 2018 Replacement of SSE (Release 6) with NASA's Prediction of Worldwide Energy Resource (POWER) Project GIS-enabled Web ... Worldwide Energy Resource (POWER) Project funded largely by NASA Earth Applied Sciences program.   The new POWER web portal ...

  17. Handling an Asthma Flare-Up

    MedlinePlus

    ... Videos for Educators Search English Español Handling an Asthma Flare-Up KidsHealth / For Kids / Handling an Asthma ... español Cómo controlar las crisis asmáticas What's an Asthma Flare-Up? If you have asthma , you probably ...

  18. 7 CFR 1205.312 - Handle.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... AND ORDERS; MISCELLANEOUS COMMODITIES), DEPARTMENT OF AGRICULTURE COTTON RESEARCH AND PROMOTION Cotton Research and Promotion Order Definitions § 1205.312 Handle. Handle means to harvest, gin, warehouse, compress, purchase, market, transport, or otherwise acquire ownership or control of cotton. [31 FR 16758...

  19. 7 CFR 983.14 - Handle.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Orders; Fruits, Vegetables, Nuts), DEPARTMENT OF AGRICULTURE PISTACHIOS GROWN IN CALIFORNIA, ARIZONA, AND NEW MEXICO Definitions § 983.14 Handle. Handle means to engage in: (a) Receiving pistachios; (b) Hulling and drying pistachios; (c) Further preparing pistachios by sorting, sizing, shelling, roasting...

  20. 7 CFR 1219.11 - Handle.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... AND ORDERS; MISCELLANEOUS COMMODITIES), DEPARTMENT OF AGRICULTURE HASS AVOCADO PROMOTION, RESEARCH, AND INFORMATION Hass Avocado Promotion, Research, and Information Order Definitions § 1219.11 Handle. Handle means to pack, process, transport, purchase, or in any other way to place or cause Hass avocados...

  1. 7 CFR 1219.11 - Handle.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... AND ORDERS; MISCELLANEOUS COMMODITIES), DEPARTMENT OF AGRICULTURE HASS AVOCADO PROMOTION, RESEARCH, AND INFORMATION Hass Avocado Promotion, Research, and Information Order Definitions § 1219.11 Handle. Handle means to pack, process, transport, purchase, or in any other way to place or cause Hass avocados...

  2. 7 CFR 1219.11 - Handle.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... AND ORDERS; MISCELLANEOUS COMMODITIES), DEPARTMENT OF AGRICULTURE HASS AVOCADO PROMOTION, RESEARCH, AND INFORMATION Hass Avocado Promotion, Research, and Information Order Definitions § 1219.11 Handle. Handle means to pack, process, transport, purchase, or in any other way to place or cause Hass avocados...

  3. 7 CFR 1219.11 - Handle.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... AND ORDERS; MISCELLANEOUS COMMODITIES), DEPARTMENT OF AGRICULTURE HASS AVOCADO PROMOTION, RESEARCH, AND INFORMATION Hass Avocado Promotion, Research, and Information Order Definitions § 1219.11 Handle. Handle means to pack, process, transport, purchase, or in any other way to place or cause Hass avocados...

  4. MyLabStocks: a web-application to manage molecular biology materials

    PubMed Central

    Chuffart, Florent; Yvert, Gaël

    2014-01-01

    Laboratory stocks are the hardware of research. They must be stored and managed with mimimum loss of material and information. Plasmids, oligonucleotides and strains are regularly exchanged between collaborators within and between laboratories. Managing and sharing information about every item is crucial for retrieval of reagents, for planning experiments and for reproducing past experimental results. We have developed a web-based application to manage stocks commonly used in a molecular biology laboratory. Its functionalities include user-defined privileges, visualization of plasmid maps directly from their sequence and the capacity to search items from fields of annotation or directly from a query sequence using BLAST. It is designed to handle records of plasmids, oligonucleotides, yeast strains, antibodies, pipettes and notebooks. Based on PHP/MySQL, it can easily be extended to handle other types of stocks and it can be installed on any server architecture. MyLabStocks is freely available from: https://forge.cbp.ens-lyon.fr/redmine/projects/mylabstocks under an open source licence. PMID:24643870

  5. EarthServer - 3D Visualization on the Web

    NASA Astrophysics Data System (ADS)

    Wagner, Sebastian; Herzig, Pasquale; Bockholt, Ulrich; Jung, Yvonne; Behr, Johannes

    2013-04-01

    EarthServer (www.earthserver.eu), funded by the European Commission under its Seventh Framework Program, is a project to enable the management, access and exploration of massive, multi-dimensional datasets using Open GeoSpatial Consortium (OGC) query and processing language standards like WCS 2.0 and WCPS. To this end, a server/client architecture designed to handle Petabyte/Exabyte volumes of multi-dimensional data is being developed and deployed. As an important part of the EarthServer project, six Lighthouse Applications, major scientific data exploitation initiatives, are being established to make cross-domain, Earth Sciences related data repositories available in an open and unified manner, as service endpoints based on solutions and infrastructure developed within the project. Clients technology developed and deployed in EarthServer ranges from mobile and web clients to immersive virtual reality systems, all designed to interact with a physically and logically distributed server infrastructure using exclusively OGC standards. In this contribution, we would like to present our work on a web-based 3D visualization and interaction client for Earth Sciences data using only technology found in standard web browsers without requiring the user to install plugins or addons. Additionally, we are able to run the earth data visualization client on a wide range of different platforms with very different soft- and hardware requirements such as smart phones (e.g. iOS, Android), different desktop systems etc. High-quality, hardware-accelerated visualization of 3D and 4D content in standard web browsers can be realized now and we believe it will become more and more common to use this fast, lightweight and ubiquitous platform to provide insights into big datasets without requiring the user to set up a specialized client first. With that in mind, we will also point out some of the limitations we encountered using current web technologies. Underlying the EarthServer web client

  6. Performance of the Magnetospheric Multiscale central instrument data handling

    NASA Astrophysics Data System (ADS)

    Klar, Robert A.; Miller, Scott A.; Brysch, Michael L.; Bertrand, Allison R.

    In order to study the fundamental physical processes of magnetic reconnection, particle acceleration and turbulence, the Magnetospheric Multiscale (MMS) mission employs a constellation of four identically configured observatories, each with a suite of complementary science instruments. Southwest Research Institute® (SwRI® ) developed the Central Instrument Data Processor (CIDP) to handle the large data volume associated with these instruments. The CIDP is an integrated access point between the instruments and the spacecraft. It provides synchronization pulses, relays telecommands, and gathers instrument housekeeping telemetry. It collects science data from the instruments and stores it to a mass memory for later playback to a ground station. This paper retrospectively examines the data handling performance realized by the CIDP implementation. It elaborates on some of the constraints on the hardware and software designs and the resulting effects on performance. For the hardware, it discusses the limitations of the front-end electronics input/output (I/O) architecture and associated mass memory buffering. For the software, it discusses the limitations of the Consultative Committee for Space Data Systems (CCSDS) File Delivery Protocol (CFDP) implementation and the data structure choices for file management. It also describes design changes that improve data handling performance in newer designs.

  7. Large Scale Landslide Database System Established for the Reservoirs in Southern Taiwan

    NASA Astrophysics Data System (ADS)

    Tsai, Tsai-Tsung; Tsai, Kuang-Jung; Shieh, Chjeng-Lun

    2017-04-01

    Typhoon Morakot seriously attack southern Taiwan awaken the public awareness of large scale landslide disasters. Large scale landslide disasters produce large quantity of sediment due to negative effects on the operating functions of reservoirs. In order to reduce the risk of these disasters within the study area, the establishment of a database for hazard mitigation / disaster prevention is necessary. Real time data and numerous archives of engineering data, environment information, photo, and video, will not only help people make appropriate decisions, but also bring the biggest concern for people to process and value added. The study tried to define some basic data formats / standards from collected various types of data about these reservoirs and then provide a management platform based on these formats / standards. Meanwhile, in order to satisfy the practicality and convenience, the large scale landslide disasters database system is built both provide and receive information abilities, which user can use this large scale landslide disasters database system on different type of devices. IT technology progressed extreme quick, the most modern system might be out of date anytime. In order to provide long term service, the system reserved the possibility of user define data format /standard and user define system structure. The system established by this study was based on HTML5 standard language, and use the responsive web design technology. This will make user can easily handle and develop this large scale landslide disasters database system.

  8. Dwarf Galaxies and the Cosmic Web

    NASA Astrophysics Data System (ADS)

    Benítez-Llambay, Alejandro; Navarro, Julio F.; Abadi, Mario G.; Gottlöber, Stefan; Yepes, Gustavo; Hoffman, Yehuda; Steinmetz, Matthias

    2013-02-01

    We use a cosmological simulation of the formation of the Local Group of Galaxies to identify a mechanism that enables the removal of baryons from low-mass halos without appealing to feedback or reionization. As the Local Group forms, matter bound to it develops a network of filaments and pancakes. This moving web of gas and dark matter drifts and sweeps a large volume, overtaking many halos in the process. The dark matter content of these halos is unaffected but their gas can be efficiently removed by ram pressure. The loss of gas is especially pronounced in low-mass halos due to their lower binding energy and has a dramatic effect on the star formation history of affected systems. This "cosmic web stripping" may help to explain the scarcity of dwarf galaxies compared with the numerous low-mass halos expected in ΛCDM and the large diversity of star formation histories and morphologies characteristic of faint galaxies. Although our results are based on a single high-resolution simulation, it is likely that the hydrodynamical interaction of dwarf galaxies with the cosmic web is a crucial ingredient so far missing from galaxy formation models.

  9. DWARF GALAXIES AND THE COSMIC WEB

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benitez-Llambay, Alejandro; Abadi, Mario G.; Navarro, Julio F.

    2013-02-01

    We use a cosmological simulation of the formation of the Local Group of Galaxies to identify a mechanism that enables the removal of baryons from low-mass halos without appealing to feedback or reionization. As the Local Group forms, matter bound to it develops a network of filaments and pancakes. This moving web of gas and dark matter drifts and sweeps a large volume, overtaking many halos in the process. The dark matter content of these halos is unaffected but their gas can be efficiently removed by ram pressure. The loss of gas is especially pronounced in low-mass halos due tomore » their lower binding energy and has a dramatic effect on the star formation history of affected systems. This 'cosmic web stripping' may help to explain the scarcity of dwarf galaxies compared with the numerous low-mass halos expected in {Lambda}CDM and the large diversity of star formation histories and morphologies characteristic of faint galaxies. Although our results are based on a single high-resolution simulation, it is likely that the hydrodynamical interaction of dwarf galaxies with the cosmic web is a crucial ingredient so far missing from galaxy formation models.« less

  10. 7 CFR 1216.12 - Handle.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... AND ORDERS; MISCELLANEOUS COMMODITIES), DEPARTMENT OF AGRICULTURE PEANUT PROMOTION, RESEARCH, AND INFORMATION ORDER Peanut Promotion, Research, and Information Order Definitions § 1216.12 Handle. Handle means... peanuts and in the shipment (except as a common or contract carrier of peanuts owned by another) or sale...

  11. 7 CFR 1216.12 - Handle.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... AND ORDERS; MISCELLANEOUS COMMODITIES), DEPARTMENT OF AGRICULTURE PEANUT PROMOTION, RESEARCH, AND INFORMATION ORDER Peanut Promotion, Research, and Information Order Definitions § 1216.12 Handle. Handle means... peanuts and in the shipment (except as a common or contract carrier of peanuts owned by another) or sale...

  12. 7 CFR 926.9 - Handle.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements and... REQUIREMENTS APPLICABLE TO CRANBERRIES NOT SUBJECT TO THE CRANBERRY MARKETING ORDER § 926.9 Handle. Handle... cranberries or processed cranberries up to, but not including, the retail level. Effective Date Note: At 71 FR...

  13. Handling qualities criteria for the space shuttle orbiter during the terminal phase of flight

    NASA Technical Reports Server (NTRS)

    Stapleford, R. L.; Klein, R. H.; Hob, R. H.

    1972-01-01

    It was found that large portions of the military handling qualities specification are directly applicable. However a number of additional and substitute criteria are recommended for areas not covered or inadequately covered in the military specification. Supporting pilot/vehicle analyses and simulation experiments were conducted and are described. Results are also presented of analytical and simulator evaluations of three specific interim Orbiter designs which provided a test of the proposed handling qualities criteria. The correlations between the analytical and experimental evaluations were generally excellent.

  14. Using Open Web APIs in Teaching Web Mining

    ERIC Educational Resources Information Center

    Chen, Hsinchun; Li, Xin; Chau, M.; Ho, Yi-Jen; Tseng, Chunju

    2009-01-01

    With the advent of the World Wide Web, many business applications that utilize data mining and text mining techniques to extract useful business information on the Web have evolved from Web searching to Web mining. It is important for students to acquire knowledge and hands-on experience in Web mining during their education in information systems…

  15. From Sensor to Observation Web with environmental enablers in the Future Internet.

    PubMed

    Havlik, Denis; Schade, Sven; Sabeur, Zoheir A; Mazzetti, Paolo; Watson, Kym; Berre, Arne J; Mon, Jose Lorenzo

    2011-01-01

    This paper outlines the grand challenges in global sustainability research and the objectives of the FP7 Future Internet PPP program within the Digital Agenda for Europe. Large user communities are generating significant amounts of valuable environmental observations at local and regional scales using the devices and services of the Future Internet. These communities' environmental observations represent a wealth of information which is currently hardly used or used only in isolation and therefore in need of integration with other information sources. Indeed, this very integration will lead to a paradigm shift from a mere Sensor Web to an Observation Web with semantically enriched content emanating from sensors, environmental simulations and citizens. The paper also describes the research challenges to realize the Observation Web and the associated environmental enablers for the Future Internet. Such an environmental enabler could for instance be an electronic sensing device, a web-service application, or even a social networking group affording or facilitating the capability of the Future Internet applications to consume, produce, and use environmental observations in cross-domain applications. The term "envirofied" Future Internet is coined to describe this overall target that forms a cornerstone of work in the Environmental Usage Area within the Future Internet PPP program. Relevant trends described in the paper are the usage of ubiquitous sensors (anywhere), the provision and generation of information by citizens, and the convergence of real and virtual realities to convey understanding of environmental observations. The paper addresses the technical challenges in the Environmental Usage Area and the need for designing multi-style service oriented architecture. Key topics are the mapping of requirements to capabilities, providing scalability and robustness with implementing context aware information retrieval. Another essential research topic is handling data

  16. From Sensor to Observation Web with Environmental Enablers in the Future Internet

    PubMed Central

    Havlik, Denis; Schade, Sven; Sabeur, Zoheir A.; Mazzetti, Paolo; Watson, Kym; Berre, Arne J.; Mon, Jose Lorenzo

    2011-01-01

    This paper outlines the grand challenges in global sustainability research and the objectives of the FP7 Future Internet PPP program within the Digital Agenda for Europe. Large user communities are generating significant amounts of valuable environmental observations at local and regional scales using the devices and services of the Future Internet. These communities’ environmental observations represent a wealth of information which is currently hardly used or used only in isolation and therefore in need of integration with other information sources. Indeed, this very integration will lead to a paradigm shift from a mere Sensor Web to an Observation Web with semantically enriched content emanating from sensors, environmental simulations and citizens. The paper also describes the research challenges to realize the Observation Web and the associated environmental enablers for the Future Internet. Such an environmental enabler could for instance be an electronic sensing device, a web-service application, or even a social networking group affording or facilitating the capability of the Future Internet applications to consume, produce, and use environmental observations in cross-domain applications. The term “envirofied” Future Internet is coined to describe this overall target that forms a cornerstone of work in the Environmental Usage Area within the Future Internet PPP program. Relevant trends described in the paper are the usage of ubiquitous sensors (anywhere), the provision and generation of information by citizens, and the convergence of real and virtual realities to convey understanding of environmental observations. The paper addresses the technical challenges in the Environmental Usage Area and the need for designing multi-style service oriented architecture. Key topics are the mapping of requirements to capabilities, providing scalability and robustness with implementing context aware information retrieval. Another essential research topic is handling

  17. The natural angle between the hand and handle and the effect of handle orientation on wrist radial/ulnar deviation during maximal push exertions.

    PubMed

    Young, Justin G; Lin, Jia-Hua; Chang, Chien-Chi; McGorry, Raymond W

    2013-01-01

    The purpose of this experiment was to quantify the natural angle between the hand and a handle, and to investigate three design factors: handle rotation, handle tilt and between-handle width on the natural angle as well as resultant wrist radial/ulnar deviation ('RUD') for pushing tasks. Photographs taken of the right upper limb of 31 participants (14 women and 17 men) performing maximal seated push exertions on different handles were analysed. Natural hand/handle angle and RUD were assessed. It was found that all of the three design factors significantly affected natural handle angle and wrist RUD, but participant gender did not. The natural angle between the hand and the cylindrical handle was 65 ± 7°. Wrist deviation was reduced for handles that were rotated 0° (horizontal) and at the narrow width (31 cm). Handles that were tilted forward 15° reduced radial deviation consistently (12-13°) across handle conditions. Manual materials handling (MMH) tasks involving pushing have been related to increased risk of musculoskeletal injury. This study shows that handle orientation influences hand and wrist posture during pushing, and suggests that the design of push handles on carts and other MMH aids can be improved by adjusting their orientation to fit the natural interface between the hand and handle.

  18. The emergent discipline of health web science.

    PubMed

    Luciano, Joanne S; Cumming, Grant P; Wilkinson, Mark D; Kahana, Eva

    2013-08-22

    The transformative power of the Internet on all aspects of daily life, including health care, has been widely recognized both in the scientific literature and in public discourse. Viewed through the various lenses of diverse academic disciplines, these transformations reveal opportunities realized, the promise of future advances, and even potential problems created by the penetration of the World Wide Web for both individuals and for society at large. Discussions about the clinical and health research implications of the widespread adoption of information technologies, including the Internet, have been subsumed under the disciplinary label of Medicine 2.0. More recently, however, multi-disciplinary research has emerged that is focused on the achievement and promise of the Web itself, as it relates to healthcare issues. In this paper, we explore and interrogate the contributions of the burgeoning field of Web Science in relation to health maintenance, health care, and health policy. From this, we introduce Health Web Science as a subdiscipline of Web Science, distinct from but overlapping with Medicine 2.0. This paper builds on the presentations and subsequent interdisciplinary dialogue that developed among Web-oriented investigators present at the 2012 Medicine 2.0 Conference in Boston, Massachusetts.

  19. The Emergent Discipline of Health Web Science

    PubMed Central

    2013-01-01

    The transformative power of the Internet on all aspects of daily life, including health care, has been widely recognized both in the scientific literature and in public discourse. Viewed through the various lenses of diverse academic disciplines, these transformations reveal opportunities realized, the promise of future advances, and even potential problems created by the penetration of the World Wide Web for both individuals and for society at large. Discussions about the clinical and health research implications of the widespread adoption of information technologies, including the Internet, have been subsumed under the disciplinary label of Medicine 2.0. More recently, however, multi-disciplinary research has emerged that is focused on the achievement and promise of the Web itself, as it relates to healthcare issues. In this paper, we explore and interrogate the contributions of the burgeoning field of Web Science in relation to health maintenance, health care, and health policy. From this, we introduce Health Web Science as a subdiscipline of Web Science, distinct from but overlapping with Medicine 2.0. This paper builds on the presentations and subsequent interdisciplinary dialogue that developed among Web-oriented investigators present at the 2012 Medicine 2.0 Conference in Boston, Massachusetts. PMID:23968998

  20. [A web-based integrated clinical database for laryngeal cancer].

    PubMed

    E, Qimin; Liu, Jialin; Li, Yong; Liang, Chuanyu

    2014-08-01

    To establish an integrated database for laryngeal cancer, and to provide an information platform for laryngeal cancer in clinical and fundamental researches. This database also meet the needs of clinical and scientific use. Under the guidance of clinical expert, we have constructed a web-based integrated clinical database for laryngeal carcinoma on the basis of clinical data standards, Apache+PHP+MySQL technology, laryngeal cancer specialist characteristics and tumor genetic information. A Web-based integrated clinical database for laryngeal carcinoma had been developed. This database had a user-friendly interface and the data could be entered and queried conveniently. In addition, this system utilized the clinical data standards and exchanged information with existing electronic medical records system to avoid the Information Silo. Furthermore, the forms of database was integrated with laryngeal cancer specialist characteristics and tumor genetic information. The Web-based integrated clinical database for laryngeal carcinoma has comprehensive specialist information, strong expandability, high feasibility of technique and conforms to the clinical characteristics of laryngeal cancer specialties. Using the clinical data standards and structured handling clinical data, the database can be able to meet the needs of scientific research better and facilitate information exchange, and the information collected and input about the tumor sufferers are very informative. In addition, the user can utilize the Internet to realize the convenient, swift visit and manipulation on the database.

  1. Genome Partitioner: A web tool for multi-level partitioning of large-scale DNA constructs for synthetic biology applications

    PubMed Central

    Del Medico, Luca; Christen, Heinz; Christen, Beat

    2017-01-01

    Recent advances in lower-cost DNA synthesis techniques have enabled new innovations in the field of synthetic biology. Still, efficient design and higher-order assembly of genome-scale DNA constructs remains a labor-intensive process. Given the complexity, computer assisted design tools that fragment large DNA sequences into fabricable DNA blocks are needed to pave the way towards streamlined assembly of biological systems. Here, we present the Genome Partitioner software implemented as a web-based interface that permits multi-level partitioning of genome-scale DNA designs. Without the need for specialized computing skills, biologists can submit their DNA designs to a fully automated pipeline that generates the optimal retrosynthetic route for higher-order DNA assembly. To test the algorithm, we partitioned a 783 kb Caulobacter crescentus genome design. We validated the partitioning strategy by assembling a 20 kb test segment encompassing a difficult to synthesize DNA sequence. Successful assembly from 1 kb subblocks into the 20 kb segment highlights the effectiveness of the Genome Partitioner for reducing synthesis costs and timelines for higher-order DNA assembly. The Genome Partitioner is broadly applicable to translate DNA designs into ready to order sequences that can be assembled with standardized protocols, thus offering new opportunities to harness the diversity of microbial genomes for synthetic biology applications. The Genome Partitioner web tool can be accessed at https://christenlab.ethz.ch/GenomePartitioner. PMID:28531174

  2. Genome Partitioner: A web tool for multi-level partitioning of large-scale DNA constructs for synthetic biology applications.

    PubMed

    Christen, Matthias; Del Medico, Luca; Christen, Heinz; Christen, Beat

    2017-01-01

    Recent advances in lower-cost DNA synthesis techniques have enabled new innovations in the field of synthetic biology. Still, efficient design and higher-order assembly of genome-scale DNA constructs remains a labor-intensive process. Given the complexity, computer assisted design tools that fragment large DNA sequences into fabricable DNA blocks are needed to pave the way towards streamlined assembly of biological systems. Here, we present the Genome Partitioner software implemented as a web-based interface that permits multi-level partitioning of genome-scale DNA designs. Without the need for specialized computing skills, biologists can submit their DNA designs to a fully automated pipeline that generates the optimal retrosynthetic route for higher-order DNA assembly. To test the algorithm, we partitioned a 783 kb Caulobacter crescentus genome design. We validated the partitioning strategy by assembling a 20 kb test segment encompassing a difficult to synthesize DNA sequence. Successful assembly from 1 kb subblocks into the 20 kb segment highlights the effectiveness of the Genome Partitioner for reducing synthesis costs and timelines for higher-order DNA assembly. The Genome Partitioner is broadly applicable to translate DNA designs into ready to order sequences that can be assembled with standardized protocols, thus offering new opportunities to harness the diversity of microbial genomes for synthetic biology applications. The Genome Partitioner web tool can be accessed at https://christenlab.ethz.ch/GenomePartitioner.

  3. 7 CFR 1207.307 - Handle.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... AND ORDERS; MISCELLANEOUS COMMODITIES), DEPARTMENT OF AGRICULTURE POTATO RESEARCH AND PROMOTION PLAN Potato Research and Promotion Plan Definitions § 1207.307 Handle. Handle means to grade, pack, process, sell, transport, purchase, or in any other way to place potatoes or cause potatoes to be placed in the...

  4. 7 CFR 1219.11 - Handle.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 10 2010-01-01 2010-01-01 false Handle. 1219.11 Section 1219.11 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (MARKETING AGREEMENTS.... Handle means to pack, process, transport, purchase, or in any other way to place or cause Hass avocados...

  5. 21 CFR 820.140 - Handling.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Handling. 820.140 Section 820.140 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES..., contamination, or other adverse effects to product do not occur during handling. ...

  6. 9 CFR 3.118 - Handling.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 9 Animals and Animal Products 1 2013-01-01 2013-01-01 false Handling. 3.118 Section 3.118 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE STANDARDS Specifications for the Humane Handling, Care, Treatment, and Transportation of Marine...

  7. 9 CFR 3.118 - Handling.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 9 Animals and Animal Products 1 2010-01-01 2010-01-01 false Handling. 3.118 Section 3.118 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE STANDARDS Specifications for the Humane Handling, Care, Treatment, and Transportation of Marine...

  8. 9 CFR 3.118 - Handling.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 9 Animals and Animal Products 1 2012-01-01 2012-01-01 false Handling. 3.118 Section 3.118 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE STANDARDS Specifications for the Humane Handling, Care, Treatment, and Transportation of Marine...

  9. 9 CFR 3.118 - Handling.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 9 Animals and Animal Products 1 2014-01-01 2014-01-01 false Handling. 3.118 Section 3.118 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE STANDARDS Specifications for the Humane Handling, Care, Treatment, and Transportation of Marine...

  10. 9 CFR 3.118 - Handling.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 9 Animals and Animal Products 1 2011-01-01 2011-01-01 false Handling. 3.118 Section 3.118 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE STANDARDS Specifications for the Humane Handling, Care, Treatment, and Transportation of Marine...

  11. 7 CFR 1207.307 - Handle.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... AND ORDERS; MISCELLANEOUS COMMODITIES), DEPARTMENT OF AGRICULTURE POTATO RESEARCH AND PROMOTION PLAN Potato Research and Promotion Plan Definitions § 1207.307 Handle. Handle means to grade, pack, process, sell, transport, purchase, or in any other way to place potatoes or cause potatoes to be placed in the...

  12. 7 CFR 1207.307 - Handle.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... AND ORDERS; MISCELLANEOUS COMMODITIES), DEPARTMENT OF AGRICULTURE POTATO RESEARCH AND PROMOTION PLAN Potato Research and Promotion Plan Definitions § 1207.307 Handle. Handle means to grade, pack, process, sell, transport, purchase, or in any other way to place potatoes or cause potatoes to be placed in the...

  13. 7 CFR 1207.307 - Handle.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... AND ORDERS; MISCELLANEOUS COMMODITIES), DEPARTMENT OF AGRICULTURE POTATO RESEARCH AND PROMOTION PLAN Potato Research and Promotion Plan Definitions § 1207.307 Handle. Handle means to grade, pack, process, sell, transport, purchase, or in any other way to place potatoes or cause potatoes to be placed in the...

  14. 7 CFR 1207.307 - Handle.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... AND ORDERS; MISCELLANEOUS COMMODITIES), DEPARTMENT OF AGRICULTURE POTATO RESEARCH AND PROMOTION PLAN Potato Research and Promotion Plan Definitions § 1207.307 Handle. Handle means to grade, pack, process, sell, transport, purchase, or in any other way to place potatoes or cause potatoes to be placed in the...

  15. Stormbow: A Cloud-Based Tool for Reads Mapping and Expression Quantification in Large-Scale RNA-Seq Studies

    PubMed Central

    Zhao, Shanrong; Prenger, Kurt; Smith, Lance

    2013-01-01

    RNA-Seq is becoming a promising replacement to microarrays in transcriptome profiling and differential gene expression study. Technical improvements have decreased sequencing costs and, as a result, the size and number of RNA-Seq datasets have increased rapidly. However, the increasing volume of data from large-scale RNA-Seq studies poses a practical challenge for data analysis in a local environment. To meet this challenge, we developed Stormbow, a cloud-based software package, to process large volumes of RNA-Seq data in parallel. The performance of Stormbow has been tested by practically applying it to analyse 178 RNA-Seq samples in the cloud. In our test, it took 6 to 8 hours to process an RNA-Seq sample with 100 million reads, and the average cost was $3.50 per sample. Utilizing Amazon Web Services as the infrastructure for Stormbow allows us to easily scale up to handle large datasets with on-demand computational resources. Stormbow is a scalable, cost effective, and open-source based tool for large-scale RNA-Seq data analysis. Stormbow can be freely downloaded and can be used out of box to process Illumina RNA-Seq datasets. PMID:25937948

  16. Stormbow: A Cloud-Based Tool for Reads Mapping and Expression Quantification in Large-Scale RNA-Seq Studies.

    PubMed

    Zhao, Shanrong; Prenger, Kurt; Smith, Lance

    2013-01-01

    RNA-Seq is becoming a promising replacement to microarrays in transcriptome profiling and differential gene expression study. Technical improvements have decreased sequencing costs and, as a result, the size and number of RNA-Seq datasets have increased rapidly. However, the increasing volume of data from large-scale RNA-Seq studies poses a practical challenge for data analysis in a local environment. To meet this challenge, we developed Stormbow, a cloud-based software package, to process large volumes of RNA-Seq data in parallel. The performance of Stormbow has been tested by practically applying it to analyse 178 RNA-Seq samples in the cloud. In our test, it took 6 to 8 hours to process an RNA-Seq sample with 100 million reads, and the average cost was $3.50 per sample. Utilizing Amazon Web Services as the infrastructure for Stormbow allows us to easily scale up to handle large datasets with on-demand computational resources. Stormbow is a scalable, cost effective, and open-source based tool for large-scale RNA-Seq data analysis. Stormbow can be freely downloaded and can be used out of box to process Illumina RNA-Seq datasets.

  17. Trophic groups and modules: two levels of group detection in food webs

    PubMed Central

    Gauzens, Benoit; Thébault, Elisa; Lacroix, Gérard; Legendre, Stéphane

    2015-01-01

    Within food webs, species can be partitioned into groups according to various criteria. Two notions have received particular attention: trophic groups (TGs), which have been used for decades in the ecological literature, and more recently, modules. The relationship between these two group concepts remains unknown in empirical food webs. While recent developments in network theory have led to efficient methods for detecting modules in food webs, the determination of TGs (groups of species that are functionally similar) is largely based on subjective expert knowledge. We develop a novel algorithm for TG detection. We apply this method to empirical food webs and show that aggregation into TGs allows for the simplification of food webs while preserving their information content. Furthermore, we reveal a two-level hierarchical structure where modules partition food webs into large bottom–top trophic pathways, whereas TGs further partition these pathways into groups of species with similar trophic connections. This provides new perspectives for the study of dynamical and functional consequences of food-web structure, bridging topological and dynamical analysis. TGs have a clear ecological meaning and are found to provide a trade-off between network complexity and information loss. PMID:25878127

  18. Web-based scoring of the dicentric assay, a collaborative biodosimetric scoring strategy for population triage in large scale radiation accidents.

    PubMed

    Romm, H; Ainsbury, E; Bajinskis, A; Barnard, S; Barquinero, J F; Barrios, L; Beinke, C; Puig-Casanovas, R; Deperas-Kaminska, M; Gregoire, E; Oestreicher, U; Lindholm, C; Moquet, J; Rothkamm, K; Sommer, S; Thierens, H; Vral, A; Vandersickel, V; Wojcik, A

    2014-05-01

    In the case of a large scale radiation accident high throughput methods of biological dosimetry for population triage are needed to identify individuals requiring clinical treatment. The dicentric assay performed in web-based scoring mode may be a very suitable technique. Within the MULTIBIODOSE EU FP7 project a network is being established of 8 laboratories with expertise in dose estimations based on the dicentric assay. Here, the manual dicentric assay was tested in a web-based scoring mode. More than 23,000 high resolution images of metaphase spreads (only first mitosis) were captured by four laboratories and established as image galleries on the internet (cloud). The galleries included images of a complete dose effect curve (0-5.0 Gy) and three types of irradiation scenarios simulating acute whole body, partial body and protracted exposure. The blood samples had been irradiated in vitro with gamma rays at the University of Ghent, Belgium. Two laboratories provided image galleries from Fluorescence plus Giemsa stained slides (3 h colcemid) and the image galleries from the other two laboratories contained images from Giemsa stained preparations (24 h colcemid). Each of the 8 participating laboratories analysed 3 dose points of the dose effect curve (scoring 100 cells for each point) and 3 unknown dose points (50 cells) for each of the 3 simulated irradiation scenarios. At first all analyses were performed in a QuickScan Mode without scoring individual chromosomes, followed by conventional scoring (only complete cells, 46 centromeres). The calibration curves obtained using these two scoring methods were very similar, with no significant difference in the linear-quadratic curve coefficients. Analysis of variance showed a significant effect of dose on the yield of dicentrics, but no significant effect of the laboratories, different methods of slide preparation or different incubation times used for colcemid. The results obtained to date within the MULTIBIODOSE

  19. Automatic sequential fluid handling with multilayer microfluidic sample isolated pumping

    PubMed Central

    Liu, Jixiao; Fu, Hai; Yang, Tianhang; Li, Songjing

    2015-01-01

    To sequentially handle fluids is of great significance in quantitative biology, analytical chemistry, and bioassays. However, the technological options are limited when building such microfluidic sequential processing systems, and one of the encountered challenges is the need for reliable, efficient, and mass-production available microfluidic pumping methods. Herein, we present a bubble-free and pumping-control unified liquid handling method that is compatible with large-scale manufacture, termed multilayer microfluidic sample isolated pumping (mμSIP). The core part of the mμSIP is the selective permeable membrane that isolates the fluidic layer from the pneumatic layer. The air diffusion from the fluidic channel network into the degassing pneumatic channel network leads to fluidic channel pressure variation, which further results in consistent bubble-free liquid pumping into the channels and the dead-end chambers. We characterize the mμSIP by comparing the fluidic actuation processes with different parameters and a flow rate range of 0.013 μl/s to 0.097 μl/s is observed in the experiments. As the proof of concept, we demonstrate an automatic sequential fluid handling system aiming at digital assays and immunoassays, which further proves the unified pumping-control and suggests that the mμSIP is suitable for functional microfluidic assays with minimal operations. We believe that the mμSIP technology and demonstrated automatic sequential fluid handling system would enrich the microfluidic toolbox and benefit further inventions. PMID:26487904

  20. Handle with care: the impact of using Java applets in Web-based studies on dropout and sample composition.

    PubMed

    Stieger, Stefan; Göritz, Anja S; Voracek, Martin

    2011-05-01

    In Web-based studies, Web browsers are used to display online questionnaires. If an online questionnaire relies on non-standard technologies (e.g., Java applets), it is often necessary to install a particular browser plug-in. This can lead to technically induced dropout because some participants lack the technological know-how or the willingness to install the plug-in. In two thematically identical online studies conducted across two time points in two different participant pools (N = 1,527 and 805), we analyzed whether using a Java applet produces dropout and distortion of demographics in the final sample. Dropout was significantly higher on the Java applet questionnaire page than on the preceding and subsequent questionnaire pages. Age-specific effects were found only in one sample (i.e., dropouts were older), whereas sex-specific effects were found in both samples (i.e., women dropped out more frequently than men on the Java applet page). These results additionally support the recommendation that using additional technologies (e.g., Java applets) can be dangerous in producing a sample that is biased toward both younger and male respondents.

  1. NGL Viewer: a web application for molecular visualization.

    PubMed

    Rose, Alexander S; Hildebrand, Peter W

    2015-07-01

    The NGL Viewer (http://proteinformatics.charite.de/ngl) is a web application for the visualization of macromolecular structures. By fully adopting capabilities of modern web browsers, such as WebGL, for molecular graphics, the viewer can interactively display large molecular complexes and is also unaffected by the retirement of third-party plug-ins like Flash and Java Applets. Generally, the web application offers comprehensive molecular visualization through a graphical user interface so that life scientists can easily access and profit from available structural data. It supports common structural file-formats (e.g. PDB, mmCIF) and a variety of molecular representations (e.g. 'cartoon, spacefill, licorice'). Moreover, the viewer can be embedded in other web sites to provide specialized visualizations of entries in structural databases or results of structure-related calculations. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  2. Web3D Technologies in Learning, Education and Training: Motivations, Issues, Opportunities

    ERIC Educational Resources Information Center

    Chittaro, Luca; Ranon, Roberto

    2007-01-01

    Web3D open standards allow the delivery of interactive 3D virtual learning environments through the Internet, reaching potentially large numbers of learners worldwide, at any time. This paper introduces the educational use of virtual reality based on Web3D technologies. After briefly presenting the main Web3D technologies, we summarize the…

  3. Online data analysis using Web GDL

    NASA Astrophysics Data System (ADS)

    Jaffey, A.; Cheung, M.; Kobashi, A.

    2008-12-01

    The ever improving capability of modern astronomical instruments to capture data at high spatial resolution and cadence is opening up unprecedented opportunities for scientific discovery. When data sets become so large that they cannot be easily transferred over the internet, the researcher must find alternative ways to perform data analysis. One strategy is to bring the data analysis code to where the data resides. We present Web GDL, an implementation of GDL (GNU Data Language, open source incremental compiler compatible with IDL) that allows users to perform interactive data analysis within a web browser.

  4. WASTE HANDLING BUILDING ELECTRICAL SYSTEM DESCRIPTION DOCUMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    S.C. Khamamkar

    2000-06-23

    The Waste Handling Building Electrical System performs the function of receiving, distributing, transforming, monitoring, and controlling AC and DC power to all waste handling building electrical loads. The system distributes normal electrical power to support all loads that are within the Waste Handling Building (WHB). The system also generates and distributes emergency power to support designated emergency loads within the WHB within specified time limits. The system provides the capability to transfer between normal and emergency power. The system provides emergency power via independent and physically separated distribution feeds from the normal supply. The designated emergency electrical equipment will bemore » designed to operate during and after design basis events (DBEs). The system also provides lighting, grounding, and lightning protection for the Waste Handling Building. The system is located in the Waste Handling Building System. The system consists of a diesel generator, power distribution cables, transformers, switch gear, motor controllers, power panel boards, lighting panel boards, lighting equipment, lightning protection equipment, control cabling, and grounding system. Emergency power is generated with a diesel generator located in a QL-2 structure and connected to the QL-2 bus. The Waste Handling Building Electrical System distributes and controls primary power to acceptable industry standards, and with a dependability compatible with waste handling building reliability objectives for non-safety electrical loads. It also generates and distributes emergency power to the designated emergency loads. The Waste Handling Building Electrical System receives power from the Site Electrical Power System. The primary material handling power interfaces include the Carrier/Cask Handling System, Canister Transfer System, Assembly Transfer System, Waste Package Remediation System, and Disposal Container Handling Systems. The system interfaces with the MGR

  5. 50 CFR 14.111 - Handling.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... and Fisheries UNITED STATES FISH AND WILDLIFE SERVICE, DEPARTMENT OF THE INTERIOR TAKING, POSSESSION..., EXPORTATION, AND TRANSPORTATION OF WILDLIFE Standards for the Humane and Healthful Transport of Wild Mammals and Birds to the United States § 14.111 Handling. (a) Care shall be exercised to avoid handling the...

  6. Fluid handling equipment: A compilation

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Devices and techniques used in fluid-handling and vacuum systems are described. Section 1 presents several articles on fluid lines and tubing. Section 2 describes a number of components such as valves, filters, and regulators. The last section contains descriptions of a number of innovative fluid-handling systems.

  7. SLUG HANDLING DEVICES

    DOEpatents

    Gentry, J.R.

    1958-09-16

    A device is described for handling fuel elements of a neutronic reactor. The device consists of two concentric telescoped contalners that may fit about the fuel element. A number of ratchet members, equally spaced about the entrance to the containers, are pivoted on the inner container and spring biased to the outer container so thnt they are forced to hear against and hold the fuel element, the weight of which tends to force the ratchets tighter against the fuel element. The ratchets are released from their hold by raising the inner container relative to the outer memeber. This device reduces the radiation hazard to the personnel handling the fuel elements.

  8. Web-based Data Exploration, Exploitation and Visualization Tools for Satellite Sensor VIS/IR Calibration Applications

    NASA Astrophysics Data System (ADS)

    Gopalan, A.; Doelling, D. R.; Scarino, B. R.; Chee, T.; Haney, C.; Bhatt, R.

    2016-12-01

    The CERES calibration group at NASA/LaRC has developed and deployed a suite of online data exploration and visualization tools targeted towards a range of spaceborne VIS/IR imager calibration applications for the Earth Science community. These web-based tools are driven by the open-source R (Language for Statistical Computing and Visualization) with a web interface for the user to customize the results according to their application. The tool contains a library of geostationary and sun-synchronous imager spectral response functions (SRF), incoming solar spectra, SCIAMACHY and Hyperion Earth reflected visible hyper-spectral data, and IASI IR hyper-spectral data. The suite of six specific web-based tools was designed to provide critical information necessary for sensor cross-calibration. One of the challenges of sensor cross-calibration is accounting for spectral band differences and may introduce biases if not handled properly. The spectral band adjustment factors (SBAF) are a function of the earth target, atmospheric and cloud conditions or scene type and angular conditions, when obtaining sensor radiance pairs. The SBAF will need to be customized for each inter-calibration target and sensor pair. The advantages of having a community open source tool are: 1) only one archive of SCIAMACHY, Hyperion, and IASI datasets needs to be maintained, which is on the order of 50TB. 2) the framework will allow easy incorporation of new satellite SRFs and hyper-spectral datasets and associated coincident atmospheric and cloud properties, such as PW. 3) web tool or SBAF algorithm improvements or suggestions when incorporated can benefit the community at large. 4) The customization effort is on the user rather than on the host. In this paper we discuss each of these tools in detail and explore the variety of advanced options that can be used to constrain the results along with specific use cases to highlight the value-added by these datasets.

  9. Googling DNA sequences on the World Wide Web.

    PubMed

    Hajibabaei, Mehrdad; Singer, Gregory A C

    2009-11-10

    New web-based technologies provide an excellent opportunity for sharing and accessing information and using web as a platform for interaction and collaboration. Although several specialized tools are available for analyzing DNA sequence information, conventional web-based tools have not been utilized for bioinformatics applications. We have developed a novel algorithm and implemented it for searching species-specific genomic sequences, DNA barcodes, by using popular web-based methods such as Google. We developed an alignment independent character based algorithm based on dividing a sequence library (DNA barcodes) and query sequence to words. The actual search is conducted by conventional search tools such as freely available Google Desktop Search. We implemented our algorithm in two exemplar packages. We developed pre and post-processing software to provide customized input and output services, respectively. Our analysis of all publicly available DNA barcode sequences shows a high accuracy as well as rapid results. Our method makes use of conventional web-based technologies for specialized genetic data. It provides a robust and efficient solution for sequence search on the web. The integration of our search method for large-scale sequence libraries such as DNA barcodes provides an excellent web-based tool for accessing this information and linking it to other available categories of information on the web.

  10. Analysis of multiple activity manual materials handling tasks using A Guide to Manual Materials Handling.

    PubMed

    Mital, A

    1999-01-01

    Manual handling of materials continues to be a hazardous activity, leading to a very significant number of severe overexertion injuries. Designing jobs that are within the physical capabilities of workers is one approach ergonomists have adopted to redress this problem. As a result, several job design procedures have been developed over the years. However, these procedures are limited to designing or evaluating only pure lifting jobs or only the lifting aspect of a materials handling job. This paper describes a general procedure that may be used to design or analyse materials handling jobs that involve several different kinds of activities (e.g. lifting, lowering, carrying, pushing, etc). The job design/analysis procedure utilizes an elemental approach (breaking the job into elements) and relies on databases provided in A Guide to Manual Materials Handling to compute associated risk factors. The use of the procedure is demonstrated with the help of two case studies.

  11. Information Handling is the Problem

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2001-01-01

    This slide presentation reviews the concerns surrounding the automation of information handling. There are two types of decision support software that supports most Space Station Flight Controllers. one is very simple, and the other is very complex. A middle ground is sought. This is the reason for the Human Centered Autonomous and Assistant Systems Testbed (HCAAST) Project. The aim is to study flight controllers at work, and in the bigger picture, with particular attention to how they handle information and how coordination of multiple teams is performed. The focus of the project is on intelligent assistants to assist in handling information for the flight controllers.

  12. Robotic liquid handling and automation in epigenetics.

    PubMed

    Gaisford, Wendy

    2012-10-01

    Automated liquid-handling robots and high-throughput screening (HTS) are widely used in the pharmaceutical industry for the screening of large compound libraries, small molecules for activity against disease-relevant target pathways, or proteins. HTS robots capable of low-volume dispensing reduce assay setup times and provide highly accurate and reproducible dispensing, minimizing variation between sample replicates and eliminating the potential for manual error. Low-volume automated nanoliter dispensers ensure accuracy of pipetting within volume ranges that are difficult to achieve manually. In addition, they have the ability to potentially expand the range of screening conditions from often limited amounts of valuable sample, as well as reduce the usage of expensive reagents. The ability to accurately dispense lower volumes provides the potential to achieve a greater amount of information than could be otherwise achieved using manual dispensing technology. With the emergence of the field of epigenetics, an increasing number of drug discovery companies are beginning to screen compound libraries against a range of epigenetic targets. This review discusses the potential for the use of low-volume liquid handling robots, for molecular biological applications such as quantitative PCR and epigenetics.

  13. 7 CFR 926.9 - Handle.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... REQUIREMENTS APPLICABLE TO CRANBERRIES NOT SUBJECT TO THE CRANBERRY MARKETING ORDER § 926.9 Handle. Handle... contract carrier of cranberries owned by another person) fresh or processed cranberries produced within or outside the United States or in any other way to place fresh or processed cranberries into the current of...

  14. 7 CFR 926.9 - Handle.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... REQUIREMENTS APPLICABLE TO CRANBERRIES NOT SUBJECT TO THE CRANBERRY MARKETING ORDER § 926.9 Handle. Handle... contract carrier of cranberries owned by another person) fresh or processed cranberries produced within or outside the United States or in any other way to place fresh or processed cranberries into the current of...

  15. 7 CFR 926.9 - Handle.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... REQUIREMENTS APPLICABLE TO CRANBERRIES NOT SUBJECT TO THE CRANBERRY MARKETING ORDER § 926.9 Handle. Handle... contract carrier of cranberries owned by another person) fresh or processed cranberries produced within or outside the United States or in any other way to place fresh or processed cranberries into the current of...

  16. 7 CFR 926.9 - Handle.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... REQUIREMENTS APPLICABLE TO CRANBERRIES NOT SUBJECT TO THE CRANBERRY MARKETING ORDER § 926.9 Handle. Handle... contract carrier of cranberries owned by another person) fresh or processed cranberries produced within or outside the United States or in any other way to place fresh or processed cranberries into the current of...

  17. Automatic liquid handling for life science: a critical review of the current state of the art.

    PubMed

    Kong, Fanwei; Yuan, Liang; Zheng, Yuan F; Chen, Weidong

    2012-06-01

    Liquid handling plays a pivotal role in life science laboratories. In experiments such as gene sequencing, protein crystallization, antibody testing, and drug screening, liquid biosamples frequently must be transferred between containers of varying sizes and/or dispensed onto substrates of varying types. The sample volumes are usually small, at the micro- or nanoliter level, and the number of transferred samples can be huge when investigating large-scope combinatorial conditions. Under these conditions, liquid handling by hand is tedious, time-consuming, and impractical. Consequently, there is a strong demand for automated liquid-handling methods such as sensor-integrated robotic systems. In this article, we survey the current state of the art in automatic liquid handling, including technologies developed by both industry and research institutions. We focus on methods for dealing with small volumes at high throughput and point out challenges for future advancements.

  18. Web Content Management and One EPA Web Factsheet

    EPA Pesticide Factsheets

    One EPA Web is a multi-year project to improve EPA’s website to better meet the needs of our Web visitors. Content is developed and managed in the WebCMS which supports One EPA Web goals by standardizing how we create and publish content.

  19. 7 CFR 996.4 - Handle.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... DOMESTIC AND IMPORTED PEANUTS MARKETED IN THE UNITED STATES Definitions § 996.4 Handle. Handle means to... imported peanuts and in the shipment (except as a common or contract carrier of peanuts owned by another) or sale of cleaned-inshell or shelled peanuts or other activity causing peanuts to enter into human...

  20. Consumer Shell Egg Consumption and Handling Practices: Results from a National Survey.

    PubMed

    Kosa, Katherine M; Cates, Sheryl C; Bradley, Samantha; Godwin, Sandria; Chambers, Delores

    2015-07-01

    Numerous cases and outbreaks of Salmonella infection are attributable to shell eggs each year in the United States. Safe handling and consumption of shell eggs at home can help reduce foodborne illness attributable to shell eggs. A nationally representative Web survey of 1,504 U.S. adult grocery shoppers was conducted to describe consumer handling practices and consumption of shell eggs at home. Based on self-reported survey data, most respondents purchase shell eggs from a grocery store (89.5%), and these eggs were kept refrigerated (not at room temperature; 98.5%). As recommended, most consumers stored shell eggs in the refrigerator (99%) for no more than 3 to 5 weeks (97.6%). After cracking eggs, 48.1% of respondents washed their hands with soap and water. More than half of respondents who fry and/or poach eggs cooked them so that the whites and/or the yolks were still soft or runny, a potentially unsafe practice. Among respondents who owned a food thermometer (62.0%), only 5.2% used it to check the doneness of baked egg dishes when they prepared such a dish. Consumers generally followed two of the four core "Safe Food Families" food safety messages ("separate" and "chill") when handling shell eggs at home. To prevent Salmonella infection associated with shell eggs, consumers should improve their practices related to the messages "clean" (i.e., wash hands after cracking eggs) and "cook" (i.e., cook until yolks and whites are firm and use a food thermometer to check doneness of baked egg dishes) when preparing shell eggs at home. These findings will be used to inform the development of science-based consumer education materials that can help reduce foodborne illness from Salmonella infection.

  1. A new approach to handling incoming verifications.

    PubMed

    Luizzo, Anthony; Roy, Bill; Luizzo, Philip

    2016-10-01

    Outside requests for data on current or former employees are handled in different ways by healthcare organizations and present considerable liability risks if a corporate policy for handling such risks is not in place. In this article, the authors present a strategy for responsible handling of sensitive information.

  2. 9 CFR 114.11 - Storage and handling.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 9 Animals and Animal Products 1 2012-01-01 2012-01-01 false Storage and handling. 114.11 Section... BIOLOGICAL PRODUCTS § 114.11 Storage and handling. Biological products at licensed establishments shall be protected at all times against improper storage and handling. Completed product shall be kept under...

  3. 9 CFR 114.11 - Storage and handling.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 9 Animals and Animal Products 1 2014-01-01 2014-01-01 false Storage and handling. 114.11 Section... BIOLOGICAL PRODUCTS § 114.11 Storage and handling. Biological products at licensed establishments shall be protected at all times against improper storage and handling. Completed product shall be kept under...

  4. 9 CFR 114.11 - Storage and handling.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 9 Animals and Animal Products 1 2013-01-01 2013-01-01 false Storage and handling. 114.11 Section... BIOLOGICAL PRODUCTS § 114.11 Storage and handling. Biological products at licensed establishments shall be protected at all times against improper storage and handling. Completed product shall be kept under...

  5. 14 CFR 25.489 - Ground handling conditions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Ground handling conditions. 25.489 Section... AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY AIRPLANES Structure Ground Loads § 25.489 Ground handling conditions... ground handling conditions). No wing lift may be considered. The shock absorbers and tires may be assumed...

  6. 14 CFR 25.489 - Ground handling conditions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 1 2011-01-01 2011-01-01 false Ground handling conditions. 25.489 Section... AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY AIRPLANES Structure Ground Loads § 25.489 Ground handling conditions... ground handling conditions). No wing lift may be considered. The shock absorbers and tires may be assumed...

  7. Visualizing multiattribute Web transactions using a freeze technique

    NASA Astrophysics Data System (ADS)

    Hao, Ming C.; Cotting, Daniel; Dayal, Umeshwar; Machiraju, Vijay; Garg, Pankaj

    2003-05-01

    Web transactions are multidimensional and have a number of attributes: client, URL, response times, and numbers of messages. One of the key questions is how to simultaneously lay out in a graph the multiple relationships, such as the relationships between the web client response times and URLs in a web access application. In this paper, we describe a freeze technique to enhance a physics-based visualization system for web transactions. The idea is to freeze one set of objects before laying out the next set of objects during the construction of the graph. As a result, we substantially reduce the force computation time. This technique consists of three steps: automated classification, a freeze operation, and a graph layout. These three steps are iterated until the final graph is generated. This iterated-freeze technique has been prototyped in several e-service applications at Hewlett Packard Laboratories. It has been used to visually analyze large volumes of service and sales transactions at online web sites.

  8. Determination of PCDDs in spider webs: preliminary studies

    NASA Astrophysics Data System (ADS)

    Rybak, Justyna; Rutkowski, Radosław

    2018-01-01

    The application of spider webs for determination of polichlorinated dibenzo-para-dioxins (PCDDs) has been studied for the first time. The aim of the studies was to find out if spider webs are suitable for such examinations as it was proved in the previous research they are excellent indicators of air pollutants. Spiders are ubiquitous, thus collection of samples is easy and non-invasive. Studies were conducted within the city of Wrocław and surroundings, one of the biggest and at the same time heaviest polluted city in Poland. Five research sites have been chosen, where spider webs were collected after 60 days of continuous exposure time. Webs belonging to two genera Tegenaria sylvestris and Tegenaria ferruginea (family Agelenidae) have been chosen as they are large and very dense, thus they are very suitable for such examinations. Webs were found to retain dioxins probably mainly by external exposure. These promising results should be continued and expanded in the future research.

  9. Large scale rigidity-based flexibility analysis of biomolecules

    PubMed Central

    Streinu, Ileana

    2016-01-01

    KINematics And RIgidity (KINARI) is an on-going project for in silico flexibility analysis of proteins. The new version of the software, Kinari-2, extends the functionality of our free web server KinariWeb, incorporates advanced web technologies, emphasizes the reproducibility of its experiments, and makes substantially improved tools available to the user. It is designed specifically for large scale experiments, in particular, for (a) very large molecules, including bioassemblies with high degree of symmetry such as viruses and crystals, (b) large collections of related biomolecules, such as those obtained through simulated dilutions, mutations, or conformational changes from various types of dynamics simulations, and (c) is intended to work as seemlessly as possible on the large, idiosyncratic, publicly available repository of biomolecules, the Protein Data Bank. We describe the system design, along with the main data processing, computational, mathematical, and validation challenges underlying this phase of the KINARI project. PMID:26958583

  10. Llama handling and training.

    PubMed

    McGee, M

    1994-07-01

    This article offers insights into the relationship of llama owners to their animals and the role of veterinarians as part of the animal care team. The effect of human behavior and handling techniques on llama behavior and marketability are discussed. Progressive ideas for nonforceful llama handling equipment, procedures, and training ideas are outlined in detail. Included are specific training plans for routine herd management chores such as injections and toenail trimming. This article is useful for both veterinarians and llama owners.

  11. istar: a web platform for large-scale protein-ligand docking.

    PubMed

    Li, Hongjian; Leung, Kwong-Sak; Ballester, Pedro J; Wong, Man-Hon

    2014-01-01

    Protein-ligand docking is a key computational method in the design of starting points for the drug discovery process. We are motivated by the desire to automate large-scale docking using our popular docking engine idock and thus have developed a publicly-accessible web platform called istar. Without tedious software installation, users can submit jobs using our website. Our istar website supports 1) filtering ligands by desired molecular properties and previewing the number of ligands to dock, 2) monitoring job progress in real time, and 3) visualizing ligand conformations and outputting free energy and ligand efficiency predicted by idock, binding affinity predicted by RF-Score, putative hydrogen bonds, and supplier information for easy purchase, three useful features commonly lacked on other online docking platforms like DOCK Blaster or iScreen. We have collected 17,224,424 ligands from the All Clean subset of the ZINC database, and revamped our docking engine idock to version 2.0, further improving docking speed and accuracy, and integrating RF-Score as an alternative rescoring function. To compare idock 2.0 with the state-of-the-art AutoDock Vina 1.1.2, we have carried out a rescoring benchmark and a redocking benchmark on the 2,897 and 343 protein-ligand complexes of PDBbind v2012 refined set and CSAR NRC HiQ Set 24Sept2010 respectively, and an execution time benchmark on 12 diverse proteins and 3,000 ligands of different molecular weight. Results show that, under various scenarios, idock achieves comparable success rates while outperforming AutoDock Vina in terms of docking speed by at least 8.69 times and at most 37.51 times. When evaluated on the PDBbind v2012 core set, our istar platform combining with RF-Score manages to reproduce Pearson's correlation coefficient and Spearman's correlation coefficient of as high as 0.855 and 0.859 respectively between the experimental binding affinity and the predicted binding affinity of the docked conformation. istar

  12. Next generation of weather generators on web service framework

    NASA Astrophysics Data System (ADS)

    Chinnachodteeranun, R.; Hung, N. D.; Honda, K.; Ines, A. V. M.

    2016-12-01

    Weather generator is a statistical model that synthesizes possible realization of long-term historical weather in future. It generates several tens to hundreds of realizations stochastically based on statistical analysis. Realization is essential information as a crop modeling's input for simulating crop growth and yield. Moreover, they can be contributed to analyzing uncertainty of weather to crop development stage and to decision support system on e.g. water management and fertilizer management. Performing crop modeling requires multidisciplinary skills which limit the usage of weather generator only in a research group who developed it as well as a barrier for newcomers. To improve the procedures of performing weather generators as well as the methodology to acquire the realization in a standard way, we implemented a framework for providing weather generators as web services, which support service interoperability. Legacy weather generator programs were wrapped in the web service framework. The service interfaces were implemented based on an international standard that was Sensor Observation Service (SOS) defined by Open Geospatial Consortium (OGC). Clients can request realizations generated by the model through SOS Web service. Hierarchical data preparation processes required for weather generator are also implemented as web services and seamlessly wired. Analysts and applications can invoke services over a network easily. The services facilitate the development of agricultural applications and also reduce the workload of analysts on iterative data preparation and handle legacy weather generator program. This architectural design and implementation can be a prototype for constructing further services on top of interoperable sensor network system. This framework opens an opportunity for other sectors such as application developers and scientists in other fields to utilize weather generators.

  13. Design and simulation of integration system between automated material handling system and manufacturing layout in the automotive assembly line

    NASA Astrophysics Data System (ADS)

    Seha, S.; Zamberi, J.; Fairu, A. J.

    2017-10-01

    Material handling system (MHS) is an important part for the productivity plant and has recognized as an integral part of today’s manufacturing system. Currently, MHS has growth tremendously with its technology and equipment type. Based on the case study observation, the issue involving material handling system contribute to the reduction of production efficiency. This paper aims to propose a new design of integration between material handling and manufacturing layout by investigating the influences of layout and material handling system. A method approach tool using Delmia Quest software is introduced and the simulation result is used to assess the influences of the integration between material handling system and manufacturing layout in the performance of automotive assembly line. The result show, the production of assembly line output increases more than 31% from the current system. The source throughput rate average value went up to 252 units per working hour in model 3 and show the effectiveness of the pick-to-light system as efficient storage equipment. Thus, overall result shows, the application of AGV and the pick-to-light system gave a large significant effect in the automotive assembly line. Moreover, the change of layout also shows a large significant improvement to the performance.

  14. A large Great Britain-wide outbreak of STEC O157 phage type 8 linked to handling of raw leeks and potatoes.

    PubMed

    Launders, N; Locking, M E; Hanson, M; Willshaw, G; Charlett, A; Salmon, R; Cowden, J; Harker, K S; Adak, G K

    2016-01-01

    Between December 2010 and July 2011, 252 cases of STEC O157 PT8 stx1 + 2 infection were reported in England, Scotland and Wales. This was the largest outbreak of STEC reported in England and the second largest in the UK to date. Eighty cases were hospitalized, with two cases of haemolytic uraemic syndrome and one death reported. Routine investigative data were used to generate a hypothesis but the subsequent case-control study was inconclusive. A second, more detailed, hypothesis generation exercise identified consumption or handling of vegetables as a potential mode of transmission. A second case-control study demonstrated that cases were more likely than controls to live in households whose members handled or prepared leeks bought unwrapped [odds ratio (OR) 40, 95% confidence interval (CI) 2·08-769·4], and potatoes bought in sacks (OR 13·13, 95% CI 1·19-145·3). This appears to be the first outbreak of STEC O157 infection linked to the handling of leeks.

  15. Ergonomics of disposable handles for minimally invasive surgery.

    PubMed

    Büchel, D; Mårvik, R; Hallabrin, B; Matern, U

    2010-05-01

    The ergonomic deficiencies of currently available minimally invasive surgery (MIS) instrument handles have been addressed in many studies. In this study, a new ergonomic pistol handle concept, realized as a prototype, and two disposable ring handles were investigated according to ergonomic properties set by new European standards. In this study, 25 volunteers performed four practical tasks to evaluate the ergonomics of the handles used in standard operating procedures (e.g., measuring a suture and cutting to length, precise maneuvering and targeting, and dissection of a gallbladder). Moreover, 20 participants underwent electromyography (EMG) tests to measure the muscle strain they experienced while carrying out the basic functions (grasp, rotate, and maneuver) in the x, y, and z axes. The data measured included the number of errors, the time required for task completion, perception of pressure areas, and EMG data. The values for usability in the test were effectiveness, efficiency, and user satisfaction. Surveys relating to the subjective rating were completed after each task for each of the three handles tested. Each handle except the new prototype caused pressure areas and pain. Extreme differences in muscle strain could not be observed for any of the three handles. Experienced surgeons worked more quickly with the prototype when measuring and cutting a suture (approximately 20%) and during precise maneuvering and targeting (approximately 20%). On the other hand, they completed the dissection task faster with the handle manufactured by Ethicon. Fewer errors were made with the prototype in dissection of the gallbladder. In contrast to the handles available on the market, the prototype was always rated as positive by the volunteers in the subjective surveys. None of the handles could fulfil all of the requirements with top scores. Each handle had its advantages and disadvantages. In contrast to the ring handles, the volunteers could fulfil most of the tasks more

  16. GeoCENS: a geospatial cyberinfrastructure for the world-wide sensor web.

    PubMed

    Liang, Steve H L; Huang, Chih-Yuan

    2013-10-02

    The world-wide sensor web has become a very useful technique for monitoring the physical world at spatial and temporal scales that were previously impossible. Yet we believe that the full potential of sensor web has thus far not been revealed. In order to harvest the world-wide sensor web's full potential, a geospatial cyberinfrastructure is needed to store, process, and deliver large amount of sensor data collected worldwide. In this paper, we first define the issue of the sensor web long tail followed by our view of the world-wide sensor web architecture. Then, we introduce the Geospatial Cyberinfrastructure for Environmental Sensing (GeoCENS) architecture and explain each of its components. Finally, with demonstration of three real-world powered-by-GeoCENS sensor web applications, we believe that the GeoCENS architecture can successfully address the sensor web long tail issue and consequently realize the world-wide sensor web vision.

  17. Bayesian analysis of the dynamic cosmic web in the SDSS galaxy survey

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leclercq, Florent; Wandelt, Benjamin; Jasche, Jens, E-mail: florent.leclercq@polytechnique.org, E-mail: jasche@iap.fr, E-mail: wandelt@iap.fr

    Recent application of the Bayesian algorithm \\textsc(borg) to the Sloan Digital Sky Survey (SDSS) main sample galaxies resulted in the physical inference of the formation history of the observed large-scale structure from its origin to the present epoch. In this work, we use these inferences as inputs for a detailed probabilistic cosmic web-type analysis. To do so, we generate a large set of data-constrained realizations of the large-scale structure using a fast, fully non-linear gravitational model. We then perform a dynamic classification of the cosmic web into four distinct components (voids, sheets, filaments, and clusters) on the basis of themore » tidal field. Our inference framework automatically and self-consistently propagates typical observational uncertainties to web-type classification. As a result, this study produces accurate cosmographic classification of large-scale structure elements in the SDSS volume. By also providing the history of these structure maps, the approach allows an analysis of the origin and growth of the early traces of the cosmic web present in the initial density field and of the evolution of global quantities such as the volume and mass filling fractions of different structures. For the problem of web-type classification, the results described in this work constitute the first connection between theory and observations at non-linear scales including a physical model of structure formation and the demonstrated capability of uncertainty quantification. A connection between cosmology and information theory using real data also naturally emerges from our probabilistic approach. Our results constitute quantitative chrono-cosmography of the complex web-like patterns underlying the observed galaxy distribution.« less

  18. Composition of web services using Markov decision processes and dynamic programming.

    PubMed

    Uc-Cetina, Víctor; Moo-Mena, Francisco; Hernandez-Ucan, Rafael

    2015-01-01

    We propose a Markov decision process model for solving the Web service composition (WSC) problem. Iterative policy evaluation, value iteration, and policy iteration algorithms are used to experimentally validate our approach, with artificial and real data. The experimental results show the reliability of the model and the methods employed, with policy iteration being the best one in terms of the minimum number of iterations needed to estimate an optimal policy, with the highest Quality of Service attributes. Our experimental work shows how the solution of a WSC problem involving a set of 100,000 individual Web services and where a valid composition requiring the selection of 1,000 services from the available set can be computed in the worst case in less than 200 seconds, using an Intel Core i5 computer with 6 GB RAM. Moreover, a real WSC problem involving only 7 individual Web services requires less than 0.08 seconds, using the same computational power. Finally, a comparison with two popular reinforcement learning algorithms, sarsa and Q-learning, shows that these algorithms require one or two orders of magnitude and more time than policy iteration, iterative policy evaluation, and value iteration to handle WSC problems of the same complexity.

  19. Composition of Web Services Using Markov Decision Processes and Dynamic Programming

    PubMed Central

    Uc-Cetina, Víctor; Moo-Mena, Francisco; Hernandez-Ucan, Rafael

    2015-01-01

    We propose a Markov decision process model for solving the Web service composition (WSC) problem. Iterative policy evaluation, value iteration, and policy iteration algorithms are used to experimentally validate our approach, with artificial and real data. The experimental results show the reliability of the model and the methods employed, with policy iteration being the best one in terms of the minimum number of iterations needed to estimate an optimal policy, with the highest Quality of Service attributes. Our experimental work shows how the solution of a WSC problem involving a set of 100,000 individual Web services and where a valid composition requiring the selection of 1,000 services from the available set can be computed in the worst case in less than 200 seconds, using an Intel Core i5 computer with 6 GB RAM. Moreover, a real WSC problem involving only 7 individual Web services requires less than 0.08 seconds, using the same computational power. Finally, a comparison with two popular reinforcement learning algorithms, sarsa and Q-learning, shows that these algorithms require one or two orders of magnitude and more time than policy iteration, iterative policy evaluation, and value iteration to handle WSC problems of the same complexity. PMID:25874247

  20. Semantic Search of Web Services

    ERIC Educational Resources Information Center

    Hao, Ke

    2013-01-01

    This dissertation addresses semantic search of Web services using natural language processing. We first survey various existing approaches, focusing on the fact that the expensive costs of current semantic annotation frameworks result in limited use of semantic search for large scale applications. We then propose a vector space model based service…

  1. Graph-Based Semantic Web Service Composition for Healthcare Data Integration.

    PubMed

    Arch-Int, Ngamnij; Arch-Int, Somjit; Sonsilphong, Suphachoke; Wanchai, Paweena

    2017-01-01

    Within the numerous and heterogeneous web services offered through different sources, automatic web services composition is the most convenient method for building complex business processes that permit invocation of multiple existing atomic services. The current solutions in functional web services composition lack autonomous queries of semantic matches within the parameters of web services, which are necessary in the composition of large-scale related services. In this paper, we propose a graph-based Semantic Web Services composition system consisting of two subsystems: management time and run time. The management-time subsystem is responsible for dependency graph preparation in which a dependency graph of related services is generated automatically according to the proposed semantic matchmaking rules. The run-time subsystem is responsible for discovering the potential web services and nonredundant web services composition of a user's query using a graph-based searching algorithm. The proposed approach was applied to healthcare data integration in different health organizations and was evaluated according to two aspects: execution time measurement and correctness measurement.

  2. Trophic groups and modules: two levels of group detection in food webs.

    PubMed

    Gauzens, Benoit; Thébault, Elisa; Lacroix, Gérard; Legendre, Stéphane

    2015-05-06

    Within food webs, species can be partitioned into groups according to various criteria. Two notions have received particular attention: trophic groups (TGs), which have been used for decades in the ecological literature, and more recently, modules. The relationship between these two group concepts remains unknown in empirical food webs. While recent developments in network theory have led to efficient methods for detecting modules in food webs, the determination of TGs (groups of species that are functionally similar) is largely based on subjective expert knowledge. We develop a novel algorithm for TG detection. We apply this method to empirical food webs and show that aggregation into TGs allows for the simplification of food webs while preserving their information content. Furthermore, we reveal a two-level hierarchical structure where modules partition food webs into large bottom-top trophic pathways, whereas TGs further partition these pathways into groups of species with similar trophic connections. This provides new perspectives for the study of dynamical and functional consequences of food-web structure, bridging topological and dynamical analysis. TGs have a clear ecological meaning and are found to provide a trade-off between network complexity and information loss. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  3. Focused Crawling of the Deep Web Using Service Class Descriptions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rocco, D; Liu, L; Critchlow, T

    2004-06-21

    Dynamic Web data sources--sometimes known collectively as the Deep Web--increase the utility of the Web by providing intuitive access to data repositories anywhere that Web access is available. Deep Web services provide access to real-time information, like entertainment event listings, or present a Web interface to large databases or other data repositories. Recent studies suggest that the size and growth rate of the dynamic Web greatly exceed that of the static Web, yet dynamic content is often ignored by existing search engine indexers owing to the technical challenges that arise when attempting to search the Deep Web. To address thesemore » challenges, we present DynaBot, a service-centric crawler for discovering and clustering Deep Web sources offering dynamic content. DynaBot has three unique characteristics. First, DynaBot utilizes a service class model of the Web implemented through the construction of service class descriptions (SCDs). Second, DynaBot employs a modular, self-tuning system architecture for focused crawling of the DeepWeb using service class descriptions. Third, DynaBot incorporates methods and algorithms for efficient probing of the Deep Web and for discovering and clustering Deep Web sources and services through SCD-based service matching analysis. Our experimental results demonstrate the effectiveness of the service class discovery, probing, and matching algorithms and suggest techniques for efficiently managing service discovery in the face of the immense scale of the Deep Web.« less

  4. Standards should be applied in the prevention and handling of missing data for patient-centered outcomes research: a systematic review and expert consensus.

    PubMed

    Li, Tianjing; Hutfless, Susan; Scharfstein, Daniel O; Daniels, Michael J; Hogan, Joseph W; Little, Roderick J A; Roy, Jason A; Law, Andrew H; Dickersin, Kay

    2014-01-01

    To recommend methodological standards in the prevention and handling of missing data for primary patient-centered outcomes research (PCOR). We searched National Library of Medicine Bookshelf and Catalog as well as regulatory agencies' and organizations' Web sites in January 2012 for guidance documents that had formal recommendations regarding missing data. We extracted the characteristics of included guidance documents and recommendations. Using a two-round modified Delphi survey, a multidisciplinary panel proposed mandatory standards on the prevention and handling of missing data for PCOR. We identified 1,790 records and assessed 30 as having relevant recommendations. We proposed 10 standards as mandatory, covering three domains. First, the single best approach is to prospectively prevent missing data occurrence. Second, use of valid statistical methods that properly reflect multiple sources of uncertainty is critical when analyzing missing data. Third, transparent and thorough reporting of missing data allows readers to judge the validity of the findings. We urge researchers to adopt rigorous methodology and promote good science by applying best practices to the prevention and handling of missing data. Developing guidance on the prevention and handling of missing data for observational studies and studies that use existing records is a priority for future research. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Using Web Server Logs in Evaluating Instructional Web Sites.

    ERIC Educational Resources Information Center

    Ingram, Albert L.

    2000-01-01

    Web server logs contain a great deal of information about who uses a Web site and how they use it. This article discusses the analysis of Web logs for instructional Web sites; reviews the data stored in most Web server logs; demonstrates what further information can be gleaned from the logs; and discusses analyzing that information for the…

  6. 14 CFR 158.49 - Handling of PFC's.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 3 2013-01-01 2013-01-01 false Handling of PFC's. 158.49 Section 158.49... PASSENGER FACILITY CHARGES (PFC'S) Collection, Handling, and Remittance of PFC's § 158.49 Handling of PFC's... amount of PFC revenue in the covered air carrier's account at the time the bankruptcy petition is filed...

  7. 14 CFR 158.49 - Handling of PFC's.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 3 2014-01-01 2014-01-01 false Handling of PFC's. 158.49 Section 158.49... PASSENGER FACILITY CHARGES (PFC'S) Collection, Handling, and Remittance of PFC's § 158.49 Handling of PFC's... amount of PFC revenue in the covered air carrier's account at the time the bankruptcy petition is filed...

  8. 14 CFR 158.49 - Handling of PFC's.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Handling of PFC's. 158.49 Section 158.49... PASSENGER FACILITY CHARGES (PFC'S) Collection, Handling, and Remittance of PFC's § 158.49 Handling of PFC's... amount of PFC revenue in the covered air carrier's account at the time the bankruptcy petition is filed...

  9. 14 CFR 158.49 - Handling of PFC's.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Handling of PFC's. 158.49 Section 158.49... PASSENGER FACILITY CHARGES (PFC'S) Collection, Handling, and Remittance of PFC's § 158.49 Handling of PFC's... amount of PFC revenue in the covered air carrier's account at the time the bankruptcy petition is filed...

  10. 14 CFR 158.49 - Handling of PFC's.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 3 2012-01-01 2012-01-01 false Handling of PFC's. 158.49 Section 158.49... PASSENGER FACILITY CHARGES (PFC'S) Collection, Handling, and Remittance of PFC's § 158.49 Handling of PFC's... amount of PFC revenue in the covered air carrier's account at the time the bankruptcy petition is filed...

  11. Toward Exposing Timing-Based Probing Attacks in Web Applications.

    PubMed

    Mao, Jian; Chen, Yue; Shi, Futian; Jia, Yaoqi; Liang, Zhenkai

    2017-02-25

    Web applications have become the foundation of many types of systems, ranging from cloud services to Internet of Things (IoT) systems. Due to the large amount of sensitive data processed by web applications, user privacy emerges as a major concern in web security. Existing protection mechanisms in modern browsers, e.g., the same origin policy, prevent the users' browsing information on one website from being directly accessed by another website. However, web applications executed in the same browser share the same runtime environment. Such shared states provide side channels for malicious websites to indirectly figure out the information of other origins. Timing is a classic side channel and the root cause of many recent attacks, which rely on the variations in the time taken by the systems to process different inputs. In this paper, we propose an approach to expose the timing-based probing attacks in web applications. It monitors the browser behaviors and identifies anomalous timing behaviors to detect browser probing attacks. We have prototyped our system in the Google Chrome browser and evaluated the effectiveness of our approach by using known probing techniques. We have applied our approach on a large number of top Alexa sites and reported the suspicious behavior patterns with corresponding analysis results. Our theoretical analysis illustrates that the effectiveness of the timing-based probing attacks is dramatically limited by our approach.

  12. Ground data handling for LANDSAT-D

    NASA Technical Reports Server (NTRS)

    Lynch, T. J.

    1976-01-01

    The present plans for the LANDSAT D ground data handling are described in relationship to the mission objectives and the planned spacecraft system. The end to end data system is presented with particular emphasis on the data handling plans for the new instrument, the Thematic Mapper. This instrument generates ten times the amount of data per scene as the present Multispectral Scanner, and this resulting data rate and volume are discussed as well as possible new data techniques to handle them such as image compression.

  13. Uncertainty visualisation in the Model Web

    NASA Astrophysics Data System (ADS)

    Gerharz, L. E.; Autermann, C.; Hopmann, H.; Stasch, C.; Pebesma, E.

    2012-04-01

    Visualisation of geospatial data as maps is a common way to communicate spatially distributed information. If temporal and furthermore uncertainty information are included in the data, efficient visualisation methods are required. For uncertain spatial and spatio-temporal data, numerous visualisation methods have been developed and proposed, but only few tools for visualisation of data in a standardised way exist. Furthermore, usually they are realised as thick clients, and lack functionality of handling data coming from web services as it is envisaged in the Model Web. We present an interactive web tool for visualisation of uncertain spatio-temporal data developed in the UncertWeb project. The client is based on the OpenLayers JavaScript library. OpenLayers provides standard map windows and navigation tools, i.e. pan, zoom in/out, to allow interactive control for the user. Further interactive methods are implemented using jStat, a JavaScript library for statistics plots developed in UncertWeb, and flot. To integrate the uncertainty information into existing standards for geospatial data, the Uncertainty Markup Language (UncertML) was applied in combination with OGC Observations&Measurements 2.0 and JavaScript Object Notation (JSON) encodings for vector and NetCDF for raster data. The client offers methods to visualise uncertain vector and raster data with temporal information. Uncertainty information considered for the tool are probabilistic and quantified attribute uncertainties which can be provided as realisations or samples, full probability distributions functions and statistics. Visualisation is supported for uncertain continuous and categorical data. In the client, the visualisation is realised using a combination of different methods. Based on previously conducted usability studies, a differentiation between expert (in statistics or mapping) and non-expert users has been indicated as useful. Therefore, two different modes are realised together in the tool

  14. Firefly: embracing future web technologies

    NASA Astrophysics Data System (ADS)

    Roby, W.; Wu, X.; Goldina, T.; Joliet, E.; Ly, L.; Mi, W.; Wang, C.; Zhang, Lijun; Ciardi, D.; Dubois-Felsmann, G.

    2016-07-01

    At IPAC/Caltech, we have developed the Firefly web archive and visualization system. Used in production for the last eight years in many missions, Firefly gives the scientist significant capabilities to study data. Firefly provided the first completely web based FITS viewer as well as a growing set of tabular and plotting visualizers. Further, it will be used for the science user interface of the LSST telescope which goes online in 2021. Firefly must meet the needs of archive access and visualization for the 2021 LSST telescope and must serve astronomers beyond the year 2030. Recently, our team has faced the fact that the technology behind Firefly software was becoming obsolete. We were searching for ways to utilize the current breakthroughs in maintaining stability, testability, speed, and reliability of large web applications, which Firefly exemplifies. In the last year, we have ported the Firefly to cutting edge web technologies. Embarking on this massive overhaul is no small feat to say the least. Choosing the technologies that will maintain a forward trajectory in a future development project is always hard and often overwhelming. When a team must port 150,000 lines of code for a production-level product there is little room to make poor choices. This paper will give an overview of the most modern web technologies and lessons learned in our conversion from GWT based system to React/Redux based system.

  15. Rotorcraft handling-qualities design criteria development

    NASA Technical Reports Server (NTRS)

    Aiken, Edwin W.; Lebacqz, J. Victor; Chen, Robert T. N.; Key, David L.

    1988-01-01

    Joint NASA/Army efforts at the Ames Research Center to develop rotorcraft handling-qualities design criteria began in earnest in 1975. Notable results were the UH-1H VSTOLAND variable stability helicopter, the VFA-2 camera-and-terrain-board simulator visual system, and the generic helicopter real-time mathematical model, ARMCOP. An initial series of handling-qualities studies was conducted to assess the effects of rotor design parameters, interaxis coupling, and various levels of stability and control augmentation. The ability to conduct in-flight handling-qualities research was enhanced by the development of the NASA/Army CH-47 variable-stability helicopter. Research programs conducted using this vehicle include vertical-response investigations, hover augmentation systems, and the effects of control-force characteristics. The handling-qualities data base was judged to be sufficient to allow an update of the military helicopter handling-qualities specification, MIL-H-8501. These efforts, including not only the in-house experimental work but also contracted research and collaborative programs performed under the auspices of various international agreements. The report concludes by reviewing the topics that are currently most in need of work, and the plans for addressing these topics.

  16. GeoCENS: A Geospatial Cyberinfrastructure for the World-Wide Sensor Web

    PubMed Central

    Liang, Steve H.L.; Huang, Chih-Yuan

    2013-01-01

    The world-wide sensor web has become a very useful technique for monitoring the physical world at spatial and temporal scales that were previously impossible. Yet we believe that the full potential of sensor web has thus far not been revealed. In order to harvest the world-wide sensor web's full potential, a geospatial cyberinfrastructure is needed to store, process, and deliver large amount of sensor data collected worldwide. In this paper, we first define the issue of the sensor web long tail followed by our view of the world-wide sensor web architecture. Then, we introduce the Geospatial Cyberinfrastructure for Environmental Sensing (GeoCENS) architecture and explain each of its components. Finally, with demonstration of three real-world powered-by-GeoCENS sensor web applications, we believe that the GeoCENS architecture can successfully address the sensor web long tail issue and consequently realize the world-wide sensor web vision. PMID:24152921

  17. Development of a metal-clad advanced composite shear web design concept

    NASA Technical Reports Server (NTRS)

    Laakso, J. H.

    1974-01-01

    An advanced composite web concept was developed for potential application to the Space Shuttle Orbiter main engine thrust structure. The program consisted of design synthesis, analysis, detail design, element testing, and large scale component testing. A concept was sought that offered significant weight saving by the use of Boron/Epoxy (B/E) reinforced titanium plate structure. The desired concept was one that was practical and that utilized metal to efficiently improve structural reliability. The resulting development of a unique titanium-clad B/E shear web design concept is described. Three large scale components were fabricated and tested to demonstrate the performance of the concept: a titanium-clad plus or minus 45 deg B/E web laminate stiffened with vertical B/E reinforced aluminum stiffeners.

  18. Ergonomics: safe patient handling and mobility.

    PubMed

    Hallmark, Beth; Mechan, Patricia; Shores, Lynne

    2015-03-01

    This article reviews and investigates the issues surrounding ergonomics, with a specific focus on safe patient handling and mobility. The health care worker of today faces many challenges, one of which is related to the safety of patients. Safe patient handling and mobility is on the forefront of the movement to improve patient safety. This article reviews the risks associated with patient handling and mobility, and informs the reader of current evidence-based practice relevant to this area of care. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Exposing the structure of an Arctic food web.

    PubMed

    Wirta, Helena K; Vesterinen, Eero J; Hambäck, Peter A; Weingartner, Elisabeth; Rasmussen, Claus; Reneerkens, Jeroen; Schmidt, Niels M; Gilg, Olivier; Roslin, Tomas

    2015-09-01

    How food webs are structured has major implications for their stability and dynamics. While poorly studied to date, arctic food webs are commonly assumed to be simple in structure, with few links per species. If this is the case, then different parts of the web may be weakly connected to each other, with populations and species united by only a low number of links. We provide the first highly resolved description of trophic link structure for a large part of a high-arctic food web. For this purpose, we apply a combination of recent techniques to describing the links between three predator guilds (insectivorous birds, spiders, and lepidopteran parasitoids) and their two dominant prey orders (Diptera and Lepidoptera). The resultant web shows a dense link structure and no compartmentalization or modularity across the three predator guilds. Thus, both individual predators and predator guilds tap heavily into the prey community of each other, offering versatile scope for indirect interactions across different parts of the web. The current description of a first but single arctic web may serve as a benchmark toward which to gauge future webs resolved by similar techniques. Targeting an unusual breadth of predator guilds, and relying on techniques with a high resolution, it suggests that species in this web are closely connected. Thus, our findings call for similar explorations of link structure across multiple guilds in both arctic and other webs. From an applied perspective, our description of an arctic web suggests new avenues for understanding how arctic food webs are built and function and of how they respond to current climate change. It suggests that to comprehend the community-level consequences of rapid arctic warming, we should turn from analyses of populations, population pairs, and isolated predator-prey interactions to considering the full set of interacting species.

  20. Method and apparatus for measuring web material wound on a reel

    NASA Technical Reports Server (NTRS)

    Muller, R. M. (Inventor)

    1977-01-01

    The method and apparatus for measuring the number of layers of a web material of known thickness wound on a storage or take-up reel is presented. The method and apparatus are based on the principle that, at a relatively large radius, the loci of layers of a thin web wound on the reel approximate a family of concentric circles having radii respectively successively increasing by a length equal to the web thickness. Tachometer pulses are generated in response to linear movement of the web and reset pulses are generated in response to rotation of the reel. A digital circuit, responsive to the tachometer and reset pulses, generates data indicative of the layer number of any layer of the web and of position of the web within the layer without requiring numerical interpolation.

  1. Exception handling for sensor fusion

    NASA Astrophysics Data System (ADS)

    Chavez, G. T.; Murphy, Robin R.

    1993-08-01

    This paper presents a control scheme for handling sensing failures (sensor malfunctions, significant degradations in performance due to changes in the environment, and errant expectations) in sensor fusion for autonomous mobile robots. The advantages of the exception handling mechanism are that it emphasizes a fast response to sensing failures, is able to use only a partial causal model of sensing failure, and leads to a graceful degradation of sensing if the sensing failure cannot be compensated for. The exception handling mechanism consists of two modules: error classification and error recovery. The error classification module in the exception handler attempts to classify the type and source(s) of the error using a modified generate-and-test procedure. If the source of the error is isolated, the error recovery module examines its cache of recovery schemes, which either repair or replace the current sensing configuration. If the failure is due to an error in expectation or cannot be identified, the planner is alerted. Experiments using actual sensor data collected by the CSM Mobile Robotics/Machine Perception Laboratory's Denning mobile robot demonstrate the operation of the exception handling mechanism.

  2. Differences among nursing homes in outcomes of a safe resident handling program

    PubMed Central

    Kurotvski, Alicia; Gore, Rebecca; Buchholz, Bryan; Punnett, Laura

    2018-01-01

    A large nursing home corporation implemented a safe resident handling program (SRHP) in 2004–2007. We evaluated its efficacy over a 2-year period by examining differences among 5 centers in program outcomes and potential predictors of those differences. We observed nursing assistants (NAs), recording activities and body postures at 60-second intervals on personal digital assistants at baseline and at 3-month, 12-month, and 24-month follow-ups. The two outcomes computed were change in equipment use during resident handling and change in a physical workload index that estimated spinal loading due to body postures and handled loads. Potential explanatory factors were extracted from post-observation interviews, investigator surveys of the workforce, from administrative data, and employee satisfaction surveys. The facility with the most positive outcome measures was associated with many positive changes in explanatory factors and the facility with the fewest positive outcome measures experienced negative changes in the same factors. These findings suggest greater SRHP benefits where there was lower NA turnover and agency staffing; less time pressure; and better teamwork, staff communication, and supervisory support. PMID:22833329

  3. The design and implementation of web mining in web sites security

    NASA Astrophysics Data System (ADS)

    Li, Jian; Zhang, Guo-Yin; Gu, Guo-Chang; Li, Jian-Li

    2003-06-01

    The backdoor or information leak of Web servers can be detected by using Web Mining techniques on some abnormal Web log and Web application log data. The security of Web servers can be enhanced and the damage of illegal access can be avoided. Firstly, the system for discovering the patterns of information leakages in CGI scripts from Web log data was proposed. Secondly, those patterns for system administrators to modify their codes and enhance their Web site security were provided. The following aspects were described: one is to combine web application log with web log to extract more information, so web data mining could be used to mine web log for discovering the information that firewall and Information Detection System cannot find. Another approach is to propose an operation module of web site to enhance Web site security. In cluster server session, Density-Based Clustering technique is used to reduce resource cost and obtain better efficiency.

  4. Web Mining: Machine Learning for Web Applications.

    ERIC Educational Resources Information Center

    Chen, Hsinchun; Chau, Michael

    2004-01-01

    Presents an overview of machine learning research and reviews methods used for evaluating machine learning systems. Ways that machine-learning algorithms were used in traditional information retrieval systems in the "pre-Web" era are described, and the field of Web mining and how machine learning has been used in different Web mining…

  5. MyLabStocks: a web-application to manage molecular biology materials.

    PubMed

    Chuffart, Florent; Yvert, Gaël

    2014-05-01

    Laboratory stocks are the hardware of research. They must be stored and managed with mimimum loss of material and information. Plasmids, oligonucleotides and strains are regularly exchanged between collaborators within and between laboratories. Managing and sharing information about every item is crucial for retrieval of reagents, for planning experiments and for reproducing past experimental results. We have developed a web-based application to manage stocks commonly used in a molecular biology laboratory. Its functionalities include user-defined privileges, visualization of plasmid maps directly from their sequence and the capacity to search items from fields of annotation or directly from a query sequence using BLAST. It is designed to handle records of plasmids, oligonucleotides, yeast strains, antibodies, pipettes and notebooks. Based on PHP/MySQL, it can easily be extended to handle other types of stocks and it can be installed on any server architecture. MyLabStocks is freely available from: https://forge.cbp.ens-lyon.fr/redmine/projects/mylabstocks under an open source licence. © 2014 Laboratoire de Biologie Moleculaire de la Cellule CNRS. Yeast published by John Wiley & Sons, Ltd.

  6. Spider orb webs rely on radial threads to absorb prey kinetic energy

    PubMed Central

    Sensenig, Andrew T.; Lorentz, Kimberly A.; Kelly, Sean P.; Blackledge, Todd A.

    2012-01-01

    The kinetic energy of flying insect prey is a formidable challenge for orb-weaving spiders. These spiders construct two-dimensional, round webs from a combination of stiff, strong radial silk and highly elastic, glue-coated capture spirals. Orb webs must first stop the flight of insect prey and then retain those insects long enough to be subdued by the spiders. Consequently, spider silks rank among the toughest known biomaterials. The large number of silk threads composing a web suggests that aerodynamic dissipation may also play an important role in stopping prey. Here, we quantify energy dissipation in orb webs spun by diverse species of spiders using data derived from high-speed videos of web deformation under prey impact. By integrating video data with material testing of silks, we compare the relative contributions of radial silk, the capture spiral and aerodynamic dissipation. Radial silk dominated energy absorption in all webs, with the potential to account for approximately 100 per cent of the work of stopping prey in larger webs. The most generous estimates for the roles of capture spirals and aerodynamic dissipation show that they rarely contribute more than 30 per cent and 10 per cent of the total work of stopping prey, respectively, and then only for smaller orb webs. The reliance of spider orb webs upon internal energy absorption by radial threads for prey capture suggests that the material properties of the capture spirals are largely unconstrained by the selective pressures of stopping prey and can instead evolve freely in response to alternative functional constraints such as adhering to prey. PMID:22431738

  7. Spider orb webs rely on radial threads to absorb prey kinetic energy.

    PubMed

    Sensenig, Andrew T; Lorentz, Kimberly A; Kelly, Sean P; Blackledge, Todd A

    2012-08-07

    The kinetic energy of flying insect prey is a formidable challenge for orb-weaving spiders. These spiders construct two-dimensional, round webs from a combination of stiff, strong radial silk and highly elastic, glue-coated capture spirals. Orb webs must first stop the flight of insect prey and then retain those insects long enough to be subdued by the spiders. Consequently, spider silks rank among the toughest known biomaterials. The large number of silk threads composing a web suggests that aerodynamic dissipation may also play an important role in stopping prey. Here, we quantify energy dissipation in orb webs spun by diverse species of spiders using data derived from high-speed videos of web deformation under prey impact. By integrating video data with material testing of silks, we compare the relative contributions of radial silk, the capture spiral and aerodynamic dissipation. Radial silk dominated energy absorption in all webs, with the potential to account for approximately 100 per cent of the work of stopping prey in larger webs. The most generous estimates for the roles of capture spirals and aerodynamic dissipation show that they rarely contribute more than 30 per cent and 10 per cent of the total work of stopping prey, respectively, and then only for smaller orb webs. The reliance of spider orb webs upon internal energy absorption by radial threads for prey capture suggests that the material properties of the capture spirals are largely unconstrained by the selective pressures of stopping prey and can instead evolve freely in response to alternative functional constraints such as adhering to prey.

  8. Silicon web process development. [for low cost solar cells

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Hopkins, R. H.; Seidensticker, R. G.; Mchugh, J. P.; Hill, F. E.; Heimlich, M. E.; Driggers, J. M.

    1979-01-01

    Silicon dendritic web, a single crystal ribbon shaped during growth by crystallographic forces and surface tension (rather than dies), is a highly promising base material for efficient low cost solar cells. The form of the product smooth, flexible strips 100 to 200 microns thick, conserves expensive silicon and facilitates automation of crystal growth and the subsequent manufacturing of solar cells. These characteristics, coupled with the highest demonstrated ribbon solar cell efficiency-15.5%-make silicon web a leading candidate to achieve, or better, the 1986 Low Cost Solar Array (LSA) Project cost objective of 50 cents per peak watt of photovoltaic output power. The main objective of the Web Program, technology development to significantly increase web output rate, and to show the feasibility for simultaneous melt replenishment and growth, have largely been accomplished. Recently, web output rates of 23.6 sq cm/min, nearly three times the 8 sq cm/min maximum rate of a year ago, were achieved. Webs 4 cm wide or greater were grown on a number of occassions.

  9. Bioaccumulation and trophic transfer of pharmaceuticals in food webs from a large freshwater lake.

    PubMed

    Xie, Zhengxin; Lu, Guanghua; Yan, Zhenhua; Liu, Jianchao; Wang, Peifang; Wang, Yonghua

    2017-03-01

    Pharmaceuticals are increasingly detected in environmental matrices, but information on their trophic transfer in aquatic food webs is insufficient. This study investigated the bioaccumulation and trophic transfer of 23 pharmaceuticals in Taihu Lake, China. Pharmaceutical concentrations were analyzed in surface water, sediments and 14 aquatic species, including plankton, invertebrates and fish collected from the lake. The median concentrations of the detected pharmaceuticals ranged from not detected (ND) to 49 ng/L in water, ND to 49 ng/g dry weight (dw) in sediments, and from ND to 130 ng/g dw in biota. Higher concentrations of pharmaceuticals were found in zoobenthos relative to plankton, shrimp and fish muscle. In fish tissues, the observed pharmaceutical contents in the liver and brain were generally higher than those in the gills and muscle. Both bioaccumulation factors (median BAFs: 19-2008 L/kg) and biota-sediment accumulation factors (median BSAFs: 0.0010-0.037) indicated a low bioaccumulation potential for the target pharmaceuticals. For eight of the most frequently detected pharmaceuticals in food webs, the trophic magnification factors (TMFs) were analyzed from two different regions of Taihu Lake. The TMFs for roxithromycin, propranolol, diclofenac, ibuprofen, ofloxacin, norfloxacin, ciprofloxacin and tetracycline in the two food webs ranged from 0.28 to 1.25, suggesting that none of these pharmaceuticals experienced trophic magnification. In addition, the pharmaceutical TMFs did not differ significantly between the two regions in Taihu Lake. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Reducing variation in a rabbit vaccine safety study with particular emphasis on housing conditions and handling.

    PubMed

    Verwer, Cynthia M; van der Ark, Arno; van Amerongen, Geert; van den Bos, Ruud; Hendriksen, Coenraad F M

    2009-04-01

    This paper describes the results of a study of the effects of modified housing conditions, conditioning and habituation on humans using a rabbit model for monitoring whole-cell pertussis vaccine (pWCV)-induced adverse effects. The study has been performed with reference to previous vaccine safety studies of pWCV in rabbits in which results were difficult to interpret due to the large variation in experimental outcome, especially in the key parameter deep-body temperature (T(b)). Certain stressful laboratory conditions, as well as procedures involving humans, e.g. blood sampling, inoculation and cage-cleaning, were hypothesized to cause this large variation. The results of this study show that under modified housing conditions rabbits have normal circadian body temperatures. This allowed discrimination of pWCV-induced adverse effects in which handled rabbits tended to show a dose-related increase in temperature after inoculation with little variance, whereas non-handled rabbits did not. Effects of experimental and routine procedures on body temperature were significantly reduced under modified conditions and were within the normal T(b) range. Handled animals reacted less strongly and with less variance to experimental procedures, such as blood sampling, injection and cage-cleaning, than non-handled rabbits. Overall, handling had a positive effect on the behaviour of the animals. Data show that the housing modifications have provided a more robust model for monitoring pWCV adverse effects. Furthermore, conditioning and habituation of rabbits to humans reduce the variation in experimental outcome, which might allow for a reduction in the number of animals used. In addition, this also reduces distress and thus contributes to refining this animal model.

  11. Countering Insider Threats - Handling Insider Threats Using Dynamic, Run-Time Forensics

    DTIC Science & Technology

    2007-10-01

    able to handle the security policy requirements of a large organization containing many decentralized and diverse users, while being easily managed... contained in the TIF folder. Searching for any text string and sorting is supported also. The cache index file of Internet Explorer is not changed... containing thousands of malware software signatures. Separate datasets can be created for various classifications of malware such as encryption software

  12. A web-based solution for 3D medical image visualization

    NASA Astrophysics Data System (ADS)

    Hou, Xiaoshuai; Sun, Jianyong; Zhang, Jianguo

    2015-03-01

    In this presentation, we present a web-based 3D medical image visualization solution which enables interactive large medical image data processing and visualization over the web platform. To improve the efficiency of our solution, we adopt GPU accelerated techniques to process images on the server side while rapidly transferring images to the HTML5 supported web browser on the client side. Compared to traditional local visualization solution, our solution doesn't require the users to install extra software or download the whole volume dataset from PACS server. By designing this web-based solution, it is feasible for users to access the 3D medical image visualization service wherever the internet is available.

  13. SITE GENERATED RADIOLOGICAL WASTE HANDLING SYSTEM DESCRIPTION DOCUMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    S. C. Khamankar

    2000-06-20

    The Site Generated Radiological Waste Handling System handles radioactive waste products that are generated at the geologic repository operations area. The waste is collected, treated if required, packaged for shipment, and shipped to a disposal site. Waste streams include low-level waste (LLW) in solid and liquid forms, as-well-as mixed waste that contains hazardous and radioactive constituents. Liquid LLW is segregated into two streams, non-recyclable and recyclable. The non-recyclable stream may contain detergents or other non-hazardous cleaning agents and is packaged for shipment. The recyclable stream is treated to recycle a large portion of the water while the remaining concentrated wastemore » is packaged for shipment; this greatly reduces the volume of waste requiring disposal. There will be no liquid LLW discharge. Solid LLW consists of wet solids such as ion exchange resins and filter cartridges, as-well-as dry active waste such as tools, protective clothing, and poly bags. Solids will be sorted, volume reduced, and packaged for shipment. The generation of mixed waste at the Monitored Geologic Repository (MGR) is not planned; however, if it does come into existence, it will be collected and packaged for disposal at its point of occurrence, temporarily staged, then shipped to government-approved off-site facilities for disposal. The Site Generated Radiological Waste Handling System has equipment located in both the Waste Treatment Building (WTB) and in the Waste Handling Building (WHB). All types of liquid and solid LLW are processed in the WTB, while wet solid waste from the Pool Water Treatment and Cooling System is packaged where received in the WHB. There is no installed hardware for mixed waste. The Site Generated Radiological Waste Handling System receives waste from locations where water is used for decontamination functions. In most cases the water is piped back to the WTB for processing. The WTB and WHB provide staging areas for storing and

  14. Cost of Information Handling in Hospitals

    PubMed Central

    Jydstrup, Ronald A.; Gross, Malvern J.

    1966-01-01

    Cost of information handling (noncomputerized) in hospitals was studied in detail from an industrial engineering point of view at Rochester General, Highland, and Geneva General hospitals. Activities were observed, personnel questioned, and time studies carried out. It was found that information handling comprises about one fourth of the hospitals' operating cost—a finding strongly recommending revision and streamlining of both forms and inefficient operations. In an Appendix to this study are presented 15 items that would improve information handling in one area of the hospital, nursing units, where this activity is greater than in any other in a hospital. PMID:5971636

  15. Web-based platform for collaborative medical imaging research

    NASA Astrophysics Data System (ADS)

    Rittner, Leticia; Bento, Mariana P.; Costa, André L.; Souza, Roberto M.; Machado, Rubens C.; Lotufo, Roberto A.

    2015-03-01

    Medical imaging research depends basically on the availability of large image collections, image processing and analysis algorithms, hardware and a multidisciplinary research team. It has to be reproducible, free of errors, fast, accessible through a large variety of devices spread around research centers and conducted simultaneously by a multidisciplinary team. Therefore, we propose a collaborative research environment, named Adessowiki, where tools and datasets are integrated and readily available in the Internet through a web browser. Moreover, processing history and all intermediate results are stored and displayed in automatic generated web pages for each object in the research project or clinical study. It requires no installation or configuration from the client side and offers centralized tools and specialized hardware resources, since processing takes place in the cloud.

  16. Space webs based on rotating tethered formations

    NASA Astrophysics Data System (ADS)

    Palmerini, Giovanni B.; Sgubini, Silvano; Sabatini, Marco

    2009-07-01

    Several on-going studies indicate the interest for large, light orbiting structures, shaped as fish nets or webs: along the ropes of the web small spacecraft can move like spiders to position and re-locate, at will, pieces of hardware devoted to specific missions. The concept could be considered as an intermediate solution between the large monolithic structure, heavy and expensive to realize, but easy to control, and the formations of satellites, where all system members are completely free and should manoeuvre in order to acquire a desired configuration. Instead, the advantage of having a "hard-but-light" link among the different grids lays in the partition of the tasks among system components and in a possible overall reduction of the control system complexity and cost. Unfortunately, there is no stable configuration for an orbiting, two-dimensional web made by light, flexible tethers which cannot support compression forces. A possible solution is to make use of centrifugal forces to pull the net, with a reduced number of simple thrusters located at the tips of the tethers to initially acquire the required spin. In this paper a dynamic analysis of a simplified rotating web is performed, in order to evaluate the spinning velocity able to satisfy the requirement for the stability of the system. The model adopted overlaps simpler elements, each of them given by a tether (made up of a number of linear finite elements) connecting two extreme bodies accommodating the spinning thrusters. The combination of these "diameter-like" elements provides the web, shaped according to the specific requirements. The net is primarily considered as subjected to Keplerian attraction and J2 and drag perturbations only, but its behaviour under thermal inputs is also investigated.

  17. Big data in wildlife research: remote web-based monitoring of hibernating black bears.

    PubMed

    Laske, Timothy G; Garshelis, David L; Iaizzo, Paul A

    2014-12-11

    Numerous innovations for the management and collection of "big data" have arisen in the field of medicine, including implantable computers and sensors, wireless data transmission, and web-based repositories for collecting and organizing information. Recently, human clinical devices have been deployed in captive and free-ranging wildlife to aid in the characterization of both normal physiology and the interaction of animals with their environment, including reactions to humans. Although these devices have had a significant impact on the types and quantities of information that can be collected, their utility has been limited by internal memory capacities, the efforts required to extract and analyze information, and by the necessity to handle the animals in order to retrieve stored data. We surgically implanted miniaturized cardiac monitors (1.2 cc, Reveal LINQ™, Medtronic Inc.), a newly developed human clinical system, into hibernating wild American black bears (N = 6). These devices include wireless capabilities, which enabled frequent transmissions of detailed physiological data from bears in their remote den sites to a web-based data storage and management system. Solar and battery powered telemetry stations transmitted detailed physiological data over the cellular network during the winter months. The system provided the transfer of large quantities of data in near-real time. Observations included changes in heart rhythms associated with birthing and caring for cubs, and in all bears, long periods without heart beats (up to 16 seconds) occurred during each respiratory cycle. For the first time, detailed physiological data were successfully transferred from an animal in the wild to a web-based data collection and management system, overcoming previous limitations on the quantities of data that could be transferred. The system provides an opportunity to detect unusual events as they are occurring, enabling investigation of the animal and site shortly

  18. Graph-Based Semantic Web Service Composition for Healthcare Data Integration

    PubMed Central

    2017-01-01

    Within the numerous and heterogeneous web services offered through different sources, automatic web services composition is the most convenient method for building complex business processes that permit invocation of multiple existing atomic services. The current solutions in functional web services composition lack autonomous queries of semantic matches within the parameters of web services, which are necessary in the composition of large-scale related services. In this paper, we propose a graph-based Semantic Web Services composition system consisting of two subsystems: management time and run time. The management-time subsystem is responsible for dependency graph preparation in which a dependency graph of related services is generated automatically according to the proposed semantic matchmaking rules. The run-time subsystem is responsible for discovering the potential web services and nonredundant web services composition of a user's query using a graph-based searching algorithm. The proposed approach was applied to healthcare data integration in different health organizations and was evaluated according to two aspects: execution time measurement and correctness measurement. PMID:29065602

  19. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application.

    PubMed

    Hanwell, Marcus D; de Jong, Wibe A; Harris, Christopher J

    2017-10-30

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction-connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platform with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web-going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.

  20. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application

    DOE PAGES

    Hanwell, Marcus D.; de Jong, Wibe A.; Harris, Christopher J.

    2017-10-30

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction - connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platformmore » with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web - going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.« less

  1. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanwell, Marcus D.; de Jong, Wibe A.; Harris, Christopher J.

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction - connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platformmore » with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web - going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.« less

  2. Soybean Knowledge Base (SoyKB): a Web Resource for Soybean Translational Genomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joshi, Trupti; Patil, Kapil; Fitzpatrick, Michael R.

    2012-01-17

    Background: Soybean Knowledge Base (SoyKB) is a comprehensive all-inclusive web resource for soybean translational genomics. SoyKB is designed to handle the management and integration of soybean genomics, transcriptomics, proteomics and metabolomics data along with annotation of gene function and biological pathway. It contains information on four entities, namely genes, microRNAs, metabolites and single nucleotide polymorphisms (SNPs). Methods: SoyKB has many useful tools such as Affymetrix probe ID search, gene family search, multiple gene/ metabolite search supporting co-expression analysis, and protein 3D structure viewer as well as download and upload capacity for experimental data and annotations. It has four tiers ofmore » registration, which control different levels of access to public and private data. It allows users of certain levels to share their expertise by adding comments to the data. It has a user-friendly web interface together with genome browser and pathway viewer, which display data in an intuitive manner to the soybean researchers, producers and consumers. Conclusions: SoyKB addresses the increasing need of the soybean research community to have a one-stop-shop functional and translational omics web resource for information retrieval and analysis in a user-friendly way. SoyKB can be publicly accessed at http://soykb.org/.« less

  3. CNA web server: rigidity theory-based thermal unfolding simulations of proteins for linking structure, (thermo-)stability, and function.

    PubMed

    Krüger, Dennis M; Rathi, Prakash Chandra; Pfleger, Christopher; Gohlke, Holger

    2013-07-01

    The Constraint Network Analysis (CNA) web server provides a user-friendly interface to the CNA approach developed in our laboratory for linking results from rigidity analyses to biologically relevant characteristics of a biomolecular structure. The CNA web server provides a refined modeling of thermal unfolding simulations that considers the temperature dependence of hydrophobic tethers and computes a set of global and local indices for quantifying biomacromolecular stability. From the global indices, phase transition points are identified where the structure switches from a rigid to a floppy state; these phase transition points can be related to a protein's (thermo-)stability. Structural weak spots (unfolding nuclei) are automatically identified, too; this knowledge can be exploited in data-driven protein engineering. The local indices are useful in linking flexibility and function and to understand the impact of ligand binding on protein flexibility. The CNA web server robustly handles small-molecule ligands in general. To overcome issues of sensitivity with respect to the input structure, the CNA web server allows performing two ensemble-based variants of thermal unfolding simulations. The web server output is provided as raw data, plots and/or Jmol representations. The CNA web server, accessible at http://cpclab.uni-duesseldorf.de/cna or http://www.cnanalysis.de, is free and open to all users with no login requirement.

  4. CNA web server: rigidity theory-based thermal unfolding simulations of proteins for linking structure, (thermo-)stability, and function

    PubMed Central

    Krüger, Dennis M.; Rathi, Prakash Chandra; Pfleger, Christopher; Gohlke, Holger

    2013-01-01

    The Constraint Network Analysis (CNA) web server provides a user-friendly interface to the CNA approach developed in our laboratory for linking results from rigidity analyses to biologically relevant characteristics of a biomolecular structure. The CNA web server provides a refined modeling of thermal unfolding simulations that considers the temperature dependence of hydrophobic tethers and computes a set of global and local indices for quantifying biomacromolecular stability. From the global indices, phase transition points are identified where the structure switches from a rigid to a floppy state; these phase transition points can be related to a protein’s (thermo-)stability. Structural weak spots (unfolding nuclei) are automatically identified, too; this knowledge can be exploited in data-driven protein engineering. The local indices are useful in linking flexibility and function and to understand the impact of ligand binding on protein flexibility. The CNA web server robustly handles small-molecule ligands in general. To overcome issues of sensitivity with respect to the input structure, the CNA web server allows performing two ensemble-based variants of thermal unfolding simulations. The web server output is provided as raw data, plots and/or Jmol representations. The CNA web server, accessible at http://cpclab.uni-duesseldorf.de/cna or http://www.cnanalysis.de, is free and open to all users with no login requirement. PMID:23609541

  5. Early handling modulates outcome of neonatal dexamethasone exposure.

    PubMed

    Claessens, Sanne E F; Daskalakis, Nikolaos P; Oitzl, Melly S; de Kloet, E Ronald

    2012-09-01

    Synthetic glucocorticoids such as dexamethasone (DEX) are used to prevent or treat respiratory disorders in prematurely born infants. Besides the short-term benefit on lung development, numerous human and animal studies have reported adverse neurodevelopmental side effects. In contrast, maternal care is known to exert a positive influence on neurodevelopmental outcome in rodents. The aim of the current study was therefore to investigate whether neonatal handling (days 1-21), known to induce maternal care, might serve as an intervention strategy modulating the adverse effects of DEX treatment (days 1-3). For this purpose we have measured the outcome of these early-life manipulations on development as well as adult endocrine and behavioral phenotype of male rats. Maternal care was observed during the first week of life and indeed enhanced in response to handling. Eye opening was accelerated and body weight reduced in DEX-treated animals. In adulthood, we report that handling ameliorated impaired spatial learning observed in DEX treated non-handled animals in the T-maze. Additionally, handling reduced susceptibility to the impact of DEX treatment in the water maze. Although DEX treatment and handling both resulted in enhanced negative feedback of the stress-induced corticosterone response and both reduced startle reactivity, the acquisition of fear was only reduced by handling, without effect of DEX. Interestingly, handling had a beneficial effect on pre-pulse inhibition, which was diminished after DEX treatment. In conclusion, these findings indicate that handling of the neonate enhances maternal care and attenuates specific DEX-induced alterations in the adult behavioral phenotype. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. 30 CFR 75.833 - Handling high-voltage trailing cables.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Handling high-voltage trailing cables. 75.833... High-Voltage Longwalls § 75.833 Handling high-voltage trailing cables. (a) Cable handling. (1) Miners must not handle energized trailing cables unless they are wearing high-voltage insulating gloves, which...

  7. Airborne microorganisms associated with grain handling.

    PubMed

    Swan, J R; Crook, B

    1998-01-01

    There is substantial evidence that workers handling grain develop allergic respiratory symptoms. Microbiological contaminants are likely to be a significant contributing factor. Worker's exposure to microorganisms contaminating grain dust in the UK was therefore examined. Aerobiological studies were made when grain was being handled on farms and also during bulk handling of grain in dockside terminals. A quantitative and qualitative microbiological examination of the airborne grain dust was carried out. Samples of airborne grain dust were collected and viable bacteria, fungi and actinomycetes were grown, isolated and identified. It was found that workers handling grain or working close to grain at farms and docks were frequently exposed to more than 1 million bacteria and fungi per m3 air, and that airborne bacteria and fungi exceeded 10(4) per m3 air in all areas sampled. The qualitative examination of the samples showed that the predominant microorganisms present differed between freshly harvested grain and stored grain, but not between different types of grain.

  8. Compilation and network analyses of cambrian food webs.

    PubMed

    Dunne, Jennifer A; Williams, Richard J; Martinez, Neo D; Wood, Rachel A; Erwin, Douglas H

    2008-04-29

    A rich body of empirically grounded theory has developed about food webs--the networks of feeding relationships among species within habitats. However, detailed food-web data and analyses are lacking for ancient ecosystems, largely because of the low resolution of taxa coupled with uncertain and incomplete information about feeding interactions. These impediments appear insurmountable for most fossil assemblages; however, a few assemblages with excellent soft-body preservation across trophic levels are candidates for food-web data compilation and topological analysis. Here we present plausible, detailed food webs for the Chengjiang and Burgess Shale assemblages from the Cambrian Period. Analyses of degree distributions and other structural network properties, including sensitivity analyses of the effects of uncertainty associated with Cambrian diet designations, suggest that these early Paleozoic communities share remarkably similar topology with modern food webs. Observed regularities reflect a systematic dependence of structure on the numbers of taxa and links in a web. Most aspects of Cambrian food-web structure are well-characterized by a simple "niche model," which was developed for modern food webs and takes into account this scale dependence. However, a few aspects of topology differ between the ancient and recent webs: longer path lengths between species and more species in feeding loops in the earlier Chengjiang web, and higher variability in the number of links per species for both Cambrian webs. Our results are relatively insensitive to the exclusion of low-certainty or random links. The many similarities between Cambrian and recent food webs point toward surprisingly strong and enduring constraints on the organization of complex feeding interactions among metazoan species. The few differences could reflect a transition to more strongly integrated and constrained trophic organization within ecosystems following the rapid diversification of species, body

  9. Human Handling Promotes Compliant Behavior in Adult Laboratory Rabbits

    PubMed Central

    Swennes, Alton G; Alworth, Leanne C; Harvey, Stephen B; Jones, Carolyn A; King, Christopher S; Crowell-Davis, Sharon L

    2011-01-01

    Routine laboratory procedures can be stressful for laboratory animals. We wanted to determine whether human handling of adult rabbits could induce a degree of habituation, reducing stress and facilitating research-related manipulation. To this end, adult New Zealand white rabbits were handled either frequently or minimally. After being handled over 3 wk, these rabbits were evaluated by novel personnel and compared with minimally handled controls. Evaluators subjectively scored the rabbits for their relative compliance or resistance to being scruffed and removed from their cages, being transported to a treatment room, and their behavior at all stages of the exercise. Upon evaluation, handled rabbits scored significantly more compliant than nontreated controls. During evaluation, behaviors that the rabbits displayed when they were approached in their cages and while being handled outside their cages were recorded and compared between study groups. Handled rabbits displayed behavior consistent with a reduction in human-directed fear. This study illustrates the potential for handling to improve compliance in laboratory procedures and reduce fear-related behavior in laboratory rabbits. Such handling could be used to improve rabbit welfare through the reduction of stress and exposure to novel stimuli. PMID:21333162

  10. Evolution of the cosmic web

    NASA Astrophysics Data System (ADS)

    Cautun, Marius; van de Weygaert, Rien; Jones, Bernard J. T.; Frenk, Carlos S.

    2014-07-01

    The cosmic web is the largest scale manifestation of the anisotropic gravitational collapse of matter. It represents the transitional stage between linear and non-linear structures and contains easily accessible information about the early phases of structure formation processes. Here we investigate the characteristics and the time evolution of morphological components. Our analysis involves the application of the NEXUS Multiscale Morphology Filter technique, predominantly its NEXUS+ version, to high resolution and large volume cosmological simulations. We quantify the cosmic web components in terms of their mass and volume content, their density distribution and halo populations. We employ new analysis techniques to determine the spatial extent of filaments and sheets, like their total length and local width. This analysis identifies clusters and filaments as the most prominent components of the web. In contrast, while voids and sheets take most of the volume, they correspond to underdense environments and are devoid of group-sized and more massive haloes. At early times the cosmos is dominated by tenuous filaments and sheets, which, during subsequent evolution, merge together, such that the present-day web is dominated by fewer, but much more massive, structures. The analysis of the mass transport between environments clearly shows how matter flows from voids into walls, and then via filaments into cluster regions, which form the nodes of the cosmic web. We also study the properties of individual filamentary branches, to find long, almost straight, filaments extending to distances larger than 100 h-1 Mpc. These constitute the bridges between massive clusters, which seem to form along approximatively straight lines.

  11. Evaluation of a metal shear web selectively reinforced with filamentary composites for space shuttle application. Phase 2: summary report: Shear web component fabrication

    NASA Technical Reports Server (NTRS)

    Laakso, J. H.; Smith, D. D.; Zimmerman, D. K.

    1973-01-01

    The fabrication of two shear web test elements and three large scale shear web test components are reported. In addition, the fabrication of test fixtures for the elements and components is described. The center-loaded beam test fixtures were configured to have a test side and a dummy or permanent side. The test fixtures were fabricated from standard extruded aluminum sections and plates and were designed to be reuseable.

  12. Protecting Database Centric Web Services against SQL/XPath Injection Attacks

    NASA Astrophysics Data System (ADS)

    Laranjeiro, Nuno; Vieira, Marco; Madeira, Henrique

    Web services represent a powerful interface for back-end database systems and are increasingly being used in business critical applications. However, field studies show that a large number of web services are deployed with security flaws (e.g., having SQL Injection vulnerabilities). Although several techniques for the identification of security vulnerabilities have been proposed, developing non-vulnerable web services is still a difficult task. In fact, security-related concerns are hard to apply as they involve adding complexity to already complex code. This paper proposes an approach to secure web services against SQL and XPath Injection attacks, by transparently detecting and aborting service invocations that try to take advantage of potential vulnerabilities. Our mechanism was applied to secure several web services specified by the TPC-App benchmark, showing to be 100% effective in stopping attacks, non-intrusive and very easy to use.

  13. Collaborative Writing among Second Language Learners in Academic Web-Based Projects

    ERIC Educational Resources Information Center

    Kessler, Greg; Bikowski, Dawn; Boggs, Jordan

    2012-01-01

    This study investigates Web-based, project oriented, many-to-many collaborative writing for academic purposes. Thirty-eight Fulbright scholars in an orientation program at a large Midwestern university used a Web-based word processing tool to collaboratively plan and report on a research project. The purpose of this study is to explore and…

  14. Web2Quests: Updating a Popular Web-Based Inquiry-Oriented Activity

    ERIC Educational Resources Information Center

    Kurt, Serhat

    2009-01-01

    WebQuest is a popular inquiry-oriented activity in which learners use Web resources. Since the creation of the innovation, almost 15 years ago, the Web has changed significantly, while the WebQuest technique has changed little. This article examines possible applications of new Web trends on WebQuest instructional strategy. Some possible…

  15. SCHeMA web-based observation data information system

    NASA Astrophysics Data System (ADS)

    Novellino, Antonio; Benedetti, Giacomo; D'Angelo, Paolo; Confalonieri, Fabio; Massa, Francesco; Povero, Paolo; Tercier-Waeber, Marie-Louise

    2016-04-01

    SeaDataNet network of National Oceanographic Data Centres. The SCHeMA presentation layer, a fundamental part of the software architecture, offers to the user a bidirectional interaction with the integrated system allowing to manage and configure the sensor probes; view the stored observations and metadata, and handle alarms. The overall structure of the web portal developed within the SCHeMA initiative (Sensor Configuration, development of Core Profile interface for data access via OGC standard, external services such as web services, WMS, WFS; and Data download and query manager) will be presented and illustrated with examples of ongoing tests in costal and open sea.

  16. Biotool2Web: creating simple Web interfaces for bioinformatics applications.

    PubMed

    Shahid, Mohammad; Alam, Intikhab; Fuellen, Georg

    2006-01-01

    Currently there are many bioinformatics applications being developed, but there is no easy way to publish them on the World Wide Web. We have developed a Perl script, called Biotool2Web, which makes the task of creating web interfaces for simple ('home-made') bioinformatics applications quick and easy. Biotool2Web uses an XML document containing the parameters to run the tool on the Web, and generates the corresponding HTML and common gateway interface (CGI) files ready to be published on a web server. This tool is available for download at URL http://www.uni-muenster.de/Bioinformatics/services/biotool2web/ Georg Fuellen (fuellen@alum.mit.edu).

  17. Handling of thermal paper: Implications for dermal exposure to bisphenol A and its alternatives

    PubMed Central

    Bernier, Meghan R.

    2017-01-01

    Bisphenol A (BPA) is an endocrine disrupting chemical used in a wide range of consumer products including photoactive dyes used in thermal paper. Recent studies have shown that dermal absorption of BPA can occur when handling these papers. Yet, regulatory agencies have largely dismissed thermal paper as a major source of BPA exposure. Exposure estimates provided by agencies such as the European Food Safety Authority (EFSA) are based on assumptions about how humans interact with this material, stating that ‘typical’ exposures for adults involve only one handling per day for short periods of time (<1 minute), with limited exposure surfaces (three fingertips). The objective of this study was to determine how individuals handle thermal paper in one common setting: a cafeteria providing short-order meals. We observed thermal paper handling in a college-aged population (n = 698 subjects) at the University of Massachusetts’ dining facility. We find that in this setting, individuals handle receipts for an average of 11.5 min, that >30% of individuals hold thermal paper with more than three fingertips, and >60% allow the paper to touch their palm. Only 11% of the participants we observed were consistent with the EFSA model for time of contact and dermal surface area. Mathematical modeling based on handling times we measured and previously published transfer coefficients, concentrations of BPA in paper, and absorption factors indicate the most conservative estimated intake from handling thermal paper in this population is 51.1 ng/kg/day, similar to EFSA’s estimates of 59 ng/kg/day from dermal exposures. Less conservative estimates, using published data on concentrations in thermal paper and transfer rates to skin, indicate that exposures are likely significantly higher. Based on our observational data, we propose that the current models for estimating dermal BPA exposures are not consistent with normal human behavior and should be reevaluated. PMID:28570582

  18. Handling of thermal paper: Implications for dermal exposure to bisphenol A and its alternatives.

    PubMed

    Bernier, Meghan R; Vandenberg, Laura N

    2017-01-01

    Bisphenol A (BPA) is an endocrine disrupting chemical used in a wide range of consumer products including photoactive dyes used in thermal paper. Recent studies have shown that dermal absorption of BPA can occur when handling these papers. Yet, regulatory agencies have largely dismissed thermal paper as a major source of BPA exposure. Exposure estimates provided by agencies such as the European Food Safety Authority (EFSA) are based on assumptions about how humans interact with this material, stating that 'typical' exposures for adults involve only one handling per day for short periods of time (<1 minute), with limited exposure surfaces (three fingertips). The objective of this study was to determine how individuals handle thermal paper in one common setting: a cafeteria providing short-order meals. We observed thermal paper handling in a college-aged population (n = 698 subjects) at the University of Massachusetts' dining facility. We find that in this setting, individuals handle receipts for an average of 11.5 min, that >30% of individuals hold thermal paper with more than three fingertips, and >60% allow the paper to touch their palm. Only 11% of the participants we observed were consistent with the EFSA model for time of contact and dermal surface area. Mathematical modeling based on handling times we measured and previously published transfer coefficients, concentrations of BPA in paper, and absorption factors indicate the most conservative estimated intake from handling thermal paper in this population is 51.1 ng/kg/day, similar to EFSA's estimates of 59 ng/kg/day from dermal exposures. Less conservative estimates, using published data on concentrations in thermal paper and transfer rates to skin, indicate that exposures are likely significantly higher. Based on our observational data, we propose that the current models for estimating dermal BPA exposures are not consistent with normal human behavior and should be reevaluated.

  19. 7 CFR 959.53 - Handling for special purposes.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Handling for special purposes. 959.53 Section 959.53 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing... Regulating Handling Regulations § 959.53 Handling for special purposes. Regulations in effect pursuant to...

  20. SPSmart: adapting population based SNP genotype databases for fast and comprehensive web access.

    PubMed

    Amigo, Jorge; Salas, Antonio; Phillips, Christopher; Carracedo, Angel

    2008-10-10

    In the last five years large online resources of human variability have appeared, notably HapMap, Perlegen and the CEPH foundation. These databases of genotypes with population information act as catalogues of human diversity, and are widely used as reference sources for population genetics studies. Although many useful conclusions may be extracted by querying databases individually, the lack of flexibility for combining data from within and between each database does not allow the calculation of key population variability statistics. We have developed a novel tool for accessing and combining large-scale genomic databases of single nucleotide polymorphisms (SNPs) in widespread use in human population genetics: SPSmart (SNPs for Population Studies). A fast pipeline creates and maintains a data mart from the most commonly accessed databases of genotypes containing population information: data is mined, summarized into the standard statistical reference indices, and stored into a relational database that currently handles as many as 4 x 10(9) genotypes and that can be easily extended to new database initiatives. We have also built a web interface to the data mart that allows the browsing of underlying data indexed by population and the combining of populations, allowing intuitive and straightforward comparison of population groups. All the information served is optimized for web display, and most of the computations are already pre-processed in the data mart to speed up the data browsing and any computational treatment requested. In practice, SPSmart allows populations to be combined into user-defined groups, while multiple databases can be accessed and compared in a few simple steps from a single query. It performs the queries rapidly and gives straightforward graphical summaries of SNP population variability through visual inspection of allele frequencies outlined in standard pie-chart format. In addition, full numerical description of the data is output in

  1. Handle-shaped Prominence

    NASA Image and Video Library

    2001-02-17

    NASA Extreme Ultraviolet Imaging Telescope aboard ESA’s SOHO spacecraft took this image of a huge, handle-shaped prominence in 1999. Prominences are huge clouds of relatively cool dense plasma suspended in the Sun hot, thin corona.

  2. E-Mail Molecules—Individualizing the Large Lecture Class

    NASA Astrophysics Data System (ADS)

    Wamser, Carl C.

    2003-11-01

    All students in the organic chemistry class are assigned a unique set of nine molecules to report on as optional extra credit assignments. The molecules are taken from a list containing over 200 molecules on the class Web site; they represent an assortment of biologically relevant compounds, from acetaminophen to yohimbine. Once a week, students may submit information about one of the molecules for two points extra credit (where the course includes a total of over 600 points from traditional quizzes and exams). The information requested about the molecules varies slightly each term as student expertise grows, for example, molecular formula, hybridizations, functional groups, or number of stereocenters, but always includes biological relevance and sources of information. Initially students submitted data directly to the instructor by e-mail, but submissions now are handled by a Web-based course management system (WebCT). The goal is to give students individualized assignments that are relatively realistic in light of their future careers in health sciences. Nearly all of the students do some of the molecules, and many students do all of them. About 30 40% of the students who do the assignments regularly gain a grade benefit. Student responses to the exercise have been positive.

  3. Handling Qualities Optimization for Rotorcraft Conceptual Design

    NASA Technical Reports Server (NTRS)

    Lawrence, Ben; Theodore, Colin R.; Berger, Tom

    2016-01-01

    Over the past decade, NASA, under a succession of rotary-wing programs has been moving towards coupling multiple discipline analyses in a rigorous consistent manner to evaluate rotorcraft conceptual designs. Handling qualities is one of the component analyses to be included in a future NASA Multidisciplinary Analysis and Optimization framework for conceptual design of VTOL aircraft. Similarly, the future vision for the capability of the Concept Design and Assessment Technology Area (CD&A-TA) of the U.S Army Aviation Development Directorate also includes a handling qualities component. SIMPLI-FLYD is a tool jointly developed by NASA and the U.S. Army to perform modeling and analysis for the assessment of flight dynamics and control aspects of the handling qualities of rotorcraft conceptual designs. An exploration of handling qualities analysis has been carried out using SIMPLI-FLYD in illustrative scenarios of a tiltrotor in forward flight and single-main rotor helicopter at hover. Using SIMPLI-FLYD and the conceptual design tool NDARC integrated into a single process, the effects of variations of design parameters such as tail or rotor size were evaluated in the form of margins to fixed- and rotary-wing handling qualities metrics as well as the vehicle empty weight. The handling qualities design margins are shown to vary across the flight envelope due to both changing flight dynamic and control characteristics and changing handling qualities specification requirements. The current SIMPLI-FLYD capability and future developments are discussed in the context of an overall rotorcraft conceptual design process.

  4. Parallax handling of image stitching using dominant-plane homography

    NASA Astrophysics Data System (ADS)

    Pang, Zhaofeng; Li, Cheng; Zhao, Baojun; Tang, Linbo

    2015-10-01

    In this paper, we present a novel image stitching method to handle parallax in practical application. For images with significant amount of parallax, the more effective approach is to align roughly and globally the overlapping regions and then apply a seam-cutting method to composite naturally stitched images. It is well known that images can be modeled by various planes result from the projective parallax under non-ideal imaging condition. The dominant-plane homography has important advantages of warping an image globally and avoiding some local distortions. The proposed method primarily addresses large parallax problem through two steps: (1) selecting matching point pairs located on the dominant plane, by clustering matching correspondences and then measuring the cost of each cluster; and (2) in order to obtain a plausible seam, edge maps of overlapped area incorporation arithmetic is adopted to modify the standard seam-cutting method. Furthermore, our approach is demonstrated to achieve reliable performance of handling parallax through a mass of experimental comparisons with state-of-the-art methods.

  5. The 'last mile' of data handling: Fermilab's IFDH tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyon, Adam L.; Mengel, Marc W.

    2014-01-01

    IFDH (Intensity Frontier Data Handling), is a suite of tools for data movement tasks for Fermilab experiments and is an important part of the FIFE[2] (Fabric for Intensity Frontier [1] Experiments) initiative described at this conference. IFDH encompasses moving input data from caches or storage elements to compute nodes (the 'last mile' of data movement) and moving output data potentially to those caches as part of the journey back to the user. IFDH also involves throttling and locking to ensure that large numbers of jobs do not cause data movement bottlenecks. IFDH is realized as an easy to use layermore » that users call in their job scripts (e.g. 'ifdh cp'), hiding the low level data movement tools. One advantage of this layer is that the underlying low level tools can be selected or changed without the need for the user to alter their scripts. Logging and performance monitoring can also be added easily. This system will be presented in detail as well as its impact on the ease of data handling at Fermilab experiments.« less

  6. Characterizing stroke lesions using digital templates and lesion quantification tools in a web-based imaging informatics system for a large-scale stroke rehabilitation clinical trial

    NASA Astrophysics Data System (ADS)

    Wang, Ximing; Edwardson, Matthew; Dromerick, Alexander; Winstein, Carolee; Wang, Jing; Liu, Brent

    2015-03-01

    Previously, we presented an Interdisciplinary Comprehensive Arm Rehabilitation Evaluation (ICARE) imaging informatics system that supports a large-scale phase III stroke rehabilitation trial. The ePR system is capable of displaying anonymized patient imaging studies and reports, and the system is accessible to multiple clinical trial sites and users across the United States via the web. However, the prior multicenter stroke rehabilitation trials lack any significant neuroimaging analysis infrastructure. In stroke related clinical trials, identification of the stroke lesion characteristics can be meaningful as recent research shows that lesion characteristics are related to stroke scale and functional recovery after stroke. To facilitate the stroke clinical trials, we hope to gain insight into specific lesion characteristics, such as vascular territory, for patients enrolled into large stroke rehabilitation trials. To enhance the system's capability for data analysis and data reporting, we have integrated new features with the system: a digital brain template display, a lesion quantification tool and a digital case report form. The digital brain templates are compiled from published vascular territory templates at each of 5 angles of incidence. These templates were updated to include territories in the brainstem using a vascular territory atlas and the Medical Image Processing, Analysis and Visualization (MIPAV) tool. The digital templates are displayed for side-by-side comparisons and transparent template overlay onto patients' images in the image viewer. The lesion quantification tool quantifies planimetric lesion area from user-defined contour. The digital case report form stores user input into a database, then displays contents in the interface to allow for reviewing, editing, and new inputs. In sum, the newly integrated system features provide the user with readily-accessible web-based tools to identify the vascular territory involved, estimate lesion area

  7. 16 CFR 1207.10 - Handling, storage, and marking.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 16 Commercial Practices 2 2011-01-01 2011-01-01 false Handling, storage, and marking. 1207.10... REGULATIONS SAFETY STANDARD FOR SWIMMING POOL SLIDES § 1207.10 Handling, storage, and marking. (a) Marking... identification of the manufacturer. (b) Shipping, handling, and storage. The slide shall be designed, constructed...

  8. 16 CFR 1207.10 - Handling, storage, and marking.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 16 Commercial Practices 2 2014-01-01 2014-01-01 false Handling, storage, and marking. 1207.10... REGULATIONS SAFETY STANDARD FOR SWIMMING POOL SLIDES § 1207.10 Handling, storage, and marking. (a) Marking... identification of the manufacturer. (b) Shipping, handling, and storage. The slide shall be designed, constructed...

  9. 16 CFR 1207.10 - Handling, storage, and marking.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 16 Commercial Practices 2 2012-01-01 2012-01-01 false Handling, storage, and marking. 1207.10... REGULATIONS SAFETY STANDARD FOR SWIMMING POOL SLIDES § 1207.10 Handling, storage, and marking. (a) Marking... identification of the manufacturer. (b) Shipping, handling, and storage. The slide shall be designed, constructed...

  10. 16 CFR 1207.10 - Handling, storage, and marking.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Handling, storage, and marking. 1207.10... REGULATIONS SAFETY STANDARD FOR SWIMMING POOL SLIDES § 1207.10 Handling, storage, and marking. (a) Marking... identification of the manufacturer. (b) Shipping, handling, and storage. The slide shall be designed, constructed...

  11. A dynamical classification of the cosmic web

    NASA Astrophysics Data System (ADS)

    Forero-Romero, J. E.; Hoffman, Y.; Gottlöber, S.; Klypin, A.; Yepes, G.

    2009-07-01

    In this paper, we propose a new dynamical classification of the cosmic web. Each point in space is classified in one of four possible web types: voids, sheets, filaments and knots. The classification is based on the evaluation of the deformation tensor (i.e. the Hessian of the gravitational potential) on a grid. The classification is based on counting the number of eigenvalues above a certain threshold, λth, at each grid point, where the case of zero, one, two or three such eigenvalues corresponds to void, sheet, filament or a knot grid point. The collection of neighbouring grid points, friends of friends, of the same web type constitutes voids, sheets, filaments and knots as extended web objects. A simple dynamical consideration of the emergence of the web suggests that the threshold should not be null, as in previous implementations of the algorithm. A detailed dynamical analysis would have found different threshold values for the collapse of sheets, filaments and knots. Short of such an analysis a phenomenological approach has been opted for, looking for a single threshold to be determined by analysing numerical simulations. Our cosmic web classification has been applied and tested against a suite of large (dark matter only) cosmological N-body simulations. In particular, the dependence of the volume and mass filling fractions on λth and on the resolution has been calculated for the four web types. We also study the percolation properties of voids and filaments. Our main findings are as follows. (i) Already at λth = 0.1 the resulting web classification reproduces the visual impression of the cosmic web. (ii) Between 0.2 <~ λth <~ 0.4, a system of percolated voids coexists with a net of interconnected filaments. This suggests a reasonable choice for λth as the parameter that defines the cosmic web. (iii) The dynamical nature of the suggested classification provides a robust framework for incorporating environmental information into galaxy formation models

  12. Toward Exposing Timing-Based Probing Attacks in Web Applications †

    PubMed Central

    Mao, Jian; Chen, Yue; Shi, Futian; Jia, Yaoqi; Liang, Zhenkai

    2017-01-01

    Web applications have become the foundation of many types of systems, ranging from cloud services to Internet of Things (IoT) systems. Due to the large amount of sensitive data processed by web applications, user privacy emerges as a major concern in web security. Existing protection mechanisms in modern browsers, e.g., the same origin policy, prevent the users’ browsing information on one website from being directly accessed by another website. However, web applications executed in the same browser share the same runtime environment. Such shared states provide side channels for malicious websites to indirectly figure out the information of other origins. Timing is a classic side channel and the root cause of many recent attacks, which rely on the variations in the time taken by the systems to process different inputs. In this paper, we propose an approach to expose the timing-based probing attacks in web applications. It monitors the browser behaviors and identifies anomalous timing behaviors to detect browser probing attacks. We have prototyped our system in the Google Chrome browser and evaluated the effectiveness of our approach by using known probing techniques. We have applied our approach on a large number of top Alexa sites and reported the suspicious behavior patterns with corresponding analysis results. Our theoretical analysis illustrates that the effectiveness of the timing-based probing attacks is dramatically limited by our approach. PMID:28245610

  13. Adaptations in a hierarchical food web of southeastern Lake Michigan

    USGS Publications Warehouse

    Krause, Ann E.; Frank, Ken A.; Jones, Michael L.; Nalepa, Thomas F.; Barbiero, Richard P.; Madenjian, Charles P.; Agy, Megan; Evans, Marlene S.; Taylor, William W.; Mason, Doran M.; Léonard, Nancy J.

    2009-01-01

    Two issues in ecological network theory are: (1) how to construct an ecological network model and (2) how do entire networks (as opposed to individual species) adapt to changing conditions? We present a novel method for constructing an ecological network model for the food web of southeastern Lake Michigan (USA) and we identify changes in key system properties that are large relative to their uncertainty as this ecological network adapts from one time point to a second time point in response to multiple perturbations. To construct our food web for southeastern Lake Michigan, we followed the list of seven recommendations outlined in Cohen et al. [Cohen, J.E., et al., 1993. Improving food webs. Ecology 74, 252–258] for improving food webs. We explored two inter-related extensions of hierarchical system theory with our food web; the first one was that subsystems react to perturbations independently in the short-term and the second one was that a system's properties change at a slower rate than its subsystems’ properties. We used Shannon's equations to provide quantitative versions of the basic food web properties: number of prey, number of predators, number of feeding links, and connectance (or density). We then compared these properties between the two time-periods by developing distributions of each property for each time period that took uncertainty about the property into account. We compared these distributions, and concluded that non-overlapping distributions indicated changes in these properties that were large relative to their uncertainty. Two subsystems were identified within our food web system structure (p < 0.001). One subsystem had more non-overlapping distributions in food web properties between Time 1 and Time 2 than the other subsystem. The overall system had all overlapping distributions in food web properties between Time 1 and Time 2. These results supported both extensions of hierarchical systems theory. Interestingly, the subsystem with more

  14. The Ensembl Web Site: Mechanics of a Genome Browser

    PubMed Central

    Stalker, James; Gibbins, Brian; Meidl, Patrick; Smith, James; Spooner, William; Hotz, Hans-Rudolf; Cox, Antony V.

    2004-01-01

    The Ensembl Web site (http://www.ensembl.org/) is the principal user interface to the data of the Ensembl project, and currently serves >500,000 pages (∼2.5 million hits) per week, providing access to >80 GB (gigabyte) of data to users in more than 80 countries. Built atop an open-source platform comprising Apache/mod_perl and the MySQL relational database management system, it is modular, extensible, and freely available. It is being actively reused and extended in several different projects, and has been downloaded and installed in companies and academic institutions worldwide. Here, we describe some of the technical features of the site, with particular reference to its dynamic configuration that enables it to handle disparate data from multiple species. PMID:15123591

  15. The Ensembl Web site: mechanics of a genome browser.

    PubMed

    Stalker, James; Gibbins, Brian; Meidl, Patrick; Smith, James; Spooner, William; Hotz, Hans-Rudolf; Cox, Antony V

    2004-05-01

    The Ensembl Web site (http://www.ensembl.org/) is the principal user interface to the data of the Ensembl project, and currently serves >500,000 pages (approximately 2.5 million hits) per week, providing access to >80 GB (gigabyte) of data to users in more than 80 countries. Built atop an open-source platform comprising Apache/mod_perl and the MySQL relational database management system, it is modular, extensible, and freely available. It is being actively reused and extended in several different projects, and has been downloaded and installed in companies and academic institutions worldwide. Here, we describe some of the technical features of the site, with particular reference to its dynamic configuration that enables it to handle disparate data from multiple species.

  16. WebMedSA: a web-based framework for segmenting and annotating medical images using biomedical ontologies

    NASA Astrophysics Data System (ADS)

    Vega, Francisco; Pérez, Wilson; Tello, Andrés.; Saquicela, Victor; Espinoza, Mauricio; Solano-Quinde, Lizandro; Vidal, Maria-Esther; La Cruz, Alexandra

    2015-12-01

    Advances in medical imaging have fostered medical diagnosis based on digital images. Consequently, the number of studies by medical images diagnosis increases, thus, collaborative work and tele-radiology systems are required to effectively scale up to this diagnosis trend. We tackle the problem of the collaborative access of medical images, and present WebMedSA, a framework to manage large datasets of medical images. WebMedSA relies on a PACS and supports the ontological annotation, as well as segmentation and visualization of the images based on their semantic description. Ontological annotations can be performed directly on the volumetric image or at different image planes (e.g., axial, coronal, or sagittal); furthermore, annotations can be complemented after applying a segmentation technique. WebMedSA is based on three main steps: (1) RDF-ization process for extracting, anonymizing, and serializing metadata comprised in DICOM medical images into RDF/XML; (2) Integration of different biomedical ontologies (using L-MOM library), making this approach ontology independent; and (3) segmentation and visualization of annotated data which is further used to generate new annotations according to expert knowledge, and validation. Initial user evaluations suggest that WebMedSA facilitates the exchange of knowledge between radiologists, and provides the basis for collaborative work among them.

  17. Procedures to handle inventory cluster plots that straddle two or more conditions

    Treesearch

    Jerold T. Hahn; Colin D. MacLean; Stanford L. Arner; William A. Bechtold

    1995-01-01

    We review the relative merits and field procedures for four basic plot designs to handle forest inventory plots that straddle two or more conditions, given that subplots will not be moved. A cluster design is recommended that combines fixed-area subplots and variable-radius plot (VRP) sampling. Each subplot in a cluster consists of a large fixed-area subplot for...

  18. SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications

    PubMed Central

    Kalinin, Alexandr A.; Palanimalai, Selvam; Dinov, Ivo D.

    2018-01-01

    The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis. PMID:29630069

  19. SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications.

    PubMed

    Kalinin, Alexandr A; Palanimalai, Selvam; Dinov, Ivo D

    2017-04-01

    The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis.

  20. WebSat--a web software for microsatellite marker development.

    PubMed

    Martins, Wellington Santos; Lucas, Divino César Soares; Neves, Kelligton Fabricio de Souza; Bertioli, David John

    2009-01-01

    Simple sequence repeats (SSR), also known as microsatellites, have been extensively used as molecular markers due to their abundance and high degree of polymorphism. We have developed a simple to use web software, called WebSat, for microsatellite molecular marker prediction and development. WebSat is accessible through the Internet, requiring no program installation. Although a web solution, it makes use of Ajax techniques, providing a rich, responsive user interface. WebSat allows the submission of sequences, visualization of microsatellites and the design of primers suitable for their amplification. The program allows full control of parameters and the easy export of the resulting data, thus facilitating the development of microsatellite markers. The web tool may be accessed at http://purl.oclc.org/NET/websat/

  1. 30 CFR 77.606 - Energized trailing cables; handling.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Energized trailing cables; handling. 77.606... COAL MINES Trailing Cables § 77.606 Energized trailing cables; handling. Energized medium- and high-voltage trailing cables shall be handled only by persons wearing protective rubber gloves (see § 77.606-1...

  2. Materials Handling. Module SH-01. Safety and Health.

    ERIC Educational Resources Information Center

    Center for Occupational Research and Development, Inc., Waco, TX.

    This student module on materials handling is one of 50 modules concerned with job safety and health. It presents the procedures for safe materials handling. Discussed are manual handling methods (lifting and carrying by hand) and mechanical lifting (lifting by powered trucks, cranes or conveyors). Following the introduction, 15 objectives (each…

  3. Incorporating Handling Qualities Analysis into Rotorcraft Conceptual Design

    NASA Technical Reports Server (NTRS)

    Lawrence, Ben

    2014-01-01

    This paper describes the initial development of a framework to incorporate handling qualities analyses into a rotorcraft conceptual design process. In particular, the paper describes how rotorcraft conceptual design level data can be used to generate flight dynamics models for handling qualities analyses. Also, methods are described that couple a basic stability augmentation system to the rotorcraft flight dynamics model to extend analysis to beyond that of the bare airframe. A methodology for calculating the handling qualities characteristics of the flight dynamics models and for comparing the results to ADS-33E criteria is described. Preliminary results from the application of the handling qualities analysis for variations in key rotorcraft design parameters of main rotor radius, blade chord, hub stiffness and flap moment of inertia are shown. Varying relationships, with counteracting trends for different handling qualities criteria and different flight speeds are exhibited, with the action of the control system playing a complex part in the outcomes. Overall, the paper demonstrates how a broad array of technical issues across flight dynamics stability and control, simulation and modeling, control law design and handling qualities testing and evaluation had to be confronted to implement even a moderately comprehensive handling qualities analysis of relatively low fidelity models. A key outstanding issue is to how to 'close the loop' with an overall design process, and options for the exploration of how to feedback handling qualities results to a conceptual design process are proposed for future work.

  4. Web Mining for Web Image Retrieval.

    ERIC Educational Resources Information Center

    Chen, Zheng; Wenyin, Liu; Zhang, Feng; Li, Mingjing; Zhang, Hongjiang

    2001-01-01

    Presents a prototype system for image retrieval from the Internet using Web mining. Discusses the architecture of the Web image retrieval prototype; document space modeling; user log mining; and image retrieval experiments to evaluate the proposed system. (AEF)

  5. Dynamic Web Pages: Performance Impact on Web Servers.

    ERIC Educational Resources Information Center

    Kothari, Bhupesh; Claypool, Mark

    2001-01-01

    Discussion of Web servers and requests for dynamic pages focuses on experimentally measuring and analyzing the performance of the three dynamic Web page generation technologies: CGI, FastCGI, and Servlets. Develops a multivariate linear regression model and predicts Web server performance under some typical dynamic requests. (Author/LRW)

  6. A tool for improving the Web accessibility of visually handicapped persons.

    PubMed

    Fujiki, Tadayoshi; Hanada, Eisuke; Yamada, Tomomi; Noda, Yoshihiro; Antoku, Yasuaki; Nakashima, Naoki; Nose, Yoshiaki

    2006-04-01

    Abstract Much has been written concerning the difficulties faced by visually handicapped persons when they access the internet. To solve some of the problems and to make web pages more accessible, we developed a tool we call the "Easy Bar," which works as a toolbar on the web browser. The functions of the Easy Bar are to change the size of web texts and images, to adjust the color, and to clear cached data that is automatically saved by the web browser. These functions are executed with ease by clicking buttons and operating a pull-down list. Since the icons built into Easy Bar are quite large, it is not necessary for the user to deal with delicate operations. The functions of Easy Bar run on any web page without increasing the processing time. For the visually handicapped, Easy Bar would contribute greatly to improved web accessibility to medical information.

  7. Techniques for Enhancing Web-Based Education.

    ERIC Educational Resources Information Center

    Barbieri, Kathy; Mehringer, Susan

    The Virtual Workshop is a World Wide Web-based set of modules on high performance computing developed at the Cornell Theory Center (CTC) (New York). This approach reaches a large audience, leverages staff effort, and poses challenges for developing interesting presentation techniques. This paper describes the following techniques with their…

  8. WebVR: an interactive web browser for virtual environments

    NASA Astrophysics Data System (ADS)

    Barsoum, Emad; Kuester, Falko

    2005-03-01

    The pervasive nature of web-based content has lead to the development of applications and user interfaces that port between a broad range of operating systems and databases, while providing intuitive access to static and time-varying information. However, the integration of this vast resource into virtual environments has remained elusive. In this paper we present an implementation of a 3D Web Browser (WebVR) that enables the user to search the internet for arbitrary information and to seamlessly augment this information into virtual environments. WebVR provides access to the standard data input and query mechanisms offered by conventional web browsers, with the difference that it generates active texture-skins of the web contents that can be mapped onto arbitrary surfaces within the environment. Once mapped, the corresponding texture functions as a fully integrated web-browser that will respond to traditional events such as the selection of links or text input. As a result, any surface within the environment can be turned into a web-enabled resource that provides access to user-definable data. In order to leverage from the continuous advancement of browser technology and to support both static as well as streamed content, WebVR uses ActiveX controls to extract the desired texture skin from industry strength browsers, providing a unique mechanism for data fusion and extensibility.

  9. Planktonic food webs revisited: Reanalysis of results from the linear inverse approach

    NASA Astrophysics Data System (ADS)

    Hlaili, Asma Sakka; Niquil, Nathalie; Legendre, Louis

    2014-01-01

    Identification of the trophic pathway that dominates a given planktonic assemblage is generally based on the distribution of biomasses among food-web compartments, or better, the flows of materials or energy among compartments. These flows are obtained by field observations and a posteriori analyses, including the linear inverse approach. In the present study, we re-analysed carbon flows obtained by inverse analysis at 32 stations in the global ocean and one large lake. Our results do not support two "classical" views of plankton ecology, i.e. that the herbivorous food web is dominated by mesozooplankton grazing on large phytoplankton, and the microbial food web is based on microzooplankton significantly consuming bacteria; our results suggest instead that phytoplankton are generally grazed by microzooplankton, of which they are the main food source. Furthermore, we identified the "phyto-microbial food web", where microzooplankton largely feed on phytoplankton, in addition to the already known "poly-microbial food web", where microzooplankton consume more or less equally various types of food. These unexpected results led to a (re)definition of the conceptual models corresponding to the four trophic pathways we found to exist in plankton, i.e. the herbivorous, multivorous, and two types of microbial food web. We illustrated the conceptual trophic pathways using carbon flows that were actually observed at representative stations. The latter can be calibrated to correspond to any field situation. Our study also provides researchers and managers with operational criteria for identifying the dominant trophic pathway in a planktonic assemblage, these criteria being based on the values of two carbon ratios that could be calculated from flow values that are relatively easy to estimate in the field.

  10. Web-video-mining-supported workflow modeling for laparoscopic surgeries.

    PubMed

    Liu, Rui; Zhang, Xiaoli; Zhang, Hao

    2016-11-01

    As quality assurance is of strong concern in advanced surgeries, intelligent surgical systems are expected to have knowledge such as the knowledge of the surgical workflow model (SWM) to support their intuitive cooperation with surgeons. For generating a robust and reliable SWM, a large amount of training data is required. However, training data collected by physically recording surgery operations is often limited and data collection is time-consuming and labor-intensive, severely influencing knowledge scalability of the surgical systems. The objective of this research is to solve the knowledge scalability problem in surgical workflow modeling with a low cost and labor efficient way. A novel web-video-mining-supported surgical workflow modeling (webSWM) method is developed. A novel video quality analysis method based on topic analysis and sentiment analysis techniques is developed to select high-quality videos from abundant and noisy web videos. A statistical learning method is then used to build the workflow model based on the selected videos. To test the effectiveness of the webSWM method, 250 web videos were mined to generate a surgical workflow for the robotic cholecystectomy surgery. The generated workflow was evaluated by 4 web-retrieved videos and 4 operation-room-recorded videos, respectively. The evaluation results (video selection consistency n-index ≥0.60; surgical workflow matching degree ≥0.84) proved the effectiveness of the webSWM method in generating robust and reliable SWM knowledge by mining web videos. With the webSWM method, abundant web videos were selected and a reliable SWM was modeled in a short time with low labor cost. Satisfied performances in mining web videos and learning surgery-related knowledge show that the webSWM method is promising in scaling knowledge for intelligent surgical systems. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. How to Handle Impasses in Bargaining.

    ERIC Educational Resources Information Center

    Durrant, Robert E.

    Guidelines in an outline format are presented to school board members and administrators on how to handle impasses in bargaining. The following two rules are given: there sometimes may be strikes, but there always will be settlements; and on the way to settlements, there always will be impasses. Suggestions for handling impasses are listed under…

  12. Neonatal handling and reproductive function in female rats.

    PubMed

    Gomes, C M; Raineki, C; Ramos de Paula, P; Severino, G S; Helena, C V V; Anselmo-Franci, J A; Franci, C R; Sanvitto, G L; Lucion, A B

    2005-02-01

    Neonatal handling induces anovulatory estrous cycles and decreases sexual receptivity in female rats. The synchronous secretion of hormones from the gonads (estradiol (E2) and progesterone (P)), pituitary (luteinizing (LH) and follicle-stimulating (FSH) hormones) and hypothalamus (LH-releasing hormone (LHRH)) are essential for the reproductive functions in female rats. The present study aimed to describe the plasma levels of E2 and P throughout the estrous cycle and LH, FSH and prolactin (PRL) in the afternoon of the proestrus, and the LHRH content in the medial preoptic area (MPOA), median eminence (ME) and medial septal area (MSA) in the proestrus, in the neonatal handled rats. Wistar pup rats were handled for 1 min during the first 10 days after delivery (neonatal handled group) or left undisturbed (nonhandled group). When they reached adulthood, blood samples were collected through a jugular cannula and the MPOA, ME and MSA were microdissected. Plasma levels of the hormones and the content of LHRH were determined by RIA. The number of oocytes counted in the morning of the estrus day in the handled rats was significantly lower than in the nonhandled ones. Neonatal handling reduces E2 levels only on the proestrus day while P levels decreased in metestrus and estrus. Handled females also showed reduced plasma levels of LH, FSH and PRL in the afternoon of the proestrus. The LHRH content in the MPOA was significantly higher than in the nonhandled group. The reduced secretion of E2, LH, FSH and LHRH on the proestrus day may explain the anovulatory estrous cycle in neonatal handled rats. The reduced secretion of PRL in the proestrus may be related to the decreased sexual receptiveness in handled females. In conclusion, early-life environmental stimulation can induce long-lasting effects on the hypothalamus-pituitary-gonad axis.

  13. An experimental test of a fundamental food web motif.

    PubMed

    Rip, Jason M K; McCann, Kevin S; Lynn, Denis H; Fawcett, Sonia

    2010-06-07

    Large-scale changes to the world's ecosystem are resulting in the deterioration of biostructure-the complex web of species interactions that make up ecological communities. A difficult, yet crucial task is to identify food web structures, or food web motifs, that are the building blocks of this baroque network of interactions. Once identified, these food web motifs can then be examined through experiments and theory to provide mechanistic explanations for how structure governs ecosystem stability. Here, we synthesize recent ecological research to show that generalist consumers coupling resources with different interaction strengths, is one such motif. This motif amazingly occurs across an enormous range of spatial scales, and so acts to distribute coupled weak and strong interactions throughout food webs. We then perform an experiment that illustrates the importance of this motif to ecological stability. We find that weak interactions coupled to strong interactions by generalist consumers dampen strong interaction strengths and increase community stability. This study takes a critical step by isolating a common food web motif and through clear, experimental manipulation, identifies the fundamental stabilizing consequences of this structure for ecological communities.

  14. 75 FR 27986 - Electronic Filing System-Web (EFS-Web) Contingency Option

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-19

    ...] Electronic Filing System--Web (EFS-Web) Contingency Option AGENCY: United States Patent and Trademark Office... availability of its patent electronic filing system, Electronic Filing System--Web (EFS-Web) by providing a new contingency option when the primary portal to EFS-Web has an unscheduled outage. Previously, the entire EFS...

  15. Designing Effective Web Forms for Older Web Users

    ERIC Educational Resources Information Center

    Li, Hui; Rau, Pei-Luen Patrick; Fujimura, Kaori; Gao, Qin; Wang, Lin

    2012-01-01

    This research aims to provide insight for web form design for older users. The effects of task complexity and information structure of web forms on older users' performance were examined. Forty-eight older participants with abundant computer and web experience were recruited. The results showed significant differences in task time and error rate…

  16. Differences among nursing homes in outcomes of a safe resident handling program.

    PubMed

    Kurowski, Alicia; Gore, Rebecca; Buchholz, Bryan; Punnett, Laura

    2012-01-01

    A large nursing home corporation implemented a safe resident handling program (SRHP) in 2004-2007. We evaluated its efficacy over a 2-year period by examining differences among 5 centers in program outcomes and potential predictors of those differences. We observed nursing assistants (NAs), recording activities and body postures at 60-second intervals on personal digital assistants at baseline and at 3-month, 12-month, and 24-month follow-ups. The two outcomes computed were change in equipment use during resident handling and change in a physical workload index that estimated spinal loading due to body postures and handled loads. Potential explanatory factors were extracted from post-observation interviews, investigator surveys of the workforce, from administrative data, and employee satisfaction surveys. The facility with the most positive outcome measures was associated with many positive changes in explanatory factors and the facility with the fewest positive outcome measures experienced negative changes in the same factors. These findings suggest greater SRHP benefits where there was lower NA turnover and agency staffing; less time pressure; and better teamwork, staff communication, and supervisory support. © 2012 American Society for Healthcare Risk Management of the American Hospital Association.

  17. Collaborative Visualization for Large-Scale Accelerator Electromagnetic Modeling (Final Report)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    William J. Schroeder

    2011-11-13

    This report contains the comprehensive summary of the work performed on the SBIR Phase II, Collaborative Visualization for Large-Scale Accelerator Electromagnetic Modeling at Kitware Inc. in collaboration with Stanford Linear Accelerator Center (SLAC). The goal of the work was to develop collaborative visualization tools for large-scale data as illustrated in the figure below. The solutions we proposed address the typical problems faced by geographicallyand organizationally-separated research and engineering teams, who produce large data (either through simulation or experimental measurement) and wish to work together to analyze and understand their data. Because the data is large, we expect that it cannotmore » be easily transported to each team member's work site, and that the visualization server must reside near the data. Further, we also expect that each work site has heterogeneous resources: some with large computing clients, tiled (or large) displays and high bandwidth; others sites as simple as a team member on a laptop computer. Our solution is based on the open-source, widely used ParaView large-data visualization application. We extended this tool to support multiple collaborative clients who may locally visualize data, and then periodically rejoin and synchronize with the group to discuss their findings. Options for managing session control, adding annotation, and defining the visualization pipeline, among others, were incorporated. We also developed and deployed a Web visualization framework based on ParaView that enables the Web browser to act as a participating client in a collaborative session. The ParaView Web Visualization framework leverages various Web technologies including WebGL, JavaScript, Java and Flash to enable interactive 3D visualization over the web using ParaView as the visualization server. We steered the development of this technology by teaming with the SLAC National Accelerator Laboratory. SLAC has a computationally

  18. Implementation of a near-real time cross-border web-mapping platform on airborne particulate matter (PM) concentration with open-source software

    NASA Astrophysics Data System (ADS)

    Knörchen, Achim; Ketzler, Gunnar; Schneider, Christoph

    2015-01-01

    Although Europe has been growing together for the past decades, cross-border information platforms on environmental issues are still scarce. With regard to the establishment of a web-mapping tool on airborne particulate matter (PM) concentration for the Euregio Meuse-Rhine located in the border region of Belgium, Germany and the Netherlands, this article describes the research on methodical and technical backgrounds implementing such a platform. An open-source solution was selected for presenting the data in a Web GIS (OpenLayers/GeoExt; both JavaScript-based), applying other free tools for data handling (Python), data management (PostgreSQL), geo-statistical modelling (Octave), geoprocessing (GRASS GIS/GDAL) and web mapping (MapServer). The multilingual, made-to-order online platform provides access to near-real time data on PM concentration as well as additional background information. In an open data section, commented configuration files for the Web GIS client are being made available for download. Furthermore, all geodata generated by the project is being published under public domain and can be retrieved in various formats or integrated into Desktop GIS as Web Map Services (WMS).

  19. Deep Web video

    ScienceCinema

    None Available

    2018-02-06

    To make the web work better for science, OSTI has developed state-of-the-art technologies and services including a deep web search capability. The deep web includes content in searchable databases available to web users but not accessible by popular search engines, such as Google. This video provides an introduction to the deep web search engine.

  20. Quinone-induced protein handling changes: Implications for major protein handling systems in quinone-mediated toxicity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiong, Rui; Siegel, David; Ross, David, E-mail: david.ross@ucdenver.edu

    2014-10-15

    Para-quinones such as 1,4-Benzoquinone (BQ) and menadione (MD) and ortho-quinones including the oxidation products of catecholamines, are derived from xenobiotics as well as endogenous molecules. The effects of quinones on major protein handling systems in cells; the 20/26S proteasome, the ER stress response, autophagy, chaperone proteins and aggresome formation, have not been investigated in a systematic manner. Both BQ and aminochrome (AC) inhibited proteasomal activity and activated the ER stress response and autophagy in rat dopaminergic N27 cells. AC also induced aggresome formation while MD had little effect on any protein handling systems in N27 cells. The effect of NQO1more » on quinone induced protein handling changes and toxicity was examined using N27 cells stably transfected with NQO1 to generate an isogenic NQO1-overexpressing line. NQO1 protected against BQ–induced apoptosis but led to a potentiation of AC- and MD-induced apoptosis. Modulation of quinone-induced apoptosis in N27 and NQO1-overexpressing cells correlated only with changes in the ER stress response and not with changes in other protein handling systems. These data suggested that NQO1 modulated the ER stress response to potentiate toxicity of AC and MD, but protected against BQ toxicity. We further demonstrated that NQO1 mediated reduction to unstable hydroquinones and subsequent redox cycling was important for the activation of the ER stress response and toxicity for both AC and MD. In summary, our data demonstrate that quinone-specific changes in protein handling are evident in N27 cells and the induction of the ER stress response is associated with quinone-mediated toxicity. - Highlights: • Unstable hydroquinones contributed to quinone-induced ER stress and toxicity.« less

  1. The influence of handling qualities on safety and survivability

    NASA Technical Reports Server (NTRS)

    Anderson, S. B.

    1977-01-01

    The relationship of handling qualities to safety and survivability of military aircraft is examined which includes the following: (1) a brief discussion of the philosophy used in the military specifications for treatment of degraded handling qualities, (2) an examination of several example handling qualities problem areas which influence safety and survivability; and (3) a movie illustrating the potential dangers of inadequate handling qualities features.

  2. Handling Kids in Crisis with Care

    ERIC Educational Resources Information Center

    Bushinski, Cari

    2018-01-01

    The Handle with Care program helps schools help students who experience trauma. While at the scene of an event like a domestic violence call, drug raid, or car accident, law enforcement personnel determine the names and school of any children present. They notify that child's school to "handle ___ with care" the next day, and the school…

  3. Ground-Handling Forces on a 1/40-scale Model of the U. S. Airship "Akron."

    NASA Technical Reports Server (NTRS)

    Silverstein, Abe; Gulick, B G

    1937-01-01

    This report presents the results of full-scale wind tunnel tests conducted to determine the ground-handling forces on a 1/40-scale model of the U. S. Airship "Akron." Ground-handling conditions were simulated by establishing a velocity gradient above a special ground board in the tunnel comparable with that encountered over a landing field. The tests were conducted at Reynolds numbers ranging from 5,000,000 to 19,000,000 at each of six angles of yaw between 0 degree and 180 degrees and at four heights of the model above the ground board. The ground-handling forces vary greatly with the angle of yaw and reach large values at appreciable angles of yaw. Small changes in height, pitch, or roll did not critically affect the forces on the model. In the range of Reynolds numbers tested, no significant variation of the forces with the scale was disclosed.

  4. Personality in cyberspace: personal Web sites as media for personality expressions and impressions.

    PubMed

    Marcus, Bernd; Machilek, Franz; Schütz, Astrid

    2006-06-01

    This research examined the personality of owners of personal Web sites based on self-reports, visitors' ratings, and the content of the Web sites. The authors compared a large sample of Web site owners with population-wide samples on the Big Five dimensions of personality. Controlling for demographic differences, the average Web site owner reported being slightly less extraverted and more open to experience. Compared with various other samples, Web site owners did not generally differ on narcissism, self-monitoring, or self-esteem, but gender differences on these traits were often smaller in Web site owners. Self-other agreement was highest with Openness to Experience, but valid judgments of all Big Five dimensions were derived from Web sites providing rich information. Visitors made use of quantifiable features of the Web site to infer personality, and the cues they utilized partly corresponded to self-reported traits. Copyright 2006 APA, all rights reserved.

  5. Postmarket Drug Surveillance Without Trial Costs: Discovery of Adverse Drug Reactions Through Large-Scale Analysis of Web Search Queries

    PubMed Central

    Gabrilovich, Evgeniy

    2013-01-01

    Background Postmarket drug safety surveillance largely depends on spontaneous reports by patients and health care providers; hence, less common adverse drug reactions—especially those caused by long-term exposure, multidrug treatments, or those specific to special populations—often elude discovery. Objective Here we propose a low cost, fully automated method for continuous monitoring of adverse drug reactions in single drugs and in combinations thereof, and demonstrate the discovery of heretofore-unknown ones. Methods We used aggregated search data of large populations of Internet users to extract information related to drugs and adverse reactions to them, and correlated these data over time. We further extended our method to identify adverse reactions to combinations of drugs. Results We validated our method by showing high correlations of our findings with known adverse drug reactions (ADRs). However, although acute early-onset drug reactions are more likely to be reported to regulatory agencies, we show that less acute later-onset ones are better captured in Web search queries. Conclusions Our method is advantageous in identifying previously unknown adverse drug reactions. These ADRs should be considered as candidates for further scrutiny by medical regulatory authorities, for example, through phase 4 trials. PMID:23778053

  6. Environmental controls on food web regimes: A fluvial perspective

    NASA Astrophysics Data System (ADS)

    Power, Mary E.

    2006-02-01

    Because food web regimes control the biomass of primary producers (e.g., plants or algae), intermediate consumers (e.g., invertebrates), and large top predators (tuna, killer whales), they are of societal as well as academic interest. Some controls over food web regimes may be internal, but many are mediated by conditions or fluxes over large spatial scales. To understand locally observed changes in food webs, we must learn more about how environmental gradients and boundaries affect the fluxes of energy, materials, or organisms through landscapes or seascapes that influence local species interactions. Marine biologists and oceanographers have overcome formidable challenges of fieldwork on the high seas to make remarkable progress towards this goal. In river drainage networks, we have opportunities to address similar questions at smaller spatial scales, in ecosystems with clear physical structure and organization. Despite these advantages, we still have much to learn about linkages between fluxes from watershed landscapes and local food webs in river networks. Longitudinal (downstream) gradients in productivity, disturbance regimes, and habitat structure exert strong effects on the organisms and energy sources of river food webs, but their effects on species interactions are just beginning to be explored. In fluid ecosystems with less obvious physical structure, like the open ocean, discerning features that control the movement of organisms and affect food web dynamics is even more challenging. In both habitats, new sensing, tracing and mapping technologies have revealed how landscape or seascape features (e.g., watershed divides, ocean fronts or circulation cells) channel, contain or concentrate organisms, energy and materials. Field experiments and direct in situ observations of basic natural history, however, remain as vital as ever in interpreting the responses of biota to these features. We need field data that quantify the many spatial and temporal scales of

  7. Contact angle control of sessile drops on a tensioned web

    NASA Astrophysics Data System (ADS)

    Park, Janghoon; Kim, Dongguk; Lee, Changwoo

    2018-04-01

    In this study, the influence of the change of tension applied to flexible and thin web substrate on the contact angle of sessile drop in roll-to-roll system was investigated. Graphene oxide and deionized water solutions were used in the experiments. Tension was changed to 29, 49, and 69 N, and the casting distance of the micropipette and the material was set to 10, 20, and 40 mm, and the droplet volume was set to 10, 20, and 30 μL, respectively. Statistical analysis of three variables and analysis of the variance methodology showed that the casting distance was most significant for the contact angle change, and the most interesting tension variable was also affected. The change in tension caused the maximum contact angle to change by 5.5°. The tension was not uniform in the width direction. When the droplet was applied in the same direction in the width direction, it was confirmed that the tension unevenness had great influence on the contact angle up to 11°. Finally, the casting distance, which has a large effect on the contact angle, was calibrated in the width direction to reduce the width direction contact angle deviation to 1%. This study can be applied to fine patterning research using continuous inkjet printing and aerosol jet printing, which are roll-to-roll processes based on droplet handling.

  8. webpic: A flexible web application for collecting distance and count measurements from images

    PubMed Central

    2018-01-01

    Despite increasing ability to store and analyze large amounts of data for organismal and ecological studies, the process of collecting distance and count measurements from images has largely remained time consuming and error-prone, particularly for tasks for which automation is difficult or impossible. Improving the efficiency of these tasks, which allows for more high quality data to be collected in a shorter amount of time, is therefore a high priority. The open-source web application, webpic, implements common web languages and widely available libraries and productivity apps to streamline the process of collecting distance and count measurements from images. In this paper, I introduce the framework of webpic and demonstrate one readily available feature of this application, linear measurements, using fossil leaf specimens. This application fills the gap between workflows accomplishable by individuals through existing software and those accomplishable by large, unmoderated crowds. It demonstrates that flexible web languages can be used to streamline time-intensive research tasks without the use of specialized equipment or proprietary software and highlights the potential for web resources to facilitate data collection in research tasks and outreach activities with improved efficiency. PMID:29608592

  9. WebGLORE: a web service for Grid LOgistic REgression.

    PubMed

    Jiang, Wenchao; Li, Pinghao; Wang, Shuang; Wu, Yuan; Xue, Meng; Ohno-Machado, Lucila; Jiang, Xiaoqian

    2013-12-15

    WebGLORE is a free web service that enables privacy-preserving construction of a global logistic regression model from distributed datasets that are sensitive. It only transfers aggregated local statistics (from participants) through Hypertext Transfer Protocol Secure to a trusted server, where the global model is synthesized. WebGLORE seamlessly integrates AJAX, JAVA Applet/Servlet and PHP technologies to provide an easy-to-use web service for biomedical researchers to break down policy barriers during information exchange. http://dbmi-engine.ucsd.edu/webglore3/. WebGLORE can be used under the terms of GNU general public license as published by the Free Software Foundation.

  10. Dielectric Elastomer Actuators for Soft Wave-Handling Systems.

    PubMed

    Wang, Tao; Zhang, Jinhua; Hong, Jun; Wang, Michael Yu

    2017-03-01

    This article presents a soft handling system inspired by the principle of the natural wave (named Wave-Handling system) aiming to offer a soft solution to delicately transport and sort fragile items such as fruits, vegetables, biological tissues in food, and biological industries. The system consists of an array of hydrostatically coupled dielectric elastomer actuators (HCDEAs). Due to the electrostriction property of dielectric elastomers, the handling system can be controlled by electric voltage rather than the cumbersome pneumatic system. To study the working performance of the Wave-Handling system and how the performance can be improved, the basic properties of HCDEA are investigated through experiments. We find that the HCDEA exhibits some delay and hysteretic characteristics when activated by periodic voltage and the characteristics are influenced by the frequency and external force also. All this will affect the performance of the Wave-Handling system. However, the electric control, simple structure, light weight, and low cost of the soft handling system show great potential to move from laboratory to practical application. As a proof of design concept, a simply made prototype of the handling system is controlled to generate a parallel moving wave to manipulate a ball. Based on the experimental results, the improvements and future work are discussed and we believe this work will provide inspiration for soft robotic engineering.

  11. Epidemic model for information diffusion in web forums: experiments in marketing exchange and political dialog.

    PubMed

    Woo, Jiyoung; Chen, Hsinchun

    2016-01-01

    As social media has become more prevalent, its influence on business, politics, and society has become significant. Due to easy access and interaction between large numbers of users, information diffuses in an epidemic style on the web. Understanding the mechanisms of information diffusion through these new publication methods is important for political and marketing purposes. Among social media, web forums, where people in online communities disseminate and receive information, provide a good environment for examining information diffusion. In this paper, we model topic diffusion in web forums using the epidemiology model, the susceptible-infected-recovered (SIR) model, frequently used in previous research to analyze both disease outbreaks and knowledge diffusion. The model was evaluated on a large longitudinal dataset from the web forum of a major retail company and from a general political discussion forum. The fitting results showed that the SIR model is a plausible model to describe the diffusion process of a topic. This research shows that epidemic models can expand their application areas to topic discussion on the web, particularly social media such as web forums.

  12. Pulsed flows, tributary inputs, and food web structure in a highly regulated river

    USGS Publications Warehouse

    Sabo, John; Caron, Melanie; Doucett, Richard R.; Dibble, Kimberly L.; Ruhi, Albert; Marks, Jane; Hungate, Bruce; Kennedy, Theodore A.

    2018-01-01

    1.Dams disrupt the river continuum, altering hydrology, biodiversity, and energy flow. Although research indicates that tributary inputs have the potential to dilute these effects, knowledge at the food web level is still scarce.2.Here we examined the riverine food web structure of the Colorado River below Glen Canyon Dam, focusing on organic matter sources, trophic diversity, and food chain length. We asked how these components respond to pulsed flows from tributaries following monsoon thunderstorms that seasonally increase streamflow in the American Southwest.3.Tributaries increased the relative importance of terrestrial organic matter, particularly during the wet season below junctures of key tributaries. This contrasted with the algal-based food web present immediately below Glen Canyon Dam.4.Tributary inputs during the monsoon also increased trophic diversity and food chain length: food chain length peaked below the confluence with the largest tributary (by discharge) in Grand Canyon, increasing by >1 trophic level over a 4-5 kilometre reach possibly due to aquatic prey being flushed into the mainstem during heavy rain events.5.Our results illustrate that large tributaries can create seasonal discontinuities, influencing riverine food web structure in terms of allochthony, food web diversity, and food chain length.6.Synthesis and applications. Pulsed flows from unregulated tributaries following seasonal monsoon rains increase the importance of terrestrially-derived organic matter in large, regulated river food webs, increasing food chain length and trophic diversity downstream of tributary inputs. Protecting unregulated tributaries within hydropower cascades may be important if we are to mitigate food web structure alteration due to flow regulation by large dams. This is critical in the light of global hydropower development, especially in megadiverse, developing countries where dam placement (including completed and planned structures) is in tributaries.

  13. Web data mining

    NASA Astrophysics Data System (ADS)

    Wibonele, Kasanda J.; Zhang, Yanqing

    2002-03-01

    A web data mining system using granular computing and ASP programming is proposed. This is a web based application, which allows web users to submit survey data for many different companies. This survey is a collection of questions that will help these companies develop and improve their business and customer service with their clients by analyzing survey data. This web application allows users to submit data anywhere. All the survey data is collected into a database for further analysis. An administrator of this web application can login to the system and view all the data submitted. This web application resides on a web server, and the database resides on the MS SQL server.

  14. Nonlinear material behaviour of spider silk yields robust webs.

    PubMed

    Cranford, Steven W; Tarakanova, Anna; Pugno, Nicola M; Buehler, Markus J

    2012-02-01

    Natural materials are renowned for exquisite designs that optimize function, as illustrated by the elasticity of blood vessels, the toughness of bone and the protection offered by nacre. Particularly intriguing are spider silks, with studies having explored properties ranging from their protein sequence to the geometry of a web. This material system, highly adapted to meet a spider's many needs, has superior mechanical properties. In spite of much research into the molecular design underpinning the outstanding performance of silk fibres, and into the mechanical characteristics of web-like structures, it remains unknown how the mechanical characteristics of spider silk contribute to the integrity and performance of a spider web. Here we report web deformation experiments and simulations that identify the nonlinear response of silk threads to stress--involving softening at a yield point and substantial stiffening at large strain until failure--as being crucial to localize load-induced deformation and resulting in mechanically robust spider webs. Control simulations confirmed that a nonlinear stress response results in superior resistance to structural defects in the web compared to linear elastic or elastic-plastic (softening) material behaviour. We also show that under distributed loads, such as those exerted by wind, the stiff behaviour of silk under small deformation, before the yield point, is essential in maintaining the web's structural integrity. The superior performance of silk in webs is therefore not due merely to its exceptional ultimate strength and strain, but arises from the nonlinear response of silk threads to strain and their geometrical arrangement in a web.

  15. Multimedia data repository for the World Wide Web

    NASA Astrophysics Data System (ADS)

    Chen, Ken; Lu, Dajin; Xu, Duanyi

    1998-08-01

    This paper introduces the design and implementation of a Multimedia Data Repository served as a multimedia information system, which provides users a Web accessible, platform independent interface to query, browse, and retrieve multimedia data such as images, graphics, audio, video from a large multimedia data repository. By integrating the multimedia DBMS, in which the textual information and samples of the multimedia data is organized and stored, and Web server together into the Microsoft ActiveX Server Framework, users can access the DBMS and query the information by simply using a Web browser at the client-side. The original multimedia data can then be located and transmitted through the Internet from the tertiary storage device, a 400 CDROM optical jukebox at the server-side, to the client-side for further use.

  16. New Insights into Handling Missing Values in Environmental Epidemiological Studies

    PubMed Central

    Roda, Célina; Nicolis, Ioannis; Momas, Isabelle; Guihenneuc, Chantal

    2014-01-01

    Missing data are unavoidable in environmental epidemiologic surveys. The aim of this study was to compare methods for handling large amounts of missing values: omission of missing values, single and multiple imputations (through linear regression or partial least squares regression), and a fully Bayesian approach. These methods were applied to the PARIS birth cohort, where indoor domestic pollutant measurements were performed in a random sample of babies' dwellings. A simulation study was conducted to assess performances of different approaches with a high proportion of missing values (from 50% to 95%). Different simulation scenarios were carried out, controlling the true value of the association (odds ratio of 1.0, 1.2, and 1.4), and varying the health outcome prevalence. When a large amount of data is missing, omitting these missing data reduced statistical power and inflated standard errors, which affected the significance of the association. Single imputation underestimated the variability, and considerably increased risk of type I error. All approaches were conservative, except the Bayesian joint model. In the case of a common health outcome, the fully Bayesian approach is the most efficient approach (low root mean square error, reasonable type I error, and high statistical power). Nevertheless for a less prevalent event, the type I error is increased and the statistical power is reduced. The estimated posterior distribution of the OR is useful to refine the conclusion. Among the methods handling missing values, no approach is absolutely the best but when usual approaches (e.g. single imputation) are not sufficient, joint modelling approach of missing process and health association is more efficient when large amounts of data are missing. PMID:25226278

  17. Working with WebQuests: Making the Web Accessible to Students with Disabilities.

    ERIC Educational Resources Information Center

    Kelly, Rebecca

    2000-01-01

    This article describes how students with disabilities in regular classes are using the WebQuest lesson format to access the Internet. It explains essential WebQuest principles, creating a draft Web page, and WebQuest components. It offers an example of a WebQuest about salvaging the sunken ships, Titanic and Lusitania. A WebQuest planning form is…

  18. QuickEval: a web application for psychometric scaling experiments

    NASA Astrophysics Data System (ADS)

    Van Ngo, Khai; Storvik, Jehans J.; Dokkeberg, Christopher A.; Farup, Ivar; Pedersen, Marius

    2015-01-01

    QuickEval is a web application for carrying out psychometric scaling experiments. It offers the possibility of running controlled experiments in a laboratory, or large scale experiment over the web for people all over the world. It is a unique one of a kind web application, and it is a software needed in the image quality field. It is also, to the best of knowledge, the first software that supports the three most common scaling methods; paired comparison, rank order, and category judgement. It is also the first software to support rank order. Hopefully, a side effect of this newly created software is that it will lower the threshold to perform psychometric experiments, improve the quality of the experiments being carried out, make it easier to reproduce experiments, and increase research on image quality both in academia and industry. The web application is available at www.colourlab.no/quickeval.

  19. Overseas Absentee Ballot Handling in DOD

    DTIC Science & Technology

    2001-06-22

    Performed . We reviewed pertinent laws, policies, and guidance dated from May 1980 through January 2000 related to the absentee ballot process and the...OVERSEAS ABSENTEE BALLOT HANDLING IN DOD Report No. D-2001-145 June 22, 2001 Office of the Inspector...34) Title and Subtitle Overseas Absentee Ballot Handling in DOD Contract or Grant Number Program Element Number Authors Project Number Task Number

  20. Mfold web server for nucleic acid folding and hybridization prediction

    PubMed Central

    Zuker, Michael

    2003-01-01

    The abbreviated name, ‘mfold web server’, describes a number of closely related software applications available on the World Wide Web (WWW) for the prediction of the secondary structure of single stranded nucleic acids. The objective of this web server is to provide easy access to RNA and DNA folding and hybridization software to the scientific community at large. By making use of universally available web GUIs (Graphical User Interfaces), the server circumvents the problem of portability of this software. Detailed output, in the form of structure plots with or without reliability information, single strand frequency plots and ‘energy dot plots’, are available for the folding of single sequences. A variety of ‘bulk’ servers give less information, but in a shorter time and for up to hundreds of sequences at once. The portal for the mfold web server is http://www.bioinfo.rpi.edu/applications/mfold. This URL will be referred to as ‘MFOLDROOT’. PMID:12824337

  1. Mfold web server for nucleic acid folding and hybridization prediction.

    PubMed

    Zuker, Michael

    2003-07-01

    The abbreviated name, 'mfold web server', describes a number of closely related software applications available on the World Wide Web (WWW) for the prediction of the secondary structure of single stranded nucleic acids. The objective of this web server is to provide easy access to RNA and DNA folding and hybridization software to the scientific community at large. By making use of universally available web GUIs (Graphical User Interfaces), the server circumvents the problem of portability of this software. Detailed output, in the form of structure plots with or without reliability information, single strand frequency plots and 'energy dot plots', are available for the folding of single sequences. A variety of 'bulk' servers give less information, but in a shorter time and for up to hundreds of sequences at once. The portal for the mfold web server is http://www.bioinfo.rpi.edu/applications/mfold. This URL will be referred to as 'MFOLDROOT'.

  2. Global change in the trophic functioning of marine food webs.

    PubMed

    Maureaud, Aurore; Gascuel, Didier; Colléter, Mathieu; Palomares, Maria L D; Du Pontavice, Hubert; Pauly, Daniel; Cheung, William W L

    2017-01-01

    The development of fisheries in the oceans, and other human drivers such as climate warming, have led to changes in species abundance, assemblages, trophic interactions, and ultimately in the functioning of marine food webs. Here, using a trophodynamic approach and global databases of catches and life history traits of marine species, we tested the hypothesis that anthropogenic ecological impacts may have led to changes in the global parameters defining the transfers of biomass within the food web. First, we developed two indicators to assess such changes: the Time Cumulated Indicator (TCI) measuring the residence time of biomass within the food web, and the Efficiency Cumulated Indicator (ECI) quantifying the fraction of secondary production reaching the top of the trophic chain. Then, we assessed, at the large marine ecosystem scale, the worldwide change of these two indicators over the 1950-2010 time-periods. Global trends were identified and cluster analyses were used to characterize the variability of trends between ecosystems. Results showed that the most common pattern over the study period is a global decrease in TCI, while the ECI indicator tends to increase. Thus, changes in species assemblages would induce faster and apparently more efficient biomass transfers in marine food webs. Results also suggested that the main driver of change over that period had been the large increase in fishing pressure. The largest changes occurred in ecosystems where 'fishing down the marine food web' are most intensive.

  3. Accelerated Creep Testing of High Strength Aramid Webbing

    NASA Technical Reports Server (NTRS)

    Jones, Thomas C.; Doggett, William R.; Stnfield, Clarence E.; Valverde, Omar

    2012-01-01

    A series of preliminary accelerated creep tests were performed on four variants of 12K and 24K lbf rated Vectran webbing to help develop an accelerated creep test methodology and analysis capability for high strength aramid webbings. The variants included pristine, aged, folded and stitched samples. This class of webbings is used in the restraint layer of habitable, inflatable space structures, for which the lifetime properties are currently not well characterized. The Stepped Isothermal Method was used to accelerate the creep life of the webbings and a novel stereo photogrammetry system was used to measure the full-field strains. A custom MATLAB code is described, and used to reduce the strain data to produce master creep curves for the test samples. Initial results show good correlation between replicates; however, it is clear that a larger number of samples are needed to build confidence in the consistency of the results. It is noted that local fiber breaks affect the creep response in a similar manner to increasing the load, thus raising the creep rate and reducing the time to creep failure. The stitched webbings produced the highest variance between replicates, due to the combination of higher local stresses and thread-on-fiber damage. Large variability in the strength of the webbings is also shown to have an impact on the range of predicted creep life.

  4. Web-based visual analysis for high-throughput genomics

    PubMed Central

    2013-01-01

    Background Visualization plays an essential role in genomics research by making it possible to observe correlations and trends in large datasets as well as communicate findings to others. Visual analysis, which combines visualization with analysis tools to enable seamless use of both approaches for scientific investigation, offers a powerful method for performing complex genomic analyses. However, there are numerous challenges that arise when creating rich, interactive Web-based visualizations/visual analysis applications for high-throughput genomics. These challenges include managing data flow from Web server to Web browser, integrating analysis tools and visualizations, and sharing visualizations with colleagues. Results We have created a platform simplifies the creation of Web-based visualization/visual analysis applications for high-throughput genomics. This platform provides components that make it simple to efficiently query very large datasets, draw common representations of genomic data, integrate with analysis tools, and share or publish fully interactive visualizations. Using this platform, we have created a Circos-style genome-wide viewer, a generic scatter plot for correlation analysis, an interactive phylogenetic tree, a scalable genome browser for next-generation sequencing data, and an application for systematically exploring tool parameter spaces to find good parameter values. All visualizations are interactive and fully customizable. The platform is integrated with the Galaxy (http://galaxyproject.org) genomics workbench, making it easy to integrate new visual applications into Galaxy. Conclusions Visualization and visual analysis play an important role in high-throughput genomics experiments, and approaches are needed to make it easier to create applications for these activities. Our framework provides a foundation for creating Web-based visualizations and integrating them into Galaxy. Finally, the visualizations we have created using the

  5. A novel architecture for information retrieval system based on semantic web

    NASA Astrophysics Data System (ADS)

    Zhang, Hui

    2011-12-01

    Nowadays, the web has enabled an explosive growth of information sharing (there are currently over 4 billion pages covering most areas of human endeavor) so that the web has faced a new challenge of information overhead. The challenge that is now before us is not only to help people locating relevant information precisely but also to access and aggregate a variety of information from different resources automatically. Current web document are in human-oriented formats and they are suitable for the presentation, but machines cannot understand the meaning of document. To address this issue, Berners-Lee proposed a concept of semantic web. With semantic web technology, web information can be understood and processed by machine. It provides new possibilities for automatic web information processing. A main problem of semantic web information retrieval is that when these is not enough knowledge to such information retrieval system, the system will return to a large of no sense result to uses due to a huge amount of information results. In this paper, we present the architecture of information based on semantic web. In addiction, our systems employ the inference Engine to check whether the query should pose to Keyword-based Search Engine or should pose to the Semantic Search Engine.

  6. WebGLORE: a Web service for Grid LOgistic REgression

    PubMed Central

    Jiang, Wenchao; Li, Pinghao; Wang, Shuang; Wu, Yuan; Xue, Meng; Ohno-Machado, Lucila; Jiang, Xiaoqian

    2013-01-01

    WebGLORE is a free web service that enables privacy-preserving construction of a global logistic regression model from distributed datasets that are sensitive. It only transfers aggregated local statistics (from participants) through Hypertext Transfer Protocol Secure to a trusted server, where the global model is synthesized. WebGLORE seamlessly integrates AJAX, JAVA Applet/Servlet and PHP technologies to provide an easy-to-use web service for biomedical researchers to break down policy barriers during information exchange. Availability and implementation: http://dbmi-engine.ucsd.edu/webglore3/. WebGLORE can be used under the terms of GNU general public license as published by the Free Software Foundation. Contact: x1jiang@ucsd.edu PMID:24072732

  7. shinyheatmap: Ultra fast low memory heatmap web interface for big data genomics.

    PubMed

    Khomtchouk, Bohdan B; Hennessy, James R; Wahlestedt, Claes

    2017-01-01

    Transcriptomics, metabolomics, metagenomics, and other various next-generation sequencing (-omics) fields are known for their production of large datasets, especially across single-cell sequencing studies. Visualizing such big data has posed technical challenges in biology, both in terms of available computational resources as well as programming acumen. Since heatmaps are used to depict high-dimensional numerical data as a colored grid of cells, efficiency and speed have often proven to be critical considerations in the process of successfully converting data into graphics. For example, rendering interactive heatmaps from large input datasets (e.g., 100k+ rows) has been computationally infeasible on both desktop computers and web browsers. In addition to memory requirements, programming skills and knowledge have frequently been barriers-to-entry for creating highly customizable heatmaps. We propose shinyheatmap: an advanced user-friendly heatmap software suite capable of efficiently creating highly customizable static and interactive biological heatmaps in a web browser. shinyheatmap is a low memory footprint program, making it particularly well-suited for the interactive visualization of extremely large datasets that cannot typically be computed in-memory due to size restrictions. Also, shinyheatmap features a built-in high performance web plug-in, fastheatmap, for rapidly plotting interactive heatmaps of datasets as large as 105-107 rows within seconds, effectively shattering previous performance benchmarks of heatmap rendering speed. shinyheatmap is hosted online as a freely available web server with an intuitive graphical user interface: http://shinyheatmap.com. The methods are implemented in R, and are available as part of the shinyheatmap project at: https://github.com/Bohdan-Khomtchouk/shinyheatmap. Users can access fastheatmap directly from within the shinyheatmap web interface, and all source code has been made publicly available on Github: https://github.com/Bohdan-Khomtchouk/fastheatmap.

  8. Improving Performance in Constructing specific Web Directory using Focused Crawler: An Experiment on Botany Domain

    NASA Astrophysics Data System (ADS)

    Khalilian, Madjid; Boroujeni, Farsad Zamani; Mustapha, Norwati

    Nowadays the growth of the web causes some difficulties to search and browse useful information especially in specific domains. However, some portion of the web remains largely underdeveloped, as shown in lack of high quality contents. An example is the botany specific web directory, in which lack of well-structured web directories have limited user's ability to browse required information. In this research we propose an improved framework for constructing a specific web directory. In this framework we use an anchor directory as a foundation for primary web directory. This web directory is completed by information which is gathered with automatic component and filtered by experts. We conduct an experiment for evaluating effectiveness, efficiency and satisfaction.

  9. Participant profiles according to recruitment source in a large Web-based prospective study: experience from the Nutrinet-Santé study.

    PubMed

    Kesse-Guyot, Emmanuelle; Andreeva, Valentina; Castetbon, Katia; Vernay, Michel; Touvier, Mathilde; Méjean, Caroline; Julia, Chantal; Galan, Pilar; Hercberg, Serge

    2013-09-13

    Interest in Internet-based epidemiologic research is growing given the logistic and cost advantages. Cohort recruitment to maximally diversify the sociodemographic profiles of participants, however, remains a contentious issue. The aim of the study was to characterize the sociodemographic profiles according to the recruitment mode of adult volunteers enrolled in a Web-based cohort. The French NutriNet-Santé Web-based cohort was launched in 2009. Recruitment is ongoing and largely relies on recurrent multimedia campaigns. One month after enrollment, participants are asked how they learned about the study (eg, general newscast or a health program on television, radio newscast, newspaper articles, Internet, personal advice, leaflet/flyers) The sociodemographic profiles of participants recruited through operative communication channels (radio, print media, Internet, advice) were compared with the profiles of those informed through television by using polytomous logistic regression. Among the 88,238 participants enrolled through the end of 2011, 30,401 (34.45%), 16,751 (18.98%), and 14,309 (16.22%) learned about the study from television, Internet, and radio newscasts, respectively. Sociodemographic profiles were various, with 14,541 (16.5%) aged ≥60 years, 20,166 (22.9%) aged <30 years, 27,766 (32.1%) without postsecondary education, 15,397 (19.7%) with household income <€1200/month, and 8258 (10.6%) with household income €3700/month. Compared to employed individuals, unemployed and retired participants were less likely to be informed about the study through other sources than through television (adjusted ORs 0.56-0.83, P<.001). Participants reporting up to secondary education were also less likely to have learned about the study through radio newscasts, newspaper articles, Internet, and advice than through television (adjusted ORs 0.60-0.77, P<.001). Television broadcasts appear to permit the recruitment of e-cohort participants with diverse sociodemographic

  10. Ondex Web: web-based visualization and exploration of heterogeneous biological networks.

    PubMed

    Taubert, Jan; Hassani-Pak, Keywan; Castells-Brooke, Nathalie; Rawlings, Christopher J

    2014-04-01

    Ondex Web is a new web-based implementation of the network visualization and exploration tools from the Ondex data integration platform. New features such as context-sensitive menus and annotation tools provide users with intuitive ways to explore and manipulate the appearance of heterogeneous biological networks. Ondex Web is open source, written in Java and can be easily embedded into Web sites as an applet. Ondex Web supports loading data from a variety of network formats, such as XGMML, NWB, Pajek and OXL. http://ondex.rothamsted.ac.uk/OndexWeb.

  11. Web-of-Objects (WoO)-Based Context Aware Emergency Fire Management Systems for the Internet of Things

    PubMed Central

    Shamszaman, Zia Ush; Ara, Safina Showkat; Chong, Ilyoung; Jeong, Youn Kwae

    2014-01-01

    Recent advancements in the Internet of Things (IoT) and the Web of Things (WoT) accompany a smart life where real world objects, including sensing devices, are interconnected with each other. The Web representation of smart objects empowers innovative applications and services for various domains. To accelerate this approach, Web of Objects (WoO) focuses on the implementation aspects of bringing the assorted real world objects to the Web applications. In this paper; we propose an emergency fire management system in the WoO infrastructure. Consequently, we integrate the formation and management of Virtual Objects (ViO) which are derived from real world physical objects and are virtually connected with each other into the semantic ontology model. The charm of using the semantic ontology is that it allows information reusability, extensibility and interoperability, which enable ViOs to uphold orchestration, federation, collaboration and harmonization. Our system is context aware, as it receives contextual environmental information from distributed sensors and detects emergency situations. To handle a fire emergency, we present a decision support tool for the emergency fire management team. The previous fire incident log is the basis of the decision support system. A log repository collects all the emergency fire incident logs from ViOs and stores them in a repository. PMID:24531299

  12. Web-of-Objects (WoO)-based context aware emergency fire management systems for the Internet of Things.

    PubMed

    Shamszaman, Zia Ush; Ara, Safina Showkat; Chong, Ilyoung; Jeong, Youn Kwae

    2014-02-13

    Recent advancements in the Internet of Things (IoT) and the Web of Things (WoT) accompany a smart life where real world objects, including sensing devices, are interconnected with each other. The Web representation of smart objects empowers innovative applications and services for various domains. To accelerate this approach, Web of Objects (WoO) focuses on the implementation aspects of bringing the assorted real world objects to the Web applications. In this paper; we propose an emergency fire management system in the WoO infrastructure. Consequently, we integrate the formation and management of Virtual Objects (ViO) which are derived from real world physical objects and are virtually connected with each other into the semantic ontology model. The charm of using the semantic ontology is that it allows information reusability, extensibility and interoperability, which enable ViOs to uphold orchestration, federation, collaboration and harmonization. Our system is context aware, as it receives contextual environmental information from distributed sensors and detects emergency situations. To handle a fire emergency, we present a decision support tool for the emergency fire management team. The previous fire incident log is the basis of the decision support system. A log repository collects all the emergency fire incident logs from ViOs and stores them in a repository.

  13. Power-Tool Adapter For T-Handle Screws

    NASA Technical Reports Server (NTRS)

    Deloach, Stephen R.

    1992-01-01

    Proposed adapter enables use of pneumatic drill, electric drill, electric screwdriver, or similar power tool to tighten or loosen T-handled screws. Notched tube with perpendicular rod welded to it inserted in chuck of tool. Notched end of tube slipped over screw handle.

  14. WebSat ‐ A web software for microsatellite marker development

    PubMed Central

    Martins, Wellington Santos; Soares Lucas, Divino César; de Souza Neves, Kelligton Fabricio; Bertioli, David John

    2009-01-01

    Simple sequence repeats (SSR), also known as microsatellites, have been extensively used as molecular markers due to their abundance and high degree of polymorphism. We have developed a simple to use web software, called WebSat, for microsatellite molecular marker prediction and development. WebSat is accessible through the Internet, requiring no program installation. Although a web solution, it makes use of Ajax techniques, providing a rich, responsive user interface. WebSat allows the submission of sequences, visualization of microsatellites and the design of primers suitable for their amplification. The program allows full control of parameters and the easy export of the resulting data, thus facilitating the development of microsatellite markers. Availability The web tool may be accessed at http://purl.oclc.org/NET/websat/ PMID:19255650

  15. Direct handling of equality constraints in multilevel optimization

    NASA Technical Reports Server (NTRS)

    Renaud, John E.; Gabriele, Gary A.

    1990-01-01

    In recent years there have been several hierarchic multilevel optimization algorithms proposed and implemented in design studies. Equality constraints are often imposed between levels in these multilevel optimizations to maintain system and subsystem variable continuity. Equality constraints of this nature will be referred to as coupling equality constraints. In many implementation studies these coupling equality constraints have been handled indirectly. This indirect handling has been accomplished using the coupling equality constraints' explicit functional relations to eliminate design variables (generally at the subsystem level), with the resulting optimization taking place in a reduced design space. In one multilevel optimization study where the coupling equality constraints were handled directly, the researchers encountered numerical difficulties which prevented their multilevel optimization from reaching the same minimum found in conventional single level solutions. The researchers did not explain the exact nature of the numerical difficulties other than to associate them with the direct handling of the coupling equality constraints. The coupling equality constraints are handled directly, by employing the Generalized Reduced Gradient (GRG) method as the optimizer within a multilevel linear decomposition scheme based on the Sobieski hierarchic algorithm. Two engineering design examples are solved using this approach. The results show that the direct handling of coupling equality constraints in a multilevel optimization does not introduce any problems when the GRG method is employed as the internal optimizer. The optimums achieved are comparable to those achieved in single level solutions and in multilevel studies where the equality constraints have been handled indirectly.

  16. Redefining NHS complaint handling--the real challenge.

    PubMed

    Seelos, L; Adamson, C

    1994-01-01

    More and more organizations find that a constructive and open dialogue with their customers can be an effective strategy for building long-term customer relations. In this context, it has been recognized that effective complaint-contact handling can make a significant contribution to organizations' attempts to maximize customer satisfaction and loyalty. Within the NHS, an intellectual awareness exists that effective complaint/contact handling can contribute to making services more efficient and cost-effective by developing customer-oriented improvement initiatives. Recent efforts have focused on redefining NHS complaint-handling procedures to make them more user-friendly and effective for both NHS employees and customers. Discusses the challenges associated with opening up the NHS to customer feedback. Highlights potential weaknesses in the current approach and argues that the real challenge is for NHS managers to facilitate a culture change that moves the NHS away from a long-established defensive complaint handling practice.

  17. Promoting Teachers' Positive Attitude towards Web Use: A Study in Web Site Development

    ERIC Educational Resources Information Center

    Akpinar, Yavuz; Bayramoglu, Yusuf

    2008-01-01

    The purpose of the study was to examine effects of a compact training for developing web sites on teachers' web attitude, as composed of: web self efficacy, perceived web enjoyment, perceived web usefulness and behavioral intention to use the web. To measure the related constructs, the Web Attitude Scale was adapted into Turkish and tested with a…

  18. Web3DMol: interactive protein structure visualization based on WebGL.

    PubMed

    Shi, Maoxiang; Gao, Juntao; Zhang, Michael Q

    2017-07-03

    A growing number of web-based databases and tools for protein research are being developed. There is now a widespread need for visualization tools to present the three-dimensional (3D) structure of proteins in web browsers. Here, we introduce our 3D modeling program-Web3DMol-a web application focusing on protein structure visualization in modern web browsers. Users submit a PDB identification code or select a PDB archive from their local disk, and Web3DMol will display and allow interactive manipulation of the 3D structure. Featured functions, such as sequence plot, fragment segmentation, measure tool and meta-information display, are offered for users to gain a better understanding of protein structure. Easy-to-use APIs are available for developers to reuse and extend Web3DMol. Web3DMol can be freely accessed at http://web3dmol.duapp.com/, and the source code is distributed under the MIT license. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  19. A Ubiquitous Sensor Network Platform for Integrating Smart Devices into the Semantic Sensor Web

    PubMed Central

    de Vera, David Díaz Pardo; Izquierdo, Álvaro Sigüenza; Vercher, Jesús Bernat; Gómez, Luis Alfonso Hernández

    2014-01-01

    Ongoing Sensor Web developments make a growing amount of heterogeneous sensor data available to smart devices. This is generating an increasing demand for homogeneous mechanisms to access, publish and share real-world information. This paper discusses, first, an architectural solution based on Next Generation Networks: a pilot Telco Ubiquitous Sensor Network (USN) Platform that embeds several OGC® Sensor Web services. This platform has already been deployed in large scale projects. Second, the USN-Platform is extended to explore a first approach to Semantic Sensor Web principles and technologies, so that smart devices can access Sensor Web data, allowing them also to share richer (semantically interpreted) information. An experimental scenario is presented: a smart car that consumes and produces real-world information which is integrated into the Semantic Sensor Web through a Telco USN-Platform. Performance tests revealed that observation publishing times with our experimental system were well within limits compatible with the adequate operation of smart safety assistance systems in vehicles. On the other hand, response times for complex queries on large repositories may be inappropriate for rapid reaction needs. PMID:24945678

  20. A ubiquitous sensor network platform for integrating smart devices into the semantic sensor web.

    PubMed

    de Vera, David Díaz Pardo; Izquierdo, Alvaro Sigüenza; Vercher, Jesús Bernat; Hernández Gómez, Luis Alfonso

    2014-06-18

    Ongoing Sensor Web developments make a growing amount of heterogeneous sensor data available to smart devices. This is generating an increasing demand for homogeneous mechanisms to access, publish and share real-world information. This paper discusses, first, an architectural solution based on Next Generation Networks: a pilot Telco Ubiquitous Sensor Network (USN) Platform that embeds several OGC® Sensor Web services. This platform has already been deployed in large scale projects. Second, the USN-Platform is extended to explore a first approach to Semantic Sensor Web principles and technologies, so that smart devices can access Sensor Web data, allowing them also to share richer (semantically interpreted) information. An experimental scenario is presented: a smart car that consumes and produces real-world information which is integrated into the Semantic Sensor Web through a Telco USN-Platform. Performance tests revealed that observation publishing times with our experimental system were well within limits compatible with the adequate operation of smart safety assistance systems in vehicles. On the other hand, response times for complex queries on large repositories may be inappropriate for rapid reaction needs.