Sample records for web transfer systems

  1. Web-based home telemedicine system for orthopedics

    NASA Astrophysics Data System (ADS)

    Lau, Christopher; Churchill, Sean; Kim, Janice; Matsen, Frederick A., III; Kim, Yongmin

    2001-05-01

    Traditionally, telemedicine systems have been designed to improve access to care by allowing physicians to consult a specialist about a case without sending the patient to another location, which may be difficult or time-consuming to reach. The cost of the equipment and network bandwidth needed for this consultation has restricted telemedicine use to contact between physicians instead of between patients and physicians. Recently, however, the wide availability of Internet connectivity and client and server software for e- mail, world wide web, and conferencing has made low-cost telemedicine applications feasible. In this work, we present a web-based system for asynchronous multimedia messaging between shoulder replacement surgery patients at home and their surgeons. A web browser plug-in was developed to simplify the process of capturing video and transferring it to a web site. The video capture plug-in can be used as a template to construct a plug-in that captures and transfers any type of data to a web server. For example, readings from home biosensor instruments (e.g., blood glucose meters and spirometers) that can be connected to a computing platform can be transferred to a home telemedicine web site. Both patients and doctors can access this web site to monitor progress longitudinally. The system has been tested with 3 subjects for the past 7 weeks, and we plan to continue testing in the foreseeable future.

  2. Using business intelligence for efficient inter-facility patient transfer.

    PubMed

    Haque, Waqar; Derksen, Beth Ann; Calado, Devin; Foster, Lee

    2015-01-01

    In the context of inter-facility patient transfer, a transfer operator must be able to objectively identify a destination which meets the needs of a patient, while keeping in mind each facility's limitations. We propose a solution which uses Business Intelligence (BI) techniques to analyze data related to healthcare infrastructure and services, and provides a web based system to identify optimal destination(s). The proposed inter-facility transfer system uses a single data warehouse with an Online Analytical Processing (OLAP) cube built on top that supplies analytical data to multiple reports embedded in web pages. The data visualization tool includes map based navigation of the health authority as well as an interactive filtering mechanism which finds facilities meeting the selected criteria. The data visualization is backed by an intuitive data entry web form which safely constrains the data, ensuring consistency and a single version of truth. The overall time required to identify the destination for inter-facility transfers is reduced from hours to a few minutes with this interactive solution.

  3. Cassini Archive Tracking System

    NASA Technical Reports Server (NTRS)

    Conner, Diane; Sayfi, Elias; Tinio, Adrian

    2006-01-01

    The Cassini Archive Tracking System (CATS) is a computer program that enables tracking of scientific data transfers from originators to the Planetary Data System (PDS) archives. Without CATS, there is no systematic means of locating products in the archive process or ensuring their completeness. By keeping a database of transfer communications and status, CATS enables the Cassini Project and the PDS to efficiently and accurately report on archive status. More importantly, problem areas are easily identified through customized reports that can be generated on the fly from any Web-enabled computer. A Web-browser interface and clearly defined authorization scheme provide safe distributed access to the system, where users can perform functions such as create customized reports, record a transfer, and respond to a transfer. CATS ensures that Cassini provides complete science archives to the PDS on schedule and that those archives are available to the science community by the PDS. The three-tier architecture is loosely coupled and designed for simple adaptation to multimission use. Written in the Java programming language, it is portable and can be run on any Java-enabled Web server.

  4. 78 FR 57375 - Toutant Hydro Power, Inc.; Energy System, LLC.; Notice of Application for Transfer of License...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-18

    ... viewed or printed on the eLibrary link of Commission's Web site at http://www.ferc.gov/docs-filing... Power, Inc.; Energy System, LLC.; Notice of Application for Transfer of License, and Soliciting Comments... System, LLC (transferee) filed an application for transfer of license for the M.S.C. Power Project, FERC...

  5. Electronic Transfer of School Records.

    ERIC Educational Resources Information Center

    Yeagley, Raymond

    2001-01-01

    Describes the electronic transfer of student records, notably the use of a Web-server named CHARLOTTE sponsored by the National Forum on Education Statistics and an Electronic Data Exchange system named SPEEDE/ExPRESS. (PKP)

  6. The impact of a telehealth web-based solution on neurosurgery triage and consultation.

    PubMed

    Moya, Monica; Valdez, Jessica; Yonas, Howard; Alverson, Dale C

    2010-11-01

    To enhance the quality of neurosurgery consultations, triage, and transport decisions between a Level I trauma service neurosurgery program at the University of New Mexico Hospital and referring hospitals, a secure Health Insurance Portability and Accountability Act (HIPAA)-compliant Web-based system was developed, to which digital neurological images could be sent for review by a neurosurgeon for consultation or patient transfer. Based upon prior experience of neurosurgery, it was predicted that 25% of transfer requests would be avoided if the neurosurgeons reviewed the computerized tomography scans at the time of a transfer request. In addition, it was predicted in 25% of the case that changes in management recommendations would take place independent of the transfer decision. The program was designed to allow referring hospitals to transmit digital images to the Web site, providing consulting doctors with additional patient information. This project analyzed the neurosurgeons' responses to questions designed to determine if transport or management decisions were altered when using this telehealth program in response to a request for consultation or transfer from a rural facility. Analysis of the responses of the consulting neurosurgeons revealed that, after viewing the images, 44% of the potential transfers were avoided and 44% of consulted cases resulted in management recommendation changes independent of the transfer decision. Use of the system resulted in improved triage and changes in transfer or management recommendations. A significant number of potential transfers were avoided, resulting in transport cost avoidance, more effective use of resources, and more appropriate use of the neurosurgery service as well as improved patient preparation.

  7. 75 FR 6019 - NewPage Wisconsin System Inc., Kaukauna Utilities; Notice of Application for Transfer of License...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-05

    ... project can be viewed or printed on the eLibrary link of the Commission's Web site at http://www.ferc.gov... Wisconsin System Inc., Kaukauna Utilities; Notice of Application for Transfer of License and Soliciting Comments and Motions To Intervene January 29, 2010. On January 25, 2010, NewPage Wisconsin System Inc...

  8. Ship to Shore Data Communication and Prioritization

    DTIC Science & Technology

    2011-12-01

    First Out FTP File Transfer Protocol GCCS-M Global Command and Control System Maritime HAIPE High Assurance Internet Protocol Encryptor HTTP Hypertext...Transfer Protocol (world wide web protocol ) IBS Integrated Bar Code System IDEF0 Integration Definition IER Information Exchange Requirements...INTEL Intelligence IP Internet Protocol IPT Integrated Product Team ISEA In-Service Engineering Agent ISNS Integrated Shipboard Network System IT

  9. Guide to the Internet. The world wide web.

    PubMed Central

    Pallen, M.

    1995-01-01

    The world wide web provides a uniform, user friendly interface to the Internet. Web pages can contain text and pictures and are interconnected by hypertext links. The addresses of web pages are recorded as uniform resource locators (URLs), transmitted by hypertext transfer protocol (HTTP), and written in hypertext markup language (HTML). Programs that allow you to use the web are available for most operating systems. Powerful on line search engines make it relatively easy to find information on the web. Browsing through the web--"net surfing"--is both easy and enjoyable. Contributing to the web is not difficult, and the web opens up new possibilities for electronic publishing and electronic journals. Images p1554-a Fig 5 PMID:8520402

  10. Zinc in an ultraoligotrophic lake food web.

    PubMed

    Montañez, Juan Cruz; Arribére, María A; Rizzo, Andrea; Arcagni, Marina; Campbell, Linda; Ribeiro Guevara, Sergio

    2018-06-01

    Zinc (Zn) bioaccumulation and trophic transfer were analyzed in the food web of Lake Nahuel Huapi, a deep, unpolluted ultraoligotrophic system in North Patagonia. Benthic macroinvertebrates, plankton, and native and introduced fish were collected at three sites. The effect of pyroclastic inputs on Zn levels in lacustrine food webs was assessed by studying the impact of the eruption of Puyehue-Cordón Caulle volcanic complex (PCCVC) in 2011, by performing three sampling campaigns immediately before and after the PCCVC eruption, and after 2 years of recovery of the ecosystem. Zinc trophodynamics in L. Nahuel Huapi food web was assessed using nitrogen stable isotopes (δ 15 N). There was no significant increase of Zn concentrations ([Zn]) in L. Nahuel Huapi biota after the PCCVC eruption, despite the evidence of [Zn] increase in lake water that could be associated with volcanic ash leaching. The organisms studied exhibited [Zn] above the threshold level considered for dietary deficiency, regulating Zn adequately even under a catastrophic situations like PCCVC 2011 eruption. Zinc concentrations exhibited a biodilution pattern in the lake's food web. To the best of our knowledge, present research is the first report of Zn biodilution in lacustrine systems, and the first to study Zn transfer in a freshwater food web including both pelagic and benthic compartments.

  11. Navy Controls for Invoice, Receipt, Acceptance, and Property Transfer System Need Improvement

    DTIC Science & Technology

    2016-02-25

    iR APT as a web-based system to electronically invoice, receipt, and accept ser vices and product s from its contractors and vendors. The iR APT system...electronically shares document s bet ween DoD and it s contractors and vendors to eliminate redundant data entr y, increase data accuracy, and reduce...The iR APT system allows contractors to submit and track invoices and receipt and acceptance documents over the web and allows government personnel to

  12. Web Service Architecture Framework for Embedded Devices

    ERIC Educational Resources Information Center

    Yanzick, Paul David

    2009-01-01

    The use of Service Oriented Architectures, namely web services, has become a widely adopted method for transfer of data between systems across the Internet as well as the Enterprise. Adopting a similar approach to embedded devices is also starting to emerge as personal devices and sensor networks are becoming more common in the industry. This…

  13. Investigating the Efficacy of Web-Based Transfer Training on Independent Wheelchair Transfers Through Randomized Controlled Trials.

    PubMed

    Worobey, Lynn A; Rigot, Stephanie K; Hogaboom, Nathan S; Venus, Chris; Boninger, Michael L

    2018-01-01

    To determine the efficacy of a web-based transfer training module at improving transfer technique across 3 groups: web-based training, in-person training (current standard of practice), and a waitlist control group (WLCG); and secondarily, to determine subject factors that can be used to predict improvements in transfer ability after training. Randomized controlled trials. Summer and winter sporting events for disabled veterans. A convenience sample (N=71) of manual and power wheelchair users who could transfer independently. An individualized, in-person transfer training session or a web-based transfer training module. The WLCG received the web training at their follow-up visit. Transfer Assessment Instrument (TAI) part 1 score was used to assess transfers at baseline, skill acquisition immediately posttraining, and skill retention after a 1- to 2-day follow-up period. The in-person and web-based training groups improved their median (interquartile range) TAI scores from 7.98 (7.18-8.46) to 9.13 (8.57-9.58; P<.01), and from 7.14 (6.15-7.86) to 9.23 (8.46-9.82; P<.01), respectively, compared with the WLCG that had a median score of 7.69 for both assessments (baseline, 6.15-8.46; follow-up control, 5.83-8.46). Participants retained improvements at follow-up (P>.05). A lower initial TAI score was found to be the only significant predictor of a larger percent change in TAI score after receiving training. Transfer training can improve technique with changes retained within a short follow-up window, even among experienced wheelchair users. Web-based transfer training demonstrated comparable improvements to in-person training. With almost half of the United States population consulting online resources before a health care professional, web-based training may be an effective method to increase knowledge translation. Copyright © 2017 American Congress of Rehabilitation Medicine. All rights reserved.

  14. A Web-Based Database for Nurse Led Outreach Teams (NLOT) in Toronto.

    PubMed

    Li, Shirley; Kuo, Mu-Hsing; Ryan, David

    2016-01-01

    A web-based system can provide access to real-time data and information. Healthcare is moving towards digitizing patients' medical information and securely exchanging it through web-based systems. In one of Ontario's health regions, Nurse Led Outreach Teams (NLOT) provide emergency mobile nursing services to help reduce unnecessary transfers from long-term care homes to emergency departments. Currently the NLOT team uses a Microsoft Access database to keep track of the health information on the residents that they serve. The Access database lacks scalability, portability, and interoperability. The objective of this study is the development of a web-based database using Oracle Application Express that is easily accessible from mobile devices. The web-based database will allow NLOT nurses to enter and access resident information anytime and from anywhere.

  15. A Web-based telemedicine system for diabetic retinopathy screening using digital fundus photography.

    PubMed

    Wei, Jack C; Valentino, Daniel J; Bell, Douglas S; Baker, Richard S

    2006-02-01

    The purpose was to design and implement a Web-based telemedicine system for diabetic retinopathy screening using digital fundus cameras and to make the software publicly available through Open Source release. The process of retinal imaging and case reviewing was modeled to optimize workflow and implement use of computer system. The Web-based system was built on Java Servlet and Java Server Pages (JSP) technologies. Apache Tomcat was chosen as the JSP engine, while MySQL was used as the main database and Laboratory of Neuro Imaging (LONI) Image Storage Architecture, from the LONI-UCLA, as the platform for image storage. For security, all data transmissions were carried over encrypted Internet connections such as Secure Socket Layer (SSL) and HyperText Transfer Protocol over SSL (HTTPS). User logins were required and access to patient data was logged for auditing. The system was deployed at Hubert H. Humphrey Comprehensive Health Center and Martin Luther King/Drew Medical Center of Los Angeles County Department of Health Services. Within 4 months, 1500 images of more than 650 patients were taken at Humphrey's Eye Clinic and successfully transferred to King/Drew's Department of Ophthalmology. This study demonstrates an effective architecture for remote diabetic retinopathy screening.

  16. Big data in wildlife research: remote web-based monitoring of hibernating black bears.

    PubMed

    Laske, Timothy G; Garshelis, David L; Iaizzo, Paul A

    2014-12-11

    Numerous innovations for the management and collection of "big data" have arisen in the field of medicine, including implantable computers and sensors, wireless data transmission, and web-based repositories for collecting and organizing information. Recently, human clinical devices have been deployed in captive and free-ranging wildlife to aid in the characterization of both normal physiology and the interaction of animals with their environment, including reactions to humans. Although these devices have had a significant impact on the types and quantities of information that can be collected, their utility has been limited by internal memory capacities, the efforts required to extract and analyze information, and by the necessity to handle the animals in order to retrieve stored data. We surgically implanted miniaturized cardiac monitors (1.2 cc, Reveal LINQ™, Medtronic Inc.), a newly developed human clinical system, into hibernating wild American black bears (N = 6). These devices include wireless capabilities, which enabled frequent transmissions of detailed physiological data from bears in their remote den sites to a web-based data storage and management system. Solar and battery powered telemetry stations transmitted detailed physiological data over the cellular network during the winter months. The system provided the transfer of large quantities of data in near-real time. Observations included changes in heart rhythms associated with birthing and caring for cubs, and in all bears, long periods without heart beats (up to 16 seconds) occurred during each respiratory cycle. For the first time, detailed physiological data were successfully transferred from an animal in the wild to a web-based data collection and management system, overcoming previous limitations on the quantities of data that could be transferred. The system provides an opportunity to detect unusual events as they are occurring, enabling investigation of the animal and site shortly afterwards. Although the current study was limited to bears in winter dens, we anticipate that future systems will transmit data from implantable monitors to wearable transmitters, allowing for big data transfer on non-stationary animals.

  17. Food webs of two intermittently open estuaries receiving 15N-enriched sewage effluent

    NASA Astrophysics Data System (ADS)

    Hadwen, Wade L.; Arthington, Angela H.

    2007-01-01

    Carbon and nitrogen stable isotope signatures were used to assess the response of food webs to sewage effluent discharged into two small intermittently open estuaries in northern New South Wales, Australia. One of these systems, Tallows Creek, has a history of direct sewage inputs, whilst the other, Belongil Creek, receives wastewater via an extensive wetland treatment system. The food webs of both systems were driven by algal sources of carbon, reflecting high autotrophic productivity in response to the nutrients entering the system from sewage effluent. All aquatic biota collected from Tallows Creek had significantly enriched δ15N signatures relative to their conspecifics from Belongil Creek, indicating that sewage nitrogen had been assimilated and transferred throughout the Tallows Creek food web. These δ15N values were higher than those reported from studies in permanently open estuaries receiving sewage effluent. We suggest that these enriched signatures and the transfer of nitrogen throughout the entire food web reflect differences in hydrology and associated nitrogen cycling processes between permanently open and intermittently open estuaries. Although all organisms in Tallows Creek were generally 15N-enriched, isotopically light (less 15N-enriched) individuals of estuary perchlet ( Ambassis marianus) and sea mullet ( Mugil cephalus) were also collected. These individuals were most likely recent immigrants into Tallows Creek, as this system had only recently been opened to the ocean. This isotopic discrimination between resident (enriched) and immigrant (significantly less enriched) individuals can provide information on fish movement patterns and the role of heavily polluted intermittently open estuaries in supporting commercially and recreationally valuable estuarine species.

  18. Integrating DXplain into a clinical information system using the World Wide Web.

    PubMed

    Elhanan, G; Socratous, S A; Cimino, J J

    1996-01-01

    The World Wide Web(WWW) offers a cross-platform environment and standard protocols that enable integration of various applications available on the Internet. The authors use the Web to facilitate interaction between their Web-based Clinical Information System and a decision-support system-DXplain, at the Massachusetts General Hospital-using local architecture and Common Gateway Interface programs. The current application translates patients laboratory test results into DXplain's terms to generate diagnostic hypotheses. Two different access methods are utilized for this model; Hypertext Transfer Protocol (HTTP) and TCP/IP function calls. While clinical aspects cannot be evaluated as yet, the model demonstrates the potential of Web-based applications for interaction and integration and how local architecture, with a controlled vocabulary server, can further facilitate such integration. This model serves to demonstrate some of the limitations of the current WWW technology and identifies issues such as control over Web resources and their utilization and liability issues as possible obstacles for further integration.

  19. OC ToGo: bed site image integration into OpenClinica with mobile devices

    NASA Astrophysics Data System (ADS)

    Haak, Daniel; Gehlen, Johan; Jonas, Stephan; Deserno, Thomas M.

    2014-03-01

    Imaging and image-based measurements nowadays play an essential role in controlled clinical trials, but electronic data capture (EDC) systems insufficiently support integration of captured images by mobile devices (e.g. smartphones and tablets). The web application OpenClinica has established as one of the world's leading EDC systems and is used to collect, manage and store data of clinical trials in electronic case report forms (eCRFs). In this paper, we present a mobile application for instantaneous integration of images into OpenClinica directly during examination on patient's bed site. The communication between the Android application and OpenClinica is based on the simple object access protocol (SOAP) and representational state transfer (REST) web services for metadata, and secure file transfer protocol (SFTP) for image transfer, respectively. OpenClinica's web services are used to query context information (e.g. existing studies, events and subjects) and to import data into the eCRF, as well as export of eCRF metadata and structural information. A stable image transfer is ensured and progress information (e.g. remaining time) visualized to the user. The workflow is demonstrated for a European multi-center registry, where patients with calciphylaxis disease are included. Our approach improves the EDC workflow, saves time, and reduces costs. Furthermore, data privacy is enhanced, since storage of private health data on the imaging devices becomes obsolete.

  20. A web-based institutional DICOM distribution system with the integration of the Clinical Trial Processor (CTP).

    PubMed

    Aryanto, K Y E; Broekema, A; Langenhuysen, R G A; Oudkerk, M; van Ooijen, P M A

    2015-05-01

    To develop and test a fast and easy rule-based web-environment with optional de-identification of imaging data to facilitate data distribution within a hospital environment. A web interface was built using Hypertext Preprocessor (PHP), an open source scripting language for web development, and Java with SQL Server to handle the database. The system allows for the selection of patient data and for de-identifying these when necessary. Using the services provided by the RSNA Clinical Trial Processor (CTP), the selected images were pushed to the appropriate services using a protocol based on the module created for the associated task. Five pipelines, each performing a different task, were set up in the server. In a 75 month period, more than 2,000,000 images are transferred and de-identified in a proper manner while 20,000,000 images are moved from one node to another without de-identification. While maintaining a high level of security and stability, the proposed system is easy to setup, it integrate well with our clinical and research practice and it provides a fast and accurate vendor-neutral process of transferring, de-identifying, and storing DICOM images. Its ability to run different de-identification processes in parallel pipelines is a major advantage in both clinical and research setting.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casella, R.

    RESTful (REpresentational State Transfer) web services are an alternative implementation to SOAP/RPC web services in a client/server model. BNLs IT Division has started deploying RESTful Web Services for enterprise data retrieval and manipulation. Data is currently used by system administrators for tracking configuration information and as it is expanded will be used by Cyber Security for vulnerability management and as an aid to cyber investigations. This talk will describe the implementation and outstanding issues as well as some of the reasons for choosing RESTful over SOAP/RPC and future directions.

  2. 78 FR 14528 - Mayo Hydropower, LLC, Avalon Hydropower, LLC; Notice of Application for Transfer of License, and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-06

    ... this project can be viewed or printed on the eLibrary link of Commission's Web site at http://www.ferc...) and the instructions on the Commission's Web site under http://www.ferc.gov/docs-filing/efiling.asp... system at http://www.ferc.gov/docs-filing/ecomment.asp . You must include your name and contact...

  3. 78 FR 60271 - Hollow Dam Power Company; Ampersand Hollow Dam Hydro, LLC; Notice of Application for Transfer of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-01

    ...Library link of Commission's Web site at http://www.ferc.gov/docs-filing/elibrary.asp . Enter the docket...) and the instructions on the Commission's Web site under http://www.ferc.gov/docs-filing/efiling.asp... system at http://www.ferc.gov/docs-filing/ecomment.asp . You must include your name and contact...

  4. A Markov-Based Recommendation Model for Exploring the Transfer of Learning on the Web

    ERIC Educational Resources Information Center

    Huang, Yueh-Min; Huang, Tien-Chi; Wang, Kun-Te; Hwang, Wu-Yuin

    2009-01-01

    The ability to apply existing knowledge in new situations and settings is clearly a vital skill that all students need to develop. Nowhere is this truer than in the rapidly developing world of Web-based learning, which is characterized by non-sequential courses and the absence of an effective cross-subject guidance system. As a result, questions…

  5. Application of World Wide Web (W3) Technologies in Payload Operations

    NASA Technical Reports Server (NTRS)

    Sun, Charles; Windrem, May; Picinich, Lou

    1996-01-01

    World Wide Web (W3) technologies are considered in relation to their application to space missions. It is considered that such technologies, including the hypertext transfer protocol and the Java object-oriented language, offer a powerful and relatively inexpensive framework for distributed application software development. The suitability of these technologies for payload monitoring systems development is discussed, and the experience gained from the development of an insect habitat monitoring system based on W3 technologies is reported.

  6. 78 FR 12049 - Marlborough Hydro Associates; Ashuelot River Hydro, Inc.; Notice of Application for Transfer of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-21

    ... information about this project can be viewed or printed on the eLibrary link of Commission's Web site at http... CFR 385.2001(a)(1) and the instructions on the Commission's Web site under http://www.ferc.gov/docs... registration, using the eComment system at http://www.ferc.gov/docs-filing/ecomment.asp . You must include your...

  7. Access to the NCAR Research Data Archive via the Globus Data Transfer Service

    NASA Astrophysics Data System (ADS)

    Cram, T.; Schuster, D.; Ji, Z.; Worley, S. J.

    2014-12-01

    The NCAR Research Data Archive (RDA; http://rda.ucar.edu) contains a large and diverse collection of meteorological and oceanographic observations, operational and reanalysis outputs, and remote sensing datasets to support atmospheric and geoscience research. The RDA contains greater than 600 dataset collections which support the varying needs of a diverse user community. The number of RDA users is increasing annually, and the most popular method used to access the RDA data holdings is through web based protocols, such as wget and cURL based scripts. In the year 2013, 10,000 unique users downloaded greater than 820 terabytes of data from the RDA, and customized data products were prepared for more than 29,000 user-driven requests. In order to further support this increase in web download usage, the RDA is implementing the Globus data transfer service (www.globus.org) to provide a GridFTP data transfer option for the user community. The Globus service is broadly scalable, has an easy to install client, is sustainably supported, and provides a robust, efficient, and reliable data transfer option for RDA users. This paper highlights the main functionality and usefulness of the Globus data transfer service for accessing the RDA holdings. The Globus data transfer service, developed and supported by the Computation Institute at The University of Chicago and Argonne National Laboratory, uses the GridFTP as a fast, secure, and reliable method for transferring data between two endpoints. A Globus user account is required to use this service, and data transfer endpoints are defined on the Globus web interface. In the RDA use cases, the access endpoint is created on the RDA data server at NCAR. The data user defines the receiving endpoint for the data transfer, which can be the main file system at a host institution, a personal work station, or laptop. Once initiated, the data transfer runs as an unattended background process by Globus, and Globus ensures that the transfer is accurately fulfilled. Users can monitor the data transfer progress on the Globus web interface and optionally receive an email notification once it is complete. Globus also provides a command-line interface to support scripted transfers, which can be useful when embedded in data processing workflows.

  8. The Service Environment for Enhanced Knowledge and Research (SEEKR) Framework

    NASA Astrophysics Data System (ADS)

    King, T. A.; Walker, R. J.; Weigel, R. S.; Narock, T. W.; McGuire, R. E.; Candey, R. M.

    2011-12-01

    The Service Environment for Enhanced Knowledge and Research (SEEKR) Framework is a configurable service oriented framework to enable the discovery, access and analysis of data shared in a community. The SEEKR framework integrates many existing independent services through the use of web technologies and standard metadata. Services are hosted on systems by using an application server and are callable by using REpresentational State Transfer (REST) protocols. Messages and metadata are transferred with eXtensible Markup Language (XML) encoding which conform to a published XML schema. Space Physics Archive Search and Extract (SPASE) metadata is central to utilizing the services. Resources (data, documents, software, etc.) are described with SPASE and the associated Resource Identifier is used to access and exchange resources. The configurable options for the service can be set by using a web interface. Services are packaged as web application resource (WAR) files for direct deployment on application services such as Tomcat or Jetty. We discuss the composition of the SEEKR framework, how new services can be integrated and the steps necessary to deploying the framework. The SEEKR Framework emerged from NASA's Virtual Magnetospheric Observatory (VMO) and other systems and we present an overview of these systems from a SEEKR Framework perspective.

  9. Earth Science Mining Web Services

    NASA Astrophysics Data System (ADS)

    Pham, L. B.; Lynnes, C. S.; Hegde, M.; Graves, S.; Ramachandran, R.; Maskey, M.; Keiser, K.

    2008-12-01

    To allow scientists further capabilities in the area of data mining and web services, the Goddard Earth Sciences Data and Information Services Center (GES DISC) and researchers at the University of Alabama in Huntsville (UAH) have developed a system to mine data at the source without the need of network transfers. The system has been constructed by linking together several pre-existing technologies: the Simple Scalable Script-based Science Processor for Measurements (S4PM), a processing engine at the GES DISC; the Algorithm Development and Mining (ADaM) system, a data mining toolkit from UAH that can be configured in a variety of ways to create customized mining processes; ActiveBPEL, a workflow execution engine based on BPEL (Business Process Execution Language); XBaya, a graphical workflow composer; and the EOS Clearinghouse (ECHO). XBaya is used to construct an analysis workflow at UAH using ADaM components, which are also installed remotely at the GES DISC, wrapped as Web Services. The S4PM processing engine searches ECHO for data using space-time criteria, staging them to cache, allowing the ActiveBPEL engine to remotely orchestrates the processing workflow within S4PM. As mining is completed, the output is placed in an FTP holding area for the end user. The goals are to give users control over the data they want to process, while mining data at the data source using the server's resources rather than transferring the full volume over the internet. These diverse technologies have been infused into a functioning, distributed system with only minor changes to the underlying technologies. The key to this infusion is the loosely coupled, Web- Services based architecture: All of the participating components are accessible (one way or another) through (Simple Object Access Protocol) SOAP-based Web Services.

  10. Earth Science Mining Web Services

    NASA Technical Reports Server (NTRS)

    Pham, Long; Lynnes, Christopher; Hegde, Mahabaleshwa; Graves, Sara; Ramachandran, Rahul; Maskey, Manil; Keiser, Ken

    2008-01-01

    To allow scientists further capabilities in the area of data mining and web services, the Goddard Earth Sciences Data and Information Services Center (GES DISC) and researchers at the University of Alabama in Huntsville (UAH) have developed a system to mine data at the source without the need of network transfers. The system has been constructed by linking together several pre-existing technologies: the Simple Scalable Script-based Science Processor for Measurements (S4PM), a processing engine at he GES DISC; the Algorithm Development and Mining (ADaM) system, a data mining toolkit from UAH that can be configured in a variety of ways to create customized mining processes; ActiveBPEL, a workflow execution engine based on BPEL (Business Process Execution Language); XBaya, a graphical workflow composer; and the EOS Clearinghouse (ECHO). XBaya is used to construct an analysis workflow at UAH using ADam components, which are also installed remotely at the GES DISC, wrapped as Web Services. The S4PM processing engine searches ECHO for data using space-time criteria, staging them to cache, allowing the ActiveBPEL engine to remotely orchestras the processing workflow within S4PM. As mining is completed, the output is placed in an FTP holding area for the end user. The goals are to give users control over the data they want to process, while mining data at the data source using the server's resources rather than transferring the full volume over the internet. These diverse technologies have been infused into a functioning, distributed system with only minor changes to the underlying technologies. The key to the infusion is the loosely coupled, Web-Services based architecture: All of the participating components are accessible (one way or another) through (Simple Object Access Protocol) SOAP-based Web Services.

  11. The World Wide Web and Technology Transfer at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Bianco, David J.

    1994-01-01

    NASA Langley Research Center (LaRC) began using the World Wide Web (WWW) in the summer of 1993, becoming the first NASA installation to provide a Center-wide home page. This coincided with a reorganization of LaRC to provide a more concentrated focus on technology transfer to both aerospace and non-aerospace industry. Use of the WWW and NCSA Mosaic not only provides automated information dissemination, but also allows for the implementation, evolution and integration of many technology transfer applications. This paper describes several of these innovative applications, including the on-line presentation of the entire Technology Opportunities Showcase (TOPS), an industrial partnering showcase that exists on the Web long after the actual 3-day event ended. During its first year on the Web, LaRC also developed several WWW-based information repositories. The Langley Technical Report Server (LTRS), a technical paper delivery system with integrated searching and retrieval, has proved to be quite popular. The NASA Technical Report Server (NTRS), an outgrowth of LTRS, provides uniform access to many logically similar, yet physically distributed NASA report servers. WWW is also the foundation of the Langley Software Server (LSS), an experimental software distribution system which will distribute LaRC-developed software with the possible phase-out of NASA's COSMIC program. In addition to the more formal technology distribution projects, WWW has been successful in connecting people with technologies and people with other people. With the completion of the LaRC reorganization, the Technology Applications Group, charged with interfacing with non-aerospace companies, opened for business with a popular home page.

  12. Interaction between birds and macrofauna within food webs of six intertidal habitats of the Wadden Sea.

    PubMed

    Horn, Sabine; de la Vega, Camille; Asmus, Ragnhild; Schwemmer, Philipp; Enners, Leonie; Garthe, Stefan; Binder, Kirsten; Asmus, Harald

    2017-01-01

    The determination of food web structures using Ecological Network Analysis (ENA) is a helpful tool to get insight into complex ecosystem processes. The intertidal area of the Wadden Sea is structured into diverse habitat types which differ in their ecological functioning. In the present study, six different intertidal habitats (i.e. cockle field, razor clam field, mud flat, mussel bank, sand flat and seagrass meadow) were analyzed using ENA to determine similarities and characteristic differences in the food web structure of the systems. All six systems were well balanced between their degree of organization and their robustness. However, they differed in their detailed features. The cockle field and the mussel bank exhibited a strong dependency on external imports. The razor clam field appeared to be a rather small system with low energy transfer. In the mud flat microphytobenthos was used as a main food source and the system appeared to be sensitive to perturbations. Bird predation was the most pronounced in the sand flat and the seagrass meadow and led to an increase in energy transfer and parallel trophic cycles in these habitats. Habitat diversity appears to be an important trait for the Wadden Sea as each subsystem seems to have a specific role in the overall functioning of the entire ecosystem.

  13. Interaction between birds and macrofauna within food webs of six intertidal habitats of the Wadden Sea

    PubMed Central

    Horn, Sabine; de la Vega, Camille; Asmus, Ragnhild; Schwemmer, Philipp; Enners, Leonie; Garthe, Stefan; Binder, Kirsten; Asmus, Harald

    2017-01-01

    The determination of food web structures using Ecological Network Analysis (ENA) is a helpful tool to get insight into complex ecosystem processes. The intertidal area of the Wadden Sea is structured into diverse habitat types which differ in their ecological functioning. In the present study, six different intertidal habitats (i.e. cockle field, razor clam field, mud flat, mussel bank, sand flat and seagrass meadow) were analyzed using ENA to determine similarities and characteristic differences in the food web structure of the systems. All six systems were well balanced between their degree of organization and their robustness. However, they differed in their detailed features. The cockle field and the mussel bank exhibited a strong dependency on external imports. The razor clam field appeared to be a rather small system with low energy transfer. In the mud flat microphytobenthos was used as a main food source and the system appeared to be sensitive to perturbations. Bird predation was the most pronounced in the sand flat and the seagrass meadow and led to an increase in energy transfer and parallel trophic cycles in these habitats. Habitat diversity appears to be an important trait for the Wadden Sea as each subsystem seems to have a specific role in the overall functioning of the entire ecosystem. PMID:28489869

  14. WebGLORE: a web service for Grid LOgistic REgression.

    PubMed

    Jiang, Wenchao; Li, Pinghao; Wang, Shuang; Wu, Yuan; Xue, Meng; Ohno-Machado, Lucila; Jiang, Xiaoqian

    2013-12-15

    WebGLORE is a free web service that enables privacy-preserving construction of a global logistic regression model from distributed datasets that are sensitive. It only transfers aggregated local statistics (from participants) through Hypertext Transfer Protocol Secure to a trusted server, where the global model is synthesized. WebGLORE seamlessly integrates AJAX, JAVA Applet/Servlet and PHP technologies to provide an easy-to-use web service for biomedical researchers to break down policy barriers during information exchange. http://dbmi-engine.ucsd.edu/webglore3/. WebGLORE can be used under the terms of GNU general public license as published by the Free Software Foundation.

  15. Soil life in reconstructed ecosystems: Initial soil food web responses after rebuilding a forest soil profile for a climate change experiment

    EPA Science Inventory

    Disrupting ecosystem components, while transferring and reconstructing them for experiments can produce myriad responses. Establishing the extent of these biological responses as the system approaches a new equilibrium allows us more reliably to emulate comparable native systems....

  16. Plugin free remote visualization in the browser

    NASA Astrophysics Data System (ADS)

    Tamm, Georg; Slusallek, Philipp

    2015-01-01

    Today, users access information and rich media from anywhere using the web browser on their desktop computers, tablets or smartphones. But the web evolves beyond media delivery. Interactive graphics applications like visualization or gaming become feasible as browsers advance in the functionality they provide. However, to deliver large-scale visualization to thin clients like mobile devices, a dedicated server component is necessary. Ideally, the client runs directly within the browser the user is accustomed to, requiring no installation of a plugin or native application. In this paper, we present the state-of-the-art of technologies which enable plugin free remote rendering in the browser. Further, we describe a remote visualization system unifying these technologies. The system transfers rendering results to the client as images or as a video stream. We utilize the upcoming World Wide Web Consortium (W3C) conform Web Real-Time Communication (WebRTC) standard, and the Native Client (NaCl) technology built into Chrome, to deliver video with low latency.

  17. The QuakeSim Project: Web Services for Managing Geophysical Data and Applications

    NASA Astrophysics Data System (ADS)

    Pierce, Marlon E.; Fox, Geoffrey C.; Aktas, Mehmet S.; Aydin, Galip; Gadgil, Harshawardhan; Qi, Zhigang; Sayar, Ahmet

    2008-04-01

    We describe our distributed systems research efforts to build the “cyberinfrastructure” components that constitute a geophysical Grid, or more accurately, a Grid of Grids. Service-oriented computing principles are used to build a distributed infrastructure of Web accessible components for accessing data and scientific applications. Our data services fall into two major categories: Archival, database-backed services based around Geographical Information System (GIS) standards from the Open Geospatial Consortium, and streaming services that can be used to filter and route real-time data sources such as Global Positioning System data streams. Execution support services include application execution management services and services for transferring remote files. These data and execution service families are bound together through metadata information and workflow services for service orchestration. Users may access the system through the QuakeSim scientific Web portal, which is built using a portlet component approach.

  18. WebGLORE: a Web service for Grid LOgistic REgression

    PubMed Central

    Jiang, Wenchao; Li, Pinghao; Wang, Shuang; Wu, Yuan; Xue, Meng; Ohno-Machado, Lucila; Jiang, Xiaoqian

    2013-01-01

    WebGLORE is a free web service that enables privacy-preserving construction of a global logistic regression model from distributed datasets that are sensitive. It only transfers aggregated local statistics (from participants) through Hypertext Transfer Protocol Secure to a trusted server, where the global model is synthesized. WebGLORE seamlessly integrates AJAX, JAVA Applet/Servlet and PHP technologies to provide an easy-to-use web service for biomedical researchers to break down policy barriers during information exchange. Availability and implementation: http://dbmi-engine.ucsd.edu/webglore3/. WebGLORE can be used under the terms of GNU general public license as published by the Free Software Foundation. Contact: x1jiang@ucsd.edu PMID:24072732

  19. Guidelines for Transferring Residential Courses into Web

    ERIC Educational Resources Information Center

    Tüzün, Hakan; Çinar, Murat

    2016-01-01

    This study shared unique design experiences by examining the process of transferring residential courses to the Web, and proposed a design model for individuals who want to transfer their courses into this environment. The formative research method was used in the study, and two project teams' processes of putting courses, which were being taught…

  20. Limitations of STIRAP-like population transfer in extended systems: the three-level system embedded in a web of background states.

    PubMed

    Jakubetz, Werner

    2012-12-14

    This paper presents a systematic numerical investigation of background state participation in STIRAP (stimulated Raman-adiabatic passage) population transfer among vibrational states, focusing on the consequences for the robustness of the method. The simulations, which are performed over extended grids in the parameter space of the Stokes- and pump pulses (frequencies, field strengths, and pulse lengths), involve hierarchies of (3 + N)-level systems of increasing complexity, ranging from the standard three-level STIRAP setup, (N = 0) in Λ-configuration, up to N = 446. A strongly coupled three-level core system is selected from the full Hamiltonian of the double-well HCN∕HNC system, and the couplings connecting this core system to the remaining states are (re-) parameterized in different ways, from very weak to very strong. The systems so obtained represent a three-level system embedded in various ways in webs of cross-linked vibrational background states and incorporate typical molecular properties. We first summarize essential properties of population transfer in the standard three-level system and quantify the robustness of the method and its dependence on the pulse parameters. Against these reference results, we present results obtained for four (3 + 446)-level systems and several subsystems. For pulse lengths of at most few picoseconds the intrinsic robustness of STIRAP with respect to variations in the field strength disappears as soon as the largest core-background couplings exceed about one tenth of the STIRAP couplings. In such cases robustness with respect to variations in the field strength is entirely lost, since at higher field strengths, except for irregularly spaced narrow frequency ranges, transfer probabilities are strongly reduced. STIRAP-like population transfer is maintained, with some restrictions, at low field strengths near the onset of adiabatic transfer. The suppression of STIRAP is traced back to different mechanisms based on a plentitude of single- and multiphoton transitions to background states, which at the high field strengths characteristic for STIRAP proceed readily even along weakly coupled pathways.

  1. A Workshop on UNIX, Workstations, and Internet Connections.

    ERIC Educational Resources Information Center

    Vierheller, Timothy R.

    1997-01-01

    Describes a workshop that introduces participants to the UNIX operating system. Provides an overview of how to access information on the Internet and gain familiarity with Web browsers, file transfer programs, telnet sessions, newsreaders, and Gopher services. (DDR)

  2. Development of wide area environment accelerator operation and diagnostics method

    NASA Astrophysics Data System (ADS)

    Uchiyama, Akito; Furukawa, Kazuro

    2015-08-01

    Remote operation and diagnostic systems for particle accelerators have been developed for beam operation and maintenance in various situations. Even though fully remote experiments are not necessary, the remote diagnosis and maintenance of the accelerator is required. Considering remote-operation operator interfaces (OPIs), the use of standard protocols such as the hypertext transfer protocol (HTTP) is advantageous, because system-dependent protocols are unnecessary between the remote client and the on-site server. Here, we have developed a client system based on WebSocket, which is a new protocol provided by the Internet Engineering Task Force for Web-based systems, as a next-generation Web-based OPI using the Experimental Physics and Industrial Control System Channel Access protocol. As a result of this implementation, WebSocket-based client systems have become available for remote operation. Also, as regards practical application, the remote operation of an accelerator via a wide area network (WAN) faces a number of challenges, e.g., the accelerator has both experimental device and radiation generator characteristics. Any error in remote control system operation could result in an immediate breakdown. Therefore, we propose the implementation of an operator intervention system for remote accelerator diagnostics and support that can obviate any differences between the local control room and remote locations. Here, remote-operation Web-based OPIs, which resolve security issues, are developed.

  3. Chronic Mycobacterium infection of first dorsal web space after accidental Bacilli Calmette-Guérin injection in a health worker: case report.

    PubMed

    Vigler, Mordechai; Mulett, Hanan; Hausman, Michael R

    2008-11-01

    We present a case of inoculation of the first dorsal web space by a nurse practitioner who accidentally stuck herself while preparing Bacilli Calmette-Guérin vaccine for treatment of bladder tumor. We report the evolution and management of this resistant chronic Mycobacterium infection that ultimately required use of a vacuum wound management system followed by a microvascular free tissue transfer.

  4. Addressing mental health epidemic among university students via web-based, self-screening, and referral system: a preliminary study.

    PubMed

    Kim, Eung-Hun; Coumar, Anil; Lober, William B; Kim, Yongmin

    2011-03-01

    The prevalence and severity of mental health problems in college and university communities are alarming. However, the majority of students with mental disorders do not seek help from professionals. To help students assess their mental conditions and encourage them to take an active role in seeking care, we developed a web-based self-screening, referral, and secure communication system and evaluated it at the University of Washington for 17 months. The system handled more than 1000 screenings during the study period. Of the subjects who used the system, 75% noted that the system helped them to make a decision to receive help from professionals. The system was able to provide outreach to students with mental health concerns effectively, allow them to self-screen their conditions, and encourage them to receive professional assistance. The system provided students with 24/7 web-based access to the clinic, and more than 50% of the system use was made during off-hours. The system was well received by patients, referral managers, and care providers, and it was transferred to the clinic for daily clinical use. We believe that a web-based system like ours could be used as one way to tackle the growing epidemic of mental health problems among college and university students.

  5. The Firegoose: two-way integration of diverse data from different bioinformatics web resources with desktop applications

    PubMed Central

    Bare, J Christopher; Shannon, Paul T; Schmid, Amy K; Baliga, Nitin S

    2007-01-01

    Background Information resources on the World Wide Web play an indispensable role in modern biology. But integrating data from multiple sources is often encumbered by the need to reformat data files, convert between naming systems, or perform ongoing maintenance of local copies of public databases. Opportunities for new ways of combining and re-using data are arising as a result of the increasing use of web protocols to transmit structured data. Results The Firegoose, an extension to the Mozilla Firefox web browser, enables data transfer between web sites and desktop tools. As a component of the Gaggle integration framework, Firegoose can also exchange data with Cytoscape, the R statistical package, Multiexperiment Viewer (MeV), and several other popular desktop software tools. Firegoose adds the capability to easily use local data to query KEGG, EMBL STRING, DAVID, and other widely-used bioinformatics web sites. Query results from these web sites can be transferred to desktop tools for further analysis with a few clicks. Firegoose acquires data from the web by screen scraping, microformats, embedded XML, or web services. We define a microformat, which allows structured information compatible with the Gaggle to be embedded in HTML documents. We demonstrate the capabilities of this software by performing an analysis of the genes activated in the microbe Halobacterium salinarum NRC-1 in response to anaerobic environments. Starting with microarray data, we explore functions of differentially expressed genes by combining data from several public web resources and construct an integrated view of the cellular processes involved. Conclusion The Firegoose incorporates Mozilla Firefox into the Gaggle environment and enables interactive sharing of data between diverse web resources and desktop software tools without maintaining local copies. Additional web sites can be incorporated easily into the framework using the scripting platform of the Firefox browser. Performing data integration in the browser allows the excellent search and navigation capabilities of the browser to be used in combination with powerful desktop tools. PMID:18021453

  6. The Firegoose: two-way integration of diverse data from different bioinformatics web resources with desktop applications.

    PubMed

    Bare, J Christopher; Shannon, Paul T; Schmid, Amy K; Baliga, Nitin S

    2007-11-19

    Information resources on the World Wide Web play an indispensable role in modern biology. But integrating data from multiple sources is often encumbered by the need to reformat data files, convert between naming systems, or perform ongoing maintenance of local copies of public databases. Opportunities for new ways of combining and re-using data are arising as a result of the increasing use of web protocols to transmit structured data. The Firegoose, an extension to the Mozilla Firefox web browser, enables data transfer between web sites and desktop tools. As a component of the Gaggle integration framework, Firegoose can also exchange data with Cytoscape, the R statistical package, Multiexperiment Viewer (MeV), and several other popular desktop software tools. Firegoose adds the capability to easily use local data to query KEGG, EMBL STRING, DAVID, and other widely-used bioinformatics web sites. Query results from these web sites can be transferred to desktop tools for further analysis with a few clicks. Firegoose acquires data from the web by screen scraping, microformats, embedded XML, or web services. We define a microformat, which allows structured information compatible with the Gaggle to be embedded in HTML documents. We demonstrate the capabilities of this software by performing an analysis of the genes activated in the microbe Halobacterium salinarum NRC-1 in response to anaerobic environments. Starting with microarray data, we explore functions of differentially expressed genes by combining data from several public web resources and construct an integrated view of the cellular processes involved. The Firegoose incorporates Mozilla Firefox into the Gaggle environment and enables interactive sharing of data between diverse web resources and desktop software tools without maintaining local copies. Additional web sites can be incorporated easily into the framework using the scripting platform of the Firefox browser. Performing data integration in the browser allows the excellent search and navigation capabilities of the browser to be used in combination with powerful desktop tools.

  7. Autonomous Satellite Command and Control through the World Wide Web: Phase 3

    NASA Technical Reports Server (NTRS)

    Cantwell, Brian; Twiggs, Robert

    1998-01-01

    NASA's New Millenium Program (NMP) has identified a variety of revolutionary technologies that will support orders of magnitude improvements in the capabilities of spacecraft missions. This program's Autonomy team has focused on science and engineering automation technologies. In doing so, it has established a clear development roadmap specifying the experiments and demonstrations required to mature these technologies. The primary developmental thrusts of this roadmap are in the areas of remote agents, PI/operator interface, planning/scheduling fault management, and smart execution architectures. Phases 1 and 2 of the ASSET Project (previously known as the WebSat project) have focused on establishing World Wide Web-based commanding and telemetry services as an advanced means of interfacing a spacecraft system with the PI and operators. Current automated capabilities include Web-based command submission, limited contact scheduling, command list generation and transfer to the ground station, spacecraft support for demonstrations experiments, data transfer from the ground station back to the ASSET system, data archiving, and Web-based telemetry distribution. Phase 2 was finished in December 1996. During January-December 1997 work was commenced on Phase 3 of the ASSET Project. Phase 3 is the subject of this report. This phase permitted SSDL and its project partners to expand the ASSET system in a variety of ways. These added capabilities included the advancement of ground station capabilities, the adaptation of spacecraft on-board software, and the expansion of capabilities of the ASSET management algorithms. Specific goals of Phase 3 were: (1) Extend Web-based goal-level commanding for both the payload PI and the spacecraft engineer; (2) Support prioritized handling of multiple PIs as well as associated payload experimenters; (3) Expand the number and types of experiments supported by the ASSET system and its associated spacecraft; (4) Implement more advanced resource management, modeling and fault management capabilities that integrate the space and ground segments of the space system hardware; (5) Implement a beacon monitoring test; (6) Implement an experimental blackboard controller for space system management; (7) Further define typical ground station developments required for Internet-based remote control and for full system automation of the PI-to-spacecraft link. Each of those goals is examined in the next section. Significant sections of this report were also published as a conference paper.

  8. Soil life in reconstructed ecosystems: initial soil food web responses after rebuilding a forest soil profile for a climate change experiment

    Treesearch

    Paul T. Rygiewicz; Vicente J. Monleon; Elaine R. Ingham; Kendall J. Martin; Mark G. Johnson

    2010-01-01

    Disrupting ecosystem components, while transferring and reconstructing them for experiments can produce myriad responses. Establishing the extent of these biological responses as the system approaches a new equilibrium allows us more reliably to emulate comparable native systems. That is, the sensitivity of analyzing ecosystem processes in a reconstructed system is...

  9. Vehicle Dynamics Monitoring and Tracking System (VDMTS): Monitoring Mission Impacts in Support of Installation Land Management

    DTIC Science & Technology

    2012-06-01

    transfer This report will be made accessible through the World Wide Web (WWW) at URLs: http://www.cecer.army.mil http://libweb.erdc.usace.army.mil...conditions (e.g., wetter or dryer conditions). Using the same live event tracking data, predictions can be made for vegetation loss in wet soils, even...WWW World Wide Web ERDC/CERL TR-12-11 107 References Ahlvin, R. B., and P. W. Haley. 1992. NATO reference mobility model edition II, NRMM II

  10. Web Program for Development of GUIs for Cluster Computers

    NASA Technical Reports Server (NTRS)

    Czikmantory, Akos; Cwik, Thomas; Klimeck, Gerhard; Hua, Hook; Oyafuso, Fabiano; Vinyard, Edward

    2003-01-01

    WIGLAF (a Web Interface Generator and Legacy Application Facade) is a computer program that provides a Web-based, distributed, graphical-user-interface (GUI) framework that can be adapted to any of a broad range of application programs, written in any programming language, that are executed remotely on any cluster computer system. WIGLAF enables the rapid development of a GUI for controlling and monitoring a specific application program running on the cluster and for transferring data to and from the application program. The only prerequisite for the execution of WIGLAF is a Web-browser program on a user's personal computer connected with the cluster via the Internet. WIGLAF has a client/server architecture: The server component is executed on the cluster system, where it controls the application program and serves data to the client component. The client component is an applet that runs in the Web browser. WIGLAF utilizes the Extensible Markup Language to hold all data associated with the application software, Java to enable platform-independent execution on the cluster system and the display of a GUI generator through the browser, and the Java Remote Method Invocation software package to provide simple, effective client/server networking.

  11. Development of a web database portfolio system with PACS connectivity for undergraduate health education and continuing professional development.

    PubMed

    Ng, Curtise K C; White, Peter; McKay, Janice C

    2009-04-01

    Increasingly, the use of web database portfolio systems is noted in medical and health education, and for continuing professional development (CPD). However, the functions of existing systems are not always aligned with the corresponding pedagogy and hence reflection is often lost. This paper presents the development of a tailored web database portfolio system with Picture Archiving and Communication System (PACS) connectivity, which is based on the portfolio pedagogy. Following a pre-determined portfolio framework, a system model with the components of web, database and mail servers, server side scripts, and a Query/Retrieve (Q/R) broker for conversion between Hypertext Transfer Protocol (HTTP) requests and Q/R service class of Digital Imaging and Communication in Medicine (DICOM) standard, is proposed. The system was piloted with seventy-seven volunteers. A tailored web database portfolio system (http://radep.hti.polyu.edu.hk) was developed. Technological arrangements for reinforcing portfolio pedagogy include popup windows (reminders) with guidelines and probing questions of 'collect', 'select' and 'reflect' on evidence of development/experience, limitation in the number of files (evidence) to be uploaded, the 'Evidence Insertion' functionality to link the individual uploaded artifacts with reflective writing, capability to accommodate diversity of contents and convenient interfaces for reviewing portfolios and communication. Evidence to date suggests the system supports users to build their portfolios with sound hypertext reflection under a facilitator's guidance, and with reviewers to monitor students' progress providing feedback and comments online in a programme-wide situation.

  12. Remote vibration monitoring system using wireless internet data transfer

    NASA Astrophysics Data System (ADS)

    Lemke, John

    2000-06-01

    Vibrations from construction activities can affect infrastructure projects in several ways. Within the general vicinity of a construction site, vibrations can result in damage to existing structures, disturbance to people, damage to sensitive machinery, and degraded performance of precision instrumentation or motion sensitive equipment. Current practice for monitoring vibrations in the vicinity of construction sites commonly consists of measuring free field or structural motions using velocity transducers connected to a portable data acquisition unit via cables. This paper describes an innovative way to collect, process, transmit, and analyze vibration measurements obtained at construction sites. The system described measures vibration at the sensor location, performs necessary signal conditioning and digitization, and sends data to a Web server using wireless data transmission and Internet protocols. A Servlet program running on the Web server accepts the transmitted data and incorporates it into a project database. Two-way interaction between the Web-client and the Web server is accomplished through the use of a Servlet program and a Java Applet running inside a browser located on the Web client's computer. Advantages of this system over conventional vibration data logging systems include continuous unattended monitoring, reduced costs associated with field data collection, instant access to data files and graphs by project team members, and the ability to remotely modify data sampling schemes.

  13. Development and Integration of WWW-Based Services in an Existing University Environment.

    ERIC Educational Resources Information Center

    Garofalakis, John; Kappos, Panagiotis; Tsakalidis, Athanasios; Tsaknakis, John; Tzimas, Giannis; Vassiliadis, Vassilios

    This paper describes the experience and the problems solved in the process of developing and integrating advanced World Wide Web-based services into the University of Patras (Greece) system. In addition to basic network services (e.g., e-mail, file transfer protocol), the final system will integrate the following set of advanced services: a…

  14. A Web of Things-Based Emerging Sensor Network Architecture for Smart Control Systems.

    PubMed

    Khan, Murad; Silva, Bhagya Nathali; Han, Kijun

    2017-02-09

    The Web of Things (WoT) plays an important role in the representation of the objects connected to the Internet of Things in a more transparent and effective way. Thus, it enables seamless and ubiquitous web communication between users and the smart things. Considering the importance of WoT, we propose a WoT-based emerging sensor network (WoT-ESN), which collects data from sensors, routes sensor data to the web, and integrate smart things into the web employing a representational state transfer (REST) architecture. A smart home scenario is introduced to evaluate the proposed WoT-ESN architecture. The smart home scenario is tested through computer simulation of the energy consumption of various household appliances, device discovery, and response time performance. The simulation results show that the proposed scheme significantly optimizes the energy consumption of the household appliances and the response time of the appliances.

  15. A Web of Things-Based Emerging Sensor Network Architecture for Smart Control Systems

    PubMed Central

    Khan, Murad; Silva, Bhagya Nathali; Han, Kijun

    2017-01-01

    The Web of Things (WoT) plays an important role in the representation of the objects connected to the Internet of Things in a more transparent and effective way. Thus, it enables seamless and ubiquitous web communication between users and the smart things. Considering the importance of WoT, we propose a WoT-based emerging sensor network (WoT-ESN), which collects data from sensors, routes sensor data to the web, and integrate smart things into the web employing a representational state transfer (REST) architecture. A smart home scenario is introduced to evaluate the proposed WoT-ESN architecture. The smart home scenario is tested through computer simulation of the energy consumption of various household appliances, device discovery, and response time performance. The simulation results show that the proposed scheme significantly optimizes the energy consumption of the household appliances and the response time of the appliances.  PMID:28208787

  16. Assessing ecosystem effects of reservoir operations using food web-energy transfer and water quality models

    USGS Publications Warehouse

    Saito, L.; Johnson, B.M.; Bartholow, J.; Hanna, R.B.

    2001-01-01

    We investigated the effects on the reservoir food web of a new temperature control device (TCD) on the dam at Shasta Lake, California. We followed a linked modeling approach that used a specialized reservoir water quality model to forecast operation-induced changes in phytoplankton production. A food web–energy transfer model was also applied to propagate predicted changes in phytoplankton up through the food web to the predators and sport fishes of interest. The food web–energy transfer model employed a 10% trophic transfer efficiency through a food web that was mapped using carbon and nitrogen stable isotope analysis. Stable isotope analysis provided an efficient and comprehensive means of estimating the structure of the reservoir's food web with minimal sampling and background data. We used an optimization procedure to estimate the diet proportions of all food web components simultaneously from their isotopic signatures. Some consumers were estimated to be much more sensitive than others to perturbations to phytoplankton supply. The linked modeling approach demonstrated that interdisciplinary efforts enhance the value of information obtained from studies of managed ecosystems. The approach exploited the strengths of engineering and ecological modeling methods to address concerns that neither of the models could have addressed alone: (a) the water quality model could not have addressed quantitatively the possible impacts to fish, and (b) the food web model could not have examined how phytoplankton availability might change due to reservoir operations.

  17. Evaluating and Implementing Learning Environments: A United Kingdom Experience.

    ERIC Educational Resources Information Center

    Ingraham, Bruce; Watson, Barbara; McDowell, Liz; Brockett, Adrian; Fitzpatrick, Simon

    2002-01-01

    Reports on ongoing work at five universities in northeastern England that have been evaluating and implementing online learning environments known as virtual learning environments (VLEs) or managed learning environments (MLEs). Discusses do-it-yourself versus commercial systems; transferability; Web-based versus client-server; integration with…

  18. Implementing an SIG based platform of application and service for city spatial information in Shanghai

    NASA Astrophysics Data System (ADS)

    Yu, Bailang; Wu, Jianping

    2006-10-01

    Spatial Information Grid (SIG) is an infrastructure that has the ability to provide the services for spatial information according to users' needs by means of collecting, sharing, organizing and processing the massive distributed spatial information resources. This paper presents the architecture, technologies and implementation of the Shanghai City Spatial Information Application and Service System, a SIG based platform, which is an integrated platform that serves for administration, planning, construction and development of the city. In the System, there are ten categories of spatial information resources, including city planning, land-use, real estate, river system, transportation, municipal facility construction, environment protection, sanitation, urban afforestation and basic geographic information data. In addition, spatial information processing services are offered as a means of GIS Web Services. The resources and services are all distributed in different web-based nodes. A single database is created to store the metadata of all the spatial information. A portal site is published as the main user interface of the System. There are three main functions in the portal site. First, users can search the metadata and consequently acquire the distributed data by using the searching results. Second, some spatial processing web applications that developed with GIS Web Services, such as file format conversion, spatial coordinate transfer, cartographic generalization and spatial analysis etc, are offered to use. Third, GIS Web Services currently available in the System can be searched and new ones can be registered. The System has been working efficiently in Shanghai Government Network since 2005.

  19. Accessing NASA Technology with the World Wide Web

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Bianco, David J.

    1995-01-01

    NASA Langley Research Center (LaRC) began using the World Wide Web (WWW) in the summer of 1993, becoming the first NASA installation to provide a Center-wide home page. This coincided with a reorganization of LaRC to provide a more concentrated focus on technology transfer to both aerospace and non-aerospace industry. Use of WWW and NCSA Mosaic not only provides automated information dissemination, but also allows for the implementation, evolution and integration of many technology transfer and technology awareness applications. This paper describes several of these innovative applications, including the on-line presentation of the entire Technology OPportunities Showcase (TOPS), an industrial partnering showcase that exists on the Web long after the actual 3-day event ended. The NASA Technical Report Server (NTRS) provides uniform access to many logically similar, yet physically distributed NASA report servers. WWW is also the foundation of the Langley Software Server (LSS), an experimental software distribution system which will distribute LaRC-developed software. In addition to the more formal technology distribution projects, WWW has been successful in connecting people with technologies and people with other people.

  20. Accounting Data to Web Interface Using PERL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hargeaves, C

    2001-08-13

    This document will explain the process to create a web interface for the accounting information generated by the High Performance Storage Systems (HPSS) accounting report feature. The accounting report contains useful data but it is not easily accessed in a meaningful way. The accounting report is the only way to see summarized storage usage information. The first step is to take the accounting data, make it meaningful and store the modified data in persistent databases. The second step is to generate the various user interfaces, HTML pages, that will be used to access the data. The third step is tomore » transfer all required files to the web server. The web pages pass parameters to Common Gateway Interface (CGI) scripts that generate dynamic web pages and graphs. The end result is a web page with specific information presented in text with or without graphs. The accounting report has a specific format that allows the use of regular expressions to verify if a line is storage data. Each storage data line is stored in a detailed database file with a name that includes the run date. The detailed database is used to create a summarized database file that also uses run date in its name. The summarized database is used to create the group.html web page that includes a list of all storage users. Scripts that query the database folder to build a list of available databases generate two additional web pages. A master script that is run monthly as part of a cron job, after the accounting report has completed, manages all of these individual scripts. All scripts are written in the PERL programming language. Whenever possible data manipulation scripts are written as filters. All scripts are written to be single source, which means they will function properly on both the open and closed networks at LLNL. The master script handles the command line inputs for all scripts, file transfers to the web server and records run information in a log file. The rest of the scripts manipulate the accounting data or use the files created to generate HTML pages. Each script will be described in detail herein. The following is a brief description of HPSS taken directly from an HPSS web site. ''HPSS is a major development project, which began in 1993 as a Cooperative Research and Development Agreement (CRADA) between government and industry. The primary objective of HPSS is to move very large data objects between high performance computers, workstation clusters, and storage libraries at speeds many times faster than is possible with today's software systems. For example, HPSS can manage parallel data transfers from multiple network-connected disk arrays at rates greater than 1 Gbyte per second, making it possible to access high definition digitized video in real time.'' The HPSS accounting report is a canned report whose format is controlled by the HPSS developers.« less

  1. Integrating information from disparate sources: the Walter Reed National Surgical Quality Improvement Program Data Transfer Project.

    PubMed

    Nelson, Victoria; Nelson, Victoria Ruth; Li, Fiona; Green, Susan; Tamura, Tomoyoshi; Liu, Jun-Min; Class, Margaret

    2008-11-06

    The Walter Reed National Surgical Quality Improvement Program Data Transfer web module integrates with medical and surgical information systems, and leverages outside standards, such as the National Library of Medicine's RxNorm, to process surgical and risk assessment data. Key components of the project included a needs assessment with nurse reviewers and a data analysis for federated (standards were locally controlled) data sources. The resulting interface streamlines nurse reviewer workflow by integrating related tasks and data.

  2. 48 CFR 252.232-7006 - Wide Area WorkFlow Payment Instructions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...— (1) Have a designated electronic business point of contact in the System for Award Management at... submission. Document submissions may be via Web entry, Electronic Data Interchange, or File Transfer Protocol... that uniquely identifies a unit, activity, or organization. Document type means the type of payment...

  3. 48 CFR 252.232-7006 - Wide Area WorkFlow Payment Instructions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...— (1) Have a designated electronic business point of contact in the System for Award Management at... submission. Document submissions may be via Web entry, Electronic Data Interchange, or File Transfer Protocol... that uniquely identifies a unit, activity, or organization. Document type means the type of payment...

  4. Variable nutrient stoichiometry (carbon:nitrogen:phosphorus) across trophic levels determines community and ecosystem properties in an oligotrophic mangrove system.

    PubMed

    Scharler, U M; Ulanowicz, R E; Fogel, M L; Wooller, M J; Jacobson-Meyers, M E; Lovelock, C E; Feller, I C; Frischer, M; Lee, R; McKee, K; Romero, I C; Schmit, J P; Shearer, C

    2015-11-01

    Our study investigated the carbon:nitrogen:phosphorus (C:N:P) stoichiometry of mangrove island of the Mesoamerican Barrier Reef (Twin Cays, Belize). The C:N:P of abiotic and biotic components of this oligotrophic ecosystem was measured and served to build networks of nutrient flows for three distinct mangrove forest zones (tall seaward fringing forest, inland dwarf forests and a transitional zone). Between forest zones, the stoichiometry of primary producers, heterotrophs and abiotic components did not change significantly, but there was a significant difference in C:N:P, and C, N, and P biomass, between the functional groups mangrove trees, other primary producers, heterotrophs, and abiotic components. C:N:P decreased with increasing trophic level. Nutrient recycling in the food webs was highest for P, and high transfer efficiencies between trophic levels of P and N also indicated an overall shortage of these nutrients when compared to C. Heterotrophs were sometimes, but not always, limited by the same nutrient as the primary producers. Mangrove trees and the primary tree consumers were P limited, whereas the invertebrates consuming leaf litter and detritus were N limited. Most compartments were limited by P or N (not by C), and the relative depletion rate of food sources was fastest for P. P transfers thus constituted a bottleneck of nutrient transfer on Twin Cays. This is the first comprehensive ecosystem study of nutrient transfers in a mangrove ecosystem, illustrating some mechanisms (e.g. recycling rates, transfer efficiencies) which oligotrophic systems use in order to build up biomass and food webs spanning various trophic levels.

  5. Variable nutrient stoichiometry (carbon:nitrogen:phosphorus) across trophic levels determines community and ecosystem properties in an oligotrophic mangrove system

    USGS Publications Warehouse

    Scharler, U.M.; Ulanowicz, Robert E.; Fogel, M.L.; Wooller, M.J.; Jacobson-Meyers, M.E.; Lovelock, C.E.; Feller, I.C.; Frischer, M.; Lee, R.; Mckee, Karen L.; Romero, I.C.; Schmit, J.P.; Shearer, C.

    2015-01-01

    Our study investigated the carbon:nitrogen:phosphorus (C:N:P) stoichiometry of mangrove island of the Mesoamerican Barrier Reef (Twin Cays, Belize). The C:N:P of abiotic and biotic components of this oligotrophic ecosystem was measured and served to build networks of nutrient flows for three distinct mangrove forest zones (tall seaward fringing forest, inland dwarf forests and a transitional zone). Between forest zones, the stoichiometry of primary producers, heterotrophs and abiotic components did not change significantly, but there was a significant difference in C:N:P, and C, N, and P biomass, between the functional groups mangrove trees, other primary producers, heterotrophs, and abiotic components. C:N:P decreased with increasing trophic level. Nutrient recycling in the food webs was highest for P, and high transfer efficiencies between trophic levels of P and N also indicated an overall shortage of these nutrients when compared to C. Heterotrophs were sometimes, but not always, limited by the same nutrient as the primary producers. Mangrove trees and the primary tree consumers were P limited, whereas the invertebrates consuming leaf litter and detritus were N limited. Most compartments were limited by P or N (not by C), and the relative depletion rate of food sources was fastest for P. P transfers thus constituted a bottleneck of nutrient transfer on Twin Cays. This is the first comprehensive ecosystem study of nutrient transfers in a mangrove ecosystem, illustrating some mechanisms (e.g. recycling rates, transfer efficiencies) which oligotrophic systems use in order to build up biomass and food webs spanning various trophic levels.

  6. Secure web book to store structural genomics research data.

    PubMed

    Manjasetty, Babu A; Höppner, Klaus; Mueller, Uwe; Heinemann, Udo

    2003-01-01

    Recently established collaborative structural genomics programs aim at significantly accelerating the crystal structure analysis of proteins. These large-scale projects require efficient data management systems to ensure seamless collaboration between different groups of scientists working towards the same goal. Within the Berlin-based Protein Structure Factory, the synchrotron X-ray data collection and the subsequent crystal structure analysis tasks are located at BESSY, a third-generation synchrotron source. To organize file-based communication and data transfer at the BESSY site of the Protein Structure Factory, we have developed the web-based BCLIMS, the BESSY Crystallography Laboratory Information Management System. BCLIMS is a relational data management system which is powered by MySQL as the database engine and Apache HTTP as the web server. The database interface routines are written in Python programing language. The software is freely available to academic users. Here we describe the storage, retrieval and manipulation of laboratory information, mainly pertaining to the synchrotron X-ray diffraction experiments and the subsequent protein structure analysis, using BCLIMS.

  7. 78 FR 42775 - CGI Federal, Inc., and Custom Applications Management; Transfer of Data

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-17

    ... develop applications, Web sites, Web pages, web-based applications and databases, in accordance with EPA policies and related Federal standards and procedures. The Contractor will provide [[Page 42776

  8. An easy-to-build remote laboratory with data transfer using the Internet School Experimental System

    NASA Astrophysics Data System (ADS)

    Schauer, František; Lustig, František; Dvořák, Jiří; Ožvoldová, Miroslava

    2008-07-01

    The present state of information communication technology makes it possible to devise and run computer-based e-laboratories accessible to any user with a connection to the Internet, equipped with very simple technical means and making full use of web services. Thus, the way is open for a new strategy of physics education with strongly global features, based on experiment and experimentation. We name this strategy integrated e-learning, and remote experiments across the Internet are the foundation for this strategy. We present both pedagogical and technical reasoning for the remote experiments and outline a simple system based on a server-client approach, and on web services and Java applets. We give here an outline of the prospective remote laboratory system with data transfer using the Internet School Experimental System (ISES) as hardware and ISES WEB Control kit as software. This approach enables the simple construction of remote experiments without building any hardware and virtually no programming, using a paste and copy approach with typical prebuilt blocks such as a camera view, controls, graphs, displays, etc. We have set up and operate at present seven experiments, running round the clock, with more than 12 000 connections since 2005. The experiments are widely used in practical teaching of both university and secondary level physics. The recording of the detailed steps the experimentor takes during the measurement enables detailed study of the psychological aspects of running the experiments. The system is ready for a network of universities to start covering the basic set of physics experiments. In conclusion we summarize the results achieved and experiences of using remote experiments built on the ISES hardware system.

  9. Web Transfer Over Satellites Being Improved

    NASA Technical Reports Server (NTRS)

    Allman, Mark

    1999-01-01

    Extensive research conducted by NASA Lewis Research Center's Satellite Networks and Architectures Branch and the Ohio University has demonstrated performance improvements in World Wide Web transfers over satellite-based networks. The use of a new version of the Hypertext Transfer Protocol (HTTP) reduced the time required to load web pages over a single Transmission Control Protocol (TCP) connection traversing a satellite channel. However, an older technique of simultaneously making multiple requests of a given server has been shown to provide even faster transfer time. Unfortunately, the use of multiple simultaneous requests has been shown to be harmful to the network in general. Therefore, we are developing new mechanisms for the HTTP protocol which may allow a single request at any given time to perform as well as, or better than, multiple simultaneous requests. In the course of study, we also demonstrated that the time for web pages to load is at least as short via a satellite link as it is via a standard 28.8-kbps dialup modem channel. This demonstrates that satellites are a viable means of accessing the Internet.

  10. Accelerating Cancer Systems Biology Research through Semantic Web Technology

    PubMed Central

    Wang, Zhihui; Sagotsky, Jonathan; Taylor, Thomas; Shironoshita, Patrick; Deisboeck, Thomas S.

    2012-01-01

    Cancer systems biology is an interdisciplinary, rapidly expanding research field in which collaborations are a critical means to advance the field. Yet the prevalent database technologies often isolate data rather than making it easily accessible. The Semantic Web has the potential to help facilitate web-based collaborative cancer research by presenting data in a manner that is self-descriptive, human and machine readable, and easily sharable. We have created a semantically linked online Digital Model Repository (DMR) for storing, managing, executing, annotating, and sharing computational cancer models. Within the DMR, distributed, multidisciplinary, and inter-organizational teams can collaborate on projects, without forfeiting intellectual property. This is achieved by the introduction of a new stakeholder to the collaboration workflow, the institutional licensing officer, part of the Technology Transfer Office. Furthermore, the DMR has achieved silver level compatibility with the National Cancer Institute’s caBIG®, so users can not only interact with the DMR through a web browser but also through a semantically annotated and secure web service. We also discuss the technology behind the DMR leveraging the Semantic Web, ontologies, and grid computing to provide secure inter-institutional collaboration on cancer modeling projects, online grid-based execution of shared models, and the collaboration workflow protecting researchers’ intellectual property. PMID:23188758

  11. Accelerating cancer systems biology research through Semantic Web technology.

    PubMed

    Wang, Zhihui; Sagotsky, Jonathan; Taylor, Thomas; Shironoshita, Patrick; Deisboeck, Thomas S

    2013-01-01

    Cancer systems biology is an interdisciplinary, rapidly expanding research field in which collaborations are a critical means to advance the field. Yet the prevalent database technologies often isolate data rather than making it easily accessible. The Semantic Web has the potential to help facilitate web-based collaborative cancer research by presenting data in a manner that is self-descriptive, human and machine readable, and easily sharable. We have created a semantically linked online Digital Model Repository (DMR) for storing, managing, executing, annotating, and sharing computational cancer models. Within the DMR, distributed, multidisciplinary, and inter-organizational teams can collaborate on projects, without forfeiting intellectual property. This is achieved by the introduction of a new stakeholder to the collaboration workflow, the institutional licensing officer, part of the Technology Transfer Office. Furthermore, the DMR has achieved silver level compatibility with the National Cancer Institute's caBIG, so users can interact with the DMR not only through a web browser but also through a semantically annotated and secure web service. We also discuss the technology behind the DMR leveraging the Semantic Web, ontologies, and grid computing to provide secure inter-institutional collaboration on cancer modeling projects, online grid-based execution of shared models, and the collaboration workflow protecting researchers' intellectual property. Copyright © 2012 Wiley Periodicals, Inc.

  12. Systems for the Intermodal Routing of Spent Nuclear Fuel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, Steven K; Liu, Cheng

    The safe and secure movement of spent nuclear fuel from shutdown and active reactor facilities to intermediate or long term storage sites may, in some instances, require the use of several modes of transportation to accomplish the move. To that end, a fully operable multi-modal routing system is being developed within Oak Ridge National Laboratory s (ORNL) WebTRAGIS (Transportation Routing Analysis Geographic Information System). This study aims to provide an overview of multi-modal routing, the existing state of the TRAGIS networks, the source data needs, and the requirements for developing structural relationships between various modes to create a suitable systemmore » for modeling the transport of spent nuclear fuel via a multimodal network. Modern transportation systems are comprised of interconnected, yet separate, modal networks. Efficient transportation networks rely upon the smooth transfer of cargoes at junction points that serve as connectors between modes. A key logistical impediment to the shipment of spent nuclear fuel is the absence of identified or designated transfer locations between transport modes. Understanding the potential network impacts on intermodal transportation of spent nuclear fuel is vital for planning transportation routes from origin to destination. By identifying key locations where modes intersect, routing decisions can be made to prioritize cost savings, optimize transport times and minimize potential risks to the population and environment. In order to facilitate such a process, ORNL began the development of a base intermodal network and associated routing code. The network was developed using previous intermodal networks and information from publicly available data sources to construct a database of potential intermodal transfer locations with likely capability to handle spent nuclear fuel casks. The coding development focused on modifying the existing WebTRAGIS routing code to accommodate intermodal transfers and the selection of prioritization constraints and modifiers to determine route selection. The limitations of the current model and future directions for development are discussed, including the current state of information on possible intermodal transfer locations for spent fuel.« less

  13. A radiology department intranet: development and applications.

    PubMed

    Willing, S J; Berland, L L

    1999-01-01

    An intranet is a "private Internet" that uses the protocols of the World Wide Web to share information resources within a company or with the company's business partners and clients. The hardware requirements for an intranet begin with a dedicated Web server permanently connected to the departmental network. The heart of a Web server is the hypertext transfer protocol (HTTP) service, which receives a page request from a client's browser and transmits the page back to the client. Although knowledge of hypertext markup language (HTML) is not essential for authoring a Web page, a working familiarity with HTML is useful, as is knowledge of programming and database management. Security can be ensured by using scripts to write information in hidden fields or by means of "cookies." Interfacing databases and database management systems with the Web server and conforming the user interface to HTML syntax can be achieved by means of the common gateway interface (CGI), Active Server Pages (ASP), or other methods. An intranet in a radiology department could include the following types of content: on-call schedules, work schedules and a calendar, a personnel directory, resident resources, memorandums and discussion groups, software for a radiology information system, and databases.

  14. An indicator-based evaluation of Black Sea food web dynamics during 1960-2000

    NASA Astrophysics Data System (ADS)

    Akoglu, Ekin; Salihoglu, Baris; Libralato, Simone; Oguz, Temel; Solidoro, Cosimo

    2014-06-01

    Four Ecopath mass-balance models were implemented for evaluating the structure and function of the Black Sea ecosystem using several ecological indicators during four distinctive periods (1960s, 1980-1987, 1988-1994 and 1995-2000). The results exemplify how the Black Sea ecosystem structure started to change after the 1960s as a result of a series of trophic transformations, i.e., shifts in the energy flow pathways through the food web. These transformations were initiated by anthropogenic factors, such as eutrophication and overfishing, that led to the transfer of large quantities of energy to the trophic dead-end species, which had no natural predators in the ecosystem, i.e., jellyfish whose biomass increased from 0.03 g C m- 2 in 1960-1969 to 0.933 g C m- 2 in 1988-1994. Concurrently, an alternative short pathway for energy transfer was formed that converted significant amounts of system production back to detritus. This decreased the transfer efficiency of energy flow from the primary producers to the higher trophic levels from 9% in the 1960s to 3% between 1980 and 1987. We conclude that the anchovy stock collapse and successful establishment of the alien comb-jelly Mnemiopsis in 1989 were rooted in the trophic interactions in the food web, all of which were exacerbated because of the long-term establishment of a combination of anthropogenic stressors.

  15. Clinicians' expectations of Web 2.0 as a mechanism for knowledge transfer of stroke best practices.

    PubMed

    David, Isabelle; Poissant, Lise; Rochette, Annie

    2012-09-13

    Health professionals are increasingly encouraged to adopt an evidence-based practice to ensure greater efficiency of their services. To promote this practice, several strategies exist: distribution of educational materials, local consensus processes, educational outreach visits, local opinion leaders, and reminders. Despite these strategies, gaps continue to be observed between practice and scientific evidence. Therefore, it is important to implement innovative knowledge transfer strategies that will change health professionals' practices. Through its interactive capacities, Web 2.0 applications are worth exploring. As an example, virtual communities of practice have already begun to influence professional practice. This study was initially developed to help design a Web 2.0 platform for health professionals working with stroke patients. The aim was to gain a better understanding of professionals' perceptions of Web 2.0 before the development of the platform. A qualitative study following a phenomenological approach was chosen. We conducted individual semi-structured interviews with clinicians and managers. Interview transcripts were subjected to a content analysis. Twenty-four female clinicians and managers in Quebec, Canada, aged 28-66 participated. Most participants identified knowledge transfer as the most useful outcome of a Web 2.0 platform. Respondents also expressed their need for a user-friendly platform. Accessibility to a computer and the Internet, features of the Web 2.0 platform, user support, technology skills, and previous technological experience were found to influence perceived ease of use and usefulness. Our results show that the perceived lack of time of health professionals has an influence on perceived behavioral intention to use it despite favorable perception of the usefulness of the Web 2.0 platform. In conclusion, female health professionals in Quebec believe that Web 2.0 may be a useful mechanism for knowledge transfer. However, lack of time and lack of technological skills may limit their use of a future Web 2.0 platform. Further studies are required with other populations and in other regions to confirm these findings.

  16. Increasing efficiency of information dissemination and collection through the World Wide Web

    Treesearch

    Daniel P. Huebner; Malchus B. Baker; Peter F. Ffolliott

    2000-01-01

    Researchers, managers, and educators have access to revolutionary technology for information transfer through the World Wide Web (Web). Using the Web to effectively gather and distribute information is addressed in this paper. Tools, tips, and strategies are discussed. Companion Web sites are provided to guide users in selecting the most appropriate tool for searching...

  17. Reliable file sharing in distributed operating system using web RTC

    NASA Astrophysics Data System (ADS)

    Dukiya, Rajesh

    2017-12-01

    Since, the evolution of distributed operating system, distributed file system is come out to be important part in operating system. P2P is a reliable way in Distributed Operating System for file sharing. It was introduced in 1999, later it became a high research interest topic. Peer to Peer network is a type of network, where peers share network workload and other load related tasks. A P2P network can be a period of time connection, where a bunch of computers connected by a USB (Universal Serial Bus) port to transfer or enable disk sharing i.e. file sharing. Currently P2P requires special network that should be designed in P2P way. Nowadays, there is a big influence of browsers in our life. In this project we are going to study of file sharing mechanism in distributed operating system in web browsers, where we will try to find performance bottlenecks which our research will going to be an improvement in file sharing by performance and scalability in distributed file systems. Additionally, we will discuss the scope of Web Torrent file sharing and free-riding in peer to peer networks.

  18. Drivers of nitrogen transfer in stream food webs across continents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Norman, Beth C.; Whiles, Matt R.; Collins, Sarah M.

    Studies of trophic-level material and energy transfers are central to ecology. The use of isotopic tracers has now made it possible to measure trophic transfer efficiencies of important nutrients and to better understand how these materials move through food webs. We analyzed data from thirteen 15N-ammonium tracer addition experiments to quantify N transfer from basal resources to animals in headwater streams with varying physical, chemical, and biological features. N transfer efficiencies from primary uptake compartments (PUCs; heterotrophic microorganisms and primary producers) to primary consumers was lower (mean: 11.5%, range: <1%-43%) than N transfer efficiencies from primary consumers to predators (mean:more » 80%, range: 5%- >100%). Total N transferred (as a rate) was greater in streams with open compared to closed canopies and overall N transfer efficiency generally followed a similar pattern, although was not statistically significant. We used principal component analysis to condense a suite of site characteristics into two environmental components. Total N uptake rates among trophic levels were best predicted by the component that was correlated with latitude, DIN:SRP, GPP:ER, and % canopy cover. N transfer efficiency did not respond consistently to environmental variables. Here, our results suggest that canopy cover influences N movement through stream food webs because light availability and primary production facilitate N transfer to higher trophic levels.« less

  19. Drivers of nitrogen transfer in stream food webs across continents

    DOE PAGES

    Norman, Beth C.; Whiles, Matt R.; Collins, Sarah M.; ...

    2017-10-25

    Studies of trophic-level material and energy transfers are central to ecology. The use of isotopic tracers has now made it possible to measure trophic transfer efficiencies of important nutrients and to better understand how these materials move through food webs. We analyzed data from thirteen 15N-ammonium tracer addition experiments to quantify N transfer from basal resources to animals in headwater streams with varying physical, chemical, and biological features. N transfer efficiencies from primary uptake compartments (PUCs; heterotrophic microorganisms and primary producers) to primary consumers was lower (mean: 11.5%, range: <1%-43%) than N transfer efficiencies from primary consumers to predators (mean:more » 80%, range: 5%- >100%). Total N transferred (as a rate) was greater in streams with open compared to closed canopies and overall N transfer efficiency generally followed a similar pattern, although was not statistically significant. We used principal component analysis to condense a suite of site characteristics into two environmental components. Total N uptake rates among trophic levels were best predicted by the component that was correlated with latitude, DIN:SRP, GPP:ER, and % canopy cover. N transfer efficiency did not respond consistently to environmental variables. Here, our results suggest that canopy cover influences N movement through stream food webs because light availability and primary production facilitate N transfer to higher trophic levels.« less

  20. Drivers of nitrogen transfer in stream food webs across continents.

    PubMed

    Norman, Beth C; Whiles, Matt R; Collins, Sarah M; Flecker, Alexander S; Hamilton, Steve K; Johnson, Sherri L; Rosi, Emma J; Ashkenas, Linda R; Bowden, William B; Crenshaw, Chelsea L; Crowl, Todd; Dodds, Walter K; Hall, Robert O; El-Sabaawi, Rana; Griffiths, Natalie A; Marti, Eugènia; McDowell, William H; Peterson, Scot D; Rantala, Heidi M; Riis, Tenna; Simon, Kevin S; Tank, Jennifer L; Thomas, Steven A; von Schiller, Daniel; Webster, Jackson R

    2017-12-01

    Studies of trophic-level material and energy transfers are central to ecology. The use of isotopic tracers has now made it possible to measure trophic transfer efficiencies of important nutrients and to better understand how these materials move through food webs. We analyzed data from thirteen 15 N-ammonium tracer addition experiments to quantify N transfer from basal resources to animals in headwater streams with varying physical, chemical, and biological features. N transfer efficiencies from primary uptake compartments (PUCs; heterotrophic microorganisms and primary producers) to primary consumers was lower (mean 11.5%, range <1% to 43%) than N transfer efficiencies from primary consumers to predators (mean 80%, range 5% to >100%). Total N transferred (as a rate) was greater in streams with open compared to closed canopies and overall N transfer efficiency generally followed a similar pattern, although was not statistically significant. We used principal component analysis to condense a suite of site characteristics into two environmental components. Total N uptake rates among trophic levels were best predicted by the component that was correlated with latitude, DIN:SRP, GPP:ER, and percent canopy cover. N transfer efficiency did not respond consistently to environmental variables. Our results suggest that canopy cover influences N movement through stream food webs because light availability and primary production facilitate N transfer to higher trophic levels. © 2017 by the Ecological Society of America.

  1. [Development of a secure and cost-effective infrastructure for the access of arbitrary web-based image distribution systems].

    PubMed

    Hackländer, T; Kleber, K; Schneider, H; Demabre, N; Cramer, B M

    2004-08-01

    To build an infrastructure that enables radiologists on-call and external users a teleradiological access to the HTML-based image distribution system inside the hospital via internet. In addition, no investment costs should arise on the user side and the image data should be sent renamed using cryptographic techniques. A pure HTML-based system manages the image distribution inside the hospital, with an open source project extending this system through a secure gateway outside the firewall of the hospital. The gateway handles the communication between the external users and the HTML server within the network of the hospital. A second firewall is installed between the gateway and the external users and builds up a virtual private network (VPN). A connection between the gateway and the external user is only acknowledged if the computers involved authenticate each other via certificates and the external users authenticate via a multi-stage password system. All data are transferred encrypted. External users get only access to images that have been renamed to a pseudonym by means of automated processing before. With an ADSL internet access, external users achieve an image load frequency of 0.4 CT images per second. More than 90 % of the delay during image transfer results from security checks within the firewalls. Data passing the gateway induce no measurable delay. Project goals were realized by means of an infrastructure that works vendor independently with any HTML-based image distribution systems. The requirements of data security were realized using state-of-the-art web techniques. Adequate access and transfer speed lead to a widespread acceptance of the system on the part of external users.

  2. WEB-BASED MODELING OF A FERTILIZER SOLUTION SPILL IN THE OHIO RIVER

    EPA Science Inventory

    Environmental computer models are usually desktop models. Some web-enabled models are beginning to appear where the user can use a browser to run the models on a central web server. Several issues arise when a desktop model is transferred to a web architecture. This paper discuss...

  3. RPM-WEBBSYS: A web-based computer system to apply the rational polynomial method for estimating static formation temperatures of petroleum and geothermal wells

    NASA Astrophysics Data System (ADS)

    Wong-Loya, J. A.; Santoyo, E.; Andaverde, J. A.; Quiroz-Ruiz, A.

    2015-12-01

    A Web-Based Computer System (RPM-WEBBSYS) has been developed for the application of the Rational Polynomial Method (RPM) to estimate static formation temperatures (SFT) of geothermal and petroleum wells. The system is also capable to reproduce the full thermal recovery processes occurred during the well completion. RPM-WEBBSYS has been programmed using advances of the information technology to perform more efficiently computations of SFT. RPM-WEBBSYS may be friendly and rapidly executed by using any computing device (e.g., personal computers and portable computing devices such as tablets or smartphones) with Internet access and a web browser. The computer system was validated using bottomhole temperature (BHT) measurements logged in a synthetic heat transfer experiment, where a good matching between predicted and true SFT was achieved. RPM-WEBBSYS was finally applied to BHT logs collected from well drilling and shut-in operations, where the typical problems of the under- and over-estimation of the SFT (exhibited by most of the existing analytical methods) were effectively corrected.

  4. Facilitating quality control for spectra assignments of small organic molecules: nmrshiftdb2--a free in-house NMR database with integrated LIMS for academic service laboratories.

    PubMed

    Kuhn, Stefan; Schlörer, Nils E

    2015-08-01

    nmrshiftdb2 supports with its laboratory information management system the integration of an electronic lab administration and management into academic NMR facilities. Also, it offers the setup of a local database, while full access to nmrshiftdb2's World Wide Web database is granted. This freely available system allows on the one hand the submission of orders for measurement, transfers recorded data automatically or manually, and enables download of spectra via web interface, as well as the integrated access to prediction, search, and assignment tools of the NMR database for lab users. On the other hand, for the staff and lab administration, flow of all orders can be supervised; administrative tools also include user and hardware management, a statistic functionality for accounting purposes, and a 'QuickCheck' function for assignment control, to facilitate quality control of assignments submitted to the (local) database. Laboratory information management system and database are based on a web interface as front end and are therefore independent of the operating system in use. Copyright © 2015 John Wiley & Sons, Ltd.

  5. Real-Time Payload Control and Monitoring on the World Wide Web

    NASA Technical Reports Server (NTRS)

    Sun, Charles; Windrem, May; Givens, John J. (Technical Monitor)

    1998-01-01

    World Wide Web (W3) technologies such as the Hypertext Transfer Protocol (HTTP) and the Java object-oriented programming environment offer a powerful, yet relatively inexpensive, framework for distributed application software development. This paper describes the design of a real-time payload control and monitoring system that was developed with W3 technologies at NASA Ames Research Center. Based on Java Development Toolkit (JDK) 1.1, the system uses an event-driven "publish and subscribe" approach to inter-process communication and graphical user-interface construction. A C Language Integrated Production System (CLIPS) compatible inference engine provides the back-end intelligent data processing capability, while Oracle Relational Database Management System (RDBMS) provides the data management function. Preliminary evaluation shows acceptable performance for some classes of payloads, with Java's portability and multimedia support identified as the most significant benefit.

  6. Bringing simulation to engineers in the field: a Web 2.0 approach.

    PubMed

    Haines, Robert; Khan, Kashif; Brooke, John

    2009-07-13

    Field engineers working on water distribution systems have to implement day-to-day operational decisions. Since pipe networks are highly interconnected, the effects of such decisions are correlated with hydraulic and water quality conditions elsewhere in the network. This makes the provision of predictive decision support tools (DSTs) for field engineers critical to optimizing the engineering work on the network. We describe how we created DSTs to run on lightweight mobile devices by using the Web 2.0 technique known as Software as a Service. We designed our system following the architectural style of representational state transfer. The system not only displays static geographical information system data for pipe networks, but also dynamic information and prediction of network state, by invoking and displaying the results of simulations running on more powerful remote resources.

  7. Higher Education Web Portals: Serving State and Student Transfer Needs. Research Brief

    ERIC Educational Resources Information Center

    McGill, Mollie

    2010-01-01

    Students need access to web-based resources where they can easily learn about the transfer options available to them--so they can save time and money as they strive to meet their educational goals. This study, conducted by the Western Interstate Commission for Higher Education Cooperative for Educational Technologies (WCET), is one component of…

  8. PLEs in Higher Education: Exploring the Transference of Web 2.0 Social Affordances

    ERIC Educational Resources Information Center

    Casquero, Oskar; Portillo, Javier; Ovelar, Ramón; Romo, Jesús; Benito, Manuel

    2013-01-01

    Knowing whether Personal Learning Environments (PLEs) could transfer Web 2.0 affordances, which have been focused on the non-educational or recreational sphere, to the institutional sphere is important to move the research agenda beyond "cool uses" and to understand how the learning process is affected when students use this new type of…

  9. Globe Teachers Guide and Photographic Data on the Web

    NASA Technical Reports Server (NTRS)

    Kowal, Dan

    2004-01-01

    The task of managing the GLOBE Online Teacher s Guide during this time period focused on transforming the technology behind the delivery system of this document. The web application transformed from a flat file retrieval system to a dynamic database access approach. The new methodology utilizes Java Server Pages (JSP) on the front-end and an Oracle relational database on the backend. This new approach allows users of the web site, mainly teachers, to access content efficiently by grade level and/or by investigation or educational concept area. Moreover, teachers can gain easier access to data sheets and lab and field guides. The new online guide also included updated content for all GLOBE protocols. The GLOBE web management team was given documentation for maintaining the new application. Instructions for modifying the JSP templates and managing database content were included in this document. It was delivered to the team by the end of October, 2003. The National Geophysical Data Center (NGDC) continued to manage the school study site photos on the GLOBE website. 333 study site photo images were added to the GLOBE database and posted on the web during this same time period for 64 schools. Documentation for processing study site photos was also delivered to the new GLOBE web management team. Lastly, assistance was provided in transferring reference applications such as the Cloud and LandSat quizzes and Earth Systems Online Poster from NGDC servers to GLOBE servers along with documentation for maintaining these applications.

  10. Enhanced transfer of organic matter to higher trophic levels caused by ocean acidification and its implications for export production: A mass balance approach.

    PubMed

    Boxhammer, Tim; Taucher, Jan; Bach, Lennart T; Achterberg, Eric P; Algueró-Muñiz, María; Bellworthy, Jessica; Czerny, Jan; Esposito, Mario; Haunost, Mathias; Hellemann, Dana; Ludwig, Andrea; Yong, Jaw C; Zark, Maren; Riebesell, Ulf; Anderson, Leif G

    2018-01-01

    Ongoing acidification of the ocean through uptake of anthropogenic CO2 is known to affect marine biota and ecosystems with largely unknown consequences for marine food webs. Changes in food web structure have the potential to alter trophic transfer, partitioning, and biogeochemical cycling of elements in the ocean. Here we investigated the impact of realistic end-of-the-century CO2 concentrations on the development and partitioning of the carbon, nitrogen, phosphorus, and silica pools in a coastal pelagic ecosystem (Gullmar Fjord, Sweden). We covered the entire winter-to-summer plankton succession (100 days) in two sets of five pelagic mesocosms, with one set being CO2 enriched (~760 μatm pCO2) and the other one left at ambient CO2 concentrations. Elemental mass balances were calculated and we highlight important challenges and uncertainties we have faced in the closed mesocosm system. Our key observations under high CO2 were: (1) A significantly amplified transfer of carbon, nitrogen, and phosphorus from primary producers to higher trophic levels, during times of regenerated primary production. (2) A prolonged retention of all three elements in the pelagic food web that significantly reduced nitrogen and phosphorus sedimentation by about 11 and 9%, respectively. (3) A positive trend in carbon fixation (relative to nitrogen) that appeared in the particulate matter pool as well as the downward particle flux. This excess carbon counteracted a potential reduction in carbon sedimentation that could have been expected from patterns of nitrogen and phosphorus fluxes. Our findings highlight the potential for ocean acidification to alter partitioning and cycling of carbon and nutrients in the surface ocean but also show that impacts are temporarily variable and likely depending upon the structure of the plankton food web.

  11. Water Quality Exchange Web Template User Guide

    EPA Pesticide Factsheets

    This is a step by step guide to using the WQX Web Monitoring Data Entry Template for Physical/Chemical data to prepare your data for import into the WQX Web tool, and subsequent transfer to the STORET Data Warehouse.

  12. Innovating the Standard Procurement System Utilizing Intelligent Agent Technologies

    DTIC Science & Technology

    1999-12-01

    36 C. STANDARD PROCUREMENT SYSTEM 36 1. OVERVIEW 36 2. SPS FUNCTIONS , 37 3. SPS ADVANTAGES 39 4. SPS DISADVANTAGES 40 5. SPS SUMMARY 41 D...PROCUREMENT PROCESS INNOVATION RESULTS ’. 52 E. INTELLIGENT AGENT (IA) TECHNOLOGY 53 1. OVERVIEW 54 viii 2. ADVANTAGES 58 3. DISADVANTAGES 58 F...Electronic Mall (EMALL), GSA Advantage , etc. • Web invoicing Electronic Funds Transfer (EFT) • • International Merchant Purchase Authorization Card (IMPAC

  13. Pathfinder. Volume 8, Number 6, November/December 2010

    DTIC Science & Technology

    2010-12-01

    transferring information between multiple systems . Nevertheless, without an end-to-end TCPED process and the associated standards, policies and equipment in...products with partners whose information technology systems vary and are not compatible with those of the NSG, NGA and the U.S. Depart- ment of...Pacific. ARF DReaMS is based on Web service technol- ogy, where traditional maps, data and any relevant geospatial information are made available

  14. Organic Carbon Sources and their Transfer in a Gulf of Mexico Coral Reef Ecosystem under River Influence

    NASA Astrophysics Data System (ADS)

    Parrish, C.; Carreón-Palau, L.; del Ángel-Rodríguez, J.; Perez-Espana, H.; Aguiniga-Garcıa, S.

    2016-02-01

    To assess the degree to which coral reefs in a marine protected area have been influenced by terrestrial and anthropogenic organic carbon inputs we used C and N stable isotopes and lipid biomarkers in the Coral Reef System of Veracruz in the southwest Gulf of Mexico. A C and N stable isotope mixing model and a calculated fatty acid (FA) retention factor revealed the primary producer sources that fuel the coral reef food web. Then lipid classes, FA and sterol biomarkers determined production of terrestrial and marine biogenic material of nutritional quality to pelagic and benthic organisms. Finally, coprostanol determined pollutant loading from sewage in the suspended particulate matter. Results indicate that phytoplankton is the major source of essential FA for fish and that dietary energy from terrestrial sources such as mangroves are transferred to juvenile fish, while sea grass non-essential FA are transferred to the entire food web. Sea urchins may be the main consumers of brown macroalgae, while surgeon fish prefer red algae. C and N isotopic values and the C:N ratio suggest that fertilizer is the principal source of nitrogen to macroalgae. Thus nitrogen supply also favored phytoplankton and sea grass growth leading to a better nutritional condition and high retention of organic carbon in the food web members during the rainy season when river influence increases. However, the great star coral Montastrea cavernosa nutritional condition decreased significantly. The nearest river to the Reef System was polluted in the dry season; however, a dilution effect was detected in the rainy season, when some coral reefs were contaminated. In 2013, a new treatment plant started working in the area. We would suggest monitoring δ15N and the C: N ratio in macroalgae as indicators of the nitrogen input and coprostanol as an indicator of human feces pollution in order to verify the efficiency of the new treatment plant as part of the management program of the Reef System.

  15. Design and development of a tele-healthcare information system based on web services and HL7 standards.

    PubMed

    Huang, Ean-Wen; Hung, Rui-Suan; Chiou, Shwu-Fen; Liu, Fei-Ying; Liou, Der-Ming

    2011-01-01

    Information and communication technologies progress rapidly and many novel applications have been developed in many domains of human life. In recent years, the demand for healthcare services has been growing because of the increase in the elderly population. Consequently, a number of healthcare institutions have focused on creating technologies to reduce extraneous work and improve the quality of service. In this study, an information platform for tele- healthcare services was implemented. The architecture of the platform included a web-based application server and client system. The client system was able to retrieve the blood pressure and glucose levels of a patient stored in measurement instruments through Bluetooth wireless transmission. The web application server assisted the staffs and clients in analyzing the health conditions of patients. In addition, the server provided face-to-face communications and instructions through remote video devices. The platform deployed a service-oriented architecture, which consisted of HL7 standard messages and web service components. The platform could transfer health records into HL7 standard clinical document architecture for data exchange with other organizations. The prototyping system was pretested and evaluated in a homecare department of hospital and a community management center for chronic disease monitoring. Based on the results of this study, this system is expected to improve the quality of healthcare services.

  16. Web-based data collection: detailed methods of a questionnaire and data gathering tool

    PubMed Central

    Cooper, Charles J; Cooper, Sharon P; del Junco, Deborah J; Shipp, Eva M; Whitworth, Ryan; Cooper, Sara R

    2006-01-01

    There have been dramatic advances in the development of web-based data collection instruments. This paper outlines a systematic web-based approach to facilitate this process through locally developed code and to describe the results of using this process after two years of data collection. We provide a detailed example of a web-based method that we developed for a study in Starr County, Texas, assessing high school students' work and health status. This web-based application includes data instrument design, data entry and management, and data tables needed to store the results that attempt to maximize the advantages of this data collection method. The software also efficiently produces a coding manual, web-based statistical summary and crosstab reports, as well as input templates for use by statistical packages. Overall, web-based data entry using a dynamic approach proved to be a very efficient and effective data collection system. This data collection method expedited data processing and analysis and eliminated the need for cumbersome and expensive transfer and tracking of forms, data entry, and verification. The code has been made available for non-profit use only to the public health research community as a free download [1]. PMID:16390556

  17. Attention: Page has moved

    Science.gov Websites

    Information CPC Web Team USA.gov is the U.S. Government's official Web portal to all Federal, state and local government Web resources and services. This page has moved In about 10 seconds you will be transferred to its

  18. Interrogating Host-virus Interactions and Elemental Transfer Using NanoSIMS

    NASA Astrophysics Data System (ADS)

    Pasulka, A.; Thamatrakoln, K.; Poulos, B.; Bidle, K. D.; Sullivan, M. B.; Orphan, V. J.

    2016-02-01

    Marine viruses (bacteriophage and eukaryotic viruses) impact microbial food webs by influencing microbial community structure, carbon and nutrient flow, and serving as agents of gene transfer. While the collective impact of viral activity has become more apparent over the last decade, there is a growing need for single-cell and single-virus level measurements of the associated carbon and nitrogen transfer, which ultimately shape the biogeochemical impact of viruses in the upper ocean. Stable isotopes have been used extensively for understanding trophic relationships and elemental cycling in marine food webs. While single-cell isotope approaches such as nanoscale secondary ion mass spectrometry (nanoSIMS) have been more readily used to study trophic interactions between microorganisms, isotopic enrichment in viruses has not been described. Here we used nanoSIMS to quantify the transfer of stable isotopes (13C and 15N) from host to individual viral particles in two distinct unicellular algal-virus model systems. These model systems represent a eukaryotic phytoplankton (Emiliania huxleyi strain CCMP374) and its 200nm coccolithovirus (EhV207), as well as a cyanobacterial phytoplankton (Synechococcus WH8101) and its 80nm virus (Syn1). Host cells were grown on labeled media for multiple generations, subjected to viral infection, and then viruses were harvested after lysis. In both cases, nanoSIMS measurements were able to detect 13C and 15N in the resulting viral particles significantly above the background noise. The isotopic enrichment in the viral particles mirrored that of the host. Through use of these laboratory model systems, we quantified the sensitivity (ion counts), spatial resolution, and reproducibility, including sources of methodological and biological variability, in stable isotope incorporation into viral particles. Our findings suggest that nanoSIMS can be successfully employed to directly probe virus-host interactions at the resolution of individual viral particles and quantify the amount of carbon and nitrogen transferred into viruses during infection of autotrophic phytoplankton.

  19. Benthic and Pelagic Pathways of Methylmercury Bioaccumulation in Estuarine Food Webs of the Northeast United States

    PubMed Central

    Chen, Celia Y.; Borsuk, Mark E.; Bugge, Deenie M.; Hollweg, Terill; Balcom, Prentiss H.; Ward, Darren M.; Williams, Jason; Mason, Robert P.

    2014-01-01

    Methylmercury (MeHg) is a contaminant of global concern that bioaccumulates and bioamagnifies in marine food webs. Lower trophic level fauna are important conduits of MeHg from sediment and water to estuarine and coastal fish harvested for human consumption. However, the sources and pathways of MeHg to these coastal fisheries are poorly known particularly the potential for transfer of MeHg from the sediment to biotic compartments. Across a broad gradient of human land impacts, we analyzed MeHg concentrations in food webs at ten estuarine sites in the Northeast US (from the Hackensack Meadowlands, NJ to the Gulf of Maine). MeHg concentrations in water column particulate material, but not in sediments, were predictive of MeHg concentrations in fish (killifish and Atlantic silversides). Moreover, MeHg concentrations were higher in pelagic fauna than in benthic-feeding fauna suggesting that MeHg delivery to the water column from methylation sites from within or outside of the estuary may be an important driver of MeHg bioaccumulation in estuarine pelagic food webs. In contrast, bulk sediment MeHg concentrations were only predictive of concentrations of MeHg in the infaunal worms. Our results across a broad gradient of sites demonstrate that the pathways of MeHg to lower trophic level estuarine organisms are distinctly different between benthic deposit feeders and forage fish. Thus, even in systems with contaminated sediments, transfer of MeHg into estuarine food webs maybe driven more by the efficiency of processes that determine MeHg input and bioavailability in the water column. PMID:24558491

  20. Systematic plan of building Web geographic information system based on ActiveX control

    NASA Astrophysics Data System (ADS)

    Zhang, Xia; Li, Deren; Zhu, Xinyan; Chen, Nengcheng

    2003-03-01

    A systematic plan of building Web Geographic Information System (WebGIS) using ActiveX technology is proposed in this paper. In the proposed plan, ActiveX control technology is adopted in building client-side application, and two different schemas are introduced to implement communication between controls in users¡ browser and middle application server. One is based on Distribute Component Object Model (DCOM), the other is based on socket. In the former schema, middle service application is developed as a DCOM object that communicates with ActiveX control through Object Remote Procedure Call (ORPC) and accesses data in GIS Data Server through Open Database Connectivity (ODBC). In the latter, middle service application is developed using Java language. It communicates with ActiveX control through socket based on TCP/IP and accesses data in GIS Data Server through Java Database Connectivity (JDBC). The first one is usually developed using C/C++, and it is difficult to develop and deploy. The second one is relatively easy to develop, but its performance of data transfer relies on Web bandwidth. A sample application is developed using the latter schema. It is proved that the performance of the sample application is better than that of some other WebGIS applications in some degree.

  1. XML Style Guide

    DTIC Science & Technology

    2015-07-01

    Acronyms ASCII American Standard Code for Information Interchange DAU data acquisition unit DDML data display markup language IHAL...Transfer Standard URI uniform resource identifier W3C World Wide Web Consortium XML extensible markup language XSD XML schema definition XML Style...Style Guide, RCC 125-15, July 2015 1 Introduction The next generation of telemetry systems will rely heavily on extensible markup language (XML

  2. 75 FR 19956 - Miller and Miller, Waterpower LLC; Notice of Application for Transfer of License and Soliciting...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-16

    ...) filed an application for transfer of license of the Worthville Dam Project No. 3156, located on the Deep.... See 18 CFR 385.2001(a)(1)(iii)(2008) and the instructions on the Commission's Web site under the ``e... filings please go to the Commission's Web site located at http://www.ferc.gov/filing-comments.asp . More...

  3. Job submission and management through web services: the experience with the CREAM service

    NASA Astrophysics Data System (ADS)

    Aiftimiei, C.; Andreetto, P.; Bertocco, S.; Fina, S. D.; Ronco, S. D.; Dorigo, A.; Gianelle, A.; Marzolla, M.; Mazzucato, M.; Sgaravatto, M.; Verlato, M.; Zangrando, L.; Corvo, M.; Miccio, V.; Sciaba, A.; Cesini, D.; Dongiovanni, D.; Grandi, C.

    2008-07-01

    Modern Grid middleware is built around components providing basic functionality, such as data storage, authentication, security, job management, resource monitoring and reservation. In this paper we describe the Computing Resource Execution and Management (CREAM) service. CREAM provides a Web service-based job execution and management capability for Grid systems; in particular, it is being used within the gLite middleware. CREAM exposes a Web service interface allowing conforming clients to submit and manage computational jobs to a Local Resource Management System. We developed a special component, called ICE (Interface to CREAM Environment) to integrate CREAM in gLite. ICE transfers job submissions and cancellations from the Workload Management System, allowing users to manage CREAM jobs from the gLite User Interface. This paper describes some recent studies aimed at assessing the performance and reliability of CREAM and ICE; those tests have been performed as part of the acceptance tests for integration of CREAM and ICE in gLite. We also discuss recent work towards enhancing CREAM with a BES and JSDL compliant interface.

  4. Clinicians’ Expectations of Web 2.0 as a Mechanism for Knowledge Transfer of Stroke Best Practices

    PubMed Central

    David, Isabelle; Rochette, Annie

    2012-01-01

    Background Health professionals are increasingly encouraged to adopt an evidence-based practice to ensure greater efficiency of their services. To promote this practice, several strategies exist: distribution of educational materials, local consensus processes, educational outreach visits, local opinion leaders, and reminders. Despite these strategies, gaps continue to be observed between practice and scientific evidence. Therefore, it is important to implement innovative knowledge transfer strategies that will change health professionals’ practices. Through its interactive capacities, Web 2.0 applications are worth exploring. As an example, virtual communities of practice have already begun to influence professional practice. Objective This study was initially developed to help design a Web 2.0 platform for health professionals working with stroke patients. The aim was to gain a better understanding of professionals’ perceptions of Web 2.0 before the development of the platform. Methods A qualitative study following a phenomenological approach was chosen. We conducted individual semi-structured interviews with clinicians and managers. Interview transcripts were subjected to a content analysis. Results Twenty-four female clinicians and managers in Quebec, Canada, aged 28-66 participated. Most participants identified knowledge transfer as the most useful outcome of a Web 2.0 platform. Respondents also expressed their need for a user-friendly platform. Accessibility to a computer and the Internet, features of the Web 2.0 platform, user support, technology skills, and previous technological experience were found to influence perceived ease of use and usefulness. Our results show that the perceived lack of time of health professionals has an influence on perceived behavioral intention to use it despite favorable perception of the usefulness of the Web 2.0 platform. Conclusions In conclusion, female health professionals in Quebec believe that Web 2.0 may be a useful mechanism for knowledge transfer. However, lack of time and lack of technological skills may limit their use of a future Web 2.0 platform. Further studies are required with other populations and in other regions to confirm these findings. PMID:23195753

  5. Ingestion and transfer of microplastics in the planktonic food web.

    PubMed

    Setälä, Outi; Fleming-Lehtinen, Vivi; Lehtiniemi, Maiju

    2014-02-01

    Experiments were carried out with different Baltic Sea zooplankton taxa to scan their potential to ingest plastics. Mysid shrimps, copepods, cladocerans, rotifers, polychaete larvae and ciliates were exposed to 10 μm fluorescent polystyrene microspheres. These experiments showed ingestion of microspheres in all taxa studied. The highest percentage of individuals with ingested spheres was found in pelagic polychaete larvae, Marenzelleria spp. Experiments with the copepod Eurytemora affinis and the mysid shrimp Neomysis integer showed egestion of microspheres within 12 h. Food web transfer experiments were done by offering zooplankton labelled with ingested microspheres to mysid shrimps. Microscopy observations of mysid intestine showed the presence of zooplankton prey and microspheres after 3 h incubation. This study shows for the first time the potential of plastic microparticle transfer via planktonic organisms from one trophic level (mesozooplankton) to a higher level (macrozooplankton). The impacts of plastic transfer and possible accumulation in the food web need further investigations. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Assimilation of Diazotrophic Nitrogen into Pelagic Food Webs

    PubMed Central

    Woodland, Ryan J.; Holland, Daryl P.; Beardall, John; Smith, Jonathan; Scicluna, Todd; Cook, Perran L. M.

    2013-01-01

    The fate of diazotrophic nitrogen (ND) fixed by planktonic cyanobacteria in pelagic food webs remains unresolved, particularly for toxic cyanophytes that are selectively avoided by most herbivorous zooplankton. Current theory suggests that ND fixed during cyanobacterial blooms can enter planktonic food webs contemporaneously with peak bloom biomass via direct grazing of zooplankton on cyanobacteria or via the uptake of bioavailable ND (exuded from viable cyanobacterial cells) by palatable phytoplankton or microbial consortia. Alternatively, ND can enter planktonic food webs post-bloom following the remineralization of bloom detritus. Although the relative contribution of these processes to planktonic nutrient cycles is unknown, we hypothesized that assimilation of bioavailable ND (e.g., nitrate, ammonium) by palatable phytoplankton and subsequent grazing by zooplankton (either during or after the cyanobacterial bloom) would be the primary pathway by which ND was incorporated into the planktonic food web. Instead, in situ stable isotope measurements and grazing experiments clearly documented that the assimilation of ND by zooplankton outpaced assimilation by palatable phytoplankton during a bloom of toxic Nodularia spumigena Mertens. We identified two distinct temporal phases in the trophic transfer of ND from N. spumigena to the plankton community. The first phase was a highly dynamic transfer of ND to zooplankton with rates that covaried with bloom biomass while bypassing other phytoplankton taxa; a trophic transfer that we infer was routed through bloom-associated bacteria. The second phase was a slowly accelerating assimilation of the dissolved-ND pool by phytoplankton that was decoupled from contemporaneous variability in N. spumigena concentrations. These findings provide empirical evidence that ND can be assimilated and transferred rapidly throughout natural plankton communities and yield insights into the specific processes underlying the propagation of ND through pelagic food webs. PMID:23840744

  7. Assimilation of diazotrophic nitrogen into pelagic food webs.

    PubMed

    Woodland, Ryan J; Holland, Daryl P; Beardall, John; Smith, Jonathan; Scicluna, Todd; Cook, Perran L M

    2013-01-01

    The fate of diazotrophic nitrogen (N(D)) fixed by planktonic cyanobacteria in pelagic food webs remains unresolved, particularly for toxic cyanophytes that are selectively avoided by most herbivorous zooplankton. Current theory suggests that N(D) fixed during cyanobacterial blooms can enter planktonic food webs contemporaneously with peak bloom biomass via direct grazing of zooplankton on cyanobacteria or via the uptake of bioavailable N(D) (exuded from viable cyanobacterial cells) by palatable phytoplankton or microbial consortia. Alternatively, N(D) can enter planktonic food webs post-bloom following the remineralization of bloom detritus. Although the relative contribution of these processes to planktonic nutrient cycles is unknown, we hypothesized that assimilation of bioavailable N(D) (e.g., nitrate, ammonium) by palatable phytoplankton and subsequent grazing by zooplankton (either during or after the cyanobacterial bloom) would be the primary pathway by which N(D) was incorporated into the planktonic food web. Instead, in situ stable isotope measurements and grazing experiments clearly documented that the assimilation of N(D) by zooplankton outpaced assimilation by palatable phytoplankton during a bloom of toxic Nodularia spumigena Mertens. We identified two distinct temporal phases in the trophic transfer of N(D) from N. spumigena to the plankton community. The first phase was a highly dynamic transfer of N(D) to zooplankton with rates that covaried with bloom biomass while bypassing other phytoplankton taxa; a trophic transfer that we infer was routed through bloom-associated bacteria. The second phase was a slowly accelerating assimilation of the dissolved-N(D) pool by phytoplankton that was decoupled from contemporaneous variability in N. spumigena concentrations. These findings provide empirical evidence that N(D) can be assimilated and transferred rapidly throughout natural plankton communities and yield insights into the specific processes underlying the propagation of N(D) through pelagic food webs.

  8. Infrared-thermography imaging system multiapplications for manufacturing

    NASA Astrophysics Data System (ADS)

    Stern, Sharon A.

    1990-03-01

    Imaging systems technology has been utilized traditionally for diagnosing structural envelope or insulation problems in the general thermographic comunity. Industrially, new applications for utilizing thermal imaging technology have been developed i n pred i cti ve/preventi ye mai ntenance and prod uct moni tori ng prociures at Eastman Kodak Company, the largest photographic manufacturering producer in the world. In the manufacturing processes used at Eastman Kodak Company, new applications for thermal imaging include: (1) Fluid transfer line insulation (2) Web coating drying uniformity (3) Web slitter knives (4) Heating/cooling coils (5) Overheated tail bearings, and (6) Electrical phase imbalance. The substantial cost benefits gained from these applications of infrared thermography substantiate the practicality of this approach and indicate the desirability of researching further appl i cati ons.

  9. A Prototype Web-based system for GOES-R Space Weather Data

    NASA Astrophysics Data System (ADS)

    Sundaravel, A.; Wilkinson, D. C.

    2010-12-01

    The Geostationary Operational Environmental Satellite-R Series (GOES-R) makes use of advanced instruments and technologies to monitor the Earth's surface and provide with accurate space weather data. The first GOES-R series satellite is scheduled to be launched in 2015. The data from the satellite will be widely used by scientists for space weather modeling and predictions. This project looks into the ways of how these datasets can be made available to the scientists on the Web and to assist them on their research. We are working on to develop a prototype web-based system that allows users to browse, search and download these data. The GOES-R datasets will be archived in NetCDF (Network Common Data Form) and CSV (Comma Separated Values) format. The NetCDF is a self-describing data format that contains both the metadata information and the data. The data is stored in an array-oriented fashion. The web-based system will offer services in two ways: via a web application (portal) and via web services. Using the web application, the users can download data in NetCDF or CSV format and can also plot a graph of the data. The web page displays the various categories of data and the time intervals for which the data is available. The web application (client) sends the user query to the server, which then connects to the data sources to retrieve the data and delivers it to the users. Data access will also be provided via SOAP (Simple Object Access Protocol) and REST (Representational State Transfer) web services. These provide functions which can be used by other applications to fetch data and use the data for further processing. To build the prototype system, we are making use of proxy data from existing GOES and POES space weather datasets. Java is the programming language used in developing tools that formats data to NetCDF and CSV. For the web technology we have chosen Grails to develop both the web application and the services. Grails is an open source web application framework based on the Groovy language. We are also making use of the THREDDS (Thematic Realtime Environmental Distributed Data Services) server to publish and access the NetCDF files. We have completed developing software tools to generate NetCDF and CSV data files and also tools to translate NetCDF to CSV. The current phase of the project involves in designing and developing the web interface.

  10. Web-based remote sensing of building energy performance

    NASA Astrophysics Data System (ADS)

    Martin, William; Nassiopoulos, Alexandre; Le Cam, Vincent; Kuate, Raphaël; Bourquin, Frédéric

    2013-04-01

    The present paper describes the design and the deployment of an instrumentation system enabling the energy monitoring of a building in a smart-grid context. The system is based on a network of wireless low power IPv6 sensors. Ambient temperature and electrical power for heating are measured. The management, storage, visualisation and treatment of the data is done through a web-based application that can be deployed as an online web service. The same web-based framework enables the acquisition of distant measured data such as those coming from a nearby weather station. On-site sensor and weather station data are then adequately treated based on inverse identification methods. The algorithms aim at determining the parameters of a numerical model suitable for a short-time horizon prediction of indoor climate. The model is based on standard multi-zone modelling assumptions and takes into account solar, airflow and conductive transfers. It was specially designed to render accurately inertia effects that are used in a demand-response strategy. All the hardware or software technologies that are used in the system are open and low cost so that they comply with the constraints of on-site deployment in buildings. The measured data as well as the model predictions can be accessed ubiquously through the web. This feature enables to consider a wide range of energy management applications at the disctrict, city or national level. The entire system has been deployed and tested in an experimental office building in Angers, France. It demonstrates the potential of ICT technologies to enable remotely controlled monitoring and surveillance in real time.

  11. An Easy-to-Build Remote Laboratory with Data Transfer Using the Internet School Experimental System

    ERIC Educational Resources Information Center

    Schauer, Frantisek; Lustig, Frantisek; Dvorak, Jiri; Ozvoldova, Miroslava

    2008-01-01

    The present state of information communication technology makes it possible to devise and run computer-based e-laboratories accessible to any user with a connection to the Internet, equipped with very simple technical means and making full use of web services. Thus, the way is open for a new strategy of physics education with strongly global…

  12. Dissolved organic carbon modulates mercury concentrations in insect subsidies from streams to terrestrial consumers

    PubMed Central

    Chaves-Ulloa, Ramsa; Taylor, Brad W.; Broadley, Hannah J.; Cottingham, Kathryn L.; Baer, Nicholas A.; Weathers, Kathleen C.; Ewing, Holly A.; Chen, Celia Y.

    2016-01-01

    Mercury (Hg) concentrations in aquatic environments have increased globally, exposing consumers of aquatic organisms to high Hg levels. For both aquatic and terrestrial consumers, exposure to Hg depends on their food sources as well as environmental factors influencing Hg bioavailability. The majority of the research on the transfer of methylmercury (MeHg), a toxic and bioaccumulating form of Hg, between aquatic and terrestrial food webs has focused on terrestrial piscivores. However, a gap exists in our understanding of the factors regulating MeHg bioaccumulation by non-piscivorous terrestrial predators, specifically consumers of adult aquatic insects. Because dissolved organic carbon (DOC) binds tightly to MeHg, affecting its transport and availability in aquatic food webs, we hypothesized that DOC affects MeHg transfer from stream food webs to terrestrial predators feeding on emerging adult insects. We tested this hypothesis by collecting data over two years from 10 low-order streams spanning a broad DOC gradient in the Lake Sunapee watershed in New Hampshire. We found that streamwater MeHg concentration increased linearly with DOC concentration. However, streams with the highest DOC concentrations had emerging stream prey and spiders with lower MeHg concentrations than streams with intermediate DOC concentrations; a pattern that is similar to fish and larval aquatic insects. Furthermore, high MeHg concentrations found in spiders show that MeHg transfer in adult aquatic insects is an overlooked but potentially significant pathway of MeHg bioaccumulation in terrestrial food webs. Our results suggest that although MeHg in water increases with DOC, MeHg concentrations in stream and terrestrial consumers did not consistently increase with increases in streamwater MeHg concentrations. In fact, there was a change from a positive to a negative relationship between aqueous exposure and bioaccumulation at streamwater MeHg concentrations associated with DOC above around 5 mg/L. Thus, our study highlights the importance of stream DOC for MeHg dynamics beyond stream boundaries, and shows that factors modulating MeHg bioavailability in aquatic systems can affect the transfer of MeHg to terrestrial predators via aquatic subsidies. PMID:27755696

  13. Dissolved organic carbon modulates mercury concentrations in insect subsidies from streams to terrestrial consumers.

    PubMed

    Chaves-Ulloa, Ramsa; Taylor, Brad W; Broadley, Hannah J; Cottingham, Kathryn L; Baer, Nicholas A; Weathers, Kathleen C; Ewing, Holly A; Chen, Celia Y

    2016-09-01

    Mercury (Hg) concentrations in aquatic environments have increased globally, exposing consumers of aquatic organisms to high Hg levels. For both aquatic and terrestrial consumers, exposure to Hg depends on their food sources as well as environmental factors influencing Hg bioavailability. The majority of the research on the transfer of methylmercury (MeHg), a toxic and bioaccumulating form of Hg, between aquatic and terrestrial food webs has focused on terrestrial piscivores. However, a gap exists in our understanding of the factors regulating MeHg bioaccumulation by non-piscivorous terrestrial predators, specifically consumers of adult aquatic insects. Because dissolved organic carbon (DOC) binds tightly to MeHg, affecting its transport and availability in aquatic food webs, we hypothesized that DOC affects MeHg transfer from stream food webs to terrestrial predators feeding on emerging adult insects. We tested this hypothesis by collecting data over 2 years from 10 low-order streams spanning a broad DOC gradient in the Lake Sunapee watershed in New Hampshire, USA. We found that streamwater MeHg concentration increased linearly with DOC concentration. However, streams with the highest DOC concentrations had emerging stream prey and spiders with lower MeHg concentrations than streams with intermediate DOC concentrations; a pattern that is similar to fish and larval aquatic insects. Furthermore, high MeHg concentrations found in spiders show that MeHg transfer in adult aquatic insects is an overlooked but potentially significant pathway of MeHg bioaccumulation in terrestrial food webs. Our results suggest that although MeHg in water increases with DOC, MeHg concentrations in stream and terrestrial consumers did not consistently increase with increases in streamwater MeHg concentrations. In fact, there was a change from a positive to a negative relationship between aqueous exposure and bioaccumulation at streamwater MeHg concentrations associated with DOC above ~5 mg/L. Thus, our study highlights the importance of stream DOC for MeHg dynamics beyond stream boundaries, and shows that factors modulating MeHg bioavailability in aquatic systems can affect the transfer of MeHg to terrestrial predators via aquatic subsidies. © 2016 by the Ecological Society of America.

  14. Ecological-network models link diversity, structure and function in the plankton food-web

    NASA Astrophysics Data System (ADS)

    D'Alelio, Domenico; Libralato, Simone; Wyatt, Timothy; Ribera D'Alcalà, Maurizio

    2016-02-01

    A planktonic food-web model including sixty-three functional nodes (representing auto- mixo- and heterotrophs) was developed to integrate most trophic diversity present in the plankton. The model was implemented in two variants - which we named ‘green’ and ‘blue’ - characterized by opposite amounts of phytoplankton biomass and representing, respectively, bloom and non-bloom states of the system. Taxonomically disaggregated food-webs described herein allowed to shed light on how components of the plankton community changed their trophic behavior in the two different conditions, and modified the overall functioning of the plankton food web. The green and blue food-webs showed distinct organizations in terms of trophic roles of the nodes and carbon fluxes between them. Such re-organization stemmed from switches in selective grazing by both metazoan and protozoan consumers. Switches in food-web structure resulted in relatively small differences in the efficiency of material transfer towards higher trophic levels. For instance, from green to blue states, a seven-fold decrease in phytoplankton biomass translated into only a two-fold decrease in potential planktivorous fish biomass. By linking diversity, structure and function in the plankton food-web, we discuss the role of internal mechanisms, relying on species-specific functionalities, in driving the ‘adaptive’ responses of plankton communities to perturbations.

  15. oriTfinder: a web-based tool for the identification of origin of transfers in DNA sequences of bacterial mobile genetic elements.

    PubMed

    Li, Xiaobin; Xie, Yingzhou; Liu, Meng; Tai, Cui; Sun, Jingyong; Deng, Zixin; Ou, Hong-Yu

    2018-05-04

    oriTfinder is a web server that facilitates the rapid identification of the origin of transfer site (oriT) of a conjugative plasmid or chromosome-borne integrative and conjugative element. The utilized back-end database oriTDB was built upon more than one thousand known oriT regions of bacterial mobile genetic elements (MGEs) as well as the known MGE-encoding relaxases and type IV coupling proteins (T4CP). With a combination of similarity searches for the oriTDB-archived oriT nucleotide sequences and the co-localization of the flanking relaxase homologous genes, the oriTfinder can predict the oriT region with high accuracy in the DNA sequence of a bacterial plasmid or chromosome in minutes. The server also detects the other transfer-related modules, including the potential relaxase gene, T4CP gene and the type IV secretion system gene cluster, and the putative genes coding for virulence factors and acquired antibiotic resistance determinants. oriTfinder may contribute to meeting the increasing demands of re-annotations for bacterial conjugative, mobilizable or non-transferable elements and aid in the rapid risk accession of disease-relevant trait dissemination in pathogenic bacteria of interest. oriTfinder is freely available to all users without any login requirement at http://bioinfo-mml.sjtu.edu.cn/oriTfinder.

  16. Engineering web maps with gradual content zoom based on streaming vector data

    NASA Astrophysics Data System (ADS)

    Huang, Lina; Meijers, Martijn; Šuba, Radan; van Oosterom, Peter

    2016-04-01

    Vario-scale data structures have been designed to support gradual content zoom and the progressive transfer of vector data, for use with arbitrary map scales. The focus to date has been on the server side, especially on how to convert geographic data into the proposed vario-scale structures by means of automated generalisation. This paper contributes to the ongoing vario-scale research by focusing on the client side and communication, particularly on how this works in a web-services setting. It is claimed that these functionalities are urgently needed, as many web-based applications, both desktop and mobile, require gradual content zoom, progressive transfer and a high performance level. The web-client prototypes developed in this paper make it possible to assess the behaviour of vario-scale data and to determine how users will actually see the interactions. Several different options of web-services communication architectures are possible in a vario-scale setting. These options are analysed and tested with various web-client prototypes, with respect to functionality, ease of implementation and performance (amount of transmitted data and response times). We show that the vario-scale data structure can fit in with current web-based architectures and efforts to standardise map distribution on the internet. However, to maximise the benefits of vario-scale data, a client needs to be aware of this structure. When a client needs a map to be refined (by means of a gradual content zoom operation), only the 'missing' data will be requested. This data will be sent incrementally to the client from a server. In this way, the amount of data transferred at one time is reduced, shortening the transmission time. In addition to these conceptual architecture aspects, there are many implementation and tooling design decisions at play. These will also be elaborated on in this paper. Based on the experiments conducted, we conclude that the vario-scale approach indeed supports gradual content zoom and the progressive web transfer of vector data. This is a big step forward in making vector data at arbitrary map scales available to larger user groups.

  17. Differential mercury transfer in the aquatic food web of a double basined lake associated with selenium and habitat.

    PubMed

    Arcagni, Marina; Campbell, Linda; Arribére, María A; Marvin-Dipasquale, Mark; Rizzo, Andrea; Ribeiro Guevara, Sergio

    2013-06-01

    Food web trophodynamics of total mercury (THg) and selenium (Se) were assessed for the double-basined ultraoligotrophic system of Lake Moreno, Patagonia. Each basin has differing proportions of littoral and pelagic habitats, thereby providing an opportunity to assess the importance of habitat (e.g. food web structure or benthic MeHg production) in the transfer of Hg and Se to top trophic fish species. Pelagic plankton, analyzed in three size classes (10-53, 53-200, and >200 μm), had very high [THg], exceeding 200 μg g(-1) dry weight (DW) in the smallest, and a low ratio of MeHg to THg (0.1 to 3%). In contrast, [THg] in littoral macroinvertebrates showed lower values (0.3 to 1.8 μg g(-1) DW). Juvenile and small fish species feeding upon plankton had higher [THg] (0.2 to 8 μg g(-1) muscle DW) compared to large piscivore fish species (0.1 to 1.6 μg g(-1) muscle DW). Selenium concentrations exhibited a much narrower variation range than THg in the food web, varying from 0.5 to 2.7 μg g(-1) DW. Molar Se:Hg ratios exceeded 1 for the majority of organisms in both basins, with most ratios exceeding 10. Using stable nitrogen isotopes as indicator of trophic level, no significant correlations were found with [THg], [Se] or Se:Hg. The apparent lack of biomagnification trends was attributed to elevated [THg] in plankton in the inorganic form mostly, as well as the possibility of consistent Se supply reducing the biomagnification in the food web of the organic portion of THg. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Differential mercury transfer in the aquatic food web of a double basined lake associated with selenium and habitat

    USGS Publications Warehouse

    Arcagni, Marina; Campbell, Linda; Arribére, María A.; Marvin-DiPasquale, Mark; Rizzo, Andrea; Guevara, Sergio Ribeiro

    2013-01-01

    Food web trophodynamics of total mercury (THg) and selenium (Se) were assessed for the double-basined ultraoligotrophic system of Lake Moreno, Patagonia. Each basin has differing proportions of littoral and pelagic habitats, thereby providing an opportunity to assess the importance of habitat (e.g. food web structure or benthic MeHg production) in the transfer of Hg and Se to top trophic fish species. Pelagic plankton, analyzed in three size classes (10–53, 53–200, and > 200 μm), had very high [THg], exceeding 200 μg g− 1 dry weight (DW) in the smallest, and a low ratio of MeHg to THg (0.1 to 3%). In contrast, [THg] in littoral macroinvertebrates showed lower values (0.3 to 1.8 μg g− 1 DW). Juvenile and small fish species feeding upon plankton had higher [THg] (0.2 to 8 μg g− 1 muscle DW) compared to large piscivore fish species (0.1 to 1.6 μg g− 1 muscle DW). Selenium concentrations exhibited a much narrower variation range than THg in the food web, varying from 0.5 to 2.7 μg g− 1 DW. Molar Se:Hg ratios exceeded 1 for the majority of organisms in both basins, with most ratios exceeding 10. Using stable nitrogen isotopes as indicator of trophic level, no significant correlations were found with [THg], [Se] or Se:Hg. The apparent lack of biomagnification trends was attributed to elevated [THg] in plankton in the inorganic form mostly, as well as the possibility of consistent Se supply reducing the biomagnification in the food web of the organic portion of THg.

  19. Design Insights and Inspiration from the Tate: What Museum Web Sites Can Offer Us

    ERIC Educational Resources Information Center

    Riley-Huff, Debra A.

    2009-01-01

    There are many similarities between museums and academic libraries as public service institutions. This article is an examination of museum Web site practices and concepts that might also be transferable to academic library Web sites. It explores the digital manifestations of design and information presentation, user engagement, interactivity, and…

  20. On the Nets. Comparing Web Browsers: Mosaic, Cello, Netscape, WinWeb and InternetWorks Life.

    ERIC Educational Resources Information Center

    Notess, Greg R.

    1995-01-01

    World Wide Web browsers are compared by speed, setup, hypertext transport protocol (HTTP) handling, management of file transfer protocol (FTP), telnet, gopher, and wide area information server (WAIS); bookmark options; and communication functions. Netscape has the most features, the fastest retrieval, sophisticated bookmark capabilities. (JMV)

  1. THE CONTRIBUTION OF MICROARTHROPODS TO ABOVE GROUND FOOD WEBS: A REVIEW AND MODEL OF BELOW GROUND TRANSFER IN A CONIFEROUS FOREST

    EPA Science Inventory

    Although belowground food webs have received much attention, studies concerning microarthropods in nondetrital food webs are scarce. Because adult oribatid mites often number between 250,000-500,000/m(2) in coniferous forests, microarthropods are a potential food resource for mic...

  2. A resource-oriented architecture for a Geospatial Web

    NASA Astrophysics Data System (ADS)

    Mazzetti, Paolo; Nativi, Stefano

    2010-05-01

    In this presentation we discuss some architectural issues on the design of an architecture for a Geospatial Web, that is an information system for sharing geospatial resources according to the Web paradigm. The success of the Web in building a multi-purpose information space, has raised questions about the possibility of adopting the same approach for systems dedicated to the sharing of more specific resources, such as the geospatial information, that is information characterized by spatial/temporal reference. To this aim an investigation on the nature of the Web and on the validity of its paradigm for geospatial resources is required. The Web was born in the early 90's to provide "a shared information space through which people and machines could communicate" [Berners-Lee 1996]. It was originally built around a small set of specifications (e.g. URI, HTTP, HTML, etc.); however, in the last two decades several other technologies and specifications have been introduced in order to extend its capabilities. Most of them (e.g. the SOAP family) actually aimed to transform the Web in a generic Distributed Computing Infrastructure. While these efforts were definitely successful enabling the adoption of service-oriented approaches for machine-to-machine interactions supporting complex business processes (e.g. for e-Government and e-Business applications), they do not fit in the original concept of the Web. In the year 2000, R. T. Fielding, one of the designers of the original Web specifications, proposes a new architectural style for distributed systems, called REST (Representational State Transfer), aiming to capture the fundamental characteristics of the Web as it was originally conceived [Fielding 2000]. In this view, the nature of the Web lies not so much in the technologies, as in the way they are used. Maintaining the Web architecture conform to the REST style would then assure the scalability, extensibility and low entry barrier of the original Web. On the contrary, systems using the same Web technologies and specifications but according to a different architectural style, despite their usefulness, should not be considered part of the Web. If the REST style captures the significant Web characteristics, then, in order to build a Geospatial Web it is necessary that its architecture satisfies all the REST constraints. One of them is of particular importance: the adoption of a Uniform Interface. It prescribes that all the geospatial resources must be accessed through the same interface; moreover according to the REST style this interface must satisfy four further constraints: a) identification of resources; b) manipulation of resources through representations; c) self-descriptive messages; and, d) hypermedia as the engine of application state. In the Web, the uniform interface provides basic operations which are meaningful for generic resources. They typically implement the CRUD pattern (Create-Retrieve-Update-Delete) which demonstrated to be flexible and powerful in several general-purpose contexts (e.g. filesystem management, SQL for database management systems, etc.). Restricting the scope to a subset of resources it would be possible to identify other generic actions which are meaningful for all of them. For example for geospatial resources, subsetting, resampling, interpolation and coordinate reference systems transformations functionalities are candidate functionalities for a uniform interface. However an investigation is needed to clarify the semantics of those actions for different resources, and consequently if they can really ascend the role of generic interface operation. Concerning the point a), (identification of resources), it is required that every resource addressable in the Geospatial Web has its own identifier (e.g. a URI). This allows to implement citation and re-use of resources, simply providing the URI. OPeNDAP and KVP encodings of OGC data access services specifications might provide a basis for it. Concerning point b) (manipulation of resources through representations), the Geospatial Web poses several issues. In fact, while the Web mainly handles semi-structured information, in the Geospatial Web the information is typically structured with several possible data models (e.g. point series, gridded coverages, trajectories, etc.) and encodings. A possibility would be to simplify the interchange formats, choosing to support a subset of data models and format(s). This is what actually the Web designers did choosing to define a common format for hypermedia (HTML), although the underlying protocol would be generic. Concerning point c), self-descriptive messages, the exchanged messages should describe themselves and their content. This would not be actually a major issue considering the effort put in recent years on geospatial metadata models and specifications. The point d), hypermedia as the engine of application state, is actually where the Geospatial Web would mainly differ from existing geospatial information sharing systems. In fact the existing systems typically adopt a service-oriented architecture, where applications are built as a single service or as a workflow of services. On the other hand, in the Geospatial Web, applications should be built following the path between interconnected resources. The link between resources should be made explicit as hyperlinks. The adoption of Semantic Web solutions would allow to define not only the existence of a link between two resources, but also the nature of the link. The implementation of a Geospatial Web would allow to build an information system with the same characteristics of the Web sharing its points-of-strength and weaknesses. The main advantages would be the following: • The user would interact with the Geospatial Web according to the well-known Web navigation paradigm. This would lower the barrier to the access to geospatial applications for non-specialists (e.g. the success of Google Maps and other Web mapping applications); • Successful Web and Web 2.0 applications - search engines, feeds, social network - could be integrated/replicated in the Geospatial Web; The main drawbacks would be the following: • The Uniform Interface simplifies the overall system architecture (e.g. no service registry, and service descriptors required), but moves the complexity to the data representation. Moreover since the interface must stay generic, it results really simple and therefore complex interactions would require several transfers. • In the geospatial domain one of the most valuable resources are processes (e.g. environmental models). How they can be modeled as resources accessed through the common interface is an open issue. Taking into account advantages and drawback it seems that a Geospatial Web would be useful, but its use would be limited to specific use-cases not covering all the possible applications. The Geospatial Web architecture could be partly based on existing specifications, while other aspects need investigation. References [Berners-Lee 1996] T. Berners-Lee, "WWW: Past, present, and future". IEEE Computer, 29(10), Oct. 1996, pp. 69-77. [Fielding 2000] Fielding, R. T. 2000. Architectural styles and the design of network-based software architectures. PhD Dissertation. Dept. of Information and Computer Science, University of California, Irvine

  3. Web-GIS oriented systems viability for municipal solid waste selective collection optimization in developed and transient economies.

    PubMed

    Rada, E C; Ragazzi, M; Fedrizzi, P

    2013-04-01

    Municipal solid waste management is a multidisciplinary activity that includes generation, source separation, storage, collection, transfer and transport, processing and recovery, and, last but not least, disposal. The optimization of waste collection, through source separation, is compulsory where a landfill based management must be overcome. In this paper, a few aspects related to the implementation of a Web-GIS based system are analyzed. This approach is critically analyzed referring to the experience of two Italian case studies and two additional extra-European case studies. The first case is one of the best examples of selective collection optimization in Italy. The obtained efficiency is very high: 80% of waste is source separated for recycling purposes. In the second reference case, the local administration is going to be faced with the optimization of waste collection through Web-GIS oriented technologies for the first time. The starting scenario is far from an optimized management of municipal solid waste. The last two case studies concern pilot experiences in China and Malaysia. Each step of the Web-GIS oriented strategy is comparatively discussed referring to typical scenarios of developed and transient economies. The main result is that transient economies are ready to move toward Web oriented tools for MSW management, but this opportunity is not yet well exploited in the sector. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Reconciling fisheries catch and ocean productivity

    PubMed Central

    Stock, Charles A.; Asch, Rebecca G.; Cheung, William W. L.; Dunne, John P.; Friedland, Kevin D.; Lam, Vicky W. Y.; Sarmiento, Jorge L.; Watson, Reg A.

    2017-01-01

    Photosynthesis fuels marine food webs, yet differences in fish catch across globally distributed marine ecosystems far exceed differences in net primary production (NPP). We consider the hypothesis that ecosystem-level variations in pelagic and benthic energy flows from phytoplankton to fish, trophic transfer efficiencies, and fishing effort can quantitatively reconcile this contrast in an energetically consistent manner. To test this hypothesis, we enlist global fish catch data that include previously neglected contributions from small-scale fisheries, a synthesis of global fishing effort, and plankton food web energy flux estimates from a prototype high-resolution global earth system model (ESM). After removing a small number of lightly fished ecosystems, stark interregional differences in fish catch per unit area can be explained (r = 0.79) with an energy-based model that (i) considers dynamic interregional differences in benthic and pelagic energy pathways connecting phytoplankton and fish, (ii) depresses trophic transfer efficiencies in the tropics and, less critically, (iii) associates elevated trophic transfer efficiencies with benthic-predominant systems. Model catch estimates are generally within a factor of 2 of values spanning two orders of magnitude. Climate change projections show that the same macroecological patterns explaining dramatic regional catch differences in the contemporary ocean amplify catch trends, producing changes that may exceed 50% in some regions by the end of the 21st century under high-emissions scenarios. Models failing to resolve these trophodynamic patterns may significantly underestimate regional fisheries catch trends and hinder adaptation to climate change. PMID:28115722

  5. Reconciling fisheries catch and ocean productivity.

    PubMed

    Stock, Charles A; John, Jasmin G; Rykaczewski, Ryan R; Asch, Rebecca G; Cheung, William W L; Dunne, John P; Friedland, Kevin D; Lam, Vicky W Y; Sarmiento, Jorge L; Watson, Reg A

    2017-02-21

    Photosynthesis fuels marine food webs, yet differences in fish catch across globally distributed marine ecosystems far exceed differences in net primary production (NPP). We consider the hypothesis that ecosystem-level variations in pelagic and benthic energy flows from phytoplankton to fish, trophic transfer efficiencies, and fishing effort can quantitatively reconcile this contrast in an energetically consistent manner. To test this hypothesis, we enlist global fish catch data that include previously neglected contributions from small-scale fisheries, a synthesis of global fishing effort, and plankton food web energy flux estimates from a prototype high-resolution global earth system model (ESM). After removing a small number of lightly fished ecosystems, stark interregional differences in fish catch per unit area can be explained ( r = 0.79) with an energy-based model that ( i ) considers dynamic interregional differences in benthic and pelagic energy pathways connecting phytoplankton and fish, ( ii ) depresses trophic transfer efficiencies in the tropics and, less critically, ( iii ) associates elevated trophic transfer efficiencies with benthic-predominant systems. Model catch estimates are generally within a factor of 2 of values spanning two orders of magnitude. Climate change projections show that the same macroecological patterns explaining dramatic regional catch differences in the contemporary ocean amplify catch trends, producing changes that may exceed 50% in some regions by the end of the 21st century under high-emissions scenarios. Models failing to resolve these trophodynamic patterns may significantly underestimate regional fisheries catch trends and hinder adaptation to climate change.

  6. Connecting geoscience systems and data using Linked Open Data in the Web of Data

    NASA Astrophysics Data System (ADS)

    Ritschel, Bernd; Neher, Günther; Iyemori, Toshihiko; Koyama, Yukinobu; Yatagai, Akiyo; Murayama, Yasuhiro; Galkin, Ivan; King, Todd; Fung, Shing F.; Hughes, Steve; Habermann, Ted; Hapgood, Mike; Belehaki, Anna

    2014-05-01

    Linked Data or Linked Open Data (LOD) in the realm of free and publically accessible data is one of the most promising and most used semantic Web frameworks connecting various types of data and vocabularies including geoscience and related domains. The semantic Web extension to the commonly existing and used World Wide Web is based on the meaning of entities and relationships or in different words classes and properties used for data in a global data and information space, the Web of Data. LOD data is referenced and mash-uped by URIs and is retrievable using simple parameter controlled HTTP-requests leading to a result which is human-understandable or machine-readable. Furthermore the publishing and mash-up of data in the semantic Web realm is realized by specific Web standards, such as RDF, RDFS, OWL and SPARQL defined for the Web of Data. Semantic Web based mash-up is the Web method to aggregate and reuse various contents from different sources, such as e.g. using FOAF as a model and vocabulary for the description of persons and organizations -in our case- related to geoscience projects, instruments, observations, data and so on. On the example of three different geoscience data and information management systems, such as ESPAS, IUGONET and GFZ ISDC and the associated science data and related metadata or better called context data, the concept of the mash-up of systems and data using the semantic Web approach and the Linked Open Data framework is described in this publication. Because the three systems are based on different data models, data storage structures and technical implementations an extra semantic Web layer upon the existing interfaces is used for mash-up solutions. In order to satisfy the semantic Web standards, data transition processes, such as the transfer of content stored in relational databases or mapped in XML documents into SPARQL capable databases or endpoints using D2R or XSLT is necessary. In addition, the use of mapped and/or merged domain specific and cross-domain vocabularies in the sense of terminological ontologies are the foundation for a virtually unified data retrieval and access in IUGONET, ESPAS and GFZ ISDC data management systems. SPARQL endpoints realized either by originally RDF databases, e.g. Virtuoso or by virtual SPARQL endpoints, e.g. D2R services enable an only upon Web standard-based mash-up of domain-specific systems and data, such as in this case the space weather and geomagnetic domain but also cross-domain connection to data and vocabularies, e.g. related to NASA's VxOs, particularly VWO or NASA's PDS data system within LOD. LOD - Linked Open Data RDF - Resource Description Framework RDFS - RDF Schema OWL - Ontology Web Language SPARQL - SPARQL Protocol and RDF Query Language FOAF - Friends of a Friend ontology ESPAS - Near Earth Space Data Infrastructure for e-Science (Project) IUGONET - Inter-university Upper Atmosphere Global Observation Network (Project) GFZ ISDC - German Research Centre for Geosciences Information System and Data Center XML - Extensible Mark-up Language D2R - (Relational) Database to RDF (Transformation) XSLT - Extensible Stylesheet Language Transformation Virtuoso - OpenLink Virtuoso Universal Server (including RDF data management) NASA - National Aeronautics and Space Administration VOx - Virtual Observatories VWO - Virtual Wave Observatory PDS - Planetary Data System

  7. Enhanced transfer of organic matter to higher trophic levels caused by ocean acidification and its implications for export production: A mass balance approach

    PubMed Central

    Taucher, Jan; Bach, Lennart T.; Achterberg, Eric P.; Algueró-Muñiz, María; Bellworthy, Jessica; Czerny, Jan; Esposito, Mario; Haunost, Mathias; Hellemann, Dana; Ludwig, Andrea; Yong, Jaw C.; Zark, Maren; Riebesell, Ulf; Anderson, Leif G.

    2018-01-01

    Ongoing acidification of the ocean through uptake of anthropogenic CO2 is known to affect marine biota and ecosystems with largely unknown consequences for marine food webs. Changes in food web structure have the potential to alter trophic transfer, partitioning, and biogeochemical cycling of elements in the ocean. Here we investigated the impact of realistic end-of-the-century CO2 concentrations on the development and partitioning of the carbon, nitrogen, phosphorus, and silica pools in a coastal pelagic ecosystem (Gullmar Fjord, Sweden). We covered the entire winter-to-summer plankton succession (100 days) in two sets of five pelagic mesocosms, with one set being CO2 enriched (~760 μatm pCO2) and the other one left at ambient CO2 concentrations. Elemental mass balances were calculated and we highlight important challenges and uncertainties we have faced in the closed mesocosm system. Our key observations under high CO2 were: (1) A significantly amplified transfer of carbon, nitrogen, and phosphorus from primary producers to higher trophic levels, during times of regenerated primary production. (2) A prolonged retention of all three elements in the pelagic food web that significantly reduced nitrogen and phosphorus sedimentation by about 11 and 9%, respectively. (3) A positive trend in carbon fixation (relative to nitrogen) that appeared in the particulate matter pool as well as the downward particle flux. This excess carbon counteracted a potential reduction in carbon sedimentation that could have been expected from patterns of nitrogen and phosphorus fluxes. Our findings highlight the potential for ocean acidification to alter partitioning and cycling of carbon and nutrients in the surface ocean but also show that impacts are temporarily variable and likely depending upon the structure of the plankton food web. PMID:29799856

  8. Development of a medical module for disaster information systems.

    PubMed

    Calik, Elif; Atilla, Rıdvan; Kaya, Hilal; Aribaş, Alirıza; Cengiz, Hakan; Dicle, Oğuz

    2014-01-01

    This study aims to improve a medical module which provides a real-time medical information flow about pre-hospital processes that gives health care in disasters; transferring, storing and processing the records that are in electronic media and over internet as a part of disaster information systems. In this study which is handled within the frame of providing information flow among professionals in a disaster case, to supply the coordination of healthcare team and transferring complete information to specified people at real time, Microsoft Access database and SQL query language were used to inform database applications. System was prepared on Microsoft .Net platform using C# language. Disaster information system-medical module was designed to be used in disaster area, field hospital, nearby hospitals, temporary inhabiting areas like tent city, vehicles that are used for dispatch, and providing information flow between medical officials and data centres. For fast recording of the disaster victim data, accessing to database which was used by health care professionals was provided (or granted) among analysing process steps and creating minimal datasets. Database fields were created in the manner of giving opportunity to enter new data and search old data which is recorded before disaster. Web application which provides access such as data entry to the database and searching towards the designed interfaces according to the login credentials access level. In this study, homepage and users' interfaces which were built on database in consequence of system analyses were provided with www.afmedinfo.com web site to the user access. With this study, a recommendation was made about how to use disaster-based information systems in the field of health. Awareness has been developed about the fact that disaster information system should not be perceived only as an early warning system. Contents and the differences of the health care practices of disaster information systems were revealed. A web application was developed supplying a link between the user and the database to make date entry and data query practices by the help of the developed interfaces.

  9. 10th Annual Systems Engineering Conference - Focusing on Improving Performance of Defense Systems Programs. Volume 1. Tuesday Presentations

    DTIC Science & Technology

    2007-10-25

    Exchange Platforms Publishers Resellers iTunes Non-Profit/ Government/ Self- Managed Library of Congress Library Consortia Fed. Tech Transfer Sites Univ...Thompson Dialog – Thomson MicroPatent – Thomson Pharma – WIPO Digital Patent Library* (*nonprofit organization) • iTunes • IP Exchange Platforms...Elsevier SciDirect – Nat’l Acad. Press – Lexis, Westlaw – Newspaper web sites • iTunes • Resellers of Copyrighted Material – JSTOR – EBSCOHost

  10. Time and Frequency Transfer Activities at NIST

    DTIC Science & Technology

    2008-12-01

    differences. The graph shows data from MJD 54466 to MJD 54763 (January 1, 2008 to October 24, 2008). II.E. The Sistema Interamericano de...Metrologia (SIM) Time Network The Sistema Interamericano de Metrologia (SIM) consists of national metrology institutes (NMIs) located in the 34...designed to mitigate multipath signals. All SIM systems are connected to the Internet and upload their measurement results to Internet Web servers

  11. 77 FR 21761 - Alice Falls Corporation, Alice Falls Hydro, LLC; Notice of Application for Transfer of License...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-11

    ... under http:[sol][sol]www.ferc.gov/docs-filing/efiling.asp. Commenters can submit brief comments up to 6,000 characters, without prior registration, using the eComment system at http:[sol][sol]www.ferc.gov... printed on the eLibrary link of Commission's Web site at http:[sol][sol]www.ferc.gov/docs-filing/elibrary...

  12. Bioaccumulation and trophic transfer of cyclic volatile methylsiloxanes (cVMS) in the aquatic marine food webs of the Oslofjord, Norway.

    PubMed

    Powell, David E; Schøyen, Merete; Øxnevad, Sigurd; Gerhards, Reinhard; Böhmer, Thomas; Koerner, Martin; Durham, Jeremy; Huff, Darren W

    2018-05-01

    The trophic transfer of cyclic methylsiloxanes (cVMS) in aquatic ecosystems is an important criterion for assessing bioaccumulation and ecological risk. Bioaccumulation and trophic transfer of cVMS, specifically octamethylcyclotetrasiloxane (D4), decamethylcyclopentasiloxane (D5), and dodecamethylcyclohexasiloxane (D6), were evaluated for the marine food webs of the Inner and Outer Oslofjord, Norway. The sampled food webs included zooplankton, benthic macroinvertebrates, shellfish, and finfish species. Zooplankton, benthic macroinvertebrates, and shellfish occupied the lowest trophic levels (TL ≈2 to 3); northern shrimp (Pandalus borealis) and Atlantic herring (Clupea harengus) occupied the middle trophic levels (TL ≈3 to 4), and Atlantic cod (Gadus morhua) occupied the highest tropic level (TL>4.0). Trophic dynamics in the Oslofjord were best described as a compressed food web defined by demersal and pelagic components that were confounded by a diversity in prey organisms and feeding relationships. Lipid-normalized concentrations of D4, D5, and D6 were greatest in the lowest trophic levels and significantly decreased up the food web, with the lowest concentrations being observed in the highest trophic level species. Trophic magnification factors (TMF) for D4, D5, and D6 were <1.0 (range 0.3 to 0.9) and were consistent between the Inner and Outer Oslofjord, indicating that exposure did not impact TMF across the marine food web. There was no evidence to suggest biomagnification of cVMS in the Oslofjord. Rather, results indicated that trophic dilution of cVMS, not trophic magnification, occurred across the sampled food webs. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  13. Power Plants, Steam and Gas Turbines WebQuest

    ERIC Educational Resources Information Center

    Ulloa, Carlos; Rey, Guillermo D.; Sánchez, Ángel; Cancela, Ángeles

    2012-01-01

    A WebQuest is an Internet-based and inquiry-oriented learning activity. The aim of this work is to outline the creation of a WebQuest entitled "Power Generation Plants: Steam and Gas Turbines." This is one of the topics covered in the course "Thermodynamics and Heat Transfer," which is offered in the second year of Mechanical…

  14. Providing Knowledge Recommendations: An Approach for Informal Electronic Mentoring

    ERIC Educational Resources Information Center

    Colomo-Palacios, Ricardo; Casado-Lumbreras, Cristina; Soto-Acosta, Pedro; Misra, Sanjay

    2014-01-01

    The use of Web 2.0 technologies for knowledge management is invading the corporate sphere. The Web 2.0 is the most adopted knowledge transfer tool within knowledge intensive firms and is starting to be used for mentoring. This paper presents IM-TAG, a Web 2.0 tool, based on semantic technologies, for informal mentoring. The tool offers…

  15. Web Tools: The Second Generation

    ERIC Educational Resources Information Center

    Pascopella, Angela

    2008-01-01

    Web 2.0 tools and technologies, or second generation tools, help districts to save time and money, and eliminate the need to transfer or move files back and forth across computers. Many Web 2.0 tools help students think critically and solve problems, which falls under the 21st-century skills. The second-generation tools are growing in popularity…

  16. Role of cellular compartmentalization in the trophic transfer of mercury species in a freshwater plant-crustacean food chain.

    PubMed

    Beauvais-Flück, Rebecca; Chaumot, Arnaud; Gimbert, Frédéric; Quéau, Hervé; Geffard, Olivier; Slaveykova, Vera I; Cosio, Claudia

    2016-12-15

    Mercury (Hg) represents an important risk for human health through the food webs contamination. Macrophytes bioaccumulate Hg and play a role in Hg transfer to food webs in shallow aquatic ecosystems. Nevertheless, the compartmentalization of Hg within macrophytes, notably major accumulation in the cell wall and its impact on trophic transfer to primary consumers are overlooked. The present work focusses on the trophic transfer of inorganic Hg (IHg) and monomethyl-Hg (MMHg) from the intracellular and cell wall compartments of the macrophyte Elodea nuttallii - considered a good candidate for phytoremediation - to the crustacean Gammarus fossarum. The results demonstrated that Hg accumulated in both compartments was trophically bioavailable to gammarids. Besides IHg from both compartments were similarly transferred to G. fossarum, while for MMHg, uptake rates were ∼2.5-fold higher in G. fossarum fed with the cell wall vs the intracellular compartment. During the depuration phase, Hg concentrations in G. fossarum varied insignificantly suggesting that both IHg and MMHg were strongly bound to biological ligands in the crustacean. Our data imply that cell walls have to be considered as an important source of Hg to consumers in freshwater food webs when developing procedures for enhancing aquatic environment protection during phytoremediation programs. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Integrated technologies for solid waste bin monitoring system.

    PubMed

    Arebey, Maher; Hannan, M A; Basri, Hassan; Begum, R A; Abdullah, Huda

    2011-06-01

    The integration of communication technologies such as radio frequency identification (RFID), global positioning system (GPS), general packet radio system (GPRS), and geographic information system (GIS) with a camera are constructed for solid waste monitoring system. The aim is to improve the way of responding to customer's inquiry and emergency cases and estimate the solid waste amount without any involvement of the truck driver. The proposed system consists of RFID tag mounted on the bin, RFID reader as in truck, GPRS/GSM as web server, and GIS as map server, database server, and control server. The tracking devices mounted in the trucks collect location information in real time via the GPS. This information is transferred continuously through GPRS to a central database. The users are able to view the current location of each truck in the collection stage via a web-based application and thereby manage the fleet. The trucks positions and trash bin information are displayed on a digital map, which is made available by a map server. Thus, the solid waste of the bin and the truck are being monitored using the developed system.

  18. Measuring transferring similarity via local information

    NASA Astrophysics Data System (ADS)

    Yin, Likang; Deng, Yong

    2018-05-01

    Recommender systems have developed along with the web science, and how to measure the similarity between users is crucial for processing collaborative filtering recommendation. Many efficient models have been proposed (i.g., the Pearson coefficient) to measure the direct correlation. However, the direct correlation measures are greatly affected by the sparsity of dataset. In other words, the direct correlation measures would present an inauthentic similarity if two users have a very few commonly selected objects. Transferring similarity overcomes this drawback by considering their common neighbors (i.e., the intermediates). Yet, the transferring similarity also has its drawback since it can only provide the interval of similarity. To break the limitations, we propose the Belief Transferring Similarity (BTS) model. The contributions of BTS model are: (1) BTS model addresses the issue of the sparsity of dataset by considering the high-order similarity. (2) BTS model transforms uncertain interval to a certain state based on fuzzy systems theory. (3) BTS model is able to combine the transferring similarity of different intermediates using information fusion method. Finally, we compare BTS models with nine different link prediction methods in nine different networks, and we also illustrate the convergence property and efficiency of the BTS model.

  19. Contributions of Traditional Web 1.0 Tools e.g. Email and Web 2.0 Tools e.g. Weblog towards Knowledge Management

    ERIC Educational Resources Information Center

    Dehinbo, Johnson

    2010-01-01

    The use of email utilizes the power of Web 1.0 to enable users to access their email from any computer and mobile devices that is connected to the Internet making email valuable in acquiring and transferring knowledge. But the advent of Web 2.0 and social networking seems to indicate certain limitations of email. The use of social networking seems…

  20. A Web Geographic Information System to share data and explorative analysis tools: The application to West Nile disease in the Mediterranean basin.

    PubMed

    Savini, Lara; Tora, Susanna; Di Lorenzo, Alessio; Cioci, Daniela; Monaco, Federica; Polci, Andrea; Orsini, Massimiliano; Calistri, Paolo; Conte, Annamaria

    2018-01-01

    In the last decades an increasing number of West Nile Disease cases was observed in equines and humans in the Mediterranean basin and surveillance systems are set up in numerous countries to manage and control the disease. The collection, storage and distribution of information on the spread of the disease becomes important for a shared intervention and control strategy. To this end, a Web Geographic Information System has been developed and disease data, climatic and environmental remote sensed data, full genome sequences of selected isolated strains are made available. This paper describes the Disease Monitoring Dashboard (DMD) web system application, the tools available for the preliminary analysis on climatic and environmental factors and the other interactive tools for epidemiological analysis. WNV occurrence data are collected from multiple official and unofficial sources. Whole genome sequences and metadata of WNV strains are retrieved from public databases or generated in the framework of the Italian surveillance activities. Climatic and environmental data are provided by NASA website. The Geographical Information System is composed by Oracle 10g Database and ESRI ArcGIS Server 10.03; the web mapping client application is developed with the ArcGIS API for Javascript and Phylocanvas library to facilitate and optimize the mash-up approach. ESRI ArcSDE 10.1 has been used to store spatial data. The DMD application is accessible through a generic web browser at https://netmed.izs.it/networkMediterraneo/. The system collects data through on-line forms and automated procedures and visualizes data as interactive graphs, maps and tables. The spatial and temporal dynamic visualization of disease events is managed by a time slider that returns results on both map and epidemiological curve. Climatic and environmental data can be associated to cases through python procedures and downloaded as Excel files. The system compiles multiple datasets through user-friendly web tools; it integrates entomological, veterinary and human surveillance, molecular information on pathogens and environmental and climatic data. The principal result of the DMD development is the transfer and dissemination of knowledge and technologies to develop strategies for integrated prevention and control measures of animal and human diseases.

  1. NASA Records Database

    NASA Technical Reports Server (NTRS)

    Callac, Christopher; Lunsford, Michelle

    2005-01-01

    The NASA Records Database, comprising a Web-based application program and a database, is used to administer an archive of paper records at Stennis Space Center. The system begins with an electronic form, into which a user enters information about records that the user is sending to the archive. The form is smart : it provides instructions for entering information correctly and prompts the user to enter all required information. Once complete, the form is digitally signed and submitted to the database. The system determines which storage locations are not in use, assigns the user s boxes of records to some of them, and enters these assignments in the database. Thereafter, the software tracks the boxes and can be used to locate them. By use of search capabilities of the software, specific records can be sought by box storage locations, accession numbers, record dates, submitting organizations, or details of the records themselves. Boxes can be marked with such statuses as checked out, lost, transferred, and destroyed. The system can generate reports showing boxes awaiting destruction or transfer. When boxes are transferred to the National Archives and Records Administration (NARA), the system can automatically fill out NARA records-transfer forms. Currently, several other NASA Centers are considering deploying the NASA Records Database to help automate their records archives.

  2. Completing the One EPA Web Transformation (Email Message)

    EPA Pesticide Factsheets

    Deputy administrator Stan Meiburg urged other administrators to review their web transformation progress, and make sure they have requested extensions and planned to temporarily transfer content to the www3 server, before the September 30, 2015 deadline.

  3. QNAP 1263U Network Attached Storage (NAS)/ Storage Area Network (SAN) Device Users Guide

    DTIC Science & Technology

    2016-11-01

    standard Ethernet network. Operating either a NAS or SAN is vital for the integrity of the data stored on the drives found in the device. Redundant...speed of the network itself. Many standards are in place for transferring data, including more standard ones such as File Transfer Protocol and Server ...following are the procedures for connecting to the NAS administrative web page: 1) Open a web browser and browse to 192.168.40.8:8080. 2) Enter the

  4. ViDI: Virtual Diagnostics Interface. Volume 2; Unified File Format and Web Services as Applied to Seamless Data Transfer

    NASA Technical Reports Server (NTRS)

    Fleming, Gary A. (Technical Monitor); Schwartz, Richard J.

    2004-01-01

    The desire to revolutionize the aircraft design cycle from its currently lethargic pace to a fast turn-around operation enabling the optimization of non-traditional configurations is a critical challenge facing the aeronautics industry. In response, a large scale effort is underway to not only advance the state of the art in wind tunnel testing, computational modeling, and information technology, but to unify these often disparate elements into a cohesive design resource. This paper will address Seamless Data Transfer, the critical central nervous system that will enable a wide variety of varied components to work together.

  5. The OGC Sensor Web Enablement framework

    NASA Astrophysics Data System (ADS)

    Cox, S. J.; Botts, M.

    2006-12-01

    Sensor observations are at the core of natural sciences. Improvements in data-sharing technologies offer the promise of much greater utilisation of observational data. A key to this is interoperable data standards. The Open Geospatial Consortium's (OGC) Sensor Web Enablement initiative (SWE) is developing open standards for web interfaces for the discovery, exchange and processing of sensor observations, and tasking of sensor systems. The goal is to support the construction of complex sensor applications through real-time composition of service chains from standard components. The framework is based around a suite of standard interfaces, and standard encodings for the message transferred between services. The SWE interfaces include: Sensor Observation Service (SOS)-parameterized observation requests (by observation time, feature of interest, property, sensor); Sensor Planning Service (SPS)-tasking a sensor- system to undertake future observations; Sensor Alert Service (SAS)-subscription to an alert, usually triggered by a sensor result exceeding some value. The interface design generally follows the pattern established in the OGC Web Map Service (WMS) and Web Feature Service (WFS) interfaces, where the interaction between a client and service follows a standard sequence of requests and responses. The first obtains a general description of the service capabilities, followed by obtaining detail required to formulate a data request, and finally a request for a data instance or stream. These may be implemented in a stateless "REST" idiom, or using conventional "web-services" (SOAP) messaging. In a deployed system, the SWE interfaces are supplemented by Catalogue, data (WFS) and portrayal (WMS) services, as well as authentication and rights management. The standard SWE data formats are Observations and Measurements (O&M) which encodes observation metadata and results, Sensor Model Language (SensorML) which describes sensor-systems, Transducer Model Language (TML) which covers low-level data streams, and domain-specific GML Application Schemas for definitions of the target feature types. The SWE framework has been demonstrated in several interoperability testbeds. These were based around emergency management, security, contamination and environmental monitoring scenarios.

  6. Architecture for Improving Terrestrial Logistics Based on the Web of Things

    PubMed Central

    Castro, Miguel; Jara, Antonio J.; Skarmeta, Antonio

    2012-01-01

    Technological advances for improving supply chain efficiency present three key challenges for managing goods: tracking, tracing and monitoring (TTM), in order to satisfy the requirements for products such as perishable goods where the European Legislations requires them to ship within a prescribed temperature range to ensure freshness and suitability for consumption. The proposed system integrates RFID for tracking and tracing through a distributed architecture developed for heavy goods vehicles, and the sensors embedded in the SunSPOT platform for monitoring the goods transported based on the concept of the Internet of Things. This paper presents how the Internet of Things is integrated for improving terrestrial logistics offering a comprehensive and flexible architecture, with high scalability, according to the specific needs for reaching an item-level continuous monitoring solution. The major contribution from this work is the optimization of the Embedded Web Services based on RESTful (Web of Things) for the access to TTM services at any time during the transportation of goods. Specifically, it has been extended the monitoring patterns such as observe and blockwise transfer for the requirements from the continuous conditional monitoring, and for the transfer of full inventories and partial ones based on conditional queries. In definitive, this work presents an evolution of the previous TTM solutions, which were limited to trailer identification and environment monitoring, to a solution which is able to provide an exhaustive item-level monitoring, required for several use cases. This exhaustive monitoring has required new communication capabilities through the Web of Things, which has been optimized with the use and improvement of a set of communications patterns. PMID:22778657

  7. Architecture for improving terrestrial logistics based on the Web of Things.

    PubMed

    Castro, Miguel; Jara, Antonio J; Skarmeta, Antonio

    2012-01-01

    Technological advances for improving supply chain efficiency present three key challenges for managing goods: tracking, tracing and monitoring (TTM), in order to satisfy the requirements for products such as perishable goods where the European Legislations requires them to ship within a prescribed temperature range to ensure freshness and suitability for consumption. The proposed system integrates RFID for tracking and tracing through a distributed architecture developed for heavy goods vehicles, and the sensors embedded in the SunSPOT platform for monitoring the goods transported based on the concept of the Internet of Things. This paper presents how the Internet of Things is integrated for improving terrestrial logistics offering a comprehensive and flexible architecture, with high scalability, according to the specific needs for reaching an item-level continuous monitoring solution. The major contribution from this work is the optimization of the Embedded Web Services based on RESTful (Web of Things) for the access to TTM services at any time during the transportation of goods. Specifically, it has been extended the monitoring patterns such as observe and blockwise transfer for the requirements from the continuous conditional monitoring, and for the transfer of full inventories and partial ones based on conditional queries. In definitive, this work presents an evolution of the previous TTM solutions, which were limited to trailer identification and environment monitoring, to a solution which is able to provide an exhaustive item-level monitoring, required for several use cases. This exhaustive monitoring has required new communication capabilities through the Web of Things, which has been optimized with the use and improvement of a set of communications patterns.

  8. Efficient Data Transfer Rate and Speed of Secured Ethernet Interface System.

    PubMed

    Ghanti, Shaila; Naik, G M

    2016-01-01

    Embedded systems are extensively used in home automation systems, small office systems, vehicle communication systems, and health service systems. The services provided by these systems are available on the Internet and these services need to be protected. Security features like IP filtering, UDP protection, or TCP protection need to be implemented depending on the specific application used by the device. Every device on the Internet must have network interface. This paper proposes the design of the embedded Secured Ethernet Interface System to protect the service available on the Internet against the SYN flood attack. In this experimental study, Secured Ethernet Interface System is customized to protect the web service against the SYN flood attack. Secured Ethernet Interface System is implemented on ALTERA Stratix IV FPGA as a system on chip and uses the modified SYN flood attack protection method. The experimental results using Secured Ethernet Interface System indicate increase in number of genuine clients getting service from the server, considerable improvement in the data transfer rate, and better response time during the SYN flood attack.

  9. Efficient Data Transfer Rate and Speed of Secured Ethernet Interface System

    PubMed Central

    Ghanti, Shaila

    2016-01-01

    Embedded systems are extensively used in home automation systems, small office systems, vehicle communication systems, and health service systems. The services provided by these systems are available on the Internet and these services need to be protected. Security features like IP filtering, UDP protection, or TCP protection need to be implemented depending on the specific application used by the device. Every device on the Internet must have network interface. This paper proposes the design of the embedded Secured Ethernet Interface System to protect the service available on the Internet against the SYN flood attack. In this experimental study, Secured Ethernet Interface System is customized to protect the web service against the SYN flood attack. Secured Ethernet Interface System is implemented on ALTERA Stratix IV FPGA as a system on chip and uses the modified SYN flood attack protection method. The experimental results using Secured Ethernet Interface System indicate increase in number of genuine clients getting service from the server, considerable improvement in the data transfer rate, and better response time during the SYN flood attack. PMID:28116350

  10. Trophic transference of microplastics under a low exposure scenario: Insights on the likelihood of particle cascading along marine food-webs.

    PubMed

    Santana, M F M; Moreira, F T; Turra, A

    2017-08-15

    Microplastics are emergent pollutants in marine environments, whose risks along food-web still need to be understood. Within this knowledge gap, MPs transference and persistence along trophic levels are key processes. We assessed the potential occurrence of these processes considering a less extreme scenario of exposure than used previously, with microplastics present only in the hemolymph of prey (the mussel Perna perna) and absent in the gut cavity. Predators were the crab Callinectes ornatus and the puffer fish Spheoeroides greeleyi. Transference of microplastics occurred from prey to predators but without evidences of particle persistence in their tissues after 10days of exposure. This suggests a reduced likelihood of trophic cascading of particles and, consequently, a reduced risk of direct impacts of microplastics on higher trophic levels. However, the contact with microplastics along food-webs is still concerning, modulated by the concentration of particles in prey and predators' depuration capacity and rate. Copyright © 2017. Published by Elsevier Ltd.

  11. Interactive web-based format vs conventional brochure material for information transfer to children and parents: a randomized controlled trial regarding preoperative information.

    PubMed

    Lööf, Gunilla; Liljeberg, Cecilia; Eksborg, Staffan; Lönnqvist, Per-Arne

    2017-06-01

    Information transfer to patients is an integral part of modern medicine. Internet-based alternatives represent a new and attractive way for information transfer. The study used a prospective observer-blinded design. Children (3-12 years) and parents were instructed to get further preoperative information either through an interactive web-based platform, the Anaesthesia-Web, or conventional brochure material until day of outpatient surgery. On the day of surgery, children and parents were separately asked six different questions. The primary end-point was to compare the total question score in children between the two information options (maximum score = 36). Secondary aims were the total question score for parents and the influence of age, sex, and time between the preoperative visit and day of surgery. A total of 125 children were recruited, of which 103 were included in the final analysis (the Anaesthesia-Web group, n = 49; the brochure material group, n = 54). At the predetermined interim analysis, the total question score in children was found to be substantially higher in the Anaesthesia-Web group than in the brochure material group (median score: 27; IQR: 16.5-33 and median score: 19.5; IQR: 11.25-27.75, respectively, P = 0.0076). The median difference in score was 6; 95% CI: 0-9. The total question score in parents was also higher in the Anaesthesia-Web group than in the brochure material group. Increasing child age was associated with a higher total question score in both groups. Sex did not influence the total question score in the Anaesthesia-Web group, whereas girls scored better than boys in the brochure material group. Children in the age range 3-12 years of age as well as their parents do better attain preoperative information from an interactive web-based platform compared to conventional brochure material. © 2017 John Wiley & Sons Ltd.

  12. 36 CFR 1233.16 - How does an agency transfer records to the National Personnel Records Center (NPRC)?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...-1800, Web site http://www.opm.gov/feddata/recguide2008.pdf, for the OPM publication “The Guide to... Records Center (NPRC), St. Louis, MO” section of the NARA Federal Records Centers Program Web site (http... assistance consult the NPRC Web site (http://www.archives.gov/facilities/mo/st7_louis.html). ...

  13. The Contribution of Microarthropods to Aboveground Food Webs: A Review and Model of Belowground Transfer in a Coniferous Forest

    Treesearch

    John M. Johnston

    1999-01-01

    Although below ground food webs have received much attention, studies concerning microarthropods in nondetrital food webs are scarce. because adult oribatid mites often number between 250.000-500,000/ m2 in coniferous forests, microarthropods are a potential food resource for macroarthropod and vertebrate predators of the forest floor. Although...

  14. 36 CFR § 1233.16 - How does an agency transfer records to the National Personnel Records Center (NPRC)?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...-1800, Web site http://www.opm.gov/feddata/recguide2008.pdf, for the OPM publication “The Guide to... Records Center (NPRC), St. Louis, MO” section of the NARA Federal Records Centers Program Web site (http... assistance consult the NPRC Web site (http://www.archives.gov/facilities/mo/st_louis.html). ...

  15. 36 CFR 1233.16 - How does an agency transfer records to the National Personnel Records Center (NPRC)?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...-1800, Web site http://www.opm.gov/feddata/recguide2008.pdf, for the OPM publication “The Guide to... Records Center (NPRC), St. Louis, MO” section of the NARA Federal Records Centers Program Web site (http... assistance consult the NPRC Web site (http://www.archives.gov/facilities/mo/st_louis.html). ...

  16. Representing Value as Digital Object: A Discussion of Transferability and Anonymity; Digital Library Initiatives of the Deutsche Forschungsgemeinschaft; CrossRef Turns One; Fermi National Accelerator Laboratory (Fermilab).

    ERIC Educational Resources Information Center

    Kahn, Robert E.; Lyons, Patrice A.; Brahms, Ewald; Brand, Amy; van den Bergen, Mieke

    2001-01-01

    Includes four articles that discuss the use of digital objects to represent value in a network environment; digital library initiatives at the central public funding organization for academic research in Germany; an application of the Digital Object Identifier System; and the Web site of the Fermi National Accelerator Laboratory. (LRW)

  17. 48 CFR 301.607-74 - Certification transfers.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ....607-74 Certification transfers. (a) HHS recognizes and accepts FAC-P/PM certifications issued by other... under FAC-P/PM. See FAI's Web site, and Chapter 3, Federal Acquisition Certification—Program and Project... certification transfer should not be initiated when an individual, who holds a current FAC-P/PM certification...

  18. 48 CFR 301.607-74 - Certification transfers.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ....607-74 Certification transfers. (a) HHS recognizes and accepts FAC-P/PM certifications issued by other... under FAC-P/PM. See FAI's Web site, and Chapter 3, Federal Acquisition Certification—Program and Project... certification transfer should not be initiated when an individual, who holds a current FAC-P/PM certification...

  19. 48 CFR 301.607-74 - Certification transfers.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ....607-74 Certification transfers. (a) HHS recognizes and accepts FAC-P/PM certifications issued by other... under FAC-P/PM. See FAI's Web site, and Chapter 3, Federal Acquisition Certification—Program and Project... certification transfer should not be initiated when an individual, who holds a current FAC-P/PM certification...

  20. 48 CFR 301.607-74 - Certification transfers.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ....607-74 Certification transfers. (a) HHS recognizes and accepts FAC-P/PM certifications issued by other... under FAC-P/PM. See FAI's Web site, and Chapter 3, Federal Acquisition Certification—Program and Project... certification transfer should not be initiated when an individual, who holds a current FAC-P/PM certification...

  1. Development of a Web-Based Distributed Interactive Simulation (DIS) Environment Using JavaScript

    DTIC Science & Technology

    2014-09-01

    scripting that let users change or interact with web content depending on user input, which is in contrast with server-side scripts such as PHP, Java and...transfer, DIS usually broadcasts or multicasts its PDUs based on UDP socket. 3. JavaScript JavaScript is the scripting language of the web, and all...IDE) for developing desktop, mobile and web applications with JAVA , C++, HTML5, JavaScript and more. b. Framework The DIS implementation of

  2. BrainIACS: a system for web-based medical image processing

    NASA Astrophysics Data System (ADS)

    Kishore, Bhaskar; Bazin, Pierre-Louis; Pham, Dzung L.

    2009-02-01

    We describe BrainIACS, a web-based medical image processing system that permits and facilitates algorithm developers to quickly create extensible user interfaces for their algorithms. Designed to address the challenges faced by algorithm developers in providing user-friendly graphical interfaces, BrainIACS is completely implemented using freely available, open-source software. The system, which is based on a client-server architecture, utilizes an AJAX front-end written using the Google Web Toolkit (GWT) and Java Servlets running on Apache Tomcat as its back-end. To enable developers to quickly and simply create user interfaces for configuring their algorithms, the interfaces are described using XML and are parsed by our system to create the corresponding user interface elements. Most of the commonly found elements such as check boxes, drop down lists, input boxes, radio buttons, tab panels and group boxes are supported. Some elements such as the input box support input validation. Changes to the user interface such as addition and deletion of elements are performed by editing the XML file or by using the system's user interface creator. In addition to user interface generation, the system also provides its own interfaces for data transfer, previewing of input and output files, and algorithm queuing. As the system is programmed using Java (and finally Java-script after compilation of the front-end code), it is platform independent with the only requirements being that a Servlet implementation be available and that the processing algorithms can execute on the server platform.

  3. Research and design on system of asset management based on RFID

    NASA Astrophysics Data System (ADS)

    Guan, Peng; Du, HuaiChang; Jing, Hua; Zhang, MengYue; Zhang, Meng; Xu, GuiXian

    2011-10-01

    By analyzing the problems in the current assets management, this thesis proposing RFID technology will be applied to asset management in order to improve the management level of automation and information. This paper designed the equipment identification based on 433MHz RFID tag and reader which was deeply studied on the basis of RFID tag and card reader circuits, and this paper also illustrates the system of asset management. The RS232 converts Ethernet is a innovative technology to transfer data to PC monitor software, and implement system of asset management based on WEB techniques (PHP and MySQL).

  4. A Web-based home welfare and care services support system using a pen type image sensor.

    PubMed

    Ogawa, Hidekuni; Yonezawa, Yoshiharu; Maki, Hiromichi; Sato, Haruhiko; Hahn, Allen W; Caldwell, W Morton

    2003-01-01

    A long-term care insurance law for elderly persons was put in force two years ago in Japan. The Home Helpers, who are employed by hospitals, care companies or the welfare office, provide home welfare and care services for the elderly, such as cooking, bathing, washing, cleaning, shopping, etc. We developed a web-based home welfare and care services support system using wireless Internet mobile phones and Internet client computers, which employs a pen type image sensor. The pen type image sensor is used by the elderly people as the entry device for their care requests. The client computer sends the requests to the server computer in the Home Helper central office, and then the server computer automatically transfers them to the Home Helper's mobile phone. This newly-developed home welfare and care services support system is easily operated by elderly persons and enables Homes Helpers to save a significant amount of time and extra travel.

  5. What do international pharmacoeconomic guidelines say about economic data transferability?

    PubMed

    Barbieri, Marco; Drummond, Michael; Rutten, Frans; Cook, John; Glick, Henry A; Lis, Joanna; Reed, Shelby D; Sculpher, Mark; Severens, Johan L

    2010-12-01

    The objectives of this article were to assess the positions of the various national pharmacoeconomic guidelines on the transferability (or lack of transferability) of clinical and economic data and to review the methods suggested in the guidelines for addressing issues of transferability. A review of existing national pharmacoeconomic guidelines was conducted to assess recommendations on the transferability of clinical and economic data, whether there are important differences between countries, and whether common methodologies have been suggested to address key transferability issues. Pharmacoeconomic guidelines were initially identified through the ISPOR Web site. In addition, those national guidelines not included in the ISPOR Web site, but known to us, were also considered. Across 27 sets of guidelines, baseline risk and unit costs were uniformly considered to be of low transferability, while treatment effect was classified as highly transferable. Results were more variable for resource use and utilities, which were considered to have low transferability in 63% and 45% of cases, respectively. There were some differences between older and more recent guidelines in the treatment of transferability issues. A growing number of jurisdictions are using guidelines for the economic evaluation of pharmaceuticals. The recommendations in existing guidelines regarding the transferability of clinical and economic data are quite diverse. There is a case for standardization in dealing with transferability issues. One important step would be to update guidelines more frequently. © 2010, International Society for Pharmacoeconomics and Outcomes Research (ISPOR).

  6. Wood Utilization Research Dissemination on the World Wide Web: A Case Study

    Treesearch

    Daniel L. Schmoldt; Matthew F. Winn; Philip A. Araman

    1997-01-01

    Because many research products are informational rather than tangible, emerging information technologies, such as the multi-media format of the World Wide Web, provide an open and easily accessible mechanism for transferring research to user groups. We have found steady, increasing use of our Web site over the first 6-1/2 months of operation; almost one-third of the...

  7. Installing and Executing Information Object Analysis, Intent, Dissemination, and Enhancement (IOAIDE) and Its Dependencies

    DTIC Science & Technology

    2017-02-01

    Image Processing Web Server Administration ...........................17 Fig. 18 Microsoft ASP.NET MVC 4 installation...algorithms are made into client applications that can be accessed from an image processing web service2 developed following Representational State...Transfer (REST) standards by a mobile app, laptop PC, and other devices. Similarly, weather tweets can be accessed via the Weather Digest Web Service

  8. Effectiveness of a Web-Based Simulation in Improving Nurses' Workplace Practice With Deteriorating Ward Patients: A Pre- and Postintervention Study.

    PubMed

    Liaw, Sok Ying; Wong, Lai Fun; Lim, Eunice Ya Ping; Ang, Sophia Bee Leng; Mujumdar, Sandhya; Ho, Jasmine Tze Yin; Mordiffi, Siti Zubaidah; Ang, Emily Neo Kim

    2016-02-19

    Nurses play an important role in detecting patients with clinical deterioration. However, the problem of nurses failing to trigger deteriorating ward patients still persists despite the implementation of a patient safety initiative, the Rapid Response System. A Web-based simulation was developed to enhance nurses' role in recognizing and responding to deteriorating patients. While studies have evaluated the effectiveness of the Web-based simulation on nurses' clinical performance in a simulated environment, no study has examined its impact on nurses' actual practice in the clinical setting. The objective of this study was to evaluate the impact of Web-based simulation on nurses' recognition of and response to deteriorating patients in clinical settings. The outcomes were measured across all levels of Kirkpatrick's 4-level evaluation model with clinical outcome on triggering rates of deteriorating patients as the primary outcome measure. A before-and-after study was conducted on two general wards at an acute care tertiary hospital over a 14-month period. All nurses from the two study wards who undertook the Web-based simulation as part of their continuing nursing education were invited to complete questionnaires at various time points to measure their motivational reaction, knowledge, and perceived transfer of learning. Clinical records on cases triggered by ward nurses from the two study wards were evaluated for frequency and types of triggers over a period of 6 months pre- and 6 months postintervention. The number of deteriorating patients triggered by ward nurses in a medical general ward increased significantly (P<.001) from pre- (84/937, 8.96%) to postintervention (91/624, 14.58%). The nurses reported positively on the transfer of learning (mean 3.89, SD 0.49) from the Web-based simulation to clinical practice. A significant increase (P<.001) on knowledge posttest score from pretest score was also reported. The nurses also perceived positively their motivation (mean 3.78, SD 0.56) to engage in the Web-based simulation. This study provides evidence on the effectiveness of Web-based simulation in improving nursing practice when recognizing and responding to deteriorating patients. This educational tool could be implemented by nurse educators worldwide to address the educational needs of a large group of hospital nurses responsible for patients in clinical deterioration.

  9. Tracking the autochthonous carbon transfer in stream biofilm food webs.

    PubMed

    Risse-Buhl, Ute; Trefzger, Nicolai; Seifert, Anne-Gret; Schönborn, Wilfried; Gleixner, Gerd; Küsel, Kirsten

    2012-01-01

    Food webs in the rhithral zone rely mainly on allochthonous carbon from the riparian vegetation. However, autochthonous carbon might be more important in open canopy streams. In streams, most of the microbial activity occurs in biofilms, associated with the streambed. We followed the autochthonous carbon transfer toward bacteria and grazing protozoa within a stream biofilm food web. Biofilms that developed in a second-order stream (Thuringia, Germany) were incubated in flow channels under climate-controlled conditions. Six-week-old biofilms received either ¹³C- or ¹²C-labeled CO₂, and uptake into phospholipid fatty acids was followed. The dissolved inorganic carbon of the flow channel water became immediately labeled. In biofilms grown under 8-h light/16-h dark conditions, more than 50% of the labeled carbon was incorporated in biofilm algae, mainly filamentous cyanobacteria, pennate diatoms, and nonfilamentous green algae. A mean of 29% of the labeled carbon reached protozoan grazer. The testate amoeba Pseudodifflugia horrida was highly abundant in biofilms and seemed to be the most important grazer on biofilm bacteria and algae. Hence, stream biofilms dominated by cyanobacteria and algae seem to play an important role in the uptake of CO₂ and transfer of autochthonous carbon through the microbial food web.

  10. Web Service Execution and Monitoring in Integrated Applications in Support of Business Communities

    NASA Astrophysics Data System (ADS)

    Chiriacescu, Rares M.; SzóKe, Alexandru; Portase, Sorin; Florea, Monica

    Emerging technology is one of the key factors that drive the business world to faster adaptation, reaction and shorter communication path. Building upon such technologies, business communities emerge, geared toward high flexibility in their offerings and collaboration: business-to-customer and business-to-business collaborations. Adapting to the market requirements, companies must address several technical challenges that arise from the main requirements of the system they have to introduce: a high degree of flexibility, heterogeneous system collaboration and security of the transferred data.

  11. The Physiology Constant Database of Teen-Agers in Beijing

    PubMed Central

    Wei-Qi, Wei; Guang-Jin, Zhu; Cheng-Li, Xu; Shao-Mei, Han; Bao-Shen, Qi; Li, Chen; Shu-Yu, Zu; Xiao-Mei, Zhou; Wen-Feng, Hu; Zheng-Guo, Zhang

    2004-01-01

    Physiology constants of adolescents are important to understand growing living systems and are a useful reference in clinical and epidemiological research. Until recently, physiology constants were not available in China and therefore most physiologists, physicians, and nutritionists had to use data from abroad for reference. However, the very difference between the Eastern and Western races casts doubt on the usefulness of overseas data. We have therefore created a database system to provide a repository for the storage of physiology constants of teen-agers in Beijing. The several thousands of pieces of data are now divided into hematological biochemistry, lung function, and cardiac function with all data manually checked before being transferred into the database. The database was accomplished through the development of a web interface, scripts, and a relational database. The physiology data were integrated into the relational database system to provide flexible facilities by using combinations of various terms and parameters. A web browser interface was designed for the users to facilitate their searching. The database is available on the web. The statistical table, scatter diagram, and histogram of the data are available for both anonym and user according to queries, while only the user can achieve detail, including download data and advanced search. PMID:15258669

  12. Framework for ReSTful Web Services in OSGi

    NASA Technical Reports Server (NTRS)

    Shams, Khawaja S.; Norris, Jeffrey S.; Powell, Mark W.; Crockett, Thomas M.; Mittman, David S.; Fox, Jason M.; Joswig, Joseph C.; Wallick, Michael N.; Torres, Recaredo J.; Rabe, Kenneth

    2009-01-01

    Ensemble ReST is a software system that eases the development, deployment, and maintenance of server-side application programs to perform functions that would otherwise be performed by client software. Ensemble ReST takes advantage of the proven disciplines of ReST (Representational State Transfer. ReST leverages the standardized HTTP protocol to enable developers to offer services to a diverse variety of clients: from shell scripts to sophisticated Java application suites

  13. Appendix E : FIB-63 tests.

    DOT National Transportation Integrated Search

    2013-03-01

    Web splitting cracks (Figure 1) typically form during prestress transfer, or in the days and : weeks following transfer. They occur due to tensile stresses that are induced as prestressing : forces in the bottom flange are distributed through the cro...

  14. Storage and distribution of pathology digital images using integrated web-based viewing systems.

    PubMed

    Marchevsky, Alberto M; Dulbandzhyan, Ronda; Seely, Kevin; Carey, Steve; Duncan, Raymond G

    2002-05-01

    Health care providers have expressed increasing interest in incorporating digital images of gross pathology specimens and photomicrographs in routine pathology reports. To describe the multiple technical and logistical challenges involved in the integration of the various components needed for the development of a system for integrated Web-based viewing, storage, and distribution of digital images in a large health system. An Oracle version 8.1.6 database was developed to store, index, and deploy pathology digital photographs via our Intranet. The database allows for retrieval of images by patient demographics or by SNOMED code information. The Intranet of a large health system accessible from multiple computers located within the medical center and at distant private physician offices. The images can be viewed using any of the workstations of the health system that have authorized access to our Intranet, using a standard browser or a browser configured with an external viewer or inexpensive plug-in software, such as Prizm 2.0. The images can be printed on paper or transferred to film using a digital film recorder. Digital images can also be displayed at pathology conferences by using wireless local area network (LAN) and secure remote technologies. The standardization of technologies and the adoption of a Web interface for all our computer systems allows us to distribute digital images from a pathology database to a potentially large group of users distributed in multiple locations throughout a large medical center.

  15. Internet, World Wide Web, and Creativity.

    ERIC Educational Resources Information Center

    Siau, Keng

    1999-01-01

    This article presents the services available on the Internet for creativity and discusses their applicability to electronic brainstorming. Services include bulletin boards, electronic mail and listservs, chat groups, file transfers, and remote login. Opportunities provided by the World Wide Web are discussed, along with tools available to…

  16. Bioaccumulation of per- and polyfluorinated alkyl substances (PFAS) in selected species from the Barents Sea food web.

    PubMed

    Haukås, Marianne; Berger, Urs; Hop, Haakon; Gulliksen, Bjørn; Gabrielsen, Geir W

    2007-07-01

    The present study reports concentrations and biomagnification potential of per- and polyfluorinated alkyl substances (PFAS) in species from the Barents Sea food web. The examined species included sea ice amphipod (Gammarus wilkitzkii), polar cod (Boreogadus saida), black guillemot (Cepphus grylle) and glaucous gull (Larus hyperboreus). These were analyzed for PFAS, polychlorinated biphenyls (PCBs), dichlorodiphenyltrichloroethanes (DDTs) and polybrominated diphenyl ethers (PBDEs). Perfluorooctane sulfonate (PFOS) was the predominant of the detected PFAS. Trophic levels and food web transfer of PFAS were determined using stable nitrogen isotopes (delta(15)N). No correlation was found between PFOS concentrations and trophic level within species. However, a non-linear relationship was established when the entire food web was analyzed. Biomagnification factors displayed values >1 for perfluorohexane sulfonate (PFHxS), perfluorononanoic acid (PFNA), PFOS and SigmaPFAS(7). Multivariate analyses showed that the degree of trophic transfer of PFAS is similar to that of PCB, DDT and PBDE, despite their accumulation through different pathways.

  17. Bioaccumulation and Trophic Transfer of Mercury and Selenium in African Sub-Tropical Fluvial Reservoirs Food Webs (Burkina Faso)

    PubMed Central

    Ouédraogo, Ousséni; Chételat, John; Amyot, Marc

    2015-01-01

    The bioaccumulation and biomagnification of mercury (Hg) and selenium (Se) were investigated in sub-tropical freshwater food webs from Burkina Faso, West Africa, a region where very few ecosystem studies on contaminants have been performed. During the 2010 rainy season, samples of water, sediment, fish, zooplankton, and mollusks were collected from three water reservoirs and analysed for total Hg (THg), methylmercury (MeHg), and total Se (TSe). Ratios of δ13C and δ15N were measured to determine food web structures and patterns of contaminant accumulation and transfer to fish. Food chain lengths (FCLs) were calculated using mean δ15N of all primary consumer taxa collected as the site-specific baseline. We report relatively low concentrations of THg and TSe in most fish. We also found in all studied reservoirs short food chain lengths, ranging from 3.3 to 3.7, with most fish relying on a mixture of pelagic and littoral sources for their diet. Mercury was biomagnified in fish food webs with an enrichment factor ranging from 2.9 to 6.5 for THg and from 2.9 to 6.6 for MeHg. However, there was no evidence of selenium biomagnification in these food webs. An inverse relationship was observed between adjusted δ15N and log-transformed Se:Hg ratios, indicating that Se has a lesser protective effect in top predators, which are also the most contaminated animals with respect to MeHg. Trophic position, carbon source, and fish total length were the factors best explaining Hg concentration in fish. In a broader comparison of our study sites with literature data for other African lakes, the THg biomagnification rate was positively correlated with FCL. We conclude that these reservoir systems from tropical Western Africa have low Hg biomagnification associated with short food chains. This finding may partly explain low concentrations of Hg commonly reported in fish from this area. PMID:25875292

  18. Empirical correspondence between trophic transfer efficiency in freshwater food webs and the slope of their size spectra.

    PubMed

    Mehner, Thomas; Lischke, Betty; Scharnweber, Kristin; Attermeyer, Katrin; Brothers, Soren; Gaedke, Ursula; Hilt, Sabine; Brucet, Sandra

    2018-06-01

    The density of organisms declines with size, because larger organisms need more energy than smaller ones and energetic losses occur when larger organisms feed on smaller ones. A potential expression of density-size distributions are Normalized Biomass Size Spectra (NBSS), which plot the logarithm of biomass independent of taxonomy within bins of logarithmic organismal size, divided by the bin width. Theoretically, the NBSS slope of multi-trophic communities is exactly -1.0 if the trophic transfer efficiency (TTE, ratio of production rates between adjacent trophic levels) is 10% and the predator-prey mass ratio (PPMR) is fixed at 10 4 . Here we provide evidence from four multi-trophic lake food webs that empirically estimated TTEs correspond to empirically estimated slopes of the respective community NBSS. Each of the NBSS considered pelagic and benthic organisms spanning size ranges from bacteria to fish, all sampled over three seasons in 1 yr. The four NBSS slopes were significantly steeper than -1.0 (range -1.14 to -1.19, with 95% CIs excluding -1). The corresponding average TTEs were substantially lower than 10% in each of the four food webs (range 1.0% to 3.6%, mean 1.85%). The overall slope merging all biomass-size data pairs from the four systems (-1.17) was almost identical to the slope predicted from the arithmetic mean TTE of the four food webs (-1.18) assuming a constant PPMR of 10 4 . Accordingly, our empirical data confirm the theoretically predicted quantitative relationship between TTE and the slope of the biomass-size distribution. Furthermore, we show that benthic and pelagic organisms can be merged into a community NBSS, but future studies have yet to explore potential differences in habitat-specific TTEs and PPMRs. We suggest that community NBSS may provide valuable information on the structure of food webs and their energetic pathways, and can result in improved accuracy of TTE-estimates. © 2018 by the Ecological Society of America.

  19. Indonesian drought monitoring from space. A report of SAFE activity: Assessment of drought impact on rice production in Indonesia by satellite remote sensing and dissemination with web-GIS

    NASA Astrophysics Data System (ADS)

    Shofiyati, Rizatus; Takeuchi, Wataru; Sofan, Parwati; Darmawan, Soni; Awaluddin; Supriatna, Wahyu

    2014-06-01

    Long droughts experienced in Indonesia in the past are identified as one of the main factors in the failure of rice production. In this regard, special attention to monitor the condition is encouraged to reduce the damage. Currently, various satellite data and approaches can withdraw valuable information for monitoring and anticipating drought hazards. Two types of drought, Meteorology and Agriculture, have been assessed. During the last 10 years, daily and monthly rainfall data derived from TRMM and GSMaP. MTSAT and AMSR-E data have been analyzed to identify meteorological drought. Agricultural drought has been studied by observing the character of some indices (EVI, VCI, VHI, LST, and NDVI) of sixteen-day and monthly MODIS data at a period of 5 years (2009 - 2013). Network for data transfer has been built between LAPAN (data provider), ICALRD (implementer), IAARD Cloud Computing, and University of Tokyo (technical supporter). A Web-GIS based Drought Monitoring Information System has been developed to disseminate the information to end users. This paper describes the implementation of remote sensing drought monitoring model and development of Web-GIS and satellite based information system.

  20. BioServices: a common Python package to access biological Web Services programmatically.

    PubMed

    Cokelaer, Thomas; Pultz, Dennis; Harder, Lea M; Serra-Musach, Jordi; Saez-Rodriguez, Julio

    2013-12-15

    Web interfaces provide access to numerous biological databases. Many can be accessed to in a programmatic way thanks to Web Services. Building applications that combine several of them would benefit from a single framework. BioServices is a comprehensive Python framework that provides programmatic access to major bioinformatics Web Services (e.g. KEGG, UniProt, BioModels, ChEMBLdb). Wrapping additional Web Services based either on Representational State Transfer or Simple Object Access Protocol/Web Services Description Language technologies is eased by the usage of object-oriented programming. BioServices releases and documentation are available at http://pypi.python.org/pypi/bioservices under a GPL-v3 license.

  1. Web Services for Astronomical Databases: Connecting AIPS++ to the Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Douthit, M. C.

    2002-12-01

    In the year 2010, the NRAO will be operating four of the world's most powerful radio telescopes: GBT, EVLA, VLBA, and ALMA (with international partnership). Multi-Terabyte data sets will quickly accumulate with a rate of twenty-five to fifty Megabytes of data per second generated by ALMA and EVLA each. It will be imperative for scientists to possess software capable of automated data reduction, image synthesis, and archiving. With the evolution of AIPS++ and the recently developed concepts of the image pipeline, the participation of the NRAO in the virtual observatories of the future is now on the horizon giving birth to the need for fast archive access and web service development in AIPS++. When the software package began over 10 years ago, it was not designed for data transfer via the web. In response to the demands of the NVO, we have designed and implemented an application layer that will allow our system to communicate with others. Sponsored by the NRAO and California State University, San Marcos.

  2. Transfer of gold nanoparticles from the water column to the estuarine food web

    NASA Astrophysics Data System (ADS)

    Ferry, John L.; Craig, Preston; Hexel, Cole; Sisco, Patrick; Frey, Rebecca; Pennington, Paul L.; Fulton, Michael H.; Scott, I. Geoff; Decho, Alan W.; Kashiwada, Shosaku; Murphy, Catherine J.; Shaw, Timothy J.

    2009-07-01

    Within the next five years the manufacture of large quantities of nanomaterials may lead to unintended contamination of terrestrial and aquatic ecosystems. The unique physical, chemical and electronic properties of nanomaterials allow new modes of interaction with environmental systems that can have unexpected impacts. Here, we show that gold nanorods can readily pass from the water column to the marine food web in three laboratory-constructed estuarine mesocosms containing sea water, sediment, sea grass, microbes, biofilms, snails, clams, shrimp and fish. A single dose of gold nanorods (65 nm length × 15 nm diameter) was added to each mesocosm and their distribution in the aqueous and sediment phases monitored over 12 days. Nanorods partitioned between biofilms, sediments, plants, animals and sea water with a recovery of 84.4%. Clams and biofilms accumulated the most nanoparticles on a per mass basis, suggesting that gold nanorods can readily pass from the water column to the marine food web.

  3. VISIBIOweb: visualization and layout services for BioPAX pathway models

    PubMed Central

    Dilek, Alptug; Belviranli, Mehmet E.; Dogrusoz, Ugur

    2010-01-01

    With recent advancements in techniques for cellular data acquisition, information on cellular processes has been increasing at a dramatic rate. Visualization is critical to analyzing and interpreting complex information; representing cellular processes or pathways is no exception. VISIBIOweb is a free, open-source, web-based pathway visualization and layout service for pathway models in BioPAX format. With VISIBIOweb, one can obtain well-laid-out views of pathway models using the standard notation of the Systems Biology Graphical Notation (SBGN), and can embed such views within one's web pages as desired. Pathway views may be navigated using zoom and scroll tools; pathway object properties, including any external database references available in the data, may be inspected interactively. The automatic layout component of VISIBIOweb may also be accessed programmatically from other tools using Hypertext Transfer Protocol (HTTP). The web site is free and open to all users and there is no login requirement. It is available at: http://visibioweb.patika.org. PMID:20460470

  4. Flexible server-side processing of climate archives

    NASA Astrophysics Data System (ADS)

    Juckes, Martin; Stephens, Ag; Damasio da Costa, Eduardo

    2014-05-01

    The flexibility and interoperability of OGC Web Processing Services are combined with an extensive range of data processing operations supported by the Climate Data Operators (CDO) library to facilitate processing of the CMIP5 climate data archive. The challenges posed by this peta-scale archive allow us to test and develop systems which will help us to deal with approaching exa-scale challenges. The CEDA WPS package allows users to manipulate data in the archive and export the results without first downloading the data -- in some cases this can drastically reduce the data volumes which need to be transferred and greatly reduce the time needed for the scientists to get their results. Reductions in data transfer are achieved at the expense of an additional computational load imposed on the archive (or near-archive) infrastructure. This is managed with a load balancing system. Short jobs may be run in near real-time, longer jobs will be queued. When jobs are queued the user is provided with a web dashboard displaying job status. A clean split between the data manipulation software and the request management software is achieved by exploiting the extensive CDO library. This library has a long history of development to support the needs of the climate science community. Use of the library ensures that operations run on data by the system can be reproduced by users using the same operators installed on their own computers. Examples using the system deployed for the CMIP5 archive will be shown and issues which need to be addressed as archive volumes expand into the exa-scale will be discussed.

  5. Flexible server-side processing of climate archives

    NASA Astrophysics Data System (ADS)

    Juckes, M. N.; Stephens, A.; da Costa, E. D.

    2013-12-01

    The flexibility and interoperability of OGC Web Processing Services are combined with an extensive range of data processing operations supported by the Climate Data Operators (CDO) library to facilitate processing of the CMIP5 climate data archive. The challenges posed by this peta-scale archive allow us to test and develop systems which will help us to deal with approaching exa-scale challenges. The CEDA WPS package allows users to manipulate data in the archive and export the results without first downloading the data -- in some cases this can drastically reduce the data volumes which need to be transferred and greatly reduce the time needed for the scientists to get their results. Reductions in data transfer are achieved at the expense of an additional computational load imposed on the archive (or near-archive) infrastructure. This is managed with a load balancing system. Short jobs may be run in near real-time, longer jobs will be queued. When jobs are queued the user is provided with a web dashboard displaying job status. A clean split between the data manipulation software and the request management software is achieved by exploiting the extensive CDO library. This library has a long history of development to support the needs of the climate science community. Use of the library ensures that operations run on data by the system can be reproduced by users using the same operators installed on their own computers. Examples using the system deployed for the CMIP5 archive will be shown and issues which need to be addressed as archive volumes expand into the exa-scale will be discussed.

  6. The National Institutes of Health Clinical Center Digital Imaging Network, Picture Archival and Communication System, and Radiology Information System.

    PubMed

    Goldszal, A F; Brown, G K; McDonald, H J; Vucich, J J; Staab, E V

    2001-06-01

    In this work, we describe the digital imaging network (DIN), picture archival and communication system (PACS), and radiology information system (RIS) currently being implemented at the Clinical Center, National Institutes of Health (NIH). These systems are presently in clinical operation. The DIN is a redundant meshed network designed to address gigabit density and expected high bandwidth requirements for image transfer and server aggregation. The PACS projected workload is 5.0 TB of new imaging data per year. Its architecture consists of a central, high-throughput Digital Imaging and Communications in Medicine (DICOM) data repository and distributed redundant array of inexpensive disks (RAID) servers employing fiber-channel technology for immediate delivery of imaging data. On demand distribution of images and reports to clinicians and researchers is accomplished via a clustered web server. The RIS follows a client-server model and provides tools to order exams, schedule resources, retrieve and review results, and generate management reports. The RIS-hospital information system (HIS) interfaces include admissions, discharges, and transfers (ATDs)/demographics, orders, appointment notifications, doctors update, and results.

  7. Adolescents' and young adults' transition experiences when transferring from paediatric to adult care: a qualitative metasynthesis.

    PubMed

    Fegran, Liv; Hall, Elisabeth O C; Uhrenfeldt, Lisbeth; Aagaard, Hanne; Ludvigsen, Mette Spliid

    2014-01-01

    The objective of this study was to synthesize qualitative studies of how adolescents and young adults with chronic diseases experience the transition from paediatric to adult hospital care. The review is designed as a qualitative metasynthesis and is following Sandelowski and Barroso's guidelines for synthesizing qualitative research. Literature searches were conducted in the databases PubMed, Ovid, Scopus, Cumulative Index to Nursing and Allied Health Literature (CINAHL), ISI Web of Science, and Nordic and German databases covering the period from 1999 to November 2010. In addition, forward citation snowball searching was conducted in the databases Ovid, CINAHL, ISI Web of Science, Scopus and Google Scholar. Of the 1143 records screened, 18 studies were included. Inclusion criteria were qualitative studies in English, German or Nordic languages on adolescents' and young adults' transition experiences when transferring from paediatric to adult care. There was no age limit, provided the focus was on the actual transfer process and participants had a chronic somatic disease. The studies were appraised as suitable for inclusion using a published appraisal tool. Data were analyzed into metasummaries and a metasynthesis according to established guidelines for synthesis of qualitative research. Four themes illustrating experiences of loss of familiar surroundings and relationships combined with insecurity and a feeling of being unprepared for what was ahead were identified: facing changes in significant relationships, moving from a familiar to an unknown ward culture, being prepared for transfer and achieving responsibility. Young adults' transition experiences seem to be comparable across diagnoses. Feelings of not belonging and of being redundant during the transfer process are striking. Health care professionals' appreciation of young adults' need to be acknowledged and valued as competent collaborators in their own transfer is crucial, and may protect them from additional health problems during a vulnerable phase. Further research including participants across various cultures and health care systems is needed. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. 76 FR 43960 - NARA Records Reproduction Fees

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-22

    ... transferred to NARA and maintain its fee schedule on NARA's Web site http://www.archives.gov . The proposed... document is faint or too dark, it requires additional time to obtain a readable image. In TABLE 1 below... our Web site ( http://www.archives.gov ) annually when announcing that records reproduction fees will...

  9. WEB-IS2: Next Generation Web Services Using Amira Visualization Package

    NASA Astrophysics Data System (ADS)

    Yang, X.; Wang, Y.; Bollig, E. F.; Kadlec, B. J.; Garbow, Z. A.; Yuen, D. A.; Erlebacher, G.

    2003-12-01

    Amira (www.amiravis.com) is a powerful 3-D visualization package and has been employed recently by the science and engineering communities to gain insight into their data. We present a new web-based interface to Amira, packaged in a Java applet. We have developed a module called WEB-IS/Amira (WEB-IS2), which provides web-based access to Amira. This tool allows earth scientists to manipulate Amira controls remotely and to analyze, render and view large datasets over the internet, without regard for time or location. This could have important ramifications for GRID computing. The design of our implementation will soon allow multiple users to visually collaborate by manipulating a single dataset through a variety of client devices. These clients will only require a browser capable of displaying Java applets. As the deluge of data continues, innovative solutions that maximize ease of use without sacrificing efficiency or flexibility will continue to gain in importance, particularly in the Earth sciences. Major initiatives, such as Earthscope (http://www.earthscope.org), which will generate at least a terabyte of data daily, stand to profit enormously by a system such as WEB-IS/Amira (WEB-IS2). We discuss our use of SOAP (Livingston, D., Advanced SOAP for Web development, Prentice Hall, 2002), a novel 2-way communication protocol, as a means of providing remote commands, and efficient point-to-point transfer of binary image data. We will present our initial experiences with the use of Naradabrokering (www.naradabrokering.org) as a means to decouple clients and servers. Information is submitted to the system as a published item, while it is retrieved through a subscription mechanisms, via what is known as "topics". These topic headers, their contents, and the list of subscribers are automatically tracked by Naradabrokering. This novel approach promises a high degree of fault tolerance, flexibility with respect to client diversity, and language independence for the services (Erlebacher, G., Yuen, D.A., and F. Dubuffet, Current trends and demands in visualization in the geosciences, Electron. Geosciences, 4, 2001).

  10. Reuse of the Cloud Analytics and Collaboration Environment within Tactical Applications (TacApps): A Feasibility Analysis

    DTIC Science & Technology

    2016-03-01

    Representational state transfer  Java messaging service  Java application programming interface (API)  Internet relay chat (IRC)/extensible messaging and...JBoss application server or an Apache Tomcat servlet container instance. The relational database management system can be either PostgreSQL or MySQL ... Java library called direct web remoting. This library has been part of the core CACE architecture for quite some time; however, there have not been

  11. Applications of the U.S. Geological Survey's global land cover product

    USGS Publications Warehouse

    Reed, B.

    1997-01-01

    The U.S. Geological Survey (USGS), in partnership with several international agencies and universities, has produced a global land cover characteristics database. The land cover data were created using multitemporal analysis of advanced very high resolution radiometer satellite images in conjunction with other existing geographic data. A translation table permits the conversion of the land cover classes into several conventional land cover schemes that are used by ecosystem modelers, climate modelers, land management agencies, and other user groups. The alternative classification schemes include Global Ecosystems, the Biosphere Atmosphere Transfer Scheme, the Simple Biosphere, the USGS Anderson Level 2, and the International Geosphere Biosphere Programme. The distribution system for these data is through the World Wide Web (the web site address is: http://edcwww.cr.usgs.gov/landdaac/glcc/glcc.html) or by magnetic media upon special request The availability of the data over the World Wide Web, in conjunction with the flexible database structure, allows easy data access to a wide range of users. The web site contains a user registration form that allows analysis of the diverse applications of large-area land cover data. Currently, applications are divided among mapping (20 percent), conservation (30 percent), and modeling (35 percent).

  12. Design of Remote GPRS-based Gas Data Monitoring System

    NASA Astrophysics Data System (ADS)

    Yan, Xiyue; Yang, Jianhua; Lu, Wei

    2018-01-01

    In order to solve the problem of remote data transmission of gas flowmeter, and realize unattended operation on the spot, an unattended remote monitoring system based on GPRS for gas data is designed in this paper. The slave computer of this system adopts embedded microprocessor to read data of gas flowmeter through rs-232 bus and transfers it to the host computer through DTU. In the host computer, the VB program dynamically binds the Winsock control to receive and parse data. By using dynamic data exchange, the Kingview configuration software realizes history trend curve, real time trend curve, alarm, print, web browsing and other functions.

  13. Spatial Data Services for Interdisciplinary Applications from the NASA Socioeconomic Data and Applications Center

    NASA Astrophysics Data System (ADS)

    Chen, R. S.; MacManus, K.; Vinay, S.; Yetman, G.

    2016-12-01

    The Socioeconomic Data and Applications Center (SEDAC), one of 12 Distributed Active Archive Centers (DAACs) in the NASA Earth Observing System Data and Information System (EOSDIS), has developed a variety of operational spatial data services aimed at providing online access, visualization, and analytic functions for geospatial socioeconomic and environmental data. These services include: open web services that implement Open Geospatial Consortium (OGC) specifications such as Web Map Service (WMS), Web Feature Service (WFS), and Web Coverage Service (WCS); spatial query services that support Web Processing Service (WPS) and Representation State Transfer (REST); and web map clients and a mobile app that utilize SEDAC and other open web services. These services may be accessed from a variety of external map clients and visualization tools such as NASA's WorldView, NOAA's Climate Explorer, and ArcGIS Online. More than 200 data layers related to population, settlements, infrastructure, agriculture, environmental pollution, land use, health, hazards, climate change and other aspects of sustainable development are available through WMS, WFS, and/or WCS. Version 2 of the SEDAC Population Estimation Service (PES) supports spatial queries through WPS and REST in the form of a user-defined polygon or circle. The PES returns an estimate of the population residing in the defined area for a specific year (2000, 2005, 2010, 2015, or 2020) based on SEDAC's Gridded Population of the World version 4 (GPWv4) dataset, together with measures of accuracy. The SEDAC Hazards Mapper and the recently released HazPop iOS mobile app enable users to easily submit spatial queries to the PES and see the results. SEDAC has developed an operational virtualized backend infrastructure to manage these services and support their continual improvement as standards change, new data and services become available, and user needs evolve. An ongoing challenge is to improve the reliability and performance of the infrastructure, in conjunction with external services, to meet both research and operational needs.

  14. Globus Identity, Access, and Data Management: Platform Services for Collaborative Science

    NASA Astrophysics Data System (ADS)

    Ananthakrishnan, R.; Foster, I.; Wagner, R.

    2016-12-01

    Globus is software-as-a-service for research data management, developed at, and operated by, the University of Chicago. Globus, accessible at www.globus.org, provides high speed, secure file transfer; file sharing directly from existing storage systems; and data publication to institutional repositories. 40,000 registered users have used Globus to transfer tens of billions of files totaling hundreds of petabytes between more than 10,000 storage systems within campuses and national laboratories in the US and internationally. Web, command line, and REST interfaces support both interactive use and integration into applications and infrastructures. An important component of the Globus system is its foundational identity and access management (IAM) platform service, Globus Auth. Both Globus research data management and other applications use Globus Auth for brokering authentication and authorization interactions between end-users, identity providers, resource servers (services), and a range of clients, including web, mobile, and desktop applications, and other services. Compliant with important standards such as OAuth, OpenID, and SAML, Globus Auth provides mechanisms required for an extensible, integrated ecosystem of services and clients for the research and education community. It underpins projects such as the US National Science Foundation's XSEDE system, NCAR's Research Data Archive, and the DOE Systems Biology Knowledge Base. Current work is extending Globus services to be compliant with FEDRAMP standards for security assessment, authorization, and monitoring for cloud services. We will present Globus IAM solutions and give examples of Globus use in various projects for federated access to resources. We will also describe how Globus Auth and Globus research data management capabilities enable rapid development and low-cost operations of secure data sharing platforms that leverage Globus services and integrate them with local policy and security.

  15. The National Water Data Exchange-capabilities and trends in the dissemination and exchange of water data

    USGS Publications Warehouse

    Burton, J.S.

    1998-01-01

    This paper discusses the programmes of the National Water Data Exchange (NAWDEX) in providing access to US Geological Survey (USGS) water data and water-related information. NAWDEX dissseminates water data and water-related information by working cooperatively through a network of 68 Assistance Centers to more than 430 member organizations. In addition, NAWDEX provides access to the USGS Water Data Storage System (WATSTORE) and the US Environmental Protection Agency's Storage and Retrieval System (STORET). Recently, the trend has been to make water resources data available over the World Wide Web on the Internet. The NAWDEX homepage, located at Uniform Resource Locator http://h2o.er.usgs.gov/public/nawdex/nawdex.html, provides links to (a) Selected Water Resources Abstracts; (b) National Water Conditions Report; (c) historical streamflow data: and (d) real-time streamflow conditions. NAWDEX also transfers data to users over the Internet through the file transfer protocol (FTP).

  16. Working More Productively: Tools for Administrative Data

    PubMed Central

    Roos, Leslie L; Soodeen, Ruth-Ann; Bond, Ruth; Burchill, Charles

    2003-01-01

    Objective This paper describes a web-based resource () that contains a series of tools for working with administrative data. This work in knowledge management represents an effort to document, find, and transfer concepts and techniques, both within the local research group and to a more broadly defined user community. Concepts and associated computer programs are made as “modular” as possible to facilitate easy transfer from one project to another. Study Setting/Data Sources Tools to work with a registry, longitudinal administrative data, and special files (survey and clinical) from the Province of Manitoba, Canada in the 1990–2003 period. Data Collection Literature review and analyses of web site utilization were used to generate the findings. Principal Findings The Internet-based Concept Dictionary and SAS macros developed in Manitoba are being used in a growing number of research centers. Nearly 32,000 hits from more than 10,200 hosts in a recent month demonstrate broad interest in the Concept Dictionary. Conclusions The tools, taken together, make up a knowledge repository and research production system that aid local work and have great potential internationally. Modular software provides considerable efficiency. The merging of documentation and researcher-to-researcher dissemination keeps costs manageable. PMID:14596394

  17. eHealth System for Collecting and Utilizing Patient Reported Outcome Measures for Personalized Treatment and Care (PROMPT-Care) Among Cancer Patients: Mixed Methods Approach to Evaluate Feasibility and Acceptability.

    PubMed

    Girgis, Afaf; Durcinoska, Ivana; Levesque, Janelle V; Gerges, Martha; Sandell, Tiffany; Arnold, Anthony; Delaney, Geoff P

    2017-10-02

    Despite accumulating evidence indicating that collecting patient-reported outcomes (PROs) and transferring results to the treating health professional in real time has the potential to improve patient well-being and cancer outcomes, this practice is not widespread. The aim of this study was to test the feasibility and acceptability of PROMPT-Care (Patient Reported Outcome Measures for Personalized Treatment and Care), a newly developed electronic health (eHealth) system that facilitates PRO data capture from cancer patients, data linkage and retrieval to support clinical decisions and patient self-management, and data retrieval to support ongoing evaluation and innovative research. We developed an eHealth system in consultation with content-specific expert advisory groups and tested it with patients receiving treatment or follow-up care in two hospitals in New South Wales, Australia, over a 3-month period. Participants were recruited in clinic and completed self-report Web-based assessments either just before their upcoming clinical consultation or every 4 weeks if in follow-up care. A mixed methods approach was used to evaluate feasibility and acceptability of PROMPT-Care; data collected throughout the study informed the accuracy and completeness of data transfer procedures, and extent of missing data was determined from participants' assessments. Patients participated in cognitive interviews while completing their first assessment and completed evaluation surveys and interviews at study-end to assess system acceptability and usefulness of patient self-management resources, and oncology staff were interviewed at study-end to determine the acceptability and perceived usefulness of real-time PRO reporting. A total of 42 patients consented to the study; 7 patients were withdrawn before starting the intervention primarily because of changes in eligibility. Overall, 35 patients (13 on treatment and 22 in follow-up) completed 67 assessments during the study period. Mean completeness of patient-reported data was 93%, with 100% accuracy of data transfer. Ten patients completed cognitive interviews, 28 completed evaluation surveys, and 14 completed evaluation interviews at study-end. PROMPT-Care patient acceptability was high-100% (28/28) reported the time to complete the Web-based assessments (average 15 min) as about right, most willing to answer more questions (79%, 22/28 yes), 96% (27/28) found the Web-based assessment easier or same as completing a paper copy, and they valued the self-management resources . Oncology staff (n=5) also reported high acceptability and potential feasibility of the system. Patients and oncology staff found the PROMPT-Care system to be highly acceptable, and the results suggest that it would be feasible to implement it into an oncology setting. Suggested modifications to the patient assessment survey, clinician access to the reports, and system requirements will be made as part of the next stage of large-scale testing and future implementation of the system as part of routine care. Australian New Zealand Clinical Trials Registry ACTRN1261500135294; https://www.anzctr.org.au/Trial/Registration/TrialReview.aspx?id=369299&isReview=true (Archived by WebCite at http://www.webcitation.org/6lzylG5A0). ©Afaf Girgis, Ivana Durcinoska, Janelle V Levesque, Martha Gerges, Tiffany Sandell, Anthony Arnold, Geoff P Delaney, The PROMPT-Care Program Group. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 02.10.2017.

  18. Modeling selenium bioaccumulation through arthropod food webs in San Francisco Bay, California, USA.

    PubMed

    Schlekat, Christian E; Purkerson, David G; Luoma, Samuel N

    2004-12-01

    Trophic transfer is the main process by which upper trophic level wildlife are exposed to selenium. Transfers through lower levels of a predator's food web thus can be instrumental in determining the threat of selenium in an ecosystem. Little is known about Se transfer through pelagic, zooplankton-based food webs in San Francisco Bay ([SFB], CA, USA), which serve as an energy source for important predators such as striped bass: A dynamic multipathway bioaccumulation model was used to model Se transfer from phytoplankton to pelagic copepods to carnivorous mysids (Neomysis mercedis). Uptake rates of dissolved Se, depuration rates, and assimilation efficiencies (AE) for the model were determined for copepods and mysids in the laboratory. Small (73-250 microm) and large (250-500 microm) herbivorous zooplankton collected from SFB (Oithona/Limnoithona and Acartia sp.) assimilated Se with similar efficiencies (41-52%) from phytoplankton. Mysids assimilated 73% of Se from small herbivorous zooplankton; Se AE was significantly lower (61%) than larger herbivorous zooplankton. Selenium depuration rates were high for both zooplankton and mysids (12-25% d(-1)), especially compared to bivalves (2-3% d(-1)). The model predicted steady state Se concentrations in mysids similar to those observed in the field. The predicted concentration range (1.5-5.4 microg g(-1)) was lower than concentrations of 4.5 to 24 microg g(-1) observed in bivalves from the bay. Differences in efflux between mysids and bivalves were the best explanation for the differences in uptake. The results suggest that the risk of selenium toxicity to predators feeding on N. mercedis would be less than the risk to predators feeding on bivalves. Management of selenium contamination should include food webs analyses to focus on the most important exposure pathways identified for a given watershed.

  19. Modeling selenium bioaccumulation through arthropod food webs in San Francisco Bay, California, USA

    USGS Publications Warehouse

    Schlekat, C.E.; Purkerson, D.G.; Luoma, S.N.

    2004-01-01

    Trophic transfer is the main process by which upper trophic level wildlife are exposed to selenium. Transfers through lower levels of a predator's food web thus can be instrumental in determining the threat of selenium in an ecosystem. Little is known about Se transfer through pelagic, zooplankton-based food webs in San Francisco Bay ([SFB], CA, USA), which serve as an energy source for important predators such as striped bass. A dynamic multipathway bioaccumulation model was used to model Se transfer from phytoplankton to pelagic copepods to carnivorous mysids (Neomysis mercedis). Uptake rates of dissolved Se, depuration rates, and assimilation efficiencies (AE) for the model were determined for copepods and mysids in the laboratory. Small (73-250 ??m) and large (250-500 ??m) herbivorous zooplankton collected from SFB (Oithona/Limnoithona and Acartia sp.) assimilated Se with similar efficiencies (41-52%) from phytoplankton. Mysids assimilated 73% of Se from small herbivorous zooplankton; Se AE was significantly lower (61%) than larger herbivorous zooplankton. Selenium depuration rates were high for both zooplankton and mysids (12-25% d-1), especially compared to bivalves (2-3% d-1). The model predicted steady state Se concentrations in mysids similar to those observed in the field. The predicted concentration range (1.5-5.4 ??g g -1) was lower than concentrations of 4.5 to 24 ??g g-1 observed in bivalves from the bay. Differences in efflux between mysids and bivalves were the best explanation for the differences in uptake. The results suggest that the risk of selenium toxicity to predators feeding on N. mercedis would be less than the risk to predators feeding on bivalves. Management of selenium contamination should include food webs analyses to focus on the most important exposure pathways identified for a given watershed.

  20. Global change in the trophic functioning of marine food webs.

    PubMed

    Maureaud, Aurore; Gascuel, Didier; Colléter, Mathieu; Palomares, Maria L D; Du Pontavice, Hubert; Pauly, Daniel; Cheung, William W L

    2017-01-01

    The development of fisheries in the oceans, and other human drivers such as climate warming, have led to changes in species abundance, assemblages, trophic interactions, and ultimately in the functioning of marine food webs. Here, using a trophodynamic approach and global databases of catches and life history traits of marine species, we tested the hypothesis that anthropogenic ecological impacts may have led to changes in the global parameters defining the transfers of biomass within the food web. First, we developed two indicators to assess such changes: the Time Cumulated Indicator (TCI) measuring the residence time of biomass within the food web, and the Efficiency Cumulated Indicator (ECI) quantifying the fraction of secondary production reaching the top of the trophic chain. Then, we assessed, at the large marine ecosystem scale, the worldwide change of these two indicators over the 1950-2010 time-periods. Global trends were identified and cluster analyses were used to characterize the variability of trends between ecosystems. Results showed that the most common pattern over the study period is a global decrease in TCI, while the ECI indicator tends to increase. Thus, changes in species assemblages would induce faster and apparently more efficient biomass transfers in marine food webs. Results also suggested that the main driver of change over that period had been the large increase in fishing pressure. The largest changes occurred in ecosystems where 'fishing down the marine food web' are most intensive.

  1. COTS technologies for telemedicine applications.

    PubMed

    Triunfo, Riccardo; Tumbarello, Roberto; Sulis, Alessandro; Zanetti, Gianluigi; Lianas, Luca; Meloni, Vittorio; Frexia, Francesca

    2010-01-01

    To demonstrate a simple low-cost system for tele-echocardiology, focused on paediatric cardiology applications. The system was realized using open-source software and COTS technologies. It is based on the transmission of two simultaneous video streams, obtained by direct digitization of the output of an ultrasound machine and by a netcam showing the examination that is taking place. These streams are then embedded into a web page so they are accessible, together with basic video controls, via a standard web browser. The system can also record video streams on a server for further use. The system was tested on a small group of neonatal cases with suspected cardiopathies for a preliminary assessment of its features and diagnostic capabilities. Both the clinical and technological results were encouraging and are leading the way for further experimentation. The presented system can transfer clinical images and videos in an efficient way and in real time. It can be used in the same hospital to support internal consultancy requests, in remote areas using Internet connections and for didactic purposes using low cost COTS appliances and simple interfaces for end users. The solution proposed can be extended to control different medical appliances in those remote hospitals.

  2. Communication and collaboration technologies.

    PubMed

    Cheeseman, Susan E

    2012-01-01

    This is the third in a series of columns exploring health information technology (HIT) in the neonatal intensive care unit (NICU). The first column provided background information on the implementation of information technology throughout the health care delivery system, as well as the requisite informatics competencies needed for nurses to fully engage in the digital era of health care. The second column focused on information and resources to master basic computer competencies described by the TIGER initiative (Technology Informatics Guiding Education Reform) as learning about computers, computer networks, and the transfer of data.1 This column will provide additional information related to basic computer competencies, focusing on communication and collaboration technologies. Computers and the Internet have transformed the way we communicate and collaborate. Electronic communication is the ability to exchange information through the use of computer equipment and software.2 Broadly defined, any technology that facilitates linking one or more individuals together is a collaborative tool. Collaboration using technology encompasses an extensive range of applications that enable groups of individuals to work together including e-mail, instant messaging (IM ), and several web applications collectively referred to as Web 2.0 technologies. The term Web 2.0 refers to web applications where users interact and collaborate with each other in a collective exchange of ideas generating content in a virtual community. Examples of Web 2.0 technologies include social networking sites, blogs, wikis, video sharing sites, and mashups. Many organizations are developing collaborative strategies and tools for employees to connect and interact using web-based social media technologies.3.

  3. Climate change could drive marine food web collapse through altered trophic flows and cyanobacterial proliferation

    PubMed Central

    Ullah, Hadayet; Goldenberg, Silvan U.; Fordham, Damien A.

    2018-01-01

    Global warming and ocean acidification are forecast to exert significant impacts on marine ecosystems worldwide. However, most of these projections are based on ecological proxies or experiments on single species or simplified food webs. How energy fluxes are likely to change in marine food webs in response to future climates remains unclear, hampering forecasts of ecosystem functioning. Using a sophisticated mesocosm experiment, we model energy flows through a species-rich multilevel food web, with live habitats, natural abiotic variability, and the potential for intra- and intergenerational adaptation. We show experimentally that the combined stress of acidification and warming reduced energy flows from the first trophic level (primary producers and detritus) to the second (herbivores), and from the second to the third trophic level (carnivores). Warming in isolation also reduced the energy flow from herbivores to carnivores, the efficiency of energy transfer from primary producers and detritus to herbivores and detritivores, and the living biomass of detritivores, herbivores, and carnivores. Whilst warming and acidification jointly boosted primary producer biomass through an expansion of cyanobacteria, this biomass was converted to detritus rather than to biomass at higher trophic levels—i.e., production was constrained to the base of the food web. In contrast, ocean acidification affected the food web positively by enhancing trophic flow from detritus and primary producers to herbivores, and by increasing the biomass of carnivores. Our results show how future climate change can potentially weaken marine food webs through reduced energy flow to higher trophic levels and a shift towards a more detritus-based system, leading to food web simplification and altered producer–consumer dynamics, both of which have important implications for the structuring of benthic communities. PMID:29315309

  4. The development of a national surveillance system for monitoring blood use and inventory levels at sentinel hospitals in South Korea.

    PubMed

    Lim, Y A; Kim, H H; Joung, U S; Kim, C Y; Shin, Y H; Lee, S W; Kim, H J

    2010-04-01

    We developed a web-based program for a national surveillance system to determine baseline data regarding the supply and demand of blood products at sentinel hospitals in South Korea. Sentinel hospitals were invited to participate in a 1-month pilot-test. The data for receipts and exports of blood from each hospital information system were converted into comma-separated value files according to a specific conversion rule. The daily data from the sites could be transferred to the web-based program server using a semi-automated submission procedure: pressing a key allowed the program to automatically compute the blood inventory level as well as other indices including the minimal inventory ratio (MIR), ideal inventory ratio (IIR), supply index (SI) and utilisation index (UI). The national surveillance system was referred to as the Korean Blood Inventory Monitoring System (KBIMS) and the web-based program for KBIMS was referred to as the Blood Inventory Monitoring System (BMS). A total of 30 256 red blood cell (RBC) units were submitted as receipt data, however, only 83% of the receipt data were submitted to the BMS server as export data (25 093 RBC units). Median values were 2.67 for MIR, 1.08 for IIR, 1.00 for SI, 0.88 for UI and 5.33 for the ideal inventory day. The BMS program was easy to use and is expected to provide a useful tool for monitoring hospital inventory levels. This information will provide baseline data regarding the supply and demand of blood products in South Korea.

  5. The Impacts of a Web-Aided Instructional Simulation on Science Learning.

    ERIC Educational Resources Information Center

    Hsu, Ying-Shao; Thomas, Rex A.

    2002-01-01

    Investigates the effects of selected characteristics of a web-aided instructional simulation on students' conceptual change, problem solving, and transfer abilities. Conducts a two-pronged research study with (n=117) students enrolled in a beginning meteorology course at Iowa State University. Compares three groups--with-log group, without-log…

  6. The New Frontier: Conquering the World Wide Web by Mule.

    ERIC Educational Resources Information Center

    Gresham, Morgan

    1999-01-01

    Examines effects of teaching hypertext markup language on students' perceptions of class goals in a networked composition classroom. Suggests sending documents via file transfer protocol by command line and viewing the Web with a textual browser shifted emphasis from writing to coding. Argues that helping students identify a balance between…

  7. Teaching Programming via the Web: A Time-Tested Methodology

    ERIC Educational Resources Information Center

    Karsten, Rex; Kaparthi, Shashidhar; Roth, Roberta M.

    2005-01-01

    Advances in information and communication technologies give us the ability to reach out beyond the time and place limitations of the traditional classroom. However, effective online teaching is more than just transferring traditional courses to the World Wide Web (WWW). We describe how we have used "off the shelf" software and the infrastructure…

  8. Project Management for Web-Based Course Development

    ERIC Educational Resources Information Center

    Li, Dong; Shearer, Rick

    2004-01-01

    Transferring face-to-face courses into Web-based courses is a trend in higher education. Whether this course transition is for distance education or for resident instruction, faculty members play a critical role in the process. Faculty members not only provide lesson content, but important insights into how content has been best presented in…

  9. 76 FR 75890 - Agency Information Collection Activities: Solicitation of Proposal Information for Award of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-05

    ... burden. DHS S&T uses a secure Web site which the public can propose SBIR research topics and submit... submit SBIR research topics and submit response to DHS SBIR solicitations. Additionally, electronic web... Innovative Research (SBIR) and Small Business Technology Transfer (STTR) programs 15 U.S.C. 628. For...

  10. Neutron Scattering Web

    Science.gov Websites

    Neutron Scattering Home Page A new portal for neutron scattering has just been established at neutronsources.org. The information contained here in the Neutron Scattering Web has been transferred to the new site . We will leave the current content here for archival purposes but no new content will be added. We

  11. 10 CFR 2.1303 - Availability of documents.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... NUCLEAR REGULATORY COMMISSION RULES OF PRACTICE FOR DOMESTIC LICENSING PROCEEDINGS AND ISSUANCE OF ORDERS Procedures for Hearings on License Transfer Applications § 2.1303 Availability of documents. Unless exempt... for a license transfer requiring Commission approval will be placed at the NRC Web site, http://www...

  12. 10 CFR 2.1303 - Availability of documents.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... NUCLEAR REGULATORY COMMISSION RULES OF PRACTICE FOR DOMESTIC LICENSING PROCEEDINGS AND ISSUANCE OF ORDERS Procedures for Hearings on License Transfer Applications § 2.1303 Availability of documents. Unless exempt... for a license transfer requiring Commission approval will be placed at the NRC Web site, http://www...

  13. A web-based solution for 3D medical image visualization

    NASA Astrophysics Data System (ADS)

    Hou, Xiaoshuai; Sun, Jianyong; Zhang, Jianguo

    2015-03-01

    In this presentation, we present a web-based 3D medical image visualization solution which enables interactive large medical image data processing and visualization over the web platform. To improve the efficiency of our solution, we adopt GPU accelerated techniques to process images on the server side while rapidly transferring images to the HTML5 supported web browser on the client side. Compared to traditional local visualization solution, our solution doesn't require the users to install extra software or download the whole volume dataset from PACS server. By designing this web-based solution, it is feasible for users to access the 3D medical image visualization service wherever the internet is available.

  14. Web-based, virtual course units as a didactic concept for medical teaching.

    PubMed

    Schultze-Mosgau, Stefan; Zielinski, Thomas; Lochner, Jürgen

    2004-06-01

    The objective was to develop a web-based, virtual series of lectures for evidence-based, standardized knowledge transfer independent of location and time with possibilities for interactive participation and a concluding web-based online examination. Within the framework of a research project, specific Intranet and Internet capable course modules were developed together with a concluding examination. The concept of integrating digital and analogue course units supported by sound was based on FlashCam (Nexus Concepts), Flash MX (Macromedia), HTML and JavaScript. A Web server/SGI Indigo Unix server was used as a platform by the course provider. A variety of independent formats (swf, avi, mpeg, DivX, etc.) were integrated in the individual swf modules. An online examination was developed to monitor the learning effect. The examination papers are automatically forwarded by email after completion. The results are also returned to the user automatically after they have been processed by a key program and an evaluation program. The system requirements for the user PC have deliberately been kept low (Internet Explorer 5.0, Flash-Player 6, 56 kbit/s modem, 200 MHz PC). Navigation is intuitive. Users were provided with a technical online introduction and a FAQ list. Eighty-two students of dentistry in their 3rd to 5th years of study completed a questionnaire to assess the course content and the user friendliness (SPSS V11) with grades 1 to 6 (1 = 'excellent' and 6 = 'unsatisfactory'). The course units can be viewed under the URL: http://giga.rrze.uni-erlangen.de/movies/MKG/trailer and URL: http://giga.rrze.uni-erlangen.de/movies/MKG/demo/index. Some 89% of the students gave grades 1 (excellent) and 2 (good) for accessibility independent of time and 83% for access independent of location. Grades 1 and 2 were allocated for an objectivization of the knowledge transfer by 67% of the students and for the use of video sequences for demonstrating surgical techniques by 91% of the students. The course units were used as an optional method of studying by 87% of the students; 76% of the students made use of this facility from home; 83% of the students used Internet Explorer as a browser; 60% used online streaming and 35% downloading as the preferred method for data transfer. The course units contribute to an evidence-based objectivization of multimedia knowledge transfer independent of time and location. Online examinations permit automatic monitoring and evaluation of the learning effect. The modular structure permits easy updating of course contents. Hyperlinks with literature sources facilitate study.

  15. The GBT Dynamic Scheduling System: Development and Testing

    NASA Astrophysics Data System (ADS)

    McCarty, M.; Clark, M.; Marganian, P.; O'Neil, K.; Shelton, A.; Sessoms, E.

    2009-09-01

    During the summer trimester of 2008, all observations on the Robert C. Byrd Green Bank Telescope (GBT) were scheduled using the new Dynamic Scheduling System (DSS). Beta testing exercised the policies, algorithms, and software developed for the DSS project. Since observers are located all over the world, the DSS was implemented as a web application. Technologies such as iCalendar, Really Simple Syndication (RSS) feeds, email, and instant messaging are used to transfer as much or as little information to observers as they request. We discuss the software engineering challenges leading to our implementation such as information distribution and building rich user interfaces in the web browser. We also relate our adaptation of agile development practices to design and develop the DSS. Additionally, we describe handling differences in expected versus actual initial conditions in the pool of project proposals for the 08B trimester. We then identify lessons learned from beta testing and present statistics on how the DSS was used during the trimester.

  16. Enabling Mobile Air Quality App Development with an AirNow API

    NASA Astrophysics Data System (ADS)

    Dye, T.; White, J. E.; Ludewig, S. A.; Dickerson, P.; Healy, A. N.; West, J. W.; Prince, L. A.

    2013-12-01

    The U.S. Environmental Protection Agency's (EPA) AirNow program works with over 130 participating state, local, and federal air quality agencies to obtain, quality control, and store real-time air quality observations and forecasts. From these data, the AirNow system generates thousands of maps and products each hour. Each day, information from AirNow is published online and in other media to assist the public in making health-based decisions related to air quality. However, an increasing number of people use mobile devices as their primary tool for obtaining information, and AirNow has responded to this trend by publishing an easy-to-use Web API that is useful for mobile app developers. This presentation will describe the various features of the AirNow application programming interface (API), including Representational State Transfer (REST)-type web services, file outputs, and RSS feeds. In addition, a web portal for the AirNow API will be shown, including documentation on use of the system, a query tool for configuring and running web services, and general information about the air quality data and forecasts available. Data published via the AirNow API includes corresponding Air Quality Index (AQI) levels for each pollutant. We will highlight examples of mobile apps that are using the AirNow API to provide location-based, real-time air quality information. Examples will include mobile apps developed for Minnesota ('Minnesota Air') and Washington, D.C. ('Clean Air Partners Air Quality'), and an app developed by EPA ('EPA AirNow').

  17. Online plot services for paleomagnetism and rock magnetism

    NASA Astrophysics Data System (ADS)

    Hatakeyama, T.

    2017-12-01

    In paleomagnetism and rock magnetism, a lot of types of original plots are used for obtained data from measurements. Many researchers in paleomagnetism often use not only general-purpose plotting programs such as Microsoft Excel but also single-purpose tools. A large benefit of using the latter tools is that we can make a beautiful figure for our own data. However, those programs require specific environment for their operation such as type of hardware and platform, type of operation system and its version, libraries for execution and so on. Therefore, it is difficult to share the result and graphics among the collaborators who use different environments on their PCs. Thus, one of the best solution is likely a program operated on popular environment. The most popular is web environment as we all know. Almost all current operating systems have web browsers as standard and all people use them regularly. Now we provide a web-based service plotting paleomagnetic results easily.We develop original programs with a command-line user interface (non-GUI), and we prepared web pages for input of the simple measured data and options and a wrapper script which transfers the entered values to the program. The results, analyzed values and plotted graphs from the program are shown in the HTML page and downloadable. Our plot services are provided in http://mage-p.org/mageplot/. In this talk, we introduce our program and service and discuss the philosophy and efficiency of these services.

  18. Introducing the PRIDE Archive RESTful web services.

    PubMed

    Reisinger, Florian; del-Toro, Noemi; Ternent, Tobias; Hermjakob, Henning; Vizcaíno, Juan Antonio

    2015-07-01

    The PRIDE (PRoteomics IDEntifications) database is one of the world-leading public repositories of mass spectrometry (MS)-based proteomics data and it is a founding member of the ProteomeXchange Consortium of proteomics resources. In the original PRIDE database system, users could access data programmatically by accessing the web services provided by the PRIDE BioMart interface. New REST (REpresentational State Transfer) web services have been developed to serve the most popular functionality provided by BioMart (now discontinued due to data scalability issues) and address the data access requirements of the newly developed PRIDE Archive. Using the API (Application Programming Interface) it is now possible to programmatically query for and retrieve peptide and protein identifications, project and assay metadata and the originally submitted files. Searching and filtering is also possible by metadata information, such as sample details (e.g. species and tissues), instrumentation (mass spectrometer), keywords and other provided annotations. The PRIDE Archive web services were first made available in April 2014. The API has already been adopted by a few applications and standalone tools such as PeptideShaker, PRIDE Inspector, the Unipept web application and the Python-based BioServices package. This application is free and open to all users with no login requirement and can be accessed at http://www.ebi.ac.uk/pride/ws/archive/. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  19. Mash-up of techniques between data crawling/transfer, data preservation/stewardship and data processing/visualization technologies on a science cloud system designed for Earth and space science: a report of successful operation and science projects of the NICT Science Cloud

    NASA Astrophysics Data System (ADS)

    Murata, K. T.

    2014-12-01

    Data-intensive or data-centric science is 4th paradigm after observational and/or experimental science (1st paradigm), theoretical science (2nd paradigm) and numerical science (3rd paradigm). Science cloud is an infrastructure for 4th science methodology. The NICT science cloud is designed for big data sciences of Earth, space and other sciences based on modern informatics and information technologies [1]. Data flow on the cloud is through the following three techniques; (1) data crawling and transfer, (2) data preservation and stewardship, and (3) data processing and visualization. Original tools and applications of these techniques have been designed and implemented. We mash up these tools and applications on the NICT Science Cloud to build up customized systems for each project. In this paper, we discuss science data processing through these three steps. For big data science, data file deployment on a distributed storage system should be well designed in order to save storage cost and transfer time. We developed a high-bandwidth virtual remote storage system (HbVRS) and data crawling tool, NICTY/DLA and Wide-area Observation Network Monitoring (WONM) system, respectively. Data files are saved on the cloud storage system according to both data preservation policy and data processing plan. The storage system is developed via distributed file system middle-ware (Gfarm: GRID datafarm). It is effective since disaster recovery (DR) and parallel data processing are carried out simultaneously without moving these big data from storage to storage. Data files are managed on our Web application, WSDBank (World Science Data Bank). The big-data on the cloud are processed via Pwrake, which is a workflow tool with high-bandwidth of I/O. There are several visualization tools on the cloud; VirtualAurora for magnetosphere and ionosphere, VDVGE for google Earth, STICKER for urban environment data and STARStouch for multi-disciplinary data. There are 30 projects running on the NICT Science Cloud for Earth and space science. In 2003 56 refereed papers were published. At the end, we introduce a couple of successful results of Earth and space sciences using these three techniques carried out on the NICT Sciences Cloud. [1] http://sc-web.nict.go.jp

  20. Web-GIS oriented systems viability for municipal solid waste selective collection optimization in developed and transient economies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rada, E.C., E-mail: Elena.Rada@ing.unitn.it; Ragazzi, M.; Fedrizzi, P.

    Highlights: ► As an appropriate solution for MSW management in developed and transient countries. ► As an option to increase the efficiency of MSW selective collection. ► As an opportunity to integrate MSW management needs and services inventories. ► As a tool to develop Urban Mining actions. - Abstract: Municipal solid waste management is a multidisciplinary activity that includes generation, source separation, storage, collection, transfer and transport, processing and recovery, and, last but not least, disposal. The optimization of waste collection, through source separation, is compulsory where a landfill based management must be overcome. In this paper, a few aspectsmore » related to the implementation of a Web-GIS based system are analyzed. This approach is critically analyzed referring to the experience of two Italian case studies and two additional extra-European case studies. The first case is one of the best examples of selective collection optimization in Italy. The obtained efficiency is very high: 80% of waste is source separated for recycling purposes. In the second reference case, the local administration is going to be faced with the optimization of waste collection through Web-GIS oriented technologies for the first time. The starting scenario is far from an optimized management of municipal solid waste. The last two case studies concern pilot experiences in China and Malaysia. Each step of the Web-GIS oriented strategy is comparatively discussed referring to typical scenarios of developed and transient economies. The main result is that transient economies are ready to move toward Web oriented tools for MSW management, but this opportunity is not yet well exploited in the sector.« less

  1. Bioaccumulation and trophic transfer of pharmaceuticals in food webs from a large freshwater lake.

    PubMed

    Xie, Zhengxin; Lu, Guanghua; Yan, Zhenhua; Liu, Jianchao; Wang, Peifang; Wang, Yonghua

    2017-03-01

    Pharmaceuticals are increasingly detected in environmental matrices, but information on their trophic transfer in aquatic food webs is insufficient. This study investigated the bioaccumulation and trophic transfer of 23 pharmaceuticals in Taihu Lake, China. Pharmaceutical concentrations were analyzed in surface water, sediments and 14 aquatic species, including plankton, invertebrates and fish collected from the lake. The median concentrations of the detected pharmaceuticals ranged from not detected (ND) to 49 ng/L in water, ND to 49 ng/g dry weight (dw) in sediments, and from ND to 130 ng/g dw in biota. Higher concentrations of pharmaceuticals were found in zoobenthos relative to plankton, shrimp and fish muscle. In fish tissues, the observed pharmaceutical contents in the liver and brain were generally higher than those in the gills and muscle. Both bioaccumulation factors (median BAFs: 19-2008 L/kg) and biota-sediment accumulation factors (median BSAFs: 0.0010-0.037) indicated a low bioaccumulation potential for the target pharmaceuticals. For eight of the most frequently detected pharmaceuticals in food webs, the trophic magnification factors (TMFs) were analyzed from two different regions of Taihu Lake. The TMFs for roxithromycin, propranolol, diclofenac, ibuprofen, ofloxacin, norfloxacin, ciprofloxacin and tetracycline in the two food webs ranged from 0.28 to 1.25, suggesting that none of these pharmaceuticals experienced trophic magnification. In addition, the pharmaceutical TMFs did not differ significantly between the two regions in Taihu Lake. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Trophic transfer of microplastics and mixed contaminants in the marine food web and implications for human health.

    PubMed

    Carbery, Maddison; O'Connor, Wayne; Palanisami, Thavamani

    2018-06-01

    Plastic litter has become one of the most serious threats to the marine environment. Over 690 marine species have been impacted by plastic debris with small plastic particles being observed in the digestive tract of organisms from different trophic levels. The physical and chemical properties of microplastics facilitate the sorption of contaminants to the particle surface, serving as a vector of contaminants to organisms following ingestion. Bioaccumulation factors for higher trophic organisms and impacts on wider marine food webs remain unknown. The main objectives of this review were to discuss the factors influencing microplastic ingestion; describe the biological impacts of associated chemical contaminants; highlight evidence for the trophic transfer of microplastics and contaminants within marine food webs and outline the future research priorities to address potential human health concerns. Controlled laboratory studies looking at the effects of microplastics and contaminants on model organisms employ nominal concentrations and consequently have little relevance to the real environment. Few studies have attempted to track the fate of microplastics and mixed contaminants through a complex marine food web using environmentally relevant concentrations to identify the real level of risk. To our knowledge, there has been no attempt to understand the transfer of microplastics and associated contaminants from seafood to humans and the implications for human health. Research is needed to determine bioaccumulation factors for popular seafood items in order to identify the potential impacts on human health. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. Communication of Career Pathways Through Associate Degree Program Web Sites: A Baseline Assessment.

    PubMed

    Becker, Ellen A; Vargas, Jenny

    2018-05-08

    The American Association for Respiratory Care sponsored a series of conferences that addressed the competency of the future workforce of respiratory therapists (RTs). Based upon the findings from those conferences, several initiatives emerged that support RTs earning a baccalaureate (or bachelor's) degree. The objective of this study was to identify the ways that associate degree programs communicate career pathways toward a baccalaureate degree through their Web sites. This cross-sectional observational study used a random sample of 100 of the 362 associate degree programs approved by the Commission on Accreditation for Respiratory Care. Data were collected from 3 specific categories: demographic data, baccalaureate completion information, and the Web page location for the program. The presence of statements related to any pathway toward a bachelor's degree, transfer credits, articulation agreements, and links for baccalaureate completion were recorded. The descriptive statistics in this study were reported as total numbers and percentages. Of the 100 programs in the random sample, only 89 were included in the study. Only 39 (44%) programs had links on their program Web site that had any content related to bachelor's degrees, 16 (18%) identified college transfer courses toward a bachelor's degree, and 26 (29%) programs included baccalaureate articulation agreements on their Web site. A minority of associate degree programs communicated career pathway information to their prospective and current students through program Web sites. An informative Web site would make the path more transparent for entry-level students to meet their future educational needs as their careers progress. Copyright © 2018 by Daedalus Enterprises.

  4. Global Access to Library of Congress' Digital Resources: National Digital Library and Internet Resources.

    ERIC Educational Resources Information Center

    Chen, Ching-chih

    1996-01-01

    Summarizes how the Library of Congress' digital library collections can be accessed globally via the Internet and World Wide Web. Outlines the resources found in each of the various access points: gopher, online catalog, library and legislative Web sites, legal and copyright databases, and FTP (file transfer protocol) sites. (LAM)

  5. Supporting Students' Knowledge Transfer in Modeling Activities

    ERIC Educational Resources Information Center

    Piksööt, Jaanika; Sarapuu, Tago

    2014-01-01

    This study investigates ways to enhance secondary school students' knowledge transfer in complex science domains by implementing question prompts. Two samples of students applied two web-based models to study molecular genetics--the model of genetic code (n = 258) and translation (n = 245). For each model, the samples were randomly divided into…

  6. A Distributed Web-based Solution for Ionospheric Model Real-time Management, Monitoring, and Short-term Prediction

    NASA Astrophysics Data System (ADS)

    Kulchitsky, A.; Maurits, S.; Watkins, B.

    2006-12-01

    With the widespread availability of the Internet today, many people can monitor various scientific research activities. It is important to accommodate this interest providing on-line access to dynamic and illustrative Web-resources, which could demonstrate different aspects of ongoing research. It is especially important to explain and these research activities for high school and undergraduate students, thereby providing more information for making decisions concerning their future studies. Such Web resources are also important to clarify scientific research for the general public, in order to achieve better awareness of research progress in various fields. Particularly rewarding is dissemination of information about ongoing projects within Universities and research centers to their local communities. The benefits of this type of scientific outreach are mutual, since development of Web-based automatic systems is prerequisite for many research projects targeting real-time monitoring and/or modeling of natural conditions. Continuous operation of such systems provide ongoing research opportunities for the statistically massive validation of the models, as well. We have developed a Web-based system to run the University of Alaska Fairbanks Polar Ionospheric Model in real-time. This model makes use of networking and computational resources at the Arctic Region Supercomputing Center. This system was designed to be portable among various operating systems and computational resources. Its components can be installed across different computers, separating Web servers and computational engines. The core of the system is a Real-Time Management module (RMM) written Python, which facilitates interactions of remote input data transfers, the ionospheric model runs, MySQL database filling, and PHP scripts for the Web-page preparations. The RMM downloads current geophysical inputs as soon as they become available at different on-line depositories. This information is processed to provide inputs for the next ionospheic model time step and then stored in a MySQL database as the first part of the time-specific record. The RMM then performs synchronization of the input times with the current model time, prepares a decision on initialization for the next model time step, and monitors its execution. Then, as soon as the model completes computations for the next time step, RMM visualizes the current model output into various short-term (about 1-2 hours) forecasting products and compares prior results with available ionospheric measurements. The RMM places prepared images into the MySQL database, which can be located on a different computer node, and then proceeds to the next time interval continuing the time-loop. The upper-level interface of this real-time system is the a PHP-based Web site (http://www.arsc.edu/SpaceWeather/new). This site provides general information about the Earth polar and adjacent mid-latitude ionosphere, allows for monitoring of the current developments and short-term forecasts, and facilitates access to the comparisons archive stored in the database.

  7. A web-based knowledge management system integrating Western and Traditional Chinese Medicine for relational medical diagnosis.

    PubMed

    Herrera-Hernandez, Maria C; Lai-Yuen, Susana K; Piegl, Les A; Zhang, Xiao

    2016-10-26

    This article presents the design of a web-based knowledge management system as a training and research tool for the exploration of key relationships between Western and Traditional Chinese Medicine, in order to facilitate relational medical diagnosis integrating these mainstream healing modalities. The main goal of this system is to facilitate decision-making processes, while developing skills and creating new medical knowledge. Traditional Chinese Medicine can be considered as an ancient relational knowledge-based approach, focusing on balancing interrelated human functions to reach a healthy state. Western Medicine focuses on specialties and body systems and has achieved advanced methods to evaluate the impact of a health disorder on the body functions. Identifying key relationships between Traditional Chinese and Western Medicine opens new approaches for health care practices and can increase the understanding of human medical conditions. Our knowledge management system was designed from initial datasets of symptoms, known diagnosis and treatments, collected from both medicines. The datasets were subjected to process-oriented analysis, hierarchical knowledge representation and relational database interconnection. Web technology was implemented to develop a user-friendly interface, for easy navigation, training and research. Our system was prototyped with a case study on chronic prostatitis. This trial presented the system's capability for users to learn the correlation approach, connecting knowledge in Western and Traditional Chinese Medicine by querying the database, mapping validated medical information, accessing complementary information from official sites, and creating new knowledge as part of the learning process. By addressing the challenging tasks of data acquisition and modeling, organization, storage and transfer, the proposed web-based knowledge management system is presented as a tool for users in medical training and research to explore, learn and update relational information for the practice of integrated medical diagnosis. This proposal in education has the potential to enable further creation of medical knowledge from both Traditional Chinese and Western Medicine for improved care providing. The presented system positively improves the information visualization, learning process and knowledge sharing, for training and development of new skills for diagnosis and treatment, and a better understanding of medical diseases. © IMechE 2016.

  8. Wireless Integrated Microelectronic Vacuum Sensor System

    NASA Technical Reports Server (NTRS)

    Krug, Eric; Philpot, Brian; Trott, Aaron; Lawrence, Shaun

    2013-01-01

    NASA Stennis Space Center's (SSC's) large rocket engine test facility requires the use of liquid propellants, including the use of cryogenic fluids like liquid hydrogen as fuel, and liquid oxygen as an oxidizer (gases which have been liquefied at very low temperatures). These fluids require special handling, storage, and transfer technology. The biggest problem associated with transferring cryogenic liquids is product loss due to heat transfer. Vacuum jacketed piping is specifically designed to maintain high thermal efficiency so that cryogenic liquids can be transferred with minimal heat transfer. A vacuum jacketed pipe is essentially two pipes in one. There is an inner carrier pipe, in which the cryogenic liquid is actually transferred, and an outer jacket pipe that supports and seals the vacuum insulation, forming the "vacuum jacket." The integrity of the vacuum jacketed transmission lines that transfer the cryogenic fluid from delivery barges to the test stand must be maintained prior to and during engine testing. To monitor the vacuum in these vacuum jacketed transmission lines, vacuum gauge readings are used. At SSC, vacuum gauge measurements are done on a manual rotation basis with two technicians, each using a handheld instrument. Manual collection of vacuum data is labor intensive and uses valuable personnel time. Additionally, there are times when personnel cannot collect the data in a timely fashion (i.e., when a leak is detected, measurements must be taken more often). Additionally, distribution of this data to all interested parties can be cumbersome. To simplify the vacuum-gauge data collection process, automate the data collection, and decrease the labor costs associated with acquiring these measurements, an automated system that monitors the existing gauges was developed by Invocon, Inc. For this project, Invocon developed a Wireless Integrated Microelectronic Vacuum Sensor System (WIMVSS) that provides the ability to gather vacuum-gauge measurements automatically and wirelessly, in near-real time - using a low-maintenance, lowpower sensor mesh network. The WIMVSS operates by using a self-configuring mesh network of wireless sensor units. Mesh networking is a type of networking where each sensor or node can capture and disseminate its own data, but also serve as a relay to receive and transmit data from other sensors. Each sensor node can synchronize with adjacent sensors, and propagate data from one sensor to the next, until the destination is reached. In this case, the destination is a Network Interface Unit (NIU). The WIMVSS sensors are mounted on the existing vacuum gauges. Information gathered by the sensors is sent to the NIU. Because of the mesh networking, if a sensor cannot directly send the data to the NIU, it can be propagated through the network of sensors. The NIU requires antenna access to the sensor units, AC power, and an Ethernet connection. The NIU bridges the sensor network to a WIMVSS server via an Ethernet connection. The server is configured with a database, a Web server, and proprietary interface software that makes it possible for the vacuum measurements from vacuum jacketed fluid lines to be saved, retrieved, and then displayed from any Web-enabled PC that has access to the Internet. Authorized users can then simply access the data from any PC with Internet connection. Commands can also be sent directly from the Web interface for control and maintenance of the sensor network. The technology enabled by the WIMVSS decreases labor required for gathering vacuum measurements, increases access to vacuum data by making it available on any computer with access to the Internet, increases the frequency with which data points can be acquired for evaluating the system, and decreases the recurring cost of the sensors by using off-the-shelf components and integrating these with heritage vacuum gauges.

  9. Expediting the transfer of evidence into practice: building clinical partnerships*

    PubMed Central

    Rader, Tamara; Gagnon, Anita J.

    2000-01-01

    A librarian/clinician partnership was fostered in one hospital through the formation of the Evidence-based Practice Committee, with an ulterior goal of facilitating the transfer of evidence into practice. The paper will describe barriers to evidence-based practice and outline the committee's strategies for overcoming these barriers, including the development and promotion of a Web-based guide to evidence-based practice specifically designed for clinicians (health professionals). Educational strategies for use of the Web-based guide will also be addressed. Advantages of this partnership are that the skills of librarians in meeting the needs of clinicians are maximized. The evidence-based practice skills of clinicians are honed and librarians make a valuable contribution to the knowledgebase of the clinical staff. The knowledge acquired through the partnership by both clinicians and librarians will increase the sophistication of the dialogue between the two groups and in turn will expedite the transfer of evidence into practice. PMID:10928710

  10. Virtualization of open-source secure web services to support data exchange in a pediatric critical care research network

    PubMed Central

    Sward, Katherine A; Newth, Christopher JL; Khemani, Robinder G; Cryer, Martin E; Thelen, Julie L; Enriquez, Rene; Shaoyu, Su; Pollack, Murray M; Harrison, Rick E; Meert, Kathleen L; Berg, Robert A; Wessel, David L; Shanley, Thomas P; Dalton, Heidi; Carcillo, Joseph; Jenkins, Tammara L; Dean, J Michael

    2015-01-01

    Objectives To examine the feasibility of deploying a virtual web service for sharing data within a research network, and to evaluate the impact on data consistency and quality. Material and Methods Virtual machines (VMs) encapsulated an open-source, semantically and syntactically interoperable secure web service infrastructure along with a shadow database. The VMs were deployed to 8 Collaborative Pediatric Critical Care Research Network Clinical Centers. Results Virtual web services could be deployed in hours. The interoperability of the web services reduced format misalignment from 56% to 1% and demonstrated that 99% of the data consistently transferred using the data dictionary and 1% needed human curation. Conclusions Use of virtualized open-source secure web service technology could enable direct electronic abstraction of data from hospital databases for research purposes. PMID:25796596

  11. BioSWR – Semantic Web Services Registry for Bioinformatics

    PubMed Central

    Repchevsky, Dmitry; Gelpi, Josep Ll.

    2014-01-01

    Despite of the variety of available Web services registries specially aimed at Life Sciences, their scope is usually restricted to a limited set of well-defined types of services. While dedicated registries are generally tied to a particular format, general-purpose ones are more adherent to standards and usually rely on Web Service Definition Language (WSDL). Although WSDL is quite flexible to support common Web services types, its lack of semantic expressiveness led to various initiatives to describe Web services via ontology languages. Nevertheless, WSDL 2.0 descriptions gained a standard representation based on Web Ontology Language (OWL). BioSWR is a novel Web services registry that provides standard Resource Description Framework (RDF) based Web services descriptions along with the traditional WSDL based ones. The registry provides Web-based interface for Web services registration, querying and annotation, and is also accessible programmatically via Representational State Transfer (REST) API or using a SPARQL Protocol and RDF Query Language. BioSWR server is located at http://inb.bsc.es/BioSWR/and its code is available at https://sourceforge.net/projects/bioswr/under the LGPL license. PMID:25233118

  12. BioSWR--semantic web services registry for bioinformatics.

    PubMed

    Repchevsky, Dmitry; Gelpi, Josep Ll

    2014-01-01

    Despite of the variety of available Web services registries specially aimed at Life Sciences, their scope is usually restricted to a limited set of well-defined types of services. While dedicated registries are generally tied to a particular format, general-purpose ones are more adherent to standards and usually rely on Web Service Definition Language (WSDL). Although WSDL is quite flexible to support common Web services types, its lack of semantic expressiveness led to various initiatives to describe Web services via ontology languages. Nevertheless, WSDL 2.0 descriptions gained a standard representation based on Web Ontology Language (OWL). BioSWR is a novel Web services registry that provides standard Resource Description Framework (RDF) based Web services descriptions along with the traditional WSDL based ones. The registry provides Web-based interface for Web services registration, querying and annotation, and is also accessible programmatically via Representational State Transfer (REST) API or using a SPARQL Protocol and RDF Query Language. BioSWR server is located at http://inb.bsc.es/BioSWR/and its code is available at https://sourceforge.net/projects/bioswr/under the LGPL license.

  13. Web and Desktop Applications for ALMA Science Verification Data

    NASA Astrophysics Data System (ADS)

    Shirasaki, Y.; Kawasaki, W.; Eguchi, S.; Komiya, Y.; Kosugi, G.; Ohishi, M.; Mizumoto, Y.

    2013-10-01

    ALMA is the largest radio telescope operating in Chile, and it is expected to produce 200 TB of data every year. Even a data cube obtained for a single source can exceed 1 TB. It is, therefore, crucial to reduce the size of data transmitted through the Internet by doing a cutout of a part of a data cube and/or reducing the spatial/frequency resolution before transferring the data. To specify the cutout region or required resolution, one needs to overview the whole of the data without transferring the large data cube. For this purpose, we developed two applications for quick-looking ALMA data cube, ALMA Web QL and Desktop Viewer (Vissage).

  14. ForistomApp a Web application for scientific and technological information management of Forsitom foundation

    NASA Astrophysics Data System (ADS)

    Saavedra-Duarte, L. A.; Angarita-Jerardino, A.; Ruiz, P. A.; Dulce-Moreno, H. J.; Vera-Rivera, F. H.; V-Niño, E. D.

    2017-12-01

    Information and Communication Technologies (ICT) are essential in the transfer of knowledge, and the Web tools, as part of ICT, are important for institutions seeking greater visibility of the products developed by their researchers. For this reason, we implemented an application that allows the information management of the FORISTOM Foundation (Foundation of Researchers in Science and Technology of Materials). The application shows a detailed description, not only of all its members also of all the scientific production that they carry out, such as technological developments, research projects, articles, presentations, among others. This application can be implemented by other entities committed to the scientific dissemination and transfer of technology and knowledge.

  15. WebMail versus WebApp: Comparing Problem-Based Learning Methods in a Business Research Methods Course

    ERIC Educational Resources Information Center

    Williams van Rooij, Shahron

    2007-01-01

    This study examined the impact of two Problem-Based Learning (PBL) approaches on knowledge transfer, problem-solving self-efficacy, and perceived learning gains among four intact classes of adult learners engaged in a group project in an online undergraduate business research methods course. With two of the classes using a text-only PBL workbook…

  16. Design on the MUVE: Synergizing Online Design Education with Multi-User Virtual Environments (MUVE)

    ERIC Educational Resources Information Center

    Sakalli, Isinsu; Chung, WonJoon

    2015-01-01

    The world is becoming increasingly virtual. Since the invention of the World Wide Web, information and human interaction has been transferring to the web at a rapid rate. Education is one of the many institutions that is taking advantage of accessing large numbers of people globally through computers. While this can be a simpler task for…

  17. Webly-Supervised Fine-Grained Visual Categorization via Deep Domain Adaptation.

    PubMed

    Xu, Zhe; Huang, Shaoli; Zhang, Ya; Tao, Dacheng

    2018-05-01

    Learning visual representations from web data has recently attracted attention for object recognition. Previous studies have mainly focused on overcoming label noise and data bias and have shown promising results by learning directly from web data. However, we argue that it might be better to transfer knowledge from existing human labeling resources to improve performance at nearly no additional cost. In this paper, we propose a new semi-supervised method for learning via web data. Our method has the unique design of exploiting strong supervision, i.e., in addition to standard image-level labels, our method also utilizes detailed annotations including object bounding boxes and part landmarks. By transferring as much knowledge as possible from existing strongly supervised datasets to weakly supervised web images, our method can benefit from sophisticated object recognition algorithms and overcome several typical problems found in webly-supervised learning. We consider the problem of fine-grained visual categorization, in which existing training resources are scarce, as our main research objective. Comprehensive experimentation and extensive analysis demonstrate encouraging performance of the proposed approach, which, at the same time, delivers a new pipeline for fine-grained visual categorization that is likely to be highly effective for real-world applications.

  18. Network and User-Perceived Performance of Web Page Retrievals

    NASA Technical Reports Server (NTRS)

    Kruse, Hans; Allman, Mark; Mallasch, Paul

    1998-01-01

    The development of the HTTP protocol has been driven by the need to improve the network performance of the protocol by allowing the efficient retrieval of multiple parts of a web page without the need for multiple simultaneous TCP connections between a client and a server. We suggest that the retrieval of multiple page elements sequentially over a single TCP connection may result in a degradation of the perceived performance experienced by the user. We attempt to quantify this perceived degradation through the use of a model which combines a web retrieval simulation and an analytical model of TCP operation. Starting with the current HTTP/l.1 specification, we first suggest a client@side heuristic to improve the perceived transfer performance. We show that the perceived speed of the page retrieval can be increased without sacrificing data transfer efficiency. We then propose a new client/server extension to the HTTP/l.1 protocol to allow for the interleaving of page element retrievals. We finally address the issue of the display of advertisements on web pages, and in particular suggest a number of mechanisms which can make efficient use of IP multicast to send advertisements to a number of clients within the same network.

  19. Online data analysis using Web GDL

    NASA Astrophysics Data System (ADS)

    Jaffey, A.; Cheung, M.; Kobashi, A.

    2008-12-01

    The ever improving capability of modern astronomical instruments to capture data at high spatial resolution and cadence is opening up unprecedented opportunities for scientific discovery. When data sets become so large that they cannot be easily transferred over the internet, the researcher must find alternative ways to perform data analysis. One strategy is to bring the data analysis code to where the data resides. We present Web GDL, an implementation of GDL (GNU Data Language, open source incremental compiler compatible with IDL) that allows users to perform interactive data analysis within a web browser.

  20. St. Regis Paper Mill: Architectural and Environmental Survey

    DTIC Science & Technology

    2010-02-01

    designated by other authorized documents. DESTROY THIS REPORT WHEN NO LONGER NEEDED. DO NOT RETURN IT TO THE ORIGINATOR. ERDC/CERL TR-10-4 iii Table...of Technology Transfer ................................................................................................... 2 2 Methodology...Environmental Di- vision. Mode of Technology Transfer This report will be made accessible through the World Wide Web (WWW) at: URL: http

  1. Improving Internet Archive Service through Proxy Cache.

    ERIC Educational Resources Information Center

    Yu, Hsiang-Fu; Chen, Yi-Ming; Wang, Shih-Yong; Tseng, Li-Ming

    2003-01-01

    Discusses file transfer protocol (FTP) servers for downloading archives (files with particular file extensions), and the change to HTTP (Hypertext transfer protocol) with increased Web use. Topics include the Archie server; proxy cache servers; and how to improve the hit rate of archives by a combination of caching and better searching mechanisms.…

  2. 40 CFR 63.822 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... roller that transfers material to a raised image (type or art) on a plate cylinder. The material is then transferred from the image on the plate cylinder to the web or sheet to be printed. A flexographic print... press means any press which prints only non-saleable items used to check the quality of image formation...

  3. 40 CFR 63.822 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... roller that transfers material to a raised image (type or art) on a plate cylinder. The material is then transferred from the image on the plate cylinder to the web or sheet to be printed. A flexographic print... press means any press which prints only non-saleable items used to check the quality of image formation...

  4. 40 CFR 63.822 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... roller that transfers material to a raised image (type or art) on a plate cylinder. The material is then transferred from the image on the plate cylinder to the web or sheet to be printed. A flexographic print... press means any press which prints only non-saleable items used to check the quality of image formation...

  5. Web services-based text-mining demonstrates broad impacts for interoperability and process simplification.

    PubMed

    Wiegers, Thomas C; Davis, Allan Peter; Mattingly, Carolyn J

    2014-01-01

    The Critical Assessment of Information Extraction systems in Biology (BioCreAtIvE) challenge evaluation tasks collectively represent a community-wide effort to evaluate a variety of text-mining and information extraction systems applied to the biological domain. The BioCreative IV Workshop included five independent subject areas, including Track 3, which focused on named-entity recognition (NER) for the Comparative Toxicogenomics Database (CTD; http://ctdbase.org). Previously, CTD had organized document ranking and NER-related tasks for the BioCreative Workshop 2012; a key finding of that effort was that interoperability and integration complexity were major impediments to the direct application of the systems to CTD's text-mining pipeline. This underscored a prevailing problem with software integration efforts. Major interoperability-related issues included lack of process modularity, operating system incompatibility, tool configuration complexity and lack of standardization of high-level inter-process communications. One approach to potentially mitigate interoperability and general integration issues is the use of Web services to abstract implementation details; rather than integrating NER tools directly, HTTP-based calls from CTD's asynchronous, batch-oriented text-mining pipeline could be made to remote NER Web services for recognition of specific biological terms using BioC (an emerging family of XML formats) for inter-process communications. To test this concept, participating groups developed Representational State Transfer /BioC-compliant Web services tailored to CTD's NER requirements. Participants were provided with a comprehensive set of training materials. CTD evaluated results obtained from the remote Web service-based URLs against a test data set of 510 manually curated scientific articles. Twelve groups participated in the challenge. Recall, precision, balanced F-scores and response times were calculated. Top balanced F-scores for gene, chemical and disease NER were 61, 74 and 51%, respectively. Response times ranged from fractions-of-a-second to over a minute per article. We present a description of the challenge and summary of results, demonstrating how curation groups can effectively use interoperable NER technologies to simplify text-mining pipeline implementation. Database URL: http://ctdbase.org/ © The Author(s) 2014. Published by Oxford University Press.

  6. Web services-based text-mining demonstrates broad impacts for interoperability and process simplification

    PubMed Central

    Wiegers, Thomas C.; Davis, Allan Peter; Mattingly, Carolyn J.

    2014-01-01

    The Critical Assessment of Information Extraction systems in Biology (BioCreAtIvE) challenge evaluation tasks collectively represent a community-wide effort to evaluate a variety of text-mining and information extraction systems applied to the biological domain. The BioCreative IV Workshop included five independent subject areas, including Track 3, which focused on named-entity recognition (NER) for the Comparative Toxicogenomics Database (CTD; http://ctdbase.org). Previously, CTD had organized document ranking and NER-related tasks for the BioCreative Workshop 2012; a key finding of that effort was that interoperability and integration complexity were major impediments to the direct application of the systems to CTD's text-mining pipeline. This underscored a prevailing problem with software integration efforts. Major interoperability-related issues included lack of process modularity, operating system incompatibility, tool configuration complexity and lack of standardization of high-level inter-process communications. One approach to potentially mitigate interoperability and general integration issues is the use of Web services to abstract implementation details; rather than integrating NER tools directly, HTTP-based calls from CTD's asynchronous, batch-oriented text-mining pipeline could be made to remote NER Web services for recognition of specific biological terms using BioC (an emerging family of XML formats) for inter-process communications. To test this concept, participating groups developed Representational State Transfer /BioC-compliant Web services tailored to CTD's NER requirements. Participants were provided with a comprehensive set of training materials. CTD evaluated results obtained from the remote Web service-based URLs against a test data set of 510 manually curated scientific articles. Twelve groups participated in the challenge. Recall, precision, balanced F-scores and response times were calculated. Top balanced F-scores for gene, chemical and disease NER were 61, 74 and 51%, respectively. Response times ranged from fractions-of-a-second to over a minute per article. We present a description of the challenge and summary of results, demonstrating how curation groups can effectively use interoperable NER technologies to simplify text-mining pipeline implementation. Database URL: http://ctdbase.org/ PMID:24919658

  7. Real Time Integration of Field Data Into a GIS Platform for the Management of Hydrological Emergencies

    NASA Astrophysics Data System (ADS)

    Mangiameli, M.; Mussumeci, G.

    2013-01-01

    A wide series of events requires immediate availability of information and field data to be provided to decision-makers. An example is the necessity of quickly transferring the information acquired from monitoring and alerting sensors or the data of the reconnaissance of damage after a disastrous event to an Emergency Operations Center. To this purpose, we developed an integrated GIS and WebGIS system to dynamically create and populate via Web a database with spatial features. In particular, this work concerns the gathering and transmission of spatial data and related information to the desktop GIS so that they can be displayed and analyzed in real time to characterize the operational scenario and to decide the rescue interventions. As basic software, we used only free and open source: QuantumGIS and Grass as Desktop GIS, Map Server with PMapper application for the Web-Gis functionality and PostGreSQL/PostGIS as Data Base Management System (DBMS). The approach has been designed, developed and successfully tested in the management of GIS-based navigation of an autonomous robot, both to map its trajectories and to assign optimal paths. This paper presents the application of our system to a simulated hydrological event that could interest the province of Catania, in Sicily. In particular, assuming that more teams draw up an inventory of the damage, we highlight the benefits of real-time transmission of the information collected from the field to headquarters.

  8. Business logic for geoprocessing of distributed geodata

    NASA Astrophysics Data System (ADS)

    Kiehle, Christian

    2006-12-01

    This paper describes the development of a business-logic component for the geoprocessing of distributed geodata. The business logic acts as a mediator between the data and the user, therefore playing a central role in any spatial information system. The component is used in service-oriented architectures to foster the reuse of existing geodata inventories. Based on a geoscientific case study of groundwater vulnerability assessment and mapping, the demands for such architectures are identified with special regard to software engineering tasks. Methods are derived from the field of applied Geosciences (Hydrogeology), Geoinformatics, and Software Engineering. In addition to the development of a business logic component, a forthcoming Open Geospatial Consortium (OGC) specification is introduced: the OGC Web Processing Service (WPS) specification. A sample application is introduced to demonstrate the potential of WPS for future information systems. The sample application Geoservice Groundwater Vulnerability is described in detail to provide insight into the business logic component, and demonstrate how information can be generated out of distributed geodata. This has the potential to significantly accelerate the assessment and mapping of groundwater vulnerability. The presented concept is easily transferable to other geoscientific use cases dealing with distributed data inventories. Potential application fields include web-based geoinformation systems operating on distributed data (e.g. environmental planning systems, cadastral information systems, and others).

  9. Evaluation of expert system application based on usability aspects

    NASA Astrophysics Data System (ADS)

    Munaiseche, C. P. C.; Liando, O. E. S.

    2016-04-01

    Usability usually defined as a point of human acceptance to a product or a system based on understands and right reaction to an interface. The performance of web application has been influence by the quality of the interface of that web to supporting information transfer process. Preferably, before the applications of expert systems were installed in the operational environment, these applications must be evaluated first by usability testing. This research aimed to measure the usability of the expert system application using tasks as interaction media. This study uses an expert system application to diagnose skin disease in human using questionnaire method which utilize the tasks as interaction media in measuring the usability. Certain tasks were executed by the participants in observing usability value of the application. The usability aspects observed were learnability, efficiency, memorability, errors, and satisfaction. Each questionnaire question represent aspects of usability. The results present the usability value for each aspect and the total average merit for all the five-usability aspect was 4.28, this indicated that the tested expert system application is in the range excellent for the usability level, so the application can be implemented as the operated system by user. The main contribution of the study is the research became the first step in using task model in the usability evaluation for the expert system application software.

  10. The internet

    PubMed Central

    Al-Shahi, R; Sadler, M; Rees, G; Bateman, D

    2002-01-01

    The growing use of email and the world wide web (WWW), by the public, academics, and clinicians—as well as the increasing availability of high quality information on the WWW—make a working knowledge of the internet important. Although this article aims to enhance readers' existing use of the internet and medical resources on the WWW, it is also intelligible to someone unfamiliar with the internet. A web browser is one of the central pieces of software in modern computing: it is a window on the WWW, file transfer protocol sites, networked newsgroups, and your own computer's files. Effective use of the internet for professional purposes requires an understanding of the best strategies to search the WWW and the mechanisms for ensuring secure data transfer, as well as a compendium of online resources including journals, textbooks, medical portals, and sites providing high quality patient information. This article summarises these resources, available to incorporate into your web browser as downloadable "Favorites" or "Bookmarks" from www.jnnp.com, where there are also freely accessible hypertext links to the recommended sites. PMID:12438460

  11. The DBCLS BioHackathon: standardization and interoperability for bioinformatics web services and workflows. The DBCLS BioHackathon Consortium*.

    PubMed

    Katayama, Toshiaki; Arakawa, Kazuharu; Nakao, Mitsuteru; Ono, Keiichiro; Aoki-Kinoshita, Kiyoko F; Yamamoto, Yasunori; Yamaguchi, Atsuko; Kawashima, Shuichi; Chun, Hong-Woo; Aerts, Jan; Aranda, Bruno; Barboza, Lord Hendrix; Bonnal, Raoul Jp; Bruskiewich, Richard; Bryne, Jan C; Fernández, José M; Funahashi, Akira; Gordon, Paul Mk; Goto, Naohisa; Groscurth, Andreas; Gutteridge, Alex; Holland, Richard; Kano, Yoshinobu; Kawas, Edward A; Kerhornou, Arnaud; Kibukawa, Eri; Kinjo, Akira R; Kuhn, Michael; Lapp, Hilmar; Lehvaslaiho, Heikki; Nakamura, Hiroyuki; Nakamura, Yasukazu; Nishizawa, Tatsuya; Nobata, Chikashi; Noguchi, Tamotsu; Oinn, Thomas M; Okamoto, Shinobu; Owen, Stuart; Pafilis, Evangelos; Pocock, Matthew; Prins, Pjotr; Ranzinger, René; Reisinger, Florian; Salwinski, Lukasz; Schreiber, Mark; Senger, Martin; Shigemoto, Yasumasa; Standley, Daron M; Sugawara, Hideaki; Tashiro, Toshiyuki; Trelles, Oswaldo; Vos, Rutger A; Wilkinson, Mark D; York, William; Zmasek, Christian M; Asai, Kiyoshi; Takagi, Toshihisa

    2010-08-21

    Web services have become a key technology for bioinformatics, since life science databases are globally decentralized and the exponential increase in the amount of available data demands for efficient systems without the need to transfer entire databases for every step of an analysis. However, various incompatibilities among database resources and analysis services make it difficult to connect and integrate these into interoperable workflows. To resolve this situation, we invited domain specialists from web service providers, client software developers, Open Bio* projects, the BioMoby project and researchers of emerging areas where a standard exchange data format is not well established, for an intensive collaboration entitled the BioHackathon 2008. The meeting was hosted by the Database Center for Life Science (DBCLS) and Computational Biology Research Center (CBRC) and was held in Tokyo from February 11th to 15th, 2008. In this report we highlight the work accomplished and the common issues arisen from this event, including the standardization of data exchange formats and services in the emerging fields of glycoinformatics, biological interaction networks, text mining, and phyloinformatics. In addition, common shared object development based on BioSQL, as well as technical challenges in large data management, asynchronous services, and security are discussed. Consequently, we improved interoperability of web services in several fields, however, further cooperation among major database centers and continued collaborative efforts between service providers and software developers are still necessary for an effective advance in bioinformatics web service technologies.

  12. The DBCLS BioHackathon: standardization and interoperability for bioinformatics web services and workflows. The DBCLS BioHackathon Consortium*

    PubMed Central

    2010-01-01

    Web services have become a key technology for bioinformatics, since life science databases are globally decentralized and the exponential increase in the amount of available data demands for efficient systems without the need to transfer entire databases for every step of an analysis. However, various incompatibilities among database resources and analysis services make it difficult to connect and integrate these into interoperable workflows. To resolve this situation, we invited domain specialists from web service providers, client software developers, Open Bio* projects, the BioMoby project and researchers of emerging areas where a standard exchange data format is not well established, for an intensive collaboration entitled the BioHackathon 2008. The meeting was hosted by the Database Center for Life Science (DBCLS) and Computational Biology Research Center (CBRC) and was held in Tokyo from February 11th to 15th, 2008. In this report we highlight the work accomplished and the common issues arisen from this event, including the standardization of data exchange formats and services in the emerging fields of glycoinformatics, biological interaction networks, text mining, and phyloinformatics. In addition, common shared object development based on BioSQL, as well as technical challenges in large data management, asynchronous services, and security are discussed. Consequently, we improved interoperability of web services in several fields, however, further cooperation among major database centers and continued collaborative efforts between service providers and software developers are still necessary for an effective advance in bioinformatics web service technologies. PMID:20727200

  13. Cleanups In My Community (CIMC) - Federal facilities that are also Superfund sites, National Layer

    EPA Pesticide Factsheets

    Federal facilities are properties owned by the federal government. This data layer provides access to Federal facilities that are Superfund sites as part of the CIMC web service. Data are collected using the Superfund Enterprise Management System (SEMS) and transferred to Envirofacts for access by the public. Data about Federal facility Superfund sites are located on their own EPA web pages, and CIMC links to those pages. Links to the relevant web pages for each site are provided within the attribute table. Federal facility sites can be either Superfund sites or RCRA Corrective Action sites, or they may have moved from one program to the other and back. In Cleanups in My Community, you can map or list any of these Federal Facility sites. This data layer shows only those facilities that are Superfund Sites. RCRA federal facility sites and other Superfund NPL sites are included in other data layers as part of this web service.Superfund is a program administered by the EPA to locate, investigate, and clean up worst hazardous waste sites throughout the United States. EPA administers the Superfund program in cooperation with individual states and tribal governments. These sites include abandoned warehouses, manufacturing facilities, processing plants, and landfills - the key word here being abandoned. The CIMC web service was initially published in 2013, but the data are updated on the 18th of each month. The full schedule for data updates in CIMC is located here:

  14. An Interoperable Architecture for Air Pollution Early Warning System Based on Sensor Web

    NASA Astrophysics Data System (ADS)

    Samadzadegan, F.; Zahmatkesh, H.; Saber, M.; Ghazi khanlou, H. J.

    2013-09-01

    Environmental monitoring systems deal with time-sensitive issues which require quick responses in emergency situations. Handling the sensor observations in near real-time and obtaining valuable information is challenging issues in these systems from a technical and scientific point of view. The ever-increasing population growth in urban areas has caused certain problems in developing countries, which has direct or indirect impact on human life. One of applicable solution for controlling and managing air quality by considering real time and update air quality information gathered by spatially distributed sensors in mega cities, using sensor web technology for developing monitoring and early warning systems. Urban air quality monitoring systems using functionalities of geospatial information system as a platform for analysing, processing, and visualization of data in combination with Sensor Web for supporting decision support systems in disaster management and emergency situations. This system uses Sensor Web Enablement (SWE) framework of the Open Geospatial Consortium (OGC), which offers a standard framework that allows the integration of sensors and sensor data into spatial data infrastructures. SWE framework introduces standards for services to access sensor data and discover events from sensor data streams as well as definition set of standards for the description of sensors and the encoding of measurements. The presented system provides capabilities to collect, transfer, share, process air quality sensor data and disseminate air quality status in real-time. It is possible to overcome interoperability challenges by using standard framework. In a routine scenario, air quality data measured by in-situ sensors are communicated to central station where data is analysed and processed. The extracted air quality status is processed for discovering emergency situations, and if necessary air quality reports are sent to the authorities. This research proposed an architecture to represent how integrate air quality sensor data stream into geospatial data infrastructure to present an interoperable air quality monitoring system for supporting disaster management systems by real time information. Developed system tested on Tehran air pollution sensors for calculating Air Quality Index (AQI) for CO pollutant and subsequently notifying registered users in emergency cases by sending warning E-mails. Air quality monitoring portal used to retrieving and visualize sensor observation through interoperable framework. This system provides capabilities to retrieve SOS observation using WPS in a cascaded service chaining pattern for monitoring trend of timely sensor observation.

  15. Stable-isotope analysis: a neglected tool for placing parasites in food webs.

    PubMed

    Sabadel, A J M; Stumbo, A D; MacLeod, C D

    2018-02-28

    Parasites are often overlooked in the construction of food webs, despite their ubiquitous presence in almost every type of ecosystem. Researchers who do recognize their importance often struggle to include parasites using classical food-web theory, mainly due to the parasites' multiple hosts and life stages. A novel approach using compound-specific stable-isotope analysis promises to provide considerable insight into the energetic exchanges of parasite and host, which may solve some of the issues inherent in incorporating parasites using a classical approach. Understanding the role of parasites within food webs, and tracing the associated biomass transfers, are crucial to constructing new models that will expand our knowledge of food webs. This mini-review focuses on stable-isotope studies published in the past decade, and introduces compound-specific stable-isotope analysis as a powerful, but underutilized, newly developed tool that may answer many unresolved questions regarding the role of parasites in food webs.

  16. 75 FR 27986 - Electronic Filing System-Web (EFS-Web) Contingency Option

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-19

    ...] Electronic Filing System--Web (EFS-Web) Contingency Option AGENCY: United States Patent and Trademark Office... availability of its patent electronic filing system, Electronic Filing System--Web (EFS-Web) by providing a new contingency option when the primary portal to EFS-Web has an unscheduled outage. Previously, the entire EFS...

  17. Finding, Browsing and Getting Data Easily Using SPDF Web Services

    NASA Technical Reports Server (NTRS)

    Candey, R.; Chimiak, R.; Harris, B.; Johnson, R.; Kovalick, T.; Lal, N.; Leckner, H.; Liu, M.; McGuire, R.; Papitashvili, N.; hide

    2010-01-01

    The NASA GSFC Space Physics Data Facility (5PDF) provides heliophysics science-enabling information services for enhancing scientific research and enabling integration of these services into the Heliophysics Data Environment paradigm, via standards-based approach (SOAP) and Representational State Transfer (REST) web services in addition to web browser, FTP, and OPeNDAP interfaces. We describe these interfaces and the philosophies behind these web services, and show how to call them from various languages, such as IDL and Perl. We are working towards a "one simple line to call" philosophy extolled in the recent VxO discussions. Combining data from many instruments and missions enables broad research analysis and correlation and coordination with other experiments and missions.

  18. ReSTful OSGi Web Applications Tutorial

    NASA Technical Reports Server (NTRS)

    Shams, Khawaja; Norris, Jeff

    2008-01-01

    This slide presentation accompanies a tutorial on the ReSTful (Representational State Transfer) web application. Using Open Services Gateway Initiative (OSGi), ReST uses HTTP protocol to enable developers to offer services to a diverse variety of clients: from shell scripts to sophisticated Java application suites. It also uses Eclipse for the rapid development, the Eclipse debugger, the test application, and the ease of export to production servers.

  19. Bridging the Field Trip Gap: Integrating Web-Based Video as a Teaching and Learning Partner in Interior Design Education

    ERIC Educational Resources Information Center

    Roehl, Amy

    2013-01-01

    This study utilizes web-based video as a strategy to transfer knowledge about the interior design industry in a format that interests the current generation of students. The model of instruction developed is based upon online video as an engaging, economical, and time-saving alternative to a field trip, guest speaker, or video teleconference.…

  20. 75 FR 3217 - J&T Hydro Company; H. Dean Brooks and W. Bruce Cox; Notice of Application for Transfer of License...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-20

    ... Ramseur Project No. 11392 located on the Deep River in Randolph County, North Carolina. The transferor and...)(iii)(2009) and the instructions on the Commission's Web site under the ``e-Filing'' link. If unable to... the Commission's Web site located at http://www.ferc.gov/filing-comments.asp . More information about...

  1. VO-KOREL: A Fourier Disentangling Service of the Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Škoda, Petr; Hadrava, Petr; Fuchs, Jan

    2012-04-01

    VO-KOREL is a web service exploiting the technology of the Virtual Observatory for providing astronomers with the intuitive graphical front-end and distributed computing back-end running the most recent version of the Fourier disentangling code KOREL. The system integrates the ideas of the e-shop basket, conserving the privacy of every user by transfer encryption and access authentication, with features of laboratory notebook, allowing the easy housekeeping of both input parameters and final results, as well as it explores a newly emerging technology of cloud computing. While the web-based front-end allows the user to submit data and parameter files, edit parameters, manage a job list, resubmit or cancel running jobs and mainly watching the text and graphical results of a disentangling process, the main part of the back-end is a simple job queue submission system executing in parallel multiple instances of the FORTRAN code KOREL. This may be easily extended for GRID-based deployment on massively parallel computing clusters. The short introduction into underlying technologies is given, briefly mentioning advantages as well as bottlenecks of the design used.

  2. Impact of nitrogen deposition on forest and lake food webs in nitrogen-limited environments.

    PubMed

    Meunier, Cédric L; Gundale, Michael J; Sánchez, Irene S; Liess, Antonia

    2016-01-01

    Increased reactive nitrogen (Nr ) deposition has raised the amount of N available to organisms and has greatly altered the transfer of energy through food webs, with major consequences for trophic dynamics. The aim of this review was to: (i) clarify the direct and indirect effects of Nr deposition on forest and lake food webs in N-limited biomes, (ii) compare and contrast how aquatic and terrestrial systems respond to increased Nr deposition, and (iii) identify how the nutrient pathways within and between ecosystems change in response to Nr deposition. We present that Nr deposition releases primary producers from N limitation in both forest and lake ecosystems and raises plants' N content which in turn benefits herbivores with high N requirements. Such trophic effects are coupled with a general decrease in biodiversity caused by different N-use efficiencies; slow-growing species with low rates of N turnover are replaced by fast-growing species with high rates of N turnover. In contrast, Nr deposition diminishes below-ground production in forests, due to a range of mechanisms that reduce microbial biomass, and decreases lake benthic productivity by switching herbivore growth from N to phosphorus (P) limitation, and by intensifying P limitation of benthic fish. The flow of nutrients between ecosystems is expected to change with increasing Nr deposition. Due to higher litter production and more intense precipitation, more terrestrial matter will enter lakes. This will benefit bacteria and will in turn boost the microbial food web. Additionally, Nr deposition promotes emergent insects, which subsidize the terrestrial food web as prey for insectivores or by dying and decomposing on land. So far, most studies have examined Nr -deposition effects on the food web base, whereas our review highlights that changes at the base of food webs substantially impact higher trophic levels and therefore food web structure and functioning. © 2015 John Wiley & Sons Ltd.

  3. An Architecture Combining IMS-LD and Web Services for Flexible Data-Transfer in CSCL

    ERIC Educational Resources Information Center

    Magnisalis, Ioannis; Demetriadis, Stavros

    2017-01-01

    This article presents evaluation data regarding the MAPIS3 architecture which is proposed as a solution for the data-transfer among various tools to promote flexible collaborative learning designs. We describe the problem that this architecture deals with as "tool orchestration" in collaborative learning settings. This term refers to a…

  4. Native and nonnative fish populations of the Colorado River are food limited--evidence from new food web analyses

    USGS Publications Warehouse

    Kennedy, Theodore A.; Cross, Wyatt F.; Hall, Robert O.; Baxter, Colden V.; Rosi-Marshall, Emma J.

    2013-01-01

    Fish populations in the Colorado River downstream from Glen Canyon Dam appear to be limited by the availability of high-quality invertebrate prey. Midge and blackfly production is low and nonnative rainbow trout in Glen Canyon and native fishes in Grand Canyon consume virtually all of the midge and blackfly biomass that is produced annually. In Glen Canyon, the invertebrate assemblage is dominated by nonnative New Zealand mudsnails, the food web has a simple structure, and transfers of energy from the base of the web (algae) to the top of the web (rainbow trout) are inefficient. The food webs in Grand Canyon are more complex relative to Glen Canyon, because, on average, each species in the web is involved in more interactions and feeding connections. Based on theory and on studies from other ecosystems, the structure and organization of Grand Canyon food webs should make them more stable and less susceptible to large changes following perturbations of the flow regime relative to food webs in Glen Canyon. In support of this hypothesis, Grand Canyon food webs were much less affected by a 2008 controlled flood relative to the food web in Glen Canyon.

  5. Higher mass-independent isotope fractionation of methylmercury in the pelagic food web of Lake Baikal (Russia).

    PubMed

    Perrot, Vincent; Pastukhov, Mikhail V; Epov, Vladimir N; Husted, Søren; Donard, Olivier F X; Amouroux, David

    2012-06-05

    Mercury undergoes several transformations that influence its stable isotope composition during a number of environmental and biological processes. Measurements of Hg isotopic mass-dependent (MDF) and mass-independent fractionation (MIF) in food webs may therefore help to identify major sources and processes leading to significant bioaccumulation of methylmercury (MeHg). In this work, δ(13)C, δ(15)N, concentration of Hg species (MeHg, inorganic Hg), and stable isotopic composition of Hg were determined at different trophic levels of the remote and pristine Lake Baikal ecosystem. Muscle of seals and different fish as well as amphipods, zooplankton, and phytoplankton were specifically investigated. MDF during trophic transfer of MeHg leading to enrichment of heavier isotopes in the predators was clearly established by δ(202)Hg measurements in the pelagic prey-predator system (carnivorous sculpins and top-predator seals). Despite the low concentrations of Hg in the ecosystem, the pelagic food web reveals very high MIF Δ(199)Hg (3.15-6.65‰) in comparison to coastal fish (0.26-1.65‰) and most previous studies in aquatic organisms. Trophic transfer does not influence MIF signature since similar Δ(199)Hg was observed in sculpins (4.59 ± 0.55‰) and seal muscles (4.62 ± 0.60‰). The MIF is suggested to be mainly controlled by specific physical and biogeochemical characteristics of the water column. The higher level of MIF in pelagic fish of Lake Baikal is mainly due to the bioaccumulation of residual MeHg that is efficiently turned over and photodemethylated in deep oligotrophic and stationary (i.e., long residence time) freshwater columns.

  6. Evaluation of a metal shear web selectively reinforced with filamentary composites for space shuttle application. Phase 1 summary report: Shear web design development

    NASA Technical Reports Server (NTRS)

    Laakso, J. H.; Zimmerman, D. K.

    1972-01-01

    An advanced composite shear web design concept was developed for the Space Shuttle orbiter main engine thrust beam structure. Various web concepts were synthesized by a computer-aided adaptive random search procedure. A practical concept is identified having a titanium-clad + or - 45 deg boron/epoxy web plate with vertical boron/epoxy reinforced aluminum stiffeners. The boron-epoxy laminate contributes to the strength and stiffness efficiency of the basic web section. The titanium-cladding functions to protect the polymeric laminate parts from damaging environments and is chem-milled to provide reinforcement in selected areas. Detailed design drawings are presented for both boron/epoxy reinforced and all-metal shear webs. The weight saving offered is 24% relative to all-metal construction at an attractive cost per pound of weight saved, based on the detailed designs. Small scale element tests substantiate the boron/epoxy reinforced design details in critical areas. The results show that the titanium-cladding reliably reinforces the web laminate in critical edge load transfer and stiffener fastener hole areas.

  7. Remote monitoring of vibrational information in spider webs.

    PubMed

    Mortimer, B; Soler, A; Siviour, C R; Vollrath, F

    2018-05-22

    Spiders are fascinating model species to study information-acquisition strategies, with the web acting as an extension of the animal's body. Here, we compare the strategies of two orb-weaving spiders that acquire information through vibrations transmitted and filtered in the web. Whereas Araneus diadematus monitors web vibration directly on the web, Zygiella x-notata uses a signal thread to remotely monitor web vibration from a retreat, which gives added protection. We assess the implications of these two information-acquisition strategies on the quality of vibration information transfer, using laser Doppler vibrometry to measure vibrations of real webs and finite element analysis in computer models of webs. We observed that the signal thread imposed no biologically relevant time penalty for vibration propagation. However, loss of energy (attenuation) was a cost associated with remote monitoring via a signal thread. The findings have implications for the biological use of vibrations by spiders, including the mechanisms to locate and discriminate between vibration sources. We show that orb-weaver spiders are fascinating examples of organisms that modify their physical environment to shape their information-acquisition strategy.

  8. Remote monitoring of vibrational information in spider webs

    NASA Astrophysics Data System (ADS)

    Mortimer, B.; Soler, A.; Siviour, C. R.; Vollrath, F.

    2018-06-01

    Spiders are fascinating model species to study information-acquisition strategies, with the web acting as an extension of the animal's body. Here, we compare the strategies of two orb-weaving spiders that acquire information through vibrations transmitted and filtered in the web. Whereas Araneus diadematus monitors web vibration directly on the web, Zygiella x-notata uses a signal thread to remotely monitor web vibration from a retreat, which gives added protection. We assess the implications of these two information-acquisition strategies on the quality of vibration information transfer, using laser Doppler vibrometry to measure vibrations of real webs and finite element analysis in computer models of webs. We observed that the signal thread imposed no biologically relevant time penalty for vibration propagation. However, loss of energy (attenuation) was a cost associated with remote monitoring via a signal thread. The findings have implications for the biological use of vibrations by spiders, including the mechanisms to locate and discriminate between vibration sources. We show that orb-weaver spiders are fascinating examples of organisms that modify their physical environment to shape their information-acquisition strategy.

  9. Technological Networks

    NASA Astrophysics Data System (ADS)

    Mitra, Bivas

    The study of networks in the form of mathematical graph theory is one of the fundamental pillars of discrete mathematics. However, recent years have witnessed a substantial new movement in network research. The focus of the research is shifting away from the analysis of small graphs and the properties of individual vertices or edges to consideration of statistical properties of large scale networks. This new approach has been driven largely by the availability of technological networks like the Internet [12], World Wide Web network [2], etc. that allow us to gather and analyze data on a scale far larger than previously possible. At the same time, technological networks have evolved as a socio-technological system, as the concepts of social systems that are based on self-organization theory have become unified in technological networks [13]. In today’s society, we have a simple and universal access to great amounts of information and services. These information services are based upon the infrastructure of the Internet and the World Wide Web. The Internet is the system composed of ‘computers’ connected by cables or some other form of physical connections. Over this physical network, it is possible to exchange e-mails, transfer files, etc. On the other hand, the World Wide Web (commonly shortened to the Web) is a system of interlinked hypertext documents accessed via the Internet where nodes represent web pages and links represent hyperlinks between the pages. Peer-to-peer (P2P) networks [26] also have recently become a popular medium through which huge amounts of data can be shared. P2P file sharing systems, where files are searched and downloaded among peers without the help of central servers, have emerged as a major component of Internet traffic. An important advantage in P2P networks is that all clients provide resources, including bandwidth, storage space, and computing power. In this chapter, we discuss these technological networks in detail. The review is organized as follows. Section 2 presents an introduction to the Internet and different protocols related to it. This section also specifies the socio-technological properties of the Internet, like scale invariance, the small-world property, network resilience, etc. Section 3 describes the P2P networks, their categorization, and other related issues like search, stability, etc. Section 4 concludes the chapter.

  10. Cloud Based Web 3d GIS Taiwan Platform

    NASA Astrophysics Data System (ADS)

    Tsai, W.-F.; Chang, J.-Y.; Yan, S. Y.; Chen, B.

    2011-09-01

    This article presents the status of the web 3D GIS platform, which has been developed in the National Applied Research Laboratories. The purpose is to develop a global earth observation 3D GIS platform for applications to disaster monitoring and assessment in Taiwan. For quick response to preliminary and detailed assessment after a natural disaster occurs, the web 3D GIS platform is useful to access, transfer, integrate, display and analyze the multi-scale huge data following the international OGC standard. The framework of cloud service for data warehousing management and efficiency enhancement using VMWare is illustrated in this article.

  11. Bringing Control System User Interfaces to the Web

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Xihui; Kasemir, Kay

    With the evolution of web based technologies, especially HTML5 [1], it becomes possible to create web-based control system user interfaces (UI) that are cross-browser and cross-device compatible. This article describes two technologies that facilitate this goal. The first one is the WebOPI [2], which can seamlessly display CSS BOY [3] Operator Interfaces (OPI) in web browsers without modification to the original OPI file. The WebOPI leverages the powerful graphical editing capabilities of BOY and provides the convenience of re-using existing OPI files. On the other hand, it uses generic JavaScript and a generic communication mechanism between the web browser andmore » web server. It is not optimized for a control system, which results in unnecessary network traffic and resource usage. Our second technology is the WebSocket-based Process Data Access (WebPDA) [4]. It is a protocol that provides efficient control system data communication using WebSocket [5], so that users can create web-based control system UIs using standard web page technologies such as HTML, CSS and JavaScript. WebPDA is control system independent, potentially supporting any type of control system.« less

  12. Development of a Web-based Glaucoma Registry at King Khaled Eye Specialist Hospital, Saudi Arabia: A Cost-Effective Methodology

    PubMed Central

    Zaman, Babar; Khandekar, Rajiv; Al Shahwan, Sami; Song, Jonathan; Al Jadaan, Ibrahim; Al Jiasim, Leyla; Owaydha, Ohood; Asghar, Nasira; Hijazi, Amar; Edward, Deepak P.

    2014-01-01

    In this brief communication, we present the steps used to establish a web-based congenital glaucoma registry at our institution. The contents of a case report form (CRF) were developed by a group of glaucoma subspecialists. Information Technology (IT) specialists used Lime Survey softwareTM to create an electronic CRF. A MY Structured Query Language (MySQL) server was used as a database with a virtual machine operating system. Two ophthalmologists and 2 IT specialists worked for 7 hours, and a biostatistician and a data registrar worked for 24 hours each to establish the electronic CRF. Using the CRF which was transferred to the Lime survey tool, and the MYSQL server application, data could be directly stored in spreadsheet programs that included Microsoft Excel, SPSS, and R-Language and queried in real-time. In a pilot test, clinical data from 80 patients with congenital glaucoma were entered into the registry and successful descriptive analysis and data entry validation was performed. A web-based disease registry was established in a short period of time in a cost-efficient manner using available resources and a team-based approach. PMID:24791112

  13. Development of a web-based glaucoma registry at King Khaled Eye Specialist Hospital, Saudi Arabia: a cost-effective methodology.

    PubMed

    Zaman, Babar; Khandekar, Rajiv; Al Shahwan, Sami; Song, Jonathan; Al Jadaan, Ibrahim; Al Jiasim, Leyla; Owaydha, Ohood; Asghar, Nasira; Hijazi, Amar; Edward, Deepak P

    2014-01-01

    In this brief communication, we present the steps used to establish a web-based congenital glaucoma registry at our institution. The contents of a case report form (CRF) were developed by a group of glaucoma subspecialists. Information Technology (IT) specialists used Lime Survey softwareTM to create an electronic CRF. A MY Structured Query Language (MySQL) server was used as a database with a virtual machine operating system. Two ophthalmologists and 2 IT specialists worked for 7 hours, and a biostatistician and a data registrar worked for 24 hours each to establish the electronic CRF. Using the CRF which was transferred to the Lime survey tool, and the MYSQL server application, data could be directly stored in spreadsheet programs that included Microsoft Excel, SPSS, and R-Language and queried in real-time. In a pilot test, clinical data from 80 patients with congenital glaucoma were entered into the registry and successful descriptive analysis and data entry validation was performed. A web-based disease registry was established in a short period of time in a cost-efficient manner using available resources and a team-based approach.

  14. Hydrothermal activity, functional diversity and chemoautotrophy are major drivers of seafloor carbon cycling.

    PubMed

    Bell, James B; Woulds, Clare; Oevelen, Dick van

    2017-09-20

    Hydrothermal vents are highly dynamic ecosystems and are unusually energy rich in the deep-sea. In situ hydrothermal-based productivity combined with sinking photosynthetic organic matter in a soft-sediment setting creates geochemically diverse environments, which remain poorly studied. Here, we use comprehensive set of new and existing field observations to develop a quantitative ecosystem model of a deep-sea chemosynthetic ecosystem from the most southerly hydrothermal vent system known. We find evidence of chemosynthetic production supplementing the metazoan food web both at vent sites and elsewhere in the Bransfield Strait. Endosymbiont-bearing fauna were very important in supporting the transfer of chemosynthetic carbon into the food web, particularly to higher trophic levels. Chemosynthetic production occurred at all sites to varying degrees but was generally only a small component of the total organic matter inputs to the food web, even in the most hydrothermally active areas, owing in part to a low and patchy density of vent-endemic fauna. Differences between relative abundance of faunal functional groups, resulting from environmental variability, were clear drivers of differences in biogeochemical cycling and resulted in substantially different carbon processing patterns between habitats.

  15. Fatty acid transfer in the food web of a coastal Mediterranean lagoon: Evidence for high arachidonic acid retention in fish

    NASA Astrophysics Data System (ADS)

    Koussoroplis, Apostolos-Manuel; Bec, Alexandre; Perga, Marie-Elodie; Koutrakis, Emmanuil; Bourdier, Gilles; Desvilettes, Christian

    2011-02-01

    The transfer of fatty acids (FAs) in the food web of a Mediterranean lagoon was studied using FA compositional patterns across several trophic levels. The structure of the food web was inferred from C and N stable isotopes values and an isotope mixing model was used in order to estimate the relative contribution of the different potential food sources to the biomass of consumers. Bidimensional plots of FA composition of food web components against their δ 15N values indicated a general trend of increasing proportions of highly unsaturated fatty acids (HUFAs) with increasing trophic levels while the proportions of saturated fatty acids (SAFAs) and 18-carbon polyunsaturated fatty acids (PUFAs) decreased. Using the relative contributions of food sources to consumers and their FA compositions, a model was built in order to estimate the PUFA composition of consumer mixed diets which was compared to consumer PUFA profiles. The latter allowed the identification of the PUFAs which were mostly enriched/retained in consumer lipids. There was a surprisingly high retention of arachidonic acid (ARA), a trend which challenges the idea of low ARA needs in marine fish and suggests the important physiological role of this essential FA for fish in estuarine environments.

  16. Tracing Fluxes Of Aquatic Production And Contaminants Into Terrestrial Food Webs With Nitrogen Stable Isotopes

    NASA Astrophysics Data System (ADS)

    Rivard, A.; Cabana, G.; Rainey, W.; Power, M.

    2005-05-01

    Biomagnifying contaminants such as mercury can be transported and redistributed across the watershed by streams and rivers. Their fate and effects on consumers depend on food web transfer both within and between aquatic and terrestrial ecosystems. The Truckee River (Ca/Ne) is heavily contaminated by Hg originating from century-old upstream mining operations. We used nitrogen stable isotope analysis to trace the incorporation of Hg transported by the Truckee and transferred by emerging aquatic insects into the riparian food web. N-isotope ratios and Hg of aquatic primary consumers were significantly elevated compared to that of terrestrial arthropods (13.3 vs 5.6 % and 110 vs 17 ngg-1). Estimates of dependence on aquatics in 16 riparian passerine bird species based on blood delta 15N ranged between 0.0 and 0.95 and were significantly related to Hg in blood. Similar correlations between Hg and delta 15N measured in tail tips of western fence lizard (Sceloporus occidentalis) collected at increasing distances from the river were observed. High inter-individual variation in bird Hg was highly correlated with delta 15N. These results show how stable isotopes and contaminant fluxes can reveal important food web linkages across aquatic/terrestrial ecotones.

  17. DTS: The NOAO Data Transport System

    NASA Astrophysics Data System (ADS)

    Fitzpatrick, M.; Semple, T.

    2014-05-01

    The NOAO Data Transport System (DTS) provides high-throughput, reliable, data transfer between telescopes, pipelines and archive centers located in the Northern and Southern hemispheres. It is a distributed application using XML-RPC for command and control, and either parallel-TCP or UDT protocols for bulk data transport. The system is data-agnostic, allowing arbitrary files or directories to be moved using the same infrastructure. Data paths are configurable in the system by connecting nodes as the source or destination of data in a queue. Each leg of a data path may be configured independently based on the network environment between the sites. A queueing model is currently implemented to manage the automatic movement of data, a streaming model is planned to support arbitrarily large transfers (e.g. as in a disk recovery scenario) or to provide a 'pass-thru' interface to minize overheads. A web-based monitor allows anyone to get a graphical overview of the DTS system as it runs, operators will be able to control individual nodes in the system. Through careful tuning of the network paths DTS is able to achieve in excess of 80-percent of the nominal wire speed using only commodity networks, making it ideal for long-haul transport of large volumes of data.

  18. Global change in the trophic functioning of marine food webs

    PubMed Central

    Gascuel, Didier; Colléter, Mathieu; Palomares, Maria L. D.; Du Pontavice, Hubert; Pauly, Daniel; Cheung, William W. L.

    2017-01-01

    The development of fisheries in the oceans, and other human drivers such as climate warming, have led to changes in species abundance, assemblages, trophic interactions, and ultimately in the functioning of marine food webs. Here, using a trophodynamic approach and global databases of catches and life history traits of marine species, we tested the hypothesis that anthropogenic ecological impacts may have led to changes in the global parameters defining the transfers of biomass within the food web. First, we developed two indicators to assess such changes: the Time Cumulated Indicator (TCI) measuring the residence time of biomass within the food web, and the Efficiency Cumulated Indicator (ECI) quantifying the fraction of secondary production reaching the top of the trophic chain. Then, we assessed, at the large marine ecosystem scale, the worldwide change of these two indicators over the 1950–2010 time-periods. Global trends were identified and cluster analyses were used to characterize the variability of trends between ecosystems. Results showed that the most common pattern over the study period is a global decrease in TCI, while the ECI indicator tends to increase. Thus, changes in species assemblages would induce faster and apparently more efficient biomass transfers in marine food webs. Results also suggested that the main driver of change over that period had been the large increase in fishing pressure. The largest changes occurred in ecosystems where ‘fishing down the marine food web’ are most intensive. PMID:28800358

  19. Applying Web Usage Mining for Personalizing Hyperlinks in Web-Based Adaptive Educational Systems

    ERIC Educational Resources Information Center

    Romero, Cristobal; Ventura, Sebastian; Zafra, Amelia; de Bra, Paul

    2009-01-01

    Nowadays, the application of Web mining techniques in e-learning and Web-based adaptive educational systems is increasing exponentially. In this paper, we propose an advanced architecture for a personalization system to facilitate Web mining. A specific Web mining tool is developed and a recommender engine is integrated into the AHA! system in…

  20. Silver bioaccumulation in chironomid larvae as a potential source for upper trophic levels: a study case from northern Patagonia.

    PubMed

    Williams, Natalia; Rizzo, Andrea; Arribére, María A; Suárez, Diego Añón; Guevara, Sergio Ribeiro

    2018-01-01

    Silver (Ag) is a pollutant of high concern in aquatic ecosystems, considered among the most toxic metallic ions. In lacustrine environments, contaminated sediments are a source of Ag for the food web. Chironomidae (Insecta: Diptera) are the most abundant, diverse, and representative insect groups in aquatic ecosystems. Chironomid larvae are closely associated to benthic substrates and link primary producers and secondary consumers. Given their trophic position and their life habits, these larvae can be considered the entry point for the transference of Ag, from the benthic deposit to the higher trophic levels of the food web. Previous studies in lakes from Nahuel Huapi National Park (Northern Patagonia) showed Ag enrichment over background levels (0.04-0.1 μg g -1 dry weight) both in biota (bivalves and fish liver) and sediments from sites near human settlements. The aim of this study was to analyze the role of chironomids in the transference of Ag from the benthic reservoir of Lake Moreno Oeste to the food web. The concentration of Ag in chironomid larvae tissue ranged from 0.1 to 1.5 μg g -1 dry weight, reaching a bioaccumulation factor up to 17 over substrates and depending on the associated substrate type, feeding habitats, larval stage, and season. The main Ag transfer to higher trophic levels by chironomids occurs in the littoral zone, mostly from larvae inhabiting submerged vegetation (Myriophyllum quitense) and sediment from vegetated zones. This study presents novel evidence of the doorway role played by chironomid larvae in Ag pathways from the sediments into food webs of freshwater ecosystems.

  1. Dynamic analysis of temporal moisture profiles in heatset printing studied with near-infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Tåg, C.-M.; Toiviainen, M.; Juuti, M.; Gane, P. A. C.

    2010-10-01

    Dynamic analysis of the water transfer onto coated paper, and its permeation and absorption into the porous structure were studied online in a full-scale heatset web offset printing environment. The moisture content of the paper was investigated at five different positions during the printing process. Changes in the moisture content of the paper were studied as a function of the web temperature, printing speed and silicone application in the folding unit positioned after the hot air drying oven. Additionally, the influence of fountain solution composition on the pick-up by the paper was investigated. The water content of the fountain solution transferred to the paper from the printing units was observed as changes in near-infrared absorbance. A calibration data set enabled the subsequent quantification of the dynamic moisture content of the paper at the studied locations. An increase in the printing speed reduced the water transfer to the paper and an increase in web temperature resulted in a reduction in the moisture content. An increase in the dosage level of the water-silicone mixture was observed as a re-moistening effect of the paper. Differences in the drying strategy resulted in different moisture profiles depending on the type of fountain solution used. As a conclusion, the near-infrared signal provides an effective way to characterize the moisture dynamics online at different press units.

  2. Leveraging Globus to Support Access and Delivery of Scientific Data

    NASA Astrophysics Data System (ADS)

    Cram, T.; Schuster, D.; Ji, Z.; Worley, S. J.

    2015-12-01

    The NCAR Research Data Archive (RDA; http://rda.ucar.edu) contains a large and diverse collection of meteorological and oceanographic observations, operational and reanalysis outputs, and remote sensing datasets to support atmospheric and geoscience research. The RDA contains greater than 600 dataset collections which support the varying needs of a diverse user community. The number of RDA users is increasing annually, and the most popular method used to access the RDA data holdings is through web based protocols, such as wget and cURL based scripts. In the year 2014, 11,000 unique users downloaded greater than 1.1 petabytes of data from the RDA, and customized data products were prepared for more than 45,000 user-driven requests. In order to further support this increase in web download usage, the RDA has implemented the Globus data transfer service (www.globus.org) to provide a GridFTP data transfer option for the user community. The Globus service is broadly scalable, has an easy to install client, is sustainably supported, and provides a robust, efficient, and reliable data transfer option for the research community. This presentation will highlight the technical functionality, challenges, and usefulness of the Globus data transfer service for accessing the RDA data holdings.

  3. Delivering images to the operating room: a web-based solution.

    PubMed

    Bennett, W F; Tunstall, K M; Skinner, P W; Spigos, D G

    2002-01-01

    As radiology departments become filmless, they are discovering that some areas are particularly difficult to deliver images. Many departments have found that the operating room is one such area. There are space constraints and difficulty in manipulating the images by a sterile surgeon. This report describes one method to overcome this obstacle. The author's institution has been using picture archiving and communication system (PACS) for approximately 3 years, and it has been a filmless department for 1 year. The PACS transfers images to a webserver for distribution throughout the hospital. It is accessed by Internet Explorer without any additional software. The authors recently started a pilot program in which they installed dual panel flat screen monitors in 6 operating rooms. The computers are connected to the hospital backbone by ethernet. Graphic cards installed in the computers allow the use of dual monitors. Because the surgeons were experienced in viewing cases on the enterprise web system, they had little difficulty in adapting to the operating room (OR) system. Initial reception of the system is positive. The use of the web system was found to be superior by the surgeons because of the flexibility and manipulation of the images compared with film. Images can be magnified to facilitate viewing from across the room. The ultimate goal of electronic radiology is to replace hardcopy film in all aspects. One area that PACS has difficulty in accomplishing this goal is in the operating room. Most institutions have continued to print film for the OR. The authors have initiated a project that may allow web viewing in the OR. Because of limited space in the OR, an additional computer was undesirable. The CPU tower, keyboard, and mouse were mounted on a frame on the wall. The images were displayed on 2 flat screen monitors, which simulated the viewboxes traditionally used by the surgeons. Interviews with the surgeons have found both positive and negative aspects of the system. Overall impression is good, but the timeliness of the intraoperative films needs to be improved. The author's pilot project of installing a web-based display system in the operating room still is being evaluated. Their initial results have been positive, and if there are no major problems that arise the project will be expanded. These results show that it is possible to provide image delivery to the OR over the intranet that is acceptable to the surgeons.

  4. Embedded Web Technology: Applying World Wide Web Standards to Embedded Systems

    NASA Technical Reports Server (NTRS)

    Ponyik, Joseph G.; York, David W.

    2002-01-01

    Embedded Systems have traditionally been developed in a highly customized manner. The user interface hardware and software along with the interface to the embedded system are typically unique to the system for which they are built, resulting in extra cost to the system in terms of development time and maintenance effort. World Wide Web standards have been developed in the passed ten years with the goal of allowing servers and clients to intemperate seamlessly. The client and server systems can consist of differing hardware and software platforms but the World Wide Web standards allow them to interface without knowing about the details of system at the other end of the interface. Embedded Web Technology is the merging of Embedded Systems with the World Wide Web. Embedded Web Technology decreases the cost of developing and maintaining the user interface by allowing the user to interface to the embedded system through a web browser running on a standard personal computer. Embedded Web Technology can also be used to simplify an Embedded System's internal network.

  5. Vcs.js - Visualization Control System for the Web

    NASA Astrophysics Data System (ADS)

    Chaudhary, A.; Lipsa, D.; Doutriaux, C.; Beezley, J. D.; Williams, D. N.; Fries, S.; Harris, M. B.

    2016-12-01

    VCS is a general purpose visualization library, optimized for climate data, which is part of the UV-CDAT system. It provides a Python API for drawing 2D plots such as lineplots, scatter plots, Taylor diagrams, data colored by scalar values, vector glyphs, isocontours and map projections. VCS is based on the VTK library. Vcs.js is the corresponding JavaScript API, designed to be as close as possible to the original VCS Python API and to provide similar functionality for the Web. Vcs.js includes additional functionality when compared with VCS. This additional API is used to introspect data files available on the server and variables available in a data file. Vcs.js can display plots in the browser window. It always works with a server that reads a data file, extracts variables from the file and subsets the data. From this point, two alternate paths are possible. First the system can render the data on the server using VCS producing an image which is send to the browser to be displayed. This path works for for all plot types and produces a reference image identical with the images produced by VCS. This path uses the VTK-Web library. As an optimization, usable in certain conditions, a second path is possible. Data is packed, and sent to the browser which uses a JavaScript plotting library, such as plotly, to display the data. Plots that work well in the browser are line-plots, scatter-plots for any data and many other plot types for small data and supported grid types. As web technology matures, more plots could be supported for rendering in the browser. Rendering can be done either on the client or on the server and we expect that the best place to render will change depending on the available web technology, data transfer costs, server management costs and value provided to users. We intend to provide a flexible solution that allows for both client and server side rendering and a meaningful way to choose between the two. We provide a web-based user interface called vCdat which uses Vcs.js as its visualization library. Our paper will discuss the principles guiding our design choices for Vcs.js, present our design in detail and show a sample usage of the library.

  6. A RESTful API for accessing microbial community data for MG-RAST.

    PubMed

    Wilke, Andreas; Bischof, Jared; Harrison, Travis; Brettin, Tom; D'Souza, Mark; Gerlach, Wolfgang; Matthews, Hunter; Paczian, Tobias; Wilkening, Jared; Glass, Elizabeth M; Desai, Narayan; Meyer, Folker

    2015-01-01

    Metagenomic sequencing has produced significant amounts of data in recent years. For example, as of summer 2013, MG-RAST has been used to annotate over 110,000 data sets totaling over 43 Terabases. With metagenomic sequencing finding even wider adoption in the scientific community, the existing web-based analysis tools and infrastructure in MG-RAST provide limited capability for data retrieval and analysis, such as comparative analysis between multiple data sets. Moreover, although the system provides many analysis tools, it is not comprehensive. By opening MG-RAST up via a web services API (application programmers interface) we have greatly expanded access to MG-RAST data, as well as provided a mechanism for the use of third-party analysis tools with MG-RAST data. This RESTful API makes all data and data objects created by the MG-RAST pipeline accessible as JSON objects. As part of the DOE Systems Biology Knowledgebase project (KBase, http://kbase.us) we have implemented a web services API for MG-RAST. This API complements the existing MG-RAST web interface and constitutes the basis of KBase's microbial community capabilities. In addition, the API exposes a comprehensive collection of data to programmers. This API, which uses a RESTful (Representational State Transfer) implementation, is compatible with most programming environments and should be easy to use for end users and third parties. It provides comprehensive access to sequence data, quality control results, annotations, and many other data types. Where feasible, we have used standards to expose data and metadata. Code examples are provided in a number of languages both to show the versatility of the API and to provide a starting point for users. We present an API that exposes the data in MG-RAST for consumption by our users, greatly enhancing the utility of the MG-RAST service.

  7. MyGeoHub: A Collaborative Geospatial Research and Education Platform

    NASA Astrophysics Data System (ADS)

    Kalyanam, R.; Zhao, L.; Biehl, L. L.; Song, C. X.; Merwade, V.; Villoria, N.

    2017-12-01

    Scientific research is increasingly collaborative and globally distributed; research groups now rely on web-based scientific tools and data management systems to simplify their day-to-day collaborative workflows. However, such tools often lack seamless interfaces, requiring researchers to contend with manual data transfers, annotation and sharing. MyGeoHub is a web platform that supports out-of-the-box, seamless workflows involving data ingestion, metadata extraction, analysis, sharing and publication. MyGeoHub is built on the HUBzero cyberinfrastructure platform and adds general-purpose software building blocks (GABBs), for geospatial data management, visualization and analysis. A data management building block iData, processes geospatial files, extracting metadata for keyword and map-based search while enabling quick previews. iData is pervasive, allowing access through a web interface, scientific tools on MyGeoHub or even mobile field devices via a data service API. GABBs includes a Python map library as well as map widgets that in a few lines of code, generate complete geospatial visualization web interfaces for scientific tools. GABBs also includes powerful tools that can be used with no programming effort. The GeoBuilder tool provides an intuitive wizard for importing multi-variable, geo-located time series data (typical of sensor readings, GPS trackers) to build visualizations supporting data filtering and plotting. MyGeoHub has been used in tutorials at scientific conferences and educational activities for K-12 students. MyGeoHub is also constantly evolving; the recent addition of Jupyter and R Shiny notebook environments enable reproducible, richly interactive geospatial analyses and applications ranging from simple pre-processing to published tools. MyGeoHub is not a monolithic geospatial science gateway, instead it supports diverse needs ranging from just a feature-rich data management system, to complex scientific tools and workflows.

  8. GLobal Integrated Design Environment

    NASA Technical Reports Server (NTRS)

    Kunkel, Matthew; McGuire, Melissa; Smith, David A.; Gefert, Leon P.

    2011-01-01

    The GLobal Integrated Design Environment (GLIDE) is a collaborative engineering application built to resolve the design session issues of real-time passing of data between multiple discipline experts in a collaborative environment. Utilizing Web protocols and multiple programming languages, GLIDE allows engineers to use the applications to which they are accustomed in this case, Excel to send and receive datasets via the Internet to a database-driven Web server. Traditionally, a collaborative design session consists of one or more engineers representing each discipline meeting together in a single location. The discipline leads exchange parameters and iterate through their respective processes to converge on an acceptable dataset. In cases in which the engineers are unable to meet, their parameters are passed via e-mail, telephone, facsimile, or even postal mail. The result of this slow process of data exchange would elongate a design session to weeks or even months. While the iterative process remains in place, software can now exchange parameters securely and efficiently, while at the same time allowing for much more information about a design session to be made available. GLIDE is written in a compilation of several programming languages, including REALbasic, PHP, and Microsoft Visual Basic. GLIDE client installers are available to download for both Microsoft Windows and Macintosh systems. The GLIDE client software is compatible with Microsoft Excel 2000 or later on Windows systems, and with Microsoft Excel X or later on Macintosh systems. GLIDE follows the Client-Server paradigm, transferring encrypted and compressed data via standard Web protocols. Currently, the engineers use Excel as a front end to the GLIDE Client, as many of their custom tools run in Excel.

  9. The effects of climatic fluctuations and extreme events on running water ecosystems

    PubMed Central

    Woodward, Guy; Bonada, Núria; Brown, Lee E.; Death, Russell G.; Durance, Isabelle; Gray, Clare; Hladyz, Sally; Ledger, Mark E.; Milner, Alexander M.; Ormerod, Steve J.; Thompson, Ross M.

    2016-01-01

    Most research on the effects of environmental change in freshwaters has focused on incremental changes in average conditions, rather than fluctuations or extreme events such as heatwaves, cold snaps, droughts, floods or wildfires, which may have even more profound consequences. Such events are commonly predicted to increase in frequency, intensity and duration with global climate change, with many systems being exposed to conditions with no recent historical precedent. We propose a mechanistic framework for predicting potential impacts of environmental fluctuations on running-water ecosystems by scaling up effects of fluctuations from individuals to entire ecosystems. This framework requires integration of four key components: effects of the environment on individual metabolism, metabolic and biomechanical constraints on fluctuating species interactions, assembly dynamics of local food webs, and mapping the dynamics of the meta-community onto ecosystem function. We illustrate the framework by developing a mathematical model of environmental fluctuations on dynamically assembling food webs. We highlight (currently limited) empirical evidence for emerging insights and theoretical predictions. For example, widely supported predictions about the effects of environmental fluctuations are: high vulnerability of species with high per capita metabolic demands such as large-bodied ones at the top of food webs; simplification of food web network structure and impaired energetic transfer efficiency; and reduced resilience and top-down relative to bottom-up regulation of food web and ecosystem processes. We conclude by identifying key questions and challenges that need to be addressed to develop more accurate and predictive bio-assessments of the effects of fluctuations, and implications of fluctuations for management practices in an increasingly uncertain world. PMID:27114576

  10. Use of Open Standards and Technologies at the Lunar Mapping and Modeling Project

    NASA Astrophysics Data System (ADS)

    Law, E.; Malhotra, S.; Bui, B.; Chang, G.; Goodale, C. E.; Ramirez, P.; Kim, R. M.; Sadaqathulla, S.; Rodriguez, L.

    2011-12-01

    The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is tasked by NASA. The project is responsible for the development of an information system to support lunar exploration activities. It provides lunar explorers a set of tools and lunar map and model products that are predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). At Jet Propulsion Laboratory (JPL), we have built the LMMP interoperable geospatial information system's underlying infrastructure and a single point of entry - the LMMP Portal by employing a number of open standards and technologies. The Portal exposes a set of services to users to allow search, visualization, subset, and download of lunar data managed by the system. Users also have access to a set of tools that visualize, analyze and annotate the data. The infrastructure and Portal are based on web service oriented architecture. We designed the system to support solar system bodies in general including asteroids, earth and planets. We employed a combination of custom software, commercial and open-source components, off-the-shelf hardware and pay-by-use cloud computing services. The use of open standards and web service interfaces facilitate platform and application independent access to the services and data, offering for instances, iPad and Android mobile applications and large screen multi-touch with 3-D terrain viewing functions, for a rich browsing and analysis experience from a variety of platforms. The web services made use of open standards including: Representational State Transfer (REST); and Open Geospatial Consortium (OGC)'s Web Map Service (WMS), Web Coverage Service (WCS), Web Feature Service (WFS). Its data management services have been built on top of a set of open technologies including: Object Oriented Data Technology (OODT) - open source data catalog, archive, file management, data grid framework; openSSO - open source access management and federation platform; solr - open source enterprise search platform; redmine - open source project collaboration and management framework; GDAL - open source geospatial data abstraction library; and others. Its data products are compliant with Federal Geographic Data Committee (FGDC) metadata standard. This standardization allows users to access the data products via custom written applications or off-the-shelf applications such as GoogleEarth. We will demonstrate this ready-to-use system for data discovery and visualization by walking through the data services provided through the portal such as browse, search, and other tools. We will further demonstrate image viewing and layering of lunar map images from the Internet, via mobile devices such as Apple's iPad.

  11. Resource Management Scheme Based on Ubiquitous Data Analysis

    PubMed Central

    Lee, Heung Ki; Jung, Jaehee

    2014-01-01

    Resource management of the main memory and process handler is critical to enhancing the system performance of a web server. Owing to the transaction delay time that affects incoming requests from web clients, web server systems utilize several web processes to anticipate future requests. This procedure is able to decrease the web generation time because there are enough processes to handle the incoming requests from web browsers. However, inefficient process management results in low service quality for the web server system. Proper pregenerated process mechanisms are required for dealing with the clients' requests. Unfortunately, it is difficult to predict how many requests a web server system is going to receive. If a web server system builds too many web processes, it wastes a considerable amount of memory space, and thus performance is reduced. We propose an adaptive web process manager scheme based on the analysis of web log mining. In the proposed scheme, the number of web processes is controlled through prediction of incoming requests, and accordingly, the web process management scheme consumes the least possible web transaction resources. In experiments, real web trace data were used to prove the improved performance of the proposed scheme. PMID:25197692

  12. Drying of fiber webs

    DOEpatents

    Warren, David W.

    1997-01-01

    A process and an apparatus for high-intensity drying of fiber webs or sheets, such as newsprint, printing and writing papers, packaging paper, and paperboard or linerboard, as they are formed on a paper machine. The invention uses direct contact between the wet fiber web or sheet and various molten heat transfer fluids, such as liquified eutectic metal alloys, to impart heat at high rates over prolonged durations, in order to achieve ambient boiling of moisture contained within the web. The molten fluid contact process causes steam vapor to emanate from the web surface, without dilution by ambient air; and it is differentiated from the evaporative drying techniques of the prior industrial art, which depend on the uses of steam-heated cylinders to supply heat to the paper web surface, and ambient air to carry away moisture, which is evaporated from the web surface. Contact between the wet fiber web and the molten fluid can be accomplished either by submersing the web within a molten bath or by coating the surface of the web with the molten media. Because of the high interfacial surface tension between the molten media and the cellulose fiber comprising the paper web, the molten media does not appreciately stick to the paper after it is dried. Steam generated from the paper web is collected and condensed without dilution by ambient air to allow heat recovery at significantly higher temperature levels than attainable in evaporative dryers.

  13. Ocean acidification increases fatty acids levels of larval fish.

    PubMed

    Díaz-Gil, Carlos; Catalán, Ignacio A; Palmer, Miquel; Faulk, Cynthia K; Fuiman, Lee A

    2015-07-01

    Rising levels of anthropogenic carbon dioxide in the atmosphere are acidifying the oceans and producing diverse and important effects on marine ecosystems, including the production of fatty acids (FAs) by primary producers and their transfer through food webs. FAs, particularly essential FAs, are necessary for normal structure and function in animals and influence composition and trophic structure of marine food webs. To test the effect of ocean acidification (OA) on the FA composition of fish, we conducted a replicated experiment in which larvae of the marine fish red drum (Sciaenops ocellatus) were reared under a climate change scenario of elevated CO2 levels (2100 µatm) and under current control levels (400 µatm). We found significantly higher whole-body levels of FAs, including nine of the 11 essential FAs, and altered relative proportions of FAs in the larvae reared under higher levels of CO2. Consequences of this effect of OA could include alterations in performance and survival of fish larvae and transfer of FAs through food webs. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  14. A Proxy Design to Leverage the Interconnection of CoAP Wireless Sensor Networks with Web Applications

    PubMed Central

    Ludovici, Alessandro; Calveras, Anna

    2015-01-01

    In this paper, we present the design of a Constrained Application Protocol (CoAP) proxy able to interconnect Web applications based on Hypertext Transfer Protocol (HTTP) and WebSocket with CoAP based Wireless Sensor Networks. Sensor networks are commonly used to monitor and control physical objects or environments. Smart Cities represent applications of such a nature. Wireless Sensor Networks gather data from their surroundings and send them to a remote application. This data flow may be short or long lived. The traditional HTTP long-polling used by Web applications may not be adequate in long-term communications. To overcome this problem, we include the WebSocket protocol in the design of the CoAP proxy. We evaluate the performance of the CoAP proxy in terms of latency and memory consumption. The tests consider long and short-lived communications. In both cases, we evaluate the performance obtained by the CoAP proxy according to the use of WebSocket and HTTP long-polling. PMID:25585107

  15. Virtualization of open-source secure web services to support data exchange in a pediatric critical care research network.

    PubMed

    Frey, Lewis J; Sward, Katherine A; Newth, Christopher J L; Khemani, Robinder G; Cryer, Martin E; Thelen, Julie L; Enriquez, Rene; Shaoyu, Su; Pollack, Murray M; Harrison, Rick E; Meert, Kathleen L; Berg, Robert A; Wessel, David L; Shanley, Thomas P; Dalton, Heidi; Carcillo, Joseph; Jenkins, Tammara L; Dean, J Michael

    2015-11-01

    To examine the feasibility of deploying a virtual web service for sharing data within a research network, and to evaluate the impact on data consistency and quality. Virtual machines (VMs) encapsulated an open-source, semantically and syntactically interoperable secure web service infrastructure along with a shadow database. The VMs were deployed to 8 Collaborative Pediatric Critical Care Research Network Clinical Centers. Virtual web services could be deployed in hours. The interoperability of the web services reduced format misalignment from 56% to 1% and demonstrated that 99% of the data consistently transferred using the data dictionary and 1% needed human curation. Use of virtualized open-source secure web service technology could enable direct electronic abstraction of data from hospital databases for research purposes. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  16. Simple, Scalable, Script-based, Science Processor for Measurements - Data Mining Edition (S4PM-DME)

    NASA Astrophysics Data System (ADS)

    Pham, L. B.; Eng, E. K.; Lynnes, C. S.; Berrick, S. W.; Vollmer, B. E.

    2005-12-01

    The S4PM-DME is the Goddard Earth Sciences Distributed Active Archive Center's (GES DAAC) web-based data mining environment. The S4PM-DME replaces the Near-line Archive Data Mining (NADM) system with a better web environment and a richer set of production rules. S4PM-DME enables registered users to submit and execute custom data mining algorithms. The S4PM-DME system uses the GES DAAC developed Simple Scalable Script-based Science Processor for Measurements (S4PM) to automate tasks and perform the actual data processing. A web interface allows the user to access the S4PM-DME system. The user first develops personalized data mining algorithm on his/her home platform and then uploads them to the S4PM-DME system. Algorithms in C and FORTRAN languages are currently supported. The user developed algorithm is automatically audited for any potential security problems before it is installed within the S4PM-DME system and made available to the user. Once the algorithm has been installed the user can promote the algorithm to the "operational" environment. From here the user can search and order the data available in the GES DAAC archive for his/her science algorithm. The user can also set up a processing subscription. The subscription will automatically process new data as it becomes available in the GES DAAC archive. The generated mined data products are then made available for FTP pickup. The benefits of using S4PM-DME are 1) to decrease the downloading time it typically takes a user to transfer the GES DAAC data to his/her system thus off-load the heavy network traffic, 2) to free-up the load on their system, and last 3) to utilize the rich and abundance ocean, atmosphere data from the MODIS and AIRS instruments available from the GES DAAC.

  17. Performance enhancement of a web-based picture archiving and communication system using commercial off-the-shelf server clusters.

    PubMed

    Liu, Yan-Lin; Shih, Cheng-Ting; Chang, Yuan-Jen; Chang, Shu-Jun; Wu, Jay

    2014-01-01

    The rapid development of picture archiving and communication systems (PACSs) thoroughly changes the way of medical informatics communication and management. However, as the scale of a hospital's operations increases, the large amount of digital images transferred in the network inevitably decreases system efficiency. In this study, a server cluster consisting of two server nodes was constructed. Network load balancing (NLB), distributed file system (DFS), and structured query language (SQL) duplication services were installed. A total of 1 to 16 workstations were used to transfer computed radiography (CR), computed tomography (CT), and magnetic resonance (MR) images simultaneously to simulate the clinical situation. The average transmission rate (ATR) was analyzed between the cluster and noncluster servers. In the download scenario, the ATRs of CR, CT, and MR images increased by 44.3%, 56.6%, and 100.9%, respectively, when using the server cluster, whereas the ATRs increased by 23.0%, 39.2%, and 24.9% in the upload scenario. In the mix scenario, the transmission performance increased by 45.2% when using eight computer units. The fault tolerance mechanisms of the server cluster maintained the system availability and image integrity. The server cluster can improve the transmission efficiency while maintaining high reliability and continuous availability in a healthcare environment.

  18. Performance Enhancement of a Web-Based Picture Archiving and Communication System Using Commercial Off-the-Shelf Server Clusters

    PubMed Central

    Chang, Shu-Jun; Wu, Jay

    2014-01-01

    The rapid development of picture archiving and communication systems (PACSs) thoroughly changes the way of medical informatics communication and management. However, as the scale of a hospital's operations increases, the large amount of digital images transferred in the network inevitably decreases system efficiency. In this study, a server cluster consisting of two server nodes was constructed. Network load balancing (NLB), distributed file system (DFS), and structured query language (SQL) duplication services were installed. A total of 1 to 16 workstations were used to transfer computed radiography (CR), computed tomography (CT), and magnetic resonance (MR) images simultaneously to simulate the clinical situation. The average transmission rate (ATR) was analyzed between the cluster and noncluster servers. In the download scenario, the ATRs of CR, CT, and MR images increased by 44.3%, 56.6%, and 100.9%, respectively, when using the server cluster, whereas the ATRs increased by 23.0%, 39.2%, and 24.9% in the upload scenario. In the mix scenario, the transmission performance increased by 45.2% when using eight computer units. The fault tolerance mechanisms of the server cluster maintained the system availability and image integrity. The server cluster can improve the transmission efficiency while maintaining high reliability and continuous availability in a healthcare environment. PMID:24701580

  19. Clearing your Desk! Software and Data Services for Collaborative Web Based GIS Analysis

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D. P.; Goodall, J. L.; Band, L. E.; Merwade, V.; Couch, A.; Hooper, R. P.; Maidment, D. R.; Dash, P. K.; Stealey, M.; Yi, H.; Gan, T.; Gichamo, T.; Yildirim, A. A.; Liu, Y.

    2015-12-01

    Can your desktop computer crunch the large GIS datasets that are becoming increasingly common across the geosciences? Do you have access to or the know-how to take advantage of advanced high performance computing (HPC) capability? Web based cyberinfrastructure takes work off your desk or laptop computer and onto infrastructure or "cloud" based data and processing servers. This talk will describe the HydroShare collaborative environment and web based services being developed to support the sharing and processing of hydrologic data and models. HydroShare supports the upload, storage, and sharing of a broad class of hydrologic data including time series, geographic features and raster datasets, multidimensional space-time data, and other structured collections of data. Web service tools and a Python client library provide researchers with access to HPC resources without requiring them to become HPC experts. This reduces the time and effort spent in finding and organizing the data required to prepare the inputs for hydrologic models and facilitates the management of online data and execution of models on HPC systems. This presentation will illustrate the use of web based data and computation services from both the browser and desktop client software. These web-based services implement the Terrain Analysis Using Digital Elevation Model (TauDEM) tools for watershed delineation, generation of hydrology-based terrain information, and preparation of hydrologic model inputs. They allow users to develop scripts on their desktop computer that call analytical functions that are executed completely in the cloud, on HPC resources using input datasets stored in the cloud, without installing specialized software, learning how to use HPC, or transferring large datasets back to the user's desktop. These cases serve as examples for how this approach can be extended to other models to enhance the use of web and data services in the geosciences.

  20. A Integrated Service Platform for Remote Sensing Image 3D Interpretation and Draughting based on HTML5

    NASA Astrophysics Data System (ADS)

    LIU, Yiping; XU, Qing; ZhANG, Heng; LV, Liang; LU, Wanjie; WANG, Dandi

    2016-11-01

    The purpose of this paper is to solve the problems of the traditional single system for interpretation and draughting such as inconsistent standards, single function, dependence on plug-ins, closed system and low integration level. On the basis of the comprehensive analysis of the target elements composition, map representation and similar system features, a 3D interpretation and draughting integrated service platform for multi-source, multi-scale and multi-resolution geospatial objects is established based on HTML5 and WebGL, which not only integrates object recognition, access, retrieval, three-dimensional display and test evaluation but also achieves collection, transfer, storage, refreshing and maintenance of data about Geospatial Objects and shows value in certain prospects and potential for growth.

  1. Teleradiology mobile internet system with a new information security solution

    NASA Astrophysics Data System (ADS)

    Satoh, Hitoshi; Niki, Noboru; Eguchi, Kenji; Ohmatsu, Hironobu; Kusumoto, Masahiko; Kaneko, Masahiro; Moriyama, Noriyuki

    2014-03-01

    We have developed an external storage system by using secret sharing scheme and tokenization for regional medical cooperation, PHR service and information preservation. The use of mobile devices such as smart phones and tablets will be accelerated for a PHR service, and the confidential medical information is exposed to the risk of damage and intercept. We verified the transfer rate of the sending and receiving of data to and from the external storage system that connected it with PACS by the Internet this time. External storage systems are the data centers that exist in Okinawa, in Osaka, in Sapporo and in Tokyo by using secret sharing scheme. PACS continuously transmitted 382 CT images to the external data centers. Total capacity of the CT images is about 200MB. The total time that had been required to transmit was about 250 seconds. Because the preservation method to use secret sharing scheme is applied, security is strong. But, it also takes the information transfer time of this system too much. Therefore, DICOM data is masked to the header information part because it is made to anonymity in our method. The DICOM data made anonymous is preserved in the data base in the hospital. Header information including individual information is divided into two or more tallies by secret sharing scheme, and preserved at two or more external data centers. The token to relate the DICOM data anonymity made to header information preserved outside is strictly preserved in the token server. The capacity of header information that contains patient's individual information is only about 2% of the entire DICOM data. This total time that had been required to transmit was about 5 seconds. Other, common solutions that can protect computer communication networks from attacks are classified as cryptographic techniques or authentication techniques. Individual number IC card is connected with electronic certification authority of web medical image conference system. Individual number IC card is given only to the person to whom the authority to operate web medical image conference system was given.

  2. OGC and Grid Interoperability in enviroGRIDS Project

    NASA Astrophysics Data System (ADS)

    Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas

    2010-05-01

    EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and the OGC Web service protocols, the advantages offered by the Grid technology - such as providing a secure interoperability between the distributed geospatial resource -and the issues introduced by the integration of distributed geospatial data in a secure environment: data and service discovery, management, access and computation. enviroGRIDS project proposes a new architecture which allows a flexible and scalable approach for integrating the geospatial domain represented by the OGC Web services with the Grid domain represented by the gLite middleware. The parallelism offered by the Grid technology is discussed and explored at the data level, management level and computation level. The analysis is carried out for OGC Web service interoperability in general but specific details are emphasized for Web Map Service (WMS), Web Feature Service (WFS), Web Coverage Service (WCS), Web Processing Service (WPS) and Catalog Service for Web (CSW). Issues regarding the mapping and the interoperability between the OGC and the Grid standards and protocols are analyzed as they are the base in solving the communication problems between the two environments: grid and geospatial. The presetation mainly highlights how the Grid environment and Grid applications capabilities can be extended and utilized in geospatial interoperability. Interoperability between geospatial and Grid infrastructures provides features such as the specific geospatial complex functionality and the high power computation and security of the Grid, high spatial model resolution and geographical area covering, flexible combination and interoperability of the geographical models. According with the Service Oriented Architecture concepts and requirements of interoperability between geospatial and Grid infrastructures each of the main functionality is visible from enviroGRIDS Portal and consequently, by the end user applications such as Decision Maker/Citizen oriented Applications. The enviroGRIDS portal is the single way of the user to get into the system and the portal faces a unique style of the graphical user interface. Main reference for further information: [1] enviroGRIDS Project, http://www.envirogrids.net/

  3. Utilization of an Educational Web-Based Mobile App for Acquisition and Transfer of Critical Anatomical Knowledge, Thereby Increasing Classroom and Laboratory Preparedness in Veterinary Students

    ERIC Educational Resources Information Center

    Hannon, Kevin

    2017-01-01

    Contact time with students is becoming more valuable and must be utilized efficiently. Unfortunately, many students attend anatomy lectures and labs ill-prepared, and this limits efficiency. To address this issue we have created an interactive mobile app designed to facilitate the acquisition and transfer of critical anatomical knowledge in…

  4. A Complex Web of Education Policy Borrowing and Transfer: Education for All and the Plan for the Development of Education in Brazil

    ERIC Educational Resources Information Center

    Rambla, Xavier

    2014-01-01

    This article analyses how Education for All policies were transferred to Brazil and Latin America by means of ambitious educational strategic plans such as the Plan for the Development of Education and the National Education Plans -- promoted by the Federal Government of Brazil, and the Latin American Educational Goals -- promoted by the…

  5. Transferring Young People with Profound Intellectual and Multiple Disabilities from Pediatric to Adult Medical Care: Parents' Experiences and Recommendations

    ERIC Educational Resources Information Center

    Bindels-de Heus, Karen G. C. B.; van Staa, AnneLoes; van Vliet, Ingeborg; Ewals, Frans V. P. M.; Hilberink, Sander R.

    2013-01-01

    Many children with profound intellectual and multiple disabilities (PIMD) now reach adulthood. The aim of this study was to elicit parents' experiences with the transfer from pediatric to adult medical care. A convenience sample of 131 Dutch parents of young people with PIMD (16--26 years) completed a web-based questionnaire. Twenty-two percent of…

  6. Hematology Glossary

    MedlinePlus

    ... of ASH educational meetings and webinars ASH Image Bank Educational Web-based library of hematologic imagery ... quickly allogeneic: refers to blood, stem cells, bone marrow, or other tissue that is transferred from one person to another ...

  7. Entanglement of spin waves among four quantum memories.

    PubMed

    Choi, K S; Goban, A; Papp, S B; van Enk, S J; Kimble, H J

    2010-11-18

    Quantum networks are composed of quantum nodes that interact coherently through quantum channels, and open a broad frontier of scientific opportunities. For example, a quantum network can serve as a 'web' for connecting quantum processors for computation and communication, or as a 'simulator' allowing investigations of quantum critical phenomena arising from interactions among the nodes mediated by the channels. The physical realization of quantum networks generically requires dynamical systems capable of generating and storing entangled states among multiple quantum memories, and efficiently transferring stored entanglement into quantum channels for distribution across the network. Although such capabilities have been demonstrated for diverse bipartite systems, entangled states have not been achieved for interconnects capable of 'mapping' multipartite entanglement stored in quantum memories to quantum channels. Here we demonstrate measurement-induced entanglement stored in four atomic memories; user-controlled, coherent transfer of the atomic entanglement to four photonic channels; and characterization of the full quadripartite entanglement using quantum uncertainty relations. Our work therefore constitutes an advance in the distribution of multipartite entanglement across quantum networks. We also show that our entanglement verification method is suitable for studying the entanglement order of condensed-matter systems in thermal equilibrium.

  8. Development and Evaluation of an Interactive WebQuest Environment: "Web Macerasi"

    ERIC Educational Resources Information Center

    Gulbahar, Yasemin; Madran, R. Orcun; Kalelioglu, Filiz

    2010-01-01

    This study was conducted to develop a web-based interactive system, Web Macerasi, for teaching-learning and evaluation purposes, and to find out the possible effects of this system. The study has two stages. In the first stage, a WebQuest site was designed as an interactive system in which various Internet and web technologies were used for…

  9. Stakeholders' expectations on connectivity research for water and land management addressed by a survey in the collaborative EU-COST Connecteur network

    NASA Astrophysics Data System (ADS)

    Smetanova, Anna; Paton, Eva N.; Keesstra, Saskia

    2016-04-01

    Transfer of knowledge across the science-society interface is essential for both, ethical and economic reasons, and inevitable for successful climate change adaptation and integrated management of sustainable, resilient landscapes. The transdisciplinary research of connectivity (which is the degree to which a system facilitates the movement of matter and energy through itself. It is an emergent property of the system state, Connecteur web resources,2015) has the potential to supply monitoring, modelling and management tools to land and water managers in order to reach these goals. The research of water and sediment connectivity has received significant and increasing scientific attention across the entire realm of the environmental disciplines, and the COST Action ES 1306 Connecteur facilitates the multi-sectorial collaboration in connectivity research at EU level. In order to appropriately address the transfer of the cutting edge research developments of the Connecteur network, the collaborative research project on stakeholders' perception of connectivity was conducted by the Working Group 5 "Transition of connectivity research towards sustainable land and water management". The questionnaire survey on stakeholder perception was conducted by volunteering scientist involved in the Connecteur network together from 19 European countries. Together 84 stakeholders from all mayor sectors in water and land management were asked about the main challenges of their work, their understanding of connectivity, the desired areas of cooperation with connectivity science, and the best tools for transferring knowledge. The results showed differences between different stakeholders groups in the way they percept and work with connectivity, as well as their requirement of knowledge transfers. While farmers, and (in lower extend) the agricultural administration officers articulated no, or little need for connectivity management, the majority of stakeholders involved in land and water management found it important. The need of scientist involvement in educational activities (targeting farmers), provision of training in newly developed easily usable tools (models or maps) based on existing data and training in this tools (for land and water management) were, together with freely available data, the most frequently expressed desired way of communication. The results of the study help to improve the research pathways of all working groups of COST Action ES 1306 Connecteur, and to identify the important way of transfer of the connectivity science to all relevant stakeholders. The project was supported by COST-STSM-ES1306-011215-063624. Connecteur web resources (2015) http://connecteur.info/wiki/connectivity-wiki/, 07.01.2016

  10. Experiences Building Globus Genomics: A Next-Generation Sequencing Analysis Service using Galaxy, Globus, and Amazon Web Services

    PubMed Central

    Madduri, Ravi K.; Sulakhe, Dinanath; Lacinski, Lukasz; Liu, Bo; Rodriguez, Alex; Chard, Kyle; Dave, Utpal J.; Foster, Ian T.

    2014-01-01

    We describe Globus Genomics, a system that we have developed for rapid analysis of large quantities of next-generation sequencing (NGS) genomic data. This system achieves a high degree of end-to-end automation that encompasses every stage of data analysis including initial data retrieval from remote sequencing centers or storage (via the Globus file transfer system); specification, configuration, and reuse of multi-step processing pipelines (via the Galaxy workflow system); creation of custom Amazon Machine Images and on-demand resource acquisition via a specialized elastic provisioner (on Amazon EC2); and efficient scheduling of these pipelines over many processors (via the HTCondor scheduler). The system allows biomedical researchers to perform rapid analysis of large NGS datasets in a fully automated manner, without software installation or a need for any local computing infrastructure. We report performance and cost results for some representative workloads. PMID:25342933

  11. Experiences Building Globus Genomics: A Next-Generation Sequencing Analysis Service using Galaxy, Globus, and Amazon Web Services.

    PubMed

    Madduri, Ravi K; Sulakhe, Dinanath; Lacinski, Lukasz; Liu, Bo; Rodriguez, Alex; Chard, Kyle; Dave, Utpal J; Foster, Ian T

    2014-09-10

    We describe Globus Genomics, a system that we have developed for rapid analysis of large quantities of next-generation sequencing (NGS) genomic data. This system achieves a high degree of end-to-end automation that encompasses every stage of data analysis including initial data retrieval from remote sequencing centers or storage (via the Globus file transfer system); specification, configuration, and reuse of multi-step processing pipelines (via the Galaxy workflow system); creation of custom Amazon Machine Images and on-demand resource acquisition via a specialized elastic provisioner (on Amazon EC2); and efficient scheduling of these pipelines over many processors (via the HTCondor scheduler). The system allows biomedical researchers to perform rapid analysis of large NGS datasets in a fully automated manner, without software installation or a need for any local computing infrastructure. We report performance and cost results for some representative workloads.

  12. The Effects of Real-Time Interactive Multimedia Teleradiology System

    PubMed Central

    Al-Safadi, Lilac

    2016-01-01

    This study describes the design of a real-time interactive multimedia teleradiology system and assesses how the system is used by referring physicians in point-of-care situations and supports or hinders aspects of physician-radiologist interaction. We developed a real-time multimedia teleradiology management system that automates the transfer of images and radiologists' reports and surveyed physicians to triangulate the findings and to verify the realism and results of the experiment. The web-based survey was delivered to 150 physicians from a range of specialties. The survey was completed by 72% of physicians. Data showed a correlation between rich interactivity, satisfaction, and effectiveness. The results of our experiments suggest that real-time multimedia teleradiology systems are valued by referring physicians and may have the potential for enhancing their practice and improving patient care and highlight the critical role of multimedia technologies to provide real-time multimode interactivity in current medical care. PMID:27294118

  13. The design and implementation of web mining in web sites security

    NASA Astrophysics Data System (ADS)

    Li, Jian; Zhang, Guo-Yin; Gu, Guo-Chang; Li, Jian-Li

    2003-06-01

    The backdoor or information leak of Web servers can be detected by using Web Mining techniques on some abnormal Web log and Web application log data. The security of Web servers can be enhanced and the damage of illegal access can be avoided. Firstly, the system for discovering the patterns of information leakages in CGI scripts from Web log data was proposed. Secondly, those patterns for system administrators to modify their codes and enhance their Web site security were provided. The following aspects were described: one is to combine web application log with web log to extract more information, so web data mining could be used to mine web log for discovering the information that firewall and Information Detection System cannot find. Another approach is to propose an operation module of web site to enhance Web site security. In cluster server session, Density-Based Clustering technique is used to reduce resource cost and obtain better efficiency.

  14. Minimal incorporation of Deepwater Horizon oil by estuarine filter feeders.

    PubMed

    Fry, Brian; Anderson, Laurie C

    2014-03-15

    Natural abundance carbon isotope analyses are sensitive tracers for fates and use of oil in aquatic environments. Use of oil carbon in estuarine food webs should lead to isotope values approaching those of oil itself, -27‰ for stable carbon isotopes reflecting oil origins and -1000‰ for carbon-14 reflecting oil age. To test for transfer of oil from the 2010 Deepwater Horizon spill into estuarine food webs, filter-feeding barnacles (Balanus sp.) and marsh mussels (Geukensia demissa) were collected from Louisiana estuaries near the site of the oil spill. Carbon-14 analyses of these animals from open waters and oiled marshes showed that oil use was <1% and near detection limits estimated at 0.3% oil incorporation. Respiration studies showed no evidence for enhanced microbial activity in bay waters. Results are consistent with low dietary impacts of oil for filter feeders and little overall impact on respiration in the productive Louisiana estuarine systems. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. Web-based radiology: a future to be created.

    PubMed

    Canadè, Adolfo; Palladino, Francesco; Pitzalis, Gianluca; Campioni, Paolo; Marano, Pasquale

    2003-01-01

    The impact of Internet on Medicine and Surgery is certainly remarkable, however the influence it had on Diagnostic Imaging was even stronger. The standardization of digital images acquired by the different medical imaging equipment has further facilitated the diffusion, transmission and communication in radiology within hospitals as well as on WEB. Radiology departments are bound to become "filmless" and with the present "tablet PC" radiological images will be directly transferred to the patient's bed in the relative electronic patient report. For radiology, interactive education could be envisaged with a tutor who guides the student(s) through the network. The Internet is an inexhaustible source of radiologic educational and information material with a number of sites of clinical cases, tutorial and teaching files, journals and magisterial lectures on-line. In a near future, the Internet could be applied in the simulation of clinicoradiologic cases or in applications of artificial intelligence with expert systems to support the solution of most complex cases.

  16. Programmatic access to logical models in the Cell Collective modeling environment via a REST API.

    PubMed

    Kowal, Bryan M; Schreier, Travis R; Dauer, Joseph T; Helikar, Tomáš

    2016-01-01

    Cell Collective (www.cellcollective.org) is a web-based interactive environment for constructing, simulating and analyzing logical models of biological systems. Herein, we present a Web service to access models, annotations, and simulation data in the Cell Collective platform through the Representational State Transfer (REST) Application Programming Interface (API). The REST API provides a convenient method for obtaining Cell Collective data through almost any programming language. To ensure easy processing of the retrieved data, the request output from the API is available in a standard JSON format. The Cell Collective REST API is freely available at http://thecellcollective.org/tccapi. All public models in Cell Collective are available through the REST API. For users interested in creating and accessing their own models through the REST API first need to create an account in Cell Collective (http://thecellcollective.org). thelikar2@unl.edu. Technical user documentation: https://goo.gl/U52GWo. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  17. A teledentistry system for the second opinion.

    PubMed

    Gambino, Orazio; Lima, Fausto; Pirrone, Roberto; Ardizzone, Edoardo; Campisi, Giuseppina; di Fede, Olga

    2014-01-01

    In this paper we present a Teledentistry system aimed to the Second Opinion task. It make use of a particular camera called intra-oral camera, also called dental camera, in order to perform the photo shooting and real-time video of the inner part of the mouth. The pictures acquired by the Operator with such a device are sent to the Oral Medicine Expert (OME) by means of a current File Transfer Protocol (FTP) service and the real-time video is channeled into a video streaming thanks to the VideoLan client/server (VLC) application. It is composed by a HTML5 web-pages generated by PHP and allows to perform the Second Opinion both when Operator and OME are logged and when one of them is offline.

  18. Compliant threads maximize spider silk connection strength and toughness

    PubMed Central

    Meyer, Avery; Pugno, Nicola M.; Cranford, Steven W.

    2014-01-01

    Millions of years of evolution have adapted spider webs to achieve a range of functionalities, including the well-known capture of prey, with efficient use of material. One feature that has escaped extensive investigation is the silk-on-silk connection joints within spider webs, particularly from a structural mechanics perspective. We report a joint theoretical and computational analysis of an idealized silk-on-silk fibre junction. By modifying the theory of multiple peeling, we quantitatively compare the performance of the system while systematically increasing the rigidity of the anchor thread, by both scaling the stress–strain response and the introduction of an applied pre-strain. The results of our study indicate that compliance is a virtue—the more extensible the anchorage, the tougher and stronger the connection becomes. In consideration of the theoretical model, in comparison with rigid substrates, a compliant anchorage enormously increases the effective adhesion strength (work required to detach), independent of the adhered thread itself, attributed to a nonlinear alignment between thread and anchor (contact peeling angle). The results can direct novel engineering design principles to achieve possible load transfer from compliant fibre-to-fibre anchorages, be they silk-on-silk or another, as-yet undeveloped, system. PMID:25008083

  19. Drying of fiber webs

    DOEpatents

    Warren, D.W.

    1997-04-15

    A process and an apparatus are disclosed for high-intensity drying of fiber webs or sheets, such as newsprint, printing and writing papers, packaging paper, and paperboard or linerboard, as they are formed on a paper machine. The invention uses direct contact between the wet fiber web or sheet and various molten heat transfer fluids, such as liquefied eutectic metal alloys, to impart heat at high rates over prolonged durations, in order to achieve ambient boiling of moisture contained within the web. The molten fluid contact process causes steam vapor to emanate from the web surface, without dilution by ambient air; and it is differentiated from the evaporative drying techniques of the prior industrial art, which depend on the uses of steam-heated cylinders to supply heat to the paper web surface, and ambient air to carry away moisture, which is evaporated from the web surface. Contact between the wet fiber web and the molten fluid can be accomplished either by submersing the web within a molten bath or by coating the surface of the web with the molten media. Because of the high interfacial surface tension between the molten media and the cellulose fiber comprising the paper web, the molten media does not appreciatively stick to the paper after it is dried. Steam generated from the paper web is collected and condensed without dilution by ambient air to allow heat recovery at significantly higher temperature levels than attainable in evaporative dryers. 6 figs.

  20. Towards Mechanistic Understanding of Mercury Availability and Toxicity to Aquatic Primary Producers.

    PubMed

    Dranguet, Perrine; Flück, Rebecca; Regier, Nicole; Cosio, Claudia; Le Faucheur, Séverine; Slaveykova, Vera I

    2014-11-01

    The present article reviews current knowledge and recent progress on the bioavailability and toxicity of mercury to aquatic primary producers. Mercury is a ubiquitous toxic trace element of global concern. At the base of the food web, primary producers are central for mercury incorporation into the food web. Here, the emphasis is on key, but still poorly understood, processes governing the interactions between mercury species and phytoplankton, and macrophytes, two representatives of primary producers. Mass transfer to biota surface, adsorption to cell wall, internalization and release from cells, as well as underlying toxicity mechanisms of both inorganic mercury and methylmercury are discussed critically. In addition, the intracellular distribution and transformation processes, their importance for mercury toxicity, species-sensitivity differences and trophic transfer are presented. The mini-review is illustrated with examples of our own research.

  1. Species- and habitat-specific bioaccumulation of total mercury and methylmercury in the food web of a deep oligotrophic lake.

    PubMed

    Arcagni, Marina; Juncos, Romina; Rizzo, Andrea; Pavlin, Majda; Fajon, Vesna; Arribére, María A; Horvat, Milena; Ribeiro Guevara, Sergio

    2018-01-15

    Niche segregation between introduced and native fish in Lake Nahuel Huapi, a deep oligotrophic lake in Northwest Patagonia (Argentina), occurs through the consumption of different prey. Therefore, in this work we analyzed total mercury [THg] and methylmercury [MeHg] concentrations in top predator fish and in their main prey to test whether their feeding habits influence [Hg]. Results indicate that [THg] and [MeHg] varied by foraging habitat and they increased with greater percentage of benthic diet and decreased with pelagic diet in Lake Nahuel Huapi. This is consistent with the fact that the native creole perch, a mostly benthivorous feeder, which shares the highest trophic level of the food web with introduced salmonids, had higher [THg] and [MeHg] than the more pelagic feeder rainbow trout and bentho-pelagic feeder brown trout. This differential THg and MeHg bioaccumulation observed in native and introduced fish provides evidence to the hypothesis that there are two main Hg transfer pathways from the base of the food web to top predators: a pelagic pathway where Hg is transferred from water, through plankton (with Hg in inorganic species mostly), forage fish to salmonids, and a benthic pathway, as Hg is transferred from the sediments (where Hg methylation occurs mostly), through crayfish (with higher [MeHg] than plankton), to native fish, leading to one fold higher [Hg]. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Mercury bioaccumulation in bats reflects dietary connectivity to aquatic food webs.

    PubMed

    Becker, Daniel J; Chumchal, Matthew M; Broders, Hugh G; Korstian, Jennifer M; Clare, Elizabeth L; Rainwater, Thomas R; Platt, Steven G; Simmons, Nancy B; Fenton, M Brock

    2018-02-01

    Mercury (Hg) is a persistent and widespread heavy metal with neurotoxic effects in wildlife. While bioaccumulation of Hg has historically been studied in aquatic food webs, terrestrial consumers can become contaminated with Hg when they feed on aquatic organisms (e.g., emergent aquatic insects, fish, and amphibians). However, the extent to which dietary connectivity to aquatic ecosystems can explain patterns of Hg bioaccumulation in terrestrial consumers has not been well studied. Bats (Order: Chiroptera) can serve as a model system for illuminating the trophic transfer of Hg given their high dietary diversity and foraging links to both aquatic and terrestrial food webs. Here we quantitatively characterize the dietary correlates of long-term exposure to Hg across a diverse local assemblage of bats in Belize and more globally across bat species from around the world with a comparative analysis of hair samples. Our data demonstrate considerable interspecific variation in hair total Hg concentrations in bats that span three orders of magnitude across species, ranging from 0.04 mg/kg in frugivorous bats (Artibeus spp.) to 145.27 mg/kg in the piscivorous Noctilio leporinus. Hg concentrations showed strong phylogenetic signal and were best explained by dietary connectivity of bat species to aquatic food webs. Our results highlight that phylogeny can be predictive of Hg concentrations through similarity in diet and how interspecific variation in feeding strategies influences chronic exposure to Hg and enables movement of contaminants from aquatic to terrestrial ecosystems. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Personalization of Rule-based Web Services.

    PubMed

    Choi, Okkyung; Han, Sang Yong

    2008-04-04

    Nowadays Web users have clearly expressed their wishes to receive personalized services directly. Personalization is the way to tailor services directly to the immediate requirements of the user. However, the current Web Services System does not provide any features supporting this such as consideration of personalization of services and intelligent matchmaking. In this research a flexible, personalized Rule-based Web Services System to address these problems and to enable efficient search, discovery and construction across general Web documents and Semantic Web documents in a Web Services System is proposed. This system utilizes matchmaking among service requesters', service providers' and users' preferences using a Rule-based Search Method, and subsequently ranks search results. A prototype of efficient Web Services search and construction for the suggested system is developed based on the current work.

  4. Mercury biomagnification and the trophic structure of the ichthyofauna from a remote lake in the Brazilian Amazon

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Azevedo-Silva, Claudio Eduardo, E-mail: ceass@biof

    The present study assesses mercury biomagnification and the trophic structure of the ichthyofauna from the Puruzinho Lake, Brazilian Amazon. In addition to mercury determination, the investigation comprised the calculation of Trophic Magnification Factor (TMF) and Trophic Magnification Slope (TMS), through the measurements of stable isotopes of carbon (δ{sup 13}C) and nitrogen (δ{sup 15}N) in fish samples. These assessments were executed in two different scenarios, i.e., considering (1) all fish species or (2) only the resident fish (excluding the migratory species). Bottom litter, superficial sediment and seston were the sources used for generating the trophic position (TP) data used in themore » calculation of the TMF. Samples from 84 fish were analysed, comprising 13 species, which were categorized into four trophic guilds: iliophagous, planktivorous, omnivorous and piscivorous fish. The δ{sup 13}C values pointed to the separation of the ichthyofauna into two groups. One group comprised iliophagous and planktivorous species, which are linked to the food chains of phytoplankton and detritus. The other group was composed by omnivorous and piscivorous fish, which are associated to the trophic webs of phytoplankton, bottom litter, detritus, periphyton, as well as to food chains of igapó (blackwater-flooded Amazonian forests). The TP values suggest that the ichthyofauna from the Puruzinho Lake is part of a short food web, with three well-characterized trophic levels. Mercury concentrations and δ{sup 13}C values point to multiple sources for Hg input and transfer. The similarity in Hg levels and TP values between piscivorous and planktivorous fish suggests a comparable efficiency for the transfer of this metal through pelagic and littoral food chains. Regarding the two abovementioned scenarios, i.e., considering (1) the entire ichthyofauna and (2) only the resident species, the TMF values were 5.25 and 4.49, as well as the TMS values were 0.21 and 0.19, respectively. These findings confirm that Hg biomagnifies through the food web of Puruzinho Lake ichthyofauna. The migratory species did not significantly change mercury biomagnification rate in Puruzinho Lake; however, they may play a relevant role in Hg transport. The biomagnification rate (TMS value) in Puruzinho Lake was higher than the average values for its latitude, being comparable to TMS values of temperate and polar systems (marine and freshwater environments). - Highlights: • Mercury biomagnified in food web of a remote Amazonian lake. • δ{sup 13}C and δ{sup 15}N values suggest multiple Hg sources in food web with 3 trophic levels. • We found similar Hg transfer efficiencies in pelagic and littoral food chains. • Migration may influence the trophic structure assessment performed with δ{sup 15}N. • The migrating species did not significantly alter the biomagnification calculation.« less

  5. Radiology on handheld devices: image display, manipulation, and PACS integration issues.

    PubMed

    Raman, Bhargav; Raman, Raghav; Raman, Lalithakala; Beaulieu, Christopher F

    2004-01-01

    Handheld personal digital assistants (PDAs) have undergone continuous and substantial improvements in hardware and graphics capabilities, making them a compelling platform for novel developments in teleradiology. The latest PDAs have processor speeds of up to 400 MHz and storage capacities of up to 80 Gbytes with memory expansion methods. A Digital Imaging and Communications in Medicine (DICOM)-compliant, vendor-independent handheld image access system was developed in which a PDA server acts as the gateway between a picture archiving and communication system (PACS) and PDAs. The system is compatible with most currently available PDA models. It is capable of both wired and wireless transfer of images and includes custom PDA software and World Wide Web interfaces that implement a variety of basic image manipulation functions. Implementation of this system, which is currently undergoing debugging and beta testing, required optimization of the user interface to efficiently display images on smaller PDA screens. The PDA server manages user work lists and implements compression and security features to accelerate transfer speeds, protect patient information, and regulate access. Although some limitations remain, PDA-based teleradiology has the potential to increase the efficiency of the radiologic work flow, increasing productivity and improving communication with referring physicians and patients. Copyright RSNA, 2004

  6. When parasites become prey: ecological and epidemiological significance of eating parasites

    USGS Publications Warehouse

    Johnson, Pieter T.J.; Dobson, Andrew P.; Lafferty, Kevin D.; Marcogliese, David J.; Memmott, Jane; Orlofske, Sarah A.; Poulin, Robert; Thieltges, David W.

    2010-01-01

    Recent efforts to include parasites in food webs have drawn attention to a previously ignored facet of foraging ecology: parasites commonly function as prey within ecosystems. Because of the high productivity of parasites, their unique nutritional composition and their pathogenicity in hosts, their consumption affects both food-web topology and disease risk in humans and wildlife. Here, we evaluate the ecological, evolutionary and epidemiological significance of feeding on parasites, including concomitant predation, grooming, predation on free-living stages and intraguild predation. Combining empirical data and theoretical models, we show that consumption of parasites is neither rare nor accidental, and that it can sharply affect parasite transmission and food web properties. Broader consideration of predation on parasites will enhance our understanding of disease control, food web structure and energy transfer, and the evolution of complex life cycles.

  7. Implementing WebGL and HTML5 in Macromolecular Visualization and Modern Computer-Aided Drug Design.

    PubMed

    Yuan, Shuguang; Chan, H C Stephen; Hu, Zhenquan

    2017-06-01

    Web browsers have long been recognized as potential platforms for remote macromolecule visualization. However, the difficulty in transferring large-scale data to clients and the lack of native support for hardware-accelerated applications in the local browser undermine the feasibility of such utilities. With the introduction of WebGL and HTML5 technologies in recent years, it is now possible to exploit the power of a graphics-processing unit (GPU) from a browser without any third-party plugin. Many new tools have been developed for biological molecule visualization and modern drug discovery. In contrast to traditional offline tools, real-time computing, interactive data analysis, and cross-platform analyses feature WebGL- and HTML5-based tools, facilitating biological research in a more efficient and user-friendly way. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Introductory Tools for Radiative Transfer Models

    NASA Astrophysics Data System (ADS)

    Feldman, D.; Kuai, L.; Natraj, V.; Yung, Y.

    2006-12-01

    Satellite data are currently so voluminous that, despite their unprecedented quality and potential for scientific application, only a small fraction is analyzed due to two factors: researchers' computational constraints and a relatively small number of researchers actively utilizing the data. Ultimately it is hoped that the terabytes of unanalyzed data being archived can receive scientific scrutiny but this will require a popularization of the methods associated with the analysis. Since a large portion of complexity is associated with the proper implementation of the radiative transfer model, it is reasonable and appropriate to make the model as accessible as possible to general audiences. Unfortunately, the algorithmic and conceptual details that are necessary for state-of-the-art analysis also tend to frustrate the accessibility for those new to remote sensing. Several efforts have been made to have web- based radiative transfer calculations, and these are useful for limited calculations, but analysis of more than a few spectra requires the utilization of home- or server-based computing resources. We present a system that is designed to allow for easier access to radiative transfer models with implementation on a home computing platform in the hopes that this system can be utilized in and expanded upon in advanced high school and introductory college settings. This learning-by-doing process is aided through the use of several powerful tools. The first is a wikipedia-style introduction to the salient features of radiative transfer that references the seminal works in the field and refers to more complicated calculations and algorithms sparingly5. The second feature is a technical forum, commonly referred to as a tiki-wiki, that addresses technical and conceptual questions through public postings, private messages, and a ranked searching routine. Together, these tools may be able to facilitate greater interest in the field of remote sensing.

  9. Transfer of benzo[a]pyrene from microplastics to Artemia nauplii and further to zebrafish via a trophic food web experiment: CYP1A induction and visual tracking of persistent organic pollutants.

    PubMed

    Batel, Annika; Linti, Frederic; Scherer, Martina; Erdinger, Lothar; Braunbeck, Thomas

    2016-07-01

    The uptake of microplastic particles and the transfer of potential harmful substances along with microplastics has been studied in a variety of organisms, especially invertebrates. However, the potential accumulation of very small microplastic particles along food webs ending with vertebrate models has not been investigated so far. Therefore, a simple artificial food chain with Artemia sp. nauplii and zebrafish (Danio rerio) was established to analyze the transfer of microplastic particles and associated persistent organic pollutants (POPs) between different trophic levels. Very small (1-20 μm) microplastic particles accumulated in Artemia nauplii and were subsequently transferred to fish. Virgin particles not loaded with POPs did not cause any observable physical harm in the intestinal tracts of zebrafish, although parts of the particles were retained within the mucus of intestinal villi and might even have been taken up by epithelial cells. The transfer of associated POPs was tested with the polycyclic aromatic hydrocarbon benzo[a]pyrene and an ethoxyresorufin-O-deethylase (EROD) assay for CYP1A induction in zebrafish liver as well as via fluorescence analyses. Whereas a significant induction in the EROD assay could not be shown, because of high individual variation and low sensitivity regarding substance concentration, the fluorescence tracking of benzo[a]pyrene indicates that food-borne microplastic-associated POPs may actually desorb in the intestine of fish and are thus transferred to the intestinal epithelium and liver. Environ Toxicol Chem 2016;35:1656-1666. © 2016 SETAC. © 2016 SETAC.

  10. Wireless, Web-Based Interactive Control of Optical Coherence Tomography with Mobile Devices.

    PubMed

    Mehta, Rajvi; Nankivil, Derek; Zielinski, David J; Waterman, Gar; Keller, Brenton; Limkakeng, Alexander T; Kopper, Regis; Izatt, Joseph A; Kuo, Anthony N

    2017-01-01

    Optical coherence tomography (OCT) is widely used in ophthalmology clinics and has potential for more general medical settings and remote diagnostics. In anticipation of remote applications, we developed wireless interactive control of an OCT system using mobile devices. A web-based user interface (WebUI) was developed to interact with a handheld OCT system. The WebUI consisted of key OCT displays and controls ported to a webpage using HTML and JavaScript. Client-server relationships were created between the WebUI and the OCT system computer. The WebUI was accessed on a cellular phone mounted to the handheld OCT probe to wirelessly control the OCT system. Twenty subjects were imaged using the WebUI to assess the system. System latency was measured using different connection types (wireless 802.11n only, wireless to remote virtual private network [VPN], and cellular). Using a cellular phone, the WebUI was successfully used to capture posterior eye OCT images in all subjects. Simultaneous interactivity by a remote user on a laptop was also demonstrated. On average, use of the WebUI added only 58, 95, and 170 ms to the system latency using wireless only, wireless to VPN, and cellular connections, respectively. Qualitatively, operator usage was not affected. Using a WebUI, we demonstrated wireless and remote control of an OCT system with mobile devices. The web and open source software tools used in this project make it possible for any mobile device to potentially control an OCT system through a WebUI. This platform can be a basis for remote, teleophthalmology applications using OCT.

  11. A local chaotic quasi-attractor in a kicked rotator

    NASA Astrophysics Data System (ADS)

    Jiang, Yu-Mei; Lu, Yun-Qing; Zhao, Jin-Gang; Wang, Xu-Ming; Chen, He-Sheng; He, Da-Ren

    2002-03-01

    Recently, Hu et al. reported a diffusion in a special kind of stochastic web observed in a kicked rotator described by a discontinuous but invertible two-dimensional area-preserving map^1. We modified the function form of the system so that the period of the kicking force becomes different in two parts of the space, and the conservative map becomes both discontinuous and noninvertible. It is found that when the ratio between both periods becomes smaller or larger than (but near to) 1, the chaotic diffusion in the web transfers to chaotic transients, which are attracted to the elliptic islands those existed inside the holes of the web earlier when the ratio equals 1. As soon as reaching the islands, the iteration follows the conservative laws exactly. Therefore we address these elliptic islands as "regular quasi-attractor"^2. When the ratio increases further and becomes far from 1, all the elliptic islands disappear and a local chaotic quasi-attractor appears instead. It attracts the iterations starting from most initial points in the phase space. This behavior may be considered as a kind of "confinement" of chaotic motion of a particle. ^1B. Hu et al., Phys.Rev.Lett.,82(1999)4224. ^2J. Wang et al., Phys.Rev.E, 64(2001)026202.

  12. The Protein Disease Database of human body fluids: II. Computer methods and data issues.

    PubMed

    Lemkin, P F; Orr, G A; Goldstein, M P; Creed, G J; Myrick, J E; Merril, C R

    1995-01-01

    The Protein Disease Database (PDD) is a relational database of proteins and diseases. With this database it is possible to screen for quantitative protein abnormalities associated with disease states. These quantitative relationships use data drawn from the peer-reviewed biomedical literature. Assays may also include those observed in high-resolution electrophoretic gels that offer the potential to quantitate many proteins in a single test as well as data gathered by enzymatic or immunologic assays. We are using the Internet World Wide Web (WWW) and the Web browser paradigm as an access method for wide distribution and querying of the Protein Disease Database. The WWW hypertext transfer protocol and its Common Gateway Interface make it possible to build powerful graphical user interfaces that can support easy-to-use data retrieval using query specification forms or images. The details of these interactions are totally transparent to the users of these forms. Using a client-server SQL relational database, user query access, initial data entry and database maintenance are all performed over the Internet with a Web browser. We discuss the underlying design issues, mapping mechanisms and assumptions that we used in constructing the system, data entry, access to the database server, security, and synthesis of derived two-dimensional gel image maps and hypertext documents resulting from SQL database searches.

  13. The Development of Smart Home System for Controlling and Monitoring Energy Consumption using WebSocket Protocol

    NASA Astrophysics Data System (ADS)

    Witthayawiroj, Niti; Nilaphruek, Pongpon

    2017-03-01

    Energy consumption especially electricity is considered one of the most serious problems in households these days. It is because the amount of electricity consumed is more than the amount that people actually need. This means that there is an overusing which resulted from the inconvenience of moving to the switch to turn off the light or any appliances and it is often that closing the light is forgettable, for instance; in addition, there are no tools for monitoring how much energy that is consumed in residents. From this, it can be easily seen that people are having a problem in energy usage monitor and control. There are two main objectives of this study including 1) creating the communication framework among server, clients and devices, and 2) developing the prototype system that try to solve the mentioned problems which gives the user an opportunity to know the amount of electricity they have used in their houses and also the ability to turn appliances on and off through the Internet on smart devices such as smart phones and tablets that support Android platform or any web browser. Raspberry Pi is used as a microcontroller and the data is transferred to the smart device by WebSocket protocol which is strongly recommended for real-time communication. The example features on the device’s screen are user management, controlling and monitoring of appliances. The result expresses that the system is very effective and not difficult to use from users’ satisfaction. However, current sensors may be used for a more accurate electricity measurement and Wi-Fi module for more appliances to calculate its power in the future.

  14. PC-based web authoring: How to learn as little unix as possible while getting on the Web

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gennari, L.T.; Breaux, M.; Minton, S.

    1996-09-01

    This document is a general guide for creating Web pages, using commonly available word processing and file transfer applications. It is not a full guide to HTML, nor does it provide an introduction to the many WYSIWYG HTML editors available. The viability of the authoring method it describes will not be affected by changes in the HTML specification or the rapid release-and-obsolescence cycles of commercial WYSIWYG HTML editors. This document provides a gentle introduction to HTML for the beginner, and as the user gains confidence and experience, encourages greater familiarity with HTML through continued exposure to and hands-on usage ofmore » HTML code.« less

  15. Mac-based Web authoring: How to learn as little Unix as possible while getting on the Web.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gennari, L.T.

    1996-06-01

    This document is a general guide for creating Web pages, using commonly available word processing and file transfer applications. It is not a full guide to HTML, nor does it provide an introduction to the many WYSIWYG HTML editors available. The viability of the authoring method it describes will not be affected by changes in the HTML specification or the rapid release-and-obsolescence cycles of commercial WYSIWYG HTML editors. This document provides a gentle introduction to HTML for the beginner and as the user gains confidence and experience, encourages greater familiarity with HTML through continued exposure to and hands-on usage ofmore » HTML code.« less

  16. Future View: Web Navigation based on Learning User's Browsing Strategy

    NASA Astrophysics Data System (ADS)

    Nagino, Norikatsu; Yamada, Seiji

    In this paper, we propose a Future View system that assists user's usual Web browsing. The Future View will prefetch Web pages based on user's browsing strategies and present them to a user in order to assist Web browsing. To learn user's browsing strategy, the Future View uses two types of learning classifier systems: a content-based classifier system for contents change patterns and an action-based classifier system for user's action patterns. The results of learning is applied to crawling by Web robots, and the gathered Web pages are presented to a user through a Web browser interface. We experimentally show effectiveness of navigation using the Future View.

  17. Trophic transfer of metals along freshwater food webs: Evidence of cadmium biomagnification in nature

    USGS Publications Warehouse

    Croteau, M.-N.; Luoma, S.N.; Stewart, A.R.

    2005-01-01

    We conducted a study with cadmium (Cd) and copper (Cu) in the delta of San Francisco Bay, using nitrogen and carbon stable isotopes to identify trophic position and food web structure. Cadmium is progressively enriched among trophic levels in discrete epiphyte-based food webs composed of macrophyte-dwelling invertebrates (the first link being epiphytic algae) and fishes (the first link being gobies). Cadmium concentrations were biomagnified 15 times within the scope of two trophic links in both food webs. Trophic enrichment in invertebrates was twice that of fishes. No tendency toward trophic-level enrichment was observed for Cu, regardless of whether organisms were sorted by food web or treated on a taxonomic basis within discrete food webs. The greatest toxic effects of Cd are likely to occur with increasing trophic positions, where animals are ingesting Cd-rich prey (or food). In Franks Tract this occurs within discrete food chains composed of macrophyte-dwelling invertebrates or fishes inhabiting submerged aquatic vegetation. Unraveling ecosystem complexity is necessary before species most exposed and at risk can be identified. ?? 2005, by the American Society of Limnology and Oceanography, Inc.

  18. ONR (Office of Naval Research) Far East Scientific Bulletin. Volume 9, Number 4, October-December 1984,

    DTIC Science & Technology

    1984-12-01

    appeared in 1854 in Erpetologie Ggn~rale ou Histoire Nature lie Complete des Reptiles and was reproduced in the book, Australia 18 Animals Discovered...Zone in relation to their environment and importance "* in the marine food web . Initial emphasis is being placed on publishing distribution charts of...section is the study of the nutrient supply, primary productivity, and the transfer of energy through the food web to animals and plants that are now

  19. Advanced composite vertical stabilizer for DC-10 transport aircraft

    NASA Technical Reports Server (NTRS)

    Stephens, C. O.

    1979-01-01

    Structural design, tooling, fabrication, and test activities are reported for a program to develop an advanced composite vertical stabilizer (CVS) for the DC 10 Commercial Transport Aircraft. Structural design details are described and the status of structural and weight analyses are reported. A structural weight reduction of 21.7% is currently predicted. Test results are discussed for sine wave stiffened shear webs containing representative of the CVS spar webs and for lightning current transfer and tests on a panel representative of the CVS skins.

  20. Creating Web-Based Scientific Applications Using Java Servlets

    NASA Technical Reports Server (NTRS)

    Palmer, Grant; Arnold, James O. (Technical Monitor)

    2001-01-01

    There are many advantages to developing web-based scientific applications. Any number of people can access the application concurrently. The application can be accessed from a remote location. The application becomes essentially platform-independent because it can be run from any machine that has internet access and can run a web browser. Maintenance and upgrades to the application are simplified since only one copy of the application exists in a centralized location. This paper details the creation of web-based applications using Java servlets. Java is a powerful, versatile programming language that is well suited to developing web-based programs. A Java servlet provides the interface between the central server and the remote client machines. The servlet accepts input data from the client, runs the application on the server, and sends the output back to the client machine. The type of servlet that supports the HTTP protocol will be discussed in depth. Among the topics the paper will discuss are how to write an http servlet, how the servlet can run applications written in Java and other languages, and how to set up a Java web server. The entire process will be demonstrated by building a web-based application to compute stagnation point heat transfer.

  1. AISI/DOE Technology Roadmap Program: Development of Cost-effective, Energy-efficient Steel Framing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nader R. Elhajj

    2003-01-06

    Steel members in wall construction form a thermal bridge that interrupts the insulation layer of a wall. This causes higher rate of heat transfer by conduction through the wall framing than through other parts of the wall. One method to reduce the thermal bridging effect is to provide a break, such as insulating sheathing. A thermally efficient slit-web and stud was developed in this program to mitigate the conductivity of steel. The thermal performance of the slit-web stud was evaluated at Oak Ridge National Laboratory using hotbox testing. The thermal test results showed that the prototype slit-web stud performed 17%more » better than the solid-web stud, using R-13 fiber glass batts with exterior OSB sheathing and interior drywall. The structural behavior of this slit-web stud was evaluated in axial, bending, shear, shearwall, and stub-column tests. Test results indicated that the slitweb stud performed similarly or better than the solid-web stud in most structural performance characteristics investigated. Thus, the prototype slit-web stud has been shown to be thermally efficient, economiexecy viable, structurally sound, easily manufactured and usable in a range of residential installations.« less

  2. A System for Information Management in BioMedical Studies—SIMBioMS

    PubMed Central

    Krestyaninova, Maria; Zarins, Andris; Viksna, Juris; Kurbatova, Natalja; Rucevskis, Peteris; Neogi, Sudeshna Guha; Gostev, Mike; Perheentupa, Teemu; Knuuttila, Juha; Barrett, Amy; Lappalainen, Ilkka; Rung, Johan; Podnieks, Karlis; Sarkans, Ugis; McCarthy, Mark I; Brazma, Alvis

    2009-01-01

    Summary: SIMBioMS is a web-based open source software system for managing data and information in biomedical studies. It provides a solution for the collection, storage, management and retrieval of information about research subjects and biomedical samples, as well as experimental data obtained using a range of high-throughput technologies, including gene expression, genotyping, proteomics and metabonomics. The system can easily be customized and has proven to be successful in several large-scale multi-site collaborative projects. It is compatible with emerging functional genomics data standards and provides data import and export in accepted standard formats. Protocols for transferring data to durable archives at the European Bioinformatics Institute have been implemented. Availability: The source code, documentation and initialization scripts are available at http://simbioms.org. Contact: support@simbioms.org; mariak@ebi.ac.uk PMID:19633095

  3. DOORS to the semantic web and grid with a PORTAL for biomedical computing.

    PubMed

    Taswell, Carl

    2008-03-01

    The semantic web remains in the early stages of development. It has not yet achieved the goals envisioned by its founders as a pervasive web of distributed knowledge and intelligence. Success will be attained when a dynamic synergism can be created between people and a sufficient number of infrastructure systems and tools for the semantic web in analogy with those for the original web. The domain name system (DNS), web browsers, and the benefits of publishing web pages motivated many people to register domain names and publish web sites on the original web. An analogous resource label system, semantic search applications, and the benefits of collaborative semantic networks will motivate people to register resource labels and publish resource descriptions on the semantic web. The Domain Ontology Oriented Resource System (DOORS) and Problem Oriented Registry of Tags and Labels (PORTAL) are proposed as infrastructure systems for resource metadata within a paradigm that can serve as a bridge between the original web and the semantic web. The Internet Registry Information Service (IRIS) registers [corrected] domain names while DNS publishes domain addresses with mapping of names to addresses for the original web. Analogously, PORTAL registers resource labels and tags while DOORS publishes resource locations and descriptions with mapping of labels to locations for the semantic web. BioPORT is proposed as a prototype PORTAL registry specific for the problem domain of biomedical computing.

  4. Using the World Wide Web for GIDEP Problem Data Processing at Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    McPherson, John W.; Haraway, Sandra W.; Whirley, J. Don

    1999-01-01

    Since April 1997, Marshall Space Flight Center has been using electronic transfer and the web to support our processing of the Government-Industry Data Exchange Program (GIDEP) and NASA ALERT information. Specific aspects include: (1) Extraction of ASCII text information from GIDEP for loading into Word documents for e-mail to ALERT actionees; (2) Downloading of GIDEP form image formats in Adobe Acrobat (.pdf) for internal storage display on the MSFC ALERT web page; (3) Linkage of stored GRDEP problem forms with summary information for access from the MSFC ALERT Distribution Summary Chart or from an html table of released MSFC ALERTs (4) Archival of historic ALERTs for reference by GIDEP ID, MSFC ID, or MSFC release date; (5) On-line tracking of ALERT response status using a Microsoft Access database and the web (6) On-line response to ALERTs from MSFC actionees through interactive web forms. The technique, benefits, effort, coordination, and lessons learned for each aspect are covered herein.

  5. Designing and developing portable large-scale JavaScript web applications within the Experiment Dashboard framework

    NASA Astrophysics Data System (ADS)

    Andreeva, J.; Dzhunov, I.; Karavakis, E.; Kokoszkiewicz, L.; Nowotka, M.; Saiz, P.; Tuckett, D.

    2012-12-01

    Improvements in web browser performance and web standards compliance, as well as the availability of comprehensive JavaScript libraries, provides an opportunity to develop functionally rich yet intuitive web applications that allow users to access, render and analyse data in novel ways. However, the development of such large-scale JavaScript web applications presents new challenges, in particular with regard to code sustainability and team-based work. We present an approach that meets the challenges of large-scale JavaScript web application design and development, including client-side model-view-controller architecture, design patterns, and JavaScript libraries. Furthermore, we show how the approach leads naturally to the encapsulation of the data source as a web API, allowing applications to be easily ported to new data sources. The Experiment Dashboard framework is used for the development of applications for monitoring the distributed computing activities of virtual organisations on the Worldwide LHC Computing Grid. We demonstrate the benefits of the approach for large-scale JavaScript web applications in this context by examining the design of several Experiment Dashboard applications for data processing, data transfer and site status monitoring, and by showing how they have been ported for different virtual organisations and technologies.

  6. Climate change alters the structure of arctic marine food webs due to poleward shifts of boreal generalists.

    PubMed

    Kortsch, Susanne; Primicerio, Raul; Fossheim, Maria; Dolgov, Andrey V; Aschan, Michaela

    2015-09-07

    Climate-driven poleward shifts, leading to changes in species composition and relative abundances, have been recently documented in the Arctic. Among the fastest moving species are boreal generalist fish which are expected to affect arctic marine food web structure and ecosystem functioning substantially. Here, we address structural changes at the food web level induced by poleward shifts via topological network analysis of highly resolved boreal and arctic food webs of the Barents Sea. We detected considerable differences in structural properties and link configuration between the boreal and the arctic food webs, the latter being more modular and less connected. We found that a main characteristic of the boreal fish moving poleward into the arctic region of the Barents Sea is high generalism, a property that increases connectance and reduces modularity in the arctic marine food web. Our results reveal that habitats form natural boundaries for food web modules, and that generalists play an important functional role in coupling pelagic and benthic modules. We posit that these habitat couplers have the potential to promote the transfer of energy and matter between habitats, but also the spread of pertubations, thereby changing arctic marine food web structure considerably with implications for ecosystem dynamics and functioning. © 2015 The Authors.

  7. Climate change alters the structure of arctic marine food webs due to poleward shifts of boreal generalists

    PubMed Central

    Kortsch, Susanne; Primicerio, Raul; Fossheim, Maria; Dolgov, Andrey V.; Aschan, Michaela

    2015-01-01

    Climate-driven poleward shifts, leading to changes in species composition and relative abundances, have been recently documented in the Arctic. Among the fastest moving species are boreal generalist fish which are expected to affect arctic marine food web structure and ecosystem functioning substantially. Here, we address structural changes at the food web level induced by poleward shifts via topological network analysis of highly resolved boreal and arctic food webs of the Barents Sea. We detected considerable differences in structural properties and link configuration between the boreal and the arctic food webs, the latter being more modular and less connected. We found that a main characteristic of the boreal fish moving poleward into the arctic region of the Barents Sea is high generalism, a property that increases connectance and reduces modularity in the arctic marine food web. Our results reveal that habitats form natural boundaries for food web modules, and that generalists play an important functional role in coupling pelagic and benthic modules. We posit that these habitat couplers have the potential to promote the transfer of energy and matter between habitats, but also the spread of pertubations, thereby changing arctic marine food web structure considerably with implications for ecosystem dynamics and functioning. PMID:26336179

  8. Quality evaluation on an e-learning system in continuing professional education of nurses.

    PubMed

    Lin, I-Chun; Chien, Yu-Mei; Chang, I-Chiu

    2006-01-01

    Maintaining high quality in Web-based learning is a powerful means of increasing the overall efficiency and effectiveness of distance learning. Many studies have evaluated Web-based learning but seldom evaluate from the information systems (IS) perspective. This study applied the famous IS Success model in measuring the quality of a Web-based learning system using a Web-based questionnaire for data collection. One hundred and fifty four nurses participated in the survey. Based on confirmatory factor analysis, the variables of the research model fit for measuring the quality of a Web-based learning system. As Web-based education continues to grow worldwide, the results of this study may assist the system adopter (hospital executives), the learner (nurses), and the system designers in making reasonable and informed judgments with regard to the quality of Web-based learning system in continuing professional education.

  9. 75 FR 42376 - Proposed Information Collection; Comment Request; NTIA/FCC Web-based Frequency Coordination System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-21

    ... Information Collection; Comment Request; NTIA/FCC Web- based Frequency Coordination System AGENCY: National.... Abstract The National Telecommunications and Information Administration (NTIA) hosts a Web-based system...) bands that are shared on a co-primary basis by federal and non-federal users. The Web-based system...

  10. Wireless, Web-Based Interactive Control of Optical Coherence Tomography with Mobile Devices

    PubMed Central

    Mehta, Rajvi; Nankivil, Derek; Zielinski, David J.; Waterman, Gar; Keller, Brenton; Limkakeng, Alexander T.; Kopper, Regis; Izatt, Joseph A.; Kuo, Anthony N.

    2017-01-01

    Purpose Optical coherence tomography (OCT) is widely used in ophthalmology clinics and has potential for more general medical settings and remote diagnostics. In anticipation of remote applications, we developed wireless interactive control of an OCT system using mobile devices. Methods A web-based user interface (WebUI) was developed to interact with a handheld OCT system. The WebUI consisted of key OCT displays and controls ported to a webpage using HTML and JavaScript. Client–server relationships were created between the WebUI and the OCT system computer. The WebUI was accessed on a cellular phone mounted to the handheld OCT probe to wirelessly control the OCT system. Twenty subjects were imaged using the WebUI to assess the system. System latency was measured using different connection types (wireless 802.11n only, wireless to remote virtual private network [VPN], and cellular). Results Using a cellular phone, the WebUI was successfully used to capture posterior eye OCT images in all subjects. Simultaneous interactivity by a remote user on a laptop was also demonstrated. On average, use of the WebUI added only 58, 95, and 170 ms to the system latency using wireless only, wireless to VPN, and cellular connections, respectively. Qualitatively, operator usage was not affected. Conclusions Using a WebUI, we demonstrated wireless and remote control of an OCT system with mobile devices. Translational Relevance The web and open source software tools used in this project make it possible for any mobile device to potentially control an OCT system through a WebUI. This platform can be a basis for remote, teleophthalmology applications using OCT. PMID:28138415

  11. The Role of Highly Unsaturated Fatty Acids in Aquatic Food Webs

    NASA Astrophysics Data System (ADS)

    Perhar, G.; Arhonditsis, G. B.

    2009-05-01

    Highly unsaturated fatty acids (HUFAs) are important molecules transferred across the plant-animal interface in aquatic food webs. Defined here as carbon chains of length 18 (carbons) or more, with a double bond in the third (Omega 3) or sixth (Omega 6) bond from the methyl end, HUFAs are formed in primary producers (phytoplankton). With limited abilities to synthesize de novo, consumers and higher trophic organisms are required to obtain their HUFAs primarily from diet. Bioconversion of HUFAs from one form to another is in theory possible, as is synthesis via elongation and the transformation of a saturated to highly saturated fatty acid, but the enzymes required for these processes are absent in most species. HUFAs are hypothesized to be somatic growth limiting compounds for herbivorous zooplankton and have been shown to be critical for juvenile fish growth and wellbeing. Zooplankton tend to vary their fatty acid concentrations, collection strategies and utilization methods based on taxonomy, and various mechanisms have been suggested to account for these differences i.e., seasonal and nervous system hypotheses. Considering also the facts that copepods overwinter in an active state while daphnids overwinter as resting eggs, and that copepods tend to accumulate Docosahexaenoic acid (DHA) through collection and bioconversion, while daphnids focus on Eicosapentaenoic acid (EPA), one can link high DHA concentrations to active overwintering; but both EPA and DHA have similar melting points, putting DHA's cold weather adaptation abilities into question. Another characteristic setting copepods apart from daphnids is nervous system complexity: copepod axons are coated in thick myelin sheaths, permitting rapid neural processing, such as rapid prey attack and intelligent predator avoidance; DHA may be required for the proper functioning of copepod neurons. Recent modeling results have suggested food webs with high quality primary producers (species high in HUFAs, i.e. diatoms), at their base can attain inverted biomass distributions with efficient energy transfer between trophic levels, making HUFA pathways in aquatic food webs of special interest to fisheries and environmental managers. Built on our previous work, which implicitly considered HUFAs through a proxy (generic food quality term, which also indexes ingestibility, digestibility and toxicity), our aim is to elucidate the underlying mechanisms controlling HUFA transport through the lower aquatic food web, with an emphasis on the hypothesized somatic growth limiting potential. A biochemical submodel coupled to a plankton model has been formulated and calibrated, accounting explicitly for the omega 3 and omega 6 families of fatty acids; specifically, Alpha Linoleic acid (ALA, a precursor to EPA), EPA and DHA. Further insights into the role of HUFAs on food web dynamics and the subsequent implications on ecosystem functioning are gained through bifurcation analysis of the model. Our research aims to elucidate the existing gaps in the literature pertaining to the role and impact of HUFAs on plankton dynamics, which have traditionally been thought to be driven by stoichiometric ratios and limiting nutrients. In this study, we challenge the notion of nutrients being the primary driving factor of aquatic ecosystem patterns by introducing a modeling framework that accounts for the interplay between nutrients and HUFAs.

  12. Nature of phosphorus limitation in the ultraoligotrophic eastern Mediterranean.

    PubMed

    Thingstad, T F; Krom, M D; Mantoura, R F C; Flaten, G A F; Groom, S; Herut, B; Kress, N; Law, C S; Pasternak, A; Pitta, P; Psarra, S; Rassoulzadegan, F; Tanaka, T; Tselepides, A; Wassmann, P; Woodward, E M S; Riser, C Wexels; Zodiatis, G; Zohary, T

    2005-08-12

    Phosphate addition to surface waters of the ultraoligotrophic, phosphorus-starved eastern Mediterranean in a Lagrangian experiment caused unexpected ecosystem responses. The system exhibited a decline in chlorophyll and an increase in bacterial production and copepod egg abundance. Although nitrogen and phosphorus colimitation hindered phytoplankton growth, phosphorous may have been transferred through the microbial food web to copepods via two, not mutually exclusive, pathways: (i) bypass of the phytoplankton compartment by phosphorus uptake in heterotrophic bacteria and (ii) tunnelling, whereby phosphate luxury consumption rapidly shifts the stoichiometric composition of copepod prey. Copepods may thus be coupled to lower trophic levels through interactions not usually considered.

  13. Nature of Phosphorus Limitation in the Ultraoligotrophic Eastern Mediterranean

    NASA Astrophysics Data System (ADS)

    Thingstad, T. F.; Krom, M. D.; Mantoura, R. F. C.; Flaten, G. A. F.; Groom, S.; Herut, B.; Kress, N.; Law, C. S.; Pasternak, A.; Pitta, P.; Psarra, S.; Rassoulzadegan, F.; Tanaka, T.; Tselepides, A.; Wassmann, P.; Woodward, E. M. S.; Riser, C. Wexels; Zodiatis, G.; Zohary, T.

    2005-08-01

    Phosphate addition to surface waters of the ultraoligotrophic, phosphorus-starved eastern Mediterranean in a Lagrangian experiment caused unexpected ecosystem responses. The system exhibited a decline in chlorophyll and an increase in bacterial production and copepod egg abundance. Although nitrogen and phosphorus colimitation hindered phytoplankton growth, phosphorous may have been transferred through the microbial food web to copepods via two, not mutually exclusive, pathways: (i) bypass of the phytoplankton compartment by phosphorus uptake in heterotrophic bacteria and (ii) tunnelling, whereby phosphate luxury consumption rapidly shifts the stoichiometric composition of copepod prey. Copepods may thus be coupled to lower trophic levels through interactions not usually considered.

  14. Lyα-emitting galaxies as a probe of reionization: large-scale bubble morphology and small-scale absorbers

    NASA Astrophysics Data System (ADS)

    Kakiichi, Koki; Dijkstra, Mark; Ciardi, Benedetta; Graziani, Luca

    2016-12-01

    The visibility of Lyα-emitting galaxies during the Epoch of Reionization is controlled by both diffuse H I patches in large-scale bubble morphology and small-scale absorbers. To investigate their impacts on Lyα transfer, we apply a novel combination of analytic modelling and cosmological hydrodynamical, radiative transfer simulations to three reionization models: (I) the `bubble' model, where only diffuse H I outside ionized bubbles is present; (II) the `web' model, where H I exists only in overdense self-shielded gas; and (III) the hybrid `web-bubble' model. The three models can explain the observed Lyα luminosity function equally well, but with very different H I fractions. This confirms a degeneracy between the ionization topology of the intergalactic medium (IGM) and the H I fraction inferred from Lyα surveys. We highlight the importance of the clustering of small-scale absorbers around galaxies. A combined analysis of the Lyα luminosity function and the Lyα fraction can break this degeneracy and provide constraints on the reionization history and its topology. Constraints can be improved by analysing the full MUV-dependent redshift evolution of the Lyα fraction of Lyman break galaxies. We find that the IGM-transmission probability distribution function is unimodal for bubble models and bimodal in web models. Comparing our models to observations, we infer that the neutral fraction at z ˜ 7 is likely to be of the order of tens of per cent when interpreted with bubble or web-bubble models, with a conservative lower limit ˜1 per cent when interpreted with web models.

  15. Stereoisomer-specific Trophodynamics of the Chiral Brominated Flame Retardants HBCD and TBECH in a Marine Food Web, with Implications for Human Exposure.

    PubMed

    Ruan, Yuefei; Zhang, Xiaohua; Qiu, Jian-Wen; Leung, Kenneth M Y; Lam, James C W; Lam, Paul K S

    2018-06-25

    Stereoisomers of 1,2,5,6,9,10-hexabromocyclododecane (HBCD) and 1,2-dibromo-4-(1,2-dibromoethyl)-cyclohexane (TBECH) were determined in sediments and 30 marine species in a marine food web to investigate their trophic transfer. Lipid content was found to affect the bioaccumulation of ΣHBCD and ΣTBECH in these species. Elevated biomagnification of each diastereomer from prey species to marine mammals was observed. For HBCD, biota samples showed a shift from γ- to α-HBCD when compared with sediments and technical mixtures; trophic magnification potential of (‒)-α- and (+)-α-HBCD were observed in the food web, with trophic magnification factors (TMFs) of 11.8 and 8.7, respectively. For TBECH, the relative abundance of γ- and δ-TBECH exhibited an increasing trend from abiotic matrices to biota samples; trophic magnification was observed for each diastereomer, with TMFs ranging from 1.9 to 3.5; the enantioselective bioaccumulation of the first eluting enantiomer of δ-TBECH in organisms at higher TLs was consistently observed across samples. This is the first report on the trophic transfer of TBECH in the food web. The estimated daily intake of HBCD for Hong Kong residents was approximately 16-times higher than that for the general population in China, and the health risk to local children was high based on the relevant available reference dose.

  16. Improving end of life care: an information systems approach to reducing medical errors.

    PubMed

    Tamang, S; Kopec, D; Shagas, G; Levy, K

    2005-01-01

    Chronic and terminally ill patients are disproportionately affected by medical errors. In addition, the elderly suffer more preventable adverse events than younger patients. Targeting system wide "error-reducing" reforms to vulnerable populations can significantly reduce the incidence and prevalence of human error in medical practice. Recent developments in health informatics, particularly the application of artificial intelligence (AI) techniques such as data mining, neural networks, and case-based reasoning (CBR), presents tremendous opportunities for mitigating error in disease diagnosis and patient management. Additionally, the ubiquity of the Internet creates the possibility of an almost ideal network for the dissemination of medical information. We explore the capacity and limitations of web-based palliative information systems (IS) to transform the delivery of care, streamline processes and improve the efficiency and appropriateness of medical treatment. As a result, medical error(s) that occur with patients dealing with severe, chronic illness and the frail elderly can be reduced.The palliative model grew out of the need for pain relief and comfort measures for patients diagnosed with cancer. Applied definitions of palliative care extend this convention, but there is no widely accepted definition. This research will discuss the development life cycle of two palliative information systems: the CONFER QOLP management information system (MIS), currently used by a community-based palliative care program in Brooklyn, New York, and the CAREN case-based reasoning prototype. CONFER is a web platform based on the idea of "eCare". CONFER uses XML (extensible mark-up language), a W3C-endorced standard mark up to define systems data. The second system, CAREN, is a CBR prototype designed for palliative care patients in the cancer trajectory. CBR is a technique, which tries to exploit the similarities of two situations and match decision-making to the best-known precedent cases. The prototype uses the opensource CASPIAN shell developed by the University of Aberystwyth, Wales and is available by anonymous FTP. We will discuss and analyze the preliminary results we have obtained using this CBR tool. Our research suggests that automated information systems can be used to improve the quality of care at the end of life and disseminate expert level 'know how' to palliative care clinicians. We will present how our CBR prototype can be successfully deployed, capable of securely transferring information using a Secure File Transfer Protocol (SFTP) and using a JAVA CBR engine.

  17. An Automated, High-Throughput System for GISAXS and GIWAXS Measurements of Thin Films

    NASA Astrophysics Data System (ADS)

    Schaible, Eric; Jimenez, Jessica; Church, Matthew; Lim, Eunhee; Stewart, Polite; Hexemer, Alexander

    Grazing incidence small-angle X-ray scattering (GISAXS) and grazing incidence wide-angle X-ray scattering (GIWAXS) are important techniques for characterizing thin films. In order to meet rapidly increasing demand, the SAXSWAXS beamline at the Advanced Light Source (beamline 7.3.3) has implemented a fully automated, high-throughput system to conduct SAXS, GISAXS and GIWAXS measurements. An automated robot arm transfers samples from a holding tray to a measurement stage. Intelligent software aligns each sample in turn, and measures each according to user-defined specifications. Users mail in trays of samples on individually barcoded pucks, and can download and view their data remotely. Data will be pipelined to the NERSC supercomputing facility, and will be available to users via a web portal that facilitates highly parallelized analysis.

  18. NOAA's Big Data Partnership at the National Centers for Environmental Information

    NASA Astrophysics Data System (ADS)

    Kearns, E. J.

    2015-12-01

    In April of 2015, the U.S. Department of Commerce announced NOAA's Big Data Partnership (BDP) with Amazon Web Services, Google Cloud Platform, IBM, Microsoft Corp., and the Open Cloud Consortium through Cooperative Research and Development Agreements. Recent progress on the activities with these Partners at the National Centers for Environmental Information (NCEI) will be presented. These activities include the transfer of over 350 TB of NOAA's archived data from NCEI's tape-based archive system to BDP cloud providers; new opportunities for data mining and investigation; application of NOAA's data maturity and stewardship concepts to the BDP; and integration of both archived and near-realtime data streams into a synchronized, distributed data system. Both lessons learned and future opportunities for the environmental data community will be presented.

  19. Use of QuakeSim and UAVSAR for Earthquake Damage Mitigation and Response

    NASA Technical Reports Server (NTRS)

    Donnellan, A.; Parker, J. W.; Bawden, G.; Hensley, S.

    2009-01-01

    Spaceborne, airborne, and modeling and simulation techniques are being applied to earthquake risk assessment and response for mitigation from this natural disaster. QuakeSim is a web-based portal for modeling interseismic strain accumulation using paleoseismic and crustal deformation data. The models are used for understanding strain accumulation and release from earthquakes as well as stress transfer to neighboring faults. Simulations of the fault system can be used for understanding the likelihood and patterns of earthquakes as well as the likelihood of large aftershocks from events. UAVSAR is an airborne L-band InSAR system for collecting crustal deformation data. QuakeSim, UAVSAR, and DESDynI (following launch) can be used for monitoring earthquakes, the associated rupture and damage, and postseismic motions for prediction of aftershock locations.

  20. 75 FR 27182 - Energy Conservation Program: Web-Based Compliance and Certification Management System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-14

    ... Conservation Program: Web-Based Compliance and Certification Management System AGENCY: Office of Energy... certification reports to the Department of Energy (DOE) through an electronic Web-based tool, the Compliance and... following means: 1. Compliance and Certification Management System (CCMS)--via the Web portal: http...

  1. Information Retrieval System for Japanese Standard Disease-Code Master Using XML Web Service

    PubMed Central

    Hatano, Kenji; Ohe, Kazuhiko

    2003-01-01

    Information retrieval system of Japanese Standard Disease-Code Master Using XML Web Service is developed. XML Web Service is a new distributed processing system by standard internet technologies. With seamless remote method invocation of XML Web Service, users are able to get the latest disease code master information from their rich desktop applications or internet web sites, which refer to this service. PMID:14728364

  2. Web Mining: Machine Learning for Web Applications.

    ERIC Educational Resources Information Center

    Chen, Hsinchun; Chau, Michael

    2004-01-01

    Presents an overview of machine learning research and reviews methods used for evaluating machine learning systems. Ways that machine-learning algorithms were used in traditional information retrieval systems in the "pre-Web" era are described, and the field of Web mining and how machine learning has been used in different Web mining…

  3. Recent advancements on the development of web-based applications for the implementation of seismic analysis and surveillance systems

    NASA Astrophysics Data System (ADS)

    Friberg, P. A.; Luis, R. S.; Quintiliani, M.; Lisowski, S.; Hunter, S.

    2014-12-01

    Recently, a novel set of modules has been included in the Open Source Earthworm seismic data processing system, supporting the use of web applications. These include the Mole sub-system, for storing relevant event data in a MySQL database (see M. Quintiliani and S. Pintore, SRL, 2013), and an embedded webserver, Moleserv, for serving such data to web clients in QuakeML format. These modules have enabled, for the first time using Earthworm, the use of web applications for seismic data processing. These can greatly simplify the operation and maintenance of seismic data processing centers by having one or more servers providing the relevant data as well as the data processing applications themselves to client machines running arbitrary operating systems.Web applications with secure online web access allow operators to work anywhere, without the often cumbersome and bandwidth hungry use of secure shell or virtual private networks. Furthermore, web applications can seamlessly access third party data repositories to acquire additional information, such as maps. Finally, the usage of HTML email brought the possibility of specialized web applications, to be used in email clients. This is the case of EWHTMLEmail, which produces event notification emails that are in fact simple web applications for plotting relevant seismic data.Providing web services as part of Earthworm has enabled a number of other tools as well. One is ISTI's EZ Earthworm, a web based command and control system for an otherwise command line driven system; another is a waveform web service. The waveform web service serves Earthworm data to additional web clients for plotting, picking, and other web-based processing tools. The current Earthworm waveform web service hosts an advanced plotting capability for providing views of event-based waveforms from a Mole database served by Moleserve.The current trend towards the usage of cloud services supported by web applications is driving improvements in JavaScript, css and HTML, as well as faster and more efficient web browsers, including mobile. It is foreseeable that in the near future, web applications are as powerful and efficient as native applications. Hence the work described here has been the first step towards bringing the Open Source Earthworm seismic data processing system to this new paradigm.

  4. The Cancer Genomics Hub (CGHub): overcoming cancer through the power of torrential data

    PubMed Central

    Wilks, Christopher; Cline, Melissa S.; Weiler, Erich; Diehkans, Mark; Craft, Brian; Martin, Christy; Murphy, Daniel; Pierce, Howdy; Black, John; Nelson, Donavan; Litzinger, Brian; Hatton, Thomas; Maltbie, Lori; Ainsworth, Michael; Allen, Patrick; Rosewood, Linda; Mitchell, Elizabeth; Smith, Bradley; Warner, Jim; Groboske, John; Telc, Haifang; Wilson, Daniel; Sanford, Brian; Schmidt, Hannes; Haussler, David; Maltbie, Daniel

    2014-01-01

    The Cancer Genomics Hub (CGHub) is the online repository of the sequencing programs of the National Cancer Institute (NCI), including The Cancer Genomics Atlas (TCGA), the Cancer Cell Line Encyclopedia (CCLE) and the Therapeutically Applicable Research to Generate Effective Treatments (TARGET) projects, with data from 25 different types of cancer. The CGHub currently contains >1.4 PB of data, has grown at an average rate of 50 TB a month and serves >100 TB per week. The architecture of CGHub is designed to support bulk searching and downloading through a Web-accessible application programming interface, enforce patient genome confidentiality in data storage and transmission and optimize for efficiency in access and transfer. In this article, we describe the design of these three components, present performance results for our transfer protocol, GeneTorrent, and finally report on the growth of the system in terms of data stored and transferred, including estimated limits on the current architecture. Our experienced-based estimates suggest that centralizing storage and computational resources is more efficient than wide distribution across many satellite labs. Database URL: https://cghub.ucsc.edu PMID:25267794

  5. The ToxBank Data Warehouse: Supporting the Replacement of In Vivo Repeated Dose Systemic Toxicity Testing.

    PubMed

    Kohonen, Pekka; Benfenati, Emilio; Bower, David; Ceder, Rebecca; Crump, Michael; Cross, Kevin; Grafström, Roland C; Healy, Lyn; Helma, Christoph; Jeliazkova, Nina; Jeliazkov, Vedrin; Maggioni, Silvia; Miller, Scott; Myatt, Glenn; Rautenberg, Michael; Stacey, Glyn; Willighagen, Egon; Wiseman, Jeff; Hardy, Barry

    2013-01-01

    The aim of the SEURAT-1 (Safety Evaluation Ultimately Replacing Animal Testing-1) research cluster, comprised of seven EU FP7 Health projects co-financed by Cosmetics Europe, is to generate a proof-of-concept to show how the latest technologies, systems toxicology and toxicogenomics can be combined to deliver a test replacement for repeated dose systemic toxicity testing on animals. The SEURAT-1 strategy is to adopt a mode-of-action framework to describe repeated dose toxicity, combining in vitro and in silico methods to derive predictions of in vivo toxicity responses. ToxBank is the cross-cluster infrastructure project whose activities include the development of a data warehouse to provide a web-accessible shared repository of research data and protocols, a physical compounds repository, reference or "gold compounds" for use across the cluster (available via wiki.toxbank.net), and a reference resource for biomaterials. Core technologies used in the data warehouse include the ISA-Tab universal data exchange format, REpresentational State Transfer (REST) web services, the W3C Resource Description Framework (RDF) and the OpenTox standards. We describe the design of the data warehouse based on cluster requirements, the implementation based on open standards, and finally the underlying concepts and initial results of a data analysis utilizing public data related to the gold compounds. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Plastic and other microfibers in sediments, macroinvertebrates and shorebirds from three intertidal wetlands of southern Europe and west Africa.

    PubMed

    Lourenço, Pedro M; Serra-Gonçalves, Catarina; Ferreira, Joana Lia; Catry, Teresa; Granadeiro, José P

    2017-12-01

    Microplastics are widespread in aquatic environments and can be ingested by a wide range of organisms. They can also be transferred along food webs. Estuaries and other tidal wetlands may be particularly prone to this type of pollution due to their particular hydrological characteristics and sewage input, but few studies have compared wetlands with different anthropogenic pressure. Furthermore, there is no information on microplastic transfer to secondary intertidal consumers such as shorebirds. We analysed intertidal sediments, macroinvertebrates and shorebirds, from three important wetlands along the Eastern Atlantic (Tejo estuary, Portugal; Banc d'Arguin, Mauritania and Bijagós archipelago, Guinea-Bissau), in order to evaluate the prevalence and transfer of microplastics along the intertidal food web. We further investigated variables that could explain the distribution of microplastics within the intertidal areas of the Tejo estuary. Microfibers were recorded in a large proportion of sediment samples (91%), macroinvertebrates (60%) and shorebird faeces (49%). μ-FTIR analysis indicated only 52% of these microfibers were composed of synthetic polymers (i.e. plastics). Microfiber concentrations were generally higher in the Tejo and lower in the Bijagós, with intermediate values for Banc d'Arguin, thus following a latitudinal gradient. Heavier anthropogenic pressure in the Tejo explains this pattern, but the relatively high concentrations in a pristine site like the Banc d'Arguin demonstrate the spread of pollution in the oceans. Similar microfiber concentrations in faeces of shorebirds with different foraging behaviour and similar composition of fibres collected from invertebrate and faeces suggest shorebirds mainly ingest microfibers through their prey, confirming microfiber transfer along intertidal food webs. Within the Tejo estuary, concentration of microfibers in the sediment and bivalves were positively related with the percentage of fine sediments and with the population size of the closest township, suggesting that hydrodynamics and local domestic sewage are the main factors influencing the distribution of microfibers. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. 78 FR 66420 - Proposed Enhancements to the Motor Carrier Safety Measurement System (SMS) Public Web Site

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-05

    ...-0392] Proposed Enhancements to the Motor Carrier Safety Measurement System (SMS) Public Web Site AGENCY... on the Agency's Safety Measurement System (SMS) public Web site. FMCSA first announced the... public Web site that are the direct result of feedback from stakeholders regarding the information...

  8. ITMS: Individualized Teaching Material System: Adaptive Integration of Web Pages Distributed in Some Servers.

    ERIC Educational Resources Information Center

    Mitsuhara, Hiroyuki; Kurose, Yoshinobu; Ochi, Youji; Yano, Yoneo

    The authors developed a Web-based Adaptive Educational System (Web-based AES) named ITMS (Individualized Teaching Material System). ITMS adaptively integrates knowledge on the distributed Web pages and generates individualized teaching material that has various contents. ITMS also presumes the learners' knowledge levels from the states of their…

  9. Microbial Food-Web Drivers in Tropical Reservoirs.

    PubMed

    Domingues, Carolina Davila; da Silva, Lucia Helena Sampaio; Rangel, Luciana Machado; de Magalhães, Leonardo; de Melo Rocha, Adriana; Lobão, Lúcia Meirelles; Paiva, Rafael; Roland, Fábio; Sarmento, Hugo

    2017-04-01

    Element cycling in aquatic systems is driven chiefly by planktonic processes, and the structure of the planktonic food web determines the efficiency of carbon transfer through trophic levels. However, few studies have comprehensively evaluated all planktonic food-web components in tropical regions. The aim of this study was to unravel the top-down controls (metazooplankton community structure), bottom-up controls (resource availability), and hydrologic (water residence time) and physical (temperature) variables that affect different components of the microbial food web (MFW) carbon stock in tropical reservoirs, through structural equation models (SEM). We conducted a field study in four deep Brazilian reservoirs (Balbina, Tucuruí, Três Marias, and Funil) with different trophic states (oligo-, meso-, and eutrophic). We found evidence of a high contribution of the MFW (up to 50% of total planktonic carbon), especially in the less-eutrophic reservoirs (Balbina and Tucuruí). Bottom-up and top-down effects assessed through SEM indicated negative interactions between soluble reactive phosphorus and phototrophic picoplankton (PPP), dissolved inorganic nitrogen, and heterotrophic nanoflagellates (HNF). Copepods positively affected ciliates, and cladocerans positively affected heterotrophic bacteria (HB) and PPP. Higher copepod/cladoceran ratios and an indirect positive effect of copepods on HB might strengthen HB-HNF coupling. We also found low values for the degree of uncoupling (D) and a low HNF/HB ratio compared with literature data (mostly from temperate regions). This study demonstrates the importance of evaluating the whole size spectrum (including microbial compartments) of the different planktonic compartments, in order to capture the complex carbon dynamics of tropical aquatic ecosystems.

  10. Web usage data mining agent

    NASA Astrophysics Data System (ADS)

    Madiraju, Praveen; Zhang, Yanqing

    2002-03-01

    When a user logs in to a website, behind the scenes the user leaves his/her impressions, usage patterns and also access patterns in the web servers log file. A web usage mining agent can analyze these web logs to help web developers to improve the organization and presentation of their websites. They can help system administrators in improving the system performance. Web logs provide invaluable help in creating adaptive web sites and also in analyzing the network traffic analysis. This paper presents the design and implementation of a Web usage mining agent for digging in to the web log files.

  11. Web Mining for Web Image Retrieval.

    ERIC Educational Resources Information Center

    Chen, Zheng; Wenyin, Liu; Zhang, Feng; Li, Mingjing; Zhang, Hongjiang

    2001-01-01

    Presents a prototype system for image retrieval from the Internet using Web mining. Discusses the architecture of the Web image retrieval prototype; document space modeling; user log mining; and image retrieval experiments to evaluate the proposed system. (AEF)

  12. WebCIS: large scale deployment of a Web-based clinical information system.

    PubMed

    Hripcsak, G; Cimino, J J; Sengupta, S

    1999-01-01

    WebCIS is a Web-based clinical information system. It sits atop the existing Columbia University clinical information system architecture, which includes a clinical repository, the Medical Entities Dictionary, an HL7 interface engine, and an Arden Syntax based clinical event monitor. WebCIS security features include authentication with secure tokens, authorization maintained in an LDAP server, SSL encryption, permanent audit logs, and application time outs. WebCIS is currently used by 810 physicians at the Columbia-Presbyterian center of New York Presbyterian Healthcare to review and enter data into the electronic medical record. Current deployment challenges include maintaining adequate database performance despite complex queries, replacing large numbers of computers that cannot run modern Web browsers, and training users that have never logged onto the Web. Although the raised expectations and higher goals have increased deployment costs, the end result is a far more functional, far more available system.

  13. Mobile access to virtual randomization for investigator-initiated trials.

    PubMed

    Deserno, Thomas M; Keszei, András P

    2017-08-01

    Background/aims Randomization is indispensable in clinical trials in order to provide unbiased treatment allocation and a valid statistical inference. Improper handling of allocation lists can be avoided using central systems, for example, human-based services. However, central systems are unaffordable for investigator-initiated trials and might be inaccessible from some places, where study subjects need allocations. We propose mobile access to virtual randomization, where the randomization lists are non-existent and the appropriate allocation is computed on demand. Methods The core of the system architecture is an electronic data capture system or a clinical trial management system, which is extended by an R interface connecting the R server using the Java R Interface. Mobile devices communicate via the representational state transfer web services. Furthermore, a simple web-based setup allows configuring the appropriate statistics by non-statisticians. Our comprehensive R script supports simple randomization, restricted randomization using a random allocation rule, block randomization, and stratified randomization for un-blinded, single-blinded, and double-blinded trials. For each trial, the electronic data capture system or the clinical trial management system stores the randomization parameters and the subject assignments. Results Apps are provided for iOS and Android and subjects are randomized using smartphones. After logging onto the system, the user selects the trial and the subject, and the allocation number and treatment arm are displayed instantaneously and stored in the core system. So far, 156 subjects have been allocated from mobile devices serving five investigator-initiated trials. Conclusion Transforming pre-printed allocation lists into virtual ones ensures the correct conduct of trials and guarantees a strictly sequential processing in all trial sites. Covering 88% of all randomization models that are used in recent trials, virtual randomization becomes available for investigator-initiated trials and potentially for large multi-center trials.

  14. A Data Management Framework for Real-Time Water Quality Monitoring

    NASA Astrophysics Data System (ADS)

    Mulyono, E.; Yang, D.; Craig, M.

    2007-12-01

    CSU East Bay operates two in-situ, near-real-time water quality monitoring stations in San Francisco Bay as a member of the Center for Integrative Coastal Ocean Observation, Research, and Education (CICORE) and the Central and Northern California Ocean Observing System (CeNCOOS). We have been operating stations at Dumbarton Pier and San Leandro Marina for the past two years. At each station, a sonde measures seven water quality parameters every six minutes. During the first year of operation, we retrieved data from the sondes every few weeks by visiting the sites and uploading data to a handheld logger. Last year we implemented a telemetry system utilizing a cellular CDMA modem to transfer data from the field to our data center on an hourly basis. Data from each station are initially stored in monthly files in native format. We import data from these files into a SQL database every hour. SQL is handled by Django, an open source web framework. Django provides a user- friendly web user interface (UI) to administer the data. We utilized parts of the Django UI for our database web- front, which allows users to access our database via the World Wide Web and perform basic queries. We also serve our data to other aggregating sites, including the central CICORE website and NOAA's National Data Buoy Center (NDBC). Since Django is written in Python, it allows us to integrate other Python modules into our software, such as the Matplot library for scientific graphics. We store our code in a Subversion repository, which keeps track of software revisions. Code is tested using Python's unittest and doctest modules within Django's testing facility, which warns us when our code modifications cause other parts of the software to break. During the past two years of data acquisition, we have incrementally updated our data model to accommodate changes in physical hardware, including equipment moves, instrument replacements, and sensor upgrades that affected data format.

  15. A RESTful API for accessing microbial community data for MG-RAST

    DOE PAGES

    Wilke, Andreas; Bischof, Jared; Harrison, Travis; ...

    2015-01-08

    Metagenomic sequencing has produced significant amounts of data in recent years. For example, as of summer 2013, MGRAST has been used to annotate over 110,000 data sets totaling over 43 Terabases. With metagenomic sequencing finding even wider adoption in the scientific community, the existing web-based analysis tools and infrastructure in MG-RAST provide limited capability for data retrieval and analysis, such as comparative analysis between multiple data sets. Moreover, although the system provides many analysis tools, it is not comprehensive. By opening MG-RAST up via a web services API (application programmers interface) we have greatly expanded access to MG-RAST data, asmore » well as provided a mechanism for the use of third-party analysis tools with MG-RAST data. This RESTful API makes all data and data objects created by the MG-RAST pipeline accessible as JSON objects. As part of the DOE Systems Biology Knowledgebase project (KBase, http:// kbase.us) we have implemented a web services API for MG-RAST. This API complements the existing MG-RAST web interface and constitutes the basis of KBase’s microbial community capabilities. In addition, the API exposes a comprehensive collection of data to programmers. This API, which uses a RESTful (Representational State Transfer) implementation, is compatible with most programming environments and should be easy to use for end users and third parties. It provides comprehensive access to sequence data, quality control results, annotations, and many other data types. Where feasible, we have used standards to expose data and metadata. Code examples are provided in a number of languages both to show the versatility of the API and to provide a starting point for users. We present an API that exposes the data in MG-RAST for consumption by our users, greatly enhancing the utility of the MG-RAST service.« less

  16. A RESTful API for Accessing Microbial Community Data for MG-RAST

    PubMed Central

    Wilke, Andreas; Bischof, Jared; Harrison, Travis; Brettin, Tom; D'Souza, Mark; Gerlach, Wolfgang; Matthews, Hunter; Paczian, Tobias; Wilkening, Jared; Glass, Elizabeth M.; Desai, Narayan; Meyer, Folker

    2015-01-01

    Metagenomic sequencing has produced significant amounts of data in recent years. For example, as of summer 2013, MG-RAST has been used to annotate over 110,000 data sets totaling over 43 Terabases. With metagenomic sequencing finding even wider adoption in the scientific community, the existing web-based analysis tools and infrastructure in MG-RAST provide limited capability for data retrieval and analysis, such as comparative analysis between multiple data sets. Moreover, although the system provides many analysis tools, it is not comprehensive. By opening MG-RAST up via a web services API (application programmers interface) we have greatly expanded access to MG-RAST data, as well as provided a mechanism for the use of third-party analysis tools with MG-RAST data. This RESTful API makes all data and data objects created by the MG-RAST pipeline accessible as JSON objects. As part of the DOE Systems Biology Knowledgebase project (KBase, http://kbase.us) we have implemented a web services API for MG-RAST. This API complements the existing MG-RAST web interface and constitutes the basis of KBase's microbial community capabilities. In addition, the API exposes a comprehensive collection of data to programmers. This API, which uses a RESTful (Representational State Transfer) implementation, is compatible with most programming environments and should be easy to use for end users and third parties. It provides comprehensive access to sequence data, quality control results, annotations, and many other data types. Where feasible, we have used standards to expose data and metadata. Code examples are provided in a number of languages both to show the versatility of the API and to provide a starting point for users. We present an API that exposes the data in MG-RAST for consumption by our users, greatly enhancing the utility of the MG-RAST service. PMID:25569221

  17. Towards a virtual hub approach for landscape assessment and multimedia ecomuseum using multitemporal-maps

    NASA Astrophysics Data System (ADS)

    Brumana, R.; Santana Quintero, M.; Barazzetti, L.; Previtali, M.; Banfi, F.; Oreni, D.; Roels, D.; Roncoroni, F.

    2015-08-01

    Landscapes are dynamic entities, stretching and transforming across space and time, and need to be safeguarded as living places for the future, with interaction of human, social and economic dimensions. To have a comprehensive landscape evaluation several open data are needed, each one characterized by its own protocol, service interface, limiting or impeding this way interoperability and their integration. Indeed, nowadays the development of websites targeted to landscape assessment and touristic purposes requires many resources in terms of time, cost and IT skills to be implemented at different scales. For this reason these applications are limited to few cases mainly focusing on worldwide known touristic sites. The capability to spread the development of web-based multimedia virtual museum based on geospatial data relies for the future being on the possibility to discover the needed geo-spatial data through a single point of access in an homogenous way. In this paper the proposed innovative approach may facilitate the access to open data in a homogeneous way by means of specific components (the brokers) performing interoperability actions required to interconnect heterogeneous data sources. In the specific case study here analysed it has been implemented an interface to migrate a geo-swat chart based on local and regional geographic information into an user friendly Google Earth©-based infrastructure, integrating ancient cadastres and modern cartography, accessible by professionals and tourists via web and also via portable devices like tables and smartphones. The general aim of this work on the case study on the Lake of Como (Tremezzina municipality), is to boost the integration of assessment methodologies with digital geo-based technologies of map correlation for the multimedia ecomuseum system accessible via web. The developed WebGIS system integrates multi-scale and multi-temporal maps with different information (cultural, historical, landscape levels) represented by thematic icons allowing to transfer the richness of the landscape value to both tourists and professionals.

  18. Development of Web-based Distributed Cooperative Development Environmentof Sign-Language Animation System and its Evaluation

    NASA Astrophysics Data System (ADS)

    Yuizono, Takaya; Hara, Kousuke; Nakayama, Shigeru

    A web-based distributed cooperative development environment of sign-language animation system has been developed. We have extended the system from the previous animation system that was constructed as three tiered system which consists of sign-language animation interface layer, sign-language data processing layer, and sign-language animation database. Two components of a web client using VRML plug-in and web servlet are added to the previous system. The systems can support humanoid-model avatar for interoperability, and can use the stored sign language animation data shared on the database. It is noted in the evaluation of this system that the inverse kinematics function of web client improves the sign-language animation making.

  19. Enabling a systems biology knowledgebase with gaggle and firegoose

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baliga, Nitin S.

    The overall goal of this project was to extend the existing Gaggle and Firegoose systems to develop an open-source technology that runs over the web and links desktop applications with many databases and software applications. This technology would enable researchers to incorporate workflows for data analysis that can be executed from this interface to other online applications. The four specific aims were to (1) provide one-click mapping of genes, proteins, and complexes across databases and species; (2) enable multiple simultaneous workflows; (3) expand sophisticated data analysis for online resources; and enhance open-source development of the Gaggle-Firegoose infrastructure. Gaggle is anmore » open-source Java software system that integrates existing bioinformatics programs and data sources into a user-friendly, extensible environment to allow interactive exploration, visualization, and analysis of systems biology data. Firegoose is an extension to the Mozilla Firefox web browser that enables data transfer between websites and desktop tools including Gaggle. In the last phase of this funding period, we have made substantial progress on development and application of the Gaggle integration framework. We implemented the workspace to the Network Portal. Users can capture data from Firegoose and save them to the workspace. Users can create workflows to start multiple software components programmatically and pass data between them. Results of analysis can be saved to the cloud so that they can be easily restored on any machine. We also developed the Gaggle Chrome Goose, a plugin for the Google Chrome browser in tandem with an opencpu server in the Amazon EC2 cloud. This allows users to interactively perform data analysis on a single web page using the R packages deployed on the opencpu server. The cloud-based framework facilitates collaboration between researchers from multiple organizations. We have made a number of enhancements to the cmonkey2 application to enable and improve the integration within different environments, and we have created a new tools pipeline for generating EGRIN2 models in a largely automated way.« less

  20. PMD2HD--a web tool aligning a PubMed search results page with the local German Cancer Research Centre library collection.

    PubMed

    Bohne-Lang, Andreas; Lang, Elke; Taube, Anke

    2005-06-27

    Web-based searching is the accepted contemporary mode of retrieving relevant literature, and retrieving as many full text articles as possible is a typical prerequisite for research success. In most cases only a proportion of references will be directly accessible as digital reprints through displayed links. A large number of references, however, have to be verified in library catalogues and, depending on their availability, are accessible as print holdings or by interlibrary loan request. The problem of verifying local print holdings from an initial retrieval set of citations can be solved using Z39.50, an ANSI protocol for interactively querying library information systems. Numerous systems include Z39.50 interfaces and therefore can process Z39.50 interactive requests. However, the programmed query interaction command structure is non-intuitive and inaccessible to the average biomedical researcher. For the typical user, it is necessary to implement the protocol within a tool that hides and handles Z39.50 syntax, presenting a comfortable user interface. PMD2HD is a web tool implementing Z39.50 to provide an appropriately functional and usable interface to integrate into the typical workflow that follows an initial PubMed literature search, providing users with an immediate asset to assist in the most tedious step in literature retrieval, checking for subscription holdings against a local online catalogue. PMD2HD can facilitate literature access considerably with respect to the time and cost of manual comparisons of search results with local catalogue holdings. The example presented in this article is related to the library system and collections of the German Cancer Research Centre. However, the PMD2HD software architecture and use of common Z39.50 protocol commands allow for transfer to a broad range of scientific libraries using Z39.50-compatible library information systems.

  1. Mobile medical visual information retrieval.

    PubMed

    Depeursinge, Adrien; Duc, Samuel; Eggel, Ivan; Müller, Henning

    2012-01-01

    In this paper, we propose mobile access to peer-reviewed medical information based on textual search and content-based visual image retrieval. Web-based interfaces designed for limited screen space were developed to query via web services a medical information retrieval engine optimizing the amount of data to be transferred in wireless form. Visual and textual retrieval engines with state-of-the-art performance were integrated. Results obtained show a good usability of the software. Future use in clinical environments has the potential of increasing quality of patient care through bedside access to the medical literature in context.

  2. AMBIT RESTful web services: an implementation of the OpenTox application programming interface.

    PubMed

    Jeliazkova, Nina; Jeliazkov, Vedrin

    2011-05-16

    The AMBIT web services package is one of the several existing independent implementations of the OpenTox Application Programming Interface and is built according to the principles of the Representational State Transfer (REST) architecture. The Open Source Predictive Toxicology Framework, developed by the partners in the EC FP7 OpenTox project, aims at providing a unified access to toxicity data and predictive models, as well as validation procedures. This is achieved by i) an information model, based on a common OWL-DL ontology ii) links to related ontologies; iii) data and algorithms, available through a standardized REST web services interface, where every compound, data set or predictive method has a unique web address, used to retrieve its Resource Description Framework (RDF) representation, or initiate the associated calculations.The AMBIT web services package has been developed as an extension of AMBIT modules, adding the ability to create (Quantitative) Structure-Activity Relationship (QSAR) models and providing an OpenTox API compliant interface. The representation of data and processing resources in W3C Resource Description Framework facilitates integrating the resources as Linked Data. By uploading datasets with chemical structures and arbitrary set of properties, they become automatically available online in several formats. The services provide unified interfaces to several descriptor calculation, machine learning and similarity searching algorithms, as well as to applicability domain and toxicity prediction models. All Toxtree modules for predicting the toxicological hazard of chemical compounds are also integrated within this package. The complexity and diversity of the processing is reduced to the simple paradigm "read data from a web address, perform processing, write to a web address". The online service allows to easily run predictions, without installing any software, as well to share online datasets and models. The downloadable web application allows researchers to setup an arbitrary number of service instances for specific purposes and at suitable locations. These services could be used as a distributed framework for processing of resource-intensive tasks and data sharing or in a fully independent way, according to the specific needs. The advantage of exposing the functionality via the OpenTox API is seamless interoperability, not only within a single web application, but also in a network of distributed services. Last, but not least, the services provide a basis for building web mashups, end user applications with friendly GUIs, as well as embedding the functionalities in existing workflow systems.

  3. AMBIT RESTful web services: an implementation of the OpenTox application programming interface

    PubMed Central

    2011-01-01

    The AMBIT web services package is one of the several existing independent implementations of the OpenTox Application Programming Interface and is built according to the principles of the Representational State Transfer (REST) architecture. The Open Source Predictive Toxicology Framework, developed by the partners in the EC FP7 OpenTox project, aims at providing a unified access to toxicity data and predictive models, as well as validation procedures. This is achieved by i) an information model, based on a common OWL-DL ontology ii) links to related ontologies; iii) data and algorithms, available through a standardized REST web services interface, where every compound, data set or predictive method has a unique web address, used to retrieve its Resource Description Framework (RDF) representation, or initiate the associated calculations. The AMBIT web services package has been developed as an extension of AMBIT modules, adding the ability to create (Quantitative) Structure-Activity Relationship (QSAR) models and providing an OpenTox API compliant interface. The representation of data and processing resources in W3C Resource Description Framework facilitates integrating the resources as Linked Data. By uploading datasets with chemical structures and arbitrary set of properties, they become automatically available online in several formats. The services provide unified interfaces to several descriptor calculation, machine learning and similarity searching algorithms, as well as to applicability domain and toxicity prediction models. All Toxtree modules for predicting the toxicological hazard of chemical compounds are also integrated within this package. The complexity and diversity of the processing is reduced to the simple paradigm "read data from a web address, perform processing, write to a web address". The online service allows to easily run predictions, without installing any software, as well to share online datasets and models. The downloadable web application allows researchers to setup an arbitrary number of service instances for specific purposes and at suitable locations. These services could be used as a distributed framework for processing of resource-intensive tasks and data sharing or in a fully independent way, according to the specific needs. The advantage of exposing the functionality via the OpenTox API is seamless interoperability, not only within a single web application, but also in a network of distributed services. Last, but not least, the services provide a basis for building web mashups, end user applications with friendly GUIs, as well as embedding the functionalities in existing workflow systems. PMID:21575202

  4. Patterns of usage for a Web-based clinical information system.

    PubMed

    Chen, Elizabeth S; Cimino, James J

    2004-01-01

    Understanding how clinicians are using clinical information systems to assist with their everyday tasks is valuable to the system design and development process. Developers of such systems are interested in monitoring usage in order to make enhancements. System log files are rich resources for gaining knowledge about how the system is being used. We have analyzed the log files of our Web-based clinical information system (WebCIS) to obtain various usage statistics including which WebCIS features are frequently being used. We have also identified usage patterns, which convey how the user is traversing the system. We present our method and these results as well as describe how the results can be used to customize menus, shortcut lists, and patient reports in WebCIS and similar systems.

  5. A Data Management System Integrating Web-based Training and Randomized Trials: Requirements, Experiences and Recommendations.

    PubMed

    Muroff, Jordana; Amodeo, Maryann; Larson, Mary Jo; Carey, Margaret; Loftin, Ralph D

    2011-01-01

    This article describes a data management system (DMS) developed to support a large-scale randomized study of an innovative web-course that was designed to improve substance abuse counselors' knowledge and skills in applying a substance abuse treatment method (i.e., cognitive behavioral therapy; CBT). The randomized trial compared the performance of web-course-trained participants (intervention group) and printed-manual-trained participants (comparison group) to determine the effectiveness of the web-course in teaching CBT skills. A single DMS was needed to support all aspects of the study: web-course delivery and management, as well as randomized trial management. The authors briefly reviewed several other systems that were described as built either to handle randomized trials or to deliver and evaluate web-based training. However it was clear that these systems fell short of meeting our needs for simultaneous, coordinated management of the web-course and the randomized trial. New England Research Institute's (NERI) proprietary Advanced Data Entry and Protocol Tracking (ADEPT) system was coupled with the web-programmed course and customized for our purposes. This article highlights the requirements for a DMS that operates at the intersection of web-based course management systems and randomized clinical trial systems, and the extent to which the coupled, customized ADEPT satisfied those requirements. Recommendations are included for institutions and individuals considering conducting randomized trials and web-based training programs, and seeking a DMS that can meet similar requirements.

  6. Geospatial Technology Applications and Infrastructure in the Biological Resources Division.

    DTIC Science & Technology

    1998-09-01

    Forestry/forest ecology Geography Geology GIS/mapping technologies GPS technology HTML/World Wide Web Information management/transfer JAVA Land...tech- nologies are being used to understand diet selection, habitat use, hibernation behavior, and social interactions of desert tortoises

  7. Dataworks for GNSS: Software for Supporting Data Sharing and Federation of Geodetic Networks

    NASA Astrophysics Data System (ADS)

    Boler, F. M.; Meertens, C. M.; Miller, M. M.; Wier, S.; Rost, M.; Matykiewicz, J.

    2015-12-01

    Continuously-operating Global Navigation Satellite System (GNSS) networks are increasingly being installed globally for a wide variety of science and societal applications. GNSS enables Earth science research in areas including tectonic plate interactions, crustal deformation in response to loading by tectonics, magmatism, water and ice, and the dynamics of water - and thereby energy transfer - in the atmosphere at regional scale. The many individual scientists and organizations that set up GNSS stations globally are often open to sharing data, but lack the resources or expertise to deploy systems and software to manage and curate data and metadata and provide user tools that would support data sharing. UNAVCO previously gained experience in facilitating data sharing through the NASA-supported development of the Geodesy Seamless Archive Centers (GSAC) open source software. GSAC provides web interfaces and simple web services for data and metadata discovery and access, supports federation of multiple data centers, and simplifies transfer of data and metadata to long-term archives. The NSF supported the dissemination of GSAC to multiple European data centers forming the European Plate Observing System. To expand upon GSAC to provide end-to-end, instrument-to-distribution capability, UNAVCO developed Dataworks for GNSS with NSF funding to the COCONet project, and deployed this software on systems that are now operating as Regional GNSS Data Centers as part of the NSF-funded TLALOCNet and COCONet projects. Dataworks consists of software modules written in Python and Java for data acquisition, management and sharing. There are modules for GNSS receiver control and data download, a database schema for metadata, tools for metadata handling, ingest software to manage file metadata, data file management scripts, GSAC, scripts for mirroring station data and metadata from partner GSACs, and extensive software and operator documentation. UNAVCO plans to provide a cloud VM image of Dataworks that would allow standing up a Dataworks-enabled GNSS data center without requiring upfront investment in server hardware. By enabling data creators to organize their data and metadata for sharing, Dataworks helps scientists expand their data curation awareness and responsibility, and enhances data access for all.

  8. Achieving Better Buying Power through Acquisition of Open Architecture Software Systems for Web-Based and Mobile Devices

    DTIC Science & Technology

    2015-05-01

    Achieving Better Buying Power through Acquisition of Open Architecture Software Systems for Web-Based and Mobile Devices Walt Scacchi and Thomas...2015 to 00-00-2015 4. TITLE AND SUBTITLE Achieving Better Buying Power through Acquisition of Open Architecture Software Systems for Web-Based and...architecture (OA) software systems  Emerging challenges in achieving Better Buying Power (BBP) via OA software systems for Web- based and Mobile devices

  9. Factors to keep in mind when introducing virtual microscopy.

    PubMed

    Glatz-Krieger, Katharina; Spornitz, Udo; Spatz, Alain; Mihatsch, Michael J; Glatz, Dieter

    2006-03-01

    Digitization of glass slides and delivery of so-called virtual slides (VS) emulating a real microscope over the Internet have become reality due to recent improvements in technology. We have implemented a virtual microscope for instruction of medical students and for continuing medical education. Up to 30,000 images per slide are captured using a microscope with an automated stage. The images are post-processed and then served by a plain hypertext transfer protocol (http)-server. A virtual slide client (vMic) based on Macromedia's Flash MX, a highly accepted technology available on every modern Web browser, has been developed. All necessary virtual slide parameters are stored in an XML file together with the image. Evaluation of the courses by questionnaire indicated that most students and many but not all pathologists regard virtual slides as an adequate replacement for traditional slides. All our virtual slides are publicly accessible over the World Wide Web (WWW) at http://vmic.unibas.ch . Recently, several commercially available virtual slide acquisition systems (VSAS) have been developed that use various technologies to acquire and distribute virtual slides. These systems differ in speed, image quality, compatibility, viewer functionalities and price. This paper gives an overview of the factors to keep in mind when introducing virtual microscopy.

  10. Interactive access to LP DAAC satellite data archives through a combination of open-source and custom middleware web services

    USGS Publications Warehouse

    Davis, Brian N.; Werpy, Jason; Friesz, Aaron M.; Impecoven, Kevin; Quenzer, Robert; Maiersperger, Tom; Meyer, David J.

    2015-01-01

    Current methods of searching for and retrieving data from satellite land remote sensing archives do not allow for interactive information extraction. Instead, Earth science data users are required to download files over low-bandwidth networks to local workstations and process data before science questions can be addressed. New methods of extracting information from data archives need to become more interactive to meet user demands for deriving increasingly complex information from rapidly expanding archives. Moving the tools required for processing data to computer systems of data providers, and away from systems of the data consumer, can improve turnaround times for data processing workflows. The implementation of middleware services was used to provide interactive access to archive data. The goal of this middleware services development is to enable Earth science data users to access remote sensing archives for immediate answers to science questions instead of links to large volumes of data to download and process. Exposing data and metadata to web-based services enables machine-driven queries and data interaction. Also, product quality information can be integrated to enable additional filtering and sub-setting. Only the reduced content required to complete an analysis is then transferred to the user.

  11. EPA Web Taxonomy

    EPA Pesticide Factsheets

    EPA's Web Taxonomy is a faceted hierarchical vocabulary used to tag web pages with terms from a controlled vocabulary. Tagging enables search and discovery of EPA's Web based information assests. EPA's Web Taxonomy is being provided in Simple Knowledge Organization System (SKOS) format. SKOS is a standard for sharing and linking knowledge organization systems that promises to make Federal terminology resources more interoperable.

  12. Science Plan Visualisation for Rosetta

    NASA Astrophysics Data System (ADS)

    Schmidt, A.; Grieger, B.; Völk, S.

    2013-12-01

    Rosetta is a mission of the European Space Agency (ESA) to rendez-vous with comet Churyumov-Gerasimenko in mid-2014. The trajectories and their corresponding operations are flexible and particularly complex. To make informed decisions among the many free parameters novel ways to communicate operations to the community have been explored. To support science planning by communicating operational ideas and disseminating operational scenarios, the science ground segment makes use of Web-based visualisation technologies. To keep the threshold to analysing operations proposals as low as possible, various implementation techniques have been investigated. An important goal was to use the Web to make the content as accessible as possible. By adopting the recent standard WebGL and generating static pages of time-dependent three-dimensional views of the spacecraft as well as the corresponding field-of-views of instruments, directly from the operational and for-study files, users are given the opportunity to explore interactively in their Web browsers what is being proposed in addition to using the traditional file products and analysing them in detail. The scenes and animations can be viewed in any modern Web browser and be combined with other analyses. This is to facilitate verification and cross-validation of complex products, often done by comparing different independent analyses and studies. By providing different timesteps in animations, it is possible to focus on long-term planning or short-term planning without distracting the user from the essentials. This is particularly important since the information that can be displayed in a Web browser is somewhat related to data volume that can be transferred across the wire. In Web browsers, it is more challenging to do numerical calculations on demand. Since requests for additional data have to be passed through a Web server, they are more complex and also require a more complex infrastructure. The volume of data that can be kept in a browser environment is limited and might have to be transferred over often slow network links. Thus, careful design and reduction of data is required. Regarding user interaction, Web browsers are often limited to a mouse and keyboards. In terms of benefits, the threshold and turn-around times for discussing operational ideas by using the visualisation techniques described here are lowered. An additional benefit of the approach was the cooperative use of products by distributed users which resulted in higher-quality software and data by incorporating more feedback than what would usually have been available.

  13. Communication of Science Plans in the Rosetta Mission

    NASA Astrophysics Data System (ADS)

    Schmidt, Albrecht; Grieger, Björn; Völk, Stefan

    2014-05-01

    Rosetta is a mission of the European Space Agency (ESA) to rendez-vous with comet Churyumov-Gerasimenko in mid-2014. The trajectories and their corresponding operations are both flexible and particularly complex. To make informed decisions among the many free parameters, novel ways to communicate operations to the community have been explored. To support science planning by communicating operational ideas and disseminating operational scenarios, the science ground segment makes use of Web-based visualisation technologies. To keep the threshold to analysing operations proposals as low as possible, various implementation techniques have been investigated. An important goal was to use the Web to make the content as accessible as possible. By adopting the recent standard WebGL and generating static pages of time-dependent three-dimensional views of the spacecraft as well as the corresponding field-of-views of instruments, directly from the operational and for-study files, users are given the opportunity to explore interactively in their Web browsers what is being proposed in addition to using the traditional file products and analysing them in detail. The scenes and animations can be viewed in any modern Web browser and be combined with other analyses. This is to facilitate verification and cross-validation of complex products, often done by comparing different independent analyses and studies. By providing different timesteps in animations, it is possible to focus on long-term planning or short-term planning without distracting the user from the essentials. This is particularly important since the information that can be displayed in a Web browser is somewhat related to data volume that can be transferred across the wire. In Web browsers, it is more challenging to do numerical calculations on demand. Since requests for additional data have to be passed through a Web server, they are more complex and also require a more complex infrastructure. The volume of data that can be kept in a browser environment is limited and might have to be transferred over often slow network links. Thus, careful design and reduction of data is required. Regarding user interaction, Web browsers are often limited to a mouse and keyboards. In terms of benefits, the threshold and turn-around times for discussing operational ideas by using the visualisation techniques described here are lowered. An additional benefit of the approach was the cooperative use of products by distributed users which resulted in higher-quality software and data by incorporating more feedback than what would usually have been available.

  14. A Web-based cost-effective training tool with possible application to brain injury rehabilitation.

    PubMed

    Wang, Peijun; Kreutzer, Ina Anna; Bjärnemo, Robert; Davies, Roy C

    2004-06-01

    Virtual reality (VR) has provoked enormous interest in the medical community. In particular, VR offers therapists new approaches for improving rehabilitation effects. However, most of these VR assistant tools are not very portable, extensible or economical. Due to the vast amount of 3D data, they are not suitable for Internet transfer. Furthermore, in order to run these VR systems smoothly, special hardware devices are needed. As a result, existing VR assistant tools tend to be available in hospitals but not in patients' homes. To overcome these disadvantages, as a case study, this paper proposes a Web-based Virtual Ticket Machine, called WBVTM, using VRML [VRML Consortium, The Virtual Reality Modeling Language: International Standard ISO/IEC DIS 14772-1, 1997, available at ], Java and EAI (External Authoring Interface) [Silicon Graphics, Inc., The External Authoring Interface (EAI), available at ], to help people with acquired brain injury (ABI) to relearn basic living skills at home at a low cost. As these technologies are open standard and feature usability on the Internet, WBVTM achieves the goals of portability, easy accessibility and cost-effectiveness.

  15. Volcano Modelling and Simulation gateway (VMSg): A new web-based framework for collaborative research in physical modelling and simulation of volcanic phenomena

    NASA Astrophysics Data System (ADS)

    Esposti Ongaro, T.; Barsotti, S.; de'Michieli Vitturi, M.; Favalli, M.; Longo, A.; Nannipieri, L.; Neri, A.; Papale, P.; Saccorotti, G.

    2009-12-01

    Physical and numerical modelling is becoming of increasing importance in volcanology and volcanic hazard assessment. However, new interdisciplinary problems arise when dealing with complex mathematical formulations, numerical algorithms and their implementations on modern computer architectures. Therefore new frameworks are needed for sharing knowledge, software codes, and datasets among scientists. Here we present the Volcano Modelling and Simulation gateway (VMSg, accessible at http://vmsg.pi.ingv.it), a new electronic infrastructure for promoting knowledge growth and transfer in the field of volcanological modelling and numerical simulation. The new web portal, developed in the framework of former and ongoing national and European projects, is based on a dynamic Content Manager System (CMS) and was developed to host and present numerical models of the main volcanic processes and relationships including magma properties, magma chamber dynamics, conduit flow, plume dynamics, pyroclastic flows, lava flows, etc. Model applications, numerical code documentation, simulation datasets as well as model validation and calibration test-cases are also part of the gateway material.

  16. On-the-fly form generation and on-line metadata configuration--a clinical data management Web infrastructure in Java.

    PubMed

    Beck, Peter; Truskaller, Thomas; Rakovac, Ivo; Cadonna, Bruno; Pieber, Thomas R

    2006-01-01

    In this paper we describe the approach to build a web-based clinical data management infrastructure on top of an entity-attribute-value (EAV) database which provides for flexible definition and extension of clinical data sets as well as efficient data handling and high performance query execution. A "mixed" EAV implementation provides a flexible and configurable data repository and at the same time utilizes the performance advantages of conventional database tables for rarely changing data structures. A dynamically configurable data dictionary contains further information for data validation. The online user interface can also be assembled dynamically. A data transfer object which encapsulates data together with all required metadata is populated by the backend and directly used to dynamically render frontend forms and handle incoming data. The "mixed" EAV model enables flexible definition and modification of clinical data sets while reducing performance drawbacks of pure EAV implementations to a minimum. The system currently is in use in an electronic patient record with focus on flexibility and a quality management application (www.healthgate.at) with high performance requirements.

  17. Automatic phylogenetic classification of bacterial beta-lactamase sequences including structural and antibiotic substrate preference information.

    PubMed

    Ma, Jianmin; Eisenhaber, Frank; Maurer-Stroh, Sebastian

    2013-12-01

    Beta lactams comprise the largest and still most effective group of antibiotics, but bacteria can gain resistance through different beta lactamases that can degrade these antibiotics. We developed a user friendly tree building web server that allows users to assign beta lactamase sequences to their respective molecular classes and subclasses. Further clinically relevant information includes if the gene is typically chromosomal or transferable through plasmids as well as listing the antibiotics which the most closely related reference sequences are known to target and cause resistance against. This web server can automatically build three phylogenetic trees: the first tree with closely related sequences from a Tachyon search against the NCBI nr database, the second tree with curated reference beta lactamase sequences, and the third tree built specifically from substrate binding pocket residues of the curated reference beta lactamase sequences. We show that the latter is better suited to recover antibiotic substrate assignments through nearest neighbor annotation transfer. The users can also choose to build a structural model for the query sequence and view the binding pocket residues of their query relative to other beta lactamases in the sequence alignment as well as in the 3D structure relative to bound antibiotics. This web server is freely available at http://blac.bii.a-star.edu.sg/.

  18. Effects of prey density, temperature and predator diversity on nonconsumptive predator-driven mortality in a freshwater food web.

    PubMed

    Veselý, Lukáš; Boukal, David S; Buřič, Miloš; Kozák, Pavel; Kouba, Antonín; Sentis, Arnaud

    2017-12-22

    Nonconsumptive predator-driven mortality (NCM), defined as prey mortality due to predation that does not result in prey consumption, is an underestimated component of predator-prey interactions with possible implications for population dynamics and ecosystem functioning. However, the biotic and abiotic factors influencing this mortality component remain largely unexplored, leaving a gap in our understanding of the impacts of environmental change on ecological communities. We investigated the effects of temperature, prey density, and predator diversity and density on NCM in an aquatic food web module composed of dragonfly larvae (Aeshna cyanea) and marbled crayfish (Procambarus fallax f. virginalis) preying on common carp (Cyprinus carpio) fry. We found that NCM increased with prey density and depended on the functional diversity and density of the predator community. Warming significantly reduced NCM only in the dragonfly larvae but the magnitude depended on dragonfly larvae density. Our results indicate that energy transfer across trophic levels is more efficient due to lower NCM in functionally diverse predator communities, at lower resource densities and at higher temperatures. This suggests that environmental changes such as climate warming and reduced resource availability could increase the efficiency of energy transfer in food webs only if functionally diverse predator communities are conserved.

  19. Bioaccumulation and trophic transfer of short chain chlorinated paraffins in a marine food web from Liaodong Bay, North China.

    PubMed

    Ma, Xindong; Zhang, Haijun; Wang, Zhen; Yao, Ziwei; Chen, Jingwen; Chen, Jiping

    2014-05-20

    Short chain chlorinated paraffins (SCCPs) are under the evaluation for inclusion into the Stockholm Convention on persistent organic pollutants. However, information on their bioconcentration and biomagnification in marine ecosystems is unavailable, limiting the evaluation of their ecological risks. In this study, seawater, sediment, zooplankton, invertebrates, and fishes collected from Liaodong Bay, Bohai Sea, North China were analyzed to investigate the residual level, congener group profile, bioaccumulation, and trophic transfer of SCCPs in a marine food web. The total concentrations of SCCPs ranged from 4.1 to 13.1 ng L(-1) in seawater, 65 to 541 ng g(-1) (dw) in sediment, and 86 to 4400 ng g(-1) (ww) in organisms. Correspondence analysis indicated the relative enrichment of C10Cl5 and C11Cl5 formula groups in most aquatic organisms. Both the logarithm bioaccumulation factors (log BAFs: 4.1-6.7) and biota-sediment accumulation factors (BSAFs: 0.1-7.3) of individual congeners implied the bioaccumulation of SCCPs. The trophic magnification factor (TMF) of ∑SCCPs was determined to be 2.38 in the zooplankton-shrimp-fish food web, indicating biomagnification potential of SCCPs in the marine ecosystem. The TMF values of individual congener groups significantly correlated with their log KOW values.

  20. Sensor Webs as Virtual Data Systems for Earth Science

    NASA Astrophysics Data System (ADS)

    Moe, K. L.; Sherwood, R.

    2008-05-01

    The NASA Earth Science Technology Office established a 3-year Advanced Information Systems Technology (AIST) development program in late 2006 to explore the technical challenges associated with integrating sensors, sensor networks, data assimilation and modeling components into virtual data systems called "sensor webs". The AIST sensor web program was initiated in response to a renewed emphasis on the sensor web concepts. In 2004, NASA proposed an Earth science vision for a more robust Earth observing system, coupled with remote sensing data analysis tools and advances in Earth system models. The AIST program is conducting the research and developing components to explore the technology infrastructure that will enable the visionary goals. A working statement for a NASA Earth science sensor web vision is the following: On-demand sensing of a broad array of environmental and ecological phenomena across a wide range of spatial and temporal scales, from a heterogeneous suite of sensors both in-situ and in orbit. Sensor webs will be dynamically organized to collect data, extract information from it, accept input from other sensor / forecast / tasking systems, interact with the environment based on what they detect or are tasked to perform, and communicate observations and results in real time. The focus on sensor webs is to develop the technology and prototypes to demonstrate the evolving sensor web capabilities. There are 35 AIST projects ranging from 1 to 3 years in duration addressing various aspects of sensor webs involving space sensors such as Earth Observing-1, in situ sensor networks such as the southern California earthquake network, and various modeling and forecasting systems. Some of these projects build on proof-of-concept demonstrations of sensor web capabilities like the EO-1 rapid fire response initially implemented in 2003. Other projects simulate future sensor web configurations to evaluate the effectiveness of sensor-model interactions for producing improved science predictions. Still other projects are maturing technology to support autonomous operations, communications and system interoperability. This paper will highlight lessons learned by various projects during the first half of the AIST program. Several sensor web demonstrations have been implemented and resulting experience with evolving standards, such as the Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) among others, will be featured. The role of sensor webs in support of the intergovernmental Group on Earth Observations' Global Earth Observation System of Systems (GEOSS) will also be discussed. The GEOSS vision is a distributed system of systems that builds on international components to supply observing and processing systems that are, in the whole, comprehensive, coordinated and sustained. Sensor web prototypes are under development to demonstrate how remote sensing satellite data, in situ sensor networks and decision support systems collaborate in applications of interest to GEO, such as flood monitoring. Furthermore, the international Committee on Earth Observation Satellites (CEOS) has stepped up to the challenge to provide the space-based systems component for GEOSS. CEOS has proposed "virtual constellations" to address emerging data gaps in environmental monitoring, avoid overlap among observing systems, and make maximum use of existing space and ground assets. Exploratory applications that support the objectives of virtual constellations will also be discussed as a future role for sensor webs.

  1. Innovative technology for web-based data management during an outbreak

    PubMed Central

    Mukhi, Shamir N; Chester, Tammy L Stuart; Klaver-Kibria, Justine DA; Nowicki, Deborah L; Whitlock, Mandy L; Mahmud, Salah M; Louie, Marie; Lee, Bonita E

    2011-01-01

    Lack of automated and integrated data collection and management, and poor linkage of clinical, epidemiological and laboratory data during an outbreak can inhibit effective and timely outbreak investigation and response. This paper describes an innovative web-based technology, referred to as Web Data, developed for the rapid set-up and provision of interactive and adaptive data management during outbreak situations. We also describe the benefits and limitations of the Web Data technology identified through a questionnaire that was developed to evaluate the use of Web Data implementation and application during the 2009 H1N1 pandemic by Winnipeg Regional Health Authority and Provincial Laboratory for Public Health of Alberta. Some of the main benefits include: improved and secure data access, increased efficiency and reduced error, enhanced electronic collection and transfer of data, rapid creation and modification of the database, conversion of specimen-level to case-level data, and user-defined data extraction and query capabilities. Areas requiring improvement include: better understanding of privacy policies, increased capability for data sharing and linkages between jurisdictions to alleviate data entry duplication. PMID:23569597

  2. The EarthServer Federation: State, Role, and Contribution to GEOSS

    NASA Astrophysics Data System (ADS)

    Merticariu, Vlad; Baumann, Peter

    2016-04-01

    The intercontinental EarthServer initiative has established a European datacube platform with proven scalability: known databases exceed 100 TB, and single queries have been split across more than 1,000 cloud nodes. Its service interface being rigorously based on the OGC "Big Geo Data" standards, Web Coverage Service (WCS) and Web Coverage Processing Service (WCPS), a series of clients can dock into the services, ranging from open-source OpenLayers and QGIS over open-source NASA WorldWind to proprietary ESRI ArcGIS. Datacube fusion in a "mix and match" style is supported by the platform technolgy, the rasdaman Array Database System, which transparently federates queries so that users simply approach any node of the federation to access any data item, internally optimized for minimal data transfer. Notably, rasdaman is part of GEOSS GCI. NASA is contributing its Web WorldWind virtual globe for user-friendly data extraction, navigation, and analysis. Integrated datacube / metadata queries are contributed by CITE. Current federation members include ESA (managed by MEEO sr.l.), Plymouth Marine Laboratory (PML), the European Centre for Medium-Range Weather Forecast (ECMWF), Australia's National Computational Infrastructure, and Jacobs University (adding in Planetary Science). Further data centers have expressed interest in joining. We present the EarthServer approach, discuss its underlying technology, and illustrate the contribution this datacube platform can make to GEOSS.

  3. Transfer and alignment of random single-walled carbon nanotube films by contact printing.

    PubMed

    Liu, Huaping; Takagi, Daisuke; Chiashi, Shohei; Homma, Yoshikazu

    2010-02-23

    We present a simple method to transfer large-area random single-walled carbon nanotube (SWCNT) films grown on SiO(2) substrates onto another surface through a simple contact printing process. The transferred random SWCNT films can be assembled into highly ordered, dense regular arrays with high uniformity and reproducibility by sliding the growth substrate during the transfer process. The position of the transferred SWCNT film can be controlled by predefined patterns on the receiver substrates. The process is compatible with a variety of substrates, and even metal meshes for transmission electron microscopy (TEM) can be used as receiver substrates. Thus, suspended web-like SWCNT networks and aligned SWCNT arrays can be formed over the grids of TEM meshes, so that the structures of the transferred SWCNTs can be directly observed by TEM. This simple technique can be used to controllably transfer SWCNTs for property studies, for the fabrication of devices, or even as support films for TEM meshes.

  4. The GBT Dynamic Scheduling System: Powered by the Web

    NASA Astrophysics Data System (ADS)

    Marganian, P.; Clark, M.; McCarty, M.; Sessoms, E.; Shelton, A.

    2009-09-01

    The web technologies utilized for the Robert C. Byrd Green Bank Telescope's (GBT) new Dynamic Scheduling System are discussed, focusing on languages, frameworks, and tools. We use a popular Python web framework, TurboGears, to take advantage of the extensive web services the system provides. TurboGears is a model-view-controller framework, which aggregates SQLAlchemy, Genshi, and CherryPy respectively. On top of this framework, Javascript (Prototype, script.aculo.us, and JQuery) and cascading style sheets (Blueprint) are used for desktop-quality web pages.

  5. Staged cascade fluidized bed combustor

    DOEpatents

    Cannon, Joseph N.; De Lucia, David E.; Jackson, William M.; Porter, James H.

    1984-01-01

    A fluid bed combustor comprising a plurality of fluidized bed stages interconnected by downcomers providing controlled solids transfer from stage to stage. Each stage is formed from a number of heat transfer tubes carried by a multiapertured web which passes fluidizing air to upper stages. The combustor cross section is tapered inwardly from the middle towards the top and bottom ends. Sorbent materials, as well as non-volatile solid fuels, are added to the top stages of the combustor, and volatile solid fuels are added at an intermediate stage.

  6. Communication and Gamification in the Web-Based Foreign Language Educational System: Web- Based Foreign Language Educational System

    ERIC Educational Resources Information Center

    Osipov, Ilya V.; Volinsky, Alex A.; Nikulchev, Evgeny; Prasikova, Anna Y.

    2016-01-01

    The paper describes development of the educational online web communication platform for teaching and learning foreign languages. The main objective was to develop a web application for teaching foreigners to understand casual fluent speech. The system is based on the time bank principle, allowing users to teach others their native language along…

  7. Decade of Change.

    ERIC Educational Resources Information Center

    Hunter, Leslie Gene

    1995-01-01

    Discusses advancements in the field of history-related computer-assisted instruction and research. Describes the components of Historiography and Methods of Research, a class that introduces history students to such practical applications as the World Wide Web (WWW), File Transfer Protocol (FTP), listservs, archival access, and others. Briefly…

  8. A Beginner's Guide to the Internet.

    ERIC Educational Resources Information Center

    McAdams, Charles A.; Nelson, Mark A.

    1995-01-01

    Maintains that the Internet offers services and opportunities for music teachers and students. Provides an overview of topics such as electronic mail, File Transfer Protocol (FTP), Gopher, and the World Wide Web (WWW). Includes two lists of music resources available on the Internet. (CFR)

  9. BOTULISM E IN LAKE ERIE: ECOLOGY AND LOWER FOOD WEB TRANSFER

    EPA Science Inventory

    This project will determine the environmental conditions that favor botulism Type E bacteria in Lake Erie and explore whether quagga mussels are altering bottom sediment conditions to favor C. botulinum growth. Analysis of environmental parameters, including water chemistry, alg...

  10. Web-based infectious disease surveillance systems and public health perspectives: a systematic review.

    PubMed

    Choi, Jihye; Cho, Youngtae; Shim, Eunyoung; Woo, Hyekyung

    2016-12-08

    Emerging and re-emerging infectious diseases are a significant public health concern, and early detection and immediate response is crucial for disease control. These challenges have led to the need for new approaches and technologies to reinforce the capacity of traditional surveillance systems for detecting emerging infectious diseases. In the last few years, the availability of novel web-based data sources has contributed substantially to infectious disease surveillance. This study explores the burgeoning field of web-based infectious disease surveillance systems by examining their current status, importance, and potential challenges. A systematic review framework was applied to the search, screening, and analysis of web-based infectious disease surveillance systems. We searched PubMed, Web of Science, and Embase databases to extensively review the English literature published between 2000 and 2015. Eleven surveillance systems were chosen for evaluation according to their high frequency of application. Relevant terms, including newly coined terms, development and classification of the surveillance systems, and various characteristics associated with the systems were studied. Based on a detailed and informative review of the 11 web-based infectious disease surveillance systems, it was evident that these systems exhibited clear strengths, as compared to traditional surveillance systems, but with some limitations yet to be overcome. The major strengths of the newly emerging surveillance systems are that they are intuitive, adaptable, low-cost, and operated in real-time, all of which are necessary features of an effective public health tool. The most apparent potential challenges of the web-based systems are those of inaccurate interpretation and prediction of health status, and privacy issues, based on an individual's internet activity. Despite being in a nascent stage with further modification needed, web-based surveillance systems have evolved to complement traditional national surveillance systems. This review highlights ways in which the strengths of existing systems can be maintained and weaknesses alleviated to implement optimal web surveillance systems.

  11. Validation of Web-Based Physical Activity Measurement Systems Using Doubly Labeled Water

    PubMed Central

    Yamaguchi, Yukio; Yamada, Yosuke; Tokushima, Satoru; Hatamoto, Yoichi; Sagayama, Hiroyuki; Kimura, Misaka; Higaki, Yasuki; Tanaka, Hiroaki

    2012-01-01

    Background Online or Web-based measurement systems have been proposed as convenient methods for collecting physical activity data. We developed two Web-based physical activity systems—the 24-hour Physical Activity Record Web (24hPAR WEB) and 7 days Recall Web (7daysRecall WEB). Objective To examine the validity of two Web-based physical activity measurement systems using the doubly labeled water (DLW) method. Methods We assessed the validity of the 24hPAR WEB and 7daysRecall WEB in 20 individuals, aged 25 to 61 years. The order of email distribution and subsequent completion of the two Web-based measurements systems was randomized. Each measurement tool was used for a week. The participants’ activity energy expenditure (AEE) and total energy expenditure (TEE) were assessed over each week using the DLW method and compared with the respective energy expenditures estimated using the Web-based systems. Results The mean AEE was 3.90 (SD 1.43) MJ estimated using the 24hPAR WEB and 3.67 (SD 1.48) MJ measured by the DLW method. The Pearson correlation for AEE between the two methods was r = .679 (P < .001). The Bland-Altman 95% limits of agreement ranged from –2.10 to 2.57 MJ between the two methods. The Pearson correlation for TEE between the two methods was r = .874 (P < .001). The mean AEE was 4.29 (SD 1.94) MJ using the 7daysRecall WEB and 3.80 (SD 1.36) MJ by the DLW method. The Pearson correlation for AEE between the two methods was r = .144 (P = .54). The Bland-Altman 95% limits of agreement ranged from –3.83 to 4.81 MJ between the two methods. The Pearson correlation for TEE between the two methods was r = .590 (P = .006). The average input times using terminal devices were 8 minutes and 10 seconds for the 24hPAR WEB and 6 minutes and 38 seconds for the 7daysRecall WEB. Conclusions Both Web-based systems were found to be effective methods for collecting physical activity data and are appropriate for use in epidemiological studies. Because the measurement accuracy of the 24hPAR WEB was moderate to high, it could be suitable for evaluating the effect of interventions on individuals as well as for examining physical activity behavior. PMID:23010345

  12. Impact of mercury contamination on the population dynamics of Peringia ulvae (Gastropoda): Implications on metal transfer through the trophic web

    NASA Astrophysics Data System (ADS)

    Cardoso, P. G.; Sousa, E.; Matos, P.; Henriques, B.; Pereira, E.; Duarte, A. C.; Pardal, M. A.

    2013-09-01

    The effects of mercury contamination on the population structure and dynamics of the gastropod Peringia ulvae (also known as Hydrobia ulvae) and its impact on the trophic web were assessed along a mercury gradient in Ria de Aveiro (Portugal). The gastropod was revealed to be a tolerant species to the contaminant, since the highest densities, biomasses and growth productivity values were recorded at the intermediate contaminated area followed by the most contaminated one and finally the least contaminated area. P. ulvae was however negatively affected by mercury in terms of growth and life span. So, in the most contaminated area the population was characterised mainly by the presence of juveniles and young individuals. The intermediate contaminated area showed a greater equilibrium in terms of groups' proportion, being the adults the dominant set. The least contaminated area presented intermediate values. P. ulvae life spans were shortest in the most contaminated area (7-8 mo), followed by the least contaminated area (10-11 mo) and finally, the intermediate one (11-14 mo). P. ulvae revealed to be an important vehicle of mercury transfer from sediments to the trophic web, incorporating approximately 15 g of Hg, annually, in the inner area of the Laranjo Bay (0.6 Km2). Therefore, despite P. ulvae being revealed to be not a good bio-indicator of mercury contamination, since it did not suffer profound modifications in its structure and functioning, it is a crucial element in the mercury biomagnification processes throughout the food web.

  13. Development Of A Web Service And Android 'APP' For The Distribution Of Rainfall Data. A Bottom-Up Remote Sensing Data Mining And Redistribution Project In The Age Of The 'Web 2.0'

    NASA Astrophysics Data System (ADS)

    Mantas, Vasco M.; Pereira, A. J. S. C.; Liu, Zhong

    2013-12-01

    A project was devised to develop a set of freely available applications and web services that can (1) simplify access from Mobile Devices to TOVAS data and (2) support the development of new datasets through data repackaging and mash-up. The bottom-up approach enables the multiplication of new services, often of limited direct interest to the organizations that produces the original, global datasets, but significant to small, local users. Through this multiplication of services, the development cost is transferred to the intermediate or end users and the entire process is made more efficient, even allowing new players to use the data in innovative ways.

  14. Impact of the 3 °C temperature rise on bacterial growth and carbon transfer towards higher trophic levels: Empirical models for the Adriatic Sea

    NASA Astrophysics Data System (ADS)

    Šolić, Mladen; Krstulović, Nada; Šantić, Danijela; Šestanović, Stefanija; Kušpilić, Grozdan; Bojanić, Natalia; Ordulj, Marin; Jozić, Slaven; Vrdoljak, Ana

    2017-09-01

    The Mediterranean Sea (including the Adriatic Sea) has been identified as a 'hotspot' for climate change, with the prediction of the increase in water temperature of 2-4 °C over the next few decades. Being mainly oligotrophic, and strongly phosphorus limited, the Adriatic Sea is characterized by the important role of the microbial food web in production and transfer of biomass and energy towards higher trophic levels. We hypothesized that predicted 3 °C temperature rise in the near future might cause an increase of bacterial production and bacterial losses to grazers, which could significantly enlarge the trophic base for metazoans. This empirical study is based on a combined 'space-for-time substitution' analysis (which is performed on 3583 data sets) and on an experimental approach (36 in situ grazing experiments performed at different temperatures). It showed that the predicted 3 °C temperature increase (which is a result of global warming) in the near future could cause a significant increase in bacterial growth at temperatures lower than 16 °C (during the colder winter-spring period, as well as in the deeper layers). The effect of temperature on bacterial growth could be additionally doubled in conditions without phosphorus limitation. Furthermore, a 3 °C increase in temperature could double the grazing on bacteria by heterotrophic nanoflagellate (HNF) and ciliate predators and it could increase the proportion of bacterial production transferred to the metazoan food web by 42%. Therefore, it is expected that global warming may further strengthen the role of the microbial food web in a carbon cycle in the Adriatic Sea.

  15. Informatics application provides instant research to practice benefits.

    PubMed Central

    Bowles, K. H.; Peng, T.; Qian, R.; Naylor, M. D.

    2001-01-01

    A web-based research information system was designed to enable our research team to efficiently measure health related quality of life among frail older adults in a variety of health care settings (home care, nursing homes, assisted living, PACE). The structure, process, and outcome data is collected using laptop computers and downloaded to a SQL database. Unique features of this project are the ability to transfer research to practice by instantly sharing individual and aggregate results with the clinicians caring for these elders and directly impacting the quality of their care. Clinicians can also dial in to the database to access standard queries or receive customized reports about the patients in their facilities. This paper will describe the development and implementation of the information system. The conference presentation will include a demonstration and examples of research to practice benefits. PMID:11825156

  16. Wireless Infrastructure for Performing Monitoring, Diagnostics, and Control HVAC and Other Energy-Using Systems in Small Commercial Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patrick O'Neill

    This project focused on developing a low-cost wireless infrastructure for monitoring, diagnosing, and controlling building systems and equipment. End users receive information via the Internet and need only a web browser and Internet connection. The system used wireless communications for: (1) collecting data centrally on site from many wireless sensors installed on building equipment, (2) transmitting control signals to actuators and (3) transmitting data to an offsite network operations center where it is processed and made available to clients on the Web (see Figure 1). Although this wireless infrastructure can be applied to any building system, it was tested onmore » two representative applications: (1) monitoring and diagnostics for packaged rooftop HVAC units used widely on small commercial buildings and (2) continuous diagnosis and control of scheduling errors such as lights and equipment left on during unoccupied hours. This project developed a generic infrastructure for performance monitoring, diagnostics, and control, applicable to a broad range of building systems and equipment, but targeted specifically to small to medium commercial buildings (an underserved market segment). The proposed solution is based on two wireless technologies. The first, wireless telemetry, is used for cell phones and paging and is reliable and widely available. This risk proved to be easily managed during the project. The second technology is on-site wireless communication for acquiring data from sensors and transmitting control signals. The technology must enable communication with many nodes, overcome physical obstructions, operate in environments with other electrical equipment, support operation with on-board power (instead of line power) for some applications, operate at low transmission power in license-free radio bands, and be low cost. We proposed wireless mesh networking to meet these needs. This technology is relatively new and has been applied only in research and tests. This proved to be a major challenge for the project and was ultimately abandoned in favor of a directly wired solution for collecting sensor data at the building. The primary reason for this was the relatively short ranges at which we were able to effectively place the sensor nodes from the central receiving unit. Several different mesh technologies were attempted with similar results. Two hardware devices were created during the original performance period of the project. The first device, the WEB-MC, is a master control unit that has two radios, a CPU, memory, and serves as the central communications device for the WEB-MC System (Currently called the 'BEST Wireless HVAC Maintenance System' as a tentative commercial product name). The WEB-MC communicates with the local mesh network system via one of its antennas. Communication with the mesh network enables the WEB-MC to configure the network, send/receive data from individual motes, and serves as the primary mechanism for collecting sensor data at remote locations. The second antenna enables the WEB-MC to connect to a cellular network ('Long-Haul Communications') to transfer data to and from the NorthWrite Network Operations Center (NOC). A third 'all-in-one' hardware solution was created after the project was extended (Phase 2) and additional resources were provided. The project team leveraged a project funded by the State of Washington to develop a hardware solution that integrated the functionality of the original two devices. The primary reason for this approach was to eliminate the mesh network technical difficulties that severely limited the functionality of the original hardware approach. There were five separate software developments required to deliver the functionality needed for this project. These include the Data Server (or Network Operations Center), Web Application, Diagnostic Software, WEB-MC Embedded Software, Mote Embedded Software. Each of these developments was necessarily dependent on the others. This resulted in a challenging management task - requiring high bandwidth communications among all the team members. Fortunately, the project team performed exceptionally well together and was able to work through the various challenges that this presented - for example, when one software tool required a detailed description of the output of a second tool, before that tool had been fully designed.« less

  17. Architecture and the Web.

    ERIC Educational Resources Information Center

    Money, William H.

    Instructors should be concerned with how to incorporate the World Wide Web into an information systems (IS) curriculum organized across three areas of knowledge: information technology, organizational and management concepts, and theory and development of systems. The Web fits broadly into the information technology component. For the Web to be…

  18. Web-Based Intelligent E-Learning Systems: Technologies and Applications

    ERIC Educational Resources Information Center

    Ma, Zongmin

    2006-01-01

    Collecting and presenting the latest research and development results from the leading researchers in the field of e-learning systems, Web-Based Intelligent E-Learning Systems: Technologies and Applications provides a single record of current research and practical applications in Web-based intelligent e-learning systems. This book includes major…

  19. Web data mining

    NASA Astrophysics Data System (ADS)

    Wibonele, Kasanda J.; Zhang, Yanqing

    2002-03-01

    A web data mining system using granular computing and ASP programming is proposed. This is a web based application, which allows web users to submit survey data for many different companies. This survey is a collection of questions that will help these companies develop and improve their business and customer service with their clients by analyzing survey data. This web application allows users to submit data anywhere. All the survey data is collected into a database for further analysis. An administrator of this web application can login to the system and view all the data submitted. This web application resides on a web server, and the database resides on the MS SQL server.

  20. Large area sheet task: Advanced Dendritic Web Growth Development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Hopkins, R. H.; Meier, D.; Schruben, J.

    1981-01-01

    A melt level control system was implemented to provide stepless silicon feed rates from zero to rates exactly matching the silicon consumed during web growth. Bench tests of the unit were successfully completed and the system mounted in a web furnace for operational verification. Tests of long term temperature drift correction techniques were made; web width monitoring seems most appropriate for feedback purposes. A system to program the initiation of the web growth cycle was successfully tested. A low cost temperature controller was tested which functions as well as units four times as expensive.

  1. The Namibia Early Flood Warning System, A CEOS Pilot Project

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel; Frye, Stuart; Cappelaere, Pat; Sohlberg, Robert; Handy, Matthew; Grossman, Robert

    2012-01-01

    Over the past year few years, an international collaboration has developed a pilot project under the auspices of Committee on Earth Observation Satellite (CEOS) Disasters team. The overall team consists of civilian satellite agencies. For this pilot effort, the development team consists of NASA, Canadian Space Agency, Univ. of Maryland, Univ. of Colorado, Univ. of Oklahoma, Ukraine Space Research Institute and Joint Research Center(JRC) for European Commission. This development team collaborates with regional , national and international agencies to deliver end-to-end disaster coverage. In particular, the team in collaborating on this effort with the Namibia Department of Hydrology to begin in Namibia . However, the ultimate goal is to expand the functionality to provide early warning over the South Africa region. The initial collaboration was initiated by United Nations Office of Outer Space Affairs and CEOS Working Group for Information Systems and Services (WGISS). The initial driver was to demonstrate international interoperability using various space agency sensors and models along with regional in-situ ground sensors. In 2010, the team created a preliminary semi-manual system to demonstrate moving and combining key data streams and delivering the data to the Namibia Department of Hydrology during their flood season which typically is January through April. In this pilot, a variety of moderate resolution and high resolution satellite flood imagery was rapidly delivered and used in conjunction with flood predictive models in Namibia. This was collected in conjunction with ground measurements and was used to examine how to create a customized flood early warning system. During the first year, the team made use of SensorWeb technology to gather various sensor data which was used to monitor flood waves traveling down basins originating in Angola, but eventually flooding villages in Namibia. The team made use of standardized interfaces such as those articulated under the Open Cloud Consortium (OGC) Sensor Web Enablement (SWE) set of web services was good [1][2]. However, it was discovered that in order to make a system like this functional, there were many performance issues. Data sets were large and located in a variety of location behind firewalls and had to be accessed across open networks, so security was an issue. Furthermore, the network access acted as bottleneck to transfer map products to where they are needed. Finally, during disasters, many users and computer processes act in parallel and thus it was very easy to overload the single string of computers stitched together in a virtual system that was initially developed. To address some of these performance issues, the team partnered with the Open Cloud Consortium (OCC) who supplied a Computation Cloud located at the University of Illinois at Chicago and some manpower to administer this Cloud. The Flood SensorWeb [3] system was interfaced to the Cloud to provide a high performance user interface and product development engine. Figure 1 shows the functional diagram of the Flood SensorWeb. Figure 2 shows some of the functionality of the Computation Cloud that was integrated. A significant portion of the original system was ported to the Cloud and during the past year, technical issues were resolved which included web access to the Cloud, security over the open Internet, beginning experiments on how to handle surge capacity by using the virtual machines in the cloud in parallel, using tiling techniques to render large data sets as layers on map, interfaces to allow user to customize the data processing/product chain and other performance enhancing techniques. The conclusion reached from the effort and this presentation is that defining the interoperability standards in a small fraction of the work. For example, once open web service standards were defined, many users could not make use of the standards due to security restrictions. Furthermore, once an interoperable sysm is functional, then a surge of users can render a system unusable, especially in the disaster domain.

  2. Sensor system for web inspection

    DOEpatents

    Sleefe, Gerard E.; Rudnick, Thomas J.; Novak, James L.

    2002-01-01

    A system for electrically measuring variations over a flexible web has a capacitive sensor including spaced electrically conductive, transmit and receive electrodes mounted on a flexible substrate. The sensor is held against a flexible web with sufficient force to deflect the path of the web, which moves relative to the sensor.

  3. Visual Based Retrieval Systems and Web Mining--Introduction.

    ERIC Educational Resources Information Center

    Iyengar, S. S.

    2001-01-01

    Briefly discusses Web mining and image retrieval techniques, and then presents a summary of articles in this special issue. Articles focus on Web content mining, artificial neural networks as tools for image retrieval, content-based image retrieval systems, and personalizing the Web browsing experience using media agents. (AEF)

  4. Closing the circle of care: implementation of a web-based communication tool to improve emergency department discharge communication with family physicians.

    PubMed

    Hunchak, Cheryl; Tannenbaum, David; Roberts, Michael; Shah, Thrushar; Tisma, Predrag; Ovens, Howard; Borgundvaag, Bjug

    2015-03-01

    Postdischarge emergency department (ED) communication with family physicians is often suboptimal and negatively impacts patient care. We designed and piloted an online notification system that electronically alerts family physicians of patient ED visits and provides access to visitspecific laboratory and diagnostic information. Nine (of 10 invited) high-referring family physicians participated in this single ED pilot. A prepilot chart audit (30 patients from each family physician) determined the baseline rate of paper-based record transmission. A webbased communication portal was designed and piloted by the nine family physicians over 1 year. Participants provided usability feedback via focus groups and written surveys. Review of 270 patient charts in the prepilot phase revealed a 13% baseline rate of handwritten chart and a 44% rate of any information transfer between the ED and family physician offices following discharge. During the pilot, participant family physicians accrued 880 patient visits. Seven and two family physicians accessed online records for 74% and 12% of visits, respectively, an overall 60.7% of visits, corresponding to an overall absolute increase in receipt of patient ED visit information of 17%. The postpilot survey found that 100% of family physicians reported that they were ''often'' or ''always'' aware of patient ED visits, used the portal ''always'' or ''regularly'' to access patients' health records online, and felt that the web portal contributed to improved actual and perceived continuity of patient care. Introduction of a web-based ED visit communication tool improved ED-family physician communication. The impact of this system on improved continuity of care, timeliness of follow-up, and reduced duplication of investigations and referrals requires additional study.

  5. Australian Academic Use of the Internet.

    ERIC Educational Resources Information Center

    Applebee, Andrelyn C.; Clayton, Peter; Pascoe, Celina

    1997-01-01

    A study of academic staff at the University of Canberra (Australia) in 1995 determined use of electronic mail, Telnet, file transfer protocol (FTP) software, World Wide Web, library and document delivery services, discussion groups, and student communication. Examined demographic characteristics of faculty (discipline, employment status, gender,…

  6. FACILITATING PUBLIC ACCESS TO GOVERNMENT ENVIRONMENTAL MONITORING DATA: THE LIVING EVERGLADES WEB SITE

    EPA Science Inventory

    The Technology Transfer and Support Division of the USEPA, Office of Research and Development's (ORD) National Risk Management Research Laboratory has developed this handbook, in conjunction with the South Florida Water Management District (SFWMD), to document The Living Everglad...

  7. JPARSS: A Java Parallel Network Package for Grid Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Jie; Akers, Walter; Chen, Ying

    2002-03-01

    The emergence of high speed wide area networks makes grid computinga reality. However grid applications that need reliable data transfer still have difficulties to achieve optimal TCP performance due to network tuning of TCP window size to improve bandwidth and to reduce latency on a high speed wide area network. This paper presents a Java package called JPARSS (Java Parallel Secure Stream (Socket)) that divides data into partitions that are sent over several parallel Java streams simultaneously and allows Java or Web applications to achieve optimal TCP performance in a grid environment without the necessity of tuning TCP window size.more » This package enables single sign-on, certificate delegation and secure or plain-text data transfer using several security components based on X.509 certificate and SSL. Several experiments will be presented to show that using Java parallelstreams is more effective than tuning TCP window size. In addition a simple architecture using Web services« less

  8. The Modern Research Data Portal: A Design Pattern for Networked, Data-Intensive Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chard, Kyle; Dart, Eli; Foster, Ian

    Here we describe best practices for providing convenient, high-speed, secure access to large data via research data portals. We capture these best practices in a new design pattern, the Modern Research Data Portal, that disaggregates the traditional monolithic web-based data portal to achieve orders-of-magnitude increases in data transfer performance, support new deployment architectures that decouple control logic from data storage, and reduce development and operations costs. We introduce the design pattern; explain how it leverages high-performance Science DMZs and cloud-based data management services; review representative examples at research laboratories and universities, including both experimental facilities and supercomputer sites; describe howmore » to leverage Python APIs for authentication, authorization, data transfer, and data sharing; and use coding examples to demonstrate how these APIs can be used to implement a range of research data portal capabilities. Sample code at a companion web site, https://docs.globus.org/mrdp, provides application skeletons that readers can adapt to realize their own research data portals.« less

  9. The Modern Research Data Portal: a design pattern for networked, data-intensive science

    DOE PAGES

    Chard, Kyle; Dart, Eli; Foster, Ian; ...

    2018-01-15

    We describe best practices for providing convenient, high-speed, secure access to large data via research data portals. Here, we capture these best practices in a new design pattern, the Modern Research Data Portal, that disaggregates the traditional monolithic web-based data portal to achieve orders-of-magnitude increases in data transfer performance, support new deployment architectures that decouple control logic from data storage, and reduce development and operations costs. We introduce the design pattern; explain how it leverages high-performance data enclaves and cloud-based data management services; review representative examples at research laboratories and universities, including both experimental facilities and supercomputer sites; describe howmore » to leverage Python APIs for authentication, authorization, data transfer, and data sharing; and use coding examples to demonstrate how these APIs can be used to implement a range of research data portal capabilities. Sample code at a companion web site,https://docs.globus.org/mrdp, provides application skeletons that readers can adapt to realize their own research data portals.« less

  10. The Modern Research Data Portal: a design pattern for networked, data-intensive science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chard, Kyle; Dart, Eli; Foster, Ian

    We describe best practices for providing convenient, high-speed, secure access to large data via research data portals. Here, we capture these best practices in a new design pattern, the Modern Research Data Portal, that disaggregates the traditional monolithic web-based data portal to achieve orders-of-magnitude increases in data transfer performance, support new deployment architectures that decouple control logic from data storage, and reduce development and operations costs. We introduce the design pattern; explain how it leverages high-performance data enclaves and cloud-based data management services; review representative examples at research laboratories and universities, including both experimental facilities and supercomputer sites; describe howmore » to leverage Python APIs for authentication, authorization, data transfer, and data sharing; and use coding examples to demonstrate how these APIs can be used to implement a range of research data portal capabilities. Sample code at a companion web site,https://docs.globus.org/mrdp, provides application skeletons that readers can adapt to realize their own research data portals.« less

  11. The energetics of fish growth and how it constrains food-web trophic structure.

    PubMed

    Barneche, Diego R; Allen, Andrew P

    2018-06-01

    The allocation of metabolic energy to growth fundamentally influences all levels of biological organisation. Here we use a first-principles theoretical model to characterise the energetics of fish growth at distinct ontogenetic stages and in distinct thermal regimes. Empirically, we show that the mass scaling of growth rates follows that of metabolic rate, and is somewhat steeper at earlier ontogenetic stages. We also demonstrate that the cost of growth, E m , varies substantially among fishes, and that it may increase with temperature, trophic level and level of activity. Theoretically, we show that E m is a primary determinant of the efficiency of energy transfer across trophic levels, and that energy is transferred more efficiently between trophic levels if the prey are young and sedentary. Overall, our study demonstrates the importance of characterising the energetics of individual growth in order to understand constraints on the structure of food webs and ecosystems. © 2018 John Wiley & Sons Ltd/CNRS.

  12. Touring DNS Open Houses for Trends and Configurations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kalafut, Prof. Andrew; Shue, Craig A; Gupta, Prof. Minaxi

    2011-01-01

    DNS is a critical component of the Internet. It maps domain names to IP addresses and serves as a distributed database for various other applications, including mail, Web, and spam filtering. This paper examines DNS zones in the Internet for diversity, adoption rates of new technologies, and prevalence of configuration issues. To gather data, we sweep 60% of the Internet's domains in June - August 2007 for zone transfers. 6.6% of them allow us to transfer their complete information. Surprisingly, this includes a large fraction of the domains deploying DNSSEC. We find that DNS zones vary significantly in size andmore » some span many ASes. Also, while anti-spam technologies appear to be getting deployed, the adoption rates of DNSSEC and IPv6 continue to be low. Finally, we also find that carelessness in handing DNS records can lead to reduced availability of name servers, email, and Web servers. This also undermines anti-spam efforts and the efforts to shut down phishing sites or to contain malware infections.« less

  13. MAGI: a Node.js web service for fast microRNA-Seq analysis in a GPU infrastructure.

    PubMed

    Kim, Jihoon; Levy, Eric; Ferbrache, Alex; Stepanowsky, Petra; Farcas, Claudiu; Wang, Shuang; Brunner, Stefan; Bath, Tyler; Wu, Yuan; Ohno-Machado, Lucila

    2014-10-01

    MAGI is a web service for fast MicroRNA-Seq data analysis in a graphics processing unit (GPU) infrastructure. Using just a browser, users have access to results as web reports in just a few hours->600% end-to-end performance improvement over state of the art. MAGI's salient features are (i) transfer of large input files in native FASTA with Qualities (FASTQ) format through drag-and-drop operations, (ii) rapid prediction of microRNA target genes leveraging parallel computing with GPU devices, (iii) all-in-one analytics with novel feature extraction, statistical test for differential expression and diagnostic plot generation for quality control and (iv) interactive visualization and exploration of results in web reports that are readily available for publication. MAGI relies on the Node.js JavaScript framework, along with NVIDIA CUDA C, PHP: Hypertext Preprocessor (PHP), Perl and R. It is freely available at http://magi.ucsd.edu. © The Author 2014. Published by Oxford University Press.

  14. Development of a Web-based financial application System

    NASA Astrophysics Data System (ADS)

    Hasan, M. R.; Ibrahimy, M. I.; Motakabber, S. M. A.; Ferdaus, M. M.; Khan, M. N. H.; Mostafa, M. G.

    2013-12-01

    The paper describes a technique to develop a web based financial system, following latest technology and business needs. In the development of web based application, the user friendliness and technology both are very important. It is used ASP .NET MVC 4 platform and SQL 2008 server for development of web based financial system. It shows the technique for the entry system and report monitoring of the application is user friendly. This paper also highlights the critical situations of development, which will help to develop the quality product.

  15. Achieving Better Buying Power through Acquisition of Open Architecture Software Systems for Web and Mobile Devices

    DTIC Science & Technology

    2016-02-22

    SPONSORED REPORT SERIES Achieving Better Buying Power through Acquisition of Open Architecture Software Systems for Web and Mobile Devices 22...ACQUISITION RESEARCH PROGRAM SPONSORED REPORT SERIES Achieving Better Buying Power through Acquisition of Open Architecture Software Systems for Web ...Policy Naval Postgraduate School Executive Summary Many people within large enterprises rely on up to four Web -based or mobile devices for their

  16. Synoptic reporting in tumor pathology: advantages of a web-based system.

    PubMed

    Qu, Zhenhong; Ninan, Shibu; Almosa, Ahmed; Chang, K G; Kuruvilla, Supriya; Nguyen, Nghia

    2007-06-01

    The American College of Surgeons Commission on Cancer (ACS-CoC) mandates that pathology reports at ACS-CoC-approved cancer programs include all scientifically validated data elements for each site and tumor specimen. The College of American Pathologists (CAP) has produced cancer checklists in static text formats to assist reporting. To be inclusive, the CAP checklists are pages long, requiring extensive text editing and multiple intermediate steps. We created a set of dynamic tumor-reporting templates, using Microsoft Active Server Page (ASP.NET), with drop-down list and data-compile features, and added a reminder function to indicate missing information. Users can access this system on the Internet, prepare the tumor report by selecting relevant data from drop-down lists with an embedded tumor staging scheme, and directly transfer the final report into a laboratory information system by using the copy-and-paste function. By minimizing extensive text editing and eliminating intermediate steps, this system can reduce reporting errors, improve work efficiency, and increase compliance.

  17. Effective 3-D surface modeling for geographic information systems

    NASA Astrophysics Data System (ADS)

    Yüksek, K.; Alparslan, M.; Mendi, E.

    2013-11-01

    In this work, we propose a dynamic, flexible and interactive urban digital terrain platform (DTP) with spatial data and query processing capabilities of Geographic Information Systems (GIS), multimedia database functionality and graphical modeling infrastructure. A new data element, called Geo-Node, which stores image, spatial data and 3-D CAD objects is developed using an efficient data structure. The system effectively handles data transfer of Geo-Nodes between main memory and secondary storage with an optimized Directional Replacement Policy (DRP) based buffer management scheme. Polyhedron structures are used in Digital Surface Modeling (DSM) and smoothing process is performed by interpolation. The experimental results show that our framework achieves high performance and works effectively with urban scenes independent from the amount of spatial data and image size. The proposed platform may contribute to the development of various applications such as Web GIS systems based on 3-D graphics standards (e.g. X3-D and VRML) and services which integrate multi-dimensional spatial information and satellite/aerial imagery.

  18. Effective 3-D surface modeling for geographic information systems

    NASA Astrophysics Data System (ADS)

    Yüksek, K.; Alparslan, M.; Mendi, E.

    2016-01-01

    In this work, we propose a dynamic, flexible and interactive urban digital terrain platform with spatial data and query processing capabilities of geographic information systems, multimedia database functionality and graphical modeling infrastructure. A new data element, called Geo-Node, which stores image, spatial data and 3-D CAD objects is developed using an efficient data structure. The system effectively handles data transfer of Geo-Nodes between main memory and secondary storage with an optimized directional replacement policy (DRP) based buffer management scheme. Polyhedron structures are used in digital surface modeling and smoothing process is performed by interpolation. The experimental results show that our framework achieves high performance and works effectively with urban scenes independent from the amount of spatial data and image size. The proposed platform may contribute to the development of various applications such as Web GIS systems based on 3-D graphics standards (e.g., X3-D and VRML) and services which integrate multi-dimensional spatial information and satellite/aerial imagery.

  19. Mobile Cloud Computing with SOAP and REST Web Services

    NASA Astrophysics Data System (ADS)

    Ali, Mushtaq; Fadli Zolkipli, Mohamad; Mohamad Zain, Jasni; Anwar, Shahid

    2018-05-01

    Mobile computing in conjunction with Mobile web services drives a strong approach where the limitations of mobile devices may possibly be tackled. Mobile Web Services are based on two types of technologies; SOAP and REST, which works with the existing protocols to develop Web services. Both the approaches carry their own distinct features, yet to keep the constraint features of mobile devices in mind, the better in two is considered to be the one which minimize the computation and transmission overhead while offloading. The load transferring of mobile device to remote servers for execution called computational offloading. There are numerous approaches to implement computational offloading a viable solution for eradicating the resources constraints of mobile device, yet a dynamic method of computational offloading is always required for a smooth and simple migration of complex tasks. The intention of this work is to present a distinctive approach which may not engage the mobile resources for longer time. The concept of web services utilized in our work to delegate the computational intensive tasks for remote execution. We tested both SOAP Web services approach and REST Web Services for mobile computing. Two parameters considered in our lab experiments to test; Execution Time and Energy Consumption. The results show that RESTful Web services execution is far better than executing the same application by SOAP Web services approach, in terms of execution time and energy consumption. Conducting experiments with the developed prototype matrix multiplication app, REST execution time is about 200% better than SOAP execution approach. In case of energy consumption REST execution is about 250% better than SOAP execution approach.

  20. A novel architecture for information retrieval system based on semantic web

    NASA Astrophysics Data System (ADS)

    Zhang, Hui

    2011-12-01

    Nowadays, the web has enabled an explosive growth of information sharing (there are currently over 4 billion pages covering most areas of human endeavor) so that the web has faced a new challenge of information overhead. The challenge that is now before us is not only to help people locating relevant information precisely but also to access and aggregate a variety of information from different resources automatically. Current web document are in human-oriented formats and they are suitable for the presentation, but machines cannot understand the meaning of document. To address this issue, Berners-Lee proposed a concept of semantic web. With semantic web technology, web information can be understood and processed by machine. It provides new possibilities for automatic web information processing. A main problem of semantic web information retrieval is that when these is not enough knowledge to such information retrieval system, the system will return to a large of no sense result to uses due to a huge amount of information results. In this paper, we present the architecture of information based on semantic web. In addiction, our systems employ the inference Engine to check whether the query should pose to Keyword-based Search Engine or should pose to the Semantic Search Engine.

  1. Effects of User and System Characteristics on Perceived Usefulness and Perceived Ease of Use of the Web-Based Classroom Response System

    ERIC Educational Resources Information Center

    Ke, Chih-Horng; Sun, Huey-Min; Yang, Yuan-Chi; Sun, Huey-Min

    2012-01-01

    This study explores the effect of user and system characteristics on our proposed web-based classroom response system (CRS) by a longitudinal design. The results of research are expected to understand the important factors of user and system characteristics in the web-based CRS. The proposed system can supply interactive teaching contents,…

  2. Integrating digital educational content created and stored within disparate software environments: an extensible markup language (XML) solution in real-world use.

    PubMed

    Frank, M S; Schultz, T; Dreyer, K

    2001-06-01

    To provide a standardized and scaleable mechanism for exchanging digital radiologic educational content between software systems that use disparate authoring, storage, and presentation technologies. Our institution uses two distinct software systems for creating educational content for radiology. Each system is used to create in-house educational content as well as commercial educational products. One system is an authoring and viewing application that facilitates the input and storage of hierarchical knowledge and associated imagery, and is capable of supporting a variety of entity relationships. This system is primarily used for the production and subsequent viewing of educational CD-ROMS. Another software system is primarily used for radiologic education on the world wide web. This system facilitates input and storage of interactive knowledge and associated imagery, delivering this content over the internet in a Socratic manner simulating in-person interaction with an expert. A subset of knowledge entities common to both systems was derived. An additional subset of knowledge entities that could be bidirectionally mapped via algorithmic transforms was also derived. An extensible markup language (XML) object model and associated lexicon were then created to represent these knowledge entities and their interactive behaviors. Forward-looking attention was exercised in the creation of the object model in order to facilitate straightforward future integration of other sources of educational content. XML generators and interpreters were written for both systems. Deriving the XML object model and lexicon was the most critical and time-consuming aspect of the project. The coding of the XML generators and interpreters required only a few hours for each environment. Subsequently, the transfer of hundreds of educational cases and thematic presentations between the systems can now be accomplished in a matter of minutes. The use of algorithmic transforms results in nearly 100% transfer of context as well as content, thus providing "presentation-ready" outcomes. The automation of knowledge exchange between dissimilar digital teaching environments magnifies the efforts of educators and enriches the learning experience for participants. XML is a powerful and useful mechanism for transfering educational content, as well as the context and interactive behaviors of such content, between disparate systems.

  3. Microbial loop contribution to exergy in the sediments of the Marsala lagoon (Italy)

    NASA Astrophysics Data System (ADS)

    Pusceddu, A.; Danovaro, R.

    2003-04-01

    Recent advances in ecological modelling have stressed the need for new descriptors of ecosystem health, able to consider the actual transfer of energy through food webs, including also the potential transfer/loss of (genetic) information. In ecological terms, exergy is defined as a goal function which, as sum of energy (biomass) and (genetic) information contained in a given system due to living organisms, acts as a quality indicator of ecosystems. Biopolymeric organic carbon (BPC) quantity and biochemical composition, bacteria, heterotrophic nanoflagellate and meiofauna abundance, biomass and exergy contents were investigated, on a seasonal basis, in the Marsala lagoon (Mediterranean Sea), at two stations characterized by contrasting hydrodynamic conditions. Carbohydrate (2.8 mg g-1), protein (1.6 mg g-1) and lipid (0.86 mg g-1) contents were extremely high, with values at the more exposed station about 3 times lower than those at the sheltered one. BPC (on average 2.5 mg C g-1), dominated by carbohydrates (50%), was mostly refractory and largely unaccounted for by primary organic matter (4% of BPC), indicating that the Marsala lagoon sediments act as a "detritus sink". At both stations, bacterial (on average 0.3 mg C g-1) and heterotrophic nanoflagellate (9.8 μgC g-1) biomass values were rather high, whereas meiofauna biomass was extremely low (on average 7.2 μg C cm-2). The exergy transfer along the benthic microbial loop components in the Marsala lagoon appeared largely bottlenecked by the refractory composition of organic detritus. In the more exposed station, the exergy transfer towards the higher trophic levels was more efficient than in the sheltered one. Although total exergy values were significantly higher in summer than in winter, at both stations the exergy transfer in winter was more efficient than in summer. Our results indicate that, in 'detritus sink' systems, auxiliary energy (e.g., wind-induced sediment resuspension) might be of paramount importance for increasing efficiency of organic detritus channeling to higher trophic levels.

  4. AWSCS-A System to Evaluate Different Approaches for the Automatic Composition and Execution of Web Services Flows

    PubMed Central

    Tardiole Kuehne, Bruno; Estrella, Julio Cezar; Nunes, Luiz Henrique; Martins de Oliveira, Edvard; Hideo Nakamura, Luis; Gomes Ferreira, Carlos Henrique; Carlucci Santana, Regina Helena; Reiff-Marganiec, Stephan; Santana, Marcos José

    2015-01-01

    This paper proposes a system named AWSCS (Automatic Web Service Composition System) to evaluate different approaches for automatic composition of Web services, based on QoS parameters that are measured at execution time. The AWSCS is a system to implement different approaches for automatic composition of Web services and also to execute the resulting flows from these approaches. Aiming at demonstrating the results of this paper, a scenario was developed, where empirical flows were built to demonstrate the operation of AWSCS, since algorithms for automatic composition are not readily available to test. The results allow us to study the behaviour of running composite Web services, when flows with the same functionality but different problem-solving strategies were compared. Furthermore, we observed that the influence of the load applied on the running system as the type of load submitted to the system is an important factor to define which approach for the Web service composition can achieve the best performance in production. PMID:26068216

  5. AWSCS-A System to Evaluate Different Approaches for the Automatic Composition and Execution of Web Services Flows.

    PubMed

    Tardiole Kuehne, Bruno; Estrella, Julio Cezar; Nunes, Luiz Henrique; Martins de Oliveira, Edvard; Hideo Nakamura, Luis; Gomes Ferreira, Carlos Henrique; Carlucci Santana, Regina Helena; Reiff-Marganiec, Stephan; Santana, Marcos José

    2015-01-01

    This paper proposes a system named AWSCS (Automatic Web Service Composition System) to evaluate different approaches for automatic composition of Web services, based on QoS parameters that are measured at execution time. The AWSCS is a system to implement different approaches for automatic composition of Web services and also to execute the resulting flows from these approaches. Aiming at demonstrating the results of this paper, a scenario was developed, where empirical flows were built to demonstrate the operation of AWSCS, since algorithms for automatic composition are not readily available to test. The results allow us to study the behaviour of running composite Web services, when flows with the same functionality but different problem-solving strategies were compared. Furthermore, we observed that the influence of the load applied on the running system as the type of load submitted to the system is an important factor to define which approach for the Web service composition can achieve the best performance in production.

  6. The architecture of the management system of complex steganographic information

    NASA Astrophysics Data System (ADS)

    Evsutin, O. O.; Meshcheryakov, R. V.; Kozlova, A. S.; Solovyev, T. M.

    2017-01-01

    The aim of the study is to create a wide area information system that allows one to control processes of generation, embedding, extraction, and detection of steganographic information. In this paper, the following problems are considered: the definition of the system scope and the development of its architecture. For creation of algorithmic maintenance of the system, classic methods of steganography are used to embed information. Methods of mathematical statistics and computational intelligence are used to identify the embedded information. The main result of the paper is the development of the architecture of the management system of complex steganographic information. The suggested architecture utilizes cloud technology in order to provide service using the web-service via the Internet. It is meant to provide streams of multimedia data processing that are streams with many sources of different types. The information system, built in accordance with the proposed architecture, will be used in the following areas: hidden transfer of documents protected by medical secrecy in telemedicine systems; copyright protection of online content in public networks; prevention of information leakage caused by insiders.

  7. SCIATRAN 3.1: A new radiative transfer model and retrieval package

    NASA Astrophysics Data System (ADS)

    Rozanov, Alexei; Rozanov, Vladimir; Kokhanovsky, Alexander; Burrows, John P.

    The SCIATRAN 3.1 package is a result of further development of the SCIATRAN 2.X software family which, similar to previous versions, comprises a radiative transfer model and a retrieval block. After an implementation of the vector radiative transfer model in SCIATRAN 3.0 the spectral range covered by the model has been extended into the thermal infrared ranging to approximately 40 micrometers. Another major improvement has been done accounting for the underlying surface effects. Among others, a sophisticated representation of the water surface with a bidirectional reflection distribution function (BRDF) has been implemented accounting for the Fresnel reflection of the polarized light and for the effect of foam. A newly developed representation for a snow surface allows radiative transfer calculations to be performed within an unpolluted or soiled snow layer. Furthermore, a new approach has been implemented allowing radiative transfer calculations to be performed for a coupled atmosphere-ocean system. This means that, the underlying ocean is not considered as a purely reflecting surface any more. Instead, full radiative transfer calculations are performed within the water allowing the user to simulate the radiance within both the atmosphere and the ocean. Similar to previous versions, the simulations can be performed for any viewing geometry typi-cal for atmospheric observations in the UV-Vis-NIR-TIR spectral range (nadir, limb, off-axis, etc.) as well as for any observer location within or outside the Earth's atmosphere including underwater observations. Similar to the precursor version, the new model is freely available for non-commercial use via the web page of the University of Bremen. In this presentation a short description of the software package, especially of the new features of the radiative transfer model is given, including remarks on the availability for the scientific community. Furthermore, some application examples of the radiative transfer model are shown.

  8. Mercury bioaccumulation along food webs in temperate aquatic ecosystems colonized by aquatic macrophytes in south western France.

    PubMed

    Gentès, Sophie; Maury-Brachet, Régine; Guyoneaud, Rémy; Monperrus, Mathilde; André, Jean-Marc; Davail, Stéphane; Legeay, Alexia

    2013-05-01

    Mercury (Hg) is considered as an important pollutant for aquatic systems as its organic form, methylmercury (MeHg), is easily bioaccumulated and bioamplified along food webs. In various ecosystems, aquatic periphyton associated with macrophyte was identified as an important place for Hg storage and methylation by microorganisms. Our study concerns temperate aquatic ecosystems (South Western France) colonized by invasive macrophytes and characterized by high mercury methylation potentials. This work establishes original data concerning Hg bioaccumulation in organisms (plants, crustaceans, molluscs and fish) from five contrasting ecosystems. For low trophic level species, total Hg (THg) concentrations were low (from 27±2ngTHgg(-1)dw in asiatic clam Corbicula fluminea to 418±114ngTHgg(-1)dw in crayfish Procambarus clarkii). THg concentrations in some carnivorous fish (high trophic level) were close to or exceeded the International Marketing Level (IML) with values ranging from 1049±220ngTHgg(-1)dw in pike perch muscle (Sander lucioperca) to 3910±1307ngTHgg(-1)dw in eel muscle (Anguilla Anguilla). Trophic levels for the individuals were also evaluated through stable isotope analysis, and linked to Hg concentrations of organisms. A significant Hg biomagnification (r(2)= 0.9) was observed in the Aureilhan lake, despite the absence of top predator fish. For this site, Ludwigia sp. periphyton, as an entry point of Hg into food webs, is a serious hypothesis which remains to be confirmed. This study provides a first investigation of Hg transfer in the ecosystems of south western France and allows the assessment of the risk associated with the presence of Hg in aquatic food webs. Copyright © 2013 Elsevier Inc. All rights reserved.

  9. Tracking the flow of bacterially derived 13C and 15N through soil faunal feeding channels.

    PubMed

    Crotty, F V; Blackshaw, R P; Murray, P J

    2011-06-15

    The soil food web has been referred to as a 'black box', a 'poor man's tropical rainforest' and an 'enigma', due to its opacity, diversity and the limited insight into feeding specificity. Here we investigate the flow of C and N through the soil food web as a way to gain understanding of the feeding interactions occurring. A bacterium, Pseudomonas lurida, was introduced to soil cores from two different habitats, a grassland and a woodland with the same soil type, enriched to 99 atom% in (13)C and (15)N, to trace the flow of bacterial C and N through the soil food web. Throughout the experiment the soil remained enriched in (13)C and (15)N. Almost all the invertebrates tested gained C and N enrichment indicative of the labelled bacteria, implying that bacterial feeding is a common mechanism within the soil. Only three groups were significantly enriched in both (13)C and (15)N in both habitats. These were Collembola (Entomobryomorpha), Acari (Oribatida), and Nematoda, indicating that these organisms are consuming the most bacteria within both systems. When the invertebrates were grouped into hypothesised trophic levels, those considered secondary decomposers were gaining the most enrichment across all invertebrates tested. This enrichment was also high in the micro-predators within the soil, implying that their main food source was the secondary decomposers, particularly the Collembola. Using an enriched bacterium to track the trophic transfer between organisms within the soil food web is a novel way of empirically showing that interactions are occurring, which normally cannot be seen. Copyright © 2011 John Wiley & Sons, Ltd.

  10. The MetabolomeExpress Project: enabling web-based processing, analysis and transparent dissemination of GC/MS metabolomics datasets.

    PubMed

    Carroll, Adam J; Badger, Murray R; Harvey Millar, A

    2010-07-14

    Standardization of analytical approaches and reporting methods via community-wide collaboration can work synergistically with web-tool development to result in rapid community-driven expansion of online data repositories suitable for data mining and meta-analysis. In metabolomics, the inter-laboratory reproducibility of gas-chromatography/mass-spectrometry (GC/MS) makes it an obvious target for such development. While a number of web-tools offer access to datasets and/or tools for raw data processing and statistical analysis, none of these systems are currently set up to act as a public repository by easily accepting, processing and presenting publicly submitted GC/MS metabolomics datasets for public re-analysis. Here, we present MetabolomeExpress, a new File Transfer Protocol (FTP) server and web-tool for the online storage, processing, visualisation and statistical re-analysis of publicly submitted GC/MS metabolomics datasets. Users may search a quality-controlled database of metabolite response statistics from publicly submitted datasets by a number of parameters (eg. metabolite, species, organ/biofluid etc.). Users may also perform meta-analysis comparisons of multiple independent experiments or re-analyse public primary datasets via user-friendly tools for t-test, principal components analysis, hierarchical cluster analysis and correlation analysis. They may interact with chromatograms, mass spectra and peak detection results via an integrated raw data viewer. Researchers who register for a free account may upload (via FTP) their own data to the server for online processing via a novel raw data processing pipeline. MetabolomeExpress https://www.metabolome-express.org provides a new opportunity for the general metabolomics community to transparently present online the raw and processed GC/MS data underlying their metabolomics publications. Transparent sharing of these data will allow researchers to assess data quality and draw their own insights from published metabolomics datasets.

  11. Assessment of Web-Based Authentication Methods in the U.S.: Comparing E-Learning Systems to Internet Healthcare Information Systems

    ERIC Educational Resources Information Center

    Mattord, Herbert J.

    2012-01-01

    Organizations continue to rely on password-based authentication methods to control access to many Web-based systems. This research study developed a benchmarking instrument intended to assess authentication methods used in Web-based information systems (IS). It developed an Authentication Method System Index (AMSI) to analyze collected data from…

  12. 78 FR 49480 - Proposed Information Collection; Comment Request; NTIA/FCC Web-based Frequency Coordination System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-14

    ... Information Collection; Comment Request; NTIA/FCC Web- based Frequency Coordination System AGENCY: National... INFORMATION: I. Abstract The National Telecommunications and Information Administration (NTIA) hosts a web... (RF) bands that are shared on a co-primary basis by federal and non-federal users. The web-based...

  13. 75 FR 29307 - Web Based Supply Chain Management Commodity Offer Form, Paperwork Collection Notice

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-25

    ... DEPARTMENT OF AGRICULTURE Agricultural Marketing Service [Doc. No FV10-CP-01, AMS-FV-10-0041] Web... collection request is required for the implementation of a new system named Web Based Supply Chain Management...-2782. Mail: David Tuckwiller, Project Manager, Web Based Supply Chain Management System, Agricultural...

  14. Information Diversity in Web Search

    ERIC Educational Resources Information Center

    Liu, Jiahui

    2009-01-01

    The web is a rich and diverse information source with incredible amounts of information about all kinds of subjects in various forms. This information source affords great opportunity to build systems that support users in their work and everyday lives. To help users explore information on the web, web search systems should find information that…

  15. Access and privacy rights using web security standards to increase patient empowerment.

    PubMed

    Falcão-Reis, Filipa; Costa-Pereira, Altamiro; Correia, Manuel E

    2008-01-01

    Electronic Health Record (EHR) systems are becoming more and more sophisticated and include nowadays numerous applications, which are not only accessed by medical professionals, but also by accounting and administrative personnel. This could represent a problem concerning basic rights such as privacy and confidentiality. The principles, guidelines and recommendations compiled by the OECD protection of privacy and trans-border flow of personal data are described and considered within health information system development. Granting access to an EHR should be dependent upon the owner of the record; the patient: he must be entitled to define who is allowed to access his EHRs, besides the access control scheme each health organization may have implemented. In this way, it's not only up to health professionals to decide who have access to what, but the patient himself. Implementing such a policy is walking towards patient empowerment which society should encourage and governments should promote. The paper then introduces a technical solution based on web security standards. This would give patients the ability to monitor and control which entities have access to their personal EHRs, thus empowering them with the knowledge of how much of his medical history is known and by whom. It is necessary to create standard data access protocols, mechanisms and policies to protect the privacy rights and furthermore, to enable patients, to automatically track the movement (flow) of their personal data and information in the context of health information systems. This solution must be functional and, above all, user-friendly and the interface should take in consideration some heuristics of usability in order to provide the user with the best tools. The current official standards on confidentiality and privacy in health care, currently being developed within the EU, are explained, in order to achieve a consensual idea of the guidelines that all member states should follow to transfer such principles into national laws. A perspective is given on the state of the art concerning web security standards, which can be used to easily engineer health information systems complying with the patient empowering goals. In conclusion health systems with the characteristics thus described are technically feasible and should be generally implemented and deployed.

  16. Influence of ocean acidification on plankton community structure during a winter-to-summer succession: An imaging approach indicates that copepods can benefit from elevated CO2 via indirect food web effects

    PubMed Central

    Taucher, Jan; Haunost, Mathias; Boxhammer, Tim; Bach, Lennart T.; Algueró-Muñiz, María; Riebesell, Ulf

    2017-01-01

    Plankton communities play a key role in the marine food web and are expected to be highly sensitive to ongoing environmental change. Oceanic uptake of anthropogenic carbon dioxide (CO2) causes pronounced shifts in marine carbonate chemistry and a decrease in seawater pH. These changes–summarized by the term ocean acidification (OA)–can significantly affect the physiology of planktonic organisms. However, studies on the response of entire plankton communities to OA, which also include indirect effects via food-web interactions, are still relatively rare. Thus, it is presently unclear how OA could affect the functioning of entire ecosystems and biogeochemical element cycles. In this study, we report from a long-term in situ mesocosm experiment, where we investigated the response of natural plankton communities in temperate waters (Gullmarfjord, Sweden) to elevated CO2 concentrations and OA as expected for the end of the century (~760 μatm pCO2). Based on a plankton-imaging approach, we examined size structure, community composition and food web characteristics of the whole plankton assemblage, ranging from picoplankton to mesozooplankton, during an entire winter-to-summer succession. The plankton imaging system revealed pronounced temporal changes in the size structure of the copepod community over the course of the plankton bloom. The observed shift towards smaller individuals resulted in an overall decrease of copepod biomass by 25%, despite increasing numerical abundances. Furthermore, we observed distinct effects of elevated CO2 on biomass and size structure of the entire plankton community. Notably, the biomass of copepods, dominated by Pseudocalanus acuspes, displayed a tendency towards elevated biomass by up to 30–40% under simulated ocean acidification. This effect was significant for certain copepod size classes and was most likely driven by CO2-stimulated responses of primary producers and a complex interplay of trophic interactions that allowed this CO2 effect to propagate up the food web. Such OA-induced shifts in plankton community structure could have far-reaching consequences for food-web interactions, biomass transfer to higher trophic levels and biogeochemical cycling of marine ecosystems. PMID:28178268

  17. Web information retrieval for health professionals.

    PubMed

    Ting, S L; See-To, Eric W K; Tse, Y K

    2013-06-01

    This paper presents a Web Information Retrieval System (WebIRS), which is designed to assist the healthcare professionals to obtain up-to-date medical knowledge and information via the World Wide Web (WWW). The system leverages the document classification and text summarization techniques to deliver the highly correlated medical information to the physicians. The system architecture of the proposed WebIRS is first discussed, and then a case study on an application of the proposed system in a Hong Kong medical organization is presented to illustrate the adoption process and a questionnaire is administrated to collect feedback on the operation and performance of WebIRS in comparison with conventional information retrieval in the WWW. A prototype system has been constructed and implemented on a trial basis in a medical organization. It has proven to be of benefit to healthcare professionals through its automatic functions in classification and summarizing the medical information that the physicians needed and interested. The results of the case study show that with the use of the proposed WebIRS, significant reduction of searching time and effort, with retrieval of highly relevant materials can be attained.

  18. Protecting clinical data on Web client computers: the PCASSO approach.

    PubMed Central

    Masys, D. R.; Baker, D. B.

    1998-01-01

    The ubiquity and ease of use of the Web have made it an increasingly popular medium for communication of health-related information. Web interfaces to commercially available clinical information systems are now available or under development by most major vendors. To the extent that such interfaces involve the use of unprotected operating systems, they are vulnerable to security limitations of Web client software environments. The Patient Centered Access to Secure Systems Online (PCASSO) project extends the protections for person-identifiable health data on Web client computers. PCASSO uses several approaches, including physical protection of authentication information, execution containment, graphical displays, and monitoring the client system for intrusions and co-existing programs that may compromise security. PMID:9929243

  19. Evaluation Criteria for the Educational Web-Information System

    ERIC Educational Resources Information Center

    Seok, Soonhwa; Meyen, Edward; Poggio, John C.; Semon, Sarah; Tillberg-Webb, Heather

    2008-01-01

    This article addresses how evaluation criteria improve educational Web-information system design, and the tangible and intangible benefits of using evaluation criteria, when implemented in an educational Web-information system design. The evaluation criteria were developed by the authors through a content validation study applicable to…

  20. 78 FR 76391 - Proposed Enhancements to the Motor Carrier Safety Measurement System (SMS) Public Web Site

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-17

    ...-0392] Proposed Enhancements to the Motor Carrier Safety Measurement System (SMS) Public Web Site AGENCY... proposed enhancements to the display of information on the Agency's Safety Measurement System (SMS) public Web site. On December 6, 2013, Advocates [[Page 76392

  1. Improving root-zone soil moisture estimations using dynamic root growth and crop phenology

    USDA-ARS?s Scientific Manuscript database

    Water Energy Balance (WEB) Soil Vegetation Atmosphere Transfer (SVAT) modelling can be used to estimate soil moisture by forcing the model with observed data such as precipitation and solar radiation. Recently, an innovative approach that assimilates remotely sensed thermal infrared (TIR) observatio...

  2. Monitoring performance of a highly distributed and complex computing infrastructure in LHCb

    NASA Astrophysics Data System (ADS)

    Mathe, Z.; Haen, C.; Stagni, F.

    2017-10-01

    In order to ensure an optimal performance of the LHCb Distributed Computing, based on LHCbDIRAC, it is necessary to be able to inspect the behavior over time of many components: firstly the agents and services on which the infrastructure is built, but also all the computing tasks and data transfers that are managed by this infrastructure. This consists of recording and then analyzing time series of a large number of observables, for which the usage of SQL relational databases is far from optimal. Therefore within DIRAC we have been studying novel possibilities based on NoSQL databases (ElasticSearch, OpenTSDB and InfluxDB) as a result of this study we developed a new monitoring system based on ElasticSearch. It has been deployed on the LHCb Distributed Computing infrastructure for which it collects data from all the components (agents, services, jobs) and allows creating reports through Kibana and a web user interface, which is based on the DIRAC web framework. In this paper we describe this new implementation of the DIRAC monitoring system. We give details on the ElasticSearch implementation within the DIRAC general framework, as well as an overview of the advantages of the pipeline aggregation used for creating a dynamic bucketing of the time series. We present the advantages of using the ElasticSearch DSL high-level library for creating and running queries. Finally we shall present the performances of that system.

  3. Web-services-based spatial decision support system to facilitate nuclear waste siting

    NASA Astrophysics Data System (ADS)

    Huang, L. Xinglai; Sheng, Grant

    2006-10-01

    The availability of spatial web services enables data sharing among managers, decision and policy makers and other stakeholders in much simpler ways than before and subsequently has created completely new opportunities in the process of spatial decision making. Though generally designed for a certain problem domain, web-services-based spatial decision support systems (WSDSS) can provide a flexible problem-solving environment to explore the decision problem, understand and refine problem definition, and generate and evaluate multiple alternatives for decision. This paper presents a new framework for the development of a web-services-based spatial decision support system. The WSDSS is comprised of distributed web services that either have their own functions or provide different geospatial data and may reside in different computers and locations. WSDSS includes six key components, namely: database management system, catalog, analysis functions and models, GIS viewers and editors, report generators, and graphical user interfaces. In this study, the architecture of a web-services-based spatial decision support system to facilitate nuclear waste siting is described as an example. The theoretical, conceptual and methodological challenges and issues associated with developing web services-based spatial decision support system are described.

  4. The Cancer Genomics Hub (CGHub): overcoming cancer through the power of torrential data.

    PubMed

    Wilks, Christopher; Cline, Melissa S; Weiler, Erich; Diehkans, Mark; Craft, Brian; Martin, Christy; Murphy, Daniel; Pierce, Howdy; Black, John; Nelson, Donavan; Litzinger, Brian; Hatton, Thomas; Maltbie, Lori; Ainsworth, Michael; Allen, Patrick; Rosewood, Linda; Mitchell, Elizabeth; Smith, Bradley; Warner, Jim; Groboske, John; Telc, Haifang; Wilson, Daniel; Sanford, Brian; Schmidt, Hannes; Haussler, David; Maltbie, Daniel

    2014-01-01

    The Cancer Genomics Hub (CGHub) is the online repository of the sequencing programs of the National Cancer Institute (NCI), including The Cancer Genomics Atlas (TCGA), the Cancer Cell Line Encyclopedia (CCLE) and the Therapeutically Applicable Research to Generate Effective Treatments (TARGET) projects, with data from 25 different types of cancer. The CGHub currently contains >1.4 PB of data, has grown at an average rate of 50 TB a month and serves >100 TB per week. The architecture of CGHub is designed to support bulk searching and downloading through a Web-accessible application programming interface, enforce patient genome confidentiality in data storage and transmission and optimize for efficiency in access and transfer. In this article, we describe the design of these three components, present performance results for our transfer protocol, GeneTorrent, and finally report on the growth of the system in terms of data stored and transferred, including estimated limits on the current architecture. Our experienced-based estimates suggest that centralizing storage and computational resources is more efficient than wide distribution across many satellite labs. Database URL: https://cghub.ucsc.edu. Published by Oxford University Press 2014. This work is written by US Government employees and is in the public domain in the US.

  5. Ultraviolet and Visible Emission Mechanisms in Astrophysics

    NASA Technical Reports Server (NTRS)

    Stancil, Phillip C.; Schultz, David R.

    2003-01-01

    The project involved the study of ultraviolet (UV) and visible emission mechanisms in astrophysical and atmospheric environments. In many situations, the emission is a direct consequence of a charge transferring collision of an ion with a neutral with capture of an electron to an excited state of the product ion. The process is also important in establishing the ionization and thermal balance of an astrophysical plasma. As little of the necessary collision data are available, the main thrust of the project was the calculation of total and state-selective charge transfer cross sections and rate coefficients for a very large number of collision systems. The data was computed using modern explicit techniques including the molecular-orbital close-coupling (MOCC), classical trajectory Monte Carlo (CTMC), and continuum distorted wave (CDW) methods. Estimates were also made in some instances using the multichannel Landau-Zener (MCLZ) and classical over-the-barrier (COB) models. Much of the data which has been computed has been formatted for inclusion in a charge transfer database on the World Wide Web (cfadc.phy.ornl.gov/astro/ps/data/). A considerable amount of data has been generated during the lifetime of the grant. Some of it has not been analyzed, but it will be as soon as possible, the data placed on our website, and papers ultimately written.

  6. The interRAI Acute Care instrument incorporated in an eHealth system for standardized and web-based geriatric assessment: strengths, weaknesses, opportunities and threats in the acute hospital setting

    PubMed Central

    2013-01-01

    Background The interRAI Acute Care instrument is a multidimensional geriatric assessment system intended to determine a hospitalized older persons’ medical, psychosocial and functional capacity and needs. Its objective is to develop an overall plan for treatment and long-term follow-up based on a common set of standardized items that can be used in various care settings. A Belgian web-based software system (BelRAI-software) was developed to enable clinicians to interpret the output and to communicate the patients’ data across wards and care organizations. The purpose of the study is to evaluate the (dis)advantages of the implementation of the interRAI Acute Care instrument as a comprehensive geriatric assessment instrument in an acute hospital context. Methods In a cross-sectional multicenter study on four geriatric wards in three acute hospitals, trained clinical staff (nurses, occupational therapists, social workers, and geriatricians) assessed 410 inpatients in routine clinical practice. The BelRAI-system was evaluated by focus groups, observations, and questionnaires. The Strengths, Weaknesses, Opportunities and Threats were mapped (SWOT-analysis) and validated by the participants. Results The primary strengths of the BelRAI-system were a structured overview of the patients’ condition early after admission and the promotion of multidisciplinary assessment. Our study was a first attempt to transfer standardized data between home care organizations, nursing homes and hospitals and a way to centralize medical, allied health professionals and nursing data. With the BelRAI-software, privacy of data is guaranteed. Weaknesses are the time-consuming character of the process and the overlap with other assessment instruments or (electronic) registration forms. There is room for improving the user-friendliness and the efficiency of the software, which needs hospital-specific adaptations. Opportunities are a timely and systematic problem detection and continuity of care. An actual shortage of funding of personnel to coordinate the assessment process is the most important threat. Conclusion The BelRAI-software allows standardized transmural information transfer and the centralization of medical, allied health professionals and nursing data. It is strictly secured and follows strict privacy regulations, allowing hospitals to optimize (transmural) communication and interaction. However, weaknesses and threats exist and must be tackled in order to promote large scale implementation. PMID:24007312

  7. The interRAI Acute Care instrument incorporated in an eHealth system for standardized and web-based geriatric assessment: strengths, weaknesses, opportunities and threats in the acute hospital setting.

    PubMed

    Devriendt, Els; Wellens, Nathalie I H; Flamaing, Johan; Declercq, Anja; Moons, Philip; Boonen, Steven; Milisen, Koen

    2013-09-05

    The interRAI Acute Care instrument is a multidimensional geriatric assessment system intended to determine a hospitalized older persons' medical, psychosocial and functional capacity and needs. Its objective is to develop an overall plan for treatment and long-term follow-up based on a common set of standardized items that can be used in various care settings. A Belgian web-based software system (BelRAI-software) was developed to enable clinicians to interpret the output and to communicate the patients' data across wards and care organizations. The purpose of the study is to evaluate the (dis)advantages of the implementation of the interRAI Acute Care instrument as a comprehensive geriatric assessment instrument in an acute hospital context. In a cross-sectional multicenter study on four geriatric wards in three acute hospitals, trained clinical staff (nurses, occupational therapists, social workers, and geriatricians) assessed 410 inpatients in routine clinical practice. The BelRAI-system was evaluated by focus groups, observations, and questionnaires. The Strengths, Weaknesses, Opportunities and Threats were mapped (SWOT-analysis) and validated by the participants. The primary strengths of the BelRAI-system were a structured overview of the patients' condition early after admission and the promotion of multidisciplinary assessment. Our study was a first attempt to transfer standardized data between home care organizations, nursing homes and hospitals and a way to centralize medical, allied health professionals and nursing data. With the BelRAI-software, privacy of data is guaranteed. Weaknesses are the time-consuming character of the process and the overlap with other assessment instruments or (electronic) registration forms. There is room for improving the user-friendliness and the efficiency of the software, which needs hospital-specific adaptations. Opportunities are a timely and systematic problem detection and continuity of care. An actual shortage of funding of personnel to coordinate the assessment process is the most important threat. The BelRAI-software allows standardized transmural information transfer and the centralization of medical, allied health professionals and nursing data. It is strictly secured and follows strict privacy regulations, allowing hospitals to optimize (transmural) communication and interaction. However, weaknesses and threats exist and must be tackled in order to promote large scale implementation.

  8. System Testing of Desktop and Web Applications

    ERIC Educational Resources Information Center

    Slack, James M.

    2011-01-01

    We want our students to experience system testing of both desktop and web applications, but the cost of professional system-testing tools is far too high. We evaluate several free tools and find that AutoIt makes an ideal educational system-testing tool. We show several examples of desktop and web testing with AutoIt, starting with simple…

  9. CRIMEtoYHU: a new web tool to develop yeast-based functional assays for characterizing cancer-associated missense variants.

    PubMed

    Mercatanti, Alberto; Lodovichi, Samuele; Cervelli, Tiziana; Galli, Alvaro

    2017-12-01

    Evaluation of the functional impact of cancer-associated missense variants is more difficult than for protein-truncating mutations and consequently standard guidelines for the interpretation of sequence variants have been recently proposed. A number of algorithms and software products were developed to predict the impact of cancer-associated missense mutations on protein structure and function. Importantly, direct assessment of the variants using high-throughput functional assays using simple genetic systems can help in speeding up the functional evaluation of newly identified cancer-associated variants. We developed the web tool CRIMEtoYHU (CTY) to help geneticists in the evaluation of the functional impact of cancer-associated missense variants. Humans and the yeast Saccharomyces cerevisiae share thousands of protein-coding genes although they have diverged for a billion years. Therefore, yeast humanization can be helpful in deciphering the functional consequences of human genetic variants found in cancer and give information on the pathogenicity of missense variants. To humanize specific positions within yeast genes, human and yeast genes have to share functional homology. If a mutation in a specific residue is associated with a particular phenotype in humans, a similar substitution in the yeast counterpart may reveal its effect at the organism level. CTY simultaneously finds yeast homologous genes, identifies the corresponding variants and determines the transferability of human variants to yeast counterparts by assigning a reliability score (RS) that may be predictive for the validity of a functional assay. CTY analyzes newly identified mutations or retrieves mutations reported in the COSMIC database, provides information about the functional conservation between yeast and human and shows the mutation distribution in human genes. CTY analyzes also newly found mutations and aborts when no yeast homologue is found. Then, on the basis of the protein domain localization and functional conservation between yeast and human, the selected variants are ranked by the RS. The RS is assigned by an algorithm that computes functional data, type of mutation, chemistry of amino acid substitution and the degree of mutation transferability between human and yeast protein. Mutations giving a positive RS are highly transferable to yeast and, therefore, yeast functional assays will be more predictable. To validate the web application, we have analyzed 8078 cancer-associated variants located in 31 genes that have a yeast homologue. More than 50% of variants are transferable to yeast. Incidentally, 88% of all transferable mutations have a reliability score >0. Moreover, we analyzed by CTY 72 functionally validated missense variants located in yeast genes at positions corresponding to the human cancer-associated variants. All these variants gave a positive RS. To further validate CTY, we analyzed 3949 protein variants (with positive RS) by the predictive algorithm PROVEAN. This analysis shows that yeast-based functional assays will be more predictable for the variants with positive RS. We believe that CTY could be an important resource for the cancer research community by providing information concerning the functional impact of specific mutations, as well as for the design of functional assays useful for decision support in precision medicine. © FEMS 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  10. WebAlchemist: a Web transcoding system for mobile Web access in handheld devices

    NASA Astrophysics Data System (ADS)

    Whang, Yonghyun; Jung, Changwoo; Kim, Jihong; Chung, Sungkwon

    2001-11-01

    In this paper, we describe the design and implementation of WebAlchemist, a prototype web transcoding system, which automatically converts a given HTML page into a sequence of equivalent HTML pages that can be properly displayed on a hand-held device. The Web/Alchemist system is based on a set of HTML transcoding heuristics managed by the Transcoding Manager (TM) module. In order to tackle difficult-to-transcode pages such as ones with large or complex table structures, we have developed several new transcoding heuristics that extract partial semantics from syntactic information such as the table width, font size and cascading style sheet. Subjective evaluation results using popular HTML pages (such as the CNN home page) show that WebAlchemist generates readable, structure-preserving transcoded pages, which can be properly displayed on hand-held devices.

  11. Web Services and Handle Infrastructure - WDCC's Contributions to International Projects

    NASA Astrophysics Data System (ADS)

    Föll, G.; Weigelt, T.; Kindermann, S.; Lautenschlager, M.; Toussaint, F.

    2012-04-01

    Climate science demands on data management are growing rapidly as climate models grow in the precision with which they depict spatial structures and in the completeness with which they describe a vast range of physical processes. The ExArch project is exploring the challenges of developing a software management infrastructure which will scale to the multi-exabyte archives of climate data which are likely to be crucial to major policy decisions in by the end of the decade. The ExArch approach to future integration of exascale climate archives is based on one hand on a distributed web service architecture providing data analysis and quality control functionality across archvies. On the other hand a consistent persistent identifier infrastructure is deployed to support distributed data management and data replication. Distributed data analysis functionality is based on the CDO climate data operators' package. The CDO-Tool is used for processing of the archived data and metadata. CDO is a collection of command line Operators to manipulate and analyse Climate and forecast model Data. A range of formats is supported and over 500 operators are provided. CDO presently is designed to work in a scripting environment with local files. ExArch will extend the tool to support efficient usage in an exascale archive with distributed data and computational resources by providing flexible scheduling capabilities. Quality control will become increasingly important in an exascale computing context. Researchers will be dealing with millions of data files from multiple sources and will need to know whether the files satisfy a range of basic quality criterea. Hence ExArch will provide a flexible and extensible quality control system. The data will be held at more than 30 computing centres and data archives around the world, but for users it will appear as a single archive due to a standardized ExArch Web Processing Service. Data infrastructures such as the one built by ExArch can greatly benefit from assigning persistent identifiers (PIDs) to the main entities, such as data and metadata records. A PID should then not only consist of a globally unique identifier, but also support built-in facilities to relate PIDs to each other, to build multi-hierarchical virtual collections and to enable attaching basic metadata directly to PIDs. With such a toolset, PIDs can support crucial data management tasks. For example, data replication performed in ExArch can be supported through PIDs as they can help to establish durable links between identical copies. By linking derivative data objects together, their provenance can be traced with a level of detail and reliability currently unavailable in the Earth system modelling domain. Regarding data transfers, virtual collections of PIDs may be used to package data prior to transmission. If the PID of such a collection is used as the primary key in data transfers, safety of transfer and traceability of data objects across repositories increases. End-users can benefit from PIDs as well since they make data discovery independent from particular storage sites and enable user-friendly communication about primary research objects. A generic PID system can in fact be a fundamental building block for scientific e-infrastructures across projects and domains.

  12. Mercury bioaccumulation and trophic transfer in the terrestrial food web of a montane forest.

    PubMed

    Rimmer, Christopher C; Miller, Eric K; McFarland, Kent P; Taylor, Robert J; Faccio, Steven D

    2010-04-01

    We investigated mercury (Hg) concentrations in a terrestrial food web in high elevation forests in Vermont. Hg concentrations increased from autotrophic organisms to herbivores < detritivores < omnivores < carnivores. Within the carnivores studied, raptors had higher blood Hg concentrations than their songbird prey. The Hg concentration in the blood of the focal study species, Bicknell's thrush (Catharus bicknelli), varied over the course of the summer in response to a diet shift related to changing availability of arthropod prey. The Bicknell's thrush food web is more detrital-based (with higher Hg concentrations) in early summer and more foliage-based (with lower Hg concentrations) during late summer. There were significant year effects in different ecosystem compartments indicating a possible connection between atmospheric Hg deposition, detrital-layer Hg concentrations, arthropod Hg concentrations, and passerine blood Hg concentrations.

  13. On Building a Search Interface Discovery System

    NASA Astrophysics Data System (ADS)

    Shestakov, Denis

    A huge portion of the Web known as the deep Web is accessible via search interfaces to myriads of databases on the Web. While relatively good approaches for querying the contents of web databases have been recently proposed, one cannot fully utilize them having most search interfaces unlocated. Thus, the automatic recognition of search interfaces to online databases is crucial for any application accessing the deep Web. This paper describes the architecture of the I-Crawler, a system for finding and classifying search interfaces. The I-Crawler is intentionally designed to be used in the deep web characterization surveys and for constructing directories of deep web resources.

  14. Intelligent Web-Based Learning System with Personalized Learning Path Guidance

    ERIC Educational Resources Information Center

    Chen, C. M.

    2008-01-01

    Personalized curriculum sequencing is an important research issue for web-based learning systems because no fixed learning paths will be appropriate for all learners. Therefore, many researchers focused on developing e-learning systems with personalized learning mechanisms to assist on-line web-based learning and adaptively provide learning paths…

  15. Global Patterns in Ecological Indicators of Marine Food Webs: A Modelling Approach

    PubMed Central

    Heymans, Johanna Jacomina; Coll, Marta; Libralato, Simone; Morissette, Lyne; Christensen, Villy

    2014-01-01

    Background Ecological attributes estimated from food web models have the potential to be indicators of good environmental status given their capabilities to describe redundancy, food web changes, and sensitivity to fishing. They can be used as a baseline to show how they might be modified in the future with human impacts such as climate change, acidification, eutrophication, or overfishing. Methodology In this study ecological network analysis indicators of 105 marine food web models were tested for variation with traits such as ecosystem type, latitude, ocean basin, depth, size, time period, and exploitation state, whilst also considering structural properties of the models such as number of linkages, number of living functional groups or total number of functional groups as covariate factors. Principal findings Eight indicators were robust to model construction: relative ascendency; relative overhead; redundancy; total systems throughput (TST); primary production/TST; consumption/TST; export/TST; and total biomass of the community. Large-scale differences were seen in the ecosystems of the Atlantic and Pacific Oceans, with the Western Atlantic being more complex with an increased ability to mitigate impacts, while the Eastern Atlantic showed lower internal complexity. In addition, the Eastern Pacific was less organised than the Eastern Atlantic although both of these systems had increased primary production as eastern boundary current systems. Differences by ecosystem type highlighted coral reefs as having the largest energy flow and total biomass per unit of surface, while lagoons, estuaries, and bays had lower transfer efficiencies and higher recycling. These differences prevailed over time, although some traits changed with fishing intensity. Keystone groups were mainly higher trophic level species with mostly top-down effects, while structural/dominant groups were mainly lower trophic level groups (benthic primary producers such as seagrass and macroalgae, and invertebrates). Keystone groups were prevalent in estuarine or small/shallow systems, and in systems with reduced fishing pressure. Changes to the abundance of key functional groups might have significant implications for the functioning of ecosystems and should be avoided through management. Conclusion/significance Our results provide additional understanding of patterns of structural and functional indicators in different ecosystems. Ecosystem traits such as type, size, depth, and location need to be accounted for when setting reference levels as these affect absolute values of ecological indicators. Therefore, establishing absolute reference values for ecosystem indicators may not be suitable to the ecosystem-based, precautionary approach. Reference levels for ecosystem indicators should be developed for individual ecosystems or ecosystems with the same typologies (similar location, ecosystem type, etc.) and not benchmarked against all other ecosystems. PMID:24763610

  16. Global patterns in ecological indicators of marine food webs: a modelling approach.

    PubMed

    Heymans, Johanna Jacomina; Coll, Marta; Libralato, Simone; Morissette, Lyne; Christensen, Villy

    2014-01-01

    Ecological attributes estimated from food web models have the potential to be indicators of good environmental status given their capabilities to describe redundancy, food web changes, and sensitivity to fishing. They can be used as a baseline to show how they might be modified in the future with human impacts such as climate change, acidification, eutrophication, or overfishing. In this study ecological network analysis indicators of 105 marine food web models were tested for variation with traits such as ecosystem type, latitude, ocean basin, depth, size, time period, and exploitation state, whilst also considering structural properties of the models such as number of linkages, number of living functional groups or total number of functional groups as covariate factors. Eight indicators were robust to model construction: relative ascendency; relative overhead; redundancy; total systems throughput (TST); primary production/TST; consumption/TST; export/TST; and total biomass of the community. Large-scale differences were seen in the ecosystems of the Atlantic and Pacific Oceans, with the Western Atlantic being more complex with an increased ability to mitigate impacts, while the Eastern Atlantic showed lower internal complexity. In addition, the Eastern Pacific was less organised than the Eastern Atlantic although both of these systems had increased primary production as eastern boundary current systems. Differences by ecosystem type highlighted coral reefs as having the largest energy flow and total biomass per unit of surface, while lagoons, estuaries, and bays had lower transfer efficiencies and higher recycling. These differences prevailed over time, although some traits changed with fishing intensity. Keystone groups were mainly higher trophic level species with mostly top-down effects, while structural/dominant groups were mainly lower trophic level groups (benthic primary producers such as seagrass and macroalgae, and invertebrates). Keystone groups were prevalent in estuarine or small/shallow systems, and in systems with reduced fishing pressure. Changes to the abundance of key functional groups might have significant implications for the functioning of ecosystems and should be avoided through management. Our results provide additional understanding of patterns of structural and functional indicators in different ecosystems. Ecosystem traits such as type, size, depth, and location need to be accounted for when setting reference levels as these affect absolute values of ecological indicators. Therefore, establishing absolute reference values for ecosystem indicators may not be suitable to the ecosystem-based, precautionary approach. Reference levels for ecosystem indicators should be developed for individual ecosystems or ecosystems with the same typologies (similar location, ecosystem type, etc.) and not benchmarked against all other ecosystems.

  17. A web-based biosignal data management system for U-health data integration.

    PubMed

    Ro, Dongwoo; Yoo, Sooyoung; Choi, Jinwook

    2008-11-06

    In the ubiquitous healthcare environment, the biosignal data should be easily accessed and properly maintained. This paper describes a web-based data management system. It consists of a device interface, a data upload control, a central repository, and a web server. For the user-specific web services, a MFER Upload ActiveX Control was developed.

  18. Developing Distributed Collaboration Systems at NASA: A Report from the Field

    NASA Technical Reports Server (NTRS)

    Becerra-Fernandez, Irma; Stewart, Helen; Knight, Chris; Norvig, Peter (Technical Monitor)

    2001-01-01

    Web-based collaborative systems have assumed a pivotal role in the information systems development arena. While business to customers (B-to-C) and business to business (B-to-B) electronic commerce systems, search engines, and chat sites are the focus of attention, web-based systems span the gamut of information systems that were traditionally confined to internal organizational client server networks. For example, the Domino Application Server allows Lotus Notes (trademarked) uses to build collaborative intranet applications and mySAP.com (trademarked) enables web portals and e-commerce applications for SAP users. This paper presents the experiences in the development of one such system: Postdoc, a government off-the-shelf web-based collaborative environment. Issues related to the design of web-based collaborative information systems, including lessons learned from the development and deployment of the system as well as measured performance, are presented in this paper. Finally, the limitations of the implementation approach as well as future plans are presented as well.

  19. Technique of nonvascularized toe phalangeal transfer and distraction lengthening in the treatment of multiple digit symbrachydactyly.

    PubMed

    Netscher, David T; Lewis, Eric V

    2008-06-01

    A combination of nonvascularized multiple toe phalangeal transfers, web space deepening, and distraction lengthening may provide excellent function in the child born with the oligodactylous type of symbrachydactyly. These techniques may reconstruct multiple digits, maintaining a wide and stable grip span with good prehension to the thumb. We detail the techniques of each of these 3 stages in reconstruction and describe appropriate patient selection. Potential complications are discussed. However, with strict attention to technical details, these complications can be minimized.

  20. Farmers' Market Manager's Level of Communication and Influence on Electronic Benefits Transfer (EBT) Adoption at Midwest Farmers' Markets.

    PubMed

    Hasin, Afroza; Smith, Sylvia

    2018-01-01

    To understand market managers' level of communication and use of technology that might influence decision to adopt Electronic Benefits Transfer (EBT) at farmers' markets. Cross-sectional study using the Theory of Diffusion of Innovation. Electronic survey administered in midwest states of Illinois, Michigan, and Wisconsin. Farmers' market managers in Illinois, Michigan, and Wisconsin. Information on EBT adoption, market managers' communication, and technology use. Binary logistic regression analysis with EBT adoption as the dependent variable and frequency of technology use, partnership with organizations, farmers' market association (FMA) membership, Facebook page and Web site for the market, and primary source of information as independent variables. Chi-square tests and ANOVA were used to compare states and adopter categories. Logistic regression results showed that the odds of adopting EBT was 7.5 times higher for markets that had partnership with other organizations. Compared with non-adopters, a significantly greater number of early adopters had partnership, FMA membership, and a Facebook page and Web site for market, and reported to a board of directors. Markets that had partnership, FMA membership, a Facebook page and Web site, and mandatory reporting to a board of directors were important factors that influenced EBT adoption at midwest farmers' markets. Copyright © 2017 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  1. 210Po and 210Pb trophic transfer within the phytoplankton-zooplankton-anchovy/sardine food web: a case study from the Gulf of Lion (NW Mediterranean Sea).

    PubMed

    Strady, Emilie; Harmelin-Vivien, Mireille; Chiffoleau, Jean François; Veron, Alain; Tronczynski, Jacek; Radakovitch, Olivier

    2015-05-01

    The transfer of (210)Po and (210)Pb in the food web of small pelagic fishes (from phytoplankton and zooplankton to anchovy Engraulis encrasicolus and sardine Sardina pilchardus) is investigated in the Gulf of Lion (GoL). We present original data of (210)Po and (210)Pb activity concentrations, C and N stable isotope ratios, measured (i) from different size classes of phytoplankton and zooplankton during spring and winter in different environments of the GoL, and (ii) in two fish species. Significant spatial patterns based on (210)Po, (210)Pb activity concentrations and (210)Po/(210)Pb ratios in the different plankton size classes are evidenced by hierarchical clustering, both in spring and winter. This variability, also observed for C and N stable isotopes ratios, is connected to local specific pelagic habitats and hydrodynamics. The sampling strategy suggests that (210)Po bioaccumulation in the GoL remains at a constant level from the first (dominated by phytoplankton) to the second trophic level (zooplankton), while (210)Pb bioaccumulation shows an increase in winter. Based on stable N isotope ratios and (210)Po activity concentrations measured in anchovies and sardines, we evidence (210)Po bio-magnification along the trophic food web of these two planktivorous pelagic fishes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Transfer of Real-time Dynamic Radiation Environment Assimilation Model; Research to Operation

    NASA Astrophysics Data System (ADS)

    Cho, K. S. F.; Hwang, J.; Shin, D. K.; Kim, G. J.; Morley, S.; Henderson, M. G.; Friedel, R. H.; Reeves, G. D.

    2015-12-01

    Real-time Dynamic Radiation Environment Assimilation Model (rtDREAM) was developed by LANL for nowcast of energetic electrons' flux at the radiation belt to quantify potential risks from radiation damage at the satellites. Assimilated data are from multiple sources including LANL assets (GEO, GPS). For transfer from research to operation of the rtDREAM code, LANL/KSWC/NOAA makes a Memorandum Of Understanding (MOU) on the collaboration between three parts. By this MOU, KWSC/RRA provides all the support for transitioning the research version of DREAM to operations. KASI is primarily responsible for providing all the interfaces between the current scientific output formats of the code and useful space weather products that can be used and accessed through the web. In the second phase, KASI will be responsible in performing the work needed to transform the Van Allen Probes beacon data into "DREAM ready" inputs. KASI will also provide the "operational" code framework and additional data preparation, model output, display and web page codes back to LANL and SWPC. KASI is already a NASA partnering ground station for the Van Allen Probes' space weather beacon data and can here show use and utility of these data for comparison between rtDREAM and observations by web. NOAA has offered to take on some of the data processing tasks specific to the GOES data.

  3. Embedded controller for GEM detector readout system

    NASA Astrophysics Data System (ADS)

    Zabołotny, Wojciech M.; Byszuk, Adrian; Chernyshova, Maryna; Cieszewski, Radosław; Czarski, Tomasz; Dominik, Wojciech; Jakubowska, Katarzyna L.; Kasprowicz, Grzegorz; Poźniak, Krzysztof; Rzadkiewicz, Jacek; Scholz, Marek

    2013-10-01

    This paper describes the embedded controller used for the multichannel readout system for the GEM detector. The controller is based on the embedded Mini ITX mainboard, running the GNU/Linux operating system. The controller offers two interfaces to communicate with the FPGA based readout system. FPGA configuration and diagnostics is controlled via low speed USB based interface, while high-speed setup of the readout parameters and reception of the measured data is handled by the PCI Express (PCIe) interface. Hardware access is synchronized by the dedicated server written in C. Multiple clients may connect to this server via TCP/IP network, and different priority is assigned to individual clients. Specialized protocols have been implemented both for low level access on register level and for high level access with transfer of structured data with "msgpack" protocol. High level functionalities have been split between multiple TCP/IP servers for parallel operation. Status of the system may be checked, and basic maintenance may be performed via web interface, while the expert access is possible via SSH server. System was designed with reliability and flexibility in mind.

  4. Organizational aspects of e-referrals.

    PubMed

    Wootton, R; Harno, K; Reponen, J

    2003-01-01

    Three different, well established systems for e-referral were examined. They ranged from a system in a single country handling a large number of cases (60,000 per year) to a global system covering many countries which handled fewer cases (150 per year). Nonetheless, there appeared to be a number of common features. Whether the purpose is e-transfer or e-consultation, the underlying model of the e-referral process is: the referrer initiates an e-request; the organization managing the process receives it; the organization allocates it for reply; the responder replies to the initiator. Various things can go wrong and the organization managing the e-referral process needs to be able to track requests through the system; this requires various performance metrics. E-referral can be conducted using email, or as messages passed either directly between computer systems or via a Web-link to a server. The experience of the three systems studied shows that significant changes in work practice are needed to launch an e-referral service successfully. The use of e-referral between primary and secondary care improves access to services and can be shown to be cost-effective.

  5. Keeping Track Every Step of the Way

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Knowledge Sharing Systems, Inc., a producer of intellectual assets management software systems for the federal government, universities, non-profit laboratories, and private companies, constructed and presently manages the NASA Technology Tracking System, also known as TechTracS. Under contract to Langley Research Center, TechTracS identifies and captures all NASA technologies, manages the patent prosecution process, and then tracks their progress en route to commercialization. The system supports all steps involved in various technology transfer activities, and is considered the premier intellectual asset management system used in the federal government today. NASA TechTracS consists of multiple relational databases and web servers, located at each of the 10 field centers, as well as NASA Headquarters. The system is capable of supporting the following functions: planning commercial technologies; commercialization activities; reporting new technologies and inventions; and processing and tracking intellectual property rights, licensing, partnerships, awards, and success stories. NASA TechTracS is critical to the Agency's ongoing mission to commercialize its revolutionary technologies in a variety of sectors within private industry, both aerospace and non- aerospace.

  6. Development of a Smart Mobile Data Module for Fetal Monitoring in E-Healthcare.

    PubMed

    Houzé de l'Aulnoit, Agathe; Boudet, Samuel; Génin, Michaël; Gautier, Pierre-François; Schiro, Jessica; Houzé de l'Aulnoit, Denis; Beuscart, Régis

    2018-03-23

    The fetal heart rate (FHR) is a marker of fetal well-being in utero (when monitoring maternal and/or fetal pathologies) and during labor. Here, we developed a smart mobile data module for the remote acquisition and transmission (via a Wi-Fi or 4G connection) of FHR recordings, together with a web-based viewer for displaying the FHR datasets on a computer, smartphone or tablet. In order to define the features required by users, we modelled the fetal monitoring procedure (in home and hospital settings) via semi-structured interviews with midwives and obstetricians. Using this information, we developed a mobile data transfer module based on a Raspberry Pi. When connected to a standalone fetal monitor, the module acquires the FHR signal and sends it (via a Wi-Fi or a 3G/4G mobile internet connection) to a secure server within our hospital information system. The archived, digitized signal data are linked to the patient's electronic medical records. An HTML5/JavaScript web viewer converts the digitized FHR data into easily readable and interpretable graphs for viewing on a computer (running Windows, Linux or MacOS) or a mobile device (running Android, iOS or Windows Phone OS). The data can be viewed in real time or offline. The application includes tools required for correct interpretation of the data (signal loss calculation, scale adjustment, and precise measurements of the signal's characteristics). We performed a proof-of-concept case study of the transmission, reception and visualization of FHR data for a pregnant woman at 30 weeks of amenorrhea. She was hospitalized in the pregnancy assessment unit and FHR data were acquired three times a day with a Philips Avalon® FM30 fetal monitor. The prototype (Raspberry Pi) was connected to the fetal monitor's RS232 port. The emission and reception of prerecorded signals were tested and the web server correctly received the signals, and the FHR recording was visualized in real time on a computer, a tablet and smartphones (running Android and iOS) via the web viewer. This process did not perturb the hospital's computer network. There was no data delay or loss during a 60-min test. The web viewer was tested successfully in the various usage situations. The system was as user-friendly as expected, and enabled rapid, secure archiving. We have developed a system for the acquisition, transmission, recording and visualization of RCF data. Healthcare professionals can view the FHR data remotely on their computer, tablet or smartphone. Integration of FHR data into a hospital information system enables optimal, secure, long-term data archiving.

  7. CoP Sensing Framework on Web-Based Environment

    NASA Astrophysics Data System (ADS)

    Mustapha, S. M. F. D. Syed

    The Web technologies and Web applications have shown similar high growth rate in terms of daily usages and user acceptance. The Web applications have not only penetrated in the traditional domains such as education and business but have also encroached into areas such as politics, social, lifestyle, and culture. The emergence of Web technologies has enabled Web access even to the person on the move through PDAs or mobile phones that are connected using Wi-Fi, HSDPA, or other communication protocols. These two phenomena are the inducement factors toward the need of building Web-based systems as the supporting tools in fulfilling many mundane activities. In doing this, one of the many focuses in research has been to look at the implementation challenges in building Web-based support systems in different types of environment. This chapter describes the implementation issues in building the community learning framework that can be supported on the Web-based platform. The Community of Practice (CoP) has been chosen as the community learning theory to be the case study and analysis as it challenges the creativity of the architectural design of the Web system in order to capture the presence of learning activities. The details of this chapter describe the characteristics of the CoP to understand the inherent intricacies in modeling in the Web-based environment, the evidences of CoP that need to be traced automatically in a slick manner such that the evidence-capturing process is unobtrusive, and the technologies needed to embrace a full adoption of Web-based support system for the community learning framework.

  8. The Effect of Teaching Methods and Learning Style on Learning Program Design in Web-Based Education Systems

    ERIC Educational Resources Information Center

    Hung, Yen-Chu

    2012-01-01

    The instructional value of web-based education systems has been an important area of research in information systems education. This study investigates the effect of various teaching methods on program design learning for students with specific learning styles in web-based education systems. The study takes first-year Computer Science and…

  9. Web-Based Triage in a College Health Setting

    ERIC Educational Resources Information Center

    Sole, Mary Lou; Stuart, Patricia L.; Deichen, Michael

    2006-01-01

    The authors describe the initiation and use of a Web-based triage system in a college health setting. During the first 4 months of implementation, the system recorded 1,290 encounters. More women accessed the system (70%); the average age was 21.8 years. The Web-based triage system advised the majority of students to seek care within 24 hours;…

  10. Using Hi-FAME (High Feedback-Assessment-Multimedia-Environment) Instructional Model in WBI: A Case Study for Biology Teacher Education.

    ERIC Educational Resources Information Center

    Wang, Tzu-Hua; Wang, Wei-Lung; Wang, Kuo-Hua; Huang, Shih-Chieh

    The study attempted to adapt two web tools, FFS system (Frontpage Feedback System) and WATA system (Web-based Assessment and Test Analysis System), to construct a Hi-FAME (High Feedback-Assessment-Multimedia-Environment) Model in WBI (Web-based Instruction) to facilitate pre-service teacher training. Participants were 30 junior pre-service…

  11. The Development of a Web-Based Assessment System to Identify Students' Misconception Automatically on Linear Kinematics with a Four-Tier Instrument Test

    ERIC Educational Resources Information Center

    Pujayanto, Pujayanto; Budiharti, Rini; Adhitama, Egy; Nuraini, Niken Rizky Amalia; Putri, Hanung Vernanda

    2018-01-01

    This research proposes the development of a web-based assessment system to identify students' misconception. The system, named WAS (web-based assessment system), can identify students' misconception profile on linear kinematics automatically after the student has finished the test. The test instrument was developed and validated. Items were…

  12. Health care providers' attitudes towards transfer and transition in young persons with long term illness- a web-based survey.

    PubMed

    Sparud-Lundin, Carina; Berghammer, Malin; Moons, Philip; Bratt, Ewa-Lena

    2017-04-11

    Transition programs in health care for young persons with special health care needs aim to maximize lifelong functioning. Exploring health care professionals' perspective may increase the possibility of successful implementation of transition programs. The aim was to survey health care professionals' attitudes towards components and barriers on transition and transfer in young people with long-term medical conditions with special health care needs. A cross-sectional web-based survey was sent by e-mail to 529 physicians and nurses in Swedish pediatric and adult outpatient clinics. Response rate was 38% (n = 201). The survey consisted of 59 questions regarding different aspects of components and barriers on transition and transfer. Descriptive statistics were computed to summarize demographic data and categorized responses. The Chi square test was used for comparison between proportions of categories. Most respondents agreed on the destinations of care for adolescents within their specialty. Age and psychosocial aspects such as maturity and family situations were considered the most important initiators for transfer. Joint meeting with the patient (82%); presence of a transition coordinator (76%) and a written individualized transfer plan (55%) were reported as important transition components. Pediatric care professionals found the absence of a transition coordinator to be more of a transition barrier than adult care professionals (p = 0.018) and also a more important transfer component (p = 0.017). Other barriers were lack of funding (45%) and limited clinical space (19%). Transition programs were more common in university hospitals than in regional hospitals (12% vs 2%, p = <0.001) as well as having a transition coordinator (12% vs 3%, p = 0.004). The findings highlight a willingness to work on new transition strategies and provide direction for improvement, taking local transition components as well as potential barriers into consideration when implementing future transition programs. Some differences in attitudes towards transitional care remain among pediatric and adult care professionals.

  13. Isotopic study of mercury sources and transfer between a freshwater lake and adjacent forest food web.

    PubMed

    Kwon, Sae Yun; Blum, Joel D; Nadelhoffer, Knute J; Timothy Dvonch, J; Tsui, Martin Tsz-Ki

    2015-11-01

    Studies of monomethylmercury (MMHg) sources and biogeochemical pathways have been extensive in aquatic ecosystems, but limited in forest ecosystems. Increasing evidence suggests that there is significant mercury (Hg) exchange between aquatic and forest ecosystems. We use Hg stable isotope ratios (δ(202)Hg and Δ(199)Hg) to investigate the relative importance of MMHg sources and assess Hg transfer pathways between Douglas Lake and adjacent forests located at the University of Michigan Biological Station, USA. We characterize Hg isotopic compositions of basal resources and use linear regression of % MMHg versus δ(202)Hg and Δ(199)Hg to estimate Hg isotope values for inorganic mercury (IHg) and MMHg in the aquatic and adjacent forest food webs. In the aquatic ecosystem, we found that lake sediment represents a mixture of IHg pools deposited via watershed runoff and precipitation. The δ(202)Hg and Δ(199)Hg values estimated for IHg are consistent with other studies that measured forest floor in temperate forests. The Δ(199)Hg value estimated for MMHg in the aquatic food web indicates that MMHg is subjected to ~20% photochemical degradation prior to bioaccumulation. In the forest ecosystem, we found a significant negative relationship between total Hg and δ(202)Hg and Δ(199)Hg of soil collected at multiple distances from the lakeshore and lake sediment. This suggests that IHg input from watershed runoff provides an important Hg transfer pathway between the forest and aquatic ecosystems. We measured Δ(199)Hg values for high trophic level insects and compared these insects at multiple distances perpendicular to the lake shoreline. The Δ(199)Hg values correspond to the % canopy cover suggesting that forest MMHg is subjected to varying extents of photochemical degradation and the extent may be controlled by sunlight. Our study demonstrates that the use of Hg isotopes adds important new insight into the relative importance of MMHg sources and complex Hg transfer pathways across ecosystem boundaries. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Development of grid-like applications for public health using Web 2.0 mashup techniques.

    PubMed

    Scotch, Matthew; Yip, Kevin Y; Cheung, Kei-Hoi

    2008-01-01

    Development of public health informatics applications often requires the integration of multiple data sources. This process can be challenging due to issues such as different file formats, schemas, naming systems, and having to scrape the content of web pages. A potential solution to these system development challenges is the use of Web 2.0 technologies. In general, Web 2.0 technologies are new internet services that encourage and value information sharing and collaboration among individuals. In this case report, we describe the development and use of Web 2.0 technologies including Yahoo! Pipes within a public health application that integrates animal, human, and temperature data to assess the risk of West Nile Virus (WNV) outbreaks. The results of development and testing suggest that while Web 2.0 applications are reasonable environments for rapid prototyping, they are not mature enough for large-scale public health data applications. The application, in fact a "systems of systems," often failed due to varied timeouts for application response across web sites and services, internal caching errors, and software added to web sites by administrators to manage the load on their servers. In spite of these concerns, the results of this study demonstrate the potential value of grid computing and Web 2.0 approaches in public health informatics.

  15. 106-17 Telemetry Standards Metadata Configuration Chapter 23

    DTIC Science & Technology

    2017-07-01

    23-1 23.2 Metadata Description Language ...Chapter 23, July 2017 iii Acronyms HTML Hypertext Markup Language MDL Metadata Description Language PCM pulse code modulation TMATS Telemetry...Attributes Transfer Standard W3C World Wide Web Consortium XML eXtensible Markup Language XSD XML schema document Telemetry Network Standard

  16. U.S.-Mexican Security Cooperation: the Merida Initiative and Beyond

    DTIC Science & Technology

    2010-08-16

    2010, those funds had yet to be transferred from the State Department to USAID for implementation. 71 “Cárteles Perturban al Sistema Carcelario,” El...Quejas a Web .” Milenio. July 28, 2010. U.S.-Mexican Security Cooperation: the Mérida Initiative and Beyond Congressional Research Service 27

  17. 77 FR 25877 - Amendments to ONRR's Web Site and Mailing Addresses and Payment Definitions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-02

    ..., Electronic funds transfers, Geothermal energy, Indians--lands, Mineral royalties, Oil and gas exploration... Secretary announced the name change of MMS to the Bureau of Ocean Energy Management, Regulation, and... (ONRR); the Bureau of Ocean Energy Management (BOEM); and the Bureau of Safety and Environmental...

  18. A Coevolutionary Arms Race: Understanding Plant-Herbivore Interactions

    ERIC Educational Resources Information Center

    Becklin, Katie M.

    2008-01-01

    Plants and insects share a long evolutionary history characterized by relationships that affect individual, population, and community dynamics. Plant-herbivore interactions are a prominent feature of this evolutionary history; it is by plant-herbivore interactions that energy is transferred from primary producers to the rest of the food web. Not…

  19. A Guide for Using the Internet.

    ERIC Educational Resources Information Center

    Brown, Herb

    This manual provides an instructional overview of Internet resources with student exercises. The work consists of nine chapters: (1) Introduction to the Internet; (2) How to Access the Internet; (3) Electronic Mail; (4) Gopher; (5) File Transfer Protocol-FTP; (6) Newsgroups & Newsreaders; (7) Telnet & TN3270; (8) World Wide Web; and (9)…

  20. Web-Based Notes Is an Inadequate Learning Resource.

    ERIC Educational Resources Information Center

    Amory, Alan; Naicker, Kevin

    Development of online courses requires the use of appropriate educational philosophies that discourage rote learning and passive transfer of information from teacher to learner. This paper reports on the development, use and evaluation of two second year Biology online software packages used by students in constructivist environments. The courses…

Top