Focused Crawling of the Deep Web Using Service Class Descriptions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rocco, D; Liu, L; Critchlow, T
2004-06-21
Dynamic Web data sources--sometimes known collectively as the Deep Web--increase the utility of the Web by providing intuitive access to data repositories anywhere that Web access is available. Deep Web services provide access to real-time information, like entertainment event listings, or present a Web interface to large databases or other data repositories. Recent studies suggest that the size and growth rate of the dynamic Web greatly exceed that of the static Web, yet dynamic content is often ignored by existing search engine indexers owing to the technical challenges that arise when attempting to search the Deep Web. To address thesemore » challenges, we present DynaBot, a service-centric crawler for discovering and clustering Deep Web sources offering dynamic content. DynaBot has three unique characteristics. First, DynaBot utilizes a service class model of the Web implemented through the construction of service class descriptions (SCDs). Second, DynaBot employs a modular, self-tuning system architecture for focused crawling of the DeepWeb using service class descriptions. Third, DynaBot incorporates methods and algorithms for efficient probing of the Deep Web and for discovering and clustering Deep Web sources and services through SCD-based service matching analysis. Our experimental results demonstrate the effectiveness of the service class discovery, probing, and matching algorithms and suggest techniques for efficiently managing service discovery in the face of the immense scale of the Deep Web.« less
Providing Multi-Page Data Extraction Services with XWRAPComposer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Ling; Zhang, Jianjun; Han, Wei
2008-04-30
Dynamic Web data sources – sometimes known collectively as the Deep Web – increase the utility of the Web by providing intuitive access to data repositories anywhere that Web access is available. Deep Web services provide access to real-time information, like entertainment event listings, or present a Web interface to large databases or other data repositories. Recent studies suggest that the size and growth rate of the dynamic Web greatly exceed that of the static Web, yet dynamic content is often ignored by existing search engine indexers owing to the technical challenges that arise when attempting to search the Deepmore » Web. To address these challenges, we present DYNABOT, a service-centric crawler for discovering and clustering Deep Web sources offering dynamic content. DYNABOT has three unique characteristics. First, DYNABOT utilizes a service class model of the Web implemented through the construction of service class descriptions (SCDs). Second, DYNABOT employs a modular, self-tuning system architecture for focused crawling of the Deep Web using service class descriptions. Third, DYNABOT incorporates methods and algorithms for efficient probing of the Deep Web and for discovering and clustering Deep Web sources and services through SCD-based service matching analysis. Our experimental results demonstrate the effectiveness of the service class discovery, probing, and matching algorithms and suggest techniques for efficiently managing service discovery in the face of the immense scale of the Deep Web.« less
A Framework for Transparently Accessing Deep Web Sources
ERIC Educational Resources Information Center
Dragut, Eduard Constantin
2010-01-01
An increasing number of Web sites expose their content via query interfaces, many of them offering the same type of products/services (e.g., flight tickets, car rental/purchasing). They constitute the so-called "Deep Web". Accessing the content on the Deep Web has been a long-standing challenge for the database community. For a user interested in…
Stratification-Based Outlier Detection over the Deep Web.
Xian, Xuefeng; Zhao, Pengpeng; Sheng, Victor S; Fang, Ligang; Gu, Caidong; Yang, Yuanfeng; Cui, Zhiming
2016-01-01
For many applications, finding rare instances or outliers can be more interesting than finding common patterns. Existing work in outlier detection never considers the context of deep web. In this paper, we argue that, for many scenarios, it is more meaningful to detect outliers over deep web. In the context of deep web, users must submit queries through a query interface to retrieve corresponding data. Therefore, traditional data mining methods cannot be directly applied. The primary contribution of this paper is to develop a new data mining method for outlier detection over deep web. In our approach, the query space of a deep web data source is stratified based on a pilot sample. Neighborhood sampling and uncertainty sampling are developed in this paper with the goal of improving recall and precision based on stratification. Finally, a careful performance evaluation of our algorithm confirms that our approach can effectively detect outliers in deep web.
Stratification-Based Outlier Detection over the Deep Web
Xian, Xuefeng; Zhao, Pengpeng; Sheng, Victor S.; Fang, Ligang; Gu, Caidong; Yang, Yuanfeng; Cui, Zhiming
2016-01-01
For many applications, finding rare instances or outliers can be more interesting than finding common patterns. Existing work in outlier detection never considers the context of deep web. In this paper, we argue that, for many scenarios, it is more meaningful to detect outliers over deep web. In the context of deep web, users must submit queries through a query interface to retrieve corresponding data. Therefore, traditional data mining methods cannot be directly applied. The primary contribution of this paper is to develop a new data mining method for outlier detection over deep web. In our approach, the query space of a deep web data source is stratified based on a pilot sample. Neighborhood sampling and uncertainty sampling are developed in this paper with the goal of improving recall and precision based on stratification. Finally, a careful performance evaluation of our algorithm confirms that our approach can effectively detect outliers in deep web. PMID:27313603
Semantic Annotations and Querying of Web Data Sources
NASA Astrophysics Data System (ADS)
Hornung, Thomas; May, Wolfgang
A large part of the Web, actually holding a significant portion of the useful information throughout the Web, consists of views on hidden databases, provided by numerous heterogeneous interfaces that are partly human-oriented via Web forms ("Deep Web"), and partly based on Web Services (only machine accessible). In this paper we present an approach for annotating these sources in a way that makes them citizens of the Semantic Web. We illustrate how queries can be stated in terms of the ontology, and how the annotations are used to selected and access appropriate sources and to answer the queries.
Automatic Generation of Data Types for Classification of Deep Web Sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ngu, A H; Buttler, D J; Critchlow, T J
2005-02-14
A Service Class Description (SCD) is an effective meta-data based approach for discovering Deep Web sources whose data exhibit some regular patterns. However, it is tedious and error prone to create an SCD description manually. Moreover, a manually created SCD is not adaptive to the frequent changes of Web sources. It requires its creator to identify all the possible input and output types of a service a priori. In many domains, it is impossible to exhaustively list all the possible input and output data types of a source in advance. In this paper, we describe machine learning approaches for automaticmore » generation of the data types of an SCD. We propose two different approaches for learning data types of a class of Web sources. The Brute-Force Learner is able to generate data types that can achieve high recall, but with low precision. The Clustering-based Learner generates data types that have a high precision rate, but with a lower recall rate. We demonstrate the feasibility of these two learning-based solutions for automatic generation of data types for citation Web sources and presented a quantitative evaluation of these two solutions.« less
NASA Astrophysics Data System (ADS)
Hornung, Thomas; Simon, Kai; Lausen, Georg
Combining information from different Web sources often results in a tedious and repetitive process, e.g. even simple information requests might require to iterate over a result list of one Web query and use each single result as input for a subsequent query. One approach for this chained queries are data-centric mashups, which allow to visually model the data flow as a graph, where the nodes represent the data source and the edges the data flow.
deepTools2: a next generation web server for deep-sequencing data analysis.
Ramírez, Fidel; Ryan, Devon P; Grüning, Björn; Bhardwaj, Vivek; Kilpert, Fabian; Richter, Andreas S; Heyne, Steffen; Dündar, Friederike; Manke, Thomas
2016-07-08
We present an update to our Galaxy-based web server for processing and visualizing deeply sequenced data. Its core tool set, deepTools, allows users to perform complete bioinformatic workflows ranging from quality controls and normalizations of aligned reads to integrative analyses, including clustering and visualization approaches. Since we first described our deepTools Galaxy server in 2014, we have implemented new solutions for many requests from the community and our users. Here, we introduce significant enhancements and new tools to further improve data visualization and interpretation. deepTools continue to be open to all users and freely available as a web service at deeptools.ie-freiburg.mpg.de The new deepTools2 suite can be easily deployed within any Galaxy framework via the toolshed repository, and we also provide source code for command line usage under Linux and Mac OS X. A public and documented API for access to deepTools functionality is also available. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
deepTools: a flexible platform for exploring deep-sequencing data.
Ramírez, Fidel; Dündar, Friederike; Diehl, Sarah; Grüning, Björn A; Manke, Thomas
2014-07-01
We present a Galaxy based web server for processing and visualizing deeply sequenced data. The web server's core functionality consists of a suite of newly developed tools, called deepTools, that enable users with little bioinformatic background to explore the results of their sequencing experiments in a standardized setting. Users can upload pre-processed files with continuous data in standard formats and generate heatmaps and summary plots in a straight-forward, yet highly customizable manner. In addition, we offer several tools for the analysis of files containing aligned reads and enable efficient and reproducible generation of normalized coverage files. As a modular and open-source platform, deepTools can easily be expanded and customized to future demands and developments. The deepTools webserver is freely available at http://deeptools.ie-freiburg.mpg.de and is accompanied by extensive documentation and tutorials aimed at conveying the principles of deep-sequencing data analysis. The web server can be used without registration. deepTools can be installed locally either stand-alone or as part of Galaxy. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.
Inside the Web: A Look at Digital Libraries and the Invisible/Deep Web
ERIC Educational Resources Information Center
Su, Mila C.
2009-01-01
The evolution of the Internet and the World Wide Web continually exceeds expectations with the "swift pace" of technological innovations. Information is added, and just as quickly becomes outdated at a rapid pace. Researchers have found that Digital materials can provide access to primary source materials and connect the researcher to institutions…
Hybrid Schema Matching for Deep Web
NASA Astrophysics Data System (ADS)
Chen, Kerui; Zuo, Wanli; He, Fengling; Chen, Yongheng
Schema matching is the process of identifying semantic mappings, or correspondences, between two or more schemas. Schema matching is a first step and critical part of data integration. For schema matching of deep web, most researches only interested in query interface, while rarely pay attention to abundant schema information contained in query result pages. This paper proposed a mixed schema matching technique, which combines attributes that appeared in query structures and query results of different data sources, and mines the matched schemas inside. Experimental results prove the effectiveness of this method for improving the accuracy of schema matching.
None Available
2018-02-06
To make the web work better for science, OSTI has developed state-of-the-art technologies and services including a deep web search capability. The deep web includes content in searchable databases available to web users but not accessible by popular search engines, such as Google. This video provides an introduction to the deep web search engine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None Available
To make the web work better for science, OSTI has developed state-of-the-art technologies and services including a deep web search capability. The deep web includes content in searchable databases available to web users but not accessible by popular search engines, such as Google. This video provides an introduction to the deep web search engine.
Hoelzer, Simon; Schweiger, Ralf K; Rieger, Joerg; Meyer, Michael
2006-01-01
The organizational structures of web contents and electronic information resources must adapt to the demands of a growing volume of information and user requirements. Otherwise the information society will be threatened by disinformation. The biomedical sciences are especially vulnerable in this regard, since they are strongly oriented toward text-based knowledge sources. Here sustainable improvement can only be achieved by using a comprehensive, integrated approach that not only includes data management but also specifically incorporates the editorial processes, including structuring information sources and publication. The technical resources needed to effectively master these tasks are already available in the form of the data standards and tools of the Semantic Web. They include Rich Site Summaries (RSS), which have become an established means of distributing and syndicating conventional news messages and blogs. They can also provide access to the contents of the previously mentioned information sources, which are conventionally classified as 'deep web' content.
On Building a Search Interface Discovery System
NASA Astrophysics Data System (ADS)
Shestakov, Denis
A huge portion of the Web known as the deep Web is accessible via search interfaces to myriads of databases on the Web. While relatively good approaches for querying the contents of web databases have been recently proposed, one cannot fully utilize them having most search interfaces unlocated. Thus, the automatic recognition of search interfaces to online databases is crucial for any application accessing the deep Web. This paper describes the architecture of the I-Crawler, a system for finding and classifying search interfaces. The I-Crawler is intentionally designed to be used in the deep web characterization surveys and for constructing directories of deep web resources.
Sobczak, W.V.; Cloern, J.E.; Jassby, A.D.; Cole, B.E.; Schraga, T.S.; Arnsberg, A.
2005-01-01
Detritus from terrestrial ecosystems is the major source of organic matter in many streams, rivers, and estuaries, yet the role of detritus in supporting pelagic food webs is debated. We examined the importance of detritus to secondary productivity in the Sacramento and San Joaquin River Delta (California, United States), a large complex of tidal freshwater habitats. The Delta ecosystem has low primary productivity but large detrital inputs, so we hypothesized that detritus is the primary energy source fueling production in pelagic food webs. We assessed the sources, quantity, composition, and bioavailability of organic matter among a diversity of habitats (e.g., marsh sloughs, floodplains, tidal lakes, and deep river channels) over two years to test this hypothesis. Our results support the emerging principle that detritus dominates riverine and estuarine organic matter supply and supports the majority of ecosystem metabolism. Yet in contrast to prevailing ideas, we found that detritus was weakly coupled to the Delta's pelagic food web. Results from independent approaches showed that phytoplankton production was the dominant source of organic matter for the Delta's pelagic food web, even though primary production accounts for a small fraction of the Delta's organic matter supply. If these results are general, they suggest that the value of organic matter to higher trophic levels, including species targeted by programs of ecosystem restoration, is a function of phytoplankton production. ?? 2005 Estuarine Research Federation.
An insight into the deep web; why it matters for addiction psychiatry?
Orsolini, Laura; Papanti, Duccio; Corkery, John; Schifano, Fabrizio
2017-05-01
Nowadays, the web is rapidly spreading, playing a significant role in the marketing or sale or distribution of "quasi" legal drugs, hence facilitating continuous changes in drug scenarios. The easily renewable and anarchic online drug-market is gradually transforming indeed the drug market itself, from a "street" to a "virtual" one, with customers being able to shop with a relative anonymity in a 24-hr marketplace. The hidden "deep web" is facilitating this phenomenon. The paper aims at providing an overview to mental health's and addiction's professionals on current knowledge about prodrug activities on the deep web. A nonparticipant netnographic qualitative study of a list of prodrug websites (blogs, fora, and drug marketplaces) located into the surface web was here carried out. A systematic Internet search was conducted on Duckduckgo® and Google® whilst including the following keywords: "drugs" or "legal highs" or "Novel Psychoactive Substances" or "NPS" combined with the word deep web. Four themes (e.g., "How to access into the deepweb"; "Darknet and the online drug trading sites"; "Grams-search engine for the deep web"; and "Cryptocurrencies") and 14 categories were here generated and properly discussed. This paper represents a complete or systematical guideline about the deep web, specifically focusing on practical information on online drug marketplaces, useful for addiction's professionals. Copyright © 2017 John Wiley & Sons, Ltd.
Automating Information Discovery Within the Invisible Web
NASA Astrophysics Data System (ADS)
Sweeney, Edwina; Curran, Kevin; Xie, Ermai
A Web crawler or spider crawls through the Web looking for pages to index, and when it locates a new page it passes the page on to an indexer. The indexer identifies links, keywords, and other content and stores these within its database. This database is searched by entering keywords through an interface and suitable Web pages are returned in a results page in the form of hyperlinks accompanied by short descriptions. The Web, however, is increasingly moving away from being a collection of documents to a multidimensional repository for sounds, images, audio, and other formats. This is leading to a situation where certain parts of the Web are invisible or hidden. The term known as the "Deep Web" has emerged to refer to the mass of information that can be accessed via the Web but cannot be indexed by conventional search engines. The concept of the Deep Web makes searches quite complex for search engines. Google states that the claim that conventional search engines cannot find such documents as PDFs, Word, PowerPoint, Excel, or any non-HTML page is not fully accurate and steps have been taken to address this problem by implementing procedures to search items such as academic publications, news, blogs, videos, books, and real-time information. However, Google still only provides access to a fraction of the Deep Web. This chapter explores the Deep Web and the current tools available in accessing it.
Seasonal pathways of organic matter within the Avilés submarine canyon: Food web implications
NASA Astrophysics Data System (ADS)
Romero-Romero, Sonia; Molina-Ramírez, Axayacatl; Höfer, Juan; Duineveld, Gerard; Rumín-Caparrós, Aitor; Sanchez-Vidal, Anna; Canals, Miquel; Acuña, José Luis
2016-11-01
The transport and fate of organic matter (OM) sources within the Avilés submarine canyon (Cantabrian Sea, Southern Bay of Biscay) were studied using carbon and nitrogen stable isotope ratios. The isotopic composition of settling particles and deep bottom sediments closely resembled that of surface particulate OM, and there were no marked differences in the isotopic composition of settling particles inside and outside of the AC. This indicates that the Avilés Canyon (AC) receives inputs of sinking OM mostly from the upper water column and less through advective near-bottom down-canyon transport. Sinking OM fluxes are of marine and terrestrial origin in proportions which vary seasonally. Analysis of δ13C in the canyon fauna indicates a dependence on OM mainly produced by marine phytoplankton. A tight coupling of isotopic signatures between pelagic organisms and benthic suspension feeders reflects an active biological vertical transport of OM from the surface to the deep-sea. The food web presented seasonal variations in the trophic niche width and the amplitude of the primary carbon sources, reflecting seasonality in the availability of fresh particulate OM. Those seasonal changes are larger for benthic organisms of lower trophic levels.
Food-Web Complexity in Guaymas Basin Hydrothermal Vents and Cold Seeps.
Portail, Marie; Olu, Karine; Dubois, Stanislas F; Escobar-Briones, Elva; Gelinas, Yves; Menot, Lénaick; Sarrazin, Jozée
In the Guaymas Basin, the presence of cold seeps and hydrothermal vents in close proximity, similar sedimentary settings and comparable depths offers a unique opportunity to assess and compare the functioning of these deep-sea chemosynthetic ecosystems. The food webs of five seep and four vent assemblages were studied using stable carbon and nitrogen isotope analyses. Although the two ecosystems shared similar potential basal sources, their food webs differed: seeps relied predominantly on methanotrophy and thiotrophy via the Calvin-Benson-Bassham (CBB) cycle and vents on petroleum-derived organic matter and thiotrophy via the CBB and reductive tricarboxylic acid (rTCA) cycles. In contrast to symbiotic species, the heterotrophic fauna exhibited high trophic flexibility among assemblages, suggesting weak trophic links to the metabolic diversity of chemosynthetic primary producers. At both ecosystems, food webs did not appear to be organised through predator-prey links but rather through weak trophic relationships among co-occurring species. Examples of trophic or spatial niche differentiation highlighted the importance of species-sorting processes within chemosynthetic ecosystems. Variability in food web structure, addressed through Bayesian metrics, revealed consistent trends across ecosystems. Food-web complexity significantly decreased with increasing methane concentrations, a common proxy for the intensity of seep and vent fluid fluxes. Although high fluid-fluxes have the potential to enhance primary productivity, they generate environmental constraints that may limit microbial diversity, colonisation of consumers and the structuring role of competitive interactions, leading to an overall reduction of food-web complexity and an increase in trophic redundancy. Heterogeneity provided by foundation species was identified as an additional structuring factor. According to their biological activities, foundation species may have the potential to partly release the competitive pressure within communities of low fluid-flux habitats. Finally, ecosystem functioning in vents and seeps was highly similar despite environmental differences (e.g. physico-chemistry, dominant basal sources) suggesting that ecological niches are not specifically linked to the nature of fluids. This comparison of seep and vent functioning in the Guaymas basin thus provides further supports to the hypothesis of continuity among deep-sea chemosynthetic ecosystems.
Food-Web Complexity in Guaymas Basin Hydrothermal Vents and Cold Seeps
Olu, Karine; Dubois, Stanislas F.; Escobar-Briones, Elva; Gelinas, Yves; Menot, Lénaick; Sarrazin, Jozée
2016-01-01
In the Guaymas Basin, the presence of cold seeps and hydrothermal vents in close proximity, similar sedimentary settings and comparable depths offers a unique opportunity to assess and compare the functioning of these deep-sea chemosynthetic ecosystems. The food webs of five seep and four vent assemblages were studied using stable carbon and nitrogen isotope analyses. Although the two ecosystems shared similar potential basal sources, their food webs differed: seeps relied predominantly on methanotrophy and thiotrophy via the Calvin-Benson-Bassham (CBB) cycle and vents on petroleum-derived organic matter and thiotrophy via the CBB and reductive tricarboxylic acid (rTCA) cycles. In contrast to symbiotic species, the heterotrophic fauna exhibited high trophic flexibility among assemblages, suggesting weak trophic links to the metabolic diversity of chemosynthetic primary producers. At both ecosystems, food webs did not appear to be organised through predator-prey links but rather through weak trophic relationships among co-occurring species. Examples of trophic or spatial niche differentiation highlighted the importance of species-sorting processes within chemosynthetic ecosystems. Variability in food web structure, addressed through Bayesian metrics, revealed consistent trends across ecosystems. Food-web complexity significantly decreased with increasing methane concentrations, a common proxy for the intensity of seep and vent fluid fluxes. Although high fluid-fluxes have the potential to enhance primary productivity, they generate environmental constraints that may limit microbial diversity, colonisation of consumers and the structuring role of competitive interactions, leading to an overall reduction of food-web complexity and an increase in trophic redundancy. Heterogeneity provided by foundation species was identified as an additional structuring factor. According to their biological activities, foundation species may have the potential to partly release the competitive pressure within communities of low fluid-flux habitats. Finally, ecosystem functioning in vents and seeps was highly similar despite environmental differences (e.g. physico-chemistry, dominant basal sources) suggesting that ecological niches are not specifically linked to the nature of fluids. This comparison of seep and vent functioning in the Guaymas basin thus provides further supports to the hypothesis of continuity among deep-sea chemosynthetic ecosystems. PMID:27683216
WARCProcessor: An Integrative Tool for Building and Management of Web Spam Corpora.
Callón, Miguel; Fdez-Glez, Jorge; Ruano-Ordás, David; Laza, Rosalía; Pavón, Reyes; Fdez-Riverola, Florentino; Méndez, Jose Ramón
2017-12-22
In this work we present the design and implementation of WARCProcessor, a novel multiplatform integrative tool aimed to build scientific datasets to facilitate experimentation in web spam research. The developed application allows the user to specify multiple criteria that change the way in which new corpora are generated whilst reducing the number of repetitive and error prone tasks related with existing corpus maintenance. For this goal, WARCProcessor supports up to six commonly used data sources for web spam research, being able to store output corpus in standard WARC format together with complementary metadata files. Additionally, the application facilitates the automatic and concurrent download of web sites from Internet, giving the possibility of configuring the deep of the links to be followed as well as the behaviour when redirected URLs appear. WARCProcessor supports both an interactive GUI interface and a command line utility for being executed in background.
WARCProcessor: An Integrative Tool for Building and Management of Web Spam Corpora
Callón, Miguel; Fdez-Glez, Jorge; Ruano-Ordás, David; Laza, Rosalía; Pavón, Reyes; Méndez, Jose Ramón
2017-01-01
In this work we present the design and implementation of WARCProcessor, a novel multiplatform integrative tool aimed to build scientific datasets to facilitate experimentation in web spam research. The developed application allows the user to specify multiple criteria that change the way in which new corpora are generated whilst reducing the number of repetitive and error prone tasks related with existing corpus maintenance. For this goal, WARCProcessor supports up to six commonly used data sources for web spam research, being able to store output corpus in standard WARC format together with complementary metadata files. Additionally, the application facilitates the automatic and concurrent download of web sites from Internet, giving the possibility of configuring the deep of the links to be followed as well as the behaviour when redirected URLs appear. WARCProcessor supports both an interactive GUI interface and a command line utility for being executed in background. PMID:29271913
Deep-sea macrourid fishes scavenge on plant material: Evidence from in situ observations
NASA Astrophysics Data System (ADS)
Jeffreys, Rachel M.; Lavaleye, Marc S. S.; Bergman, Magda J. N.; Duineveld, Gerard C. A.; Witbaard, Rob; Linley, Thom
2010-04-01
Deep-sea benthic communities primarily rely on an allochthonous food source. This may be in the form of phytodetritus or as food falls e.g. sinking carcasses of nekton or debris of marine macrophyte algae. Deep-sea macrourids are the most abundant demersal fish in the deep ocean. Macrourids are generally considered to be the apex predators/scavengers in deep-sea communities. Baited camera experiments and stable isotope analyses have demonstrated that animal carrion derived from the surface waters is an important component in the diets of macrourids; some macrourid stomachs also contained vegetable/plant material e.g. onion peels, oranges, algae. The latter observations led us to the question: is plant material an attractive food source for deep-sea scavenging fish? We simulated a plant food fall using in situ benthic lander systems equipped with a baited time-lapse camera. Abyssal macrourids and cusk-eels were attracted to the bait, both feeding vigorously on the bait, and the majority of the bait was consumed in <30 h. These observations indicate (1) plant material can produce an odour plume similar to that of animal carrion and attracts deep-sea fish, and (2) deep-sea fish readily eat plant material. This represents to our knowledge the first in situ documentation of deep-sea fish ingesting plant material and highlights the variability in the scavenging nature of deep-sea fishes. This may have implications for food webs in areas where macrophyte/seagrass detritus is abundant at the seafloor e.g. canyon systems and continental shelves close to seagrass meadows (Bahamas and Mediterranean).
ERIC Educational Resources Information Center
Turner, Laura
2001-01-01
Focuses on the Deep Web, defined as Web content in searchable databases of the type that can be found only by direct query. Discusses the problems of indexing; inability to find information not indexed in the search engine's database; and metasearch engines. Describes 10 sites created to access online databases or directly search them. Lists ways…
NASA Astrophysics Data System (ADS)
Meillier, Céline; Chatelain, Florent; Michel, Olivier; Bacon, Roland; Piqueras, Laure; Bacher, Raphael; Ayasso, Hacheme
2016-04-01
We present SELFI, the Source Emission Line FInder, a new Bayesian method optimized for detection of faint galaxies in Multi Unit Spectroscopic Explorer (MUSE) deep fields. MUSE is the new panoramic integral field spectrograph at the Very Large Telescope (VLT) that has unique capabilities for spectroscopic investigation of the deep sky. It has provided data cubes with 324 million voxels over a single 1 arcmin2 field of view. To address the challenge of faint-galaxy detection in these large data cubes, we developed a new method that processes 3D data either for modeling or for estimation and extraction of source configurations. This object-based approach yields a natural sparse representation of the sources in massive data fields, such as MUSE data cubes. In the Bayesian framework, the parameters that describe the observed sources are considered random variables. The Bayesian model leads to a general and robust algorithm where the parameters are estimated in a fully data-driven way. This detection algorithm was applied to the MUSE observation of Hubble Deep Field-South. With 27 h total integration time, these observations provide a catalog of 189 sources of various categories and with secured redshift. The algorithm retrieved 91% of the galaxies with only 9% false detection. This method also allowed the discovery of three new Lyα emitters and one [OII] emitter, all without any Hubble Space Telescope counterpart. We analyzed the reasons for failure for some targets, and found that the most important limitation of the method is when faint sources are located in the vicinity of bright spatially resolved galaxies that cannot be approximated by the Sérsic elliptical profile. The software and its documentation are available on the MUSE science web service (muse-vlt.eu/science).
Micro-Power Sources Enabling Robotic Outpost Based Deep Space Exploration
NASA Technical Reports Server (NTRS)
West, W. C.; Whitacre, J. F.; Ratnakumar, B. V.; Brandon, E. J.; Studor, G. F.
2001-01-01
Robotic outpost based exploration represents a fundamental shift in mission design from conventional, single spacecraft missions towards a distributed risk approach with many miniaturized semi-autonomous robots and sensors. This approach can facilitate wide-area sampling and exploration, and may consist of a web of orbiters, landers, or penetrators. To meet the mass and volume constraints of deep space missions such as the Europa Ocean Science Station, the distributed units must be fully miniaturized to fully leverage the wide-area exploration approach. However, presently there is a dearth of available options for powering these miniaturized sensors and robots. This group is currently examining miniaturized, solid state batteries as candidates to meet the demand of applications requiring low power, mass, and volume micro-power sources. These applications may include powering microsensors, battery-backing rad-hard CMOS memory and providing momentary chip back-up power. Additional information is contained in the original extended abstract.
Proactive Support of Internet Browsing when Searching for Relevant Health Information.
Rurik, Clas; Zowalla, Richard; Wiesner, Martin; Pfeifer, Daniel
2015-01-01
Many people use the Internet as one of the primary sources of health information. This is due to the high volume and easy access of freely available information regarding diseases, diagnoses and treatments. However, users may find it difficult to retrieve information which is easily understandable and does not require a deep medical background. In this paper, we present a new kind of Web browser add-on, in order to proactively support users when searching for relevant health information. Our add-on not only visualizes the understandability of displayed medical text but also provides further recommendations of Web pages which hold similar content but are potentially easier to comprehend.
Software for Allocating Resources in the Deep Space Network
NASA Technical Reports Server (NTRS)
Wang, Yeou-Fang; Borden, Chester; Zendejas, Silvino; Baldwin, John
2003-01-01
TIGRAS 2.0 is a computer program designed to satisfy a need for improved means for analyzing the tracking demands of interplanetary space-flight missions upon the set of ground antenna resources of the Deep Space Network (DSN) and for allocating those resources. Written in Microsoft Visual C++, TIGRAS 2.0 provides a single rich graphical analysis environment for use by diverse DSN personnel, by connecting to various data sources (relational databases or files) based on the stages of the analyses being performed. Notable among the algorithms implemented by TIGRAS 2.0 are a DSN antenna-load-forecasting algorithm and a conflict-aware DSN schedule-generating algorithm. Computers running TIGRAS 2.0 can also be connected using SOAP/XML to a Web services server that provides analysis services via the World Wide Web. TIGRAS 2.0 supports multiple windows and multiple panes in each window for users to view and use information, all in the same environment, to eliminate repeated switching among various application programs and Web pages. TIGRAS 2.0 enables the use of multiple windows for various requirements, trajectory-based time intervals during which spacecraft are viewable, ground resources, forecasts, and schedules. Each window includes a time navigation pane, a selection pane, a graphical display pane, a list pane, and a statistics pane.
Jensen, Sigmund; Neufeld, Josh D; Birkeland, Nils-Kåre; Hovland, Martin; Murrell, John Colin
2008-11-01
Deep-water coral reefs are seafloor environments with diverse biological communities surrounded by cold permanent darkness. Sources of energy and carbon for the nourishment of these reefs are presently unclear. We investigated one aspect of the food web using DNA stable-isotope probing (DNA-SIP). Sediment from beneath a Lophelia pertusa reef off the coast of Norway was incubated until assimilation of 5 micromol 13CH4 g(-1) wet weight occurred. Extracted DNA was separated into 'light' and 'heavy' fractions for analysis of labelling. Bacterial community fingerprinting of PCR-amplified 16S rRNA gene fragments revealed two predominant 13C-specific bands. Sequencing of these bands indicated that carbon from 13CH4 had been assimilated by a Methylomicrobium and an uncultivated member of the Gammaproteobacteria. Cloning and sequencing of 16S rRNA genes from the heavy DNA, in addition to genes encoding particulate methane monooxygenase and methanol dehydrogenase, all linked Methylomicrobium with methane metabolism. Putative cross-feeders were affiliated with Methylophaga (Gammaproteobacteria), Hyphomicrobium (Alphaproteobacteria) and previously unrecognized methylotrophs of the Gammaproteobacteria, Alphaproteobacteria, Deferribacteres and Bacteroidetes. This first marine methane SIP study provides evidence for the presence of methylotrophs that participate in sediment food webs associated with deep-water coral reefs.
DCO-VIVO: A Collaborative Data Platform for the Deep Carbon Science Communities
NASA Astrophysics Data System (ADS)
Wang, H.; Chen, Y.; West, P.; Erickson, J. S.; Ma, X.; Fox, P. A.
2014-12-01
Deep Carbon Observatory (DCO) is a decade-long scientific endeavor to understand carbon in the complex deep Earth system. Thousands of DCO scientists from institutions across the globe are organized into communities representing four domains of exploration: Extreme Physics and Chemistry, Reservoirs and Fluxes, Deep Energy, and Deep Life. Cross-community and cross-disciplinary collaboration is one of the most distinctive features in DCO's flexible research framework. VIVO is an open-source Semantic Web platform that facilitates cross-institutional researcher and research discovery. it includes a number of standard ontologies that interconnect people, organizations, publications, activities, locations, and other entities of research interest to enable browsing, searching, visualizing, and generating Linked Open (research) Data. The DCO-VIVO solution expedites research collaboration between DCO scientists and communities. Based on DCO's specific requirements, the DCO Data Science team developed a series of extensions to the VIVO platform including extending the VIVO information model, extended query over the semantic information within VIVO, integration with other open source collaborative environments and data management systems, using single sign-on, assigning of unique Handles to DCO objects, and publication and dataset ingesting extensions using existing publication systems. We present here the iterative development of these requirements that are now in daily use by the DCO community of scientists for research reporting, information sharing, and resource discovery in support of research activities and program management.
Vertical transport of carbon-14 into deep-sea food webs
NASA Astrophysics Data System (ADS)
Pearcy, W. G.; Stuiver, Minze
1983-04-01
During the years 1973 to 1976 the carbon-14 content was higher in epipelagic and vertically migrating, upper mesopelagic animals (caught between 0 and 500 m) than in lower mesopelagic, bathypelagic, and abyssobenthic animals (500 to 5180 m) in the northeastern Pacific Ocean. Only one species of deep-sea fish had a Δ14C value as high as surface-caught fish. The 14C content of most animals was higher pre-bomb levels, but the relatively low 14C content of most deep-sea animals indicates that the majority of their carbon was not derived directly from a near-surface food chain labeled with bomb carbon. A mean residence time of about 35 y was estimated for the organic carbon pool for abyssobenthic animals based on the relative increase of radiocarbon in surface-dwelling animals since 1967. The results suggest that rapidly sinking particles from surface waters, such as fecal pellets, are not the major source of organic carbon for deep-sea fishes and large benthic invertebrates.
Cyanide Suicide After Deep Web Shopping: A Case Report.
Le Garff, Erwan; Delannoy, Yann; Mesli, Vadim; Allorge, Delphine; Hédouin, Valéry; Tournel, Gilles
2016-09-01
Cyanide is a product that is known for its use in industrial or laboratory processes, as well as for intentional intoxication. The toxicity of cyanide is well described in humans with rapid inhibition of cellular aerobic metabolism after ingestion or inhalation, leading to severe clinical effects that are frequently lethal. We report the case of a young white man found dead in a hotel room after self-poisoning with cyanide ordered in the deep Web. This case shows a probable complex suicide kit use including cyanide, as a lethal tool, and dextromethorphan, as a sedative and anxiolytic substance. This case is an original example of the emerging deep Web shopping in illegal drug procurement.
Unveiling the Synchrotron Cosmic Web: Pilot Study
NASA Astrophysics Data System (ADS)
Brown, Shea; Rudnick, Lawrence; Pfrommer, Christoph; Jones, Thomas
2011-10-01
The overall goal of this project is to challenge our current theoretical understanding of the relativistic particle populations in the inter-galactic medium (IGM) through deep 1.4 GHz observations of 13 massive, high-redshift clusters of galaxies. Designed to compliment/extend the GMRT radio halo survey (Venturi et al. 2007), these observations will attempt to detect the peaks of the purported synchrotron cosmic-web, and place serious limits on models of CR acceleration and magnetic field amplification during large-scale structure formation. The primary goals of this survey are: 1) Confirm the bi-modal nature of the radio halo population, which favors turbulent re-acceleration of cosmic-ray electrons (CRe) during cluster mergers as the source of the diffuse radio emission; 2) Directly test hadronic secondary models which predict the presence of cosmic-ray protons (CRp) in the cores of massive X-ray clusters; 3) Search in polarization for shock structures, a potential source of CR acceleration in the IGM.
Gardner, James V.; Sulak, Kenneth J.; Dartnell, Peter; Hellequin, Laurent; Calder, Brian R.; Mayer, Larry A.
2000-01-01
An extensive deep (~100 m) reef tract occurs on the Mississippi-Alabama outer continental shelf (OCS). The tract, known as "The Pinnacles", is apparently part of a sequence of drowned reef complexes along the "40-fathom" shelf edge of the northern Gulf of Mexico (Ludwick and Walton, 1957). It is critical to determine the accurate geomorphology of deep-reefs because of their importance as benthic habitats for fisheries. The Pinnacles were mapped by Ludwick and Walton (1957) using a single-beam echo sounder but the spatial extent and morphology were interpreted from a series of widely separated, poorly navigated bathymetric profiles. Other recent studies, supported by Minerals Management Service (MMS), used towed sidescan sonars and single-channel seismic-reflection profiling. None of the existing studies provide the quality of geomorphic data necessary for reasonable habitat studies. The fish faunas of shallow hermatypic reefs have been well studied, but those of deep ahermatypic reefs have relatively ignored. The ecology of deep ahermatypic reefs is fundamentally different from hermatipic reefs because autochthonous intracellular symbiotic zooxanthellae (the carbon source for hermatypic corals) do not form the base of the trophic web. Instead, exogenous plankton, transported to the reef by currents, serves as the primary carbon source. Deep OCS reefs also lie below the practical working depths for SCUBA; consequently, remote investigations from a ship or in situ investigations using submersibles or ROVs are required. Community structure and trophodynamics of demersal fishes of the Pinnacles are presently the focus of USGS reseach. A goal of the research is to answer questions concerning the relataive roles played by geomorphology and surficial geology in the interaction with and control of biological differentiation. OCS reefs are important because we now know that such areas are important coral reef fish havens, key spawning 2 sites, and a critical early larval and juvenile habitats for economically important sport/food fishes. Also, deep-reef ecosystems as well as the fish populations they sustain are impacted by intensive oil-field development. It is now known that deep OCS reefs function as a key source of re-population (via seasonal and ontogenetic migration) of already heavily impacted inshore reefs. A reflection of this realization is the recent closure by the Gulf States Fisheries Management Council of a 600 mi 2 area of the Florida Middle Grounds (another unmapped major "40-fathom" OCS reef complex) to commercial fishing to preserve grouper spawning aggregations. It is known that the Pinnacles reefs support a lush fauna of ahermatypic hard corals, soft corals, black corals, sessile crinoids and sponges—together forming a living habitat for a well-developed fish fauna. The fish fauna comprises typical Caribbean reef fishes and Carolinian shelf fishes, plus epipelagic fishes, and a few deep-sea fishes. The base of the megafaunal invertebrate food web is plankton, borne by essentially continuous semi-laminar currents flowing predominantly out of the SW, up, along and across the shelf edge. These currents are intercepted by pinnacles reefs, which lie roughly in two linear tracts, parallel to the coastline (see fig. 1 in report). USGS research initiated in 1997 (Sulak et al., in progress) has demonstrated that the Pinnacles reef fish fauna is dominated by planktivorous fishes. Ongoing food habits, trophic web and stable isotope analyses by the USGS are reinforcing a basic picture of deep OCS reefs as ecosystems based on exogenous current-borne plankton. Long-term current meter deployments have demonstrated that the >3 m,
2016-07-21
Todays internet has multiple webs. The surface web is what Google and other search engines index and pull based on links. Essentially, the surface...financial records, research and development), and personal data (medical records or legal documents). These are all deep web. Standard search engines dont
Verification Tools Secure Online Shopping, Banking
NASA Technical Reports Server (NTRS)
2010-01-01
Just like rover or rocket technology sent into space, the software that controls these technologies must be extensively tested to ensure reliability and effectiveness. Ames Research Center invented the open-source Java Pathfinder (JPF) toolset for the deep testing of Java-based programs. Fujitsu Labs of America Inc., based in Sunnyvale, California, improved the capabilities of the JPF Symbolic Pathfinder tool, establishing the tool as a means of thoroughly testing the functionality and security of Web-based Java applications such as those used for Internet shopping and banking.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-15
...-foot-wide, 20-foot-deep excavated power canal; (2) a 55-foot-long, 65-foot-wide, 8-foot-deep excavated... 18 CFR 385.2001(a)(1)(iii) and the instructions on the Commission's Web site http://www.ferc.gov/docs... Web site at http://www.ferc.gov/docs-filing/elibrary.asp . Enter the docket number (P-13743-000, 13753...
Deep pelagic food web structure as revealed by in situ feeding observations.
Choy, C Anela; Haddock, Steven H D; Robison, Bruce H
2017-12-06
Food web linkages, or the feeding relationships between species inhabiting a shared ecosystem, are an ecological lens through which ecosystem structure and function can be assessed, and thus are fundamental to informing sustainable resource management. Empirical feeding datasets have traditionally been painstakingly generated from stomach content analysis, direct observations and from biochemical trophic markers (stable isotopes, fatty acids, molecular tools). Each approach carries inherent biases and limitations, as well as advantages. Here, using 27 years (1991-2016) of in situ feeding observations collected by remotely operated vehicles (ROVs), we quantitatively characterize the deep pelagic food web of central California within the California Current, complementing existing studies of diet and trophic interactions with a unique perspective. Seven hundred and forty-three independent feeding events were observed with ROVs from near-surface waters down to depths approaching 4000 m, involving an assemblage of 84 different predators and 82 different prey types, for a total of 242 unique feeding relationships. The greatest diversity of prey was consumed by narcomedusae, followed by physonect siphonophores, ctenophores and cephalopods. We highlight key interactions within the poorly understood 'jelly web', showing the importance of medusae, ctenophores and siphonophores as key predators, whose ecological significance is comparable to large fish and squid species within the central California deep pelagic food web. Gelatinous predators are often thought to comprise relatively inefficient trophic pathways within marine communities, but we build upon previous findings to document their substantial and integral roles in deep pelagic food webs. © 2017 The Authors.
Dining in the Deep: The Feeding Ecology of Deep-Sea Fishes
NASA Astrophysics Data System (ADS)
Drazen, Jeffrey C.; Sutton, Tracey T.
2017-01-01
Deep-sea fishes inhabit ˜75% of the biosphere and are a critical part of deep-sea food webs. Diet analysis and more recent trophic biomarker approaches, such as stable isotopes and fatty-acid profiles, have enabled the description of feeding guilds and an increased recognition of the vertical connectivity in food webs in a whole-water-column sense, including benthic-pelagic coupling. Ecosystem modeling requires data on feeding rates; the available estimates indicate that deep-sea fishes have lower per-individual feeding rates than coastal and epipelagic fishes, but the overall predation impact may be high. A limited number of studies have measured the vertical flux of carbon by mesopelagic fishes, which appears to be substantial. Anthropogenic activities are altering deep-sea ecosystems and their services, which are mediated by trophic interactions. We also summarize outstanding data gaps.
[Oncologic gynecology and the Internet].
Gizler, Robert; Bielanów, Tomasz; Kulikiewicz, Krzysztof
2002-11-01
The strategy of World Wide Web searching for medical sites was presented in this article. The "deep web" and "surface web" resources were searched. The 10 best sites connected with the gynecological oncology, according to authors' opinion, were presented.
Moby and Moby 2: creatures of the deep (web).
Vandervalk, Ben P; McCarthy, E Luke; Wilkinson, Mark D
2009-03-01
Facile and meaningful integration of data from disparate resources is the 'holy grail' of bioinformatics. Some resources have begun to address this problem by providing their data using Semantic Web standards, specifically the Resource Description Framework (RDF) and the Web Ontology Language (OWL). Unfortunately, adoption of Semantic Web standards has been slow overall, and even in cases where the standards are being utilized, interconnectivity between resources is rare. In response, we have seen the emergence of centralized 'semantic warehouses' that collect public data from third parties, integrate it, translate it into OWL/RDF and provide it to the community as a unified and queryable resource. One limitation of the warehouse approach is that queries are confined to the resources that have been selected for inclusion. A related problem, perhaps of greater concern, is that the majority of bioinformatics data exists in the 'Deep Web'-that is, the data does not exist until an application or analytical tool is invoked, and therefore does not have a predictable Web address. The inability to utilize Uniform Resource Identifiers (URIs) to address this data is a barrier to its accessibility via URI-centric Semantic Web technologies. Here we examine 'The State of the Union' for the adoption of Semantic Web standards in the health care and life sciences domain by key bioinformatics resources, explore the nature and connectivity of several community-driven semantic warehousing projects, and report on our own progress with the CardioSHARE/Moby-2 project, which aims to make the resources of the Deep Web transparently accessible through SPARQL queries.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-30
... wide by 50 feet long by 30 feet deep; (3) the existing 50-foot-long by 20-foot-wide by 30-foot- deep... Commission's Web site ( http://www.ferc.gov/docs-filing/ferconline.asp ) under the ``eFiling'' link. For a... 20426. For more information on how to submit these types of filings please go to the Commission's Web...
Trophic dynamics of deep-sea megabenthos are mediated by surface productivity.
Tecchio, Samuele; van Oevelen, Dick; Soetaert, Karline; Navarro, Joan; Ramírez-Llodra, Eva
2013-01-01
Most deep-sea benthic ecosystems are food limited and, in the majority of cases, are driven by the organic matter falling from the surface or advected downslope. Species may adapt to this scarceness by applying a wide variety of responses, such as feeding specialisation, niche width variation, and reduction in metabolic rates. The Mediterranean Sea hosts a gradient of food availability at the deep seafloor over its wide longitudinal transect. In the Mediterranean, broad regional studies on trophic habits are almost absent, and the response of deep-sea benthos to different trophic conditions is still speculative. Here, we show that both primary and secondary production processes taking place at surface layers are key drivers of deep-sea food web structuring. By employing an innovative statistical tool, we interpreted bulk-tissue δ(13)C and δ(15)N isotope ratios in benthic megafauna, and associated surface and mesopelagic components from the 3 basins of the Mediterranean Sea at 3 different depths (1200, 2000, and 3000 m). The trophic niche width and the amplitude of primary carbon sources were positively correlated with both primary and secondary surface production indicators. Moreover, mesopelagic organic matter utilization processes showed an intermediate position between surface and deep benthic components. These results shed light on the understanding of deep-sea ecosystems functioning and, at the same time, they demand further investigation.
Trophic Dynamics of Deep-Sea Megabenthos Are Mediated by Surface Productivity
Tecchio, Samuele; van Oevelen, Dick; Soetaert, Karline; Navarro, Joan; Ramírez-Llodra, Eva
2013-01-01
Most deep-sea benthic ecosystems are food limited and, in the majority of cases, are driven by the organic matter falling from the surface or advected downslope. Species may adapt to this scarceness by applying a wide variety of responses, such as feeding specialisation, niche width variation, and reduction in metabolic rates. The Mediterranean Sea hosts a gradient of food availability at the deep seafloor over its wide longitudinal transect. In the Mediterranean, broad regional studies on trophic habits are almost absent, and the response of deep-sea benthos to different trophic conditions is still speculative. Here, we show that both primary and secondary production processes taking place at surface layers are key drivers of deep-sea food web structuring. By employing an innovative statistical tool, we interpreted bulk-tissue δ13C and δ15N isotope ratios in benthic megafauna, and associated surface and mesopelagic components from the 3 basins of the Mediterranean Sea at 3 different depths (1200, 2000, and 3000 m). The trophic niche width and the amplitude of primary carbon sources were positively correlated with both primary and secondary surface production indicators. Moreover, mesopelagic organic matter utilization processes showed an intermediate position between surface and deep benthic components. These results shed light on the understanding of deep-sea ecosystems functioning and, at the same time, they demand further investigation. PMID:23691098
Demopoulos, Amanda W.J.; McClain-Counts, Jennifer; Ross, Steve W.; Brooke, Sandra; Mienis, Furu
2017-01-01
Examination of food webs and trophic niches provide insights into organisms' functional ecology, yet few studies have examined trophodynamics within submarine canyons, where the interaction of canyon morphology and oceanography influences habitat provision and food deposition. Using stable isotope analysis and Bayesian ellipses, we documented deep-sea food-web structure and trophic niches in Baltimore Canyon and the adjacent open slopes in the US Mid-Atlantic Region. Results revealed isotopically diverse feeding groups, comprising approximately 5 trophic levels. Regression analysis indicated that consumer isotope data are structured by habitat (canyon vs. slope), feeding group, and depth. Benthic feeders were enriched in 13C and 15N relative to suspension feeders, consistent with consuming older, more refractory organic matter. In contrast, canyon suspension feeders had the largest and more distinct isotopic niche, indicating they consume an isotopically discrete food source, possibly fresher organic material. The wider isotopic niche observed for canyon consumers indicated the presence of feeding specialists and generalists. High dispersion in δ13C values for canyon consumers suggests that the isotopic composition of particulate organic matter changes, which is linked to depositional dynamics, resulting in discrete zones of organic matter accumulation or resuspension. Heterogeneity in habitat and food availability likely enhances trophic diversity in canyons. Given their abundance in the world's oceans, our results from Baltimore Canyon suggest that submarine canyons may represent important havens for trophic diversity.
DEEPWATER AND NEARSHORE FOOD WEB CHARACTERIZATIONS IN LAKE SUPERIOR
Due to the difficulty associated with sampling deep aquatic systems, food web relationships among deepwater fauna are often poorly known. We are characterizing nearshore versus offshore habitats in the Great Lakes and investigating food web linkages among profundal, pelagic, and ...
NASA Astrophysics Data System (ADS)
McMahon, K.; McCarthy, M. D.; Guilderson, T. P.; Sherwood, O.; Williams, B.; Larsen, T.; Glynn, D. S.
2017-12-01
Future climate change is predicted to alter ocean productivity, food web dynamics, biogeochemical cycling, and the efficacy of the biological pump. Proteinaceous deep-sea corals act as "living sediment traps," providing long-term, high-resolution records of exported surface ocean production and a window into past changes in ocean condition as a historical context for potential future changes. Here, we present recent work developing the application of compound-specific stable isotope analysis of individual amino acids to proteinaceous deep-sea corals to reconstruct past changes in phytoplankton community composition and biogeochemical cycling. We present new calibrations for molecular isotope comparisons between metabolically active coral polyp tissue and bioarchival proteinaceous skeleton. We then applied these techniques to deep-sea corals from the North Pacific Subtropical Gyre (NPSG) to reconstruct centennial to millennial time scale changes in phytoplankton community composition and biogeochemical cycling as a function of regional climate change. This work suggests that the NPSG has undergone multiple major phytoplankton regime shifts over the last millennium between prokaryotic and eukaryotic phytoplankton communities and associated sources of nitrogen fueling production. The most recent regime, which started around the end of the Little Ice Age and the onset of the Industrial era, is unprecedented in the last 1000 years and resulted in a 30-50% increase in diazotrophic cyanobacteria contribution to export production and an associated 17-27% increase in N2-fixation in the NPSG over last century. By offering the first direct phylogenetic context for long-term shifts in isotopic records of exported particulate organic matter, our data represent a major step forward in understanding the evolution of marine plankton community dynamics, food web architecture, biogeochemical cycling, and the climate feedback loops through the biological pump.
Dancing girl flap: a new flap suitable for web release.
Shinya, K
1999-12-01
To create a deep web, a flap must be designed to have a high elongation effect in one direction along the mid-lateral line of the finger and also to have a shortening effect in the other direction, crossing at a right angle to the mid-lateral line. The dancing girl flap is a modification of a four-flap Z-plasty with two additional Z-plasties. It has a high elongation effect in one direction (>550%) and a shortening effect in the other direction at a right angle (<33%), creating a deep, U-shaped surface. This new flap can be used to release severe scar contracture with a web, and is most suitable for incomplete syndactyly with webs as high as the proximal interphalangeal joint.
Domain-specific Web Service Discovery with Service Class Descriptions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rocco, D; Caverlee, J; Liu, L
2005-02-14
This paper presents DynaBot, a domain-specific web service discovery system. The core idea of the DynaBot service discovery system is to use domain-specific service class descriptions powered by an intelligent Deep Web crawler. In contrast to current registry-based service discovery systems--like the several available UDDI registries--DynaBot promotes focused crawling of the Deep Web of services and discovers candidate services that are relevant to the domain of interest. It uses intelligent filtering algorithms to match services found by focused crawling with the domain-specific service class descriptions. We demonstrate the capability of DynaBot through the BLAST service discovery scenario and describe ourmore » initial experience with DynaBot.« less
Deep pelagic food web structure as revealed by in situ feeding observations
Haddock, Steven H. D.; Robison, Bruce H.
2017-01-01
Food web linkages, or the feeding relationships between species inhabiting a shared ecosystem, are an ecological lens through which ecosystem structure and function can be assessed, and thus are fundamental to informing sustainable resource management. Empirical feeding datasets have traditionally been painstakingly generated from stomach content analysis, direct observations and from biochemical trophic markers (stable isotopes, fatty acids, molecular tools). Each approach carries inherent biases and limitations, as well as advantages. Here, using 27 years (1991–2016) of in situ feeding observations collected by remotely operated vehicles (ROVs), we quantitatively characterize the deep pelagic food web of central California within the California Current, complementing existing studies of diet and trophic interactions with a unique perspective. Seven hundred and forty-three independent feeding events were observed with ROVs from near-surface waters down to depths approaching 4000 m, involving an assemblage of 84 different predators and 82 different prey types, for a total of 242 unique feeding relationships. The greatest diversity of prey was consumed by narcomedusae, followed by physonect siphonophores, ctenophores and cephalopods. We highlight key interactions within the poorly understood ‘jelly web’, showing the importance of medusae, ctenophores and siphonophores as key predators, whose ecological significance is comparable to large fish and squid species within the central California deep pelagic food web. Gelatinous predators are often thought to comprise relatively inefficient trophic pathways within marine communities, but we build upon previous findings to document their substantial and integral roles in deep pelagic food webs. PMID:29212727
Net-Centric Sensors and Data Sources (N-CSDS) GEODSS Sidecar
NASA Astrophysics Data System (ADS)
Richmond, D.
2012-09-01
Vast amounts of Space Situational Sensor data is collected each day on closed, legacy systems. Massachusetts Institute of Technology Lincoln Laboratory (MIT/LL) developed a Net-Centric approach to expose this data under the Extended Space Sensors Architecture (ESSA) Advanced Concept Technology Demonstration (ACTD). The Net-Centric Sensors and Data Sources (N-CSDS) Ground-based Electro Optical Deep Space Surveillance (GEODSS) Sidecar is the next generation that moves the ESSA ACTD engineering tools to an operational baseline. The N-CSDS GEODSS sidecar high level architecture will be presented, highlighting the features that supports deployment at multiple diverse sensor sites. Other key items that will be covered include: 1) The Web Browser interface to perform searches of historical data 2) The capabilities of the deployed Web Services and example service request/responses 3) Example data and potential user applications will be highlighted 4) Specifics regarding the process to gain access to the N-CSDS GEODSS sensor data in near real time 5) Current status and future deployment plans (Including plans for deployment to the Maui GEODSS Site)
The deep lymphatic anatomy of the hand.
Ma, Chuan-Xiang; Pan, Wei-Ren; Liu, Zhi-An; Zeng, Fan-Qiang; Qiu, Zhi-Qiang
2018-07-01
The deep lymphatic anatomy of the hand still remains the least described in medical literature. Eight hands were harvested from four nonembalmed human cadavers amputated above the wrist. A small amount of 6% hydrogen peroxide was employed to detect the lymphatic vessels around the superficial and deep palmar vascular arches, in webs from the index to little fingers, the thenar and hypothenar areas. A 30-gauge needle was inserted into the vessels and injected with a barium sulphate compound. Each specimen was dissected, photographed and radiographed to demonstrate deep lymphatic distribution of the hand. Five groups of deep collecting lymph vessels were found in the hand: superficial palmar arch lymph vessel (SPALV); deep palmar arch lymph vessel (DPALV); thenar lymph vessel (TLV); hypothenar lymph vessel (HTLV); deep finger web lymph vessel (DFWLV). Each group of vessels drained in different directions first, then all turned and ran towards the wrist in different layers. The deep lymphatic drainage of the hand has been presented. The results will provide an anatomical basis for clinical management, educational reference and scientific research. Copyright © 2018 Elsevier GmbH. All rights reserved.
Userscripts for the life sciences.
Willighagen, Egon L; O'Boyle, Noel M; Gopalakrishnan, Harini; Jiao, Dazhi; Guha, Rajarshi; Steinbeck, Christoph; Wild, David J
2007-12-21
The web has seen an explosion of chemistry and biology related resources in the last 15 years: thousands of scientific journals, databases, wikis, blogs and resources are available with a wide variety of types of information. There is a huge need to aggregate and organise this information. However, the sheer number of resources makes it unrealistic to link them all in a centralised manner. Instead, search engines to find information in those resources flourish, and formal languages like Resource Description Framework and Web Ontology Language are increasingly used to allow linking of resources. A recent development is the use of userscripts to change the appearance of web pages, by on-the-fly modification of the web content. This opens possibilities to aggregate information and computational results from different web resources into the web page of one of those resources. Several userscripts are presented that enrich biology and chemistry related web resources by incorporating or linking to other computational or data sources on the web. The scripts make use of Greasemonkey-like plugins for web browsers and are written in JavaScript. Information from third-party resources are extracted using open Application Programming Interfaces, while common Universal Resource Locator schemes are used to make deep links to related information in that external resource. The userscripts presented here use a variety of techniques and resources, and show the potential of such scripts. This paper discusses a number of userscripts that aggregate information from two or more web resources. Examples are shown that enrich web pages with information from other resources, and show how information from web pages can be used to link to, search, and process information in other resources. Due to the nature of userscripts, scientists are able to select those scripts they find useful on a daily basis, as the scripts run directly in their own web browser rather than on the web server. This flexibility allows the scientists to tune the features of web resources to optimise their productivity.
Userscripts for the Life Sciences
Willighagen, Egon L; O'Boyle, Noel M; Gopalakrishnan, Harini; Jiao, Dazhi; Guha, Rajarshi; Steinbeck, Christoph; Wild, David J
2007-01-01
Background The web has seen an explosion of chemistry and biology related resources in the last 15 years: thousands of scientific journals, databases, wikis, blogs and resources are available with a wide variety of types of information. There is a huge need to aggregate and organise this information. However, the sheer number of resources makes it unrealistic to link them all in a centralised manner. Instead, search engines to find information in those resources flourish, and formal languages like Resource Description Framework and Web Ontology Language are increasingly used to allow linking of resources. A recent development is the use of userscripts to change the appearance of web pages, by on-the-fly modification of the web content. This opens possibilities to aggregate information and computational results from different web resources into the web page of one of those resources. Results Several userscripts are presented that enrich biology and chemistry related web resources by incorporating or linking to other computational or data sources on the web. The scripts make use of Greasemonkey-like plugins for web browsers and are written in JavaScript. Information from third-party resources are extracted using open Application Programming Interfaces, while common Universal Resource Locator schemes are used to make deep links to related information in that external resource. The userscripts presented here use a variety of techniques and resources, and show the potential of such scripts. Conclusion This paper discusses a number of userscripts that aggregate information from two or more web resources. Examples are shown that enrich web pages with information from other resources, and show how information from web pages can be used to link to, search, and process information in other resources. Due to the nature of userscripts, scientists are able to select those scripts they find useful on a daily basis, as the scripts run directly in their own web browser rather than on the web server. This flexibility allows the scientists to tune the features of web resources to optimise their productivity. PMID:18154664
Polychlorinated Biphenyl (PCB) Bioaccumulation in Fish: A Look at Michigan's Upper Peninsula
NASA Astrophysics Data System (ADS)
Sokol, E. C.; Urban, N. R.; Perlinger, J. A.; Khan, T.; Friedman, C. L.
2014-12-01
Fish consumption is an important economic, social and cultural component of Michigan's UpperPeninsula, where safe fish consumption is threatened due to polychlorinated biphenyl (PCB)contamination. Despite its predominantly rural nature, the Upper Peninsula has a history of industrialPCB use. PCB congener concentrations in fish vary 50-fold in Upper Peninsula lakes. Several factors maycontribute to this high variability including local point sources, unique watershed and lakecharacteristics, and food web structure. It was hypothesized that the variability in congener distributionscould be used to identify factors controlling concentrations in fish, and then to use those factors topredict PCB contamination in fish from lakes that had not been monitored. Watershed and lakecharacteristics were acquired from several databases for 16 lakes sampled in the State's fishcontaminant survey. Species congener distributions were compared using Principal Component Analysis(PCA) to distinguish between lakes with local vs. regional, atmospheric sources; six lakes were predictedto have local sources and half of those have confirmed local PCB use. For lakes without local PCBsources, PCA indicated that lake size was the primary factor influencing PCB concentrations. The EPA'sbioaccumulation model, BASS, was used to predict PCB contamination in lakes without local sources as afunction of food web characteristics. The model was used to evaluate the hypothesis that deep,oligotrophic lakes have longer food webs and higher PCB concentrations in top predator fish. Based onthese findings, we will develop a mechanistic watershed-lake model to predict PCB concentrations infish as a function of atmospheric PCB concentrations, lake size, and trophic state. Future atmosphericconcentrations, predicted by modeling potential primary and secondary emission scenarios, will be usedto predict the time horizon for safe fish consumption.
NASA Astrophysics Data System (ADS)
Baker, Philip; Minzlaff, Ulrike; Schoenle, Alexandra; Schwabe, Enrico; Hohlfeld, Manon; Jeuck, Alexandra; Brenke, Nils; Prausse, Dennis; Rothenbeck, Marcel; Brix, Saskia; Frutos, Inmaculada; Jörger, Katharina M.; Neusser, Timea P.; Koppelmann, Rolf; Devey, Colin; Brandt, Angelika; Arndt, Hartmut
2018-02-01
Deep-sea ecosystems, limited by their inability to use primary production as a source of carbon, rely on other sources to maintain life. Sedimentation of organic carbon into the deep sea has been previously studied, however, the high biomass of sedimented Sargassum algae discovered during the VEMA Transit expedition in 2014/2015 to the southern North Atlantic, and its potential as a regular carbon input, has been an underestimated phenomenon. To determine the potential for this carbon flux, a literature survey of previous studies that estimated the abundance of surface water Sargassum was conducted. We compared these estimates with quantitative analyses of sedimented Sargassum appearing on photos taken with an autonomous underwater vehicle (AUV) directly above the abyssal sediment during the expedition. Organismal communities associated to Sargassum fluitans from surface waters were investigated and Sargassum samples collected from surface waters and the deep sea were biochemically analyzed (fatty acids, stable isotopes, C:N ratios) to determine degradation potential and the trophic significance within deep-sea communities. The estimated Sargassum biomass (fresh weight) in the deep sea (0.07-3.75 g/m2) was several times higher than that estimated from surface waters in the North Atlantic (0.024-0.84 g/m2). Biochemical analysis showed degradation of Sargassum occurring during sedimentation or in the deep sea, however, fatty acid and stable isotope analysis did not indicate direct trophic interactions between the algae and benthic organisms. Thus, it is assumed that components of the deep-sea microbial food web form an important link between the macroalgae and larger benthic organisms. Evaluation of the epifauna showed a diverse nano- micro-, meio, and macrofauna on surface Sargassum and maybe transported across the Atlantic, but we had no evidence for a vertical exchange of fauna components. The large-scale sedimentation of Sargassum forms an important trophic link between surface and benthic production and has to be further considered in the future as a regular carbon input to the deep-sea floor in the North Atlantic.
Naesström, Matilda; Blomstedt, Patric; Hariz, Marwan; Bodlund, Owe
2017-01-01
Deep brain stimulation (DBS) is under investigation for severe obsessive-compulsive disorder (OCD) resistant to other therapies. The number of implants worldwide is slowly increasing. Therefore, it is of importance to explore knowledge and concerns of this novel treatment among patients and their psychiatric healthcare contacts. This information is relevant for scientific professionals working with clinical studies for DBS for this indication. Especially, for future study designs and the creation of information targeting healthcare professionals and patients. The aim of this study was to explore the knowledge and concerns toward DBS among patients with OCD, psychiatrists, and cognitive behavioral therapists. The study was conducted through web-based surveys for the aimed target groups -psychiatrist, patients, and cognitive behavioral therapists. The surveys contained questions regarding previous knowledge of DBS, source of knowledge, attitudes, and concerns towards the therapy. The main source of information was from scientific sources among psychiatrists and psychotherapists. The patient's main source of information was the media. Common concerns among the groups included complications from surgery, anesthesia, stimulation side effects, and the novelty of the treatment. Specific concerns for the groups included; personality changes mentioned by patients and psychotherapists, and ethical concerns among psychiatrists. There are challenges for DBS in OCD as identified by the participants of this study; source and quality of information, efficacy, potential adverse effects, and eligibility. In all of which the current evidence base still is limited. A broad research agenda is needed for studies going forward.
ERIC Educational Resources Information Center
Lagoze, Carl; Neylon, Eamonn; Mooney, Stephen; Warnick, Walter L.; Scott, R. L.; Spence, Karen J.; Johnson, Lorrie A.; Allen, Valerie S.; Lederman, Abe
2001-01-01
Includes four articles that discuss Dublin Core metadata, digital rights management and electronic books, including interoperability; and directed query engines, a type of search engine designed to access resources on the deep Web that is being used at the Department of Energy. (LRW)
Multibeam mapping of the West Florida Shelf, Gulf of Mexico
Gardner, James V.; Dartnell, Peter; Sulak, Kenneth J.
2002-01-01
A zone of deep-water reefs is thought to extend from the mid and outer shelf south of Mississippi and Alabama to at least the northwestern Florida shelf off Panama City, Florida (Figure 1). The reefs off Mississippi and Alabama are found in water depths of 60 to 120 m (Ludwick and Walton, 1957; Gardner et al., 2001, in press) and were the focus of a multibeam echosounder (MBES) mapping survey by the U.S. Geological Survey (USGS) in 2000 (Gardner et al., 2000, Gardner et al., 2001, in press). If this deep-water-reef trend does exist along the northwestern Florida shelf, then it is critical to determine the accurate geomorphology and reef type that occur because of their importance as benthic habitats for fisheries. Georeferenced high-resolution mapping of bathymetry is a fundamental first step in the study of areas suspected to be critical habitats. Morphology is thought to be critical to defining the distribution of dominant demersal plankton/planktivores communities. Fish faunas of shallow hermatypic reefs have been well studied, but those of deep ahermatypic reefs have been relatively ignored. The ecology of deep-water ahermatypic reefs is fundamentally different from hermatypic reefs because autochthonous intracellular symbiotic zooxanthellae (the carbon source for hermatypic corals) do not form the base of the trophic web in ahermatypic reefs. Instead, exogenous plankton, transported to the reef by currents, serves as the primary carbon source. Thus, one of the principle uses of the morphology data will be to identify whether any reefs found are hermatypic or ahermatypic in origin.
76 FR 67456 - Common Formats for Patient Safety Data Collection and Event Reporting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-01
... Common Formats, can be accessed electronically at the following HHS Web site: http://www.PSO.AHRQ.gov... Thromboembolism (VTE), which includes Deep Vein Thrombosis (DVT) and Pulmonary Embolism (PE), will apply to both... available at the PSO Privacy Protection Center (PPC) Web site: https://www.psoppc.org/web/patientsafety...
w4CSeq: software and web application to analyze 4C-seq data.
Cai, Mingyang; Gao, Fan; Lu, Wange; Wang, Kai
2016-11-01
Circularized Chromosome Conformation Capture followed by deep sequencing (4C-Seq) is a powerful technique to identify genome-wide partners interacting with a pre-specified genomic locus. Here, we present a computational and statistical approach to analyze 4C-Seq data generated from both enzyme digestion and sonication fragmentation-based methods. We implemented a command line software tool and a web interface called w4CSeq, which takes in the raw 4C sequencing data (FASTQ files) as input, performs automated statistical analysis and presents results in a user-friendly manner. Besides providing users with the list of candidate interacting sites/regions, w4CSeq generates figures showing genome-wide distribution of interacting regions, and sketches the enrichment of key features such as TSSs, TTSs, CpG sites and DNA replication timing around 4C sites. Users can establish their own web server by downloading source codes at https://github.com/WGLab/w4CSeq Additionally, a demo web server is available at http://w4cseq.wglab.org CONTACT: kaiwang@usc.edu or wangelu@usc.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-12
... following methods: Government-wide rulemaking Web site: http://www.regulations.gov . Follow the instructions... irrigation system improvements outlined in this plan will provide more efficient use of this water. Deep... reduction of excess deep percolation passing below the plant root zone. Deep percolation of irrigation water...
ERIC Educational Resources Information Center
Wighting, Mervyn J.; Lucking, Robert A.; Christmann, Edwin P.
2004-01-01
Teachers search for ways to enhance oceanography units in the classroom. There are many online resources available to help one explore the mysteries of the deep. This article describes a collection of Web sites on this topic appropriate for middle level classrooms.
Frossard, Victor; Verneaux, Valérie; Millet, Laurent; Magny, Michel; Perga, Marie-Elodie
2015-06-01
Stable C isotope ratio (δ(13)C) values of chironomid remains (head capsules; HC) were used to infer changes in benthic C sources over the last 150 years for two French sub-Alpine lakes. The HCs were retrieved from a series of sediment cores from different depths. The HC δ(13)C values started to decrease with the onset of eutrophication. The HC δ(13)C temporal patterns varied among depths, which revealed spatial differences in the contribution of methanotrophic bacteria to the benthic secondary production. The estimates of the methane (CH4)-derived C contribution to chironomid biomass ranged from a few percent prior to the 1930s to up to 30 % in recent times. The chironomid fluxes increased concomitantly with changes in HC δ(13)C values before a drastic decrease due to the development of hypoxic conditions. The hypoxia reinforced the implication for CH4-derived C transfer to chironomid production. In Lake Annecy, the HC δ(13)C values were negatively correlated to total organic C (TOC) content in the sediment (Corg), whereas no relationship was found in Lake Bourget. In Lake Bourget, chironomid abundances reached their maximum with TOC contents between 1 and 1.5 % Corg, which could constitute a threshold for change in chironomid abundance and consequently for the integration of CH4-derived C into the lake food webs. Our results indicated that the CH4-derived C contribution to the benthic food webs occurred at different depths in these two large, deep lakes (deep waters and sublittoral zone), and that the trophic transfer of this C was promoted in sublittoral zones where O2 gradients were dynamic.
NASA Astrophysics Data System (ADS)
Zapata-Hernández, Germán; Sellanes, Javier; Thiel, Martin; Henríquez, Camila; Hernández, Sebastián; Fernández, Julio C. C.; Hajdu, Eduardo
2016-11-01
Estuarine environments are complex ecological systems, which depend on multiple inputs of organic sources that could support their benthic communities. The deep-water megabenthic communities of the Interior Sea of Chiloé (ISCh, northern part of the fjord region of Chile) were studied to characterize their taxonomic composition and to trace the energy pathways supporting them by using stable isotope analysis (SIA). Megabenthic and demersal organisms as well as sunken macroalgal debris and terrestrial organic matter (TOM: wood, leaves, branches) were obtained by bottom trawling along an estuarine gradient covering 100-460 m water depth. Additionally, particulate organic matter (POM) and the sedimentary organic matter (SOM) were sampled and carbon (δ13C) and nitrogen (δ15N) isotope ratios were determined for all these organisms and potential food sources. A total of 140 taxa were obtained, including invertebrates (e.g. polychaetes, mollusks, crustaceans and echinoderms) bony fishes, rays and sharks. Based on the stable isotope values it was possible to infer a strong dependence on primary production derived from phytoplankton which is exported to the benthos. A potentially important contribution from sunken macroalgae to megabenthic consumers was established only for some invertebrates, such as the irregular echinoid Tripylaster philippii and the decapod Eurypodius latreillii. The trophic structure metrics suggest a similar isotopic niche width, trophic diversity and species packaging in the food webs among the major basins in the ISCh. It is thus concluded that the benthic food webs are supported principally by surface primary production, but macroalgal subsidies could be exploited by selected invertebrate taxa (e.g. detritivores) and terrestrial carbon pathways are important for certain specialized taxa (e.g. Xylophaga dorsalis).
A Holistic, Similarity-Based Approach for Personalized Ranking in Web Databases
ERIC Educational Resources Information Center
Telang, Aditya
2011-01-01
With the advent of the Web, the notion of "information retrieval" has acquired a completely new connotation and currently encompasses several disciplines ranging from traditional forms of text and data retrieval in unstructured and structured repositories to retrieval of static and dynamic information from the contents of the surface and deep Web.…
ERIC Educational Resources Information Center
Rouet, Jean-Francois; Ros, Christine; Goumi, Antonine; Macedo-Rouet, Monica; Dinet, Jerome
2011-01-01
Two experiments investigated primary and secondary school students' Web menu selection strategies using simulated Web search tasks. It was hypothesized that students' selections of websites depend on their perception and integration of multiple relevance cues. More specifically, students should be able to disentangle superficial cues (e.g.,…
Boulos, Maged N Kamel; Honda, Kiyoshi
2006-01-01
Open Source Web GIS software systems have reached a stage of maturity, sophistication, robustness and stability, and usability and user friendliness rivalling that of commercial, proprietary GIS and Web GIS server products. The Open Source Web GIS community is also actively embracing OGC (Open Geospatial Consortium) standards, including WMS (Web Map Service). WMS enables the creation of Web maps that have layers coming from multiple different remote servers/sources. In this article we present one easy to implement Web GIS server solution that is based on the Open Source University of Minnesota (UMN) MapServer. By following the accompanying step-by-step tutorial instructions, interested readers running mainstream Microsoft® Windows machines and with no prior technical experience in Web GIS or Internet map servers will be able to publish their own health maps on the Web and add to those maps additional layers retrieved from remote WMS servers. The 'digital Asia' and 2004 Indian Ocean tsunami experiences in using free Open Source Web GIS software are also briefly described. PMID:16420699
Source Update Capture in Information Agents
NASA Technical Reports Server (NTRS)
Ashish, Naveen; Kulkarni, Deepak; Wang, Yao
2003-01-01
In this paper we present strategies for successfully capturing updates at Web sources. Web-based information agents provide integrated access to autonomous Web sources that can get updated. For many information agent applications we are interested in knowing when a Web source to which the application provides access, has been updated. We may also be interested in capturing all the updates at a Web source over a period of time i.e., detecting the updates and, for each update retrieving and storing the new version of data. Previous work on update and change detection by polling does not adequately address this problem. We present strategies for intelligently polling a Web source for efficiently capturing changes at the source.
Ross, Steve W.; Demopoulos, Amanda W.J.; Kellogg, Christina A.; Morrison, Cheryl L.; Nizinski, Martha S.; Ames, Cheryl L.; Casazza, Tara L.; Gualtieri, Daniel; Kovacs, Kaitlin; McClain, Jennifer P.; Quattrini, Andrea M.; Roa-Varon, Adela Y.; Thaler, Andrew D.
2012-01-01
This report summarizes research funded by the U.S. Geological Survey (USGS) in collaboration with the University of North Carolina at Wilmington (UNCW) on the ecology of deep chemosynthetic communities in the Gulf of Mexico. The research was conducted at the request of the U.S. Bureau of Ocean Energy Management, Regulation and Enforcement (BOEMRE; formerly Minerals Management Service) to complement a BOEMRE-funded project titled "Deepwater Program: Investigations of Chemosynthetic Communities on the Lower Continental Slope of the Gulf of Mexico." The overall research partnership, known as "Chemo III," was initiated to increase understanding of the distribution, structure, function, and vulnerabilities of these poorly known associations of animals and microbes for water depths greater than 1,000 meters (m) in the Gulf of Mexico. Chemosynthetic communities rely on carbon sources that are largely independent of sunlight and photosynthetic food webs. Despite recent research directed toward chemosynthetic and deep coral (for example, Lophelia pertusa) based ecosystems, these habitats are still poorly studied, especially at depths greater than 1,000 m. With the progression into deeper waters by fishing and energy industries, developing sufficient knowledge to manage these deep ecosystems is essential. Increased understanding of deep-sea communities will enable sound evaluations of potential impacts and appropriate mitigations.
NASA Astrophysics Data System (ADS)
Kürten, Benjamin; Al-Aidaroos, Ali M.; Kürten, Saskia; El-Sherbiny, Mohsen M.; Devassy, Reny P.; Struck, Ulrich; Zarokanellos, Nikolaos; Jones, Burton H.; Hansen, Thomas; Bruss, Gerd; Sommer, Ulrich
2016-01-01
Although zooplankton occupy key roles in aquatic biogeochemical cycles, little is known about the pelagic food web and trophodynamics of zooplankton in the Red Sea. Natural abundance stable isotope analysis (SIA) of carbon (δ13C) and N (δ15N) is one approach to elucidating pelagic food web structures and diet assimilation. Integrating the combined effects of ecological processes and hydrography, ecohydrographic features often translate into geographic patterns in δ13C and δ15N values at the base of food webs. This is due, for example, to divergent 15N abundances in source end-members (deep water sources: high δ15N, diazotrophs: low δ15N). Such patterns in the spatial distributions of stable isotope values were coined isoscapes. Empirical data of atmospheric, oceanographic, and biological processes, which drive the ecohydrographic gradients of the oligotrophic Red Sea, are under-explored and some rather anticipated than proven. Specifically, five processes underpin Red Sea gradients: (a) monsoon-related intrusions of nutrient-rich Indian Ocean water; (b) basin scale thermohaline circulation; (c) mesoscale eddy activity that causes up-welling of deep water nutrients into the upper layer; (d) the biological fixation of atmospheric nitrogen (N2) by diazotrophs; and (e) the deposition of dust and aerosol-derived N. This study assessed relationships between environmental samples (nutrients, chlorophyll a), oceanographic data (temperature, salinity, current velocity [ADCP]), particulate organic matter (POM), and net-phytoplankton, with the δ13C and δ15N values of zooplankton collected in spring 2012 from 16°28‧ to 26°57‧N along the central axis of the Red Sea. The δ15N of bulk POM and most zooplankton taxa increased from North (Duba) to South (Farasan). The potential contribution of deep water nutrient-fueled phytoplankton, POM, and diazotrophs varied among sites. Estimates suggested higher diazotroph contributions in the North, a greater contribution of POM in the South, and of small phytoplankton in the central Red Sea. Consistent variation across taxonomic and trophic groups at latitudinal scale, corresponding with patterns of nutrient stoichiometry and phytoplankton composition, indicates that the zooplankton ecology in the Red Sea is largely influenced by hydrographic features. It suggests that the primary ecohydrography of the Red Sea is driven not only by the thermohaline circulation, but also by mesoscale activities that transports nutrients to the upper water layers and interact with the general circulation pattern. Ecohydrographic features of the Red Sea, therefore, aid in explaining the observed configuration of its isoscape at the macroecological scale.
Naesström, Matilda; Blomstedt, Patric; Hariz, Marwan; Bodlund, Owe
2017-01-01
Background: Deep brain stimulation (DBS) is under investigation for severe obsessive-compulsive disorder (OCD) resistant to other therapies. The number of implants worldwide is slowly increasing. Therefore, it is of importance to explore knowledge and concerns of this novel treatment among patients and their psychiatric healthcare contacts. This information is relevant for scientific professionals working with clinical studies for DBS for this indication. Especially, for future study designs and the creation of information targeting healthcare professionals and patients. The aim of this study was to explore the knowledge and concerns toward DBS among patients with OCD, psychiatrists, and cognitive behavioral therapists. Methods: The study was conducted through web-based surveys for the aimed target groups –psychiatrist, patients, and cognitive behavioral therapists. The surveys contained questions regarding previous knowledge of DBS, source of knowledge, attitudes, and concerns towards the therapy. Results: The main source of information was from scientific sources among psychiatrists and psychotherapists. The patient's main source of information was the media. Common concerns among the groups included complications from surgery, anesthesia, stimulation side effects, and the novelty of the treatment. Specific concerns for the groups included; personality changes mentioned by patients and psychotherapists, and ethical concerns among psychiatrists. Conclusion: There are challenges for DBS in OCD as identified by the participants of this study; source and quality of information, efficacy, potential adverse effects, and eligibility. In all of which the current evidence base still is limited. A broad research agenda is needed for studies going forward. PMID:29285414
Structure, functioning, and cumulative stressors of Mediterranean deep-sea ecosystems
NASA Astrophysics Data System (ADS)
Tecchio, Samuele; Coll, Marta; Sardà, Francisco
2015-06-01
Environmental stressors, such as climate fluctuations, and anthropogenic stressors, such as fishing, are of major concern for the management of deep-sea ecosystems. Deep-water habitats are limited by primary productivity and are mainly dependent on the vertical input of organic matter from the surface. Global change over the latest decades is imparting variations in primary productivity levels across oceans, and thus it has an impact on the amount of organic matter landing on the deep seafloor. In addition, anthropogenic impacts are now reaching the deep ocean. The Mediterranean Sea, the largest enclosed basin on the planet, is not an exception. However, ecosystem-level studies of response to varying food input and anthropogenic stressors on deep-sea ecosystems are still scant. We present here a comparative ecological network analysis of three food webs of the deep Mediterranean Sea, with contrasting trophic structure. After modelling the flows of these food webs with the Ecopath with Ecosim approach, we compared indicators of network structure and functioning. We then developed temporal dynamic simulations varying the organic matter input to evaluate its potential effect. Results show that, following the west-to-east gradient in the Mediterranean Sea of marine snow input, organic matter recycling increases, net production decreases to negative values and trophic organisation is overall reduced. The levels of food-web activity followed the gradient of organic matter availability at the seafloor, confirming that deep-water ecosystems directly depend on marine snow and are therefore influenced by variations of energy input, such as climate-driven changes. In addition, simulations of varying marine snow arrival at the seafloor, combined with the hypothesis of a possible fishery expansion on the lower continental slope in the western basin, evidence that the trawling fishery may pose an impact which could be an order of magnitude stronger than a climate-driven reduction of marine snow.
Why Is My Voice Changing? (For Teens)
... enter puberty earlier or later than others. How Deep Will My Voice Get? How deep a guy's voice gets depends on his genes: ... of Use Notice of Nondiscrimination Visit the Nemours Web site. Note: All information on TeensHealth® is for ...
Hidden cycle of dissolved organic carbon in the deep ocean.
Follett, Christopher L; Repeta, Daniel J; Rothman, Daniel H; Xu, Li; Santinelli, Chiara
2014-11-25
Marine dissolved organic carbon (DOC) is a large (660 Pg C) reactive carbon reservoir that mediates the oceanic microbial food web and interacts with climate on both short and long timescales. Carbon isotopic content provides information on the DOC source via δ(13)C and age via Δ(14)C. Bulk isotope measurements suggest a microbially sourced DOC reservoir with two distinct components of differing radiocarbon age. However, such measurements cannot determine internal dynamics and fluxes. Here we analyze serial oxidation experiments to quantify the isotopic diversity of DOC at an oligotrophic site in the central Pacific Ocean. Our results show diversity in both stable and radio isotopes at all depths, confirming DOC cycling hidden within bulk analyses. We confirm the presence of isotopically enriched, modern DOC cocycling with an isotopically depleted older fraction in the upper ocean. However, our results show that up to 30% of the deep DOC reservoir is modern and supported by a 1 Pg/y carbon flux, which is 10 times higher than inferred from bulk isotope measurements. Isotopically depleted material turns over at an apparent time scale of 30,000 y, which is far slower than indicated by bulk isotope measurements. These results are consistent with global DOC measurements and explain both the fluctuations in deep DOC concentration and the anomalous radiocarbon values of DOC in the Southern Ocean. Collectively these results provide an unprecedented view of the ways in which DOC moves through the marine carbon cycle.
Hidden cycle of dissolved organic carbon in the deep ocean
Follett, Christopher L.; Repeta, Daniel J.; Rothman, Daniel H.; Xu, Li; Santinelli, Chiara
2014-01-01
Marine dissolved organic carbon (DOC) is a large (660 Pg C) reactive carbon reservoir that mediates the oceanic microbial food web and interacts with climate on both short and long timescales. Carbon isotopic content provides information on the DOC source via δ13C and age via Δ14C. Bulk isotope measurements suggest a microbially sourced DOC reservoir with two distinct components of differing radiocarbon age. However, such measurements cannot determine internal dynamics and fluxes. Here we analyze serial oxidation experiments to quantify the isotopic diversity of DOC at an oligotrophic site in the central Pacific Ocean. Our results show diversity in both stable and radio isotopes at all depths, confirming DOC cycling hidden within bulk analyses. We confirm the presence of isotopically enriched, modern DOC cocycling with an isotopically depleted older fraction in the upper ocean. However, our results show that up to 30% of the deep DOC reservoir is modern and supported by a 1 Pg/y carbon flux, which is 10 times higher than inferred from bulk isotope measurements. Isotopically depleted material turns over at an apparent time scale of 30,000 y, which is far slower than indicated by bulk isotope measurements. These results are consistent with global DOC measurements and explain both the fluctuations in deep DOC concentration and the anomalous radiocarbon values of DOC in the Southern Ocean. Collectively these results provide an unprecedented view of the ways in which DOC moves through the marine carbon cycle. PMID:25385632
NASA Technical Reports Server (NTRS)
Harmon, B. A.; Wilson, C. A.; Fishman, G. J.; Connaughton, V.; Henze, W.; Paciesas, W. S.; Finger, M. H.; McCollough, M. L.; Sahi, M.; Peterson, B.
2004-01-01
The Burst and Transient Source Experiment (BATSE), aboard the Compton Gamma Ray Observatory (CGRO), provided a record of the low-energy gamma-ray sky (approx. 20-1000 keV) between 1991 April and 2000 May (9.1 yr). BATSE monitored the high-energy sky using the Earth occultation technique (EOT) for point sources whose emission extended for times on the order of the CGRO orbital period (approx. 92 min) or greater. Using the EOT to extract flux information, a catalog of sources using data from the BATSE Large Area Detectors has been prepared. The first part of the catalog consists of results from the all-sky monitoring of 58 sources, mostly Galactic, with intrinsic variability on timescales of hours to years. For these sources, we have included tables of flux and spectral data, and outburst times for transients. Light curves (or flux histories) have been placed on the World Wide Web. We then performed a deep sampling of these 58 objects, plus a selection of 121 more objects, combining data from the entire 9.1 yr BATSE data set. Source types considered were primarily accreting binaries, but a small number of representative active galaxies, X-ray-emitting stars, and supernova remnants were also included. The sample represents a compilation of sources monitored and/or discovered with BATSE and other high-energy instruments between 1991 and 2000, known sources taken from the HEAO 1 A-4 and Macomb & Gehrels catalogs. The deep sample results include definite detections of 83 objects and possible detections of 36 additional objects. The definite detections spanned three classes of sources: accreting black hole and neutron star binaries, active galaxies, and Supernova remnants. The average fluxes measured for the fourth class, the X-ray emitting stars, were below the confidence limit for definite detection.
33 CFR 401.2 - Interpretation.
Code of Federal Regulations, 2014 CFR
2014-07-01
...: (a) Corporation means the Saint Lawrence Seaway Development Corporation; (b) E-business means web applications on the St. Lawrence Seaway Management Corporation Web site which provides direct electronic...) Seaway means the deep waterway between the Port of Montreal and Lake Erie and includes all locks, canals...
PyGPlates - a GPlates Python library for data analysis through space and deep geological time
NASA Astrophysics Data System (ADS)
Williams, Simon; Cannon, John; Qin, Xiaodong; Müller, Dietmar
2017-04-01
A fundamental consideration for studying the Earth through deep time is that the configurations of the continents, tectonic plates, and plate boundaries are continuously changing. Within a diverse range of fields including geodynamics, paleoclimate, and paleobiology, the importance of considering geodata in their reconstructed context across previous cycles of supercontinent aggregation, dispersal and ocean basin evolution is widely recognised. Open-source software tools such as GPlates provide paleo-geographic information systems for geoscientists to combine a wide variety of geodata and examine them within tectonic reconstructions through time. The availability of such powerful tools also brings new challenges - we want to learn something about the key associations between reconstructed plate motions and the geological record, but the high-dimensional parameter space is difficult for a human being to visually comprehend and quantify these associations. To achieve true spatio-temporal data-mining, new tools are needed. Here, we present a further development of the GPlates ecosystem - a Python-based tool for geotectonic analysis. In contrast to existing GPlates tools that are built around a graphical user interface (GUI) and interactive visualisation, pyGPlates offers a programming interface for the automation of quantitative plate tectonic analysis or arbitrary complexity. The vast array of open-source Python-based tools for data-mining, statistics and machine learning can now be linked to pyGPlates, allowing spatial data to be seamlessly analysed in space and geological "deep time", and with the ability to spread large computations across multiple processors. The presentation will illustrate a range of example applications, both simple and advanced. Basic examples include data querying, filtering, and reconstruction, and file-format conversions. For the innovative study of plate kinematics, pyGPlates has been used to explore the relationships between absolute plate motions, subduction zone kinematics, and mid-ocean ridge migration and orientation through deep time; to investigate the systematics of continental rift velocity evolution during Pangea breakup; and to make connections between kinematics of the Andean subduction zone and ore deposit formation. To support the numerical modelling community, pyGPlates facilitates the connection between tectonic surface boundary conditions contained within plate tectonic reconstructions (plate boundary configurations and plate velocities) and simulations such as thermo-mechanical models of lithospheric deformation and mantle convection. To support the development of web-based applications that can serve the wider geoscience community, we will demonstrate how pyGPlates can be combined with other open-source tools to serve alternative reconstructions together with a diverse array of reconstructed data sets in a self-consistent framework over the internet. PyGPlates is available to the public via the GPlates web site and contains comprehensive documentation covering installation on Windows/Mac/Linux platforms, sample code, tutorials and a detailed reference of pyGPlates functions and classes.
RaptorX-Property: a web server for protein structure property prediction.
Wang, Sheng; Li, Wei; Liu, Shiwang; Xu, Jinbo
2016-07-08
RaptorX Property (http://raptorx2.uchicago.edu/StructurePropertyPred/predict/) is a web server predicting structure property of a protein sequence without using any templates. It outperforms other servers, especially for proteins without close homologs in PDB or with very sparse sequence profile (i.e. carries little evolutionary information). This server employs a powerful in-house deep learning model DeepCNF (Deep Convolutional Neural Fields) to predict secondary structure (SS), solvent accessibility (ACC) and disorder regions (DISO). DeepCNF not only models complex sequence-structure relationship by a deep hierarchical architecture, but also interdependency between adjacent property labels. Our experimental results show that, tested on CASP10, CASP11 and the other benchmarks, this server can obtain ∼84% Q3 accuracy for 3-state SS, ∼72% Q8 accuracy for 8-state SS, ∼66% Q3 accuracy for 3-state solvent accessibility, and ∼0.89 area under the ROC curve (AUC) for disorder prediction. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
NASA Astrophysics Data System (ADS)
Williams, Rebecca L.; Wakeham, Stuart; McKinney, Rick; Wishner, Karen F.
2014-08-01
The unique physical and biogeochemical characteristics of oxygen minimum zones (OMZs) influence plankton ecology, including zooplankton trophic webs. Using carbon and nitrogen stable isotopes, this study examined zooplankton trophic webs in the Eastern Tropical North Pacific (ETNP) OMZ. δ13C values were used to indicate zooplankton food sources, and δ15N values were used to indicate zooplankton trophic position and nitrogen cycle pathways. Vertically stratified MOCNESS net tows collected zooplankton from 0 to 1000 m at two stations along a north-south transect in the ETNP during 2007 and 2008, the Tehuantepec Bowl and the Costa Rica Dome. Zooplankton samples were separated into four size fractions for stable isotope analyses. Particulate organic matter (POM), assumed to represent a primary food source for zooplankton, was collected with McLane large volume in situ pumps. The isotopic composition and trophic ecology of the ETNP zooplankton community had distinct spatial and vertical patterns influenced by OMZ structure. The most pronounced vertical isotope gradients occurred near the upper and lower OMZ oxyclines. Material with lower δ13C values was apparently produced in the upper oxycline, possibly by chemoautotrophic microbes, and was subsequently consumed by zooplankton. Between-station differences in δ15N values suggested that different nitrogen cycle processes were dominant at the two locations, which influenced the isotopic characteristics of the zooplankton community. A strong depth gradient in zooplankton δ15N values in the lower oxycline suggested an increase in trophic cycling just below the core of the OMZ. Shallow POM (0-110 m) was likely the most important food source for mixed layer, upper oxycline, and OMZ core zooplankton, while deep POM was an important food source for most lower oxycline zooplankton (except for samples dominated by the seasonally migrating copepod Eucalanus inermis). There was no consistent isotopic progression among the four zooplankton size classes for these bulk mixed assemblage samples, implying overlapping trophic webs within the total size range considered.
ERIC Educational Resources Information Center
Rodicio, Héctor García
2015-01-01
When searching and using resources on the Web, students have to evaluate Web pages in terms of relevance and reliability. This evaluation can be done in a more or less systematic way, by either considering deep or superficial cues of relevance and reliability. The goal of this study was to examine how systematic students are when evaluating Web…
50 CFR 679.21 - Prohibited species bycatch management.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Region Web site (http://alaskafisheries.noaa.gov/). (c) Salmon taken in the BS pollock fisheries... GOA groundfish species or species group. (B) Deep-water species fishery. Fishing with trawl gear... the NMFS Alaska Region Web site (http://alaskafisheries.noaa.gov/): (A) The Chinook salmon PSC...
50 CFR 679.21 - Prohibited species bycatch management.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Region Web site (http://alaskafisheries.noaa.gov/). (c) Salmon taken in the BS pollock fisheries... GOA groundfish species or species group. (B) Deep-water species fishery. Fishing with trawl gear... the NMFS Alaska Region Web site (http://alaskafisheries.noaa.gov/): (A) The Chinook salmon PSC...
[Study on Information Extraction of Clinic Expert Information from Hospital Portals].
Zhang, Yuanpeng; Dong, Jiancheng; Qian, Danmin; Geng, Xingyun; Wu, Huiqun; Wang, Li
2015-12-01
Clinic expert information provides important references for residents in need of hospital care. Usually, such information is hidden in the deep web and cannot be directly indexed by search engines. To extract clinic expert information from the deep web, the first challenge is to make a judgment on forms. This paper proposes a novel method based on a domain model, which is a tree structure constructed by the attributes of search interfaces. With this model, search interfaces can be classified to a domain and filled in with domain keywords. Another challenge is to extract information from the returned web pages indexed by search interfaces. To filter the noise information on a web page, a block importance model is proposed. The experiment results indicated that the domain model yielded a precision 10.83% higher than that of the rule-based method, whereas the block importance model yielded an F₁ measure 10.5% higher than that of the XPath method.
Build, Buy, Open Source, or Web 2.0?: Making an Informed Decision for Your Library
ERIC Educational Resources Information Center
Fagan, Jody Condit; Keach, Jennifer A.
2010-01-01
When improving a web presence, today's libraries have a choice: using a free Web 2.0 application, opting for open source, buying a product, or building a web application. This article discusses how to make an informed decision for one's library. The authors stress that deciding whether to use a free Web 2.0 application, to choose open source, to…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gwyn, Stephen D. J., E-mail: Stephen.Gwyn@nrc-cnrc.gc.ca
This paper describes the image stacks and catalogs of the Canada-France-Hawaii Telescope Legacy Survey produced using the MegaPipe data pipeline at the Canadian Astronomy Data Centre. The Legacy Survey is divided into two parts. The Deep Survey consists of four fields each of 1 deg{sup 2}, with magnitude limits (50% completeness for point sources) of u = 27.5, g = 27.9, r = 27.7, i = 27.4, and z = 26.2. It contains 1.6 Multiplication-Sign 10{sup 6} sources. The Wide Survey consists of 150 deg{sup 2} split over four fields, with magnitude limits of u = 26.0, g = 26.5,more » r = 25.9, i = 25.7, and z = 24.6. It contains 3 Multiplication-Sign 10{sup 7} sources. This paper describes the calibration, image stacking, and catalog generation process. The images and catalogs are available on the web through several interfaces: normal image and text file catalog downloads, a 'Google Sky' interface, an image cutout service, and a catalog database query service.« less
Gardner, James V.; Mayer, Larry A.; Hughes-Clarke, John E.; Dartnell, Peter; Sulak, Kenneth J.
2001-01-01
A zone of deep-water reefs is thought to extend from the mid and outer shelf south of Mississippi and Alabama to at least the northwestern Florida shelf off Panama City, Florida (Figure 1, 67kb). The reefs off Mississippi and Alabama are found in water depths of 60 to 120 m (Ludwick and Walton, 1957; Gardner et al., in press) and were the focus of a multibeam echosounder (MBES) mapping survey by the U.S. Geological Survey (USGS) in 2000 (Gardner et al., 2000; in press). If this deep-water-reef trend does exist along the northwestern Florida shelf, then it is critical to determine the accurate geomorphology and type of the reefs that occur because of their importance as benthic habitats for fisheries. Precisely georeferenced high-resolution mapping of bathymetry is a fundamental first step in the study of areas suspected to be critical habitats. Morphology is thought to be critical to defining the distribution of dominant demersal plankton/planktivores communities. Fish faunas of shallow hermatypic reefs have been well studied, but those of deep ahermatypic reefs have been relatively ignored. The ecology of deep-water ahermatypic reefs is fundamentally different from hermatypic reefs because autochthonous intracellular symbiotic zooxanthellae (the carbon source for hermatypic corals) do not form the base of the trophic web in ahermatypic reefs. Instead, exogenous plankton, transported to the reef by currents, serves as the primary carbon source. Thus, one of the principle uses of the morphology data will be to identify whether any reefs found are hermatypic or ahermatypic in origin. Community structure and trophodynamics of demersal fishes of the outer continental of the northeastern Gulf of Mexico presently are the focus of a major USGS reseach project. A goal of the project is to answer questions concerning the relative roles played by morphology and surficial geology in controling biological differentiation. Deep-water reefs are important because they are fish havens, key spawning sites, and are critical early larval and juvenile habitats for economically important sport/food fishes. It is known that deep-water reefs function as a key source for re-population (via seasonal and ontogenetic migration) of heavily impacted inshore reefs. The deep-water reefs south of Mississippi and Alabama support a lush fauna of ahermatypic hard corals, soft corals, black corals, sessile crinoids and sponges, that together form a living habitat for a well-developed fish fauna. The fish fauna comprises typical Caribbean reef fishes and Carolinian shelf fishes, plus epipelagic fishes, and a few deep-sea fishes. The base of the megafaunal invertebrate food web is plankton, borne by essentially continuous semi-laminar currents generated by eddies, spawned off the Loop Current, that periodically travel across the shelf edge. A few, sidescan-sonar surveys have been made of areas locally identified as Destin Pinnacles, Steamboat Lumps Marine Reserve (Koenig et al., 2000; Scanlon, et al., 2000; 2001), Twin Ridges (Briere, et al., 2000; Scanlon, et al., 2000), and Madison-Swanson Marine Reserve (Koenig et al., 2000; Scanlon, et al., 2000; 2001). However, no quantitative and little qualitative information about the geomorphology and surficial geology can be gained from these data. Existing bathymetry along the northwestern Florida shelf suggests the existence of areas of possible isolated deep-water reefs. NOAA bathymetric maps NOS NH16-9 and NG16-12 show geomorphic expressions that hint of the presence of reefs in isolated areas rather than in a continuous zone. There has been no systematic, high-resolution bathymetry collected in this area, prior to this cruise. After the successful mapping of the deep-water reefs on the Mississippi and Alabama shelf (Gardner et al., 2000; in press), a partnership composed of the USGS, Minerals Management Service, and NOAA was formed to continue the deep-reef mapping to the northwest Florida mid shelf and upper slope. This cruise is the first fruit of that partnership.
Spatial scales of carbon flow in a river food web
Finlay, J.C.; Khandwala, S.; Power, M.E.
2002-01-01
Spatial extents of food webs that support stream and river consumers are largely unknown, but such information is essential for basic understanding and management of lotic ecosystems. We used predictable variation in algal ??13C with water velocity, and measurements of consumer ??13C and ??15N to examine carbon flow and trophic structure in food webs of the South Fork Eel River in Northern California. Analyses of ??13C showed that the most abundant macroinvertebrate groups (collector-gatherers and scrapers) relied on algae from local sources within their riffle or shallow pool habitats. In contrast, filter-feeding invertebrates in riffles relied in part on algal production derived from upstream shallow pools. Riffle invertebrate predators also relied in part on consumers of pool-derived algal carbon. One abundant taxon drifting from shallow pools and riffles (baetid mayflies) relied on algal production derived from the habitats from which they dispersed. The trophic linkage from pool algae to riffle invertebrate predators was thus mediated through either predation on pool herbivores dispersing into riffles, or on filter feeders. Algal production in shallow pool habitats dominated the resource base of vertebrate predators in all habitats at the end of the summer. We could not distinguish between the trophic roles of riffle algae and terrestrial detritus, but both carbon sources appeared to play minor roles for vertebrate consumers. In shallow pools, small vertebrates, including three-spined stickleback (Gasterosteus aculeatus), roach (Hesperoleucas symmetricus), and rough-skinned newts (Taricha granulosa), relied on invertebrate prey derived from local pool habitats. During the most productive summer period, growth of all size classes of steelhead and resident rainbow trout (Oncorhynchus mykiss) in all habitats (shallow pools, riffles, and deep unproductive pools) was largely derived from algal production in shallow pools. Preliminary data suggest that the strong role of shallow pool algae in riffle steelhead growth during summer periods was due to drift of pool invertebrates to riffles, rather than movement of riffle trout. Data for ??15N showed that resident rainbow trout (25-33 cm standard length) in deep pools preyed upon small size classes of juvenile steelhead that were most often found in riffles or shallow pools. While many invertebrate consumers relied primarily on algal production derived from local habitats, our study shows that growth of top predators in the river is strongly linked to food webs in adjacent habitats. These results suggest a key role for emigration of aquatic prey in determining carbon flow to top predators.
50 CFR 679.21 - Prohibited species bycatch management.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Region Web site (http://alaskafisheries.noaa.gov/). (c) Salmon taken in the BS pollock fisheries... GOA groundfish species or species group. (B) Deep-water species fishery. Fishing with trawl gear... combine management of available trawl halibut PSC limits in the second season deep-water and shallow-water...
Deep Lake Explorer: Bringing citizen scientists to the underwater world of the Great Lakes
Deep Lake Explorer is a web application hosted on the Zooniverse platform that allows the public to interpret underwater video collected in the Great Lakes. Crowdsourcing image interpretation using the Zooniverse platform has proven successful for many projects, but few projects ...
Kulmanov, Maxat; Khan, Mohammed Asif; Hoehndorf, Robert; Wren, Jonathan
2018-02-15
A large number of protein sequences are becoming available through the application of novel high-throughput sequencing technologies. Experimental functional characterization of these proteins is time-consuming and expensive, and is often only done rigorously for few selected model organisms. Computational function prediction approaches have been suggested to fill this gap. The functions of proteins are classified using the Gene Ontology (GO), which contains over 40 000 classes. Additionally, proteins have multiple functions, making function prediction a large-scale, multi-class, multi-label problem. We have developed a novel method to predict protein function from sequence. We use deep learning to learn features from protein sequences as well as a cross-species protein-protein interaction network. Our approach specifically outputs information in the structure of the GO and utilizes the dependencies between GO classes as background information to construct a deep learning model. We evaluate our method using the standards established by the Computational Assessment of Function Annotation (CAFA) and demonstrate a significant improvement over baseline methods such as BLAST, in particular for predicting cellular locations. Web server: http://deepgo.bio2vec.net, Source code: https://github.com/bio-ontology-research-group/deepgo. robert.hoehndorf@kaust.edu.sa. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.
NASA Technical Reports Server (NTRS)
Harmon, B. A.; Wilson, C. A.; Fishman, G. J.; Connaughton, V.; Henze, W.; Paciesas, W. S.; Finger, M. H.; McCollough, M. L.; Sahi, M.; Peterson, B.
2003-01-01
The Burst and Transient Source Experiment (BATSE), aboard the Compton Gamma Ray Observatory (CGRO), provided a record of the low-energy gamma-ray sky (approx. 20-1000 keV) between 1991 April and 2000 May (9.1y). BATSE monitored the high energy sky using the Earth occultation technique (EOT) for point sources whose emission extended for times on the order of the CGRO orbital period (approx. 92m) or greater. Using the EOT to extract flux - 2 - information, a catalog of sources using data from the BATSE large area detectors has been prepared. The first part of the catalog consists of results from the all-sky monitoring of 58 sources, mostly Galactic, with intrinsic variability on timescales of hours to years. For these sources, we have included tables of flux and spectral data, and outburst times for transients. Light curves (or flux histories) covering the entire nine mission are being placed on the world wide web. We then performed a deep-sampling of these 58 objects, plus a selection of 121 more objects, combining data from the entire 9.ly BATSE dataset. Source types considered were primarily accreting binaries, but a small number of representative active galaxies, X-ray-emitting stars, and supernova remnants were also included. The sample represents a compilation of sources monitored and/or discovered with BATSE and other high energy instruments between 1991 and 2000, known sources taken from the HEAO 1 A-4 (Levine et al. 1984) and Macomb and Gehrels (1999) catalogs. The deep sample results include definite detections of 82 objects and possible detections of 36 additional objects. The definite detections spanned three classes of sources: accreting black hole and neutron star binaries, active galaxies and supernova remnants. The average fluxes measured for the fourth class, the X-ray emitting stars, were below the confidence limit for definite detection. Flux data for the deep sample are presented in four energy bands: 20-40, 40-70, 70-160, and 160-430 keV. The limiting average flux level (9.1 y) for the sample varies from 3.5 to 20 mCrab (5delta) between 20 and 430 keV, depending on systematic error, which in turn is primarily dependent on the sky location. To strengthen the credibility of detection of weaker sources (approx.5-25 mCrab), we generated Earth occultation images, searched for periodic behavior using FFT and epoch folding methods, and critically evaluated the energy-dependent emission in the four flux bands. The deep sample results are intended for guidance in performing future all-sky surveys or pointed observations in the hard X-ray and low-energy gamma-ray band, as well as more detailed studies with the BATSE EOT.
50 CFR 679.21 - Prohibited species bycatch management.
Code of Federal Regulations, 2012 CFR
2012-10-01
... to 907-586-7465. Forms are available on the NMFS Alaska Region Web site (http://alaskafisheries.noaa... the retained aggregate amount of other GOA groundfish species or species group. (B) Deep-water species... the NMFS Alaska Region Web site (http://alaskafisheries.noaa.gov/): (A) The Chinook salmon PSC...
Asynchronous Discourse in a Web-Assisted Mathematics Education Course
ERIC Educational Resources Information Center
Li, Zhongxiao
2009-01-01
Fall term of 2006, a web-assisted undergraduate mathematics course was taught at the University of Idaho: Math 235 Mathematics for Elementary Teachers I. The course goals were: To foster a deep understanding of critical mathematical content; and to promote the development of mathematical communication and collaboration concepts, skills, and…
Search Interface Design Using Faceted Indexing for Web Resources.
ERIC Educational Resources Information Center
Devadason, Francis; Intaraksa, Neelawat; Patamawongjariya, Pornprapa; Desai, Kavita
2001-01-01
Describes an experimental system designed to organize and provide access to Web documents using a faceted pre-coordinate indexing system based on the Deep Structure Indexing System (DSIS) derived from POPSI (Postulate based Permuted Subject Indexing) of Bhattacharyya, and the facet analysis and chain indexing system of Ranganathan. (AEF)
ERIC Educational Resources Information Center
Gupta, Amardeep
2005-01-01
Current search engines--even the constantly surprising Google--seem unable to leap the next big barrier in search: the trillions of bytes of dynamically generated data created by individual web sites around the world, or what some researchers call the "deep web." The challenge now is not information overload, but information overlook.…
ALMA deep field in SSA22: Survey design and source catalog of a 20 arcmin2 survey at 1.1 mm
NASA Astrophysics Data System (ADS)
Umehata, Hideki; Hatsukade, Bunyo; Smail, Ian; Alexander, David M.; Ivison, Rob J.; Matsuda, Yuichi; Tamura, Yoichi; Kohno, Kotaro; Kato, Yuta; Hayatsu, Natsuki H.; Kubo, Mariko; Ikarashi, Soh
2018-06-01
To search for dust-obscured star-formation activity in the early Universe, it is essential to obtain a deep and wide submillimeter/millimeter map. The advent of the Atacama Large Millimeter/submillimeter Array (ALMA) has enabled us to obtain such maps with sufficiently high spatial resolution to be free from source confusion. We present a new 1.1 mm-wave map obtained by ALMA in the SSA22 field. The field contains a remarkable proto-cluster at z = 3.09; therefore, it is an ideal region to investigate the role of a large-scale cosmic web on dust-obscured star formation. The typical 1σ depth of our map is 73 μJy beam-1 with a {0^{^''.}5} resolution. Combining the present survey with earlier, archived observations, we map an area of 20 arcmin2 (71 comoving Mpc2 at z = 3.09). Within the combined survey area we have detected 35 sources at a signal-to-noise ratio (S/N) >5, with flux densities of S1.1mm = 0.43-5.6 mJy, equivalent to star-formation rates of ≳100-1000 M⊙ yr-1 at z = 3.09, for a Chabrier initial mass function: 17 sources out of 35 are new detections. The cumulative number counts show an excess by a factor of three to five compared to blank fields. The excess suggests enhanced, dust-enshrouded star-formation activity in the proto-cluster on a 10 comoving Mpc scale, indicating accelerated galaxy evolution in this overdense region.
The biogeochemistry of anchialine caves: Progress and possibilities
Pohlman, John W.
2011-01-01
Recent investigations of anchialine caves and sinkholes have identified complex food webs dependent on detrital and, in some cases, chemosynthetically produced organic matter. Chemosynthetic microbes in anchialine systems obtain energy from reduced compounds produced during organic matter degradation (e.g., sulfide, ammonium, and methane), similar to what occurs in deep ocean cold seeps and mud volcanoes, but distinct from dominant processes operating at hydrothermal vents and sulfurous mineral caves where the primary energy source is mantle derived. This review includes case studies from both anchialine and non-anchialine habitats, where evidence for in situ chemosynthetic production of organic matter and its subsequent transfer to higher trophic level metazoans is documented. The energy sources and pathways identified are synthesized to develop conceptual models for elemental cycles and energy cascades that occur within oligotrophic and eutrophic anchialine caves. Strategies and techniques for testing the hypothesis of chemosynthesis as an active process in anchialine caves are also suggested.
ShapeShop: Towards Understanding Deep Learning Representations via Interactive Experimentation.
Hohman, Fred; Hodas, Nathan; Chau, Duen Horng
2017-05-01
Deep learning is the driving force behind many recent technologies; however, deep neural networks are often viewed as "black-boxes" due to their internal complexity that is hard to understand. Little research focuses on helping people explore and understand the relationship between a user's data and the learned representations in deep learning models. We present our ongoing work, ShapeShop, an interactive system for visualizing and understanding what semantics a neural network model has learned. Built using standard web technologies, ShapeShop allows users to experiment with and compare deep learning models to help explore the robustness of image classifiers.
49 CFR 575.106 - Tire fuel efficiency consumer information program.
Code of Federal Regulations, 2013 CFR
2013-10-01
... tires, deep tread, winter-type snow tires, space-saver or temporary use spare tires, tires with nominal... deep tread, winter-type snow tires and limited production tires that it manufactures which are exempt... to have included in the database of information available to consumers on NHTSA's Web site. (ii...
49 CFR 575.106 - Tire fuel efficiency consumer information program.
Code of Federal Regulations, 2014 CFR
2014-10-01
... tires, deep tread, winter-type snow tires, space-saver or temporary use spare tires, tires with nominal... deep tread, winter-type snow tires and limited production tires that it manufactures which are exempt... to have included in the database of information available to consumers on NHTSA's Web site. (ii...
49 CFR 575.106 - Tire fuel efficiency consumer information program.
Code of Federal Regulations, 2012 CFR
2012-10-01
... tires, deep tread, winter-type snow tires, space-saver or temporary use spare tires, tires with nominal... deep tread, winter-type snow tires and limited production tires that it manufactures which are exempt... to have included in the database of information available to consumers on NHTSA's Web site. (ii...
WEB-GIS Decision Support System for CO2 storage
NASA Astrophysics Data System (ADS)
Gaitanaru, Dragos; Leonard, Anghel; Radu Gogu, Constantin; Le Guen, Yvi; Scradeanu, Daniel; Pagnejer, Mihaela
2013-04-01
Environmental decision support systems (DSS) paradigm evolves and changes as more knowledge and technology become available to the environmental community. Geographic Information Systems (GIS) can be used to extract, assess and disseminate some types of information, which are otherwise difficult to access by traditional methods. In the same time, with the help of the Internet and accompanying tools, creating and publishing online interactive maps has become easier and rich with options. The Decision Support System (MDSS) developed for the MUSTANG (A MUltiple Space and Time scale Approach for the quaNtification of deep saline formations for CO2 storaGe) project is a user friendly web based application that uses the GIS capabilities. MDSS can be exploited by the experts for CO2 injection and storage in deep saline aquifers. The main objective of the MDSS is to help the experts to take decisions based large structured types of data and information. In order to achieve this objective the MDSS has a geospatial objected-orientated database structure for a wide variety of data and information. The entire application is based on several principles leading to a series of capabilities and specific characteristics: (i) Open-Source - the entire platform (MDSS) is based on open-source technologies - (1) database engine, (2) application server, (3) geospatial server, (4) user interfaces, (5) add-ons, etc. (ii) Multiple database connections - MDSS is capable to connect to different databases that are located on different server machines. (iii)Desktop user experience - MDSS architecture and design follows the structure of a desktop software. (iv)Communication - the server side and the desktop are bound together by series functions that allows the user to upload, use, modify and download data within the application. The architecture of the system involves one database and a modular application composed by: (1) a visualization module, (2) an analysis module, (3) a guidelines module, and (4) a risk assessment module. The Database component is build by using the PostgreSQL and PostGIS open source technology. The visualization module allows the user to view data of CO2 injection sites in different ways: (1) geospatial visualization, (2) table view, (3) 3D visualization. The analysis module will allow the user to perform certain analysis like Injectivity, Containment and Capacity analysis. The Risk Assessment module focus on the site risk matrix approach. The Guidelines module contains the methodologies of CO2 injection and storage into deep saline aquifers guidelines.
A Science Portal and Archive for Extragalactic Globular Cluster Systems Data
NASA Astrophysics Data System (ADS)
Young, Michael; Rhode, Katherine L.; Gopu, Arvind
2015-01-01
For several years we have been carrying out a wide-field imaging survey of the globular cluster populations of a sample of giant spiral, S0, and elliptical galaxies with distances of ~10-30 Mpc. We use mosaic CCD cameras on the WIYN 3.5-m and Kitt Peak 4-m telescopes to acquire deep BVR imaging of each galaxy and then analyze the data to derive global properties of the globular cluster system. In addition to measuring the total numbers, specific frequencies, spatial distributions, and color distributions for the globular cluster populations, we have produced deep, high-quality images and lists of tens to thousands of globular cluster candidates for the ~40 galaxies included in the survey.With the survey nearing completion, we have been exploring how to efficiently disseminate not only the overall results, but also all of the relevant data products, to the astronomical community. Here we present our solution: a scientific portal and archive for extragalactic globular cluster systems data. With a modern and intuitive web interface built on the same framework as the WIYN One Degree Imager Portal, Pipeline, and Archive (ODI-PPA), our system will provide public access to the survey results and the final stacked mosaic images of the target galaxies. In addition, the astrometric and photometric data for thousands of identified globular cluster candidates, as well as for all point sources detected in each field, will be indexed and searchable. Where available, spectroscopic follow-up data will be paired with the candidates. Advanced imaging tools will enable users to overlay the cluster candidates and other sources on the mosaic images within the web interface, while metadata charting tools will allow users to rapidly and seamlessly plot the survey results for each galaxy and the data for hundreds of thousands of individual sources. Finally, we will appeal to other researchers with similar data products and work toward making our portal a central repository for data related to well-studied giant galaxy globular cluster systems. This work is supported by NSF Faculty Early Career Development (CAREER) award AST-0847109.
Chen, Jingan; Yang, Haiquan; Zeng, Yan; Guo, Jianyang; Song, Yilong; Ding, Wei
2018-06-01
The concentrations and isotopic compositions of dissolved inorganic carbon (DIC) and particulate organic carbon (POC) were measured in order to better constrain the sources and cycling of POC in Lake Fuxian, the largest deep freshwater lake in China. Model results based on the combined δ 13 C and Δ 14 C, showed that the average lake-wide contributions of autochthonous POC, terrestrial POC, and resuspended sediment POC to the bulk POC in Lake Fuxian were 61%, 22%, and 17%, respectively. This indicated autochthonous POC might play a dominant role in sustaining large oligotrophic lake ecosystem. A mean 17% contribution of resuspended sediment POC to the bulk POC implied that sediment might have more significant influence on aquatic environment and ecosystem than previously recognized in large deep lakes. The contributions of different sources POC to the water-column POC were a function of the initial composition of the source materials, photosynthesis, physical regime of the lake, sediment resuspension, respiration and degradation of organic matter, and were affected indirectly by environmental factors such as light, temperature, DO, wind speed, turbidity, and nutrient concentration. This study is not only the first systematic investigation on the radiocarbon and stable isotope compositions of POC in large deep freshwater lake in China, but also one of the most extensive radiocarbon studies on the ecosystem of any great lakes in the world. The unique data constrain relative influences of autochthonous POC, terrestrial POC, and resuspended sediment POC, and deepen the understanding of the POC cycling in large freshwater lakes. This study is far from comprehensive, but it serves to highlight the potential of combined radiocarbon and stable carbon isotope for constraining the sources and cycling of POC in large lake system. More radiocarbon investigations on the water-column POC and the aquatic food webs are necessary to illuminate further the fate of autochthonous POC, terrestrial POC, and resuspended sediment POC, and their eco-environmental effects. Copyright © 2017 Elsevier B.V. All rights reserved.
de Dumast, Priscille; Mirabel, Clément; Cevidanes, Lucia; Ruellas, Antonio; Yatabe, Marilia; Ioshida, Marcos; Ribera, Nina Tubau; Michoud, Loic; Gomes, Liliane; Huang, Chao; Zhu, Hongtu; Muniz, Luciana; Shoukri, Brandon; Paniagua, Beatriz; Styner, Martin; Pieper, Steve; Budin, Francois; Vimort, Jean-Baptiste; Pascal, Laura; Prieto, Juan Carlos
2018-07-01
The purpose of this study is to describe the methodological innovations of a web-based system for storage, integration and computation of biomedical data, using a training imaging dataset to remotely compute a deep neural network classifier of temporomandibular joint osteoarthritis (TMJOA). This study imaging dataset consisted of three-dimensional (3D) surface meshes of mandibular condyles constructed from cone beam computed tomography (CBCT) scans. The training dataset consisted of 259 condyles, 105 from control subjects and 154 from patients with diagnosis of TMJ OA. For the image analysis classification, 34 right and left condyles from 17 patients (39.9 ± 11.7 years), who experienced signs and symptoms of the disease for less than 5 years, were included as the testing dataset. For the integrative statistical model of clinical, biological and imaging markers, the sample consisted of the same 17 test OA subjects and 17 age and sex matched control subjects (39.4 ± 15.4 years), who did not show any sign or symptom of OA. For these 34 subjects, a standardized clinical questionnaire, blood and saliva samples were also collected. The technological methodologies in this study include a deep neural network classifier of 3D condylar morphology (ShapeVariationAnalyzer, SVA), and a flexible web-based system for data storage, computation and integration (DSCI) of high dimensional imaging, clinical, and biological data. The DSCI system trained and tested the neural network, indicating 5 stages of structural degenerative changes in condylar morphology in the TMJ with 91% close agreement between the clinician consensus and the SVA classifier. The DSCI remotely ran with a novel application of a statistical analysis, the Multivariate Functional Shape Data Analysis, that computed high dimensional correlations between shape 3D coordinates, clinical pain levels and levels of biological markers, and then graphically displayed the computation results. The findings of this study demonstrate a comprehensive phenotypic characterization of TMJ health and disease at clinical, imaging and biological levels, using novel flexible and versatile open-source tools for a web-based system that provides advanced shape statistical analysis and a neural network based classification of temporomandibular joint osteoarthritis. Published by Elsevier Ltd.
Hoefler, Vaughan; Nagaoka, Hiroko; Miller, Craig S
2016-11-01
A systematic review was performed to compare the long-term survival of deep dentine caries-affected permanent teeth treated with partial-caries-removal (PCR) versus similar teeth treated with stepwise-caries-removal techniques (SWT). Clinical studies investigating long-term PCR and SWT outcomes in unrestored permanent teeth with deep dentine caries were evaluated. Failures were defined as loss of pulp vitality or restorative failures following treatment. PubMed, Web of Science, Dentistry and Oral Sciences Source, and Central databases were systematically searched. From 136 potentially relevant articles, 9 publications utilizing data from 5 studies (2 RCTs, and 3 observational case-series) reporting outcomes for 426 permanent teeth over two to ten years were analyzed. Regarding restorative failures, >88% success at two years for both techniques was reported. For loss of pulp vitality, observational studies reported >96% vitality at two years for each technique, while one RCT reported significantly higher vitality (p<0.05) at three years for PCR (96%) compared to SWT (83%). Risk of bias was high in all studies. Successful vitality and restorative outcomes for both PCR and SWT have been demonstrated at two years and beyond in permanent teeth with deep dentine caries. Partial-caries-removal may result in fewer pulpal complications over a three year period than SWT, although claims of a therapeutic advantage are based on very few, limited-quality studies. Partial-caries-removal and SWT are deep caries management techniques that reduce pulp exposure risk. Permanent teeth with deep dentine caries treated with either technique have a high likelihood for survival beyond two years. Copyright © 2016 Elsevier Ltd. All rights reserved.
Construction of a Virginia short-span bridge with the Strongwell 36-inch double-web I-beam.
DOT National Transportation Integrated Search
2005-01-01
The Route 601 Bridge in Sugar Grove, VA, spans 39 ft over Dickey Creek. The bridge is the first to use the Strongwell 36-in-deep fiber-reinforced polymer (FRP) double-web beam (DWB) in a vehicular bridge superstructure. Construction of the new bridge...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-03
..., 50-foot-deep frame module fitted with a trash rack and containing 10 low-head bulb turbines each... electronically via the Internet. See 18 CFR 385.2001(a)(1)(iii) and the instructions on the Commission's Web site... Commission's Web site at http://www.ferc.gov/docs-filing/elibrary.asp . Enter the docket number (P-13500-002...
Mercury and selenium in the food web of Lake Nahuel Huapi, Patagonia, Argentina.
Arcagni, Marina; Rizzo, Andrea; Juncos, Romina; Pavlin, Majda; Campbell, Linda M; Arribére, María A; Horvat, Milena; Ribeiro Guevara, Sergio
2017-01-01
Despite located far from point sources of Hg pollution, high concentrations were recorded in plankton from the deep oligotrophic Lake Nahuel Huapi, located in North Patagonia. Native and introduced top predator fish with differing feeding habits are a valuable economic resource to the region. Hence, Hg and Se trophic interactions and pathways to these fish were assessed in the food web of this lake at three sites, using stable nitrogen and carbon isotopes. As expected based on the high THg in plankton, mercury did not biomagnify in the food web of Lake Nahuel Huapi, as most of the THg in plankton is in the inorganic form. As was observed in other aquatic systems, Se did not biomagnify either. When trophic pathways to top predator fish were analyzed, they showed that THg biomagnified in the food chains of native fish but biodiluted in the food chains of introduced salmonids. A more benthic diet, typical of native fish, resulted in higher [THg] bioaccumulation than a more pelagic or mixed diet, as in the case of introduced fish. Se:THg molar ratios were higher than 1 in all the fish species, indicating that Se might be offering a natural protection against Hg toxicity. Copyright © 2016 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoon Lee, Sang; Hong, Tianzhen; Sawaya, Geof
The paper presents a method and process to establish a database of energy efficiency performance (DEEP) to enable quick and accurate assessment of energy retrofit of commercial buildings. DEEP was compiled from results of about 35 million EnergyPlus simulations. DEEP provides energy savings for screening and evaluation of retrofit measures targeting the small and medium-sized office and retail buildings in California. The prototype building models are developed for a comprehensive assessment of building energy performance based on DOE commercial reference buildings and the California DEER prototype buildings. The prototype buildings represent seven building types across six vintages of constructions andmore » 16 California climate zones. DEEP uses these prototypes to evaluate energy performance of about 100 energy conservation measures covering envelope, lighting, heating, ventilation, air-conditioning, plug-loads, and domestic hot water. DEEP consists the energy simulation results for individual retrofit measures as well as packages of measures to consider interactive effects between multiple measures. The large scale EnergyPlus simulations are being conducted on the super computers at the National Energy Research Scientific Computing Center of Lawrence Berkeley National Laboratory. The pre-simulation database is a part of an on-going project to develop a web-based retrofit toolkit for small and medium-sized commercial buildings in California, which provides real-time energy retrofit feedback by querying DEEP with recommended measures, estimated energy savings and financial payback period based on users’ decision criteria of maximizing energy savings, energy cost savings, carbon reduction, or payback of investment. The pre-simulated database and associated comprehensive measure analysis enhances the ability to performance assessments of retrofits to reduce energy use for small and medium buildings and business owners who typically do not have resources to conduct costly building energy audit. DEEP will be migrated into the DEnCity - DOE’s Energy City, which integrates large-scale energy data for multi-purpose, open, and dynamic database leveraging diverse source of existing simulation data.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-15
... project (Project No. 13780-000) would consist of: (1) An 85-foot-long, 100-foot-wide, 14-foot-deep excavated power canal; (2) a 95-foot-long, 100-foot-wide, 10-foot-deep excavated tailrace; (3) a 100-foot...)(iii) and the instructions on the Commission's Web site http://www.ferc.gov/docs-filing/efiling.asp...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-16
... Advisory Board can be found at the EPA SAB Web site at http://www.epa.gov/sab . SUPPLEMENTARY INFORMATION..., and human health effects. The Deep Water Horizon spill identified the need for additional research on alternative spill response technologies; environmental impacts of chemical dispersants under deep sea...
NASA Astrophysics Data System (ADS)
Kawamura, Taichi; Lognonné, Philippe; Nishikawa, Yasuhiro; Tanaka, Satoshi
2017-07-01
While deep moonquakes are seismic events commonly observed on the Moon, their source mechanism is still unexplained. The two main issues are poorly constrained source parameters and incompatibilities between the thermal profiles suggested by many studies and the apparent need for brittle properties at these depths. In this study, we reinvestigated the deep moonquake data to reestimate its source parameters and uncover the characteristics of deep moonquake faults that differ from those on Earth. We first improve the estimation of source parameters through spectral analysis using "new" broadband seismic records made by combining those of the Apollo long- and short-period seismometers. We use the broader frequency band of the combined spectra to estimate corner frequencies and DC values of spectra, which are important parameters to constrain the source parameters. We further use the spectral features to estimate seismic moments and stress drops for more than 100 deep moonquake events from three different source regions. This study revealed that deep moonquake faults are extremely smooth compared to terrestrial faults. Second, we reevaluate the brittle-ductile transition temperature that is consistent with the obtained source parameters. We show that the source parameters imply that the tidal stress is the main source of the stress glut causing deep moonquakes and the large strain rate from tides makes the brittle-ductile transition temperature higher. Higher transition temperatures open a new possibility to construct a thermal model that is consistent with deep moonquake occurrence and pressure condition and thereby improve our understandings of the deep moonquake source mechanism.
ShapeShop: Towards Understanding Deep Learning Representations via Interactive Experimentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hohman, Frederick M.; Hodas, Nathan O.; Chau, Duen Horng
Deep learning is the driving force behind many recent technologies; however, deep neural networks are often viewed as “black-boxes” due to their internal complexity that is hard to understand. Little research focuses on helping people explore and understand the relationship between a user’s data and the learned representations in deep learning models. We present our ongoing work, ShapeShop, an interactive system for visualizing and understanding what semantics a neural network model has learned. Built using standard web technologies, ShapeShop allows users to experiment with and compare deep learning models to help explore the robustness of image classifiers.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-16
.... Therefore, you should always check the Agency's Web site and call the appropriate advisory committee hot... currently approved for mid- to deep- dermal implantation for the correction of moderate to severe facial... material on its Web site prior to the meeting, the background material will be made publicly available at...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-05
...-deep, 3-mile-long canal carrying flows diverted from Cottonwood Creek by an existing diversion... on the Commission's Web site ( http://www.ferc.gov/docs-filing/ferconline.asp ) under the ``eFiling...Library'' link of Commission's Web site at http://www.ferc.gov/docs-filing/elibrary.asp . Enter the docket...
78 FR 72060 - Chimney Rock National Monument Management Plan; San Juan National Forest; Colorado
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-02
..., as well as objects of deep cultural and educational value. The plan will also provide for continued... Ranger District office in Pagosa Springs, Colorado, and on the San Juan National Forest Web site at www..., direct mailings, emails, and will be posted on the San Juan National Forest Web site. It is important...
78 FR 27405 - Anesthetic and Analgesic Drug Products Advisory Committee; Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-10
... check the Agency's Web site at http://www.fda.gov/AdvisoryCommittees/default.htm and scroll down to the... proposed indications of routine reversal of moderate and deep neuromuscular blockade (NMB) induced by... meeting. If FDA is unable to post the background material on its Web site prior to the meeting, the...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-18
... provide timely notice. Therefore, you should always check the Agency's Web site at http://www.fda.gov...]derm Voluma XC is indicated for deep (dermal/subcutaneous and/or submuscular/ supraperiosteal... the background material on its Web site prior to the meeting, the background material will be made...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-11
... deep saline geologic formations for permanent geologic storage. DATES: DOE invites the public to...; or by fax (304) 285-4403. The Draft EIS is available on DOE's NEPA Web page at: http://nepa.energy.gov/DOE_NEPA_documents.htm ; and on the National Energy Technology Laboratory's Web page at: http...
Web accessibility and open source software.
Obrenović, Zeljko
2009-07-01
A Web browser provides a uniform user interface to different types of information. Making this interface universally accessible and more interactive is a long-term goal still far from being achieved. Universally accessible browsers require novel interaction modalities and additional functionalities, for which existing browsers tend to provide only partial solutions. Although functionality for Web accessibility can be found as open source and free software components, their reuse and integration is complex because they were developed in diverse implementation environments, following standards and conventions incompatible with the Web. To address these problems, we have started several activities that aim at exploiting the potential of open-source software for Web accessibility. The first of these activities is the development of Adaptable Multi-Interface COmmunicator (AMICO):WEB, an infrastructure that facilitates efficient reuse and integration of open source software components into the Web environment. The main contribution of AMICO:WEB is in enabling the syntactic and semantic interoperability between Web extension mechanisms and a variety of integration mechanisms used by open source and free software components. Its design is based on our experiences in solving practical problems where we have used open source components to improve accessibility of rich media Web applications. The second of our activities involves improving education, where we have used our platform to teach students how to build advanced accessibility solutions from diverse open-source software. We are also partially involved in the recently started Eclipse projects called Accessibility Tools Framework (ACTF), the aim of which is development of extensible infrastructure, upon which developers can build a variety of utilities that help to evaluate and enhance the accessibility of applications and content for people with disabilities. In this article we briefly report on these activities.
A new probe of the magnetic field power spectrum in cosmic web filaments
NASA Astrophysics Data System (ADS)
Hales, Christopher A.; Greiner, Maksim; Ensslin, Torsten A.
2015-08-01
Establishing the properties of magnetic fields on scales larger than galaxy clusters is critical for resolving the unknown origin and evolution of galactic and cluster magnetism. More generally, observations of magnetic fields on cosmic scales are needed for assessing the impacts of magnetism on cosmology, particle physics, and structure formation over the full history of the Universe. However, firm observational evidence for magnetic fields in large scale structure remains elusive. In an effort to address this problem, we have developed a novel statistical method to infer the magnetic field power spectrum in cosmic web filaments using observation of the two-point correlation of Faraday rotation measures from a dense grid of extragalactic radio sources. Here we describe our approach, which embeds and extends the pioneering work of Kolatt (1998) within the context of Information Field Theory (a statistical theory for Bayesian inference on spatially distributed signals; Enfllin et al., 2009). We describe prospects for observation, for example with forthcoming data from the ultra-deep JVLA CHILES Con Pol survey and future surveys with the SKA.
Bathymetric limits of chondrichthyans in the deep sea: A re-evaluation
NASA Astrophysics Data System (ADS)
Musick, J. A.; Cotton, C. F.
2015-05-01
Chondrichthyans are largely absent in abyssal (>3000 m) habitats in most regions of the world ocean and are uncommon below 2000 m. The deeper-living chondrichthyans include certain rajids, squaliforms and holocephalans. Several hypotheses have been erected to explain the absence of chondrichthyans from the abyss. These are mostly based on energetics: deep-sea food webs are impoverished due to their distance from primary production, and chondrichthyans, occupying the highest trophic levels, cannot be supported due to entropy among trophic levels. We examined this hypothesis by comparing trophic levels, calculated from dietary data, of deep-sea chondrichthyans with those of deep-sea teleosts. Chondrichthyans were mostly above trophic level 4, whereas all the teleosts examined were below that level. Both small and medium squaloids, as well as sharks and skates of large size, feed on fishes, cephalopods and scavenged prey, and thus occupy the highest trophic levels in bathydemersal fish communities. In addition, whereas teleosts and chondrichthyans both store lipids in their livers to support long periods of fasting, chondrichthyans must devote much of their liver lipids to maintain neutral buoyancy. Consequently teleosts with swim bladders are better adapted to survive in the abyss where food sources are sparse and unpredictable. The potential prey field for both chondrichthyans and teleosts declines in biomass and diversity with depth, but teleosts have more flexibility in their feeding mechanisms and food habits, and occupy abyssal trophic guilds for which chondrichthyans are ill adapted.
ERIC Educational Resources Information Center
Fraser, Landon; Locatis, Craig
2001-01-01
Investigated the effects of link annotations on high school user search performance in Web hypertext environments having deep (layered) and shallow link structures. Results confirmed previous research that shallow link structures are better than deep (layered) link structures, and also showed that annotations had virtually no effect on search…
Biomagnification of persistent organic pollutants in a deep-sea, temperate food web.
Romero-Romero, Sonia; Herrero, Laura; Fernández, Mario; Gómara, Belén; Acuña, José Luis
2017-12-15
Polychlorinated biphenyls (PCBs), polybrominated diphenyl ethers (PBDEs) and polychlorinated dibenzo-p-dioxins and -furans (PCDD/Fs) were measured in a temperate, deep-sea ecosystem, the Avilés submarine Canyon (AC; Cantabrian Sea, Southern Bay of Biscay). There was an increase of contaminant concentration with the trophic level of the organisms, as calculated from stable nitrogen isotope data (δ 15 N). Such biomagnification was only significant for the pelagic food web and its magnitude was highly dependent on the type of top predators included in the analysis. The trophic magnification factor (TMF) for PCB-153 in the pelagic food web (spanning four trophic levels) was 6.2 or 2.2, depending on whether homeotherm top predators (cetaceans and seabirds) were included or not in the analysis, respectively. Since body size is significantly correlated with δ 15 N, it can be used as a proxy to estimate trophic magnification, what can potentially lead to a simple and convenient method to calculate the TMF. In spite of their lower biomagnification, deep-sea fishes showed higher concentrations than their shallower counterparts, although those differences were not significant. In summary, the AC fauna exhibits contaminant levels comparable or lower than those reported in other systems. Copyright © 2017 Elsevier B.V. All rights reserved.
DeepSig: deep learning improves signal peptide detection in proteins.
Savojardo, Castrense; Martelli, Pier Luigi; Fariselli, Piero; Casadio, Rita
2018-05-15
The identification of signal peptides in protein sequences is an important step toward protein localization and function characterization. Here, we present DeepSig, an improved approach for signal peptide detection and cleavage-site prediction based on deep learning methods. Comparative benchmarks performed on an updated independent dataset of proteins show that DeepSig is the current best performing method, scoring better than other available state-of-the-art approaches on both signal peptide detection and precise cleavage-site identification. DeepSig is available as both standalone program and web server at https://deepsig.biocomp.unibo.it. All datasets used in this study can be obtained from the same website. pierluigi.martelli@unibo.it. Supplementary data are available at Bioinformatics online.
Beyond the vent: New perspectives on hydrothermal plumes and pelagic biology
NASA Astrophysics Data System (ADS)
Phillips, Brennan T.
2017-03-01
Submarine hydrothermal vent fields introduce buoyant plumes of chemically altered seawater to the deep-sea water column. Chemoautotrophic microbes exploit this energy source, facilitating seafloor-based primary production that evidence suggests may transfer to pelagic consumers. While most hydrothermal plumes have relatively small volumes, there are recent examples of large-scale plume events associated with periods of eruptive activity, which have had a pronounced effect on water-column biology. This correlation suggests that hydrothermal plumes may have influenced basin-scale ocean chemistry during periods of increased submarine volcanism during the Phanerozoic eon. This paper synthesizes a growing body of scientific evidence supporting the hypothesis that hydrothermal plumes are the energetic basis of unique deep-sea pelagic food webs. While many important questions remain concerning the biology of hydrothermal plumes, this discussion is not present in ongoing management efforts related to seafloor massive sulfide (SMS) mining. Increased research efforts, focused on high-resolution surveys of midwater biology relative to plume structures, are recommended to establish baseline conditions and monitor the impact of future mining-based disturbances to the pelagic biosphere.
Open-Source web-based geographical information system for health exposure assessment
2012-01-01
This paper presents the design and development of an open source web-based Geographical Information System allowing users to visualise, customise and interact with spatial data within their web browser. The developed application shows that by using solely Open Source software it was possible to develop a customisable web based GIS application that provides functions necessary to convey health and environmental data to experts and non-experts alike without the requirement of proprietary software. PMID:22233606
Information Diversity in Web Search
ERIC Educational Resources Information Center
Liu, Jiahui
2009-01-01
The web is a rich and diverse information source with incredible amounts of information about all kinds of subjects in various forms. This information source affords great opportunity to build systems that support users in their work and everyday lives. To help users explore information on the web, web search systems should find information that…
40 CFR 63.4321 - How do I demonstrate initial compliance with the emission limitations?
Code of Federal Regulations, 2011 CFR
2011-07-01
... compliant material option for any individual web coating/printing operation, for any group of web coating/printing operations in the affected source, or for all the web coating/printing operations in the affected... HAP concentration option for any web coating/printing operation(s) in the affected source for which...
40 CFR 63.4321 - How do I demonstrate initial compliance with the emission limitations?
Code of Federal Regulations, 2010 CFR
2010-07-01
... compliant material option for any individual web coating/printing operation, for any group of web coating/printing operations in the affected source, or for all the web coating/printing operations in the affected... HAP concentration option for any web coating/printing operation(s) in the affected source for which...
77 FR 74470 - Intent to Prepare an Environmental Impact Statement (EIS) for the Donlin Gold Project
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-14
... added to the project mailing list and for additional information, please visit the following web site... miles long by 1 mile wide by 1,850 feet deep; a waste treatment facility (tailings impoundment... description of the proposed project will be posted on the project web site prior to these meetings to help the...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-04
... capacity of 450 kilowatts; (4) an existing 10- foot-wide, 8-foot-deep intake canal; (5) new trash racks... Commission's Web site under the ``eFiling'' link. If unable to be filed electronically, documents may be... information on how to submit these types of filings please go to the Commission's Web site located at http...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-09
... would consist of: (1) A new approximately 135-acre, 30-foot-deep upper reservoir constructed of enclosed... 18 CFR 385.2001(a)(1)(iii) and the instructions on the Commission's Web site under the ``eFiling... filings please go to the Commission's Web site located at http://www.ferc.gov/filing-comments.asp . More...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-05
... from http://www.regulations.gov or from the Alaska Region Web site at http://alaskafisheries.noaa.gov...) at 605 West 4th Avenue, Suite 306, Anchorage, AK 99501, phone 907-271-2809, or from the Council's Web... biomass trends for the following species are relatively stable: shallow-water flatfish, deep-water...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-09
... consist of: (1) A new approximately 135-acre, 30-foot-deep upper reservoir constructed of enclosed earth... Internet. See 18 CFR 385.2001(a)(1)(iii) and the instructions on the Commission's Web site under the ``e... filings please go to the Commission's Web site located at http://www.ferc.gov/filing-comments.asp . More...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-23
...-foot-wide, 98-foot-deep concrete lined vertical shaft containing 10-foot-diameter siphon piping and a... via the Internet. See 18 CFR Sec. 385.2001(a)(1)(iii) and the instructions on the Commission's Web... Commission's Web site at http://www.ferc.gov/docs-filing/elibrary.asp . Enter the docket number (P-14360) in...
NASA Astrophysics Data System (ADS)
Goehring, E. C.; Carlsen, W.; Larsen, J.; Simms, E.; Smith, M.
2007-12-01
From Local to EXtreme Environments (FLEXE) is an innovative new project of the GLOBE Program that involves middle and high school students in systematic, facilitated analyses and comparisons of real environmental data. Through FLEXE, students collect and analyze data from various sources, including the multi-year GLOBE database, deep-sea scientific research projects, and direct measurements of the local environment collected by students using GLOBE sampling protocols. Initial FLEXE materials and training have focused on student understanding of energy transfer through components of the Earth system, including a comparison of how local environmental conditions differ from those found at deep-sea hydrothermal vent communities. While the importance of data acquisition, accuracy and replication is emphasized, FLEXE is also uniquely structured to deepen students' understanding of multiple aspects of the process and nature of science, including written communication of results and on-line peer review. Analyses of data are facilitated through structured, web-based interactions and culminating activities with at-sea scientists through an online forum. The project benefits from the involvement of a professional evaluator, and as the model is tested and refined, it may serve as a template for the inclusion of additional "extreme" earth systems. FLEXE is a partnership of the international GLOBE web- based education program and the NSF Ridge 2000 mid-ocean ridge and hydrothermal vent research program, and includes the expertise of the Center for Science and the Schools at Penn State University. International collaborators also include the InterRidge and ChEss international research programs.
ERIC Educational Resources Information Center
Kammerer, Yvonne; Kalbfell, Eva; Gerjets, Peter
2016-01-01
In two experiments we systematically examined whether contradictions between two web pages--of which one was commercially biased as stated in an "about us" section--stimulated university students' consideration of source information both during and after reading. In Experiment 1 "about us" information of the web pages was…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-20
... implemented, this rule would remove the harvest and possession prohibition of six deep-water snapper-grouper... intent of this rule is to reduce the socio-economic impacts to fishermen harvesting deep-water snapper... obtained from the Southeast Regional Office Web site at http://sero.nmfs.noaa.gov . FOR FURTHER INFORMATION...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-21
... to 10-foot-deep canal extending between the head gates and the powerhouse; (3) a gate structure in...-foot-wide, 8 to 10-foot- deep canal extending between the head gates and the powerhouse; (3) a gate... Internet. See 18 CFR 385.2001(a)(1)(iii) and the instructions on the Commission's Web site http://www.ferc...
78 FR 5421 - Mid-Atlantic Fishery Management Council (MAFMC); Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-25
... 5 p.m. there will be a Scoping Hearing for the Deep Sea Corals Amendment. On Thursday February 14--A presentation on the new Council Web site will be held from 9 a.m. until 9:30 a.m. The Council will hold its... Fishery Management Plan (Deep Sea Corals Amendment) and review alternatives to be included in the...
NASA Astrophysics Data System (ADS)
Ruby, C.; Skarke, A. D.; Mesick, S.
2016-02-01
The Coastal and Marine Ecological Classification Standard (CMECS) is a network of common nomenclature that provides a comprehensive framework for organizing physical, biological, and chemical information about marine ecosystems. It was developed by the National Oceanic and Atmospheric Administration (NOAA) Coastal Services Center, in collaboration with other feral agencies and academic institutions, as a means for scientists to more easily access, compare, and integrate marine environmental data from a wide range of sources and time frames. CMECS has been endorsed by the Federal Geographic Data Committee (FGDC) as a national metadata standard. The research presented here is focused on the application of CMECS to deep-sea video and environmental data collected by the NOAA ROV Deep Discoverer and the NOAA Ship Okeanos Explorer in the Gulf of Mexico in 2011-2014. Specifically, a spatiotemporal index of the physical, chemical, biological, and geological features observed in ROV video records was developed in order to allow scientist, otherwise unfamiliar with the specific content of existing video data, to rapidly determine the abundance and distribution of features of interest, and thus evaluate the applicability of those video data to their research. CMECS units (setting, component, or modifier) for seafloor images extracted from high-definition ROV video data were established based upon visual assessment as well as analysis of coincident environmental sensor (temperature, conductivity), navigation (ROV position, depth, attitude), and log (narrative dive summary) data. The resulting classification units were integrated into easily searchable textual and geo-databases as well as an interactive web map. The spatial distribution and associations of deep-sea habitats as indicated by CMECS classifications are described and optimized methodological approaches for application of CMECS to deep-sea video and environmental data are presented.
40 CFR 63.4331 - How do I demonstrate initial compliance with the emission limitations?
Code of Federal Regulations, 2011 CFR
2011-07-01
... web coating/printing operations, you may use the emission rate without add-on controls option for any individual web coating/printing operation, for any group of web coating/printing operations in the affected source, or for all the web coating/printing operations as a group in the affected source. You must use...
40 CFR 63.4331 - How do I demonstrate initial compliance with the emission limitations?
Code of Federal Regulations, 2010 CFR
2010-07-01
... web coating/printing operations, you may use the emission rate without add-on controls option for any individual web coating/printing operation, for any group of web coating/printing operations in the affected source, or for all the web coating/printing operations as a group in the affected source. You must use...
40 CFR 63.3300 - Which of my emission sources are affected by this subpart?
Code of Federal Regulations, 2014 CFR
2014-07-01
... CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants: Paper and Other Web Coating... affected source subject to this subpart is the collection of all web coating lines at your facility. This includes web coating lines engaged in the coating of metal webs that are used in flexible packaging, and...
40 CFR 63.3300 - Which of my emission sources are affected by this subpart?
Code of Federal Regulations, 2012 CFR
2012-07-01
... CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants: Paper and Other Web Coating... affected source subject to this subpart is the collection of all web coating lines at your facility. This includes web coating lines engaged in the coating of metal webs that are used in flexible packaging, and...
40 CFR 63.3300 - Which of my emission sources are affected by this subpart?
Code of Federal Regulations, 2013 CFR
2013-07-01
... CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants: Paper and Other Web Coating... affected source subject to this subpart is the collection of all web coating lines at your facility. This includes web coating lines engaged in the coating of metal webs that are used in flexible packaging, and...
From projected species distribution to food-web structure under climate change.
Albouy, Camille; Velez, Laure; Coll, Marta; Colloca, Francesco; Le Loc'h, François; Mouillot, David; Gravel, Dominique
2014-03-01
Climate change is inducing deep modifications in species geographic ranges worldwide. However, the consequences of such changes on community structure are still poorly understood, particularly the impacts on food-web properties. Here, we propose a new framework, coupling species distribution and trophic models, to predict climate change impacts on food-web structure across the Mediterranean Sea. Sea surface temperature was used to determine the fish climate niches and their future distributions. Body size was used to infer trophic interactions between fish species. Our projections reveal that 54 fish species of 256 endemic and native species included in our analysis would disappear by 2080-2099 from the Mediterranean continental shelf. The number of feeding links between fish species would decrease on 73.4% of the continental shelf. However, the connectance of the overall fish web would increase on average, from 0.26 to 0.29, mainly due to a differential loss rate of feeding links and species richness. This result masks a systematic decrease in predator generality, estimated here as the number of prey species, from 30.0 to 25.4. Therefore, our study highlights large-scale impacts of climate change on marine food-web structure with potential deep consequences on ecosystem functioning. However, these impacts will likely be highly heterogeneous in space, challenging our current understanding of climate change impact on local marine ecosystems. © 2013 John Wiley & Sons Ltd.
Nebula--a web-server for advanced ChIP-seq data analysis.
Boeva, Valentina; Lermine, Alban; Barette, Camille; Guillouf, Christel; Barillot, Emmanuel
2012-10-01
ChIP-seq consists of chromatin immunoprecipitation and deep sequencing of the extracted DNA fragments. It is the technique of choice for accurate characterization of the binding sites of transcription factors and other DNA-associated proteins. We present a web service, Nebula, which allows inexperienced users to perform a complete bioinformatics analysis of ChIP-seq data. Nebula was designed for both bioinformaticians and biologists. It is based on the Galaxy open source framework. Galaxy already includes a large number of functionalities for mapping reads and peak calling. We added the following to Galaxy: (i) peak calling with FindPeaks and a module for immunoprecipitation quality control, (ii) de novo motif discovery with ChIPMunk, (iii) calculation of the density and the cumulative distribution of peak locations relative to gene transcription start sites, (iv) annotation of peaks with genomic features and (v) annotation of genes with peak information. Nebula generates the graphs and the enrichment statistics at each step of the process. During Steps 3-5, Nebula optionally repeats the analysis on a control dataset and compares these results with those from the main dataset. Nebula can also incorporate gene expression (or gene modulation) data during these steps. In summary, Nebula is an innovative web service that provides an advanced ChIP-seq analysis pipeline providing ready-to-publish results. Nebula is available at http://nebula.curie.fr/ Supplementary data are available at Bioinformatics online.
DeepBase: annotation and discovery of microRNAs and other noncoding RNAs from deep-sequencing data.
Yang, Jian-Hua; Qu, Liang-Hu
2012-01-01
Recent advances in high-throughput deep-sequencing technology have produced large numbers of short and long RNA sequences and enabled the detection and profiling of known and novel microRNAs (miRNAs) and other noncoding RNAs (ncRNAs) at unprecedented sensitivity and depth. In this chapter, we describe the use of deepBase, a database that we have developed to integrate all public deep-sequencing data and to facilitate the comprehensive annotation and discovery of miRNAs and other ncRNAs from these data. deepBase provides an integrative, interactive, and versatile web graphical interface to evaluate miRBase-annotated miRNA genes and other known ncRNAs, explores the expression patterns of miRNAs and other ncRNAs, and discovers novel miRNAs and other ncRNAs from deep-sequencing data. deepBase also provides a deepView genome browser to comparatively analyze these data at multiple levels. deepBase is available at http://deepbase.sysu.edu.cn/.
Analyzing traffic source impact on returning visitors ratio in information provider website
NASA Astrophysics Data System (ADS)
Prasetio, A.; Sari, P. K.; Sharif, O. O.; Sofyan, E.
2016-04-01
Web site performance, especially returning visitor is an important metric for an information provider web site. Since high returning visitor is a good indication of a web site’s visitor loyalty, it is important to find a way to improve this metric. This research investigated if there is any difference on returning visitor metric among three web traffic sources namely direct, referral and search. Monthly returning visitor and total visitor from each source is retrieved from Google Analytics tools and then calculated to measure returning visitor ratio. The period of data observation is from July 2012 to June 2015 resulting in a total of 108 samples. These data then analysed using One-Way Analysis of Variance (ANOVA) to address our research question. The results showed that different traffic source has significantly different returning visitor ratio especially between referral traffic source and the other two traffic sources. On the other hand, this research did not find any significant difference between returning visitor ratio from direct and search traffic sources. The owner of the web site can focus to multiply referral links from other relevant sites.
Semantic integration of data on transcriptional regulation.
Baitaluk, Michael; Ponomarenko, Julia
2010-07-01
Experimental and predicted data concerning gene transcriptional regulation are distributed among many heterogeneous sources. However, there are no resources to integrate these data automatically or to provide a 'one-stop shop' experience for users seeking information essential for deciphering and modeling gene regulatory networks. IntegromeDB, a semantic graph-based 'deep-web' data integration system that automatically captures, integrates and manages publicly available data concerning transcriptional regulation, as well as other relevant biological information, is proposed in this article. The problems associated with data integration are addressed by ontology-driven data mapping, multiple data annotation and heterogeneous data querying, also enabling integration of the user's data. IntegromeDB integrates over 100 experimental and computational data sources relating to genomics, transcriptomics, genetics, and functional and interaction data concerning gene transcriptional regulation in eukaryotes and prokaryotes. IntegromeDB is accessible through the integrated research environment BiologicalNetworks at http://www.BiologicalNetworks.org baitaluk@sdsc.edu Supplementary data are available at Bioinformatics online.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-29
...--County on January 28, 2011. The public notice is available on Charleston District's public Web site at... eight open mining pits over a twelve-year period, with pit depths ranging from 110 to 840 feet deep. The... of January 28, 2011, and are available on Charleston District's public Web site at http://www.sac...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-11
... electronically via the Internet. See 18 CFR 385.2001(a)(1)(iii) and the instructions on the Commission's Web site...-6-foot-deep, 50-to-200-foot-wide headrace canal; (4) an existing 25-foot-long, 49-foot wide... the Web at http://www.ferc.gov using the ``eLibrary'' link. Enter the docket number excluding the last...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-11
...-deep intake canal; (5) new trash racks, head gates, and stop log structure; (6) an existing 6-foot... Internet. See 18 CFR 385.2001(a)(1)(iii) and the instructions on the Commission's Web site http://www.ferc... copy of the application, can be viewed or printed on the ``eLibrary'' link of the Commission's Web site...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-15
... above mean sea level (msl); (3) an existing 12-foot-long, 16-foot-wide, 10-foot-deep head box and intake....2001(a)(1)(iii) and the instructions on the Commission's Web site http:[sol][sol]www.ferc.gov/docs... Commission's Web site at http:[sol][sol]www.ferc.gov/docs-filing/ elibrary.asp. Enter the docket number (P...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-28
...-deep, 24-foot-diameter vertical shaft to connect the upper and lower reservoir to the power tunnel; (6... electronically via the Internet. See 18 CFR 385.2001(a)(1)(iii) and the instructions on the Commission's Web site...Library'' link of Commission's Web site at http:[sol][sol]www.ferc.gov/docs-filing/ elibrary.asp. Enter...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-26
...) a new 130-foot-long, 20-foot-wide, 6-foot-deep concrete intake channel; (4) a new 10-foot-high, 20... on the Commission's Web site http://www.ferc.gov/docs-filing/efiling.asp . Commenters can submit... viewed or printed on the ``eLibrary'' link of Commission's Web site at http://www.ferc.gov/docs-filing...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-07
... Agency's Web site and call the appropriate advisory committee hot line/phone line to learn about possible... wrinkles in the face. The AQUAMID dermal filler is intended for use in mid-to-deep sub-dermal implantation... before the meeting. If FDA is unable to post the background material on its Web site prior to the meeting...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-10
....2001(a)(l)(iii) and the instructions on the Commission's Web site under the ``eFiling'' link. If unable... the Commission's Web site located at http://www.ferc.gov/filing-comments.asp . Please include the..., which will be dropped into a 8-foot-long, 6-foot-wide, and 6-foot-deep concrete diversion chamber that...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-16
...) filed an application for transfer of license of the Worthville Dam Project No. 3156, located on the Deep.... See 18 CFR 385.2001(a)(1)(iii)(2008) and the instructions on the Commission's Web site under the ``e... filings please go to the Commission's Web site located at http://www.ferc.gov/filing-comments.asp . More...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-27
... structure; (9) a 12-foot-diameter, 2,842- foot-long concrete tunnel; (10) a 73-foot-deep forebay; (11) three... do not need to refile. See 18 CFR 385.2001(a)(1)(iii) and the instructions on the Commission's Web...Library'' link of Commission's Web site at http://www.ferc.gov/docs-filing/elibrary.asp . Enter the docket...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-26
...-foot-wide and 10 to 12-foot-deep; (3) a new powerhouse equipped with a single 0.9 megawatt Kaplan... electronically via the Internet. See 18 CFR 385.2001(a)(1)(iii) and the instructions on the Commission's Web site...Library'' link of the Commission's Web site at http://www.ferc.gov/docs-filing/elibrary.asp . Enter the...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-09
... always check the Agency's Web site at http://www.fda.gov/AdvisoryCommittees/default.htm and scroll down... conditions by means other than the generation of deep heat within body tissues. On July 6, 2012 (77 FR 39953... than 2 business days before the meeting. If FDA is unable to post the background material on its Web...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-21
... within the proposed project boundary; (3) an existing 12-foot-long, 6.6-foot-wide, 6.6-foot-deep head box....2001(a)(1)(iii) and the instructions on the Commission's Web site http://www.ferc.gov/docs-filing... the application, can be viewed or printed on the ``eLibrary'' link of Commission's Web site at http...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-01
... powerhouse adjacent to the training walls; (7) a new 25-foot-wide, 5-foot-deep crest gate adjacent to the... electronically via the Internet. See 18 CFR 385.2001(a)(1)(iii) and the instructions on the Commission's Web site... Commission's Web site at http://www.ferc.gov/docs-filing/elibrary.asp . Enter the docket number (P-13944) in...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-20
... Ramseur Project No. 11392 located on the Deep River in Randolph County, North Carolina. The transferor and...)(iii)(2009) and the instructions on the Commission's Web site under the ``e-Filing'' link. If unable to... the Commission's Web site located at http://www.ferc.gov/filing-comments.asp . More information about...
Colombo, Cinzia; Mosconi, Paola; Confalonieri, Paolo; Baroni, Isabella; Traversa, Silvia; Hill, Sophie J; Synnot, Anneliese J; Oprandi, Nadia; Filippini, Graziella
2014-07-24
Multiple sclerosis (MS) patients and their family members increasingly seek health information on the Internet. There has been little exploration of how MS patients integrate health information with their needs, preferences, and values for decision making. The INtegrating and Deriving Evidence, Experiences, and Preferences (IN-DEEP) project is a collaboration between Italian and Australian researchers and MS patients, aimed to make high-quality evidence accessible and meaningful to MS patients and families, developing a Web-based resource of evidence-based information starting from their information needs. The objective of this study was to analyze MS patients and their family members' experience about the Web-based health information, to evaluate how they asses this information, and how they integrate health information with personal values. We organized 6 focus groups, 3 with MS patients and 3 with family members, in the Northern, Central, and Southern parts of Italy (April-June 2011). They included 40 MS patients aged between 18 and 60, diagnosed as having MS at least 3 months earlier, and 20 family members aged 18 and over, being relatives of a person with at least a 3-months MS diagnosis. The focus groups were audio-recorded and transcribed verbatim (Atlas software, V 6.0). Data were analyzed from a conceptual point of view through a coding system. An online forum was hosted by the Italian MS society on its Web platform to widen the collection of information. Nine questions were posted covering searching behavior, use of Web-based information, truthfulness of Web information. At the end, posts were downloaded and transcribed. Information needs covered a comprehensive communication of diagnosis, prognosis, and adverse events of treatments, MS causes or risk factors, new drugs, practical, and lifestyle-related information. The Internet is considered useful by MS patients, however, at the beginning or in a later stage of the disease a refusal to actively search for information could occur. Participants used to search on the Web before or after their neurologist's visit or when a new therapy was proposed. Social networks are widely used to read others' stories and retrieve information about daily management. A critical issue was the difficulty of recognizing reliable information on the Web. Many sources were used but the neurologist was mostly the final source of treatment decisions. MS patients used the Internet as a tool to integrate information about the illness. Information needs covered a wide spectrum, the searched topics changed with progression of the disease. Criteria for evaluating Internet accuracy and credibility of information were often lacking or generic. This may limit the empowerment of patients in health care choices.
Colombo, Cinzia; Confalonieri, Paolo; Baroni, Isabella; Traversa, Silvia; Hill, Sophie J; Synnot, Anneliese J; Oprandi, Nadia; Filippini, Graziella
2014-01-01
Background Multiple sclerosis (MS) patients and their family members increasingly seek health information on the Internet. There has been little exploration of how MS patients integrate health information with their needs, preferences, and values for decision making. The INtegrating and Deriving Evidence, Experiences, and Preferences (IN-DEEP) project is a collaboration between Italian and Australian researchers and MS patients, aimed to make high-quality evidence accessible and meaningful to MS patients and families, developing a Web-based resource of evidence-based information starting from their information needs. Objective The objective of this study was to analyze MS patients and their family members’ experience about the Web-based health information, to evaluate how they asses this information, and how they integrate health information with personal values. Methods We organized 6 focus groups, 3 with MS patients and 3 with family members, in the Northern, Central, and Southern parts of Italy (April-June 2011). They included 40 MS patients aged between 18 and 60, diagnosed as having MS at least 3 months earlier, and 20 family members aged 18 and over, being relatives of a person with at least a 3-months MS diagnosis. The focus groups were audio-recorded and transcribed verbatim (Atlas software, V 6.0). Data were analyzed from a conceptual point of view through a coding system. An online forum was hosted by the Italian MS society on its Web platform to widen the collection of information. Nine questions were posted covering searching behavior, use of Web-based information, truthfulness of Web information. At the end, posts were downloaded and transcribed. Results Information needs covered a comprehensive communication of diagnosis, prognosis, and adverse events of treatments, MS causes or risk factors, new drugs, practical, and lifestyle-related information. The Internet is considered useful by MS patients, however, at the beginning or in a later stage of the disease a refusal to actively search for information could occur. Participants used to search on the Web before or after their neurologist’s visit or when a new therapy was proposed. Social networks are widely used to read others’ stories and retrieve information about daily management. A critical issue was the difficulty of recognizing reliable information on the Web. Many sources were used but the neurologist was mostly the final source of treatment decisions. Conclusions MS patients used the Internet as a tool to integrate information about the illness. Information needs covered a wide spectrum, the searched topics changed with progression of the disease. Criteria for evaluating Internet accuracy and credibility of information were often lacking or generic. This may limit the empowerment of patients in health care choices. PMID:25093374
WorldWide Telescope: A Newly Open Source Astronomy Visualization System
NASA Astrophysics Data System (ADS)
Fay, Jonathan; Roberts, Douglas A.
2016-01-01
After eight years of development by Microsoft Research, WorldWide Telescope (WWT) was made an open source project at the end of June 2015. WWT was motivated by the desire to put new surveys of objects, such as the Sloan Digital Sky Survey in the context of the night sky. The development of WWT under Microsoft started with the creation of a Windows desktop client that is widely used in various education, outreach and research projects. Using this, users can explore the data built into WWT as well as data that is loaded in. Beyond exploration, WWT can be used to create tours that present various datasets a narrative format.In the past two years, the team developed a collection of web controls, including an HTML5 web client, which contains much of the functionality of the Windows desktop client. The project under Microsoft has deep connections with several user communities such as education through the WWT Ambassadors program, http://wwtambassadors.org/ and with planetariums and museums such as the Adler Planetarium. WWT can also support research, including using WWT to visualize the Bones of the Milky Way and rich connections between WWT and the Astrophysical Data Systems (ADS, http://labs.adsabs.harvard.edu/adsabs/). One important new research connection is the use of WWT to create dynamic and potentially interactive supplements to journal articles, which have been created in 2015.Now WWT is an open source community lead project. The source code is available in GitHub (https://github.com/WorldWideTelescope). There is significant developer documentation on the website (http://worldwidetelescope.org/Developers/) and an extensive developer workshops (http://wwtworkshops.org/?tribe_events=wwt-developer-workshop) has taken place in the fall of 2015.Now that WWT is open source anyone who has the interest in the project can be a contributor. As important as helping out with coding, the project needs people interested in documentation, testing, training and other roles.
Sekihara, Kensuke; Adachi, Yoshiaki; Kubota, Hiroshi K; Cai, Chang; Nagarajan, Srikantan S
2018-06-01
Magnetoencephalography (MEG) has a well-recognized weakness at detecting deeper brain activities. This paper proposes a novel algorithm for selective detection of deep sources by suppressing interference signals from superficial sources in MEG measurements. The proposed algorithm combines the beamspace preprocessing method with the dual signal space projection (DSSP) interference suppression method. A prerequisite of the proposed algorithm is prior knowledge of the location of the deep sources. The proposed algorithm first derives the basis vectors that span a local region just covering the locations of the deep sources. It then estimates the time-domain signal subspace of the superficial sources by using the projector composed of these basis vectors. Signals from the deep sources are extracted by projecting the row space of the data matrix onto the direction orthogonal to the signal subspace of the superficial sources. Compared with the previously proposed beamspace signal space separation (SSS) method, the proposed algorithm is capable of suppressing much stronger interference from superficial sources. This capability is demonstrated in our computer simulation as well as experiments using phantom data. The proposed bDSSP algorithm can be a powerful tool in studies of physiological functions of midbrain and deep brain structures.
NASA Astrophysics Data System (ADS)
Sekihara, Kensuke; Adachi, Yoshiaki; Kubota, Hiroshi K.; Cai, Chang; Nagarajan, Srikantan S.
2018-06-01
Objective. Magnetoencephalography (MEG) has a well-recognized weakness at detecting deeper brain activities. This paper proposes a novel algorithm for selective detection of deep sources by suppressing interference signals from superficial sources in MEG measurements. Approach. The proposed algorithm combines the beamspace preprocessing method with the dual signal space projection (DSSP) interference suppression method. A prerequisite of the proposed algorithm is prior knowledge of the location of the deep sources. The proposed algorithm first derives the basis vectors that span a local region just covering the locations of the deep sources. It then estimates the time-domain signal subspace of the superficial sources by using the projector composed of these basis vectors. Signals from the deep sources are extracted by projecting the row space of the data matrix onto the direction orthogonal to the signal subspace of the superficial sources. Main results. Compared with the previously proposed beamspace signal space separation (SSS) method, the proposed algorithm is capable of suppressing much stronger interference from superficial sources. This capability is demonstrated in our computer simulation as well as experiments using phantom data. Significance. The proposed bDSSP algorithm can be a powerful tool in studies of physiological functions of midbrain and deep brain structures.
Urbach, E.; Vergin, K.L.; Larson, G.L.; Giovannoni, S.J.
2007-01-01
The distribution of bacterial and archaeal species in Crater Lake plankton varies dramatically over depth and with time, as assessed by hybridization of group-specific oligonucleotides to RNA extracted from lakewater. Nonmetric, multidimensional scaling (MDS) analysis of relative bacterial phylotype densities revealed complex relationships among assemblages sampled from depth profiles in July, August and September of 1997 through 1999. CL500-11 green nonsulfur bacteria (Phylum Chloroflexi) and marine Group I crenarchaeota are consistently dominant groups in the oxygenated deep waters at 300 and 500 m. Other phylotypes found in the deep waters are similar to surface and mid-depth populations and vary with time. Euphotic zone assemblages are dominated either by ??-proteobacteria or CL120-10 verrucomicrobia, and ACK4 actinomycetes. MDS analyses of euphotic zone populations in relation to environmental variables and phytoplankton and zooplankton population structures reveal apparent links between Daphnia pulicaria zooplankton population densities and microbial community structure. These patterns may reflect food web interactions that link kokanee salmon population densities to community structure of the bacterioplankton, via fish predation on Daphnia with cascading consequences to Daphnia bacterivory and predation on bacterivorous protists. These results demonstrate a stable bottom-water microbial community. They also extend previous observations of food web-driven changes in euphotic zone bacterioplankton community structure to an oligotrophic setting. ?? 2007 Springer Science+Business Media B.V.
Insect-damaged fossil leaves record food web response to ancient climate change and extinction.
Wilf, P
2008-01-01
Plants and herbivorous insects have dominated terrestrial ecosystems for over 300 million years. Uniquely in the fossil record, foliage with well-preserved insect damage offers abundant and diverse information both about producers and about ecological and sometimes taxonomic groups of consumers. These data are ideally suited to investigate food web response to environmental perturbations, and they represent an invaluable deep-time complement to neoecological studies of global change. Correlations between feeding diversity and temperature, between herbivory and leaf traits that are modulated by climate, and between insect diversity and plant diversity can all be investigated in deep time. To illustrate, I emphasize recent work on the time interval from the latest Cretaceous through the middle Eocene (67-47 million years ago (Ma)), including two significant events that affected life: the end-Cretaceous mass extinction (65.5 Ma) and its ensuing recovery; and globally warming temperatures across the Paleocene-Eocene boundary (55.8 Ma). Climatic effects predicted from neoecology generally hold true in these deep-time settings. Rising temperature is associated with increased herbivory in multiple studies, a result with major predictive importance for current global warming. Diverse floras are usually associated with diverse insect damage; however, recovery from the end-Cretaceous extinction reveals uncorrelated plant and insect diversity as food webs rebuilt chaotically from a drastically simplified state. Calibration studies from living forests are needed to improve interpretation of the fossil data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Macumber, Daniel L; Horowitz, Scott G; Schott, Marjorie
Across most industries, desktop applications are being rapidly migrated to web applications for a variety of reasons. Web applications are inherently cross platform, mobile, and easier to distribute than desktop applications. Fueling this trend are a wide range of free, open source libraries and frameworks that make it incredibly easy to develop powerful web applications. The building energy modeling community is just beginning to pick up on these larger trends, with a small but growing number of building energy modeling applications starting on or moving to the web. This paper presents a new, open source, web based geometry editor formore » Building Energy Modeling (BEM). The editor is written completely in JavaScript and runs in a modern web browser. The editor works on a custom JSON file format and is designed to be integrated into a variety of web and desktop applications. The web based editor is available to use as a standalone web application at: https://nrel.github.io/openstudio-geometry-editor/. An example integration is demonstrated with the OpenStudio desktop application. Finally, the editor can be easily integrated with a wide range of possible building energy modeling web applications.« less
50 CFR 679.28 - Equipment and operational requirements.
Code of Federal Regulations, 2012 CFR
2012-10-01
... observer must be able to stand upright and have a work area at least 0.9 m deep in the area in front of the table and scale. (4) Table. The observer sampling station must include a table at least 0.6 m deep, 1.2... Station available on the NMFS Alaska Region Web site at http://www.fakr.noaa.gov. Inspections will be...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-03
...-wide, and 10-foot-deep head box and intake channel; (4) a new 6-foot-high, 14-foot-wide sluice gate...) an existing 375-foot-long, 20- foot-wide, and 4-foot-deep tailrace; (8) a new above ground 300-foot... instructions on the Commission's Web site http://www.ferc.gov/docs-filing/efiling.asp . Commenters can submit...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-06
...-kilowatt (kW) power recovery turbine; (4) a 25-foot-long, 8-foot- wide, 3-foot-deep cobble-lined tailrace... 150-foot-long, 8- foot-wide, 3-foot-deep cobble-lined tailrace discharging flows into Port Althorp... electronically via the Internet. See 18 CFR 385.2001(a)(1)(iii) and the instructions on the Commission's Web site...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-21
... approximately 350 acres and would include a three- berth, deep-water wharf. The proposed wharf would be 3,000 feet long and 105 feet wide, with access to suitably deep water provided by an approximately 1,100 foot... and at the Web site www.eisgatewaypacificwa.gov or can be requested by contacting the Corps, Seattle...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-16
... about 416.7 feet above mean sea level; (3) an existing 31- foot-long, 12.9-foot-wide, and 10-foot-deep...-foot-long, 30-foot-wide, and 4-foot-deep tailrace; (8) a new above-ground 365-foot-long, 35-kilovolt... Commission's Web site http://www.ferc.gov/docs-filing/efiling.asp . Commenters can submit brief comments up...
ERIC Educational Resources Information Center
Zhang, Shenglan; Duke, Nell K.
2011-01-01
Much research has demonstrated that students are largely uncritical users of Web sites as sources of information. Research-tested frameworks are needed to increase elementary-age students' awareness of the need and ability to critically evaluate Web sites as sources of information. This study is a randomized field trial of such a framework called…
Compact, High-Power, Fiber-Laser-Based Coherent Sources Tunable in the Mid-Infrared and THz Spectrum
2015-02-20
conversion sources and optical parametric oscillators (OPOs) for the deep mid-infrared (mid-IR) spectral regions >5 μm. We have successfully developed... oscillators (OPOs) for the deep mid-infrared (mid-IR) spectral regions >5 µm. We have successfully developed tunable deep mid-IR systems in both...the advancement of nonlinear frequency conversion sources and optical parametric oscillators (OPOs) for the deep mid-infrared (mid- IR) spectral
Evaluation of the Kloswall longwall mining system
NASA Astrophysics Data System (ADS)
Guay, P. J.
1982-04-01
A new longwal mining system specifically designed to extract a very deep web (48 inches or deeper) from a longwall panel was studied. Productivity and cost analysis comparing the new mining system with a conventional longwall operation taking a 30 inch wide web is presented. It is shown that the new system will increase annual production and return on investment in most cases. Conceptual drawings and specifications for a high capacity three drum shearer and a unique shield type of roof support specifically designed for very wide web operation are reported. The advantages and problems associated with wide web mining in general and as they relate specifically to the equipment selected for the new mining system are discussed.
Stress monitoring versus microseismic ruptures in an active deep mine
NASA Astrophysics Data System (ADS)
Tonnellier, Alice; Bouffier, Christian; Bigarré, Pascal; Nyström, Anders; Österberg, Anders; Fjellström, Peter
2015-04-01
Nowadays, underground mining industry has developed high-technology mass mining methods to optimise the productivity at deep levels. Such massive extraction induces high-level stress redistribution generating seismic events around the mining works, threatening safety and economics. For this reason mining irregular deep ore bodies calls for steadily enhanced scientific practises and technologies to guarantee the mine environment to be safer and stable for the miners and the infrastructures. INERIS, within the framework of the FP7 European project I2Mine and in partnership with the Swedish mining company Boliden, has developed new methodologies in order to monitor both quasi-static stress changes and ruptures in a seismic prone area. To this purpose, a unique local permanent microseismic and stress monitoring network has been installed into the deep-working Garpenberg mine situated to the north of Uppsala (Sweden). In this mine, ore is extracted using sublevel stoping with paste fill production/distribution system and long-hole drilling method. This monitoring network has been deployed between about 1100 and 1250 meter depth. It consists in six 1-component and five 3-component microseismic probes (14-Hz geophones) deployed in the Lappberget area, in addition to three 3D stress monitoring cells that focus on a very local exploited area. Objective is three-fold: to quantify accurately quasi-static stress changes and freshly-induced stress gradients with drift development in the orebody, to study quantitatively those stress changes versus induced detected and located microseismic ruptures, and possibly to identify quasi-static stress transfer from those seismic ruptures. Geophysical and geotechnical data are acquired continuously and automatically transferred to INERIS datacenter through the web. They are made available on a secured web cloud monitoring infrastructure called e.cenaris and completed with mine data. Such interface enables the visualisation of the monitoring data coming from the mine in quasi-real time and facilitates information exchanges and decision making for experts and stakeholders. On the basis of these data acquisition and sharing, preliminary analysis has been started to highlight whether stress variations and seismic sources behaviour might be directly bound with mine working evolution and could improve the knowledge on the equilibrium states inside the mine. Knowing such parameters indeed will be a potential solution to understand better the response of deep mining activities to the exploitation solicitations and to develop, if possible, methods to prevent from major hazards such as rock bursts and other ground failure phenomena.
Cieplik, Fabian; Buchalla, Wolfgang; Hellwig, Elmar; Al-Ahmad, Ali; Hiller, Karl-Anton; Maisch, Tim; Karygianni, Lamprini
2017-06-01
For deep carious lesions, a more conservative treatment modality ("selective caries removal") has been proposed, where only the heavily contaminated dentine is removed. In this regard, effective adjuncts for cavity disinfection such as the antimicrobial photodynamic therapy (aPDT) can be valuable clinically prior to definitive restoration. Therefore, the aim of this study was to systematically assess clinical studies on the effectiveness of aPDT as a supplementary tool in the treatment of deep caries lesions. Searches were performed in four databases (PubMed, EMBASE, ISI Web of Science, ClinicalTrials.gov) from 1st January, 2011 until 21st June, 2016 for search terms relevant to the observed parameters, pathological condition, intervention and anatomic entity. The pooled information was evaluated according to PRISMA guidelines. At first, 1651 articles were recovered, of which 1249 full-text articles were evaluated, 270 articles thereof were reviewed for eligibility and finally 6 articles met all inclusion criteria. The aPDT protocols involved Methylene Blue, Toluidine Blue and aluminium-chloride-phthalocyanine as photosensitizers and diode lasers, light-emitting diodes and halogen light-sources. The data from five reports, utilizing both culture-dependent and -independent methods, disclosed significant reduction of cariogenic bacterial load after mechanical caries removal with adjunct aPDT. As these studies exhibit some methodological limitations, e.g. lack of positive controls, this systematic review can support the application of aPDT to a limited extent only in terms of reducing the microbial load in deep carious lesions before restorative treatment. Copyright © 2017 Elsevier B.V. All rights reserved.
Geoseq: a tool for dissecting deep-sequencing datasets.
Gurtowski, James; Cancio, Anthony; Shah, Hardik; Levovitz, Chaya; George, Ajish; Homann, Robert; Sachidanandam, Ravi
2010-10-12
Datasets generated on deep-sequencing platforms have been deposited in various public repositories such as the Gene Expression Omnibus (GEO), Sequence Read Archive (SRA) hosted by the NCBI, or the DNA Data Bank of Japan (ddbj). Despite being rich data sources, they have not been used much due to the difficulty in locating and analyzing datasets of interest. Geoseq http://geoseq.mssm.edu provides a new method of analyzing short reads from deep sequencing experiments. Instead of mapping the reads to reference genomes or sequences, Geoseq maps a reference sequence against the sequencing data. It is web-based, and holds pre-computed data from public libraries. The analysis reduces the input sequence to tiles and measures the coverage of each tile in a sequence library through the use of suffix arrays. The user can upload custom target sequences or use gene/miRNA names for the search and get back results as plots and spreadsheet files. Geoseq organizes the public sequencing data using a controlled vocabulary, allowing identification of relevant libraries by organism, tissue and type of experiment. Analysis of small sets of sequences against deep-sequencing datasets, as well as identification of public datasets of interest, is simplified by Geoseq. We applied Geoseq to, a) identify differential isoform expression in mRNA-seq datasets, b) identify miRNAs (microRNAs) in libraries, and identify mature and star sequences in miRNAS and c) to identify potentially mis-annotated miRNAs. The ease of using Geoseq for these analyses suggests its utility and uniqueness as an analysis tool.
NASA Astrophysics Data System (ADS)
Sasaki, T.; Azuma, S.; Matsuda, S.; Nagayama, A.; Ogido, M.; Saito, H.; Hanafusa, Y.
2016-12-01
The Japan Agency for Marine-Earth Science and Technology (JAMSTEC) archives a large amount of deep-sea research videos and photos obtained by JAMSTEC's research submersibles and vehicles with cameras. The web site "JAMSTEC E-library of Deep-sea Images : J-EDI" (http://www.godac.jamstec.go.jp/jedi/e/) has made videos and photos available to the public via the Internet since 2011. Users can search for target videos and photos by keywords, easy-to-understand icons, and dive information at J-EDI because operating staffs classify videos and photos as to contents, e.g. living organism and geological environment, and add comments to them.Dive survey data including videos and photos are not only valiant academically but also helpful for education and outreach activities. With the aim of the improvement of visibility for broader communities, we added new functions of 3-dimensional display synchronized various dive survey data with videos in this year.New Functions Users can search for dive survey data by 3D maps with plotted dive points using the WebGL virtual map engine "Cesium". By selecting a dive point, users can watch deep-sea videos and photos and associated environmental data, e.g. water temperature, salinity, rock and biological sample photos, obtained by the dive survey. Users can browse a dive track visualized in 3D virtual spaces using the WebGL JavaScript library. By synchronizing this virtual dive track with videos, users can watch deep-sea videos recorded at a point on a dive track. Users can play an animation which a submersible-shaped polygon automatically traces a 3D virtual dive track and displays of dive survey data are synchronized with tracing a dive track. Users can directly refer to additional information of other JAMSTEC data sites such as marine biodiversity database, marine biological sample database, rock sample database, and cruise and dive information database, on each page which a 3D virtual dive track is displayed. A 3D visualization of a dive track makes users experience a virtual dive survey. In addition, by synchronizing a virtual dive track with videos, it is easy to understand living organisms and geological environments of a dive point. Therefore, these functions will visually support understanding of deep-sea environments in lectures and educational activities.
NASA Astrophysics Data System (ADS)
Preciado, Izaskun; Cartes, Joan E.; Punzón, Antonio; Frutos, Inmaculada; López-López, Lucía; Serrano, Alberto
2017-03-01
Trophic interactions in the deep-sea fish community of the Galicia Bank seamount (NE Atlantic) were inferred by using stomach contents analyses (SCA) and stable isotope analyses (SIA) of 27 fish species and their main prey items. Samples were collected during three surveys performed in 2009, 2010 and 2011 between 625 and 1800 m depth. Three main trophic guilds were determined using SCA data: pelagic, benthopelagic and benthic feeders, respectively. Vertically migrating macrozooplankton and meso-bathypelagic shrimps were identified to play a key role as pelagic prey for the deep sea fish community of the Galicia Bank. Habitat overlap was hardly detected; as a matter of fact, when species coexisted most of them evidenced a low dietary overlap, indicating a high degree of resource partitioning. A high potential competition, however, was observed among benthopelagic feeders, i.e.: Etmopterus spinax, Hoplostethus mediterraneus and Epigonus telescopus. A significant correlation was found between δ15N and δ13C for all the analysed species. When calculating Trophic Levels (TLs) for the main fish species, using both the SCA and SIA approaches, some discrepancies arose: TLs calculated from SIA were significantly higher than those obtained from SCA, probably indicating a higher consumption of benthic-suprabenthic prey in the previous months. During the summer, food web functioning in the Galicia Bank was more influenced by the assemblages dwelling in the water column than by deep-sea benthos, which was rather scarce in the summer samples. These discrepancies demonstrate the importance of using both approaches, SCA (snapshot of diet) and SIA (assimilated food in previous months), when attempting trophic studies, if an overview of food web dynamics in different compartments of the ecosystem is to be obtained.
75 FR 37783 - DOE/NSF Nuclear Science Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-30
... Science Foundation's Nuclear Physics Office. Technical Talk on Deep Underground Science and Engineering... Energy's Office of Nuclear Physics Web site for viewing. Rachel Samuel, Deputy Committee Management...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-20
... Internet. See 18 CFR 385.2001(a)(1)(iii) and the instructions on the Commission's Web site http:[sol][sol... following new features: (1) A 8-foot-long, 3-foot-wide, 3-foot-deep drop inlet structure; (2) a 2-foot... available for review at the Commission in the Public Reference Room or may be viewed on the Commission's Web...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-14
.... See 18 CFR 385.2001(a)(1)(iii) and the instructions on the Commission's Web site at http://www.ferc..., 3.75-foot-wide, 2-foot-deep pond; (2) a 1-foot-high lumber diversion into 2.5-foot-high, 3.75-foot... public inspection. This filing may be viewed on the web at http://www.ferc.gov using the ``eLibrary...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-26
...) a 12-foot-diameter, 2,842-foot-long concrete tunnel; (10) a 73-foot-deep forebay; (11) three 5.4- to... Internet. See 18 CFR 385.2001(a)(1)(iii) and the instructions on the Commission's Web site http://www.ferc... Commission's Web site at http://www.ferc.gov/docs-filing/elibrary.asp . Enter the docket number (P-13804) in...
Sward, Katherine A; Newth, Christopher JL; Khemani, Robinder G; Cryer, Martin E; Thelen, Julie L; Enriquez, Rene; Shaoyu, Su; Pollack, Murray M; Harrison, Rick E; Meert, Kathleen L; Berg, Robert A; Wessel, David L; Shanley, Thomas P; Dalton, Heidi; Carcillo, Joseph; Jenkins, Tammara L; Dean, J Michael
2015-01-01
Objectives To examine the feasibility of deploying a virtual web service for sharing data within a research network, and to evaluate the impact on data consistency and quality. Material and Methods Virtual machines (VMs) encapsulated an open-source, semantically and syntactically interoperable secure web service infrastructure along with a shadow database. The VMs were deployed to 8 Collaborative Pediatric Critical Care Research Network Clinical Centers. Results Virtual web services could be deployed in hours. The interoperability of the web services reduced format misalignment from 56% to 1% and demonstrated that 99% of the data consistently transferred using the data dictionary and 1% needed human curation. Conclusions Use of virtualized open-source secure web service technology could enable direct electronic abstraction of data from hospital databases for research purposes. PMID:25796596
Sweetman, Andrew K; Smith, Craig R; Dale, Trine; Jones, Daniel O B
2014-12-07
Jellyfish blooms are common in many oceans, and anthropogenic changes appear to have increased their magnitude in some regions. Although mass falls of jellyfish carcasses have been observed recently at the deep seafloor, the dense necrophage aggregations and rapid consumption rates typical for vertebrate carrion have not been documented. This has led to a paradigm of limited energy transfer to higher trophic levels at jelly falls relative to vertebrate organic falls. We show from baited camera deployments in the Norwegian deep sea that dense aggregations of deep-sea scavengers (more than 1000 animals at peak densities) can rapidly form at jellyfish baits and consume entire jellyfish carcasses in 2.5 h. We also show that scavenging rates on jellyfish are not significantly different from fish carrion of similar mass, and reveal that scavenging communities typical for the NE Atlantic bathyal zone, including the Atlantic hagfish, galatheid crabs, decapod shrimp and lyssianasid amphipods, consume both types of carcasses. These rapid jellyfish carrion consumption rates suggest that the contribution of gelatinous material to organic fluxes may be seriously underestimated in some regions, because jelly falls may disappear much more rapidly than previously thought. Our results also demonstrate that the energy contained in gelatinous carrion can be efficiently incorporated into large numbers of deep-sea scavengers and food webs, lessening the expected impacts (e.g. smothering of the seafloor) of enhanced jellyfish production on deep-sea ecosystems and pelagic-benthic coupling. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
Two-step web-mining approach to study geology/geophysics-related open-source software projects
NASA Astrophysics Data System (ADS)
Behrends, Knut; Conze, Ronald
2013-04-01
Geology/geophysics is a highly interdisciplinary science, overlapping with, for instance, physics, biology and chemistry. In today's software-intensive work environments, geoscientists often encounter new open-source software from scientific fields that are only remotely related to the own field of expertise. We show how web-mining techniques can help to carry out systematic discovery and evaluation of such software. In a first step, we downloaded ~500 abstracts (each consisting of ~1 kb UTF-8 text) from agu-fm12.abstractcentral.com. This web site hosts the abstracts of all publications presented at AGU Fall Meeting 2012, the world's largest annual geology/geophysics conference. All abstracts belonged to the category "Earth and Space Science Informatics", an interdisciplinary label cross-cutting many disciplines such as "deep biosphere", "atmospheric research", and "mineral physics". Each publication was represented by a highly structured record with ~20 short data attributes, the largest authorship-record being the unstructured "abstract" field. We processed texts of the abstracts with the statistics software "R" to calculate a corpus and a term-document matrix. Using R package "tm", we applied text-mining techniques to filter data and develop hypotheses about software-development activities happening in various geology/geophysics fields. Analyzing the term-document matrix with basic techniques (e.g., word frequencies, co-occurences, weighting) as well as more complex methods (clustering, classification) several key pieces of information were extracted. For example, text-mining can be used to identify scientists who are also developers of open-source scientific software, and the names of their programming projects and codes can also be identified. In a second step, based on the intermediate results found by processing the conference-abstracts, any new hypotheses can be tested in another webmining subproject: by merging the dataset with open data from github.com and stackoverflow.com. These popular, developer-centric websites have powerful application-programmer interfaces, and follow an open-data policy. In this regard, these sites offer a web-accessible reservoir of information that can be tapped to study questions such as: which open source software projects are eminent in the various geoscience fields? What are the most popular programming languages? How are they trending? Are there any interesting temporal patterns in committer activities? How large are programming teams and how do they change over time? What free software packages exist in the vast realms of related fields? Does the software from these fields have capabilities that might still be useful to me as a researcher, or can help me perform my work better? Are there any open-source projects that might be commercially interesting? This evaluation strategy reveals programming projects that tend to be new. As many important legacy codes are not hosted on open-source code-repositories, the presented search method might overlook some older projects.
50 CFR 679.28 - Equipment and operational requirements.
Code of Federal Regulations, 2014 CFR
2014-10-01
... sampling table. The observer must be able to stand upright and have a work area at least 0.9 m deep in the... least 0.6 m deep, 1.2 m wide and 0.9 m high and no more than 1.1 m high. The entire surface area of the... Station available on the NMFS Alaska Region Web site at http://www.fakr.noaa.gov. Inspections will be...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-11
... new 3,000-foot-long, 1,000-foot- wide, 50- to 75-foot-deep upper reservoir, with a surface area of 50...-foot-long, 1,000-foot- wide, 50- to 75-foot-deep lower reservoir with a surface area of 80 acres and a... instructions on the Commission's Web site http://www.ferc.gov/docs-filing/efiling.asp . Commenters can submit...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-08
... the drill pad would measure 4 by 20 feet and be approximately 5 feet deep. An estimated 1.45 acres of... the drill pad would measure 8 by 10 feet and be approximately 6 feet deep. An estimated 42.64 acres of... the proposal will be posted on the project Web site at http://www.fs.fed.us/nepa/nepa_project_exp.php...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-30
...)(1)(iii) and the instructions on the Commission's Web site http://www.ferc.gov/docs-filing/efiling... conduit equipped with a 7-foot-high, 7-foot-wide gate; (e) a 16-foot-wide, 4-foot-deep, 200-foot-long... a 198 kW turbine generating unit; (d) a 14-foot-wide, 9-foot-deep, 100-foot-long tailrace; and (e...
About Kennedy's Disease: Symptoms
... multilingual website and translation delivery network Close "The web site acted as a central organizing influence for ... certain areas. Loss of sensation. Decreased or Absent Deep Tendon Reflexes When a doctor taps the knee ...
76 FR 24923 - National Science Board; Sunshine Act Meetings; Notice
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-03
...: Some portions open, some portions closed. UPDATES: Please refer to the National Science Board Web site... Information Item: Status Deep Underground Science and Engineering Laboratory Information Item: High...
Is Multitask Deep Learning Practical for Pharma?
Ramsundar, Bharath; Liu, Bowen; Wu, Zhenqin; Verras, Andreas; Tudor, Matthew; Sheridan, Robert P; Pande, Vijay
2017-08-28
Multitask deep learning has emerged as a powerful tool for computational drug discovery. However, despite a number of preliminary studies, multitask deep networks have yet to be widely deployed in the pharmaceutical and biotech industries. This lack of acceptance stems from both software difficulties and lack of understanding of the robustness of multitask deep networks. Our work aims to resolve both of these barriers to adoption. We introduce a high-quality open-source implementation of multitask deep networks as part of the DeepChem open-source platform. Our implementation enables simple python scripts to construct, fit, and evaluate sophisticated deep models. We use our implementation to analyze the performance of multitask deep networks and related deep models on four collections of pharmaceutical data (three of which have not previously been analyzed in the literature). We split these data sets into train/valid/test using time and neighbor splits to test multitask deep learning performance under challenging conditions. Our results demonstrate that multitask deep networks are surprisingly robust and can offer strong improvement over random forests. Our analysis and open-source implementation in DeepChem provide an argument that multitask deep networks are ready for widespread use in commercial drug discovery.
The Leeuwin Current and its eddies: An introductory overview
NASA Astrophysics Data System (ADS)
Waite, A. M.; Thompson, P. A.; Pesant, S.; Feng, M.; Beckley, L. E.; Domingues, C. M.; Gaughan, D.; Hanson, C. E.; Holl, C. M.; Koslow, T.; Meuleners, M.; Montoya, J. P.; Moore, T.; Muhling, B. A.; Paterson, H.; Rennie, S.; Strzelecki, J.; Twomey, L.
2007-04-01
The Leeuwin Current (LC) is an anomalous poleward-flowing eastern boundary current that carries warm, low-salinity water southward along the coast of Western Australia. We present an introduction to a new body of work on the physical and biological dynamics of the LC and its eddies, collected in this Special Issue of Deep-Sea Research II, including (1) several modelling efforts aimed at understanding LC dynamics and eddy generation, (2) papers from regional surveys of primary productivity and nitrogen uptake patterns in the LC, and (3) the first detailed field investigations of the biological oceanography of LC mesoscale eddies. Key results in papers collected here include insight into the source regions of the LC and the Leeuwin Undercurrent (LUC), the energetic interactions of the LC and LUC, and their roles in the generation of warm-core (WC) and cold-core (CC) eddies, respectively. In near-shore waters, the dynamics of upwelling were found to control the spatio-temporal variability of primary production, and important latitudinal differences were found in the fraction of production driven by nitrate (the f-ratio). The ubiquitous deep chlorophyll maximum within LC was found to be a significant contributor to total water column production within the region. WC eddies including a single large eddy studied in 2000 contained relatively elevated chlorophyll a concentrations thought to originate at least in part from the continental shelf/shelf break region and to have been incorporated during eddy formation. During the Eddies 2003 voyage, a more detailed study comparing the WC and CC eddies illuminated more mechanistic details of the unusual dynamics and ecology of the eddies. Food web analysis suggested that the WC eddy had an enhanced "classic" food web, with more concentrated mesozooplankton and larger diatom populations than in the CC eddy. Finally, implications for fisheries management are addressed.
Automating Mid- and Long-Range Scheduling for the NASA Deep Space Network
NASA Technical Reports Server (NTRS)
Johnston, Mark D.; Tran, Daniel
2012-01-01
NASA has recently deployed a new mid-range scheduling system for the antennas of the Deep Space Network (DSN), called Service Scheduling Software, or S(sup 3). This system was designed and deployed as a modern web application containing a central scheduling database integrated with a collaborative environment, exploiting the same technologies as social web applications but applied to a space operations context. This is highly relevant to the DSN domain since the network schedule of operations is developed in a peer-to-peer negotiation process among all users of the DSN. These users represent not only NASA's deep space missions, but also international partners and ground-based science and calibration users. The initial implementation of S(sup 3) is complete and the system has been operational since July 2011. This paper describes some key aspects of the S(sup 3) system and on the challenges of modeling complex scheduling requirements and the ongoing extension of S(sup 3) to encompass long-range planning, downtime analysis, and forecasting, as the next step in developing a single integrated DSN scheduling tool suite to cover all time ranges.
The Deep Impact Network Experiment Operations Center Monitor and Control System
NASA Technical Reports Server (NTRS)
Wang, Shin-Ywan (Cindy); Torgerson, J. Leigh; Schoolcraft, Joshua; Brenman, Yan
2009-01-01
The Interplanetary Overlay Network (ION) software at JPL is an implementation of Delay/Disruption Tolerant Networking (DTN) which has been proposed as an interplanetary protocol to support space communication. The JPL Deep Impact Network (DINET) is a technology development experiment intended to increase the technical readiness of the JPL implemented ION suite. The DINET Experiment Operations Center (EOC) developed by JPL's Protocol Technology Lab (PTL) was critical in accomplishing the experiment. EOC, containing all end nodes of simulated spaces and one administrative node, exercised publish and subscribe functions for payload data among all end nodes to verify the effectiveness of data exchange over ION protocol stacks. A Monitor and Control System was created and installed on the administrative node as a multi-tiered internet-based Web application to support the Deep Impact Network Experiment by allowing monitoring and analysis of the data delivery and statistics from ION. This Monitor and Control System includes the capability of receiving protocol status messages, classifying and storing status messages into a database from the ION simulation network, and providing web interfaces for viewing the live results in addition to interactive database queries.
Glider observations of the Dotson Ice Shelf outflow and its connection to the Amundsen Sea Polynya
NASA Astrophysics Data System (ADS)
Miles, T. N.; Schofield, O.; Lee, S. H.; Yager, P. L.; Ha, H. K.
2016-02-01
The Amundsen Sea is one of the most productive polynyas in the Antarctic per unit area and is undergoing rapid changes including a reduction in sea ice duration, thinning ice sheets, retreat of glaciers and the potential collapse of the Thwaites Glacier in Pine Island Bay. A growing body of research has indicated that these changes are altering the water mass properties and associated biogeochemistry within the polynya. Unfortunately difficulties in accessing the remote location have greatly limited the amount of in situ data that has been collected. In this study data from a Teledyne-Web Slocum glider was used to supplement ship-based sampling along the Dotson Ice Shelf (DIS). This autonomous underwater vehicle revealed a detailed view of a meltwater laden outflow from below the western flank of the DIS. Circumpolar Deep Water intruding onto the shelf drives glacial melt and the supply of macronutrients that, along with ample light, supports the large phytoplankton blooms in the Amundsen Sea Polynya. Less well understood is the source of micronutrients, such as iron, necessary to support this bloom to the central polynya where chlorophyll concentrations are highest. This outflow region showed decreasing optical backscatter with proximity to the bed indicating that particulate matter was sourced from the overlying glacier rather than resuspended sediment. This result suggests that particulate iron, and potentially phytoplankton primary productivity, is intrinsically linked to the magnitude and duration of sub-glacial melt from Circumpolar Deep Water intrusions onto the shelf.
Development of a web application for water resources based on open source software
NASA Astrophysics Data System (ADS)
Delipetrev, Blagoj; Jonoski, Andreja; Solomatine, Dimitri P.
2014-01-01
This article presents research and development of a prototype web application for water resources using latest advancements in Information and Communication Technologies (ICT), open source software and web GIS. The web application has three web services for: (1) managing, presenting and storing of geospatial data, (2) support of water resources modeling and (3) water resources optimization. The web application is developed using several programming languages (PhP, Ajax, JavaScript, Java), libraries (OpenLayers, JQuery) and open source software components (GeoServer, PostgreSQL, PostGIS). The presented web application has several main advantages: it is available all the time, it is accessible from everywhere, it creates a real time multi-user collaboration platform, the programing languages code and components are interoperable and designed to work in a distributed computer environment, it is flexible for adding additional components and services and, it is scalable depending on the workload. The application was successfully tested on a case study with concurrent multi-users access.
Hogsden, Kristy L; Harding, Jon S
2012-03-01
We compared food web structure in 20 streams with either anthropogenic or natural sources of acidity and metals or circumneutral water chemistry in New Zealand. Community and diet analysis indicated that mining streams receiving anthropogenic inputs of acidic and metal-rich drainage had much simpler food webs (fewer species, shorter food chains, less links) than those in naturally acidic, naturally high metal, and circumneutral streams. Food webs of naturally high metal streams were structurally similar to those in mining streams, lacking fish predators and having few species. Whereas, webs in naturally acidic streams differed very little from those in circumneutral streams due to strong similarities in community composition and diets of secondary and top consumers. The combined negative effects of acidity and metals on stream food webs are clear. However, elevated metal concentrations, regardless of source, appear to play a more important role than acidity in driving food web structure. Copyright © 2011 Elsevier Ltd. All rights reserved.
Granulomatosis with Polyangiitis (GPA): Symptoms and Causes
... of the nose (saddling) caused by weakened cartilage Deep vein thrombosis By Mayo Clinic ... is a not-for-profit organization and proceeds from Web advertising help support our mission. Mayo Clinic does ...
15 CFR 930.42 - Public participation.
Code of Federal Regulations, 2011 CFR
2011-01-01
...) Specify a source for additional information, e.g., a State agency web site; and (4) Specify a contact for... sites. However, electronic notices, e.g., web sites, shall not be the sole source of a public notification, but may be used in conjunction with other means. Web sites may be used to provide a location for...
Arsenic speciation in food chains from mid-Atlantic hydrothermal vents.
Taylor, Vivien F; Jackson, Brian P; Siegfried, Matthew; Navratilova, Jana; Francesconi, Kevin A; Kirshtein, Julie; Voytek, Mary
2012-05-04
Arsenic concentration and speciation were determined in benthic fauna collected from the Mid-Atlantic Ridge hydrothermal vents. The shrimp species, Rimicaris exoculata , the vent chimney-dwelling mussel, Bathymodiolus azoricus , Branchipolynoe seepensis , a commensal worm of B. azoricus , and the gastropod Peltospira smaragdina showed variations in As concentration and in stable isotope (δ 13 C and δ 15 N) signature between species, suggesting different sources of As uptake. Arsenic speciation showed arsenobetaine to be the dominant species in R. exoculata , whereas in B. azoricus and B. seepensis arsenosugars were most abundant, although arsenobetaine, dimethylarsinate, and inorganic arsenic were also observed, along with several unidentified species. Scrape samples from outside the vent chimneys, covered with microbial mat, which is a presumed food source for many vent organisms, contained high levels of total As, but organic species were not detectable. The formation of arsenosugars in pelagic environments is typically attributed to marine algae, and the pathway to arsenobetaine is still unknown. The occurrence of arsenosugars and arsenobetaine in these deep sea organisms, where primary production is chemolithoautotrophic and stable isotope analyses indicate food sources are of vent origin, suggests that organic arsenicals can occur in a food web without algae or other photosynthetic life.
Kupka, M S; Dorn, C; Richter, O; van der Ven, H; Baur, M
2003-08-01
It is well established that medical information sources develop continuously from printed media to digital online sources. To demonstrate effectiveness and feasibility of decentralized performed web-based information sources for health professionals, two projects are described. The information platform of the German Working Group for Information Technologies in Gynecology and Obstetrics (AIG) and the information source concerning the German Registry for in vitro fertilization (DIR) were implemented using ordinary software and standard computer equipment. Only minimal resources and training were necessary to perform safe and reliable web-based information sources with a high correlation of effectiveness in costs and time exposure.
Undergraduate Students Searching and Reading Web Sources for Writing
ERIC Educational Resources Information Center
Li, Yongyan
2012-01-01
With the Internet-evoked paradigm shift in the academy, there has been a growing interest in students' Web-based information-seeking and source-use practices. Nevertheless, little is known as to how individual students go about searching for sources online and selecting source material for writing particular assignments. This exploratory study…
1991-01-01
patterns and water circulations, normal water fluctuations, salinity, threatened and endangered species, fish or other aquatic organisms in the food web ...velocity and anchoring of sediments; * habitat for aquatic organisms in the food web ; 0 habitat for resident and transient wildlife species; and...The remaining anomalous areas, RA-2 and RA-5 (3 to 5 + feet deep ), coincide with the location of former possible landfilling activities. However, it
Deep-Sea Microbes: Linking Biogeochemical Rates to -Omics Approaches
NASA Astrophysics Data System (ADS)
Herndl, G. J.; Sintes, E.; Bayer, B.; Bergauer, K.; Amano, C.; Hansman, R.; Garcia, J.; Reinthaler, T.
2016-02-01
Over the past decade substantial progress has been made in determining deep ocean microbial activity and resolving some of the enigmas in understanding the deep ocean carbon flux. Also, metagenomics approaches have shed light onto the dark ocean's microbes but linking -omics approaches to biogeochemical rate measurements are generally rare in microbial oceanography and even more so for the deep ocean. In this presentation, we will show by combining metagenomics, -proteomics and biogeochemical rate measurements on the bulk and single-cell level that deep-sea microbes exhibit characteristics of generalists with a large genome repertoire, versatile in utilizing substrate as revealed by metaproteomics. This is in striking contrast with the apparently rather uniform dissolved organic matter pool in the deep ocean. Combining the different -omics approaches with metabolic rate measurements, we will highlight some major inconsistencies and enigmas in our understanding of the carbon cycling and microbial food web structure in the dark ocean.
OpenFIRE - A Web GIS Service for Distributing the Finnish Reflection Experiment Datasets
NASA Astrophysics Data System (ADS)
Väkevä, Sakari; Aalto, Aleksi; Heinonen, Aku; Heikkinen, Pekka; Korja, Annakaisa
2017-04-01
The Finnish Reflection Experiment (FIRE) is a land-based deep seismic reflection survey conducted between 2001 and 2003 by a research consortium of the Universities of Helsinki and Oulu, the Geological Survey of Finland, and a Russian state-owned enterprise SpetsGeofysika. The dataset consists of 2100 kilometers of high-resolution profiles across the Archaean and Proterozoic nuclei of the Fennoscandian Shield. Although FIRE data have been available on request since 2009, the data have remained underused outside the original research consortium. The original FIRE data have been quality-controlled. The shot gathers have been cross-checked and comprehensive errata has been created. The brute stacks provided by the Russian seismic contractor have been reprocessed into seismic sections and replotted. A complete documentation of the intermediate processing steps is provided together with guidelines for setting up a computing environment and plotting the data. An open access web service "OpenFIRE" for the visualization and the downloading of FIRE data has been created. The service includes a mobile-responsive map application capable of enriching seismic sections with data from other sources such as open data from the National Land Survey and the Geological Survey of Finland. The AVAA team of the Finnish Open Science and Research Initiative has provided a tailored Liferay portal with necessary web components such as an API (Application Programming Interface) for download requests. INSPIRE (Infrastructure for Spatial Information in Europe) -compliant discovery metadata have been produced and geospatial data will be exposed as Open Geospatial Consortium standard services. The technical guidelines of the European Plate Observing System have been followed and the service could be considered as a reference application for sharing reflection seismic data. The OpenFIRE web service is available at www.seismo.helsinki.fi/openfire
Monitor and Control of the Deep-Space network via Secure Web
NASA Technical Reports Server (NTRS)
Lamarra, N.
1997-01-01
(view graph) NASA lead center for robotic space exploration. Operating division of Caltech/Jet Propulsion Laboratory. Current missions, Voyagers, Galileo, Pathfinder, Global Surveyor. Upcoming missions, Cassini, Mars and New Millennium.
75 FR 55617 - National Science Board; Sunshine Act Meetings Notice
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-13
... to the National Science Board Web site http://www.nsf.gov/nsb for additional information and schedule... of Deep Underground Science and Engineering Laboratory (DUSEL) on South Dakota Graduate Education in...
... Things to Help You Express the Pain and Deep Emotion Some people cut because the emotions that ... of Use Notice of Nondiscrimination Visit the Nemours Web site. Note: All information on TeensHealth® is for ...
ComplexContact: a web server for inter-protein contact prediction using deep learning.
Zeng, Hong; Wang, Sheng; Zhou, Tianming; Zhao, Feifeng; Li, Xiufeng; Wu, Qing; Xu, Jinbo
2018-05-22
ComplexContact (http://raptorx2.uchicago.edu/ComplexContact/) is a web server for sequence-based interfacial residue-residue contact prediction of a putative protein complex. Interfacial residue-residue contacts are critical for understanding how proteins form complex and interact at residue level. When receiving a pair of protein sequences, ComplexContact first searches for their sequence homologs and builds two paired multiple sequence alignments (MSA), then it applies co-evolution analysis and a CASP-winning deep learning (DL) method to predict interfacial contacts from paired MSAs and visualizes the prediction as an image. The DL method was originally developed for intra-protein contact prediction and performed the best in CASP12. Our large-scale experimental test further shows that ComplexContact greatly outperforms pure co-evolution methods for inter-protein contact prediction, regardless of the species.
Lepori, Fabio; Roberts, James J.
2017-01-01
We used monitoring data from Lake Lugano (Switzerland and Italy) to assess key ecosystem responses to three decades of nutrient management (1983–2014). We investigated whether reductions in external phosphorus loadings (Lext) caused declines in lake phosphorus concentrations (P) and phytoplankton biomass (Chl a), as assumed by the predictive models that underpinned the management plan. Additionally, we examined the hypothesis that deep lakes respond quickly to Lext reductions. During the study period, nutrient management reduced Lext by approximately a half. However, the effects of such reduction on P and Chl a were complex. Far from the scenarios predicted by classic nutrient-management approaches, the responses of P and Chl a did not only reflect changes in Lext, but also variation in internal P loadings (Lint) and food-web structure. In turn, Lint varied depending on basin morphometry and climatic effects, whereas food-web structure varied due to apparently stochastic events of colonization and near-extinction of key species. Our results highlight the complexity of the trajectory of deep-lake ecosystems undergoing nutrient management. From an applied standpoint, they also suggest that [i] the recovery of warm monomictic lakes may be slower than expected due to the development of Lint, and that [ii] classic P and Chl a models based on Lext may be useful in nutrient management programs only if their predictions are used as starting points within adaptive frameworks.
Response of a macrotidal estuary to changes in anthropogenic mercury loading between 1850 and 2000.
Sunderland, Elsie M; Dalziel, John; Heyes, Andrew; Branfireun, Brian A; Krabbenhoft, David P; Gobas, Frank A P C
2010-03-01
Methylmercury (MeHg) bioaccumulation in marine food webs poses risks to fish-consuming populations and wildlife. Here we develop and test an estuarine mercury cycling model for a coastal embayment of the Bay of Fundy, Canada. Mass budget calculations reveal that MeHg fluxes into sediments from settling solids exceed losses from sediment-to-water diffusion and resuspension. Although measured methylation rates in benthic sediments are high, rapid demethylation results in negligible net in situ production of MeHg. These results suggest that inflowing fluvial and tidal waters, rather than coastal sediments, are the dominant MeHg sources for pelagic marine food webs in this region. Model simulations show water column MeHg concentrations peaked in the 1960s and declined by almost 40% by the year 2000. Water column MeHg concentrations respond rapidly to changes in mercury inputs, reaching 95% of steady state in approximately 2 months. Thus, MeHg concentrations in pelagic organisms can be expected to respond rapidly to mercury loading reductions achieved through regulatory controls. In contrast, MeHg concentrations in sediments have steadily increased since the onset of industrialization despite recent decreases in total mercury loading. Benthic food web MeHg concentrations are likely to continue to increase over the next several decades at present-day mercury emissions levels because the deep active sediment layer in this system contains a large amount of legacy mercury and requires hundreds of years to reach steady state with inputs.
Response of a macrotidal estuary to changes in anthropogenic mercury loading between 1850 and 2000
Sunderl, E.M.; Dalziel, J.; Heyes, A.; Branfireun, B.A.; Krabbenhoft, D.P.; Gobas, F.A.P.C.
2010-01-01
Methylmercury (MeHg) bioaccumulation in marine food webs poses risks to fish-consuming populations and wildlife. Here we develop and test an estuarine mercury cycling model for a coastal embayment of the Bay of Fundy, Canada. Mass budget calculations reveal that MeHg fluxes into sediments from settling solids exceed losses from sediment-to-water diffusion and resuspension. Although measured methylation rates in benthic sediments are high, rapid demethylation results in negligible net in situ production of MeHg. These results suggest that inflowing fluvial and tidal waters, rather than coastal sediments, are the dominant MeHg sources for pelagic marine food webs in this region. Model simulations show water column MeHg concentrations peaked in the 1960s and declined by almost40% by the year 2000. Water column MeHg concentrations respond rapidly to changes in mercury inputs, reaching 95% of steady state in approximately 2 months. Thus, MeHg concentrations in pelagic organisms can be expected to respond rapidly to mercury loading reductions achieved through regulatory controls. In contrast MeHg concentrations in sediments have steadily increased since the onset of industrialization despite recent decreases in total mercury loading. Benthic food web MeHg concentrations are likely to continue to increase over the next several decades at present-day mercury emissions levels because the deep active sediment layer in this system contains a large amount of legacy mercury and requires hundreds of years to reach steady state with inputs. ?? 2010 American Chemical Society.
Two Stage Data Augmentation for Low Resourced Speech Recognition (Author’s Manuscript)
2016-09-12
speech recognition, deep neural networks, data augmentation 1. Introduction When training data is limited—whether it be audio or text—the obvious...Schwartz, and S. Tsakalidis, “Enhancing low resource keyword spotting with au- tomatically retrieved web documents,” in Interspeech, 2015, pp. 839–843. [2...and F. Seide, “Feature learning in deep neural networks - a study on speech recognition tasks,” in International Conference on Learning Representations
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-04
... 385.2001(a)(1)(iii) and the instructions on the Commission's Web site http://www.ferc.gov/docs-filing... conduit equipped with a 7-foot-high, 7-foot-wide gate; (e) a 16-foot-wide, 4-foot-deep, 200-foot-long... generating unit; (d) and a 14- foot-wide, 9-foot-deep, 100-foot-long tailrace (e) six 900-foot-long, 600 volt...
NASA Astrophysics Data System (ADS)
Williams, B.; Thibodeau, B.; Chikaraishi, Y.; Ohkouchi, N.; Grottoli, A. G.
2014-12-01
Instrumental and proxy data and global climate model experiments indicate a multi-decadal shoaling of the western tropical Pacific (WTP) thermocline potentially related to a shift in ENSO frequency. In the WTP, the nutricline coincides with the thermocline, and a shoaling of the nutricline brings more nitrate-rich seawater higher in the water column and within the sunlit euphotic zone. In the nutrient-poor WTP, this incursion of nitrate-rich water at the bottom of the euphotic zone may stimulate productivity in the water column. However, there is a general paucity of measurements below the surface with which to investigate recent changes in seawater chemistry. Nitrogen isotope (δ15N) measurements of particulate organic matter (POM) can elucidate the source of nitrogen to the WTP and related trophic dynamics. This POM is the food source to the long-lived proteinaceous corals, and drives the nitrogen isotopic composition of their skeleton. Here, we report time series δ15N values from the banded skeletons of proteinaceous corals from offshore Palau in the WTP that provide proxy information about past changes in euphotic zone nitrogen dynamics. Bulk skeletal δ15N values declined between 1977 and 2010 suggesting a progressively increasing contribution of deep water with isotopically-light nitrate to the euphotic zone and/or a shortening of the planktonic food web. Since only some amino acids are enriched in δ15N with each trophic transfer in a food web, we measured the δ15N composition of seven individual amino acids in the same coral skeleton. The δ15N time series of the individual amino acids also declined over time, mirroring the bulk values. These new data indicate that the changes in the source nitrogen to the base of the euphotic zone drives a decline in coral skeletal δ15N values, consistent with the shoaling nutricline, with no coinciding alteration of the trophic structure in the WTP.
Arterial supply of the thumb: Systemic review.
Miletin, J; Sukop, A; Baca, V; Kachlik, D
2017-10-01
We offer a complete systemic review of the anatomy of arteries of the thumb, including their sources in the first web space. Eleven studies were selected from the PubMed, Medline, Embase, Scopus and Ovid databases. Data about each artery of the thumb were obtained; in particular, the incidence and dominance of each of these arteries were calculated. The ulnopalmar digital artery of the thumb (UPDAT) was found in 99.63%, the radiopalmar digital artery of the thumb (RPDAT) in 99.26%, the ulnodorsal digital artery of the thumb (UDDAT) in 83.39%, and the radiodorsal digital artery of the thumb (RDDAT) in 70.38%. The sources for the thumb arteries are the first palmar metacarpal artery (for UPDAT in 63.15%, for RPDAT in 78.88%, for UDDAT in 56.95% and for RDDAT in 41.48%), the first dorsal metacarpal artery (for UPDAT in 20.54%, for RPDAT 2.53%, for UDDAT in 20.62%, and for RDDAT in 4.81%) and the superficial palmar arch, either complete or incomplete (for UPDAT in 25.57%, for RPDAT in 23.04%, for UDDAT in 0%, and for RDDAT in 5.19%). The dominant source could be identified in 88.2% of cases: the first palmar metacarpal artery (66.2%), the first dorsal metacarpal artery (15.5%) and the superficial palmar arch, complete or incomplete (8.2%). Four arteries usually supply the thumb. Any artery in the first web space can be a source for the thumb arteries. We propose a new classification of the arteries of the hand, dividing them into three systems (superficial palmar, deep palmar and dorsal system), and suggest that the term "princeps pollicis artery" be reconsidered and systemic anatomical terms of the thumb arteries preferred. Clin. Anat. 30:963-973, 2017. ©2017 Wiley-Liss, Inc. © 2017 Wiley Periodicals, Inc.
Dual-Carbon sources fuel the OCS deep-reef Community, a stable isotope investigation
Sulak, Kenneth J.; Berg, J.; Randall, Michael T.; Dennis, George D.; Brooks, R.A.
2008-01-01
The hypothesis that phytoplankton is the sole carbon source for the OCS deep-reef community (>60 m) was tested. Trophic structure for NE Gulf of Mexico deep reefs was analyzed via carbon and nitrogen stable isotopes. Carbon signatures for 114 entities (carbon sources, sediment, fishes, and invertebrates) supported surface phytoplankton as the primary fuel for the deep reef. However, a second carbon source, the macroalga Sargassum, with its epiphytic macroalgal associate, Cladophora liniformis, was also identified. Macroalgal carbon signatures were detected among 23 consumer entities. Most notably, macroalgae contributed 45 % of total carbon to the 13C isotopic spectrum of the particulate-feeding reef-crest gorgonian Nicella. The discontinuous spatial distribution of some sessile deep-reef invertebrates utilizing pelagic macroalgal carbon may be trophically tied to the contagious distribution of Sargassum biomass along major ocean surface features.
Replacing missing values using trustworthy data values from web data sources
NASA Astrophysics Data System (ADS)
Izham Jaya, M.; Sidi, Fatimah; Mat Yusof, Sharmila; Suriani Affendey, Lilly; Ishak, Iskandar; Jabar, Marzanah A.
2017-09-01
In practice, collected data usually are incomplete and contains missing value. Existing approaches in managing missing values overlook the importance of trustworthy data values in replacing missing values. In view that trusted completed data is very important in data analysis, we proposed a framework of missing value replacement using trustworthy data values from web data sources. The proposed framework adopted ontology to map data values from web data sources to the incomplete dataset. As data from web is conflicting with each other, we proposed a trust score measurement based on data accuracy and data reliability. Trust score is then used to select trustworthy data values from web data sources for missing values replacement. We successfully implemented the proposed framework using financial dataset and presented the findings in this paper. From our experiment, we manage to show that replacing missing values with trustworthy data values is important especially in a case of conflicting data to solve missing values problem.
Linking mercury exposure to habitat and feeding behaviour in Beaufort Sea beluga whales
NASA Astrophysics Data System (ADS)
Loseto, L. L.; Stern, G. A.; Deibel, D.; Connelly, T. L.; Prokopowicz, A.; Lean, D. R. S.; Fortier, L.; Ferguson, S. H.
2008-12-01
Mercury (Hg) levels in the Beaufort Sea beluga population have been increasing since the 1990's. Ultimately, it is the Hg content of prey that determines beluga Hg levels. However, the Beaufort Sea beluga diet is not understood, and little is known about the diet Hg sources in their summer habitat. During the summer, they segregate into social groups based on habitat use leading to the hypothesis that they may feed in different food webs explaining Hg dietary sources. Methyl mercury (MeHg) and total mercury (THg) levels were measured in the estuarine-shelf, Amundsen Gulf and epibenthic food webs in the western Canadian Arctic collected during the Canadian Arctic Shelf Exchange Study (CASES) to assess their dietary Hg contribution. To our knowledge, this is the first study to report MeHg levels in estuarine fish and epibenthic invertebrates from the Arctic Ocean. Although the Mackenzie River is a large source of Hg, the estuarine-shelf prey items had the lowest MeHg levels, ranging from 0.1 to 0.27 μg/g dry weight (dw) in arctic cisco ( Coregonus autumnalis) and saffron cod ( Eleginus gracilis) respectively. Highest MeHg levels occurred in fourhorn sculpin ( Myoxocephalus quadricornis) (0.5 μg/g dw) from the epibenthic food web. Beluga hypothesized to feed in the epibenthic and Amundsen Gulf food webs had the highest Hg levels matching with high Hg levels in associated food webs, and estuarine-shelf belugas had the lowest Hg levels (2.6 μg/g dw), corresponding with the low food web Hg levels, supporting the variation in dietary Hg uptake. The trophic level transfer of Hg was similar among the food webs, highlighting the importance of Hg sources at the bottom of the food web as well as food web length. We propose that future biomagnification studies incorporate predator behaviour with food web structure to assist in the evaluation of dietary Hg sources.
Sources: A Compilation of Useful Information for Teachers & Teacher-Librarians. Canadian Edition.
ERIC Educational Resources Information Center
School Libraries in Canada, 2002
2002-01-01
Includes a variety of sources for quality information for Canadian school libraries. Highlights include professional associations; award-winning books; Canadian children's and young adult authors and illustrators; educational films; Web sites; Canadian information sources on the Web; Canadian poetry; and professional resources. (LRW)
WebProtégé: a collaborative Web-based platform for editing biomedical ontologies.
Horridge, Matthew; Tudorache, Tania; Nuylas, Csongor; Vendetti, Jennifer; Noy, Natalya F; Musen, Mark A
2014-08-15
WebProtégé is an open-source Web application for editing OWL 2 ontologies. It contains several features to aid collaboration, including support for the discussion of issues, change notification and revision-based change tracking. WebProtégé also features a simple user interface, which is geared towards editing the kinds of class descriptions and annotations that are prevalent throughout biomedical ontologies. Moreover, it is possible to configure the user interface using views that are optimized for editing Open Biomedical Ontology (OBO) class descriptions and metadata. Some of these views are shown in the Supplementary Material and can be seen in WebProtégé itself by configuring the project as an OBO project. WebProtégé is freely available for use on the Web at http://webprotege.stanford.edu. It is implemented in Java and JavaScript using the OWL API and the Google Web Toolkit. All major browsers are supported. For users who do not wish to host their ontologies on the Stanford servers, WebProtégé is available as a Web app that can be run locally using a Servlet container such as Tomcat. Binaries, source code and documentation are available under an open-source license at http://protegewiki.stanford.edu/wiki/WebProtege. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Utilization of carbon sources in a northern Brazilian mangrove ecosystem
NASA Astrophysics Data System (ADS)
Giarrizzo, Tommaso; Schwamborn, Ralf; Saint-Paul, Ulrich
2011-12-01
Carbon and nitrogen stable isotope ratios ( 13C and 15N) and trophic level (TL) estimates based on stomach content analysis and published data were used to assess the contribution of autotrophic sources to 55 consumers in an intertidal mangrove creek of the Curuçá estuary, northern Brazil. Primary producers showed δ 13C signatures ranging between -29.2 and -19.5‰ and δ 15N from 3.0 to 6.3‰. The wide range of the isotopic composition of carbon of consumers (-28.6 to -17.1‰) indicated that different autotrophic sources are important in the intertidal mangrove food webs. Food web segregation structures the ecosystem into three relatively distinct food webs: (i) mangrove food web, where vascular plants contribute directly or indirectly via POM to the most 13C-depleted consumers (e.g. Ucides cordatus and zooplanktivorous food chains); (ii) algal food web, where benthic algae are eaten directly by consumers (e.g. Uca maracoani, mullets, polychaetes, several fishes); (iii) mixed food web where the consumers use the carbon from different primary sources (mainly benthivorous fishes). An IsoError mixing model was used to determine the contributions of primary sources to consumers, based on δ 13C values. Model outputs were very sensitive to the magnitude of trophic isotope fractionation and to the variability in 13C data. Nevertheless, the simplification of the system by a priori aggregation of primary producers allowed interpretable results for several taxa, revealing the segregation into different food webs.
46 CFR 154.440 - Allowable stress.
Code of Federal Regulations, 2014 CFR
2014-10-01
...: (1) For tank web frames, stringers, or girders of carbon manganese steel or aluminum alloys, meet σB... in appendix A of this part. (c) Tank plating must meet the American Bureau of Shipping's deep tank...
46 CFR 154.440 - Allowable stress.
Code of Federal Regulations, 2013 CFR
2013-10-01
...: (1) For tank web frames, stringers, or girders of carbon manganese steel or aluminum alloys, meet σB... in appendix A of this part. (c) Tank plating must meet the American Bureau of Shipping's deep tank...
46 CFR 154.440 - Allowable stress.
Code of Federal Regulations, 2012 CFR
2012-10-01
...: (1) For tank web frames, stringers, or girders of carbon manganese steel or aluminum alloys, meet σB... in appendix A of this part. (c) Tank plating must meet the American Bureau of Shipping's deep tank...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cardamone, Carolin N.; Van Dokkum, Pieter G.; Urry, C. Megan
2010-08-15
We present deep optical 18-medium-band photometry from the Subaru telescope over the {approx}30' x 30' Extended Chandra Deep Field-South, as part of the Multiwavelength Survey by Yale-Chile (MUSYC). This field has a wealth of ground- and space-based ancillary data, and contains the GOODS-South field and the Hubble Ultra Deep Field. We combine the Subaru imaging with existing UBVRIzJHK and Spitzer IRAC images to create a uniform catalog. Detecting sources in the MUSYC 'BVR' image we find {approx}40,000 galaxies with R {sub AB} < 25.3, the median 5{sigma} limit of the 18 medium bands. Photometric redshifts are determined using the EAzYmore » code and compared to {approx}2000 spectroscopic redshifts in this field. The medium-band filters provide very accurate redshifts for the (bright) subset of galaxies with spectroscopic redshifts, particularly at 0.1 < z < 1.2 and at z {approx}> 3.5. For 0.1 < z < 1.2, we find a 1{sigma} scatter in {Delta}z/(1 + z) of 0.007, similar to results obtained with a similar filter set in the COSMOS field. As a demonstration of the data quality, we show that the red sequence and blue cloud can be cleanly identified in rest-frame color-magnitude diagrams at 0.1 < z < 1.2. We find that {approx}20% of the red sequence galaxies show evidence of dust emission at longer rest-frame wavelengths. The reduced images, photometric catalog, and photometric redshifts are provided through the public MUSYC Web site.« less
Devitt, Brian Meldan; Baker, Joseph F; Fitzgerald, Eilis; McCarthy, Conor
2010-01-01
A case of injury to the third web space of the right hand of a rugby player, as a result of buddy strapping with electrical insulating tape of the little and ring finger, is presented. A deep laceration of the web space and distal palmar fascia resulted, necessitating wound exploration and repair. This case highlights the danger of using electrical insulating tape as a means to buddy strap fingers. PMID:22736733
NASA Astrophysics Data System (ADS)
Trumpy, Eugenio; Manzella, Adele
2017-02-01
The Italian National Geothermal Database (BDNG), is the largest collection of Italian Geothermal data and was set up in the 1980s. It has since been updated both in terms of content and management tools: information on deep wells and thermal springs (with temperature > 30 °C) are currently organized and stored in a PostgreSQL relational database management system, which guarantees high performance, data security and easy access through different client applications. The BDNG is the core of the Geothopica web site, whose webGIS tool allows different types of user to access geothermal data, to visualize multiple types of datasets, and to perform integrated analyses. The webGIS tool has been recently improved by two specially designed, programmed and implemented visualization tools to display data on well lithology and underground temperatures. This paper describes the contents of the database and its software and data update, as well as the webGIS tool including the new tools for data lithology and temperature visualization. The geoinformation organized in the database and accessible through Geothopica is of use not only for geothermal purposes, but also for any kind of georesource and CO2 storage project requiring the organization of, and access to, deep underground data. Geothopica also supports project developers, researchers, and decision makers in the assessment, management and sustainable deployment of georesources.
miRanalyzer: a microRNA detection and analysis tool for next-generation sequencing experiments.
Hackenberg, Michael; Sturm, Martin; Langenberger, David; Falcón-Pérez, Juan Manuel; Aransay, Ana M
2009-07-01
Next-generation sequencing allows now the sequencing of small RNA molecules and the estimation of their expression levels. Consequently, there will be a high demand of bioinformatics tools to cope with the several gigabytes of sequence data generated in each single deep-sequencing experiment. Given this scene, we developed miRanalyzer, a web server tool for the analysis of deep-sequencing experiments for small RNAs. The web server tool requires a simple input file containing a list of unique reads and its copy numbers (expression levels). Using these data, miRanalyzer (i) detects all known microRNA sequences annotated in miRBase, (ii) finds all perfect matches against other libraries of transcribed sequences and (iii) predicts new microRNAs. The prediction of new microRNAs is an especially important point as there are many species with very few known microRNAs. Therefore, we implemented a highly accurate machine learning algorithm for the prediction of new microRNAs that reaches AUC values of 97.9% and recall values of up to 75% on unseen data. The web tool summarizes all the described steps in a single output page, which provides a comprehensive overview of the analysis, adding links to more detailed output pages for each analysis module. miRanalyzer is available at http://web.bioinformatics.cicbiogune.es/microRNA/.
Roll-to-roll light directed electrophoretic deposition system and method
Pascall, Andrew J.; Kuntz, Joshua
2017-06-06
A roll-to-roll light directed electrophoretic deposition system and method advances a roll of a flexible electrode web substrate along a roll-to-roll process path, where a material source is positioned to provide on the flexible electrode web substrate a thin film colloidal dispersion of electrically charged colloidal material dispersed in a fluid. A counter electrode is also positioned to come in contact with the thin film colloidal dispersion opposite the flexible electrode web substrate, where one of the counter electrode and the flexible electrode web substrate is a photoconductive electrode. A voltage source is connected to produce an electric potential between the counter electrode and the flexible electrode web substrate to induce electrophoretic deposition on the flexible electrode web substrate when the photoconductive electrode is rendered conductive, and a patterned light source is arranged to illuminate the photoconductive electrode with a light pattern and render conductive illuminated areas of the photoconductive electrode so that a patterned deposit of the electrically charged colloidal material is formed on the flexible electrode web substrate.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-04
...-and-r[email protected] , Attention Docket ID No. EPA-HQ-OAR-2003-0119. Facsimile: Fax your comments to... otherwise protected through http://www.regulations.gov or e-mail. The http://www.regulations.gov Web site is... following Web site: http://www.epa.gov/airquality/combustion . Please refer to this Web site to confirm the...
Charting Our Path with a Web Literacy Map
ERIC Educational Resources Information Center
Dalton, Bridget
2015-01-01
Being a literacy teacher today means being a teacher of Web literacies. This article features the "Web Literacy Map", an open source tool from Mozilla's Webmaker project. The map focuses on Exploring (Navigating the Web); Building (creating for the Web), and Connecting (Participating on the Web). Readers are invited to use resources,…
Landers, Richard N; Brusso, Robert C; Cavanaugh, Katelyn J; Collmus, Andrew B
2016-12-01
The term big data encompasses a wide range of approaches of collecting and analyzing data in ways that were not possible before the era of modern personal computing. One approach to big data of great potential to psychologists is web scraping, which involves the automated collection of information from webpages. Although web scraping can create massive big datasets with tens of thousands of variables, it can also be used to create modestly sized, more manageable datasets with tens of variables but hundreds of thousands of cases, well within the skillset of most psychologists to analyze, in a matter of hours. In this article, we demystify web scraping methods as currently used to examine research questions of interest to psychologists. First, we introduce an approach called theory-driven web scraping in which the choice to use web-based big data must follow substantive theory. Second, we introduce data source theories , a term used to describe the assumptions a researcher must make about a prospective big data source in order to meaningfully scrape data from it. Critically, researchers must derive specific hypotheses to be tested based upon their data source theory, and if these hypotheses are not empirically supported, plans to use that data source should be changed or eliminated. Third, we provide a case study and sample code in Python demonstrating how web scraping can be conducted to collect big data along with links to a web tutorial designed for psychologists. Fourth, we describe a 4-step process to be followed in web scraping projects. Fifth and finally, we discuss legal, practical and ethical concerns faced when conducting web scraping projects. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Effects of web-based electrocardiography simulation on strategies and learning styles.
Granero-Molina, José; Fernández-Sola, Cayetano; López-Domene, Esperanza; Hernández-Padilla, José Manuel; Preto, Leonel São Romão; Castro-Sánchez, Adelaida María
2015-08-01
To identify the association between the use of web simulation electrocardiography and the learning approaches, strategies and styles of nursing degree students. A descriptive and correlational design with a one-group pretest-posttest measurement was used. The study sample included 246 students in a Basic and Advanced Cardiac Life Support nursing class of nursing degree. No significant differences between genders were found in any dimension of learning styles and approaches to learning. After the introduction of web simulation electrocardiography, significant differences were found in some item scores of learning styles: theorist (p < 0.040), pragmatic (p < 0.010) and approaches to learning. The use of a web electrocardiogram (ECG) simulation is associated with the development of active and reflexive learning styles, improving motivation and a deep approach in nursing students.
Frey, Lewis J; Sward, Katherine A; Newth, Christopher J L; Khemani, Robinder G; Cryer, Martin E; Thelen, Julie L; Enriquez, Rene; Shaoyu, Su; Pollack, Murray M; Harrison, Rick E; Meert, Kathleen L; Berg, Robert A; Wessel, David L; Shanley, Thomas P; Dalton, Heidi; Carcillo, Joseph; Jenkins, Tammara L; Dean, J Michael
2015-11-01
To examine the feasibility of deploying a virtual web service for sharing data within a research network, and to evaluate the impact on data consistency and quality. Virtual machines (VMs) encapsulated an open-source, semantically and syntactically interoperable secure web service infrastructure along with a shadow database. The VMs were deployed to 8 Collaborative Pediatric Critical Care Research Network Clinical Centers. Virtual web services could be deployed in hours. The interoperability of the web services reduced format misalignment from 56% to 1% and demonstrated that 99% of the data consistently transferred using the data dictionary and 1% needed human curation. Use of virtualized open-source secure web service technology could enable direct electronic abstraction of data from hospital databases for research purposes. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Isotopic diversity indices: how sensitive to food web structure?
Brind'Amour, Anik; Dubois, Stanislas F
2013-01-01
Recently revisited, the concept of niche ecology has lead to the formalisation of functional and trophic niches using stable isotope ratios. Isotopic diversity indices (IDI) derived from a set of measures assessing the dispersion/distribution of points in the δ-space were recently suggested and increasingly used in the literature. However, three main critics emerge from the use of these IDI: 1) they fail to account for the isotopic sources overlap, 2) some indices are highly sensitive to the number of species and/or the presence of rare species, and 3) the lack of standardization prevents any spatial and temporal comparisons. Using simulations we investigated the ability of six commonly used IDI to discriminate among different trophic food web structures, with a focus on the first two critics. We tested the sensitivity of the IDI to five food web structures along a gradient of sources overlap, varying from two distinct food chains with differentiated sources to two superimposed food chains sharing two sources. For each of the food web structure we varied the number of species (from 10 to 100 species) and the type of species feeding behaviour (i.e. random or selective feeding). Values of IDI were generally larger in food webs with distinct basal sources and tended to decrease as the superimposition of the food chains increased. This was more pronounced when species displayed food preferences in comparison to food webs where species fed randomly on any prey. The number of species composing the food web also had strong effects on the metrics, including those that were supposedly less sensitive to small sample size. In all cases, computing IDI on food webs with low numbers of species always increases the uncertainty of the metrics. A threshold of ~20 species was detected above which several metrics can be safely used.
Anomalies of rupture velocity in deep earthquakes
NASA Astrophysics Data System (ADS)
Suzuki, M.; Yagi, Y.
2010-12-01
Explaining deep seismicity is a long-standing challenge in earth science. Deeper than 300 km, the occurrence rate of earthquakes with depth remains at a low level until ~530 km depth, then rises until ~600 km, finally terminate near 700 km. Given the difficulty of estimating fracture properties and observing the stress field in the mantle transition zone (410-660 km), the seismic source processes of deep earthquakes are the most important information for understanding the distribution of deep seismicity. However, in a compilation of seismic source models of deep earthquakes, the source parameters for individual deep earthquakes are quite varied [Frohlich, 2006]. Rupture velocities for deep earthquakes estimated using seismic waveforms range from 0.3 to 0.9Vs, where Vs is the shear wave velocity, a considerably wider range than the velocities for shallow earthquakes. The uncertainty of seismic source models prevents us from determining the main characteristics of the rupture process and understanding the physical mechanisms of deep earthquakes. Recently, the back projection method has been used to derive a detailed and stable seismic source image from dense seismic network observations [e.g., Ishii et al., 2005; Walker et al., 2005]. Using this method, we can obtain an image of the seismic source process from the observed data without a priori constraints or discarding parameters. We applied the back projection method to teleseismic P-waveforms of 24 large, deep earthquakes (moment magnitude Mw ≥ 7.0, depth ≥ 300 km) recorded since 1994 by the Data Management Center of the Incorporated Research Institutions for Seismology (IRIS-DMC) and reported in the U.S. Geological Survey (USGS) catalog, and constructed seismic source models of deep earthquakes. By imaging the seismic rupture process for a set of recent deep earthquakes, we found that the rupture velocities are less than about 0.6Vs except in the depth range of 530 to 600 km. This is consistent with the depth variation of deep seismicity: it peaks between about 530 and 600 km, where the fast rupture earthquakes (greater than 0.7Vs) are observed. Similarly, aftershock productivity is particularly low from 300 to 550 km depth and increases markedly at depth greater than 550 km [e.g., Persh and Houston, 2004]. We propose that large fracture surface energy (Gc) value for deep earthquakes generally prevent the acceleration of dynamic rupture propagation and generation of earthquakes between 300 and 700 km depth, whereas small Gc value in the exceptional depth range promote dynamic rupture propagation and explain the seismicity peak near 600 km.
49 CFR 575.106 - Tire fuel efficiency consumer information program.
Code of Federal Regulations, 2010 CFR
2010-10-01
... tires, deep tread, winter-type snow tires, space-saver or temporary use spare tires, tires with nominal... Web site. (ii) Requirements for tire retailers. Subject to paragraph (e)(1)(iii) of this section, each...
49 CFR 575.106 - Tire fuel efficiency consumer information program.
Code of Federal Regulations, 2011 CFR
2011-10-01
... tires, deep tread, winter-type snow tires, space-saver or temporary use spare tires, tires with nominal... Web site. (ii) Requirements for tire retailers. Subject to paragraph (e)(1)(iii) of this section, each...
NASA Technical Reports Server (NTRS)
Wang, Yeou-Fang; Schrock, Mitchell; Baldwin, John R.; Borden, Charles S.
2010-01-01
The Ground Resource Allocation and Planning Environment (GRAPE 1.0) is a Web-based, collaborative team environment based on the Microsoft SharePoint platform, which provides Deep Space Network (DSN) resource planners tools and services for sharing information and performing analysis.
Deep Learning Improves Antimicrobial Peptide Recognition.
Veltri, Daniel; Kamath, Uday; Shehu, Amarda
2018-03-24
Bacterial resistance to antibiotics is a growing concern. Antimicrobial peptides (AMPs), natural components of innate immunity, are popular targets for developing new drugs. Machine learning methods are now commonly adopted by wet-laboratory researchers to screen for promising candidates. In this work we utilize deep learning to recognize antimicrobial activity. We propose a neural network model with convolutional and recurrent layers that leverage primary sequence composition. Results show that the proposed model outperforms state-of-the-art classification models on a comprehensive data set. By utilizing the embedding weights, we also present a reduced-alphabet representation and show that reasonable AMP recognition can be maintained using nine amino-acid types. Models and data sets are made freely available through the Antimicrobial Peptide Scanner vr.2 web server at: www.ampscanner.com. amarda@gmu.edu for general inquiries and dan.veltri@gmail.com for web server information. Supplementary data are available at Bioinformatics online.
Bell, James B; Woulds, Clare; Oevelen, Dick van
2017-09-20
Hydrothermal vents are highly dynamic ecosystems and are unusually energy rich in the deep-sea. In situ hydrothermal-based productivity combined with sinking photosynthetic organic matter in a soft-sediment setting creates geochemically diverse environments, which remain poorly studied. Here, we use comprehensive set of new and existing field observations to develop a quantitative ecosystem model of a deep-sea chemosynthetic ecosystem from the most southerly hydrothermal vent system known. We find evidence of chemosynthetic production supplementing the metazoan food web both at vent sites and elsewhere in the Bransfield Strait. Endosymbiont-bearing fauna were very important in supporting the transfer of chemosynthetic carbon into the food web, particularly to higher trophic levels. Chemosynthetic production occurred at all sites to varying degrees but was generally only a small component of the total organic matter inputs to the food web, even in the most hydrothermally active areas, owing in part to a low and patchy density of vent-endemic fauna. Differences between relative abundance of faunal functional groups, resulting from environmental variability, were clear drivers of differences in biogeochemical cycling and resulted in substantially different carbon processing patterns between habitats.
Biodiversity maintenance in food webs with regulatory environmental feedbacks.
Bagdassarian, Carey K; Dunham, Amy E; Brown, Christopher G; Rauscher, Daniel
2007-04-21
Although the food web is one of the most fundamental and oldest concepts in ecology, elucidating the strategies and structures by which natural communities of species persist remains a challenge to empirical and theoretical ecologists. We show that simple regulatory feedbacks between autotrophs and their environment when embedded within complex and realistic food-web models enhance biodiversity. The food webs are generated through the niche-model algorithm and coupled with predator-prey dynamics, with and without environmental feedbacks at the autotroph level. With high probability and especially at lower, more realistic connectance levels, regulatory environmental feedbacks result in fewer species extinctions, that is, in increased species persistence. These same feedback couplings, however, also sensitize food webs to environmental stresses leading to abrupt collapses in biodiversity with increased forcing. Feedback interactions between species and their material environments anchor food-web persistence, adding another dimension to biodiversity conservation. We suggest that the regulatory features of two natural systems, deep-sea tubeworms with their microbial consortia and a soil ecosystem manifesting adaptive homeostatic changes, can be embedded within niche-model food-web dynamics.
Saint: a lightweight integration environment for model annotation.
Lister, Allyson L; Pocock, Matthew; Taschuk, Morgan; Wipat, Anil
2009-11-15
Saint is a web application which provides a lightweight annotation integration environment for quantitative biological models. The system enables modellers to rapidly mark up models with biological information derived from a range of data sources. Saint is freely available for use on the web at http://www.cisban.ac.uk/saint. The web application is implemented in Google Web Toolkit and Tomcat, with all major browsers supported. The Java source code is freely available for download at http://saint-annotate.sourceforge.net. The Saint web server requires an installation of libSBML and has been tested on Linux (32-bit Ubuntu 8.10 and 9.04).
Development of an open-source web-based intervention for Brazilian smokers - Viva sem Tabaco.
Gomide, H P; Bernardino, H S; Richter, K; Martins, L F; Ronzani, T M
2016-08-02
Web-based interventions for smoking cessation available in Portuguese do not adhere to evidence-based treatment guidelines. Besides, all existing web-based interventions are built on proprietary platforms that developing countries often cannot afford. We aimed to describe the development of "Viva sem Tabaco", an open-source web-based intervention. The development of the intervention included the selection of content from evidence-based guidelines for smoking cessation, the design of the first layout, conduction of 2 focus groups to identify potential features, refinement of the layout based on focus groups and correction of content based on feedback provided by specialists on smoking cessation. At the end, we released the source-code and intervention on the Internet and translated it into Spanish and English. The intervention developed fills gaps in the information available in Portuguese and the lack of open-source interventions for smoking cessation. The open-source licensing format and its translation system may help researchers from different countries deploying evidence-based interventions for smoking cessation.
[Research progress on food sources and food web structure of wetlands based on stable isotopes].
Chen, Zhan Yan; Wu, Hai Tao; Wang, Yun Biao; Lyu, Xian Guo
2017-07-18
The trophic dynamics of wetland organisms is the basis of assessing wetland structure and function. Stable isotopes of carbon and nitrogen have been widely applied to identify trophic relationships in food source, food composition and food web transport in wetland ecosystem studies. This paper provided an overall review about the current methodology of isotope mixing model and trophic level in wetland ecosystems, and discussed the standards of trophic fractionation and baseline. Moreover, we characterized the typical food sources and isotopic compositions of wetland ecosystems, summarized the food sources in different trophic levels of herbivores, omnivores and carnivores based on stable isotopic analyses. We also discussed the limitations of stable isotopes in tra-cing food sources and in constructing food webs. Based on the current results, development trends and upcoming requirements, future studies should focus on sample treatment, conservation and trophic enrichment measurement in the wetland food web, as well as on combing a variety of methodologies including traditional stomach stuffing, molecular markers, and multiple isotopes.
Project CONVERGE: Impacts of local oceanographic processes on Adélie penguin foraging ecology
NASA Astrophysics Data System (ADS)
Kohut, J. T.; Bernard, K. S.; Fraser, W.; Oliver, M. J.; Statscewich, H.; Patterson-Fraser, D.; Winsor, P.; Cimino, M. A.; Miles, T. N.
2016-02-01
During the austral summer of 2014-2015, project CONVERGE deployed a multi-platform network to sample the Adélie penguin foraging hotspot associated with Palmer Deep Canyon along the Western Antarctic Peninsula. The focus of CONVERGE was to assess the impact of prey-concentrating ocean circulation dynamics on Adélie penguin foraging behavior. Food web links between phytoplankton and zooplankton abundance and penguin behavior were examined to better understand the within-season variability in Adélie foraging ecology. Since the High Frequency Radar (HFR) network installation in November 2014, the radial component current data from each of the three sites were combined to provide a high resolution (0.5 km) surface velocity maps. These hourly maps have revealed an incredibly dynamic system with strong fronts and frequent eddies extending across the Palmer Deep foraging area. A coordinated fleet of underwater gliders were used in concert with the HFR fields to sample the hydrography and phytoplankton distributions associated with convergent and divergent features. Three gliders mapped the along and across canyon variability of the hydrography, chlorophyll fluorescence and acoustic backscatter in the context of the observed surface currents and simultaneous penguin tracks. This presentation will highlight these synchronized measures of the food web in the context of the observed HFR fronts and eddies. The location and persistence of these features coupled with ecological sampling through the food web offer an unprecedented view of the Palmer Deep ecosystem. Specific examples will highlight how the vertical structure of the water column beneath the surface features stack the primary and secondary producers relative to observed penguin foraging behavior. The coupling from the physics through the food web as observed by our multi-platform network gives strong evidence for the critical role that distribution patterns of lower trophic levels have on Adélie foraging.
Digital London: Creating a Searchable Web of Interlinked Sources on Eighteenth Century London
ERIC Educational Resources Information Center
Shoemaker, Robert
2005-01-01
Purpose: To outline the conceptual and technical difficulties encountered, as well as the opportunities created, when developing an interlinked collection of web-based digitised primary sources on eighteenth century London. Design/methodology/approach: As a pilot study for a larger project, a variety of primary sources, including the "Old…
Deep learning application: rubbish classification with aid of an android device
NASA Astrophysics Data System (ADS)
Liu, Sijiang; Jiang, Bo; Zhan, Jie
2017-06-01
Deep learning is a very hot topic currently in pattern recognition and artificial intelligence researches. Aiming at the practical problem that people usually don't know correct classifications some rubbish should belong to, based on the powerful image classification ability of the deep learning method, we have designed a prototype system to help users to classify kinds of rubbish. Firstly the CaffeNet Model was adopted for our classification network training on the ImageNet dataset, and the trained network was deployed on a web server. Secondly an android app was developed for users to capture images of unclassified rubbish, upload images to the web server for analyzing backstage and retrieve the feedback, so that users can obtain the classification guide by an android device conveniently. Tests on our prototype system of rubbish classification show that: an image of one single type of rubbish with origin shape can be better used to judge its classification, while an image containing kinds of rubbish or rubbish with changed shape may fail to help users to decide rubbish's classification. However, the system still shows promising auxiliary function for rubbish classification if the network training strategy can be optimized further.
Meeting Reference Responsibilities through Library Web Sites.
ERIC Educational Resources Information Center
Adams, Michael
2001-01-01
Discusses library Web sites and explains some of the benefits when libraries make their sites into reference portals, linking them to other useful Web sites. Topics include print versus Web information sources; limitations of search engines; what Web sites to include, including criteria for inclusions; and organizing the sites. (LRW)
Discovering Authorities and Hubs in Different Topological Web Graph Structures.
ERIC Educational Resources Information Center
Meghabghab, George
2002-01-01
Discussion of citation analysis on the Web considers Web hyperlinks as a source to analyze citations. Topics include basic graph theory applied to Web pages, including matrices, linear algebra, and Web topology; and hubs and authorities, including a search technique called HITS (Hyperlink Induced Topic Search). (Author/LRW)
Automatic generation of Web mining environments
NASA Astrophysics Data System (ADS)
Cibelli, Maurizio; Costagliola, Gennaro
1999-02-01
The main problem related to the retrieval of information from the world wide web is the enormous number of unstructured documents and resources, i.e., the difficulty of locating and tracking appropriate sources. This paper presents a web mining environment (WME), which is capable of finding, extracting and structuring information related to a particular domain from web documents, using general purpose indices. The WME architecture includes a web engine filter (WEF), to sort and reduce the answer set returned by a web engine, a data source pre-processor (DSP), which processes html layout cues in order to collect and qualify page segments, and a heuristic-based information extraction system (HIES), to finally retrieve the required data. Furthermore, we present a web mining environment generator, WMEG, that allows naive users to generate a WME specific to a given domain by providing a set of specifications.
Parasites in food webs: the ultimate missing links
Lafferty, Kevin D; Allesina, Stefano; Arim, Matias; Briggs, Cherie J; De Leo, Giulio; Dobson, Andrew P; Dunne, Jennifer A; Johnson, Pieter T J; Kuris, Armand M; Marcogliese, David J; Martinez, Neo D; Memmott, Jane; Marquet, Pablo A; McLaughlin, John P; Mordecai, Erin A; Pascual, Mercedes; Poulin, Robert; Thieltges, David W
2008-01-01
Parasitism is the most common consumer strategy among organisms, yet only recently has there been a call for the inclusion of infectious disease agents in food webs. The value of this effort hinges on whether parasites affect food-web properties. Increasing evidence suggests that parasites have the potential to uniquely alter food-web topology in terms of chain length, connectance and robustness. In addition, parasites might affect food-web stability, interaction strength and energy flow. Food-web structure also affects infectious disease dynamics because parasites depend on the ecological networks in which they live. Empirically, incorporating parasites into food webs is straightforward. We may start with existing food webs and add parasites as nodes, or we may try to build food webs around systems for which we already have a good understanding of infectious processes. In the future, perhaps researchers will add parasites while they construct food webs. Less clear is how food-web theory can accommodate parasites. This is a deep and central problem in theoretical biology and applied mathematics. For instance, is representing parasites with complex life cycles as a single node equivalent to representing other species with ontogenetic niche shifts as a single node? Can parasitism fit into fundamental frameworks such as the niche model? Can we integrate infectious disease models into the emerging field of dynamic food-web modelling? Future progress will benefit from interdisciplinary collaborations between ecologists and infectious disease biologists. PMID:18462196
Parasites in food webs: the ultimate missing links.
Lafferty, Kevin D; Allesina, Stefano; Arim, Matias; Briggs, Cherie J; De Leo, Giulio; Dobson, Andrew P; Dunne, Jennifer A; Johnson, Pieter T J; Kuris, Armand M; Marcogliese, David J; Martinez, Neo D; Memmott, Jane; Marquet, Pablo A; McLaughlin, John P; Mordecai, Erin A; Pascual, Mercedes; Poulin, Robert; Thieltges, David W
2008-06-01
Parasitism is the most common consumer strategy among organisms, yet only recently has there been a call for the inclusion of infectious disease agents in food webs. The value of this effort hinges on whether parasites affect food-web properties. Increasing evidence suggests that parasites have the potential to uniquely alter food-web topology in terms of chain length, connectance and robustness. In addition, parasites might affect food-web stability, interaction strength and energy flow. Food-web structure also affects infectious disease dynamics because parasites depend on the ecological networks in which they live. Empirically, incorporating parasites into food webs is straightforward. We may start with existing food webs and add parasites as nodes, or we may try to build food webs around systems for which we already have a good understanding of infectious processes. In the future, perhaps researchers will add parasites while they construct food webs. Less clear is how food-web theory can accommodate parasites. This is a deep and central problem in theoretical biology and applied mathematics. For instance, is representing parasites with complex life cycles as a single node equivalent to representing other species with ontogenetic niche shifts as a single node? Can parasitism fit into fundamental frameworks such as the niche model? Can we integrate infectious disease models into the emerging field of dynamic food-web modelling? Future progress will benefit from interdisciplinary collaborations between ecologists and infectious disease biologists.
Parasites in food webs: the ultimate missing links
Lafferty, Kevin D.; Allesina, Stefano; Arim, Matias; Briggs, Cherie J.; De Leo, Giulio A.; Dobson, Andrew P.; Dunne, Jennifer A.; Johnson, Pieter T.J.; Kuris, Armand M.; Marcogliese, David J.; Martinez, Neo D.; Memmott, Jane; Marquet, Pablo A.; McLaughlin, John P.; Mordecai, Eerin A.; Pascual, Mercedes; Poulin, Robert; Thieltges, David W.
2008-01-01
Parasitism is the most common consumer strategy among organisms, yet only recently has there been a call for the inclusion of infectious disease agents in food webs. The value of this effort hinges on whether parasites affect food-web properties. Increasing evidence suggests that parasites have the potential to uniquely alter food-web topology in terms of chain length, connectance and robustness. In addition, parasites might affect food-web stability, interaction strength and energy flow. Food-web structure also affects infectious disease dynamics because parasites depend on the ecological networks in which they live. Empirically, incorporating parasites into food webs is straightforward. We may start with existing food webs and add parasites as nodes, or we may try to build food webs around systems for which we already have a good understanding of infectious processes. In the future, perhaps researchers will add parasites while they construct food webs. Less clear is how food-web theory can accommodate parasites. This is a deep and central problem in theoretical biology and applied mathematics. For instance, is representing parasites with complex life cycles as a single node equivalent to representing other species with ontogenetic niche shifts as a single node? Can parasitism fit into fundamental frameworks such as the niche model? Can we integrate infectious disease models into the emerging field of dynamic food-web modelling? Future progress will benefit from interdisciplinary collaborations between ecologists and infectious disease biologists.
Introduction: From pathogenesis to therapy, deep endometriosis remains a source of controversy.
Donnez, Jacques
2017-12-01
Deep endometriosis remains a source of controversy. A number of theories may explain its pathogenesis and many arguments support the hypothesis that genetic or epigenetic changes are a prerequisite for development of lesions into deep endometriosis. Deep endometriosis is frequently responsible for pelvic pain, dysmenorrhea, and/or deep dyspareunia, but can also cause obstetrical complications. Diagnosis may be improved by high-quality imaging. Therapeutic approaches are a source of contention as well. In this issue's Views and Reviews, medical and surgical strategies are discussed, and it is emphasized that treatment should be designed according to a patient's symptoms and individual needs. It is also vital that referral centers have the knowledge and experience to treat deep endometriosis medically and/or surgically. The debate must continue because emerging trends in therapy need to be followed and investigated for optimal management. Copyright © 2017 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.
75 FR 61779 - National Science Board: Sunshine Act Meetings; Notice
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-06
...:30 p.m. to 3 p.m. SUBJECT MATTER: Review of NSB Action Item (NSB/CPP-10-63) (Deep Underground Science... National Science Board Web site http://www.nsf.gov/nsb for additional information and schedule updates...
Vulnerability of deep groundwater in the Bengal Aquifer System to contamination by arsenic
Burgess, W.G.; Hoque, M.A.; Michael, H.A.; Voss, C.I.; Breit, G.N.; Ahmed, K.M.
2010-01-01
Shallow groundwater, the primary water source in the Bengal Basin, contains up to 100 times the World Health Organization (WHO) drinking-water guideline of 10g l 1 arsenic (As), threatening the health of 70 million people. Groundwater from a depth greater than 150m, which almost uniformly meets the WHO guideline, has become the preferred alternative source. The vulnerability of deep wells to contamination by As is governed by the geometry of induced groundwater flow paths and the geochemical conditions encountered between the shallow and deep regions of the aquifer. Stratification of flow separates deep groundwater from shallow sources of As in some areas. Oxidized sediments also protect deep groundwater through the ability of ferric oxyhydroxides to adsorb As. Basin-scale groundwater flow modelling suggests that, over large regions, deep hand-pumped wells for domestic supply may be secure against As invasion for hundreds of years. By contrast, widespread deep irrigation pumping might effectively eliminate deep groundwater as an As-free resource within decades. Finer-scale models, incorporating spatial heterogeneity, are needed to investigate the security of deep municipal abstraction at specific urban locations. ?? 2010 Macmillan Publishers Limited. All rights reserved.
Public-Sector Information Security: A Call to Action for Public-Sector CIOs
2002-10-01
scenarios. However, in a larger sense, it is a story for all public-sector CIOs, a story both prophetic and sobering. Deep in this story, however, there...information technology (IT), our way of life, and the values that lay deep in the core of our American culture. These values include rights to...defines roles and accountabilities. The Scope of the Problem Today there are 109.5 million Internet hosts on the World Wide Web . Five years ago there
Personal travel assistants and the world wide web
DOT National Transportation Integrated Search
1997-01-01
To be successful, handheld computers known as Personal Travel Assistants (PTAs) must be connected to external information sources. The viability of using the Internet and the world wide web (www) as such sources is explored. Considerations include wh...
Radiation tolerance of boron doped dendritic web silicon solar cells
NASA Technical Reports Server (NTRS)
Rohatgi, A.
1980-01-01
The potential of dendritic web silicon for giving radiation hard solar cells is compared with the float zone silicon material. Solar cells with n(+)-p-P(+) structure and approximately 15% (AMl) efficiency were subjected to 1 MeV electron irradiation. Radiation tolerance of web cell efficiency was found to be at least as good as that of the float zone silicon cell. A study of the annealing behavior of radiation-induced defects via deep level transient spectroscopy revealed that E sub v + 0.31 eV defect, attributed to boron-oxygen-vacancy complex, is responsible for the reverse annealing of the irradiated cells in the temperature range of 150 to 350 C.
WebScope: A New Tool for Fusion Data Analysis and Visualization
NASA Astrophysics Data System (ADS)
Yang, Fei; Dang, Ningning; Xiao, Bingjia
2010-04-01
A visualization tool was developed through a web browser based on Java applets embedded into HTML pages, in order to provide a world access to the EAST experimental data. It can display data from various trees in different servers in a single panel. With WebScope, it is easier to make a comparison between different data sources and perform a simple calculation over different data sources.
Open source software integrated into data services of Japanese planetary explorations
NASA Astrophysics Data System (ADS)
Yamamoto, Y.; Ishihara, Y.; Otake, H.; Imai, K.; Masuda, K.
2015-12-01
Scientific data obtained by Japanese scientific satellites and lunar and planetary explorations are archived in DARTS (Data ARchives and Transmission System). DARTS provides the data with a simple method such as HTTP directory listing for long-term preservation while DARTS tries to provide rich web applications for ease of access with modern web technologies based on open source software. This presentation showcases availability of open source software through our services. KADIAS is a web-based application to search, analyze, and obtain scientific data measured by SELENE(Kaguya), a Japanese lunar orbiter. KADIAS uses OpenLayers to display maps distributed from Web Map Service (WMS). As a WMS server, open source software MapServer is adopted. KAGUYA 3D GIS (KAGUYA 3D Moon NAVI) provides a virtual globe for the SELENE's data. The main purpose of this application is public outreach. NASA World Wind Java SDK is used to develop. C3 (Cross-Cutting Comparisons) is a tool to compare data from various observations and simulations. It uses Highcharts to draw graphs on web browsers. Flow is a tool to simulate a Field-Of-View of an instrument onboard a spacecraft. This tool itself is open source software developed by JAXA/ISAS, and the license is BSD 3-Caluse License. SPICE Toolkit is essential to compile FLOW. SPICE Toolkit is also open source software developed by NASA/JPL, and the website distributes many spacecrafts' data. Nowadays, open source software is an indispensable tool to integrate DARTS services.
How Students Evaluate Information and Sources when Searching the World Wide Web for Information
ERIC Educational Resources Information Center
Walraven, Amber; Brand-Gruwel, Saskia; Boshuizen, Henny P. A.
2009-01-01
The World Wide Web (WWW) has become the biggest information source for students while solving information problems for school projects. Since anyone can post anything on the WWW, information is often unreliable or incomplete, and it is important to evaluate sources and information before using them. Earlier research has shown that students have…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-28
...-AP76 Oil and Natural Gas Sector: New Source Performance Standards and National Emission Standards for... and Natural Gas Sector: New Source Performance Standards and National Emission Standards for Hazardous... be charged for copying. World Wide Web. The EPA Web site for this rulemaking is located at: http...
An open-source, mobile-friendly search engine for public medical knowledge.
Samwald, Matthias; Hanbury, Allan
2014-01-01
The World Wide Web has become an important source of information for medical practitioners. To complement the capabilities of currently available web search engines we developed FindMeEvidence, an open-source, mobile-friendly medical search engine. In a preliminary evaluation, the quality of results from FindMeEvidence proved to be competitive with those from TRIP Database, an established, closed-source search engine for evidence-based medicine.
CrossCheck: an open-source web tool for high-throughput screen data analysis.
Najafov, Jamil; Najafov, Ayaz
2017-07-19
Modern high-throughput screening methods allow researchers to generate large datasets that potentially contain important biological information. However, oftentimes, picking relevant hits from such screens and generating testable hypotheses requires training in bioinformatics and the skills to efficiently perform database mining. There are currently no tools available to general public that allow users to cross-reference their screen datasets with published screen datasets. To this end, we developed CrossCheck, an online platform for high-throughput screen data analysis. CrossCheck is a centralized database that allows effortless comparison of the user-entered list of gene symbols with 16,231 published datasets. These datasets include published data from genome-wide RNAi and CRISPR screens, interactome proteomics and phosphoproteomics screens, cancer mutation databases, low-throughput studies of major cell signaling mediators, such as kinases, E3 ubiquitin ligases and phosphatases, and gene ontological information. Moreover, CrossCheck includes a novel database of predicted protein kinase substrates, which was developed using proteome-wide consensus motif searches. CrossCheck dramatically simplifies high-throughput screen data analysis and enables researchers to dig deep into the published literature and streamline data-driven hypothesis generation. CrossCheck is freely accessible as a web-based application at http://proteinguru.com/crosscheck.
No Longer Conveyor but Creator: Developing an Epistemology of the World Wide Web.
ERIC Educational Resources Information Center
Trombley, Laura E. Skandera; Flanagan, William G.
2001-01-01
Discusses the impact of the World Wide Web in terms of epistemology. Topics include technological innovations, including new dimensions of virtuality; the accessibility of information; tracking Web use via cookies; how the Web transforms the process of learning and knowing; linking information sources; and the Web as an information delivery…
Awareness and action for eliminating health care disparities in pain care: Web-based resources.
Fan, Ling; Thomas, Melissa; Deitrick, Ginna E; Polomano, Rosemary C
2008-01-01
Evidence shows that disparities in pain care exist, and this problem spans across all health care settings. Health care disparities are complex, and stem from the health system climate, limitations imposed by laws and regulations, and discriminatory practices that are deep seated in biases, stereotypes, and uncertainties surrounding communication and decision-making processes. A search of the Internet identified thousands of Web sites, documents, reports, and educational materials pertaining to health and pain disparities. Web sites for federal agencies, private foundations, and professional and consumer-oriented organizations provide useful information on disparities related to age, race, ethnicity, geography, socioeconomic status, and specific populations. The contents of 10 Web sites are examined for resources to assist health professionals and consumers in better understanding health and pain disparities and ways to overcome them in practice.
Carbon flows in the benthic food web at the deep-sea observatory HAUSGARTEN (Fram Strait)
NASA Astrophysics Data System (ADS)
van Oevelen, Dick; Bergmann, Melanie; Soetaert, Karline; Bauerfeind, Eduard; Hasemann, Christiane; Klages, Michael; Schewe, Ingo; Soltwedel, Thomas; Budaeva, Nataliya E.
2011-11-01
The HAUSGARTEN observatory is located in the eastern Fram Strait (Arctic Ocean) and used as long-term monitoring site to follow changes in the Arctic benthic ecosystem. Linear inverse modelling was applied to decipher carbon flows among the compartments of the benthic food web at the central HAUSGARTEN station (2500 m) based on an empirical data set consisting of data on biomass, prokaryote production, total carbon deposition and community respiration. The model resolved 99 carbon flows among 4 abiotic and 10 biotic compartments, ranging from prokaryotes up to megafauna. Total carbon input was 3.78±0.31 mmol C m -2 d -1, which is a comparatively small fraction of total primary production in the area. The community respiration of 3.26±0.20 mmol C m -2 d -1 is dominated by prokaryotes (93%) and has lower contributions from surface-deposit feeding macro- (1.7%) and suspension feeding megafauna (1.9%), whereas contributions from nematode and other macro- and megabenthic compartments were limited to <1%. The high prokaryotic contribution to carbon processing suggests that functioning of the benthic food web at the central HAUSGARTEN station is comparable to abyssal plain sediments that are characterised by strong energy limitation. Faunal diet compositions suggest that labile detritus is important for deposit-feeding nematodes (24% of their diet) and surface-deposit feeding macrofauna (˜44%), but that semi-labile detritus is more important in the diets of deposit-feeding macro- and megafauna. Dependency indices on these food sources were also calculated as these integrate direct (i.e. direct grazing and predator-prey interactions) and indirect (i.e. longer loops in the food web) pathways in the food web. Projected sea-ice retreats for the Arctic Ocean typically anticipate a decrease in the labile detritus flux to the already food-limited benthic food web. The dependency indices indicate that faunal compartments depend similarly on labile and semi-labile detritus, which suggests that the benthic biota may be more sensitive to changes in labile detritus inputs than when assessed from diet composition alone. Species-specific responses to different types of labile detritus inputs, e.g. pelagic algae versus sympagic algae, however, are presently unknown and are needed to assess the vulnerability of individual components of the benthic food web.
Elusive or Illuminating: Using the Web To Explore the Salem Witchcraft Trials.
ERIC Educational Resources Information Center
Hurter, Stephanie R.
2003-01-01
Presents Web sites useful for teaching about the Salem (Massachusetts) witchcraft trials. Includes Web sites that offer primary source material, collections of Web sites, teaching material, and sites that are interactive, including features, such as QuickTime movies. (CMK)
Extracting Databases from Dark Data with DeepDive.
Zhang, Ce; Shin, Jaeho; Ré, Christopher; Cafarella, Michael; Niu, Feng
2016-01-01
DeepDive is a system for extracting relational databases from dark data : the mass of text, tables, and images that are widely collected and stored but which cannot be exploited by standard relational tools. If the information in dark data - scientific papers, Web classified ads, customer service notes, and so on - were instead in a relational database, it would give analysts a massive and valuable new set of "big data." DeepDive is distinctive when compared to previous information extraction systems in its ability to obtain very high precision and recall at reasonable engineering cost; in a number of applications, we have used DeepDive to create databases with accuracy that meets that of human annotators. To date we have successfully deployed DeepDive to create data-centric applications for insurance, materials science, genomics, paleontologists, law enforcement, and others. The data unlocked by DeepDive represents a massive opportunity for industry, government, and scientific researchers. DeepDive is enabled by an unusual design that combines large-scale probabilistic inference with a novel developer interaction cycle. This design is enabled by several core innovations around probabilistic training and inference.
Blind source deconvolution for deep Earth seismology
NASA Astrophysics Data System (ADS)
Stefan, W.; Renaut, R.; Garnero, E. J.; Lay, T.
2007-12-01
We present an approach to automatically estimate an empirical source characterization of deep earthquakes recorded teleseismically and subsequently remove the source from the recordings by applying regularized deconvolution. A principle goal in this work is to effectively deblur the seismograms, resulting in more impulsive and narrower pulses, permitting better constraints in high resolution waveform analyses. Our method consists of two stages: (1) we first estimate the empirical source by automatically registering traces to their 1st principal component with a weighting scheme based on their deviation from this shape, we then use this shape as an estimation of the earthquake source. (2) We compare different deconvolution techniques to remove the source characteristic from the trace. In particular Total Variation (TV) regularized deconvolution is used which utilizes the fact that most natural signals have an underlying spareness in an appropriate basis, in this case, impulsive onsets of seismic arrivals. We show several examples of deep focus Fiji-Tonga region earthquakes for the phases S and ScS, comparing source responses for the separate phases. TV deconvolution is compared to the water level deconvolution, Tikenov deconvolution, and L1 norm deconvolution, for both data and synthetics. This approach significantly improves our ability to study subtle waveform features that are commonly masked by either noise or the earthquake source. Eliminating source complexities improves our ability to resolve deep mantle triplications, waveform complexities associated with possible double crossings of the post-perovskite phase transition, as well as increasing stability in waveform analyses used for deep mantle anisotropy measurements.
Bromberg, Yana; Yachdav, Guy; Ofran, Yanay; Schneider, Reinhard; Rost, Burkhard
2009-05-01
The rapidly increasing quantity of protein sequence data continues to widen the gap between available sequences and annotations. Comparative modeling suggests some aspects of the 3D structures of approximately half of all known proteins; homology- and network-based inferences annotate some aspect of function for a similar fraction of the proteome. For most known protein sequences, however, there is detailed knowledge about neither their function nor their structure. Comprehensive efforts towards the expert curation of sequence annotations have failed to meet the demand of the rapidly increasing number of available sequences. Only the automated prediction of protein function in the absence of homology can close the gap between available sequences and annotations in the foreseeable future. This review focuses on two novel methods for automated annotation, and briefly presents an outlook on how modern web software may revolutionize the field of protein sequence annotation. First, predictions of protein binding sites and functional hotspots, and the evolution of these into the most successful type of prediction of protein function from sequence will be discussed. Second, a new tool, comprehensive in silico mutagenesis, which contributes important novel predictions of function and at the same time prepares for the onset of the next sequencing revolution, will be described. While these two new sub-fields of protein prediction represent the breakthroughs that have been achieved methodologically, it will then be argued that a different development might further change the way biomedical researchers benefit from annotations: modern web software can connect the worldwide web in any browser with the 'Deep Web' (ie, proprietary data resources). The availability of this direct connection, and the resulting access to a wealth of data, may impact drug discovery and development more than any existing method that contributes to protein annotation.
Ham, Timothy S; Dmytriv, Zinovii; Plahar, Hector; Chen, Joanna; Hillson, Nathan J; Keasling, Jay D
2012-10-01
The Joint BioEnergy Institute Inventory of Composable Elements (JBEI-ICEs) is an open source registry platform for managing information about biological parts. It is capable of recording information about 'legacy' parts, such as plasmids, microbial host strains and Arabidopsis seeds, as well as DNA parts in various assembly standards. ICE is built on the idea of a web of registries and thus provides strong support for distributed interconnected use. The information deposited in an ICE installation instance is accessible both via a web browser and through the web application programming interfaces, which allows automated access to parts via third-party programs. JBEI-ICE includes several useful web browser-based graphical applications for sequence annotation, manipulation and analysis that are also open source. As with open source software, users are encouraged to install, use and customize JBEI-ICE and its components for their particular purposes. As a web application programming interface, ICE provides well-developed parts storage functionality for other synthetic biology software projects. A public instance is available at public-registry.jbei.org, where users can try out features, upload parts or simply use it for their projects. The ICE software suite is available via Google Code, a hosting site for community-driven open source projects.
Geoinformatics in the public service: building a cyberinfrastructure across the geological surveys
Allison, M. Lee; Gundersen, Linda C.; Richard, Stephen M.; Keller, G. Randy; Baru, Chaitanya
2011-01-01
Advanced information technology infrastructure is increasingly being employed in the Earth sciences to provide researchers with efficient access to massive central databases and to integrate diversely formatted information from a variety of sources. These geoinformatics initiatives enable manipulation, modeling and visualization of data in a consistent way, and are helping to develop integrated Earth models at various scales, and from the near surface to the deep interior. This book uses a series of case studies to demonstrate computer and database use across the geosciences. Chapters are thematically grouped into sections that cover data collection and management; modeling and community computational codes; visualization and data representation; knowledge management and data integration; and web services and scientific workflows. Geoinformatics is a fascinating and accessible introduction to this emerging field for readers across the solid Earth sciences and an invaluable reference for researchers interested in initiating new cyberinfrastructure projects of their own.
75 FR 82072 - Notice of Lodging of a Consent Decree Under the Clean Water Act
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-29
... injunctive measures, including the construction of seven deep underground tunnel systems--to reduce its CSO... Decree, may also be examined on the following Department of Justice Web site, to http://www.usdoj.gov...
In situ grazing experiments apply new technology to gain insights into deep-sea microbial food webs
NASA Astrophysics Data System (ADS)
Pachiadaki, Maria G.; Taylor, Craig; Oikonomou, Andreas; Yakimov, Michail M.; Stoeck, Thorsten; Edgcomb, Virginia
2016-07-01
Predation by grazing protists in aquatic habitats can influence prokaryotic community structure and provides a source of new, labile organic matter. Due to methodological difficulties associated with studies of deep-sea (below photic zone) microbiota, trophic interactions between eukaryotes and prokaryotes in mesopelagic and bathypelagic realms are largely obscured. Further complicating matters, examinations of trophic interactions using water samples that have been exposed to upwards of hundreds of atmospheres of pressure change prior to initiating experiments can potentially introduce significant artifacts. Here we present results of the first study of protistan grazing in water layers ranging from the euphotic zone to the bathypelagic, utilizing the Microbial Sampler-Submersible Incubation Device (MS-SID) that makes possible in situ studies of microbial activities. Protistan grazing in the mesopelagic and bathypelagic realm of the East Mediterranean Sea was quantified using fluorescently labeled prokaryotes (FLP) prepared from the naturally-occurring prokaryotic assemblages. These studies reveal daily prokaryotic removal due to grazing ranging from 31.3±5.9% at 40 m depth to 0.5±0.3% at 950 m. At 3540 m depth, where a chemocline habitat exists with abundant and active prokaryotes above Urania basin, the daily consumption of prokaryotes by protists was 19.9±6.6% of the in situ abundance.
WebGL and web audio software lightweight components for multimedia education
NASA Astrophysics Data System (ADS)
Chang, Xin; Yuksel, Kivanc; Skarbek, Władysław
2017-08-01
The paper presents the results of our recent work on development of contemporary computing platform DC2 for multimedia education usingWebGL andWeb Audio { the W3C standards. Using literate programming paradigm the WEBSA educational tools were developed. It offers for a user (student), the access to expandable collection of WEBGL Shaders and web Audio scripts. The unique feature of DC2 is the option of literate programming, offered for both, the author and the reader in order to improve interactivity to lightweightWebGL andWeb Audio components. For instance users can define: source audio nodes including synthetic sources, destination audio nodes, and nodes for audio processing such as: sound wave shaping, spectral band filtering, convolution based modification, etc. In case of WebGL beside of classic graphics effects based on mesh and fractal definitions, the novel image processing analysis by shaders is offered like nonlinear filtering, histogram of gradients, and Bayesian classifiers.
40 CFR 63.3300 - Which of my emission sources are affected by this subpart?
Code of Federal Regulations, 2010 CFR
2010-07-01
... CATEGORIES National Emission Standards for Hazardous Air Pollutants: Paper and Other Web Coating What This... subject to this subpart is the collection of all web coating lines at your facility. This includes web coating lines engaged in the coating of metal webs that are used in flexible packaging, and web coating...
40 CFR 63.3300 - Which of my emission sources are affected by this subpart?
Code of Federal Regulations, 2011 CFR
2011-07-01
... CATEGORIES National Emission Standards for Hazardous Air Pollutants: Paper and Other Web Coating What This... subject to this subpart is the collection of all web coating lines at your facility. This includes web coating lines engaged in the coating of metal webs that are used in flexible packaging, and web coating...
ERIC Educational Resources Information Center
Mayfield, Jacqueline; Mayfield, Milton; Kohl, John
2005-01-01
The World Wide Web presents many opportunities for improving the instructional quality of international business communication related classes by providing access to a large variety of information sources. These sources can be used as supplements to traditional texts, as the basis for specific program assignments, or even as the main focus of a…
Risk of Neurological Insult in Competitive Deep Breath-Hold Diving.
Tetzlaff, Kay; Schöppenthau, Holger; Schipke, Jochen D
2017-02-01
It has been widely believed that tissue nitrogen uptake from the lungs during breath-hold diving would be insufficient to cause decompression stress in humans. With competitive free diving, however, diving depths have been ever increasing over the past decades. A case is presented of a competitive free-diving athlete who suffered stroke-like symptoms after surfacing from his last dive of a series of 3 deep breath-hold dives. A literature and Web search was performed to screen for similar cases of subjects with serious neurological symptoms after deep breath-hold dives. A previously healthy 31-y-old athlete experienced right-sided motor weakness and difficulty speaking immediately after surfacing from a breathhold dive to a depth of 100 m. He had performed 2 preceding breath-hold dives to that depth with surface intervals of only 15 min. The presentation of symptoms and neuroimaging findings supported a clinical diagnosis of stroke. Three more cases of neurological insults were retrieved by literature and Web search; in all cases the athletes presented with stroke-like symptoms after single breath-hold dives of depths exceeding 100 m. Two of these cases only had a short delay to recompression treatment and completely recovered from the insult. This report highlights the possibility of neurological insult, eg, stroke, due to cerebral arterial gas embolism as a consequence of decompression stress after deep breath-hold dives. Thus, stroke as a clinical presentation of cerebral arterial gas embolism should be considered another risk of extreme breath-hold diving.
Using Deep Learning for Gamma Ray Source Detection at the First G-APD Cherenkov Telescope (FACT)
NASA Astrophysics Data System (ADS)
Bieker, Jacob
2018-06-01
Finding gamma-ray sources is of paramount importance for Imaging Air Cherenkov Telescopes (IACT). This study looks at using deep neural networks on data from the First G-APD Cherenkov Telescope (FACT) as a proof-of-concept of finding gamma-ray sources with deep learning for the upcoming Cherenkov Telescope Array (CTA). In this study, FACT’s individual photon level observation data from the last 5 years was used with convolutional neural networks to determine if one or more sources were present. The neural networks used various architectures to determine which architectures were most successful in finding sources. Neural networks offer a promising method for finding faint and extended gamma-ray sources for IACTs. With further improvement and modifications, they offer a compelling method for source detection for the next generation of IACTs.
AEGIS-X: Deep Chandra Imaging of the Central Groth Strip
NASA Astrophysics Data System (ADS)
Nandra, K.; Laird, E. S.; Aird, J. A.; Salvato, M.; Georgakakis, A.; Barro, G.; Perez-Gonzalez, P. G.; Barmby, P.; Chary, R.-R.; Coil, A.; Cooper, M. C.; Davis, M.; Dickinson, M.; Faber, S. M.; Fazio, G. G.; Guhathakurta, P.; Gwyn, S.; Hsu, L.-T.; Huang, J.-S.; Ivison, R. J.; Koo, D. C.; Newman, J. A.; Rangel, C.; Yamada, T.; Willmer, C.
2015-09-01
We present the results of deep Chandra imaging of the central region of the Extended Groth Strip, the AEGIS-X Deep (AEGIS-XD) survey. When combined with previous Chandra observations of a wider area of the strip, AEGIS-X Wide (AEGIS-XW), these provide data to a nominal exposure depth of 800 ks in the three central ACIS-I fields, a region of approximately 0.29 deg2. This is currently the third deepest X-ray survey in existence; a factor ∼ 2-3 shallower than the Chandra Deep Fields (CDFs), but over an area ∼3 times greater than each CDF. We present a catalog of 937 point sources detected in the deep Chandra observations, along with identifications of our X-ray sources from deep ground-based, Spitzer, GALEX, and Hubble Space Telescope imaging. Using a likelihood ratio analysis, we associate multiband counterparts for 929/937 of our X-ray sources, with an estimated 95% reliability, making the identification completeness approximately 94% in a statistical sense. Reliable spectroscopic redshifts for 353 of our X-ray sources are available predominantly from Keck (DEEP2/3) and MMT Hectospec, so the current spectroscopic completeness is ∼38%. For the remainder of the X-ray sources, we compute photometric redshifts based on multiband photometry in up to 35 bands from the UV to mid-IR. Particular attention is given to the fact that the vast majority the X-ray sources are active galactic nuclei and require hybrid templates. Our photometric redshifts have mean accuracy of σ =0.04 and an outlier fraction of approximately 5%, reaching σ =0.03 with less than 4% outliers in the area covered by CANDELS . The X-ray, multiwavelength photometry, and redshift catalogs are made publicly available.
Solar cells and modules from dentritic web silicon
NASA Technical Reports Server (NTRS)
Campbell, R. B.; Rohatgi, A.; Seman, E. J.; Davis, J. R.; Rai-Choudhury, P.; Gallagher, B. D.
1980-01-01
Some of the noteworthy features of the processes developed in the fabrication of solar cell modules are the handling of long lengths of web, the use of cost effective dip coating of photoresist and antireflection coatings, selective electroplating of the grid pattern and ultrasonic bonding of the cell interconnect. Data on the cells is obtained by means of dark I-V analysis and deep level transient spectroscopy. A histogram of over 100 dentritic web solar cells fabricated in a number of runs using different web crystals shows an average efficiency of over 13%, with some efficiencies running above 15%. Lower cell efficiency is generally associated with low minority carrier time due to recombination centers sometimes present in the bulk silicon. A cost analysis of the process sequence using a 25 MW production line indicates a selling price of $0.75/peak watt in 1986. It is concluded that the efficiency of dentritic web cells approaches that of float zone silicon cells, reduced somewhat by the lower bulk lifetime of the former.
Comparison of Physics Frameworks for WebGL-Based Game Engine
NASA Astrophysics Data System (ADS)
Yogya, Resa; Kosala, Raymond
2014-03-01
Recently, a new technology called WebGL shows a lot of potentials for developing games. However since this technology is still new, there are still many potentials in the game development area that are not explored yet. This paper tries to uncover the potential of integrating physics frameworks with WebGL technology in a game engine for developing 2D or 3D games. Specifically we integrated three open source physics frameworks: Bullet, Cannon, and JigLib into a WebGL-based game engine. Using experiment, we assessed these frameworks in terms of their correctness or accuracy, performance, completeness and compatibility. The results show that it is possible to integrate open source physics frameworks into a WebGLbased game engine, and Bullet is the best physics framework to be integrated into the WebGL-based game engine.
North Atlantic Deep Water Production during the Last Glacial Maximum
Howe, Jacob N. W.; Piotrowski, Alexander M.; Noble, Taryn L.; Mulitza, Stefan; Chiessi, Cristiano M.; Bayon, Germain
2016-01-01
Changes in deep ocean ventilation are commonly invoked as the primary cause of lower glacial atmospheric CO2. The water mass structure of the glacial deep Atlantic Ocean and the mechanism by which it may have sequestered carbon remain elusive. Here we present neodymium isotope measurements from cores throughout the Atlantic that reveal glacial–interglacial changes in water mass distributions. These results demonstrate the sustained production of North Atlantic Deep Water under glacial conditions, indicating that southern-sourced waters were not as spatially extensive during the Last Glacial Maximum as previously believed. We demonstrate that the depleted glacial δ13C values in the deep Atlantic Ocean cannot be explained solely by water mass source changes. A greater amount of respired carbon, therefore, must have been stored in the abyssal Atlantic during the Last Glacial Maximum. We infer that this was achieved by a sluggish deep overturning cell, comprised of well-mixed northern- and southern-sourced waters. PMID:27256826
NASA Astrophysics Data System (ADS)
Jeffreys, Rachel M.; Lavaleye, Marc S. S.; Bergman, Magda J. N.; Duineveld, Gerard C. A.; Witbaard, Rob
2011-04-01
Deep-sea benthic communities derive their energetic requirements from overlying surface water production, which is deposited at the seafloor as phytodetritus. Benthic invertebrates are the primary consumers of this food source, with deep-sea fish at the top of the trophic hierarchy. Recently, we demonstrated with the use of baited cameras that macrourid fish rapidly respond to and feed vigorously on large plant food falls mimicked by spinach ( Jeffreys et al., 2010). Since higher plant remains are scarce in the deep-sea, with the exception of canyons, where terrestrial material has been observed, these results led us to ask if a more commonly documented plant material i.e. phytodetritus might form a food source for deep-sea fish and mobile scavenging megafauna. We simulated a phytodetritus dump at the seafloor in two contrasting environments (1) the NE Atlantic where carpets of phytodetritus have been previously observed and (2) the oligotrophic western Mediterranean, where the deposition of phytodetritus at the seafloor is a rare occurrence. We recorded the response of the scavenging fauna using an in situ benthic lander equipped with baited time-lapse cameras. In the NE Atlantic at 3000 m, abyssal macrourids and cusk-eels were observed ingesting the phytodetritus. The phytodetrital patch was significantly diminished within 2 h. Abundance estimates calculated from first arrival times of macrourids at the phytodetrital patch in the Atlantic corresponded with abundance estimates from video-transect indicating that fish were attracted to the scent of phytodetrital bait. In contrast to this, in the western Mediterranean at 2800 m a single macrourid was observed investigating the phytodetrital patch but did not feed from it. The phytodetrital patch was significantly diminished within 6.5 h as a result of mainly invertebrate activity. At 1900 m, Lepidion lepidion was observed near the lander and the bait, but did not feed. The phytodetrital patch remained intact until the end of the experiment. In the deployments in the Mediterranean abundance estimates from first arrival times at the bait, corrected for their body size, were lower than estimates obtained from video-transects and trawl catches. This suggests that the Mediterranean fish were not readily attracted to this food source. In contrast, invertebrates in the Balearic Sea were observed ingesting the phytodetritus bait despite the rare occurrence of phytodetritus dumps in the Mediterranean. Stable isotope values of the fish at both study sites, set within the context of the benthic food web, did not demonstrate a strong trophic link to phytodetritus. Fatty acid profiles of these fish indicated a strong link between their lipid pool and primary producers i.e. phytoplankton, which may be attributed to trophic transfer. The usefulness of fatty acid biomarkers in ascertaining deep-sea fish diets is discussed. Our study suggests that the abyssal grenadier C. armatus on the Atlantic Iberian margin is attracted to phytodetritus. However the exact contribution of this food source to the diet of macrourids in this area remains unresolved.
Depth-specific Analyses of the Lake Superior Food Web
Characteristics of large, deep aquatic systems include depth gradients in community composition, in the quality and distribution of food resources, and in the strategies that organisms use to obtain their nutrition. In Lake Superior, nearshore communities that rely upon a combina...
NASA Astrophysics Data System (ADS)
Baltar, Federico; Arístegui, Javier; Sintes, Eva; Gasol, Josep M.; Reinthaler, Thomas; Herndl, Gerhard J.
2010-05-01
It is generally assumed that sinking particulate organic carbon (POC) constitutes the main source of organic carbon supply to the deep ocean's food webs. However, a major discrepancy between the rates of sinking POC supply (collected with sediment traps) and the prokaryotic organic carbon demand (the total amount of carbon required to sustain the heterotrophic metabolism of the prokaryotes; i.e., production plus respiration, PCD) of deep-water communities has been consistently reported for the dark realm of the global ocean. While the amount of sinking POC flux declines exponentially with depth, the concentration of suspended, buoyant non-sinking POC (nsPOC; obtained with oceanographic bottles) exhibits only small variations with depth in the (sub)tropical Northeast Atlantic. Based on available data for the North Atlantic we show here that the sinking POC flux would contribute only 4-12% of the PCD in the mesopelagic realm (depending on the primary production rate in surface waters). The amount of nsPOC potentially available to heterotrophic prokaryotes in the mesopelagic realm can be partly replenished by dark dissolved inorganic carbon fixation contributing between 12% to 72% to the PCD daily. Taken together, there is evidence that the mesopelagic microheterotrophic biota is more dependent on the nsPOC pool than on the sinking POC supply. Hence, the enigmatic major mismatch between the organic carbon demand of the deep-water heterotrophic microbiota and the POC supply rates might be substantially smaller by including the potentially available nsPOC and its autochthonous production in oceanic carbon cycling models.
News Resources on the World Wide Web.
ERIC Educational Resources Information Center
Notess, Greg R.
1996-01-01
Describes up-to-date news sources that are presently available on the Internet and World Wide Web. Highlights include electronic newspapers; AP (Associated Press) sources and Reuters; sports news; stock market information; New York Times; multimedia capabilities, including CNN Interactive; and local and regional news. (LRW)
The deep web, dark matter, metabundles and the broadband elites: do you need an informaticist?
Holden, Gary; Rosenberg, Gary
2003-01-01
The World Wide Web (WWW) is growing in size and is becoming a substantial component of life. This seems especially true for US professionals, including social workers. It will require effort by these professionals to use the WWW effectively and efficiently. One of the main issues that these professionals will encounter in these efforts is the quality of materials located on the WWW. This paper reviews some of the factors related to improving the quality of information obtained from the WWW by social workers.
A very deep IRAS survey at the north ecliptic pole
NASA Technical Reports Server (NTRS)
Houck, J. R.; Hacking, P. B.; Condon, J. J.
1987-01-01
The data from approximately 20 hours observation of the 4- to 6-square degree field surrounding the north ecliptic pole have been combined to produce a very deep IR survey at the four IRAS bands. Scans from both pointed and survey observations were included in the data analysis. At 12 and 25 microns the deep survey is limited by detector noise and is approximately 50 times deeper than the IRAS Point Source Catalog (PSC). At 60 microns the problems of source confusion and Galactic cirrus combine to limit the deep survey to approximately 12 times deeper than the PSC. These problems are so severe at 100 microns that flux values are only given for locations corresponding to sources selected at 60 microns. In all, 47 sources were detected at 12 microns, 37 at 25 microns, and 99 at 60 microns. The data-analysis procedures and the significance of the 12- and 60-micron source-count results are discussed.
[Dynamics and interactions between the university community and public health 2.0].
Rodríguez-Gómez, Rodolfo
2016-01-01
To explore the experiences of a group of participants in a university community with the web in general and with digital contents on public health, to describe their motivations and to understand how social networks influence their interaction with content on public health. Qualitative research. Deep semi-structured interviews were conducted to understand the phenomenon. Five categories emerged after the study: socialization and internalization of the cyberculture, social marketing linked to the web and public health, culture of fear and distrust, the concept of health, and the health system and public health. Participants have internalized the web and have given it a strong symbolic capital. The challenges of public health 2.0 are not only to achieve interaction with users and to get a place in cyberspace, but also to fight against the stigma of the "public" and to take advantage of the influence of the web on small-world networks to communicate.
Pienaar, Rudolph; Rannou, Nicolas; Bernal, Jorge; Hahn, Daniel; Grant, P Ellen
2015-01-01
The utility of web browsers for general purpose computing, long anticipated, is only now coming into fruition. In this paper we present a web-based medical image data and information management software platform called ChRIS ([Boston] Children's Research Integration System). ChRIS' deep functionality allows for easy retrieval of medical image data from resources typically found in hospitals, organizes and presents information in a modern feed-like interface, provides access to a growing library of plugins that process these data - typically on a connected High Performance Compute Cluster, allows for easy data sharing between users and instances of ChRIS and provides powerful 3D visualization and real time collaboration.
1994-09-01
50 years ago as an imperative f’or a simple fighter- boamber escort team has .ince produced a highly sophisticated web of relationships between multip...encyclopedia of mission.specific u,)ject.ives that. are neither defined nor conceived at the operational level. .’lhe ATO cannot possibly cut this deep , nor...but they were nervous). [Meanwhile] F-15s are at their orbit point 150 miles deep in Iraq-waiting! We should do better than that. Responsiveness What
NASA Astrophysics Data System (ADS)
Das, I.; Oberai, K.; Sarathi Roy, P.
2012-07-01
Landslides exhibit themselves in different mass movement processes and are considered among the most complex natural hazards occurring on the earth surface. Making landslide database available online via WWW (World Wide Web) promotes the spreading and reaching out of the landslide information to all the stakeholders. The aim of this research is to present a comprehensive database for generating landslide hazard scenario with the help of available historic records of landslides and geo-environmental factors and make them available over the Web using geospatial Free & Open Source Software (FOSS). FOSS reduces the cost of the project drastically as proprietary software's are very costly. Landslide data generated for the period 1982 to 2009 were compiled along the national highway road corridor in Indian Himalayas. All the geo-environmental datasets along with the landslide susceptibility map were served through WEBGIS client interface. Open source University of Minnesota (UMN) mapserver was used as GIS server software for developing web enabled landslide geospatial database. PHP/Mapscript server-side application serve as a front-end application and PostgreSQL with PostGIS extension serve as a backend application for the web enabled landslide spatio-temporal databases. This dynamic virtual visualization process through a web platform brings an insight into the understanding of the landslides and the resulting damage closer to the affected people and user community. The landslide susceptibility dataset is also made available as an Open Geospatial Consortium (OGC) Web Feature Service (WFS) which can be accessed through any OGC compliant open source or proprietary GIS Software.
9 CFR 94.12 - Pork and pork products from regions where swine vesicular disease exists.
Code of Federal Regulations, 2013 CFR
2013-01-01
... vesicular disease is maintained on the APHIS Web site at http://www.aphis.usda.gov/import_export/animals... 260 °C for approximately 210 minutes after which they must be cooked in hot oil (deep-fried) at a...
Depth-specific Analyses of the Lake Superior Food Web, oral presentation
Characteristics of large, deep aquatic systems include depth gradients in community composition, in the quality and distribution of food resources, and in the strategies that organisms use to obtain their nutrition. In Lake Superior, nearshore communities that rely upon a combina...
A flexible cruciform journal bearing mount
NASA Technical Reports Server (NTRS)
Frost, A. E.; Geiger, W. A.
1973-01-01
Flexible mount achieves low roll, pitch and yaw stiffnesses while maintaining high radial stiffness by holding bearing pad in fixed relationship to deep web cruciform member and holding this member in fixed relationship to bearing support. This mount has particular application in small, high performance gas turbines.
Ocean Drilling Program: Related Sites
) 306-0390 Web site: www.nsf.gov Joint Oceanographic Institutions for Deep Earth Sampling (JOIDES) US Members: Columbia University, Lamont-Doherty Earth Observatory Florida State University Oregon State University, College of Oceanic and Atmospheric Sciences Pennsylvania State University, College of Earth and
Ondex Web: web-based visualization and exploration of heterogeneous biological networks.
Taubert, Jan; Hassani-Pak, Keywan; Castells-Brooke, Nathalie; Rawlings, Christopher J
2014-04-01
Ondex Web is a new web-based implementation of the network visualization and exploration tools from the Ondex data integration platform. New features such as context-sensitive menus and annotation tools provide users with intuitive ways to explore and manipulate the appearance of heterogeneous biological networks. Ondex Web is open source, written in Java and can be easily embedded into Web sites as an applet. Ondex Web supports loading data from a variety of network formats, such as XGMML, NWB, Pajek and OXL. http://ondex.rothamsted.ac.uk/OndexWeb.
Kooistra, Lammert; Bergsma, Aldo; Chuma, Beatus; de Bruin, Sytze
2009-01-01
This paper describes the development of a sensor web based approach which combines earth observation and in situ sensor data to derive typical information offered by a dynamic web mapping service (WMS). A prototype has been developed which provides daily maps of vegetation productivity for the Netherlands with a spatial resolution of 250 m. Daily available MODIS surface reflectance products and meteorological parameters obtained through a Sensor Observation Service (SOS) were used as input for a vegetation productivity model. This paper presents the vegetation productivity model, the sensor data sources and the implementation of the automated processing facility. Finally, an evaluation is made of the opportunities and limitations of sensor web based approaches for the development of web services which combine both satellite and in situ sensor sources. PMID:22574019
T-Check in Technologies for Interoperability: Web Services and Security--Single Sign-On
2007-12-01
following tools: • Apache Tomcat 6.0—a Java Servlet container to host the Web services and a simple Web client application [Apache 2007a] • Apache Axis...Eclipse. Eclipse – an open development platform. http://www.eclipse.org/ (2007) [Hunter 2001] Hunter, Jason. Java Servlet Programming, 2nd Edition...Citation SAML 1.1 Java Toolkit SAML Ping Identity’s SAML-1.1 implementation [SourceID 2006] OpenSAML SAML An open source implementation of SAML 1.1
Readability of Online Patient Education Materials Related to IR.
McEnteggart, Gregory E; Naeem, Muhammad; Skierkowski, Dorothy; Baird, Grayson L; Ahn, Sun H; Soares, Gregory
2015-08-01
To assess the readability of online patient education materials (OPEM) related to common diseases treated by and procedures performed by interventional radiology (IR). The following websites were chosen based on their average Google search return for each IR OPEM content area examined in this study: Society of Interventional Radiology (SIR), Cardiovascular and Interventional Radiological Society of Europe (CIRSE), National Library of Medicine, RadiologyInfo, Mayo Clinic, WebMD, and Wikipedia. IR OPEM content area was assessed for the following: peripheral arterial disease, central venous catheter, varicocele, uterine artery embolization, vertebroplasty, transjugular intrahepatic portosystemic shunt, and deep vein thrombosis. The following algorithms were used to estimate and compare readability levels: Flesch-Kincaid Grade Formula, Flesch Reading Ease Score, Gunning Frequency of Gobbledygook, Simple Measure of Gobbledygook, and Coleman-Liau Index. Data were analyzed using general mixed modeling. On average, online sources that required beyond high school grade-level readability were Wikipedia (15.0), SIR (14.2), and RadiologyInfo (12.4); sources that required high school grade-level readability were CIRSE (11.3), Mayo Clinic (11.0), WebMD (10.6), and National Library of Medicine (9.0). On average, OPEM on uterine artery embolization, vertebroplasty, varicocele, and peripheral arterial disease required the highest level of readability (12.5, 12.3, 12.3, and 12.2, respectively). The IR OPEM assessed in this study were written above the recommended sixth-grade reading level and the health literacy level of the average American adult. Many patients in the general public may not have the ability to read and understand health information in IR OPEM. Copyright © 2015 SIR. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Krauthamer, Helene
2001-01-01
Describes a workshop used with classes doing Web research for their English papers in a computer lab. Shows how this is a good opportunity for students to learn to find, evaluate, and save Web sources, how to read critically and annotate the sources, and how to weave them into working drafts and avoid plagiarism. (SR)
Information sources [Chapter 12
Daniel G. Neary; John N. Rinne; Alvin L. Medina
2012-01-01
The main information sources for the UVR consist of several web sites with general information and bibliographies. RMRS has publications on its Air, Water, Aquatic Environments (AWAE) Program Flagstaff web site. Another RMRS and University of Arizona website on semi-arid and arid watersheds contains a large, searchable bibliography of supporting information from the...
NASA Astrophysics Data System (ADS)
Franzen, Thomas M. O.; Sadler, Elaine M.; Chhetri, Rajan; Ekers, Ronald D.; Mahony, Elizabeth K.; Murphy, Tara; Norris, Ray P.; Waldram, Elizabeth M.; Whittam, Imogen H.
2014-04-01
We present a source catalogue and first results from a deep, blind radio survey carried out at 20 GHz with the Australia Telescope Compact Array, with follow-up observations at 5.5, 9 and 18 GHz. The Australia Telescope 20 GHz (AT20G) deep pilot survey covers a total area of 5 deg2 in the Chandra Deep Field South and in Stripe 82 of the Sloan Digital Sky Survey. We estimate the survey to be 90 per cent complete above 2.5 mJy. Of the 85 sources detected, 55 per cent have steep spectra (α _{1.4}^{20} < -0.5) and 45 per cent have flat or inverted spectra (α _{1.4}^{20} ≥ -0.5). The steep-spectrum sources tend to have single power-law spectra between 1.4 and 18 GHz, while the spectral indices of the flat- or inverted-spectrum sources tend to steepen with frequency. Among the 18 inverted-spectrum (α _{1.4}^{20} ≥ 0.0) sources, 10 have clearly defined peaks in their spectra with α _{1.4}^{5.5} > 0.15 and α 9^{18} < -0.15. On a 3-yr time-scale, at least 10 sources varied by more than 15 per cent at 20 GHz, showing that variability is still common at the low flux densities probed by the AT20G-deep pilot survey. We find a strong and puzzling shift in the typical spectral index of the 15-20-GHz source population when combining data from the AT20G, Ninth Cambridge and Tenth Cambridge surveys: there is a shift towards a steeper-spectrum population when going from ˜1 Jy to ˜5 mJy, which is followed by a shift back towards a flatter-spectrum population below ˜5 mJy. The 5-GHz source-count model by Jackson & Wall, which only includes contributions from FRI and FRII sources, and star-forming galaxies, does not reproduce the observed flattening of the flat-spectrum counts below ˜5 mJy. It is therefore possible that another population of sources is contributing to this effect.
The Management of the Scientific Information Environment: The Role of the Research Library Web Site.
ERIC Educational Resources Information Center
Arte, Assunta
2001-01-01
Describes the experiences of the Italian National Research Council Library staff in the successful development and implementation of its Web site. Discusses electronic information sources that interface with the Web site; library services; technical infrastructure; and the choice of a Web-based library management system. (Author/LRW)
Scene Recognition for Indoor Localization Using a Multi-Sensor Fusion Approach.
Liu, Mengyun; Chen, Ruizhi; Li, Deren; Chen, Yujin; Guo, Guangyi; Cao, Zhipeng; Pan, Yuanjin
2017-12-08
After decades of research, there is still no solution for indoor localization like the GNSS (Global Navigation Satellite System) solution for outdoor environments. The major reasons for this phenomenon are the complex spatial topology and RF transmission environment. To deal with these problems, an indoor scene constrained method for localization is proposed in this paper, which is inspired by the visual cognition ability of the human brain and the progress in the computer vision field regarding high-level image understanding. Furthermore, a multi-sensor fusion method is implemented on a commercial smartphone including cameras, WiFi and inertial sensors. Compared to former research, the camera on a smartphone is used to "see" which scene the user is in. With this information, a particle filter algorithm constrained by scene information is adopted to determine the final location. For indoor scene recognition, we take advantage of deep learning that has been proven to be highly effective in the computer vision community. For particle filter, both WiFi and magnetic field signals are used to update the weights of particles. Similar to other fingerprinting localization methods, there are two stages in the proposed system, offline training and online localization. In the offline stage, an indoor scene model is trained by Caffe (one of the most popular open source frameworks for deep learning) and a fingerprint database is constructed by user trajectories in different scenes. To reduce the volume requirement of training data for deep learning, a fine-tuned method is adopted for model training. In the online stage, a camera in a smartphone is used to recognize the initial scene. Then a particle filter algorithm is used to fuse the sensor data and determine the final location. To prove the effectiveness of the proposed method, an Android client and a web server are implemented. The Android client is used to collect data and locate a user. The web server is developed for indoor scene model training and communication with an Android client. To evaluate the performance, comparison experiments are conducted and the results demonstrate that a positioning accuracy of 1.32 m at 95% is achievable with the proposed solution. Both positioning accuracy and robustness are enhanced compared to approaches without scene constraint including commercial products such as IndoorAtlas.
Scene Recognition for Indoor Localization Using a Multi-Sensor Fusion Approach
Chen, Ruizhi; Li, Deren; Chen, Yujin; Guo, Guangyi; Cao, Zhipeng
2017-01-01
After decades of research, there is still no solution for indoor localization like the GNSS (Global Navigation Satellite System) solution for outdoor environments. The major reasons for this phenomenon are the complex spatial topology and RF transmission environment. To deal with these problems, an indoor scene constrained method for localization is proposed in this paper, which is inspired by the visual cognition ability of the human brain and the progress in the computer vision field regarding high-level image understanding. Furthermore, a multi-sensor fusion method is implemented on a commercial smartphone including cameras, WiFi and inertial sensors. Compared to former research, the camera on a smartphone is used to “see” which scene the user is in. With this information, a particle filter algorithm constrained by scene information is adopted to determine the final location. For indoor scene recognition, we take advantage of deep learning that has been proven to be highly effective in the computer vision community. For particle filter, both WiFi and magnetic field signals are used to update the weights of particles. Similar to other fingerprinting localization methods, there are two stages in the proposed system, offline training and online localization. In the offline stage, an indoor scene model is trained by Caffe (one of the most popular open source frameworks for deep learning) and a fingerprint database is constructed by user trajectories in different scenes. To reduce the volume requirement of training data for deep learning, a fine-tuned method is adopted for model training. In the online stage, a camera in a smartphone is used to recognize the initial scene. Then a particle filter algorithm is used to fuse the sensor data and determine the final location. To prove the effectiveness of the proposed method, an Android client and a web server are implemented. The Android client is used to collect data and locate a user. The web server is developed for indoor scene model training and communication with an Android client. To evaluate the performance, comparison experiments are conducted and the results demonstrate that a positioning accuracy of 1.32 m at 95% is achievable with the proposed solution. Both positioning accuracy and robustness are enhanced compared to approaches without scene constraint including commercial products such as IndoorAtlas. PMID:29292761
Levels-of-processing effect on internal source monitoring in schizophrenia.
Ragland, J Daniel; McCarthy, Erin; Bilker, Warren B; Brensinger, Colleen M; Valdez, Jeffrey; Kohler, Christian; Gur, Raquel E; Gur, Ruben C
2006-05-01
Recognition can be normalized in schizophrenia by providing patients with semantic organizational strategies through a levels-of-processing (LOP) framework. However, patients may rely primarily on familiarity effects, making recognition less sensitive than source monitoring to the strength of the episodic memory trace. The current study investigates whether providing semantic organizational strategies can also normalize patients' internal source-monitoring performance. Sixteen clinically stable medicated patients with schizophrenia and 15 demographically matched healthy controls were asked to identify the source of remembered words following an LOP-encoding paradigm in which they alternated between processing words on a 'shallow' perceptual versus a 'deep' semantic level. A multinomial analysis provided orthogonal measures of item recognition and source discrimination, and bootstrapping generated variance to allow for parametric analyses. LOP and group effects were tested by contrasting recognition and source-monitoring parameters for words that had been encoded during deep versus shallow processing conditions. As in a previous study there were no group differences in LOP effects on recognition performance, with patients and controls benefiting equally from deep versus shallow processing. Although there were no group differences in internal source monitoring, only controls had significantly better performance for words processed during the deep encoding condition. Patient performance did not correlate with clinical symptoms or medication dose. Providing a deep processing semantic encoding strategy significantly improved patients' recognition performance only. The lack of a significant LOP effect on internal source monitoring in patients may reflect subtle problems in the relational binding of semantic information that are independent of strategic memory processes.
Bates, Benjamin R; Romina, Sharon; Ahmed, Rukhsana; Hopson, Danielle
2006-03-01
Recent use of the Internet as a source of health information has raised concerns about consumers' ability to tell 'good' information from 'bad' information. Although consumers report that they use source credibility to judge information quality, several observational studies suggest that consumers make little use of source credibility. This study examines consumer evaluations of web pages attributed to a credible source as compared to generic web pages on measures of message quality. In spring 2005, a community-wide convenience survey was distributed in a regional hub city in Ohio, USA. 519 participants were randomly assigned one of six messages discussing lung cancer prevention: three messages each attributed to a highly credible national organization and three identical messages each attributed to a generic web page. Independent sample t-tests were conducted to compare each attributed message to its counterpart attributed to a generic web page on measures of trustworthiness, truthfulness, readability, and completeness. The results demonstrated that differences in attribution to a source did not have a significant effect on consumers' evaluations of the quality of the information.Conclusions. The authors offer suggestions for national organizations to promote credibility to consumers as a heuristic for choosing better online health information through the use of media co-channels to emphasize credibility.
NASA Astrophysics Data System (ADS)
Holzer, Mark; DeVries, Timothy; Bianchi, Daniele; Newton, Robert; Schlosser, Peter; Winckler, Gisela
2017-01-01
Hydrothermal vents along the ocean's tectonic ridge systems inject superheated water and large amounts of dissolved metals that impact the deep ocean circulation and the oceanic cycling of trace metals. The hydrothermal fluid contains dissolved mantle helium that is enriched in 3He relative to the atmosphere, providing an isotopic tracer of the ocean's deep circulation and a marker of hydrothermal sources. This work investigates the potential for the 3He/4He isotope ratio to constrain the ocean's mantle 3He source and to provide constraints on the ocean's deep circulation. We use an ensemble of 11 data-assimilated steady-state ocean circulation models and a mantle helium source based on geographically varying sea-floor spreading rates. The global source distribution is partitioned into 6 regions, and the vertical profile and source amplitude of each region are varied independently to determine the optimal 3He source distribution that minimizes the mismatch between modeled and observed δ3He. In this way, we are able to fit the observed δ3He distribution to within a relative error of ∼15%, with a global 3He source that ranges from 640 to 850 mol yr-1, depending on circulation. The fit captures the vertical and interbasin gradients of the δ3He distribution very well and reproduces its jet-sheared saddle point in the deep equatorial Pacific. This demonstrates that the data-assimilated models have much greater fidelity to the deep ocean circulation than other coarse-resolution ocean models. Nonetheless, the modelled δ3He distributions still display some systematic biases, especially in the deep North Pacific where δ3He is overpredicted by our models, and in the southeastern tropical Pacific, where observed westward-spreading δ3He plumes are not well captured. Sources inferred by the data-assimilated transport with and without isopycnally aligned eddy diffusivity differ widely in the Southern Ocean, in spite of the ability to match the observed distributions of CFCs and radiocarbon for either eddy parameterization.
Extracting Databases from Dark Data with DeepDive
Zhang, Ce; Shin, Jaeho; Ré, Christopher; Cafarella, Michael; Niu, Feng
2016-01-01
DeepDive is a system for extracting relational databases from dark data: the mass of text, tables, and images that are widely collected and stored but which cannot be exploited by standard relational tools. If the information in dark data — scientific papers, Web classified ads, customer service notes, and so on — were instead in a relational database, it would give analysts a massive and valuable new set of “big data.” DeepDive is distinctive when compared to previous information extraction systems in its ability to obtain very high precision and recall at reasonable engineering cost; in a number of applications, we have used DeepDive to create databases with accuracy that meets that of human annotators. To date we have successfully deployed DeepDive to create data-centric applications for insurance, materials science, genomics, paleontologists, law enforcement, and others. The data unlocked by DeepDive represents a massive opportunity for industry, government, and scientific researchers. DeepDive is enabled by an unusual design that combines large-scale probabilistic inference with a novel developer interaction cycle. This design is enabled by several core innovations around probabilistic training and inference. PMID:28316365
The S-Web Model for the Sources of the Slow Solar Wind
NASA Technical Reports Server (NTRS)
Antiochos, Spiro K.; Karpen, Judith T.; DeVore, C. Richard
2012-01-01
Models for the origin of the slow solar wind must account for two seemingly contradictory observations: The slow wind has the composition of the closed-field corona, implying that it originates from the continuous opening and closing of flux at the boundary between open and closed field. On the other hand, the slow wind has large angular width, up to 60 degrees, suggesting that its source extends far from the open-closed boundary. We describe a model that can explain both observations. The key idea is that the source of the slow wind at the Sun is a network of narrow (possibly singular) open-field corridors that map to a web of separatrices (the S-Web) and quasi-separatrix layers in the heliosphere. We discuss the dynamics of the S-Web model and its implications for present observations and for the upcoming observations from Solar Orbiter and Solar Probe Plus.
The UMLS Knowledge Source Server: an experience in Web 2.0 technologies.
Thorn, Karen E; Bangalore, Anantha K; Browne, Allen C
2007-10-11
The UMLS Knowledge Source Server (UMLSKS), developed at the National Library of Medicine (NLM), makes the knowledge sources of the Unified Medical Language System (UMLS) available to the research community over the Internet. In 2003, the UMLSKS was redesigned utilizing state-of-the-art technologies available at that time. That design offered a significant improvement over the prior version but presented a set of technology-dependent issues that limited its functionality and usability. Four areas of desired improvement were identified: software interfaces, web interface content, system maintenance/deployment, and user authentication. By employing next generation web technologies, newer authentication paradigms and further refinements in modular design methods, these areas could be addressed and corrected to meet the ever increasing needs of UMLSKS developers. In this paper we detail the issues present with the existing system and describe the new system's design using new technologies considered entrants in the Web 2.0 development era.
NASA Astrophysics Data System (ADS)
Ferrini, V. L.; Grange, B.; Morton, J. J.; Soule, S. A.; Carbotte, S. M.; Lehnert, K.
2016-12-01
The National Deep Submergence Facility (NDSF) operates the Human Occupied Vehicle (HOV) Alvin, the Remotely Operated Vehicle (ROV) Jason, and the Autonomous Underwater Vehicle (AUV) Sentry. These vehicles are deployed throughout the global oceans to acquire sensor data and physical samples for a variety of interdisciplinary science programs. As part of the EarthCube Integrative Activity Alliance Testbed Project (ATP), new web services were developed to improve access to existing online NDSF data and metadata resources. These services make use of tools and infrastructure developed by the Interdisciplinary Earth Data Alliance (IEDA) and enable programmatic access to metadata and data resources as well as the development of new service-driven user interfaces. The Alvin Frame Grabber and Jason Virtual Van enable the exploration of frame-grabbed images derived from video cameras on NDSF dives. Metadata available for each image includes time and vehicle position, data from environmental sensors, and scientist-generated annotations, and data are organized and accessible by cruise and/or dive. A new FrameGrabber web service and service-driven user interface were deployed to offer integrated access to these data resources through a single API and allows users to search across content curated in both systems. In addition, a new NDSF Dive Metadata web service and service-driven user interface was deployed to provide consolidated access to basic information about each NDSF dive (e.g. vehicle name, dive ID, location, etc), which is important for linking distributed data resources curated in different data systems.
SIDECACHE: Information access, management and dissemination framework for web services.
Doderer, Mark S; Burkhardt, Cory; Robbins, Kay A
2011-06-14
Many bioinformatics algorithms and data sets are deployed using web services so that the results can be explored via the Internet and easily integrated into other tools and services. These services often include data from other sites that is accessed either dynamically or through file downloads. Developers of these services face several problems because of the dynamic nature of the information from the upstream services. Many publicly available repositories of bioinformatics data frequently update their information. When such an update occurs, the developers of the downstream service may also need to update. For file downloads, this process is typically performed manually followed by web service restart. Requests for information obtained by dynamic access of upstream sources is sometimes subject to rate restrictions. SideCache provides a framework for deploying web services that integrate information extracted from other databases and from web sources that are periodically updated. This situation occurs frequently in biotechnology where new information is being continuously generated and the latest information is important. SideCache provides several types of services including proxy access and rate control, local caching, and automatic web service updating. We have used the SideCache framework to automate the deployment and updating of a number of bioinformatics web services and tools that extract information from remote primary sources such as NCBI, NCIBI, and Ensembl. The SideCache framework also has been used to share research results through the use of a SideCache derived web service.
Magnetic Fields for All: The GPIPS Community Web-Access Portal
NASA Astrophysics Data System (ADS)
Carveth, Carol; Clemens, D. P.; Pinnick, A.; Pavel, M.; Jameson, K.; Taylor, B.
2007-12-01
The new GPIPS website portal provides community users with an intuitive and powerful interface to query the data products of the Galactic Plane Infrared Polarization Survey. The website, which was built using PHP for the front end and MySQL for the database back end, allows users to issue queries based on galactic or equatorial coordinates, GPIPS-specific identifiers, polarization information, magnitude information, and several other attributes. The returns are presented in HTML tables, with the added option of either downloading or being emailed an ASCII file including the same or more information from the database. Other functionalities of the website include providing details of the status of the Survey (which fields have been observed or are planned to be observed), techniques involved in data collection and analysis, and descriptions of the database contents and names. For this initial launch of the website, users may access the GPIPS polarization point source catalog and the deep coadd photometric point source catalog. Future planned developments include a graphics-based method for querying the database, as well as tools to combine neighboring GPIPS images into larger image files for both polarimetry and photometry. This work is partially supported by NSF grant AST-0607500.
CO2 dynamics in the Amargosa Desert: Fluxes and isotopic speciation in a deep unsaturated zone
Walvoord, Michelle Ann; Striegl, Robert G.; Prudic, David E.; Stonestrom, David A.
2005-01-01
Natural unsaturated-zone gas profiles at the U.S. Geological Survey's Amargosa Desert Research Site, near Beatty, Nevada, reveal the presence of two physically and isotopically distinct CO2 sources, one shallow and one deep. The shallow source derives from seasonally variable autotrophic and heterotrophic respiration in the root zone. Scanning electron micrograph results indicate that at least part of the deep CO2 source is associated with calcite precipitation at the 110-m-deep water table. We use a geochemical gas-diffusion model to explore processes of CO2 production and behavior in the unsaturated zone. The individual isotopic species 12CO2, 13CO2, and 14CO2 are treated as separate chemical components that diffuse and react independently. Steady state model solutions, constrained by the measured δ13C (in CO2), and δ14C (in CO2) profiles, indicate that the shallow CO2 source from root and microbial respiration composes ∼97% of the annual average total CO2 production at this arid site. Despite the small contribution from deep CO2 production amounting to ∼0.1 mol m−2 yr−1, upward diffusion from depth strongly influences the distribution of CO2 and carbon isotopes in the deep unsaturated zone. In addition to diffusion from deep CO2 production, 14C exchange with a sorbed CO2 phase is indicated by the modeled δ14C profiles, confirming previous work. The new model of carbon-isotopic profiles provides a quantitative approach for evaluating fluxes of carbon under natural conditions in deep unsaturated zones.
Signal restoration through deconvolution applied to deep mantle seismic probes
NASA Astrophysics Data System (ADS)
Stefan, W.; Garnero, E.; Renaut, R. A.
2006-12-01
We present a method of signal restoration to improve the signal-to-noise ratio, sharpen seismic arrival onset, and act as an empirical source deconvolution of specific seismic arrivals. Observed time-series gi are modelled as a convolution of a simpler time-series fi, and an invariant point spread function (PSF) h that attempts to account for the earthquake source process. The method is used on the shear wave time window containing SKS and S, whereby using a Gaussian PSF produces more impulsive, narrower, signals in the wave train. The resulting restored time-series facilitates more accurate and objective relative traveltime estimation of the individual seismic arrivals. We demonstrate the accuracy of the reconstruction method on synthetic seismograms generated by the reflectivity method. Clean and sharp reconstructions are obtained with real data, even for signals with relatively high noise content. Reconstructed signals are simpler, more impulsive, and narrower, which allows highlighting of some details of arrivals that are not readily apparent in raw waveforms. In particular, phases nearly coincident in time can be separately identified after processing. This is demonstrated for two seismic wave pairs used to probe deep mantle and core-mantle boundary structure: (1) the Sab and Scd arrivals, which travel above and within, respectively, a 200-300-km-thick, higher than average shear wave velocity layer at the base of the mantle, observable in the 88-92 deg epicentral distance range and (2) SKS and SPdiff KS, which are core waves with the latter having short arcs of P-wave diffraction, and are nearly identical in timing near 108-110 deg in distance. A Java/Matlab algorithm was developed for the signal restoration, which can be downloaded from the authors web page, along with example data and synthetic seismograms.
Deep mantle: Enriched carbon source detected
NASA Astrophysics Data System (ADS)
Barry, Peter H.
2017-09-01
Estimates of carbon in the deep mantle vary by more than an order of magnitude. Coupled volcanic CO2 emission data and magma supply rates reveal a carbon-rich mantle plume source region beneath Hawai'i with 40% more carbon than previous estimates.
Food web flows through a sub-arctic deep-sea benthic community
NASA Astrophysics Data System (ADS)
Gontikaki, E.; van Oevelen, D.; Soetaert, K.; Witte, U.
2011-11-01
The benthic food web of the deep Faroe-Shetland Channel (FSC) was modelled by using the linear inverse modelling methodology. The reconstruction of carbon pathways by inverse analysis was based on benthic oxygen uptake rates, biomass data and transfer of labile carbon through the food web as revealed by a pulse-chase experiment. Carbon deposition was estimated at 2.2 mmol C m -2 d -1. Approximately 69% of the deposited carbon was respired by the benthic community with bacteria being responsible for 70% of the total respiration. The major fraction of the labile detritus flux was recycled within the microbial loop leaving merely 2% of the deposited labile phytodetritus available for metazoan consumption. Bacteria assimilated carbon at high efficiency (0.55) but only 24% of bacterial production was grazed by metazoans; the remaining returned to the dissolved organic matter pool due to viral lysis. Refractory detritus was the basal food resource for nematodes covering ∼99% of their carbon requirements. On the contrary, macrofauna seemed to obtain the major part of their metabolic needs from bacteria (49% of macrofaunal consumption). Labile detritus transfer was well-constrained, based on the data from the pulse-chase experiment, but appeared to be of limited importance to the diet of the examined benthic organisms (<1% and 5% of carbon requirements of nematodes and macrofauna respectively). Predation on nematodes was generally low with the exception of sub-surface deposit-feeding polychaetes that obtained 35% of their energy requirements from nematode ingestion. Carnivorous polychaetes also covered 35% of their carbon demand through predation although the preferred prey, in this case, was other macrofaunal animals rather than nematodes. Bacteria and detritus contributed 53% and 12% to the total carbon ingestion of carnivorous polychaetes suggesting a high degree of omnivory among higher consumers in the FSC benthic food web. Overall, this study provided a unique insight into the functioning of a deep-sea benthic community and demonstrated how conventional data can be exploited further when combined with state-of-the-art modelling approaches.
9 CFR 94.9 - Pork and pork products from regions where classical swine fever exists.
Code of Federal Regulations, 2013 CFR
2013-01-01
... swine fever is maintained on the APHIS Web site at http://www.aphis.usda.gov/import_export/animals... which they must be cooked in hot oil (deep-fried) at a minimum of 104 °C for an additional 150 minutes...
9 CFR 94.12 - Pork and pork products from regions where swine vesicular disease exists.
Code of Federal Regulations, 2014 CFR
2014-01-01
... has declared free of swine vesicular disease is maintained on the APHIS Web site at http://www.aphis... which they must be cooked in hot oil (deep-fried) at a minimum of 104 °C for an additional 150 minutes...
Galaxy and Mass Assembly (GAMA): Exploring the WISE Web in G12
NASA Astrophysics Data System (ADS)
Jarrett, T. H.; Cluver, M. E.; Magoulas, C.; Bilicki, M.; Alpaslan, M.; Bland-Hawthorn, J.; Brough, S.; Brown, M. J. I.; Croom, S.; Driver, S.; Holwerda, B. W.; Hopkins, A. M.; Loveday, J.; Norberg, P.; Peacock, J. A.; Popescu, C. C.; Sadler, E. M.; Taylor, E. N.; Tuffs, R. J.; Wang, L.
2017-02-01
We present an analysis of the mid-infrared Wide-field Infrared Survey Explorer (WISE) sources seen within the equatorial GAMA G12 field, located in the North Galactic Cap. Our motivation is to study and characterize the behavior of WISE source populations in anticipation of the deep multiwavelength surveys that will define the next decade, with the principal science goal of mapping the 3D large-scale structures and determining the global physical attributes of the host galaxies. In combination with cosmological redshifts, we identify galaxies from their WISE W1 (3.4 μm) resolved emission, and we also perform a star-galaxy separation using apparent magnitude, colors, and statistical modeling of star counts. The resulting galaxy catalog has ≃590,000 sources in 60 deg2, reaching a W1 5σ depth of 31 μJy. At the faint end, where redshifts are not available, we employ a luminosity function analysis to show that approximately 27% of all WISE extragalactic sources to a limit of 17.5 mag (31 μJy) are at high redshift, z> 1. The spatial distribution is investigated using two-point correlation functions and a 3D source density characterization at 5 Mpc and 20 Mpc scales. For angular distributions, we find that brighter and more massive sources are strongly clustered relative to fainter sources with lower mass; likewise, based on WISE colors, spheroidal galaxies have the strongest clustering, while late-type disk galaxies have the lowest clustering amplitudes. In three dimensions, we find a number of distinct groupings, often bridged by filaments and superstructures. Using special visualization tools, we map these structures, exploring how clustering may play a role with stellar mass and galaxy type.
Savel, Thomas G; Bronstein, Alvin; Duck, William; Rhodes, M Barry; Lee, Brian; Stinn, John; Worthen, Katherine
2010-01-01
Real-time surveillance systems are valuable for timely response to public health emergencies. It has been challenging to leverage existing surveillance systems in state and local communities, and, using a centralized architecture, add new data sources and analytical capacity. Because this centralized model has proven to be difficult to maintain and enhance, the US Centers for Disease Control and Prevention (CDC) has been examining the ability to use a federated model based on secure web services architecture, with data stewardship remaining with the data provider. As a case study for this approach, the American Association of Poison Control Centers and the CDC extended an existing data warehouse via a secure web service, and shared aggregate clinical effects and case counts data by geographic region and time period. To visualize these data, CDC developed a web browser-based interface, Quicksilver, which leveraged the Google Maps API and Flot, a javascript plotting library. Two iterations of the NPDS web service were completed in 12 weeks. The visualization client, Quicksilver, was developed in four months. This implementation of web services combined with a visualization client represents incremental positive progress in transitioning national data sources like BioSense and NPDS to a federated data exchange model. Quicksilver effectively demonstrates how the use of secure web services in conjunction with a lightweight, rapidly deployed visualization client can easily integrate isolated data sources for biosurveillance.
PaaS for web applications with OpenShift Origin
NASA Astrophysics Data System (ADS)
Lossent, A.; Rodriguez Peon, A.; Wagner, A.
2017-10-01
The CERN Web Frameworks team has deployed OpenShift Origin to facilitate deployment of web applications and to improving efficiency in terms of computing resource usage. OpenShift leverages Docker containers and Kubernetes orchestration to provide a Platform-as-a-service solution oriented for web applications. We will review use cases and how OpenShift was integrated with other services such as source control, web site management and authentication services.
NASA Technical Reports Server (NTRS)
Schmidt, M.; Hasinger, G.; Gunn, J.; Schneider, D.; Burg, R.; Giacconi, R.; Lehmann, I.; MacKenty, J.; Truemper, J.; Zamorani, G.
1998-01-01
The ROSAT Deep Survey includes a complete sample of 50 X-ray sources with fluxes in the 0.5 - 2 keV band larger than 5.5 x 10(exp -15)erg/sq cm/s in the Lockman field (Hasinger et al., Paper 1). We have obtained deep broad-band CCD images of the field and spectra of many optical objects near the positions of the X-ray sources. We define systematically the process leading to the optical identifications of the X-ray sources. For this purpose, we introduce five identification (ID) classes that characterize the process in each case. Among the 50 X-ray sources, we identify 39 AGNs, 3 groups of galaxies, 1 galaxy and 3 galactic stars. Four X-ray sources remain unidentified so far; two of these objects may have an unusually large ratio of X-ray to optical flux.
The Role of Virtual Reference in Library Web Site Design: A Qualitative Source for Usage Data
ERIC Educational Resources Information Center
Powers, Amanda Clay; Shedd, Julie; Hill, Clay
2011-01-01
Gathering qualitative information about usage behavior of library Web sites is a time-consuming process requiring the active participation of patron communities. Libraries that collect virtual reference transcripts, however, hold valuable data regarding how the library Web site is used that could benefit Web designers. An analysis of virtual…
Meeting the Needs of Travel Clientele: Tried and True Strategies That Work.
ERIC Educational Resources Information Center
Blessing, Kathy; Whitney, Cherine
This paper describes sources for meeting the information needs of travel clientele. Topics addressed include: (1) U.S. government Web sites; (2) collection development tools, including review journals, online bookstores, travel Web sites, and sources of point-by-point comparisons of guide books; (3) prominent guidebook series and publisher Web…
An Open-Source and Java-Technologies Approach to Web Applications
2003-09-01
program for any purpose (Freedom 0). • The freedom to study how the program works, and adapt it to individual needs (Freedom 1). Access to the source...manage information for many purposes. Today a key technology that allows developers to make Web applications is server-side programming to generate a
Katzman, G L; Morris, D; Lauman, J; Cochella, C; Goede, P; Harnsberger, H R
2001-06-01
To foster a community supported evaluation processes for open-source digital teaching file (DTF) development and maintenance. The mechanisms used to support this process will include standard web browsers, web servers, forum software, and custom additions to the forum software to potentially enable a mediated voting protocol. The web server will also serve as a focal point for beta and release software distribution, which is the desired end-goal of this process. We foresee that www.mdtf.org will provide for widespread distribution of open source DTF software that will include function and interface design decisions from community participation on the website forums.
Howe, Emily; Simenstad, Charles A; Ogston, Andrea
2017-10-01
We measured the influence of landscape setting on estuarine food web connectivity in five macrotidal Pacific Northwest estuaries across a gradient of freshwater influence. We used stable isotopes (δ 13 C, δ 15 N, δ 34 S) in combination with a Bayesian mixing model to trace primary producer contributions to suspension- and deposit-feeding bivalve consumers (Mytilus trossulus and Macoma nasuta) transplanted into three estuarine vegetation zones: emergent marsh, mudflat, and eelgrass. Eelgrass includes both Japanese eelgrass (Zostera japonica) and native eelgrass (Zostera marina). Fluvial discharge and consumer feeding mode strongly influenced the strength and spatial scale of observed food web linkages, while season played a secondary role. Mussels displayed strong cross-ecosystem connectivity in all estuaries, with decreasing marine influence in the more fluvial estuaries. Mussel diets indicated homogenization of detrital sources within the water column of each estuary. In contrast, the diets of benthic deposit-feeding clams indicated stronger compartmentalization in food web connectivity, especially in the largest river delta where clam diets were trophically disconnected from marsh sources of detritus. This suggests detritus deposition is patchy across space, and less homogenous than the suspended detritus pool. In addition to fluvial setting, other estuary-specific environmental drivers, such as marsh area or particle transport speed, influenced the degree of food web linkages across space and time, often accounting for unexpected patterns in food web connectivity. Transformations of the estuarine landscape that alter river hydrology or availability of detritus sources can thus potentially disrupt natural food web connectivity at the landscape scale, especially for sedentary organisms, which cannot track their food sources through space. © 2017 by the Ecological Society of America.
Zhang, Hanyuan; Vieira Resende E Silva, Bruno; Cui, Juan
2018-05-01
Small RNA sequencing is the most widely used tool for microRNA (miRNA) discovery, and shows great potential for the efficient study of miRNA cross-species transport, i.e., by detecting the presence of exogenous miRNA sequences in the host species. Because of the increased appreciation of dietary miRNAs and their far-reaching implication in human health, research interests are currently growing with regard to exogenous miRNAs bioavailability, mechanisms of cross-species transport and miRNA function in cellular biological processes. In this article, we present microRNA Discovery (miRDis), a new small RNA sequencing data analysis pipeline for both endogenous and exogenous miRNA detection. Specifically, we developed and deployed a Web service that supports the annotation and expression profiling data of known host miRNAs and the detection of novel miRNAs, other noncoding RNAs, and the exogenous miRNAs from dietary species. As a proof-of-concept, we analyzed a set of human plasma sequencing data from a milk-feeding study where 225 human miRNAs were detected in the plasma samples and 44 show elevated expression after milk intake. By examining the bovine-specific sequences, data indicate that three bovine miRNAs (bta-miR-378, -181* and -150) are present in human plasma possibly because of the dietary uptake. Further evaluation based on different sets of public data demonstrates that miRDis outperforms other state-of-the-art tools in both detection and quantification of miRNA from either animal or plant sources. The miRDis Web server is available at: http://sbbi.unl.edu/miRDis/index.php.
Zheng, Ling-Ling; Xu, Wei-Lin; Liu, Shun; Sun, Wen-Ju; Li, Jun-Hao; Wu, Jie; Yang, Jian-Hua; Qu, Liang-Hu
2016-07-08
tRNA-derived small RNA fragments (tRFs) are one class of small non-coding RNAs derived from transfer RNAs (tRNAs). tRFs play important roles in cellular processes and are involved in multiple cancers. High-throughput small RNA (sRNA) sequencing experiments can detect all the cellular expressed sRNAs, including tRFs. However, distinguishing genuine tRFs from RNA fragments generated by random degradation remains a major challenge. In this study, we developed an integrated web-based computing system, tRF2Cancer, to accurately identify tRFs from sRNA deep-sequencing data and evaluate their expression in multiple cancers. The binomial test was introduced to evaluate whether reads from a small RNA-seq data set represent tRFs or degraded fragments. A classification method was then used to annotate the types of tRFs based on their sites of origin in pre-tRNA or mature tRNA. We applied the pipeline to analyze 10 991 data sets from 32 types of cancers and identified thousands of expressed tRFs. A tool called 'tRFinCancer' was developed to facilitate the users to inspect the expression of tRFs across different types of cancers. Another tool called 'tRFBrowser' shows both the sites of origin and the distribution of chemical modification sites in tRFs on their source tRNA. The tRF2Cancer web server is available at http://rna.sysu.edu.cn/tRFfinder/. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Perrot, Vincent; Pastukhov, Mikhail V; Epov, Vladimir N; Husted, Søren; Donard, Olivier F X; Amouroux, David
2012-06-05
Mercury undergoes several transformations that influence its stable isotope composition during a number of environmental and biological processes. Measurements of Hg isotopic mass-dependent (MDF) and mass-independent fractionation (MIF) in food webs may therefore help to identify major sources and processes leading to significant bioaccumulation of methylmercury (MeHg). In this work, δ(13)C, δ(15)N, concentration of Hg species (MeHg, inorganic Hg), and stable isotopic composition of Hg were determined at different trophic levels of the remote and pristine Lake Baikal ecosystem. Muscle of seals and different fish as well as amphipods, zooplankton, and phytoplankton were specifically investigated. MDF during trophic transfer of MeHg leading to enrichment of heavier isotopes in the predators was clearly established by δ(202)Hg measurements in the pelagic prey-predator system (carnivorous sculpins and top-predator seals). Despite the low concentrations of Hg in the ecosystem, the pelagic food web reveals very high MIF Δ(199)Hg (3.15-6.65‰) in comparison to coastal fish (0.26-1.65‰) and most previous studies in aquatic organisms. Trophic transfer does not influence MIF signature since similar Δ(199)Hg was observed in sculpins (4.59 ± 0.55‰) and seal muscles (4.62 ± 0.60‰). The MIF is suggested to be mainly controlled by specific physical and biogeochemical characteristics of the water column. The higher level of MIF in pelagic fish of Lake Baikal is mainly due to the bioaccumulation of residual MeHg that is efficiently turned over and photodemethylated in deep oligotrophic and stationary (i.e., long residence time) freshwater columns.
Keynote Talk: Mining the Web 2.0 for Improved Image Search
NASA Astrophysics Data System (ADS)
Baeza-Yates, Ricardo
There are several semantic sources that can be found in the Web that are either explicit, e.g. Wikipedia, or implicit, e.g. derived from Web usage data. Most of them are related to user generated content (UGC) or what is called today the Web 2.0. In this talk we show how to use these sources of evidence in Flickr, such as tags, visual annotations or clicks, which represent the the wisdom of crowds behind UGC, to improve image search. These results are the work of the multimedia retrieval team at Yahoo! Research Barcelona and they are already being used in Yahoo! image search. This work is part of a larger effort to produce a virtuous data feedback circuit based on the right combination many different technologies to leverage the Web itself.
Design for Connecting Spatial Data Infrastructures with Sensor Web (sensdi)
NASA Astrophysics Data System (ADS)
Bhattacharya, D.; M., M.
2016-06-01
Integrating Sensor Web With Spatial Data Infrastructures (SENSDI) aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. It is about research to harness the sensed environment by utilizing domain specific sensor data to create a generalized sensor webframework. The challenges being semantic enablement for Spatial Data Infrastructures, and connecting the interfaces of SDI with interfaces of Sensor Web. The proposed research plan is to Identify sensor data sources, Setup an open source SDI, Match the APIs and functions between Sensor Web and SDI, and Case studies like hazard applications, urban applications etc. We take up co-operative development of SDI best practices to enable a new realm of a location enabled and semantically enriched World Wide Web - the "Geospatial Web" or "Geosemantic Web" by setting up one to one correspondence between WMS, WFS, WCS, Metadata and 'Sensor Observation Service' (SOS); 'Sensor Planning Service' (SPS); 'Sensor Alert Service' (SAS); a service that facilitates asynchronous message interchange between users and services, and between two OGC-SWE services, called the 'Web Notification Service' (WNS). Hence in conclusion, it is of importance to geospatial studies to integrate SDI with Sensor Web. The integration can be done through merging the common OGC interfaces of SDI and Sensor Web. Multi-usability studies to validate integration has to be undertaken as future research.
Value of Information Web Application
2015-04-01
their understanding of VoI attributes (source reliable, information content, and latency). The VoI web application emulates many features of a...only when using the Firefox web browser on those computers (Internet Explorer was not viable due to unchangeable user settings). During testing, the
9 CFR 94.9 - Pork and pork products from regions where classical swine fever exists.
Code of Federal Regulations, 2014 CFR
2014-01-01
... declared free of classical swine fever is maintained on the APHIS Web site at http://www.aphis.usda.gov... which they must be cooked in hot oil (deep-fried) at a minimum of 104 °C for an additional 150 minutes...
E-Learning for Depth in the Semantic Web
ERIC Educational Resources Information Center
Shafrir, Uri; Etkind, Masha
2006-01-01
In this paper, we describe concept parsing algorithms, a novel semantic analysis methodology at the core of a new pedagogy that focuses learners attention on deep comprehension of the conceptual content of learned material. Two new e-learning tools are described in some detail: interactive concept discovery learning and meaning equivalence…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-12
...., Charleston, SC 29403. To submit comments please see our Web site at: http://www.sac.usace.army.mil/?action... container traffic and cargo value. In 2009, the Charleston port district was ranked ninth (out of 200 deep... [[Page 50188
NASA Technical Reports Server (NTRS)
Baldwin, John; Zendejas, Silvino; Gutheinz, Sandy; Borden, Chester; Wang, Yeou-Fang
2009-01-01
Mission and Assets Database (MADB) Version 1.0 is an SQL database system with a Web user interface to centralize information. The database stores flight project support resource requirements, view periods, antenna information, schedule, and forecast results for use in mid-range and long-term planning of Deep Space Network (DSN) assets.
Dive and discover: Expeditions to the seafloor
NASA Astrophysics Data System (ADS)
Lawrence, Lisa Ayers
The Dive and Discover Web site is a virtual treasure chest of deep sea science and classroom resources. The goals of Dive and Discover are to engage students, teachers, and the general public in the excitement of ocean disco very through an interactive educational Web site. You can follow scientists on oceanographic research cruises by reading their daily cruise logs, viewing photos and video clips of the discoveries, and even e-mailing questions to the scientists and crew. WHOI has also included an “Educator's Companion” section with teaching strategies, activities, and assessments, making Dive and Discover an excellent resource for the classroom.
Dive and discover: Expeditions to the seafloor
NASA Astrophysics Data System (ADS)
Ayers Lawrence, Lisa
The Dive and Discover Web site is a virtual treasure chest of deep sea science and classroom resources. The goals of Dive and Discover are to engage students, teachers, and the general public in the excitement of ocean disco very through an interactive educational Web site. You can follow scientists on oceanographic research cruises by reading their daily cruise logs, viewing photos and video clips of the discoveries, and even e-mailing questions to the scientists and crew. WHOI has also included an "Educator's Companion" section with teaching strategies, activities, and assessments, making Dive and Discover an excellent resource for the classroom.
The Research on Automatic Construction of Domain Model Based on Deep Web Query Interfaces
NASA Astrophysics Data System (ADS)
JianPing, Gu
The integration of services is transparent, meaning that users no longer face the millions of Web services, do not care about the required data stored, but do not need to learn how to obtain these data. In this paper, we analyze the uncertainty of schema matching, and then propose a series of similarity measures. To reduce the cost of execution, we propose the type-based optimization method and schema matching pruning method of numeric data. Based on above analysis, we propose the uncertain schema matching method. The experiments prove the effectiveness and efficiency of our method.
A novel web-enabled healthcare solution on health vault system.
Liao, Lingxia; Chen, Min; Rodrigues, Joel J P C; Lai, Xiaorong; Vuong, Son
2012-06-01
Complicated Electronic Medical Records (EMR) systems have created problems in systems regarding an easy implementation and interoperability for a Web-enabled Healthcare Solution, which is normally provided by an independent healthcare giver with limited IT knowledge and interests. An EMR system with well-designed and user-friendly interface, such as Microsoft HealthVault System used as the back-end platform of a Web-enabled healthcare application will be an approach to deal with these problems. This paper analyzes the patient oriented Web-enabled healthcare service application as the new trend to delivery healthcare from hospital/clinic-centric to patient-centric, the current e-healthcare applications, and the main backend EMR systems. Then, we present a novel web-enabled healthcare solution based on Microsoft HealthVault EMR system to meet customers' needs, such as, low total cost, easily development and maintenance, and good interoperability. A sample system is given to show how the solution can be fulfilled, evaluated, and validated. We expect that this paper will provide a deep understanding of the available EMR systems, leading to insights for new solutions and approaches driven to next generation EMR systems.
Capturing Trust in Social Web Applications
NASA Astrophysics Data System (ADS)
O'Donovan, John
The Social Web constitutes a shift in information flow from the traditional Web. Previously, content was provided by the owners of a website, for consumption by the end-user. Nowadays, these websites are being replaced by Social Web applications which are frameworks for the publication of user-provided content. Traditionally, Web content could be `trusted' to some extent based on the site it originated from. Algorithms such as Google's PageRank were (and still are) used to compute the importance of a website, based on analysis of underlying link topology. In the Social Web, analysis of link topology merely tells us about the importance of the information framework which hosts the content. Consumers of information still need to know about the importance/reliability of the content they are reading, and therefore about the reliability of the producers of that content. Research into trust and reputation of the producers of information in the Social Web is still very much in its infancy. Every day, people are forced to make trusting decisions about strangers on the Web based on a very limited amount of information. For example, purchasing a product from an eBay seller with a `reputation' of 99%, downloading a file from a peer-to-peer application such as Bit-Torrent, or allowing Amazon.com tell you what products you will like. Even something as simple as reading comments on a Web-blog requires the consumer to make a trusting decision about the quality of that information. In all of these example cases, and indeed throughout the Social Web, there is a pressing demand for increased information upon which we can make trusting decisions. This chapter examines the diversity of sources from which trust information can be harnessed within Social Web applications and discusses a high level classification of those sources. Three different techniques for harnessing and using trust from a range of sources are presented. These techniques are deployed in two sample Social Web applications—a recommender system and an online auction. In all cases, it is shown that harnessing an increased amount of information upon which to make trust decisions greatly enhances the user experience with the Social Web application.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-04
... Web site. E-mail: Comments may be sent by electronic mail (e-mail) to a-and-r[email protected] otherwise protected through http://www.regulations.gov or e-mail. The http://www.regulations.gov Web site is... Web site: http://www.epa.gov/airquality/combustion . Please refer to this Web site to confirm the date...
Methylation of inorganic mercury in polar marine waters
NASA Astrophysics Data System (ADS)
Lehnherr, Igor; St. Louis, Vincent L.; Hintelmann, Holger; Kirk, Jane L.
2011-05-01
Monomethylmercury is a neurotoxin that accumulates in marine organisms, with serious implications for human health. The toxin is of particular concern to northern Inuit peoples, for example, whose traditional diets are composed primarily of marine mammals and fish. The ultimate source of monomethylmercury to marine organisms has remained uncertain, although various potential sources have been proposed, including export from coastal and deep-sea sediments and major river systems, atmospheric deposition and water-column production. Here, we report results from incubation experiments in which we added isotopically labelled inorganic mercury and monomethylmercury to seawater samples collected from a range of sites in the Canadian Arctic Archipelago. Monomethylmercury formed from the methylation of inorganic mercury in all samples. Demethylation of monomethylmercury was also observed in water from all sites. We determined steady-state concentrations of monomethylmercury in marine waters by incorporating the rate constants for monomethylmercury formation and degradation derived from these experiments into a numerical model. We estimate that the conversion of inorganic mercury to monomethylmercury in the water column accounts for around 47% (+/-62%, standard deviation) of the monomethylmercury present in polar marine waters, with site-to-site differences in inorganic mercury and monomethylmercury levels accounting for most of the variability. We suggest that water-column methylation of inorganic mercury is a significant source of monomethylmercury in pelagic marine food webs in the Arctic, and possibly in the world's oceans in general.
Natural Products from Deep-Sea-Derived Fungi ̶ A New Source of Novel Bioactive Compounds?
Daletos, Georgios; Ebrahim, Weaam; Ancheeva, Elena; El-Neketi, Mona; Song, Weiguo; Lin, Wenhan; Proksch, Peter
2018-01-01
Over the last two decades, deep-sea-derived fungi are considered to be a new source of pharmacologically active secondary metabolites for drug discovery mainly based on the underlying assumption that the uniqueness of the deep sea will give rise to equally unprecedented natural products. Indeed, up to now over 200 new metabolites have been identified from deep-sea fungi, which is in support of the statement made above. This review summarizes the new and/or bioactive compounds reported from deepsea- derived fungi in the last six years (2010 - October 2016) and critically evaluates whether the data published so far really support the notion that these fungi are a promising source of new bioactive chemical entities. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Vives, Ingrid; Grimalt, Joan O; Ventura, Marc; Catalan, Jordi
2005-06-01
We investigated the contents of polycyclic aromatic hydrocarbons (PAHs) in the food web organisms included in the diet of brown trout from a remote mountain lake. The preferential habitat and trophic level of the component species have been assessed from the signature of stable isotopes (delta13C and delta15N). Subsequently, the patterns of accumulation and transformation of these hydrocarbons in the food chain have been elucidated. Most of the food web organisms exhibit PAH distributions largely dominated by phenanthrene, which agrees with its predominance in atmospheric deposition, water, and suspended particles. Total PAH levels are higher in the organisms from the littoral habitat than from the deep sediments or the pelagic water column. However, organisms from deep sediments exhibit higher proportions of higher molecular weight PAH than those in other lake areas. Distinct organisms exhibit specific features in their relative PAH composition that point to different capacities for uptake and metabolic degradation. Brown trout show an elevated capacity for metabolic degradation because they have lower PAH concentrations than food and they are enriched strongly in lower molecular weight compounds. The PAH levels in trout highly depend on organisms living in the littoral areas. Fish exposure to PAH, therefore, may vary from lake to lake according to the relative contribution of littoral organisms to their diet.
Material and physical model for evaluation of deep brain activity contribution to EEG recordings
NASA Astrophysics Data System (ADS)
Ye, Yan; Li, Xiaoping; Wu, Tiecheng; Li, Zhe; Xie, Wenwen
2015-12-01
Deep brain activity is conventionally recorded with surgical implantation of electrodes. During the neurosurgery, brain tissue damage and the consequent side effects to patients are inevitably incurred. In order to eliminate undesired risks, we propose that deep brain activity should be measured using the noninvasive scalp electroencephalography (EEG) technique. However, the deeper the neuronal activity is located, the noisier the corresponding scalp EEG signals are. Thus, the present study aims to evaluate whether deep brain activity could be observed from EEG recordings. In the experiment, a three-layer cylindrical head model was constructed to mimic a human head. A single dipole source (sine wave, 10 Hz, altering amplitudes) was embedded inside the model to simulate neuronal activity. When the dipole source was activated, surface potential was measured via electrodes attached on the top surface of the model and raw data were recorded for signal analysis. Results show that the dipole source activity positioned at 66 mm depth in the model, equivalent to the depth of deep brain structures, is clearly observed from surface potential recordings. Therefore, it is highly possible that deep brain activity could be observed from EEG recordings and deep brain activity could be measured using the noninvasive scalp EEG technique.
Light penetration structures the deep acoustic scattering layers in the global ocean.
Aksnes, Dag L; Røstad, Anders; Kaartvedt, Stein; Martinez, Udane; Duarte, Carlos M; Irigoien, Xabier
2017-05-01
The deep scattering layer (DSL) is a ubiquitous acoustic signature found across all oceans and arguably the dominant feature structuring the pelagic open ocean ecosystem. It is formed by mesopelagic fishes and pelagic invertebrates. The DSL animals are an important food source for marine megafauna and contribute to the biological carbon pump through the active flux of organic carbon transported in their daily vertical migrations. They occupy depths from 200 to 1000 m at daytime and migrate to a varying degree into surface waters at nighttime. Their daytime depth, which determines the migration amplitude, varies across the global ocean in concert with water mass properties, in particular the oxygen regime, but the causal underpinning of these correlations has been unclear. We present evidence that the broad variability in the oceanic DSL daytime depth observed during the Malaspina 2010 Circumnavigation Expedition is governed by variation in light penetration. We find that the DSL depth distribution conforms to a common optical depth layer across the global ocean and that a correlation between dissolved oxygen and light penetration provides a parsimonious explanation for the association of shallow DSL distributions with hypoxic waters. In enhancing understanding of this phenomenon, our results should improve the ability to predict and model the dynamics of one of the largest animal biomass components on earth, with key roles in the oceanic biological carbon pump and food web.
On2broker: Semantic-Based Access to Information Sources at the WWW.
ERIC Educational Resources Information Center
Fensel, Dieter; Angele, Jurgen; Decker, Stefan; Erdmann, Michael; Schnurr, Hans-Peter; Staab, Steffen; Studer, Rudi; Witt, Andreas
On2broker provides brokering services to improve access to heterogeneous, distributed, and semistructured information sources as they are presented in the World Wide Web. It relies on the use of ontologies to make explicit the semantics of Web pages. This paper discusses the general architecture and main components (i.e., query engine, information…
Source Evaluation of Domain Experts and Novices during Web Search
ERIC Educational Resources Information Center
Brand-Gruwel, S.; Kammerer, Y.; van Meeuwen, L.; van Gog, T.
2017-01-01
Nowadays, almost everyone uses the World Wide Web (WWW) to search for information of any kind. In education, students frequently use the WWW for selecting information to accomplish assignments such as writing an essay or preparing a presentation. The evaluation of sources and information is an important sub-skill in this process. But many students…
NASA Astrophysics Data System (ADS)
Jean, M. M.; Falloon, T.; Gillis, K. M.
2014-12-01
We have acquired high-precision Pb-isotopic signatures of primitive lithologies (basalts/gabbros) recovered from IODP Expedition 345.The Hess Deep Rift, located in the vicinity of the Galapagos triple junction (Cocos, Nazca, and Pacific), is viewed as one the best-studied tectonic windows into fast-spreading crust because a relatively young (<1.5 Ma) cross section of oceanic crust. This allows for (1) characterization of the mantle source(s) at Hess Deep, (2) insight into the extent of isotopic homogeneity or heterogeneity in the area, and (3) constrain the relative contributions from the intruding Cocos-Nazca spreading center. The observed Pb-isotopic variation at Hess Deep covers almost the entire range of EPR MORB (10°N to -5°S). Hess Deep samples range from 208Pb (37.3-38.25), 207Pb (15.47-15.58), 206Pb (17.69-18.91). These compositions suggest that this part of Hess Deep mantle is no more isotopically homogeneous than EPR mantle. Two distinct arrays are also observed: 208Pb-enriched (r2=0.985; n=30) and 208Pb-depleted (r2=0.988; n=6). The 208Pb/204Pb isotopes indicates that the Pb-source for some of the samples at Hess Deep had very low Th/U ratios, whereas other areas around the Galapagos microplate seem to have more "normal" ratios. These trends are less apparent when viewed with 207Pb-isotopes. Instead, the majority of basalts and gabbros follow the NHRL, however, at the depleted-end of this array a negative excursion to more enriched compositions is observed. This negative but linear trend could signify an alteration trend or mixing with an EMI-type mantle source, yet this mixing is not observed with 208Pb. This trend is also observed at Pito Deep, which has similar origins to Hess Deep (Barker et al., 2008; Pollack et al., 2009). The Galapagos region has been considered a testing ground for mixing of HIMU, Enriched Mantle, and Depleted Mantle reservoirs (e.g., Schilling et al., 2002). According to our data, however, an EPR-component must also be considered. We model Hess Deep Pb-isotopes as a 4-component system. EPR-DM-EM comprise a 'local' reservoir, but the majority of samples contain a mixture of modified-HIMU-EM-EPR, a product of incoming plume material entrained within the Galapagos Spreading Center.
NASA Astrophysics Data System (ADS)
Pulsani, B. R.
2017-11-01
Tank Information System is a web application which provides comprehensive information about minor irrigation tanks of Telangana State. As part of the program, a web mapping application using Flex and ArcGIS server was developed to make the data available to the public. In course of time as Flex be-came outdated, a migration of the client interface to the latest JavaScript based technologies was carried out. Initially, the Flex based application was migrated to ArcGIS JavaScript API using Dojo Toolkit. Both the client applications used published services from ArcGIS server. To check the migration pattern from proprietary to open source, the JavaScript based ArcGIS application was later migrated to OpenLayers and Dojo Toolkit which used published service from GeoServer. The migration pattern noticed in the study especially emphasizes upon the use of Dojo Toolkit and PostgreSQL database for ArcGIS server so that migration to open source could be performed effortlessly. The current ap-plication provides a case in study which could assist organizations in migrating their proprietary based ArcGIS web applications to open source. Furthermore, the study reveals cost benefits of adopting open source against commercial software's.
Planning of electroporation-based treatments using Web-based treatment-planning software.
Pavliha, Denis; Kos, Bor; Marčan, Marija; Zupanič, Anže; Serša, Gregor; Miklavčič, Damijan
2013-11-01
Electroporation-based treatment combining high-voltage electric pulses and poorly permanent cytotoxic drugs, i.e., electrochemotherapy (ECT), is currently used for treating superficial tumor nodules by following standard operating procedures. Besides ECT, another electroporation-based treatment, nonthermal irreversible electroporation (N-TIRE), is also efficient at ablating deep-seated tumors. To perform ECT or N-TIRE of deep-seated tumors, following standard operating procedures is not sufficient and patient-specific treatment planning is required for successful treatment. Treatment planning is required because of the use of individual long-needle electrodes and the diverse shape, size and location of deep-seated tumors. Many institutions that already perform ECT of superficial metastases could benefit from treatment-planning software that would enable the preparation of patient-specific treatment plans. To this end, we have developed a Web-based treatment-planning software for planning electroporation-based treatments that does not require prior engineering knowledge from the user (e.g., the clinician). The software includes algorithms for automatic tissue segmentation and, after segmentation, generation of a 3D model of the tissue. The procedure allows the user to define how the electrodes will be inserted. Finally, electric field distribution is computed, the position of electrodes and the voltage to be applied are optimized using the 3D model and a downloadable treatment plan is made available to the user.
2013-01-01
Background Surrogate variable analysis (SVA) is a powerful method to identify, estimate, and utilize the components of gene expression heterogeneity due to unknown and/or unmeasured technical, genetic, environmental, or demographic factors. These sources of heterogeneity are common in gene expression studies, and failing to incorporate them into the analysis can obscure results. Using SVA increases the biological accuracy and reproducibility of gene expression studies by identifying these sources of heterogeneity and correctly accounting for them in the analysis. Results Here we have developed a web application called SVAw (Surrogate variable analysis Web app) that provides a user friendly interface for SVA analyses of genome-wide expression studies. The software has been developed based on open source bioconductor SVA package. In our software, we have extended the SVA program functionality in three aspects: (i) the SVAw performs a fully automated and user friendly analysis workflow; (ii) It calculates probe/gene Statistics for both pre and post SVA analysis and provides a table of results for the regression of gene expression on the primary variable of interest before and after correcting for surrogate variables; and (iii) it generates a comprehensive report file, including graphical comparison of the outcome for the user. Conclusions SVAw is a web server freely accessible solution for the surrogate variant analysis of high-throughput datasets and facilitates removing all unwanted and unknown sources of variation. It is freely available for use at http://psychiatry.igm.jhmi.edu/sva. The executable packages for both web and standalone application and the instruction for installation can be downloaded from our web site. PMID:23497726
NASA Astrophysics Data System (ADS)
Riesselman, C. R.; Scher, H.; Robinson, M. M.; Dowsett, H. J.; Bell, D. B.
2012-12-01
Earth's future climate may resemble the mid-Piacenzian Age of the Pliocene, a time when global temperatures were sustained within the range predicted for the coming century. Surface and deep water temperature reconstructions and coupled ocean-atmosphere general circulation model simulations by the USGS PRISM (Pliocene Research Interpretation and Synoptic Mapping) Group identify a dramatic North Atlantic warm surface temperature anomaly in the mid-Piacenzian (3.264 - 3.025 Ma), accompanied by increased evaporation. The anomaly is detected in deep waters at 46°S, suggesting enhanced meridional overturning circulation and more southerly penetration of North Atlantic Deep Water (NADW) during the PRISM interval. However deep water temperature proxies are not diagnostic of water mass and some coupled model simulations predict transient decreases in NADW production in the 21st century, presenting a contrasting picture of future climate. We present a new multi-proxy investigation of Atlantic deep ocean circulation during the warm mid-Piacenzian, using δ13C of benthic foraminifera as a proxy for water mass age and the neodymium isotopic composition of fossil fish teeth (ɛNd) as a proxy for water mass source and mixing. This reconstruction utilizes both new and previously published data from DSDP and ODP cores along equatorial (Ceara Rise), southern mid-latitude (Walvis Ridge), and south Atlantic (Meteor Rise/Agulhas Ridge) depth transects. Additional end-member sites in the regions of modern north Atlantic and Southern Ocean deep water formation provide a Pliocene baseline for comparison. δ13C throughout the Atlantic basin is remarkably homogenous during the PRISM interval. δ13C values of Cibicidoides spp. and C. wuellerstorfi largely range between 0‰ and 1‰ at North Atlantic, shallow equatorial, southern mid-latitude, and south Atlantic sites with water depths from 2000-4700 m; both depth and latitudinal gradients are generally small (~0.3‰). However, equatorial Ceara Rise sites below 3500 m diverge, with δ13C values as low as -1.2‰ at ~3.15 Ma. The uniquely negative δ13C values at deep Ceara rise sites suggest that, during PRISM warmth, the oldest Atlantic deep waters may have resided along the modern deep western boundary current, while younger deep water masses were concentrated to the south and east. In the modern Atlantic, the ɛNd value of southern-sourced waters is more radiogenic than that of northern-sourced waters, providing a complimentary means to characterize Pliocene water mass geometry. ɛNd values from shallow (2500 m) and deep (4700 m) Walvis Ridge sites average -10 and -11 respectively; the shallow site is somewhat more radiogenic than published coretop ɛNd (-12), suggesting enhanced Pliocene influence of southern-sourced water masses. Ongoing analytical efforts will fingerprint Piacenzian ɛNd from north and south deep water source regions and will target additional depth transect ɛNd, allowing us to investigate the possibility that "older" carbon isotopic signatures at western equatorial sites reflect entrainment of proto-NADW while "younger" signatures at southern and eastern sites reflect the influence of southern-sourced deep water.
Savel, Thomas G; Bronstein, Alvin; Duck, William; Rhodes, M. Barry; Lee, Brian; Stinn, John; Worthen, Katherine
2010-01-01
Objectives Real-time surveillance systems are valuable for timely response to public health emergencies. It has been challenging to leverage existing surveillance systems in state and local communities, and, using a centralized architecture, add new data sources and analytical capacity. Because this centralized model has proven to be difficult to maintain and enhance, the US Centers for Disease Control and Prevention (CDC) has been examining the ability to use a federated model based on secure web services architecture, with data stewardship remaining with the data provider. Methods As a case study for this approach, the American Association of Poison Control Centers and the CDC extended an existing data warehouse via a secure web service, and shared aggregate clinical effects and case counts data by geographic region and time period. To visualize these data, CDC developed a web browser-based interface, Quicksilver, which leveraged the Google Maps API and Flot, a javascript plotting library. Results Two iterations of the NPDS web service were completed in 12 weeks. The visualization client, Quicksilver, was developed in four months. Discussion This implementation of web services combined with a visualization client represents incremental positive progress in transitioning national data sources like BioSense and NPDS to a federated data exchange model. Conclusion Quicksilver effectively demonstrates how the use of secure web services in conjunction with a lightweight, rapidly deployed visualization client can easily integrate isolated data sources for biosurveillance. PMID:23569581
REMORA: a pilot in the ocean of BioMoby web-services.
Carrere, Sébastien; Gouzy, Jérôme
2006-04-01
Emerging web-services technology allows interoperability between multiple distributed architectures. Here, we present REMORA, a web server implemented according to the BioMoby web-service specifications, providing life science researchers with an easy-to-use workflow generator and launcher, a repository of predefined workflows and a survey system. Jerome.Gouzy@toulouse.inra.fr The REMORA web server is freely available at http://bioinfo.genopole-toulouse.prd.fr/remora, sources are available upon request from the authors.
A Query Integrator and Manager for the Query Web
Brinkley, James F.; Detwiler, Landon T.
2012-01-01
We introduce two concepts: the Query Web as a layer of interconnected queries over the document web and the semantic web, and a Query Web Integrator and Manager (QI) that enables the Query Web to evolve. QI permits users to write, save and reuse queries over any web accessible source, including other queries saved in other installations of QI. The saved queries may be in any language (e.g. SPARQL, XQuery); the only condition for interconnection is that the queries return their results in some form of XML. This condition allows queries to chain off each other, and to be written in whatever language is appropriate for the task. We illustrate the potential use of QI for several biomedical use cases, including ontology view generation using a combination of graph-based and logical approaches, value set generation for clinical data management, image annotation using terminology obtained from an ontology web service, ontology-driven brain imaging data integration, small-scale clinical data integration, and wider-scale clinical data integration. Such use cases illustrate the current range of applications of QI and lead us to speculate about the potential evolution from smaller groups of interconnected queries into a larger query network that layers over the document and semantic web. The resulting Query Web could greatly aid researchers and others who now have to manually navigate through multiple information sources in order to answer specific questions. PMID:22531831
NASA Astrophysics Data System (ADS)
Lares, M.
The presence of institutions on the internet is nowadays very important to strenghten communication channels, both internal and with the general public. The Córdoba Observatory has several web portals, including the official web page, a blog and presence on several social networks. These are one of the fundamental pillars for outreach activities, and serve as communication channel for events and scientific, academic, and outreach news. They are also a source of information for the staff, as well as data related to the Observatory internal organization and scientific production. Several statistical studies are presented, based on data taken from the visits to the official web pages. I comment on some aspects of the role of web pages as a source of consultation and as a quick response to information needs. FULL TEXT IN SPANISH
Fiber-based tunable repetition rate source for deep tissue two-photon fluorescence microscopy.
Charan, Kriti; Li, Bo; Wang, Mengran; Lin, Charles P; Xu, Chris
2018-05-01
Deep tissue multiphoton imaging requires high peak power to enhance signal and low average power to prevent thermal damage. Both goals can be advantageously achieved through laser repetition rate tuning instead of simply adjusting the average power. We show that the ideal repetition rate for deep two-photon imaging in the mouse brain is between 1 and 10 MHz, and we present a fiber-based source with an arbitrarily tunable repetition rate within this range. The performance of the new source is compared to a mode-locked Ti:Sapphire (Ti:S) laser for in vivo imaging of mouse brain vasculature. At 2.5 MHz, the fiber source requires 5.1 times less average power to obtain the same signal as a standard Ti:S laser operating at 80 MHz.
NASA Astrophysics Data System (ADS)
Cardellini, Carlo; Frigeri, Alessandro; Lehnert, Kerstin; Ash, Jason; McCormick, Brendan; Chiodini, Giovanni; Fischer, Tobias; Cottrell, Elizabeth
2015-04-01
The release of volatiles from the Earth's interior takes place in both volcanic and non-volcanic areas of the planet. The comprehension of such complex process and the improvement of the current estimates of global carbon emissions, will greatly benefit from the integration of geochemical, petrological and volcanological data. At present, major online data repositories relevant to studies of degassing are not linked and interoperable. In the framework of the Deep Earth Carbon Degassing (DECADE) initiative of the Deep Carbon Observatory (DCO), we are developing interoperability between three data systems that will make their data accessible via the DECADE portal: (1) the Smithsonian Institutionian's Global Volcanism Program database (VOTW) of volcanic activity data, (2) EarthChem databases for geochemical and geochronological data of rocks and melt inclusions, and (3) the MaGa database (Mapping Gas emissions) which contains compositional and flux data of gases released at volcanic and non-volcanic degassing sites. The DECADE web portal will create a powerful search engine of these databases from a single entry point and will return comprehensive multi-component datasets. A user will be able, for example, to obtain data relating to compositions of emitted gases, compositions and age of the erupted products and coincident activity, of a specific volcano. This level of capability requires a complete synergy between the databases, including availability of standard-based web services (WMS, WFS) at all data systems. Data and metadata can thus be extracted from each system without interfering with each database's local schema or being replicated to achieve integration at the DECADE web portal. The DECADE portal will enable new synoptic perspectives on the Earth degassing process allowing to explore Earth degassing related datasets over previously unexplored spatial or temporal ranges.
Starvation and recovery in the deep-sea methanotroph Methyloprofundus sedimenti.
Tavormina, Patricia L; Kellermann, Matthias Y; Antony, Chakkiath Paul; Tocheva, Elitza I; Dalleska, Nathan F; Jensen, Ashley J; Valentine, David L; Hinrichs, Kai-Uwe; Jensen, Grant J; Dubilier, Nicole; Orphan, Victoria J
2017-01-01
In the deep ocean, the conversion of methane into derived carbon and energy drives the establishment of diverse faunal communities. Yet specific biological mechanisms underlying the introduction of methane-derived carbon into the food web remain poorly described, due to a lack of cultured representative deep-sea methanotrophic prokaryotes. Here, the response of the deep-sea aerobic methanotroph Methyloprofundus sedimenti to methane starvation and recovery was characterized. By combining lipid analysis, RNA analysis, and electron cryotomography, it was shown that M. sedimenti undergoes discrete cellular shifts in response to methane starvation, including changes in headgroup-specific fatty acid saturation levels, and reductions in cytoplasmic storage granules. Methane starvation is associated with a significant increase in the abundance of gene transcripts pertinent to methane oxidation. Methane reintroduction to starved cells stimulates a rapid, transient extracellular accumulation of methanol, revealing a way in which methane-derived carbon may be routed to community members. This study provides new understanding of methanotrophic responses to methane starvation and recovery, and lays the initial groundwork to develop Methyloprofundus as a model chemosynthesizing bacterium from the deep sea. © 2016 John Wiley & Sons Ltd.
Gulf of Mexico Deep-Sea Coral Ecosystem Studies, 2008-2011
Kellogg, Christina A.
2009-01-01
Most people are familiar with tropical coral reefs, located in warm, well-illuminated, shallow waters. However, corals also exist hundreds and even thousands of meters below the ocean surface, where it is cold and completely dark. These deep-sea corals, also known as cold-water corals, have become a topic of interest due to conservation concerns over the impacts of trawling, exploration for oil and gas, and climate change. Although the existence of these corals has been known since the 1800s, our understanding of their distribution, ecology, and biology is limited due to the technical difficulties of conducting deep-sea research. DISCOVRE (DIversity, Systematics, and COnnectivity of Vulnerable Reef Ecosystems) is a new U.S. Geological Survey (USGS) program focused on deep-water coral ecosystems in the Gulf of Mexico. This integrated, multidisciplinary, international effort investigates a variety of topics related to unique and fragile deep-sea coral ecosystems from the microscopic level to the ecosystem level, including components of microbiology, population genetics, paleoecology, food webs, taxonomy, community ecology, physical oceanography, and mapping.
ERIC Educational Resources Information Center
Sathick, Javubar; Venkat, Jaya
2015-01-01
Mining social web data is a challenging task and finding user interest for personalized and non-personalized recommendation systems is another important task. Knowledge sharing among web users has become crucial in determining usage of web data and personalizing content in various social websites as per the user's wish. This paper aims to design a…
The GISMO two-millimeter deep field in GOODS-N
DOE Office of Scientific and Technical Information (OSTI.GOV)
Staguhn, Johannes G.; Kovács, Attila; Arendt, Richard G.
2014-07-20
We present deep continuum observations using the GISMO camera at a wavelength of 2 mm centered on the Hubble Deep Field in the GOODS-N field. These are the first deep field observations ever obtained at this wavelength. The 1σ sensitivity in the innermost ∼4' of the 7' diameter map is ∼135 μJy beam{sup –1}, a factor of three higher in flux/beam sensitivity than the deepest available SCUBA 850 μm observations, and almost a factor of four higher in flux/beam sensitivity than the combined MAMBO/AzTEC 1.2 mm observations of this region. Our source extraction algorithm identifies 12 sources directly, and anothermore » 3 through correlation with known sources at 1.2 mm and 850 μm. Five of the directly detected GISMO sources have counterparts in the MAMBO/AzTEC catalog, and four of those also have SCUBA counterparts. HDF850.1, one of the first blank-field detected submillimeter galaxies, is now detected at 2 mm. The median redshift of all sources with counterparts of known redshifts is z-tilde =2.91±0.94. Statistically, the detections are most likely real for five of the seven 2 mm sources without shorter wavelength counterparts, while the probability for none of them being real is negligible.« less
Tracing Mississippi River influences in estuarine food webs of coastal Louisiana.
Wissel, Björn; Fry, Brian
2005-08-01
The Breton Sound estuary in southern Louisiana receives large amounts of Mississippi River water via a controlled diversion structure at the upstream end of the estuary. We used stable isotopes to trace spatial and seasonal responses of the downstream food web to winter and spring introductions of river water. Analysis of delta13C, delta15N, and delta34S in the common local consumers such as grass shrimp (Palaemonetes sp.), barnacles (Balanus sp.), and small plankton-feeding fish (bay anchovies, Anchoa mitchilli) showed that the diversion was associated with two of the five major source regimes that were supporting food webs: a river regime near the diversion and a river-influenced productive marsh regime farther away from the diversion. Mixing models identified a third river-influenced source regime at the marine end of the estuary where major natural discharge from the Bird's Foot Delta wraps around into estuarine waters. The remaining two source regimes represented typical estuarine conditions: local freshwater sources especially from precipitation and a brackish source regime representing higher salinity marine influences. Overall, the Mississippi River diversion accounted for 75% of food web support in the upper estuary and 25% in the middle estuary, with influence strongest along known flow pathways and closest to the diversion. Isotopes also traced seasonal changes in river contributions, and indicated increased plant community productivity along the major flow path of diversion water. In the Breton Sound estuary, bottom-up forcing of food webs is strongly linked to river introductions and discharge, occurring in spatial and temporal patterns predictable from known river input regimes and known hydrologic circulation patterns.
Reducing Methylmercury Accumulation in the Food Webs of San Francisco Bay and Its Local Watersheds
Davis, J.A.; Looker, R.E.; Yee, D.; Marvin-DiPasquale, M.; Grenier, J.L.; Austin, C.M.; McKee, L.J.; Greenfield, B.K.; Brodberg, R.; Blum, J.D.
2013-01-01
San Francisco Bay (California, USA) and its local watersheds present an interesting case study in estuarine mercury (Hg) contamination. This review focuses on the most promising avenues for attempting to reduce methylmercury (MeHg) contamination in Bay Area aquatic food webs and identifying the scientific information that is most urgently needed to support these efforts. Concern for human exposure to MeHg in the region has led to advisories for consumption of sport fish. Striped bass from the Bay have the highest average Hg concentration measured for this species in USA estuaries, and this degree of contamination has been constant for the past 40 years. Similarly, largemouth bass in some Bay Area reservoirs have some of the highest Hg concentrations observed in the entire US. Bay Area wildlife, particularly birds, face potential impacts to reproduction based on Hg concentrations in the tissues of several Bay species. Source control of Hg is one of the primary possible approaches for reducing MeHg accumulation in Bay Area aquatic food webs. Recent findings (particularly Hg isotope measurements) indicate that the decades-long residence time of particle-associated Hg in the Bay is sufficient to allow significant conversion of even the insoluble forms of Hg into MeHg. Past inputs have been thoroughly mixed throughout this shallow and dynamic estuary. The large pool of Hg already present in the ecosystem dominates the fraction converted to MeHg and accumulating in the food web. Consequently, decreasing external Hg inputs can be expected to reduce MeHg in the food web, but it will likely take many decades to centuries before those reductions are achieved. Extensive efforts to reduce loads from the largest Hg mining source (the historic New Almaden mining district) are underway. Hg is spread widely across the urban landscape, but there are a number of key sources, source areas, and pathways that provide opportunities to capture larger quantities of Hg and reduce loads from urban runoff. Atmospheric deposition is a lower priority for source control in the Bay Area due to a combination of a lack of major local sources and Hg isotope data indicating it is a secondary contributor to food web MeHg. Internal net production of MeHg is the dominant source of MeHg that enters the food web. Controlling internal net production is the second primary management approach, and has the potential to reduce food web MeHg more effectively and within a much shorter time-frame. MeHg cycling and control opportunities vary by habitat. Controlling net MeHg production and accumulation in the food web of upstream reservoirs and ponds is very promising due to the many features of these ecosystems that can be manipulated. The most feasible control options in tidal marshes relate to the design of flow patterns and subhabitats in restoration projects. Options for controlling MeHg production in open Bay habitat are limited due primarily to the highly dispersed distribution of Hg throughout the ecosystem. Other changes in these habitats may also have a large influence on food web MeHg, including temperature changes due to global warming, sea level rise, food web alterations due to introduced species and other causes, and changes in sediment supply. Other options for reducing or mitigating exposure and risk include controlling bioaccumulation, cleanup of contaminated sites, and reducing other factors (e.g., habitat availability) that limit at risk wildlife populations. PMID:23122771
THE VLA-COSMOS SURVEY. IV. DEEP DATA AND JOINT CATALOG
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schinnerer, E.; Sargent, M. T.; Bondi, M.
2010-06-15
In the context of the VLA-COSMOS Deep project, additional VLA A array observations at 1.4 GHz were obtained for the central degree of the COSMOS field and combined with the existing data from the VLA-COSMOS Large project. A newly constructed Deep mosaic with a resolution of 2.''5 was used to search for sources down to 4{sigma} with 1{sigma} {approx} 12 {mu}Jy beam{sup -1} in the central 50' x 50'. This new catalog is combined with the catalog from the Large project (obtained at 1.''5 x 1.''4 resolution) to construct a new Joint catalog. All sources listed in the new Jointmore » catalog have peak flux densities of {>=}5{sigma} at 1.''5 and/or 2.''5 resolution to account for the fact that a significant fraction of sources at these low flux levels are expected to be slightly resolved at 1.''5 resolution. All properties listed in the Joint catalog, such as peak flux density, integrated flux density, and source size, are determined in the 2.''5 resolution Deep image. In addition, the Joint catalog contains 43 newly identified multi-component sources.« less
Business Faculty Research: Satisfaction with the Web versus Library Databases
ERIC Educational Resources Information Center
Dewald, Nancy H.; Silvius, Matthew A.
2005-01-01
Business faculty members teaching at undergraduate campuses of the Pennsylvania State University were surveyed in order to assess their satisfaction with free Web sources and with subscription databases for their professional research. Although satisfaction with the Web's ease of use was higher than that for databases, overall satisfaction for…
An Evaluation Instrument for Internet Web Sites.
ERIC Educational Resources Information Center
Livengood, Stephanie Plank
This paper describes the creation of a comprehensive evaluation tool for reference librarians in adult service divisions to use in selecting World Wide Web sites as reference sources. Traditional evaluation criteria, endorsed and applied by librarians over the years, are not sufficient for the evaluation of today's hypermedia web site environment.…
40 CFR 63.821 - Designation of affected sources.
Code of Federal Regulations, 2010 CFR
2010-07-01
... packaging rotogravure or wide-web flexographic printing presses at a facility plus any other equipment at... packaging rotogravure or wide-web flexographic press which is used primarily for coating, laminating, or... applied by the press using wide-web flexographic print stations in each month never exceeds 5 percent of...
Countries: General, Electricity, Geography, Health, Literature: Children's, Plants.
ERIC Educational Resources Information Center
Web Feet, 2002
2002-01-01
Presents an annotated list of Web site educational resources kindergarten through eighth grade. The Web sites this month cover the following subjects: countries (general); electricity; geography; health; children's literature; and plants. Includes a list of "Calendar Connections" to Web site sources of information on Earth Day in April…
ERIC Educational Resources Information Center
Carr, Caleb T.; Zube, Paul; Dickens, Eric; Hayter, Carolyn A.; Barterian, Justin A.
2013-01-01
To explore the integration of education processes into social media, we tested an initial model of student learning via interactive web tools and theorized three sources of influence: interpersonal, intrapersonal, and masspersonal. Three-hundred thirty-seven students observed an online lecture and then completed a series of scales. Structural…
Effects of hydrological forcing on the structure of a tropical estuarine food web
Trisha B. Atwood; Tracy N. Wiegner; Richard A. MacKenzie
2012-01-01
River flow can impact which sources of particulate organic matter (POM) fuel estuarine food webs. Here, we used stable carbon (C) and nitrogen (N) isotope analyses to compare how contributions of diff erent POM sources (terrestrial, estuarine, and marine) to the diets of zooplankton and juvenile fishes differed between low and high river flow conditions, as well as...
Deep Kalman Filter: Simultaneous Multi-Sensor Integration and Modelling; A GNSS/IMU Case Study
Hosseinyalamdary, Siavash
2018-01-01
Bayes filters, such as the Kalman and particle filters, have been used in sensor fusion to integrate two sources of information and obtain the best estimate of unknowns. The efficient integration of multiple sensors requires deep knowledge of their error sources. Some sensors, such as Inertial Measurement Unit (IMU), have complicated error sources. Therefore, IMU error modelling and the efficient integration of IMU and Global Navigation Satellite System (GNSS) observations has remained a challenge. In this paper, we developed deep Kalman filter to model and remove IMU errors and, consequently, improve the accuracy of IMU positioning. To achieve this, we added a modelling step to the prediction and update steps of the Kalman filter, so that the IMU error model is learned during integration. The results showed our deep Kalman filter outperformed the conventional Kalman filter and reached a higher level of accuracy. PMID:29695119
Deep Kalman Filter: Simultaneous Multi-Sensor Integration and Modelling; A GNSS/IMU Case Study.
Hosseinyalamdary, Siavash
2018-04-24
Bayes filters, such as the Kalman and particle filters, have been used in sensor fusion to integrate two sources of information and obtain the best estimate of unknowns. The efficient integration of multiple sensors requires deep knowledge of their error sources. Some sensors, such as Inertial Measurement Unit (IMU), have complicated error sources. Therefore, IMU error modelling and the efficient integration of IMU and Global Navigation Satellite System (GNSS) observations has remained a challenge. In this paper, we developed deep Kalman filter to model and remove IMU errors and, consequently, improve the accuracy of IMU positioning. To achieve this, we added a modelling step to the prediction and update steps of the Kalman filter, so that the IMU error model is learned during integration. The results showed our deep Kalman filter outperformed the conventional Kalman filter and reached a higher level of accuracy.
Code of Federal Regulations, 2014 CFR
2014-01-01
... maintained on the APHIS Web site at http://www.aphis.usda.gov/import_export/animals/animal_disease_status... approximately 210 minutes after which they must be cooked in hot oil (deep-fried) at a minimum of 104 °C for an...
Code of Federal Regulations, 2013 CFR
2013-01-01
... APHIS Web site at http://www.aphis.usda.gov/import_export/animals/animal_disease_status.shtml. Copies of... 210 minutes after which they must be cooked in hot oil (deep-fried) at a minimum of 104 °C for an...
75 FR 76077 - Pipeline Safety: Information Collection Activities
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-07
.... ADDRESSES: Comments may be submitted in the following ways: E-Gov Web Site: http://www.regulations.gov....regulations.gov , including any personal information provided. You should know that anyone is able to search... meters) deep as measured from mean low water that are at risk of being an exposed underwater pipeline or...
Jerome I. Friedman, Henry W. Kendall, Richard E. Taylor and the Development
on the Web. Documents: Experimental Search for a Heavy Electron, DOE Technical Report, September 1967 1967 (Taylor, R. E.) Deep Inelastic Electron Scattering: Experimental, DOE Technical Report, October page may take you to non-federal websites. Their policies may differ from this site. Website Policies
The objective of our study was to characterize the trophic connections of the dominant fishes of the deep-pelagic region of the northern Mid-Atlantic Ridge (MAR) with respect to vertical distribution using carbon (C) and nitrogen (N) stable isotope analysis. Our goals were to id...
Sustained deposition of contaminants from the Deepwater Horizon spill.
Yan, Beizhan; Passow, Uta; Chanton, Jeffrey P; Nöthig, Eva-Maria; Asper, Vernon; Sweet, Julia; Pitiranggon, Masha; Diercks, Arne; Pak, Dorothy
2016-06-14
The 2010 Deepwater Horizon oil spill resulted in 1.6-2.6 × 10(10) grams of petrocarbon accumulation on the seafloor. Data from a deep sediment trap, deployed 7.4 km SW of the well between August 2010 and October 2011, disclose that the sinking of spill-associated substances, mediated by marine particles, especially phytoplankton, continued at least 5 mo following the capping of the well. In August/September 2010, an exceptionally large diatom bloom sedimentation event coincided with elevated sinking rates of oil-derived hydrocarbons, black carbon, and two key components of drilling mud, barium and olefins. Barium remained in the water column for months and even entered pelagic food webs. Both saturated and polycyclic aromatic hydrocarbon source indicators corroborate a predominant contribution of crude oil to the sinking hydrocarbons. Cosedimentation with diatoms accumulated contaminants that were dispersed in the water column and transported them downward, where they were concentrated into the upper centimeters of the seafloor, potentially leading to sustained impact on benthic ecosystems.
The New CCSDS Image Compression Recommendation
NASA Technical Reports Server (NTRS)
Yeh, Pen-Shu; Armbruster, Philippe; Kiely, Aaron; Masschelein, Bart; Moury, Gilles; Schaefer, Christoph
2005-01-01
The Consultative Committee for Space Data Systems (CCSDS) data compression working group has recently adopted a recommendation for image data compression, with a final release expected in 2005. The algorithm adopted in the recommendation consists of a two-dimensional discrete wavelet transform of the image, followed by progressive bit-plane coding of the transformed data. The algorithm can provide both lossless and lossy compression, and allows a user to directly control the compressed data volume or the fidelity with which the wavelet-transformed data can be reconstructed. The algorithm is suitable for both frame-based image data and scan-based sensor data, and has applications for near-Earth and deep-space missions. The standard will be accompanied by free software sources on a future web site. An Application-Specific Integrated Circuit (ASIC) implementation of the compressor is currently under development. This paper describes the compression algorithm along with the requirements that drove the selection of the algorithm. Performance results and comparisons with other compressors are given for a test set of space images.
Sustained deposition of contaminants from the Deepwater Horizon spill
Yan, Beizhan; Passow, Uta; Chanton, Jeffrey P.; Nöthig, Eva-Maria; Asper, Vernon; Sweet, Julia; Pitiranggon, Masha; Diercks, Arne; Pak, Dorothy
2016-01-01
The 2010 Deepwater Horizon oil spill resulted in 1.6–2.6 × 1010 grams of petrocarbon accumulation on the seafloor. Data from a deep sediment trap, deployed 7.4 km SW of the well between August 2010 and October 2011, disclose that the sinking of spill-associated substances, mediated by marine particles, especially phytoplankton, continued at least 5 mo following the capping of the well. In August/September 2010, an exceptionally large diatom bloom sedimentation event coincided with elevated sinking rates of oil-derived hydrocarbons, black carbon, and two key components of drilling mud, barium and olefins. Barium remained in the water column for months and even entered pelagic food webs. Both saturated and polycyclic aromatic hydrocarbon source indicators corroborate a predominant contribution of crude oil to the sinking hydrocarbons. Cosedimentation with diatoms accumulated contaminants that were dispersed in the water column and transported them downward, where they were concentrated into the upper centimeters of the seafloor, potentially leading to sustained impact on benthic ecosystems. PMID:27247393
Deep Crustal Melting and the Survival of Continental Crust
NASA Astrophysics Data System (ADS)
Whitney, D.; Teyssier, C. P.; Rey, P. F.; Korchinski, M.
2017-12-01
Plate convergence involving continental lithosphere leads to crustal melting, which ultimately stabilizes the crust because it drives rapid upward flow of hot deep crust, followed by rapid cooling at shallow levels. Collision drives partial melting during crustal thickening (at 40-75 km) and/or continental subduction (at 75-100 km). These depths are not typically exceeded by crustal rocks that are exhumed in each setting because partial melting significantly decreases viscosity, facilitating upward flow of deep crust. Results from numerical models and nature indicate that deep crust moves laterally and then vertically, crystallizing at depths as shallow as 2 km. Deep crust flows en masse, without significant segregation of melt into magmatic bodies, over 10s of kms of vertical transport. This is a major mechanism by which deep crust is exhumed and is therefore a significant process of heat and mass transfer in continental evolution. The result of vertical flow of deep, partially molten crust is a migmatite dome. When lithosphere is under extension or transtension, the deep crust is solicited by faulting of the brittle upper crust, and the flow of deep crust in migmatite domes traverses nearly the entire thickness of orogenic crust in <10 million years. This cycle of burial, partial melting, rapid ascent, and crystallization/cooling preserves the continents from being recycled into the mantle by convergent tectonic processes over geologic time. Migmatite domes commonly preserve a record of high-T - low-P metamorphism. Domes may also contain rocks or minerals that record high-T - high-P conditions, including high-P metamorphism broadly coeval with host migmatite, evidence for the deep crustal origin of migmatite. There exists a spectrum of domes, from entirely deep-sourced to mixtures of deep and shallow sources. Controlling factors in deep vs. shallow sources are relative densities of crustal layers and rate of extension: fast extension (cm/yr) promotes efficient ascent of deep crust, whereas slow extension (mm/yr) produces significantly less exhumation. Recognition of the importance of migmatite (gneiss) domes as archives of orogenic deep crust is applicable to determining the chemical and physical properties of continental crust, as well as mechanisms and timescales of crustal differentiation.
Hamed, Mohamed; Spaniol, Christian; Nazarieh, Maryam; Helms, Volkhard
2015-07-01
TFmiR is a freely available web server for deep and integrative analysis of combinatorial regulatory interactions between transcription factors, microRNAs and target genes that are involved in disease pathogenesis. Since the inner workings of cells rely on the correct functioning of an enormously complex system of activating and repressing interactions that can be perturbed in many ways, TFmiR helps to better elucidate cellular mechanisms at the molecular level from a network perspective. The provided topological and functional analyses promote TFmiR as a reliable systems biology tool for researchers across the life science communities. TFmiR web server is accessible through the following URL: http://service.bioinformatik.uni-saarland.de/tfmir. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Development of high-efficiency solar cells on silicon web
NASA Technical Reports Server (NTRS)
Meier, D. L.
1986-01-01
Achievement of higher efficiency cells by directing efforts toward identifying carrier loss mechanisms; design of cell structures; and development of processing techniques are described. Use of techniques such as deep-level transient spectroscopy (DLTS), laser-beam-induced current (LBIC), and transmission electron microscopy (TEM) indicated that dislocations in web material rather than twin planes were primarily responsible for limiting diffusion lengths in the web. Lifetimes and cell efficiencies can be improved from 19 to 120 microns, and 8 to 10.3% (no AR), respectively, by implanting hydrogen at 1500 eV and a beam current density of 2.0 mA/sq cm. Some of the processing improvements included use of a double-layer AR coating (ZnS and MgF2) and an addition of an aluminum back surface reflectors. Cells of more than 16% efficiency were achieved.
NASA Astrophysics Data System (ADS)
Papiol, V.; Cartes, J. E.; Fanelli, E.; Rumolo, P.
2013-03-01
The food-web structure and seasonality of the dominant taxa of benthopelagic megafauna (fishes and decapods) on the middle slope of the Catalan Sea (Balearic Basin, NW Mediterranean) were investigated using the carbon and nitrogen stable isotope ratios of 29 species. Macrofauna (infauna, suprabenthos and zooplankton) were also analysed as potential prey. Samples were collected on a seasonal basis from 600 to 1000 m depth between February 2007 and February 2008. The fishes and decapods were classified into feeding groups based on the literature: benthic feeders (including suprabenthos) and zooplankton feeders, the latter further separated into migratory and non-migratory species. Decapods exhibited depleted δ15N and enriched δ13C compared to fishes. Annual mean δ13C of fishes ranged from - 19.15‰ (Arctozenus risso) to - 16.65‰ (Phycis blennoides) and of δ15N from 7.27‰ (Lampanyctus crocodilus) to 11.31‰ (Nezumia aequalis). Annual mean values of δ13C of decapods were from - 18.94‰ (Sergestes arcticus) to - 14.78‰ (Pontophilus norvegicus), and of δ15N from 6.36‰ (Sergia robusta) to 9.72‰ (Paromola cuvieri). Stable isotopes distinguished well amongst the 3 feeding guilds established a priori, pointing to high levels of resource partitioning in deep-sea communities. The trophic structure of the community was a function of the position of predators along the benthic-pelagic gradient, with benthic feeders isotopically enriched relative to pelagic feeders. This difference allowed the identification of two food webs based on pelagic versus benthic consumption. Prey and predator sizes were also important in structuring the community. The most generalised seasonal pattern was δ13C depletion from winter to spring and summer, especially amongst migratory macroplankton feeders. This suggests greater consumption of pelagic prey, likely related with increases in pelagic production or with ontogenic migrations of organisms from mid-water to the Benthic Boundary Layer (BBL). δ15N enrichment was detected in periods of water column stratification, particularly amongst benthic feeder fishes. Megafauna relied on a single source of nutrition after peaks in surface production, presumably marine snow. Conversely, a larger array of food sources, probably from advection, sustained the community in periods of water column stratification. Benthic feeder δ13C values of both taxa were positively correlated with fluorescence measured 5 m above the seabed and negatively correlated with total organic carbon in the sediments, both being food sources for deposit feeding macroinfauna. Macroplankton feeder δ13C values were linked to environmental variables related to vertical transport from surface production, i.e. lipids and chlorophyll and their degradation products, likely due to their stronger reliance on sinking phytodetritus through consumption of planktonic prey.
Web queries as a source for syndromic surveillance.
Hulth, Anette; Rydevik, Gustaf; Linde, Annika
2009-01-01
In the field of syndromic surveillance, various sources are exploited for outbreak detection, monitoring and prediction. This paper describes a study on queries submitted to a medical web site, with influenza as a case study. The hypothesis of the work was that queries on influenza and influenza-like illness would provide a basis for the estimation of the timing of the peak and the intensity of the yearly influenza outbreaks that would be as good as the existing laboratory and sentinel surveillance. We calculated the occurrence of various queries related to influenza from search logs submitted to a Swedish medical web site for two influenza seasons. These figures were subsequently used to generate two models, one to estimate the number of laboratory verified influenza cases and one to estimate the proportion of patients with influenza-like illness reported by selected General Practitioners in Sweden. We applied an approach designed for highly correlated data, partial least squares regression. In our work, we found that certain web queries on influenza follow the same pattern as that obtained by the two other surveillance systems for influenza epidemics, and that they have equal power for the estimation of the influenza burden in society. Web queries give a unique access to ill individuals who are not (yet) seeking care. This paper shows the potential of web queries as an accurate, cheap and labour extensive source for syndromic surveillance.
Demonstration of miniaturized 20mW CW 280nm and 266nm solid-state UV laser sources
NASA Astrophysics Data System (ADS)
Landru, Nicolas; Georges, Thierry; Beaurepaire, Julien; Le Guen, Bruno; Le Bail, Guy
2015-02-01
Visible 561 nm and 532 nm laser emissions from 14-mm long DPSS monolithic cavities are frequency converted to deep UV 280 nm and 266 nm in 16-mm long monolithic external cavities. Wavelength conversion is fully insensitive to mechanical vibrations and the whole UV laser sources fit in a miniaturized housing. More than 20 mW deep UV laser emission is demonstrated with high power stability, low noise and good beam quality. Aging tests are in progress but long lifetimes are expected thanks to the cavity design. Protein detection and deep UV resonant Raman spectroscopy are applications that could benefit from these laser sources.
An open source web interface for linking models to infrastructure system databases
NASA Astrophysics Data System (ADS)
Knox, S.; Mohamed, K.; Harou, J. J.; Rheinheimer, D. E.; Medellin-Azuara, J.; Meier, P.; Tilmant, A.; Rosenberg, D. E.
2016-12-01
Models of networked engineered resource systems such as water or energy systems are often built collaboratively with developers from different domains working at different locations. These models can be linked to large scale real world databases, and they are constantly being improved and extended. As the development and application of these models becomes more sophisticated, and the computing power required for simulations and/or optimisations increases, so has the need for online services and tools which enable the efficient development and deployment of these models. Hydra Platform is an open source, web-based data management system, which allows modellers of network-based models to remotely store network topology and associated data in a generalised manner, allowing it to serve multiple disciplines. Hydra Platform uses a web API using JSON to allow external programs (referred to as `Apps') to interact with its stored networks and perform actions such as importing data, running models, or exporting the networks to different formats. Hydra Platform supports multiple users accessing the same network and has a suite of functions for managing users and data. We present ongoing development in Hydra Platform, the Hydra Web User Interface, through which users can collaboratively manage network data and models in a web browser. The web interface allows multiple users to graphically access, edit and share their networks, run apps and view results. Through apps, which are located on the server, the web interface can give users access to external data sources and models without the need to install or configure any software. This also ensures model results can be reproduced by removing platform or version dependence. Managing data and deploying models via the web interface provides a way for multiple modellers to collaboratively manage data, deploy and monitor model runs and analyse results.
Kong, Xiao-le; Wang, Shi-qin; Zhao, Huan; Yuan, Rui-qiang
2015-11-01
There is an obvious regional contradiction between water resources and agricultural produce in lower plain area of North China, however, excessive fluorine in deep groundwater further limits the use of regional water resources. In order to understand the spatial distribution characteristics and source of F(-) in groundwater, study was carried out in Nanpi County by field survey and sampling, hydrogeochemical analysis and stable isotopes methods. The results showed that the center of low fluoride concentrations of shallow groundwater was located around reservoir of Dalang Lake, and centers of high fluoride concentrations were located in southeast and southwest of the study area. The region with high fluoride concentration was consistent with the over-exploitation region of deep groundwater. Point source pollution of subsurface drainage and non-point source of irrigation with deep groundwater in some regions were the main causes for the increasing F(-) concentrations of shallow groundwater in parts of the sampling sites. Rock deposition and hydrogeology conditions were the main causes for the high F(-) concentrations (1.00 mg x L(-1), threshold of drinking water quality standard in China) in deep groundwater. F(-) released from clay minerals into the water increased the F(-) concentrations in deep groundwater because of over-exploitation. With the increasing exploitation and utilization of brackish shallow groundwater and the compressing and restricting of deep groundwater exploitation, the water environment in the middle and east lower plain area of North China will undergo significant change, and it is important to identify the distribution and source of F(-) in surface water and groundwater for reasonable development and use of water resources in future.
SchNet - A deep learning architecture for molecules and materials
NASA Astrophysics Data System (ADS)
Schütt, K. T.; Sauceda, H. E.; Kindermans, P.-J.; Tkatchenko, A.; Müller, K.-R.
2018-06-01
Deep learning has led to a paradigm shift in artificial intelligence, including web, text, and image search, speech recognition, as well as bioinformatics, with growing impact in chemical physics. Machine learning, in general, and deep learning, in particular, are ideally suitable for representing quantum-mechanical interactions, enabling us to model nonlinear potential-energy surfaces or enhancing the exploration of chemical compound space. Here we present the deep learning architecture SchNet that is specifically designed to model atomistic systems by making use of continuous-filter convolutional layers. We demonstrate the capabilities of SchNet by accurately predicting a range of properties across chemical space for molecules and materials, where our model learns chemically plausible embeddings of atom types across the periodic table. Finally, we employ SchNet to predict potential-energy surfaces and energy-conserving force fields for molecular dynamics simulations of small molecules and perform an exemplary study on the quantum-mechanical properties of C20-fullerene that would have been infeasible with regular ab initio molecular dynamics.
NASA Astrophysics Data System (ADS)
Copeland, Adrienne Marie
Patchiness of prey can influence the behavior of a predator, as predicted by the optimal foraging theory which states that an animal will maximize the energy gain while minimizing energy loss. While this relationship has been studied and is relatively well understood in some terrestrial systems, the same is far from true in marine systems. It is as important to investigate this in the marine realm in order to better understand predator distribution and behavior. Micronekton, organisms from 2-20 cm, might be a key component in understanding this as it is potentially an essential link in the food web between primary producers and higher trophic levels, including cephalopods which are primary prey items of deep diving odontocetes (toothed whales). My dissertation assesses the spatial and temporal variability of micronekton in the Northwestern Hawaiian Islands (NWHI), the Main Hawaiian Islands' (MHI) Island of Hawaii, and the Gulf of Mexico (GOM). Additionally it focuses on understanding the relationship between the spatial distribution of micronekton and environmental and geographic factors, and how the spatial and temporal variability of this micronekton relates to deep diving odontocete foraging. I used both an active Simrad EK60 echosounder system to collect water column micronekton backscatter and a passive acoustic system to detect the presence of echolocation clicks from deep diving beaked, sperm, and short-finned pilot whales. My results provide insight into what might be contributing to hotspots of micronekton which formed discrete layers in all locations, a shallow scattering layer (SSL) from the surface to about 200 m and a deep scattering layer (DSL) starting at about 350 m. In both the GOM and the NWHI, the bathymetry and proximity to shore influenced the amount of micronekton backscatter with locations closer to shore and at shallower depths having higher backscatter. We found in all three locations that some species of deep diving odontocetes were searching for prey in these areas with higher micronekton backscatter. Beaked whales in the NWHI, short-finned pilot whales in the NWHI and MHI, and sperm whales in the GOM where present in areas of higher micronekton backscatter. These hotspots of backscatter may be good predictors of the distribution of some deep-diving toothed whale foragers since the hotspots potentially indicate a food web supporting the prey of the cetaceans.
NASA Astrophysics Data System (ADS)
Takemura, Shunsuke; Maeda, Takuto; Furumura, Takashi; Obara, Kazushige
2016-05-01
In this study, the source location of the 30 May 2015 (Mw 7.9) deep-focus Bonin earthquake was constrained using P wave seismograms recorded across Japan. We focus on propagation characteristics of high-frequency P wave. Deep-focus intraslab earthquakes typically show spindle-shaped seismogram envelopes with peak delays of several seconds and subsequent long-duration coda waves; however, both the main shock and aftershock of the 2015 Bonin event exhibited pulse-like P wave propagations with high apparent velocities (~12.2 km/s). Such P wave propagation features were reproduced by finite-difference method simulations of seismic wave propagation in the case of slab-bottom source. The pulse-like P wave seismogram envelopes observed from the 2015 Bonin earthquake show that its source was located at the bottom of the Pacific slab at a depth of ~680 km, rather than within its middle or upper regions.
NASA Astrophysics Data System (ADS)
Andreeva, J.; Dzhunov, I.; Karavakis, E.; Kokoszkiewicz, L.; Nowotka, M.; Saiz, P.; Tuckett, D.
2012-12-01
Improvements in web browser performance and web standards compliance, as well as the availability of comprehensive JavaScript libraries, provides an opportunity to develop functionally rich yet intuitive web applications that allow users to access, render and analyse data in novel ways. However, the development of such large-scale JavaScript web applications presents new challenges, in particular with regard to code sustainability and team-based work. We present an approach that meets the challenges of large-scale JavaScript web application design and development, including client-side model-view-controller architecture, design patterns, and JavaScript libraries. Furthermore, we show how the approach leads naturally to the encapsulation of the data source as a web API, allowing applications to be easily ported to new data sources. The Experiment Dashboard framework is used for the development of applications for monitoring the distributed computing activities of virtual organisations on the Worldwide LHC Computing Grid. We demonstrate the benefits of the approach for large-scale JavaScript web applications in this context by examining the design of several Experiment Dashboard applications for data processing, data transfer and site status monitoring, and by showing how they have been ported for different virtual organisations and technologies.
gemcWeb: A Cloud Based Nuclear Physics Simulation Software
NASA Astrophysics Data System (ADS)
Markelon, Sam
2017-09-01
gemcWeb allows users to run nuclear physics simulations from the web. Being completely device agnostic, scientists can run simulations from anywhere with an Internet connection. Having a full user system, gemcWeb allows users to revisit and revise their projects, and share configurations and results with collaborators. gemcWeb is based on simulation software gemc, which is based on standard GEant4. gemcWeb requires no C++, gemc, or GEant4 knowledge. Using a simple but powerful GUI allows users to configure their project from geometries and configurations stored on the deployment server. Simulations are then run on the server, with results being posted to the user, and then securely stored. Python based and open-source, the main version of gemcWeb is hosted internally at Jefferson National Labratory and used by the CLAS12 and Electron-Ion Collider Project groups. However, as the software is open-source, and hosted as a GitHub repository, an instance can be deployed on the open web, or any institution's intra-net. An instance can be configured to host experiments specific to an institution, and the code base can be modified by any individual or group. Special thanks to: Maurizio Ungaro, PhD., creator of gemc; Markus Diefenthaler, PhD., advisor; and Kyungseon Joo, PhD., advisor.
NOAA Operational Tsunameter Support for Research
NASA Astrophysics Data System (ADS)
Bouchard, R.; Stroker, K.
2008-12-01
In March 2008, the National Oceanic and Atmospheric Administration's (NOAA) National Data Buoy Center (NDBC) completed the deployment of the last of the 39-station network of deep-sea tsunameters. As part of NOAA's effort to strengthen tsunami warning capabilities, NDBC expanded the network from 6 to 39 stations and upgraded all stations to the second generation Deep-ocean Assessment and Reporting of Tsunamis technology (DART II). Consisting of a bottom pressure recorder (BPR) and a surface buoy, the tsunameters deliver water-column heights, estimated from pressure measurements at the sea floor, to Tsunami Warning Centers in less than 3 minutes. This network provides coastal communities in the Pacific, Atlantic, Caribbean, and the Gulf of Mexico with faster and more accurate tsunami warnings. In addition, both the coarse resolution real-time data and the high resolution (15-second) recorded data provide invaluable contributions to research, such as the detection of the 2004 Sumatran tsunami in the Northeast Pacific (Gower and González, 2006) and the experimental tsunami forecast system (Bernard et al., 2007). NDBC normally recovers the BPRs every 24 months and sends the recovered high resolution data to NOAA's National Geophysical Data Center (NGDC) for archive and distribution. NGDC edits and processes this raw binary format to obtain research-quality data. NGDC provides access to retrospective BPR data from 1986 to the present. The DART database includes pressure and temperature data from the ocean floor, stored in a relational database, enabling data integration with the global tsunami and significant earthquake databases. All data are accessible via the Web as tables, reports, interactive maps, OGC Web Map Services (WMS), and Web Feature Services (WFS) to researchers around the world. References: Gower, J. and F. González, 2006. U.S. Warning System Detected the Sumatra Tsunami, Eos Trans. AGU, 87(10). Bernard, E. N., C. Meinig, and A. Hilton, 2007. Deep Ocean Tsunami Detection: Third Generation DART, Eos Trans. AGU, 88(52), Fall Meet. Suppl., Abstract S51C-03.
Responsible vendors, intelligent consumers: Silk Road, the online revolution in drug trading.
Van Hout, Marie Claire; Bingham, Tim
2014-03-01
Silk Road is located on the Deep Web and provides an anonymous transacting infrastructure for the retail of drugs and pharmaceuticals. Members are attracted to the site due to protection of identity by screen pseudonyms, variety and quality of product listings, selection of vendors based on reviews, reduced personal risks, stealth of product delivery, development of personal connections with vendors in stealth modes and forum activity. The study aimed to explore vendor accounts of Silk Road as retail infrastructure. A single and holistic case study with embedded units approach (Yin, 2003) was chosen to explore the accounts of vendor subunits situated within the Silk Road marketplace. Vendors (n=10) completed an online interview via the direct message facility and via Tor mail. Vendors described themselves as 'intelligent and responsible' consumers of drugs. Decisions to commence vending operations on the site centred on simplicity in setting up vendor accounts, and opportunity to operate within a low risk, high traffic, high mark-up, secure and anonymous Deep Web infrastructure. The embedded online culture of harm reduction ethos appealed to them in terms of the responsible vending and use of personally tested high quality products. The professional approach to running their Silk Road businesses and dedication to providing a quality service was characterised by professional advertising of quality products, professional communication and visibility on forum pages, speedy dispatch of slightly overweight products, competitive pricing, good stealth techniques and efforts to avoid customer disputes. Vendors appeared content with a fairly constant buyer demand and described a relatively competitive market between small and big time market players. Concerns were evident with regard to Bitcoin instability. The greatest threat to Silk Road and other sites operating on the Deep Web is not law enforcement or market dynamics, it is technology itself. Copyright © 2013 Elsevier B.V. All rights reserved.
Unipept web services for metaproteomics analysis.
Mesuere, Bart; Willems, Toon; Van der Jeugt, Felix; Devreese, Bart; Vandamme, Peter; Dawyndt, Peter
2016-06-01
Unipept is an open source web application that is designed for metaproteomics analysis with a focus on interactive datavisualization. It is underpinned by a fast index built from UniProtKB and the NCBI taxonomy that enables quick retrieval of all UniProt entries in which a given tryptic peptide occurs. Unipept version 2.4 introduced web services that provide programmatic access to the metaproteomics analysis features. This enables integration of Unipept functionality in custom applications and data processing pipelines. The web services are freely available at http://api.unipept.ugent.be and are open sourced under the MIT license. Unipept@ugent.be Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Zinc in an ultraoligotrophic lake food web.
Montañez, Juan Cruz; Arribére, María A; Rizzo, Andrea; Arcagni, Marina; Campbell, Linda; Ribeiro Guevara, Sergio
2018-06-01
Zinc (Zn) bioaccumulation and trophic transfer were analyzed in the food web of Lake Nahuel Huapi, a deep, unpolluted ultraoligotrophic system in North Patagonia. Benthic macroinvertebrates, plankton, and native and introduced fish were collected at three sites. The effect of pyroclastic inputs on Zn levels in lacustrine food webs was assessed by studying the impact of the eruption of Puyehue-Cordón Caulle volcanic complex (PCCVC) in 2011, by performing three sampling campaigns immediately before and after the PCCVC eruption, and after 2 years of recovery of the ecosystem. Zinc trophodynamics in L. Nahuel Huapi food web was assessed using nitrogen stable isotopes (δ 15 N). There was no significant increase of Zn concentrations ([Zn]) in L. Nahuel Huapi biota after the PCCVC eruption, despite the evidence of [Zn] increase in lake water that could be associated with volcanic ash leaching. The organisms studied exhibited [Zn] above the threshold level considered for dietary deficiency, regulating Zn adequately even under a catastrophic situations like PCCVC 2011 eruption. Zinc concentrations exhibited a biodilution pattern in the lake's food web. To the best of our knowledge, present research is the first report of Zn biodilution in lacustrine systems, and the first to study Zn transfer in a freshwater food web including both pelagic and benthic compartments.
Fiber-based tunable repetition rate source for deep tissue two-photon fluorescence microscopy
Charan, Kriti; Li, Bo; Wang, Mengran; Lin, Charles P.; Xu, Chris
2018-01-01
Deep tissue multiphoton imaging requires high peak power to enhance signal and low average power to prevent thermal damage. Both goals can be advantageously achieved through laser repetition rate tuning instead of simply adjusting the average power. We show that the ideal repetition rate for deep two-photon imaging in the mouse brain is between 1 and 10 MHz, and we present a fiber-based source with an arbitrarily tunable repetition rate within this range. The performance of the new source is compared to a mode-locked Ti:Sapphire (Ti:S) laser for in vivo imaging of mouse brain vasculature. At 2.5 MHz, the fiber source requires 5.1 times less average power to obtain the same signal as a standard Ti:S laser operating at 80 MHz. PMID:29760989
Systematic Review of Quality of Patient Information on Liposuction in the Internet
Zuk, Grzegorz; Eylert, Gertraud; Raptis, Dimitri Aristotle; Guggenheim, Merlin; Shafighi, Maziar
2016-01-01
Background: A large number of patients who are interested in esthetic surgery actively search the Internet, which represents nowadays the first source of information. However, the quality of information available in the Internet on liposuction is currently unknown. The aim of this study was to assess the quality of patient information on liposuction available in the Internet. Methods: The quantitative and qualitative assessment of Web sites was based on a modified Ensuring Quality Information for Patients tool (36 items). Five hundred Web sites were identified by the most popular web search engines. Results: Two hundred forty-five Web sites were assessed after duplicates and irrelevant sources were excluded. Only 72 (29%) Web sites addressed >16 items, and scores tended to be higher for professional societies, portals, patient groups, health departments, and academic centers than for Web sites developed by physicians, respectively. The Ensuring Quality Information for Patients score achieved by Web sites ranged between 8 and 29 of total 36 points, with a median value of 16 points (interquartile range, 14–18). The top 10 Web sites with the highest scores were identified. Conclusions: The quality of patient information on liposuction available in the Internet is poor, and existing Web sites show substantial shortcomings. There is an urgent need for improvement in offering superior quality information on liposuction for patients intending to undergo this procedure. PMID:27482498
Systematic Review of Quality of Patient Information on Liposuction in the Internet.
Zuk, Grzegorz; Palma, Adrian Fernando; Eylert, Gertraud; Raptis, Dimitri Aristotle; Guggenheim, Merlin; Shafighi, Maziar
2016-06-01
A large number of patients who are interested in esthetic surgery actively search the Internet, which represents nowadays the first source of information. However, the quality of information available in the Internet on liposuction is currently unknown. The aim of this study was to assess the quality of patient information on liposuction available in the Internet. The quantitative and qualitative assessment of Web sites was based on a modified Ensuring Quality Information for Patients tool (36 items). Five hundred Web sites were identified by the most popular web search engines. Two hundred forty-five Web sites were assessed after duplicates and irrelevant sources were excluded. Only 72 (29%) Web sites addressed >16 items, and scores tended to be higher for professional societies, portals, patient groups, health departments, and academic centers than for Web sites developed by physicians, respectively. The Ensuring Quality Information for Patients score achieved by Web sites ranged between 8 and 29 of total 36 points, with a median value of 16 points (interquartile range, 14-18). The top 10 Web sites with the highest scores were identified. The quality of patient information on liposuction available in the Internet is poor, and existing Web sites show substantial shortcomings. There is an urgent need for improvement in offering superior quality information on liposuction for patients intending to undergo this procedure.
Compilation and network analyses of cambrian food webs.
Dunne, Jennifer A; Williams, Richard J; Martinez, Neo D; Wood, Rachel A; Erwin, Douglas H
2008-04-29
A rich body of empirically grounded theory has developed about food webs--the networks of feeding relationships among species within habitats. However, detailed food-web data and analyses are lacking for ancient ecosystems, largely because of the low resolution of taxa coupled with uncertain and incomplete information about feeding interactions. These impediments appear insurmountable for most fossil assemblages; however, a few assemblages with excellent soft-body preservation across trophic levels are candidates for food-web data compilation and topological analysis. Here we present plausible, detailed food webs for the Chengjiang and Burgess Shale assemblages from the Cambrian Period. Analyses of degree distributions and other structural network properties, including sensitivity analyses of the effects of uncertainty associated with Cambrian diet designations, suggest that these early Paleozoic communities share remarkably similar topology with modern food webs. Observed regularities reflect a systematic dependence of structure on the numbers of taxa and links in a web. Most aspects of Cambrian food-web structure are well-characterized by a simple "niche model," which was developed for modern food webs and takes into account this scale dependence. However, a few aspects of topology differ between the ancient and recent webs: longer path lengths between species and more species in feeding loops in the earlier Chengjiang web, and higher variability in the number of links per species for both Cambrian webs. Our results are relatively insensitive to the exclusion of low-certainty or random links. The many similarities between Cambrian and recent food webs point toward surprisingly strong and enduring constraints on the organization of complex feeding interactions among metazoan species. The few differences could reflect a transition to more strongly integrated and constrained trophic organization within ecosystems following the rapid diversification of species, body plans, and trophic roles during the Cambrian radiation. More research is needed to explore the generality of food-web structure through deep time and across habitats, especially to investigate potential mechanisms that could give rise to similar structure, as well as any differences.
Deep-towed high resolution seismic imaging II: Determination of P-wave velocity distribution
NASA Astrophysics Data System (ADS)
Marsset, B.; Ker, S.; Thomas, Y.; Colin, F.
2018-02-01
The acquisition of high resolution seismic data in deep waters requires the development of deep towed seismic sources and receivers able to deal with the high hydrostatic pressure environment. The low frequency piezoelectric transducer of the SYSIF (SYstème Sismique Fond) deep towed seismic device comply with the former requirement taking advantage of the coupling of a mechanical resonance (Janus driver) and a fluid resonance (Helmholtz cavity) to produce a large frequency bandwidth acoustic signal (220-1050 Hz). The ability to perform deep towed multichannel seismic imaging with SYSIF was demonstrated in 2014, yet, the ability to determine P-wave velocity distribution wasn't achieved. P-wave velocity analysis relies on the ratio between the source-receiver offset range and the depth of the seismic reflectors, thus towing the seismic source and receivers closer to the sea bed will provide a better geometry for P-wave velocity determination. Yet, technical issues, related to the acoustic source directivity, arise for this approach in the particular framework of piezoelectric sources. A signal processing sequence is therefore added to the initial processing flow. Data acquisition took place during the GHASS (Gas Hydrates, fluid Activities and Sediment deformations in the western Black Sea) cruise in the Romanian waters of the Black Sea. The results of the imaging processing are presented for two seismic data sets acquired over gas hydrates and gas bearing sediments. The improvement in the final seismic resolution demonstrates the validity of the velocity model.
Deep challenges for China's war on water pollution.
Han, Dongmei; Currell, Matthew J; Cao, Guoliang
2016-11-01
China's Central government has released an ambitious plan to tackle the nation's water pollution crisis. However, this is inhibited by a lack of data, particularly for groundwater. We compiled and analyzed water quality classification data from publicly available government sources, further revealing the scale and extent of the crisis. We also compiled nitrate data in shallow and deep groundwater from a range of literature sources, covering 52 of China's groundwater systems; the most comprehensive national-scale assessment yet. Nitrate pollution at levels exceeding the US EPA's maximum contaminant level (10 mg/L NO 3 N) occurs at the 90th percentile in 25 of 36 shallow aquifers and 10 out of 37 deep or karst aquifers. Isotopic compositions of groundwater nitrate (δ 15 N and δ 18 O NO3 values ranging from -14.9‰ to 35.5‰ and -8.1‰ to 51.0‰, respectively) indicate many nitrate sources including soil nitrogen, agricultural fertilizers, untreated wastewater and/or manure, and locally show evidence of de-nitrification. From these data, it is clear that contaminated groundwater is ubiquitous in deep aquifers as well as shallow groundwater (and surface water). Deep aquifers contain water recharged tens of thousands of years before present, long before widespread anthropogenic nitrate contamination. This groundwater has therefore likely been contaminated due to rapid bypass flow along wells or other conduits. Addressing the issue of well condition is urgently needed to stop further pollution of China's deep aquifers, which are some of China's most important drinking water sources. China's new 10-point Water Pollution Plan addresses previous shortcomings, however, control and remediation of deep groundwater pollution will take decades of sustained effort. Copyright © 2016. Published by Elsevier Ltd.
Deep Interactive Learning with Sharkzor
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Sharkzor is a web application for machine-learning assisted image sort and summary. Deep learning algorithms are leveraged to infer, augment, and automate the user’s mental model. Initially, images uploaded by the user are spread out on a canvas. The user then interacts with the images to impute their mental model into the applications algorithmic underpinnings. Methods of interaction within Sharkzor’s user interface and user experience support three primary user tasks: triage, organize and automate. The user triages the large pile of overlapping images by moving images of interest into proximity. The user then organizes said images into meaningful groups. Aftermore » interacting with the images and groups, deep learning helps to automate the user’s interactions. The loop of interaction, automation, and response by the user allows the system to quickly make sense of large amounts of data.« less
Artemis: Integrating Scientific Data on the Grid (Preprint)
2004-07-01
Theseus execution engine [Barish and Knoblock 03] to efficiently execute the generated datalog program. The Theseus execution engine has a wide...variety of operations to query databases, web sources, and web services. Theseus also contains a wide variety of relational operations, such as...selection, union, or projection. Furthermore, Theseus optimizes the execution of an integration plan by querying several data sources in parallel and
32 CFR Appendix A to Part 806b - Definitions
Code of Federal Regulations, 2010 CFR
2010-07-01
... exemption for protecting the identity of confidential sources. Cookie: Data created by a Web server that is... (persistent cookie). It provides a way for the Web site to identify users and keep track of their preferences... or is sent to a Web site different from the one you are currently viewing. Defense Data Integrity...
32 CFR Appendix A to Part 806b - Definitions
Code of Federal Regulations, 2011 CFR
2011-07-01
... exemption for protecting the identity of confidential sources. Cookie: Data created by a Web server that is... (persistent cookie). It provides a way for the Web site to identify users and keep track of their preferences... or is sent to a Web site different from the one you are currently viewing. Defense Data Integrity...
Publicizing Your Web Resources for Maximum Exposure.
ERIC Educational Resources Information Center
Smith, Kerry J.
2001-01-01
Offers advice to librarians for marketing their Web sites on Internet search engines. Advises against relying solely on spiders and recommends adding metadata to the source code and delivering that information directly to the search engines. Gives an overview of metadata and typical coding for meta tags. Includes Web addresses for a number of…
Implementing a Dynamic Database-Driven Course Using LAMP
ERIC Educational Resources Information Center
Laverty, Joseph Packy; Wood, David; Turchek, John
2011-01-01
This paper documents the formulation of a database driven open source architecture web development course. The design of a web-based curriculum faces many challenges: a) relative emphasis of client and server-side technologies, b) choice of a server-side language, and c) the cost and efficient delivery of a dynamic web development, database-driven…
Opinion Integration and Summarization
ERIC Educational Resources Information Center
Lu, Yue
2011-01-01
As Web 2.0 applications become increasingly popular, more and more people express their opinions on the Web in various ways in real time. Such wide coverage of topics and abundance of users make the Web an extremely valuable source for mining people's opinions about all kinds of topics. However, since the opinions are usually expressed as…
Experience on Mashup Development with End User Programming Environment
ERIC Educational Resources Information Center
Yue, Kwok-Bun
2010-01-01
Mashups, Web applications integrating data and functionality from other Web sources to provide a new service, have quickly become ubiquitous. Because of their role as a focal point in three important trends (Web 2.0, situational software applications, and end user development), mashups are a crucial emerging technology for information systems…
ERIC Educational Resources Information Center
Gerjets, Peter; Kammerer, Yvonne; Werner, Benita
2011-01-01
Web searching for complex information requires to appropriately evaluating diverse sources of information. Information science studies identified different criteria applied by searchers to evaluate Web information. However, the explicit evaluation instructions used in these studies might have resulted in a distortion of spontaneous evaluation…
Web Resources for Camp Staff: Where To Look for Answers to Your Questions.
ERIC Educational Resources Information Center
Pavlicin, Karen M.
1997-01-01
The World Wide Web is a good source of quick information, which is especially helpful during the busy camping season. Among the subjects on the Web relevant to camp are horsemanship, canoeing, waterfront safety, government standards, legislative news, disabilities, youth resources, vegetarian meals, grant writing, news, and stress management.…
Designing Crop Simulation Web Service with Service Oriented Architecture Principle
NASA Astrophysics Data System (ADS)
Chinnachodteeranun, R.; Hung, N. D.; Honda, K.
2015-12-01
Crop simulation models are efficient tools for simulating crop growth processes and yield. Running crop models requires data from various sources as well as time-consuming data processing, such as data quality checking and data formatting, before those data can be inputted to the model. It makes the use of crop modeling limited only to crop modelers. We aim to make running crop models convenient for various users so that the utilization of crop models will be expanded, which will directly improve agricultural applications. As the first step, we had developed a prototype that runs DSSAT on Web called as Tomorrow's Rice (v. 1). It predicts rice yields based on a planting date, rice's variety and soil characteristics using DSSAT crop model. A user only needs to select a planting location on the Web GUI then the system queried historical weather data from available sources and expected yield is returned. Currently, we are working on weather data connection via Sensor Observation Service (SOS) interface defined by Open Geospatial Consortium (OGC). Weather data can be automatically connected to a weather generator for generating weather scenarios for running the crop model. In order to expand these services further, we are designing a web service framework consisting of layers of web services to support compositions and executions for running crop simulations. This framework allows a third party application to call and cascade each service as it needs for data preparation and running DSSAT model using a dynamic web service mechanism. The framework has a module to manage data format conversion, which means users do not need to spend their time curating the data inputs. Dynamic linking of data sources and services are implemented using the Service Component Architecture (SCA). This agriculture web service platform demonstrates interoperability of weather data using SOS interface, convenient connections between weather data sources and weather generator, and connecting various services for running crop models for decision support.
NASA Astrophysics Data System (ADS)
Zhai, Dongsheng; Liu, Chen
Since 2005, the term Web 2.0 has gradually become a hot topic on the Internet. Web 2.0 lets users create web contents as distinct from webmasters or web coders. Web 2.0 has come to our work, our life and even has become an indispensable part of our web-life. Its applications have already been widespread in many fields on the Internet. So far, China has about 137 million netizens [1], therefore its Web 2.0 market is so attractive that many sources of venture capital flow into the Chinese Web 2.0 market and there are also a lot of new Web 2.0 companies in China. However, the development of Web 2.0 in China is accompanied by some problems and obstacles. In this paper, we will mainly discuss Web 2.0 applications in China, with their current problems and future development trends.
Liang, Liming; Chai, Jiake; Jia, Xiaoming; Wang, Yirong; Meng, Suyu; Liu, Tao
2012-12-01
To investigate the effectiveness of repairing severe cicatricial contracture deformity in the web-space by kite-like incision combined with full-thickness skin grafting. Between June 2008 and September 2011, 31 patients (87 web-spaces) with severe cicatricial contracture deformities in the web-spaces were treated. There were 24 males and 7 females, aged 5-43 years (median, 22 years). The causes of injuries were flame burn (26 cases), scald (3 cases), electric arc burn (1 case), and chemical burn (1 case). The degree of burn was deep second degree (14 cases) and third degree (17 cases). The interval time from injury to operation was 10 months to 17 years (median, 2.2 years). The kite-like incision was marked on the scar in the web-space. The rhombic scar between the adjacent metacarpophalangeal joints was excised, and cicatricial contracture was released completely. The secondary wound in the web-space was repaired with full-thickness autogeneic skin grafting. The secondary wound at donor site was directly sutured. All full-thickness skin grafts survived well. The incisions at donor sites healed primarily. Of 31 patients, 29 (82 web-spaces) were followed up 6-18 months (mean, 13 months). The sizes and depths of reconstructed web-spaces were similar to those of normal ones. No secondary cicatricial contracture was observed, and the function of fingers recovered well. The short-term effectiveness is satisfactory by kite-like incision combined with full-thickness skin grafting for repairing severe cicatricial contracture deformities in the web-space, while the long-term effectiveness needs further observation.
Systematic Review of Quality of Patient Information on Phalloplasty in the Internet.
Karamitros, Georgios A; Kitsos, Nikolaos A; Sapountzis, Stamatis
2017-12-01
An increasing number of patients, considering aesthetic surgery, use Internet health information as their first source of information. However, the quality of information available in the Internet on phalloplasty is currently unknown. This study aimed to assess the quality of patient information on phalloplasty available in the Internet. The assessment of the Web sites was based on the modified Ensuring Quality Information for Patients (EQIP) instrument (36 items). Three hundred Web sites were identified by the most popular Web search engines. Ninety Web sites were assessed after, duplicates, irrelevant sources and Web sites in other languages rather than English were excluded. Only 16 (18%) Web sites addressed >21 items, and scores tended to be higher for Web sites developed by academic centers and the industry than for Web sites developed by private practicing surgeons. The EQIP score achieved by Web sites ranged between 4 and 29 of the total 36 points, with a median value of 17.5 points (interquartile range, 13-21). The top 5 Web sites with the highest scores were identified. The quality of patient information on phalloplasty in the Internet is substandard, and the existing Web sites present inadequate information. There is a dire need to improve the quality of Internet phalloplasty resources for potential patients who might consider this procedure. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .
NASA Astrophysics Data System (ADS)
Frigeri, A.; Cardellini, C.; Chiodini, G.; Frondini, F.; Bagnato, E.; Aiuppa, A.; Fischer, T. P.; Lehnert, K. A.
2014-12-01
The study of the main pathways of carbon flux from the deep Earth requires the analysis of a large quantity and variety of data on volcanic and non-volcanic gas emissions. Hence, there is need for common frameworks to aggregate available data and insert new observations. Since 2010 we have been developing the Mapping Gas emissions (MaGa) web-based database to collect data on carbon degassing form volcanic and non-volcanic environments. MaGa uses an Object-relational model, translating the experience of field surveyors into the database schema. The current web interface of MaGa allows users to browse the data in tabular format or by browsing an interactive web-map. Enabled users can insert information as measurement methods, instrument details as well as the actual values collected in the field. Measurements found in the literature can be inserted as well as direct field observations made by human-operated instruments. Currently the database includes fluxes and gas compositions from active craters degassing, diffuse soil degassing and fumaroles both from dormant volcanoes and open-vent volcanoes from literature survey and data about non-volcanic emission of the Italian territory. Currently, MaGa holds more than 1000 volcanic plume degassing fluxes, data from 30 sites of diffuse soil degassing from italian volcanoes, and about 60 measurements from fumarolic and non volcanic emission sites. For each gas emission site, the MaGa holds data, pictures, descriptions on gas sampling, analysis and measurement methods, together with bibliographic references and contacts to researchers having experience on each site. From 2012, MaGa developments started to be focused towards the framework of the Deep Earth Carbon Degassing research initiative of the Deep Carbon Observatory. Whithin the DECADE initiative, there are others data systems, as EarthChem and the Smithsonian Institution's Global Volcanism Program. An interoperable interaction between the DECADE data systems is being planned. MaGa is showing good potentials to improve the knowledge on Earth degassing firstly by making data more accessible and encouraging participation among researchers, and secondly by allowing to observe and explore, for the first time, a gas emission dataset with spatial and temporal extents never analyzed before.
NASA Astrophysics Data System (ADS)
Bagli, Stefano; Pistocchi, Alberto; Mazzoli, Paolo; Borga, Marco; Bertoldi, Giacomo; Brenner, Johannes; Luzzi, Valerio
2016-04-01
Climate change, increasing pressure on farmland to satisfy the growing demand, and need to ensure environmental quality for agriculture in order to be competitive require an increasing capacity of water management. In this context, web-based for forecasting and monitoring the hydrological conditions of topsoil can be an effective means to save water, maximize crop protection and reduce soil loss and the leaching of pollutants. Such tools need to be targeted to the users and be accessible in a simple way in order to allow adequate take up in the practice. IASMHYN "Improved management of Agricultural Systems by Monitoring and Hydrological evaluation" is a web mapping service designed to provide and update on a daily basis the main water budget variables for farmland management. A beta version of the tool is available at www.gecosistema.com/iasmhyn . IASMHYN is an instrument for "second level monitoring" that takes into account accurate hydro-meteorological information's from ground stations and remote sensing sources, and turns them into practically usable decision variables for precision farming, making use of geostatistical analysis and hydrological models The main routines embedded in IASMYHN exclusively use open source libraries (R packages and Python), to perform following operations: (1) Automatic acquisition of observed data, both from ground stations and remote sensing, concerning precipitation (RADAR) and temperature (MODIS-LST) available from various sources; (2) Interpolation of acquisitions through regression kriging in order to spatially map the meteorological data; (3) Run of hydrological models to obtain spatial information of hydrological soil variables of immediate interest in agriculture. The real time results that are produced are available trough a web interface and provide the user with spatial maps and time series of the following variables, supporting decision on irrigation, soil protection from erosion, pollution risk of groundwater and streams: - Daily precipitation and its characteristics (rain, snow or hail, rain erosiveness); - Maximum, minimum and average daily temperature; - Soil Water Content (SWC); - Infiltration into the deep layers of the soil and surface runoff; - Potential loss of soil due to erosion - Residence time of a possible chemical (pesticides, fertilizers) applied to the soil. Thematic real time maps are produced give the user support decision on irrigation, soil management and pesticide/fertilizer application. The ongoing project will also lead to validation and improvement of estimates of hydrological variables from satellite imagery and radar data. The tool has been cross-validated with estimates of evapotranspiration and soil water content in agricultural sites in South Tyrol (Italy) in the framework of MONALISA project (http://www.monalisa-project.eu). A comparison with physical based models, satellite imagery and radar data will allow further generalization of the product. The ultimate goal of the tool is to make available on the market a service that is generally applicable in Europe , using commonly available data, to provide single farmers and organizations effective and up to date information for planning and programming their activities.
Global dust sources detection using MODIS Deep Blue Collection 6 aerosol products
NASA Astrophysics Data System (ADS)
Pérez García-Pando, C.; Ginoux, P. A.
2015-12-01
Our understanding of the global dust cycle is limited by a dearth of information about dust sources, especially small-scale features which could account for a large fraction of global emissions. Remote sensing sensors are the most useful tool to locate dust sources. These sensors include microwaves, visible channels, and lidar. On the global scale, major dust source regions have been identified using polar orbiting satellite instruments. The MODIS Deep Blue algorithm has been particularly useful to detect small-scale sources such as floodplains, alluvial fans, rivers, and wadis , as well as to identify anthropogenic sources from agriculture. The recent release of Collection 6 MODIS aerosol products allows to extend dust source detection to the entire land surfaces, which is quite useful to identify mid to high latitude dust sources and detect not only dust from agriculture but fugitive dust from transport and industrial activities. This presentation will overview the advantages and drawbacks of using MODIS Deep Blue for dust detection, compare to other instruments (polar orbiting and geostationary). The results of Collection 6 with a new dust screening will be compared against AERONET. Applications to long range transport of anthropogenic dust will be presented.
ALFAZOA Deep HI Survey to Identify Galaxies in the ZOA 37° ≦ l ≦ 43° and -2.5° ≦ b ≦ 3°
NASA Astrophysics Data System (ADS)
Palencia, Kelby; Robert Minchin, Monica Sanchez, Patricia Henning , Rhys Taylor
2018-01-01
The area where the galaxy lies, as viewed from the solar system, is called the Zone of Avoidance (ZOA). Due to extinction and confusion in the ZOA sources behind it appear to be blocked. This project is working with data from the Arecibo ALFAZOA Deep survey to identify galaxies in the ZOA amid 37° ≦ l ≦ 43° and -2.5° ≦ b ≦ 3° . The ALFAZOA Deep surveyed a part of the inner galaxy for the first time in the ZOA. The ALFAZOA Deep survey is a more sensitive survey than the previous survey the ALFAZOA Shallow. FRELLED and Miriad were used to identify and analyze the data in this region. With the data 57 sources where identified. Within these 57 sources, 51 were galaxies, which 3 were previously discovered galaxies; leaving 48 as new galaxies. The other 6 remaining sources from the 57, were follow-up sources. Two groups of galaxies were also identified, one lies around 1,500-3,200 km/s and the other between 10,600-11,700 km/s in redshift. The sources from the group in 10,600-11,700 km/s in redshift also need a follow up as they lie near the spectrum where the receiver signal starts to weaken.
[The modern sources for making a medical geography description].
2014-02-01
The current article is dedicated to application of Internet for acquisition of medical geography information. The vast majority of the modern domestic reference manuals are neither reliable nor up-to-date. At the time when the foreign printed sources are not easily accessible the foreign web resources often become the main source of information. The article possesses some practical advice on how to find the general, medical and military medical data on the web. It is emphasized the necessity of careful cross validation of all the obtained data to be confident in their reliability.
NASA Astrophysics Data System (ADS)
Hunt, Brian P. V.; Carlotti, François; Donoso, Katty; Pagano, Marc; D'Ortenzio, Fabrizio; Taillandier, Vincent; Conan, Pascal
2017-08-01
Knowledge of the relative contributions of phytoplankton size classes to zooplankton biomass is necessary to understand food-web functioning and response to climate change. During the Deep Water formation Experiment (DEWEX), conducted in the north-west Mediterranean Sea in winter (February) and spring (April) of 2013, we investigated phytoplankton-zooplankton trophic links in contrasting oligotrophic and eutrophic conditions. Size fractionated particulate matter (pico-POM, nano-POM, and micro-POM) and zooplankton (64 to >4000 μm) composition and carbon and nitrogen stable isotope ratios were measured inside and outside the nutrient-rich deep convection zone in the central Liguro-Provencal basin. In winter, phytoplankton biomass was low (0.28 mg m-3) and evenly spread among picophytoplankton, nanophytoplankton, and microphytoplankton. Using an isotope mixing model, we estimated average contributions to zooplankton biomass by pico-POM, nano-POM, and micro-POM of 28, 59, and 15%, respectively. In spring, the nutrient poor region outside the convection zone had low phytoplankton biomass (0.58 mg m-3) and was dominated by pico/nanophytoplankton. Estimated average contributions to zooplankton biomass by pico-POM, nano-POM, and micro-POM were 64, 28 and 10%, respectively, although the model did not differentiate well between pico-POM and nano-POM in this region. In the deep convection zone, spring phytoplankton biomass was high (1.34 mg m-3) and dominated by micro/nano phytoplankton. Estimated average contributions to zooplankton biomass by pico-POM, nano-POM, and micro-POM were 42, 42, and 20%, respectively, indicating that a large part of the microphytoplankton biomass may have remained ungrazed.
Kang, Sokbom; Lee, Jong-Min; Lee, Jae-Kwan; Kim, Jae-Weon; Cho, Chi-Heum; Kim, Seok-Mo; Park, Sang-Yoon; Park, Chan-Yong; Kim, Ki-Tae
2014-03-01
The purpose of this study is to develop a Web-based nomogram for predicting the individualized risk of para-aortic nodal metastasis in incompletely staged patients with endometrial cancer. From 8 institutions, the medical records of 397 patients who underwent pelvic and para-aortic lymphadenectomy as a surgical staging procedure were retrospectively reviewed. A multivariate logistic regression model was created and internally validated by rigorous bootstrap resampling methods. Finally, the model was transformed into a user-friendly Web-based nomogram (http://http://www.kgog.org/nomogram/empa001.html). The rate of para-aortic nodal metastasis was 14.4% (57/397 patients). Using a stepwise variable selection, 4 variables including deep myometrial invasion, non-endometrioid subtype, lymphovascular space invasion, and log-transformed CA-125 levels were finally adopted. After 1000 repetitions of bootstrapping, all of these 4 variables retained a significant association with para-aortic nodal metastasis in the multivariate analysis-deep myometrial invasion (P = 0.001), non-endometrioid histologic subtype (P = 0.034), lymphovascular space invasion (P = 0.003), and log-transformed serum CA-125 levels (P = 0.004). The model showed good discrimination (C statistics = 0.87; 95% confidence interval, 0.82-0.92) and accurate calibration (Hosmer-Lemeshow P = 0.74). This nomogram showed good performance in predicting para-aortic metastasis in patients with endometrial cancer. The tool may be useful in determining the extent of lymphadenectomy after incomplete surgery.
Arcagni, Marina; Juncos, Romina; Rizzo, Andrea; Pavlin, Majda; Fajon, Vesna; Arribére, María A; Horvat, Milena; Ribeiro Guevara, Sergio
2018-01-15
Niche segregation between introduced and native fish in Lake Nahuel Huapi, a deep oligotrophic lake in Northwest Patagonia (Argentina), occurs through the consumption of different prey. Therefore, in this work we analyzed total mercury [THg] and methylmercury [MeHg] concentrations in top predator fish and in their main prey to test whether their feeding habits influence [Hg]. Results indicate that [THg] and [MeHg] varied by foraging habitat and they increased with greater percentage of benthic diet and decreased with pelagic diet in Lake Nahuel Huapi. This is consistent with the fact that the native creole perch, a mostly benthivorous feeder, which shares the highest trophic level of the food web with introduced salmonids, had higher [THg] and [MeHg] than the more pelagic feeder rainbow trout and bentho-pelagic feeder brown trout. This differential THg and MeHg bioaccumulation observed in native and introduced fish provides evidence to the hypothesis that there are two main Hg transfer pathways from the base of the food web to top predators: a pelagic pathway where Hg is transferred from water, through plankton (with Hg in inorganic species mostly), forage fish to salmonids, and a benthic pathway, as Hg is transferred from the sediments (where Hg methylation occurs mostly), through crayfish (with higher [MeHg] than plankton), to native fish, leading to one fold higher [Hg]. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Bourgeois, Solveig; Witte, Ursula; Harrison, Ailish M.; Makela, Anni; Kazanidis, Georgios; Archambault, Philippe
2016-04-01
Ongoing climate change in the Arctic is causing drastic alteration of the Arctic marine ecosystem functioning, such as shifts in patterns of primary production, and modifying the present tight pelagic-benthic coupling. Subsequently benthic communities, which rely upon organic matter produced in the top layers of the Ocean, will also be affected by these changes. The benthic megafaunal communities play a significant role in ecological processes and ecosystem functioning (i.e. organic matter recycling, bioturbation, food source for the higher trophic levels…). Yet, information is scarce regarding the main food sources for dominant benthic organisms, and therefore the impact of the ongoing changes is difficult to assess. The goal of this study is to investigate the preferential feeding of different carbon sources by megabenthic organisms in the Canadian High Arctic and to identify environmental drivers which explain the observed trends. In summer 2013, benthic megafauna was collected at 9 stations spread along latitudinal (58 to 81°N) and longitudinal (62 to 114°W) transects in the Baffin Bay and Parry Channel, respectively. Carbon and nitrogen bulk stable isotope analyses (δ13C and δ15N) were performed on several species divided into groups according to their feeding type. This study highlights distinct trends in δ13C values of benthic organisms suggesting the importance of both phytoplankton and ice algae as carbon sources for megafauna in the Canadian High Arctic. The importance of physical and biological parameters as drivers of food web structure will be furthermore discussed.
Class Projects on the Internet.
ERIC Educational Resources Information Center
Nicholson, Danny
1996-01-01
Discusses the use of the Internet in the classroom. Presents a project on renewable energy sources in which students produce web pages. Provides the web page address of the project completed by students. (ASK)
Development of an IHE MRRT-compliant open-source web-based reporting platform.
Pinto Dos Santos, Daniel; Klos, G; Kloeckner, R; Oberle, R; Dueber, C; Mildenberger, P
2017-01-01
To develop a platform that uses structured reporting templates according to the IHE Management of Radiology Report Templates (MRRT) profile, and to implement this platform into clinical routine. The reporting platform uses standard web technologies (HTML / JavaScript and PHP / MySQL) only. Several freely available external libraries were used to simplify the programming. The platform runs on a standard web server, connects with the radiology information system (RIS) and PACS, and is easily accessible via a standard web browser. A prototype platform that allows structured reporting to be easily incorporated into the clinical routine was developed and successfully tested. To date, 797 reports were generated using IHE MRRT-compliant templates (many of them downloaded from the RSNA's radreport.org website). Reports are stored in a MySQL database and are easily accessible for further analyses. Development of an IHE MRRT-compliant platform for structured reporting is feasible using only standard web technologies. All source code will be made available upon request under a free license, and the participation of other institutions in further development is welcome. • A platform for structured reporting using IHE MRRT-compliant templates is presented. • Incorporating structured reporting into clinical routine is feasible. • Full source code will be provided upon request under a free license.
Reducing methylmercury accumulation in the food webs of San Francisco Bay and its local watersheds.
Davis, J A; Looker, R E; Yee, D; Marvin-Di Pasquale, M; Grenier, J L; Austin, C M; McKee, L J; Greenfield, B K; Brodberg, R; Blum, J D
2012-11-01
San Francisco Bay (California, USA) and its local watersheds present an interesting case study in estuarine mercury (Hg) contamination. This review focuses on the most promising avenues for attempting to reduce methylmercury (MeHg) contamination in Bay Area aquatic food webs and identifying the scientific information that is most urgently needed to support these efforts. Concern for human exposure to MeHg in the region has led to advisories for consumption of sport fish. Striped bass from the Bay have the highest average Hg concentration measured for this species in USA estuaries, and this degree of contamination has been constant for the past 40 years. Similarly, largemouth bass in some Bay Area reservoirs have some of the highest Hg concentrations observed in the entire US. Bay Area wildlife, particularly birds, face potential impacts to reproduction based on Hg concentrations in the tissues of several Bay species. Source control of Hg is one of the primary possible approaches for reducing MeHg accumulation in Bay Area aquatic food webs. Recent findings (particularly Hg isotope measurements) indicate that the decades-long residence time of particle-associated Hg in the Bay is sufficient to allow significant conversion of even the insoluble forms of Hg into MeHg. Past inputs have been thoroughly mixed throughout this shallow and dynamic estuary. The large pool of Hg already present in the ecosystem dominates the fraction converted to MeHg and accumulating in the food web. Consequently, decreasing external Hg inputs can be expected to reduce MeHg in the food web, but it will likely take many decades to centuries before those reductions are achieved. Extensive efforts to reduce loads from the largest Hg mining source (the historic New Almaden mining district) are underway. Hg is spread widely across the urban landscape, but there are a number of key sources, source areas, and pathways that provide opportunities to capture larger quantities of Hg and reduce loads from urban runoff. Atmospheric deposition is a lower priority for source control in the Bay Area due to a combination of a lack of major local sources. Internal net production of MeHg is the dominant source of MeHg that enters the food web. Controlling internal net production is the second primary management approach, and has the potential to reduce food web MeHg in some habitats more effectively and within a much shorter time-frame. Controlling net MeHg production and accumulation in the food web of upstream reservoirs and ponds is very promising due to the many features of these ecosystems that can be manipulated. The most feasible control options in tidal marshes relate to the design of flow patterns and subhabitats in restoration projects. Options for controlling MeHg production in open Bay habitat are limited due primarily to the highly dispersed distribution of Hg throughout the ecosystem. Other changes in these habitats may also have a large influence on food web MeHg, including temperature changes due to global warming, sea level rise, food web alterations due to introduced species and other causes, and changes in sediment supply. Other options for reducing or mitigating exposure and risk include controlling bioaccumulation, cleanup of contaminated sites, and reducing other factors (e.g., habitat availability) that limit at-risk wildlife populations. Copyright © 2012 Elsevier Inc. All rights reserved.
Reducing methylmercury accumulation in the food webs of San Francisco Bay and its local watersheds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, J.A., E-mail: jay@sfei.org; Looker, R.E.; Yee, D.
San Francisco Bay (California, USA) and its local watersheds present an interesting case study in estuarine mercury (Hg) contamination. This review focuses on the most promising avenues for attempting to reduce methylmercury (MeHg) contamination in Bay Area aquatic food webs and identifying the scientific information that is most urgently needed to support these efforts. Concern for human exposure to MeHg in the region has led to advisories for consumption of sport fish. Striped bass from the Bay have the highest average Hg concentration measured for this species in USA estuaries, and this degree of contamination has been constant for themore » past 40 years. Similarly, largemouth bass in some Bay Area reservoirs have some of the highest Hg concentrations observed in the entire US. Bay Area wildlife, particularly birds, face potential impacts to reproduction based on Hg concentrations in the tissues of several Bay species. Source control of Hg is one of the primary possible approaches for reducing MeHg accumulation in Bay Area aquatic food webs. Recent findings (particularly Hg isotope measurements) indicate that the decades-long residence time of particle-associated Hg in the Bay is sufficient to allow significant conversion of even the insoluble forms of Hg into MeHg. Past inputs have been thoroughly mixed throughout this shallow and dynamic estuary. The large pool of Hg already present in the ecosystem dominates the fraction converted to MeHg and accumulating in the food web. Consequently, decreasing external Hg inputs can be expected to reduce MeHg in the food web, but it will likely take many decades to centuries before those reductions are achieved. Extensive efforts to reduce loads from the largest Hg mining source (the historic New Almaden mining district) are underway. Hg is spread widely across the urban landscape, but there are a number of key sources, source areas, and pathways that provide opportunities to capture larger quantities of Hg and reduce loads from urban runoff. Atmospheric deposition is a lower priority for source control in the Bay Area due to a combination of a lack of major local sources. Internal net production of MeHg is the dominant source of MeHg that enters the food web. Controlling internal net production is the second primary management approach, and has the potential to reduce food web MeHg in some habitats more effectively and within a much shorter time-frame. Controlling net MeHg production and accumulation in the food web of upstream reservoirs and ponds is very promising due to the many features of these ecosystems that can be manipulated. The most feasible control options in tidal marshes relate to the design of flow patterns and subhabitats in restoration projects. Options for controlling MeHg production in open Bay habitat are limited due primarily to the highly dispersed distribution of Hg throughout the ecosystem. Other changes in these habitats may also have a large influence on food web MeHg, including temperature changes due to global warming, sea level rise, food web alterations due to introduced species and other causes, and changes in sediment supply. Other options for reducing or mitigating exposure and risk include controlling bioaccumulation, cleanup of contaminated sites, and reducing other factors (e.g., habitat availability) that limit at-risk wildlife populations.« less
The Impact of PeerWise Approach on the Academic Performance of Medical Students
ERIC Educational Resources Information Center
Kadir, Farkaad A.; Ansari, Reshma M.; AbManan, Norhafizah; Abdullah, Mohd Hafiz Ngoo; Nor, Hamdan Mohd
2014-01-01
PeerWise is a novel, freely available, online pedagogical tool that allows students to create and deposit questions for peer evaluation. A participatory learning approach through this web-based system was used to motivate and promote a deep approach in learning nervous system by 124 second year MBBS students at Cyberjaya University College of…
McCain Emphasizes School Choice, Accountability, but Lacks Specifics
ERIC Educational Resources Information Center
Hoff, David J.
2008-01-01
Buried deep within the campaign Web site of Senator John McCain, the Arizona Republican explains the principles that define his K-12 agenda: choice, accountability, and teacher quality. But his 25-year congressional record and statements in his current campaign do give a glimpse of what Senator McCain--better known for his views on defense and…
Live Synchronous Web Meetings in Asynchronous Online Courses: Reconceptualizing Virtual Office Hours
ERIC Educational Resources Information Center
Lowenthal, Patrick R.; Snelson, Chareen; Dunlap, Joanna C.
2017-01-01
Most online courses rely solely on asynchronous text-based online communication. This type of communication can foster anytime, anywhere reflection, critical thinking, and deep learning. However, it can also frustrate participants because of the lack of spontaneity and visual cues and the time it takes for conversations to develop and feedback to…
NASA Astrophysics Data System (ADS)
Nikkhoo, Mehdi; Walter, Thomas R.; Lundgren, Paul; Spica, Zack; Legrand, Denis
2016-04-01
The Azufre-Lastarria volcanic complex in the central Andes has been recognized as a major region of magma intrusion. Both deep and shallow inflating reservoirs inferred through InSAR time series inversions, are the main sources of a multi-scale deformation accompanied by pronounced fumarolic activity. The possible interactions between these reservoirs, as well as the path of propagating fluids and the development of their pathways, however, have not been investigated. Results from recent seismic noise tomography in the area show localized zones of shear wave velocity anomalies, with a low shear wave velocity region at 1 km depth and another one at 4 km depth beneath Lastarria. Although the inferred shallow zone is in a good agreement with the location of the shallow deformation source, the deep zone does not correspond to any deformation source in the area. Here, using the boundary element method (BEM), we have performed an in-depth continuum mechanical investigation of the available ascending and descending InSAR data. We modelled the deep source, taking into account the effect of topography and complex source geometry on the inversion. After calculating the stress field induced by this source, we apply Paul's criterion (a variation on Mohr-Coulomb failure) to recognize locations that are liable for failure. We show that the locations of tensile and shear failure almost perfectly coincide with the shallow and deep anomalies as identified by shear wave velocity, respectively. Based on the stress-change models we conjecture that the deep reservoir controls the development of shallower hydrothermal fluids; a hypothesis that can be tested and applied to other volcanoes.
Brazin, Lillian R
2006-01-01
This is the final biennial update listing directories, journal articles, Web sites, and general books that aid the librarian, house officer, or medical student in finding information on medical residency and fellowship programs. The World Wide Web provides the most complete and up-to-date source of information about postgraduate training programs and specialties. This update continues to go beyond postgraduate training resources to include selected Web sites and books on curriculum vitae writing, practice management, personal finances, the "Match," certification and licensure examination preparation, lifestyle issues, job hunting, and the DEA license application process. Print resources are included if they provide information not on the Internet, have features that are particularly useful, or cover too many relevant topics in depth to be covered in a journal article or on a Web site. The Internet is a major marketing tool for hospitals seeking to recruit the best and brightest physicians for their training programs. Even the smallest community hospital has a Web site.
Levels-of-processing effect on internal source monitoring in schizophrenia
RAGLAND, J. DANIEL; McCARTHY, ERIN; BILKER, WARREN B.; RENSINGER, COLLEEN M. B; VALDEZ, JEFFREY; KOHLER, CHRISTIAN; GUR, RAQUEL E.; GUR, RUBEN C.
2015-01-01
Background Recognition can be normalized in schizophrenia by providing patients with semantic organizational strategies through a levels-of-processing (LOP) framework. However, patients may rely primarily on familiarity effects, making recognition less sensitive than source monitoring to the strength of the episodic memory trace. The current study investigates whether providing semantic organizational strategies can also normalize patients’ internal source-monitoring performance. Method Sixteen clinically stable medicated patients with schizophrenia and 15 demographically matched healthy controls were asked to identify the source of remembered words following an LOP-encoding paradigm in which they alternated between processing words on a ‘shallow’ perceptual versus a ‘deep’ semantic level. A multinomial analysis provided orthogonal measures of item recognition and source discrimination, and bootstrapping generated variance to allow for parametric analyses. LOP and group effects were tested by contrasting recognition and source-monitoring parameters for words that had been encoded during deep versus shallow processing conditions. Results As in a previous study there were no group differences in LOP effects on recognition performance, with patients and controls benefiting equally from deep versus shallow processing. Although there were no group differences in internal source monitoring, only controls had significantly better performance for words processed during the deep encoding condition. Patient performance did not correlate with clinical symptoms or medication dose. Conclusions Providing a deep processing semantic encoding strategy significantly improved patients’ recognition performance only. The lack of a significant LOP effect on internal source monitoring in patients may reffect subtle problems in the relational binding of semantic information that are independent of strategic memory processes. PMID:16608558
Clinic expert information extraction based on domain model and block importance model.
Zhang, Yuanpeng; Wang, Li; Qian, Danmin; Geng, Xingyun; Yao, Dengfu; Dong, Jiancheng
2015-11-01
To extract expert clinic information from the Deep Web, there are two challenges to face. The first one is to make a judgment on forms. A novel method based on a domain model, which is a tree structure constructed by the attributes of query interfaces is proposed. With this model, query interfaces can be classified to a domain and filled in with domain keywords. Another challenge is to extract information from response Web pages indexed by query interfaces. To filter the noisy information on a Web page, a block importance model is proposed, both content and spatial features are taken into account in this model. The experimental results indicate that the domain model yields a precision 4.89% higher than that of the rule-based method, whereas the block importance model yields an F1 measure 10.5% higher than that of the XPath method. Copyright © 2015 Elsevier Ltd. All rights reserved.
Development of the EM tomography system by the vertical electromagnetic profiling (VEMP) method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miura, Y.; Osato, K.; Takasugi, S.
1995-12-31
As a part of the {open_quotes}Deep-Seated Geothermal Resources Survey{close_quotes} project being undertaken by the NEDO, the Vertical ElectroMagnetic Profiling (VEMP) method is being developed to accurately obtain deep resistivity structure. The VEMP method acquires multi-frequency three-component magnetic field data in an open hole well using controlled sources (loop sources or grounded-wire sources) emitted at the surface. Numerical simulation using EM3D demonstrated that phase data of the VEMP method is very sensitive to resistivity structure and the phase data will also indicate presence of deep anomalies. Forward modelling was also used to determine required transmitter moments for various grounded-wire and loopmore » sources for a field test using the WD-1 well in the Kakkonda geothermal area. Field logging of the well was carried out in May 1994 and the processed field data matches well the simulated data.« less
ERIC Educational Resources Information Center
Smith, Karl D.
1977-01-01
Explains an upper elementary game of tag that illustrates energy flow in food webs using candy bars as food sources. A follow-up field trip to a river and five language arts projects are also suggested. (CS)
Surfing the web and parkinson's law.
Baldwin, F D
1996-05-01
The World Wide Web accounts for much of the popular interest in the Internet and offers a rich and variegated source of medical information. It's where you'll find online attractions ranging from "The Visible Human" to collections of lawyer jokes, as well as guides to clinical materials. Here's a basic introduction to the Web, its features, and its vocabulary.
ERIC Educational Resources Information Center
Griffin, Teresa; Cohen, Deb
2012-01-01
The ubiquity and familiarity of the world wide web means that students regularly turn to it as a source of information. In doing so, they "are said to rely heavily on simple search engines, such as Google to find what they want." Researchers have also investigated how students use search engines, concluding that "the young web users tended to…
Guiding Students in Using the World Wide Web for Research.
ERIC Educational Resources Information Center
Kubly, Kristin
This paper addresses the need for educators and librarians to guide students in using the World Wide Web appropriately by teaching them to evaluate Internet resources using criteria designed to identify the authoritative sources. The pros and cons of information commonly found on the Web are discussed, as well as academic Internet subject or…
Measuring Law Library Catalog Web Site Usability: A Web Analytic Approach
ERIC Educational Resources Information Center
Fang, Wei; Crawford, Marjorie E.
2008-01-01
Although there is a proliferation of information available on the Web, and law professors, students, and other users have a variety of channels to locate information and complete their research activities, the law library catalog still remains an important source for offering users access to information that has been evaluated and cataloged by…
40 CFR 63.3370 - How do I demonstrate compliance with the emission standards?
Code of Federal Regulations, 2010 CFR
2010-07-01
... SOURCE CATEGORIES National Emission Standards for Hazardous Air Pollutants: Paper and Other Web Coating... material, i, in a month, kg. Mvret = Mass of volatile matter retained in the coated web after curing or...-purchased coating material, i, in a month, kg. Mvret = Mass of volatile matter retained in the coated web...
FASTQ quality control dashboard
DOE Office of Scientific and Technical Information (OSTI.GOV)
2016-07-25
FQCDB builds up existing open source software, FastQC, implementing a modern web interface for across parsed output of FastQC. In addition, FQCDB is extensible as a web service to include additional plots of type line, boxplot, or heatmap, across data formatted according to guidelines. The interface is also configurable via more readable JSON format, enabling customization by non-web programmers.
DeepBlue epigenomic data server: programmatic data retrieval and analysis of epigenome region sets
Albrecht, Felipe; List, Markus; Bock, Christoph; Lengauer, Thomas
2016-01-01
Large amounts of epigenomic data are generated under the umbrella of the International Human Epigenome Consortium, which aims to establish 1000 reference epigenomes within the next few years. These data have the potential to unravel the complexity of epigenomic regulation. However, their effective use is hindered by the lack of flexible and easy-to-use methods for data retrieval. Extracting region sets of interest is a cumbersome task that involves several manual steps: identifying the relevant experiments, downloading the corresponding data files and filtering the region sets of interest. Here we present the DeepBlue Epigenomic Data Server, which streamlines epigenomic data analysis as well as software development. DeepBlue provides a comprehensive programmatic interface for finding, selecting, filtering, summarizing and downloading region sets. It contains data from four major epigenome projects, namely ENCODE, ROADMAP, BLUEPRINT and DEEP. DeepBlue comes with a user manual, examples and a well-documented application programming interface (API). The latter is accessed via the XML-RPC protocol supported by many programming languages. To demonstrate usage of the API and to enable convenient data retrieval for non-programmers, we offer an optional web interface. DeepBlue can be openly accessed at http://deepblue.mpi-inf.mpg.de. PMID:27084938
Clinical software development for the Web: lessons learned from the BOADICEA project
2012-01-01
Background In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. Results We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. Conclusions We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback. PMID:22490389
Clinical software development for the Web: lessons learned from the BOADICEA project.
Cunningham, Alex P; Antoniou, Antonis C; Easton, Douglas F
2012-04-10
In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback.
Design Drivers of Water Data Services
NASA Astrophysics Data System (ADS)
Valentine, D.; Zaslavsky, I.
2008-12-01
The CUAHSI Hydrologic Information System (HIS) is being developed as a geographically distributed network of hydrologic data sources and functions that are integrated using web services so that they function as a connected whole. The core of the HIS service-oriented architecture is a collection of water web services, which provide uniform access to multiple repositories of observation data. These services use SOAP protocols communicating WaterML (Water Markup Language). When a client makes a data or metadata request using a CUAHSI HIS web service, these requests are made in standard manner, following the CUAHSI HIS web service signatures - regardless of how the underlying data source may be organized. Also, regardless of the format in which the data are returned by the source, the web services respond to requests by returning the data in a standard format of WaterML. The goal of WaterML design has been to capture semantics of hydrologic observations discovery and retrieval and express the point observations information model as an XML schema. To a large extent, it follows the representation of the information model as adopted by the CUASHI Observations Data Model (ODM) relational design. Another driver of WaterML design is specifications and metadata adopted by USGS NWIS, EPA STORET, and other federal agencies, as it seeks to provide a common foundation for exchanging both agency data and data collected in multiple academic projects. Another WaterML design principle was to create, in version 1 of HIS in particular, a fairly rigid and simple XML schema which is easy to generate and parse, thus creating the least barrier for adoption by hydrologists. WaterML includes a series of elements that reflect common notions used in describing hydrologic observations, such as site, variable, source, observation series, seriesCatalog, and data values. Each of the three main request methods in the water web services - GetSiteInfo, GetVariableInfo, and GetValues - has a corresponding response element in WaterML: SitesResponse, VariableResponse, and TimeSeriesResponse. The WaterML specification is being adopted by federal agencies. The experimental USGS NWIS Daily Values web service returns WaterML-compliant TImeSeriesResponse. The National Climatic Data Center is also prototyping WaterML for data delivery, and has developed a REST-based service that generates WaterML- compliant output for the NCDC ASOS network. Such agency-supported web services coming online provide a much more efficient way to deliver agency data compared to the web site scraper services that the CUAHSI HIS project has developed initially. The CUAHSI water data web services will continue to serve as the main communication mechanism within CUAHSI HIS, connecting a variety of data sources with a growing set of web service clients being developed in both academia and the commercial sector. The driving forces for the development of web services continue to be: - Application experience and needs of the growing number of CUAHSI HIS users, who experiment with additional data types, analysis modes, data browsing and searching strategies, and provide feedback to WaterML developers; - Data description requirements posed by various federal and state agencies; - Harmonization with standards being adopted or developed in neighboring communities, in particular the relevant standards being explored within the Open Geospatial Consortium. CUAHSI WaterML is a standard output schema for CUAHSI HIS water web services. Its formal specification is available as OGC discussion paper at www.opengeospatial.org/standards/dp/ class="ab'>
NASA Astrophysics Data System (ADS)
Choi, Woo June; Wang, Ruikang K.
2015-10-01
We report noninvasive, in vivo optical imaging deep within a mouse brain by swept-source optical coherence tomography (SS-OCT), enabled by a 1.3-μm vertical cavity surface emitting laser (VCSEL). VCSEL SS-OCT offers a constant signal sensitivity of 105 dB throughout an entire depth of 4.25 mm in air, ensuring an extended usable imaging depth range of more than 2 mm in turbid biological tissue. Using this approach, we show deep brain imaging in mice with an open-skull cranial window preparation, revealing intact mouse brain anatomy from the superficial cerebral cortex to the deep hippocampus. VCSEL SS-OCT would be applicable to small animal studies for the investigation of deep tissue compartments in living brains where diseases such as dementia and tumor can take their toll.
Chemical stratigraphy of the Apollo 17 deep drill cores 70009-70007
NASA Technical Reports Server (NTRS)
Ehmann, W. D.; Ali, M. Z.
1977-01-01
A description is presented of an analysis of a total of 26 samples from three core segments (70009, 70008, 70007) of the Apollo 17 deep drill string. The deep drill string was taken about 700 m east of the Camelot Crater in the Taurus-Littrow region of the moon. Three core segments have been chemically characterized from the mainly coarse-grained upper portion of the deep drill string. The chemical data suggest that the entire 70007-70009 portion of the deep drill string examined was not deposited as a single unit, but was formed by several events sampling slightly different source materials which may have occurred over a relatively short period of time. According to the data from drill stem 70007, there were at least two phases of deposition. Core segment 70009 is probably derived from somewhat different source material than 70008. It seems to be a very well mixed material.
WIRM: An Open Source Toolkit for Building Biomedical Web Applications
Jakobovits, Rex M.; Rosse, Cornelius; Brinkley, James F.
2002-01-01
This article describes an innovative software toolkit that allows the creation of web applications that facilitate the acquisition, integration, and dissemination of multimedia biomedical data over the web, thereby reducing the cost of knowledge sharing. There is a lack of high-level web application development tools suitable for use by researchers, clinicians, and educators who are not skilled programmers. Our Web Interfacing Repository Manager (WIRM) is a software toolkit that reduces the complexity of building custom biomedical web applications. WIRM’s visual modeling tools enable domain experts to describe the structure of their knowledge, from which WIRM automatically generates full-featured, customizable content management systems. PMID:12386108
Design and implementation of CUAHSI WaterML and WaterOneFlow Web Services
NASA Astrophysics Data System (ADS)
Valentine, D. W.; Zaslavsky, I.; Whitenack, T.; Maidment, D.
2007-12-01
WaterOneFlow is a term for a group of web services created by and for the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) community. CUAHSI web services facilitate the retrieval of hydrologic observations information from online data sources using the SOAP protocol. CUAHSI Water Markup Language (below referred to as WaterML) is an XML schema defining the format of messages returned by the WaterOneFlow web services. \
Artieta-Pinedo, Isabel; Paz-Pascual, Carmen; Grandes, Gonzalo; Villanueva, Gemma
2018-03-01
the aim of this study is to evaluate the quality of web pages found by women when carrying out an exploratory search concerning pregnancy, childbirth, the postpartum period and breastfeeding. a descriptive study of the first 25 web pages that appear in the search engines Google, Yahoo and Bing, in October 2014 in the Basque Country (Spain), when entering eight Spanish words and seven English words related to pregnancy, childbirth, the postpartum period, breastfeeding and newborns. Web pages aimed at healthcare professionals and forums were excluded. The reliability was evaluated using the LIDA questionnaire, and the contents of the web pages with the highest scores were then described. a total of 126 web pages were found using the key search words. Of these, 14 scored in the top 30% for reliability. The content analysis of these found that the mean score for "references to the source of the information" was 3.4 (SD: 2.17), that for "up-to-date" was 4.30 (SD: 1.97) and the score for "conflict of interest statement" was 5.90 (SD: 2.16). The mean for web pages created by universities and official bodies was 13.64 (SD: 4.47), whereas the mean for those created by private bodies was 11.23 (SD: 4.51) (F (1,124)5.27. p=0.02). The content analysis of these web pages found that the most commonly discussed topic was breastfeeding, followed by self-care during pregnancy and the onset of childbirth. in this study, web pages from established healthcare or academic institutions were found to contain the most reliable information. The significant number of web pages found in this study with poor quality information indicates the need for healthcare professionals to guide women when sourcing information online. As the origin of the web page has a direct effect on reliability, the involvement of healthcare professionals in the use, counselling and generation of new technologies as an intervention tool is increasingly essential. Copyright © 2017 Elsevier Ltd. All rights reserved.
COEUS: “semantic web in a box” for biomedical applications
2012-01-01
Background As the “omics” revolution unfolds, the growth in data quantity and diversity is bringing about the need for pioneering bioinformatics software, capable of significantly improving the research workflow. To cope with these computer science demands, biomedical software engineers are adopting emerging semantic web technologies that better suit the life sciences domain. The latter’s complex relationships are easily mapped into semantic web graphs, enabling a superior understanding of collected knowledge. Despite increased awareness of semantic web technologies in bioinformatics, their use is still limited. Results COEUS is a new semantic web framework, aiming at a streamlined application development cycle and following a “semantic web in a box” approach. The framework provides a single package including advanced data integration and triplification tools, base ontologies, a web-oriented engine and a flexible exploration API. Resources can be integrated from heterogeneous sources, including CSV and XML files or SQL and SPARQL query results, and mapped directly to one or more ontologies. Advanced interoperability features include REST services, a SPARQL endpoint and LinkedData publication. These enable the creation of multiple applications for web, desktop or mobile environments, and empower a new knowledge federation layer. Conclusions The platform, targeted at biomedical application developers, provides a complete skeleton ready for rapid application deployment, enhancing the creation of new semantic information systems. COEUS is available as open source at http://bioinformatics.ua.pt/coeus/. PMID:23244467
COEUS: "semantic web in a box" for biomedical applications.
Lopes, Pedro; Oliveira, José Luís
2012-12-17
As the "omics" revolution unfolds, the growth in data quantity and diversity is bringing about the need for pioneering bioinformatics software, capable of significantly improving the research workflow. To cope with these computer science demands, biomedical software engineers are adopting emerging semantic web technologies that better suit the life sciences domain. The latter's complex relationships are easily mapped into semantic web graphs, enabling a superior understanding of collected knowledge. Despite increased awareness of semantic web technologies in bioinformatics, their use is still limited. COEUS is a new semantic web framework, aiming at a streamlined application development cycle and following a "semantic web in a box" approach. The framework provides a single package including advanced data integration and triplification tools, base ontologies, a web-oriented engine and a flexible exploration API. Resources can be integrated from heterogeneous sources, including CSV and XML files or SQL and SPARQL query results, and mapped directly to one or more ontologies. Advanced interoperability features include REST services, a SPARQL endpoint and LinkedData publication. These enable the creation of multiple applications for web, desktop or mobile environments, and empower a new knowledge federation layer. The platform, targeted at biomedical application developers, provides a complete skeleton ready for rapid application deployment, enhancing the creation of new semantic information systems. COEUS is available as open source at http://bioinformatics.ua.pt/coeus/.
Burger, Joanna; Gochfeld, Michael; Jeitner, Christian; Pittfield, Taryn; Donio, Mark
2015-01-01
Health and safety professionals, and the public, are interested in the best methods of providing timely information about disasters. The objective of this study was to examine information sources used for Superstorm Sandy with respect to the storm, evacuation routes, shelters, safety, and health issues. Respondents in Central New Jersey and Jersey Shore communities were differentially impacted by the storm. Jersey shore respondents had higher evacuation rates (47 % vs 13 %), higher flood waters in homes, longer power outages (average 23 vs 6 days), and longer periods without internet (29 vs 6 days). Electricity outages disrupted both sources and receivers of communication. Both groups obtained most of their information regarding safety from television, radio, friends and web/email. Information sources on health varied by location, with central Jersey respondents using mainly TV and the web, and Jersey shore respondents obtaining health information from the radio, and TV (before the storm). For information on evacuation routes, Jersey shore respondents obtained information from many sources, while central Jersey respondents obtained it from TV. Information on mold was largely obtained from friends and the web, since mold issues were dealt with several weeks after Sandy. The reliance on traditional sources of information (TV, radio, friends) found in this study suggests that the extreme power outages rendered web, cell phones, and social media on cell phones less usable, and suggests the need for an integrated communication strategy with redundancies that takes into account prolonged power outages over large geographical areas. PMID:24279815
Burger, Joanna; Gochfeld, Michael; Jeitner, Christian; Pittfield, Taryn; Donio, Mark
2013-01-01
Health and safety professionals and the public are interested in the best methods of providing timely information about disasters. The objective of this study was to examine information sources used for Superstorm Sandy with respect to the storm, evacuation routes, shelters, safety, and health issues. Respondents in central New Jersey and Jersey shore communities were differentially impacted by the storm. Jersey shore respondents had higher evacuation rates (47% vs. 13%), higher flood waters in homes, longer power outages (average 23 vs. 6 d), and longer periods without Internet (29 vs. 6 d). Electricity outages disrupted both sources and receivers of communication. Both groups obtained most of their information regarding safety from television, radio, friends, and Web/e-mail. Information sources on health varied by location, with central Jersey respondents using mainly TV and the Web, and Jersey shore respondents obtaining health information from the radio and TV (before the storm). For information on evacuation routes, Jersey shore respondents obtained information from many sources, while central Jersey respondents obtained it from TV. Information on mold was largely obtained from friends and the Web, since mold issues were dealt with several weeks after Sandy. The reliance on traditional sources of information (TV, radio, friends) found in this study suggests that the extreme power outages rendered Web, cell phones, and social media on cell phones less usable, and suggests the need for an integrated communication strategy with redundancies that takes into account prolonged power outages over large geographical areas.
Detecting people of interest from internet data sources
NASA Astrophysics Data System (ADS)
Cardillo, Raymond A.; Salerno, John J.
2006-04-01
In previous papers, we have documented success in determining the key people of interest from a large corpus of real-world evidence. Our recent efforts focus on exploring additional domains and data sources. Internet data sources such as email, web pages, and news feeds make it easier to gather a large corpus of documents for various domains, but detecting people of interest in these sources introduces new challenges. Analyzing these massive sources magnifies entity resolution problems, and demands a storage management strategy that supports efficient algorithmic analysis and visualization techniques. This paper discusses the techniques we used in order to analyze the ENRON email repository, which are also applicable to analyzing web pages returned from our "Buddy" meta-search engine.
GaLactic and Extragalactic All-Sky MWA-eXtended (GLEAM-X) survey: Pilot observations
NASA Astrophysics Data System (ADS)
Hurley-Walker, N.; Seymour, N.; Staveley-Smith, L.; Johnston-Hollitt, M.; Kapinska, A.; McKinley, B.
2017-01-01
This proposal is a pilot study for the extension of the highly successful GaLactic and Extragalactic MWA (GLEAM) survey (Wayth et al. 2015). The aim is to test out new observing strategies and data reduction techniques suitable for exploiting the longer baselines of the extended phase 2 MWA array. Deeper and wide surveys at higher resolution will enable a legion of science capabilities pertaining to galaxy evolution, clusters and the cosmic web, whilst maintaining the advantages over LOFAR including larger field-of-view, wider frequency coverage and better sensitivity to extended emission. We will continue the successful drift scan mode observing to test the feasibility of a large-area survey in 2017-B and onward. We will also target a single deep area with a bright calibrator source to establish the utility of focussed deep observations. In both cases, we will be exploring calibrating and imaging strategies across 72-231 MHz with the new long baselines. The published extragalactic sky catalogue (Hurley-Walker et al. 2017) improves the prospects for good ionospheric calibration in this new regime, as well as trivialising flux calibration. The new Alternative Data Release of the TIFR GMRT Sky Survey (TGSS-ADR1; Intema et al. 2016), which has 30" resolution and covers the proposed observing area, allows us to test whether our calibration and imaging strategy correctly recovers the true structure of (high surface-brightness) resolved sources. GLEAM-X will have lower noise, higher surface brightness sensitivity, and have considerably wider bandwidth than TGSS. These properties will enable a wide range of science, such as: Detecting and characterising cluster relics and haloes beyond z=0.45; Accurately determining radio source counts at multiple frequencies; Measuring the low-v luminosity function to z 0.5; Performing Galactic plane science such as HII region detection and cosmic tomography; Determining the typical ionospheric diffractive scale at the MRO, feeding into SKA_Low calibration strategies. In addition the proposal is designed to be commensally used for transients science, and will also create a more accurate, higher-resolution foreground model for the EoR2 field, allowing better foreground subtraction and therefore increased sensitivity to the EoR signal.
Research and Teaching About the Deep Earth
NASA Astrophysics Data System (ADS)
Williams, Michael L.; Mogk, David W.; McDaris, John
2010-08-01
Understanding the Deep Earth: Slabs, Drips, Plumes and More; Virtual Workshop, 17-19 February and 24-26 February 2010; Images and models of active faults, subducting plates, mantle drips, and rising plumes are spurring new excitement about deep-Earth processes and connections between Earth's internal systems and plate tectonics. The new results and the steady progress of Earthscope's USArray across the country are also providing a special opportunity to reach students and the general public. The pace of discoveries about the deep Earth is accelerating due to advances in experimental, modeling, and sensing technologies; new data processing capabilities; and installation of new networks, especially the EarthScope facility. EarthScope is an interdisciplinary program that combines geology and geophysics to study the structure and evolution of the North American continent. To explore the current state of deep-Earth science and ways in which it can be brought into the undergraduate classroom, 40 professors attended a virtual workshop given by On the Cutting Edge, a program that strives to improve undergraduate geoscience education through an integrated cooperative series of workshops and Web-based resources. The 6-day two-part workshop consisted of plenary talks, large and small group discussions, and development and review of new classroom and laboratory activities.
Deep-water kelp refugia as potential hotspots of tropical marine diversity and productivity.
Graham, Michael H; Kinlan, Brian P; Druehl, Louis D; Garske, Lauren E; Banks, Stuart
2007-10-16
Classic marine ecological paradigms view kelp forests as inherently temperate-boreal phenomena replaced by coral reefs in tropical waters. These paradigms hinge on the notion that tropical surface waters are too warm and nutrient-depleted to support kelp productivity and survival. We present a synthetic oceanographic and ecophysiological model that accurately identifies all known kelp populations and, by using the same criteria, predicts the existence of >23,500 km(2) unexplored submerged (30- to 200-m depth) tropical kelp habitats. Predicted tropical kelp habitats were most probable in regions where bathymetry and upwelling resulted in mixed-layer shoaling above the depth of minimum annual irradiance dose for kelp survival. Using model predictions, we discovered extensive new deep-water Eisenia galapagensis populations in the Galápagos that increased in abundance with increasing depth to >60 m, complete with cold-water flora and fauna of temperate affinities. The predictability of deep-water kelp habitat and the discovery of expansive deep-water Galápagos kelp forests validate the extent of deep-water tropical kelp refugia, with potential implications for regional productivity and biodiversity, tropical food web ecology, and understanding of the resilience of tropical marine systems to climate change.
... on MedlinePlus health topic pages. With the Web service, software developers can build applications that leverage the authoritative, reliable health information in MedlinePlus. The MedlinePlus Web service is free of charge and does not require ...
Using Citizen Science to Close Gaps in Cabled Ocean Observatory Research
NASA Astrophysics Data System (ADS)
Morley, M. G.; Moran, K.; Riddell, D. J.; Hoeberechts, M.; Flagg, R.; Walsh, J.; Dobell, R.; Longo, J.
2015-12-01
Ocean Networks Canada operates the world-leading NEPTUNE and VENUS cabled ocean observatories off the west coast of British Columbia, and a community observatory in Cambridge Bay, Nunavut. Continuous power and connectivity permit large volumes of data to be collected and made available to scientists and citizens alike over the Internet through a web-based interface. The Oceans 2.0 data management system contains over one quarter petabyte of data, including more than 20,000 hours of video from fixed seafloor cameras and a further 8,000 hours of video collected by remotely operated vehicles. Cabled observatory instrument deployments enable the collection of high-frequency, long-duration time series of data from a specific location. This enables the study of important questions such as whether effects of climate change—for instance, variations in temperature or sea-level—are seen over the long term. However, cabled observatory monitoring also presents challenges to scientific researchers: the overwhelming volume of data and the fixed spatial location can be barriers to addressing some big questions. Here we describe how Ocean Networks Canada is using Citizen Science to address these limitations and supplement cabled observatory research. Two applications are presented: Digital Fishers is a crowd-sourcing application in which participants watch short deep-sea video clips and make annotations based on scientific research questions. To date, 3,000 participants have contributed 140,000 scientific observations on topics including sablefish abundance, hydrothermal vent geology and deep-sea feeding behaviour. Community Fishers is a program in which ordinary citizens aboard vessels of opportunity collect ocean data including water temperature, salinity, dissolved oxygen and chlorophyll. The program's focus is to directly address the typical quality concerns around data that are collected using a citizen science approach. This is done by providing high quality scientific instruments and basic (but imperative) training to the citizens and vessel operators who participate. The data are downloaded using a specially designed tablet app, and then transmitted to Oceans 2.0 where raw and corrected data and metadata are made available through the web in real-time.
Deepwater Nitrogen Fixation: Who's Doing it, Where, and Why?
NASA Astrophysics Data System (ADS)
Montoya, J. P.; Weber, S.; Vogts, A.; Voss, M.; Saxton, M.; Joye, S. B.
2016-02-01
Nitrogen availability frequently limits marine primary production and N2-fixation plays an important role in supporting biological production in surface waters of many oligotrophic regions. Although subsurface waters typically contain high concentrations of nitrate and other nutrients, measurements from a variety of oceanic settings show measurable, and at times high rates of N2-fixation in deep, dark waters below the mixed layer. We have explored the distribution of N2-fixation throughout the water column of the Gulf of Mexico (GoM) during a series of cruises beginning shortly after the Deepwater Horizon (DWH) spill in 2010 and continuing at roughly annual intervals. These cruises allowed us to sample oligotrophic waters across a range of depths, and to explore the connections between the C and N cycles mediated by release of oil and gas (petrocarbon) from natural seeps as well as anthropogenic sources (e.g., the DWH). We used stable isotope abundances (15N and 13C) in particles and zooplankton in combination with experimental measurements of N2-fixation and CH4 assimilation to assess the contribution of oil- and gas-derived C to the pelagic food web, and the impact of CH4 releases on the pelagic C and N cycles. Our isotopic measurements document the movement of petrocarbon into the pelagic food web, and our experiments revealed that high rates of N2-fixation were widespread in deep water immediately after the DWH incident, and restricted to the vicinity of natural seeps in subsequent years. Unfortunately, these approaches provided no insight into the organisms actually responsible for N2-fixation and CH4-assimilation. We used nano-scale Secondary Ion Mass Spectrometry (nanoSIMS) to image the organisms responsible for these processes, and molecular approaches to explore the diversity of methanotrophs and diazotrophs present in the system. The ability to resolve isotopic distributions on the scale of individual cells is a critical part of bridging the gap between molecular approaches that identify organisms, and biogeochemical techniques that allow us to measure the activity of communities.
2013-09-01
Figure 17. Reliable acoustic paths from a deep source to shallow receivers (From Urick 1983... Urick 1983). ..................................................................28 Figure 19. Computer generated ray diagram of the DSC for a source...near the axis. Reflected rays are omitted (From Urick 1983). .........................................29 Figure 20. Worldwide DSC axis depths in
Chaves, Paula; Simões, Daniela; Paço, Maria; Pinho, Francisco; Duarte, José Alberto; Ribeiro, Fernando
2017-12-01
Deep friction massage is one of several physiotherapy interventions suggested for the management of tendinopathy. To determine the prevalence of deep friction massage use in clinical practice, to characterize the application parameters used by physiotherapists, and to identify empirical model-based patterns of deep friction massage application in degenerative tendinopathy. observational, analytical, cross-sectional and national web-based survey. 478 physiotherapists were selected through snow-ball sampling method. The participants completed an online questionnaire about personal and professional characteristics as well as specific questions regarding the use of deep friction massage. Characterization of deep friction massage parameters used by physiotherapists were presented as counts and proportions. Latent class analysis was used to identify the empirical model-based patterns. Crude and adjusted odds ratios and 95% confidence intervals were computed. The use of deep friction massage was reported by 88.1% of the participants; tendinopathy was the clinical condition where it was most frequently used (84.9%) and, from these, 55.9% reported its use in degenerative tendinopathy. The "duration of application" parameters in chronic phase and "frequency of application" in acute and chronic phases are those that diverge most from those recommended by the author of deep friction massage. We found a high prevalence of deep friction massage use, namely in degenerative tendinopathy. Our results have shown that the application parameters are heterogeneous and diverse. This is reflected by the identification of two application patterns, although none is in complete agreement with Cyriax's description. Copyright © 2017 Elsevier Ltd. All rights reserved.
Quantum-chemical insights from deep tensor neural networks
Schütt, Kristof T.; Arbabzadah, Farhad; Chmiela, Stefan; Müller, Klaus R.; Tkatchenko, Alexandre
2017-01-01
Learning from data has led to paradigm shifts in a multitude of disciplines, including web, text and image search, speech recognition, as well as bioinformatics. Can machine learning enable similar breakthroughs in understanding quantum many-body systems? Here we develop an efficient deep learning approach that enables spatially and chemically resolved insights into quantum-mechanical observables of molecular systems. We unify concepts from many-body Hamiltonians with purpose-designed deep tensor neural networks, which leads to size-extensive and uniformly accurate (1 kcal mol−1) predictions in compositional and configurational chemical space for molecules of intermediate size. As an example of chemical relevance, the model reveals a classification of aromatic rings with respect to their stability. Further applications of our model for predicting atomic energies and local chemical potentials in molecules, reliable isomer energies, and molecules with peculiar electronic structure demonstrate the potential of machine learning for revealing insights into complex quantum-chemical systems. PMID:28067221
Quantum-chemical insights from deep tensor neural networks.
Schütt, Kristof T; Arbabzadah, Farhad; Chmiela, Stefan; Müller, Klaus R; Tkatchenko, Alexandre
2017-01-09
Learning from data has led to paradigm shifts in a multitude of disciplines, including web, text and image search, speech recognition, as well as bioinformatics. Can machine learning enable similar breakthroughs in understanding quantum many-body systems? Here we develop an efficient deep learning approach that enables spatially and chemically resolved insights into quantum-mechanical observables of molecular systems. We unify concepts from many-body Hamiltonians with purpose-designed deep tensor neural networks, which leads to size-extensive and uniformly accurate (1 kcal mol -1 ) predictions in compositional and configurational chemical space for molecules of intermediate size. As an example of chemical relevance, the model reveals a classification of aromatic rings with respect to their stability. Further applications of our model for predicting atomic energies and local chemical potentials in molecules, reliable isomer energies, and molecules with peculiar electronic structure demonstrate the potential of machine learning for revealing insights into complex quantum-chemical systems.
Quantum-chemical insights from deep tensor neural networks
NASA Astrophysics Data System (ADS)
Schütt, Kristof T.; Arbabzadah, Farhad; Chmiela, Stefan; Müller, Klaus R.; Tkatchenko, Alexandre
2017-01-01
Learning from data has led to paradigm shifts in a multitude of disciplines, including web, text and image search, speech recognition, as well as bioinformatics. Can machine learning enable similar breakthroughs in understanding quantum many-body systems? Here we develop an efficient deep learning approach that enables spatially and chemically resolved insights into quantum-mechanical observables of molecular systems. We unify concepts from many-body Hamiltonians with purpose-designed deep tensor neural networks, which leads to size-extensive and uniformly accurate (1 kcal mol-1) predictions in compositional and configurational chemical space for molecules of intermediate size. As an example of chemical relevance, the model reveals a classification of aromatic rings with respect to their stability. Further applications of our model for predicting atomic energies and local chemical potentials in molecules, reliable isomer energies, and molecules with peculiar electronic structure demonstrate the potential of machine learning for revealing insights into complex quantum-chemical systems.
DeepInfer: open-source deep learning deployment toolkit for image-guided therapy
NASA Astrophysics Data System (ADS)
Mehrtash, Alireza; Pesteie, Mehran; Hetherington, Jorden; Behringer, Peter A.; Kapur, Tina; Wells, William M.; Rohling, Robert; Fedorov, Andriy; Abolmaesumi, Purang
2017-03-01
Deep learning models have outperformed some of the previous state-of-the-art approaches in medical image analysis. Instead of using hand-engineered features, deep models attempt to automatically extract hierarchical representations at multiple levels of abstraction from the data. Therefore, deep models are usually considered to be more flexible and robust solutions for image analysis problems compared to conventional computer vision models. They have demonstrated significant improvements in computer-aided diagnosis and automatic medical image analysis applied to such tasks as image segmentation, classification and registration. However, deploying deep learning models often has a steep learning curve and requires detailed knowledge of various software packages. Thus, many deep models have not been integrated into the clinical research work ows causing a gap between the state-of-the-art machine learning in medical applications and evaluation in clinical research procedures. In this paper, we propose "DeepInfer" - an open-source toolkit for developing and deploying deep learning models within the 3D Slicer medical image analysis platform. Utilizing a repository of task-specific models, DeepInfer allows clinical researchers and biomedical engineers to deploy a trained model selected from the public registry, and apply it to new data without the need for software development or configuration. As two practical use cases, we demonstrate the application of DeepInfer in prostate segmentation for targeted MRI-guided biopsy and identification of the target plane in 3D ultrasound for spinal injections.