Science.gov

Sample records for efficient web log

  1. Efficient Preprocessing technique using Web log mining

    NASA Astrophysics Data System (ADS)

    Raiyani, Sheetal A.; jain, Shailendra

    2012-11-01

    Web Usage Mining can be described as the discovery and Analysis of user access pattern through mining of log files and associated data from a particular websites. No. of visitors interact daily with web sites around the world. enormous amount of data are being generated and these information could be very prize to the company in the field of accepting Customerís behaviors. In this paper a complete preprocessing style having data cleaning, user and session Identification activities to improve the quality of data. Efficient preprocessing technique one of the User Identification which is key issue in preprocessing technique phase is to identify the Unique web users. Traditional User Identification is based on the site structure, being supported by using some heuristic rules, for use of this reduced the efficiency of user identification solve this difficulty we introduced proposed Technique DUI (Distinct User Identification) based on IP address ,Agent and Session time ,Referred pages on desired session time. Which can be used in counter terrorism, fraud detection and detection of unusual access of secure data, as well as through detection of regular access behavior of users improve the overall designing and performance of upcoming access of preprocessing results.

  2. Analysis of Web Proxy Logs

    NASA Astrophysics Data System (ADS)

    Fei, Bennie; Eloff, Jan; Olivier, Martin; Venter, Hein

    Network forensics involves capturing, recording and analysing network audit trails. A crucial part of network forensics is to gather evidence at the server level, proxy level and from other sources. A web proxy relays URL requests from clients to a server. Analysing web proxy logs can give unobtrusive insights to the browsing behavior of computer users and provide an overview of the Internet usage in an organisation. More importantly, in terms of network forensics, it can aid in detecting anomalous browsing behavior. This paper demonstrates the use of a self-organising map (SOM), a powerful data mining technique, in network forensics. In particular, it focuses on how a SOM can be used to analyse data gathered at the web proxy level.

  3. Web Logs in the English Classroom: More Than Just Chat.

    ERIC Educational Resources Information Center

    Richardson, Will

    2003-01-01

    Details the use and appeal of Web logs to enhance classroom discussion and allow for outside involvement in the classroom. Defines a Web log, addresses discussing literature in a Web log, and describes the author's first attempts at using Web-log technology. Presents considerations for using Web logs as part of classroom instruction. (SG)

  4. Using Web Logs in the Science Classroom

    ERIC Educational Resources Information Center

    Duplichan, Staycle C.

    2009-01-01

    As educators we must ask ourselves if we are meeting the needs of today's students. The science world is adapting to our ever-changing society; are the methodology and philosophy of our educational system keeping up? In this article, you'll learn why web logs (also called blogs) are an important Web 2.0 tool in your science classroom and how they…

  5. Web Log Analysis: A Study of Instructor Evaluations Done Online

    ERIC Educational Resources Information Center

    Klassen, Kenneth J.; Smith, Wayne

    2004-01-01

    This paper focuses on developing a relatively simple method for analyzing web-logs. It also explores the challenges and benefits of web-log analysis. The study of student behavior on this site provides insights into website design and the effectiveness of this site in particular. Another benefit realized from the paper is the ease with which these…

  6. Comparing Web and Touch Screen Transaction Log Files

    PubMed Central

    Huntington, Paul; Williams, Peter

    2001-01-01

    Background Digital health information is available on a wide variety of platforms including PC-access of the Internet, Wireless Application Protocol phones, CD-ROMs, and touch screen public kiosks. All these platforms record details of user sessions in transaction log files, and there is a growing body of research into the evaluation of this data. However, there is very little research that has examined the problems of comparing the transaction log files of kiosks and the Internet. Objectives To provide a first step towards examining the problems of comparing the transaction log files of kiosks and the Internet. Methods We studied two platforms: touch screen kiosks and a comparable Web site. For both of these platforms, we examined the menu structure (which affects transaction log file data), the log-file structure, and the metrics derived from log-file records. Results We found substantial differences between the generated metrics. Conclusions None of the metrics discussed can be regarded as an effective way of comparing the use of kiosks and Web sites. Two metrics stand out as potentially comparable and valuable: the number of user sessions per hour and user penetration of pages. PMID:11720960

  7. Statistics, Structures & Satisfied Customers: Using Web Log Data to Improve Site Performance.

    ERIC Educational Resources Information Center

    Peacock, Darren

    This paper explores some of the ways in which the National Museum of Australia is using Web analysis tools to shape its future directions in the delivery of online services. In particular, it explores the potential of quantitative analysis, based on Web server log data, to convert these ephemeral traces of user experience into a strategic…

  8. Analyzing Web Server Logs to Improve a Site's Usage. The Systems Librarian

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2005-01-01

    This column describes ways to streamline and optimize how a Web site works in order to improve both its usability and its visibility. The author explains how to analyze logs and other system data to measure the effectiveness of the Web site design and search engine.

  9. Users' Perceptions of the Web As Revealed by Transaction Log Analysis.

    ERIC Educational Resources Information Center

    Moukdad, Haidar; Large, Andrew

    2001-01-01

    Describes the results of a transaction log analysis of a Web search engine, WebCrawler, to analyze user's queries for information retrieval. Results suggest most users do not employ advanced search features, and the linguistic structure often resembles a human-human communication model that is not always successful in human-computer communication.…

  10. Efficient Web Services Policy Combination

    NASA Technical Reports Server (NTRS)

    Vatan, Farrokh; Harman, Joseph G.

    2010-01-01

    Large-scale Web security systems usually involve cooperation between domains with non-identical policies. The network management and Web communication software used by the different organizations presents a stumbling block. Many of the tools used by the various divisions do not have the ability to communicate network management data with each other. At best, this means that manual human intervention into the communication protocols used at various network routers and endpoints is required. Developing practical, sound, and automated ways to compose policies to bridge these differences is a long-standing problem. One of the key subtleties is the need to deal with inconsistencies and defaults where one organization proposes a rule on a particular feature, and another has a different rule or expresses no rule. A general approach is to assign priorities to rules and observe the rules with the highest priorities when there are conflicts. The present methods have inherent inefficiency, which heavily restrict their practical applications. A new, efficient algorithm combines policies utilized for Web services. The method is based on an algorithm that allows an automatic and scalable composition of security policies between multiple organizations. It is based on defeasible policy composition, a promising approach for finding conflicts and resolving priorities between rules. In the general case, policy negotiation is an intractable problem. A promising method, suggested in the literature, is when policies are represented in defeasible logic, and composition is based on rules for non-monotonic inference. In this system, policy writers construct metapolicies describing both the policy that they wish to enforce and annotations describing their composition preferences. These annotations can indicate whether certain policy assertions are required by the policy writer or, if not, under what circumstances the policy writer is willing to compromise and allow other assertions to take

  11. An Efficient Web Page Ranking for Semantic Web

    NASA Astrophysics Data System (ADS)

    Chahal, P.; Singh, M.; Kumar, S.

    2014-01-01

    With the enormous amount of information presented on the web, the retrieval of relevant information has become a serious problem and is also the topic of research for last few years. The most common tools to retrieve information from web are search engines like Google. The Search engines are usually based on keyword searching and indexing of web pages. This approach is not very efficient as the result-set of web pages obtained include large irrelevant pages. Sometimes even the entire result-set may contain lot of irrelevant pages for the user. The next generation of search engines must address this problem. Recently, many semantic web search engines have been developed like Ontolook, Swoogle, which help in searching meaningful documents presented on semantic web. In this process the ranking of the retrieved web pages is very crucial. Some attempts have been made in ranking of semantic web pages but still the ranking of these semantic web documents is neither satisfactory and nor up to the user's expectations. In this paper we have proposed a semantic web based document ranking scheme that relies not only on the keywords but also on the conceptual instances present between the keywords. As a result only the relevant page will be on the top of the result-set of searched web pages. We explore all relevant relations between the keywords exploring the user's intention and then calculate the fraction of these relations on each web page to determine their relevance. We have found that this ranking technique gives better results than those by the prevailing methods.

  12. Effect of temporal relationships in associative rule mining for web log data.

    PubMed

    Khairudin, Nazli Mohd; Mustapha, Aida; Ahmad, Mohd Hanif

    2014-01-01

    The advent of web-based applications and services has created such diverse and voluminous web log data stored in web servers, proxy servers, client machines, or organizational databases. This paper attempts to investigate the effect of temporal attribute in relational rule mining for web log data. We incorporated the characteristics of time in the rule mining process and analysed the effect of various temporal parameters. The rules generated from temporal relational rule mining are then compared against the rules generated from the classical rule mining approach such as the Apriori and FP-Growth algorithms. The results showed that by incorporating the temporal attribute via time, the number of rules generated is subsequently smaller but is comparable in terms of quality.

  13. Effect of Temporal Relationships in Associative Rule Mining for Web Log Data

    PubMed Central

    Mohd Khairudin, Nazli; Mustapha, Aida

    2014-01-01

    The advent of web-based applications and services has created such diverse and voluminous web log data stored in web servers, proxy servers, client machines, or organizational databases. This paper attempts to investigate the effect of temporal attribute in relational rule mining for web log data. We incorporated the characteristics of time in the rule mining process and analysed the effect of various temporal parameters. The rules generated from temporal relational rule mining are then compared against the rules generated from the classical rule mining approach such as the Apriori and FP-Growth algorithms. The results showed that by incorporating the temporal attribute via time, the number of rules generated is subsequently smaller but is comparable in terms of quality. PMID:24587757

  14. Analysis of Large Data Logs: An Application of Poisson Sampling on Excite Web Queries.

    ERIC Educational Resources Information Center

    Ozmutlu, H. Cenk; Spink, Amanda; Ozmutlu, Seda

    2002-01-01

    Discusses the need for tools that allow effective analysis of search engine queries to provide a greater understanding of Web users' information seeking behavior and describes a study that developed an effective strategy for selecting samples from large-scale data sets. Reports on Poisson sampling with data logs from the Excite search engine.…

  15. A Clustering Methodology of Web Log Data for Learning Management Systems

    ERIC Educational Resources Information Center

    Valsamidis, Stavros; Kontogiannis, Sotirios; Kazanidis, Ioannis; Theodosiou, Theodosios; Karakos, Alexandros

    2012-01-01

    Learning Management Systems (LMS) collect large amounts of data. Data mining techniques can be applied to analyse their web data log files. The instructors may use this data for assessing and measuring their courses. In this respect, we have proposed a methodology for analysing LMS courses and students' activity. This methodology uses a Markov…

  16. Analyzing Engagement in a Web-Based Intervention Platform Through Visualizing Log-Data

    PubMed Central

    2014-01-01

    Background Engagement has emerged as a significant cross-cutting concern within the development of Web-based interventions. There have been calls to institute a more rigorous approach to the design of Web-based interventions, to increase both the quantity and quality of engagement. One approach would be to use log-data to better understand the process of engagement and patterns of use. However, an important challenge lies in organizing log-data for productive analysis. Objective Our aim was to conduct an initial exploration of the use of visualizations of log-data to enhance understanding of engagement with Web-based interventions. Methods We applied exploratory sequential data analysis to highlight sequential aspects of the log data, such as time or module number, to provide insights into engagement. After applying a number of processing steps, a range of visualizations were generated from the log-data. We then examined the usefulness of these visualizations for understanding the engagement of individual users and the engagement of cohorts of users. The visualizations created are illustrated with two datasets drawn from studies using the SilverCloud Platform: (1) a small, detailed dataset with interviews (n=19) and (2) a large dataset (n=326) with 44,838 logged events. Results We present four exploratory visualizations of user engagement with a Web-based intervention, including Navigation Graph, Stripe Graph, Start–Finish Graph, and Next Action Heat Map. The first represents individual usage and the last three, specific aspects of cohort usage. We provide examples of each with a discussion of salient features. Conclusions Log-data analysis through data visualization is an alternative way of exploring user engagement with Web-based interventions, which can yield different insights than more commonly used summative measures. We describe how understanding the process of engagement through visualizations can support the development and evaluation of Web

  17. Development of high efficiency solar cells on silicon web

    NASA Technical Reports Server (NTRS)

    Rohatgi, A.; Meier, D. L.; Campbell, R. B.; Schmidt, D. N.; Rai-Choudhury, P.

    1984-01-01

    Web base material is being improved with a goal toward obtaining solar cell efficiencies in excess of 18% (AM1). Carrier loss mechanisms in web silicon was investigated, techniques were developed to reduce carrier recombination in the web, and web cells were fabricated using effective surface passivation. The effect of stress on web cell performance was also investigated.

  18. Efficient Web Change Monitoring with Page Digest

    SciTech Connect

    Buttler, D J; Rocco, D; Liu, L

    2004-02-20

    The Internet and the World Wide Web have enabled a publishing explosion of useful online information, which has produced the unfortunate side effect of information overload: it is increasingly difficult for individuals to keep abreast of fresh information. In this paper we describe an approach for building a system for efficiently monitoring changes to Web documents. This paper has three main contributions. First, we present a coherent framework that captures different characteristics of Web documents. The system uses the Page Digest encoding to provide a comprehensive monitoring system for content, structure, and other interesting properties of Web documents. Second, the Page Digest encoding enables improved performance for individual page monitors through mechanisms such as short-circuit evaluation, linear time algorithms for document and structure similarity, and data size reduction. Finally, we develop a collection of sentinel grouping techniques based on the Page Digest encoding to reduce redundant processing in large-scale monitoring systems by grouping similar monitoring requests together. We examine how effective these techniques are over a wide range of parameters and have seen an order of magnitude speed up over existing Web-based information monitoring systems.

  19. New Tools for Research on Instruction and Instructional Policy: A Web-Based Teacher Log. A CTP Working Paper.

    ERIC Educational Resources Information Center

    Ball, Deborah Loewenberg; Camburn, Eric; Correnti, Richard; Phelps, Geoffrey; Wallace, Raven

    This paper discusses the initial development and testing of a Web-based instrument for collecting daily data on instruction. This teacher log was developed for use in the Study of Instructional Improvement, a longitudinal study on school improvement in high poverty areas. The researchers wanted to further develop the potential of teacher logs by…

  20. Development of high-efficiency solar cells on silicon web

    NASA Technical Reports Server (NTRS)

    Rohatgi, A.; Meier, D. L.; Campbell, R. B.; Seidensticker, R. G.; Rai-Choudhury, P.

    1984-01-01

    The development of high efficiency solar cells on a silicon web is discussed. Heat treatment effects on web quality; the influence of twin plane lamellae, trace impurities and stress on minority carrier lifetime; and the fabrication of cells are discussed.

  1. The design and implementation of web mining in web sites security

    NASA Astrophysics Data System (ADS)

    Li, Jian; Zhang, Guo-Yin; Gu, Guo-Chang; Li, Jian-Li

    2003-06-01

    The backdoor or information leak of Web servers can be detected by using Web Mining techniques on some abnormal Web log and Web application log data. The security of Web servers can be enhanced and the damage of illegal access can be avoided. Firstly, the system for discovering the patterns of information leakages in CGI scripts from Web log data was proposed. Secondly, those patterns for system administrators to modify their codes and enhance their Web site security were provided. The following aspects were described: one is to combine web application log with web log to extract more information, so web data mining could be used to mine web log for discovering the information that firewall and Information Detection System cannot find. Another approach is to propose an operation module of web site to enhance Web site security. In cluster server session, Density-Based Clustering technique is used to reduce resource cost and obtain better efficiency.

  2. A Two-Tiered Model for Analyzing Library Web Site Usage Statistics, Part 1: Web Server Logs.

    ERIC Educational Resources Information Center

    Cohen, Laura B.

    2003-01-01

    Proposes a two-tiered model for analyzing web site usage statistics for academic libraries: one tier for library administrators that analyzes measures indicating library use, and a second tier for web site managers that analyzes measures aiding in server maintenance and site design. Discusses the technology of web site usage statistics, and…

  3. Developing an Efficient Computational Method that Estimates the Ability of Students in a Web-Based Learning Environment

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2012-01-01

    This paper presents a computational method that can efficiently estimate the ability of students from the log files of a Web-based learning environment capturing their problem solving processes. The computational method developed in this study approximates the posterior distribution of the student's ability obtained from the conventional Bayes…

  4. Development of high-efficiency solar cells on silicon web

    NASA Technical Reports Server (NTRS)

    Meier, D. L.; Greggi, J.; Okeeffe, T. W.; Rai-Choudhury, P.

    1986-01-01

    Work was performed to improve web base material with a goal of obtaining solar cell efficiencies in excess of 18% (AM1). Efforts in this program are directed toward identifying carrier loss mechanisms in web silicon, eliminating or reducing these mechanisms, designing a high efficiency cell structure with the aid of numerical models, and fabricating high efficiency web solar cells. Fabrication techniques must preserve or enhance carrier lifetime in the bulk of the cell and minimize recombination of carriers at the external surfaces. Three completed cells were viewed by cross-sectional transmission electron microscopy (TEM) in order to investigate further the relation between structural defects and electrical performance of web cells. Consistent with past TEM examinations, the cell with the highest efficiency (15.0%) had no dislocations but did have 11 twin planes.

  5. Competency-based residency training and the web log: modeling practice-based learning and enhancing medical knowledge†

    PubMed Central

    Hollon, Matthew F.

    2015-01-01

    Background By using web-based tools in medical education, there are opportunities to innovatively teach important principles from the general competencies of graduate medical education. Objectives Postulating that faculty transparency in learning from uncertainties in clinical work could help residents to incorporate the principles of practice-based learning and improvement (PBLI) in their professional development, faculty in this community-based residency program modeled the steps of PBLI on a weekly basis through the use of a web log. Method The program confidentially surveyed residents before and after this project about actions consistent with PBLI and knowledge acquired through reading the web log. Results The frequency that residents encountered clinical situations where they felt uncertain declined over the course of the 24 weeks of the project from a mean frequency of uncertainty of 36% to 28% (Wilcoxon signed rank test, p=0.008); however, the frequency with which residents sought answers when faced with uncertainty did not change (Wilcoxon signed rank test, p=0.39), remaining high at approximately 80%. Residents answered a mean of 52% of knowledge questions correct when tested prior to faculty posts to the blog, rising to a mean of 65% of questions correct when tested at the end of the project (paired t-test, p=0.001). Conclusions Faculty role modeling of PBLI behaviors and posting clinical questions and answers to a web log led to modest improvements in medical knowledge but did not alter behavior that was already taking place frequently among residents. PMID:26653701

  6. Development of high-efficiency solar cells on silicon web

    NASA Technical Reports Server (NTRS)

    Meier, D. L.; Greggi, J.; Rai-Choudhury, P.

    1986-01-01

    Work is reported aimed at identifying and reducing sources of carrier recombination both in the starting web silicon material and in the processed cells. Cross-sectional transmission electron microscopy measurements of several web cells were made and analyzed. The effect of the heavily twinned region on cell efficiency was modeled, and the modeling results compared to measured values for processed cells. The effects of low energy, high dose hydrogen ion implantation on cell efficiency and diffusion length were examined. Cells were fabricated from web silicon known to have a high diffusion length, with a new double layer antireflection coating being applied to these cells. A new contact system, to be used with oxide passivated cells and which greatly reduces the area of contact between metal and silicon, was designed. The application of DLTS measurements to beveled samples was further investigated.

  7. Development of high-efficiency solar cells on silicon web

    NASA Technical Reports Server (NTRS)

    Meier, D. L.

    1986-01-01

    Achievement of higher efficiency cells by directing efforts toward identifying carrier loss mechanisms; design of cell structures; and development of processing techniques are described. Use of techniques such as deep-level transient spectroscopy (DLTS), laser-beam-induced current (LBIC), and transmission electron microscopy (TEM) indicated that dislocations in web material rather than twin planes were primarily responsible for limiting diffusion lengths in the web. Lifetimes and cell efficiencies can be improved from 19 to 120 microns, and 8 to 10.3% (no AR), respectively, by implanting hydrogen at 1500 eV and a beam current density of 2.0 mA/sq cm. Some of the processing improvements included use of a double-layer AR coating (ZnS and MgF2) and an addition of an aluminum back surface reflectors. Cells of more than 16% efficiency were achieved.

  8. pcrEfficiency: a Web tool for PCR amplification efficiency prediction

    PubMed Central

    2011-01-01

    Background Relative calculation of differential gene expression in quantitative PCR reactions requires comparison between amplification experiments that include reference genes and genes under study. Ignoring the differences between their efficiencies may lead to miscalculation of gene expression even with the same starting amount of template. Although there are several tools performing PCR primer design, there is no tool available that predicts PCR efficiency for a given amplicon and primer pair. Results We have used a statistical approach based on 90 primer pair combinations amplifying templates from bacteria, yeast, plants and humans, ranging in size between 74 and 907 bp to identify the parameters that affect PCR efficiency. We developed a generalized additive model fitting the data and constructed an open source Web interface that allows the obtention of oligonucleotides optimized for PCR with predicted amplification efficiencies starting from a given sequence. Conclusions pcrEfficiency provides an easy-to-use web interface allowing the prediction of PCR efficiencies prior to web lab experiments thus easing quantitative real-time PCR set-up. A web-based service as well the source code are provided freely at http://srvgen.upct.es/efficiency.html under the GPL v2 license. PMID:22014212

  9. Log files analysis to assess the use and workload of a dynamic web server dedicated to end-stage renal disease.

    PubMed

    Ben Said, Mohamed; Le Mignot, Loic; Richard, Jean Baptiste; Le Bihan, Christine; Toubiana, Laurent; Jais, Jean-Philippe; Landais, Paul

    2006-01-01

    A Multi-Source Information System (MSIS), has been designed for the Renal Epidemiology and Information Network (REIN) dedicated to End-Stage Renal Disease (ESRD). MSIS aims at providing reliable follow-up data for ESRD patients. It is based on an n-tier architecture, made out of a universal client, a dynamic Web server connected to a production database and to a data warehouse. MSIS is operational since 2002 and progressively deployed in 9 regions in France. It includes 16,677 patients. We show that the analysis of MSIS web log files allows evaluating the use of the system and the workload in a public-health perspective.

  10. Log-Tool

    SciTech Connect

    Goodall, John

    2012-05-21

    Log files are typically semi- or un-structured. To be useable for visualization and machine learning, they need to be parsed into a standard, structured format. Log-tool is a tool for facilitating the parsing, structuring, and routing of log files (e.g. intrusion detection long, web server logs, system logs). It consists of three main components: (1) Input – it will input data from files, standard input, and syslog, (2) Parser – it will parse the log file based on regular expressions into structured data (JSNO format), (3) Output – it will output structured data into commonly used formats, including Redis (a database), standard output, and syslog.

  11. Evolving dynamic web pages using web mining

    NASA Astrophysics Data System (ADS)

    Menon, Kartik; Dagli, Cihan H.

    2003-08-01

    The heterogeneity and the lack of structure that permeates much of the ever expanding information sources on the WWW makes it difficult for the user to properly and efficiently access different web pages. Different users have different needs from the same web page. It is necessary to train the system to understand the needs and demands of the users. In other words there is a need for efficient and proper web mining. In this paper issues and possible ways of training the system and providing high level of organization for semi structured data available on the web is discussed. Web pages can be evolved based on history of query searches, browsing, links traversed and observation of the user behavior like book marking and time spent on viewing. Fuzzy clustering techniques help in grouping natural users and groups, neural networks, association rules and web traversals patterns help in efficient sequential anaysis based on previous searches and queries by the user. In this paper we analyze web server logs using above mentioned techniques to know more about user interactions. Analyzing these web server logs help to closely understand the user behavior and his/her web access pattern.

  12. An Efficient Approach for Web Indexing of Big Data through Hyperlinks in Web Crawling.

    PubMed

    Devi, R Suganya; Manjula, D; Siddharth, R K

    2015-01-01

    Web Crawling has acquired tremendous significance in recent times and it is aptly associated with the substantial development of the World Wide Web. Web Search Engines face new challenges due to the availability of vast amounts of web documents, thus making the retrieved results less applicable to the analysers. However, recently, Web Crawling solely focuses on obtaining the links of the corresponding documents. Today, there exist various algorithms and software which are used to crawl links from the web which has to be further processed for future use, thereby increasing the overload of the analyser. This paper concentrates on crawling the links and retrieving all information associated with them to facilitate easy processing for other uses. In this paper, firstly the links are crawled from the specified uniform resource locator (URL) using a modified version of Depth First Search Algorithm which allows for complete hierarchical scanning of corresponding web links. The links are then accessed via the source code and its metadata such as title, keywords, and description are extracted. This content is very essential for any type of analyser work to be carried on the Big Data obtained as a result of Web Crawling.

  13. Understanding Academic Information Seeking Habits through Analysis of Web Server Log Files: The Case of the Teachers College Library Website

    ERIC Educational Resources Information Center

    Asunka, Stephen; Chae, Hui Soo; Hughes, Brian; Natriello, Gary

    2009-01-01

    Transaction logs of user activity on an academic library website were analyzed to determine general usage patterns on the website. This paper reports on insights gained from the analysis, and identifies and discusses issues relating to content access, interface design and general functionality of the website. (Contains 13 figures and 8 tables.)

  14. Using client-side event logging and path tracing to assess and improve the quality of web-based surveys.

    PubMed Central

    White, Thomas M.; Hauan, Michael J.

    2002-01-01

    Web-based data collection has considerable appeal. However, the quality of data collected using such instruments is often questionable. There can be systematic problems with the wording of the surveys, and/or the means with which they are deployed. In unsupervised data collection, there are also concerns about whether subjects understand the questions, and wehther they are answering honestly. This paper presents a schema for using client-side timestamps and traces of subjects' paths through instruments to detect problems with the definition of instruments and their deployment. We discuss two large, anonymous, web-based, medical surveys as examples of the utility of this approach. PMID:12463954

  15. Standards, Efficiency, and the Evolution of Web Design

    ERIC Educational Resources Information Center

    Mitchell, Erik

    2010-01-01

    The author recently created a presentation using HTML5 based on a tutorial put together by Marcin Wichary. The example presentation is part proof-of-concept, part instructional piece, and it is part of a larger site on HTML5 and how one can use it to create rich Web-based applications. The more he delved into HTML5, the more he found that it was…

  16. Development of high-efficiency solar cells on silicon web

    NASA Technical Reports Server (NTRS)

    Rohatgi, A.; Meier, D. L.; Campbell, R. B.; Seidensticker, R. G.; Rai-Choudhury, P.

    1985-01-01

    High-efficiency dendritic cells were discussed. The influence of twin planes and heat treatment on the location and effect of trace impurities was of particular interest. Proper heat treatment often increases efficiency by causing impurities to pile up at twin planes. Oxide passivation had a beneficial effect on efficiency. A very efficient antireflective (AR) coating of zinc selenide and magnesium fluoride was designed and fabricated. An aluminum back-surface reflector was also effective.

  17. Re-evaluation of an improved efficiency polymeric web point-focus Fresnel lens

    SciTech Connect

    Stillwell, C.B.

    1988-08-01

    The optical efficiency of the lens developed by 3M and reported in Development and Evaluation of an Improved Efficiency Polymeric Web Point-Focus Fresnel Lens was measured by Sandia and reported to be 82%. Subsequent to publication of that report, additional lens tests at Sandia showed a lens efficiency of only 79%. This report presents the results of a study to determine why the lens efficiency is now lower than originally observed. 2 refs., 5 figs., 2 tabs.

  18. Towards a Simple and Efficient Web Search Framework

    DTIC Science & Technology

    2014-11-01

    any useful information about the various aspects of a topic. For example, for the query “ raspberry pi ”, it covers topics such as “what is raspberry pi ...topics generated by the LDA topic model for query ” raspberry pi ”. One simple explanation is that web texts are too noisy and unfocused for the LDA process...making a rasp- berry pi ”. However, the topics generated based on the 10 top ranked documents do not make much sense to us in terms of their keywords

  19. Recovery Efficiency Test Project: Phase 1, Activity report. Volume 1: Site selection, drill plan preparation, drilling, logging, and coring operations

    SciTech Connect

    Overbey, W.K. Jr.; Carden, R.S.; Kirr, J.N.

    1987-04-01

    The recovery Efficiency Test well project addressed a number of technical issues. The primary objective was to determine the increased efficiency gas recovery of a long horizontal wellbore over that of a vertical wellbore and, more specifically, what improvements can be expected from inducing multiple hydraulic fractures from such a wellbore. BDM corporation located, planned, and drilled a long radius turn horizontal well in the Devonian shale Lower Huron section in Wayne County, West Virginia, demonstrating that state-of-the-art technology is capable of drilling such wells. BDM successfully tested drilling, coring, and logging in a horizontal well using air as the circulating medium; conducted reservoir modeling studies to protect flow rates and reserves in advance of drilling operations; observed two phase flow conditions in the wellbore not observed previously; cored a fracture zone which produced gas; observed that fractures in the core and the wellbore were not systematically spaced (varied from 5 to 68 feet in different parts of the wellbore); observed that highest gas show rates reported by the mud logger corresponded to zone with lowest fracture spacing (five feet) or high fracture frequency. Four and one-half inch casting was successfully installed in the borehole and was equipped to isolate the horizontal section into eight (8) zones for future testing and stimulation operations. 6 refs., 48 figs., 10 tabs.

  20. Production and food web efficiency decrease as fishing activity increases in a coastal ecosystem

    NASA Astrophysics Data System (ADS)

    Anh, Pham Viet; Everaert, Gert; Goethals, Peter; Vinh, Chu Tien; De Laender, Frederik

    2015-11-01

    Fishing effort in the Vietnamese coastal ecosystem has rapidly increased from the 1990s to the 2000s, with unknown consequences for local ecosystem structure and functioning. Using ecosystem models that integrate fisheries and food webs we found profound differences in the production of six functional groups, the food web efficiency, and eight functional food web indices between the 1990s (low fishing intensity) and the 2000s (high fishing intensity). The functional attributes (e.g. consumption) of high trophic levels (e.g. predators) were lower in the 2000s than in the 1990s while primary production did not vary, causing food web efficiency to decrease up to 40% with time for these groups. The opposite was found for lower trophic levels (e.g. zooplankton): the functional attributes and food web efficiency increased with time (22 and 10% for the functional attributes and food web efficiency, respectively). Total system throughput, a functional food web index, was about 10% higher in the 1990s than in the 2000s, indicating a reduction of the system size and activity with time. The network analyses further indicated that the Vietnamese coastal ecosystem in the 1990s was more developed (higher ascendancy and capacity), more stable (higher overhead) and more mature (higher ratio of ascendancy and capacity) than in the 2000s. In the 1990s the recovery time of the ecosystem was shorter than in 2000s, as indicated by a higher Finn's cycling index in the 1990s (7.8 and 6.5% in 1990s and 2000s, respectively). Overall, our results demonstrate that the Vietnamese coastal ecosystem has experienced profound changes between the 1990s and 2000s, and emphasise the need for a closer inspection of the ecological impact of fishing.

  1. 3Drefine: an interactive web server for efficient protein structure refinement

    PubMed Central

    Bhattacharya, Debswapna; Nowotny, Jackson; Cao, Renzhi; Cheng, Jianlin

    2016-01-01

    3Drefine is an interactive web server for consistent and computationally efficient protein structure refinement with the capability to perform web-based statistical and visual analysis. The 3Drefine refinement protocol utilizes iterative optimization of hydrogen bonding network combined with atomic-level energy minimization on the optimized model using a composite physics and knowledge-based force fields for efficient protein structure refinement. The method has been extensively evaluated on blind CASP experiments as well as on large-scale and diverse benchmark datasets and exhibits consistent improvement over the initial structure in both global and local structural quality measures. The 3Drefine web server allows for convenient protein structure refinement through a text or file input submission, email notification, provided example submission and is freely available without any registration requirement. The server also provides comprehensive analysis of submissions through various energy and statistical feedback and interactive visualization of multiple refined models through the JSmol applet that is equipped with numerous protein model analysis tools. The web server has been extensively tested and used by many users. As a result, the 3Drefine web server conveniently provides a useful tool easily accessible to the community. The 3Drefine web server has been made publicly available at the URL: http://sysbio.rnet.missouri.edu/3Drefine/. PMID:27131371

  2. An efficient scheme for automatic web pages categorization using the support vector machine

    NASA Astrophysics Data System (ADS)

    Bhalla, Vinod Kumar; Kumar, Neeraj

    2016-07-01

    In the past few years, with an evolution of the Internet and related technologies, the number of the Internet users grows exponentially. These users demand access to relevant web pages from the Internet within fraction of seconds. To achieve this goal, there is a requirement of an efficient categorization of web page contents. Manual categorization of these billions of web pages to achieve high accuracy is a challenging task. Most of the existing techniques reported in the literature are semi-automatic. Using these techniques, higher level of accuracy cannot be achieved. To achieve these goals, this paper proposes an automatic web pages categorization into the domain category. The proposed scheme is based on the identification of specific and relevant features of the web pages. In the proposed scheme, first extraction and evaluation of features are done followed by filtering the feature set for categorization of domain web pages. A feature extraction tool based on the HTML document object model of the web page is developed in the proposed scheme. Feature extraction and weight assignment are based on the collection of domain-specific keyword list developed by considering various domain pages. Moreover, the keyword list is reduced on the basis of ids of keywords in keyword list. Also, stemming of keywords and tag text is done to achieve a higher accuracy. An extensive feature set is generated to develop a robust classification technique. The proposed scheme was evaluated using a machine learning method in combination with feature extraction and statistical analysis using support vector machine kernel as the classification tool. The results obtained confirm the effectiveness of the proposed scheme in terms of its accuracy in different categories of web pages.

  3. Spatial data efficient transmission in WebGIS based on IPv6

    NASA Astrophysics Data System (ADS)

    Wang, Zhen-feng; Liu, Ji-ping; Wang, Liang; Tao, Kun-wang

    2008-12-01

    Large-size of spatial data and limited bandwidth of network make it restricted to transmit spatial data in WebGIS. This paper employs IPv6 (Internet Protocol version 6), the successor of IPv4 running now, to transmit spatial data efficiently. As the core of NGN (Next Generation Network), IPv6 brings us many advantages to resolve performance problems in current IPv4 network applications. Multicast, which is mandatory in IPv6 routers, can make one server serve many clients simultaneously efficiently, thus to improve capacity of network applications. The new type of anycast address in IPv6 will make network client applications possible to find the nearest server. This makes data transmission between client and server fastest. The paper introduces how to apply IPv6 multicast and anycast in WebGIS to transmit data efficiently.

  4. Logging on to Learn

    ERIC Educational Resources Information Center

    Butler, Kevin

    2010-01-01

    A classroom lecture at Capistrano Connections Academy in Southern California involves booting up the home computer, logging on to a Web site, and observing a teacher conducting a PowerPoint presentation of that day's lesson entirely online. Through microphone headsets, students can watch on their home computers, respond to the teacher's questions,…

  5. Structural efficiency studies of corrugated compression panels with curved caps and beaded webs

    NASA Technical Reports Server (NTRS)

    Davis, R. C.; Mills, C. T.; Prabhakaran, R.; Jackson, L. R.

    1984-01-01

    Curved cross-sectional elements are employed in structural concepts for minimum-mass compression panels. Corrugated panel concepts with curved caps and beaded webs are optimized by using a nonlinear mathematical programming procedure and a rigorous buckling analysis. These panel geometries are shown to have superior structural efficiencies compared with known concepts published in the literature. Fabrication of these efficient corrugation concepts became possible by advances made in the art of superplastically forming of metals. Results of the mass optimization studies of the concepts are presented as structural efficiency charts for axial compression.

  6. Efficient 3D rendering for web-based medical imaging software: a proof of concept

    NASA Astrophysics Data System (ADS)

    Cantor-Rivera, Diego; Bartha, Robert; Peters, Terry

    2011-03-01

    Medical Imaging Software (MIS) found in research and in clinical practice, such as in Picture and Archiving Communication Systems (PACS) and Radiology Information Systems (RIS), has not been able to take full advantage of the Internet as a deployment platform. MIS is usually tightly coupled to algorithms that have substantial hardware and software requirements. Consequently, MIS is deployed on thick clients which usually leads project managers to allocate more resources during the deployment phase of the application than the resources that would be allocated if the application were deployed through a web interface.To minimize the costs associated with this scenario, many software providers use or develop plug-ins to provide the delivery platform (internet browser) with the features to load, interact and analyze medical images. Nevertheless there has not been a successful standard means to achieve this goal so far. This paper presents a study of WebGL as an alternative to plug-in development for efficient rendering of 3D medical models and DICOM images. WebGL is a technology that enables the internet browser to have access to the local graphics hardware in a native fashion. Because it is based in OpenGL, a widely accepted graphic industry standard, WebGL is being implemented in most of the major commercial browsers. After a discussion on the details of the technology, a series of experiments are presented to determine the operational boundaries in which WebGL is adequate for MIS. A comparison with current alternatives is also addressed. Finally conclusions and future work are discussed.

  7. Evaluation of the efficiency and effectiveness of independent dose calculation followed by machine log file analysis against conventional measurement based IMRT QA.

    PubMed

    Sun, Baozhou; Rangaraj, Dharanipathy; Boddu, Sunita; Goddu, Murty; Yang, Deshan; Palaniswaamy, Geethpriya; Yaddanapudi, Sridhar; Wooten, Omar; Mutic, Sasa

    2012-09-06

    Experimental methods are commonly used for patient-specific IMRT delivery verification. There are a variety of IMRT QA techniques which have been proposed and clinically used with a common understanding that not one single method can detect all possible errors. The aim of this work was to compare the efficiency and effectiveness of independent dose calculation followed by machine log file analysis to conventional measurement-based methods in detecting errors in IMRT delivery. Sixteen IMRT treatment plans (5 head-and-neck, 3 rectum, 3 breast, and 5 prostate plans) created with a commercial treatment planning system (TPS) were recalculated on a QA phantom. All treatment plans underwent ion chamber (IC) and 2D diode array measurements. The same set of plans was also recomputed with another commercial treatment planning system and the two sets of calculations were compared. The deviations between dosimetric measurements and independent dose calculation were evaluated. The comparisons included evaluations of DVHs and point doses calculated by the two TPS systems. Machine log files were captured during pretreatment composite point dose measurements and analyzed to verify data transfer and performance of the delivery machine. Average deviation between IC measurements and point dose calculations with the two TPSs for head-and-neck plans were 1.2 ± 1.3% and 1.4 ± 1.6%, respectively. For 2D diode array measurements, the mean gamma value with 3% dose difference and 3 mm distance-to-agreement was within 1.5% for 13 of 16 plans. The mean 3D dose differences calculated from two TPSs were within 3% for head-and-neck cases and within 2% for other plans. The machine log file analysis showed that the gantry angle, jaw position, collimator angle, and MUs were consistent as planned, and maximal MLC position error was less than 0.5 mm. The independent dose calculation followed by the machine log analysis takes an average 47 ± 6 minutes, while the experimental approach (using IC and

  8. Web Mining for Web Image Retrieval.

    ERIC Educational Resources Information Center

    Chen, Zheng; Wenyin, Liu; Zhang, Feng; Li, Mingjing; Zhang, Hongjiang

    2001-01-01

    Presents a prototype system for image retrieval from the Internet using Web mining. Discusses the architecture of the Web image retrieval prototype; document space modeling; user log mining; and image retrieval experiments to evaluate the proposed system. (AEF)

  9. ProteMiner-SSM: a web server for efficient analysis of similar protein tertiary substructures

    PubMed Central

    Chang, Darby Tien-Hau; Chen, Chien-Yu; Chung, Wen-Chin; Oyang, Yen-Jen; Juan, Hsueh-Fen; Huang, Hsuan-Cheng

    2004-01-01

    Analysis of protein–ligand interactions is a fundamental issue in drug design. As the detailed and accurate analysis of protein–ligand interactions involves calculation of binding free energy based on thermodynamics and even quantum mechanics, which is highly expensive in terms of computing time, conformational and structural analysis of proteins and ligands has been widely employed as a screening process in computer-aided drug design. In this paper, a web server called ProteMiner-SSM designed for efficient analysis of similar protein tertiary substructures is presented. In one experiment reported in this paper, the web server has been exploited to obtain some clues about a biochemical hypothesis. The main distinction in the software design of the web server is the filtering process incorporated to expedite the analysis. The filtering process extracts the residues located in the caves of the protein tertiary structure for analysis and operates with O(nlogn) time complexity, where n is the number of residues in the protein. In comparison, the α-hull algorithm, which is a widely used algorithm in computer graphics for identifying those instances that are on the contour of a three-dimensional object, features O(n2) time complexity. Experimental results show that the filtering process presented in this paper is able to speed up the analysis by a factor ranging from 3.15 to 9.37 times. The ProteMiner-SSM web server can be found at http://proteminer.csie.ntu.edu.tw/. There is a mirror site at http://p4.sbl.bc.sinica.edu.tw/proteminer/. PMID:15215355

  10. Working with Data: Discovering Knowledge through Mining and Analysis; Systematic Knowledge Management and Knowledge Discovery; Text Mining; Methodological Approach in Discovering User Search Patterns through Web Log Analysis; Knowledge Discovery in Databases Using Formal Concept Analysis; Knowledge Discovery with a Little Perspective.

    ERIC Educational Resources Information Center

    Qin, Jian; Jurisica, Igor; Liddy, Elizabeth D.; Jansen, Bernard J; Spink, Amanda; Priss, Uta; Norton, Melanie J.

    2000-01-01

    These six articles discuss knowledge discovery in databases (KDD). Topics include data mining; knowledge management systems; applications of knowledge discovery; text and Web mining; text mining and information retrieval; user search patterns through Web log analysis; concept analysis; data collection; and data structure inconsistency. (LRW)

  11. Development and evaluation of an improved efficiency polymeric web point-focus Fresnel lens

    SciTech Connect

    Cobb, S. Jr.

    1987-04-01

    The feasibility of producing parquets of point-focus Fresnel lenses with a 2/sup 0/ draft angle on the riser in a continuous polymeric web is described. The parquet produced consisted of 14 square lenses, each 8.16 in. on a side, in a 2 by 7 format. The primary aim was to show that an increased efficiency was possible over that reported in SAND83-7023 by decreasing the draft angle. A secondary aim was also to produce a web of sufficient thickness to be used without lamination to a thick superstrate. The results demonstrated that increased efficiency was realized for both the thin and thick caliper material, with performance nearly equal to a direct-cut control lens. The results also show that a bowing or sagging problem exists in the laminated lenses. They also show that the thicker, non-laminated lenses may not be stiff enough to lie flat and may buckle, causing these lenses to be potentially unacceptable.

  12. Major constrains of the pelagic food web efficiency in the Mediterranean Sea

    NASA Astrophysics Data System (ADS)

    Zoccarato, L.; Fonda Umani, S.

    2015-03-01

    Grazing pressure plays a key role on plankton communities affecting their biodiversity and shaping their structures. Predation exerted by 2-200 μm protists (i.e. microzooplankton and heterotrophic nanoplankton) influences the carbon fate in marine environments channeling new organic matter from the microbial loop toward the "classic" grazing food web. In this study, we analyzed more than 80 dilution experiments carried out in many Mediterranean sites at the surface and in the meso-bathypelagic layers. Our aims were to investigate prey-predator interactions and determine selectivity among energy sources (in terms of available biomass), efficiency in the exploitation and highlight likely constrains that can modulate carbon transfer processes within the pelagic food webs. Generally, microzooplankton shown higher impacts on prey stocks than heterotrophic nanoflagellates, expressing larger ingestion rates and efficiency. Through different trophic conditions characterized on the base of chlorophyll a concentration, microzooplankton diet has shown to change in prey compositions: nano- and picoplankton almost completely covered consumer needs in oligotrophy and mesotrophy, while microphytoplankton (mostly diatoms) represented more than 80% of the consumers' diet in eutrophy, where, nevertheless, picoplankton mortality remained relatively high. Ingestion rates of both consumers (nano- and microzooplankters) increased with the availability of prey biomasses and consequently with the trophic condition of the environment. Nevertheless, overall the heterotrophic fraction of picoplankton resulted the most exploited biomass by both classes of consumers. Ingestion efficiency (as the ratio between available biomass and ingestion rate) increased at low biomasses and therefore the highest efficiencies were recorded in oligotrophic conditions and in the bathypelagic layers.

  13. Maximizing the quantum efficiency of microchannel plate detectors - The collection of photoelectrons from the interchannel web using an electric field

    NASA Technical Reports Server (NTRS)

    Taylor, R. C.; Hettrick, M. C.; Malina, R. F.

    1983-01-01

    High quantum efficiency and two-dimensional imaging capabilities make the microchannel plate (MCP) a suitable detector for a sky survey instrument. The Extreme Ultraviolet Explorer satellite, to be launched in 1987, will use MCP detectors. A feature which limits MCP efficiency is related to the walls of individual channels. The walls are of finite thickness and thus form an interchannel web. Under normal circumstances, this web does not contribute to the detector's quantum efficiency. Panitz and Foesch (1976) have found that in the case of a bombardment with ions, electrons were ejected from the electrode material coating the web. By applying a small electric field, the electrons were returned to the MCP surface where they were detected. The present investigation is concerned with the enhancement of quantum efficiencies in the case of extreme UV wavelengths. Attention is given to a model and a computer simulation which quantitatively reproduce the experimental results.

  14. NACE: A web-based tool for prediction of intercompartmental efficiency of human molecular genetic networks.

    PubMed

    Popik, Olga V; Ivanisenko, Timofey V; Saik, Olga V; Petrovskiy, Evgeny D; Lavrik, Inna N; Ivanisenko, Vladimir A

    2016-06-15

    Molecular genetic processes generally involve proteins from distinct intracellular localisations. Reactions that follow the same process are distributed among various compartments within the cell. In this regard, the reaction rate and the efficiency of biological processes can depend on the subcellular localisation of proteins. Previously, the authors proposed a method of evaluating the efficiency of biological processes based on the analysis of the distribution of protein subcellular localisation (Popik et al., 2014). Here, NACE is presented, which is an open access web-oriented program that implements this method and allows the user to evaluate the intercompartmental efficiency of human molecular genetic networks. The method has been extended by a new feature that provides the evaluation of the tissue-specific efficiency of networks for more than 2800 anatomical structures. Such assessments are important in cases when molecular genetic pathways in different tissues proceed with the participation of various proteins with a number of intracellular localisations. For example, an analysis of KEGG pathways, conducted using the developed program, showed that the efficiencies of many KEGG pathways are tissue-specific. Analysis of efficiencies of regulatory pathways in the liver, linking proteins of the hepatitis C virus with human proteins involved in the KEGG apoptosis pathway, showed that intercompartmental efficiency might play an important role in host-pathogen interactions. Thus, the developed tool can be useful in the study of the effectiveness of functioning of various molecular genetic networks, including metabolic, regulatory, host-pathogen interactions and others taking into account tissue-specific gene expression. The tool is available via the following link: http://www-bionet.sscc.ru/nace/.

  15. Adapting Web-based Instruction to Residents’ Knowledge Improves Learning Efficiency

    PubMed Central

    Beckman, Thomas J.; Thomas, Kris G.; Thompson, Warren G.

    2008-01-01

    Summary BACKGROUND Increased clinical demands and decreased available time accentuate the need for efficient learning in postgraduate medical training. Adapting Web-based learning (WBL) to learners’ prior knowledge may improve efficiency. OBJECTIVE We hypothesized that time spent learning would be shorter and test scores not adversely affected for residents who used a WBL intervention that adapted to prior knowledge. DESIGN Randomized, crossover trial. SETTING Academic internal medicine residency program continuity clinic. PARTICIPANTS 122 internal medicine residents. INTERVENTIONS Four WBL modules on ambulatory medicine were developed in standard and adaptive formats. The adaptive format allowed learners who correctly answered case-based questions to skip the corresponding content. MEASUREMENTS and Main Results The measurements were knowledge posttest, time spent on modules, and format preference. One hundred twenty-two residents completed at least 1 module, and 111 completed all 4. Knowledge scores were similar between the adaptive format (mean ± standard error of the mean, 76.2 ± 0.9) and standard (77.2 ± 0.9, 95% confidence interval [CI] for difference −3.0 to 1.0, P = .34). However, time spent was lower for the adaptive format (29.3 minutes [CI 26.0 to 33.0] per module) than for the standard (35.6 [31.6 to 40.3]), an 18% decrease in time (CI 9 to 26%, P = .0003). Seventy-two of 96 respondents (75%) preferred the adaptive format. CONCLUSIONS Adapting WBL to learners’ prior knowledge can reduce learning time without adversely affecting knowledge scores, suggesting greater learning efficiency. In an era where reduced duty hours and growing clinical demands on trainees and faculty limit the time available for learning, such efficiencies will be increasingly important. For clinical trial registration, see http://www.clinicaltrials.gov NCT00466453 (http://www.clinicaltrials.gov/ct/show/NCT00466453?order=1). PMID:18612729

  16. Impacts of elevated terrestrial nutrient loads and temperature on pelagic food-web efficiency and fish production.

    PubMed

    Lefébure, R; Degerman, R; Andersson, A; Larsson, S; Eriksson, L-O; Båmstedt, U; Byström, P

    2013-05-01

    Both temperature and terrestrial organic matter have strong impacts on aquatic food-web dynamics and production. Temperature affects vital rates of all organisms, and terrestrial organic matter can act both as an energy source for lower trophic levels, while simultaneously reducing light availability for autotrophic production. As climate change predictions for the Baltic Sea and elsewhere suggest increases in both terrestrial matter runoff and increases in temperature, we studied the effects on pelagic food-web dynamics and food-web efficiency in a plausible future scenario with respect to these abiotic variables in a large-scale mesocosm experiment. Total basal (phytoplankton plus bacterial) production was slightly reduced when only increasing temperatures, but was otherwise similar across all other treatments. Separate increases in nutrient loads and temperature decreased the ratio of autotrophic:heterotrophic production, but the combined treatment of elevated temperature and terrestrial nutrient loads increased both fish production and food-web efficiency. CDOM: Chl a ratios strongly indicated that terrestrial and not autotrophic carbon was the main energy source in these food webs and our results also showed that zooplankton biomass was positively correlated with increased bacterial production. Concomitantly, biomass of the dominant calanoid copepod Acartia sp. increased as an effect of increased temperature. As the combined effects of increased temperature and terrestrial organic nutrient loads were required to increase zooplankton abundance and fish production, conclusions about effects of climate change on food-web dynamics and fish production must be based on realistic combinations of several abiotic factors. Moreover, our results question established notions on the net inefficiency of heterotrophic carbon transfer to the top of the food web.

  17. OLTARIS: An Efficient Web-Based Tool for Analyzing Materials Exposed to Space Radiation

    NASA Technical Reports Server (NTRS)

    Slaba, Tony; McMullen, Amelia M.; Thibeault, Sheila A.; Sandridge, Chris A.; Clowdsley, Martha S.; Blatting, Steve R.

    2011-01-01

    The near-Earth space radiation environment includes energetic galactic cosmic rays (GCR), high intensity proton and electron belts, and the potential for solar particle events (SPE). These sources may penetrate shielding materials and deposit significant energy in sensitive electronic devices on board spacecraft and satellites. Material and design optimization methods may be used to reduce the exposure and extend the operational lifetime of individual components and systems. Since laboratory experiments are expensive and may not cover the range of particles and energies relevant for space applications, such optimization may be done computationally with efficient algorithms that include the various constraints placed on the component, system, or mission. In the present work, the web-based tool OLTARIS (On-Line Tool for the Assessment of Radiation in Space) is presented, and the applicability of the tool for rapidly analyzing exposure levels within either complicated shielding geometries or user-defined material slabs exposed to space radiation is demonstrated. An example approach for material optimization is also presented. Slabs of various advanced multifunctional materials are defined and exposed to several space radiation environments. The materials and thicknesses defining each layer in the slab are then systematically adjusted to arrive at an optimal slab configuration.

  18. Efficiency of Using a Web-Based Approach to Teach Reading Strategies to Iranian EFL Learners

    ERIC Educational Resources Information Center

    Dehghanpour, Elham; Hashemian, Mahmood

    2015-01-01

    Applying new technologies with their effective potentials have changed education and, consequently, the L2 teacher role. Coping with online materials imposes the necessity of employing Web-based approaches in L2 instruction. The ability to use reading strategies in a Web-based condition needs sufficient skill which will be fulfilled if it is…

  19. Analyzing web log files of the health on the net HONmedia search engine to define typical image search tasks for image retrieval evaluation.

    PubMed

    Müller, Henning; Boyer, Célia; Gaudinat, Arnaud; Hersh, William; Geissbuhler, Antoine

    2007-01-01

    Medical institutions produce ever-increasing amount of diverse information. The digital form makes these data available for the use on more than a single patient. Images are no exception to this. However, less is known about how medical professionals search for visual medical information and how they want to use it outside of the context of a single patient. This article analyzes ten months of usage log files of the Health on the Net (HON) medical media search engine. Key words were extracted from all queries and the most frequent terms and subjects were identified. The dataset required much pre-treatment. Problems included national character sets, spelling errors and the use of terms in several languages. The results show that media search, particularly for images, was frequently used. The most common queries were for general concepts (e.g., heart, lung). To define realistic information needs for the ImageCLEFmed challenge evaluation (Cross Language Evaluation Forum medical image retrieval), we used frequent queries that were still specific enough to at least cover two of the three axes on modality, anatomic region, and pathology. Several research groups evaluated their image retrieval algorithms based on these defined topics.

  20. MultiLog: a tool for the control and output merging of multiple logging applications.

    PubMed

    Woodruff, Jonathan; Alexander, Jason

    2016-12-01

    MultiLog is a logging tool that controls, gathers, and combines the output, on-the-fly, from existing research and commercial logging applications or "loggers." Loggers record a specific set of user actions on a computing device, helping researchers to better understand environments or interactions, guiding the design of new or improved interfaces and applications. MultiLog reduces researchers' required implementation effort by simplifying the set-up of multiple loggers and seamlessly combining their output. This in turn increases the availability of logging systems to non-technical experimenters for both short-term and longitudinal observation studies. MultiLog supports two operating modes: "researcher mode" where experimenters configure multiple logging systems, and "deployment mode" where the system is deployed to user-study participants' systems. Researcher mode allows researchers to install, configure log filtering and obfuscation, observe near real-time event streams, and save configuration files ready for deployment. Deployment mode simplifies data collection from multiple loggers by running in the system tray at user log-in, starting loggers, combining their output, and securely uploading the data to a web-server. It also supports real-time browsing of log data, pausing of logging, and removal of log lines. Performance evaluations show that MultiLog does not adversely affect system performance, even when simultaneously running several logging systems. Initial studies show the system runs reliably over a period of 10 weeks.

  1. Reviews Equipment: Data logger Book: Imagined Worlds Equipment: Mini data loggers Equipment: PICAXE-18M2 data logger Books: Engineering: A Very Short Introduction and To Engineer Is Human Book: Soap, Science, & Flat-Screen TVs Equipment: uLog and SensorLab Web Watch

    NASA Astrophysics Data System (ADS)

    2012-07-01

    WE RECOMMEND Data logger Fourier NOVA LINK: data logging and analysis To Engineer is Human Engineering: essays and insights Soap, Science, & Flat-Screen TVs People, politics, business and science overlap uLog sensors and sensor adapter A new addition to the LogIT range offers simplicity and ease of use WORTH A LOOK Imagined Worlds Socio-scientific predictions for the future Mini light data logger and mini temperature data logger Small-scale equipment for schools SensorLab Plus LogIT's supporting software, with extra features HANDLE WITH CARE CAXE110P PICAXE-18M2 data logger Data logger 'on view' but disappoints Engineering: A Very Short Introduction A broad-brush treatment fails to satisfy WEB WATCH Two very different websites for students: advanced physics questions answered and a more general BBC science resource

  2. SU-E-J-150: Impact of Intrafractional Prostate Motion On the Accuracy and Efficiency of Prostate SBRT Delivery: A Retrospective Analysis of Prostate Tracking Log Files

    SciTech Connect

    Xiang, H; Hirsch, A; Willins, J; Kachnic, J; Qureshi, M; Katz, M; Nicholas, B; Keohan, S; De Armas, R; Lu, H; Efstathiou, J; Zietman, A

    2014-06-01

    Purpose: To measure intrafractional prostate motion by time-based stereotactic x-ray imaging and investigate the impact on the accuracy and efficiency of prostate SBRT delivery. Methods: Prostate tracking log files with 1,892 x-ray image registrations from 18 SBRT fractions for 6 patients were retrospectively analyzed. Patient setup and beam delivery sessions were reviewed to identify extended periods of large prostate motion that caused delays in setup or interruptions in beam delivery. The 6D prostate motions were compared to the clinically used PTV margin of 3–5 mm (3 mm posterior, 5 mm all other directions), a hypothetical PTV margin of 2–3 mm (2 mm posterior, 3 mm all other directions), and the rotation correction limits (roll ±2°, pitch ±5° and yaw ±3°) of CyberKnife to quantify beam delivery accuracy. Results: Significant incidents of treatment start delay and beam delivery interruption were observed, mostly related to large pitch rotations of ≥±5°. Optimal setup time of 5–15 minutes was recorded in 61% of the fractions, and optimal beam delivery time of 30–40 minutes in 67% of the fractions. At a default imaging interval of 15 seconds, the percentage of prostate motion beyond PTV margin of 3–5 mm varied among patients, with a mean at 12.8% (range 0.0%–31.1%); and the percentage beyond PTV margin of 2–3 mm was at a mean of 36.0% (range 3.3%–83.1%). These timely detected offsets were all corrected real-time by the robotic manipulator or by operator intervention at the time of treatment interruptions. Conclusion: The durations of patient setup and beam delivery were directly affected by the occurrence of large prostate motion. Frequent imaging of down to 15 second interval is necessary for certain patients. Techniques for reducing prostate motion, such as using endorectal balloon, can be considered to assure consistently higher accuracy and efficiency of prostate SBRT delivery.

  3. Index Compression and Efficient Query Processing in Large Web Search Engines

    ERIC Educational Resources Information Center

    Ding, Shuai

    2013-01-01

    The inverted index is the main data structure used by all the major search engines. Search engines build an inverted index on their collection to speed up query processing. As the size of the web grows, the length of the inverted list structures, which can easily grow to hundreds of MBs or even GBs for common terms (roughly linear in the size of…

  4. Harnessing modern web application technology to create intuitive and efficient data visualization and sharing tools

    PubMed Central

    Wood, Dylan; King, Margaret; Landis, Drew; Courtney, William; Wang, Runtang; Kelly, Ross; Turner, Jessica A.; Calhoun, Vince D.

    2014-01-01

    Neuroscientists increasingly need to work with big data in order to derive meaningful results in their field. Collecting, organizing and analyzing this data can be a major hurdle on the road to scientific discovery. This hurdle can be lowered using the same technologies that are currently revolutionizing the way that cultural and social media sites represent and share information with their users. Web application technologies and standards such as RESTful webservices, HTML5 and high-performance in-browser JavaScript engines are being utilized to vastly improve the way that the world accesses and shares information. The neuroscience community can also benefit tremendously from these technologies. We present here a web application that allows users to explore and request the complex datasets that need to be shared among the neuroimaging community. The COINS (Collaborative Informatics and Neuroimaging Suite) Data Exchange uses web application technologies to facilitate data sharing in three phases: Exploration, Request/Communication, and Download. This paper will focus on the first phase, and how intuitive exploration of large and complex datasets is achieved using a framework that centers around asynchronous client-server communication (AJAX) and also exposes a powerful API that can be utilized by other applications to explore available data. First opened to the neuroscience community in August 2012, the Data Exchange has already provided researchers with over 2500 GB of data. PMID:25206330

  5. Harnessing modern web application technology to create intuitive and efficient data visualization and sharing tools.

    PubMed

    Wood, Dylan; King, Margaret; Landis, Drew; Courtney, William; Wang, Runtang; Kelly, Ross; Turner, Jessica A; Calhoun, Vince D

    2014-01-01

    Neuroscientists increasingly need to work with big data in order to derive meaningful results in their field. Collecting, organizing and analyzing this data can be a major hurdle on the road to scientific discovery. This hurdle can be lowered using the same technologies that are currently revolutionizing the way that cultural and social media sites represent and share information with their users. Web application technologies and standards such as RESTful webservices, HTML5 and high-performance in-browser JavaScript engines are being utilized to vastly improve the way that the world accesses and shares information. The neuroscience community can also benefit tremendously from these technologies. We present here a web application that allows users to explore and request the complex datasets that need to be shared among the neuroimaging community. The COINS (Collaborative Informatics and Neuroimaging Suite) Data Exchange uses web application technologies to facilitate data sharing in three phases: Exploration, Request/Communication, and Download. This paper will focus on the first phase, and how intuitive exploration of large and complex datasets is achieved using a framework that centers around asynchronous client-server communication (AJAX) and also exposes a powerful API that can be utilized by other applications to explore available data. First opened to the neuroscience community in August 2012, the Data Exchange has already provided researchers with over 2500 GB of data.

  6. MEDock: a web server for efficient prediction of ligand binding sites based on a novel optimization algorithm.

    PubMed

    Chang, Darby Tien-Hau; Oyang, Yen-Jen; Lin, Jung-Hsin

    2005-07-01

    The prediction of ligand binding sites is an essential part of the drug discovery process. Knowing the location of binding sites greatly facilitates the search for hits, the lead optimization process, the design of site-directed mutagenesis experiments and the hunt for structural features that influence the selectivity of binding in order to minimize the drug's adverse effects. However, docking is still the rate-limiting step for such predictions; consequently, much more efficient algorithms are required. In this article, the design of the MEDock web server is described. The goal of this sever is to provide an efficient utility for predicting ligand binding sites. The MEDock web server incorporates a global search strategy that exploits the maximum entropy property of the Gaussian probability distribution in the context of information theory. As a result of the global search strategy, the optimization algorithm incorporated in MEDock is significantly superior when dealing with very rugged energy landscapes, which usually have insurmountable barriers. This article describes four different benchmark cases that span a diverse set of different types of ligand binding interactions. These benchmarks were compared with the use of the Lamarckian genetic algorithm (LGA), which is the major workhorse of the well-known AutoDock program. These results demonstrate that MEDock consistently converged to the correct binding modes with significantly smaller numbers of energy evaluations than the LGA required. When judged by a threshold of the number of energy evaluations consumed in the docking simulation, MEDock also greatly elevates the rate of accurate predictions for all benchmark cases. MEDock is available at http://medock.csie.ntu.edu.tw/ and http://bioinfo.mc.ntu.edu.tw/medock/.

  7. Log N-log S in inconclusive

    NASA Technical Reports Server (NTRS)

    Klebesadel, R. W.; Fenimore, E. E.; Laros, J.

    1983-01-01

    The log N-log S data acquired by the Pioneer Venus Orbiter Gamma Burst Detector (PVO) are presented and compared to similar data from the Soviet KONUS experiment. Although the PVO data are consistent with and suggestive of a -3/2 power law distribution, the results are not adequate at this state of observations to differentiate between a -3/2 and a -1 power law slope.

  8. Log Truck-Weighing System

    NASA Technical Reports Server (NTRS)

    1977-01-01

    ELDEC Corp., Lynwood, Wash., built a weight-recording system for logging trucks based on electronic technology the company acquired as a subcontractor on space programs such as Apollo and the Saturn launch vehicle. ELDEC employed its space-derived expertise to develop a computerized weight-and-balance system for Lockheed's TriStar jetliner. ELDEC then adapted the airliner system to a similar product for logging trucks. Electronic equipment computes tractor weight, trailer weight and overall gross weight, and this information is presented to the driver by an instrument in the cab. The system costs $2,000 but it pays for itself in a single year. It allows operators to use a truck's hauling capacity more efficiently since the load can be maximized without exceeding legal weight limits for highway travel. Approximately 2,000 logging trucks now use the system.

  9. Costs and Efficiency of Online and Offline Recruitment Methods: A Web-Based Cohort Study

    PubMed Central

    Riis, Anders H; Hatch, Elizabeth E; Wise, Lauren A; Nielsen, Marie G; Rothman, Kenneth J; Toft Sørensen, Henrik; Mikkelsen, Ellen M

    2017-01-01

    Background The Internet is widely used to conduct research studies on health issues. Many different methods are used to recruit participants for such studies, but little is known about how various recruitment methods compare in terms of efficiency and costs. Objective The aim of our study was to compare online and offline recruitment methods for Internet-based studies in terms of efficiency (number of recruited participants) and costs per participant. Methods We employed several online and offline recruitment methods to enroll 18- to 45-year-old women in an Internet-based Danish prospective cohort study on fertility. Offline methods included press releases, posters, and flyers. Online methods comprised advertisements placed on five different websites, including Facebook and Netdoktor.dk. We defined seven categories of mutually exclusive recruitment methods and used electronic tracking via unique Uniform Resource Locator (URL) and self-reported data to identify the recruitment method for each participant. For each method, we calculated the average cost per participant and efficiency, that is, the total number of recruited participants. Results We recruited 8252 study participants. Of these, 534 were excluded as they could not be assigned to a specific recruitment method. The final study population included 7724 participants, of whom 803 (10.4%) were recruited by offline methods, 3985 (51.6%) by online methods, 2382 (30.8%) by online methods not initiated by us, and 554 (7.2%) by other methods. Overall, the average cost per participant was €6.22 for online methods initiated by us versus €9.06 for offline methods. Costs per participant ranged from €2.74 to €105.53 for online methods and from €0 to €67.50 for offline methods. Lowest average costs per participant were for those recruited from Netdoktor.dk (€2.99) and from Facebook (€3.44). Conclusions In our Internet-based cohort study, online recruitment methods were superior to offline methods in terms

  10. Log processing systems

    SciTech Connect

    Bowlin, W.P.; Kneer, M.P.; Ballance, J.D.

    1989-11-07

    This patent describes an improvement in a computer controlled processing system for lumber production. It comprises: a computer, a sequence of processing stations for processing a log segment including; an excess material removing station for generating opposed flat side surfaces on the log segment. The flat side surfaces determined by the computer to become sides of boards to be severed from the log segments; a profiling station for forming profiled edges above and below the flat side surfaces to become the side edges of the boards to be severed from the log segment, and a severing station for severing the boards from the log segments, a conveyance means establishing a path of conveyance and having continuous control of the log segment on conveying the log segment along the path and through the above defined sequence of processing stations.

  11. Optimal message log reclamation for independent checkpointing

    NASA Technical Reports Server (NTRS)

    Wang, Yi-Min; Fuchs, W. Kent

    1993-01-01

    Independent (uncoordinated) check pointing for parallel and distributed systems allows maximum process autonomy but suffers from possible domino effects and the associated storage space overhead for maintaining multiple checkpoints and message logs. In most research on check pointing and recovery, it was assumed that only the checkpoints and message logs older than the global recovery line can be discarded. It is shown how recovery line transformation and decomposition can be applied to the problem of efficiently identifying all discardable message logs, thereby achieving optimal garbage collection. Communication trace-driven simulation for several parallel programs is used to show the benefits of the proposed algorithm for message log reclamation.

  12. Well Log ETL tool

    SciTech Connect

    Good, Jessica

    2013-08-01

    This is an executable python script which offers two different conversions for well log data: 1) Conversion from a BoreholeLASLogData.xls model to a LAS version 2.0 formatted XML file. 2) Conversion from a LAS 2.0 formatted XML file to an entry in the WellLog Content Model. Example templates for BoreholeLASLogData.xls and WellLogsTemplate.xls can be found in the package after download.

  13. Multiple log potash assay

    NASA Astrophysics Data System (ADS)

    Hill, D. G.

    1993-10-01

    A five-mineral multiple-log potash assay technique has been successfully applied to evaluate potash-rich intervals in evaporite sequences. The technique is able to distinguish economic potash minerals from non-economic potash minerals and from other non-potash radioactive minerals. It can be applied on location, using a programmable calculator or microcomputer, providing near real-time logs of potash mineral concentrations. Log assay values show good agreement with core wet chemistry analyses.

  14. Ulysses log 1992

    NASA Technical Reports Server (NTRS)

    Perez, Raul Garcia

    1993-01-01

    The Ulysses Log tells the story of some intriguing problems that we (=The Spacecraft Team) have encountered. Ulysses was launched on 6 Oct. 1990, and it made the fastest trip to Jupiter (8 Feb. 1992). It is presently going out of the ecliptic. This paper presents log entries from the following areas: (1) ingenious maneuvers; (2) telecommunication problems; and (3) surprises.

  15. Food web efficiency differs between humic and clear water lake communities in response to nutrients and light.

    PubMed

    Faithfull, C L; Mathisen, P; Wenzel, A; Bergström, A K; Vrede, T

    2015-03-01

    This study demonstrates that clear and humic freshwater pelagic communities respond differently to the same environmental stressors, i.e. nutrient and light availability. Thus, effects on humic communities cannot be generalized from existing knowledge about these environmental stressors on clear water communities. Small humic lakes are the most numerous type of lake in the boreal zone, but little is known about how these lakes will respond to increased inflows of nutrients and terrestrial dissolved organic C (t-DOC) due to climate change and increased human impacts. Therefore, we compared the effects of nutrient addition and light availability on pelagic humic and clear water lake communities in a mesocosm experiment. When nutrients were added, phytoplankton production (PPr) increased in both communities, but pelagic energy mobilization (PEM) and bacterial production (BP) only increased in the humic community. At low light conditions, the addition of nutrients led to increased PPr only in the humic community, suggesting that, in contrast to the clear water community, humic phytoplankton were already adapted to lower ambient light levels. Low light significantly reduced PPr and PEM in the clear water community, but without reducing total zooplankton production, which resulted in a doubling of food web efficiency (FWE = total zooplankton production/PEM). However, total zooplankton production was not correlated with PEM, PPr, BP, PPr:BP or C:nutrient stoichiometry for either community type. Therefore, other factors such as food chain length, food quality, ultra-violet radiation or duration of the experiment, must have determined total zooplankton production and ultimately FWE.

  16. Future climate scenarios for a coastal productive planktonic food web resulting in microplankton phenology changes and decreased trophic transfer efficiency.

    PubMed

    Calbet, Albert; Sazhin, Andrey F; Nejstgaard, Jens C; Berger, Stella A; Tait, Zachary S; Olmos, Lorena; Sousoni, Despoina; Isari, Stamatina; Martínez, Rodrigo A; Bouquet, Jean-Marie; Thompson, Eric M; Båmstedt, Ulf; Jakobsen, Hans H

    2014-01-01

    We studied the effects of future climate change scenarios on plankton communities of a Norwegian fjord using a mesocosm approach. After the spring bloom, natural plankton were enclosed and treated in duplicates with inorganic nutrients elevated to pre-bloom conditions (N, P, Si; eutrophication), lowering of 0.4 pH units (acidification), and rising 3°C temperature (warming). All nutrient-amended treatments resulted in phytoplankton blooms dominated by chain-forming diatoms, and reached 13-16 μg chlorophyll (chl) a l-1. In the control mesocosms, chl a remained below 1 μg l-1. Acidification and warming had contrasting effects on the phenology and bloom-dynamics of autotrophic and heterotrophic microplankton. Bacillariophyceae, prymnesiophyceae, cryptophyta, and Protoperidinium spp. peaked earlier at higher temperature and lower pH. Chlorophyta showed lower peak abundances with acidification, but higher peak abundances with increased temperature. The peak magnitude of autotrophic dinophyceae and ciliates was, on the other hand, lowered with combined warming and acidification. Over time, the plankton communities shifted from autotrophic phytoplankton blooms to a more heterotrophic system in all mesocosms, especially in the control unaltered mesocosms. The development of mass balance and proportion of heterotrophic/autotrophic biomass predict a shift towards a more autotrophic community and less-efficient food web transfer when temperature, nutrients and acidification are combined in a future climate-change scenario. We suggest that this result may be related to a lower food quality for microzooplankton under acidification and warming scenarios and to an increase of catabolic processes compared to anabolic ones at higher temperatures.

  17. Acoustic borehole logging

    SciTech Connect

    Medlin, W.L.; Manzi, S.J.

    1990-10-09

    This patent describes an acoustic borehole logging method. It comprises traversing a borehole with a borehole logging tool containing a transmitter of acoustic energy having a free-field frequency spectrum with at least one characteristic resonant frequency of vibration and spaced-apart receiver, repeatedly exciting the transmitter with a swept frequency tone burst of a duration sufficiently greater than the travel time of acoustic energy between the transmitter and the receiver to allow borehole cavity resonances to be established within the borehole cavity formed between the borehole logging tool and the borehole wall, detecting acoustic energy amplitude modulated by the borehole cavity resonances with the spaced-apart receiver, and recording an amplitude verses frequency output of the receiver in correlation with depth as a log of the borehole frequency spectrum representative of the subsurface formation comprising the borehole wall.

  18. EE-3A Logging Report

    SciTech Connect

    Anderson, David W.

    1993-12-15

    Two logs of EE-3A were performed during the last couple of weeks. The first of which, was a Temperature/Casing-Collar Locator (CCL) log, which took place on Friday, December 10th., 1993. The second log was a Caliper log which was done in cooperation with the Dia-Log Company, of Odessa, TX. on Monday, December, 13th., 1993.

  19. Data for Free: Using LMS Activity Logs to Measure Community in Online Courses

    ERIC Educational Resources Information Center

    Black, Erik W.; Dawson, Kara; Priem, Jason

    2008-01-01

    In the study of online learning community, many investigators have turned attention to automatically logged web data. This study aims to further this work by seeking to determine whether logs of student activity within online graduate level courses related to student perceptions of course community. Researchers utilized the data logging features…

  20. 6. Log calving barn. Interior view showing log postandbeam support ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. Log calving barn. Interior view showing log post-and-beam support system and animal stalls. - William & Lucina Bowe Ranch, Log Calving Barn, 230 feet south-southwest of House, Melrose, Silver Bow County, MT

  1. Recognizing Patterns In Log-Polar Coordinates

    NASA Technical Reports Server (NTRS)

    Weiman, Carl F. R.

    1992-01-01

    Log-Hough transform is basis of improved method for recognition of patterns - particularly, straight lines - in noisy images. Takes advantage of rotational and scale invariance of mapping from Cartesian to log-polar coordinates, and offers economy of representation and computation. Unification of iconic and Hough domains simplifies computations in recognition and eliminates erroneous quantization of slopes attributable to finite spacing of Cartesian coordinate grid of classical Hough transform. Equally efficient recognizing curves. Log-Hough transform more amenable to massively parallel computing architectures than traditional Cartesian Hough transform. "In-place" nature makes it possible to apply local pixel-neighborhood processing.

  2. e2g: an interactive web-based server for efficiently mapping large EST and cDNA sets to genomic sequences.

    PubMed

    Krüger, Jan; Sczyrba, Alexander; Kurtz, Stefan; Giegerich, Robert

    2004-07-01

    e2g is a web-based server which efficiently maps large expressed sequence tag (EST) and cDNA datasets to genomic DNA. It significantly extends the volume of data that can be mapped in reasonable time, and makes this improved efficiency available as a web service. Our server hosts large collections of EST sequences (e.g. 4.1 million mouse ESTs of 1.87 Gb) in precomputed indexed data structures for efficient sequence comparison. The user can upload a genomic DNA sequence of interest and rapidly compare this to the complete collection of ESTs on the server. This delivers a mapping of the ESTs on the genomic DNA. The e2g web interface provides a graphical overview of the mapping. Alignments of the mapped EST regions with parts of the genomic sequence are visualized. Zooming functions allow the user to interactively explore the results. Mapped sequences can be downloaded for further analysis. e2g is available on the Bielefeld University Bioinformatics Server at http://bibiserv.techfak.uni-bielefeld.de/e2g/.

  3. Real-Time System Log Monitoring/Analytics Framework

    SciTech Connect

    Oral, H Sarp; Dillow, David A; Park, Byung H; Shipman, Galen M; Geist, Al; Gunasekaran, Raghul

    2011-01-01

    Analyzing system logs provides useful insights for identifying system/application anomalies and helps in better usage of system resources. Nevertheless, it is simply not practical to scan through the raw log messages on a regular basis for large-scale systems. First, the sheer volume of unstructured log messages affects the readability, and secondly correlating the log messages to system events is a daunting task. These factors limit large-scale system logs primarily for generating alerts on known system events, and post-mortem diagnosis for identifying previously unknown system events that impacted the systems performance. In this paper, we describe a log monitoring framework that enables prompt analysis of system events in real-time. Our web-based framework provides a summarized view of console, netwatch, consumer, and apsched logs in real- time. The logs are parsed and processed to generate views of applications, message types, individual/group of compute nodes, and in sections of the compute platform. Also from past application runs we build a statistical profile of user/application characteristics with respect to known system events, recoverable/non-recoverable error messages and resources utilized. The web-based tool is being developed for Jaguar XT5 at the Oak Ridge Leadership Computing facility.

  4. NMR logging apparatus

    DOEpatents

    Walsh, David O; Turner, Peter

    2014-05-27

    Technologies including NMR logging apparatus and methods are disclosed. Example NMR logging apparatus may include surface instrumentation and one or more downhole probes configured to fit within an earth borehole. The surface instrumentation may comprise a power amplifier, which may be coupled to the downhole probes via one or more transmission lines, and a controller configured to cause the power amplifier to generate a NMR activating pulse or sequence of pulses. Impedance matching means may be configured to match an output impedance of the power amplifier through a transmission line to a load impedance of a downhole probe. Methods may include deploying the various elements of disclosed NMR logging apparatus and using the apparatus to perform NMR measurements.

  5. 4. Log chicken house (far left foreground), log bunkhouse (far ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. Log chicken house (far left foreground), log bunkhouse (far left background), one-room log cabin (left of center background), log root cellar (center), post-and-beam center in foreground, and blacksmith shop (far right foreground). View to southeast. - William & Lucina Bowe Ranch, County Road 44, 0.1 mile northeast of Big Hole River Bridge, Melrose, Silver Bow County, MT

  6. Dissemination Strategies and Adherence Predictors for Web-Based Interventions--How Efficient Are Patient Education Sessions and Email Reminders?

    ERIC Educational Resources Information Center

    Schweier, R.; Romppel, M.; Richter, C.; Grande, G.

    2016-01-01

    The Internet offers the potential to efficaciously deliver health interventions at a low cost and with a low threshold across any distance. However, since many web-based interventions are confronted with low use and adherence, proactive dissemination strategies are needed. We, therefore, tested the efficacy of a 1-h patient education session as…

  7. Reviews Book: Enjoyable Physics Equipment: SEP Colorimeter Box Book: Pursuing Power and Light Equipment: SEP Bottle Rocket Launcher Equipment: Sciencescope GLE Datalogger Equipment: EDU Logger Book: Physics of Sailing Book: The Lightness of Being Software: Logotron Insight iLog Studio iPhone Apps Lecture: 2010 IOP Schools and Colleges Lecture Web Watch

    NASA Astrophysics Data System (ADS)

    2010-09-01

    WE RECOMMEND Enjoyable Physics Mechanics book makes learning more fun SEP Colorimeter Box A useful and inexpensive colorimeter for the classroom Pursuing Power and Light Account of the development of science in the 19th centuary SEP Bottle Rocket Launcher An excellent resource for teaching about projectiles GLE Datalogger GPS software is combined with a datalogger EDU Logger Remote datalogger has greater sensing abilities Logotron Insight iLog Studio Software enables datlogging, data analysis and modelling iPhone Apps Mobile phone games aid study of gravity WORTH A LOOK Physics of Sailing Book journeys through the importance of physics in sailing The Lightness of Being Study of what the world is made from LECTURE The 2010 IOP Schools and Colleges Lecture presents the physics of fusion WEB WATCH Planet Scicast pushes boundaries of pupil creativity

  8. Logs Perl Module

    SciTech Connect

    Owen, R. K.

    2007-04-04

    A perl module designed to read and parse the voluminous set of event or accounting log files produced by a Portable Batch System (PBS) server. This module can filter on date-time and/or record type. The data can be returned in a variety of formats.

  9. Log of Apollo 11.

    ERIC Educational Resources Information Center

    National Aeronautics and Space Administration, Washington, DC.

    The major events of the first manned moon landing mission, Apollo 11, are presented in chronological order from launch time until arrival of the astronauts aboard the U.S.S. Hornet. The log is descriptive, non-technical, and includes numerous color photographs of the astronauts on the moon. (PR)

  10. Alaska's Logging Camp School.

    ERIC Educational Resources Information Center

    Millward, Robert E.

    1999-01-01

    A visit to Ketchikan, Alaska, reveals a floating, one-teacher logging-camp school that uses multiage grouping and interdisciplinary teaching. There are 10 students. The school gym and playground, bunkhouse, fuel tanks, mess hall, and students' homes bob up and down and are often moved to other sites. (MLH)

  11. Interactive Reflective Logs

    ERIC Educational Resources Information Center

    Deaton, Cynthia Minchew; Deaton, Benjamin E.; Leland, Katina

    2010-01-01

    The authors created an interactive reflective log (IRL) to provide teachers with an opportunity to use a journal approach to record, evaluate, and communicate student understanding of science concepts. Unlike a traditional journal, the IRL incorporates prompts to encourage students to discuss their understanding of science content and science…

  12. Petrographic image logging system

    SciTech Connect

    Payne, C.J.; Ulrich, M.R.; Maxwell, G.B. ); Adams, J.P. )

    1991-03-01

    The Petrographic Image Logging System (PILS) is a logging system data base for Macintosh computers that allows the merging of traditional wire-line, core, and mud log data with petrographic images. The system is flexible; it allows the user to record, manipulate, and display almost any type of character, graphic, and image information. Character and graphic data are linked and entry in either mode automatically generates the alternate mode. Character/graphic data may include such items as ROP, wire-line log data, interpreted lithologies, ditch cutting lith-percentages, porosity grade and type, grain size, core/DST information, and sample descriptions. Image data may include petrographic and SEM images of cuttings, core, and thin sections. All data are tied to depth. Data are entered quickly and easily in an interactive manner with a mouse, keyboard, and digitizing tablet or may be imported and immediately autoplotted from a variety of environments via modem, network, or removable disk. Color log displays, including petrographic images, are easily available on CRT or as hardcopy. The system consists of a petrographic microscope, video camera, Macintosh computer, video framegrabber and digitizing tablet. Hardcopy is scaleable and can be generated by a variety of color printing devices. The software is written in Supertalk, a color superset of the standard Apple Hypercard programming language, hypertalk. This system is being tested by Mobil in the lab and at the well site. Implementation has provided near 'real-time' core and cuttings images from drilling wells to the geologist back at the office.

  13. Development of Kevlar parachute webbings

    SciTech Connect

    Ericksen, R.H.

    1991-01-01

    This paper describes the development of Kevlar webbings for parachute applications. Evaluation of existing webbings and a study of the effects of filling yarn denier and pick count on tensile and joint strength provided data for fabric design. Measurements of warp crimp as a function of filling denier and pick count demonstrated the relationship between warp crimp and strength. One newly developed webbing had higher strength efficiency and another had higher joint efficiency than comparable existing webbings. Both new webbings had overall efficiencies over 5% higher than values for existing webbings. 10 refs., 4 figs., 2 tabs.

  14. Log-Concavity and Strong Log-Concavity: a review

    PubMed Central

    Saumard, Adrien; Wellner, Jon A.

    2016-01-01

    We review and formulate results concerning log-concavity and strong-log-concavity in both discrete and continuous settings. We show how preservation of log-concavity and strongly log-concavity on ℝ under convolution follows from a fundamental monotonicity result of Efron (1969). We provide a new proof of Efron's theorem using the recent asymmetric Brascamp-Lieb inequality due to Otto and Menz (2013). Along the way we review connections between log-concavity and other areas of mathematics and statistics, including concentration of measure, log-Sobolev inequalities, convex geometry, MCMC algorithms, Laplace approximations, and machine learning. PMID:27134693

  15. Log-Concavity and Strong Log-Concavity: a review.

    PubMed

    Saumard, Adrien; Wellner, Jon A

    We review and formulate results concerning log-concavity and strong-log-concavity in both discrete and continuous settings. We show how preservation of log-concavity and strongly log-concavity on ℝ under convolution follows from a fundamental monotonicity result of Efron (1969). We provide a new proof of Efron's theorem using the recent asymmetric Brascamp-Lieb inequality due to Otto and Menz (2013). Along the way we review connections between log-concavity and other areas of mathematics and statistics, including concentration of measure, log-Sobolev inequalities, convex geometry, MCMC algorithms, Laplace approximations, and machine learning.

  16. Keystroke Logging in Writing Research: Using Inputlog to Analyze and Visualize Writing Processes

    ERIC Educational Resources Information Center

    Leijten, Marielle; Van Waes, Luuk

    2013-01-01

    Keystroke logging has become instrumental in identifying writing strategies and understanding cognitive processes. Recent technological advances have refined logging efficiency and analytical outputs. While keystroke logging allows for ecological data collection, it is often difficult to connect the fine grain of logging data to the underlying…

  17. Seasonal logging, process response, and geomorphic work

    NASA Astrophysics Data System (ADS)

    Mohr, C.; Zimmermann, A.; Korup, O.; Iroume, A.; Francke, T.; Bronstert, A.

    2013-12-01

    Deforestation is a prominent anthropogenic cause of erosive overland flow and slope instability, boosting rates of soil erosion and concomitant sediment flux. Conventional methods of gauging or estimating post-logging sediment flux focus on annual timescales, but overlook potentially important process response on shorter intervals immediately following timber harvest. We resolve such dynamics from non-parametric Quantile Regression Forests (QRF) of high-frequency (3-min) measurements of stream discharge and sediment concentrations in similar-sized (~0.1 km2) forested Chilean catchments that were logged during either the rainy or the dry season. The method of QRF builds on the Random Forest algorithm, and combines quantile regression with repeated random sub-sampling of both cases and predictors which in turn provides model uncertainties. We find that, where no logging occurred, ~80% of the total sediment load was transported during extremely variable runoff events during only 5% of the monitoring period. Particularly dry-season logging dampened the role of these rare, extreme sediment-transport events by increasing load efficiency during more efficient moderate events. We conclude that QRF may reliably support forest management recommendations by providing robust simulations of post-logging response of water and sediment fluxes at high temporal resolution.

  18. Identifying related journals through log analysis

    PubMed Central

    Lu, Zhiyong; Xie, Natalie; Wilbur, W. John

    2009-01-01

    Motivation: With the explosion of biomedical literature and the evolution of online and open access, scientists are reading more articles from a wider variety of journals. Thus, the list of core journals relevant to their research may be less obvious and may often change over time. To help researchers quickly identify appropriate journals to read and publish in, we developed a web application for finding related journals based on the analysis of PubMed log data. Availability: http://www.ncbi.nlm.nih.gov/IRET/Journals Contact: luzh@ncbi.nlm.nih.gov Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19734155

  19. What Can Instructors and Policy Makers Learn about Web-Supported Learning through Web-Usage Mining

    ERIC Educational Resources Information Center

    Cohen, Anat; Nachmias, Rafi

    2011-01-01

    This paper focuses on a Web-log based tool for evaluating pedagogical processes occurring in Web-supported academic instruction and students' attitudes. The tool consists of computational measures which demonstrate what instructors and policy makers can learn about Web-supported instruction through Web-usage mining. The tool can provide different…

  20. 2. Onroom log cabin (right), log root cellar (center), tworoom ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. On-room log cabin (right), log root cellar (center), two-room log cabin (left), and post-and-beam garage (background). View to southwest. - William & Lucina Bowe Ranch, County Road 44, 0.1 mile northeast of Big Hole River Bridge, Melrose, Silver Bow County, MT

  1. 12. Upstream view showing thelower log pond log chute in ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    12. Upstream view showing thelower log pond log chute in the main channel of the Hudson River. The log chute in the dam can be seen in the background. Facing southwest. - Glens Falls Dam, 100' to 450' West of U.S. Route 9 Bridge Spanning Hudson River, Glens Falls, Warren County, NY

  2. Grid Logging: Best Practices Guide

    SciTech Connect

    Tierney, Brian L; Tierney, Brian L; Gunter, Dan

    2008-04-01

    The purpose of this document is to help developers of Grid middleware and application software generate log files that will be useful to Grid administrators, users, developers and Grid middleware itself. Currently, most of the currently generated log files are only useful to the author of the program. Good logging practices are instrumental to performance analysis, problem diagnosis, and security auditing tasks such as incident tracing and damage assessment. This document does not discuss the issue of a logging API. It is assumed that a standard log API such as syslog (C), log4j (Java), or logger (Python) is being used. Other custom logging API or even printf could be used. The key point is that the logs must contain the required information in the required format. At a high level of abstraction, the best practices for Grid logging are: (1) Consistently structured, typed, log events; (2) A standard high-resolution timestamp; (3) Use of logging levels and categories to separate logs by detail and purpose; (4) Consistent use of global and local identifiers; and (5) Use of some regular, newline-delimited ASCII text format. The rest of this document describes each of these recommendations in detail.

  3. My Journey with Learning Logs

    ERIC Educational Resources Information Center

    Hurst, Beth

    2005-01-01

    Learning logs, or reading response logs, have long been established as an effective reading strategy that helps students learn from text (Atwell, 1987; Blough & Berman, 1991; Calkins, 1986; Commander & Smith, 1996; Kuhrt & Farris, 1990; Reed, 1988; Sanders, 1985). In this paper, the author describes her experiences using learning logs as a…

  4. Student Portfolio Analysis for Decision Support of Web-Based Classroom Teacher by Data Cube Technology.

    ERIC Educational Resources Information Center

    Chang, Chih-Kai; Chen, Gwo-Dong; Liu, Baw-Jhiune; Ou, Kou-Liang

    As learners use World Wide Web-based distance learning systems over a period of years, large amounts of learning logs are generated. An instructor needs analysis tools to manage the logs and discover unusual patterns within them to improve instruction. However, logs of a Web server cannot serve as learners' portfolios to satisfy the requirements…

  5. Making WEB Meaning.

    ERIC Educational Resources Information Center

    McKenzie, Jamie

    1996-01-01

    Poorly organized and dominated by amateurs, hucksters, and marketeers, the net requires efficient navigating devices. Students at Bellingham (Washington) Public Schools tackle information overload by contributing to virtual museums on school Web sites, using annotated Web curriculum lists, and conducting research in cooperative teams stressing…

  6. Instructional Efficiency of Performance Analysis Training for Learners at Different Levels of Competency in Using a Web-Based EPSS

    ERIC Educational Resources Information Center

    Darabi, A. Aubteen; Nelson, David W.; Mackal, Melissa C.

    2004-01-01

    The measure of performance improvement potential (Gilbert, 1978) in human performance technology uses an exemplary performance as a criterion against which to measure the potential improvement in the performance of a workforce. The measure is calculated based on the performance efficiency which compares expended resources to productivity. The same…

  7. Use of Group Discussion and Learning Portfolio to Build Knowledge for Managing Web Group Learning

    ERIC Educational Resources Information Center

    Chen, Gwo-Dong; Ou, Kuo-Liang; Wang, Chin-Yeh

    2003-01-01

    To monitor and enhance the learning performance of learning groups in a Web learning system, teachers need to know the learning status of the group and determine the key influences affecting group learning outcomes. Teachers can achieve this goal by observing the group discussions and learning behavior from Web logs and analyzing the Web log data…

  8. Oracle Log Buffer Queueing

    SciTech Connect

    Rivenes, A S

    2004-12-08

    The purpose of this document is to investigate Oracle database log buffer queuing and its affect on the ability to load data using a specialized data loading system. Experiments were carried out on a Linux system using an Oracle 9.2 database. Previous experiments on a Sun 4800 running Solaris had shown that 100,000 entities per minute was an achievable rate. The question was then asked, can we do this on Linux, and where are the bottlenecks? A secondary question was also lurking, how can the loading be further scaled to handle even higher throughput requirements? Testing was conducted using a Dell PowerEdge 6650 server with four CPUs and a Dell PowerVault 220s RAID array with 14 36GB drives and 128 MB of cache. Oracle Enterprise Edition 9.2.0.4 was used for the database and Red Hat Linux Advanced Server 2.1 was used for the operating system. This document will detail the maximum observed throughputs using the same test suite that was used for the Sun tests. A detailed description of the testing performed along with an analysis of bottlenecks encountered will be made. Issues related to Oracle and Linux will also be detailed and some recommendations based on the findings.

  9. Acoustic paramagnetic logging tool

    DOEpatents

    Vail, III, William B.

    1988-01-01

    New methods and apparatus are disclosed which allow measurement of the presence of oil and water in geological formations using a new physical effect called the Acoustic Paramagnetic Logging Effect (APLE). The presence of petroleum in formation causes a slight increase in the earth's magnetic field in the vicinity of the reservoir. This is the phenomena of paramagnetism. Application of an acoustic source to a geological formation at the Larmor frequency of the nucleons present causes the paramagnetism of the formation to disappear. This results in a decrease in the earth3 s magnetic field in the vicinity of the oil bearing formation. Repetitively frequency sweeping the acoustic source through the Larmor frequency of the nucleons present (approx. 2 kHz) causes an amplitude modulation of the earth's magnetic field which is a consequence of the APLE. The amplitude modulation of the earth's magnetic field is measured with an induction coil gradiometer and provides a direct measure of the amount of oil and water in the excitation zone of the formation . The phase of the signal is used to infer the longitudinal relaxation times of the fluids present, which results in the ability in general to separate oil and water and to measure the viscosity of the oil present. Such measurements may be preformed in open boreholes and in cased well bores.

  10. 3. Log bunkhouse (far left), log chicken house (left of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. Log bunkhouse (far left), log chicken house (left of center), equipment shed (center), and workshop (far right). View to northwest. - William & Lucina Bowe Ranch, County Road 44, 0.1 mile northeast of Big Hole River Bridge, Melrose, Silver Bow County, MT

  11. Seasonal logging, process response, and geomorphic work

    NASA Astrophysics Data System (ADS)

    Mohr, C. H.; Zimmermann, A.; Korup, O.; Iroumé, A.; Francke, T.; Bronstert, A.

    2014-03-01

    Deforestation is a prominent anthropogenic cause of erosive overland flow and slope instability, boosting rates of soil erosion and concomitant sediment flux. Conventional methods of gauging or estimating post-logging sediment flux often focus on annual timescales but overlook potentially important process response on shorter intervals immediately following timber harvest. We resolve such dynamics with non-parametric quantile regression forests (QRF) based on high-frequency (3 min) discharge measurements and sediment concentration data sampled every 30-60 min in similar-sized (˜0.1 km2) forested Chilean catchments that were logged during either the rainy or the dry season. The method of QRF builds on the random forest algorithm, and combines quantile regression with repeated random sub-sampling of both cases and predictors. The algorithm belongs to the family of decision-tree classifiers, which allow quantifying relevant predictors in high-dimensional parameter space. We find that, where no logging occurred, ˜80% of the total sediment load was transported during extremely variable runoff events during only 5% of the monitoring period. In particular, dry-season logging dampened the relative role of these rare, extreme sediment-transport events by increasing load efficiency during more efficient moderate events. We show that QRFs outperform traditional sediment rating curves (SRCs) in terms of accurately simulating short-term dynamics of sediment flux, and conclude that QRF may reliably support forest management recommendations by providing robust simulations of post-logging response of water and sediment fluxes at high temporal resolution.

  12. MimoPro: a more efficient Web-based tool for epitope prediction using phage display libraries

    PubMed Central

    2011-01-01

    Background A B-cell epitope is a group of residues on the surface of an antigen which stimulates humoral responses. Locating these epitopes on antigens is important for the purpose of effective vaccine design. In recent years, mapping affinity-selected peptides screened from a random phage display library to the native epitope has become popular in epitope prediction. These peptides, also known as mimotopes, share the similar structure and function with the corresponding native epitopes. Great effort has been made in using this similarity between such mimotopes and native epitopes in prediction, which has resulted in better outcomes than statistics-based methods can. However, it cannot maintain a high degree of satisfaction in various circumstances. Results In this study, we propose a new method that maps a group of mimotopes back to a source antigen so as to locate the interacting epitope on the antigen. The core of this method is a searching algorithm that is incorporated with both dynamic programming (DP) and branch and bound (BB) optimization and operated on a series of overlapping patches on the surface of a protein. These patches are then transformed to a number of graphs using an adaptable distance threshold (ADT) regulated by an appropriate compactness factor (CF), a novel parameter proposed in this study. Compared with both Pep-3D-Search and PepSurf, two leading graph-based search tools, on average from the results of 18 test cases, MimoPro, the Web-based implementation of our proposed method, performed better in sensitivity, precision, and Matthews correlation coefficient (MCC) than both did in epitope prediction. In addition, MimoPro is significantly faster than both Pep-3D-Search and PepSurf in processing. Conclusions Our search algorithm designed for processing well constructed graphs using an ADT regulated by CF is more sensitive and significantly faster than other graph-based approaches in epitope prediction. MimoPro is a viable alternative to both

  13. A Dynamically Configurable Log-based Distributed Security Event Detection Methodology using Simple Event Correlator

    DTIC Science & Technology

    2010-06-01

    from SANS Whitepaper - "... Detecting Attacks on Web Applications from Log Files" #look for image tags type=Single continue=TakeNext ptype=RegExp...shellcmd /home/user/sec -2.5.3/ common/syslogclient "... Synthetic : " "$2|$1|xss detected in image tag: $3" #send the raw log type=Single ptype=RegExp...Expressions taken from SANS Whitepaper - "... Detecting Attacks on Web Applications from Log Files" #look for image tags type=Single continue=TakeNext

  14. Logging Concessions Enable Illegal Logging Crisis in the Peruvian Amazon

    PubMed Central

    Finer, Matt; Jenkins, Clinton N.; Sky, Melissa A. Blue; Pine, Justin

    2014-01-01

    The Peruvian Amazon is an important arena in global efforts to promote sustainable logging in the tropics. Despite recent efforts to achieve sustainability, such as provisions in the US–Peru Trade Promotion Agreement, illegal logging continues to plague the region. We present evidence that Peru's legal logging concession system is enabling the widespread illegal logging via the regulatory documents designed to ensure sustainable logging. Analyzing official government data, we found that 68.3% of all concessions supervised by authorities were suspected of major violations. Of the 609 total concessions, nearly 30% have been cancelled for violations and we expect this percentage to increase as investigations continue. Moreover, the nature of the violations indicate that the permits associated with legal concessions are used to harvest trees in unauthorized areas, thus threatening all forested areas. Many of the violations pertain to the illegal extraction of CITES-listed timber species outside authorized areas. These findings highlight the need for additional reforms. PMID:24743552

  15. Logging Concessions Enable Illegal Logging Crisis in the Peruvian Amazon

    NASA Astrophysics Data System (ADS)

    Finer, Matt; Jenkins, Clinton N.; Sky, Melissa A. Blue; Pine, Justin

    2014-04-01

    The Peruvian Amazon is an important arena in global efforts to promote sustainable logging in the tropics. Despite recent efforts to achieve sustainability, such as provisions in the US-Peru Trade Promotion Agreement, illegal logging continues to plague the region. We present evidence that Peru's legal logging concession system is enabling the widespread illegal logging via the regulatory documents designed to ensure sustainable logging. Analyzing official government data, we found that 68.3% of all concessions supervised by authorities were suspected of major violations. Of the 609 total concessions, nearly 30% have been cancelled for violations and we expect this percentage to increase as investigations continue. Moreover, the nature of the violations indicate that the permits associated with legal concessions are used to harvest trees in unauthorized areas, thus threatening all forested areas. Many of the violations pertain to the illegal extraction of CITES-listed timber species outside authorized areas. These findings highlight the need for additional reforms.

  16. Primary detection of hardwood log defects using laser surface scanning

    NASA Astrophysics Data System (ADS)

    Thomas, Edward; Thomas, Liya; Mili, Lamine; Ehrich, Roger W.; Abbott, A. Lynn; Shaffer, Clifford

    2003-05-01

    The use of laser technology to scan hardwood log surfaces for defects holds great promise for improving processing efficiency and the value and volume of lumber produced. External and internal defect detection to optimize hardwood log and lumber processing is one of the top four technological needs in the nation"s hardwood industry. The location, type, and severity of defects on hardwood logs are the key indicators of log quality and value. These visual cues provide information about internal log characteristics and products for which the log is suitable. We scanned 162 logs with a high-resolution industrial four-head laser surface scanner. The resulting data sets contain hundreds of thousands of three-dimensional coordinate points. The size of the data and noise presented special problems during processing. Robust regression models were used to fit geometric shapes to the data. The estimated orthogonal distances between the fitted model and the log surface are converted to a two-dimensional image to facilitate defect detection. Using robust regression methods and standard image processing tools we have demonstrated that severe surface defects on hardwood logs can be detected using height and contour analyses of three-dimensional laser scan data.

  17. MANCaLog: A Logic for Multi-Attribute Network Cascades

    DTIC Science & Technology

    2013-01-01

    looked at research on “niacin” indexed by Thomson Reuters Web of Knowledge (http://wokinfo.com). This topic was cho- sen due to its interest to a variety... Thomson Reuters Web of Knowledge. 5. CONCLUSION In this paper, we presented the MANCaLog language for modeling cascades in multi-agent systems organized...density lipopro- teins (HDL). Using Thomson Reuters Web of Knowledge (http://wokinfo.com) we were able to extract information on 4, 202 articles about

  18. Well Logging with Californium-252

    SciTech Connect

    Boulogne, A.R.

    2003-01-06

    Californium-252 is an intense neutron emitter that has only recently become available for experimental well logging. The purpose of this research is to investigate the application of well logging to groundwater hydrology; however, most of the techniques and purposes are quite similar to applications in the petroleum industry.

  19. ASCOT: a text mining-based web-service for efficient search and assisted creation of clinical trials.

    PubMed

    Korkontzelos, Ioannis; Mu, Tingting; Ananiadou, Sophia

    2012-04-30

    Clinical trials are mandatory protocols describing medical research on humans and among the most valuable sources of medical practice evidence. Searching for trials relevant to some query is laborious due to the immense number of existing protocols. Apart from search, writing new trials includes composing detailed eligibility criteria, which might be time-consuming, especially for new researchers. In this paper we present ASCOT, an efficient search application customised for clinical trials. ASCOT uses text mining and data mining methods to enrich clinical trials with metadata, that in turn serve as effective tools to narrow down search. In addition, ASCOT integrates a component for recommending eligibility criteria based on a set of selected protocols.

  20. Web Search Studies: Multidisciplinary Perspectives on Web Search Engines

    NASA Astrophysics Data System (ADS)

    Zimmer, Michael

    Perhaps the most significant tool of our internet age is the web search engine, providing a powerful interface for accessing the vast amount of information available on the world wide web and beyond. While still in its infancy compared to the knowledge tools that precede it - such as the dictionary or encyclopedia - the impact of web search engines on society and culture has already received considerable attention from a variety of academic disciplines and perspectives. This article aims to organize a meta-discipline of “web search studies,” centered around a nucleus of major research on web search engines from five key perspectives: technical foundations and evaluations; transaction log analyses; user studies; political, ethical, and cultural critiques; and legal and policy analyses.

  1. Web Mining

    NASA Astrophysics Data System (ADS)

    Fürnkranz, Johannes

    The World-Wide Web provides every internet citizen with access to an abundance of information, but it becomes increasingly difficult to identify the relevant pieces of information. Research in web mining tries to address this problem by applying techniques from data mining and machine learning to Web data and documents. This chapter provides a brief overview of web mining techniques and research areas, most notably hypertext classification, wrapper induction, recommender systems and web usage mining.

  2. Seasonal logging, process response, and geomorphic work

    NASA Astrophysics Data System (ADS)

    Mohr, C. H.; Zimmermann, A.; Korup, O.; Iroumé, A.; Francke, T.; Bronstert, A.

    2013-09-01

    Deforestation is a prominent anthropogenic cause of erosive overland flow and slope instability, boosting rates of soil erosion and concomitant sediment flux. Conventional methods of gauging or estimating post-logging sediment flux focus on annual timescales, but potentially overlook important geomorphic responses on shorter time scales immediately following timber harvest. Sediments fluxes are commonly estimated from linear regression of intermittent measurements of water and sediment discharge using sediment rating curves (SRCs). However, these often unsatisfactorily reproduce non-linear effects such as discharge-load hystereses. We resolve such important dynamics from non-parametric Quantile Regression Forests (QRF) of high-frequency (3 min) measurements of stream discharge and sediment concentrations in similar-sized (~ 0.1 km2) forested Chilean catchments that were logged during either the rainy or the dry season. The method of QRF builds on the Random Forest (RF) algorithm, and combines quantile regression with repeated random sub-sampling of both cases and predictors. The algorithm belongs to the family of decision-tree classifiers, which allow quantifying relevant predictors in high-dimensional parameter space. We find that, where no logging occurred, ~ 80% of the total sediment load was transported during rare but high magnitude runoff events during only 5% of the monitoring period. The variability of sediment flux of these rare events spans four orders of magnitude. In particular dry-season logging dampened the role of these rare, extreme sediment-transport events by increasing load efficiency during more moderate events. We show that QRFs outperforms traditional SRCs in terms of accurately simulating short-term dynamics of sediment flux, and conclude that QRF may reliably support forest management recommendations by providing robust simulations of post-logging response of water and sediment discharge at high temporal resolution.

  3. High resolution gamma spectroscopy well logging system

    SciTech Connect

    Giles, J.R.; Dooley, K.J.

    1997-05-01

    A Gamma Spectroscopy Logging System (GSLS) has been developed to study sub-surface radionuclide contamination. The absolute counting efficiencies of the GSLS detectors were determined using cylindrical reference sources. More complex borehole geometries were modeled using commercially available shielding software and correction factors were developed based on relative gamma-ray fluence rates. Examination of varying porosity and moisture content showed that as porosity increases, and as the formation saturation ratio decreases, relative gamma-ray fluence rates increase linearly for all energies. Correction factors for iron and water cylindrical shields were found to agree well with correction factors determined during previous studies allowing for the development of correction factors for type-304 stainless steel and low-carbon steel casings. Regression analyses of correction factor data produced equations for determining correction factors applicable to spectral gamma-ray well logs acquired under non-standard borehole conditions.

  4. Close-Call Action Log Form

    NASA Technical Reports Server (NTRS)

    Spuler, Linda M.; Ford, Patricia K.; Skeete, Darren C.; Hershman, Scot; Raviprakash, Pushpa; Arnold, John W.; Tran, Victor; Haenze, Mary Alice

    2005-01-01

    "Close Call Action Log Form" ("CCALF") is the name of both a computer program and a Web-based service provided by the program for creating an enhanced database of close calls (in the colloquial sense of mishaps that were avoided by small margins) assigned to the Center Operations Directorate (COD) at Johnson Space Center. CCALF provides a single facility for on-line collaborative review of close calls. Through CCALF, managers can delegate responses to employees. CCALF utilizes a pre-existing e-mail system to notify managers that there are close calls to review, but eliminates the need for the prior practices of passing multiple e-mail messages around the COD, then collecting and consolidating them into final responses: CCALF now collects comments from all responders for incorporation into reports that it generates. Also, whereas it was previously necessary to manually calculate metrics (e.g., numbers of maintenance-work orders necessitated by close calls) for inclusion in the reports, CCALF now computes the metrics, summarizes them, and displays them in graphical form. The reports and all pertinent information used to generate the reports are logged, tracked, and retained by CCALF for historical purposes.

  5. Effects of selective logging on tropical forest tree growth

    NASA Astrophysics Data System (ADS)

    Figueira, Adelaine Michela E. S.; Miller, Scott D.; de Sousa, Cleilim Albert D.; Menton, Mary C.; Maia, Augusto R.; Da Rocha, Humberto R.; Goulden, Michael L.

    2008-03-01

    We combined measurements of tree growth and carbon dioxide exchange to investigate the effects of selective logging on the Aboveground Live Biomass (AGLB) of a tropical rain forest in the Amazon. Most of the measurements began at least 10 months before logging and continued at least 36 months after logging. The logging removed ˜15% of the trees with Diameter at Breast Height (DBH) greater than 35 cm, which resulted in an instantaneous 10% reduction in AGLB. Both wood production and mortality increased following logging, while Gross Primary Production (GPP) was unchanged. The ratio of wood production to GPP (the wood Carbon Use Efficiency or wood CUE) more than doubled following logging. Small trees (10 cm < DBH < 35 cm) accounted for most of the enhanced wood production. Medium trees (35 cm < DBH < 55 cm) that were within 30 m of canopy gaps created by the logging also showed increased growth. The patterns of enhanced growth are most consistent with logging-induced increases in light availability. The AGLB continued to decline over the study, as mortality outpaced wood production. Wood CUE and mortality remained elevated throughout the 3 years of postlogging measurements. The future trajectory of AGLB and the forest's carbon balance are uncertain, and will depend on how long it takes for heterotrophic respiration, mortality, and CUE to return to prelogging levels.

  6. Demonstration of the Web-based Interspecies Correlation Estimation (Web-ICE) modeling application

    EPA Science Inventory

    The Web-based Interspecies Correlation Estimation (Web-ICE) modeling application is available to the risk assessment community through a user-friendly internet platform (http://epa.gov/ceampubl/fchain/webice/). ICE models are log-linear least square regressions that predict acute...

  7. New materials for fireplace logs

    NASA Technical Reports Server (NTRS)

    Kieselback, D. J.; Smock, A. W.

    1971-01-01

    Fibrous insulation and refractory concrete are used for logs as well as fireproof walls, incinerator bricks, planters, and roof shingles. Insulation is lighter and more shock resistant than fireclay. Lightweight slag bonded with refractory concrete serves as aggregrate.

  8. Silicon web process development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Skutch, M. E.; Driggers, J. M.; Hopkins, R. H.

    1981-01-01

    The silicon web process takes advantage of natural crystallographic stabilizing forces to grow long, thin single crystal ribbons directly from liquid silicon. The ribbon, or web, is formed by the solidification of a liquid film supported by surface tension between two silicon filaments, called dendrites, which border the edges of the growing strip. The ribbon can be propagated indefinitely by replenishing the liquid silicon as it is transformed to crystal. The dendritic web process has several advantages for achieving low cost, high efficiency solar cells. These advantages are discussed.

  9. Maintaining Quality of Library Web Sites Using Cluster and Path Analysis.

    ERIC Educational Resources Information Center

    Matylonek, John

    2002-01-01

    Provides methods for the systematic redesign of a library Web site through comparing baseline data changes in use brought about by design changes. Shows how complementary information, based on Web users' log statistics and direct observations of users, can enhance a library Web site. Relates this information to practical Web site decisions that…

  10. Sexual information seeking on web search engines.

    PubMed

    Spink, Amanda; Koricich, Andrew; Jansen, B J; Cole, Charles

    2004-02-01

    Sexual information seeking is an important element within human information behavior. Seeking sexually related information on the Internet takes many forms and channels, including chat rooms discussions, accessing Websites or searching Web search engines for sexual materials. The study of sexual Web queries provides insight into sexually-related information-seeking behavior, of value to Web users and providers alike. We qualitatively analyzed queries from logs of 1,025,910 Alta Vista and AlltheWeb.com Web user queries from 2001. We compared the differences in sexually-related Web searching between Alta Vista and AlltheWeb.com users. Differences were found in session duration, query outcomes, and search term choices. Implications of the findings for sexual information seeking are discussed.

  11. Early identification of adverse drug reactions from search log data.

    PubMed

    White, Ryen W; Wang, Sheng; Pant, Apurv; Harpaz, Rave; Shukla, Pushpraj; Sun, Walter; DuMouchel, William; Horvitz, Eric

    2016-02-01

    The timely and accurate identification of adverse drug reactions (ADRs) following drug approval is a persistent and serious public health challenge. Aggregated data drawn from anonymized logs of Web searchers has been shown to be a useful source of evidence for detecting ADRs. However, prior studies have been based on the analysis of established ADRs, the existence of which may already be known publically. Awareness of these ADRs can inject existing knowledge about the known ADRs into online content and online behavior, and thus raise questions about the ability of the behavioral log-based methods to detect new ADRs. In contrast to previous studies, we investigate the use of search logs for the early detection of known ADRs. We use a large set of recently labeled ADRs and negative controls to evaluate the ability of search logs to accurately detect ADRs in advance of their publication. We leverage the Internet Archive to estimate when evidence of an ADR first appeared in the public domain and adjust the index date in a backdated analysis. Our results demonstrate how search logs can be used to detect new ADRs, the central challenge in pharmacovigilance.

  12. Impacts of Intensive Logging on the Trophic Organisation of Ant Communities in a Biodiversity Hotspot

    PubMed Central

    Woodcock, Paul; Edwards, David P.; Newton, Rob J.; Vun Khen, Chey; Bottrell, Simon H.; Hamer, Keith C.

    2013-01-01

    Trophic organisation defines the flow of energy through ecosystems and is a key component of community structure. Widespread and intensifying anthropogenic disturbance threatens to disrupt trophic organisation by altering species composition and relative abundances and by driving shifts in the trophic ecology of species that persist in disturbed ecosystems. We examined how intensive disturbance caused by selective logging affects trophic organisation in the biodiversity hotspot of Sabah, Borneo. Using stable nitrogen isotopes, we quantified the positions in the food web of 159 leaf-litter ant species in unlogged and logged rainforest and tested four predictions: (i) there is a negative relationship between the trophic position of a species in unlogged forest and its change in abundance following logging, (ii) the trophic positions of species are altered by logging, (iii) disturbance alters the frequency distribution of trophic positions within the ant assemblage, and (iv) disturbance reduces food chain length. We found that ant abundance was 30% lower in logged forest than in unlogged forest but changes in abundance of individual species were not related to trophic position, providing no support for prediction (i). However, trophic positions of individual species were significantly higher in logged forest, supporting prediction (ii). Consequently, the frequency distribution of trophic positions differed significantly between unlogged and logged forest, supporting prediction (iii), and food chains were 0.2 trophic levels longer in logged forest, the opposite of prediction (iv). Our results demonstrate that disturbance can alter trophic organisation even without trophically-biased changes in community composition. Nonetheless, the absence of any reduction in food chain length in logged forest suggests that species-rich arthropod food webs do not experience trophic downgrading or a related collapse in trophic organisation despite the disturbance caused by logging

  13. Impacts of intensive logging on the trophic organisation of ant communities in a biodiversity hotspot.

    PubMed

    Woodcock, Paul; Edwards, David P; Newton, Rob J; Vun Khen, Chey; Bottrell, Simon H; Hamer, Keith C

    2013-01-01

    Trophic organisation defines the flow of energy through ecosystems and is a key component of community structure. Widespread and intensifying anthropogenic disturbance threatens to disrupt trophic organisation by altering species composition and relative abundances and by driving shifts in the trophic ecology of species that persist in disturbed ecosystems. We examined how intensive disturbance caused by selective logging affects trophic organisation in the biodiversity hotspot of Sabah, Borneo. Using stable nitrogen isotopes, we quantified the positions in the food web of 159 leaf-litter ant species in unlogged and logged rainforest and tested four predictions: (i) there is a negative relationship between the trophic position of a species in unlogged forest and its change in abundance following logging, (ii) the trophic positions of species are altered by logging, (iii) disturbance alters the frequency distribution of trophic positions within the ant assemblage, and (iv) disturbance reduces food chain length. We found that ant abundance was 30% lower in logged forest than in unlogged forest but changes in abundance of individual species were not related to trophic position, providing no support for prediction (i). However, trophic positions of individual species were significantly higher in logged forest, supporting prediction (ii). Consequently, the frequency distribution of trophic positions differed significantly between unlogged and logged forest, supporting prediction (iii), and food chains were 0.2 trophic levels longer in logged forest, the opposite of prediction (iv). Our results demonstrate that disturbance can alter trophic organisation even without trophically-biased changes in community composition. Nonetheless, the absence of any reduction in food chain length in logged forest suggests that species-rich arthropod food webs do not experience trophic downgrading or a related collapse in trophic organisation despite the disturbance caused by logging

  14. Adapting the right web pages to the right users

    NASA Astrophysics Data System (ADS)

    Hui, Xiong; Sung, Sam Y.; Huang, Stephen

    2000-04-01

    With the explosive use of the Internet, there is an ever- increasing volume of Web usage data being generated and warehoused in numerous successful Web sites. Analyzing Web usage data can help Web developers to improve the organization and presentation of their Web sites. Considering the fact that mining for patterns and rules in market basket data is well studied in data mining field, we provide a mapping approach, which can transform Web usage data into the form like market basket data. Using our model, all the methods developed by data mining research groups can be directly applied on Web usage data without much change. Existing methods for knowledge discovery in Web logs are restricted by the difficulty of getting the complete and reliable Web usage data and effectively identifying user sessions using current Web server log mechanism. The problem is due to Web caching and the existence of proxy servers. As an effort to remedy this problem, we built our own Web server log mechanism that can effectively capture user access behavior and will not be deliberately bypassed by proxy servers and end users.

  15. Mail LOG: Program operating instructions

    NASA Technical Reports Server (NTRS)

    Harris, D. K.

    1979-01-01

    The operating instructions for the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS, are provided. The program is written in FORTRAN for the PRIME 300 computer system. The MAIL LOG program has the following four modes of operation: (1) INPUT - putting new records into the data base (2) REVISE - changing or modifying existing records in the data base (3) SEARCH - finding special records existing in the data base (4) ARCHIVE - store or put away existing records in the data base. The output includes special printouts of records in the data base and results from the INPUT and SEARCH modes. The MAIL LOG data base consists of three main subfiles: Incoming and outgoing mail correspondence; Design Information Releases and Releases and Reports; and Drawings and Engineering orders.

  16. Method for induced polarization logging

    SciTech Connect

    Vinegar, H.J.; Waxman, M.H.

    1987-04-14

    A method is described for generating a log of the formation phase shift, resistivity and spontaneous potential of an earth formation from data obtained from the earth formation with a multi-electrode induced polarization logging tool. The method comprises obtaining data samples from the formation at measurement points equally spaced in time of the magnitude and phase of the induced voltage and the magnitude and phase of the current supplied by a circuit through a reference resistance R/sub 0/ to a survey current electrode associated with the tool.

  17. Solar-A reformatted data files and observing log

    NASA Technical Reports Server (NTRS)

    Morrison, M. D.; Lemen, J. R.; Acton, L. W.; Bentley, R. D.; Kosugi, T.; Tsuneta, S.; Ogawara, Y.; Watanabe, T.

    1991-01-01

    An overview is presented of the Solar-A telemetry data files which are to be created and the format and organization which the files are to use. The organization chosen is to be efficient in space, to facilitate access to the data, and to allow the data to be transportable to different machines. An observing log file is to be created automatically, using the reformatted data files as the input. It will be possible to perform searches with the observing log to list cases where instruments are in certain modes and/or seeing certain signal levels. A user will be able to search the observing log and obtain a list of all cases where a given set of conditions are satisfied. An event log will be created listing the times when the instrument or spacecraft modes change.

  18. A Novel Framework for Medical Web Information Foraging Using Hybrid ACO and Tabu Search.

    PubMed

    Drias, Yassine; Kechid, Samir; Pasi, Gabriella

    2016-01-01

    We present in this paper a novel approach based on multi-agent technology for Web information foraging. We proposed for this purpose an architecture in which we distinguish two important phases. The first one is a learning process for localizing the most relevant pages that might interest the user. This is performed on a fixed instance of the Web. The second takes into account the openness and dynamicity of the Web. It consists on an incremental learning starting from the result of the first phase and reshaping the outcomes taking into account the changes that undergoes the Web. The system was implemented using a colony of artificial ants hybridized with tabu search in order to achieve more effectiveness and efficiency. To validate our proposal, experiments were conducted on MedlinePlus, a real website dedicated for research in the domain of Health in contrast to other previous works where experiments were performed on web logs datasets. The main results are promising either for those related to strong Web regularities and for the response time, which is very short and hence complies the real time constraint.

  19. 47 CFR 73.781 - Logs.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 4 2013-10-01 2013-10-01 false Logs. 73.781 Section 73.781 Telecommunication... International Broadcast Stations § 73.781 Logs. The licensee or permittee of each international broadcast station must maintain the station log in the following manner: (a) In the program log: (1) An entry of...

  20. 47 CFR 73.781 - Logs.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 4 2012-10-01 2012-10-01 false Logs. 73.781 Section 73.781 Telecommunication... International Broadcast Stations § 73.781 Logs. The licensee or permittee of each international broadcast station must maintain the station log in the following manner: (a) In the program log: (1) An entry of...

  1. 47 CFR 73.781 - Logs.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 4 2011-10-01 2011-10-01 false Logs. 73.781 Section 73.781 Telecommunication... International Broadcast Stations § 73.781 Logs. The licensee or permittee of each international broadcast station must maintain the station log in the following manner: (a) In the program log: (1) An entry of...

  2. 29 CFR 1918.88 - Log operations.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 7 2013-07-01 2013-07-01 false Log operations. 1918.88 Section 1918.88 Labor Regulations...) SAFETY AND HEALTH REGULATIONS FOR LONGSHORING Handling Cargo § 1918.88 Log operations. (a) Working in holds. When loading logs into the holds of vessels and using dumper devices to roll logs into the...

  3. 29 CFR 1918.88 - Log operations.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 7 2011-07-01 2011-07-01 false Log operations. 1918.88 Section 1918.88 Labor Regulations...) SAFETY AND HEALTH REGULATIONS FOR LONGSHORING Handling Cargo § 1918.88 Log operations. (a) Working in holds. When loading logs into the holds of vessels and using dumper devices to roll logs into the...

  4. 29 CFR 1918.88 - Log operations.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 7 2012-07-01 2012-07-01 false Log operations. 1918.88 Section 1918.88 Labor Regulations...) SAFETY AND HEALTH REGULATIONS FOR LONGSHORING Handling Cargo § 1918.88 Log operations. (a) Working in holds. When loading logs into the holds of vessels and using dumper devices to roll logs into the...

  5. 29 CFR 1918.88 - Log operations.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 7 2014-07-01 2014-07-01 false Log operations. 1918.88 Section 1918.88 Labor Regulations...) SAFETY AND HEALTH REGULATIONS FOR LONGSHORING Handling Cargo § 1918.88 Log operations. (a) Working in holds. When loading logs into the holds of vessels and using dumper devices to roll logs into the...

  6. 47 CFR 73.781 - Logs.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 4 2014-10-01 2014-10-01 false Logs. 73.781 Section 73.781 Telecommunication... International Broadcast Stations § 73.781 Logs. The licensee or permittee of each international broadcast station must maintain the station log in the following manner: (a) In the program log: (1) An entry of...

  7. 29 CFR 1918.88 - Log operations.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 7 2010-07-01 2010-07-01 false Log operations. 1918.88 Section 1918.88 Labor Regulations...) SAFETY AND HEALTH REGULATIONS FOR LONGSHORING Handling Cargo § 1918.88 Log operations. (a) Working in holds. When loading logs into the holds of vessels and using dumper devices to roll logs into the...

  8. 47 CFR 73.781 - Logs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 4 2010-10-01 2010-10-01 false Logs. 73.781 Section 73.781 Telecommunication... International Broadcast Stations § 73.781 Logs. The licensee or permittee of each international broadcast station must maintain the station log in the following manner: (a) In the program log: (1) An entry of...

  9. Statistical log analysis made practical

    SciTech Connect

    Mitchell, W.K.; Nelson, R.J. )

    1991-06-01

    This paper discusses the advantages of a statistical approach to log analysis. Statistical techniques use inverse methods to calculate formation parameters. The use of statistical techniques has been limited, however, by the complexity of the mathematics and lengthy computer time required to minimize traditionally used nonlinear equations.

  10. Downhole memory-logging tools

    SciTech Connect

    Lysne, P.

    1992-01-01

    Logging technologies developed hydrocarbon resource evaluation have not migrated into geothermal applications even though data so obtained would strengthen reservoir characterization efforts. Two causative issues have impeded progress: (i) there is a general lack of vetted, high-temperature instrumentation, and (ii) the interpretation of log data generated in a geothermal formation is in its infancy. Memory-logging tools provide a path around the first obstacle by providing quality data at a low cost. These tools feature on-board computers that process and store data, and newer systems may be programmed to make decisions.'' Since memory tools are completely self-contained, they are readily deployed using the slick line found on most drilling locations. They have proven to be rugged, and a minimum training program is required for operator personnel. Present tools measure properties such as temperature and pressure, and the development of noise, deviation, and fluid conductivity logs based on existing hardware is relatively easy. A more complex geochemical tool aimed at a quantitative analysis of potassium, uranium and thorium will be available in about on year, and it is expandable into all nuclear measurements common in the hydrocarbon industry. A second tool designed to sample fluids at conditions exceeding 400{degrees}C is in the proposal stage. Partnerships are being formed between the geothermal industry, scientific drilling programs, and the national laboratories to define and develop inversion algorithms relating raw tool data to more pertinent information. 8 refs.

  11. CRYPTOSPORIDIUM LOG INACTIVATION CALCULATION METHODS

    EPA Science Inventory

    Appendix O of the Surface Water Treatment Rule (SWTR) Guidance Manual introduces the CeffT10 (i.e., reaction zone outlet C value and T10 time) method for calculating ozone CT value and Giardia and virus log inactivation. The LT2ESWTR Pre-proposal Draft Regulatory Language for St...

  12. A New Approach to Logging.

    ERIC Educational Resources Information Center

    Miles, Donna

    2001-01-01

    In response to high numbers of preventable fatal accidents in the logging industry, the Occupational Safety and Health Administration (OSHA) developed a week-long logger safety training program that includes hands-on learning of safety techniques in the woods. Reaching small operators has been challenging; outreach initiatives in Maine, North…

  13. IsoWeb: a bayesian isotope mixing model for diet analysis of the whole food web.

    PubMed

    Kadoya, Taku; Osada, Yutaka; Takimoto, Gaku

    2012-01-01

    Quantitative description of food webs provides fundamental information for the understanding of population, community, and ecosystem dynamics. Recently, stable isotope mixing models have been widely used to quantify dietary proportions of different food resources to a focal consumer. Here we propose a novel mixing model (IsoWeb) that estimates diet proportions of all consumers in a food web based on stable isotope information. IsoWeb requires a topological description of a food web, and stable isotope signatures of all consumers and resources in the web. A merit of IsoWeb is that it takes into account variation in trophic enrichment factors among different consumer-resource links. Sensitivity analysis using realistic hypothetical food webs suggests that IsoWeb is applicable to a wide variety of food webs differing in the number of species, connectance, sample size, and data variability. Sensitivity analysis based on real topological webs showed that IsoWeb can allow for a certain level of topological uncertainty in target food webs, including erroneously assuming false links, omission of existent links and species, and trophic aggregation into trophospecies. Moreover, using an illustrative application to a real food web, we demonstrated that IsoWeb can compare the plausibility of different candidate topologies for a focal web. These results suggest that IsoWeb provides a powerful tool to analyze food-web structure from stable isotope data. We provide R and BUGS codes to aid efficient applications of IsoWeb.

  14. Web Engineering

    SciTech Connect

    White, Bebo

    2003-06-23

    Web Engineering is the application of systematic, disciplined and quantifiable approaches to development, operation, and maintenance of Web-based applications. It is both a pro-active approach and a growing collection of theoretical and empirical research in Web application development. This paper gives an overview of Web Engineering by addressing the questions: (a) why is it needed? (b) what is its domain of operation? (c) how does it help and what should it do to improve Web application development? and (d) how should it be incorporated in education and training? The paper discusses the significant differences that exist between Web applications and conventional software, the taxonomy of Web applications, the progress made so far and the research issues and experience of creating a specialization at the master's level. The paper reaches a conclusion that Web Engineering at this stage is a moving target since Web technologies are constantly evolving, making new types of applications possible, which in turn may require innovations in how they are built, deployed and maintained.

  15. Multicriteria evaluation of simulated logging scenarios in a tropical rain forest.

    PubMed

    Huth, Andreas; Drechsler, Martin; Köhler, Peter

    2004-07-01

    Forest growth models are useful tools for investigating the long-term impacts of logging. In this paper, the results of the rain forest growth model FORMIND were assessed by a multicriteria decision analysis. The main processes covered by FORMIND include tree growth, mortality, regeneration and competition. Tree growth is calculated based on a carbon balance approach. Trees compete for light and space; dying large trees fall down and create gaps in the forest. Sixty-four different logging scenarios for an initially undisturbed forest stand at Deramakot (Malaysia) were simulated. The scenarios differ regarding the logging cycle, logging method, cutting limit and logging intensity. We characterise the impacts with four criteria describing the yield, canopy opening and changes in species composition. Multicriteria decision analysis was used for the first time to evaluate the scenarios and identify the efficient ones. Our results plainly show that reduced-impact logging scenarios are more 'efficient' than the others, since in these scenarios forest damage is minimised without significantly reducing yield. Nevertheless, there is a trade-off between yield and achieving a desired ecological state of logged forest; the ecological state of the logged forests can only be improved by reducing yields and enlarging the logging cycles. Our study also demonstrates that high cutting limits or low logging intensities cannot compensate for the high level of damage caused by conventional logging techniques.

  16. The Chemnitz LogAnalyzer: a tool for analyzing data from hypertext navigation research.

    PubMed

    Brunstein, Angela; Naumann, Anja; Krems, Josef F

    2005-05-01

    Computer-based studies usually produce log files as raw data. These data cannot be analyzed adequately with conventional statistical software. The Chemnitz LogAnalyzer provides tools for quick and comfortable visualization and analyses of hypertext navigation behavior by individual users and for aggregated data. In addition, it supports analogous analyses of questionnaire data and reanalysis with respect to several predefined orders of nodes of the same hypertext. As an illustration of how to use the Chemnitz LogAnalyzer, we give an account of one study on learning with hypertext. Participants either searched for specific details or read a hypertext document to familiarize themselves with its content. The tool helped identify navigation strategies affected by these two processing goals and provided comparisons, for example, of processing times and visited sites. Altogether, the Chemnitz LogAnalyzer fills the gap between log files as raw data of Web-based studies and conventional statistical software.

  17. Latent log-linear models for handwritten digit classification.

    PubMed

    Deselaers, Thomas; Gass, Tobias; Heigold, Georg; Ney, Hermann

    2012-06-01

    We present latent log-linear models, an extension of log-linear models incorporating latent variables, and we propose two applications thereof: log-linear mixture models and image deformation-aware log-linear models. The resulting models are fully discriminative, can be trained efficiently, and the model complexity can be controlled. Log-linear mixture models offer additional flexibility within the log-linear modeling framework. Unlike previous approaches, the image deformation-aware model directly considers image deformations and allows for a discriminative training of the deformation parameters. Both are trained using alternating optimization. For certain variants, convergence to a stationary point is guaranteed and, in practice, even variants without this guarantee converge and find models that perform well. We tune the methods on the USPS data set and evaluate on the MNIST data set, demonstrating the generalization capabilities of our proposed models. Our models, although using significantly fewer parameters, are able to obtain competitive results with models proposed in the literature.

  18. A new interface linking the ODP Log and RIDGE Multibeam Databases

    NASA Astrophysics Data System (ADS)

    Reagan, M.; Haxby, W.; Broglia, C.

    2001-12-01

    Over the past few years, a major effort has been undertaken by ODP Logging Services to create an easily accessible, on-line database of the log data collected during ODP cruises. The database currently consists of data from Legs 101-197 which can be retrieved using any web browser via the ODP Logging Services web site (http://www.ldeo.columbia.edu/BRG/ODP/DATA). Concurrently the RIDGE Multibeam Synthesis project at Lamont-Doherty Earth Observatory (LDEO) has been developing its own online database of multibeam data that can be accessed at http://coast.ldeo.columbia.edu. Recently the capabilities of these two databases have been combined using MapAPP, an interface developed by the RIDGE Multibeam Synthesis project and modified by ODP Logging Services for use with the log database. Both databases can be accessed with a simple menu selection. The interface allows for graphical searching and selection of sites in the regional context of the multibeam data using a java applet. It retains the easy download capabilities built into the log database, but also provides several new features including the ability to plot log curves `on the fly'. This capability can be used to display logs from a single hole, or to compare logs from several holes, thus providing a regional view of the data. The integration of this new graphical interface with the extensive content of the ODP Log Database provides users with a powerful tool for viewing and manipulating data. Future enhancements are anticipated to provide even greater capabilities and ease of use.

  19. Chemical logging of geothermal wells

    DOEpatents

    Allen, Charles A.; McAtee, Richard E.

    1981-01-01

    The presence of geothermal aquifers can be detected while drilling in geothermal formations by maintaining a chemical log of the ratio of the concentrations of calcium to carbonate and bicarbonate ions in the return drilling fluid. A continuous increase in the ratio of the concentrations of calcium to carbonate and bicarbonate ions is indicative of the existence of a warm or hot geothermal aquifer at some increased depth.

  20. Chemical logging of geothermal wells

    DOEpatents

    Allen, C.A.; McAtee, R.E.

    The presence of geothermal aquifers can be detected while drilling in geothermal formations by maintaining a chemical log of the ratio of the concentrations of calcium to carbonate and bicarbonate ions in the return drilling fluid. A continuous increase in the ratio of the concentrations of calcium to carbonate and bicarbonate ions is indicative of the existence of a warm or hot geothermal aquifer at some increased depth.

  1. Audit Log for Forensic Photography

    NASA Astrophysics Data System (ADS)

    Neville, Timothy; Sorell, Matthew

    We propose an architecture for an audit log system for forensic photography, which ensures that the chain of evidence of a photograph taken by a photographer at a crime scene is maintained from the point of image capture to its end application at trial. The requirements for such a system are specified and the results of experiments are presented which demonstrate the feasibility of the proposed approach.

  2. An Analysis Platform for Mobile Ad Hoc Network (MANET) Scenario Execution Log Data

    DTIC Science & Technology

    2016-01-01

    Case The SEDAP web interface has been tested with, and is compatible with, the following browsers: Internet Explorer, Google Chrome, and Iceweasel...a plugin-based platform that provides a flexible and efficient mechanism for the model generation process. This platform consists of a web portal...Fig. 8 SEDAP backend module diagram .......................................................18 Fig. 9 SEDAP web service diagram

  3. Log analysis to understand medical professionals' image searching behaviour.

    PubMed

    Tsikrika, Theodora; Müller, Henning; Kahn, Charles E

    2012-01-01

    This paper reports on the analysis of the query logs of a visual medical information retrieval system that provides access to radiology resources. Our analysis shows that, despite sharing similarities with general Web search and also with biomedical text search, query formulation and query modification when searching for visual biomedical information have unique characteristics that need to be taken into account in order to enhance the effectiveness of the search support offered by such systems. Typical information needs of medical professionals searching radiology resources are also identified with the goal to create realistic search tasks for a medical image retrieval evaluation benchmark.

  4. Using Web Metric Software to Drive: Mobile Website Development

    ERIC Educational Resources Information Center

    Tidal, Junior

    2011-01-01

    Many libraries have developed mobile versions of their websites. In order to understand their users, web developers have conducted both usability tests and focus groups, yet analytical software and web server logs can also be used to better understand users. Using data collected from these tools, the Ursula C. Schwerin Library has made informed…

  5. Using Advanced Search Operators on Web Search Engines.

    ERIC Educational Resources Information Center

    Jansen, Bernard J.

    Studies show that the majority of Web searchers enter extremely simple queries, so a reasonable system design approach would be to build search engines to compensate for this user characteristic. One hundred representative queries were selected from the transaction log of a major Web search service. These 100 queries were then modified using the…

  6. Global Connections: Web Conferencing Tools Help Educators Collaborate Anytime, Anywhere

    ERIC Educational Resources Information Center

    Forrester, Dave

    2009-01-01

    Web conferencing tools help educators from around the world collaborate in real time. Teachers, school counselors, and administrators need only to put on their headsets, check the time zone, and log on to meet and learn from educators across the globe. In this article, the author discusses how educators can use Web conferencing at their schools.…

  7. Intelligent web image retrieval system

    NASA Astrophysics Data System (ADS)

    Hong, Sungyong; Lee, Chungwoo; Nah, Yunmook

    2001-07-01

    Recently, the web sites such as e-business sites and shopping mall sites deal with lots of image information. To find a specific image from these image sources, we usually use web search engines or image database engines which rely on keyword only retrievals or color based retrievals with limited search capabilities. This paper presents an intelligent web image retrieval system. We propose the system architecture, the texture and color based image classification and indexing techniques, and representation schemes of user usage patterns. The query can be given by providing keywords, by selecting one or more sample texture patterns, by assigning color values within positional color blocks, or by combining some or all of these factors. The system keeps track of user's preferences by generating user query logs and automatically add more search information to subsequent user queries. To show the usefulness of the proposed system, some experimental results showing recall and precision are also explained.

  8. Avian responses to selective logging shaped by species traits and logging practices.

    PubMed

    Burivalova, Zuzana; Lee, Tien Ming; Giam, Xingli; Şekercioğlu, Çağan Hakkı; Wilcove, David S; Koh, Lian Pin

    2015-06-07

    Selective logging is one of the most common forms of forest use in the tropics. Although the effects of selective logging on biodiversity have been widely studied, there is little agreement on the relationship between life-history traits and tolerance to logging. In this study, we assessed how species traits and logging practices combine to determine species responses to selective logging, based on over 4000 observations of the responses of nearly 1000 bird species to selective logging across the tropics. Our analysis shows that species traits, such as feeding group and body mass, and logging practices, such as time since logging and logging intensity, interact to influence a species' response to logging. Frugivores and insectivores were most adversely affected by logging and declined further with increasing logging intensity. Nectarivores and granivores responded positively to selective logging for the first two decades, after which their abundances decrease below pre-logging levels. Larger species of omnivores and granivores responded more positively to selective logging than smaller species from either feeding group, whereas this effect of body size was reversed for carnivores, herbivores, frugivores and insectivores. Most importantly, species most negatively impacted by selective logging had not recovered approximately 40 years after logging cessation. We conclude that selective timber harvest has the potential to cause large and long-lasting changes in avian biodiversity. However, our results suggest that the impacts can be mitigated to a certain extent through specific forest management strategies such as lengthening the rotation cycle and implementing reduced impact logging.

  9. Avian responses to selective logging shaped by species traits and logging practices

    PubMed Central

    Burivalova, Zuzana; Lee, Tien Ming; Giam, Xingli; Şekercioğlu, Çağan Hakkı; Wilcove, David S.; Koh, Lian Pin

    2015-01-01

    Selective logging is one of the most common forms of forest use in the tropics. Although the effects of selective logging on biodiversity have been widely studied, there is little agreement on the relationship between life-history traits and tolerance to logging. In this study, we assessed how species traits and logging practices combine to determine species responses to selective logging, based on over 4000 observations of the responses of nearly 1000 bird species to selective logging across the tropics. Our analysis shows that species traits, such as feeding group and body mass, and logging practices, such as time since logging and logging intensity, interact to influence a species' response to logging. Frugivores and insectivores were most adversely affected by logging and declined further with increasing logging intensity. Nectarivores and granivores responded positively to selective logging for the first two decades, after which their abundances decrease below pre-logging levels. Larger species of omnivores and granivores responded more positively to selective logging than smaller species from either feeding group, whereas this effect of body size was reversed for carnivores, herbivores, frugivores and insectivores. Most importantly, species most negatively impacted by selective logging had not recovered approximately 40 years after logging cessation. We conclude that selective timber harvest has the potential to cause large and long-lasting changes in avian biodiversity. However, our results suggest that the impacts can be mitigated to a certain extent through specific forest management strategies such as lengthening the rotation cycle and implementing reduced impact logging. PMID:25994673

  10. Use of a secure Internet Web site for collaborative medical research.

    PubMed

    Marshall, W W; Haley, R W

    2000-10-11

    Researchers who collaborate on clinical research studies from diffuse locations need a convenient, inexpensive, secure way to record and manage data. The Internet, with its World Wide Web, provides a vast network that enables researchers with diverse types of computers and operating systems anywhere in the world to log data through a common interface. Development of a Web site for scientific data collection can be organized into 10 steps, including planning the scientific database, choosing a database management software system, setting up database tables for each collaborator's variables, developing the Web site's screen layout, choosing a middleware software system to tie the database software to the Web site interface, embedding data editing and calculation routines, setting up the database on the central server computer, obtaining a unique Internet address and name for the Web site, applying security measures to the site, and training staff who enter data. Ensuring the security of an Internet database requires limiting the number of people who have access to the server, setting up the server on a stand-alone computer, requiring user-name and password authentication for server and Web site access, installing a firewall computer to prevent break-ins and block bogus information from reaching the server, verifying the identity of the server and client computers with certification from a certificate authority, encrypting information sent between server and client computers to avoid eavesdropping, establishing audit trails to record all accesses into the Web site, and educating Web site users about security techniques. When these measures are carefully undertaken, in our experience, information for scientific studies can be collected and maintained on Internet databases more efficiently and securely than through conventional systems of paper records protected by filing cabinets and locked doors. JAMA. 2000;284:1843-1849.

  11. Geological well log analysis. Third ed

    SciTech Connect

    Pirson, S.J.

    1983-01-01

    Until recently, well logs have mainly been used for correlation, structural mapping, and quantitive evaluation of hydrocarbon bearing formations. This third edition of Geologic Well Log Analysis, however, describes how well logs can be used for geological studies and mineral exploration. This is done by analyzing well logs for numerous parameters and indices of significant mineral accumulation, primarily in sediments. Contents are: SP and Eh curves as redoxomorphic logs; sedimentalogical studies by log curve shapes; exploration for stratigraphic traps; continuous dipmeter as a structural tool; continuous dipmeter as a sedimentation tool; Paleo-facies logging and mapping; hydrogeology 1--hydrodynamics of compaction; hydrogeology 2--geostatic equilibrium; and hydrogeology 3--hydrodynamics of infiltration. Appendixes cover: Computer program for calculating the dip magnitude, azimuth, and the degree and orientation of the resistivity anisotrophy; a lithology computer program for calculating the curvature of a structure; and basic log analysis package for HP-41CV programmable calculator.

  12. Sensor web

    NASA Technical Reports Server (NTRS)

    Delin, Kevin A. (Inventor); Jackson, Shannon P. (Inventor)

    2011-01-01

    A Sensor Web formed of a number of different sensor pods. Each of the sensor pods include a clock which is synchronized with a master clock so that all of the sensor pods in the Web have a synchronized clock. The synchronization is carried out by first using a coarse synchronization which takes less power, and subsequently carrying out a fine synchronization to make a fine sync of all the pods on the Web. After the synchronization, the pods ping their neighbors to determine which pods are listening and responded, and then only listen during time slots corresponding to those pods which respond.

  13. Log-Linear Models for Gene Association

    PubMed Central

    Hu, Jianhua; Joshi, Adarsh; Johnson, Valen E.

    2009-01-01

    We describe a class of log-linear models for the detection of interactions in high-dimensional genomic data. This class of models leads to a Bayesian model selection algorithm that can be applied to data that have been reduced to contingency tables using ranks of observations within subjects, and discretization of these ranks within gene/network components. Many normalization issues associated with the analysis of genomic data are thereby avoided. A prior density based on Ewens’ sampling distribution is used to restrict the number of interacting components assigned high posterior probability, and the calculation of posterior model probabilities is expedited by approximations based on the likelihood ratio statistic. Simulation studies are used to evaluate the efficiency of the resulting algorithm for known interaction structures. Finally, the algorithm is validated in a microarray study for which it was possible to obtain biological confirmation of detected interactions. PMID:19655032

  14. Advanced dendritic web growth development and development of single-crystal silicon dendritic ribbon and high-efficiency solar cell program

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Hopkins, R. H.

    1986-01-01

    Efforts to demonstrate that the dendritic web technology is ready for commercial use by the end of 1986 continues. A commercial readiness goal involves improvements to crystal growth furnace throughput to demonstrate an area growth rate of greater than 15 sq cm/min while simultaneously growing 10 meters or more of ribbon under conditions of continuous melt replenishment. Continuous means that the silicon melt is being replenished at the same rate that it is being consumed by ribbon growth so that the melt level remains constant. Efforts continue on computer thermal modeling required to define high speed, low stress, continuous growth configurations; the study of convective effects in the molten silicon and growth furnace cover gas; on furnace component modifications; on web quality assessments; and on experimental growth activities.

  15. Data Mining of Network Logs

    NASA Technical Reports Server (NTRS)

    Collazo, Carlimar

    2011-01-01

    The statement of purpose is to analyze network monitoring logs to support the computer incident response team. Specifically, gain a clear understanding of the Uniform Resource Locator (URL) and its structure, and provide a way to breakdown a URL based on protocol, host name domain name, path, and other attributes. Finally, provide a method to perform data reduction by identifying the different types of advertisements shown on a webpage for incident data analysis. The procedures used for analysis and data reduction will be a computer program which would analyze the URL and identify and advertisement links from the actual content links.

  16. Balloon logging with the inverted skyline

    NASA Technical Reports Server (NTRS)

    Mosher, C. F.

    1975-01-01

    There is a gap in aerial logging techniques that has to be filled. The need for a simple, safe, sizeable system has to be developed before aerial logging will become effective and accepted in the logging industry. This paper presents such a system designed on simple principles with realistic cost and ecological benefits.

  17. 10 CFR 34.71 - Utilization logs.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Utilization logs. 34.71 Section 34.71 Energy NUCLEAR... RADIOGRAPHIC OPERATIONS Recordkeeping Requirements § 34.71 Utilization logs. (a) Each licensee shall maintain utilization logs showing for each sealed source the following information: (1) A description, including...

  18. 10 CFR 34.71 - Utilization logs.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Utilization logs. 34.71 Section 34.71 Energy NUCLEAR... RADIOGRAPHIC OPERATIONS Recordkeeping Requirements § 34.71 Utilization logs. (a) Each licensee shall maintain utilization logs showing for each sealed source the following information: (1) A description, including...

  19. 29 CFR 1917.18 - Log handling.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 7 2010-07-01 2010-07-01 false Log handling. 1917.18 Section 1917.18 Labor Regulations...) MARINE TERMINALS Marine Terminal Operations § 1917.18 Log handling. (a) The employer shall ensure that structures (bunks) used to contain logs have rounded corners and rounded structural parts to avoid...

  20. 10 CFR 34.71 - Utilization logs.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Utilization logs. 34.71 Section 34.71 Energy NUCLEAR... RADIOGRAPHIC OPERATIONS Recordkeeping Requirements § 34.71 Utilization logs. (a) Each licensee shall maintain utilization logs showing for each sealed source the following information: (1) A description, including...

  1. 10 CFR 34.71 - Utilization logs.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Utilization logs. 34.71 Section 34.71 Energy NUCLEAR... RADIOGRAPHIC OPERATIONS Recordkeeping Requirements § 34.71 Utilization logs. (a) Each licensee shall maintain utilization logs showing for each sealed source the following information: (1) A description, including...

  2. 29 CFR 1917.18 - Log handling.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 7 2012-07-01 2012-07-01 false Log handling. 1917.18 Section 1917.18 Labor Regulations...) MARINE TERMINALS Marine Terminal Operations § 1917.18 Log handling. (a) The employer shall ensure that structures (bunks) used to contain logs have rounded corners and rounded structural parts to avoid...

  3. 47 CFR 73.1820 - Station log.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 4 2010-10-01 2010-10-01 false Station log. 73.1820 Section 73.1820... Rules Applicable to All Broadcast Stations § 73.1820 Station log. (a) Entries must be made in the station log either manually by a person designated by the licensee who is in actual charge of...

  4. 47 CFR 87.109 - Station logs.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 5 2013-10-01 2013-10-01 false Station logs. 87.109 Section 87.109... Operating Requirements and Procedures Operating Procedures § 87.109 Station logs. (a) A station at a fixed location in the international aeronautical mobile service must maintain a log in accordance with Annex...

  5. 10 CFR 34.71 - Utilization logs.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Utilization logs. 34.71 Section 34.71 Energy NUCLEAR... RADIOGRAPHIC OPERATIONS Recordkeeping Requirements § 34.71 Utilization logs. (a) Each licensee shall maintain utilization logs showing for each sealed source the following information: (1) A description, including...

  6. 47 CFR 73.1820 - Station log.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 4 2014-10-01 2014-10-01 false Station log. 73.1820 Section 73.1820... Rules Applicable to All Broadcast Stations § 73.1820 Station log. (a) Entries must be made in the station log either manually by a person designated by the licensee who is in actual charge of...

  7. 47 CFR 73.1820 - Station log.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 4 2013-10-01 2013-10-01 false Station log. 73.1820 Section 73.1820... Rules Applicable to All Broadcast Stations § 73.1820 Station log. (a) Entries must be made in the station log either manually by a person designated by the licensee who is in actual charge of...

  8. 47 CFR 73.1820 - Station log.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 4 2012-10-01 2012-10-01 false Station log. 73.1820 Section 73.1820... Rules Applicable to All Broadcast Stations § 73.1820 Station log. (a) Entries must be made in the station log either manually by a person designated by the licensee who is in actual charge of...

  9. 47 CFR 87.109 - Station logs.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 5 2012-10-01 2012-10-01 false Station logs. 87.109 Section 87.109... Operating Requirements and Procedures Operating Procedures § 87.109 Station logs. (a) A station at a fixed location in the international aeronautical mobile service must maintain a log in accordance with Annex...

  10. 47 CFR 73.1820 - Station log.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 4 2011-10-01 2011-10-01 false Station log. 73.1820 Section 73.1820... Rules Applicable to All Broadcast Stations § 73.1820 Station log. (a) Entries must be made in the station log either manually by a person designated by the licensee who is in actual charge of...

  11. 47 CFR 87.109 - Station logs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 5 2010-10-01 2010-10-01 false Station logs. 87.109 Section 87.109... Operating Requirements and Procedures Operating Procedures § 87.109 Station logs. (a) A station at a fixed location in the international aeronautical mobile service must maintain a log in accordance with Annex...

  12. 29 CFR 1917.18 - Log handling.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 7 2013-07-01 2013-07-01 false Log handling. 1917.18 Section 1917.18 Labor Regulations...) MARINE TERMINALS Marine Terminal Operations § 1917.18 Log handling. (a) The employer shall ensure that structures (bunks) used to contain logs have rounded corners and rounded structural parts to avoid...

  13. 47 CFR 87.109 - Station logs.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 5 2014-10-01 2014-10-01 false Station logs. 87.109 Section 87.109... Operating Requirements and Procedures Operating Procedures § 87.109 Station logs. (a) A station at a fixed location in the international aeronautical mobile service must maintain a log in accordance with Annex...

  14. 29 CFR 1917.18 - Log handling.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 7 2011-07-01 2011-07-01 false Log handling. 1917.18 Section 1917.18 Labor Regulations...) MARINE TERMINALS Marine Terminal Operations § 1917.18 Log handling. (a) The employer shall ensure that structures (bunks) used to contain logs have rounded corners and rounded structural parts to avoid...

  15. 29 CFR 1917.18 - Log handling.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 7 2014-07-01 2014-07-01 false Log handling. 1917.18 Section 1917.18 Labor Regulations...) MARINE TERMINALS Marine Terminal Operations § 1917.18 Log handling. (a) The employer shall ensure that structures (bunks) used to contain logs have rounded corners and rounded structural parts to avoid...

  16. 47 CFR 87.109 - Station logs.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 5 2011-10-01 2011-10-01 false Station logs. 87.109 Section 87.109... Operating Requirements and Procedures Operating Procedures § 87.109 Station logs. (a) A station at a fixed location in the international aeronautical mobile service must maintain a log in accordance with Annex...

  17. A Quantitative Cost Effectiveness Model for Web-Supported Academic Instruction

    ERIC Educational Resources Information Center

    Cohen, Anat; Nachmias, Rafi

    2006-01-01

    This paper describes a quantitative cost effectiveness model for Web-supported academic instruction. The model was designed for Web-supported instruction (rather than distance learning only) characterizing most of the traditional higher education institutions. It is based on empirical data (Web logs) of students' and instructors' usage…

  18. Applying WebMining on KM system

    NASA Astrophysics Data System (ADS)

    Shimazu, Keiko; Ozaki, Tomonobu; Furukawa, Koichi

    KM (Knowledge Management) systems have recently been adopted within the realm of enterprise management. On the other hand, data mining technology is widely acknowledged within Information systems' R&D Divisions. Specially, acquisition of meaningful information from Web usage data has become one of the most exciting eras. In this paper, we employ a Web based KM system and propose a framework for applying Web Usage Mining technology to KM data. As it turns out, task duration varies according to different user operations such as referencing a table-of-contents page, down-loading a target file, and writing to a bulletin board. This in turn makes it possible to easily predict the purpose of the user's task. By taking these observations into account, we segmented access log data manually. These results were compared with results abstained by applying the constant interval method. Next, we obtained a segmentation rule of Web access logs by applying a machine-learning algorithm to manually segmented access logs as training data. Then, the newly obtained segmentation rule was compared with other known methods including the time interval method by evaluating their segmentation results in terms of recall and precision rates and it was shown that our rule attained the best results in both measures. Furthermore, the segmented data were fed to an association rule miner and the obtained association rules were utilized to modify the Web structure.

  19. Porosity Log Prediction Using Artificial Neural Network

    NASA Astrophysics Data System (ADS)

    Dwi Saputro, Oki; Lazuardi Maulana, Zulfikar; Dzar Eljabbar Latief, Fourier

    2016-08-01

    Well logging is important in oil and gas exploration. Many physical parameters of reservoir is derived from well logging measurement. Geophysicists often use well logging to obtain reservoir properties such as porosity, water saturation and permeability. Most of the time, the measurement of the reservoir properties are considered expensive. One of method to substitute the measurement is by conducting a prediction using artificial neural network. In this paper, artificial neural network is performed to predict porosity log data from other log data. Three well from ‘yy’ field are used to conduct the prediction experiment. The log data are sonic, gamma ray, and porosity log. One of three well is used as training data for the artificial neural network which employ the Levenberg-Marquardt Backpropagation algorithm. Through several trials, we devise that the most optimal input training is sonic log data and gamma ray log data with 10 hidden layer. The prediction result in well 1 has correlation of 0.92 and mean squared error of 5.67 x10-4. Trained network apply to other well data. The result show that correlation in well 2 and well 3 is 0.872 and 0.9077 respectively. Mean squared error in well 2 and well 3 is 11 x 10-4 and 9.539 x 10-4. From the result we can conclude that sonic log and gamma ray log could be good combination for predicting porosity with neural network.

  20. Logs key to solving water production problems

    SciTech Connect

    Wyatt, D.F. Jr.; Crook, R.J.

    1995-11-20

    Water source identification is the first and most important step in controlling unwanted water production that can severely limit the productive life of a well and, thereby, decrease hydrocarbon recovery. Water-control treatments often fail because the source of the water problem is not identified, the wrong treatment is performed, or the correct treatment is performed incorrectly. Table 1 lists typical problems, means of identification and evaluation, and chemical treatments available for correcting the problem. Well logs can help diagnose downhole situations that can lead to unwanted water production, and the effectiveness of water-control treatments can be evaluated with cased and open hole logs. The paper discusses cement bond logs and the pulse echo tool for cement evaluation. Casing evaluation is carried out by mechanical caliper logs and electro magnetic tools. Reservoir monitoring with pulsed neutron logs and pulsed neutron spectrometry are discussed. Also discussed are production logging, radioactive tracer logging, and well tests.

  1. Silicon web process development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Hill, F. E.; Skutch, M. E.; Driggers, J. M.; Hopkins, R. H.

    1980-01-01

    A barrier crucible design which consistently maintains melt stability over long periods of time was successfully tested and used in long growth runs. The pellet feeder for melt replenishment was operated continuously for growth runs of up to 17 hours. The liquid level sensor comprising a laser/sensor system was operated, performed well, and meets the requirements for maintaining liquid level height during growth and melt replenishment. An automated feedback loop connecting the feed mechanism and the liquid level sensing system was designed and constructed and operated successfully for 3.5 hours demonstrating the feasibility of semi-automated dendritic web growth. The sensitivity of the cost of sheet, to variations in capital equipment cost and recycling dendrites was calculated and it was shown that these factors have relatively little impact on sheet cost. Dendrites from web which had gone all the way through the solar cell fabrication process, when melted and grown into web, produce crystals which show no degradation in cell efficiency. Material quality remains high and cells made from web grown at the start, during, and the end of a run from a replenished melt show comparable efficiencies.

  2. Logs Wanted - Dead or Alive

    NASA Astrophysics Data System (ADS)

    Schuchardt, A.; Morche, D.

    2015-12-01

    Rivers cover only a small part of the Earth`s surface, yet they transfer sediment in globally significant quantities. In mountainous regions, the majority of the total channel length occurs in headwater streams. Those mountain channels are influenced in terms of sediment connectivity by processes on the slopes. For example in such a sediment routing system, sediment originating from debris flows on the slopes is delivered along sediment pathways to the channel system and can be transported further downstream as solid load. Interruption of instream coarse sediment connectivity is closely related to the existence of channel blocking barriers which also can be formed by biota. By storing sediment large wood (LW) log jams disrupt in-channel sediment connectivity. We present a study design in order to decipher the short to long term effects (c. 10-2-102 years) of sediment (dis)connectivity effects of large wood. The study areas are two basins in mountain ranges in Germany and Austria. In Austria the drainage area of the river Fugnitz was chosen which is located in the National Park Thayatal. The other drainage area of the river Sieber in Saxony-Anhalt, Germany, is located in the Harz National Park. Since studies on LW and its geomorphological effects in Central European rivers are still rare the main goals of the project are: •to identify important triggers for LW transport from slopes into the channels •to examine the spatial distribution and characterization of LW in main and slope channels by mapping and dGPS measurements •to determine the effects of LW on channel hydraulic parameters (e.g. slope, width, grains size composition, roughness) by field measurements of channel long profiles and cross section with dGPS and Wolman particle counts •to quantify the direct effects of LW on discharge and bed load transport by measuring flow velocity with an Ott-Nautilus current meter and to measure bed load up- and downstream of log jams using a portable Helley

  3. Web multimedia information retrieval using improved Bayesian algorithm.

    PubMed

    Yu, Yi-Jun; Chen, Chun; Yu, Yi-Min; Lin, Huai-Zhong

    2003-01-01

    The main thrust of this paper is application of a novel data mining approach on the log of user's feedback to improve web multimedia information retrieval performance. A user space model was constructed based on data mining, and then integrated into the original information space model to improve the accuracy of the new information space model. It can remove clutter and irrelevant text information and help to eliminate mismatch between the page author's expression and the user's understanding and expectation. User space model was also utilized to discover the relationship between high-level and low-level features for assigning weight. The authors proposed improved Bayesian algorithm for data mining. Experiment proved that the authors' proposed algorithm was efficient.

  4. Leak checker data logging system

    DOEpatents

    Gannon, J.C.; Payne, J.J.

    1996-09-03

    A portable, high speed, computer-based data logging system for field testing systems or components located some distance apart employs a plurality of spaced mass spectrometers and is particularly adapted for monitoring the vacuum integrity of a long string of a superconducting magnets such as used in high energy particle accelerators. The system provides precise tracking of a gas such as helium through the magnet string when the helium is released into the vacuum by monitoring the spaced mass spectrometers allowing for control, display and storage of various parameters involved with leak detection and localization. A system user can observe the flow of helium through the magnet string on a real-time basis hour the exact moment of opening of the helium input valve. Graph reading can be normalized to compensate for magnet sections that deplete vacuum faster than other sections between testing to permit repetitive testing of vacuum integrity in reduced time. 18 figs.

  5. Leak checker data logging system

    DOEpatents

    Gannon, Jeffrey C.; Payne, John J.

    1996-01-01

    A portable, high speed, computer-based data logging system for field testing systems or components located some distance apart employs a plurality of spaced mass spectrometers and is particularly adapted for monitoring the vacuum integrity of a long string of a superconducting magnets such as used in high energy particle accelerators. The system provides precise tracking of a gas such as helium through the magnet string when the helium is released into the vacuum by monitoring the spaced mass spectrometers allowing for control, display and storage of various parameters involved with leak detection and localization. A system user can observe the flow of helium through the magnet string on a real-time basis hour the exact moment of opening of the helium input valve. Graph reading can be normalized to compensate for magnet sections that deplete vacuum faster than other sections between testing to permit repetitive testing of vacuum integrity in reduced time.

  6. Logged In and Zoned Out.

    PubMed

    Ravizza, Susan M; Uitvlugt, Mitchell G; Fenn, Kimberly M

    2017-02-01

    Laptop computers are widely prevalent in university classrooms. Although laptops are a valuable tool, they offer access to a distracting temptation: the Internet. In the study reported here, we assessed the relationship between classroom performance and actual Internet usage for academic and nonacademic purposes. Students who were enrolled in an introductory psychology course logged into a proxy server that monitored their online activity during class. Past research relied on self-report, but the current methodology objectively measured time, frequency, and browsing history of participants' Internet usage. In addition, we assessed whether intelligence, motivation, and interest in course material could account for the relationship between Internet use and performance. Our results showed that nonacademic Internet use was common among students who brought laptops to class and was inversely related to class performance. This relationship was upheld after we accounted for motivation, interest, and intelligence. Class-related Internet use was not associated with a benefit to classroom performance.

  7. Correlating Log Messages for System Diagnostics

    SciTech Connect

    Gunasekaran, Raghul; Dillow, David A; Shipman, Galen M; Maxwell, Don E; Hill, Jason J; Park, Byung H; Geist, Al

    2010-01-01

    In large-scale computing systems, the sheer volume of log data generated presents daunting challenges for debugging and monitoring of these systems. The Oak Ridge Leadership Computing Facility s premier simulation platform, the Cray XT5 known as Jaguar, can generate a few hundred thousand log entries in less than a minute for many system level events. Determining the root cause of such system events requires analyzing and interpretation of a large number of log messages. Most often, the log messages are best understood when they are interpreted collectively rather than individually. In this paper, we present our approach to interpreting log messages by identifying their commonalities and grouping them into clusters. Given a set of log messages within a time interval, we group the messages based on source, target, and/or error type, and correlate the messages with hardware and application information. We monitor the Lustre log messages in the XT5 console log and show that such grouping of log messages assists in detecting the source of system events. By intelligent grouping and correlation of events in the log, we are able to provide system administrators with meaningful information in a concise format for root cause analysis.

  8. Web Evaluation Tool (WET): A Creative Web Tool for Online Educators

    ERIC Educational Resources Information Center

    Hamza, Mohammad Khalid

    2003-01-01

    The Nielsen/Net report Ratings 2000, reported that in 2002, online usage at work jumped 17 percent year-over-year, driven by female office workers. Nearly 46 million American office workers logged onto the Web, the highest peak since January 2000. It was also predicted that the number of students using the Internet was expected to reach 13.5…

  9. Mining Formative Evaluation Rules Using Web-Based Learning Portfolios for Web-Based Learning Systems

    ERIC Educational Resources Information Center

    Chen, Chih-Ming; Hong, Chin-Ming; Chen, Shyuan-Yi; Liu, Chao-Yu

    2006-01-01

    Learning performance assessment aims to evaluate what knowledge learners have acquired from teaching activities. Objective technical measures of learning performance are difficult to develop, but are extremely important for both teachers and learners. Learning performance assessment using learning portfolios or web server log data is becoming an…

  10. LOG PERIODIC DIPOLE ARRAY WITH PARASITIC ELEMENTS

    DTIC Science & Technology

    The design and measured characteristics of dipole and monopole versions of a log periodic array with parasitic elements are discussed. In a dipole...for the elements to obtain log periodic performance of the anntenna. This design with parasitic elements lends itself to a monopole version of the...antenna which has a simplified feeding configuration. The result is a log periodic antenna design that can be used from high frequencies through microwave frequencies.

  11. 'Infectious web'.

    PubMed

    Kotra, L P; Ojcius, D M

    2000-12-01

    A comprehensive list of all known bacterial pathogens of humans is now available at various web-sites on the internet. The sites contain hyperlinks to original scientific literature, along with general information on laboratory testing, antibiotic resistance and clinical treatment. More specific sites highlight the fungus Pneumocystic carinii, arguably the main cause of pneumonia in immunosuppressed individuals.

  12. Webbing It.

    ERIC Educational Resources Information Center

    Brandsberg, Jennifer

    1996-01-01

    Provides a quick look at some World Wide Web sites that contain current election year information. Recommends Project Vote Smart, a site with links to online news organizations, the home pages of all presidential candidates, and other political sites. Briefly notes several interactive CD-ROM resources. (MJP)

  13. Web Sitings.

    ERIC Educational Resources Information Center

    Lo, Erika

    2001-01-01

    Presents seven mathematics games, located on the World Wide Web, for elementary students, including: Absurd Math: Pre-Algebra from Another Dimension; The Little Animals Activity Centre; MathDork Game Room (classic video games focusing on algebra); Lemonade Stand (students practice math and business skills); Math Cats (teaches the artistic beauty…

  14. A Distributed Network Logging Topology

    DTIC Science & Technology

    2010-03-01

    while still maintaining a searchable and efficient storage system. v Acknowledgments I would like to thank my advisor, Lt Col Brett...implemented: unreliable UDP transfer, reliable TCP transfer, or potentially an encrypted SSL transfer. Each method would have different levels of traffic...infrastructure that resolves the centralized server bottleneck and data loss problem while still maintaining a searchable and efficient storage system. 15

  15. 47 CFR 80.409 - Station logs.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... principal radiotelegraph operating room during the voyage. (c) Public coast station logs. Public coast... made comparing the radio station clock with standard time, including errors observed and...

  16. 47 CFR 80.409 - Station logs.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... principal radiotelegraph operating room during the voyage. (c) Public coast station logs. Public coast... made comparing the radio station clock with standard time, including errors observed and...

  17. 47 CFR 80.409 - Station logs.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... principal radiotelegraph operating room during the voyage. (c) Public coast station logs. Public coast... made comparing the radio station clock with standard time, including errors observed and...

  18. 47 CFR 80.409 - Station logs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... principal radiotelegraph operating room during the voyage. (c) Public coast station logs. Public coast... made comparing the radio station clock with standard time, including errors observed and...

  19. 47 CFR 80.409 - Station logs.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... principal radiotelegraph operating room during the voyage. (c) Public coast station logs. Public coast... made comparing the radio station clock with standard time, including errors observed and...

  20. Efficient Tracking, Logging, and Blocking of Accesses to Digital Objects

    DTIC Science & Technology

    2015-09-01

    STATES AIR FORCE  ROME, NY 13441 AIR FORCE MATERIEL COMMAND NOTICE AND SIGNATURE PAGE Using Government drawings, specifications, or other data ...Government formulated or supplied the drawings, specifications, or other data does not license the holder or any other person or corporation; or convey any...response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and

  1. Sleep Logs: Measurement of Individual and Operational Efficiency

    DTIC Science & Technology

    1991-05-01

    during a given mission scenario. There are many tools for measuring amounts and patterns of sleep. Portable brain wave recording systems, for an...Hullaney, and Wybarney, 1982). It can be used easily in the field to separate periods of rest/sleep (minimal activities , presumably asleep’ from...physically active periods. However, such actigraphic units are relatively expensive. The most economical and preferred method to study sleep, especially ill

  2. Flow rate logging seepage meter

    NASA Technical Reports Server (NTRS)

    Reay, William G. (Inventor); Walthall, Harry G. (Inventor)

    1996-01-01

    An apparatus for remotely measuring and logging the flow rate of groundwater seepage into surface water bodies. As groundwater seeps into a cavity created by a bottomless housing, it displaces water through an inlet and into a waterproof sealed upper compartment, at which point, the water is collected by a collection bag, which is contained in a bag chamber. A magnet on the collection bag approaches a proximity switch as the collection bag fills, and eventually enables the proximity switch to activate a control circuit. The control circuit then rotates a three-way valve from the collection path to a discharge path, enables a data logger to record the time, and enables a pump, which discharges the water from the collection bag, through the three-way valve and pump, and into the sea. As the collection bag empties, the magnet leaves the proximity of the proximity switch, and the control circuit turns off the pump, resets the valve to provide a collection path, and restarts the collection cycle.

  3. Human dynamics revealed through Web analytics

    NASA Astrophysics Data System (ADS)

    Gonçalves, Bruno; Ramasco, José J.

    2008-08-01

    The increasing ubiquity of Internet access and the frequency with which people interact with it raise the possibility of using the Web to better observe, understand, and monitor several aspects of human social behavior. Web sites with large numbers of frequently returning users are ideal for this task. If these sites belong to companies or universities, their usage patterns can furnish information about the working habits of entire populations. In this work, we analyze the properly anonymized logs detailing the access history to Emory University’s Web site. Emory is a medium-sized university located in Atlanta, Georgia. We find interesting structure in the activity patterns of the domain and study in a systematic way the main forces behind the dynamics of the traffic. In particular, we find that linear preferential linking, priority-based queuing, and the decay of interest for the contents of the pages are the essential ingredients to understand the way users navigate the Web.

  4. DARK ENERGY FROM THE LOG-TRANSFORMED CONVERGENCE FIELD

    SciTech Connect

    Seo, Hee-Jong; Sato, Masanori; Takada, Masahiro; Dodelson, Scott

    2012-03-20

    A logarithmic transform of the convergence field improves 'the information content', i.e., the overall precision associated with the measurement of the amplitude of the convergence power spectrum, by improving the covariance matrix properties. The translation of this improvement in the information content to that in cosmological parameters, such as those associated with dark energy, requires knowing the sensitivity of the log-transformed field to those cosmological parameters. In this paper, we use N-body simulations with ray tracing to generate convergence fields at multiple source redshifts as a function of cosmology. The gain in information associated with the log-transformed field does lead to tighter constraints on dark energy parameters, but only if shape noise is neglected. The presence of shape noise quickly diminishes the advantage of the log-mapping, more quickly than we would expect based on the information content. With or without shape noise, using a larger pixel size allows for a more efficient log-transformation.

  5. Research on web performance optimization principles and models

    NASA Astrophysics Data System (ADS)

    Wang, Xin

    2013-03-01

    The Internet high speed development, causes Web the optimized question to be getting more and more prominent, therefore the Web performance optimizes into inevitably. the first principle of Web Performance Optimization is to understand, to know that income will have to pay, and return is diminishing; Simultaneously the probability will decrease Web the performance, and will start from the highest level to optimize obtained biggest. Web Technical models to improve the performance are: sharing costs, high-speed caching, profiles, parallel processing, simplified treatment. Based on this study, given the crucial Web performance optimization recommendations, which improve the performance of Web usage, accelerate the efficient use of Internet has an important significance.

  6. 40 CFR 90.412 - Data logging.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data...

  7. 40 CFR 90.412 - Data logging.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data...

  8. 40 CFR 90.412 - Data logging.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data...

  9. 40 CFR 89.409 - Data logging.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records...

  10. 40 CFR 91.412 - Data logging.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets...

  11. 40 CFR 91.412 - Data logging.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets...

  12. 40 CFR 89.409 - Data logging.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records...

  13. 40 CFR 90.412 - Data logging.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data...

  14. 40 CFR 89.409 - Data logging.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records...

  15. 40 CFR 89.409 - Data logging.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records...

  16. 40 CFR 91.412 - Data logging.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets...

  17. 40 CFR 91.412 - Data logging.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets...

  18. 40 CFR 89.409 - Data logging.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records...

  19. 40 CFR 90.412 - Data logging.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data...

  20. 40 CFR 91.412 - Data logging.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets...

  1. Discover Presidential Log Cabins. Teacher's Discussion Guide.

    ERIC Educational Resources Information Center

    National Park Service (Dept. of Interior), Washington, DC.

    Discover Presidential Log Cabins is a set of materials designed to help educate 6-8 grade students about the significance of three log cabin sites occupied by George Washington, Ulysses Grant, Abraham Lincoln, and Theodore Roosevelt. This teacher's discussion guide is intended for use as part of a larger, comprehensive social studies program, and…

  2. 'Infectious web'.

    PubMed

    Kotra, L P; Ojcius, D M

    2000-07-01

    Infections by Helicobacter pylori are responsible for duodenal and gastric ulcers and are a significant risk factor for the development of gastric adenocarcinoma. H. pylori was discovered in 1983, but many institutes in Canada, Europe, and the United States are already involved in programs to understand and treat the infections, as reflected by the growing number of internet sites devoted to this bacterium. Most AIDS patients and about 20% of children with acute lymphoblastic leukemia develop Pneumocystis carinii pneumoniae. Information on clinical symptoms and treatment, as well as the P. carinii genome sequencing project, are described at several web sites. Students and researchers wishing to understand the correlation between telomere length and AIDS may turn to web sites of the University of Colorado and Washington University School of Medicine for the latest on telomeres and telomerase, and their function in aging and cancer.

  3. Project Assessment Skills Web Application

    NASA Technical Reports Server (NTRS)

    Goff, Samuel J.

    2013-01-01

    The purpose of this project is to utilize Ruby on Rails to create a web application that will replace a spreadsheet keeping track of training courses and tasks. The goal is to create a fast and easy to use web application that will allow users to track progress on training courses. This application will allow users to update and keep track of all of the training required of them. The training courses will be organized by group and by user, making readability easier. This will also allow group leads and administrators to get a sense of how everyone is progressing in training. Currently, updating and finding information from this spreadsheet is a long and tedious task. By upgrading to a web application, finding and updating information will be easier than ever as well as adding new training courses and tasks. Accessing this data will be much easier in that users just have to go to a website and log in with NDC credentials rather than request the relevant spreadsheet from the holder. In addition to Ruby on Rails, I will be using JavaScript, CSS, and jQuery to help add functionality and ease of use to my web application. This web application will include a number of features that will help update and track progress on training. For example, one feature will be to track progress of a whole group of users to be able to see how the group as a whole is progressing. Another feature will be to assign tasks to either a user or a group of users. All of these together will create a user friendly and functional web application.

  4. Designing and Piloting a Leadership Daily Practice Log: Using Logs to Study the Practice of Leadership

    ERIC Educational Resources Information Center

    Spillane, James P.; Zuberi, Anita

    2009-01-01

    Purpose: This article aims to validate the Leadership Daily Practice (LDP) log, an instrument for conducting research on leadership in schools. Research Design: Using a combination of data sources--namely, a daily practice log, observations, and open-ended cognitive interviews--the authors evaluate the validity of the LDP log. Participants: Formal…

  5. Use of a Web-Based Calculator and a Structured Report Generator to Improve Efficiency, Accuracy, and Consistency of Radiology Reporting.

    PubMed

    Towbin, Alexander J; Hawkins, C Matthew

    2017-03-29

    While medical calculators are common, they are infrequently used in the day-to-day radiology practice. We hypothesized that a calculator coupled with a structured report generator would decrease the time required to interpret and dictate a study in addition to decreasing the number of errors in interpretation. A web-based application was created to help radiologists calculate leg-length discrepancies. A time motion study was performed to evaluate if the calculator helped to decrease the time for interpretation and dictation of leg-length radiographs. Two radiologists each evaluated two sets of ten radiographs, one set using the traditional pen and paper method and the other set using the calculator. The time to interpret each study and the time to dictate each study were recorded. In addition, each calculation was checked for errors. When comparing the two methods of calculating the leg lengths, the manual method was significantly slower than the calculator for all time points measured: the mean time to calculate the leg-length discrepancy (131.8 vs. 59.7 s; p < 0.001), the mean time to dictate the report (31.8 vs. 11 s; p < 0.001), and the mean total time (163.7 vs. 70.7 s; p < 0.001). Reports created by the calculator were more accurate than reports created via the manual method (100 vs. 90%), although this result was not significant (p = 0.16). A calculator with a structured report generator significantly improved the time required to calculate and dictate leg-length discrepancy studies.

  6. Investigating metrics of geospatial web services: The case of a CEOS federated catalog service for earth observation data

    NASA Astrophysics Data System (ADS)

    Han, Weiguo; Di, Liping; Yu, Genong; Shao, Yuanzheng; Kang, Lingjun

    2016-07-01

    Geospatial Web Services (GWS) make geospatial information and computing resources discoverable and accessible over the Web. Among them, Open Geospatial Consortium (OGC) standards-compliant data, catalog and processing services are most popular, and have been widely adopted and leveraged in geospatial research and applications. The GWS metrics, such as visit count, average processing time, and user distribution, are important to evaluate their overall performance and impacts. However, these metrics, especially of federated catalog service, have not been systematically evaluated and reported to relevant stakeholders from the point of view of service providers. Taking an integrated catalog service for earth observation data as an example, this paper describes metrics information retrieval, organization, and representation of a catalog service federation. An extensible and efficient log file analyzer is implemented to retrieve a variety of service metrics from the log file and store analysis results in an easily programmable format. An Ajax powered Web portal is built to provide stakeholders, sponsors, developers, partners, and other types of users with specific and relevant insights into metrics information in an interactive and informative form. The deployed system has provided useful information for periodical reports, service delivery, and decision support. The proposed measurement strategy and analytics framework can be a guidance to help GWS providers evaluate their services.

  7. Query log analysis of an electronic health record search engine.

    PubMed

    Yang, Lei; Mei, Qiaozhu; Zheng, Kai; Hanauer, David A

    2011-01-01

    We analyzed a longitudinal collection of query logs of a full-text search engine designed to facilitate information retrieval in electronic health records (EHR). The collection, 202,905 queries and 35,928 user sessions recorded over a course of 4 years, represents the information-seeking behavior of 533 medical professionals, including frontline practitioners, coding personnel, patient safety officers, and biomedical researchers for patient data stored in EHR systems. In this paper, we present descriptive statistics of the queries, a categorization of information needs manifested through the queries, as well as temporal patterns of the users' information-seeking behavior. The results suggest that information needs in medical domain are substantially more sophisticated than those that general-purpose web search engines need to accommodate. Therefore, we envision there exists a significant challenge, along with significant opportunities, to provide intelligent query recommendations to facilitate information retrieval in EHR.

  8. A design method for an intuitive web site

    SciTech Connect

    Quinniey, M.L.; Diegert, K.V.; Baca, B.G.; Forsythe, J.C.; Grose, E.

    1999-11-03

    The paper describes a methodology for designing a web site for human factor engineers that is applicable for designing a web site for a group of people. Many web pages on the World Wide Web are not organized in a format that allows a user to efficiently find information. Often the information and hypertext links on web pages are not organized into intuitive groups. Intuition implies that a person is able to use their knowledge of a paradigm to solve a problem. Intuitive groups are categories that allow web page users to find information by using their intuition or mental models of categories. In order to improve the human factors engineers efficiency for finding information on the World Wide Web, research was performed to develop a web site that serves as a tool for finding information effectively. The paper describes a methodology for designing a web site for a group of people who perform similar task in an organization.

  9. Sample size calculation for testing differences between cure rates with the optimal log-rank test.

    PubMed

    Wu, Jianrong

    2017-01-01

    In this article, sample size calculations are developed for use when the main interest is in the differences between the cure rates of two groups. Following the work of Ewell and Ibrahim, the asymptotic distribution of the weighted log-rank test is derived under the local alternative. The optimal log-rank test under the proportional distributions alternative is discussed, and sample size formulas for the optimal and standard log-rank tests are derived. Simulation results show that the proposed formulas provide adequate sample size estimation for trial designs and that the optimal log-rank test is more efficient than the standard log-rank test, particularly when both cure rates and percentages of censoring are small.

  10. Taming Log Files from Game/Simulation-Based Assessments: Data Models and Data Analysis Tools. Research Report. ETS RR-16-10

    ERIC Educational Resources Information Center

    Hao, Jiangang; Smith, Lawrence; Mislevy, Robert; von Davier, Alina; Bauer, Malcolm

    2016-01-01

    Extracting information efficiently from game/simulation-based assessment (G/SBA) logs requires two things: a well-structured log file and a set of analysis methods. In this report, we propose a generic data model specified as an extensible markup language (XML) schema for the log files of G/SBAs. We also propose a set of analysis methods for…

  11. The fluid-compensated cement bond log

    SciTech Connect

    Nayfeh, T.H.; Leslie, H.D.; Wheelis, W.B.

    1984-09-01

    An experimental and numerical wave mechanics study of cement bond logs demonstrated that wellsite computer processing can now segregate wellbore fluid effects from the sonic signal response to changing cement strength. Traditionally, cement logs have been interpreted as if water were in the wellbore, without consideration of wellbore fluid effects. These effects were assumed to be negligible. However, with the increasing number of logs being run in completion fluids such as CaCl/sub 2/, ZnBr/sub 2/, and CaBr/sub 2/, large variations in cement bond logs became apparent. A Schlumberger internal paper showing that bond log amplitude is related to the acoustic impedance of the fluid in which the tool is run led to a comprehensive study of wellbore fluid effects. Numerical and experimental models were developed simulating wellbore geometry. Measurements were conducted in 5-, 7-, and 95/8-in. casings by varying the wellbore fluid densities, viscosities, and fluid types (acoustic impedance). Parallel numerical modeling was undertaken using similar parameters. The results showed that the bond log amplitude varied dramatically with the wellbore fluid's acoustic impedance; for example, there was a 70 percent increase in the signal amplitude for 11.5-lb/ gal CaCl/sub 2/ over the signal amplitude in water. This led to the development of a Fluid-Compensated Bond log that corrects the amplitude for acoustic impedance of varying wellbore fluids, thereby making the measurements more directly related to the cement quality.

  12. Focused Crawling of the Deep Web Using Service Class Descriptions

    SciTech Connect

    Rocco, D; Liu, L; Critchlow, T

    2004-06-21

    Dynamic Web data sources--sometimes known collectively as the Deep Web--increase the utility of the Web by providing intuitive access to data repositories anywhere that Web access is available. Deep Web services provide access to real-time information, like entertainment event listings, or present a Web interface to large databases or other data repositories. Recent studies suggest that the size and growth rate of the dynamic Web greatly exceed that of the static Web, yet dynamic content is often ignored by existing search engine indexers owing to the technical challenges that arise when attempting to search the Deep Web. To address these challenges, we present DynaBot, a service-centric crawler for discovering and clustering Deep Web sources offering dynamic content. DynaBot has three unique characteristics. First, DynaBot utilizes a service class model of the Web implemented through the construction of service class descriptions (SCDs). Second, DynaBot employs a modular, self-tuning system architecture for focused crawling of the DeepWeb using service class descriptions. Third, DynaBot incorporates methods and algorithms for efficient probing of the Deep Web and for discovering and clustering Deep Web sources and services through SCD-based service matching analysis. Our experimental results demonstrate the effectiveness of the service class discovery, probing, and matching algorithms and suggest techniques for efficiently managing service discovery in the face of the immense scale of the Deep Web.

  13. Nonblocking and orphan free message logging protocols

    NASA Technical Reports Server (NTRS)

    Alvisi, Lorenzo; Hoppe, Bruce; Marzullo, Keith

    1992-01-01

    Currently existing message logging protocols demonstrate a classic pessimistic vs. optimistic tradeoff. We show that the optimistic-pessimistic tradeoff is not inherent to the problem of message logging. We construct a message-logging protocol that has the positive features of both optimistic and pessimistic protocol: our protocol prevents orphans and allows simple failure recovery; however, it requires no blocking in failure-free runs. Furthermore, this protocol does not introduce any additional message overhead as compared to one implemented for a system in which messages may be lost but processes do not crash.

  14. Reference manual for data base on Nevada well logs

    USGS Publications Warehouse

    Bauer, E.M.; Cartier, K.D.

    1995-01-01

    The U.S. Geological Survey and Nevada Division of Water Resources are cooperatively using a data base for are cooperatively using a data base for managing well-log information for the State of Nevada. The Well-Log Data Base is part of an integrated system of computer data bases using the Ingres Relational Data-Base Management System, which allows efficient storage and access to water information from the State Engineer's office. The data base contains a main table, two ancillary tables, and nine lookup tables, as well as a menu-driven system for entering, updating, and reporting on the data. This reference guide outlines the general functions of the system and provides a brief description of data tables and data-entry screens.

  15. Novel Desorber for Online Drilling Mud Gas Logging.

    PubMed

    Lackowski, Marcin; Tobiszewski, Marek; Namieśnik, Jacek

    2016-01-01

    This work presents the construction solution and experimental results of a novel desorber for online drilling mud gas logging. The traditional desorbers use mechanical mixing of the liquid to stimulate transfer of hydrocarbons to the gaseous phase that is further analyzed. The presented approach is based on transfer of hydrocarbons from the liquid to the gas bubbles flowing through it and further gas analysis. The desorber was checked for gas logging from four different drilling muds collected from Polish boreholes. The results of optimization studies are also presented in this study. The comparison of the novel desorber with a commercial one reveals strong advantages of the novel one. It is characterized by much better hydrocarbons recovery efficiency and allows reaching lower limits of detection of the whole analytical system. The presented desorber seems to be very attractive alternative over widely used mechanical desorbers.

  16. Predicting hospital visits from geo-tagged Internet search logs

    PubMed Central

    Agarwal, Vibhu; Han, Lichy; Madan, Isaac; Saluja, Shaurya; Shidham, Aaditya; Shah, Nigam H.

    2016-01-01

    The steady rise in healthcare costs has deprived over 45 million Americans of healthcare services (1, 2) and has encouraged healthcare providers to look for opportunities to improve their operational efficiency. Prior studies have shown that evidence of healthcare seeking intent in Internet searches correlates well with healthcare resource utilization. Given the ubiquitous nature of mobile Internet search, we hypothesized that analyzing geo-tagged mobile search logs could enable us to machine-learn predictors of future patient visits. Using a de-identified dataset of geo-tagged mobile Internet search logs, we mined text and location patterns that are predictors of healthcare resource utilization and built statistical models that predict the probability of a user’s future visit to a medical facility. Our efforts will enable the development of innovative methods for modeling and optimizing the use of healthcare resources—a crucial prerequisite for securing healthcare access for everyone in the days to come. PMID:27570641

  17. Novel Desorber for Online Drilling Mud Gas Logging

    PubMed Central

    Lackowski, Marcin; Tobiszewski, Marek; Namieśnik, Jacek

    2016-01-01

    This work presents the construction solution and experimental results of a novel desorber for online drilling mud gas logging. The traditional desorbers use mechanical mixing of the liquid to stimulate transfer of hydrocarbons to the gaseous phase that is further analyzed. The presented approach is based on transfer of hydrocarbons from the liquid to the gas bubbles flowing through it and further gas analysis. The desorber was checked for gas logging from four different drilling muds collected from Polish boreholes. The results of optimization studies are also presented in this study. The comparison of the novel desorber with a commercial one reveals strong advantages of the novel one. It is characterized by much better hydrocarbons recovery efficiency and allows reaching lower limits of detection of the whole analytical system. The presented desorber seems to be very attractive alternative over widely used mechanical desorbers. PMID:27127674

  18. Graph Structures and Algorithms for Query-Log Analysis

    NASA Astrophysics Data System (ADS)

    Donato, Debora

    Query logs are repositories that record all the interactions of users with a search engine. This incredibly rich user behavior data can be modeled using appropriate graph structures. In the recent years there has been an increasing amount of literature on studying properties, models, and algorithms for query-log graphs. Understanding the structure of such graphs, modeling user querying patterns, and designing algorithms for leveraging the latent knowledge (also known as the wisdom of the crowds) in those graphs introduces new challenges in the field of graph mining. The main goal of this paper is to present the reader with an example of these graph-structures, i.e., the Query-flow graph. This representation has been shown extremely effective for modeling user querying patterns and has been extensively used for developing real time applications. Moreover we present graph-based algorithmic solutions applied in the context of problems appearing in web applications as query recommendation and user-session segmentation.

  19. Determination of log P values of new cyclen based antimalarial drug leads using RP-HPLC.

    PubMed

    Rudraraju, A V; Amoyaw, P N A; Hubin, T J; Khan, M O F

    2014-09-01

    Lipophilicity, expressed by log P, is an important physicochemical property of drugs that affects many biological processes, including drug absorption and distribution. The main purpose of this study to determine the log P values of newly discovered drug leads using reversed-phase high-performance liquid chromatography (RP-HPLC). The reference standards, with varying polarity ranges, were dissolved in methanol and analyzed by RP-HPLC using a C18 column. The mobile phase consisted of a mixture of acetonitrile, methanol and water in a gradient elution mode. A calibration curve was plotted between the experimental log P values and obtained log k values of the reference standard compounds and a best fit line was obtained. The log k values of the new drug leads were determined in the same solvent system and were used to calculate the respective log P values by using the best fit equation. The log P vs. log k data gave a best fit linear curve that had an R2 of 0.9786 with Pvalues of the intercept and slope of 1.19 x 10(-6) and 1.56 x 10(-10), respectively, at 0.05 level of significance. Log P values of 15 new drug leads and related compounds, all of which are derivatives of macrocyclic polyamines and their metal complexes, were determined. The values obtained are closely related to the calculated log P (Clog P) values using ChemDraw Ultra 12.0. This experiment provided efficient, fast and reasonable estimates of log P values of the new drug leads by using RP-HPLC.

  20. Challenges in mapping behaviours to activities using logs from a citizen science project

    NASA Astrophysics Data System (ADS)

    Morais, Alessandra M. M.; Guarino de Vasconcelos, Leandro; Santos, Rafael D. C.

    2016-05-01

    Citizen science projects are those which recruit volunteers to participate as assistants in scientific studies. Since these projects depend on volunteer efforts, understanding the motivation that drives a volunteer to collaborate is important to ensure its success. One way to understand motivation is by interviewing the volunteers. While this approach may elicit detailed information on the volunteers' motivation and actions, it is restricted to a subset of willing participants. For web-based projects we could instead use logs of volunteers' activities, which measures which volunteer did what and when for all volunteers in a project. In this work we present some metrics that can be calculated from the logs, based on a model of interaction. We also comment on the applicability of those metrics, describe an ongoing work that may yield more precise logs and metrics and comment on issues for further research.

  1. Deep Web video

    SciTech Connect

    None Available

    2009-06-01

    To make the web work better for science, OSTI has developed state-of-the-art technologies and services including a deep web search capability. The deep web includes content in searchable databases available to web users but not accessible by popular search engines, such as Google. This video provides an introduction to the deep web search engine.

  2. Deep Web video

    ScienceCinema

    None Available

    2016-07-12

    To make the web work better for science, OSTI has developed state-of-the-art technologies and services including a deep web search capability. The deep web includes content in searchable databases available to web users but not accessible by popular search engines, such as Google. This video provides an introduction to the deep web search engine.

  3. Evaluation of historical dry well surveillance logs

    SciTech Connect

    Price, R.K.

    1996-09-09

    Several dry well surveillance logs from 1975 through 1995 for the SX Tank Farm have been examined to identify potential subsurface zones of radioactive contaminant migration. Several dynamic conditions of the gamma-ray emitting radioactive contaminant shave been identified.

  4. Expansion of industrial logging in Central Africa.

    PubMed

    Laporte, Nadine T; Stabach, Jared A; Grosch, Robert; Lin, Tiffany S; Goetz, Scott J

    2007-06-08

    Industrial logging has become the most extensive land use in Central Africa, with more than 600,000 square kilometers (30%) of forest currently under concession. With use of a time series of satellite imagery for the period from 1976 to 2003, we measured 51,916 kilometers of new logging roads. The density of roads across the forested region was 0.03 kilometer per square kilometer, but areas of Gabon and Equatorial Guinea had values over 0.09 kilometer per square kilometer. A new frontier of logging expansion was identified within the Democratic Republic of Congo, which contains 63% of the remaining forest of the region. Tree felling and skid trails increased disturbance in selectively logged areas.

  5. Logging-while-coring method and apparatus

    DOEpatents

    Goldberg, David S.; Myers, Gregory J.

    2007-11-13

    A method and apparatus for downhole coring while receiving logging-while-drilling tool data. The apparatus includes core collar and a retrievable core barrel. The retrievable core barrel receives core from a borehole which is sent to the surface for analysis via wireline and latching tool The core collar includes logging-while-drilling tools for the simultaneous measurement of formation properties during the core excavation process. Examples of logging-while-drilling tools include nuclear sensors, resistivity sensors, gamma ray sensors, and bit resistivity sensors. The disclosed method allows for precise core-log depth calibration and core orientation within a single borehole, and without at pipe trip, providing both time saving and unique scientific advantages.

  6. Optimal message log reclamation for uncoordinated checkpointing

    NASA Technical Reports Server (NTRS)

    Wang, Yi-Min; Fuchs, W. K.

    1994-01-01

    Uncoordinated checkpointing for message-passing systems allows maximum process autonomy and general nondeterministic execution, but suffers from potential domino effect and the large space overhead for maintaining checkpoints and message logs. Traditionally, it has been assumed that only obsolete checkpoints and message logs before the global recovery line can be garbage-collected. Recently, an approach to identifying all garbage checkpoints based on recovery line transformation and decomposition has been developed. We show in this paper that the same approach can be applied to the problem of identifying all garbage message logs for systems requiring message logging to record in-transit messages. Communication trace-driven simulation for several parallel programs is used to evaluate the proposed algorithm.

  7. Logging-while-coring method and apparatus

    DOEpatents

    Goldberg, David S.; Myers, Gregory J.

    2007-01-30

    A method and apparatus for downhole coring while receiving logging-while-drilling tool data. The apparatus includes core collar and a retrievable core barrel. The retrievable core barrel receives core from a borehole which is sent to the surface for analysis via wireline and latching tool The core collar includes logging-while-drilling tools for the simultaneous measurement of formation properties during the core excavation process. Examples of logging-while-drilling tools include nuclear sensors, resistivity sensors, gamma ray sensors, and bit resistivity sensors. The disclosed method allows for precise core-log depth calibration and core orientation within a single borehole, and without at pipe trip, providing both time saving and unique scientific advantages.

  8. 32 CFR 700.845 - Maintenance of logs.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 32 National Defense 5 2012-07-01 2012-07-01 false Maintenance of logs. 700.845 Section 700.845... Commanding Officers Afloat § 700.845 Maintenance of logs. (a) A deck log and an engineering log shall be... Naval Operations. (b) A compass record shall be maintained as an adjunct to the deck log. An...

  9. 29 CFR 42.7 - Complaint/directed action logs.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 1 2012-07-01 2012-07-01 false Complaint/directed action logs. 42.7 Section 42.7 Labor Office of the Secretary of Labor COORDINATED ENFORCEMENT § 42.7 Complaint/directed action logs. (a) To... operation of a system of coordinated Complaint/Directed Action Logs (logs). The logs shall be maintained...

  10. 29 CFR 42.7 - Complaint/directed action logs.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 1 2014-07-01 2013-07-01 true Complaint/directed action logs. 42.7 Section 42.7 Labor Office of the Secretary of Labor COORDINATED ENFORCEMENT § 42.7 Complaint/directed action logs. (a) To... operation of a system of coordinated Complaint/Directed Action Logs (logs). The logs shall be maintained...

  11. 32 CFR 700.845 - Maintenance of logs.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 5 2011-07-01 2011-07-01 false Maintenance of logs. 700.845 Section 700.845... Commanding Officers Afloat § 700.845 Maintenance of logs. (a) A deck log and an engineering log shall be... Naval Operations. (b) A compass record shall be maintained as an adjunct to the deck log. An...

  12. 32 CFR 700.845 - Maintenance of logs.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 5 2013-07-01 2013-07-01 false Maintenance of logs. 700.845 Section 700.845... Commanding Officers Afloat § 700.845 Maintenance of logs. (a) A deck log and an engineering log shall be... Naval Operations. (b) A compass record shall be maintained as an adjunct to the deck log. An...

  13. 29 CFR 42.7 - Complaint/directed action logs.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 1 2013-07-01 2013-07-01 false Complaint/directed action logs. 42.7 Section 42.7 Labor Office of the Secretary of Labor COORDINATED ENFORCEMENT § 42.7 Complaint/directed action logs. (a) To... operation of a system of coordinated Complaint/Directed Action Logs (logs). The logs shall be maintained...

  14. 29 CFR 42.7 - Complaint/directed action logs.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 1 2011-07-01 2011-07-01 false Complaint/directed action logs. 42.7 Section 42.7 Labor Office of the Secretary of Labor COORDINATED ENFORCEMENT § 42.7 Complaint/directed action logs. (a) To... operation of a system of coordinated Complaint/Directed Action Logs (logs). The logs shall be maintained...

  15. 29 CFR 42.7 - Complaint/directed action logs.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 1 2010-07-01 2010-07-01 true Complaint/directed action logs. 42.7 Section 42.7 Labor Office of the Secretary of Labor COORDINATED ENFORCEMENT § 42.7 Complaint/directed action logs. (a) To... operation of a system of coordinated Complaint/Directed Action Logs (logs). The logs shall be maintained...

  16. 32 CFR 700.845 - Maintenance of logs.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 5 2014-07-01 2014-07-01 false Maintenance of logs. 700.845 Section 700.845... Commanding Officers Afloat § 700.845 Maintenance of logs. (a) A deck log and an engineering log shall be... Naval Operations. (b) A compass record shall be maintained as an adjunct to the deck log. An...

  17. 32 CFR 700.845 - Maintenance of logs.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 5 2010-07-01 2010-07-01 false Maintenance of logs. 700.845 Section 700.845... Commanding Officers Afloat § 700.845 Maintenance of logs. (a) A deck log and an engineering log shall be... Naval Operations. (b) A compass record shall be maintained as an adjunct to the deck log. An...

  18. Selective Logging, Fire, and Biomass in Amazonia

    NASA Technical Reports Server (NTRS)

    Houghton, R. A.

    1999-01-01

    Biomass and rates of disturbance are major factors in determining the net flux of carbon between terrestrial ecosystems and the atmosphere, and neither of them is well known for most of the earth's surface. Satellite data over large areas are beginning to be used systematically to measure rates of two of the most important types of disturbance, deforestation and reforestation, but these are not the only types of disturbance that affect carbon storage. Other examples include selective logging and fire. In northern mid-latitude forests, logging and subsequent regrowth of forests have, in recent decades, contributed more to the net flux of carbon between terrestrial ecosystems and the atmosphere than any other type of land use. In the tropics logging is also becoming increasingly important. According to the FAO/UNEP assessment of tropical forests, about 25% of total area of productive forests have been logged one or more times in the 60-80 years before 1980. The fraction must be considerably greater at present. Thus, deforestation by itself accounts for only a portion of the emissions carbon from land. Furthermore, as rates of deforestation become more accurately measured with satellites, uncertainty in biomass will become the major factor accounting for the remaining uncertainty in estimates of carbon flux. An approach is needed for determining the biomass of terrestrial ecosystems. 3 Selective logging is increasingly important in Amazonia, yet it has not been included in region-wide, satellite-based assessments of land-cover change, in part because it is not as striking as deforestation. Nevertheless, logging affects terrestrial carbon storage both directly and indirectly. Besides the losses of carbon directly associated with selective logging, logging also increases the likelihood of fire.

  19. Conversation Threads Hidden within Email Server Logs

    NASA Astrophysics Data System (ADS)

    Palus, Sebastian; Kazienko, Przemysław

    Email server logs contain records of all email Exchange through this server. Often we would like to analyze those emails not separately but in conversation thread, especially when we need to analyze social network extracted from those email logs. Unfortunately each mail is in different record and those record are not tided to each other in any obvious way. In this paper method for discussion threads extraction was proposed together with experiments on two different data sets - Enron and WrUT..

  20. 3D GPR Imaging of Wooden Logs

    NASA Astrophysics Data System (ADS)

    Halabe, Udaya B.; Pyakurel, Sandeep

    2007-03-01

    There has been a lack of an effective NDE technique to locate internal defects within wooden logs. The few available elastic wave propagation based techniques are limited to predicting E values. Other techniques such as X-rays have not been very successful in detecting internal defects in logs. If defects such as embedded metals could be identified before the sawing process, the saw mills could significantly increase their production by reducing the probability of damage to the saw blade and the associated downtime and the repair cost. Also, if the internal defects such as knots and decayed areas could be identified in logs, the sawing blade can be oriented to exclude the defective portion and optimize the volume of high valued lumber that can be obtained from the logs. In this research, GPR has been successfully used to locate internal defects (knots, decays and embedded metals) within the logs. This paper discusses GPR imaging and mapping of the internal defects using both 2D and 3D interpretation methodology. Metal pieces were inserted in a log and the reflection patterns from these metals were interpreted from the radargrams acquired using 900 MHz antenna. Also, GPR was able to accurately identify the location of knots and decays. Scans from several orientations of the log were collected to generate 3D cylindrical volume. The actual location of the defects showed good correlation with the interpreted defects in the 3D volume. The time/depth slices from 3D cylindrical volume data were useful in understanding the extent of defects inside the log.

  1. Computer analysis of digital well logs

    USGS Publications Warehouse

    Scott, James H.

    1984-01-01

    A comprehensive system of computer programs has been developed by the U.S. Geological Survey for analyzing digital well logs. The programs are operational on a minicomputer in a research well-logging truck, making it possible to analyze and replot the logs while at the field site. The minicomputer also serves as a controller of digitizers, counters, and recorders during acquisition of well logs. The analytical programs are coordinated with the data acquisition programs in a flexible system that allows the operator to make changes quickly and easily in program variables such as calibration coefficients, measurement units, and plotting scales. The programs are designed to analyze the following well-logging measurements: natural gamma-ray, neutron-neutron, dual-detector density with caliper, magnetic susceptibility, single-point resistance, self potential, resistivity (normal and Wenner configurations), induced polarization, temperature, sonic delta-t, and sonic amplitude. The computer programs are designed to make basic corrections for depth displacements, tool response characteristics, hole diameter, and borehole fluid effects (when applicable). Corrected well-log measurements are output to magnetic tape or plotter with measurement units transformed to petrophysical and chemical units of interest, such as grade of uranium mineralization in percent eU3O8, neutron porosity index in percent, and sonic velocity in kilometers per second.

  2. From Web 2.0 to Teacher 2.0

    ERIC Educational Resources Information Center

    Thomas, David A.; Li, Qing

    2008-01-01

    The World Wide Web is evolving in response to users who demand faster and more efficient access to information, portability, and reusability of digital objects between Web-based and computer-based applications and powerful communication, publication, collaboration, and teaching and learning tools. This article reviews current uses of Web-based…

  3. Investigating the Web Structure by Isolated Stars

    NASA Astrophysics Data System (ADS)

    Uno, Yushi; Ota, Yoshinobu; Uemichi, Akio

    The link structure of the Web is generally represented by the webgraph, and it is often used for web structure mining that mainly aims to find hidden communities on the Web. In this paper, we identify a common frequent substructure and give it a formal graph definition, which we call an isolated star (i-star), and propose an efficient enumeration algorithm of i-stars. We then investigate the structure of the Web by enumerating i-stars from real web data. As a result, we observed that most i-stars correspond to index structures in single domains, while some of them are verified to be candidates of communities, which implies the validity of i-stars as useful substructure for web structure mining and link spam detecting. We also observed that the distributions of i-star sizes show power-law, which is another new evidence of the scale-freeness of the webgraph.

  4. 5. Log calving barn. Detail of wall corner showing half ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. Log calving barn. Detail of wall corner showing half dovetail notching on hand-hewn logs. - William & Lucina Bowe Ranch, Log Calving Barn, 230 feet south-southwest of House, Melrose, Silver Bow County, MT

  5. 55. VIEW OF STEAMOPERATED LOG HOIST TO PUT IN COMING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    55. VIEW OF STEAM-OPERATED LOG HOIST TO PUT IN COMING LOGS INTO RALPH HULL LUMBER CO. LOG POND. PHOTOGRAPHER: UNKNOWN. DATE: 1942. COURTESY OF RALPH HULL. - Hull-Oakes Lumber Company, 23837 Dawson Road, Monroe, Benton County, OR

  6. Well log characterization of natural gas hydrates

    USGS Publications Warehouse

    Collett, Timothy S.; Lee, Myung W.

    2011-01-01

    In the last 25 years we have seen significant advancements in the use of downhole well logging tools to acquire detailed information on the occurrence of gas hydrate in nature: From an early start of using wireline electrical resistivity and acoustic logs to identify gas hydrate occurrences in wells drilled in Arctic permafrost environments to today where wireline and advanced logging-while-drilling tools are routinely used to examine the petrophysical nature of gas hydrate reservoirs and the distribution and concentration of gas hydrates within various complex reservoir systems. The most established and well known use of downhole log data in gas hydrate research is the use of electrical resistivity and acoustic velocity data (both compressional- and shear-wave data) to make estimates of gas hydrate content (i.e., reservoir saturations) in various sediment types and geologic settings. New downhole logging tools designed to make directionally oriented acoustic and propagation resistivity log measurements have provided the data needed to analyze the acoustic and electrical anisotropic properties of both highly inter-bedded and fracture dominated gas hydrate reservoirs. Advancements in nuclear-magnetic-resonance (NMR) logging and wireline formation testing have also allowed for the characterization of gas hydrate at the pore scale. Integrated NMR and formation testing studies from northern Canada and Alaska have yielded valuable insight into how gas hydrates are physically distributed in sediments and the occurrence and nature of pore fluids (i.e., free-water along with clay and capillary bound water) in gas-hydrate-bearing reservoirs. Information on the distribution of gas hydrate at the pore scale has provided invaluable insight on the mechanisms controlling the formation and occurrence of gas hydrate in nature along with data on gas hydrate reservoir properties (i.e., permeabilities) needed to accurately predict gas production rates for various gas hydrate

  7. Thermal Properties of Bazhen fm. Sediments from Thermal Core Logging

    NASA Astrophysics Data System (ADS)

    Spasennykh, Mikhail; Popov, Evgeny; Popov, Yury; Chekhonin, Evgeny; Romushkevich, Raisa; Zagranovskaya, Dzhuliya; Belenkaya, Irina; Zhukov, Vladislav; Karpov, Igor; Saveliev, Egor; Gabova, Anastasia

    2016-04-01

    The Bazhen formation (B. fm.) is the hugest self-contained source-and-reservoir continuous petroleum system covering by more than 1 mln. km2 (West Siberia, Russia). High lithological differentiation in Bazhen deposits dominated by silicic shales and carbonates accompanied by extremely high total organic carbon values (of up to 35%), pyrite content and brittle mineralogical composition deteriorate standard thermal properties assessment for low permeable rocks. Reliable information of unconventional system thermal characteristics is the necessary part of works such as modelling of different processes in reservoir under thermal EOR for accessing their efficiency, developing and optimizing design of the oil recovery methods, interpretation of the well temperature logging data and for the basin petroleum modelling. A unique set of data including thermal conductivity, thermal diffusivity, volumetric heat capacity, thermal anisotropy for the B.fm. rocks was obtained from thermal core logging (high resolution continuous thermal profiling) on more than 4680 core samples (2000 of B.fm. samples are among) along seven wells for four oil fields. Some systematic peculiarities of the relation between thermal properties of the B.fm. rocks and their mineralogical composition, structural and texture properties were obtained. The high-resolution data are processed jointly with the standard petrophysical logging that allowed us to provide better separation of the formation. The research work was done with financial support of the Russian Ministry of Education and Science (unique identification number RFMEFI58114X0008).

  8. Computer Cache. Online Recess--Web Games for Play and Fun

    ERIC Educational Resources Information Center

    Byerly, Greg; Brodie, Carolyn S.

    2005-01-01

    There are many age-appropriate, free, and easy-to-use online games available on the Web. In this column the authors describe some of their favorites for use with and by elementary students. They have not included games that require children to log on and/or register with their names or play against someone else interactively over the Web. None of…

  9. Web-Based Learning Programs: Use by Learners with Various Cognitive Styles

    ERIC Educational Resources Information Center

    Chen, Ling-Hsiu

    2010-01-01

    To consider how Web-based learning program is utilized by learners with different cognitive styles, this study presents a Web-based learning system (WBLS) and analyzes learners' browsing data recorded in the log file to identify how learners' cognitive styles and learning behavior are related. In order to develop an adapted WBLS, this study also…

  10. Web Mining: Machine Learning for Web Applications.

    ERIC Educational Resources Information Center

    Chen, Hsinchun; Chau, Michael

    2004-01-01

    Presents an overview of machine learning research and reviews methods used for evaluating machine learning systems. Ways that machine-learning algorithms were used in traditional information retrieval systems in the "pre-Web" era are described, and the field of Web mining and how machine learning has been used in different Web mining…

  11. Workspaces in the Semantic Web

    NASA Technical Reports Server (NTRS)

    Wolfe, Shawn R.; Keller, RIchard M.

    2005-01-01

    Due to the recency and relatively limited adoption of Semantic Web technologies. practical issues related to technology scaling have received less attention than foundational issues. Nonetheless, these issues must be addressed if the Semantic Web is to realize its full potential. In particular, we concentrate on the lack of scoping methods that reduce the size of semantic information spaces so they are more efficient to work with and more relevant to an agent's needs. We provide some intuition to motivate the need for such reduced information spaces, called workspaces, give a formal definition, and suggest possible methods of deriving them.

  12. Well log evaluation of gas hydrate saturations

    USGS Publications Warehouse

    Collett, Timothy S.

    1998-01-01

    The amount of gas sequestered in gas hydrates is probably enormous, but estimates are highly speculative due to the lack of previous quantitative studies. Gas volumes that may be attributed to a gas hydrate accumulation within a given geologic setting are dependent on a number of reservoir parameters; one of which, gas-hydrate saturation, can be assessed with data obtained from downhole well logging devices. The primary objective of this study was to develop quantitative well-log evaluation techniques which will permit the calculation of gas-hydrate saturations in gas-hydrate-bearing sedimentary units. The `standard' and `quick look' Archie relations (resistivity log data) yielded accurate gas-hydrate and free-gas saturations within all of the gas hydrate accumulations assessed in the field verification phase of the study. Compressional wave acoustic log data have been used along with the Timur, modified Wood, and the Lee weighted average acoustic equations to calculate accurate gas-hydrate saturations in this study. The well log derived gas-hydrate saturations calculated in the field verification phase of this study, which range from as low as 2% to as high as 97%, confirm that gas hydrates represent a potentially important source of natural gas.

  13. Well log evaluation of gas hydrate saturations

    USGS Publications Warehouse

    Collett, T.S.

    1998-01-01

    The amount of gas sequestered in gas hydrates is probably enormous, but estimates are highly speculative due to the lack of previous quantitative studies. Gas volumes that may be attributed to a gas hydrate accumulation within a given geologic setting are dependent on a number of reservoir parameters; one of which, gas-hydrate saturation, can be assessed with data obtained from downhole well logging devices. The primary objective of this study was to develop quantitative well-log evaluation techniques which will permit the calculation of gas-hydrate saturations in gas-hydrate-bearing sedimentary units. The "standard" and "quick look" Archie relations (resistivity log data) yielded accurate gas-hydrate and free-gas saturations within all of the gas hydrate accumulations assessed in the field verification phase of the study. Compressional wave acoustic log data have been used along with the Timur, modified Wood, and the Lee weighted average acoustic equations to calculate accurate gas-hydrate saturations in all of the gas hydrate accumulations assessed in this study. The well log derived gas-hydrate saturations calculated in the field verification phase of this study, which range from as low as 2% to as high as 97%, confirm that gas hydrates represent a potentially important source of natural gas.

  14. The Design of Plywood Webs for Airplane Wing Beams

    NASA Technical Reports Server (NTRS)

    Trayer, George W

    1931-01-01

    This report deals with the design of plywood webs for wooden box beams to obtain maximum strength per unit weight. A method of arriving at the most efficient and economical web thickness, and hence the most suitable unit shear stress, is presented and working stresses in shear for various types of webs and species of plywood are given. The questions of diaphragm spacing and required glue area between the webs and flange are also discussed.

  15. Enhancing DSN Operations Efficiency with the Discrepancy Reporting Management System (DRMS)

    NASA Technical Reports Server (NTRS)

    Chatillon, Mark; Lin, James; Cooper, Tonja M.

    2003-01-01

    The DRMS is the Discrepancy Reporting Management System used by the Deep Space Network (DSN). It uses a web interface and is a management tool designed to track and manage: data outage incidents during spacecraft tracks against equipment and software known as DRs (discrepancy Reports), to record "out of pass" incident logs against equipment and software in a Station Log, to record instances where equipment has be restarted or reset as Reset records, and to electronically record equipment readiness status across the DSN. Tracking and managing these items increases DSN operational efficiency by providing: the ability to establish the operational history of equipment items, data on the quality of service provided to the DSN customers, the ability to measure service performance, early insight into processes, procedures and interfaces that may need updating or changing, and the capability to trace a data outage to a software or hardware change. The items listed above help the DSN to focus resources on areas of most need.

  16. Statistical factor analysis technique for characterizing basalt through interpreting nuclear and electrical well logging data (case study from Southern Syria).

    PubMed

    Asfahani, Jamal

    2014-02-01

    Factor analysis technique is proposed in this research for interpreting the combination of nuclear well logging, including natural gamma ray, density and neutron-porosity, and the electrical well logging of long and short normal, in order to characterize the large extended basaltic areas in southern Syria. Kodana well logging data are used for testing and applying the proposed technique. The four resulting score logs enable to establish the lithological score cross-section of the studied well. The established cross-section clearly shows the distribution and the identification of four kinds of basalt which are hard massive basalt, hard basalt, pyroclastic basalt and the alteration basalt products, clay. The factor analysis technique is successfully applied on the Kodana well logging data in southern Syria, and can be used efficiently when several wells and huge well logging data with high number of variables are required to be interpreted.

  17. Online Persistence in Higher Education Web-Supported Courses

    ERIC Educational Resources Information Center

    Hershkovitz, Arnon; Nachmias, Rafi

    2011-01-01

    This research consists of an empirical study of online persistence in Web-supported courses in higher education, using Data Mining techniques. Log files of 58 Moodle websites accompanying Tel Aviv University courses were drawn, recording the activity of 1189 students in 1897 course enrollments during the academic year 2008/9, and were analyzed…

  18. Use of an Academic Library Web Site Search Engine.

    ERIC Educational Resources Information Center

    Fagan, Jody Condit

    2002-01-01

    Describes an analysis of the search engine logs of Southern Illinois University, Carbondale's library to determine how patrons used the site search. Discusses results that showed patrons did not understand the function of the search and explains improvements that were made in the Web site and in online reference services. (Author/LRW)

  19. Adolescents' Web-Based Literacies, Identity Construction, and Skill Development

    ERIC Educational Resources Information Center

    Alvermann, Donna E.; Marshall, James D.; McLean, Cheryl A.; Huddleston, Andrew P.; Joaquin, Jairus; Bishop, John

    2012-01-01

    Five qualitative multiple-case studies document adolescents' uses of Web-based resources and digital literacy skills to construct their online identities. Working from a perspective that integrates new literacies with multimodality, the researchers enlisted the help of five high school students who kept daily logs of the websites they visited for…

  20. Unconventional neutron sources for oil well logging

    NASA Astrophysics Data System (ADS)

    Frankle, C. M.; Dale, G. E.

    2013-09-01

    Americium-Beryllium (AmBe) radiological neutron sources have been widely used in the petroleum industry for well logging purposes. There is strong desire on the part of various governmental and regulatory bodies to find alternate sources due to the high activity and small size of AmBe sources. Other neutron sources are available, both radiological (252Cf) and electronic accelerator driven (D-D and D-T). All of these, however, have substantially different neutron energy spectra from AmBe and thus cause significantly different responses in well logging tools. We report on simulations performed using unconventional sources and techniques to attempt to better replicate the porosity and carbon/oxygen ratio responses a well logging tool would see from AmBe neutrons. The AmBe response of these two types of tools is compared to the response from 252Cf, D-D, D-T, filtered D-T, and T-T sources.

  1. Spreadsheet log analysis in subsurface geology

    USGS Publications Warehouse

    Doveton, J.H.

    2000-01-01

    Most of the direct knowledge of the geology of the subsurface is gained from the examination of core and drill-cuttings recovered from boreholes drilled by the petroleum and water industries. Wireline logs run in these same boreholes generally have been restricted to tasks of lithostratigraphic correlation and thee location of hydrocarbon pay zones. However, the range of petrophysical measurements has expanded markedly in recent years, so that log traces now can be transformed to estimates of rock composition. Increasingly, logs are available in a digital format that can be read easily by a desktop computer and processed by simple spreadsheet software methods. Taken together, these developments offer accessible tools for new insights into subsurface geology that complement the traditional, but limited, sources of core and cutting observations.

  2. Lithologic logs and geophysical logs from test drilling in Palm Beach County, Florida, since 1974

    USGS Publications Warehouse

    Swayze, Leo J.; McGovern, Michael C.; Fischer, John N.

    1980-01-01

    Test-hole data that may be used to determine the hydrogeology of the zone of high permeability in Palm Beach County, Fla., are presented. Lithologic logs from 46 test wells and geophysical logs from 40 test wells are contained in this report. (USGS)

  3. LogSafe and Smart: Minnesota OSHA's LogSafe Program Takes Root.

    ERIC Educational Resources Information Center

    Honerman, James

    1999-01-01

    Logging is now the most dangerous U.S. occupation. The Occupational Safety and Health Administration (OSHA) developed specialized safety training for the logging industry but has been challenged to reach small operators. An OSHA-approved state program in Minnesota provides annual safety seminars to about two-thirds of the state's full-time…

  4. Relationships between log N-log S and celestial distribution of gamma-ray bursts

    NASA Technical Reports Server (NTRS)

    Nishimura, J.; Yamagami, T.

    1985-01-01

    The apparent conflict between log N-log S curve and isotropic celestial distribution of the gamma ray bursts is discussed. A possible selection effect due to the time profile of each burst is examined. It is shown that the contradiction is due to this selection effect of the gamma ray bursts.

  5. Antibiotic Pollution in Marine Food Webs in Laizhou Bay, North China: Trophodynamics and Human Exposure Implication.

    PubMed

    Liu, Sisi; Zhao, Hongxia; Lehmler, Hans-Joachim; Cai, Xiyun; Chen, Jingwen

    2017-02-21

    Little information is available about the bioaccumulation and biomagnification of antibiotics in marine food webs. Here, we investigate the levels and trophic transfer of 9 sulfonamide (SA), 5 fluoroquinolone (FQ), and 4 macrolide (ML) antibiotics, as well as trimethoprim in nine invertebrate and ten fish species collected from a marine food web in Laizhou Bay, North China in 2014 and 2015. All the antibiotics were detected in the marine organisms, with SAs and FQs being the most abundant antibiotics. Benthic fish accumulated more SAs than invertebrates and pelagic fish, while invertebrates exhibited higher FQ levels than fish. Generally, SAs and trimethoprim biomagnified in the food web, while the FQs and MLs were biodiluted. Trophic magnification factors (TMF) were 1.2-3.9 for SAs and trimethoprim, 0.3-1.0 for FQs and MLs. Limited biotransformation and relatively high assimilation efficiencies are the likely reasons for the biomagnification of SAs. The pH dependent distribution coefficients (log D) but not the lipophilicity (log KOW) of SAs and FQs had a significant correlation (r = 0.73; p < 0.05) with their TMFs. Although the calculated estimated daily intakes (EDI) for antibiotics suggest that consumption of seafood from Laizhou Bay is not associated with significant human health risks, this study provides important insights into the guidance of risk management of antibiotics.

  6. EPA Web Taxonomy

    EPA Pesticide Factsheets

    EPA's Web Taxonomy is a faceted hierarchical vocabulary used to tag web pages with terms from a controlled vocabulary. Tagging enables search and discovery of EPA's Web based information assests. EPA's Web Taxonomy is being provided in Simple Knowledge Organization System (SKOS) format. SKOS is a standard for sharing and linking knowledge organization systems that promises to make Federal terminology resources more interoperable.

  7. Development of pulsed neutron uranium logging instrument.

    PubMed

    Wang, Xin-guang; Liu, Dan; Zhang, Feng

    2015-03-01

    This article introduces a development of pulsed neutron uranium logging instrument. By analyzing the temporal distribution of epithermal neutrons generated from the thermal fission of (235)U, we propose a new method with a uranium-bearing index to calculate the uranium content in the formation. An instrument employing a D-T neutron generator and two epithermal neutron detectors has been developed. The logging response is studied using Monte Carlo simulation and experiments in calibration wells. The simulation and experimental results show that the uranium-bearing index is linearly correlated with the uranium content, and the porosity and thermal neutron lifetime of the formation can be acquired simultaneously.

  8. Development of pulsed neutron uranium logging instrument

    SciTech Connect

    Wang, Xin-guang; Liu, Dan; Zhang, Feng

    2015-03-15

    This article introduces a development of pulsed neutron uranium logging instrument. By analyzing the temporal distribution of epithermal neutrons generated from the thermal fission of {sup 235}U, we propose a new method with a uranium-bearing index to calculate the uranium content in the formation. An instrument employing a D-T neutron generator and two epithermal neutron detectors has been developed. The logging response is studied using Monte Carlo simulation and experiments in calibration wells. The simulation and experimental results show that the uranium-bearing index is linearly correlated with the uranium content, and the porosity and thermal neutron lifetime of the formation can be acquired simultaneously.

  9. Compacting a Kentucky coal for quality logs

    SciTech Connect

    Lin, Y.; Li, Z.; Mao, S.

    1999-07-01

    A Kentucky coal was found more difficult to be compacted into large size strong logs. Study showed that compaction parameters affecting the strength of compacted coal logs could be categorized into three groups. The first group is coal inherent properties such as elasticity and coefficient of friction, the second group is machine properties such as mold geometry, and the third group is the coal mixture preparation parameters such as particle size distribution. Theoretical analysis showed that an appropriate backpressure can reduce surface cracks occurring during ejection. This has been confirmed by the experiments conducted.

  10. MAIL LOG, program summary and specifications

    NASA Technical Reports Server (NTRS)

    Harris, D. K.

    1979-01-01

    The summary and specifications to obtain the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS are provided. The MAIL LOG program has four modes of operation: (1) input - putting new records into the data base; (2) revise - changing or modifying existing records in the data base; (3) search - finding special records existing in the data base; and (4) archive - store or put away existing records in the data base. The output includes special printouts of records in the data base and results from the input and search modes.

  11. Quality of the log-geometric distribution extrapolation for smaller undiscovered oil and gas pool size

    USGS Publications Warehouse

    Chenglin, L.; Charpentier, R.R.

    2010-01-01

    The U.S. Geological Survey procedure for the estimation of the general form of the parent distribution requires that the parameters of the log-geometric distribution be calculated and analyzed for the sensitivity of these parameters to different conditions. In this study, we derive the shape factor of a log-geometric distribution from the ratio of frequencies between adjacent bins. The shape factor has a log straight-line relationship with the ratio of frequencies. Additionally, the calculation equations of a ratio of the mean size to the lower size-class boundary are deduced. For a specific log-geometric distribution, we find that the ratio of the mean size to the lower size-class boundary is the same. We apply our analysis to simulations based on oil and gas pool distributions from four petroleum systems of Alberta, Canada and four generated distributions. Each petroleum system in Alberta has a different shape factor. Generally, the shape factors in the four petroleum systems stabilize with the increase of discovered pool numbers. For a log-geometric distribution, the shape factor becomes stable when discovered pool numbers exceed 50 and the shape factor is influenced by the exploration efficiency when the exploration efficiency is less than 1. The simulation results show that calculated shape factors increase with those of the parent distributions, and undiscovered oil and gas resources estimated through the log-geometric distribution extrapolation are smaller than the actual values. ?? 2010 International Association for Mathematical Geology.

  12. Web data mining

    NASA Astrophysics Data System (ADS)

    Wibonele, Kasanda J.; Zhang, Yanqing

    2002-03-01

    A web data mining system using granular computing and ASP programming is proposed. This is a web based application, which allows web users to submit survey data for many different companies. This survey is a collection of questions that will help these companies develop and improve their business and customer service with their clients by analyzing survey data. This web application allows users to submit data anywhere. All the survey data is collected into a database for further analysis. An administrator of this web application can login to the system and view all the data submitted. This web application resides on a web server, and the database resides on the MS SQL server.

  13. When Workflow Management Systems and Logging Systems Meet: Analyzing Large-Scale Execution Traces

    SciTech Connect

    Gunter, Daniel

    2008-07-31

    This poster shows the benefits of integrating a workflow management system with logging and log mining capabilities. By combing two existing, mature technologies: Pegasus-WMS and Netlogger, we are able to efficiently process execution logs of earthquake science workflows consisting of hundreds of thousands to one million tasks. In particular we show results of processing logs of CyberShake, a workflow application running on the TeraGrid. Client-side tools allow scientists to quickly gather statistics about a workflow run and find out which tasks executed, where they were executed, what was their runtime, etc. These statistics can be used to understand the performance characteristics of a workflow and help tune the execution parameters of the workflow management system. This poster shows the scalability of the system presenting results of uploading task execution records into the system and by showing results of querying the system for overall workflow performance information.

  14. 1. GENERAL VIEW OF LOG POND AND BOOM FOR UNLOADING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. GENERAL VIEW OF LOG POND AND BOOM FOR UNLOADING CEDAR LOGS FROM TRUCKS AT LOG DUMP, ADJACENT TO MILL; TRUCKS FORMERLY USED TRIP STAKES, THOUGH FOR SAFER HANDLING OF LOGS WELDED STAKES ARE NOW REQUIRED; AS A RESULT LOADING IS NOW DONE WITH A CRANE - Lester Shingle Mill, 1602 North Eighteenth Street, Sweet Home, Linn County, OR

  15. 47 CFR 73.1840 - Retention of logs.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 4 2014-10-01 2014-10-01 false Retention of logs. 73.1840 Section 73.1840... Rules Applicable to All Broadcast Stations § 73.1840 Retention of logs. (a) Any log required to be kept by station licensees shall be retained by them for a period of 2 years. However, logs...

  16. 32 CFR 700.846 - Status of logs.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 5 2013-07-01 2013-07-01 false Status of logs. 700.846 Section 700.846 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY UNITED STATES NAVY REGULATIONS AND... Officers Afloat § 700.846 Status of logs. The deck log, the engineering log, the compass record,...

  17. 47 CFR 73.877 - Station logs for LPFM stations.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 4 2012-10-01 2012-10-01 false Station logs for LPFM stations. 73.877 Section... BROADCAST SERVICES Low Power FM Broadcast Stations (LPFM) § 73.877 Station logs for LPFM stations. The licensee of each LPFM station must maintain a station log. Each log entry must include the time and date...

  18. 47 CFR 73.782 - Retention of logs.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 4 2011-10-01 2011-10-01 false Retention of logs. 73.782 Section 73.782... International Broadcast Stations § 73.782 Retention of logs. Logs of international broadcast stations shall be retained by the licensee or permittee for a period of two years: Provided, however, That logs...

  19. 32 CFR 700.846 - Status of logs.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 5 2011-07-01 2011-07-01 false Status of logs. 700.846 Section 700.846 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY UNITED STATES NAVY REGULATIONS AND... Officers Afloat § 700.846 Status of logs. The deck log, the engineering log, the compass record,...

  20. 32 CFR 700.846 - Status of logs.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 5 2014-07-01 2014-07-01 false Status of logs. 700.846 Section 700.846 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY UNITED STATES NAVY REGULATIONS AND... Officers Afloat § 700.846 Status of logs. The deck log, the engineering log, the compass record,...

  1. 47 CFR 73.1840 - Retention of logs.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 4 2012-10-01 2012-10-01 false Retention of logs. 73.1840 Section 73.1840... Rules Applicable to All Broadcast Stations § 73.1840 Retention of logs. (a) Any log required to be kept by station licensees shall be retained by them for a period of 2 years. However, logs...

  2. 47 CFR 73.1840 - Retention of logs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 4 2010-10-01 2010-10-01 false Retention of logs. 73.1840 Section 73.1840... Rules Applicable to All Broadcast Stations § 73.1840 Retention of logs. (a) Any log required to be kept by station licensees shall be retained by them for a period of 2 years. However, logs...

  3. 33 CFR 207.370 - Big Fork River, Minn.; logging.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... of navigation, parties engaged in handling logs upon the river shall have the right to sluice, drive, and float logs in such manner as may best suit their convenience: Provided, A sufficient channel is... force of men must accompany each log drive to prevent the formation of log jams and to maintain an...

  4. 47 CFR 73.782 - Retention of logs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 4 2010-10-01 2010-10-01 false Retention of logs. 73.782 Section 73.782... International Broadcast Stations § 73.782 Retention of logs. Logs of international broadcast stations shall be retained by the licensee or permittee for a period of two years: Provided, however, That logs...

  5. 47 CFR 73.877 - Station logs for LPFM stations.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 4 2014-10-01 2014-10-01 false Station logs for LPFM stations. 73.877 Section... BROADCAST SERVICES Low Power FM Broadcast Stations (LPFM) § 73.877 Station logs for LPFM stations. The licensee of each LPFM station must maintain a station log. Each log entry must include the time and date...

  6. 47 CFR 73.1840 - Retention of logs.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 4 2011-10-01 2011-10-01 false Retention of logs. 73.1840 Section 73.1840... Rules Applicable to All Broadcast Stations § 73.1840 Retention of logs. (a) Any log required to be kept by station licensees shall be retained by them for a period of 2 years. However, logs...

  7. 33 CFR 207.370 - Big Fork River, Minn.; logging.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... of navigation, parties engaged in handling logs upon the river shall have the right to sluice, drive, and float logs in such manner as may best suit their convenience: Provided, A sufficient channel is... force of men must accompany each log drive to prevent the formation of log jams and to maintain an...

  8. 47 CFR 73.782 - Retention of logs.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 4 2013-10-01 2013-10-01 false Retention of logs. 73.782 Section 73.782... International Broadcast Stations § 73.782 Retention of logs. Logs of international broadcast stations shall be retained by the licensee or permittee for a period of two years: Provided, however, That logs...

  9. 32 CFR 700.846 - Status of logs.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 32 National Defense 5 2012-07-01 2012-07-01 false Status of logs. 700.846 Section 700.846 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY UNITED STATES NAVY REGULATIONS AND... Officers Afloat § 700.846 Status of logs. The deck log, the engineering log, the compass record,...

  10. 33 CFR 207.370 - Big Fork River, Minn.; logging.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... of navigation, parties engaged in handling logs upon the river shall have the right to sluice, drive, and float logs in such manner as may best suit their convenience: Provided, A sufficient channel is... force of men must accompany each log drive to prevent the formation of log jams and to maintain an...

  11. 47 CFR 73.782 - Retention of logs.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 4 2012-10-01 2012-10-01 false Retention of logs. 73.782 Section 73.782... International Broadcast Stations § 73.782 Retention of logs. Logs of international broadcast stations shall be retained by the licensee or permittee for a period of two years: Provided, however, That logs...

  12. 47 CFR 73.877 - Station logs for LPFM stations.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 4 2011-10-01 2011-10-01 false Station logs for LPFM stations. 73.877 Section... BROADCAST SERVICES Low Power FM Broadcast Stations (LPFM) § 73.877 Station logs for LPFM stations. The licensee of each LPFM station must maintain a station log. Each log entry must include the time and date...

  13. 47 CFR 73.1840 - Retention of logs.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 4 2013-10-01 2013-10-01 false Retention of logs. 73.1840 Section 73.1840... Rules Applicable to All Broadcast Stations § 73.1840 Retention of logs. (a) Any log required to be kept by station licensees shall be retained by them for a period of 2 years. However, logs...

  14. 47 CFR 73.877 - Station logs for LPFM stations.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 4 2013-10-01 2013-10-01 false Station logs for LPFM stations. 73.877 Section... BROADCAST SERVICES Low Power FM Broadcast Stations (LPFM) § 73.877 Station logs for LPFM stations. The licensee of each LPFM station must maintain a station log. Each log entry must include the time and date...

  15. 47 CFR 73.877 - Station logs for LPFM stations.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 4 2010-10-01 2010-10-01 false Station logs for LPFM stations. 73.877 Section... BROADCAST SERVICES Low Power FM Broadcast Stations (LPFM) § 73.877 Station logs for LPFM stations. The licensee of each LPFM station must maintain a station log. Each log entry must include the time and date...

  16. 32 CFR 700.846 - Status of logs.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 5 2010-07-01 2010-07-01 false Status of logs. 700.846 Section 700.846 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY UNITED STATES NAVY REGULATIONS AND... Officers Afloat § 700.846 Status of logs. The deck log, the engineering log, the compass record,...

  17. 33 CFR 207.370 - Big Fork River, Minn.; logging.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... of navigation, parties engaged in handling logs upon the river shall have the right to sluice, drive, and float logs in such manner as may best suit their convenience: Provided, A sufficient channel is... force of men must accompany each log drive to prevent the formation of log jams and to maintain an...

  18. 33 CFR 207.370 - Big Fork River, Minn.; logging.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... of navigation, parties engaged in handling logs upon the river shall have the right to sluice, drive, and float logs in such manner as may best suit their convenience: Provided, A sufficient channel is... force of men must accompany each log drive to prevent the formation of log jams and to maintain an...

  19. 47 CFR 73.782 - Retention of logs.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 4 2014-10-01 2014-10-01 false Retention of logs. 73.782 Section 73.782... International Broadcast Stations § 73.782 Retention of logs. Logs of international broadcast stations shall be retained by the licensee or permittee for a period of two years: Provided, however, That logs...

  20. There's Life in Those Dead Logs!

    ERIC Educational Resources Information Center

    Biggs, Devin; Miller, Todd; Hall, Dee

    2006-01-01

    Although it is unspectacular in appearance, dead wood is one of the most ecologically important resources in forests. Fallen logs, dead standing trees, stumps, and even cavities in live trees fulfill a wide range of roles. Prominent among these is that they provide habitat for many organisms, especially insects. Fourth-grade students at Fox…

  1. Precision prediction of the log power spectrum

    NASA Astrophysics Data System (ADS)

    Repp, A.; Szapudi, I.

    2017-01-01

    At translinear scales, the log power spectrum captures significantly more cosmological information than the standard power spectrum. At high wavenumbers k, the Fisher information in the standard power spectrum P(k) fails to increase in proportion to k, in part due to correlations between large- and small-scale modes. As a result, P(k) suffers from an information plateau on these translinear scales, so that analysis with the standard power spectrum cannot access the information contained in these small-scale modes. The log power spectrum PA(k), on the other hand, captures the majority of this otherwise lost information. Until now there has been no means of predicting the amplitude of the log power spectrum apart from cataloging the results of simulations. We here present a cosmology-independent prescription for the log power spectrum; this prescription displays accuracy comparable to that of Smith et al., over a range of redshifts and smoothing scales, and for wavenumbers up to 1.5 h Mpc-1.

  2. [Human development and log-periodic law].

    PubMed

    Cash, Roland; Chaline, Jean; Nottale, Laurent; Grou, Pierre

    2002-05-01

    We suggest applying the log-periodic law formerly used to describe various crisis phenomena, in biology (evolutionary leaps), inorganic systems (earthquakes), societies and economy (economic crisis, market crashes) to the various steps of human ontogeny. We find a statistically significant agreement between this model and the data.

  3. MAIL LOG, program theory, volume 2

    NASA Technical Reports Server (NTRS)

    Harris, D. K.

    1979-01-01

    Information relevant to the MAIL LOG program theory is documented. The L-files for mail correspondence, design information release/report, and the drawing/engineering order are given. In addition, sources for miscellaneous external routines and special support routines are documented along with a glossary of terms.

  4. 29 CFR 1910.266 - Logging operations.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... other device is then hooked to the lifting or pulling machine. Danger tree. A standing tree that... delivery, such as, but not limited to, marking danger trees and trees/logs to be cut to length, felling... danger tree shall be felled, removed or avoided. Each danger tree, including lodged trees and...

  5. Data logging technology in ambulatory medical instrumentation.

    PubMed

    Anderson, R; Lyons, G M

    2001-05-01

    This paper reviews the advancements made in ambulatory data logging used in the study of human subjects since the inception of the analogue tape based data logger in the 1960s. Research into the area of ambulatory monitoring has been rejuvenated due to the development of novel storage technologies during the 1990s. Data logging systems that were previously impractical due to lack of processing power, practical size and cost are now available to the practitioner. An overview of the requirements of present day ambulatory data logging is presented and analogue tape, solid-state memory and disk drive storage recording systems that have been described in the literature are investigated in detail. It is proposed that digital based technology offers the best solution to the problems encountered during human based data logging. The appearance of novel digital storage media will continue the trend of increased recording durations, signal resolution and number of parameters thus allowing the momentum gained throughout the last several decades to continue.

  6. Modelling tropical forests response to logging

    NASA Astrophysics Data System (ADS)

    Cazzolla Gatti, Roberto; Di Paola, Arianna; Valentini, Riccardo; Paparella, Francesco

    2013-04-01

    Tropical rainforests are among the most threatened ecosystems by large-scale fragmentation due to human activity such as heavy logging and agricultural clearance. Although, they provide crucial ecosystem goods and services, such as sequestering carbon from the atmosphere, protecting watersheds and conserving biodiversity. In several countries forest resource extraction has experienced a shift from clearcutting to selective logging to maintain a significant forest cover and understock of living biomass. However the knowledge on the short and long-term effects of removing selected species in tropical rainforest are scarce and need to be further investigated. One of the main effects of selective logging on forest dynamics seems to be the local disturbance which involve the invasion of open space by weed, vines and climbers at the expense of the late-successional state cenosis. We present a simple deterministic model that describes the dynamics of tropical rainforest subject to selective logging to understand how and why weeds displace native species. We argue that the selective removal of tallest tropical trees carries out gaps of light that allow weeds, vines and climbers to prevail on native species, inhibiting the possibility of recovery of the original vegetation. Our results show that different regime shifts may occur depending on the type of forest management adopted. This hypothesis is supported by a dataset of trees height and weed/vines cover that we collected from 9 plots located in Central and West Africa both in untouched and managed areas.

  7. The Design Log: A New Informational Tool

    ERIC Educational Resources Information Center

    Spivak, Mayer

    1978-01-01

    The design log is a record of observations, diagnoses, prescriptions, and performance specifications for each space in a structure. It is a systematic approach to design that integrates information about user needs with traditional architectural programming and design. (Author/MLF)

  8. Predicting reservoir wettability via well logs

    NASA Astrophysics Data System (ADS)

    Feng, Cheng; Fu, Jinhua; Shi, Yujiang; Li, Gaoren; Mao, Zhiqiang

    2016-06-01

    Wettability is an important factor in controlling the distribution of oil and water. However, its evaluation has so far been a difficult problem because no log data can directly indicate it. In this paper, a new method is proposed for quantitatively predicting reservoir wettability via well log analysis. Specifically, based on the J function, diagenetic facies classification and the piecewise power functions, capillary pressure curves are constructed from conventional logs and a nuclear magnetic resonance (NMR) log respectively. Under the influence of wettability, the latter is distorted while the former remains unaffected. Therefore, the ratio of the median radius obtained from the two kinds of capillary pressure curve is calculated to reflect wettability, a quantitative relationship between the ratio and reservoir wettability is then established. According to the low-permeability core sample capillary pressure curve, NMR {{T}2} spectrum and contact angle experimental data from the bottom of the Upper Triassic reservoirs in western Ordos Basin, China, two kinds of constructing capillary pressure curve models and a predictive wettability model are calibrated. The wettability model is verified through the Amott wettability index and saturation exponent from resistivity measurement and their determined wettability levels are comparable, indicating that the proposed model is quite reliable. In addition, the model’s good application effect is exhibited in the field study. Thus, the quantitatively predicting reservoir wettability model proposed in this paper provides an effective tool for formation evaluation, field development and the improvement of oil recovery.

  9. The fluid-compensated cement bond log

    SciTech Connect

    Nayfeh, T.H.; Wheelis, W.B. Jr.; Leslie, H.D.

    1986-08-01

    Simulations of cement bond logging (CBL) have shown that wellbore fluid effects can be segregated from sonic-signal response to changing cement strengths. Traditionally, the effects have been considered negligible and the CBL's have been interpreted as if water were in the wellbore. However, large variations in CBL's have become apparent with the increasing number of logs run in completion fluids, such as CaCl/sub 2/, ZnBr/sub 2/, and CaBr/sub 2/. To study wellbore fluid effects, physical and numerical models were developed that simulated the wellbore geometry. Measurements were conducted in 5-, 7-, and 9 5/8-in. casings for a range of wellbore fluid types and for both densities and viscosities. Parallel numerical modeling used similar parameters. Results show that bond-log amplitudes varied dramatically with the wellbore fluid acoustic impedance-i.e., there was a 70% increase in signal amplitudes for 11.5 lbm/gal (1370-kg/m/sup 3/) CaCl/sub 2/ over the signal amplitude in water. This led to the development of a fluid-compensated bond log that corrects the amplitude for acoustic impedance of various wellbore fluids, thereby making the measurements more directly related to the cement quality.

  10. More Efficient Learning on Web Courseware Systems?

    ERIC Educational Resources Information Center

    Zufic, Janko; Kalpic, Damir

    2007-01-01

    The article describes a research conducted on students at the University in Pula, by which was attempted to establish whether there is a relationship between exam success and a type of online teaching material from which a student learns. Students were subjected to psychological testing that measured factors of intelligence: verbal, non-verbal and…

  11. Critical care procedure logging using handheld computers

    PubMed Central

    Carlos Martinez-Motta, J; Walker, Robin; Stewart, Thomas E; Granton, John; Abrahamson, Simon; Lapinsky, Stephen E

    2004-01-01

    Introduction We conducted this study to evaluate the feasibility of implementing an internet-linked handheld computer procedure logging system in a critical care training program. Methods Subspecialty trainees in the Interdepartmental Division of Critical Care at the University of Toronto received and were trained in the use of Palm handheld computers loaded with a customized program for logging critical care procedures. The procedures were entered into the handheld device using checkboxes and drop-down lists, and data were uploaded to a central database via the internet. To evaluate the feasibility of this system, we tracked the utilization of this data collection system. Benefits and disadvantages were assessed through surveys. Results All 11 trainees successfully uploaded data to the central database, but only six (55%) continued to upload data on a regular basis. The most common reason cited for not using the system pertained to initial technical problems with data uploading. From 1 July 2002 to 30 June 2003, a total of 914 procedures were logged. Significant variability was noted in the number of procedures logged by individual trainees (range 13–242). The database generated by regular users provided potentially useful information to the training program director regarding the scope and location of procedural training among the different rotations and hospitals. Conclusion A handheld computer procedure logging system can be effectively used in a critical care training program. However, user acceptance was not uniform, and continued training and support are required to increase user acceptance. Such a procedure database may provide valuable information that may be used to optimize trainees' educational experience and to document clinical training experience for licensing and accreditation. PMID:15469577

  12. Requirements-Driven Log Analysis Extended Abstract

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus

    2012-01-01

    Imagine that you are tasked to help a project improve their testing effort. In a realistic scenario it will quickly become clear, that having an impact is diffcult. First of all, it will likely be a challenge to suggest an alternative approach which is significantly more automated and/or more effective than current practice. The reality is that an average software system has a complex input/output behavior. An automated testing approach will have to auto-generate test cases, each being a pair (i; o) consisting of a test input i and an oracle o. The test input i has to be somewhat meaningful, and the oracle o can be very complicated to compute. Second, even in case where some testing technology has been developed that might improve current practice, it is then likely difficult to completely change the current behavior of the testing team unless the technique is obviously superior and does everything already done by existing technology. So is there an easier way to incorporate formal methods-based approaches than the full edged test revolution? Fortunately the answer is affirmative. A relatively simple approach is to benefit from possibly already existing logging infrastructure, which after all is part of most systems put in production. A log is a sequence of events, generated by special log recording statements, most often manually inserted in the code by the programmers. An event can be considered as a data record: a mapping from field names to values. We can analyze such a log using formal methods, for example checking it against a formal specification. This separates running the system for analyzing its behavior. It is not meant as an alternative to testing since it does not address the important in- put generation problem. However, it offers a solution which testing teams might accept since it has low impact on the existing process. A single person might be assigned to perform such log analysis, compared to the entire testing team changing behavior.

  13. Improved production log interpretation in horizontal wells using pulsed neutron logs

    SciTech Connect

    Brady, J.L.; Kohring, J.J.; North, R.J.

    1996-12-31

    Production log flow profiles provide a valuable tool to evaluate well and reservoir performance. Horizontal wellbores and their associated completion designs present several challenges to profile interpretation for conventional production logging sensors and techniques. A unique approach combining pulsed neutron capture (PNC) log data with conventional production logging measurements is providing improved flow profile answers in slotted liner, horizontal well completions on the North Slope of Alaska. Identifying and eliminating undesirable gas production is one of the chief goals of production logging on the North Slope. This process becomes difficult in horizontal wellbores as fluid segregation affects the area investigated by the various logging sensors and also the velocities of the individual phases. Typical slotted liner completions further complicate analysis as fluids are able to flow in the liner/openhole annulus. Analysis of PNC log data provides two good qualitative indicators of formation permeability. The first technique is derived from the difference of the formation sigma response before and after injecting a high-capture cross-section borax solution. The second technique uses the difference of the formation sigma response and the formation porosity measured while injecting the formation with crude or seawater. Further analysis of PNC log runs show that the two techniques closely correlate with production flow profiles under solution gas-oil ratio (GOR) conditions. These two techniques in combination with conventional production logging measurements of temperature, capacitance, pressure, and spinner improve flow profile results. PNC results can be combined with temperature and pressure data in the absence of valid spinner data to provide an approximate flow profile. These techniques have been used to successfully determine profiles in both cemented and slotted liner completions with GORs in excess of 15,000 scf/bbl.

  14. Seeking Insights About Cycling Mood Disorders via Anonymized Search Logs

    PubMed Central

    White, Ryen W; Horvitz, Eric

    2014-01-01

    Background Mood disorders affect a significant portion of the general population. Cycling mood disorders are characterized by intermittent episodes (or events) of the disease. Objective Using anonymized Web search logs, we identify a population of people with significant interest in mood stabilizing drugs (MSD) and seek evidence of mood swings in this population. Methods We extracted queries to the Microsoft Bing search engine made by 20,046 Web searchers over six months, separately explored searcher demographics using data from a large external panel of users, and sought supporting information from people with mood disorders via a survey. We analyzed changes in information needs over time relative to searches on MSD. Results Queries for MSD focused on side effects and their relation to the disease. We found evidence of significant changes in search behavior and interests coinciding with days that MSD queries are made. These include large increases (>100%) in the access of nutrition information, commercial information, and adult materials. A survey of patients diagnosed with mood disorders provided evidence that repeated queries on MSD may come with exacerbations of mood disorder. A classifier predicting the occurrence of such queries one day before they are observed obtains strong performance (AUC=0.78). Conclusions Observed patterns in search behavior align with known behaviors and those highlighted by survey respondents. These observations suggest that searchers showing intensive interest in MSD may be patients who have been prescribed these drugs. Given behavioral dynamics, we surmise that the days on which MSD queries are made may coincide with commencement of mania or depression. Although we do not have data on mood changes and whether users have been diagnosed with bipolar illness, we see evidence of cycling in people who show interest in MSD and further show that we can predict impending shifts in behavior and interest. PMID:24568936

  15. Storage Manager and File Transfer Web Services

    SciTech Connect

    William A Watson III; Ying Chen; Jie Chen; Walt Akers

    2002-07-01

    Web services are emerging as an interesting mechanism for a wide range of grid services, particularly those focused upon information services and control. When coupled with efficient data transfer services, they provide a powerful mechanism for building a flexible, open, extensible data grid for science applications. In this paper we present our prototype work on a Java Storage Resource Manager (JSRM) web service and a Java Reliable File Transfer (JRFT) web service. A java client (Grid File Manager) on top of JSRM and is developed to demonstrate the capabilities of these web services. The purpose of this work is to show the extent to which SOAP based web services are an appropriate direction for building a grid-wide data management system, and eventually grid-based portals.

  16. Graphical viewer for displaying locations and logs of selected wells and test holes in Putnam County, New York

    USGS Publications Warehouse

    Wolcott, Stephen W.

    2005-01-01

    Aquifers (water bearing geologic units) are the primary source of drinking water in most of Putnam County, N.Y. The principal sources of data used to define the geometry and hydraulic characteristics of aquifers are the logs of wells and test holes within the county. This report explains how to use a graphical viewer, available on the World Wide Web (http://ny.water.usgs.gov/pubs/of/of051198), to locate selected wells and test holes in Putnam County and display their logs.

  17. Structural basis for cytokinin production by LOG from Corynebacterium glutamicum

    PubMed Central

    Seo, Hogyun; Kim, Sangwoo; Sagong, Hye-Young; Son, Hyeoncheol Francis; Jin, Kyeong Sik; Kim, Il-Kwon; Kim, Kyung-Jin

    2016-01-01

    “Lonely guy” (LOG) has been identified as a cytokinin-producing enzyme in plants and plant-interacting fungi. The gene product of Cg2612 from the soil-dwelling bacterium Corynebacterium glutamicum was annotated as an LDC. However, the facts that C. glutamicum lacks an LDC and Cg2612 has high amino acid similarity with LOG proteins suggest that Cg2612 is possibly an LOG protein. To investigate the function of Cg2612, we determined its crystal structure at a resolution of 2.3 Å. Cg2612 functions as a dimer and shows an overall structure similar to other known LOGs, such as LOGs from Arabidopsis thaliana (AtLOG), Claviceps purpurea (CpLOG), and Mycobacterium marinum (MmLOG). Cg2612 also contains a “PGGXGTXXE” motif that contributes to the formation of an active site similar to other LOGs. Moreover, biochemical studies on Cg2612 revealed that the protein has phosphoribohydrolase activity but not LDC activity. Based on these structural and biochemical studies, we propose that Cg2612 is not an LDC family enzyme, but instead belongs to the LOG family. In addition, the prenyl-binding site of Cg2612 (CgLOG) comprised residues identical to those seen in AtLOG and CpLOG, albeit dissimilar to those in MmLOG. The work provides structural and functional implications for LOG-like proteins from other microorganisms. PMID:27507425

  18. Structural basis for cytokinin production by LOG from Corynebacterium glutamicum.

    PubMed

    Seo, Hogyun; Kim, Sangwoo; Sagong, Hye-Young; Son, Hyeoncheol Francis; Jin, Kyeong Sik; Kim, Il-Kwon; Kim, Kyung-Jin

    2016-08-10

    "Lonely guy" (LOG) has been identified as a cytokinin-producing enzyme in plants and plant-interacting fungi. The gene product of Cg2612 from the soil-dwelling bacterium Corynebacterium glutamicum was annotated as an LDC. However, the facts that C. glutamicum lacks an LDC and Cg2612 has high amino acid similarity with LOG proteins suggest that Cg2612 is possibly an LOG protein. To investigate the function of Cg2612, we determined its crystal structure at a resolution of 2.3 Å. Cg2612 functions as a dimer and shows an overall structure similar to other known LOGs, such as LOGs from Arabidopsis thaliana (AtLOG), Claviceps purpurea (CpLOG), and Mycobacterium marinum (MmLOG). Cg2612 also contains a "PGGXGTXXE" motif that contributes to the formation of an active site similar to other LOGs. Moreover, biochemical studies on Cg2612 revealed that the protein has phosphoribohydrolase activity but not LDC activity. Based on these structural and biochemical studies, we propose that Cg2612 is not an LDC family enzyme, but instead belongs to the LOG family. In addition, the prenyl-binding site of Cg2612 (CgLOG) comprised residues identical to those seen in AtLOG and CpLOG, albeit dissimilar to those in MmLOG. The work provides structural and functional implications for LOG-like proteins from other microorganisms.

  19. Does selective logging change ground-dwelling beetle assemblages in a subtropical broad-leafed forest of China?

    PubMed

    Yu, Xiao-Dong; Liu, Chong-Ling; Lü, Liang; Bearer, Scott L; Luo, Tian-Hong; Zhou, Hong-Zhang

    2017-04-01

    Selective logging with natural regeneration is advocated as a near-to-nature strategy and has been implemented in many forested systems during the last decades. However, the efficiency of such practices for the maintenance of forest species are poorly understood. We compared the species richness, abundance and composition of ground-dwelling beetles between selectively logged and unlogged forests to evaluate the possible effects of selective logging in a subtropical broad-leafed forest in southeastern China. Using pitfall traps, beetles were sampled in two naturally regenerating stands after clearcuts (ca. 50 years old, stem-exclusion stage: selectively logged 20 years ago) and two mature stands (> 80 years old, understory re-initiation stage: selectively logged 50 years ago) during 2009 and 2010. Overall, selective logging had no significant effects on total beetle richness and abundance, but saproxylic species group and some abundant forest species significantly decreased in abundance in selectively logged plots compared with unlogged plots in mature stands. Beetle assemblages showed significant differences between selectively logged and unlogged plots in mature stands. Some environmental characteristics associated with selective logging (e.g., logging strategy, stand age, and cover of shrub and moss layers) were the most important variables explaining beetle assemblage structure. Our results conclude that selective logging has no significant impacts on overall richness and abundance of ground-dwelling beetles. However, the negative effects of selective logging on saproxylic species group and some unlogged forest specialists highlight the need for large intact forested areas for sustaining the existence of forest specialist beetles.

  20. Selecting Aquifer Wells for Planned Gyroscopic Logging

    SciTech Connect

    Rohe, Michael James; Studley, Gregory Wayne

    2002-04-01

    Understanding the configuration of the eastern Snake River Plain aquifer's water table is made difficult, in part, due to borehole deviation in aquifer wells. A borehole has deviation if it is not vertical or straight. Deviation impairs the analysis of water table elevation measurements because it results in measurements that are greater than the true distance from the top of the well to the water table. Conceptual models of the water table configuration are important to environmental management decision-making at the INEEL; these models are based on measurements of depth to the water table taken from aquifer wells at or near the INEEL. When accurate data on the amount of deviation in any given borehole is acquired, then measurements of depth-to-water can be adjusted to reflect the true depth so more accurate conceptual models can be developed. Collection of additional borehole deviation data with gyroscopic logging is planned for selected wells to further our confidence in the quality of water level measurements. Selection of wells for the planned logging is based on qualitative and quantitative screening criteria. An existing data set from magnetic deviation logs was useful in establishing these criteria however, are considered less accurate than gyroscopic deviation logs under certain conditions. Population distributions for 128 aquifer wells with magnetic deviation data were used to establish three quantitative screening thresholds. Qualitative criteria consisted of administrative controls, accessibility issues, and drilling methods. Qualitative criteria eliminated all but 116 of the 337 aquifer wells, in the vicinity of the INEEL, that were initially examined in this screening effort. Of these, 72 have associated magnetic deviation data; 44 do not. Twenty-five (25) of the 72 wells with magnetic deviation data have deviation greater than one of the three quantitative screening thresholds. These 25 are recommended for the planned gyroscopic borehole deviation

  1. Web-based pathology practice examination usage

    PubMed Central

    Klatt, Edward C.

    2014-01-01

    Context: General and subject specific practice examinations for students in health sciences studying pathology were placed onto a free public internet web site entitled web path and were accessed four clicks from the home web site menu. Subjects and Methods: Multiple choice questions were coded into. html files with JavaScript functions for web browser viewing in a timed format. A Perl programming language script with common gateway interface for web page forms scored examinations and placed results into a log file on an internet computer server. The four general review examinations of 30 questions each could be completed in up to 30 min. The 17 subject specific examinations of 10 questions each with accompanying images could be completed in up to 15 min each. The results of scores and user educational field of study from log files were compiled from June 2006 to January 2014. Results: The four general review examinations had 31,639 accesses with completion of all questions, for a completion rate of 54% and average score of 75%. A score of 100% was achieved by 7% of users, ≥90% by 21%, and ≥50% score by 95% of users. In top to bottom web page menu order, review examination usage was 44%, 24%, 17%, and 15% of all accessions. The 17 subject specific examinations had 103,028 completions, with completion rate 73% and average score 74%. Scoring at 100% was 20% overall, ≥90% by 37%, and ≥50% score by 90% of users. The first three menu items on the web page accounted for 12.6%, 10.0%, and 8.2% of all completions, and the bottom three accounted for no more than 2.2% each. Conclusions: Completion rates were higher for shorter 10 questions subject examinations. Users identifying themselves as MD/DO scored higher than other users, averaging 75%. Usage was higher for examinations at the top of the web page menu. Scores achieved suggest that a cohort of serious users fully completing the examinations had sufficient preparation to use them to support their pathology

  2. Bringing Control System User Interfaces to the Web

    SciTech Connect

    Chen, Xihui; Kasemir, Kay

    2013-01-01

    With the evolution of web based technologies, especially HTML5 [1], it becomes possible to create web-based control system user interfaces (UI) that are cross-browser and cross-device compatible. This article describes two technologies that facilitate this goal. The first one is the WebOPI [2], which can seamlessly display CSS BOY [3] Operator Interfaces (OPI) in web browsers without modification to the original OPI file. The WebOPI leverages the powerful graphical editing capabilities of BOY and provides the convenience of re-using existing OPI files. On the other hand, it uses generic JavaScript and a generic communication mechanism between the web browser and web server. It is not optimized for a control system, which results in unnecessary network traffic and resource usage. Our second technology is the WebSocket-based Process Data Access (WebPDA) [4]. It is a protocol that provides efficient control system data communication using WebSocket [5], so that users can create web-based control system UIs using standard web page technologies such as HTML, CSS and JavaScript. WebPDA is control system independent, potentially supporting any type of control system.

  3. The X-ray log N-log S relation. [background radiation in extragalactic media

    NASA Technical Reports Server (NTRS)

    Boldt, Elihu

    1989-01-01

    Results from various surveys are reviewed as regards X-ray source counts at high galactic latitudes and the luminosity functions determined for extragalactic sources. Constraints on the associated log N-log S relation provided by the extragalactic X-ray background are emphasized in terms of its spatial fluctuations and spectrum as well as absolute flux level. The large number of sources required for this background suggests that there is not a sharp boundary in the redshift distribution of visible matter.

  4. Web usage mining at an academic health sciences library: an exploratory study

    PubMed Central

    Bracke, Paul J.

    2004-01-01

    Objectives: This paper explores the potential of multinomial logistic regression analysis to perform Web usage mining for an academic health sciences library Website. Methods: Usage of database-driven resource gateway pages was logged for a six-month period, including information about users' network addresses, referring uniform resource locators (URLs), and types of resource accessed. Results: It was found that referring URL did vary significantly by two factors: whether a user was on-campus and what type of resource was accessed. Conclusions: Although the data available for analysis are limited by the nature of the Web and concerns for privacy, this method demonstrates the potential for gaining insight into Web usage that supplements Web log analysis. It can be used to improve the design of static and dynamic Websites today and could be used in the design of more advanced Web systems in the future. PMID:15494757

  5. Convolution effect on TCR log response curve and the correction method for it

    NASA Astrophysics Data System (ADS)

    Chen, Q.; Liu, L. J.; Gao, J.

    2016-09-01

    Through-casing resistivity (TCR) logging has been successfully used in production wells for the dynamic monitoring of oil pools and the distribution of the residual oil, but its vertical resolution has limited its efficiency in identification of thin beds. The vertical resolution is limited by the distortion phenomenon of vertical response of TCR logging. The distortion phenomenon was studied in this work. It was found that the vertical response curve of TCR logging is the convolution of the true formation resistivity and the convolution function of TCR logging tool. Due to the effect of convolution, the measurement error at thin beds can reach 30% or even bigger. Thus the information of thin bed might be covered up very likely. The convolution function of TCR logging tool was obtained in both continuous and discrete way in this work. Through modified Lyle-Kalman deconvolution method, the true formation resistivity can be optimally estimated, so this inverse algorithm can correct the error caused by the convolution effect. Thus it can improve the vertical resolution of TCR logging tool for identification of thin beds.

  6. Spider's web inspires fibres for industry

    NASA Astrophysics Data System (ADS)

    Dacey, James

    2010-03-01

    Spiders may not be everybody's idea of natural beauty, but nobody can deny the artistry in the webs that they spin, especially when decorated with water baubles in the morning dew. Inspired by this spectacle, a group of researchers in China has mimicked the structural properties of the spider's web to create a fibre for industry that can manipulate water with the same skill and efficiency, writes James Dacey.

  7. Stochastic theory of log-periodic patterns

    NASA Astrophysics Data System (ADS)

    Canessa, Enrique

    2000-12-01

    We introduce an analytical model based on birth-death clustering processes to help in understanding the empirical log-periodic corrections to power law scaling and the finite-time singularity as reported in several domains including rupture, earthquakes, world population and financial systems. In our stochastic theory log-periodicities are a consequence of transient clusters induced by an entropy-like term that may reflect the amount of co-operative information carried by the state of a large system of different species. The clustering completion rates for the system are assumed to be given by a simple linear death process. The singularity at t0 is derived in terms of birth-death clustering coefficients.

  8. Quantifying logging residue - before the fact

    SciTech Connect

    Bones, J.T.

    1982-06-01

    Tree biomass estimation, which is being integrated into the U.S. Forest Service Renewable Resources Evaluation Program, will give foresters the ability to estimate the amount of logging residues they might expect from harvested treetops and branches and residual rough, rotten, and small trees before the actual harvest. With planning, and increased demand for such timber products as pulpwood and fuelwood, product recovery could be increased by up to 43 percent in softwood stands and 99% in hardwoods. Recovery levels affect gross product receipts and site preparation costs. An example of product recovery and residue generation is presented for three harvesting options in Pennsylvania hardwood stands. Under the whole-tree harvesting option, 46% more product was recovered than in single product harvesting, and logging residue levels were reduced by 58%.

  9. No chiral truncation of quantum log gravity?

    NASA Astrophysics Data System (ADS)

    Andrade, Tomás; Marolf, Donald

    2010-03-01

    At the classical level, chiral gravity may be constructed as a consistent truncation of a larger theory called log gravity by requiring that left-moving charges vanish. In turn, log gravity is the limit of topologically massive gravity (TMG) at a special value of the coupling (the chiral point). We study the situation at the level of linearized quantum fields, focussing on a unitary quantization. While the TMG Hilbert space is continuous at the chiral point, the left-moving Virasoro generators become ill-defined and cannot be used to define a chiral truncation. In a sense, the left-moving asymptotic symmetries are spontaneously broken at the chiral point. In contrast, in a non-unitary quantization of TMG, both the Hilbert space and charges are continuous at the chiral point and define a unitary theory of chiral gravity at the linearized level.

  10. INSPIRE and SPIRES Log File Analysis

    SciTech Connect

    Adams, Cole; /Wheaton Coll. /SLAC

    2012-08-31

    SPIRES, an aging high-energy physics publication data base, is in the process of being replaced by INSPIRE. In order to ease the transition from SPIRES to INSPIRE it is important to understand user behavior and the drivers for adoption. The goal of this project was to address some questions in regards to the presumed two-thirds of the users still using SPIRES. These questions are answered through analysis of the log files from both websites. A series of scripts were developed to collect and interpret the data contained in the log files. The common search patterns and usage comparisons are made between INSPIRE and SPIRES, and a method for detecting user frustration is presented. The analysis reveals a more even split than originally thought as well as the expected trend of user transition to INSPIRE.

  11. Identifying orthoimages in Web Map Services

    NASA Astrophysics Data System (ADS)

    Florczyk, A. J.; Nogueras-Iso, J.; Zarazaga-Soria, F. J.; Béjar, R.

    2012-10-01

    Orthoimages are essential in many Web applications to facilitate the background context that helps to understand other georeferenced information. Catalogues and service registries of Spatial Data Infrastructures do not necessarily register all the services providing access to imagery data on the Web, and it is not easy to automatically identify whether the data offered by a Web service are directly imagery data or not. This work presents a method for an automatic detection of the orthoimage layers offered by Web Map Services. The method combines two types of heuristics. The first one consists in analysing the text in the capabilities document. The second type is content-based heuristics, which analyse the content offered by the Web Map Service layers. These heuristics gather and analyse the colour features of a sample collection of image fragments that represent the offered content. An experiment has been performed over a set of Web Map Service layers, which have been fetched from a repository of capabilities documents gathered from the Web. This has proven the efficiency of the method (precision of 87% and recall of 60%). This functionality has been offered as a Web Processing Service, and it has been integrated within the Virtual Spain project to provide a catalogue of orthoimages and build realistic 3D views.

  12. Guide to the Internet. Logging in, fetching files, reading news.

    PubMed Central

    Pallen, M.

    1995-01-01

    Aside from email and the world wide web, there are several other systems for distributing information on the Internet. Telnet is a system that allows you to log on to a remote computer from anywhere on the Internet and affords access to many useful biomedical sites on the Internet. File transfer protocol (FTP) is a method of transferring files from one computer to another over the Internet. It can be used to download files, including software, from numerous publiclly accessible "anonymous FTP archives" around the world. Such archives can be searched using a tool known as Archie. Network News is a system of electronic discussion groups covering almost every imaginable subject, including many areas of medicine and the biomedical sciences; MOOs are virtual environments that allow real time electronic conferencing and teaching over the Internet. It is difficult to predict the future of medicine on the Internet. However, the net opens up many possibilities not available through previous technologies. It is now up to medical practitioners to realise the Internet's full potential. PMID:8555810

  13. Quantitative Literacy: Working with Log Graphs

    NASA Astrophysics Data System (ADS)

    Shawl, S.

    2013-04-01

    The need for working with and understanding different types of graphs is a common occurrence in everyday life. Examples include anything having to do investments, being an educated juror in a case that involves evidence presented graphically, and understanding many aspect of our current political discourse. Within a science class graphs play a crucial role in presenting and interpreting data. In astronomy, where the range of graphed values is many orders of magnitude, log-axes must be used and understood. Experience shows that students do not understand how to read and interpret log-axes or how they differ from linear. Alters (1996), in a study of college students in an algebra-based physics class, found little understanding of log plotting. The purpose of this poster is to show the method and progression I have developed for use in my “ASTRO 101” class, with the goal being to help students better understand the H-R diagram, mass-luminosity relationship, and digital spectra.

  14. Precision pressure/temperature logging tool

    SciTech Connect

    Henfling, J.A.; Normann, R.A.

    1998-01-01

    Past memory logging tools have provided excellent pressure/temperature data when used in a geothermal environment, and they are easier to maintain and deploy than tools requiring an electric wireline connection to the surface. However, they are deficient since the tool operator is unaware of downhole conditions that could require changes in the logging program. Tools that make ``decisions`` based on preprogrammed scenarios can partially overcome this difficulty, and a suite of such memory tools has been developed at Sandia National Laboratories. The first tool, which forms the basis for future instruments, measures pressure and temperature. Design considerations include a minimization of cost while insuring quality data, size compatibility with diamond-cored holes, operation in holes to 425 C (800 F), transportability by ordinary passenger air service, and ease of operation. This report documents the development and construction of the pressure/temperature tool. It includes: (1) description of the major components; (2) calibration; (3) typical logging scenario; (4) tool data examples; and (5) conclusions. The mechanical and electrical drawings, along with the tool`s software, will be furnished upon request.

  15. WebTag: Web Browsing into Sensor Tags over NFC

    PubMed Central

    Echevarria, Juan Jose; Ruiz-de-Garibay, Jonathan; Legarda, Jon; Álvarez, Maite; Ayerbe, Ana; Vazquez, Juan Ignacio

    2012-01-01

    Information and Communication Technologies (ICTs) continue to overcome many of the challenges related to wireless sensor monitoring, such as for example the design of smarter embedded processors, the improvement of the network architectures, the development of efficient communication protocols or the maximization of the life cycle autonomy. This work tries to improve the communication link of the data transmission in wireless sensor monitoring. The upstream communication link is usually based on standard IP technologies, but the downstream side is always masked with the proprietary protocols used for the wireless link (like ZigBee, Bluetooth, RFID, etc.). This work presents a novel solution (WebTag) for a direct IP based access to a sensor tag over the Near Field Communication (NFC) technology for secure applications. WebTag allows a direct web access to the sensor tag by means of a standard web browser, it reads the sensor data, configures the sampling rate and implements IP based security policies. It is, definitely, a new step towards the evolution of the Internet of Things paradigm. PMID:23012511

  16. WebTag: Web browsing into sensor tags over NFC.

    PubMed

    Echevarria, Juan Jose; Ruiz-de-Garibay, Jonathan; Legarda, Jon; Alvarez, Maite; Ayerbe, Ana; Vazquez, Juan Ignacio

    2012-01-01

    Information and Communication Technologies (ICTs) continue to overcome many of the challenges related to wireless sensor monitoring, such as for example the design of smarter embedded processors, the improvement of the network architectures, the development of efficient communication protocols or the maximization of the life cycle autonomy. This work tries to improve the communication link of the data transmission in wireless sensor monitoring. The upstream communication link is usually based on standard IP technologies, but the downstream side is always masked with the proprietary protocols used for the wireless link (like ZigBee, Bluetooth, RFID, etc.). This work presents a novel solution (WebTag) for a direct IP based access to a sensor tag over the Near Field Communication (NFC) technology for secure applications. WebTag allows a direct web access to the sensor tag by means of a standard web browser, it reads the sensor data, configures the sampling rate and implements IP based security policies. It is, definitely, a new step towards the evolution of the Internet of Things paradigm.

  17. JPL web team

    NASA Technical Reports Server (NTRS)

    Bickler, D. B.

    1986-01-01

    The Jet Propulsion Laboratory (JPL) WEB Team activities were reported for activities which were directed toward identifying and attacking areas in the growth of dendritic web ribbon, to complement the program at Westinghouse Electric Corp.

  18. Trilinos Web Interface Package

    SciTech Connect

    Hu, Jonathan; Phenow, Michael N.; Sala, Marzio; Tuminaro, Ray S.

    2006-09-01

    WebTrilinos is a scientific portal, a web-based environment to use several Trilinos packages through the web. If you are a teaching sparse linear algebra, you can use WebTrilinos to present code snippets and simple scripts, and let the students execute them from their browsers. If you want to test linear algebra solvers, you can use the MatrixPortal module, and you just have to select problems and options, then plot the results in nice graphs.

  19. Laser scanning measurements on trees for logging harvesting operations.

    PubMed

    Zheng, Yili; Liu, Jinhao; Wang, Dian; Yang, Ruixi

    2012-01-01

    Logging harvesters represent a set of high-performance modern forestry machinery, which can finish a series of continuous operations such as felling, delimbing, peeling, bucking and so forth with human intervention. It is found by experiment that during the process of the alignment of the harvesting head to capture the trunk, the operator needs a lot of observation, judgment and repeated operations, which lead to the time and fuel losses. In order to improve the operation efficiency and reduce the operating costs, the point clouds for standing trees are collected with a low-cost 2D laser scanner. A cluster extracting algorithm and filtering algorithm are used to classify each trunk from the point cloud. On the assumption that every cross section of the target trunk is approximate a standard circle and combining the information of an Attitude and Heading Reference System, the radii and center locations of the trunks in the scanning range are calculated by the Fletcher-Reeves conjugate gradient algorithm. The method is validated through experiments in an aspen forest, and the optimized calculation time consumption is compared with the previous work of other researchers. Moreover, the implementation of the calculation result for automotive capturing trunks by the harvesting head during the logging operation is discussed in particular.

  20. Laser Scanning Measurements on Trees for Logging Harvesting Operations

    PubMed Central

    Zheng, Yili; Liu, Jinhao; Wang, Dian; Yang, Ruixi

    2012-01-01

    Logging harvesters represent a set of high-performance modern forestry machinery, which can finish a series of continuous operations such as felling, delimbing, peeling, bucking and so forth with human intervention. It is found by experiment that during the process of the alignment of the harvesting head to capture the trunk, the operator needs a lot of observation, judgment and repeated operations, which lead to the time and fuel losses. In order to improve the operation efficiency and reduce the operating costs, the point clouds for standing trees are collected with a low-cost 2D laser scanner. A cluster extracting algorithm and filtering algorithm are used to classify each trunk from the point cloud. On the assumption that every cross section of the target trunk is approximate a standard circle and combining the information of an Attitude and Heading Reference System, the radii and center locations of the trunks in the scanning range are calculated by the Fletcher-Reeves conjugate gradient algorithm. The method is validated through experiments in an aspen forest, and the optimized calculation time consumption is compared with the previous work of other researchers. Moreover, the implementation of the calculation result for automotive capturing trunks by the harvesting head during the logging operation is discussed in particular. PMID:23012543

  1. Log-periodic route to fractal functions.

    PubMed

    Gluzman, S; Sornette, D

    2002-03-01

    Log-periodic oscillations have been found to decorate the usual power-law behavior found to describe the approach to a critical point, when the continuous scale-invariance symmetry is partially broken into a discrete-scale invariance symmetry. For Ising or Potts spins with ferromagnetic interactions on hierarchical systems, the relative magnitude of the log-periodic corrections are usually very small, of order 10(-5). In growth processes [diffusion limited aggregation (DLA)], rupture, earthquake, and financial crashes, log-periodic oscillations with amplitudes of the order of 10% have been reported. We suggest a "technical" explanation for this 4 order-of-magnitude difference based on the property of the "regular function" g(x) embodying the effect of the microscopic degrees of freedom summed over in a renormalization group (RG) approach F(x)=g(x)+mu(-1)F(gamma x) of an observable F as a function of a control parameter x. For systems for which the RG equation has not been derived, the previous equation can be understood as a Jackson q integral, which is the natural tool for describing discrete-scale invariance. We classify the "Weierstrass-type" solutions of the RG into two classes characterized by the amplitudes A(n) of the power-law series expansion. These two classes are separated by a novel "critical" point. Growth processes (DLA), rupture, earthquake, and financial crashes thus seem to be characterized by oscillatory or bounded regular microscopic functions that lead to a slow power-law decay of A(n), giving strong log-periodic amplitudes. If in addition, the phases of A(n) are ergodic and mixing, the observable presents self-affine nondifferentiable properties. In contrast, the regular function of statistical physics models with "ferromagnetic"-type interactions at equilibrium involves unbound logarithms of polynomials of the control variable that lead to a fast exponential decay of A(n) giving weak log-periodic amplitudes and smoothed observables.

  2. Design of a Web-tool for diagnostic clinical trials handling medical imaging research.

    PubMed

    Baltasar Sánchez, Alicia; González-Sistal, Angel

    2011-04-01

    New clinical studies in medicine are based on patients and controls using different imaging diagnostic modalities. Medical information systems are not designed for clinical trials employing clinical imaging. Although commercial software and communication systems focus on storage of image data, they are not suitable for storage and mining of new types of quantitative data. We sought to design a Web-tool to support diagnostic clinical trials involving different experts and hospitals or research centres. The image analysis of this project is based on skeletal X-ray imaging. It involves a computerised image method using quantitative analysis of regions of interest in healthy bone and skeletal metastases. The database is implemented with ASP.NET 3.5 and C# technologies for our Web-based application. For data storage, we chose MySQL v.5.0, one of the most popular open source databases. User logins were necessary, and access to patient data was logged for auditing. For security, all data transmissions were carried over encrypted connections. This Web-tool is available to users scattered at different locations; it allows an efficient organisation and storage of data (case report form) and images and allows each user to know precisely what his task is. The advantages of our Web-tool are as follows: (1) sustainability is guaranteed; (2) network locations for collection of data are secured; (3) all clinical information is stored together with the original images and the results derived from processed images and statistical analysis that enable us to perform retrospective studies; (4) changes are easily incorporated because of the modular architecture; and (5) assessment of trial data collected at different sites is centralised to reduce statistical variance.

  3. Multimedia Web Searching Trends.

    ERIC Educational Resources Information Center

    Ozmutlu, Seda; Spink, Amanda; Ozmutlu, H. Cenk

    2002-01-01

    Examines and compares multimedia Web searching by Excite and FAST search engine users in 2001. Highlights include audio and video queries; time spent on searches; terms per query; ranking of the most frequently used terms; and differences in Web search behaviors of U.S. and European Web users. (Author/LRW)

  4. Evaluating Web Usability

    ERIC Educational Resources Information Center

    Snider, Jean; Martin, Florence

    2012-01-01

    Web usability focuses on design elements and processes that make web pages easy to use. A website for college students was evaluated for underutilization. One-on-one testing, focus groups, web analytics, peer university review and marketing focus group and demographic data were utilized to conduct usability evaluation. The results indicated that…

  5. Commercial Web Site Links.

    ERIC Educational Resources Information Center

    Thelwall, Mike

    2001-01-01

    Discusses business use of the Web and related search engine design issues as well as research on general and academic links before reporting on a survey of the links published by a collection of business Web sites. Results indicate around 66% of Web sites do carry external links, most of which are targeted at a specific purpose, but about 17%…

  6. Implementing Good Web Style.

    ERIC Educational Resources Information Center

    Plankis, Brian J.

    1998-01-01

    Provides an overview of Web-site design and discusses three steps in building a site: audience analysis, design, and evaluation. Includes an analysis of loading speeds with and without graphics; examples of no-style, low-bandwidth, and high-bandwidth Web sites; and addresses for related Web sites. (PEN)

  7. WWW: Neuroscience Web Sites

    ERIC Educational Resources Information Center

    Liu, Dennis

    2006-01-01

    The human brain contains an estimated 100 billion neurons, and browsing the Web, one might be led to believe that there's a Web site for every one of those cells. It's no surprise that there are lots of Web sites concerning the nervous system. After all, the human brain is toward the top of nearly everyone's list of favorite organs and of…

  8. Web presence of an integrated delivery system at year one: lessons learned.

    PubMed

    Ong, Kenneth R; Kingham, Bernadette; Sotiridy, Kate; Kaufman, David; Polkowski, Michelle; Schofield, John

    2003-04-01

    The log analysis of a web site can be used to guide performance improvement. Log analysis identifies which resources on a website are often accessed and those that are not. A log analysis can provide cost justification for a corporate web presence. We describe the log analysis of a large integrated delivery system in New York City from its launch May 1, 2001, through April 30, 2002. During this first year there were 428753 sessions with 1322153 page views. An analysis of page views, exclusive of the default home page, revealed that the pages most frequently visited were related to job opportunities (22.4%, 120086 of 428753), general information ('About') (8.5%, 45355), St. Vincent's Hospital Manhattan (8.3%, 44630), graduate medical education and allied health (8.1%, 43317), facilities (7.2%, 38677), web site search (6.4%, 34322), and physician finder (4.3%, 22910). The web site facilitated 1,980 online job applications. Health information received 15356 page views from 1793 visits. The re-design of the site for year 2 promoted the more frequently selected links, flattened the navigational architecture, and enabled access to all web pages within two to three mouse clicks of the home page. Lessons learned in the process of developing and maintaing a health care organization web site are shared in the case study.

  9. Console Log Keeping Made Easier - Tools and Techniques for Improving Quality of Flight Controller Activity Logs

    NASA Technical Reports Server (NTRS)

    Scott, David W.; Underwood, Debrah (Technical Monitor)

    2002-01-01

    At the Marshall Space Flight Center's (MSFC) Payload Operations Integration Center (POIC) for International Space Station (ISS), each flight controller maintains detailed logs of activities and communications at their console position. These logs are critical for accurately controlling flight in real-time as well as providing a historical record and troubleshooting tool. This paper describes logging methods and electronic formats used at the POIC and provides food for thought on their strengths and limitations, plus proposes some innovative extensions. It also describes an inexpensive PC-based scheme for capturing and/or transcribing audio clips from communications consoles. Flight control activity (e.g. interpreting computer displays, entering data/issuing electronic commands, and communicating with others) can become extremely intense. It's essential to document it well, but the effort to do so may conflict with actual activity. This can be more than just annoying, as what's in the logs (or just as importantly not in them) often feeds back directly into the quality of future operations, whether short-term or long-term. In earlier programs, such as Spacelab, log keeping was done on paper, often using position-specific shorthand, and the other reader was at the mercy of the writer's penmanship. Today, user-friendly software solves the legibility problem and can automate date/time entry, but some content may take longer to finish due to individual typing speed and less use of symbols. File layout can be used to great advantage in making types of information easy to find, and creating searchable master logs for a given position is very easy and a real lifesaver in reconstructing events or researching a given topic. We'll examine log formats from several console position, and the types of information that are included and (just as importantly) excluded. We'll also look at when a summary or synopsis is effective, and when extensive detail is needed.

  10. Thresholds of logging intensity to maintain tropical forest biodiversity.

    PubMed

    Burivalova, Zuzana; Sekercioğlu, Cağan Hakkı; Koh, Lian Pin

    2014-08-18

    Primary tropical forests are lost at an alarming rate, and much of the remaining forest is being degraded by selective logging. Yet, the impacts of logging on biodiversity remain poorly understood, in part due to the seemingly conflicting findings of case studies: about as many studies have reported increases in biodiversity after selective logging as have reported decreases. Consequently, meta-analytical studies that treat selective logging as a uniform land use tend to conclude that logging has negligible effects on biodiversity. However, selectively logged forests might not all be the same. Through a pantropical meta-analysis and using an information-theoretic approach, we compared and tested alternative hypotheses for key predictors of the richness of tropical forest fauna in logged forest. We found that the species richness of invertebrates, amphibians, and mammals decreases as logging intensity increases and that this effect varies with taxonomic group and continental location. In particular, mammals and amphibians would suffer a halving of species richness at logging intensities of 38 m(3) ha(-1) and 63 m(3) ha(-1), respectively. Birds exhibit an opposing trend as their total species richness increases with logging intensity. An analysis of forest bird species, however, suggests that this pattern is largely due to an influx of habitat generalists into heavily logged areas while forest specialist species decline. Our study provides a quantitative analysis of the nuanced responses of species along a gradient of logging intensity, which could help inform evidence-based sustainable logging practices from the perspective of biodiversity conservation.

  11. Designing an Illustrated Food Web to Teach Ecological Concepts: Challenges and Solutions.

    ERIC Educational Resources Information Center

    Godkin, Celia M.

    1999-01-01

    Argues that food webs are an efficient method through which to communicate the core idea of ecology--that all living things are interconnected. Assesses the challenges and solutions to using illustrated food webs. (Author/CCM)

  12. 8. William E. Barrett, Photographer, August 1975. LOG DOCK AND ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. William E. Barrett, Photographer, August 1975. LOG DOCK AND PARTIALLY DEMOLISHED JACKSLIP USED FOR HAULING LOGS UP TO SAWMILL. - Meadow River Lumber Company, Highway 60, Rainelle, Greenbrier County, WV

  13. 2. VIEW OF BLOCK AND TACKLE FOR MOVING CEDAR LOGS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. VIEW OF BLOCK AND TACKLE FOR MOVING CEDAR LOGS FROM POND TO JACK LADDER--AN ENDLESS CHAIN CONVEYOR THAT MOVES LOGS INTO MILL - Lester Shingle Mill, 1602 North Eighteenth Street, Sweet Home, Linn County, OR

  14. 5. Log draft horse barn. Detail of west side showing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. Log draft horse barn. Detail of west side showing Dutch door and square notching at wall corner. View to east. - William & Lucina Bowe Ranch, Log Draft Horse Barn, 290 feet southwest of House, Melrose, Silver Bow County, MT

  15. 6. SIDE ELEVATION, DETAIL SHOWING ORIGINAL LOG CONSTRUCTION, CLAPBOARD ADDITION ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. SIDE ELEVATION, DETAIL SHOWING ORIGINAL LOG CONSTRUCTION, CLAPBOARD ADDITION AND CHIMNEY STACK - Shinn-Curtis Log Cabin, 23 Washington Street (moved from Rancocas Boulevard), Mount Holly, Burlington County, NJ

  16. 8. Double crib barn, south corner, log section, loft area, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. Double crib barn, south corner, log section, loft area, detail of log construction - Wilkins Farm, Barn, South side of Dove Hollow Road, 6000 feet east of State Route 259, Lost City, Hardy County, WV

  17. 3. MAIN ELEVATION, DETAIL SHOWING HEWN LOGS WITH HALFDOVETAIL JOINTS; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. MAIN ELEVATION, DETAIL SHOWING HEWN LOGS WITH HALF-DOVETAIL JOINTS; LATHE AND PLASTER ADDITION; AND CLAPBOARD SIDING - Shinn-Curtis Log Cabin, 23 Washington Street (moved from Rancocas Boulevard), Mount Holly, Burlington County, NJ

  18. 4. Exterior, detail south elevation, showing jointing of logs on ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. Exterior, detail south elevation, showing jointing of logs on later extension. Sept. 12, 1940. Mixon. - Upper Swedish Log Cabin, Darby Creek vicinity, Clifton Heights (Upper Darby Township), Darby, Delaware County, PA

  19. 35. SOUTHWEST CORNER OF EAST CHIMNEY BASE SHOWING CONTINUOUS LOG ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    35. SOUTHWEST CORNER OF EAST CHIMNEY BASE SHOWING CONTINUOUS LOG FOUNDATION OVER VAULT AND THE WEST CRIBBING LOG WITH STONE FILL ON THE EAST. - Penacook House, Daniel Webster Highway (U.S. Route 3), Boscawen, Merrimack County, NH

  20. Drying of fiber webs

    DOEpatents

    Warren, David W.

    1997-01-01

    A process and an apparatus for high-intensity drying of fiber webs or sheets, such as newsprint, printing and writing papers, packaging paper, and paperboard or linerboard, as they are formed on a paper machine. The invention uses direct contact between the wet fiber web or sheet and various molten heat transfer fluids, such as liquified eutectic metal alloys, to impart heat at high rates over prolonged durations, in order to achieve ambient boiling of moisture contained within the web. The molten fluid contact process causes steam vapor to emanate from the web surface, without dilution by ambient air; and it is differentiated from the evaporative drying techniques of the prior industrial art, which depend on the uses of steam-heated cylinders to supply heat to the paper web surface, and ambient air to carry away moisture, which is evaporated from the web surface. Contact between the wet fiber web and the molten fluid can be accomplished either by submersing the web within a molten bath or by coating the surface of the web with the molten media. Because of the high interfacial surface tension between the molten media and the cellulose fiber comprising the paper web, the molten media does not appreciately stick to the paper after it is dried. Steam generated from the paper web is collected and condensed without dilution by ambient air to allow heat recovery at significantly higher temperature levels than attainable in evaporative dryers.

  1. Drying of fiber webs

    DOEpatents

    Warren, D.W.

    1997-04-15

    A process and an apparatus are disclosed for high-intensity drying of fiber webs or sheets, such as newsprint, printing and writing papers, packaging paper, and paperboard or linerboard, as they are formed on a paper machine. The invention uses direct contact between the wet fiber web or sheet and various molten heat transfer fluids, such as liquefied eutectic metal alloys, to impart heat at high rates over prolonged durations, in order to achieve ambient boiling of moisture contained within the web. The molten fluid contact process causes steam vapor to emanate from the web surface, without dilution by ambient air; and it is differentiated from the evaporative drying techniques of the prior industrial art, which depend on the uses of steam-heated cylinders to supply heat to the paper web surface, and ambient air to carry away moisture, which is evaporated from the web surface. Contact between the wet fiber web and the molten fluid can be accomplished either by submersing the web within a molten bath or by coating the surface of the web with the molten media. Because of the high interfacial surface tension between the molten media and the cellulose fiber comprising the paper web, the molten media does not appreciatively stick to the paper after it is dried. Steam generated from the paper web is collected and condensed without dilution by ambient air to allow heat recovery at significantly higher temperature levels than attainable in evaporative dryers. 6 figs.

  2. Baum's Algorithm Learns Intersections of Halfspaces with Respect to Log-Concave Distributions

    NASA Astrophysics Data System (ADS)

    Klivans, Adam R.; Long, Philip M.; Tang, Alex K.

    In 1990, E. Baum gave an elegant polynomial-time algorithm for learning the intersection of two origin-centered halfspaces with respect to any symmetric distribution (i.e., any {\\cal D} such that {\\cal D}(E) = {\\cal D}(-E)) [3]. Here we prove that his algorithm also succeeds with respect to any mean zero distribution with a log-concave density (a broad class of distributions that need not be symmetric). As far as we are aware, prior to this work, it was not known how to efficiently learn any class of intersections of halfspaces with respect to log-concave distributions.

  3. Well log evaluation of natural gas hydrates

    SciTech Connect

    Collett, T.S.

    1992-10-01

    Gas hydrates are crystalline substances composed of water and gas, in which a solid-water-lattice accommodates gas molecules in a cage-like structure. Gas hydrates are globally widespread in permafrost regions and beneath the sea in sediment of outer continental margins. While methane, propane, and other gases can be included in the clathrate structure, methane hydrates appear to be the most common in nature. The amount of methane sequestered in gas hydrates is probably enormous, but estimates are speculative and range over three orders of magnitude from about 100,000 to 270,000,000 trillion cubic feet. The amount of gas in the hydrate reservoirs of the world greedy exceeds the volume of known conventional gas reserves. Gas hydrates also represent a significant drilling and production hazard. A fundamental question linking gas hydrate resource and hazard issues is: What is the volume of gas hydrates and included gas within a given gas hydrate occurrence Most published gas hydrate resource estimates have, of necessity, been made by broad extrapolation of only general knowledge of local geologic conditions. Gas volumes that may be attributed to gas hydrates are dependent on a number of reservoir parameters, including the areal extent ofthe gas-hydrate occurrence, reservoir thickness, hydrate number, reservoir porosity, and the degree of gas-hydrate saturation. Two of the most difficult reservoir parameters to determine are porosity and degreeof gas hydrate saturation. Well logs often serve as a source of porosity and hydrocarbon saturation data; however, well-log calculations within gas-hydrate-bearing intervals are subject to error. The primary reason for this difficulty is the lack of quantitative laboratory and field studies. The primary purpose of this paper is to review the response of well logs to the presence of gas hydrates.

  4. Well log evaluation of natural gas hydrates

    SciTech Connect

    Collett, T.S.

    1992-10-01

    Gas hydrates are crystalline substances composed of water and gas, in which a solid-water-lattice accommodates gas molecules in a cage-like structure. Gas hydrates are globally widespread in permafrost regions and beneath the sea in sediment of outer continental margins. While methane, propane, and other gases can be included in the clathrate structure, methane hydrates appear to be the most common in nature. The amount of methane sequestered in gas hydrates is probably enormous, but estimates are speculative and range over three orders of magnitude from about 100,000 to 270,000,000 trillion cubic feet. The amount of gas in the hydrate reservoirs of the world greedy exceeds the volume of known conventional gas reserves. Gas hydrates also represent a significant drilling and production hazard. A fundamental question linking gas hydrate resource and hazard issues is: What is the volume of gas hydrates and included gas within a given gas hydrate occurrence? Most published gas hydrate resource estimates have, of necessity, been made by broad extrapolation of only general knowledge of local geologic conditions. Gas volumes that may be attributed to gas hydrates are dependent on a number of reservoir parameters, including the areal extent ofthe gas-hydrate occurrence, reservoir thickness, hydrate number, reservoir porosity, and the degree of gas-hydrate saturation. Two of the most difficult reservoir parameters to determine are porosity and degreeof gas hydrate saturation. Well logs often serve as a source of porosity and hydrocarbon saturation data; however, well-log calculations within gas-hydrate-bearing intervals are subject to error. The primary reason for this difficulty is the lack of quantitative laboratory and field studies. The primary purpose of this paper is to review the response of well logs to the presence of gas hydrates.

  5. Apparatus for focused electrode induced polarization logging

    SciTech Connect

    Vinegar, H.J.; Waxman, M.H.

    1986-04-15

    An induced polarization logging tool is described for measuring parameters of a formation surrounding a borehole. The logging tool consists of: a non-conductive logging sonde; a plurality of electrodes disposed on the sonde, the electrodes including at least a survey current electrode and guard electrodes disposed on opposite sides of the survey current electrode, a non-polarizing voltage measuring electrode, a non-polarizing voltage reference electrode and a current return electrode, both the voltage reference and current return electrodes being located a greater distance from the survey current electrode than the guard electrodes; means connected to the survey current electrode and the guard electrodes for generating a signal representative of the potential difference in the formation between the survey current electrode and the guard electrodes; first control means directly coupled to the survey current electrode, the first control means controlling the current flow to the survey current electrode in response to the potential difference signal; a second control means directly coupled to the guard electrodes to control the current flow to the guard electrodes in response to the potential difference signal; a source of alternating current located at the surface, one end of the source being coupled to the two control means and the other to the current return electrode, the source supplying alternating current at various discrete frequencies between substantially 0.01 and 100 Hz; measurement means directly coupled to the voltage measurement and survey current electrodes to measure the amplitude and phase of the voltage induced in the formation and the amplitude and phase of the current flow to the survey electrode; and transmission means for transmitting the measurements to the surface.

  6. Automatic log spectrum restoration of atmospheric seeing

    NASA Astrophysics Data System (ADS)

    Navarro, R.; Gomez, R.; Santamaria, J.

    1987-03-01

    This paper presents an automatic method for (1) digital estimation of the width of the atmospherical seeing in astronomical images of extended objects and (2) image restoration by using the constrained Jansson-Van Cittert deconvolution algorithm. The estimation of the seeing is achieved by computing the radial profile of the averaged log spectrum of the image. The result of this estimation is then applied to compute the Point Spread Function (PSF) used in the deconvolution process. The method is applied to a photographical image of a sunspot. The quality of the restoration assesses the power and usefulness of the method.

  7. Acoustic Logging Modeling by Refined Biot's Equations

    NASA Astrophysics Data System (ADS)

    Plyushchenkov, Boris D.; Turchaninov, Victor I.

    An explicit uniform completely conservative finite difference scheme for the refined Biot's equations is proposed. This system is modified according to the modern theory of dynamic permeability and tortuosity in a fluid-saturated elastic porous media. The approximate local boundary transparency conditions are constructed. The acoustic logging device is simulated by the choice of appropriate boundary conditions on its external surface. This scheme and these conditions are satisfactory for exploring borehole acoustic problems in permeable formations in a real axial-symmetrical situation. The developed approach can be adapted for a nonsymmetric case also.

  8. Log amplifier with pole-zero compensation

    DOEpatents

    Brookshier, William

    1987-01-01

    A logarithmic amplifier circuit provides pole-zero compensation for improved stability and response time over 6-8 decades of input signal frequency. The amplifier circuit includes a first operational amplifier with a first feedback loop which includes a second, inverting operational amplifier in a second feedback loop. The compensated output signal is provided by the second operational amplifier with the log elements, i.e., resistors, and the compensating capacitors in each of the feedback loops having equal values so that each break point or pole is offset by a compensating break point or zero.

  9. Calibration Tests of a German Log Rodmeter

    NASA Technical Reports Server (NTRS)

    Mottard, Elmo J.; Stillman, Everette R.

    1949-01-01

    A German log rodmeter of the pitot static type was calibrated in Langley tank no. 1 at speeds up to 34 knots and angles of yaw from 0 deg to plus or minus 10 3/4 degrees. The dynamic head approximated the theoretical head at 0 degrees yaw but decreased as the yaw was increased. The static head was negative and in general became more negative with increasing speed and yaw. Cavitation occurred at speeds above 31 knots at 0 deg yaw and 21 knots at 10 3/4 deg yaw.

  10. Log amplifier with pole-zero compensation

    DOEpatents

    Brookshier, W.

    1985-02-08

    A logarithmic amplifier circuit provides pole-zero compensation for improved stability and response time over 6-8 decades of input signal frequency. The amplifer circuit includes a first operational amplifier with a first feedback loop which includes a second, inverting operational amplifier in a second feedstock loop. The compensated output signal is provided by the second operational amplifier with the log elements, i.e., resistors, and the compensating capacitors in each of the feedback loops having equal values so that each break point is offset by a compensating break point or zero.

  11. VAFLE: visual analytics of firewall log events

    NASA Astrophysics Data System (ADS)

    Ghoniem, Mohammad; Shurkhovetskyy, Georgiy; Bahey, Ahmed; Otjacques, Benoît.

    2013-12-01

    In this work, we present VAFLE, an interactive network security visualization prototype for the analysis of firewall log events. Keeping it simple yet effective for analysts, we provide multiple coordinated interactive visualizations augmented with clustering capabilities customized to support anomaly detection and cyber situation awareness. We evaluate the usefulness of the prototype in a use case with network traffic datasets from previous VAST Challenges, illustrating its effectiveness at promoting fast and well-informed decisions. We explain how a security analyst may spot suspicious traffic using VAFLE. We further assess its usefulness through a qualitative evaluation involving network security experts, whose feedback is reported and discussed.

  12. WebViz: A web browser based application for collaborative analysis of 3D data

    NASA Astrophysics Data System (ADS)

    Ruegg, C. S.

    2011-12-01

    In the age of high speed Internet where people can interact instantly, scientific tools have lacked technology which can incorporate this concept of communication using the web. To solve this issue a web application for geological studies has been created, tentatively titled WebViz. This web application utilizes tools provided by Google Web Toolkit to create an AJAX web application capable of features found in non web based software. Using these tools, a web application can be created to act as piece of software from anywhere in the globe with a reasonably speedy Internet connection. An application of this technology can be seen with data regarding the recent tsunami from the major japan earthquakes. After constructing the appropriate data to fit a computer render software called HVR, WebViz can request images of the tsunami data and display it to anyone who has access to the application. This convenience alone makes WebViz a viable solution, but the option to interact with this data with others around the world causes WebViz to be taken as a serious computational tool. WebViz also can be used on any javascript enabled browser such as those found on modern tablets and smart phones over a fast wireless connection. Due to the fact that WebViz's current state is built using Google Web Toolkit the portability of the application is in it's most efficient form. Though many developers have been involved with the project, each person has contributed to increase the usability and speed of the application. In the project's most recent form a dramatic speed increase has been designed as well as a more efficient user interface. The speed increase has been informally noticed in recent uses of the application in China and Australia with the hosting server being located at the University of Minnesota. The user interface has been improved to not only look better but the functionality has been improved. Major functions of the application are rotating the 3D object using buttons

  13. Why, What, and How to Log? Lessons from LISTEN

    ERIC Educational Resources Information Center

    Mostow, Jack; Beck, Joseph E.

    2009-01-01

    The ability to log tutorial interactions in comprehensive, longitudinal, fine-grained detail offers great potential for educational data mining--but what data is logged, and how, can facilitate or impede the realization of that potential. We propose guidelines gleaned over 15 years of logging, exploring, and analyzing millions of events from…

  14. The Learning Log as an Integrated Instructional Assessment Tool.

    ERIC Educational Resources Information Center

    Topaz, Beverley

    1997-01-01

    Use of student learning logs is recommended as a means for both students and teacher to assess second-language learning. The approach encourages learners to analyze their learning difficulties and plan for overcoming them. Incorporated into portfolios, logs can be used to analyze progress. Sample log sheet and chart used as a framework for…

  15. 14 CFR 125.407 - Maintenance log: Airplanes.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Maintenance log: Airplanes. 125.407 Section... Maintenance log: Airplanes. (a) Each person who takes corrective action or defers action concerning a reported... record the action taken in the airplane maintenance log in accordance with part 43 of this chapter....

  16. 14 CFR 121.701 - Maintenance log: Aircraft.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Maintenance log: Aircraft. 121.701 Section... REQUIREMENTS: DOMESTIC, FLAG, AND SUPPLEMENTAL OPERATIONS Records and Reports § 121.701 Maintenance log... have made, a record of that action in the airplane's maintenance log. (b) Each certificate holder...

  17. 31 CFR 593.309 - Round log or timber product.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 31 Money and Finance:Treasury 3 2013-07-01 2013-07-01 false Round log or timber product. 593.309 Section 593.309 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) OFFICE... SANCTIONS REGULATIONS General Definitions § 593.309 Round log or timber product. The term round log...

  18. Learning Logs in the Science Classroom: The Literacy Advantage

    ERIC Educational Resources Information Center

    Steenson, Cheryl

    2006-01-01

    In this article, the author discusses one of the most functional forms of writing to learn, the two-column learning logs. Two-column learning logs are based on the premise that collecting information and processing information are two very different aspects of learning. Two-column logs allow students to connect the facts and theories of science to…

  19. 21 CFR 211.182 - Equipment cleaning and use log.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Equipment cleaning and use log. 211.182 Section... Reports § 211.182 Equipment cleaning and use log. A written record of major equipment cleaning... individual equipment logs that show the date, time, product, and lot number of each batch processed....

  20. 46 CFR 148.100 - Log book entries.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 5 2014-10-01 2014-10-01 false Log book entries. 148.100 Section 148.100 Shipping COAST... THAT REQUIRE SPECIAL HANDLING Minimum Transportation Requirements § 148.100 Log book entries. During... date and time of each measurement and test must be recorded in the vessel's log....

  1. 14 CFR 125.407 - Maintenance log: Airplanes.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 3 2014-01-01 2014-01-01 false Maintenance log: Airplanes. 125.407 Section... Maintenance log: Airplanes. (a) Each person who takes corrective action or defers action concerning a reported... record the action taken in the airplane maintenance log in accordance with part 43 of this chapter....

  2. 31 CFR 593.309 - Round log or timber product.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 31 Money and Finance:Treasury 3 2011-07-01 2011-07-01 false Round log or timber product. 593.309 Section 593.309 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) OFFICE... SANCTIONS REGULATIONS General Definitions § 593.309 Round log or timber product. The term round log...

  3. 21 CFR 211.182 - Equipment cleaning and use log.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 4 2014-04-01 2014-04-01 false Equipment cleaning and use log. 211.182 Section... Reports § 211.182 Equipment cleaning and use log. A written record of major equipment cleaning... individual equipment logs that show the date, time, product, and lot number of each batch processed....

  4. 14 CFR 125.407 - Maintenance log: Airplanes.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 3 2012-01-01 2012-01-01 false Maintenance log: Airplanes. 125.407 Section... Maintenance log: Airplanes. (a) Each person who takes corrective action or defers action concerning a reported... record the action taken in the airplane maintenance log in accordance with part 43 of this chapter....

  5. 21 CFR 211.182 - Equipment cleaning and use log.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 4 2013-04-01 2013-04-01 false Equipment cleaning and use log. 211.182 Section... Reports § 211.182 Equipment cleaning and use log. A written record of major equipment cleaning... individual equipment logs that show the date, time, product, and lot number of each batch processed....

  6. 14 CFR 125.407 - Maintenance log: Airplanes.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Maintenance log: Airplanes. 125.407 Section... Maintenance log: Airplanes. (a) Each person who takes corrective action or defers action concerning a reported... record the action taken in the airplane maintenance log in accordance with part 43 of this chapter....

  7. 31 CFR 593.309 - Round log or timber product.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 31 Money and Finance:Treasury 3 2014-07-01 2014-07-01 false Round log or timber product. 593.309 Section 593.309 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) OFFICE... SANCTIONS REGULATIONS General Definitions § 593.309 Round log or timber product. The term round log...

  8. 46 CFR 148.100 - Log book entries.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 5 2012-10-01 2012-10-01 false Log book entries. 148.100 Section 148.100 Shipping COAST... THAT REQUIRE SPECIAL HANDLING Minimum Transportation Requirements § 148.100 Log book entries. During... date and time of each measurement and test must be recorded in the vessel's log....

  9. 31 CFR 593.309 - Round log or timber product.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 3 2010-07-01 2010-07-01 false Round log or timber product. 593.309 Section 593.309 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) OFFICE... SANCTIONS REGULATIONS General Definitions § 593.309 Round log or timber product. The term round log...

  10. 31 CFR 593.309 - Round log or timber product.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 31 Money and Finance:Treasury 3 2012-07-01 2012-07-01 false Round log or timber product. 593.309 Section 593.309 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) OFFICE... SANCTIONS REGULATIONS General Definitions § 593.309 Round log or timber product. The term round log...

  11. 21 CFR 211.182 - Equipment cleaning and use log.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 4 2012-04-01 2012-04-01 false Equipment cleaning and use log. 211.182 Section... Reports § 211.182 Equipment cleaning and use log. A written record of major equipment cleaning... individual equipment logs that show the date, time, product, and lot number of each batch processed....

  12. 14 CFR 121.701 - Maintenance log: Aircraft.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 3 2014-01-01 2014-01-01 false Maintenance log: Aircraft. 121.701 Section... REQUIREMENTS: DOMESTIC, FLAG, AND SUPPLEMENTAL OPERATIONS Records and Reports § 121.701 Maintenance log... have made, a record of that action in the airplane's maintenance log. (b) Each certificate holder...

  13. 21 CFR 211.182 - Equipment cleaning and use log.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 4 2011-04-01 2011-04-01 false Equipment cleaning and use log. 211.182 Section... Reports § 211.182 Equipment cleaning and use log. A written record of major equipment cleaning... individual equipment logs that show the date, time, product, and lot number of each batch processed....

  14. 14 CFR 125.407 - Maintenance log: Airplanes.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 3 2013-01-01 2013-01-01 false Maintenance log: Airplanes. 125.407 Section... Maintenance log: Airplanes. (a) Each person who takes corrective action or defers action concerning a reported... record the action taken in the airplane maintenance log in accordance with part 43 of this chapter....

  15. 14 CFR 121.701 - Maintenance log: Aircraft.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 3 2013-01-01 2013-01-01 false Maintenance log: Aircraft. 121.701 Section... REQUIREMENTS: DOMESTIC, FLAG, AND SUPPLEMENTAL OPERATIONS Records and Reports § 121.701 Maintenance log... have made, a record of that action in the airplane's maintenance log. (b) Each certificate holder...

  16. 46 CFR 148.100 - Log book entries.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 5 2011-10-01 2011-10-01 false Log book entries. 148.100 Section 148.100 Shipping COAST... THAT REQUIRE SPECIAL HANDLING Minimum Transportation Requirements § 148.100 Log book entries. During... date and time of each measurement and test must be recorded in the vessel's log....

  17. 14 CFR 121.701 - Maintenance log: Aircraft.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 3 2012-01-01 2012-01-01 false Maintenance log: Aircraft. 121.701 Section... REQUIREMENTS: DOMESTIC, FLAG, AND SUPPLEMENTAL OPERATIONS Records and Reports § 121.701 Maintenance log... have made, a record of that action in the airplane's maintenance log. (b) Each certificate holder...

  18. 46 CFR 148.100 - Log book entries.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 5 2013-10-01 2013-10-01 false Log book entries. 148.100 Section 148.100 Shipping COAST... THAT REQUIRE SPECIAL HANDLING Minimum Transportation Requirements § 148.100 Log book entries. During... date and time of each measurement and test must be recorded in the vessel's log....

  19. 14 CFR 121.701 - Maintenance log: Aircraft.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Maintenance log: Aircraft. 121.701 Section... REQUIREMENTS: DOMESTIC, FLAG, AND SUPPLEMENTAL OPERATIONS Records and Reports § 121.701 Maintenance log... have made, a record of that action in the airplane's maintenance log. (b) Each certificate holder...

  20. Drilling and geophysical logs of the tophole at an oil-and-gas well site, Central Venango County, Pennsylvania

    USGS Publications Warehouse

    Williams, John H.; Bird, Philip H.; Conger, Randall W.; Anderson, J. Alton

    2014-01-01

    Collection and integrated analysis of drilling and geophysical logs provided an efficient and effective means for characterizing the geohydrologic framework and conditions penetrated by the tophole at the selected oil-and-gas well site. The logging methods and lessons learned at this well site could be applied at other oil-and-gas drilling sites to better characterize the shallow subsurface with the overall goal of protecting freshwater aquifers during hydrocarbon development.

  1. Selected borehole geophysical logs and drillers' logs, northern coastal plain of New Jersey

    USGS Publications Warehouse

    Murashige, J.E.; Birkelo, B.A.; Pucci, A.A.

    1989-01-01

    This report presents lithologic data compiled during the initial phase of a cooperative study by the U.S. Geological Survey and the New Jersey Department of Environmental Protection, Division of Water Resources to assess the hydrogeology of the Potomac-Raritan-Magothy aquifer system in the northern Coastal Plain of New Jersey. The report includes 109 geophysical logs and 328 drillers ' logs that were selected as representative of the Potomac-Raritan-Magothy aquifer system. A description of the Potomac-Raritan-Magothy aquifer system also is give. (USGS)

  2. ADASS Web Database XML Project

    NASA Astrophysics Data System (ADS)

    Barg, M. I.; Stobie, E. B.; Ferro, A. J.; O'Neil, E. J.

    In the spring of 2000, at the request of the ADASS Program Organizing Committee (POC), we began organizing information from previous ADASS conferences in an effort to create a centralized database. The beginnings of this database originated from data (invited speakers, participants, papers, etc.) extracted from HyperText Markup Language (HTML) documents from past ADASS host sites. Unfortunately, not all HTML documents are well formed and parsing them proved to be an iterative process. It was evident at the beginning that if these Web documents were organized in a standardized way, such as XML (Extensible Markup Language), the processing of this information across the Web could be automated, more efficient, and less error prone. This paper will briefly review the many programming tools available for processing XML, including Java, Perl and Python, and will explore the mapping of relational data from our MySQL database to XML.

  3. Ice logging with light and sound

    NASA Astrophysics Data System (ADS)

    Bay, Ryan C.; Bramall, Nathan; Price, P. Buford

    Polar ice may well be the purest solid substance on Earth, and yet the impurities it contains—gases, dust, and micro-organisms—provide a rich record of Earth's past climate, volcanism, and one-celled life going back ˜400,000 years. Until recently detailed records had been deciphered mostly in chemical and biological laboratories from meter-long ice cores removed by drills capable of coring to bedrock thousands of meters down. Now, borehole instruments are adding a new dimension to the study of ice sheets. They can rapidly log records of past climate, volcanism, c-axis ice fabric, and soon, even microbial life and grain size. Gary Clow, a pioneer in borehole logging, has been measuring temperature profiles that provide information on climate and ice flow [Dahl-Jensen et al., 1998]. From sonic velocity profiles, Kendrick Taylor and Gregg Lamorey are able to infer c-axis fabrics, which record the history of ice flow Robert Hawley and Ed Waddington have developed a video logger that detects annual layers in firn ice.

  4. Dewarless Logging Tool - 1st Generation

    SciTech Connect

    HENFLING,JOSEPH A.; NORMANN,RANDY A.

    2000-08-01

    This report focuses on Sandia National Laboratories' effort to create high-temperature logging tools for geothermal applications without the need for heat shielding. One of the mechanisms for failure in conventional downhole tools is temperature. They can only survive a limited number of hours in high temperature environments. For the first time since the evolution of integrated circuits, components are now commercially available that are qualified to 225 C with many continuing to work up to 300 C. These components are primarily based on Silicon-On-Insulator (SOI) technology. Sandia has developed and tested a simple data logger based on this technology that operates up to 300 C with a few limiting components operating to only 250 C without thermal protection. An actual well log to 240 C without shielding is discussed. The first prototype high-temperature tool measures pressure and temperature using a wire-line for power and communication. The tool is based around the HT83C51 microcontroller. A brief discussion of the background and status of the High Temperature Instrumentation program at Sandia, objectives, data logger development, and future project plans are given.

  5. Dual excitation acoustic paramagnetic logging tool

    DOEpatents

    Vail, III, William B.

    1989-01-01

    New methods and apparatus are disclosed which allow measurement of the presence of oil and water in gelogical formations using a new physical effect called the Acoustic Paramagnetic Logging Effect (APLE). The presence of petroleum in formation causes a slight increase in the earth's magnetic field in the vicinity of the reservoir. This is the phenomena of paramagnetism. Application of an acoustic source to a geological formation at the Larmor frequency of the nucleous present causes the paramagnetism of the formation to disappear. This results in a decrease in the earth's magnetic field in the vicinity of the oil bearing formation. Repetitively frequency sweeping the acoustic source through the Larmor frequency of the nucleons present (approx. 2 kHz) causes an amplitude modulation of the earth's magnetic field which is a consequence of the APLE. The amplitude modulation of the earth's magnetic field is measured with an induction coil gradiometer and provides a direct measure of the amount of oil and water in the excitation zone of the formation. The phase of the signal is used to infer the longitudinal relaxation times of the fluids present, which results in the ability in general to separate oil and water and to measure the viscosity of the oil present. Such measurements may be preformed in open boreholes and in cased well bores. The Dual Excitation Acoustic Paramagnetic Logging Tool employing two acoustic sources is also described.

  6. Dual excitation acoustic paramagnetic logging tool

    DOEpatents

    Vail, W.B. III.

    1989-02-14

    New methods and apparatus are disclosed which allow measurement of the presence of oil and water in geological formations using a new physical effect called the Acoustic Paramagnetic Logging Effect (APLE). The presence of petroleum in formation causes a slight increase in the earth's magnetic field in the vicinity of the reservoir. This is the phenomena of paramagnetism. Application of an acoustic source to a geological formation at the Larmor frequency of the nucleons present causes the paramagnetism of the formation to disappear. This results in a decrease in the earth's magnetic field in the vicinity of the oil bearing formation. Repetitively frequency sweeping the acoustic source through the Larmor frequency of the nucleons present (approx. 2 kHz) causes an amplitude modulation of the earth's magnetic field which is a consequence of the APLE. The amplitude modulation of the earth's magnetic field is measured with an induction coil gradiometer and provides a direct measure of the amount of oil and water in the excitation zone of the formation. The phase of the signal is used to infer the longitudinal relaxation times of the fluids present, which results in the ability in general to separate oil and water and to measure the viscosity of the oil present. Such measurements may be performed in open boreholes and in cased well bores. The Dual Excitation Acoustic Paramagnetic Logging Tool employing two acoustic sources is also described. 6 figs.

  7. Coal log pipeline: Development status of the first commercial system

    SciTech Connect

    Marrero, T.R.

    1996-12-31

    The coal log pipeline (CLP) is an innovative means for long-distance transportation of coal. In the CLP concept, coal is pressed into the form of cylinders--coal logs--that are propelled by water flowing through underground pipe. A coal log pipeline has many advantages when compared to coal transport by unit train, slurry pipeline and long-distance trucking: low-cost, low energy consumption, low-water consumption, simple dewatering at pipeline exit, safe, and environmentally friendly. The coal logs travel butted together, as trains. Between the coal log {open_quotes}trains,{close_quotes} some space is allowed for valve switching. The optimum diameter of a coal log is approximately 90 to 95% the inside diameter of the pipe. The coal-to-water ratio is about 4 to 1. A 200 mm diameter CLP can transport about 2 million tonnes of coal per year. The coal logs at their destination come out of the pipeline onto a moving conveyer which transports the logs to a crusher or stock pile. Coal logs are crushed to match the size of existing fuel. The water effluent is treated and reused at the power plant; there is no need for its discharge. Coal logs can be manufactured with and without the use of binder. By using less than 2 percent emulsified asphalt as binder, no heat is required to compact coal logs. Binderless coal logs can be compacted at less than 90{degrees}C. Compaction pressures, for coal logs made with or without binder, are about 70 MPa. The coal particle size distribution and moisture content must be controlled. The economics of coal log pipeline system have been studied. Results indicate that a new coal log pipeline is cost-competitive with existing railroads for distances greater than 80 km, approximately. CLP is much more economical than coal slurry pipeline of the same diameter. This paper describes the current R&D and commercialization plan for CLP. 4 refs.

  8. Three-class classification models of logS and logP derived by using GA-CG-SVM approach.

    PubMed

    Zhang, Hui; Xiang, Ming-Li; Ma, Chang-Ying; Huang, Qi; Li, Wei; Xie, Yang; Wei, Yu-Quan; Yang, Sheng-Yong

    2009-05-01

    In this investigation, three-class classification models of aqueous solubility (logS) and lipophilicity (logP) have been developed by using a support vector machine (SVM) method combined with a genetic algorithm (GA) for feature selection and a conjugate gradient method (CG) for parameter optimization. A 5-fold cross-validation and an independent test set method were used to evaluate the SVM classification models. For logS, the overall prediction accuracy is 87.1% for training set and 90.0% for test set. For logP, the overall prediction accuracy is 81.0% for training set and 82.0% for test set. In general, for both logS and logP, the prediction accuracies of three-class models are slightly lower by several percent than those of two-class models. A comparison between the performance of GA-CG-SVM models and that of GA-SVM models shows that the SVM parameter optimization has a significant impact on the quality of SVM classification model.

  9. Web document clustering using hyperlink structures

    SciTech Connect

    He, Xiaofeng; Zha, Hongyuan; Ding, Chris H.Q; Simon, Horst D.

    2001-05-07

    With the exponential growth of information on the World Wide Web there is great demand for developing efficient and effective methods for organizing and retrieving the information available. Document clustering plays an important role in information retrieval and taxonomy management for the World Wide Web and remains an interesting and challenging problem in the field of web computing. In this paper we consider document clustering methods exploring textual information hyperlink structure and co-citation relations. In particular we apply the normalized cut clustering method developed in computer vision to the task of hyperdocument clustering. We also explore some theoretical connections of the normalized-cut method to K-means method. We then experiment with normalized-cut method in the context of clustering query result sets for web search engines.

  10. Web Audio/Video Streaming Tool

    NASA Technical Reports Server (NTRS)

    Guruvadoo, Eranna K.

    2003-01-01

    In order to promote NASA-wide educational outreach program to educate and inform the public of space exploration, NASA, at Kennedy Space Center, is seeking efficient ways to add more contents to the web by streaming audio/video files. This project proposes a high level overview of a framework for the creation, management, and scheduling of audio/video assets over the web. To support short-term goals, the prototype of a web-based tool is designed and demonstrated to automate the process of streaming audio/video files. The tool provides web-enabled users interfaces to manage video assets, create publishable schedules of video assets for streaming, and schedule the streaming events. These operations are performed on user-defined and system-derived metadata of audio/video assets stored in a relational database while the assets reside on separate repository. The prototype tool is designed using ColdFusion 5.0.

  11. In-Home Demonstration of the Reduction of Woodstove Emissions from the Use of Densified Logs.

    SciTech Connect

    Barnett, Stockton G.; Bidhouse, Roger D.

    1992-07-07

    There is a need to reduce emissions from conventional wood stoves in the short-term while stove replacement takes place over the longer term. One possible is to use fuels that would burn cleaner than cordwood. Densified fuels have been commercially available for years and offer such a possibility. The objective of this project was to evaluate the emissions and efficiency performance of two commercially available densified log types in homes and compare their performance with cordwood. Researchers measured particulate matter (PM), carbon monoxide (CO), and volatile organic matter (VOC) emissions. Both total VOC and methane values are presented. Each home used an Automated Woodstove Emissions Sampler system, developed for the EPA and Bonneville Power Administration, in a series of four week-long tests for each stove. The sequence of tests in each stove was cordwood, Pres-to-Logs, Eco-Logs, and a second, confirming test using Pres-to-Logs. Results show an average reduction of 52% in PM grams per hour emissions overall for the nine stoves using Pres-to-Logs. All nine stoves displayed a reduction in PM emissions. CO emissions were more modestly reduced by 27%, and VOCs were reduced 39%. The emissions reduction percentage was similar for both types of stoves.

  12. In-home demonstration of the reduction of woodstove emissions from the use of densified logs

    SciTech Connect

    Barnett, S.G.; Bighouse, R.D.

    1992-07-07

    There is a need to reduce emissions from conventional wood stoves in the short-term while stove replacement takes place over the longer term. One possible is to use fuels that would burn cleaner than cordwood. Densified fuels have been commercially available for years and offer such a possibility. The objective of this project was to evaluate the emissions and efficiency performance of two commercially available densified log types in homes and compare their performance with cordwood. Researchers measured particulate matter (PM), carbon monoxide (CO), and volatile organic matter (VOC) emissions. Both total VOC and methane values are presented. Each home used an Automated Woodstove Emissions Sampler system, developed for the EPA and Bonneville Power Administration, in a series of four week-long tests for each stove. The sequence of tests in each stove was cordwood, Pres-to-Logs, Eco-Logs, and a second, confirming test using Pres-to-Logs. Results show an average reduction of 52% in PM grams per hour emissions overall for the nine stoves using Pres-to-Logs. All nine stoves displayed a reduction in PM emissions. CO emissions were more modestly reduced by 27%, and VOCs were reduced 39%. The emissions reduction percentage was similar for both types of stoves.

  13. Symmetric log-domain diffeomorphic Registration: a demons-based approach.

    PubMed

    Vercauteren, Tom; Pennec, Xavier; Perchant, Aymeric; Ayache, Nicholas

    2008-01-01

    Modern morphometric studies use non-linear image registration to compare anatomies and perform group analysis. Recently, log-Euclidean approaches have contributed to promote the use of such computational anatomy tools by permitting simple computations of statistics on a rather large class of invertible spatial transformations. In this work, we propose a non-linear registration algorithm perfectly fit for log-Euclidean statistics on diffeomorphisms. Our algorithm works completely in the log-domain, i.e. it uses a stationary velocity field. This implies that we guarantee the invertibility of the deformation and have access to the true inverse transformation. This also means that our output can be directly used for log-Euclidean statistics without relying on the heavy computation of the log of the spatial transformation. As it is often desirable, our algorithm is symmetric with respect to the order of the input images. Furthermore, we use an alternate optimization approach related to Thirion's demons algorithm to provide a fast non-linear registration algorithm. First results show that our algorithm outperforms both the demons algorithm and the recently proposed diffeomorphic demons algorithm in terms of accuracy of the transformation while remaining computationally efficient.

  14. Planning and interpreting cement bond logs; Implementation of an expert system

    SciTech Connect

    Murphy, K.; Wydrinski, R.; Feldman, D.S. )

    1991-07-01

    The Cement Bond Log Advisor is an expert system being developed to assist users in the design, evaluation, and interpretation of cement bond logs (CBL's). CBL's are produced by sonic tools run after the casing is cemented in a well to determine how well the cement is bonded to the casing and formation and whether a cement squeeze jog should be attempted. During the development of program, two particular challenges were encountered and resolved in ways that might be of general interest to developers of expert system applications for the petroleum industry. The first challenge was to find an efficient and effective method for guiding the user in correlating and interpreting the relationships between multiple sources of continuous visual data (the various waveforms making up the log output). Second, the nomographs, which correlate log amplitudes with cement compressive strength and are used in the log interpretation, were derive from empirical data and were not readily reduced to functional relationships through traditional curve-fitting techniques. The use of adductive induction was found to be effective in deriving suitable relationships. The paper focuses on the techniques used to implement the application, with emphasis placed on how the two challenges described above were met.

  15. Regularized Multitask Learning for Multidimensional Log-Density Gradient Estimation.

    PubMed

    Yamane, Ikko; Sasaki, Hiroaki; Sugiyama, Masashi

    2016-07-01

    Log-density gradient estimation is a fundamental statistical problem and possesses various practical applications such as clustering and measuring nongaussianity. A naive two-step approach of first estimating the density and then taking its log gradient is unreliable because an accurate density estimate does not necessarily lead to an accurate log-density gradient estimate. To cope with this problem, a method to directly estimate the log-density gradient without density estimation has been explored and demonstrated to work much better than the two-step method. The objective of this letter is to improve the performance of this direct method in multidimensional cases. Our idea is to regard the problem of log-density gradient estimation in each dimension as a task and apply regularized multitask learning to the direct log-density gradient estimator. We experimentally demonstrate the usefulness of the proposed multitask method in log-density gradient estimation and mode-seeking clustering.

  16. A comparison of new ultrasonic cement and casing evaluation logs with standard cement bond logs

    SciTech Connect

    Sheives, T.C.; Tello, L.N.; Maki, V.E. Jr.; Standley, T.E.; Blankinship, T.J.

    1986-01-01

    New ultrasonic inspection techniques have been implemented to evaluate cement bond and casing conditions. These techniques, relying on state of the art downhole, high-speed waveform digitization, can successfully determine cement voids and channels and also provide accurate bond information in the presence of a microannulus. The downhole microprocessor controlled electronics transmits the digitized waveforms to the surface for computations and display. The downhole waveforms are then viewed by the operator in real time to insure proper log quality. Numerous log examples showing comparisons with the standard Cement Bond Log (CBL) demonstrate that the accuracy and resolution of the new approach provide much more information. Casing conditions are determined using both acoustic caliper measurements and a new casing thickness measurement technique. The acoustic caliper measurement not only determines accurately the casing inside diameter, but also detects the pipe gap at casing collars. The new measurement determines actual casing thickness variations due to internal or external pipe wear or corrosion. Experimental results along with log examples show the effectiveness and accuracy of measuring the actual casing thickness at the well site in real time.

  17. Web-based collection of expert opinion on routine scalp EEG: software development and interrater reliability.

    PubMed

    Halford, Jonathan J; Pressly, William B; Benbadis, Selim R; Tatum, William O; Turner, Robert P; Arain, Amir; Pritchard, Paul B; Edwards, Jonathan C; Dean, Brian C

    2011-04-01

    Computerized detection of epileptiform transients (ETs), characterized by interictal spikes and sharp waves in the EEG, has been a research goal for the last 40 years. A reliable method for detecting ETs would assist physicians in interpretation and improve efficiency in reviewing long-term EEG recordings. Computer algorithms developed thus far for detecting ETs are not as reliable as human experts, primarily due to the large number of false-positive detections. Comparing the performance of different algorithms is difficult because each study uses individual EEG test datasets. In this article, we present EEGnet, a distributed web-based platform for the acquisition and analysis of large-scale training datasets for comparison of different EEG ET detection algorithms. This software allows EEG scorers to log in through the web, mark EEG segments of interest, and categorize segments of interest using a conventional clinical EEG user interface. This software platform was used by seven board-certified academic epileptologists to score 40 short 30-second EEG segments from 40 patients, half containing ETs and half containing artifacts and normal variants. The software performance was adequate. Interrater reliability for marking the location of paroxysmal activity was low. Interrater reliability of marking artifacts and ETs was high and moderate, respectively.

  18. Using Open Web APIs in Teaching Web Mining

    ERIC Educational Resources Information Center

    Chen, Hsinchun; Li, Xin; Chau, M.; Ho, Yi-Jen; Tseng, Chunju

    2009-01-01

    With the advent of the World Wide Web, many business applications that utilize data mining and text mining techniques to extract useful business information on the Web have evolved from Web searching to Web mining. It is important for students to acquire knowledge and hands-on experience in Web mining during their education in information systems…

  19. Sign language Web pages.

    PubMed

    Fels, Deborah I; Richards, Jan; Hardman, Jim; Lee, Daniel G

    2006-01-01

    The WORLD WIDE WEB has changed the way people interact. It has also become an important equalizer of information access for many social sectors. However, for many people, including some sign language users, Web accessing can be difficult. For some, it not only presents another barrier to overcome but has left them without cultural equality. The present article describes a system that allows sign language-only Web pages to be created and linked through a video-based technique called sign-linking. In two studies, 14 Deaf participants examined two iterations of signlinked Web pages to gauge the usability and learnability of a signing Web page interface. The first study indicated that signing Web pages were usable by sign language users but that some interface features required improvement. The second study showed increased usability for those features; users consequently couldnavigate sign language information with ease and pleasure.

  20. WebEAV

    PubMed Central

    Nadkarni, Prakash M.; Brandt, Cynthia M.; Marenco, Luis

    2000-01-01

    The task of creating and maintaining a front end to a large institutional entity-attribute-value (EAV) database can be cumbersome when using traditional client-server technology. Switching to Web technology as a delivery vehicle solves some of these problems but introduces others. In particular, Web development environments tend to be primitive, and many features that client-server developers take for granted are missing. WebEAV is a generic framework for Web development that is intended to streamline the process of Web application development for databases having a significant EAV component. It also addresses some challenging user interface issues that arise when any complex system is created. The authors describe the architecture of WebEAV and provide an overview of its features with suitable examples. PMID:10887163

  1. Web accessibility and open source software.

    PubMed

    Obrenović, Zeljko

    2009-07-01

    A Web browser provides a uniform user interface to different types of information. Making this interface universally accessible and more interactive is a long-term goal still far from being achieved. Universally accessible browsers require novel interaction modalities and additional functionalities, for which existing browsers tend to provide only partial solutions. Although functionality for Web accessibility can be found as open source and free software components, their reuse and integration is complex because they were developed in diverse implementation environments, following standards and conventions incompatible with the Web. To address these problems, we have started several activities that aim at exploiting the potential of open-source software for Web accessibility. The first of these activities is the development of Adaptable Multi-Interface COmmunicator (AMICO):WEB, an infrastructure that facilitates efficient reuse and integration of open source software components into the Web environment. The main contribution of AMICO:WEB is in enabling the syntactic and semantic interoperability between Web extension mechanisms and a variety of integration mechanisms used by open source and free software components. Its design is based on our experiences in solving practical problems where we have used open source components to improve accessibility of rich media Web applications. The second of our activities involves improving education, where we have used our platform to teach students how to build advanced accessibility solutions from diverse open-source software. We are also partially involved in the recently started Eclipse projects called Accessibility Tools Framework (ACTF), the aim of which is development of extensible infrastructure, upon which developers can build a variety of utilities that help to evaluate and enhance the accessibility of applications and content for people with disabilities. In this article we briefly report on these activities.

  2. Live From the Front: Operational Ramifications of Military Web Logs in Combat Zones

    DTIC Science & Technology

    2007-05-10

    their place in history” 21 Katherine C. Den Bleyker, “The First Ammendment versus Operational Security: Where Should the Milblogging Balance Lie...accessed 25 April 2007). 35 Bleyker, “The First Ammendment versus Operational Security: Where Should the Milblogging Balance Lie?” 36 Ibid. 37 Ibid...Bleyker, Katherine C. Den. “The First Ammendment versus Operational Security: Where Should the Milblogging Balance Lie?” Fordham Intellectual

  3. Beyond Logs and Surveys: In-Depth Measures of People's Web Use Skills.

    ERIC Educational Resources Information Center

    Hargittai, Eszter

    2002-01-01

    Examines how people find information online in the context of their other media use and general Internet use patterns, and uses information about demographic background and social support networks. Describes the methodology, and suggests a mix of survey instruments and in-person observations can yield the type of data necessary to understand the…

  4. Proposal for a Web Encoding Service (wes) for Spatial Data Transactio

    NASA Astrophysics Data System (ADS)

    Siew, C. B.; Peters, S.; Rahman, A. A.

    2015-10-01

    Web services utilizations in Spatial Data Infrastructure (SDI) have been well established and standardized by Open Geospatial Consortium (OGC). Similar web services for 3D SDI are also being established in recent years, with extended capabilities to handle 3D spatial data. The increasing popularity of using City Geographic Markup Language (CityGML) for 3D city modelling applications leads to the needs for large spatial data handling for data delivery. This paper revisits the available web services in OGC Web Services (OWS), and propose the background concepts and requirements for encoding spatial data via Web Encoding Service (WES). Furthermore, the paper discusses the data flow of the encoder within web service, e.g. possible integration with Web Processing Service (WPS) or Web 3D Services (W3DS). The integration with available web service could be extended to other available web services for efficient handling of spatial data, especially 3D spatial data.

  5. Building Community Web Platform

    NASA Astrophysics Data System (ADS)

    Ohmukai, Ikki; Matsuo, Yutaka; Matsumura, Naohiro; Takeda, Hideaki

    In this paper we propose Web-based communication environment called ``Community Web Platform''. Our platform provides an easy way to exchange personal knowledge among people with lightweight metadata such like RSS and FOAF. We investigate the nature of ``personal trustness'' on the environment since it is one and only measure for evaluating subjective information and knowledge. We also discuss how to develop and maintain Community Web applications from our exrerience.

  6. Using Web Surveys to Determine Audience Characteristics and Product Preferences

    ERIC Educational Resources Information Center

    Philbrick, Jane Hass; Smith, F. Ruth; Bart, Barbara

    2010-01-01

    A web survey is a cost-effective and efficient method to use when measuring the characteristics of an audience and developing or testing new product concepts. This paper reports on the use of a web survey by a start-up media/internet firm, Farmers' Almanac TV. The results indicate that using email to contact respondents from a client list results…

  7. Student Performance Assessment Using Bayesian Network and Web Portfolios.

    ERIC Educational Resources Information Center

    Liu, Chen-Chung; Chen, Gwo-Dong; Wang, Chin-Yeh; Lu, Ching-Fang

    2002-01-01

    Proposes a novel methodology that employs Bayesian network software to assist teachers in efficiently deriving and utilizing the student model of activity performance from Web portfolios online. This system contains Web portfolios that record in detail students' learning activities, peer interaction, and knowledge progress. (AEF)

  8. Assessing Library Instruction through Web Usability and Vocabulary Studies

    ERIC Educational Resources Information Center

    Castonguay, Remi

    2008-01-01

    Can we use the methods of Web usability testing to learn about library instruction? This article is among the first in the field trying to establish a link between usability and instruction. The author discusses useful insights that Web usability can bring to our pedagogy as well as to the efficiency of library instruction. The result of a Web…

  9. Guiding Students in Finding Information on the Web.

    ERIC Educational Resources Information Center

    Quible, Zane K.

    1999-01-01

    Argues that business-communication instructors can aid students in their research by introducing them to the terminology and functions of an efficient Web-search process. Discusses the operation of four search tools: Web directories, search engines, indexes, and spiders or robots. Discusses Boolean logic, and other ways to improve the productivity…

  10. Patterns of usage for a Web-based clinical information system.

    PubMed

    Chen, Elizabeth S; Cimino, James J

    2004-01-01

    Understanding how clinicians are using clinical information systems to assist with their everyday tasks is valuable to the system design and development process. Developers of such systems are interested in monitoring usage in order to make enhancements. System log files are rich resources for gaining knowledge about how the system is being used. We have analyzed the log files of our Web-based clinical information system (WebCIS) to obtain various usage statistics including which WebCIS features are frequently being used. We have also identified usage patterns, which convey how the user is traversing the system. We present our method and these results as well as describe how the results can be used to customize menus, shortcut lists, and patient reports in WebCIS and similar systems.

  11. Properties of food webs

    SciTech Connect

    Pimm, S.L.

    1980-04-01

    On the assumption that systems of interacting species, when perturbed from equilibrium, should return to equilibrium quickly, one can predict four properties of food webs: (1) food chains should be short, (2) species feeding on more than one trophic level (omnivores) should be rare, (3) those species that do feed on more than one trophic level should do so by feeding on species in adjacent trophic levels, and (4) host-parasitoid systems are likely to be exceptions to (1)-(3) when interaction coefficients permit greater trophic complexity. By generating random, model food webs (with many features identical to webs described from a variety of marine, freshwater, and terrestrial systems), it is possible to generate expected values for the number of trophic levels and the degree of omnivory within webs. When compared with these random webs, real world webs are shown to have fewer trophic levels, less omnivory, and very few omnivores feeding on nonadjacent trophic levels. Insect webs are shown to have a greater degree of omnivory than other webs. The confirmation of all these predictions from stability analyses suggests that system stability places necessary, though not sufficient, limitations on the possible shapes of food webs.

  12. Web Accessibility and Guidelines

    NASA Astrophysics Data System (ADS)

    Harper, Simon; Yesilada, Yeliz

    Access to, and movement around, complex online environments, of which the World Wide Web (Web) is the most popular example, has long been considered an important and major issue in the Web design and usability field. The commonly used slang phrase ‘surfing the Web’ implies rapid and free access, pointing to its importance among designers and users alike. It has also been long established that this potentially complex and difficult access is further complicated, and becomes neither rapid nor free, if the user is disabled. There are millions of people who have disabilities that affect their use of the Web. Web accessibility aims to help these people to perceive, understand, navigate, and interact with, as well as contribute to, the Web, and thereby the society in general. This accessibility is, in part, facilitated by the Web Content Accessibility Guidelines (WCAG) currently moving from version one to two. These guidelines are intended to encourage designers to make sure their sites conform to specifications, and in that conformance enable the assistive technologies of disabled users to better interact with the page content. In this way, it was hoped that accessibility could be supported. While this is in part true, guidelines do not solve all problems and the new WCAG version two guidelines are surrounded by controversy and intrigue. This chapter aims to establish the published literature related to Web accessibility and Web accessibility guidelines, and discuss limitations of the current guidelines and future directions.

  13. Silicon Web Process Development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Hopkins, R. H.; Mchugh, J. P.; Hill, F. E.; Heimlich, M. E.; Driggers, J. M.

    1978-01-01

    Progress in the development of techniques to grow silicon web at 25 wq cm/min output rate is reported. Feasibility of web growth with simultaneous melt replenishment is discussed. Other factors covered include: (1) tests of aftertrimmers to improve web width; (2) evaluation of growth lid designs to raise speed and output rate; (3) tests of melt replenishment hardware; and (4) investigation of directed gas flow systems to control unwanted oxide deposition in the system and to improve convective cooling of the web. Compatibility with sufficient solar cell performance is emphasized.

  14. Web 2.0

    NASA Astrophysics Data System (ADS)

    Gibson, Becky

    The Web is growing and changing from a paradigm of static publishing to one of participation and interaction. This change has implications for people with disabilities who rely on access to the Web for employment, information, entertainment, and increased independence. The interactive and collaborative nature of Web 2.0 can present access problems for some users. There are some best practices which can be put in place today to improve access. New specifications such as Accessible Rich Internet Applications (ARIA) and IAccessible2 are opening the doors to increasing the accessibility of Web 2.0 and beyond.

  15. An introduction to webs

    NASA Astrophysics Data System (ADS)

    White, C. D.

    2016-04-01

    Webs are sets of Feynman diagrams that contribute to the exponents of scattering amplitudes, in the kinematic limit in which emitted radiation is soft. As such, they have a number of phenomenological and formal applications, and offer tantalizing glimpses into the all-order structure of perturbative quantum field theory. This article is based on a series of lectures given to graduate students, and aims to provide a pedagogical introduction to webs. Topics covered include exponentiation in (non-)abelian gauge theories, the web mixing matrix formalism for non-abelian gauge theories, and recent progress on the calculation of web diagrams. Problems are included throughout the text, to aid understanding.

  16. Simulation Control Graphical User Interface Logging Report

    NASA Technical Reports Server (NTRS)

    Hewling, Karl B., Jr.

    2012-01-01

    One of the many tasks of my project was to revise the code of the Simulation Control Graphical User Interface (SIM GUI) to enable logging functionality to a file. I was also tasked with developing a script that directed the startup and initialization flow of the various LCS software components. This makes sure that a software component will not spin up until all the appropriate dependencies have been configured properly. Also I was able to assist hardware modelers in verifying the configuration of models after they have been upgraded to a new software version. I developed some code that analyzes the MDL files to determine if any error were generated due to the upgrade process. Another one of the projects assigned to me was supporting the End-to-End Hardware/Software Daily Tag-up meeting.

  17. Tolerance bounds for log gamma regression models

    NASA Technical Reports Server (NTRS)

    Jones, R. A.; Scholz, F. W.; Ossiander, M.; Shorack, G. R.

    1985-01-01

    The present procedure for finding lower confidence bounds for the quantiles of Weibull populations, on the basis of the solution of a quadratic equation, is more accurate than current Monte Carlo tables and extends to any location-scale family. It is shown that this method is accurate for all members of the log gamma(K) family, where K = 1/2 to infinity, and works well for censored data, while also extending to regression data. An even more accurate procedure involving an approximation to the Lawless (1982) conditional procedure, with numerical integrations whose tables are independent of the data, is also presented. These methods are applied to the case of failure strengths of ceramic specimens from each of three billets of Si3N4, which have undergone flexural strength testing.

  18. Calibration Tests of a Japanese Log Rodmeter

    NASA Technical Reports Server (NTRS)

    Mottard, Elmo J.

    1949-01-01

    A Japanese log rodmeter of the rotating-vane impeller type, with a co mmutator on the impeller shaft, was calibrated in Langley tank no. 1. The rotational speed of two impellers was determined for forward speeds up to 24 knots at angles of yaw up to ?10 0 . In general, the rota tional speeds of two apparently identical impellers tested in the rod meter decreased with increasing yaw angle, right yaw causing a greate r decrease than left yaw. The difference in calibration between the t wo impellers was approximately the same as that produced by a change in yaw angle from 50 left to 50 right. Evidence of cavitation within the impeller fairing appeared at speeds above 24 knots.

  19. RPM-WEBBSYS: A web-based computer system to apply the rational polynomial method for estimating static formation temperatures of petroleum and geothermal wells

    NASA Astrophysics Data System (ADS)

    Wong-Loya, J. A.; Santoyo, E.; Andaverde, J. A.; Quiroz-Ruiz, A.

    2015-12-01

    A Web-Based Computer System (RPM-WEBBSYS) has been developed for the application of the Rational Polynomial Method (RPM) to estimate static formation temperatures (SFT) of geothermal and petroleum wells. The system is also capable to reproduce the full thermal recovery processes occurred during the well completion. RPM-WEBBSYS has been programmed using advances of the information technology to perform more efficiently computations of SFT. RPM-WEBBSYS may be friendly and rapidly executed by using any computing device (e.g., personal computers and portable computing devices such as tablets or smartphones) with Internet access and a web browser. The computer system was validated using bottomhole temperature (BHT) measurements logged in a synthetic heat transfer experiment, where a good matching between predicted and true SFT was achieved. RPM-WEBBSYS was finally applied to BHT logs collected from well drilling and shut-in operations, where the typical problems of the under- and over-estimation of the SFT (exhibited by most of the existing analytical methods) were effectively corrected.

  20. Motivation Mining: Prospecting the Web.

    ERIC Educational Resources Information Center

    Small, Ruth V.; Arnone, Marilyn P.

    1999-01-01

    Describes WebMAC instruments, which differ from other Web-evaluation instruments because they have a theoretical base, are user-centered, are designed for students in grades 7 through 12, and assess the motivational quality of Web sites. Examples are given of uses of WebMAC Middle and WebMAC Senior in activities to promote evaluation and…

  1. No-reference image quality assessment based on log-derivative statistics of natural scenes

    NASA Astrophysics Data System (ADS)

    Zhang, Yi; Chandler, Damon M.

    2013-10-01

    We propose an efficient blind/no-reference image quality assessment algorithm using a log-derivative statistical model of natural scenes. Our method, called DErivative Statistics-based QUality Evaluator (DESIQUE), extracts image quality-related statistical features at two image scales in both the spatial and frequency domains. In the spatial domain, normalized pixel values of an image are modeled in two ways: pointwise-based statistics for single pixel values and pairwise-based log-derivative statistics for the relationship of pixel pairs. In the frequency domain, log-Gabor filters are used to extract the fine scales of the image, which are also modeled by the log-derivative statistics. All of these statistics can be fitted by a generalized Gaussian distribution model, and the estimated parameters are fed into combined frameworks to estimate image quality. We train our models on the LIVE database by using optimized support vector machine learning. Experiment results tested on other databases show that the proposed algorithm not only yields a substantial improvement in predictive performance as compared to other state-of-the-art no-reference image quality assessment methods, but also maintains a high computational efficiency.

  2. Faculty Web Grade Entry: University of Phoenix

    ERIC Educational Resources Information Center

    Elisala, Tandy R.

    2005-01-01

    The University of Phoenix is a large, private, four-year university with a commitment to providing timely and efficient student services. With continued growth and process improvement opportunities utilizing technology, the institution had an opportunity to automate and streamline grade processing. This article focuses on the Faculty Web Grade…

  3. Leveraging the Semantic Web for Adaptive Education

    ERIC Educational Resources Information Center

    Kravcik, Milos; Gasevic, Dragan

    2007-01-01

    In the area of technology-enhanced learning reusability and interoperability issues essentially influence the productivity and efficiency of learning and authoring solutions. There are two basic approaches how to overcome these problems--one attempts to do it via standards and the other by means of the Semantic Web. In practice, these approaches…

  4. Web 2 Nowhere?

    ERIC Educational Resources Information Center

    Shapiro, Steven

    2012-01-01

    Web 2.0 seems to be all the rage these days. One cannot go to a library conference and attend presentations or stroll down the halls without hearing some mention of it in magical tones reserved for some great discovery. The excitement surrounding Web 2.0 reminds the author of the frenzy that gripped the country between 1848 and 1855, when…

  5. Taming the Tangled Web.

    ERIC Educational Resources Information Center

    Sturgeon, Julie

    2001-01-01

    Describes the Open Knowledge Initiative (OKI) and its use as a resource for higher education institutions interested in developing web-based learning capabilities. Highlights the OKI collaborative effort and its goal to ensure that the web tools it designs are installable and supportable on smaller campuses and by smaller institutions. (GR)

  6. Wetlands and Web Pages.

    ERIC Educational Resources Information Center

    Tisone-Bartels, Dede

    1998-01-01

    Argues that the preservation of areas like the Shoreline Park (California) wetlands depends on educating students about the value of natural resources. Describes the creation of a Web page on the wetlands for third-grade students by seventh-grade art and ecology students. Outlines the technical process of developing a Web page. (DSK)

  7. Rhizoctonia web blight

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Rhizoctonia web blight, caused by several Rhizoctonia spp., is an important disease of evergreen azaleas and other ornamental plants in nurseries. The primary pathogens causing web blight are binucleate Rhizoctonia anastomosis groups (AG) (= Ceratobasidium D.P. Rogers, teleomorph). In southern AL an...

  8. The Social Web

    ERIC Educational Resources Information Center

    Richardson, Will

    2006-01-01

    This article takes a look at tech guru Will Richardson's new book, "Blogs, Wikis, Podcasts, and Other Powerful Web Tools for Classrooms." Whether it's blogs or wikis or RSS, all roads now point to a Web where little is done in isolation. The biggest, most sweeping change in the people's relationship with the Internet may not be as much the ability…

  9. Web Page Design.

    ERIC Educational Resources Information Center

    Lindsay, Lorin

    Designing a web home page involves many decisions that affect how the page will look, the kind of technology required to use the page, the links the page will provide, and kinds of patrons who can use the page. The theme of information literacy needs to be built into every web page; users need to be taught the skills of sorting and applying…

  10. Web Publishing Schedule

    EPA Pesticide Factsheets

    Section 207(f)(2) of the E-Gov Act requires federal agencies to develop an inventory and establish a schedule of information to be published on their Web sites, make those schedules available for public comment. To post the schedules on the web site.

  11. Sign Language Web Pages

    ERIC Educational Resources Information Center

    Fels, Deborah I.; Richards, Jan; Hardman, Jim; Lee, Daniel G.

    2006-01-01

    The World Wide Web has changed the way people interact. It has also become an important equalizer of information access for many social sectors. However, for many people, including some sign language users, Web accessing can be difficult. For some, it not only presents another barrier to overcome but has left them without cultural equality. The…

  12. Literature on the Web.

    ERIC Educational Resources Information Center

    Deal, Nancy

    2003-01-01

    A teacher in the English education program at Buffalo State College describes her development of Web-based literature guides for preservice teachers to use in preparation and student teaching and for secondary-level English/language arts teachers to use in their classrooms. Discusses assembling materials for the web guide; an overview of site…

  13. Web Design Matters

    ERIC Educational Resources Information Center

    Mathews, Brian

    2009-01-01

    The web site is a library's most important feature. Patrons use the web site for numerous functions, such as renewing materials, placing holds, requesting information, and accessing databases. The homepage is the place they turn to look up the hours, branch locations, policies, and events. Whether users are at work, at home, in a building, or on…

  14. The Web's Unelected Government.

    ERIC Educational Resources Information Center

    Garfinkel, Simson L.

    1998-01-01

    The World Wide Web Consortium--an organization based at the Massachusetts Institute of Technology (MIT) that has 275 corporate members and holds closed meetings--is the closest thing the Web has to a central authority; however, almost nobody outside the telecommunications industry understands what the consortium is. Analyzes the role this body may…

  15. EPA Web Training Classes

    EPA Pesticide Factsheets

    Scheduled webinars can help you better manage EPA web content. Class topics include Drupal basics, creating different types of pages in the WebCMS such as document pages and forms, using Google Analytics, and best practices for metadata and accessibility.

  16. CERES Web Links

    Atmospheric Science Data Center

    2013-03-21

        Web Links to Relevant CERES Information Relevant information about ... questions about the CERES data can be found at the following web sites: CERES Home Page CERES TRMM Home Page ... Information page  on the Atmospheric Science Data Center site CERES "ARM" Validation Experiment (CAVE) Home Page  has ...

  17. Rill erosion in burned and salvage logged western montane forests: Effects of logging equipment type, traffic level, and slash treatment

    NASA Astrophysics Data System (ADS)

    Wagenbrenner, J. W.; Robichaud, P. R.; Brown, R. E.

    2016-10-01

    Following wildfires, forest managers often consider salvage logging burned trees to recover monetary value of timber, reduce fuel loads, or to meet other objectives. Relatively little is known about the cumulative hydrologic effects of wildfire and subsequent timber harvest using logging equipment. We used controlled rill experiments in logged and unlogged (control) forests burned at high severity in northern Montana, eastern Washington, and southern British Columbia to quantify rill overland flow and sediment production rates (fluxes) after ground-based salvage logging. We tested different types of logging equipment-feller-bunchers, tracked and wheeled skidders, and wheeled forwarders-as well as traffic levels and the addition of slash to skid trails as a best management practice. Rill experiments were done at each location in the first year after the fire and repeated in subsequent years. Logging was completed in the first or second post-fire year. We found that ground-based logging using heavy equipment compacted soil, reduced soil water repellency, and reduced vegetation cover. Vegetation recovery rates were slower in most logged areas than the controls. Runoff rates were higher in the skidder and forwarder plots than their respective controls in the Montana and Washington sites in the year that logging occurred, and the difference in runoff between the skidder and control plots at the British Columbia site was nearly significant (p = 0.089). Most of the significant increases in runoff in the logged plots persisted for subsequent years. The type of skidder, the addition of slash, and the amount of forwarder traffic did not significantly affect the runoff rates. Across the three sites, rill sediment fluxes were 5-1900% greater in logged plots than the controls in the year of logging, and the increases were significant for all logging treatments except the low use forwarder trails. There was no difference in the first-year sediment fluxes between the feller

  18. Web-Based Learning Support System

    NASA Astrophysics Data System (ADS)

    Fan, Lisa

    Web-based learning support system offers many benefits over traditional learning environments and has become very popular. The Web is a powerful environment for distributing information and delivering knowledge to an increasingly wide and diverse audience. Typical Web-based learning environments, such as Web-CT, Blackboard, include course content delivery tools, quiz modules, grade reporting systems, assignment submission components, etc. They are powerful integrated learning management systems (LMS) that support a number of activities performed by teachers and students during the learning process [1]. However, students who study a course on the Internet tend to be more heterogeneously distributed than those found in a traditional classroom situation. In order to achieve optimal efficiency in a learning process, an individual learner needs his or her own personalized assistance. For a web-based open and dynamic learning environment, personalized support for learners becomes more important. This chapter demonstrates how to realize personalized learning support in dynamic and heterogeneous learning environments by utilizing Adaptive Web technologies. It focuses on course personalization in terms of contents and teaching materials that is according to each student's needs and capabilities. An example of using Rough Set to analyze student personal information to assist students with effective learning and predict student performance is presented.

  19. Vibration Propagation in Spider Webs

    NASA Astrophysics Data System (ADS)

    Hatton, Ross; Otto, Andrew; Elias, Damian

    Due to their poor eyesight, spiders rely on web vibrations for situational awareness. Web-borne vibrations are used to determine the location of prey, predators, and potential mates. The influence of web geometry and composition on web vibrations is important for understanding spider's behavior and ecology. Past studies on web vibrations have experimentally measured the frequency response of web geometries by removing threads from existing webs. The full influence of web structure and tension distribution on vibration transmission; however, has not been addressed in prior work. We have constructed physical artificial webs and computer models to better understand the effect of web structure on vibration transmission. These models provide insight into the propagation of vibrations through the webs, the frequency response of the bare web, and the influence of the spider's mass and stiffness on the vibration transmission patterns. Funded by NSF-1504428.

  20. Funnel-web spider bite

    MedlinePlus

    ... page: //medlineplus.gov/ency/article/002844.htm Funnel-web spider bite To use the sharing features on ... the effects of a bite from the funnel-web spider. Male funnel-web spiders are more poisonous ...