Science.gov

Sample records for efficient web log

  1. Efficient Preprocessing technique using Web log mining

    NASA Astrophysics Data System (ADS)

    Raiyani, Sheetal A.; jain, Shailendra

    2012-11-01

    Web Usage Mining can be described as the discovery and Analysis of user access pattern through mining of log files and associated data from a particular websites. No. of visitors interact daily with web sites around the world. enormous amount of data are being generated and these information could be very prize to the company in the field of accepting Customerís behaviors. In this paper a complete preprocessing style having data cleaning, user and session Identification activities to improve the quality of data. Efficient preprocessing technique one of the User Identification which is key issue in preprocessing technique phase is to identify the Unique web users. Traditional User Identification is based on the site structure, being supported by using some heuristic rules, for use of this reduced the efficiency of user identification solve this difficulty we introduced proposed Technique DUI (Distinct User Identification) based on IP address ,Agent and Session time ,Referred pages on desired session time. Which can be used in counter terrorism, fraud detection and detection of unusual access of secure data, as well as through detection of regular access behavior of users improve the overall designing and performance of upcoming access of preprocessing results.

  2. Analysis of Web Proxy Logs

    NASA Astrophysics Data System (ADS)

    Fei, Bennie; Eloff, Jan; Olivier, Martin; Venter, Hein

    Network forensics involves capturing, recording and analysing network audit trails. A crucial part of network forensics is to gather evidence at the server level, proxy level and from other sources. A web proxy relays URL requests from clients to a server. Analysing web proxy logs can give unobtrusive insights to the browsing behavior of computer users and provide an overview of the Internet usage in an organisation. More importantly, in terms of network forensics, it can aid in detecting anomalous browsing behavior. This paper demonstrates the use of a self-organising map (SOM), a powerful data mining technique, in network forensics. In particular, it focuses on how a SOM can be used to analyse data gathered at the web proxy level.

  3. Web Logs in the English Classroom: More Than Just Chat.

    ERIC Educational Resources Information Center

    Richardson, Will

    2003-01-01

    Details the use and appeal of Web logs to enhance classroom discussion and allow for outside involvement in the classroom. Defines a Web log, addresses discussing literature in a Web log, and describes the author's first attempts at using Web-log technology. Presents considerations for using Web logs as part of classroom instruction. (SG)

  4. Using Web Logs in the Science Classroom

    ERIC Educational Resources Information Center

    Duplichan, Staycle C.

    2009-01-01

    As educators we must ask ourselves if we are meeting the needs of today's students. The science world is adapting to our ever-changing society; are the methodology and philosophy of our educational system keeping up? In this article, you'll learn why web logs (also called blogs) are an important Web 2.0 tool in your science classroom and how they…

  5. Using Web Logs in the Science Classroom

    ERIC Educational Resources Information Center

    Duplichan, Staycle C.

    2009-01-01

    As educators we must ask ourselves if we are meeting the needs of today's students. The science world is adapting to our ever-changing society; are the methodology and philosophy of our educational system keeping up? In this article, you'll learn why web logs (also called blogs) are an important Web 2.0 tool in your science classroom and how they…

  6. Web Log Analysis: A Study of Instructor Evaluations Done Online

    ERIC Educational Resources Information Center

    Klassen, Kenneth J.; Smith, Wayne

    2004-01-01

    This paper focuses on developing a relatively simple method for analyzing web-logs. It also explores the challenges and benefits of web-log analysis. The study of student behavior on this site provides insights into website design and the effectiveness of this site in particular. Another benefit realized from the paper is the ease with which these…

  7. Web Log Analysis: A Study of Instructor Evaluations Done Online

    ERIC Educational Resources Information Center

    Klassen, Kenneth J.; Smith, Wayne

    2004-01-01

    This paper focuses on developing a relatively simple method for analyzing web-logs. It also explores the challenges and benefits of web-log analysis. The study of student behavior on this site provides insights into website design and the effectiveness of this site in particular. Another benefit realized from the paper is the ease with which these…

  8. Analysis of Web access logs for surveillance of influenza.

    PubMed

    Johnson, Heather A; Wagner, Michael M; Hogan, William R; Chapman, Wendy; Olszewski, Robert T; Dowling, John; Barnas, Gary

    2004-01-01

    The purpose of this study was to determine whether the level of influenza in a population correlates with the number of times that internet users access information about influenza on health-related Web sites. We obtained Web access logs from the Healthlink Web site. Web access logs contain information about the user and the information the user accessed, and are maintained electronically by most Web sites, including Healthlink. We developed weekly counts of the number of accesses of selected influenza-related articles on the Healthlink Web site and measured their correlation with traditional influenza surveillance data from the Centers for Disease Control and Prevention (CDC) using the cross-correlation function (CCF). We defined timeliness as the time lag at which the correlation was a maximum. There was a moderately strong correlation between the frequency of influenza-related article accesses and the CDC's traditional surveillance data, but the results on timeliness were inconclusive. With improvements in methods for performing spatial analysis of the data and the continuing increase in Web searching behavior among Americans, Web article access has the potential to become a useful data source for public health early warning systems.

  9. Comparing Web and Touch Screen Transaction Log Files

    PubMed Central

    Huntington, Paul; Williams, Peter

    2001-01-01

    Background Digital health information is available on a wide variety of platforms including PC-access of the Internet, Wireless Application Protocol phones, CD-ROMs, and touch screen public kiosks. All these platforms record details of user sessions in transaction log files, and there is a growing body of research into the evaluation of this data. However, there is very little research that has examined the problems of comparing the transaction log files of kiosks and the Internet. Objectives To provide a first step towards examining the problems of comparing the transaction log files of kiosks and the Internet. Methods We studied two platforms: touch screen kiosks and a comparable Web site. For both of these platforms, we examined the menu structure (which affects transaction log file data), the log-file structure, and the metrics derived from log-file records. Results We found substantial differences between the generated metrics. Conclusions None of the metrics discussed can be regarded as an effective way of comparing the use of kiosks and Web sites. Two metrics stand out as potentially comparable and valuable: the number of user sessions per hour and user penetration of pages. PMID:11720960

  10. Information needs for increasing log transport efficiency

    Treesearch

    Timothy P. McDonald; Steven E. Taylor; Robert B. Rummer; Jorge Valenzuela

    2001-01-01

    Three methods of dispatching trucks to loggers were tested using a log transport simulation model: random allocation, fixed assignment of trucks to loggers, and dispatch based on knowledge of the current status of trucks and loggers within the system. This 'informed' dispatch algorithm attempted to minimize the difference in time between when a logger would...

  11. Relevant Term Suggestion in Interactive Web Search Based on Contextual Information in Query Session Logs.

    ERIC Educational Resources Information Center

    Huang, Chien-Kang; Chien, Lee-Feng; Oyang, Yen-Jen

    2003-01-01

    Proposes an effective term suggestion approach to interactive Web searches. Explains a log-based approach to relevant term extraction and term suggestion where relevant terms suggested for a user query are those that co-occur in similar query sessions from search engine logs rather than in the retrieved documents. (Author/LRW)

  12. Efficient Web Services Policy Combination

    NASA Technical Reports Server (NTRS)

    Vatan, Farrokh; Harman, Joseph G.

    2010-01-01

    Large-scale Web security systems usually involve cooperation between domains with non-identical policies. The network management and Web communication software used by the different organizations presents a stumbling block. Many of the tools used by the various divisions do not have the ability to communicate network management data with each other. At best, this means that manual human intervention into the communication protocols used at various network routers and endpoints is required. Developing practical, sound, and automated ways to compose policies to bridge these differences is a long-standing problem. One of the key subtleties is the need to deal with inconsistencies and defaults where one organization proposes a rule on a particular feature, and another has a different rule or expresses no rule. A general approach is to assign priorities to rules and observe the rules with the highest priorities when there are conflicts. The present methods have inherent inefficiency, which heavily restrict their practical applications. A new, efficient algorithm combines policies utilized for Web services. The method is based on an algorithm that allows an automatic and scalable composition of security policies between multiple organizations. It is based on defeasible policy composition, a promising approach for finding conflicts and resolving priorities between rules. In the general case, policy negotiation is an intractable problem. A promising method, suggested in the literature, is when policies are represented in defeasible logic, and composition is based on rules for non-monotonic inference. In this system, policy writers construct metapolicies describing both the policy that they wish to enforce and annotations describing their composition preferences. These annotations can indicate whether certain policy assertions are required by the policy writer or, if not, under what circumstances the policy writer is willing to compromise and allow other assertions to take

  13. Analyzing Web Server Logs to Improve a Site's Usage. The Systems Librarian

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2005-01-01

    This column describes ways to streamline and optimize how a Web site works in order to improve both its usability and its visibility. The author explains how to analyze logs and other system data to measure the effectiveness of the Web site design and search engine.

  14. Analyzing Web Server Logs to Improve a Site's Usage. The Systems Librarian

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2005-01-01

    This column describes ways to streamline and optimize how a Web site works in order to improve both its usability and its visibility. The author explains how to analyze logs and other system data to measure the effectiveness of the Web site design and search engine.

  15. Users' Perceptions of the Web As Revealed by Transaction Log Analysis.

    ERIC Educational Resources Information Center

    Moukdad, Haidar; Large, Andrew

    2001-01-01

    Describes the results of a transaction log analysis of a Web search engine, WebCrawler, to analyze user's queries for information retrieval. Results suggest most users do not employ advanced search features, and the linguistic structure often resembles a human-human communication model that is not always successful in human-computer communication.…

  16. Statistics, Structures & Satisfied Customers: Using Web Log Data to Improve Site Performance.

    ERIC Educational Resources Information Center

    Peacock, Darren

    This paper explores some of the ways in which the National Museum of Australia is using Web analysis tools to shape its future directions in the delivery of online services. In particular, it explores the potential of quantitative analysis, based on Web server log data, to convert these ephemeral traces of user experience into a strategic…

  17. Users' Perceptions of the Web As Revealed by Transaction Log Analysis.

    ERIC Educational Resources Information Center

    Moukdad, Haidar; Large, Andrew

    2001-01-01

    Describes the results of a transaction log analysis of a Web search engine, WebCrawler, to analyze user's queries for information retrieval. Results suggest most users do not employ advanced search features, and the linguistic structure often resembles a human-human communication model that is not always successful in human-computer communication.…

  18. Blog Revolution: Expanding Classroom Horizons with Web Logs

    ERIC Educational Resources Information Center

    Richardson, Will

    2005-01-01

    Blogs are not a passing fad as a new blog is created every second. There are more than 900,000 blog posts a day. Blogs are one of many new disruptive technologies that are transforming the world. They are creating a richer, more dynamic, more interactive Web where participation is the rule rather than the exception. Classrooms and schools are…

  19. An Efficient Web Page Ranking for Semantic Web

    NASA Astrophysics Data System (ADS)

    Chahal, P.; Singh, M.; Kumar, S.

    2014-01-01

    With the enormous amount of information presented on the web, the retrieval of relevant information has become a serious problem and is also the topic of research for last few years. The most common tools to retrieve information from web are search engines like Google. The Search engines are usually based on keyword searching and indexing of web pages. This approach is not very efficient as the result-set of web pages obtained include large irrelevant pages. Sometimes even the entire result-set may contain lot of irrelevant pages for the user. The next generation of search engines must address this problem. Recently, many semantic web search engines have been developed like Ontolook, Swoogle, which help in searching meaningful documents presented on semantic web. In this process the ranking of the retrieved web pages is very crucial. Some attempts have been made in ranking of semantic web pages but still the ranking of these semantic web documents is neither satisfactory and nor up to the user's expectations. In this paper we have proposed a semantic web based document ranking scheme that relies not only on the keywords but also on the conceptual instances present between the keywords. As a result only the relevant page will be on the top of the result-set of searched web pages. We explore all relevant relations between the keywords exploring the user's intention and then calculate the fraction of these relations on each web page to determine their relevance. We have found that this ranking technique gives better results than those by the prevailing methods.

  20. Effect of Temporal Relationships in Associative Rule Mining for Web Log Data

    PubMed Central

    Mohd Khairudin, Nazli; Mustapha, Aida

    2014-01-01

    The advent of web-based applications and services has created such diverse and voluminous web log data stored in web servers, proxy servers, client machines, or organizational databases. This paper attempts to investigate the effect of temporal attribute in relational rule mining for web log data. We incorporated the characteristics of time in the rule mining process and analysed the effect of various temporal parameters. The rules generated from temporal relational rule mining are then compared against the rules generated from the classical rule mining approach such as the Apriori and FP-Growth algorithms. The results showed that by incorporating the temporal attribute via time, the number of rules generated is subsequently smaller but is comparable in terms of quality. PMID:24587757

  1. Effect of temporal relationships in associative rule mining for web log data.

    PubMed

    Khairudin, Nazli Mohd; Mustapha, Aida; Ahmad, Mohd Hanif

    2014-01-01

    The advent of web-based applications and services has created such diverse and voluminous web log data stored in web servers, proxy servers, client machines, or organizational databases. This paper attempts to investigate the effect of temporal attribute in relational rule mining for web log data. We incorporated the characteristics of time in the rule mining process and analysed the effect of various temporal parameters. The rules generated from temporal relational rule mining are then compared against the rules generated from the classical rule mining approach such as the Apriori and FP-Growth algorithms. The results showed that by incorporating the temporal attribute via time, the number of rules generated is subsequently smaller but is comparable in terms of quality.

  2. A Clustering Methodology of Web Log Data for Learning Management Systems

    ERIC Educational Resources Information Center

    Valsamidis, Stavros; Kontogiannis, Sotirios; Kazanidis, Ioannis; Theodosiou, Theodosios; Karakos, Alexandros

    2012-01-01

    Learning Management Systems (LMS) collect large amounts of data. Data mining techniques can be applied to analyse their web data log files. The instructors may use this data for assessing and measuring their courses. In this respect, we have proposed a methodology for analysing LMS courses and students' activity. This methodology uses a Markov…

  3. A Clustering Methodology of Web Log Data for Learning Management Systems

    ERIC Educational Resources Information Center

    Valsamidis, Stavros; Kontogiannis, Sotirios; Kazanidis, Ioannis; Theodosiou, Theodosios; Karakos, Alexandros

    2012-01-01

    Learning Management Systems (LMS) collect large amounts of data. Data mining techniques can be applied to analyse their web data log files. The instructors may use this data for assessing and measuring their courses. In this respect, we have proposed a methodology for analysing LMS courses and students' activity. This methodology uses a Markov…

  4. Analysis of Large Data Logs: An Application of Poisson Sampling on Excite Web Queries.

    ERIC Educational Resources Information Center

    Ozmutlu, H. Cenk; Spink, Amanda; Ozmutlu, Seda

    2002-01-01

    Discusses the need for tools that allow effective analysis of search engine queries to provide a greater understanding of Web users' information seeking behavior and describes a study that developed an effective strategy for selecting samples from large-scale data sets. Reports on Poisson sampling with data logs from the Excite search engine.…

  5. Analyzing Engagement in a Web-Based Intervention Platform Through Visualizing Log-Data

    PubMed Central

    2014-01-01

    Background Engagement has emerged as a significant cross-cutting concern within the development of Web-based interventions. There have been calls to institute a more rigorous approach to the design of Web-based interventions, to increase both the quantity and quality of engagement. One approach would be to use log-data to better understand the process of engagement and patterns of use. However, an important challenge lies in organizing log-data for productive analysis. Objective Our aim was to conduct an initial exploration of the use of visualizations of log-data to enhance understanding of engagement with Web-based interventions. Methods We applied exploratory sequential data analysis to highlight sequential aspects of the log data, such as time or module number, to provide insights into engagement. After applying a number of processing steps, a range of visualizations were generated from the log-data. We then examined the usefulness of these visualizations for understanding the engagement of individual users and the engagement of cohorts of users. The visualizations created are illustrated with two datasets drawn from studies using the SilverCloud Platform: (1) a small, detailed dataset with interviews (n=19) and (2) a large dataset (n=326) with 44,838 logged events. Results We present four exploratory visualizations of user engagement with a Web-based intervention, including Navigation Graph, Stripe Graph, Start–Finish Graph, and Next Action Heat Map. The first represents individual usage and the last three, specific aspects of cohort usage. We provide examples of each with a discussion of salient features. Conclusions Log-data analysis through data visualization is an alternative way of exploring user engagement with Web-based interventions, which can yield different insights than more commonly used summative measures. We describe how understanding the process of engagement through visualizations can support the development and evaluation of Web

  6. Analyzing engagement in a web-based intervention platform through visualizing log-data.

    PubMed

    Morrison, Cecily; Doherty, Gavin

    2014-11-13

    Engagement has emerged as a significant cross-cutting concern within the development of Web-based interventions. There have been calls to institute a more rigorous approach to the design of Web-based interventions, to increase both the quantity and quality of engagement. One approach would be to use log-data to better understand the process of engagement and patterns of use. However, an important challenge lies in organizing log-data for productive analysis. Our aim was to conduct an initial exploration of the use of visualizations of log-data to enhance understanding of engagement with Web-based interventions. We applied exploratory sequential data analysis to highlight sequential aspects of the log data, such as time or module number, to provide insights into engagement. After applying a number of processing steps, a range of visualizations were generated from the log-data. We then examined the usefulness of these visualizations for understanding the engagement of individual users and the engagement of cohorts of users. The visualizations created are illustrated with two datasets drawn from studies using the SilverCloud Platform: (1) a small, detailed dataset with interviews (n=19) and (2) a large dataset (n=326) with 44,838 logged events. We present four exploratory visualizations of user engagement with a Web-based intervention, including Navigation Graph, Stripe Graph, Start-Finish Graph, and Next Action Heat Map. The first represents individual usage and the last three, specific aspects of cohort usage. We provide examples of each with a discussion of salient features. Log-data analysis through data visualization is an alternative way of exploring user engagement with Web-based interventions, which can yield different insights than more commonly used summative measures. We describe how understanding the process of engagement through visualizations can support the development and evaluation of Web-based interventions. Specifically, we show how visualizations

  7. Development of high efficiency solar cells on silicon web

    NASA Technical Reports Server (NTRS)

    Rohatgi, A.; Meier, D. L.; Campbell, R. B.; Schmidt, D. N.; Rai-Choudhury, P.

    1984-01-01

    Web base material is being improved with a goal toward obtaining solar cell efficiencies in excess of 18% (AM1). Carrier loss mechanisms in web silicon was investigated, techniques were developed to reduce carrier recombination in the web, and web cells were fabricated using effective surface passivation. The effect of stress on web cell performance was also investigated.

  8. Efficient Web Change Monitoring with Page Digest

    SciTech Connect

    Buttler, D J; Rocco, D; Liu, L

    2004-02-20

    The Internet and the World Wide Web have enabled a publishing explosion of useful online information, which has produced the unfortunate side effect of information overload: it is increasingly difficult for individuals to keep abreast of fresh information. In this paper we describe an approach for building a system for efficiently monitoring changes to Web documents. This paper has three main contributions. First, we present a coherent framework that captures different characteristics of Web documents. The system uses the Page Digest encoding to provide a comprehensive monitoring system for content, structure, and other interesting properties of Web documents. Second, the Page Digest encoding enables improved performance for individual page monitors through mechanisms such as short-circuit evaluation, linear time algorithms for document and structure similarity, and data size reduction. Finally, we develop a collection of sentinel grouping techniques based on the Page Digest encoding to reduce redundant processing in large-scale monitoring systems by grouping similar monitoring requests together. We examine how effective these techniques are over a wide range of parameters and have seen an order of magnitude speed up over existing Web-based information monitoring systems.

  9. Efficiently log and perforate 60 + wells with coiled tubing

    SciTech Connect

    Fertl, W.H.; Hotz, R.F.

    1987-07-01

    In today's petroleum industry, more and more emphasis is being placed on logging and completion techniques for highly deviated (extended-reach) and horizontal boreholes. This is the result of cost-effective development of oil and gas via: a minimum number of production platforms on large structures, incremental but marginal reserves in outlying and/or small fault blocks, shallow reservoirs in deep offshore waters, and significant hydrocarbon accumulations in environmentally sensitive and/or restrictive areas, e.g., perma-frost, urban areas, etc. The major challenge in logging such high-angle, extended-reach, and also horizontal boreholes is guiding the logging tool string to the bottom of the wellbore. In the horizontal portion of a borehole, the use of coiled tubing has proven successful in ''pushing'' the logging instrumentation toward the bottom (end) of the borehole.

  10. New Tools for Research on Instruction and Instructional Policy: A Web-Based Teacher Log. A CTP Working Paper.

    ERIC Educational Resources Information Center

    Ball, Deborah Loewenberg; Camburn, Eric; Correnti, Richard; Phelps, Geoffrey; Wallace, Raven

    This paper discusses the initial development and testing of a Web-based instrument for collecting daily data on instruction. This teacher log was developed for use in the Study of Instructional Improvement, a longitudinal study on school improvement in high poverty areas. The researchers wanted to further develop the potential of teacher logs by…

  11. MMT nightly tracking logs: a web-enabled database for continuous evaluation of tracking performance

    NASA Astrophysics Data System (ADS)

    Clark, D.; Gibson, J. D.; Porter, D.; Trebisky, T.

    2012-09-01

    Over the past few years, the MMT Observatory has developed a number of web browser front ends for operation interfaces and staff access to internal databases. Among these is a facility for viewed reduced tracking logs in both time series and FFTs for convenient examination of tracking performance. Part of the back-end software also keeps the tracking data in a searchable database, allowing data over long periods of time to be collected and analyzed to look for trends, the influence of environmental factors on tracking, and help detect tracking degradation in a timely manner.

  12. Development of high-efficiency solar cells on silicon web

    NASA Technical Reports Server (NTRS)

    Rohatgi, A.; Meier, D. L.; Campbell, R. B.; Seidensticker, R. G.; Rai-Choudhury, P.

    1984-01-01

    The development of high efficiency solar cells on a silicon web is discussed. Heat treatment effects on web quality; the influence of twin plane lamellae, trace impurities and stress on minority carrier lifetime; and the fabrication of cells are discussed.

  13. Designing efficient logging systems for northern hardwoods using equipment production capabilities and costs.

    Treesearch

    R.B. Gardner

    1966-01-01

    Describes a typical logging system used in the Lake and Northeastern States, discusses each step in the operation, and presents a simple method for designing and efficient logging system for such an operation. Points out that a system should always be built around the key piece of equipment, which is usually the skidder. Specific equipment types and their production...

  14. Competency-based residency training and the web log: modeling practice-based learning and enhancing medical knowledge.

    PubMed

    Hollon, Matthew F

    2015-01-01

    By using web-based tools in medical education, there are opportunities to innovatively teach important principles from the general competencies of graduate medical education. Postulating that faculty transparency in learning from uncertainties in clinical work could help residents to incorporate the principles of practice-based learning and improvement (PBLI) in their professional development, faculty in this community-based residency program modeled the steps of PBLI on a weekly basis through the use of a web log. The program confidentially surveyed residents before and after this project about actions consistent with PBLI and knowledge acquired through reading the web log. The frequency that residents encountered clinical situations where they felt uncertain declined over the course of the 24 weeks of the project from a mean frequency of uncertainty of 36% to 28% (Wilcoxon signed rank test, p=0.008); however, the frequency with which residents sought answers when faced with uncertainty did not change (Wilcoxon signed rank test, p=0.39), remaining high at approximately 80%. Residents answered a mean of 52% of knowledge questions correct when tested prior to faculty posts to the blog, rising to a mean of 65% of questions correct when tested at the end of the project (paired t-test, p=0.001). Faculty role modeling of PBLI behaviors and posting clinical questions and answers to a web log led to modest improvements in medical knowledge but did not alter behavior that was already taking place frequently among residents.

  15. A Two-Tiered Model for Analyzing Library Web Site Usage Statistics, Part 1: Web Server Logs.

    ERIC Educational Resources Information Center

    Cohen, Laura B.

    2003-01-01

    Proposes a two-tiered model for analyzing web site usage statistics for academic libraries: one tier for library administrators that analyzes measures indicating library use, and a second tier for web site managers that analyzes measures aiding in server maintenance and site design. Discusses the technology of web site usage statistics, and…

  16. Developing an Efficient Computational Method that Estimates the Ability of Students in a Web-Based Learning Environment

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2012-01-01

    This paper presents a computational method that can efficiently estimate the ability of students from the log files of a Web-based learning environment capturing their problem solving processes. The computational method developed in this study approximates the posterior distribution of the student's ability obtained from the conventional Bayes…

  17. Developing an Efficient Computational Method that Estimates the Ability of Students in a Web-Based Learning Environment

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2012-01-01

    This paper presents a computational method that can efficiently estimate the ability of students from the log files of a Web-based learning environment capturing their problem solving processes. The computational method developed in this study approximates the posterior distribution of the student's ability obtained from the conventional Bayes…

  18. The design and implementation of web mining in web sites security

    NASA Astrophysics Data System (ADS)

    Li, Jian; Zhang, Guo-Yin; Gu, Guo-Chang; Li, Jian-Li

    2003-06-01

    The backdoor or information leak of Web servers can be detected by using Web Mining techniques on some abnormal Web log and Web application log data. The security of Web servers can be enhanced and the damage of illegal access can be avoided. Firstly, the system for discovering the patterns of information leakages in CGI scripts from Web log data was proposed. Secondly, those patterns for system administrators to modify their codes and enhance their Web site security were provided. The following aspects were described: one is to combine web application log with web log to extract more information, so web data mining could be used to mine web log for discovering the information that firewall and Information Detection System cannot find. Another approach is to propose an operation module of web site to enhance Web site security. In cluster server session, Density-Based Clustering technique is used to reduce resource cost and obtain better efficiency.

  19. Development of high-efficiency solar cells on silicon web

    NASA Technical Reports Server (NTRS)

    Meier, D. L.; Greggi, J.; Okeeffe, T. W.; Rai-Choudhury, P.

    1986-01-01

    Work was performed to improve web base material with a goal of obtaining solar cell efficiencies in excess of 18% (AM1). Efforts in this program are directed toward identifying carrier loss mechanisms in web silicon, eliminating or reducing these mechanisms, designing a high efficiency cell structure with the aid of numerical models, and fabricating high efficiency web solar cells. Fabrication techniques must preserve or enhance carrier lifetime in the bulk of the cell and minimize recombination of carriers at the external surfaces. Three completed cells were viewed by cross-sectional transmission electron microscopy (TEM) in order to investigate further the relation between structural defects and electrical performance of web cells. Consistent with past TEM examinations, the cell with the highest efficiency (15.0%) had no dislocations but did have 11 twin planes.

  20. Competency-based residency training and the web log: modeling practice-based learning and enhancing medical knowledge†

    PubMed Central

    Hollon, Matthew F.

    2015-01-01

    Background By using web-based tools in medical education, there are opportunities to innovatively teach important principles from the general competencies of graduate medical education. Objectives Postulating that faculty transparency in learning from uncertainties in clinical work could help residents to incorporate the principles of practice-based learning and improvement (PBLI) in their professional development, faculty in this community-based residency program modeled the steps of PBLI on a weekly basis through the use of a web log. Method The program confidentially surveyed residents before and after this project about actions consistent with PBLI and knowledge acquired through reading the web log. Results The frequency that residents encountered clinical situations where they felt uncertain declined over the course of the 24 weeks of the project from a mean frequency of uncertainty of 36% to 28% (Wilcoxon signed rank test, p=0.008); however, the frequency with which residents sought answers when faced with uncertainty did not change (Wilcoxon signed rank test, p=0.39), remaining high at approximately 80%. Residents answered a mean of 52% of knowledge questions correct when tested prior to faculty posts to the blog, rising to a mean of 65% of questions correct when tested at the end of the project (paired t-test, p=0.001). Conclusions Faculty role modeling of PBLI behaviors and posting clinical questions and answers to a web log led to modest improvements in medical knowledge but did not alter behavior that was already taking place frequently among residents. PMID:26653701

  1. Increased capture of pediatric surgical complications utilizing a novel case-log web application to enhance quality improvement.

    PubMed

    Fisher, Jason C; Kuenzler, Keith A; Tomita, Sandra S; Sinha, Prashant; Shah, Paresh; Ginsburg, Howard B

    2017-01-01

    Documenting surgical complications is limited by multiple barriers and is not fostered in the electronic health record. Tracking complications is essential for quality improvement (QI) and required for board certification. Current registry platforms do not facilitate meaningful complication reporting. We developed a novel web application that improves accuracy and reduces barriers to documenting complications. We deployed a custom web application that allows pediatric surgeons to maintain case logs. The program includes a module for entering complication data in real time. Reminders to enter outcome data occur at key postoperative intervals to optimize recall of events. Between October 1, 2014, and March 31, 2015, frequencies of surgical complications captured by the existing hospital reporting system were compared with data aggregated by our application. 780 cases were captured by the web application, compared with 276 cases registered by the hospital system. We observed an increase in the capture of major complications when compared to the hospital dataset (14 events vs. 4 events). This web application improved real-time reporting of surgical complications, exceeding the accuracy of administrative datasets. Custom informatics solutions may help reduce barriers to self-reporting of adverse events and improve the data that presently inform pediatric surgical QI. Diagnostic study/Retrospective study. Level III - case control study. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Evaluation of the Feasibility of Screening Patients for Early Signs of Lung Carcinoma in Web Search Logs.

    PubMed

    White, Ryen W; Horvitz, Eric

    2017-03-01

    A statistical model that predicts the appearance of strong evidence of a lung carcinoma diagnosis via analysis of large-scale anonymized logs of web search queries from millions of people across the United States. To evaluate the feasibility of screening patients at risk of lung carcinoma via analysis of signals from online search activity. We identified people who issue special queries that provide strong evidence of a recent diagnosis of lung carcinoma. We then considered patterns of symptoms expressed as searches about concerning symptoms over several months prior to the appearance of the landmark web queries. We built statistical classifiers that predict the future appearance of landmark queries based on the search log signals. This was a retrospective log analysis of the online activity of millions of web searchers seeking health-related information online. Of web searchers who queried for symptoms related to lung carcinoma, some (n = 5443 of 4 813 985) later issued queries that provide strong evidence of recent clinical diagnosis of lung carcinoma and are regarded as positive cases in our analysis. Additional evidence on the reliability of these queries as representing clinical diagnoses is based on the significant increase in follow-on searches for treatments and medications for these searchers and on the correlation between lung carcinoma incidence rates and our log-based statistics. The remaining symptom searchers (n = 4 808 542) are regarded as negative cases. Performance of the statistical model for early detection from online search behavior, for different lead times, different sets of signals, and different cohorts of searchers stratified by potential risk. The statistical classifier predicting the future appearance of landmark web queries based on search log signals identified searchers who later input queries consistent with a lung carcinoma diagnosis, with a true-positive rate ranging from 3% to 57% for false-positive rates ranging

  3. Development of high-efficiency solar cells on silicon web

    NASA Technical Reports Server (NTRS)

    Meier, D. L.; Greggi, J.; Rai-Choudhury, P.

    1986-01-01

    Work is reported aimed at identifying and reducing sources of carrier recombination both in the starting web silicon material and in the processed cells. Cross-sectional transmission electron microscopy measurements of several web cells were made and analyzed. The effect of the heavily twinned region on cell efficiency was modeled, and the modeling results compared to measured values for processed cells. The effects of low energy, high dose hydrogen ion implantation on cell efficiency and diffusion length were examined. Cells were fabricated from web silicon known to have a high diffusion length, with a new double layer antireflection coating being applied to these cells. A new contact system, to be used with oxide passivated cells and which greatly reduces the area of contact between metal and silicon, was designed. The application of DLTS measurements to beveled samples was further investigated.

  4. Capricorn-A Web-Based Automatic Case Log and Volume Analytics for Diagnostic Radiology Residents.

    PubMed

    Chen, Po-Hao; Chen, Yin Jie; Cook, Tessa S

    2015-10-01

    On-service clinical learning is a mainstay of radiology education. However, an accurate and timely case log is difficult to keep, especially in the absence of software tools tailored to resident education. Furthermore, volume-related feedback from the residency program sometimes occurs months after a rotation ends, limiting the opportunity for meaningful intervention. We surveyed the residents of a single academic institution to evaluate the current state of and the existing need for tracking interpretation volume. Using the results of the survey, we created an open-source automated case log software. Finally, we evaluated the effect of the software tool on the residency in a 1-month, postimplementation survey. Before implementation of the system, 89% of respondents stated that volume is an important component of training, but 71% stated that volume data was inconvenient to obtain. Although the residency program provides semiannual reviews, 90% preferred reviewing interpretation volumes at least once monthly. After implementation, 95% of the respondents stated that the software is convenient to access, 75% found it useful, and 88% stated they would use the software at least once a month. The included analytics module, which benchmarks the user using historical aggregate average volumes, is the most often used feature of the software. Server log demonstrates that, on average, residents use the system approximately twice a week. An automated case log software system may fulfill a previously unmet need in diagnostic radiology training, making accurate and timely review of volume-related performance analytics a convenient process. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  5. Development of high-efficiency solar cells on silicon web

    NASA Technical Reports Server (NTRS)

    Meier, D. L.

    1986-01-01

    Achievement of higher efficiency cells by directing efforts toward identifying carrier loss mechanisms; design of cell structures; and development of processing techniques are described. Use of techniques such as deep-level transient spectroscopy (DLTS), laser-beam-induced current (LBIC), and transmission electron microscopy (TEM) indicated that dislocations in web material rather than twin planes were primarily responsible for limiting diffusion lengths in the web. Lifetimes and cell efficiencies can be improved from 19 to 120 microns, and 8 to 10.3% (no AR), respectively, by implanting hydrogen at 1500 eV and a beam current density of 2.0 mA/sq cm. Some of the processing improvements included use of a double-layer AR coating (ZnS and MgF2) and an addition of an aluminum back surface reflectors. Cells of more than 16% efficiency were achieved.

  6. Log In to Experiential Learning Theory: Supporting Web-Based Faculty Development.

    PubMed

    Omer, Selma; Choi, Sunhea; Brien, Sarah; Parry, Marcus

    2017-09-27

    For an increasingly busy and geographically dispersed faculty, the Faculty of Medicine at the University of Southampton, United Kingdom, developed a range of Web-based faculty development modules, based on Kolb's experiential learning cycle, to complement the faculty's face-to-face workshops. The objective of this study was to assess users' views and perceptions of the effectiveness of Web-based faculty development modules based on Kolb's experiential learning cycle. We explored (1) users' satisfaction with the modules, (2) whether Kolb's design framework supported users' learning, and (3) whether the design principle impacts their work as educators. We gathered data from users over a 3-year period using evaluation surveys built into each of the seven modules. Quantitative data were analyzed using descriptive statistics, and responses to open-ended questions were analyzed using content analysis. Out of the 409 module users, 283 completed the survey (69.1% response rate). Over 80% of the users reported being satisfied or very satisfied with seven individual aspects of the modules. The findings suggest a strong synergy between the design features that users rated most highly and the key stages of Kolb's learning cycle. The use of simulations and videos to give the users an initial experience as well as the opportunity to "Have a go" and receive feedback in a safe environment were both considered particularly useful. In addition to providing an opportunity for reflection, many participants considered that the modules would enhance their roles as educators through: increasing their knowledge on various education topics and the required standards for medical training, and improving their skills in teaching and assessing students through practice and feedback and ultimately increasing their confidence. Kolb's theory-based design principle used for Web-based faculty development can support faculty to improve their skills and has impact on their role as educators

  7. pcrEfficiency: a Web tool for PCR amplification efficiency prediction

    PubMed Central

    2011-01-01

    Background Relative calculation of differential gene expression in quantitative PCR reactions requires comparison between amplification experiments that include reference genes and genes under study. Ignoring the differences between their efficiencies may lead to miscalculation of gene expression even with the same starting amount of template. Although there are several tools performing PCR primer design, there is no tool available that predicts PCR efficiency for a given amplicon and primer pair. Results We have used a statistical approach based on 90 primer pair combinations amplifying templates from bacteria, yeast, plants and humans, ranging in size between 74 and 907 bp to identify the parameters that affect PCR efficiency. We developed a generalized additive model fitting the data and constructed an open source Web interface that allows the obtention of oligonucleotides optimized for PCR with predicted amplification efficiencies starting from a given sequence. Conclusions pcrEfficiency provides an easy-to-use web interface allowing the prediction of PCR efficiencies prior to web lab experiments thus easing quantitative real-time PCR set-up. A web-based service as well the source code are provided freely at http://srvgen.upct.es/efficiency.html under the GPL v2 license. PMID:22014212

  8. Blog/web log - a new easy and interactive website building tool for a non-net savvy radiologist.

    PubMed

    Sethi, Sumer K

    2007-05-01

    Recently, there has been no escaping the mention of blogs in the media. Blogging has emerged as a social phenomenon, which has impacted politics, business, and communication. Blogging software has enabled people with limited knowledge of the Internet to publish their thoughts online and participate in a global conversation; whereas the Blogosphere has hyperaccelerated the spread of information. Technorati, a blog search engine is now tracking over 7.8 million Weblogs, and 937 million links, and reports that there are about 30,000 to 40,000 new blogs created a day. The majority of people who blog do so as a hobby, using blogs to publish their thoughts, feelings, and viewpoints on whatever topics interest them. Blogging software also enables people to post pictures, music, and more recently videos. For many people blogs are used as online journals or diaries; other people use blogs to communicate with their family and friends. Whether people generate revenue with their blogs or use them as a hobby the one thing they all have in common is that they are part of the Blogosphere, or network of blogs that gives people a voice and allows them to spread information at an unprecedented rate. Although searching PubMed produces few results for "blog" (6 relevant articles of 24), "weblog" (1 article), web log (8 entries of varying relevance), and "blogging" (4 articles) (none of which were in Radiology, RadioGraphics, AJR, or JVIR), blogging might well become an important means of information transfer in Radiology also. As radiology is an image-based science, a blog is a satisfying endeavor in that you can share your experiences with others instantaneously In this context, I would like to submit my experience with easy method for building a Web site known as blogging (maintaining a Web log). As a radiologist, I use my blog (http://www.sumerdoc.blogspot.com/; or http://www.indianradiology.com/) to post interesting cases from my routine practice along with any interesting

  9. Jensen-Bregman LogDet divergence with application to efficient similarity search for covariance matrices.

    PubMed

    Cherian, Anoop; Sra, Suvrit; Banerjee, Arindam; Papanikolopoulos, Nikolaos

    2013-09-01

    Covariance matrices have found success in several computer vision applications, including activity recognition, visual surveillance, and diffusion tensor imaging. This is because they provide an easy platform for fusing multiple features compactly. An important task in all of these applications is to compare two covariance matrices using a (dis)similarity function, for which the common choice is the Riemannian metric on the manifold inhabited by these matrices. As this Riemannian manifold is not flat, the dissimilarities should take into account the curvature of the manifold. As a result, such distance computations tend to slow down, especially when the matrix dimensions are large or gradients are required. Further, suitability of the metric to enable efficient nearest neighbor retrieval is an important requirement in the contemporary times of big data analytics. To alleviate these difficulties, this paper proposes a novel dissimilarity measure for covariances, the Jensen-Bregman LogDet Divergence (JBLD). This divergence enjoys several desirable theoretical properties and at the same time is computationally less demanding (compared to standard measures). Utilizing the fact that the square root of JBLD is a metric, we address the problem of efficient nearest neighbor retrieval on large covariance datasets via a metric tree data structure. To this end, we propose a K-Means clustering algorithm on JBLD. We demonstrate the superior performance of JBLD on covariance datasets from several computer vision applications.

  10. Log files analysis to assess the use and workload of a dynamic web server dedicated to end-stage renal disease.

    PubMed

    Ben Said, Mohamed; Le Mignot, Loic; Richard, Jean Baptiste; Le Bihan, Christine; Toubiana, Laurent; Jais, Jean-Philippe; Landais, Paul

    2006-01-01

    A Multi-Source Information System (MSIS), has been designed for the Renal Epidemiology and Information Network (REIN) dedicated to End-Stage Renal Disease (ESRD). MSIS aims at providing reliable follow-up data for ESRD patients. It is based on an n-tier architecture, made out of a universal client, a dynamic Web server connected to a production database and to a data warehouse. MSIS is operational since 2002 and progressively deployed in 9 regions in France. It includes 16,677 patients. We show that the analysis of MSIS web log files allows evaluating the use of the system and the workload in a public-health perspective.

  11. Log-Tool

    SciTech Connect

    Goodall, John

    2012-05-21

    Log files are typically semi- or un-structured. To be useable for visualization and machine learning, they need to be parsed into a standard, structured format. Log-tool is a tool for facilitating the parsing, structuring, and routing of log files (e.g. intrusion detection long, web server logs, system logs). It consists of three main components: (1) Input – it will input data from files, standard input, and syslog, (2) Parser – it will parse the log file based on regular expressions into structured data (JSNO format), (3) Output – it will output structured data into commonly used formats, including Redis (a database), standard output, and syslog.

  12. An Efficient Approach for Web Indexing of Big Data through Hyperlinks in Web Crawling

    PubMed Central

    Devi, R. Suganya; Manjula, D.; Siddharth, R. K.

    2015-01-01

    Web Crawling has acquired tremendous significance in recent times and it is aptly associated with the substantial development of the World Wide Web. Web Search Engines face new challenges due to the availability of vast amounts of web documents, thus making the retrieved results less applicable to the analysers. However, recently, Web Crawling solely focuses on obtaining the links of the corresponding documents. Today, there exist various algorithms and software which are used to crawl links from the web which has to be further processed for future use, thereby increasing the overload of the analyser. This paper concentrates on crawling the links and retrieving all information associated with them to facilitate easy processing for other uses. In this paper, firstly the links are crawled from the specified uniform resource locator (URL) using a modified version of Depth First Search Algorithm which allows for complete hierarchical scanning of corresponding web links. The links are then accessed via the source code and its metadata such as title, keywords, and description are extracted. This content is very essential for any type of analyser work to be carried on the Big Data obtained as a result of Web Crawling. PMID:26137592

  13. An Efficient Approach for Web Indexing of Big Data through Hyperlinks in Web Crawling.

    PubMed

    Devi, R Suganya; Manjula, D; Siddharth, R K

    2015-01-01

    Web Crawling has acquired tremendous significance in recent times and it is aptly associated with the substantial development of the World Wide Web. Web Search Engines face new challenges due to the availability of vast amounts of web documents, thus making the retrieved results less applicable to the analysers. However, recently, Web Crawling solely focuses on obtaining the links of the corresponding documents. Today, there exist various algorithms and software which are used to crawl links from the web which has to be further processed for future use, thereby increasing the overload of the analyser. This paper concentrates on crawling the links and retrieving all information associated with them to facilitate easy processing for other uses. In this paper, firstly the links are crawled from the specified uniform resource locator (URL) using a modified version of Depth First Search Algorithm which allows for complete hierarchical scanning of corresponding web links. The links are then accessed via the source code and its metadata such as title, keywords, and description are extracted. This content is very essential for any type of analyser work to be carried on the Big Data obtained as a result of Web Crawling.

  14. Evolving dynamic web pages using web mining

    NASA Astrophysics Data System (ADS)

    Menon, Kartik; Dagli, Cihan H.

    2003-08-01

    The heterogeneity and the lack of structure that permeates much of the ever expanding information sources on the WWW makes it difficult for the user to properly and efficiently access different web pages. Different users have different needs from the same web page. It is necessary to train the system to understand the needs and demands of the users. In other words there is a need for efficient and proper web mining. In this paper issues and possible ways of training the system and providing high level of organization for semi structured data available on the web is discussed. Web pages can be evolved based on history of query searches, browsing, links traversed and observation of the user behavior like book marking and time spent on viewing. Fuzzy clustering techniques help in grouping natural users and groups, neural networks, association rules and web traversals patterns help in efficient sequential anaysis based on previous searches and queries by the user. In this paper we analyze web server logs using above mentioned techniques to know more about user interactions. Analyzing these web server logs help to closely understand the user behavior and his/her web access pattern.

  15. Estimating Hardwood Sawmill Conversion Efficiency Based on Sawing Machine and Log.

    Treesearch

    Michael W. Wade; Steven H. Bullard; Philip H. Steele; Philip A. Araman

    1992-01-01

    Increased problems of hardwood timber availability have caused many sawmillers, industry analysts, and planners to recognize the importance of sawmill conversion efficiency. Conversion efficiency not only affects sawmill profits, but is also important on a much broader level. Timber supply issues have caused resource planners and policy makers to consider the effects...

  16. Understanding Academic Information Seeking Habits through Analysis of Web Server Log Files: The Case of the Teachers College Library Website

    ERIC Educational Resources Information Center

    Asunka, Stephen; Chae, Hui Soo; Hughes, Brian; Natriello, Gary

    2009-01-01

    Transaction logs of user activity on an academic library website were analyzed to determine general usage patterns on the website. This paper reports on insights gained from the analysis, and identifies and discusses issues relating to content access, interface design and general functionality of the website. (Contains 13 figures and 8 tables.)

  17. Understanding Academic Information Seeking Habits through Analysis of Web Server Log Files: The Case of the Teachers College Library Website

    ERIC Educational Resources Information Center

    Asunka, Stephen; Chae, Hui Soo; Hughes, Brian; Natriello, Gary

    2009-01-01

    Transaction logs of user activity on an academic library website were analyzed to determine general usage patterns on the website. This paper reports on insights gained from the analysis, and identifies and discusses issues relating to content access, interface design and general functionality of the website. (Contains 13 figures and 8 tables.)

  18. Using client-side event logging and path tracing to assess and improve the quality of web-based surveys.

    PubMed Central

    White, Thomas M.; Hauan, Michael J.

    2002-01-01

    Web-based data collection has considerable appeal. However, the quality of data collected using such instruments is often questionable. There can be systematic problems with the wording of the surveys, and/or the means with which they are deployed. In unsupervised data collection, there are also concerns about whether subjects understand the questions, and wehther they are answering honestly. This paper presents a schema for using client-side timestamps and traces of subjects' paths through instruments to detect problems with the definition of instruments and their deployment. We discuss two large, anonymous, web-based, medical surveys as examples of the utility of this approach. PMID:12463954

  19. Enabling and Integrating Distributed Web Resources for Efficient and Effective Discovery of Information on the Web

    NASA Astrophysics Data System (ADS)

    Verma, Neeta; Thangamuthu, Pechimuthu; Mishra, Alka

    National Portal of India [1] integrates information from distributed web resources like websites, portals of different Ministries, Departments, State Governments as well as district administrations. These websites are developed at different points of time, using different standards and technologies. Thus integrating information from the distributed, disparate web resources is a challenging task and also has a reflection on the information discovery by a citizen using a unified interface such as National Portal. The existing text based search engines would also not yield desired results [7].

  20. Efficiency of log wood combustion affects the toxicological and chemical properties of emission particles.

    PubMed

    Tapanainen, Maija; Jalava, Pasi I; Mäki-Paakkanen, Jorma; Hakulinen, Pasi; Lamberg, Heikki; Ruusunen, Jarno; Tissari, Jarkko; Jokiniemi, Jorma; Hirvonen, Maija-Riitta

    2012-05-01

    Particulate matter (PM) has been identified as a major environmental pollutant causing severe health problems. Large amounts of the harmful particulate matter (PM) are emitted from residential wood combustion, but the toxicological properties of wood combustion particles are poorly known. To investigate chemical and consequent toxicological characteristics of PM(1) emitted from different phases of batch combustion in four heating appliances. Mouse RAW264.7 macrophages and human BEAS-2B bronchial epithelial cells were exposed for 24 h to different doses (15-300 µg/mL) of wood combustion particles. After the exposure, cytotoxicity, genotoxicity, production of the inflammatory mediators (TNF-α and MIP-2) and effects on the cell cycle were assessed. Furthermore, the detected toxicological responses were compared with the chemical composition of PM(1) samples including PAHs, metals and ions. All the wood combustion samples exerted high cytotoxicity, but only moderate inflammatory activity. The particles emitted from the inefficient phase of batch combustion in the sauna stove (SS) induced the most extensive cytotoxic and genotoxic responses in mammalian cells. Polycyclic aromatic hydrocarbons (PAHs) and other organic compounds in PM(1) samples might have contributed to these effects. Instead, water-soluble metals seemed to participate in the cytotoxic responses triggered by the particles from more efficient batch combustion in the masonry heaters. Overall, the toxicological responses were decreased when the combustion phase was more efficient. Efficiency of batch combustion plays a significant role in the harmfulness of PM even under incomplete wood combustion processes.

  1. Standards, Efficiency, and the Evolution of Web Design

    ERIC Educational Resources Information Center

    Mitchell, Erik

    2010-01-01

    The author recently created a presentation using HTML5 based on a tutorial put together by Marcin Wichary. The example presentation is part proof-of-concept, part instructional piece, and it is part of a larger site on HTML5 and how one can use it to create rich Web-based applications. The more he delved into HTML5, the more he found that it was…

  2. Standards, Efficiency, and the Evolution of Web Design

    ERIC Educational Resources Information Center

    Mitchell, Erik

    2010-01-01

    The author recently created a presentation using HTML5 based on a tutorial put together by Marcin Wichary. The example presentation is part proof-of-concept, part instructional piece, and it is part of a larger site on HTML5 and how one can use it to create rich Web-based applications. The more he delved into HTML5, the more he found that it was…

  3. Development of high-efficiency solar cells on silicon web

    NASA Technical Reports Server (NTRS)

    Rohatgi, A.; Meier, D. L.; Campbell, R. B.; Seidensticker, R. G.; Rai-Choudhury, P.

    1985-01-01

    High-efficiency dendritic cells were discussed. The influence of twin planes and heat treatment on the location and effect of trace impurities was of particular interest. Proper heat treatment often increases efficiency by causing impurities to pile up at twin planes. Oxide passivation had a beneficial effect on efficiency. A very efficient antireflective (AR) coating of zinc selenide and magnesium fluoride was designed and fabricated. An aluminum back-surface reflector was also effective.

  4. Re-evaluation of an improved efficiency polymeric web point-focus Fresnel lens

    SciTech Connect

    Stillwell, C.B.

    1988-08-01

    The optical efficiency of the lens developed by 3M and reported in Development and Evaluation of an Improved Efficiency Polymeric Web Point-Focus Fresnel Lens was measured by Sandia and reported to be 82%. Subsequent to publication of that report, additional lens tests at Sandia showed a lens efficiency of only 79%. This report presents the results of a study to determine why the lens efficiency is now lower than originally observed. 2 refs., 5 figs., 2 tabs.

  5. Towards a Simple and Efficient Web Search Framework

    DTIC Science & Technology

    2014-11-01

    any useful information about the various aspects of a topic. For example, for the query “ raspberry pi ”, it covers topics such as “what is raspberry pi ...topics generated by the LDA topic model for query ” raspberry pi ”. One simple explanation is that web texts are too noisy and unfocused for the LDA process...making a rasp- berry pi ”. However, the topics generated based on the 10 top ranked documents do not make much sense to us in terms of their keywords

  6. Recovery Efficiency Test Project: Phase 1, Activity report. Volume 1: Site selection, drill plan preparation, drilling, logging, and coring operations

    SciTech Connect

    Overbey, W.K. Jr.; Carden, R.S.; Kirr, J.N.

    1987-04-01

    The recovery Efficiency Test well project addressed a number of technical issues. The primary objective was to determine the increased efficiency gas recovery of a long horizontal wellbore over that of a vertical wellbore and, more specifically, what improvements can be expected from inducing multiple hydraulic fractures from such a wellbore. BDM corporation located, planned, and drilled a long radius turn horizontal well in the Devonian shale Lower Huron section in Wayne County, West Virginia, demonstrating that state-of-the-art technology is capable of drilling such wells. BDM successfully tested drilling, coring, and logging in a horizontal well using air as the circulating medium; conducted reservoir modeling studies to protect flow rates and reserves in advance of drilling operations; observed two phase flow conditions in the wellbore not observed previously; cored a fracture zone which produced gas; observed that fractures in the core and the wellbore were not systematically spaced (varied from 5 to 68 feet in different parts of the wellbore); observed that highest gas show rates reported by the mud logger corresponded to zone with lowest fracture spacing (five feet) or high fracture frequency. Four and one-half inch casting was successfully installed in the borehole and was equipped to isolate the horizontal section into eight (8) zones for future testing and stimulation operations. 6 refs., 48 figs., 10 tabs.

  7. Production and food web efficiency decrease as fishing activity increases in a coastal ecosystem

    NASA Astrophysics Data System (ADS)

    Anh, Pham Viet; Everaert, Gert; Goethals, Peter; Vinh, Chu Tien; De Laender, Frederik

    2015-11-01

    Fishing effort in the Vietnamese coastal ecosystem has rapidly increased from the 1990s to the 2000s, with unknown consequences for local ecosystem structure and functioning. Using ecosystem models that integrate fisheries and food webs we found profound differences in the production of six functional groups, the food web efficiency, and eight functional food web indices between the 1990s (low fishing intensity) and the 2000s (high fishing intensity). The functional attributes (e.g. consumption) of high trophic levels (e.g. predators) were lower in the 2000s than in the 1990s while primary production did not vary, causing food web efficiency to decrease up to 40% with time for these groups. The opposite was found for lower trophic levels (e.g. zooplankton): the functional attributes and food web efficiency increased with time (22 and 10% for the functional attributes and food web efficiency, respectively). Total system throughput, a functional food web index, was about 10% higher in the 1990s than in the 2000s, indicating a reduction of the system size and activity with time. The network analyses further indicated that the Vietnamese coastal ecosystem in the 1990s was more developed (higher ascendancy and capacity), more stable (higher overhead) and more mature (higher ratio of ascendancy and capacity) than in the 2000s. In the 1990s the recovery time of the ecosystem was shorter than in 2000s, as indicated by a higher Finn's cycling index in the 1990s (7.8 and 6.5% in 1990s and 2000s, respectively). Overall, our results demonstrate that the Vietnamese coastal ecosystem has experienced profound changes between the 1990s and 2000s, and emphasise the need for a closer inspection of the ecological impact of fishing.

  8. 3Drefine: an interactive web server for efficient protein structure refinement

    PubMed Central

    Bhattacharya, Debswapna; Nowotny, Jackson; Cao, Renzhi; Cheng, Jianlin

    2016-01-01

    3Drefine is an interactive web server for consistent and computationally efficient protein structure refinement with the capability to perform web-based statistical and visual analysis. The 3Drefine refinement protocol utilizes iterative optimization of hydrogen bonding network combined with atomic-level energy minimization on the optimized model using a composite physics and knowledge-based force fields for efficient protein structure refinement. The method has been extensively evaluated on blind CASP experiments as well as on large-scale and diverse benchmark datasets and exhibits consistent improvement over the initial structure in both global and local structural quality measures. The 3Drefine web server allows for convenient protein structure refinement through a text or file input submission, email notification, provided example submission and is freely available without any registration requirement. The server also provides comprehensive analysis of submissions through various energy and statistical feedback and interactive visualization of multiple refined models through the JSmol applet that is equipped with numerous protein model analysis tools. The web server has been extensively tested and used by many users. As a result, the 3Drefine web server conveniently provides a useful tool easily accessible to the community. The 3Drefine web server has been made publicly available at the URL: http://sysbio.rnet.missouri.edu/3Drefine/. PMID:27131371

  9. 3Drefine: an interactive web server for efficient protein structure refinement.

    PubMed

    Bhattacharya, Debswapna; Nowotny, Jackson; Cao, Renzhi; Cheng, Jianlin

    2016-07-08

    3Drefine is an interactive web server for consistent and computationally efficient protein structure refinement with the capability to perform web-based statistical and visual analysis. The 3Drefine refinement protocol utilizes iterative optimization of hydrogen bonding network combined with atomic-level energy minimization on the optimized model using a composite physics and knowledge-based force fields for efficient protein structure refinement. The method has been extensively evaluated on blind CASP experiments as well as on large-scale and diverse benchmark datasets and exhibits consistent improvement over the initial structure in both global and local structural quality measures. The 3Drefine web server allows for convenient protein structure refinement through a text or file input submission, email notification, provided example submission and is freely available without any registration requirement. The server also provides comprehensive analysis of submissions through various energy and statistical feedback and interactive visualization of multiple refined models through the JSmol applet that is equipped with numerous protein model analysis tools. The web server has been extensively tested and used by many users. As a result, the 3Drefine web server conveniently provides a useful tool easily accessible to the community. The 3Drefine web server has been made publicly available at the URL: http://sysbio.rnet.missouri.edu/3Drefine/. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  10. An efficient scheme for automatic web pages categorization using the support vector machine

    NASA Astrophysics Data System (ADS)

    Bhalla, Vinod Kumar; Kumar, Neeraj

    2016-07-01

    In the past few years, with an evolution of the Internet and related technologies, the number of the Internet users grows exponentially. These users demand access to relevant web pages from the Internet within fraction of seconds. To achieve this goal, there is a requirement of an efficient categorization of web page contents. Manual categorization of these billions of web pages to achieve high accuracy is a challenging task. Most of the existing techniques reported in the literature are semi-automatic. Using these techniques, higher level of accuracy cannot be achieved. To achieve these goals, this paper proposes an automatic web pages categorization into the domain category. The proposed scheme is based on the identification of specific and relevant features of the web pages. In the proposed scheme, first extraction and evaluation of features are done followed by filtering the feature set for categorization of domain web pages. A feature extraction tool based on the HTML document object model of the web page is developed in the proposed scheme. Feature extraction and weight assignment are based on the collection of domain-specific keyword list developed by considering various domain pages. Moreover, the keyword list is reduced on the basis of ids of keywords in keyword list. Also, stemming of keywords and tag text is done to achieve a higher accuracy. An extensive feature set is generated to develop a robust classification technique. The proposed scheme was evaluated using a machine learning method in combination with feature extraction and statistical analysis using support vector machine kernel as the classification tool. The results obtained confirm the effectiveness of the proposed scheme in terms of its accuracy in different categories of web pages.

  11. Spatial data efficient transmission in WebGIS based on IPv6

    NASA Astrophysics Data System (ADS)

    Wang, Zhen-feng; Liu, Ji-ping; Wang, Liang; Tao, Kun-wang

    2008-12-01

    Large-size of spatial data and limited bandwidth of network make it restricted to transmit spatial data in WebGIS. This paper employs IPv6 (Internet Protocol version 6), the successor of IPv4 running now, to transmit spatial data efficiently. As the core of NGN (Next Generation Network), IPv6 brings us many advantages to resolve performance problems in current IPv4 network applications. Multicast, which is mandatory in IPv6 routers, can make one server serve many clients simultaneously efficiently, thus to improve capacity of network applications. The new type of anycast address in IPv6 will make network client applications possible to find the nearest server. This makes data transmission between client and server fastest. The paper introduces how to apply IPv6 multicast and anycast in WebGIS to transmit data efficiently.

  12. Structural efficiency studies of corrugated compression panels with curved caps and beaded webs

    NASA Technical Reports Server (NTRS)

    Davis, R. C.; Mills, C. T.; Prabhakaran, R.; Jackson, L. R.

    1984-01-01

    Curved cross-sectional elements are employed in structural concepts for minimum-mass compression panels. Corrugated panel concepts with curved caps and beaded webs are optimized by using a nonlinear mathematical programming procedure and a rigorous buckling analysis. These panel geometries are shown to have superior structural efficiencies compared with known concepts published in the literature. Fabrication of these efficient corrugation concepts became possible by advances made in the art of superplastically forming of metals. Results of the mass optimization studies of the concepts are presented as structural efficiency charts for axial compression.

  13. Logging on to Learn

    ERIC Educational Resources Information Center

    Butler, Kevin

    2010-01-01

    A classroom lecture at Capistrano Connections Academy in Southern California involves booting up the home computer, logging on to a Web site, and observing a teacher conducting a PowerPoint presentation of that day's lesson entirely online. Through microphone headsets, students can watch on their home computers, respond to the teacher's questions,…

  14. On Mining Web Access Logs

    DTIC Science & Technology

    2000-05-01

    IEEE Trans. Fuzzy Systems, 1:2, pp 98–110, 1993. [30] R. Krishnapuram and J. M. Keller, “The Possibilistic c- Means Algorithm: Insights and...defined heuristically in many different ways. We use the Fuzzy c- Means [10] membership model given by: uij = 1 r(xj;vi) 1=(m1) Pc k=1 1 r(xj;vk) 1...6) is 1=c times the harmonic mean of the dissimilarities fr(xj ;vi)) : i = 1; : : : ; cg when c = 2. The objective function for the Robust Fuzzy c

  15. LocExpress: a web server for efficiently estimating expression of novel transcripts.

    PubMed

    Hou, Mei; Tian, Feng; Jiang, Shuai; Kong, Lei; Yang, Dechang; Gao, Ge

    2016-12-22

    The temporal and spatial-specific expression pattern of a transcript in multiple tissues and cell types can indicate key clues about its function. While several gene atlas available online as pre-computed databases for known gene models, it's still challenging to get expression profile for previously uncharacterized (i.e. novel) transcripts efficiently. Here we developed LocExpress, a web server for efficiently estimating expression of novel transcripts across multiple tissues and cell types in human (20 normal tissues/cells types and 14 cell lines) as well as in mouse (24 normal tissues/cell types and nine cell lines). As a wrapper to RNA-Seq quantification algorithm, LocExpress efficiently reduces the time cost by making abundance estimation calls increasingly within the minimum spanning bundle region of input transcripts. For a given novel gene model, such local context-oriented strategy allows LocExpress to estimate its FPKMs in hundreds of samples within minutes on a standard Linux box, making an online web server possible. To the best of our knowledge, LocExpress is the only web server to provide nearly real-time expression estimation for novel transcripts in common tissues and cell types. The server is publicly available at http://loc-express.cbi.pku.edu.cn .

  16. Preparation With Web-Based Observational Practice Improves Efficiency of Simulation-Based Mastery Learning.

    PubMed

    Cheung, Jeffrey J H; Koh, Jansen; Brett, Clare; Bägli, Darius J; Kapralos, Bill; Dubrowski, Adam

    2016-10-01

    Our current understanding of what results in effective simulation-based training is restricted to the physical practice and debriefing stages, with little attention paid to the earliest stage: how learners are prepared for these experiences. This study explored the utility of Web-based observational practice (OP) -featuring combinations of reading materials (RMs), OP, and collaboration- to prepare novice medical students for a simulation-based mastery learning (SBML) workshop in central venous catheterization. Thirty medical students were randomized into the following 3 groups differing in their preparatory materials for a SBML workshop in central venous catheterization: a control group with RMs only, a group with Web-based groups including individual OP, and collaborative OP (COP) groups in addition to RM. Preparation occurred 1 week before the SBML workshop, followed by a retention test 1-week afterward. The impact on the learning efficiency was measured by time to completion (TTC) of the SBML workshop. Web site preparation behavior data were also collected. Web-based groups demonstrated significantly lower TTC when compared with the RM group, (P = 0.038, d = 0.74). Although no differences were found between any group performances at retention, the COP group spent significantly more time and produced more elaborate answers, than the OP group on an OP activity during preparation. When preparing for SBML, Web-based OP is superior to reading materials alone; however, COP may be an important motivational factor to increase learner engagement with instructional materials. Taken together, Web-based preparation and, specifically, OP may be an important consideration in optimizing simulation instructional design.

  17. Efficient 3D rendering for web-based medical imaging software: a proof of concept

    NASA Astrophysics Data System (ADS)

    Cantor-Rivera, Diego; Bartha, Robert; Peters, Terry

    2011-03-01

    Medical Imaging Software (MIS) found in research and in clinical practice, such as in Picture and Archiving Communication Systems (PACS) and Radiology Information Systems (RIS), has not been able to take full advantage of the Internet as a deployment platform. MIS is usually tightly coupled to algorithms that have substantial hardware and software requirements. Consequently, MIS is deployed on thick clients which usually leads project managers to allocate more resources during the deployment phase of the application than the resources that would be allocated if the application were deployed through a web interface.To minimize the costs associated with this scenario, many software providers use or develop plug-ins to provide the delivery platform (internet browser) with the features to load, interact and analyze medical images. Nevertheless there has not been a successful standard means to achieve this goal so far. This paper presents a study of WebGL as an alternative to plug-in development for efficient rendering of 3D medical models and DICOM images. WebGL is a technology that enables the internet browser to have access to the local graphics hardware in a native fashion. Because it is based in OpenGL, a widely accepted graphic industry standard, WebGL is being implemented in most of the major commercial browsers. After a discussion on the details of the technology, a series of experiments are presented to determine the operational boundaries in which WebGL is adequate for MIS. A comparison with current alternatives is also addressed. Finally conclusions and future work are discussed.

  18. Evaluation of the efficiency and effectiveness of independent dose calculation followed by machine log file analysis against conventional measurement based IMRT QA.

    PubMed

    Sun, Baozhou; Rangaraj, Dharanipathy; Boddu, Sunita; Goddu, Murty; Yang, Deshan; Palaniswaamy, Geethpriya; Yaddanapudi, Sridhar; Wooten, Omar; Mutic, Sasa

    2012-09-06

    Experimental methods are commonly used for patient-specific IMRT delivery verification. There are a variety of IMRT QA techniques which have been proposed and clinically used with a common understanding that not one single method can detect all possible errors. The aim of this work was to compare the efficiency and effectiveness of independent dose calculation followed by machine log file analysis to conventional measurement-based methods in detecting errors in IMRT delivery. Sixteen IMRT treatment plans (5 head-and-neck, 3 rectum, 3 breast, and 5 prostate plans) created with a commercial treatment planning system (TPS) were recalculated on a QA phantom. All treatment plans underwent ion chamber (IC) and 2D diode array measurements. The same set of plans was also recomputed with another commercial treatment planning system and the two sets of calculations were compared. The deviations between dosimetric measurements and independent dose calculation were evaluated. The comparisons included evaluations of DVHs and point doses calculated by the two TPS systems. Machine log files were captured during pretreatment composite point dose measurements and analyzed to verify data transfer and performance of the delivery machine. Average deviation between IC measurements and point dose calculations with the two TPSs for head-and-neck plans were 1.2 ± 1.3% and 1.4 ± 1.6%, respectively. For 2D diode array measurements, the mean gamma value with 3% dose difference and 3 mm distance-to-agreement was within 1.5% for 13 of 16 plans. The mean 3D dose differences calculated from two TPSs were within 3% for head-and-neck cases and within 2% for other plans. The machine log file analysis showed that the gantry angle, jaw position, collimator angle, and MUs were consistent as planned, and maximal MLC position error was less than 0.5 mm. The independent dose calculation followed by the machine log analysis takes an average 47 ± 6 minutes, while the experimental approach (using IC and

  19. Configuration of Appalachian logging roads

    Treesearch

    John E. Baumgras; John E. Baumgras

    1971-01-01

    The configuration - the curvature and grade - of logging roads in southern Appalachia is seldom severe, according to a recent Forest Service study. To improve the efficiency of logging roads, we must first define the characteristics of these roads; and in this report we provide a quantitative description of the configuration of over 200 miles of logging roads.

  20. Niche logging

    Treesearch

    Robert B. Rummer

    1997-01-01

    Logging is facing a world of change. A logger?s niche can be defined by terrain, climate, location, timber and product, local government, Federal government, landowners, and mills. The author offers strategies for survival and successful competition.

  1. FunctionAnnotator, a versatile and efficient web tool for non-model organism annotation.

    PubMed

    Chen, Ting-Wen; Gan, Ruei-Chi; Fang, Yi-Kai; Chien, Kun-Yi; Liao, Wei-Chao; Chen, Chia-Chun; Wu, Timothy H; Chang, Ian Yi-Feng; Yang, Chi; Huang, Po-Jung; Yeh, Yuan-Ming; Chiu, Cheng-Hsun; Huang, Tzu-Wen; Tang, Petrus

    2017-09-05

    Along with the constant improvement in high-throughput sequencing technology, an increasing number of transcriptome sequencing projects are carried out in organisms without decoded genome information and even on environmental biological samples. To study the biological functions of novel transcripts, the very first task is to identify their potential functions. We present a web-based annotation tool, FunctionAnnotator, which offers comprehensive annotations, including GO term assignment, enzyme annotation, domain/motif identification and predictions for subcellular localization. To accelerate the annotation process, we have optimized the computation processes and used parallel computing for all annotation steps. Moreover, FunctionAnnotator is designed to be versatile, and it generates a variety of useful outputs for facilitating other analyses. Here, we demonstrate how FunctionAnnotator can be helpful in annotating non-model organisms. We further illustrate that FunctionAnnotator can estimate the taxonomic composition of environmental samples and assist in the identification of novel proteins by combining RNA-Seq data with proteomics technology. In summary, FunctionAnnotator can efficiently annotate transcriptomes and greatly benefits studies focusing on non-model organisms or metatranscriptomes. FunctionAnnotator, a comprehensive annotation web-service tool, is freely available online at: http://fa.cgu.edu.tw/ . This new web-based annotator will shed light on field studies involving organisms without a reference genome.

  2. ProteMiner-SSM: a web server for efficient analysis of similar protein tertiary substructures

    PubMed Central

    Chang, Darby Tien-Hau; Chen, Chien-Yu; Chung, Wen-Chin; Oyang, Yen-Jen; Juan, Hsueh-Fen; Huang, Hsuan-Cheng

    2004-01-01

    Analysis of protein–ligand interactions is a fundamental issue in drug design. As the detailed and accurate analysis of protein–ligand interactions involves calculation of binding free energy based on thermodynamics and even quantum mechanics, which is highly expensive in terms of computing time, conformational and structural analysis of proteins and ligands has been widely employed as a screening process in computer-aided drug design. In this paper, a web server called ProteMiner-SSM designed for efficient analysis of similar protein tertiary substructures is presented. In one experiment reported in this paper, the web server has been exploited to obtain some clues about a biochemical hypothesis. The main distinction in the software design of the web server is the filtering process incorporated to expedite the analysis. The filtering process extracts the residues located in the caves of the protein tertiary structure for analysis and operates with O(nlogn) time complexity, where n is the number of residues in the protein. In comparison, the α-hull algorithm, which is a widely used algorithm in computer graphics for identifying those instances that are on the contour of a three-dimensional object, features O(n2) time complexity. Experimental results show that the filtering process presented in this paper is able to speed up the analysis by a factor ranging from 3.15 to 9.37 times. The ProteMiner-SSM web server can be found at http://proteminer.csie.ntu.edu.tw/. There is a mirror site at http://p4.sbl.bc.sinica.edu.tw/proteminer/. PMID:15215355

  3. Development and evaluation of an improved efficiency polymeric web point-focus Fresnel lens

    SciTech Connect

    Cobb, S. Jr.

    1987-04-01

    The feasibility of producing parquets of point-focus Fresnel lenses with a 2/sup 0/ draft angle on the riser in a continuous polymeric web is described. The parquet produced consisted of 14 square lenses, each 8.16 in. on a side, in a 2 by 7 format. The primary aim was to show that an increased efficiency was possible over that reported in SAND83-7023 by decreasing the draft angle. A secondary aim was also to produce a web of sufficient thickness to be used without lamination to a thick superstrate. The results demonstrated that increased efficiency was realized for both the thin and thick caliper material, with performance nearly equal to a direct-cut control lens. The results also show that a bowing or sagging problem exists in the laminated lenses. They also show that the thicker, non-laminated lenses may not be stiff enough to lie flat and may buckle, causing these lenses to be potentially unacceptable.

  4. Major constrains of the pelagic food web efficiency in the Mediterranean Sea

    NASA Astrophysics Data System (ADS)

    Zoccarato, L.; Fonda Umani, S.

    2015-03-01

    Grazing pressure plays a key role on plankton communities affecting their biodiversity and shaping their structures. Predation exerted by 2-200 μm protists (i.e. microzooplankton and heterotrophic nanoplankton) influences the carbon fate in marine environments channeling new organic matter from the microbial loop toward the "classic" grazing food web. In this study, we analyzed more than 80 dilution experiments carried out in many Mediterranean sites at the surface and in the meso-bathypelagic layers. Our aims were to investigate prey-predator interactions and determine selectivity among energy sources (in terms of available biomass), efficiency in the exploitation and highlight likely constrains that can modulate carbon transfer processes within the pelagic food webs. Generally, microzooplankton shown higher impacts on prey stocks than heterotrophic nanoflagellates, expressing larger ingestion rates and efficiency. Through different trophic conditions characterized on the base of chlorophyll a concentration, microzooplankton diet has shown to change in prey compositions: nano- and picoplankton almost completely covered consumer needs in oligotrophy and mesotrophy, while microphytoplankton (mostly diatoms) represented more than 80% of the consumers' diet in eutrophy, where, nevertheless, picoplankton mortality remained relatively high. Ingestion rates of both consumers (nano- and microzooplankters) increased with the availability of prey biomasses and consequently with the trophic condition of the environment. Nevertheless, overall the heterotrophic fraction of picoplankton resulted the most exploited biomass by both classes of consumers. Ingestion efficiency (as the ratio between available biomass and ingestion rate) increased at low biomasses and therefore the highest efficiencies were recorded in oligotrophic conditions and in the bathypelagic layers.

  5. Web Mining for Web Image Retrieval.

    ERIC Educational Resources Information Center

    Chen, Zheng; Wenyin, Liu; Zhang, Feng; Li, Mingjing; Zhang, Hongjiang

    2001-01-01

    Presents a prototype system for image retrieval from the Internet using Web mining. Discusses the architecture of the Web image retrieval prototype; document space modeling; user log mining; and image retrieval experiments to evaluate the proposed system. (AEF)

  6. Web Mining for Web Image Retrieval.

    ERIC Educational Resources Information Center

    Chen, Zheng; Wenyin, Liu; Zhang, Feng; Li, Mingjing; Zhang, Hongjiang

    2001-01-01

    Presents a prototype system for image retrieval from the Internet using Web mining. Discusses the architecture of the Web image retrieval prototype; document space modeling; user log mining; and image retrieval experiments to evaluate the proposed system. (AEF)

  7. Maximizing the quantum efficiency of microchannel plate detectors - The collection of photoelectrons from the interchannel web using an electric field

    NASA Technical Reports Server (NTRS)

    Taylor, R. C.; Hettrick, M. C.; Malina, R. F.

    1983-01-01

    High quantum efficiency and two-dimensional imaging capabilities make the microchannel plate (MCP) a suitable detector for a sky survey instrument. The Extreme Ultraviolet Explorer satellite, to be launched in 1987, will use MCP detectors. A feature which limits MCP efficiency is related to the walls of individual channels. The walls are of finite thickness and thus form an interchannel web. Under normal circumstances, this web does not contribute to the detector's quantum efficiency. Panitz and Foesch (1976) have found that in the case of a bombardment with ions, electrons were ejected from the electrode material coating the web. By applying a small electric field, the electrons were returned to the MCP surface where they were detected. The present investigation is concerned with the enhancement of quantum efficiencies in the case of extreme UV wavelengths. Attention is given to a model and a computer simulation which quantitatively reproduce the experimental results.

  8. Maximizing the quantum efficiency of microchannel plate detectors - The collection of photoelectrons from the interchannel web using an electric field

    NASA Technical Reports Server (NTRS)

    Taylor, R. C.; Hettrick, M. C.; Malina, R. F.

    1983-01-01

    High quantum efficiency and two-dimensional imaging capabilities make the microchannel plate (MCP) a suitable detector for a sky survey instrument. The Extreme Ultraviolet Explorer satellite, to be launched in 1987, will use MCP detectors. A feature which limits MCP efficiency is related to the walls of individual channels. The walls are of finite thickness and thus form an interchannel web. Under normal circumstances, this web does not contribute to the detector's quantum efficiency. Panitz and Foesch (1976) have found that in the case of a bombardment with ions, electrons were ejected from the electrode material coating the web. By applying a small electric field, the electrons were returned to the MCP surface where they were detected. The present investigation is concerned with the enhancement of quantum efficiencies in the case of extreme UV wavelengths. Attention is given to a model and a computer simulation which quantitatively reproduce the experimental results.

  9. Working with Data: Discovering Knowledge through Mining and Analysis; Systematic Knowledge Management and Knowledge Discovery; Text Mining; Methodological Approach in Discovering User Search Patterns through Web Log Analysis; Knowledge Discovery in Databases Using Formal Concept Analysis; Knowledge Discovery with a Little Perspective.

    ERIC Educational Resources Information Center

    Qin, Jian; Jurisica, Igor; Liddy, Elizabeth D.; Jansen, Bernard J; Spink, Amanda; Priss, Uta; Norton, Melanie J.

    2000-01-01

    These six articles discuss knowledge discovery in databases (KDD). Topics include data mining; knowledge management systems; applications of knowledge discovery; text and Web mining; text mining and information retrieval; user search patterns through Web log analysis; concept analysis; data collection; and data structure inconsistency. (LRW)

  10. Working with Data: Discovering Knowledge through Mining and Analysis; Systematic Knowledge Management and Knowledge Discovery; Text Mining; Methodological Approach in Discovering User Search Patterns through Web Log Analysis; Knowledge Discovery in Databases Using Formal Concept Analysis; Knowledge Discovery with a Little Perspective.

    ERIC Educational Resources Information Center

    Qin, Jian; Jurisica, Igor; Liddy, Elizabeth D.; Jansen, Bernard J; Spink, Amanda; Priss, Uta; Norton, Melanie J.

    2000-01-01

    These six articles discuss knowledge discovery in databases (KDD). Topics include data mining; knowledge management systems; applications of knowledge discovery; text and Web mining; text mining and information retrieval; user search patterns through Web log analysis; concept analysis; data collection; and data structure inconsistency. (LRW)

  11. NACE: A web-based tool for prediction of intercompartmental efficiency of human molecular genetic networks.

    PubMed

    Popik, Olga V; Ivanisenko, Timofey V; Saik, Olga V; Petrovskiy, Evgeny D; Lavrik, Inna N; Ivanisenko, Vladimir A

    2016-06-15

    Molecular genetic processes generally involve proteins from distinct intracellular localisations. Reactions that follow the same process are distributed among various compartments within the cell. In this regard, the reaction rate and the efficiency of biological processes can depend on the subcellular localisation of proteins. Previously, the authors proposed a method of evaluating the efficiency of biological processes based on the analysis of the distribution of protein subcellular localisation (Popik et al., 2014). Here, NACE is presented, which is an open access web-oriented program that implements this method and allows the user to evaluate the intercompartmental efficiency of human molecular genetic networks. The method has been extended by a new feature that provides the evaluation of the tissue-specific efficiency of networks for more than 2800 anatomical structures. Such assessments are important in cases when molecular genetic pathways in different tissues proceed with the participation of various proteins with a number of intracellular localisations. For example, an analysis of KEGG pathways, conducted using the developed program, showed that the efficiencies of many KEGG pathways are tissue-specific. Analysis of efficiencies of regulatory pathways in the liver, linking proteins of the hepatitis C virus with human proteins involved in the KEGG apoptosis pathway, showed that intercompartmental efficiency might play an important role in host-pathogen interactions. Thus, the developed tool can be useful in the study of the effectiveness of functioning of various molecular genetic networks, including metabolic, regulatory, host-pathogen interactions and others taking into account tissue-specific gene expression. The tool is available via the following link: http://www-bionet.sscc.ru/nace/.

  12. Increasing efficiency of information dissemination and collection through the World Wide Web

    Treesearch

    Daniel P. Huebner; Malchus B. Baker; Peter F. Ffolliott

    2000-01-01

    Researchers, managers, and educators have access to revolutionary technology for information transfer through the World Wide Web (Web). Using the Web to effectively gather and distribute information is addressed in this paper. Tools, tips, and strategies are discussed. Companion Web sites are provided to guide users in selecting the most appropriate tool for searching...

  13. Logging damage

    Treesearch

    Ralph D. Nyland

    1989-01-01

    The best commercial logging will damage at least some residual trees during all forms of partial cutting, no matter how carefully done. Yet recommendations at the end of this Note show there is much that you can do to limit damage by proper road and trail layout, proper training and supervision of crews, appropriate equipment, and diligence.

  14. SBMLmod: a Python-based web application and web service for efficient data integration and model simulation.

    PubMed

    Schäuble, Sascha; Stavrum, Anne-Kristin; Bockwoldt, Mathias; Puntervoll, Pål; Heiland, Ines

    2017-06-24

    Systems Biology Markup Language (SBML) is the standard model representation and description language in systems biology. Enriching and analysing systems biology models by integrating the multitude of available data, increases the predictive power of these models. This may be a daunting task, which commonly requires bioinformatic competence and scripting. We present SBMLmod, a Python-based web application and service, that automates integration of high throughput data into SBML models. Subsequent steady state analysis is readily accessible via the web service COPASIWS. We illustrate the utility of SBMLmod by integrating gene expression data from different healthy tissues as well as from a cancer dataset into a previously published model of mammalian tryptophan metabolism. SBMLmod is a user-friendly platform for model modification and simulation. The web application is available at http://sbmlmod.uit.no , whereas the WSDL definition file for the web service is accessible via http://sbmlmod.uit.no/SBMLmod.wsdl . Furthermore, the entire package can be downloaded from https://github.com/MolecularBioinformatics/sbml-mod-ws . We envision that SBMLmod will make automated model modification and simulation available to a broader research community.

  15. Adapting web-based instruction to residents' knowledge improves learning efficiency: a randomized controlled trial.

    PubMed

    Cook, David A; Beckman, Thomas J; Thomas, Kris G; Thompson, Warren G

    2008-07-01

    Increased clinical demands and decreased available time accentuate the need for efficient learning in postgraduate medical training. Adapting Web-based learning (WBL) to learners' prior knowledge may improve efficiency. We hypothesized that time spent learning would be shorter and test scores not adversely affected for residents who used a WBL intervention that adapted to prior knowledge. Randomized, crossover trial. Academic internal medicine residency program continuity clinic. 122 internal medicine residents. Four WBL modules on ambulatory medicine were developed in standard and adaptive formats. The adaptive format allowed learners who correctly answered case-based questions to skip the corresponding content. The measurements were knowledge posttest, time spent on modules, and format preference. One hundred twenty-two residents completed at least 1 module, and 111 completed all 4. Knowledge scores were similar between the adaptive format (mean +/- standard error of the mean, 76.2 +/- 0.9) and standard (77.2 +/- 0.9, 95% confidence interval [CI] for difference -3.0 to 1.0, P = .34). However, time spent was lower for the adaptive format (29.3 minutes [CI 26.0 to 33.0] per module) than for the standard (35.6 [31.6 to 40.3]), an 18% decrease in time (CI 9 to 26%, P = .0003). Seventy-two of 96 respondents (75%) preferred the adaptive format. Adapting WBL to learners' prior knowledge can reduce learning time without adversely affecting knowledge scores, suggesting greater learning efficiency. In an era where reduced duty hours and growing clinical demands on trainees and faculty limit the time available for learning, such efficiencies will be increasingly important. For clinical trial registration, see http://www.clinicaltrials.gov NCT00466453 ( http://www.clinicaltrials.gov/ct/show/NCT00466453?order=1 ).

  16. Adapting Web-based Instruction to Residents’ Knowledge Improves Learning Efficiency

    PubMed Central

    Beckman, Thomas J.; Thomas, Kris G.; Thompson, Warren G.

    2008-01-01

    Summary BACKGROUND Increased clinical demands and decreased available time accentuate the need for efficient learning in postgraduate medical training. Adapting Web-based learning (WBL) to learners’ prior knowledge may improve efficiency. OBJECTIVE We hypothesized that time spent learning would be shorter and test scores not adversely affected for residents who used a WBL intervention that adapted to prior knowledge. DESIGN Randomized, crossover trial. SETTING Academic internal medicine residency program continuity clinic. PARTICIPANTS 122 internal medicine residents. INTERVENTIONS Four WBL modules on ambulatory medicine were developed in standard and adaptive formats. The adaptive format allowed learners who correctly answered case-based questions to skip the corresponding content. MEASUREMENTS and Main Results The measurements were knowledge posttest, time spent on modules, and format preference. One hundred twenty-two residents completed at least 1 module, and 111 completed all 4. Knowledge scores were similar between the adaptive format (mean ± standard error of the mean, 76.2 ± 0.9) and standard (77.2 ± 0.9, 95% confidence interval [CI] for difference −3.0 to 1.0, P = .34). However, time spent was lower for the adaptive format (29.3 minutes [CI 26.0 to 33.0] per module) than for the standard (35.6 [31.6 to 40.3]), an 18% decrease in time (CI 9 to 26%, P = .0003). Seventy-two of 96 respondents (75%) preferred the adaptive format. CONCLUSIONS Adapting WBL to learners’ prior knowledge can reduce learning time without adversely affecting knowledge scores, suggesting greater learning efficiency. In an era where reduced duty hours and growing clinical demands on trainees and faculty limit the time available for learning, such efficiencies will be increasingly important. For clinical trial registration, see http://www.clinicaltrials.gov NCT00466453 (http://www.clinicaltrials.gov/ct/show/NCT00466453?order=1). PMID:18612729

  17. Development of high-efficiency solar cells on silicon web. Seventh quarterly progress report, October 1-December 31, 1985

    SciTech Connect

    Meier, D.L.; Greggi, J.; Rai-Choudhury, P.

    1986-03-06

    The seventh quarterly report describes the results of work aimed at identifying and reducing sources of carrier recombination both in the starting web silicon material and in the processed cells. Cross-sectional TEM measurements of several web cells have been made and analyzed. The effect of the heavily twinned region on cell efficiency has been modeled, and the modeling results compared to measured values for processed cells. The effects of low-energy, high-dose hydrogen ion implantation on cell efficiency and diffusion length have been examined. Cells have been fabricated from web silicon known to have a high diffusion length, with a new double-layer antireflection coating being applied to these cells. A new contact system, to be used with oxide-passivated cells and which greatly reduces the area of contact between metal and silicon, has been designed. Finally, the application of DLTS measurements to beveled samples has been further investigated.

  18. Impacts of elevated terrestrial nutrient loads and temperature on pelagic food-web efficiency and fish production.

    PubMed

    Lefébure, R; Degerman, R; Andersson, A; Larsson, S; Eriksson, L-O; Båmstedt, U; Byström, P

    2013-05-01

    Both temperature and terrestrial organic matter have strong impacts on aquatic food-web dynamics and production. Temperature affects vital rates of all organisms, and terrestrial organic matter can act both as an energy source for lower trophic levels, while simultaneously reducing light availability for autotrophic production. As climate change predictions for the Baltic Sea and elsewhere suggest increases in both terrestrial matter runoff and increases in temperature, we studied the effects on pelagic food-web dynamics and food-web efficiency in a plausible future scenario with respect to these abiotic variables in a large-scale mesocosm experiment. Total basal (phytoplankton plus bacterial) production was slightly reduced when only increasing temperatures, but was otherwise similar across all other treatments. Separate increases in nutrient loads and temperature decreased the ratio of autotrophic:heterotrophic production, but the combined treatment of elevated temperature and terrestrial nutrient loads increased both fish production and food-web efficiency. CDOM: Chl a ratios strongly indicated that terrestrial and not autotrophic carbon was the main energy source in these food webs and our results also showed that zooplankton biomass was positively correlated with increased bacterial production. Concomitantly, biomass of the dominant calanoid copepod Acartia sp. increased as an effect of increased temperature. As the combined effects of increased temperature and terrestrial organic nutrient loads were required to increase zooplankton abundance and fish production, conclusions about effects of climate change on food-web dynamics and fish production must be based on realistic combinations of several abiotic factors. Moreover, our results question established notions on the net inefficiency of heterotrophic carbon transfer to the top of the food web.

  19. OLTARIS: An Efficient Web-Based Tool for Analyzing Materials Exposed to Space Radiation

    NASA Technical Reports Server (NTRS)

    Slaba, Tony; McMullen, Amelia M.; Thibeault, Sheila A.; Sandridge, Chris A.; Clowdsley, Martha S.; Blatting, Steve R.

    2011-01-01

    The near-Earth space radiation environment includes energetic galactic cosmic rays (GCR), high intensity proton and electron belts, and the potential for solar particle events (SPE). These sources may penetrate shielding materials and deposit significant energy in sensitive electronic devices on board spacecraft and satellites. Material and design optimization methods may be used to reduce the exposure and extend the operational lifetime of individual components and systems. Since laboratory experiments are expensive and may not cover the range of particles and energies relevant for space applications, such optimization may be done computationally with efficient algorithms that include the various constraints placed on the component, system, or mission. In the present work, the web-based tool OLTARIS (On-Line Tool for the Assessment of Radiation in Space) is presented, and the applicability of the tool for rapidly analyzing exposure levels within either complicated shielding geometries or user-defined material slabs exposed to space radiation is demonstrated. An example approach for material optimization is also presented. Slabs of various advanced multifunctional materials are defined and exposed to several space radiation environments. The materials and thicknesses defining each layer in the slab are then systematically adjusted to arrive at an optimal slab configuration.

  20. Efficiency of Using a Web-Based Approach to Teach Reading Strategies to Iranian EFL Learners

    ERIC Educational Resources Information Center

    Dehghanpour, Elham; Hashemian, Mahmood

    2015-01-01

    Applying new technologies with their effective potentials have changed education and, consequently, the L2 teacher role. Coping with online materials imposes the necessity of employing Web-based approaches in L2 instruction. The ability to use reading strategies in a Web-based condition needs sufficient skill which will be fulfilled if it is…

  1. Analyzing web log files of the health on the net HONmedia search engine to define typical image search tasks for image retrieval evaluation.

    PubMed

    Müller, Henning; Boyer, Célia; Gaudinat, Arnaud; Hersh, William; Geissbuhler, Antoine

    2007-01-01

    Medical institutions produce ever-increasing amount of diverse information. The digital form makes these data available for the use on more than a single patient. Images are no exception to this. However, less is known about how medical professionals search for visual medical information and how they want to use it outside of the context of a single patient. This article analyzes ten months of usage log files of the Health on the Net (HON) medical media search engine. Key words were extracted from all queries and the most frequent terms and subjects were identified. The dataset required much pre-treatment. Problems included national character sets, spelling errors and the use of terms in several languages. The results show that media search, particularly for images, was frequently used. The most common queries were for general concepts (e.g., heart, lung). To define realistic information needs for the ImageCLEFmed challenge evaluation (Cross Language Evaluation Forum medical image retrieval), we used frequent queries that were still specific enough to at least cover two of the three axes on modality, anatomic region, and pathology. Several research groups evaluated their image retrieval algorithms based on these defined topics.

  2. MultiLog: a tool for the control and output merging of multiple logging applications.

    PubMed

    Woodruff, Jonathan; Alexander, Jason

    2016-12-01

    MultiLog is a logging tool that controls, gathers, and combines the output, on-the-fly, from existing research and commercial logging applications or "loggers." Loggers record a specific set of user actions on a computing device, helping researchers to better understand environments or interactions, guiding the design of new or improved interfaces and applications. MultiLog reduces researchers' required implementation effort by simplifying the set-up of multiple loggers and seamlessly combining their output. This in turn increases the availability of logging systems to non-technical experimenters for both short-term and longitudinal observation studies. MultiLog supports two operating modes: "researcher mode" where experimenters configure multiple logging systems, and "deployment mode" where the system is deployed to user-study participants' systems. Researcher mode allows researchers to install, configure log filtering and obfuscation, observe near real-time event streams, and save configuration files ready for deployment. Deployment mode simplifies data collection from multiple loggers by running in the system tray at user log-in, starting loggers, combining their output, and securely uploading the data to a web-server. It also supports real-time browsing of log data, pausing of logging, and removal of log lines. Performance evaluations show that MultiLog does not adversely affect system performance, even when simultaneously running several logging systems. Initial studies show the system runs reliably over a period of 10 weeks.

  3. SU-E-J-150: Impact of Intrafractional Prostate Motion On the Accuracy and Efficiency of Prostate SBRT Delivery: A Retrospective Analysis of Prostate Tracking Log Files

    SciTech Connect

    Xiang, H; Hirsch, A; Willins, J; Kachnic, J; Qureshi, M; Katz, M; Nicholas, B; Keohan, S; De Armas, R; Lu, H; Efstathiou, J; Zietman, A

    2014-06-01

    Purpose: To measure intrafractional prostate motion by time-based stereotactic x-ray imaging and investigate the impact on the accuracy and efficiency of prostate SBRT delivery. Methods: Prostate tracking log files with 1,892 x-ray image registrations from 18 SBRT fractions for 6 patients were retrospectively analyzed. Patient setup and beam delivery sessions were reviewed to identify extended periods of large prostate motion that caused delays in setup or interruptions in beam delivery. The 6D prostate motions were compared to the clinically used PTV margin of 3–5 mm (3 mm posterior, 5 mm all other directions), a hypothetical PTV margin of 2–3 mm (2 mm posterior, 3 mm all other directions), and the rotation correction limits (roll ±2°, pitch ±5° and yaw ±3°) of CyberKnife to quantify beam delivery accuracy. Results: Significant incidents of treatment start delay and beam delivery interruption were observed, mostly related to large pitch rotations of ≥±5°. Optimal setup time of 5–15 minutes was recorded in 61% of the fractions, and optimal beam delivery time of 30–40 minutes in 67% of the fractions. At a default imaging interval of 15 seconds, the percentage of prostate motion beyond PTV margin of 3–5 mm varied among patients, with a mean at 12.8% (range 0.0%–31.1%); and the percentage beyond PTV margin of 2–3 mm was at a mean of 36.0% (range 3.3%–83.1%). These timely detected offsets were all corrected real-time by the robotic manipulator or by operator intervention at the time of treatment interruptions. Conclusion: The durations of patient setup and beam delivery were directly affected by the occurrence of large prostate motion. Frequent imaging of down to 15 second interval is necessary for certain patients. Techniques for reducing prostate motion, such as using endorectal balloon, can be considered to assure consistently higher accuracy and efficiency of prostate SBRT delivery.

  4. Reviews Equipment: Data logger Book: Imagined Worlds Equipment: Mini data loggers Equipment: PICAXE-18M2 data logger Books: Engineering: A Very Short Introduction and To Engineer Is Human Book: Soap, Science, & Flat-Screen TVs Equipment: uLog and SensorLab Web Watch

    NASA Astrophysics Data System (ADS)

    2012-07-01

    WE RECOMMEND Data logger Fourier NOVA LINK: data logging and analysis To Engineer is Human Engineering: essays and insights Soap, Science, & Flat-Screen TVs People, politics, business and science overlap uLog sensors and sensor adapter A new addition to the LogIT range offers simplicity and ease of use WORTH A LOOK Imagined Worlds Socio-scientific predictions for the future Mini light data logger and mini temperature data logger Small-scale equipment for schools SensorLab Plus LogIT's supporting software, with extra features HANDLE WITH CARE CAXE110P PICAXE-18M2 data logger Data logger 'on view' but disappoints Engineering: A Very Short Introduction A broad-brush treatment fails to satisfy WEB WATCH Two very different websites for students: advanced physics questions answered and a more general BBC science resource

  5. An efficient and flexible web services-based multidisciplinary design optimisation framework for complex engineering systems

    NASA Astrophysics Data System (ADS)

    Li, Liansheng; Liu, Jihong

    2012-08-01

    Multidisciplinary design optimisation (MDO) involves multiple disciplines, multiple coupled relationships and multiple processes, which is implemented by different specialists dispersed geographically on heterogeneous platforms with different analysis and optimisation tools. The product design data integration and data sharing among the participants hampers the development and applications of MDO in enterprises seriously. Therefore, a multi-hierarchical integrated product design data model (MH-iPDM) supporting the MDO in the web environment and a web services-based multidisciplinary design optimisation (Web-MDO) framework are proposed in this article. Based on the enabling technologies including web services, ontology, workflow, agent, XML and evidence theory, the proposed framework enables the designers geographically dispersed to work collaboratively in the MDO environment. The ontology-based workflow enables the logical reasoning of MDO to be processed dynamically. The evidence theory-based uncertainty reasoning and analysis supports the quantification, aggregation and analysis of the conflicting epistemic uncertainty from multiple sources, which improves the quality of product. Finally, a proof-of-concept prototype system is developed using J2EE and an example of supersonic business jet is demonstrated to verify the autonomous execution of MDO strategies and the effectiveness of the proposed approach.

  6. Index Compression and Efficient Query Processing in Large Web Search Engines

    ERIC Educational Resources Information Center

    Ding, Shuai

    2013-01-01

    The inverted index is the main data structure used by all the major search engines. Search engines build an inverted index on their collection to speed up query processing. As the size of the web grows, the length of the inverted list structures, which can easily grow to hundreds of MBs or even GBs for common terms (roughly linear in the size of…

  7. Index Compression and Efficient Query Processing in Large Web Search Engines

    ERIC Educational Resources Information Center

    Ding, Shuai

    2013-01-01

    The inverted index is the main data structure used by all the major search engines. Search engines build an inverted index on their collection to speed up query processing. As the size of the web grows, the length of the inverted list structures, which can easily grow to hundreds of MBs or even GBs for common terms (roughly linear in the size of…

  8. O’ Surgery Case Log Data, Where Art Thou?

    PubMed Central

    Patel, Mayur B; Guillamondegui, Oscar D; Ott, Mickey M; Palmiter, Kimberly A; May, Addison K

    2012-01-01

    Background The American College of Surgeons Case Log (ACS Case Log) represents a data system that satisfies the American Board of Surgery (ABS) Maintenance of Certification (MOC) program, yet has broad data fields for surgical subspecialties. Using the ACS Case Log, we have developed a method of data capture, categorization, and reporting of acute care surgery fellows' experiences. Study Design In July 2010, our Acute Care Surgery fellowship required our fellows to log their clinical experiences into the ACS Case Log. Cases were entered similar to billable documentation rules. Keywords were entered that specified institutional services and/or resuscitation types. This data was exported in comma separated value format, de-identified, and structured by Current Procedural Terminology (CPT) codes relevant to acute care surgery and sub-stratified by fellow and/or fellow year. Results Fifteen report types were created consisting of operative experience by service, procedure by major category (cardiothoracic, vascular, solid organ, abdominal wall, hollow viscus, and soft tissue), total resuscitations, ultrasound, airway, Intensive Care Unit services, basic neurosurgery, and basic orthopaedics. Results are viewable via a secure web application, accessible nationally, and exportable to many formats. Conclusions Utilizing the ACS Case Log satisfies the ABS MOC program requirements and provides a method for monitoring and reporting acute care surgery fellow experiences. This system is flexible to accommodate the needs of surgical subspecialties and their training programs. As documentation requirements expand, efficient clinical documentation is a must for the busy surgeon. Although, our data entry and processing method has the immediate capacity for acute care surgery fellowships nationwide, multiple larger decisions regarding national case log systems should be encouraged. PMID:22634118

  9. Harnessing modern web application technology to create intuitive and efficient data visualization and sharing tools.

    PubMed

    Wood, Dylan; King, Margaret; Landis, Drew; Courtney, William; Wang, Runtang; Kelly, Ross; Turner, Jessica A; Calhoun, Vince D

    2014-01-01

    Neuroscientists increasingly need to work with big data in order to derive meaningful results in their field. Collecting, organizing and analyzing this data can be a major hurdle on the road to scientific discovery. This hurdle can be lowered using the same technologies that are currently revolutionizing the way that cultural and social media sites represent and share information with their users. Web application technologies and standards such as RESTful webservices, HTML5 and high-performance in-browser JavaScript engines are being utilized to vastly improve the way that the world accesses and shares information. The neuroscience community can also benefit tremendously from these technologies. We present here a web application that allows users to explore and request the complex datasets that need to be shared among the neuroimaging community. The COINS (Collaborative Informatics and Neuroimaging Suite) Data Exchange uses web application technologies to facilitate data sharing in three phases: Exploration, Request/Communication, and Download. This paper will focus on the first phase, and how intuitive exploration of large and complex datasets is achieved using a framework that centers around asynchronous client-server communication (AJAX) and also exposes a powerful API that can be utilized by other applications to explore available data. First opened to the neuroscience community in August 2012, the Data Exchange has already provided researchers with over 2500 GB of data.

  10. Harnessing modern web application technology to create intuitive and efficient data visualization and sharing tools

    PubMed Central

    Wood, Dylan; King, Margaret; Landis, Drew; Courtney, William; Wang, Runtang; Kelly, Ross; Turner, Jessica A.; Calhoun, Vince D.

    2014-01-01

    Neuroscientists increasingly need to work with big data in order to derive meaningful results in their field. Collecting, organizing and analyzing this data can be a major hurdle on the road to scientific discovery. This hurdle can be lowered using the same technologies that are currently revolutionizing the way that cultural and social media sites represent and share information with their users. Web application technologies and standards such as RESTful webservices, HTML5 and high-performance in-browser JavaScript engines are being utilized to vastly improve the way that the world accesses and shares information. The neuroscience community can also benefit tremendously from these technologies. We present here a web application that allows users to explore and request the complex datasets that need to be shared among the neuroimaging community. The COINS (Collaborative Informatics and Neuroimaging Suite) Data Exchange uses web application technologies to facilitate data sharing in three phases: Exploration, Request/Communication, and Download. This paper will focus on the first phase, and how intuitive exploration of large and complex datasets is achieved using a framework that centers around asynchronous client-server communication (AJAX) and also exposes a powerful API that can be utilized by other applications to explore available data. First opened to the neuroscience community in August 2012, the Data Exchange has already provided researchers with over 2500 GB of data. PMID:25206330

  11. MEDock: a web server for efficient prediction of ligand binding sites based on a novel optimization algorithm.

    PubMed

    Chang, Darby Tien-Hau; Oyang, Yen-Jen; Lin, Jung-Hsin

    2005-07-01

    The prediction of ligand binding sites is an essential part of the drug discovery process. Knowing the location of binding sites greatly facilitates the search for hits, the lead optimization process, the design of site-directed mutagenesis experiments and the hunt for structural features that influence the selectivity of binding in order to minimize the drug's adverse effects. However, docking is still the rate-limiting step for such predictions; consequently, much more efficient algorithms are required. In this article, the design of the MEDock web server is described. The goal of this sever is to provide an efficient utility for predicting ligand binding sites. The MEDock web server incorporates a global search strategy that exploits the maximum entropy property of the Gaussian probability distribution in the context of information theory. As a result of the global search strategy, the optimization algorithm incorporated in MEDock is significantly superior when dealing with very rugged energy landscapes, which usually have insurmountable barriers. This article describes four different benchmark cases that span a diverse set of different types of ligand binding interactions. These benchmarks were compared with the use of the Lamarckian genetic algorithm (LGA), which is the major workhorse of the well-known AutoDock program. These results demonstrate that MEDock consistently converged to the correct binding modes with significantly smaller numbers of energy evaluations than the LGA required. When judged by a threshold of the number of energy evaluations consumed in the docking simulation, MEDock also greatly elevates the rate of accurate predictions for all benchmark cases. MEDock is available at http://medock.csie.ntu.edu.tw/ and http://bioinfo.mc.ntu.edu.tw/medock/.

  12. Montana Logging Utilization, 2002

    Treesearch

    Todd A. Morgan; Timothy P. Spoelma; Charles E. Keegan; Alfred L. Chase; Michael T. Thompson

    2005-01-01

    A study of logging utilization in Montana during 2002 provided logging and product utilization data for sawlog and veneer log harvests in Montana. Results of the study indicate a shift toward greater utilization of smaller diameter material, as 78 percent of the harvested volume in Montana during 2002 came from trees less than 17 inches diameter at breast height. The...

  13. Log N-log S in inconclusive

    NASA Technical Reports Server (NTRS)

    Klebesadel, R. W.; Fenimore, E. E.; Laros, J.

    1983-01-01

    The log N-log S data acquired by the Pioneer Venus Orbiter Gamma Burst Detector (PVO) are presented and compared to similar data from the Soviet KONUS experiment. Although the PVO data are consistent with and suggestive of a -3/2 power law distribution, the results are not adequate at this state of observations to differentiate between a -3/2 and a -1 power law slope.

  14. Log Truck-Weighing System

    NASA Technical Reports Server (NTRS)

    1977-01-01

    ELDEC Corp., Lynwood, Wash., built a weight-recording system for logging trucks based on electronic technology the company acquired as a subcontractor on space programs such as Apollo and the Saturn launch vehicle. ELDEC employed its space-derived expertise to develop a computerized weight-and-balance system for Lockheed's TriStar jetliner. ELDEC then adapted the airliner system to a similar product for logging trucks. Electronic equipment computes tractor weight, trailer weight and overall gross weight, and this information is presented to the driver by an instrument in the cab. The system costs $2,000 but it pays for itself in a single year. It allows operators to use a truck's hauling capacity more efficiently since the load can be maximized without exceeding legal weight limits for highway travel. Approximately 2,000 logging trucks now use the system.

  15. Costs and Efficiency of Online and Offline Recruitment Methods: A Web-Based Cohort Study

    PubMed Central

    Riis, Anders H; Hatch, Elizabeth E; Wise, Lauren A; Nielsen, Marie G; Rothman, Kenneth J; Toft Sørensen, Henrik; Mikkelsen, Ellen M

    2017-01-01

    Background The Internet is widely used to conduct research studies on health issues. Many different methods are used to recruit participants for such studies, but little is known about how various recruitment methods compare in terms of efficiency and costs. Objective The aim of our study was to compare online and offline recruitment methods for Internet-based studies in terms of efficiency (number of recruited participants) and costs per participant. Methods We employed several online and offline recruitment methods to enroll 18- to 45-year-old women in an Internet-based Danish prospective cohort study on fertility. Offline methods included press releases, posters, and flyers. Online methods comprised advertisements placed on five different websites, including Facebook and Netdoktor.dk. We defined seven categories of mutually exclusive recruitment methods and used electronic tracking via unique Uniform Resource Locator (URL) and self-reported data to identify the recruitment method for each participant. For each method, we calculated the average cost per participant and efficiency, that is, the total number of recruited participants. Results We recruited 8252 study participants. Of these, 534 were excluded as they could not be assigned to a specific recruitment method. The final study population included 7724 participants, of whom 803 (10.4%) were recruited by offline methods, 3985 (51.6%) by online methods, 2382 (30.8%) by online methods not initiated by us, and 554 (7.2%) by other methods. Overall, the average cost per participant was €6.22 for online methods initiated by us versus €9.06 for offline methods. Costs per participant ranged from €2.74 to €105.53 for online methods and from €0 to €67.50 for offline methods. Lowest average costs per participant were for those recruited from Netdoktor.dk (€2.99) and from Facebook (€3.44). Conclusions In our Internet-based cohort study, online recruitment methods were superior to offline methods in terms

  16. Costs and Efficiency of Online and Offline Recruitment Methods: A Web-Based Cohort Study.

    PubMed

    Christensen, Tina; Riis, Anders H; Hatch, Elizabeth E; Wise, Lauren A; Nielsen, Marie G; Rothman, Kenneth J; Toft Sørensen, Henrik; Mikkelsen, Ellen M

    2017-03-01

    The Internet is widely used to conduct research studies on health issues. Many different methods are used to recruit participants for such studies, but little is known about how various recruitment methods compare in terms of efficiency and costs. The aim of our study was to compare online and offline recruitment methods for Internet-based studies in terms of efficiency (number of recruited participants) and costs per participant. We employed several online and offline recruitment methods to enroll 18- to 45-year-old women in an Internet-based Danish prospective cohort study on fertility. Offline methods included press releases, posters, and flyers. Online methods comprised advertisements placed on five different websites, including Facebook and Netdoktor.dk. We defined seven categories of mutually exclusive recruitment methods and used electronic tracking via unique Uniform Resource Locator (URL) and self-reported data to identify the recruitment method for each participant. For each method, we calculated the average cost per participant and efficiency, that is, the total number of recruited participants. We recruited 8252 study participants. Of these, 534 were excluded as they could not be assigned to a specific recruitment method. The final study population included 7724 participants, of whom 803 (10.4%) were recruited by offline methods, 3985 (51.6%) by online methods, 2382 (30.8%) by online methods not initiated by us, and 554 (7.2%) by other methods. Overall, the average cost per participant was €6.22 for online methods initiated by us versus €9.06 for offline methods. Costs per participant ranged from €2.74 to €105.53 for online methods and from €0 to €67.50 for offline methods. Lowest average costs per participant were for those recruited from Netdoktor.dk (€2.99) and from Facebook (€3.44). In our Internet-based cohort study, online recruitment methods were superior to offline methods in terms of efficiency (total number of participants

  17. Utilization and cost for animal logging operations

    Treesearch

    Suraj P. Shrestha; Bobby L. Lanford

    2001-01-01

    Forest harvesting with animals is a labor-intensive operation. Due to the development of efficient machines and high volume demands from the forest products industry, mechanization of logging developed very fast, leaving behind the traditional horse and mule logging. It is expensive to use machines on smaller woodlots, which require frequent moves if mechanically...

  18. Log processing systems

    SciTech Connect

    Bowlin, W.P.; Kneer, M.P.; Ballance, J.D.

    1989-11-07

    This patent describes an improvement in a computer controlled processing system for lumber production. It comprises: a computer, a sequence of processing stations for processing a log segment including; an excess material removing station for generating opposed flat side surfaces on the log segment. The flat side surfaces determined by the computer to become sides of boards to be severed from the log segments; a profiling station for forming profiled edges above and below the flat side surfaces to become the side edges of the boards to be severed from the log segment, and a severing station for severing the boards from the log segments, a conveyance means establishing a path of conveyance and having continuous control of the log segment on conveying the log segment along the path and through the above defined sequence of processing stations.

  19. LogScope

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Smith, Margaret H.; Barringer, Howard; Groce, Alex

    2012-01-01

    LogScope is a software package for analyzing log files. The intended use is for offline post-processing of such logs, after the execution of the system under test. LogScope can, however, in principle, also be used to monitor systems online during their execution. Logs are checked against requirements formulated as monitors expressed in a rule-based specification language. This language has similarities to a state machine language, but is more expressive, for example, in its handling of data parameters. The specification language is user friendly, simple, and yet expressive enough for many practical scenarios. The LogScope software was initially developed to specifically assist in testing JPL s Mars Science Laboratory (MSL) flight software, but it is very generic in nature and can be applied to any application that produces some form of logging information (which almost any software does).

  20. Optimal message log reclamation for independent checkpointing

    NASA Technical Reports Server (NTRS)

    Wang, Yi-Min; Fuchs, W. Kent

    1993-01-01

    Independent (uncoordinated) check pointing for parallel and distributed systems allows maximum process autonomy but suffers from possible domino effects and the associated storage space overhead for maintaining multiple checkpoints and message logs. In most research on check pointing and recovery, it was assumed that only the checkpoints and message logs older than the global recovery line can be discarded. It is shown how recovery line transformation and decomposition can be applied to the problem of efficiently identifying all discardable message logs, thereby achieving optimal garbage collection. Communication trace-driven simulation for several parallel programs is used to show the benefits of the proposed algorithm for message log reclamation.

  1. Web-based oil immersion whole slide imaging increases efficiency and clinical team satisfaction in hematopathology tumor board

    PubMed Central

    Chen, Zhongchuan Will; Kohan, Jessica; Perkins, Sherrie L.; Hussong, Jerry W.; Salama, Mohamed E.

    2014-01-01

    Background: Whole slide imaging (WSI) is widely used for education and research, but is increasingly being used to streamline clinical workflow. We present our experience with regard to satisfaction and time utilization using oil immersion WSI for presentation of blood/marrow aspirate smears, core biopsies, and tissue sections in hematology/oncology tumor board/treatment planning conferences (TPC). Methods: Lymph nodes and bone marrow core biopsies were scanned at ×20 magnification and blood/marrow smears at 83X under oil immersion and uploaded to an online library with areas of interest to be displayed annotated digitally via web browser. Pathologist time required to prepare slides for scanning was compared to that required to prepare for microscope projection (MP). Time required to present cases during TPC was also compared. A 10-point evaluation survey was used to assess clinician satisfaction with each presentation method. Results: There was no significant difference in hematopathologist preparation time between WSI and MP. However, presentation time was significantly less for WSI compared to MP as selection and annotation of slides was done prior to TPC with WSI, enabling more efficient use of TPC presentation time. Survey results showed a significant increase in satisfaction by clinical attendees with regard to image quality, efficiency of presentation of pertinent findings, aid in clinical decision-making, and overall satisfaction regarding pathology presentation. A majority of respondents also noted decreased motion sickness with WSI. Conclusions: Whole slide imaging, particularly with the ability to use oil scanning, provides higher quality images compared to MP and significantly increases clinician satisfaction. WSI streamlines preparation for TPC by permitting prior slide selection, resulting in greater efficiency during TPC presentation. PMID:25379347

  2. Overruns - Southern Pine Logs

    Treesearch

    Robert A. Campbell

    1962-01-01

    Overrun and underrun data were collected for the four major southern pine species during a series of grade yield studies in the late 1950's in Arkansas, Florida, Georgia, Mississippi, and South Carolina. Each of the 1,491 logs was carefully scaled by the Doyle, Scribner Decimal C, and International ¼-inch log rule. All logs were sawed on. circle mills and the...

  3. Well Log ETL tool

    SciTech Connect

    Good, Jessica

    2013-08-01

    This is an executable python script which offers two different conversions for well log data: 1) Conversion from a BoreholeLASLogData.xls model to a LAS version 2.0 formatted XML file. 2) Conversion from a LAS 2.0 formatted XML file to an entry in the WellLog Content Model. Example templates for BoreholeLASLogData.xls and WellLogsTemplate.xls can be found in the package after download.

  4. Well Log ETL tool

    SciTech Connect

    Good, Jessica

    2013-08-01

    This is an executable python script which offers two different conversions for well log data: 1) Conversion from a BoreholeLASLogData.xls model to a LAS version 2.0 formatted XML file. 2) Conversion from a LAS 2.0 formatted XML file to an entry in the WellLog Content Model. Example templates for BoreholeLASLogData.xls and WellLogsTemplate.xls can be found in the package after download.

  5. Multiple log potash assay

    NASA Astrophysics Data System (ADS)

    Hill, D. G.

    1993-10-01

    A five-mineral multiple-log potash assay technique has been successfully applied to evaluate potash-rich intervals in evaporite sequences. The technique is able to distinguish economic potash minerals from non-economic potash minerals and from other non-potash radioactive minerals. It can be applied on location, using a programmable calculator or microcomputer, providing near real-time logs of potash mineral concentrations. Log assay values show good agreement with core wet chemistry analyses.

  6. 1962 Washington log production.

    Treesearch

    Richard L. Nielsen

    1963-01-01

    Washington's 1962 log production reached 5.05 billion board feet. This is an increase of 14 percent, or 616 million board feet, over 1961 and the highest total log production since the 1941 figure of 5.14 billion board feet.

  7. Midsouth veneer log production

    Treesearch

    Herbert S. Sternitzke

    1971-01-01

    Veneer manufacturing is an important segment of the forest industries and 1s increasing in importance every year. Veneer logs are high-valued in comparison with other kinds of logs and bolts, and considerable employment is generated and much value added in their manufacture.

  8. Ulysses log 1992

    NASA Technical Reports Server (NTRS)

    Perez, Raul Garcia

    1993-01-01

    The Ulysses Log tells the story of some intriguing problems that we (=The Spacecraft Team) have encountered. Ulysses was launched on 6 Oct. 1990, and it made the fastest trip to Jupiter (8 Feb. 1992). It is presently going out of the ecliptic. This paper presents log entries from the following areas: (1) ingenious maneuvers; (2) telecommunication problems; and (3) surprises.

  9. Future Climate Scenarios for a Coastal Productive Planktonic Food Web Resulting in Microplankton Phenology Changes and Decreased Trophic Transfer Efficiency

    PubMed Central

    Calbet, Albert; Sazhin, Andrey F.; Nejstgaard, Jens C.; Berger, Stella A.; Tait, Zachary S.; Olmos, Lorena; Sousoni, Despoina; Isari, Stamatina; Martínez, Rodrigo A.; Bouquet, Jean-Marie; Thompson, Eric M.; Båmstedt, Ulf; Jakobsen, Hans H.

    2014-01-01

    We studied the effects of future climate change scenarios on plankton communities of a Norwegian fjord using a mesocosm approach. After the spring bloom, natural plankton were enclosed and treated in duplicates with inorganic nutrients elevated to pre-bloom conditions (N, P, Si; eutrophication), lowering of 0.4 pH units (acidification), and rising 3°C temperature (warming). All nutrient-amended treatments resulted in phytoplankton blooms dominated by chain-forming diatoms, and reached 13–16 μg chlorophyll (chl) a l−1. In the control mesocosms, chl a remained below 1 μg l−1. Acidification and warming had contrasting effects on the phenology and bloom-dynamics of autotrophic and heterotrophic microplankton. Bacillariophyceae, prymnesiophyceae, cryptophyta, and Protoperidinium spp. peaked earlier at higher temperature and lower pH. Chlorophyta showed lower peak abundances with acidification, but higher peak abundances with increased temperature. The peak magnitude of autotrophic dinophyceae and ciliates was, on the other hand, lowered with combined warming and acidification. Over time, the plankton communities shifted from autotrophic phytoplankton blooms to a more heterotrophic system in all mesocosms, especially in the control unaltered mesocosms. The development of mass balance and proportion of heterotrophic/autotrophic biomass predict a shift towards a more autotrophic community and less-efficient food web transfer when temperature, nutrients and acidification are combined in a future climate-change scenario. We suggest that this result may be related to a lower food quality for microzooplankton under acidification and warming scenarios and to an increase of catabolic processes compared to anabolic ones at higher temperatures. PMID:24721992

  10. Food web efficiency differs between humic and clear water lake communities in response to nutrients and light.

    PubMed

    Faithfull, C L; Mathisen, P; Wenzel, A; Bergström, A K; Vrede, T

    2015-03-01

    This study demonstrates that clear and humic freshwater pelagic communities respond differently to the same environmental stressors, i.e. nutrient and light availability. Thus, effects on humic communities cannot be generalized from existing knowledge about these environmental stressors on clear water communities. Small humic lakes are the most numerous type of lake in the boreal zone, but little is known about how these lakes will respond to increased inflows of nutrients and terrestrial dissolved organic C (t-DOC) due to climate change and increased human impacts. Therefore, we compared the effects of nutrient addition and light availability on pelagic humic and clear water lake communities in a mesocosm experiment. When nutrients were added, phytoplankton production (PPr) increased in both communities, but pelagic energy mobilization (PEM) and bacterial production (BP) only increased in the humic community. At low light conditions, the addition of nutrients led to increased PPr only in the humic community, suggesting that, in contrast to the clear water community, humic phytoplankton were already adapted to lower ambient light levels. Low light significantly reduced PPr and PEM in the clear water community, but without reducing total zooplankton production, which resulted in a doubling of food web efficiency (FWE = total zooplankton production/PEM). However, total zooplankton production was not correlated with PEM, PPr, BP, PPr:BP or C:nutrient stoichiometry for either community type. Therefore, other factors such as food chain length, food quality, ultra-violet radiation or duration of the experiment, must have determined total zooplankton production and ultimately FWE.

  11. Future climate scenarios for a coastal productive planktonic food web resulting in microplankton phenology changes and decreased trophic transfer efficiency.

    PubMed

    Calbet, Albert; Sazhin, Andrey F; Nejstgaard, Jens C; Berger, Stella A; Tait, Zachary S; Olmos, Lorena; Sousoni, Despoina; Isari, Stamatina; Martínez, Rodrigo A; Bouquet, Jean-Marie; Thompson, Eric M; Båmstedt, Ulf; Jakobsen, Hans H

    2014-01-01

    We studied the effects of future climate change scenarios on plankton communities of a Norwegian fjord using a mesocosm approach. After the spring bloom, natural plankton were enclosed and treated in duplicates with inorganic nutrients elevated to pre-bloom conditions (N, P, Si; eutrophication), lowering of 0.4 pH units (acidification), and rising 3°C temperature (warming). All nutrient-amended treatments resulted in phytoplankton blooms dominated by chain-forming diatoms, and reached 13-16 μg chlorophyll (chl) a l-1. In the control mesocosms, chl a remained below 1 μg l-1. Acidification and warming had contrasting effects on the phenology and bloom-dynamics of autotrophic and heterotrophic microplankton. Bacillariophyceae, prymnesiophyceae, cryptophyta, and Protoperidinium spp. peaked earlier at higher temperature and lower pH. Chlorophyta showed lower peak abundances with acidification, but higher peak abundances with increased temperature. The peak magnitude of autotrophic dinophyceae and ciliates was, on the other hand, lowered with combined warming and acidification. Over time, the plankton communities shifted from autotrophic phytoplankton blooms to a more heterotrophic system in all mesocosms, especially in the control unaltered mesocosms. The development of mass balance and proportion of heterotrophic/autotrophic biomass predict a shift towards a more autotrophic community and less-efficient food web transfer when temperature, nutrients and acidification are combined in a future climate-change scenario. We suggest that this result may be related to a lower food quality for microzooplankton under acidification and warming scenarios and to an increase of catabolic processes compared to anabolic ones at higher temperatures.

  12. Idaho-Montana Logging

    NASA Image and Video Library

    2013-12-16

    Logging operations have left a striking checkerboard pattern in the landscape along the Idaho-Montana border, sandwiched between Clearwater and Bitterroot National Forests as seen in this image acquired by NASA Terra spacecraft.

  13. Acoustic borehole logging

    SciTech Connect

    Medlin, W.L.; Manzi, S.J.

    1990-10-09

    This patent describes an acoustic borehole logging method. It comprises traversing a borehole with a borehole logging tool containing a transmitter of acoustic energy having a free-field frequency spectrum with at least one characteristic resonant frequency of vibration and spaced-apart receiver, repeatedly exciting the transmitter with a swept frequency tone burst of a duration sufficiently greater than the travel time of acoustic energy between the transmitter and the receiver to allow borehole cavity resonances to be established within the borehole cavity formed between the borehole logging tool and the borehole wall, detecting acoustic energy amplitude modulated by the borehole cavity resonances with the spaced-apart receiver, and recording an amplitude verses frequency output of the receiver in correlation with depth as a log of the borehole frequency spectrum representative of the subsurface formation comprising the borehole wall.

  14. 1964 Oregon log production.

    Treesearch

    Brian R. Wall

    1965-01-01

    The production of logs in Oregon in 1964 was 9.4 billion board feet, or nearly 9 percent above 1963. This year, 1964, had the third highest level of log production in history, exceeded only in 1955 and in 1952. The proportion of total cut from private lands fell to 43 percent, even though the total private cut increased 6 percent over that in 1963. Forest industry,...

  15. 6. Log calving barn. Interior view showing log postandbeam support ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. Log calving barn. Interior view showing log post-and-beam support system and animal stalls. - William & Lucina Bowe Ranch, Log Calving Barn, 230 feet south-southwest of House, Melrose, Silver Bow County, MT

  16. EE-3A Logging Report

    SciTech Connect

    Anderson, David W.

    1993-12-15

    Two logs of EE-3A were performed during the last couple of weeks. The first of which, was a Temperature/Casing-Collar Locator (CCL) log, which took place on Friday, December 10th., 1993. The second log was a Caliper log which was done in cooperation with the Dia-Log Company, of Odessa, TX. on Monday, December, 13th., 1993.

  17. Recognizing Patterns In Log-Polar Coordinates

    NASA Technical Reports Server (NTRS)

    Weiman, Carl F. R.

    1992-01-01

    Log-Hough transform is basis of improved method for recognition of patterns - particularly, straight lines - in noisy images. Takes advantage of rotational and scale invariance of mapping from Cartesian to log-polar coordinates, and offers economy of representation and computation. Unification of iconic and Hough domains simplifies computations in recognition and eliminates erroneous quantization of slopes attributable to finite spacing of Cartesian coordinate grid of classical Hough transform. Equally efficient recognizing curves. Log-Hough transform more amenable to massively parallel computing architectures than traditional Cartesian Hough transform. "In-place" nature makes it possible to apply local pixel-neighborhood processing.

  18. e2g: an interactive web-based server for efficiently mapping large EST and cDNA sets to genomic sequences.

    PubMed

    Krüger, Jan; Sczyrba, Alexander; Kurtz, Stefan; Giegerich, Robert

    2004-07-01

    e2g is a web-based server which efficiently maps large expressed sequence tag (EST) and cDNA datasets to genomic DNA. It significantly extends the volume of data that can be mapped in reasonable time, and makes this improved efficiency available as a web service. Our server hosts large collections of EST sequences (e.g. 4.1 million mouse ESTs of 1.87 Gb) in precomputed indexed data structures for efficient sequence comparison. The user can upload a genomic DNA sequence of interest and rapidly compare this to the complete collection of ESTs on the server. This delivers a mapping of the ESTs on the genomic DNA. The e2g web interface provides a graphical overview of the mapping. Alignments of the mapped EST regions with parts of the genomic sequence are visualized. Zooming functions allow the user to interactively explore the results. Mapped sequences can be downloaded for further analysis. e2g is available on the Bielefeld University Bioinformatics Server at http://bibiserv.techfak.uni-bielefeld.de/e2g/.

  19. Improved grading system for structural logs for log homes

    Treesearch

    D.W. Green; T.M. Gorman; J.W. Evans; J.F. Murphy

    2004-01-01

    Current grading standards for logs used in log home construction use visual criteria to sort logs into either “wall logs” or structural logs (round and sawn round timbers). The conservative nature of this grading system, and the grouping of stronger and weaker species for marketing purposes, probably results in the specification of logs with larger diameter than would...

  20. Data for Free: Using LMS Activity Logs to Measure Community in Online Courses

    ERIC Educational Resources Information Center

    Black, Erik W.; Dawson, Kara; Priem, Jason

    2008-01-01

    In the study of online learning community, many investigators have turned attention to automatically logged web data. This study aims to further this work by seeking to determine whether logs of student activity within online graduate level courses related to student perceptions of course community. Researchers utilized the data logging features…

  1. Leveraging Web Services in Providing Efficient Discovery, Retrieval, and Integration of NASA-Sponsored Observations and Predictions

    NASA Astrophysics Data System (ADS)

    Bambacus, M.; Alameh, N.; Cole, M.

    2006-12-01

    The Applied Sciences Program at NASA focuses on extending the results of NASA's Earth-Sun system science research beyond the science and research communities to contribute to national priority applications with societal benefits. By employing a systems engineering approach, supporting interoperable data discovery and access, and developing partnerships with federal agencies and national organizations, the Applied Sciences Program facilitates the transition from research to operations in national applications. In particular, the Applied Sciences Program identifies twelve national applications, listed at http://science.hq.nasa.gov/earth-sun/applications/, which can be best served by the results of NASA aerospace research and development of science and technologies. The ability to use and integrate NASA data and science results into these national applications results in enhanced decision support and significant socio-economic benefits for each of the applications. This paper focuses on leveraging the power of interoperability and specifically open standard interfaces in providing efficient discovery, retrieval, and integration of NASA's science research results. Interoperability (the ability to access multiple, heterogeneous geoprocessing environments, either local or remote by means of open and standard software interfaces) can significantly increase the value of NASA-related data by increasing the opportunities to discover, access and integrate that data in the twelve identified national applications (particularly in non-traditional settings). Furthermore, access to data, observations, and analytical models from diverse sources can facilitate interdisciplinary and exploratory research and analysis. To streamline this process, the NASA GeoSciences Interoperability Office (GIO) is developing the NASA Earth-Sun System Gateway (ESG) to enable access to remote geospatial data, imagery, models, and visualizations through open, standard web protocols. The gateway (online

  2. NMR logging apparatus

    SciTech Connect

    Walsh, David O; Turner, Peter

    2014-05-27

    Technologies including NMR logging apparatus and methods are disclosed. Example NMR logging apparatus may include surface instrumentation and one or more downhole probes configured to fit within an earth borehole. The surface instrumentation may comprise a power amplifier, which may be coupled to the downhole probes via one or more transmission lines, and a controller configured to cause the power amplifier to generate a NMR activating pulse or sequence of pulses. Impedance matching means may be configured to match an output impedance of the power amplifier through a transmission line to a load impedance of a downhole probe. Methods may include deploying the various elements of disclosed NMR logging apparatus and using the apparatus to perform NMR measurements.

  3. Service-oriented workflow to efficiently and automatically fulfill products in a highly individualized web and mobile environment

    NASA Astrophysics Data System (ADS)

    Qiao, Mu

    2015-03-01

    Service Oriented Architecture1 (SOA) is widely used in building flexible and scalable web sites and services. In most of the web or mobile photo book and gifting business space, the products ordered are highly variable without a standard template that one can substitute texts or images from similar to that of commercial variable data printing. In this paper, the author describes a SOA workflow in a multi-sites, multi-product lines fulfillment system where three major challenges are addressed: utilization of hardware and equipment, highly automation with fault recovery, and highly scalable and flexible with order volume fluctuation.

  4. 4. Log chicken house (far left foreground), log bunkhouse (far ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. Log chicken house (far left foreground), log bunkhouse (far left background), one-room log cabin (left of center background), log root cellar (center), post-and-beam center in foreground, and blacksmith shop (far right foreground). View to southeast. - William & Lucina Bowe Ranch, County Road 44, 0.1 mile northeast of Big Hole River Bridge, Melrose, Silver Bow County, MT

  5. Modes of log gravity

    SciTech Connect

    Bergshoeff, Eric A.; Rosseel, Jan; Hohm, Olaf; Townsend, Paul K.

    2011-05-15

    The physical modes of a recently proposed D-dimensional 'critical gravity', linearized about its anti-de Sitter vacuum, are investigated. All 'log mode' solutions, which we categorize as 'spin-2' or 'Proca', arise as limits of the massive spin-2 modes of the noncritical theory. The linearized Einstein tensor of a spin-2 log mode is itself a 'nongauge' solution of the linearized Einstein equations whereas the linearized Einstein tensor of a Proca mode takes the form of a linearized general coordinate transformation. Our results suggest the existence of a holographically dual logarithmic conformal field theory.

  6. Real-Time System Log Monitoring/Analytics Framework

    SciTech Connect

    Oral, H Sarp; Dillow, David A; Park, Byung H; Shipman, Galen M; Geist, Al; Gunasekaran, Raghul

    2011-01-01

    Analyzing system logs provides useful insights for identifying system/application anomalies and helps in better usage of system resources. Nevertheless, it is simply not practical to scan through the raw log messages on a regular basis for large-scale systems. First, the sheer volume of unstructured log messages affects the readability, and secondly correlating the log messages to system events is a daunting task. These factors limit large-scale system logs primarily for generating alerts on known system events, and post-mortem diagnosis for identifying previously unknown system events that impacted the systems performance. In this paper, we describe a log monitoring framework that enables prompt analysis of system events in real-time. Our web-based framework provides a summarized view of console, netwatch, consumer, and apsched logs in real- time. The logs are parsed and processed to generate views of applications, message types, individual/group of compute nodes, and in sections of the compute platform. Also from past application runs we build a statistical profile of user/application characteristics with respect to known system events, recoverable/non-recoverable error messages and resources utilized. The web-based tool is being developed for Jaguar XT5 at the Oak Ridge Leadership Computing facility.

  7. Dissemination Strategies and Adherence Predictors for Web-Based Interventions--How Efficient Are Patient Education Sessions and Email Reminders?

    ERIC Educational Resources Information Center

    Schweier, R.; Romppel, M.; Richter, C.; Grande, G.

    2016-01-01

    The Internet offers the potential to efficaciously deliver health interventions at a low cost and with a low threshold across any distance. However, since many web-based interventions are confronted with low use and adherence, proactive dissemination strategies are needed. We, therefore, tested the efficacy of a 1-h patient education session as…

  8. Dissemination Strategies and Adherence Predictors for Web-Based Interventions--How Efficient Are Patient Education Sessions and Email Reminders?

    ERIC Educational Resources Information Center

    Schweier, R.; Romppel, M.; Richter, C.; Grande, G.

    2016-01-01

    The Internet offers the potential to efficaciously deliver health interventions at a low cost and with a low threshold across any distance. However, since many web-based interventions are confronted with low use and adherence, proactive dissemination strategies are needed. We, therefore, tested the efficacy of a 1-h patient education session as…

  9. Logging slash flammability

    Treesearch

    George R. Fahnestock

    1960-01-01

    Some of the most disastrous forest fires in North American history burned in slash left from logging and land clearing. In the era before organized fire control, the names Miramichi, Peshtigo, Hinckley, and Cloquet stand for millions of acres blackened and thousands of lives snuffed out. More recently the Half Moon Fire in Montana, the Tillamook Fire in Oregon, the...

  10. Log of Apollo 11.

    ERIC Educational Resources Information Center

    National Aeronautics and Space Administration, Washington, DC.

    The major events of the first manned moon landing mission, Apollo 11, are presented in chronological order from launch time until arrival of the astronauts aboard the U.S.S. Hornet. The log is descriptive, non-technical, and includes numerous color photographs of the astronauts on the moon. (PR)

  11. Interactive Reflective Logs

    ERIC Educational Resources Information Center

    Deaton, Cynthia Minchew; Deaton, Benjamin E.; Leland, Katina

    2010-01-01

    The authors created an interactive reflective log (IRL) to provide teachers with an opportunity to use a journal approach to record, evaluate, and communicate student understanding of science concepts. Unlike a traditional journal, the IRL incorporates prompts to encourage students to discuss their understanding of science content and science…

  12. Logs Perl Module

    SciTech Connect

    Owen, R. K.

    2007-04-04

    A perl module designed to read and parse the voluminous set of event or accounting log files produced by a Portable Batch System (PBS) server. This module can filter on date-time and/or record type. The data can be returned in a variety of formats.

  13. Alaska's Logging Camp School.

    ERIC Educational Resources Information Center

    Millward, Robert E.

    1999-01-01

    A visit to Ketchikan, Alaska, reveals a floating, one-teacher logging-camp school that uses multiage grouping and interdisciplinary teaching. There are 10 students. The school gym and playground, bunkhouse, fuel tanks, mess hall, and students' homes bob up and down and are often moved to other sites. (MLH)

  14. Interactive Reflective Logs

    ERIC Educational Resources Information Center

    Deaton, Cynthia Minchew; Deaton, Benjamin E.; Leland, Katina

    2010-01-01

    The authors created an interactive reflective log (IRL) to provide teachers with an opportunity to use a journal approach to record, evaluate, and communicate student understanding of science concepts. Unlike a traditional journal, the IRL incorporates prompts to encourage students to discuss their understanding of science content and science…

  15. Petrographic image logging system

    SciTech Connect

    Payne, C.J.; Ulrich, M.R.; Maxwell, G.B. ); Adams, J.P. )

    1991-03-01

    The Petrographic Image Logging System (PILS) is a logging system data base for Macintosh computers that allows the merging of traditional wire-line, core, and mud log data with petrographic images. The system is flexible; it allows the user to record, manipulate, and display almost any type of character, graphic, and image information. Character and graphic data are linked and entry in either mode automatically generates the alternate mode. Character/graphic data may include such items as ROP, wire-line log data, interpreted lithologies, ditch cutting lith-percentages, porosity grade and type, grain size, core/DST information, and sample descriptions. Image data may include petrographic and SEM images of cuttings, core, and thin sections. All data are tied to depth. Data are entered quickly and easily in an interactive manner with a mouse, keyboard, and digitizing tablet or may be imported and immediately autoplotted from a variety of environments via modem, network, or removable disk. Color log displays, including petrographic images, are easily available on CRT or as hardcopy. The system consists of a petrographic microscope, video camera, Macintosh computer, video framegrabber and digitizing tablet. Hardcopy is scaleable and can be generated by a variety of color printing devices. The software is written in Supertalk, a color superset of the standard Apple Hypercard programming language, hypertalk. This system is being tested by Mobil in the lab and at the well site. Implementation has provided near 'real-time' core and cuttings images from drilling wells to the geologist back at the office.

  16. Reviews Book: Enjoyable Physics Equipment: SEP Colorimeter Box Book: Pursuing Power and Light Equipment: SEP Bottle Rocket Launcher Equipment: Sciencescope GLE Datalogger Equipment: EDU Logger Book: Physics of Sailing Book: The Lightness of Being Software: Logotron Insight iLog Studio iPhone Apps Lecture: 2010 IOP Schools and Colleges Lecture Web Watch

    NASA Astrophysics Data System (ADS)

    2010-09-01

    WE RECOMMEND Enjoyable Physics Mechanics book makes learning more fun SEP Colorimeter Box A useful and inexpensive colorimeter for the classroom Pursuing Power and Light Account of the development of science in the 19th centuary SEP Bottle Rocket Launcher An excellent resource for teaching about projectiles GLE Datalogger GPS software is combined with a datalogger EDU Logger Remote datalogger has greater sensing abilities Logotron Insight iLog Studio Software enables datlogging, data analysis and modelling iPhone Apps Mobile phone games aid study of gravity WORTH A LOOK Physics of Sailing Book journeys through the importance of physics in sailing The Lightness of Being Study of what the world is made from LECTURE The 2010 IOP Schools and Colleges Lecture presents the physics of fusion WEB WATCH Planet Scicast pushes boundaries of pupil creativity

  17. Log-Concavity and Strong Log-Concavity: a review

    PubMed Central

    Saumard, Adrien; Wellner, Jon A.

    2016-01-01

    We review and formulate results concerning log-concavity and strong-log-concavity in both discrete and continuous settings. We show how preservation of log-concavity and strongly log-concavity on ℝ under convolution follows from a fundamental monotonicity result of Efron (1969). We provide a new proof of Efron's theorem using the recent asymmetric Brascamp-Lieb inequality due to Otto and Menz (2013). Along the way we review connections between log-concavity and other areas of mathematics and statistics, including concentration of measure, log-Sobolev inequalities, convex geometry, MCMC algorithms, Laplace approximations, and machine learning. PMID:27134693

  18. Log-Concavity and Strong Log-Concavity: a review.

    PubMed

    Saumard, Adrien; Wellner, Jon A

    We review and formulate results concerning log-concavity and strong-log-concavity in both discrete and continuous settings. We show how preservation of log-concavity and strongly log-concavity on ℝ under convolution follows from a fundamental monotonicity result of Efron (1969). We provide a new proof of Efron's theorem using the recent asymmetric Brascamp-Lieb inequality due to Otto and Menz (2013). Along the way we review connections between log-concavity and other areas of mathematics and statistics, including concentration of measure, log-Sobolev inequalities, convex geometry, MCMC algorithms, Laplace approximations, and machine learning.

  19. Development of Kevlar parachute webbings

    SciTech Connect

    Ericksen, R.H.

    1991-01-01

    This paper describes the development of Kevlar webbings for parachute applications. Evaluation of existing webbings and a study of the effects of filling yarn denier and pick count on tensile and joint strength provided data for fabric design. Measurements of warp crimp as a function of filling denier and pick count demonstrated the relationship between warp crimp and strength. One newly developed webbing had higher strength efficiency and another had higher joint efficiency than comparable existing webbings. Both new webbings had overall efficiencies over 5% higher than values for existing webbings. 10 refs., 4 figs., 2 tabs.

  20. A World Wide Web-based antimicrobial stewardship program improves efficiency, communication, and user satisfaction and reduces cost in a tertiary care pediatric medical center.

    PubMed

    Agwu, Allison L; Lee, Carlton K K; Jain, Sanjay K; Murray, Kara L; Topolski, Jason; Miller, Robert E; Townsend, Timothy; Lehmann, Christoph U

    2008-09-15

    Antimicrobial stewardship programs aim to reduce inappropriate hospital antimicrobial use. At the Johns Hopkins Children's Medical and Surgical Center (Baltimore, MD), we implemented a World Wide Web-based antimicrobial restriction program to address problems with the existing restriction program. A user survey identified opportunities for improvement of an existing antimicrobial restriction program and resulted in subsequent design, implementation, and evaluation of a World Wide Web-based antimicrobial restriction program at a 175-bed, tertiary care pediatric teaching hospital. The program provided automated clinical decision support, facilitated approval, and enhanced real-time communication among prescribers, pharmacists, and pediatric infectious diseases fellows. Approval status, duration, and rationale; missing request notifications; and expiring approvals were stored in a database that is accessible via a secure Intranet site. Before and after implementation of the program, user satisfaction, reports of missed and/or delayed doses, antimicrobial dispensing times, and cost were evaluated. After implementation of the program, there was a $370,069 reduction in projected annual cost associated with restricted antimicrobial use and an 11.6% reduction in the number of dispensed doses. User satisfaction increased from 22% to 68% and from 13% to 69% among prescribers and pharmacists, respectively. There were 21% and 32% reductions in the number of prescriber reports of missed and delayed doses, respectively, and there was a 37% reduction in the number of pharmacist reports of delayed approvals; measured dispensing times were unchanged (P = .24). In addition, 40% fewer restricted antimicrobial-related phone calls were noted by the pharmacy. The World Wide Web-based antimicrobial approval program led to improved communication, more-efficient antimicrobial administration, increased user satisfaction, and significant cost savings. Integrated tools, such as this World

  1. Aspen for cabin logs

    Treesearch

    A.W. Sump

    1947-01-01

    A plentiful supply of pine and cedar logs provided the early settlers of this country with a cheap and durable material for the construction of their homes and farm buildings. Only the axe and the ingenuity of the pioneer were needed to erect a shelter against the elements of nature. Early in the 19th century, the circular saw came into use resulting in a change in...

  2. Choosing methods and equipment for logging

    Treesearch

    Fred C. Simmons

    1948-01-01

    A logging job is one of the most difficult types of business to manage efficiently. In practically everything the logger does he is compelled to make a choice between several methods of operation and types of equipment. The conditions under which he works are constantly changing, particularly when he is forced to move fairly often from one timber tract to another. But...

  3. Use and Appreciation of a Web-Based, Tailored Intervention (E-health4Uth) Combined With Counseling to Promote Adolescents’ Health in Preventive Youth Health Care: Survey and Log-File Analysis

    PubMed Central

    Bannink, Rienke; Broeren, Suzanne; Joosten-van Zwanenburg, Evelien; van As, Els; van de Looij-Jansen, Petra

    2014-01-01

    Background Health promotion for adolescents is important in the prevention of mental health problems and health-risk behaviors. We implemented two interventions in a preventive youth health care setting. Adolescents in the E-health4Uth group received Web-based, tailored messages on their health behavior and well-being. Adolescents in the E-health4Uth and counseling group received the same tailored messages, but were subsequently referred to a school nurse for a consultation if they were at risk of mental health problems. Objective This study evaluated the use and appreciation of these Web-based, tailored messages and additional consultation with a school nurse. Differences in use and appreciation according to demographics (ie, gender, level of education, and ethnicity) of the adolescents were also assessed. Methods Two youth health care organizations participated in this study and conducted the interventions in 12 secondary schools. In total, 1702 adolescents participated; 533 in the E-health4Uth group, 554 in the E-health4Uth and counseling group, and 615 in the control group (ie, care as usual). Adolescents completed an evaluation questionnaire assessing the use and appreciation of the tailored messages immediately after receiving these messages and at a 4-month follow-up. After the consultation, adolescents and nurses completed an evaluation questionnaire on the use and appreciation of the consultation. Results The majority of the adolescents (845/1034, 81.72%) indicated they had read the tailored messages. Most items on the use and appreciation of the tailored messages and the program were scored positive (overall satisfaction on a scale from 1, most-negative, to 10, most-positive: mean 6.70, SD 1.60). In general, adolescents in vocational training, girls, and adolescents of non-Dutch ethnicity, indicated they used the tailored messages more often and appreciated the content of the messages better than adolescents receiving preuniversity education, boys, and

  4. Keystroke Logging in Writing Research: Using Inputlog to Analyze and Visualize Writing Processes

    ERIC Educational Resources Information Center

    Leijten, Marielle; Van Waes, Luuk

    2013-01-01

    Keystroke logging has become instrumental in identifying writing strategies and understanding cognitive processes. Recent technological advances have refined logging efficiency and analytical outputs. While keystroke logging allows for ecological data collection, it is often difficult to connect the fine grain of logging data to the underlying…

  5. Keystroke Logging in Writing Research: Using Inputlog to Analyze and Visualize Writing Processes

    ERIC Educational Resources Information Center

    Leijten, Marielle; Van Waes, Luuk

    2013-01-01

    Keystroke logging has become instrumental in identifying writing strategies and understanding cognitive processes. Recent technological advances have refined logging efficiency and analytical outputs. While keystroke logging allows for ecological data collection, it is often difficult to connect the fine grain of logging data to the underlying…

  6. Seasonal logging, process response, and geomorphic work

    NASA Astrophysics Data System (ADS)

    Mohr, C.; Zimmermann, A.; Korup, O.; Iroume, A.; Francke, T.; Bronstert, A.

    2013-12-01

    Deforestation is a prominent anthropogenic cause of erosive overland flow and slope instability, boosting rates of soil erosion and concomitant sediment flux. Conventional methods of gauging or estimating post-logging sediment flux focus on annual timescales, but overlook potentially important process response on shorter intervals immediately following timber harvest. We resolve such dynamics from non-parametric Quantile Regression Forests (QRF) of high-frequency (3-min) measurements of stream discharge and sediment concentrations in similar-sized (~0.1 km2) forested Chilean catchments that were logged during either the rainy or the dry season. The method of QRF builds on the Random Forest algorithm, and combines quantile regression with repeated random sub-sampling of both cases and predictors which in turn provides model uncertainties. We find that, where no logging occurred, ~80% of the total sediment load was transported during extremely variable runoff events during only 5% of the monitoring period. Particularly dry-season logging dampened the role of these rare, extreme sediment-transport events by increasing load efficiency during more efficient moderate events. We conclude that QRF may reliably support forest management recommendations by providing robust simulations of post-logging response of water and sediment fluxes at high temporal resolution.

  7. 2. Onroom log cabin (right), log root cellar (center), tworoom ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. On-room log cabin (right), log root cellar (center), two-room log cabin (left), and post-and-beam garage (background). View to southwest. - William & Lucina Bowe Ranch, County Road 44, 0.1 mile northeast of Big Hole River Bridge, Melrose, Silver Bow County, MT

  8. 12. Upstream view showing thelower log pond log chute in ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    12. Upstream view showing thelower log pond log chute in the main channel of the Hudson River. The log chute in the dam can be seen in the background. Facing southwest. - Glens Falls Dam, 100' to 450' West of U.S. Route 9 Bridge Spanning Hudson River, Glens Falls, Warren County, NY

  9. Identifying related journals through log analysis

    PubMed Central

    Lu, Zhiyong; Xie, Natalie; Wilbur, W. John

    2009-01-01

    Motivation: With the explosion of biomedical literature and the evolution of online and open access, scientists are reading more articles from a wider variety of journals. Thus, the list of core journals relevant to their research may be less obvious and may often change over time. To help researchers quickly identify appropriate journals to read and publish in, we developed a web application for finding related journals based on the analysis of PubMed log data. Availability: http://www.ncbi.nlm.nih.gov/IRET/Journals Contact: luzh@ncbi.nlm.nih.gov Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19734155

  10. Electromagnetic wave logging dipmeter

    SciTech Connect

    Meador, R.A.

    1983-12-20

    An improvement to dipmeter logs has very closely spaced radio frequency sensor coils mounted in pairs in each of the formation contacting pads. A transmitter mounted in a sonde emits the radio frequency energy, such as in the range of from two to one hundred megahertz. The phase difference in radio frequency signals between receiver coil pairs in each pad is measured, providing improved data resolution for computing formation dip, and making possible dip measurements in wells drilled with oil base mud or air (invert type muds).

  11. A collection of log rules

    Treesearch

    Frank Freese

    1973-01-01

    A log rule may be defined as a table or formula showing the estimated net yield for logs of a given diameter and length. Ordinarily the yield is expressed in terms of board feet of finished lumber, though a few rules give the cubic volume of the log or some fraction of it. Built into each log rule are allowances for losses due to such things as slabs, saw kerf, edgings...

  12. My Journey with Learning Logs

    ERIC Educational Resources Information Center

    Hurst, Beth

    2005-01-01

    Learning logs, or reading response logs, have long been established as an effective reading strategy that helps students learn from text (Atwell, 1987; Blough & Berman, 1991; Calkins, 1986; Commander & Smith, 1996; Kuhrt & Farris, 1990; Reed, 1988; Sanders, 1985). In this paper, the author describes her experiences using learning logs as a…

  13. Grid Logging: Best Practices Guide

    SciTech Connect

    Tierney, Brian L; Tierney, Brian L; Gunter, Dan

    2008-04-01

    The purpose of this document is to help developers of Grid middleware and application software generate log files that will be useful to Grid administrators, users, developers and Grid middleware itself. Currently, most of the currently generated log files are only useful to the author of the program. Good logging practices are instrumental to performance analysis, problem diagnosis, and security auditing tasks such as incident tracing and damage assessment. This document does not discuss the issue of a logging API. It is assumed that a standard log API such as syslog (C), log4j (Java), or logger (Python) is being used. Other custom logging API or even printf could be used. The key point is that the logs must contain the required information in the required format. At a high level of abstraction, the best practices for Grid logging are: (1) Consistently structured, typed, log events; (2) A standard high-resolution timestamp; (3) Use of logging levels and categories to separate logs by detail and purpose; (4) Consistent use of global and local identifiers; and (5) Use of some regular, newline-delimited ASCII text format. The rest of this document describes each of these recommendations in detail.

  14. What Can Instructors and Policy Makers Learn about Web-Supported Learning through Web-Usage Mining

    ERIC Educational Resources Information Center

    Cohen, Anat; Nachmias, Rafi

    2011-01-01

    This paper focuses on a Web-log based tool for evaluating pedagogical processes occurring in Web-supported academic instruction and students' attitudes. The tool consists of computational measures which demonstrate what instructors and policy makers can learn about Web-supported instruction through Web-usage mining. The tool can provide different…

  15. What Can Instructors and Policy Makers Learn about Web-Supported Learning through Web-Usage Mining

    ERIC Educational Resources Information Center

    Cohen, Anat; Nachmias, Rafi

    2011-01-01

    This paper focuses on a Web-log based tool for evaluating pedagogical processes occurring in Web-supported academic instruction and students' attitudes. The tool consists of computational measures which demonstrate what instructors and policy makers can learn about Web-supported instruction through Web-usage mining. The tool can provide different…

  16. Geophysical logs in British stratigraphy

    SciTech Connect

    Whittaker, A.; Holliday, D.W.; Penn, I.E.

    1985-01-01

    This Special Report outlines the stratigraphic applications of the main geophysical logging tools. It characterises the British geological succession by means of the geophysical log signatures of its principle constituent formations. A large amount of previously unpublished data is provided on a geographical area long known for its importance in the development of the science of stratigraphy. The book in units modern developments of petroleum industry geophysical techniques with long-established stratigraphical discovery/research. Contents include: Introduction; Types of logs commonly used; Some geological uses of geophysical logs; Log signatures in British Stratigraphy; References.

  17. Robust Spatial Autoregressive Modeling for Hardwood Log Inspection

    Treesearch

    Dongping Zhu; A.A. Beex

    1994-01-01

    We explore the application of a stochastic texture modeling method toward a machine vision system for log inspection in the forest products industry. This machine vision system uses computerized tomography (CT) imaging to locate and identify internal defects in hardwood logs. The application of CT to such industrial vision problems requires efficient and robust image...

  18. Primary detection of hardwood log defects using laser surface scanning

    Treesearch

    Ed Thomas; Liya Thomas; Lamine Mili; Roger Ehrich; A. Lynn Abbott; Clifford Shaffer; Clifford Shaffer

    2003-01-01

    The use of laser technology to scan hardwood log surfaces for defects holds great promise for improving processing efficiency and the value and volume of lumber produced. External and internal defect detection to optimize hardwood log and lumber processing is one of the top four technological needs in the nation's hardwood industry. The location, type, and...

  19. Predicting yields from Appalachian red oak logs and lumber

    Treesearch

    Daniel E. Dunmire

    1971-01-01

    One utilization problem is in pinpointing how to efficiently and effectively recover usable parts from logs, bolts, and lumber. Yields, which are output divided by input, provide a key to managers who make processing decisions. Research results are applied to indicate yields of graded lumber and dimension stock from graded Appalachian red oak (group) logs. How to...

  20. Predicting internal yellow-poplar log defect features using surface indicators

    Treesearch

    R. Edward Thomas

    2008-01-01

    Determining the defects that are located within the log is crucial to understanding the tree/log resource for efficient processing. However, existing means of doing this non-destructively requires the use of expensive X-ray/CT, MRI, or microwave technology. These methods do not lend themselves to fast, efficient, and cost-effective analysis of logs and tree stems in...

  1. Instructional Efficiency of Performance Analysis Training for Learners at Different Levels of Competency in Using a Web-Based EPSS

    ERIC Educational Resources Information Center

    Darabi, A. Aubteen; Nelson, David W.; Mackal, Melissa C.

    2004-01-01

    The measure of performance improvement potential (Gilbert, 1978) in human performance technology uses an exemplary performance as a criterion against which to measure the potential improvement in the performance of a workforce. The measure is calculated based on the performance efficiency which compares expended resources to productivity. The same…

  2. Making WEB Meaning.

    ERIC Educational Resources Information Center

    McKenzie, Jamie

    1996-01-01

    Poorly organized and dominated by amateurs, hucksters, and marketeers, the net requires efficient navigating devices. Students at Bellingham (Washington) Public Schools tackle information overload by contributing to virtual museums on school Web sites, using annotated Web curriculum lists, and conducting research in cooperative teams stressing…

  3. Student Portfolio Analysis for Decision Support of Web-Based Classroom Teacher by Data Cube Technology.

    ERIC Educational Resources Information Center

    Chang, Chih-Kai; Chen, Gwo-Dong; Liu, Baw-Jhiune; Ou, Kou-Liang

    As learners use World Wide Web-based distance learning systems over a period of years, large amounts of learning logs are generated. An instructor needs analysis tools to manage the logs and discover unusual patterns within them to improve instruction. However, logs of a Web server cannot serve as learners' portfolios to satisfy the requirements…

  4. Web-based Curriculum

    PubMed Central

    Zebrack, Jennifer R; Mitchell, Julie L; Davids, Susan L; Simpson, Deborah E

    2005-01-01

    OBJECTIVE To address the need for women's health education by designing, implementing, and evaluating a self-study, web-based women's health curriculum. DESIGN Cohort of students enrolled in the ambulatory portion of the medicine clerkship with comparison group of students who had not yet completed this rotation. PARTICIPANTS/SETTING Third- and fourth-year medical students on the required medicine clerkship (115 students completed the curriculum; 158 completed patient-related logs). INTERVENTION Following an extensive needs assessment and formulation of competencies and objectives, we developed a web-based women's health curriculum completed during the ambulatory portion of the medicine clerkship. The modules were case based and included web links, references, and immediate feedback on posttesting. We discuss technical issues with implementation and maintenance. MEASUREMENTS AND MAIN RESULTS We evaluated this curriculum using anonymous questionnaires, open-ended narrative comments, online multiple-choice tests, and personal digital assistant (PDA) logs of patient-related discussions of women's health. Students completing the curriculum valued learning women's health, preferred this self-directed learning over lecture, scored highly on knowledge tests, and were involved in more and higher-level discussions of women's health with faculty (P <.001). CONCLUSIONS We present a model for the systematic design of a web-based women's health curriculum as part of a medicine clerkship. The web-based instruction resolved barriers associated with limited curriculum time and faculty availability, provided an accessible and standard curriculum, and met the needs of adult learners (with their motivation to learn topics they value and apply this knowledge in their daily work). We hypothesize that our web-based curriculum spurred students to later discuss these topics with faculty. Web-based learning may be particularly suited for women's health because of its multidisciplinary

  5. Oracle Log Buffer Queueing

    SciTech Connect

    Rivenes, A S

    2004-12-08

    The purpose of this document is to investigate Oracle database log buffer queuing and its affect on the ability to load data using a specialized data loading system. Experiments were carried out on a Linux system using an Oracle 9.2 database. Previous experiments on a Sun 4800 running Solaris had shown that 100,000 entities per minute was an achievable rate. The question was then asked, can we do this on Linux, and where are the bottlenecks? A secondary question was also lurking, how can the loading be further scaled to handle even higher throughput requirements? Testing was conducted using a Dell PowerEdge 6650 server with four CPUs and a Dell PowerVault 220s RAID array with 14 36GB drives and 128 MB of cache. Oracle Enterprise Edition 9.2.0.4 was used for the database and Red Hat Linux Advanced Server 2.1 was used for the operating system. This document will detail the maximum observed throughputs using the same test suite that was used for the Sun tests. A detailed description of the testing performed along with an analysis of bottlenecks encountered will be made. Issues related to Oracle and Linux will also be detailed and some recommendations based on the findings.

  6. Acoustic paramagnetic logging tool

    DOEpatents

    Vail, III, William B.

    1988-01-01

    New methods and apparatus are disclosed which allow measurement of the presence of oil and water in geological formations using a new physical effect called the Acoustic Paramagnetic Logging Effect (APLE). The presence of petroleum in formation causes a slight increase in the earth's magnetic field in the vicinity of the reservoir. This is the phenomena of paramagnetism. Application of an acoustic source to a geological formation at the Larmor frequency of the nucleons present causes the paramagnetism of the formation to disappear. This results in a decrease in the earth3 s magnetic field in the vicinity of the oil bearing formation. Repetitively frequency sweeping the acoustic source through the Larmor frequency of the nucleons present (approx. 2 kHz) causes an amplitude modulation of the earth's magnetic field which is a consequence of the APLE. The amplitude modulation of the earth's magnetic field is measured with an induction coil gradiometer and provides a direct measure of the amount of oil and water in the excitation zone of the formation . The phase of the signal is used to infer the longitudinal relaxation times of the fluids present, which results in the ability in general to separate oil and water and to measure the viscosity of the oil present. Such measurements may be preformed in open boreholes and in cased well bores.

  7. MimoPro: a more efficient Web-based tool for epitope prediction using phage display libraries

    PubMed Central

    2011-01-01

    Background A B-cell epitope is a group of residues on the surface of an antigen which stimulates humoral responses. Locating these epitopes on antigens is important for the purpose of effective vaccine design. In recent years, mapping affinity-selected peptides screened from a random phage display library to the native epitope has become popular in epitope prediction. These peptides, also known as mimotopes, share the similar structure and function with the corresponding native epitopes. Great effort has been made in using this similarity between such mimotopes and native epitopes in prediction, which has resulted in better outcomes than statistics-based methods can. However, it cannot maintain a high degree of satisfaction in various circumstances. Results In this study, we propose a new method that maps a group of mimotopes back to a source antigen so as to locate the interacting epitope on the antigen. The core of this method is a searching algorithm that is incorporated with both dynamic programming (DP) and branch and bound (BB) optimization and operated on a series of overlapping patches on the surface of a protein. These patches are then transformed to a number of graphs using an adaptable distance threshold (ADT) regulated by an appropriate compactness factor (CF), a novel parameter proposed in this study. Compared with both Pep-3D-Search and PepSurf, two leading graph-based search tools, on average from the results of 18 test cases, MimoPro, the Web-based implementation of our proposed method, performed better in sensitivity, precision, and Matthews correlation coefficient (MCC) than both did in epitope prediction. In addition, MimoPro is significantly faster than both Pep-3D-Search and PepSurf in processing. Conclusions Our search algorithm designed for processing well constructed graphs using an ADT regulated by CF is more sensitive and significantly faster than other graph-based approaches in epitope prediction. MimoPro is a viable alternative to both

  8. 3. Log bunkhouse (far left), log chicken house (left of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. Log bunkhouse (far left), log chicken house (left of center), equipment shed (center), and workshop (far right). View to northwest. - William & Lucina Bowe Ranch, County Road 44, 0.1 mile northeast of Big Hole River Bridge, Melrose, Silver Bow County, MT

  9. Seasonal logging, process response, and geomorphic work

    NASA Astrophysics Data System (ADS)

    Mohr, C. H.; Zimmermann, A.; Korup, O.; Iroumé, A.; Francke, T.; Bronstert, A.

    2014-03-01

    Deforestation is a prominent anthropogenic cause of erosive overland flow and slope instability, boosting rates of soil erosion and concomitant sediment flux. Conventional methods of gauging or estimating post-logging sediment flux often focus on annual timescales but overlook potentially important process response on shorter intervals immediately following timber harvest. We resolve such dynamics with non-parametric quantile regression forests (QRF) based on high-frequency (3 min) discharge measurements and sediment concentration data sampled every 30-60 min in similar-sized (˜0.1 km2) forested Chilean catchments that were logged during either the rainy or the dry season. The method of QRF builds on the random forest algorithm, and combines quantile regression with repeated random sub-sampling of both cases and predictors. The algorithm belongs to the family of decision-tree classifiers, which allow quantifying relevant predictors in high-dimensional parameter space. We find that, where no logging occurred, ˜80% of the total sediment load was transported during extremely variable runoff events during only 5% of the monitoring period. In particular, dry-season logging dampened the relative role of these rare, extreme sediment-transport events by increasing load efficiency during more efficient moderate events. We show that QRFs outperform traditional sediment rating curves (SRCs) in terms of accurately simulating short-term dynamics of sediment flux, and conclude that QRF may reliably support forest management recommendations by providing robust simulations of post-logging response of water and sediment fluxes at high temporal resolution.

  10. Logging Concessions Enable Illegal Logging Crisis in the Peruvian Amazon

    PubMed Central

    Finer, Matt; Jenkins, Clinton N.; Sky, Melissa A. Blue; Pine, Justin

    2014-01-01

    The Peruvian Amazon is an important arena in global efforts to promote sustainable logging in the tropics. Despite recent efforts to achieve sustainability, such as provisions in the US–Peru Trade Promotion Agreement, illegal logging continues to plague the region. We present evidence that Peru's legal logging concession system is enabling the widespread illegal logging via the regulatory documents designed to ensure sustainable logging. Analyzing official government data, we found that 68.3% of all concessions supervised by authorities were suspected of major violations. Of the 609 total concessions, nearly 30% have been cancelled for violations and we expect this percentage to increase as investigations continue. Moreover, the nature of the violations indicate that the permits associated with legal concessions are used to harvest trees in unauthorized areas, thus threatening all forested areas. Many of the violations pertain to the illegal extraction of CITES-listed timber species outside authorized areas. These findings highlight the need for additional reforms. PMID:24743552

  11. Logging Concessions Enable Illegal Logging Crisis in the Peruvian Amazon

    NASA Astrophysics Data System (ADS)

    Finer, Matt; Jenkins, Clinton N.; Sky, Melissa A. Blue; Pine, Justin

    2014-04-01

    The Peruvian Amazon is an important arena in global efforts to promote sustainable logging in the tropics. Despite recent efforts to achieve sustainability, such as provisions in the US-Peru Trade Promotion Agreement, illegal logging continues to plague the region. We present evidence that Peru's legal logging concession system is enabling the widespread illegal logging via the regulatory documents designed to ensure sustainable logging. Analyzing official government data, we found that 68.3% of all concessions supervised by authorities were suspected of major violations. Of the 609 total concessions, nearly 30% have been cancelled for violations and we expect this percentage to increase as investigations continue. Moreover, the nature of the violations indicate that the permits associated with legal concessions are used to harvest trees in unauthorized areas, thus threatening all forested areas. Many of the violations pertain to the illegal extraction of CITES-listed timber species outside authorized areas. These findings highlight the need for additional reforms.

  12. Use of Group Discussion and Learning Portfolio to Build Knowledge for Managing Web Group Learning

    ERIC Educational Resources Information Center

    Chen, Gwo-Dong; Ou, Kuo-Liang; Wang, Chin-Yeh

    2003-01-01

    To monitor and enhance the learning performance of learning groups in a Web learning system, teachers need to know the learning status of the group and determine the key influences affecting group learning outcomes. Teachers can achieve this goal by observing the group discussions and learning behavior from Web logs and analyzing the Web log data…

  13. Use of Group Discussion and Learning Portfolio to Build Knowledge for Managing Web Group Learning

    ERIC Educational Resources Information Center

    Chen, Gwo-Dong; Ou, Kuo-Liang; Wang, Chin-Yeh

    2003-01-01

    To monitor and enhance the learning performance of learning groups in a Web learning system, teachers need to know the learning status of the group and determine the key influences affecting group learning outcomes. Teachers can achieve this goal by observing the group discussions and learning behavior from Web logs and analyzing the Web log data…

  14. Effects of log defects on lumber recovery.

    Treesearch

    James M. Cahill; Vincent S. Cegelka

    1989-01-01

    The impact of log defects on lumber recovery and the accuracy of cubic log scale deductions were evaluated from log scale and product recovery data for more than 3,000 logs. Lumber tally loss was estimated by comparing the lumber yield of sound logs to that of logs containing defects. The data were collected at several product recovery studies; they represent most of...

  15. ASCOT: a text mining-based web-service for efficient search and assisted creation of clinical trials

    PubMed Central

    2012-01-01

    Clinical trials are mandatory protocols describing medical research on humans and among the most valuable sources of medical practice evidence. Searching for trials relevant to some query is laborious due to the immense number of existing protocols. Apart from search, writing new trials includes composing detailed eligibility criteria, which might be time-consuming, especially for new researchers. In this paper we present ASCOT, an efficient search application customised for clinical trials. ASCOT uses text mining and data mining methods to enrich clinical trials with metadata, that in turn serve as effective tools to narrow down search. In addition, ASCOT integrates a component for recommending eligibility criteria based on a set of selected protocols. PMID:22595088

  16. ASCOT: a text mining-based web-service for efficient search and assisted creation of clinical trials.

    PubMed

    Korkontzelos, Ioannis; Mu, Tingting; Ananiadou, Sophia

    2012-04-30

    Clinical trials are mandatory protocols describing medical research on humans and among the most valuable sources of medical practice evidence. Searching for trials relevant to some query is laborious due to the immense number of existing protocols. Apart from search, writing new trials includes composing detailed eligibility criteria, which might be time-consuming, especially for new researchers. In this paper we present ASCOT, an efficient search application customised for clinical trials. ASCOT uses text mining and data mining methods to enrich clinical trials with metadata, that in turn serve as effective tools to narrow down search. In addition, ASCOT integrates a component for recommending eligibility criteria based on a set of selected protocols.

  17. A Dynamically Configurable Log-based Distributed Security Event Detection Methodology using Simple Event Correlator

    DTIC Science & Technology

    2010-06-01

    from SANS Whitepaper - "... Detecting Attacks on Web Applications from Log Files" #look for image tags type=Single continue=TakeNext ptype=RegExp...shellcmd /home/user/sec -2.5.3/ common/syslogclient "... Synthetic : " "$2|$1|xss detected in image tag: $3" #send the raw log type=Single ptype=RegExp...Expressions taken from SANS Whitepaper - "... Detecting Attacks on Web Applications from Log Files" #look for image tags type=Single continue=TakeNext

  18. Primary detection of hardwood log defects using laser surface scanning

    NASA Astrophysics Data System (ADS)

    Thomas, Edward; Thomas, Liya; Mili, Lamine; Ehrich, Roger W.; Abbott, A. Lynn; Shaffer, Clifford

    2003-05-01

    The use of laser technology to scan hardwood log surfaces for defects holds great promise for improving processing efficiency and the value and volume of lumber produced. External and internal defect detection to optimize hardwood log and lumber processing is one of the top four technological needs in the nation"s hardwood industry. The location, type, and severity of defects on hardwood logs are the key indicators of log quality and value. These visual cues provide information about internal log characteristics and products for which the log is suitable. We scanned 162 logs with a high-resolution industrial four-head laser surface scanner. The resulting data sets contain hundreds of thousands of three-dimensional coordinate points. The size of the data and noise presented special problems during processing. Robust regression models were used to fit geometric shapes to the data. The estimated orthogonal distances between the fitted model and the log surface are converted to a two-dimensional image to facilitate defect detection. Using robust regression methods and standard image processing tools we have demonstrated that severe surface defects on hardwood logs can be detected using height and contour analyses of three-dimensional laser scan data.

  19. Well Logging with Californium-252

    SciTech Connect

    Boulogne, A.R.

    2003-01-06

    Californium-252 is an intense neutron emitter that has only recently become available for experimental well logging. The purpose of this research is to investigate the application of well logging to groundwater hydrology; however, most of the techniques and purposes are quite similar to applications in the petroleum industry.

  20. Logging slash and forest protection.

    Treesearch

    Raphael Zon; Russell N. Cunningham

    1931-01-01

    What to do with the brush after logging? This question has been debated in Wisconsin throughout the entire history of lumbering. In the popular mind, the occurrence of severe forest conflagrations has invariably been associated with the presence of logging slash on the ground. The occurrence of vast forest fires was noted by explorers and fur traders long before...

  1. Sawing SHOLO logs: three methods

    Treesearch

    Ronald E. Coleman; Hugh W. Reynolds

    1973-01-01

    Three different methods of sawing the SHOLO log were compared on a board-foot yield basis. Using sawmill simulation, all three methods of sawing were performed on the same sample of logs, eliminating differences due to sapling. A statistical test was made to determine whether or not there were any real differences between the board-foot yields. Two of the sawing...

  2. Protecting log cabins from decay

    Treesearch

    R. M. Rowell; J. M. Black; L. R. Gjovik; W. C. Feist

    1977-01-01

    This report answers the questions most often asked of the Forest Service on the protection of log cabins from decay, and on practices for the exterior finishing and maintenance of existing cabins. Causes of stain and decay are discussed, as are some basic techniques for building a cabin that will minimize decay. Selection and handling of logs, their preservative...

  3. Review of log sort yards

    Treesearch

    John Rusty Dramm; Gerry L. Jackson; Jenny Wong

    2002-01-01

    This report provides a general overview of current log sort yard operations in the United States, including an extensive literature review and information collected during on-site visits to several operations throughout the nation. Log sort yards provide many services in marketing wood and fiber by concentrating, merchandising, processing, sorting, and adding value to...

  4. Sonic log prediction in carbonates

    NASA Astrophysics Data System (ADS)

    Islam, Nayyer

    This work is conducted to study the complications associated with the sonic log prediction in carbonate logs and to investigate the possible solutions to accurately predict the sonic logs in Traverse Limestone. Well logs from fifty different wells were analyzed to define the mineralogy of the Traverse Limestone by using conventional 4-mineral and 3-mineral identification approaches. We modified the conventional 3-mineral identification approach (that completely neglects the gamma ray response) to correct the shale effects on the basis of gamma ray log before employing the 3-mineral identification. This modification helped to get the meaningful insight of the data when a plot was made between DGA (dry grain density) and UMA (Photoelectric Volumetric Cross-section) with the characteristic ternary diagram of the quartz, calcite and dolomite. The results were then compared with the 4-mineral identification approach. Contour maps of the average mineral fractions present in the Traverse Limestone were prepared to see the basin wide mineralogy of Traverse Limestone. In the second part, sonic response of Traverse Limestone was predicted in fifty randomly distributed wells. We used the modified time average equation that accounts for the shale effects on the basis of gamma ray log, and used it to predict the sonic behavior from density porosity and average porosity. To account for the secondary porosity of dolomite, we subtracted the dolomitic fraction of clean porosity from the total porosity. The pseudo-sonic logs were then compared with the measured sonic logs on the root mean square (RMS) basis. Addition of dolomite correction in modified time average equation improved the results of sonic prediction from neutron porosity and average porosity. The results demonstrated that sonic logs could be predicted in carbonate rocks with a root mean square error of about 4isec/ft. We also attempted the use of individual mineral components for sonic log prediction but the

  5. Seasonal logging, process response, and geomorphic work

    NASA Astrophysics Data System (ADS)

    Mohr, C. H.; Zimmermann, A.; Korup, O.; Iroumé, A.; Francke, T.; Bronstert, A.

    2013-09-01

    Deforestation is a prominent anthropogenic cause of erosive overland flow and slope instability, boosting rates of soil erosion and concomitant sediment flux. Conventional methods of gauging or estimating post-logging sediment flux focus on annual timescales, but potentially overlook important geomorphic responses on shorter time scales immediately following timber harvest. Sediments fluxes are commonly estimated from linear regression of intermittent measurements of water and sediment discharge using sediment rating curves (SRCs). However, these often unsatisfactorily reproduce non-linear effects such as discharge-load hystereses. We resolve such important dynamics from non-parametric Quantile Regression Forests (QRF) of high-frequency (3 min) measurements of stream discharge and sediment concentrations in similar-sized (~ 0.1 km2) forested Chilean catchments that were logged during either the rainy or the dry season. The method of QRF builds on the Random Forest (RF) algorithm, and combines quantile regression with repeated random sub-sampling of both cases and predictors. The algorithm belongs to the family of decision-tree classifiers, which allow quantifying relevant predictors in high-dimensional parameter space. We find that, where no logging occurred, ~ 80% of the total sediment load was transported during rare but high magnitude runoff events during only 5% of the monitoring period. The variability of sediment flux of these rare events spans four orders of magnitude. In particular dry-season logging dampened the role of these rare, extreme sediment-transport events by increasing load efficiency during more moderate events. We show that QRFs outperforms traditional SRCs in terms of accurately simulating short-term dynamics of sediment flux, and conclude that QRF may reliably support forest management recommendations by providing robust simulations of post-logging response of water and sediment discharge at high temporal resolution.

  6. Web Search Studies: Multidisciplinary Perspectives on Web Search Engines

    NASA Astrophysics Data System (ADS)

    Zimmer, Michael

    Perhaps the most significant tool of our internet age is the web search engine, providing a powerful interface for accessing the vast amount of information available on the world wide web and beyond. While still in its infancy compared to the knowledge tools that precede it - such as the dictionary or encyclopedia - the impact of web search engines on society and culture has already received considerable attention from a variety of academic disciplines and perspectives. This article aims to organize a meta-discipline of “web search studies,” centered around a nucleus of major research on web search engines from five key perspectives: technical foundations and evaluations; transaction log analyses; user studies; political, ethical, and cultural critiques; and legal and policy analyses.

  7. Security Guards for the Future Web

    DTIC Science & Technology

    2004-09-01

    major company, including BEA, IBM, Microsoft, Oracle , and Sun, supports and/or develops Web Service standards. Table 3-1. Layers of Web Services...2002. [Eic04] Spec #1-WS-Addressing, http://dotnetjunkies.com/ WebLog /seichert/archive/2004/02/06/6740.aspx, September 04. [Fed03] Federation of...a large following among key vendors and is implemented to some degree in products such as BEA Weblogic , BizTalk, and Collaxa. Business Process

  8. Soil disturbances from horse/mule logging operations coupled with machines in the Southern United States

    Treesearch

    Suraj P. Shrestha; Bobby L. Lanford; Robert Rummer; Mark Dubois

    2008-01-01

    Forest harvesting with animals is a labor-intensive operation. While mechanized logging is very efficient for large tracts of timber, it is often disruptive to the soil. Small logging operations using animals may be less environmentally disruptive. To better understand horse/mule logging performances for soil disturbance, five different horse/mule harvesting operations...

  9. Costs of logging thinnings and a clearcutting in Appalachia using a truck-mounted crane

    Treesearch

    Raymond L. Sarles; Kenneth R. Whitenack

    1984-01-01

    Four timber cutting treatments - three levels of thinning and a clearcutting - were applied on 60-year-old mountain stands of Allegheny hardwoods. The stands were logged by a three-man crew using chain saws and a truck-mounted crane. Logging operations were studied, and production rates determined for tree-length logs decked at roadside. Work efficiency and...

  10. High resolution gamma spectroscopy well logging system

    SciTech Connect

    Giles, J.R.; Dooley, K.J.

    1997-05-01

    A Gamma Spectroscopy Logging System (GSLS) has been developed to study sub-surface radionuclide contamination. The absolute counting efficiencies of the GSLS detectors were determined using cylindrical reference sources. More complex borehole geometries were modeled using commercially available shielding software and correction factors were developed based on relative gamma-ray fluence rates. Examination of varying porosity and moisture content showed that as porosity increases, and as the formation saturation ratio decreases, relative gamma-ray fluence rates increase linearly for all energies. Correction factors for iron and water cylindrical shields were found to agree well with correction factors determined during previous studies allowing for the development of correction factors for type-304 stainless steel and low-carbon steel casings. Regression analyses of correction factor data produced equations for determining correction factors applicable to spectral gamma-ray well logs acquired under non-standard borehole conditions.

  11. Web Mining

    NASA Astrophysics Data System (ADS)

    Fürnkranz, Johannes

    The World-Wide Web provides every internet citizen with access to an abundance of information, but it becomes increasingly difficult to identify the relevant pieces of information. Research in web mining tries to address this problem by applying techniques from data mining and machine learning to Web data and documents. This chapter provides a brief overview of web mining techniques and research areas, most notably hypertext classification, wrapper induction, recommender systems and web usage mining.

  12. Close-Call Action Log Form

    NASA Technical Reports Server (NTRS)

    Spuler, Linda M.; Ford, Patricia K.; Skeete, Darren C.; Hershman, Scot; Raviprakash, Pushpa; Arnold, John W.; Tran, Victor; Haenze, Mary Alice

    2005-01-01

    "Close Call Action Log Form" ("CCALF") is the name of both a computer program and a Web-based service provided by the program for creating an enhanced database of close calls (in the colloquial sense of mishaps that were avoided by small margins) assigned to the Center Operations Directorate (COD) at Johnson Space Center. CCALF provides a single facility for on-line collaborative review of close calls. Through CCALF, managers can delegate responses to employees. CCALF utilizes a pre-existing e-mail system to notify managers that there are close calls to review, but eliminates the need for the prior practices of passing multiple e-mail messages around the COD, then collecting and consolidating them into final responses: CCALF now collects comments from all responders for incorporation into reports that it generates. Also, whereas it was previously necessary to manually calculate metrics (e.g., numbers of maintenance-work orders necessitated by close calls) for inclusion in the reports, CCALF now computes the metrics, summarizes them, and displays them in graphical form. The reports and all pertinent information used to generate the reports are logged, tracked, and retained by CCALF for historical purposes.

  13. New materials for fireplace logs

    NASA Technical Reports Server (NTRS)

    Kieselback, D. J.; Smock, A. W.

    1971-01-01

    Fibrous insulation and refractory concrete are used for logs as well as fireproof walls, incinerator bricks, planters, and roof shingles. Insulation is lighter and more shock resistant than fireclay. Lightweight slag bonded with refractory concrete serves as aggregrate.

  14. Effects of selective logging on tropical forest tree growth

    NASA Astrophysics Data System (ADS)

    Figueira, Adelaine Michela E. S.; Miller, Scott D.; de Sousa, Cleilim Albert D.; Menton, Mary C.; Maia, Augusto R.; Da Rocha, Humberto R.; Goulden, Michael L.

    2008-03-01

    We combined measurements of tree growth and carbon dioxide exchange to investigate the effects of selective logging on the Aboveground Live Biomass (AGLB) of a tropical rain forest in the Amazon. Most of the measurements began at least 10 months before logging and continued at least 36 months after logging. The logging removed ˜15% of the trees with Diameter at Breast Height (DBH) greater than 35 cm, which resulted in an instantaneous 10% reduction in AGLB. Both wood production and mortality increased following logging, while Gross Primary Production (GPP) was unchanged. The ratio of wood production to GPP (the wood Carbon Use Efficiency or wood CUE) more than doubled following logging. Small trees (10 cm < DBH < 35 cm) accounted for most of the enhanced wood production. Medium trees (35 cm < DBH < 55 cm) that were within 30 m of canopy gaps created by the logging also showed increased growth. The patterns of enhanced growth are most consistent with logging-induced increases in light availability. The AGLB continued to decline over the study, as mortality outpaced wood production. Wood CUE and mortality remained elevated throughout the 3 years of postlogging measurements. The future trajectory of AGLB and the forest's carbon balance are uncertain, and will depend on how long it takes for heterotrophic respiration, mortality, and CUE to return to prelogging levels.

  15. Acoustic sorting models for improved log segregation

    Treesearch

    Xiping Wang; Steve Verrill; Eini Lowell; Robert J. Ross; Vicki L. Herian

    2013-01-01

    In this study, we examined three individual log measures (acoustic velocity, log diameter, and log vertical position in a tree) for their ability to predict average modulus of elasticity (MOE) and grade yield of structural lumber obtained from Douglas-fir (Pseudotsuga menziesii [Mirb. Franco]) logs. We found that log acoustic velocity only had a...

  16. Demonstration of the Web-based Interspecies Correlation Estimation (Web-ICE) modeling application

    EPA Science Inventory

    The Web-based Interspecies Correlation Estimation (Web-ICE) modeling application is available to the risk assessment community through a user-friendly internet platform (http://epa.gov/ceampubl/fchain/webice/). ICE models are log-linear least square regressions that predict acute...

  17. Demonstration of the Web-based Interspecies Correlation Estimation (Web-ICE) modeling application

    EPA Science Inventory

    The Web-based Interspecies Correlation Estimation (Web-ICE) modeling application is available to the risk assessment community through a user-friendly internet platform (http://epa.gov/ceampubl/fchain/webice/). ICE models are log-linear least square regressions that predict acute...

  18. Silicon web process development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Skutch, M. E.; Driggers, J. M.; Hopkins, R. H.

    1981-01-01

    The silicon web process takes advantage of natural crystallographic stabilizing forces to grow long, thin single crystal ribbons directly from liquid silicon. The ribbon, or web, is formed by the solidification of a liquid film supported by surface tension between two silicon filaments, called dendrites, which border the edges of the growing strip. The ribbon can be propagated indefinitely by replenishing the liquid silicon as it is transformed to crystal. The dendritic web process has several advantages for achieving low cost, high efficiency solar cells. These advantages are discussed.

  19. A Guide to Hardwood Log Grading

    Treesearch

    Everette D. Rast; David L. Sonderman; Glenn L. Gammon

    1973-01-01

    A guide to hardwood log grading (revised) was developed as a teaching aid and field reference in grading hardwood logs. Outlines basic principles and gives detailed practical applications, with illustrations, in grading hardwood logs. Includes standards for various use classes.

  20. Improving Website Hyperlink Structure Using Server Logs

    PubMed Central

    Paranjape, Ashwin; West, Robert; Zia, Leila; Leskovec, Jure

    2016-01-01

    Good websites should be easy to navigate via hyperlinks, yet maintaining a high-quality link structure is difficult. Identifying pairs of pages that should be linked may be hard for human editors, especially if the site is large and changes frequently. Further, given a set of useful link candidates, the task of incorporating them into the site can be expensive, since it typically involves humans editing pages. In the light of these challenges, it is desirable to develop data-driven methods for automating the link placement task. Here we develop an approach for automatically finding useful hyperlinks to add to a website. We show that passively collected server logs, beyond telling us which existing links are useful, also contain implicit signals indicating which nonexistent links would be useful if they were to be introduced. We leverage these signals to model the future usefulness of yet nonexistent links. Based on our model, we define the problem of link placement under budget constraints and propose an efficient algorithm for solving it. We demonstrate the effectiveness of our approach by evaluating it on Wikipedia, a large website for which we have access to both server logs (used for finding useful new links) and the complete revision history (containing a ground truth of new links). As our method is based exclusively on standard server logs, it may also be applied to any other website, as we show with the example of the biomedical research site Simtk. PMID:28345077

  1. Hardwood log grades and lumber grade yields for factory lumber logs

    Treesearch

    Leland F. Hanks; Glenn L. Gammon; Robert L. Brisbin; Everette D. Rast

    1980-01-01

    The USDA Forest Service Standard Grades for Hardwood Factory Lumber Logs are described, and lumber grade yields for 16 species and 2 species groups are presented by log grade and log diameter. The grades enable foresters, log buyers, and log sellers to select and grade those log suitable for conversion into standard factory grade lumber. By using the apropriate lumber...

  2. Early identification of adverse drug reactions from search log data.

    PubMed

    White, Ryen W; Wang, Sheng; Pant, Apurv; Harpaz, Rave; Shukla, Pushpraj; Sun, Walter; DuMouchel, William; Horvitz, Eric

    2016-02-01

    The timely and accurate identification of adverse drug reactions (ADRs) following drug approval is a persistent and serious public health challenge. Aggregated data drawn from anonymized logs of Web searchers has been shown to be a useful source of evidence for detecting ADRs. However, prior studies have been based on the analysis of established ADRs, the existence of which may already be known publically. Awareness of these ADRs can inject existing knowledge about the known ADRs into online content and online behavior, and thus raise questions about the ability of the behavioral log-based methods to detect new ADRs. In contrast to previous studies, we investigate the use of search logs for the early detection of known ADRs. We use a large set of recently labeled ADRs and negative controls to evaluate the ability of search logs to accurately detect ADRs in advance of their publication. We leverage the Internet Archive to estimate when evidence of an ADR first appeared in the public domain and adjust the index date in a backdated analysis. Our results demonstrate how search logs can be used to detect new ADRs, the central challenge in pharmacovigilance.

  3. Mail LOG: Program operating instructions

    NASA Technical Reports Server (NTRS)

    Harris, D. K.

    1979-01-01

    The operating instructions for the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS, are provided. The program is written in FORTRAN for the PRIME 300 computer system. The MAIL LOG program has the following four modes of operation: (1) INPUT - putting new records into the data base (2) REVISE - changing or modifying existing records in the data base (3) SEARCH - finding special records existing in the data base (4) ARCHIVE - store or put away existing records in the data base. The output includes special printouts of records in the data base and results from the INPUT and SEARCH modes. The MAIL LOG data base consists of three main subfiles: Incoming and outgoing mail correspondence; Design Information Releases and Releases and Reports; and Drawings and Engineering orders.

  4. Maintaining Quality of Library Web Sites Using Cluster and Path Analysis.

    ERIC Educational Resources Information Center

    Matylonek, John

    2002-01-01

    Provides methods for the systematic redesign of a library Web site through comparing baseline data changes in use brought about by design changes. Shows how complementary information, based on Web users' log statistics and direct observations of users, can enhance a library Web site. Relates this information to practical Web site decisions that…

  5. Impacts of intensive logging on the trophic organisation of ant communities in a biodiversity hotspot.

    PubMed

    Woodcock, Paul; Edwards, David P; Newton, Rob J; Vun Khen, Chey; Bottrell, Simon H; Hamer, Keith C

    2013-01-01

    Trophic organisation defines the flow of energy through ecosystems and is a key component of community structure. Widespread and intensifying anthropogenic disturbance threatens to disrupt trophic organisation by altering species composition and relative abundances and by driving shifts in the trophic ecology of species that persist in disturbed ecosystems. We examined how intensive disturbance caused by selective logging affects trophic organisation in the biodiversity hotspot of Sabah, Borneo. Using stable nitrogen isotopes, we quantified the positions in the food web of 159 leaf-litter ant species in unlogged and logged rainforest and tested four predictions: (i) there is a negative relationship between the trophic position of a species in unlogged forest and its change in abundance following logging, (ii) the trophic positions of species are altered by logging, (iii) disturbance alters the frequency distribution of trophic positions within the ant assemblage, and (iv) disturbance reduces food chain length. We found that ant abundance was 30% lower in logged forest than in unlogged forest but changes in abundance of individual species were not related to trophic position, providing no support for prediction (i). However, trophic positions of individual species were significantly higher in logged forest, supporting prediction (ii). Consequently, the frequency distribution of trophic positions differed significantly between unlogged and logged forest, supporting prediction (iii), and food chains were 0.2 trophic levels longer in logged forest, the opposite of prediction (iv). Our results demonstrate that disturbance can alter trophic organisation even without trophically-biased changes in community composition. Nonetheless, the absence of any reduction in food chain length in logged forest suggests that species-rich arthropod food webs do not experience trophic downgrading or a related collapse in trophic organisation despite the disturbance caused by logging

  6. Impacts of Intensive Logging on the Trophic Organisation of Ant Communities in a Biodiversity Hotspot

    PubMed Central

    Woodcock, Paul; Edwards, David P.; Newton, Rob J.; Vun Khen, Chey; Bottrell, Simon H.; Hamer, Keith C.

    2013-01-01

    Trophic organisation defines the flow of energy through ecosystems and is a key component of community structure. Widespread and intensifying anthropogenic disturbance threatens to disrupt trophic organisation by altering species composition and relative abundances and by driving shifts in the trophic ecology of species that persist in disturbed ecosystems. We examined how intensive disturbance caused by selective logging affects trophic organisation in the biodiversity hotspot of Sabah, Borneo. Using stable nitrogen isotopes, we quantified the positions in the food web of 159 leaf-litter ant species in unlogged and logged rainforest and tested four predictions: (i) there is a negative relationship between the trophic position of a species in unlogged forest and its change in abundance following logging, (ii) the trophic positions of species are altered by logging, (iii) disturbance alters the frequency distribution of trophic positions within the ant assemblage, and (iv) disturbance reduces food chain length. We found that ant abundance was 30% lower in logged forest than in unlogged forest but changes in abundance of individual species were not related to trophic position, providing no support for prediction (i). However, trophic positions of individual species were significantly higher in logged forest, supporting prediction (ii). Consequently, the frequency distribution of trophic positions differed significantly between unlogged and logged forest, supporting prediction (iii), and food chains were 0.2 trophic levels longer in logged forest, the opposite of prediction (iv). Our results demonstrate that disturbance can alter trophic organisation even without trophically-biased changes in community composition. Nonetheless, the absence of any reduction in food chain length in logged forest suggests that species-rich arthropod food webs do not experience trophic downgrading or a related collapse in trophic organisation despite the disturbance caused by logging

  7. Method for induced polarization logging

    SciTech Connect

    Vinegar, H.J.; Waxman, M.H.

    1987-04-14

    A method is described for generating a log of the formation phase shift, resistivity and spontaneous potential of an earth formation from data obtained from the earth formation with a multi-electrode induced polarization logging tool. The method comprises obtaining data samples from the formation at measurement points equally spaced in time of the magnitude and phase of the induced voltage and the magnitude and phase of the current supplied by a circuit through a reference resistance R/sub 0/ to a survey current electrode associated with the tool.

  8. Sexual information seeking on web search engines.

    PubMed

    Spink, Amanda; Koricich, Andrew; Jansen, B J; Cole, Charles

    2004-02-01

    Sexual information seeking is an important element within human information behavior. Seeking sexually related information on the Internet takes many forms and channels, including chat rooms discussions, accessing Websites or searching Web search engines for sexual materials. The study of sexual Web queries provides insight into sexually-related information-seeking behavior, of value to Web users and providers alike. We qualitatively analyzed queries from logs of 1,025,910 Alta Vista and AlltheWeb.com Web user queries from 2001. We compared the differences in sexually-related Web searching between Alta Vista and AlltheWeb.com users. Differences were found in session duration, query outcomes, and search term choices. Implications of the findings for sexual information seeking are discussed.

  9. Solar-A reformatted data files and observing log

    NASA Technical Reports Server (NTRS)

    Morrison, M. D.; Lemen, J. R.; Acton, L. W.; Bentley, R. D.; Kosugi, T.; Tsuneta, S.; Ogawara, Y.; Watanabe, T.

    1991-01-01

    An overview is presented of the Solar-A telemetry data files which are to be created and the format and organization which the files are to use. The organization chosen is to be efficient in space, to facilitate access to the data, and to allow the data to be transportable to different machines. An observing log file is to be created automatically, using the reformatted data files as the input. It will be possible to perform searches with the observing log to list cases where instruments are in certain modes and/or seeing certain signal levels. A user will be able to search the observing log and obtain a list of all cases where a given set of conditions are satisfied. An event log will be created listing the times when the instrument or spacecraft modes change.

  10. Predicting internal red oak (Quercus rubra) log defect features using surface defect defect measurements

    Treesearch

    R. Edward. Thomas

    2013-01-01

    Determining the defects located within a log is crucial to understanding the tree/log resource for efficient processing. However, existing means of doing this non-destructively requires the use of expensive x-ray/CT (computerized tomography), MRI (magnetic resonance imaging), or microwave technology. These methods do not lend themselves to fast, efficient, and cost-...

  11. Adapting the right web pages to the right users

    NASA Astrophysics Data System (ADS)

    Hui, Xiong; Sung, Sam Y.; Huang, Stephen

    2000-04-01

    With the explosive use of the Internet, there is an ever- increasing volume of Web usage data being generated and warehoused in numerous successful Web sites. Analyzing Web usage data can help Web developers to improve the organization and presentation of their Web sites. Considering the fact that mining for patterns and rules in market basket data is well studied in data mining field, we provide a mapping approach, which can transform Web usage data into the form like market basket data. Using our model, all the methods developed by data mining research groups can be directly applied on Web usage data without much change. Existing methods for knowledge discovery in Web logs are restricted by the difficulty of getting the complete and reliable Web usage data and effectively identifying user sessions using current Web server log mechanism. The problem is due to Web caching and the existence of proxy servers. As an effort to remedy this problem, we built our own Web server log mechanism that can effectively capture user access behavior and will not be deliberately bypassed by proxy servers and end users.

  12. Log exports by port, 1987.

    Treesearch

    Debra D. Warren

    1989-01-01

    Volumes and average values of log exports by port have been compiled by quarter for 1987. The tables show the four Northwest customs districts by ports, species, and destinations. These data were received from the U.S. Department of Commerce too late to be published in the 1987 quarterly reports, "Production, Prices, Employment, and Trade in Northwest Forest...

  13. Downhole memory-logging tools

    SciTech Connect

    Lysne, P.

    1992-01-01

    Logging technologies developed hydrocarbon resource evaluation have not migrated into geothermal applications even though data so obtained would strengthen reservoir characterization efforts. Two causative issues have impeded progress: (i) there is a general lack of vetted, high-temperature instrumentation, and (ii) the interpretation of log data generated in a geothermal formation is in its infancy. Memory-logging tools provide a path around the first obstacle by providing quality data at a low cost. These tools feature on-board computers that process and store data, and newer systems may be programmed to make decisions.'' Since memory tools are completely self-contained, they are readily deployed using the slick line found on most drilling locations. They have proven to be rugged, and a minimum training program is required for operator personnel. Present tools measure properties such as temperature and pressure, and the development of noise, deviation, and fluid conductivity logs based on existing hardware is relatively easy. A more complex geochemical tool aimed at a quantitative analysis of potassium, uranium and thorium will be available in about on year, and it is expandable into all nuclear measurements common in the hydrocarbon industry. A second tool designed to sample fluids at conditions exceeding 400{degrees}C is in the proposal stage. Partnerships are being formed between the geothermal industry, scientific drilling programs, and the national laboratories to define and develop inversion algorithms relating raw tool data to more pertinent information. 8 refs.

  14. Downhole Memory-Logging Tools

    SciTech Connect

    Lysne, Peter

    1992-03-24

    Logging technologies developed for hydrocarbon resource evaluation have not migrated into geothermal applications even though data so obtained would strengthen reservoir characterization efforts. Two causative issues have impeded progress: (1) there is a general lack of vetted, high-temperature instrumentation, and (2) the interpretation of log data generated in a geothermal formation is in its infancy. Memory-logging tools provide a path around the first obstacle by providing quality data at a low cost. These tools feature onboard computers that process and store data, and newer systems may be programmed to make ''decisions''. Since memory tools are completely self-contained, they are readily deployed using the slick line found on most drilling locations. They have proven to be rugged, and a minimum training program is required for operator personnel. Present tools measure properties such as temperature and pressure, and the development of noise, deviation, and fluid conductivity logs based on existing hardware is relatively easy. A more complex geochemical tool aimed at a quantitative analysis of potassium, uranium and thorium will be available in about one year, and it is expandable into all nuclear measurements common in the hydrocarbon industry. A second tool designed to sample fluids at conditions exceeding 400 C (752 F) is in the proposal stage. Partnerships are being formed between the geothermal industry, scientific drilling programs, and the national laboratories to define and develop inversion algorithms relating raw tool data to more pertinent information.

  15. CRYPTOSPORIDIUM LOG INACTIVATION CALCULATION METHODS

    EPA Science Inventory

    Appendix O of the Surface Water Treatment Rule (SWTR) Guidance Manual introduces the CeffT10 (i.e., reaction zone outlet C value and T10 time) method for calculating ozone CT value and Giardia and virus log inactivation. The LT2ESWTR Pre-proposal Draft Regulatory Language for St...

  16. Outdoor Education Student Log Book.

    ERIC Educational Resources Information Center

    Garbutt, Barbara; And Others.

    A student log book for outdoor education was developed to aid Oakland County (Michigan) teachers and supervisors of outdoor education in preparing student campers for their role and responsibilities in the total program. A sample letter to sixth graders explains the purpose of the booklet. General camp rules (10) are presented, followed by 6 woods…

  17. Statistical log analysis made practical

    SciTech Connect

    Mitchell, W.K.; Nelson, R.J. )

    1991-06-01

    This paper discusses the advantages of a statistical approach to log analysis. Statistical techniques use inverse methods to calculate formation parameters. The use of statistical techniques has been limited, however, by the complexity of the mathematics and lengthy computer time required to minimize traditionally used nonlinear equations.

  18. CRYPTOSPORIDIUM LOG INACTIVATION CALCULATION METHODS

    EPA Science Inventory

    Appendix O of the Surface Water Treatment Rule (SWTR) Guidance Manual introduces the CeffT10 (i.e., reaction zone outlet C value and T10 time) method for calculating ozone CT value and Giardia and virus log inactivation. The LT2ESWTR Pre-proposal Draft Regulatory Language for St...

  19. Logging Work Injuries in Appalachia

    Treesearch

    Charles H. Wolf; Gilbert P. Dempsey

    1978-01-01

    Logging accidents are costly. They may bring pain to injured workers, hardship to their families, and higher insurance premiums and lower productivity to their employers. Our analysis of 1,172 injuries in central Appalachia reveals that nearly half of all time lost-and almost all fatalities-resulted from accidents during felling and unloading. The largest proportion of...

  20. Soil Wetness Influences Log Skidding

    Treesearch

    William N. Darwin

    1960-01-01

    One of the least explored variables in timber harvesting is the effect of ground conditions on log production . The Southern Hardwoods Laboratory is studying this variable and its influence on performance of skidding vehicles in Southern bottom lands. The test reported here was designed to evaluate the effects of bark features on skidding coefficients, but it also...

  1. Postfire logging in riparian areas.

    Treesearch

    Gordon H. Reeves; Peter A. Bisson; Bruce E. Rieman; Lee E. Benda

    2006-01-01

    We reviewed the behavior of wildfire in riparian zones, primarily in the western United States, and the potential ecological consequences of postfire logging. Fire behavior in riparian zones is complex, but many aquatic and riparian organisms exhibit a suite of adaptations that allow relatively rapid recovery after fire. Unless constrained by other factors, fish tend...

  2. The formula Scribner log rule.

    Treesearch

    George R. Staebler

    1952-01-01

    The Scribner Decimal C is the accepted log rule in the Pacific Northwest. Usually volume, growth and yield tables are expressed by this rule to give them practical meaning. Yet in the research required for such studies, the rule is unsatisfactory because of rounded values and irregular jumps in volume from diameter to diameter and length to length.

  3. A New Approach to Logging.

    ERIC Educational Resources Information Center

    Miles, Donna

    2001-01-01

    In response to high numbers of preventable fatal accidents in the logging industry, the Occupational Safety and Health Administration (OSHA) developed a week-long logger safety training program that includes hands-on learning of safety techniques in the woods. Reaching small operators has been challenging; outreach initiatives in Maine, North…

  4. Hardwood log supply: a broader perspective

    Treesearch

    Iris Montague; Adri Andersch; Jan Wiedenbeck; Urs. Buehlmann

    2015-01-01

    At regional and state meetings we talk with others in our business about the problems we face: log exports, log quality, log markets, logger shortages, cash flow problems, the weather. These are familiar talking points and real and persistent problems. But what is the relative importance of these problems for log procurement in different regions of...

  5. When is hardwood cable logging economical?

    Treesearch

    Chris B. LeDoux

    1985-01-01

    Using cable logging to harvest eastern hardwood logs on steep terrain can result in low production rates and high costs per unit of wood produced. Logging managers can improve productivity and profitability by knowing how the interaction of site-specific variables and cable logging equipment affect costs and revenues. Data from selected field studies and forest model...

  6. A method of estimating log weights.

    Treesearch

    Charles N. Mann; Hilton H. Lysons

    1972-01-01

    This paper presents a practical method of estimating the weights of logs before they are yarded. Knowledge of log weights is required to achieve optimum loading of modern yarding equipment. Truckloads of logs are weighed and measured to obtain a local density index (pounds per cubic foot) for a species of logs. The density index is then used to estimate the weights of...

  7. Nondestructive evaluation for sorting red maple logs

    Treesearch

    Xiping Wang; Robert J. Ross; David W. Green; Karl Englund; Michael Wolcott

    2000-01-01

    Existing log grading procedures in the United States make only visual assessments of log quality. These procedures do not incorporate estimates of the modulus of elasticity (MOE) of logs. It is questionable whether the visual grading procedures currently used for logs adequately assess the potential quality of structural products manufactured from them, especially...

  8. 47 CFR 73.781 - Logs.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 4 2013-10-01 2013-10-01 false Logs. 73.781 Section 73.781 Telecommunication... International Broadcast Stations § 73.781 Logs. The licensee or permittee of each international broadcast station must maintain the station log in the following manner: (a) In the program log: (1) An entry of...

  9. 47 CFR 73.781 - Logs.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 4 2012-10-01 2012-10-01 false Logs. 73.781 Section 73.781 Telecommunication... International Broadcast Stations § 73.781 Logs. The licensee or permittee of each international broadcast station must maintain the station log in the following manner: (a) In the program log: (1) An entry of...

  10. 47 CFR 73.781 - Logs.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 4 2011-10-01 2011-10-01 false Logs. 73.781 Section 73.781 Telecommunication... International Broadcast Stations § 73.781 Logs. The licensee or permittee of each international broadcast station must maintain the station log in the following manner: (a) In the program log: (1) An entry of...

  11. 29 CFR 1918.88 - Log operations.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 7 2013-07-01 2013-07-01 false Log operations. 1918.88 Section 1918.88 Labor Regulations...) SAFETY AND HEALTH REGULATIONS FOR LONGSHORING Handling Cargo § 1918.88 Log operations. (a) Working in holds. When loading logs into the holds of vessels and using dumper devices to roll logs into the...

  12. 29 CFR 1918.88 - Log operations.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 7 2011-07-01 2011-07-01 false Log operations. 1918.88 Section 1918.88 Labor Regulations...) SAFETY AND HEALTH REGULATIONS FOR LONGSHORING Handling Cargo § 1918.88 Log operations. (a) Working in holds. When loading logs into the holds of vessels and using dumper devices to roll logs into the...

  13. 29 CFR 1918.88 - Log operations.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 7 2012-07-01 2012-07-01 false Log operations. 1918.88 Section 1918.88 Labor Regulations...) SAFETY AND HEALTH REGULATIONS FOR LONGSHORING Handling Cargo § 1918.88 Log operations. (a) Working in holds. When loading logs into the holds of vessels and using dumper devices to roll logs into the...

  14. 29 CFR 1918.88 - Log operations.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 7 2014-07-01 2014-07-01 false Log operations. 1918.88 Section 1918.88 Labor Regulations...) SAFETY AND HEALTH REGULATIONS FOR LONGSHORING Handling Cargo § 1918.88 Log operations. (a) Working in holds. When loading logs into the holds of vessels and using dumper devices to roll logs into the...

  15. 47 CFR 73.781 - Logs.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 4 2014-10-01 2014-10-01 false Logs. 73.781 Section 73.781 Telecommunication... International Broadcast Stations § 73.781 Logs. The licensee or permittee of each international broadcast station must maintain the station log in the following manner: (a) In the program log: (1) An entry of...

  16. 29 CFR 1918.88 - Log operations.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 7 2010-07-01 2010-07-01 false Log operations. 1918.88 Section 1918.88 Labor Regulations...) SAFETY AND HEALTH REGULATIONS FOR LONGSHORING Handling Cargo § 1918.88 Log operations. (a) Working in holds. When loading logs into the holds of vessels and using dumper devices to roll logs into the...

  17. 47 CFR 73.781 - Logs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 4 2010-10-01 2010-10-01 false Logs. 73.781 Section 73.781 Telecommunication... International Broadcast Stations § 73.781 Logs. The licensee or permittee of each international broadcast station must maintain the station log in the following manner: (a) In the program log: (1) An entry of...

  18. A Novel Framework for Medical Web Information Foraging Using Hybrid ACO and Tabu Search.

    PubMed

    Drias, Yassine; Kechid, Samir; Pasi, Gabriella

    2016-01-01

    We present in this paper a novel approach based on multi-agent technology for Web information foraging. We proposed for this purpose an architecture in which we distinguish two important phases. The first one is a learning process for localizing the most relevant pages that might interest the user. This is performed on a fixed instance of the Web. The second takes into account the openness and dynamicity of the Web. It consists on an incremental learning starting from the result of the first phase and reshaping the outcomes taking into account the changes that undergoes the Web. The system was implemented using a colony of artificial ants hybridized with tabu search in order to achieve more effectiveness and efficiency. To validate our proposal, experiments were conducted on MedlinePlus, a real website dedicated for research in the domain of Health in contrast to other previous works where experiments were performed on web logs datasets. The main results are promising either for those related to strong Web regularities and for the response time, which is very short and hence complies the real time constraint.

  19. IsoWeb: a bayesian isotope mixing model for diet analysis of the whole food web.

    PubMed

    Kadoya, Taku; Osada, Yutaka; Takimoto, Gaku

    2012-01-01

    Quantitative description of food webs provides fundamental information for the understanding of population, community, and ecosystem dynamics. Recently, stable isotope mixing models have been widely used to quantify dietary proportions of different food resources to a focal consumer. Here we propose a novel mixing model (IsoWeb) that estimates diet proportions of all consumers in a food web based on stable isotope information. IsoWeb requires a topological description of a food web, and stable isotope signatures of all consumers and resources in the web. A merit of IsoWeb is that it takes into account variation in trophic enrichment factors among different consumer-resource links. Sensitivity analysis using realistic hypothetical food webs suggests that IsoWeb is applicable to a wide variety of food webs differing in the number of species, connectance, sample size, and data variability. Sensitivity analysis based on real topological webs showed that IsoWeb can allow for a certain level of topological uncertainty in target food webs, including erroneously assuming false links, omission of existent links and species, and trophic aggregation into trophospecies. Moreover, using an illustrative application to a real food web, we demonstrated that IsoWeb can compare the plausibility of different candidate topologies for a focal web. These results suggest that IsoWeb provides a powerful tool to analyze food-web structure from stable isotope data. We provide R and BUGS codes to aid efficient applications of IsoWeb.

  20. IsoWeb: A Bayesian Isotope Mixing Model for Diet Analysis of the Whole Food Web

    PubMed Central

    Kadoya, Taku; Osada, Yutaka; Takimoto, Gaku

    2012-01-01

    Quantitative description of food webs provides fundamental information for the understanding of population, community, and ecosystem dynamics. Recently, stable isotope mixing models have been widely used to quantify dietary proportions of different food resources to a focal consumer. Here we propose a novel mixing model (IsoWeb) that estimates diet proportions of all consumers in a food web based on stable isotope information. IsoWeb requires a topological description of a food web, and stable isotope signatures of all consumers and resources in the web. A merit of IsoWeb is that it takes into account variation in trophic enrichment factors among different consumer-resource links. Sensitivity analysis using realistic hypothetical food webs suggests that IsoWeb is applicable to a wide variety of food webs differing in the number of species, connectance, sample size, and data variability. Sensitivity analysis based on real topological webs showed that IsoWeb can allow for a certain level of topological uncertainty in target food webs, including erroneously assuming false links, omission of existent links and species, and trophic aggregation into trophospecies. Moreover, using an illustrative application to a real food web, we demonstrated that IsoWeb can compare the plausibility of different candidate topologies for a focal web. These results suggest that IsoWeb provides a powerful tool to analyze food-web structure from stable isotope data. We provide R and BUGS codes to aid efficient applications of IsoWeb. PMID:22848427

  1. Multicriteria evaluation of simulated logging scenarios in a tropical rain forest.

    PubMed

    Huth, Andreas; Drechsler, Martin; Köhler, Peter

    2004-07-01

    Forest growth models are useful tools for investigating the long-term impacts of logging. In this paper, the results of the rain forest growth model FORMIND were assessed by a multicriteria decision analysis. The main processes covered by FORMIND include tree growth, mortality, regeneration and competition. Tree growth is calculated based on a carbon balance approach. Trees compete for light and space; dying large trees fall down and create gaps in the forest. Sixty-four different logging scenarios for an initially undisturbed forest stand at Deramakot (Malaysia) were simulated. The scenarios differ regarding the logging cycle, logging method, cutting limit and logging intensity. We characterise the impacts with four criteria describing the yield, canopy opening and changes in species composition. Multicriteria decision analysis was used for the first time to evaluate the scenarios and identify the efficient ones. Our results plainly show that reduced-impact logging scenarios are more 'efficient' than the others, since in these scenarios forest damage is minimised without significantly reducing yield. Nevertheless, there is a trade-off between yield and achieving a desired ecological state of logged forest; the ecological state of the logged forests can only be improved by reducing yields and enlarging the logging cycles. Our study also demonstrates that high cutting limits or low logging intensities cannot compensate for the high level of damage caused by conventional logging techniques.

  2. Web Engineering

    SciTech Connect

    White, Bebo

    2003-06-23

    Web Engineering is the application of systematic, disciplined and quantifiable approaches to development, operation, and maintenance of Web-based applications. It is both a pro-active approach and a growing collection of theoretical and empirical research in Web application development. This paper gives an overview of Web Engineering by addressing the questions: (a) why is it needed? (b) what is its domain of operation? (c) how does it help and what should it do to improve Web application development? and (d) how should it be incorporated in education and training? The paper discusses the significant differences that exist between Web applications and conventional software, the taxonomy of Web applications, the progress made so far and the research issues and experience of creating a specialization at the master's level. The paper reaches a conclusion that Web Engineering at this stage is a moving target since Web technologies are constantly evolving, making new types of applications possible, which in turn may require innovations in how they are built, deployed and maintained.

  3. Accurately determining log and bark volumes of saw logs using high-resolution laser scan data

    Treesearch

    R. Edward Thomas; Neal D. Bennett

    2014-01-01

    Accurately determining the volume of logs and bark is crucial to estimating the total expected value recovery from a log. Knowing the correct size and volume of a log helps to determine which processing method, if any, should be used on a given log. However, applying volume estimation methods consistently can be difficult. Errors in log measurement and oddly shaped...

  4. House log drying rates in southeast Alaska for covered and uncovered softwood logs

    Treesearch

    David Nicholls; Allen Brackley

    2009-01-01

    Log moisture content has an important impact on many aspects of log home construction, including log processing, transportation costs, and dimensional stability in use. Air-drying times for house logs from freshly harvested trees can depend on numerous factors including initial moisture content, log diameter, bark condition, and environmental conditions during drying....

  5. Chemical logging of geothermal wells

    DOEpatents

    Allen, Charles A.; McAtee, Richard E.

    1981-01-01

    The presence of geothermal aquifers can be detected while drilling in geothermal formations by maintaining a chemical log of the ratio of the concentrations of calcium to carbonate and bicarbonate ions in the return drilling fluid. A continuous increase in the ratio of the concentrations of calcium to carbonate and bicarbonate ions is indicative of the existence of a warm or hot geothermal aquifer at some increased depth.

  6. Chemical logging of geothermal wells

    DOEpatents

    Allen, C.A.; McAtee, R.E.

    The presence of geothermal aquifers can be detected while drilling in geothermal formations by maintaining a chemical log of the ratio of the concentrations of calcium to carbonate and bicarbonate ions in the return drilling fluid. A continuous increase in the ratio of the concentrations of calcium to carbonate and bicarbonate ions is indicative of the existence of a warm or hot geothermal aquifer at some increased depth.

  7. Audit Log for Forensic Photography

    NASA Astrophysics Data System (ADS)

    Neville, Timothy; Sorell, Matthew

    We propose an architecture for an audit log system for forensic photography, which ensures that the chain of evidence of a photograph taken by a photographer at a crime scene is maintained from the point of image capture to its end application at trial. The requirements for such a system are specified and the results of experiments are presented which demonstrate the feasibility of the proposed approach.

  8. Latent log-linear models for handwritten digit classification.

    PubMed

    Deselaers, Thomas; Gass, Tobias; Heigold, Georg; Ney, Hermann

    2012-06-01

    We present latent log-linear models, an extension of log-linear models incorporating latent variables, and we propose two applications thereof: log-linear mixture models and image deformation-aware log-linear models. The resulting models are fully discriminative, can be trained efficiently, and the model complexity can be controlled. Log-linear mixture models offer additional flexibility within the log-linear modeling framework. Unlike previous approaches, the image deformation-aware model directly considers image deformations and allows for a discriminative training of the deformation parameters. Both are trained using alternating optimization. For certain variants, convergence to a stationary point is guaranteed and, in practice, even variants without this guarantee converge and find models that perform well. We tune the methods on the USPS data set and evaluate on the MNIST data set, demonstrating the generalization capabilities of our proposed models. Our models, although using significantly fewer parameters, are able to obtain competitive results with models proposed in the literature.

  9. The Chemnitz LogAnalyzer: a tool for analyzing data from hypertext navigation research.

    PubMed

    Brunstein, Angela; Naumann, Anja; Krems, Josef F

    2005-05-01

    Computer-based studies usually produce log files as raw data. These data cannot be analyzed adequately with conventional statistical software. The Chemnitz LogAnalyzer provides tools for quick and comfortable visualization and analyses of hypertext navigation behavior by individual users and for aggregated data. In addition, it supports analogous analyses of questionnaire data and reanalysis with respect to several predefined orders of nodes of the same hypertext. As an illustration of how to use the Chemnitz LogAnalyzer, we give an account of one study on learning with hypertext. Participants either searched for specific details or read a hypertext document to familiarize themselves with its content. The tool helped identify navigation strategies affected by these two processing goals and provided comparisons, for example, of processing times and visited sites. Altogether, the Chemnitz LogAnalyzer fills the gap between log files as raw data of Web-based studies and conventional statistical software.

  10. Log sampling methods and software for stand and landscape analyses.

    Treesearch

    Lisa J. Bate; Torolf R. Torgersen; Michael J. Wisdom; Edward O. Garton; Shawn C. Clabough

    2008-01-01

    We describe methods for efficient, accurate sampling of logs at landscape and stand scales to estimate density, total length, cover, volume, and weight. Our methods focus on optimizing the sampling effort by choosing an appropriate sampling method and transect length for specific forest conditions and objectives. Sampling methods include the line-intersect method and...

  11. Application of work sampling technique to analyze logging operations.

    Treesearch

    Edwin S. Miyata; Helmuth M. Steinhilb; Sharon A. Winsauer

    1981-01-01

    Discusses the advantages and disadvantages of various time study methods for determining efficiency and productivity in logging. The work sampling method is compared with the continuous time-study method. Gives the feasibility, capability, and limitation of the work sampling method.

  12. A new interface linking the ODP Log and RIDGE Multibeam Databases

    NASA Astrophysics Data System (ADS)

    Reagan, M.; Haxby, W.; Broglia, C.

    2001-12-01

    Over the past few years, a major effort has been undertaken by ODP Logging Services to create an easily accessible, on-line database of the log data collected during ODP cruises. The database currently consists of data from Legs 101-197 which can be retrieved using any web browser via the ODP Logging Services web site (http://www.ldeo.columbia.edu/BRG/ODP/DATA). Concurrently the RIDGE Multibeam Synthesis project at Lamont-Doherty Earth Observatory (LDEO) has been developing its own online database of multibeam data that can be accessed at http://coast.ldeo.columbia.edu. Recently the capabilities of these two databases have been combined using MapAPP, an interface developed by the RIDGE Multibeam Synthesis project and modified by ODP Logging Services for use with the log database. Both databases can be accessed with a simple menu selection. The interface allows for graphical searching and selection of sites in the regional context of the multibeam data using a java applet. It retains the easy download capabilities built into the log database, but also provides several new features including the ability to plot log curves `on the fly'. This capability can be used to display logs from a single hole, or to compare logs from several holes, thus providing a regional view of the data. The integration of this new graphical interface with the extensive content of the ODP Log Database provides users with a powerful tool for viewing and manipulating data. Future enhancements are anticipated to provide even greater capabilities and ease of use.

  13. Log analysis to understand medical professionals' image searching behaviour.

    PubMed

    Tsikrika, Theodora; Müller, Henning; Kahn, Charles E

    2012-01-01

    This paper reports on the analysis of the query logs of a visual medical information retrieval system that provides access to radiology resources. Our analysis shows that, despite sharing similarities with general Web search and also with biomedical text search, query formulation and query modification when searching for visual biomedical information have unique characteristics that need to be taken into account in order to enhance the effectiveness of the search support offered by such systems. Typical information needs of medical professionals searching radiology resources are also identified with the goal to create realistic search tasks for a medical image retrieval evaluation benchmark.

  14. Log-Linear Models for Gene Association

    PubMed Central

    Hu, Jianhua; Joshi, Adarsh; Johnson, Valen E.

    2009-01-01

    We describe a class of log-linear models for the detection of interactions in high-dimensional genomic data. This class of models leads to a Bayesian model selection algorithm that can be applied to data that have been reduced to contingency tables using ranks of observations within subjects, and discretization of these ranks within gene/network components. Many normalization issues associated with the analysis of genomic data are thereby avoided. A prior density based on Ewens’ sampling distribution is used to restrict the number of interacting components assigned high posterior probability, and the calculation of posterior model probabilities is expedited by approximations based on the likelihood ratio statistic. Simulation studies are used to evaluate the efficiency of the resulting algorithm for known interaction structures. Finally, the algorithm is validated in a microarray study for which it was possible to obtain biological confirmation of detected interactions. PMID:19655032

  15. Design and Implementation of a Robotic Surgery Training Experience Logging System.

    PubMed

    Baldea, Kristin G; Thorwarth, Ryan; Bajic, Petar; Quek, Marcus L; Gupta, Gopal N

    2017-06-28

    Residents currently log robotic cases in the ACGME system as a "surgeon" if they performed any critical step of the procedure on the surgeon console. There is no standardization as to which steps or how much of the procedure should be performed by the resident. It was our objective to establish a tool for logging the true operative experience in robotic surgery to aid in assessing surgical competency as well as curriculum development. We propose a tool to log surgical skill progression, experience, and feedback for robotic cases. A web-based robotic experience logging system (RoboLog) was developed with procedures deconstructed to their major steps. Trainees may request the supervising attending review their performance. RoboLog provides automated summary reports to both residents and attendings. RoboLog was successfully developed and piloted with a total of 310 cases logged over 1 year. A reporting structure was developed where residents could view statistics on several data points such as step-specific involvement and feedback from attending staff. Detailed data on resident experience were obtained. For instance, 82% of the 151 robotic prostatectomies were logged as "surgeon", yet urethral transection had <35% resident involvement. Our current system for logging robotic experience is lacking given the fact that resident involvement on the surgical console is variable. Widespread usage of a logging system with more insight into step-specific involvement is needed. RoboLog fills this need and can be used to track robotic training progress and aid in development of a standardized curriculum. Copyright © 2017. Published by Elsevier Inc.

  16. Avian responses to selective logging shaped by species traits and logging practices.

    PubMed

    Burivalova, Zuzana; Lee, Tien Ming; Giam, Xingli; Şekercioğlu, Çağan Hakkı; Wilcove, David S; Koh, Lian Pin

    2015-06-07

    Selective logging is one of the most common forms of forest use in the tropics. Although the effects of selective logging on biodiversity have been widely studied, there is little agreement on the relationship between life-history traits and tolerance to logging. In this study, we assessed how species traits and logging practices combine to determine species responses to selective logging, based on over 4000 observations of the responses of nearly 1000 bird species to selective logging across the tropics. Our analysis shows that species traits, such as feeding group and body mass, and logging practices, such as time since logging and logging intensity, interact to influence a species' response to logging. Frugivores and insectivores were most adversely affected by logging and declined further with increasing logging intensity. Nectarivores and granivores responded positively to selective logging for the first two decades, after which their abundances decrease below pre-logging levels. Larger species of omnivores and granivores responded more positively to selective logging than smaller species from either feeding group, whereas this effect of body size was reversed for carnivores, herbivores, frugivores and insectivores. Most importantly, species most negatively impacted by selective logging had not recovered approximately 40 years after logging cessation. We conclude that selective timber harvest has the potential to cause large and long-lasting changes in avian biodiversity. However, our results suggest that the impacts can be mitigated to a certain extent through specific forest management strategies such as lengthening the rotation cycle and implementing reduced impact logging.

  17. Avian responses to selective logging shaped by species traits and logging practices

    PubMed Central

    Burivalova, Zuzana; Lee, Tien Ming; Giam, Xingli; Şekercioğlu, Çağan Hakkı; Wilcove, David S.; Koh, Lian Pin

    2015-01-01

    Selective logging is one of the most common forms of forest use in the tropics. Although the effects of selective logging on biodiversity have been widely studied, there is little agreement on the relationship between life-history traits and tolerance to logging. In this study, we assessed how species traits and logging practices combine to determine species responses to selective logging, based on over 4000 observations of the responses of nearly 1000 bird species to selective logging across the tropics. Our analysis shows that species traits, such as feeding group and body mass, and logging practices, such as time since logging and logging intensity, interact to influence a species' response to logging. Frugivores and insectivores were most adversely affected by logging and declined further with increasing logging intensity. Nectarivores and granivores responded positively to selective logging for the first two decades, after which their abundances decrease below pre-logging levels. Larger species of omnivores and granivores responded more positively to selective logging than smaller species from either feeding group, whereas this effect of body size was reversed for carnivores, herbivores, frugivores and insectivores. Most importantly, species most negatively impacted by selective logging had not recovered approximately 40 years after logging cessation. We conclude that selective timber harvest has the potential to cause large and long-lasting changes in avian biodiversity. However, our results suggest that the impacts can be mitigated to a certain extent through specific forest management strategies such as lengthening the rotation cycle and implementing reduced impact logging. PMID:25994673

  18. Tucker Wireline Open Hole Wireline Logging

    SciTech Connect

    Milliken, M.

    2002-05-23

    The Tucker Wireline unit ran a suite of open hole logs right behind the RMOTC logging contractor for comparison purposes. The tools included Dual Laterolog, Phased Induction, BHC Sonic, and Density-Porosity.

  19. 29 CFR 1910.266 - Logging operations.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    .... Buck. To cut a felled tree into logs. Butt. The bottom of the felled part of a tree. Cable yarding. The... to prevent the root wad, butt or logs from striking an employee. These precautions include, but are...

  20. Geological well log analysis. Third ed

    SciTech Connect

    Pirson, S.J.

    1983-01-01

    Until recently, well logs have mainly been used for correlation, structural mapping, and quantitive evaluation of hydrocarbon bearing formations. This third edition of Geologic Well Log Analysis, however, describes how well logs can be used for geological studies and mineral exploration. This is done by analyzing well logs for numerous parameters and indices of significant mineral accumulation, primarily in sediments. Contents are: SP and Eh curves as redoxomorphic logs; sedimentalogical studies by log curve shapes; exploration for stratigraphic traps; continuous dipmeter as a structural tool; continuous dipmeter as a sedimentation tool; Paleo-facies logging and mapping; hydrogeology 1--hydrodynamics of compaction; hydrogeology 2--geostatic equilibrium; and hydrogeology 3--hydrodynamics of infiltration. Appendixes cover: Computer program for calculating the dip magnitude, azimuth, and the degree and orientation of the resistivity anisotrophy; a lithology computer program for calculating the curvature of a structure; and basic log analysis package for HP-41CV programmable calculator.

  1. Advanced dendritic web growth development and development of single-crystal silicon dendritic ribbon and high-efficiency solar cell program

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Hopkins, R. H.

    1986-01-01

    Efforts to demonstrate that the dendritic web technology is ready for commercial use by the end of 1986 continues. A commercial readiness goal involves improvements to crystal growth furnace throughput to demonstrate an area growth rate of greater than 15 sq cm/min while simultaneously growing 10 meters or more of ribbon under conditions of continuous melt replenishment. Continuous means that the silicon melt is being replenished at the same rate that it is being consumed by ribbon growth so that the melt level remains constant. Efforts continue on computer thermal modeling required to define high speed, low stress, continuous growth configurations; the study of convective effects in the molten silicon and growth furnace cover gas; on furnace component modifications; on web quality assessments; and on experimental growth activities.

  2. Data Mining of Network Logs

    NASA Technical Reports Server (NTRS)

    Collazo, Carlimar

    2011-01-01

    The statement of purpose is to analyze network monitoring logs to support the computer incident response team. Specifically, gain a clear understanding of the Uniform Resource Locator (URL) and its structure, and provide a way to breakdown a URL based on protocol, host name domain name, path, and other attributes. Finally, provide a method to perform data reduction by identifying the different types of advertisements shown on a webpage for incident data analysis. The procedures used for analysis and data reduction will be a computer program which would analyze the URL and identify and advertisement links from the actual content links.

  3. Intelligent web image retrieval system

    NASA Astrophysics Data System (ADS)

    Hong, Sungyong; Lee, Chungwoo; Nah, Yunmook

    2001-07-01

    Recently, the web sites such as e-business sites and shopping mall sites deal with lots of image information. To find a specific image from these image sources, we usually use web search engines or image database engines which rely on keyword only retrievals or color based retrievals with limited search capabilities. This paper presents an intelligent web image retrieval system. We propose the system architecture, the texture and color based image classification and indexing techniques, and representation schemes of user usage patterns. The query can be given by providing keywords, by selecting one or more sample texture patterns, by assigning color values within positional color blocks, or by combining some or all of these factors. The system keeps track of user's preferences by generating user query logs and automatically add more search information to subsequent user queries. To show the usefulness of the proposed system, some experimental results showing recall and precision are also explained.

  4. Using Web Metric Software to Drive: Mobile Website Development

    ERIC Educational Resources Information Center

    Tidal, Junior

    2011-01-01

    Many libraries have developed mobile versions of their websites. In order to understand their users, web developers have conducted both usability tests and focus groups, yet analytical software and web server logs can also be used to better understand users. Using data collected from these tools, the Ursula C. Schwerin Library has made informed…

  5. Using Advanced Search Operators on Web Search Engines.

    ERIC Educational Resources Information Center

    Jansen, Bernard J.

    Studies show that the majority of Web searchers enter extremely simple queries, so a reasonable system design approach would be to build search engines to compensate for this user characteristic. One hundred representative queries were selected from the transaction log of a major Web search service. These 100 queries were then modified using the…

  6. Using Advanced Search Operators on Web Search Engines.

    ERIC Educational Resources Information Center

    Jansen, Bernard J.

    Studies show that the majority of Web searchers enter extremely simple queries, so a reasonable system design approach would be to build search engines to compensate for this user characteristic. One hundred representative queries were selected from the transaction log of a major Web search service. These 100 queries were then modified using the…

  7. Using Web Metric Software to Drive: Mobile Website Development

    ERIC Educational Resources Information Center

    Tidal, Junior

    2011-01-01

    Many libraries have developed mobile versions of their websites. In order to understand their users, web developers have conducted both usability tests and focus groups, yet analytical software and web server logs can also be used to better understand users. Using data collected from these tools, the Ursula C. Schwerin Library has made informed…

  8. Global Connections: Web Conferencing Tools Help Educators Collaborate Anytime, Anywhere

    ERIC Educational Resources Information Center

    Forrester, Dave

    2009-01-01

    Web conferencing tools help educators from around the world collaborate in real time. Teachers, school counselors, and administrators need only to put on their headsets, check the time zone, and log on to meet and learn from educators across the globe. In this article, the author discusses how educators can use Web conferencing at their schools.…

  9. Selective logging and its relation to deforestation

    Treesearch

    Gregory P. Asner; Michael Keller; Marco Lentini; Frank Merry; Souza Jr. Carlos

    2009-01-01

    Selective logging is a major contributor to the social, economic, and ecological dynamics of Brazilian Amazonia. Logging activities have expanded from low-volume floodplain harvests in past centuries to high-volume operations today that take about 25 million m3 of wood from the forest each year. The most common high-impact conventional and often illegal logging...

  10. Selective logging in the Brazilian Amazon.

    Treesearch

    G. P. Asner; D. E. Knapp; E. N. Broadbent; P. J. C. Oliveira; M Keller; J. N. Silva

    2005-01-01

    Amazon deforestation has been measured by remote sensing for three decades. In comparison, selective logging has been mostly invisible to satellites. We developed a large-scale, high-resolution, automated remote-sensing analysis of selective logging in the top five timber-producing states of the Brazilian Amazon. Logged areas ranged from 12,075 to 19,823 square...

  11. Pacific Rim log trade: determinants and trends.

    Treesearch

    Donald F. Flora; Andrea L. Anderson; Wendy J. McGinnls

    1991-01-01

    Pacific Rim trade in softwood logs amounts to about $3 billion annually, of which the U.S. share is about $2 billion. Log exporting is a significant part of the forest economy in the Pacific Northwest. The 10 major Pacific Rim log-trading client and competitor countries differ widely in their roles in trade and in their policies affecting the industry.

  12. How much scarification from summer logging?

    Treesearch

    David A. Marquis; John C. Bjorkbom

    1960-01-01

    Scarification of the soil creates seedbeds that are favorable for the establishment of both paper birch and yellow birch. Logging in the summer often has been recommended as a method of obtaining these seedbeds. However, our observations on experimental logging jobs have shown that logging alone does not provide scarification over enough of the area to assure...

  13. Hardwood log grading scale stick improved

    Treesearch

    M. D. Ostrander; G. H. Englerth

    1953-01-01

    In February 1952 the Northeastern Forest Experiment Station described ( Research Note 13) a new log-grading scale stick developed by the Station for use as a visual aid in grading hardwood factory logs. It was based on the U. S. Forest Products Laboratory's log-grade specifications.

  14. Linking log quality with product performance

    Treesearch

    D. W. Green; Robert Ross

    1997-01-01

    In the United States, log grading procedures use visual assessment of defects, in relation to the log scaling diameter, to estimate the yield of lumber that maybe expected from the log. This procedure was satisfactory when structural grades were based only on defect size and location. In recent years, however, structural products have increasingly been graded using a...

  15. Challenges in converting among log scaling methods.

    Treesearch

    Henry. Spelter

    2003-01-01

    The traditional method of measuring log volume in North America is the board foot log scale, which uses simple assumptions about how much of a log's volume is recoverable. This underestimates the true recovery potential and leads to difficulties in comparing volumes measured with the traditional board foot system and those measured with the cubic scaling systems...

  16. 10 CFR 34.71 - Utilization logs.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Utilization logs. 34.71 Section 34.71 Energy NUCLEAR... RADIOGRAPHIC OPERATIONS Recordkeeping Requirements § 34.71 Utilization logs. (a) Each licensee shall maintain utilization logs showing for each sealed source the following information: (1) A description, including...

  17. 10 CFR 34.71 - Utilization logs.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Utilization logs. 34.71 Section 34.71 Energy NUCLEAR... RADIOGRAPHIC OPERATIONS Recordkeeping Requirements § 34.71 Utilization logs. (a) Each licensee shall maintain utilization logs showing for each sealed source the following information: (1) A description, including...

  18. 29 CFR 1917.18 - Log handling.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 7 2010-07-01 2010-07-01 false Log handling. 1917.18 Section 1917.18 Labor Regulations...) MARINE TERMINALS Marine Terminal Operations § 1917.18 Log handling. (a) The employer shall ensure that structures (bunks) used to contain logs have rounded corners and rounded structural parts to avoid...

  19. 10 CFR 34.71 - Utilization logs.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Utilization logs. 34.71 Section 34.71 Energy NUCLEAR... RADIOGRAPHIC OPERATIONS Recordkeeping Requirements § 34.71 Utilization logs. (a) Each licensee shall maintain utilization logs showing for each sealed source the following information: (1) A description, including...

  20. 10 CFR 34.71 - Utilization logs.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Utilization logs. 34.71 Section 34.71 Energy NUCLEAR... RADIOGRAPHIC OPERATIONS Recordkeeping Requirements § 34.71 Utilization logs. (a) Each licensee shall maintain utilization logs showing for each sealed source the following information: (1) A description, including...

  1. 29 CFR 1917.18 - Log handling.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 7 2012-07-01 2012-07-01 false Log handling. 1917.18 Section 1917.18 Labor Regulations...) MARINE TERMINALS Marine Terminal Operations § 1917.18 Log handling. (a) The employer shall ensure that structures (bunks) used to contain logs have rounded corners and rounded structural parts to avoid...

  2. 47 CFR 73.1820 - Station log.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 4 2010-10-01 2010-10-01 false Station log. 73.1820 Section 73.1820... Rules Applicable to All Broadcast Stations § 73.1820 Station log. (a) Entries must be made in the station log either manually by a person designated by the licensee who is in actual charge of...

  3. 47 CFR 87.109 - Station logs.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 5 2013-10-01 2013-10-01 false Station logs. 87.109 Section 87.109... Operating Requirements and Procedures Operating Procedures § 87.109 Station logs. (a) A station at a fixed location in the international aeronautical mobile service must maintain a log in accordance with Annex...

  4. 10 CFR 34.71 - Utilization logs.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Utilization logs. 34.71 Section 34.71 Energy NUCLEAR... RADIOGRAPHIC OPERATIONS Recordkeeping Requirements § 34.71 Utilization logs. (a) Each licensee shall maintain utilization logs showing for each sealed source the following information: (1) A description, including...

  5. 47 CFR 73.1820 - Station log.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 4 2014-10-01 2014-10-01 false Station log. 73.1820 Section 73.1820... Rules Applicable to All Broadcast Stations § 73.1820 Station log. (a) Entries must be made in the station log either manually by a person designated by the licensee who is in actual charge of...

  6. 47 CFR 73.1820 - Station log.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 4 2013-10-01 2013-10-01 false Station log. 73.1820 Section 73.1820... Rules Applicable to All Broadcast Stations § 73.1820 Station log. (a) Entries must be made in the station log either manually by a person designated by the licensee who is in actual charge of...

  7. 47 CFR 73.1820 - Station log.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 4 2012-10-01 2012-10-01 false Station log. 73.1820 Section 73.1820... Rules Applicable to All Broadcast Stations § 73.1820 Station log. (a) Entries must be made in the station log either manually by a person designated by the licensee who is in actual charge of...

  8. 47 CFR 87.109 - Station logs.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 5 2012-10-01 2012-10-01 false Station logs. 87.109 Section 87.109... Operating Requirements and Procedures Operating Procedures § 87.109 Station logs. (a) A station at a fixed location in the international aeronautical mobile service must maintain a log in accordance with Annex...

  9. 47 CFR 73.1820 - Station log.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 4 2011-10-01 2011-10-01 false Station log. 73.1820 Section 73.1820... Rules Applicable to All Broadcast Stations § 73.1820 Station log. (a) Entries must be made in the station log either manually by a person designated by the licensee who is in actual charge of...

  10. 47 CFR 87.109 - Station logs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 5 2010-10-01 2010-10-01 false Station logs. 87.109 Section 87.109... Operating Requirements and Procedures Operating Procedures § 87.109 Station logs. (a) A station at a fixed location in the international aeronautical mobile service must maintain a log in accordance with Annex...

  11. 29 CFR 1917.18 - Log handling.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 7 2013-07-01 2013-07-01 false Log handling. 1917.18 Section 1917.18 Labor Regulations...) MARINE TERMINALS Marine Terminal Operations § 1917.18 Log handling. (a) The employer shall ensure that structures (bunks) used to contain logs have rounded corners and rounded structural parts to avoid...

  12. 47 CFR 87.109 - Station logs.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 5 2014-10-01 2014-10-01 false Station logs. 87.109 Section 87.109... Operating Requirements and Procedures Operating Procedures § 87.109 Station logs. (a) A station at a fixed location in the international aeronautical mobile service must maintain a log in accordance with Annex...

  13. 29 CFR 1917.18 - Log handling.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 7 2011-07-01 2011-07-01 false Log handling. 1917.18 Section 1917.18 Labor Regulations...) MARINE TERMINALS Marine Terminal Operations § 1917.18 Log handling. (a) The employer shall ensure that structures (bunks) used to contain logs have rounded corners and rounded structural parts to avoid...

  14. 29 CFR 1917.18 - Log handling.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 7 2014-07-01 2014-07-01 false Log handling. 1917.18 Section 1917.18 Labor Regulations...) MARINE TERMINALS Marine Terminal Operations § 1917.18 Log handling. (a) The employer shall ensure that structures (bunks) used to contain logs have rounded corners and rounded structural parts to avoid...

  15. 47 CFR 87.109 - Station logs.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 5 2011-10-01 2011-10-01 false Station logs. 87.109 Section 87.109... Operating Requirements and Procedures Operating Procedures § 87.109 Station logs. (a) A station at a fixed location in the international aeronautical mobile service must maintain a log in accordance with Annex...

  16. Balloon logging with the inverted skyline

    NASA Technical Reports Server (NTRS)

    Mosher, C. F.

    1975-01-01

    There is a gap in aerial logging techniques that has to be filled. The need for a simple, safe, sizeable system has to be developed before aerial logging will become effective and accepted in the logging industry. This paper presents such a system designed on simple principles with realistic cost and ecological benefits.

  17. Sensor web

    NASA Technical Reports Server (NTRS)

    Delin, Kevin A. (Inventor); Jackson, Shannon P. (Inventor)

    2011-01-01

    A Sensor Web formed of a number of different sensor pods. Each of the sensor pods include a clock which is synchronized with a master clock so that all of the sensor pods in the Web have a synchronized clock. The synchronization is carried out by first using a coarse synchronization which takes less power, and subsequently carrying out a fine synchronization to make a fine sync of all the pods on the Web. After the synchronization, the pods ping their neighbors to determine which pods are listening and responded, and then only listen during time slots corresponding to those pods which respond.

  18. Use of a secure Internet Web site for collaborative medical research.

    PubMed

    Marshall, W W; Haley, R W

    2000-10-11

    Researchers who collaborate on clinical research studies from diffuse locations need a convenient, inexpensive, secure way to record and manage data. The Internet, with its World Wide Web, provides a vast network that enables researchers with diverse types of computers and operating systems anywhere in the world to log data through a common interface. Development of a Web site for scientific data collection can be organized into 10 steps, including planning the scientific database, choosing a database management software system, setting up database tables for each collaborator's variables, developing the Web site's screen layout, choosing a middleware software system to tie the database software to the Web site interface, embedding data editing and calculation routines, setting up the database on the central server computer, obtaining a unique Internet address and name for the Web site, applying security measures to the site, and training staff who enter data. Ensuring the security of an Internet database requires limiting the number of people who have access to the server, setting up the server on a stand-alone computer, requiring user-name and password authentication for server and Web site access, installing a firewall computer to prevent break-ins and block bogus information from reaching the server, verifying the identity of the server and client computers with certification from a certificate authority, encrypting information sent between server and client computers to avoid eavesdropping, establishing audit trails to record all accesses into the Web site, and educating Web site users about security techniques. When these measures are carefully undertaken, in our experience, information for scientific studies can be collected and maintained on Internet databases more efficiently and securely than through conventional systems of paper records protected by filing cabinets and locked doors. JAMA. 2000;284:1843-1849.

  19. Web Analytics

    EPA Pesticide Factsheets

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  20. Porosity Log Prediction Using Artificial Neural Network

    NASA Astrophysics Data System (ADS)

    Dwi Saputro, Oki; Lazuardi Maulana, Zulfikar; Dzar Eljabbar Latief, Fourier

    2016-08-01

    Well logging is important in oil and gas exploration. Many physical parameters of reservoir is derived from well logging measurement. Geophysicists often use well logging to obtain reservoir properties such as porosity, water saturation and permeability. Most of the time, the measurement of the reservoir properties are considered expensive. One of method to substitute the measurement is by conducting a prediction using artificial neural network. In this paper, artificial neural network is performed to predict porosity log data from other log data. Three well from ‘yy’ field are used to conduct the prediction experiment. The log data are sonic, gamma ray, and porosity log. One of three well is used as training data for the artificial neural network which employ the Levenberg-Marquardt Backpropagation algorithm. Through several trials, we devise that the most optimal input training is sonic log data and gamma ray log data with 10 hidden layer. The prediction result in well 1 has correlation of 0.92 and mean squared error of 5.67 x10-4. Trained network apply to other well data. The result show that correlation in well 2 and well 3 is 0.872 and 0.9077 respectively. Mean squared error in well 2 and well 3 is 11 x 10-4 and 9.539 x 10-4. From the result we can conclude that sonic log and gamma ray log could be good combination for predicting porosity with neural network.

  1. Logs Wanted - Dead or Alive

    NASA Astrophysics Data System (ADS)

    Schuchardt, A.; Morche, D.

    2015-12-01

    Rivers cover only a small part of the Earth`s surface, yet they transfer sediment in globally significant quantities. In mountainous regions, the majority of the total channel length occurs in headwater streams. Those mountain channels are influenced in terms of sediment connectivity by processes on the slopes. For example in such a sediment routing system, sediment originating from debris flows on the slopes is delivered along sediment pathways to the channel system and can be transported further downstream as solid load. Interruption of instream coarse sediment connectivity is closely related to the existence of channel blocking barriers which also can be formed by biota. By storing sediment large wood (LW) log jams disrupt in-channel sediment connectivity. We present a study design in order to decipher the short to long term effects (c. 10-2-102 years) of sediment (dis)connectivity effects of large wood. The study areas are two basins in mountain ranges in Germany and Austria. In Austria the drainage area of the river Fugnitz was chosen which is located in the National Park Thayatal. The other drainage area of the river Sieber in Saxony-Anhalt, Germany, is located in the Harz National Park. Since studies on LW and its geomorphological effects in Central European rivers are still rare the main goals of the project are: •to identify important triggers for LW transport from slopes into the channels •to examine the spatial distribution and characterization of LW in main and slope channels by mapping and dGPS measurements •to determine the effects of LW on channel hydraulic parameters (e.g. slope, width, grains size composition, roughness) by field measurements of channel long profiles and cross section with dGPS and Wolman particle counts •to quantify the direct effects of LW on discharge and bed load transport by measuring flow velocity with an Ott-Nautilus current meter and to measure bed load up- and downstream of log jams using a portable Helley

  2. Logs key to solving water production problems

    SciTech Connect

    Wyatt, D.F. Jr.; Crook, R.J.

    1995-11-20

    Water source identification is the first and most important step in controlling unwanted water production that can severely limit the productive life of a well and, thereby, decrease hydrocarbon recovery. Water-control treatments often fail because the source of the water problem is not identified, the wrong treatment is performed, or the correct treatment is performed incorrectly. Table 1 lists typical problems, means of identification and evaluation, and chemical treatments available for correcting the problem. Well logs can help diagnose downhole situations that can lead to unwanted water production, and the effectiveness of water-control treatments can be evaluated with cased and open hole logs. The paper discusses cement bond logs and the pulse echo tool for cement evaluation. Casing evaluation is carried out by mechanical caliper logs and electro magnetic tools. Reservoir monitoring with pulsed neutron logs and pulsed neutron spectrometry are discussed. Also discussed are production logging, radioactive tracer logging, and well tests.

  3. Leak checker data logging system

    DOEpatents

    Gannon, J.C.; Payne, J.J.

    1996-09-03

    A portable, high speed, computer-based data logging system for field testing systems or components located some distance apart employs a plurality of spaced mass spectrometers and is particularly adapted for monitoring the vacuum integrity of a long string of a superconducting magnets such as used in high energy particle accelerators. The system provides precise tracking of a gas such as helium through the magnet string when the helium is released into the vacuum by monitoring the spaced mass spectrometers allowing for control, display and storage of various parameters involved with leak detection and localization. A system user can observe the flow of helium through the magnet string on a real-time basis hour the exact moment of opening of the helium input valve. Graph reading can be normalized to compensate for magnet sections that deplete vacuum faster than other sections between testing to permit repetitive testing of vacuum integrity in reduced time. 18 figs.

  4. Leak checker data logging system

    DOEpatents

    Gannon, Jeffrey C.; Payne, John J.

    1996-01-01

    A portable, high speed, computer-based data logging system for field testing systems or components located some distance apart employs a plurality of spaced mass spectrometers and is particularly adapted for monitoring the vacuum integrity of a long string of a superconducting magnets such as used in high energy particle accelerators. The system provides precise tracking of a gas such as helium through the magnet string when the helium is released into the vacuum by monitoring the spaced mass spectrometers allowing for control, display and storage of various parameters involved with leak detection and localization. A system user can observe the flow of helium through the magnet string on a real-time basis hour the exact moment of opening of the helium input valve. Graph reading can be normalized to compensate for magnet sections that deplete vacuum faster than other sections between testing to permit repetitive testing of vacuum integrity in reduced time.

  5. Leak checker data logging system

    SciTech Connect

    Payne, J.J.; Gannon, J.C.

    1994-12-31

    A portable, high speed, computer-based data logging system for field testing systems or components located some distance apart employs a plurality of spaced mass spectrometers and is particularly adapted for monitoring the vacuum integrity of a long string of a superconducting magnets such as used in high energy particle accelerators. The system provides precise tracking of a gas such as helium through the magnet string when the helium is released into the vacuum by monitoring the spaced mass spectrometers allowing for control, display and storage of various parameters involved with leak detection and localization. A system user can observe the flow of helium through the magnet string on a real-time basis hour the exact moment of opening of the helium input valve. Graph reading can be normalized to compensate for magnet sections that deplete vacuum faster than other sections between testing to permit repetitive testing of vacuum integrity in reduced time.

  6. Logged In and Zoned Out.

    PubMed

    Ravizza, Susan M; Uitvlugt, Mitchell G; Fenn, Kimberly M

    2017-02-01

    Laptop computers are widely prevalent in university classrooms. Although laptops are a valuable tool, they offer access to a distracting temptation: the Internet. In the study reported here, we assessed the relationship between classroom performance and actual Internet usage for academic and nonacademic purposes. Students who were enrolled in an introductory psychology course logged into a proxy server that monitored their online activity during class. Past research relied on self-report, but the current methodology objectively measured time, frequency, and browsing history of participants' Internet usage. In addition, we assessed whether intelligence, motivation, and interest in course material could account for the relationship between Internet use and performance. Our results showed that nonacademic Internet use was common among students who brought laptops to class and was inversely related to class performance. This relationship was upheld after we accounted for motivation, interest, and intelligence. Class-related Internet use was not associated with a benefit to classroom performance.

  7. "Blogs" Catching on as Tool for Instruction: Teachers Use Interactive Web Pages to Hone Writing Skills

    ERIC Educational Resources Information Center

    Borja, Rhea R.

    2005-01-01

    A growing number of K-12 educators are using Web logs, or "blogs" for short, to foster better writing, reading, communication, and other academic skills. Such Web sites, often open to the public, double as chronological journals and can include Web links and photographs as well as audio and video elements. Opinions on the use of blogs are shared…

  8. A Quantitative Cost Effectiveness Model for Web-Supported Academic Instruction

    ERIC Educational Resources Information Center

    Cohen, Anat; Nachmias, Rafi

    2006-01-01

    This paper describes a quantitative cost effectiveness model for Web-supported academic instruction. The model was designed for Web-supported instruction (rather than distance learning only) characterizing most of the traditional higher education institutions. It is based on empirical data (Web logs) of students' and instructors' usage…

  9. A Quantitative Cost Effectiveness Model for Web-Supported Academic Instruction

    ERIC Educational Resources Information Center

    Cohen, Anat; Nachmias, Rafi

    2006-01-01

    This paper describes a quantitative cost effectiveness model for Web-supported academic instruction. The model was designed for Web-supported instruction (rather than distance learning only) characterizing most of the traditional higher education institutions. It is based on empirical data (Web logs) of students' and instructors' usage…

  10. Applying WebMining on KM system

    NASA Astrophysics Data System (ADS)

    Shimazu, Keiko; Ozaki, Tomonobu; Furukawa, Koichi

    KM (Knowledge Management) systems have recently been adopted within the realm of enterprise management. On the other hand, data mining technology is widely acknowledged within Information systems' R&D Divisions. Specially, acquisition of meaningful information from Web usage data has become one of the most exciting eras. In this paper, we employ a Web based KM system and propose a framework for applying Web Usage Mining technology to KM data. As it turns out, task duration varies according to different user operations such as referencing a table-of-contents page, down-loading a target file, and writing to a bulletin board. This in turn makes it possible to easily predict the purpose of the user's task. By taking these observations into account, we segmented access log data manually. These results were compared with results abstained by applying the constant interval method. Next, we obtained a segmentation rule of Web access logs by applying a machine-learning algorithm to manually segmented access logs as training data. Then, the newly obtained segmentation rule was compared with other known methods including the time interval method by evaluating their segmentation results in terms of recall and precision rates and it was shown that our rule attained the best results in both measures. Furthermore, the segmented data were fed to an association rule miner and the obtained association rules were utilized to modify the Web structure.

  11. Silicon web process development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Hill, F. E.; Skutch, M. E.; Driggers, J. M.; Hopkins, R. H.

    1980-01-01

    A barrier crucible design which consistently maintains melt stability over long periods of time was successfully tested and used in long growth runs. The pellet feeder for melt replenishment was operated continuously for growth runs of up to 17 hours. The liquid level sensor comprising a laser/sensor system was operated, performed well, and meets the requirements for maintaining liquid level height during growth and melt replenishment. An automated feedback loop connecting the feed mechanism and the liquid level sensing system was designed and constructed and operated successfully for 3.5 hours demonstrating the feasibility of semi-automated dendritic web growth. The sensitivity of the cost of sheet, to variations in capital equipment cost and recycling dendrites was calculated and it was shown that these factors have relatively little impact on sheet cost. Dendrites from web which had gone all the way through the solar cell fabrication process, when melted and grown into web, produce crystals which show no degradation in cell efficiency. Material quality remains high and cells made from web grown at the start, during, and the end of a run from a replenished melt show comparable efficiencies.

  12. Component Architectures and Web-Based Learning Environments

    ERIC Educational Resources Information Center

    Ferdig, Richard E.; Mishra, Punya; Zhao, Yong

    2004-01-01

    The Web has caught the attention of many educators as an efficient communication medium and content delivery system. But we feel there is another aspect of the Web that has not been given the attention it deserves. We call this aspect of the Web its "component architecture." Briefly it means that on the Web one can develop very complex…

  13. Web multimedia information retrieval using improved Bayesian algorithm.

    PubMed

    Yu, Yi-Jun; Chen, Chun; Yu, Yi-Min; Lin, Huai-Zhong

    2003-01-01

    The main thrust of this paper is application of a novel data mining approach on the log of user's feedback to improve web multimedia information retrieval performance. A user space model was constructed based on data mining, and then integrated into the original information space model to improve the accuracy of the new information space model. It can remove clutter and irrelevant text information and help to eliminate mismatch between the page author's expression and the user's understanding and expectation. User space model was also utilized to discover the relationship between high-level and low-level features for assigning weight. The authors proposed improved Bayesian algorithm for data mining. Experiment proved that the authors' proposed algorithm was efficient.

  14. Correlating Log Messages for System Diagnostics

    SciTech Connect

    Gunasekaran, Raghul; Dillow, David A; Shipman, Galen M; Maxwell, Don E; Hill, Jason J; Park, Byung H; Geist, Al

    2010-01-01

    In large-scale computing systems, the sheer volume of log data generated presents daunting challenges for debugging and monitoring of these systems. The Oak Ridge Leadership Computing Facility s premier simulation platform, the Cray XT5 known as Jaguar, can generate a few hundred thousand log entries in less than a minute for many system level events. Determining the root cause of such system events requires analyzing and interpretation of a large number of log messages. Most often, the log messages are best understood when they are interpreted collectively rather than individually. In this paper, we present our approach to interpreting log messages by identifying their commonalities and grouping them into clusters. Given a set of log messages within a time interval, we group the messages based on source, target, and/or error type, and correlate the messages with hardware and application information. We monitor the Lustre log messages in the XT5 console log and show that such grouping of log messages assists in detecting the source of system events. By intelligent grouping and correlation of events in the log, we are able to provide system administrators with meaningful information in a concise format for root cause analysis.

  15. Understanding the Usage of Content in a Mental Health Intervention for Depression: An Analysis of Log Data

    PubMed Central

    2014-01-01

    Background Web-based interventions for the early treatment of depressive symptoms can be considered effective in reducing mental complaints. However, there is a limited understanding of which elements in an intervention contribute to effectiveness. For efficiency and effectiveness of interventions, insight is needed into the use of content and persuasive features. Objective The aims of this study were (1) to illustrate how log data can be used to understand the uptake of the content of a Web-based intervention that is based on the acceptance and commitment therapy (ACT) and (2) to discover how log data can be of value for improving the incorporation of content in Web-based interventions. Methods Data from 206 participants (out of the 239) who started the first nine lessons of the Web-based intervention, Living to the Full, were used for a secondary analysis of a subset of the log data of the parent study about adherence to the intervention. The log files used in this study were per lesson: login, start mindfulness, download mindfulness, view success story, view feedback message, start multimedia, turn on text-message coach, turn off text-message coach, and view text message. Differences in usage between lessons were explored with repeated measures ANOVAs (analysis of variance). Differences between groups were explored with one-way ANOVAs. To explore the possible predictive value of the login per lesson quartiles on the outcome measures, four linear regressions were used with login quartiles as predictor and with the outcome measures (Center for Epidemiologic Studies—Depression [CES-D] and the Hospital Anxiety and Depression Scale—Anxiety [HADS-A] on post-intervention and follow-up) as dependent variables. Results A significant decrease in logins and in the use of content and persuasive features over time was observed. The usage of features varied significantly during the treatment process. The usage of persuasive features increased during the third part of the

  16. Understanding the usage of content in a mental health intervention for depression: an analysis of log data.

    PubMed

    Van Gemert-Pijnen, Julia Ewc; Kelders, Saskia M; Bohlmeijer, Ernst T

    2014-01-31

    Web-based interventions for the early treatment of depressive symptoms can be considered effective in reducing mental complaints. However, there is a limited understanding of which elements in an intervention contribute to effectiveness. For efficiency and effectiveness of interventions, insight is needed into the use of content and persuasive features. The aims of this study were (1) to illustrate how log data can be used to understand the uptake of the content of a Web-based intervention that is based on the acceptance and commitment therapy (ACT) and (2) to discover how log data can be of value for improving the incorporation of content in Web-based interventions. Data from 206 participants (out of the 239) who started the first nine lessons of the Web-based intervention, Living to the Full, were used for a secondary analysis of a subset of the log data of the parent study about adherence to the intervention. The log files used in this study were per lesson: login, start mindfulness, download mindfulness, view success story, view feedback message, start multimedia, turn on text-message coach, turn off text-message coach, and view text message. Differences in usage between lessons were explored with repeated measures ANOVAs (analysis of variance). Differences between groups were explored with one-way ANOVAs. To explore the possible predictive value of the login per lesson quartiles on the outcome measures, four linear regressions were used with login quartiles as predictor and with the outcome measures (Center for Epidemiologic Studies-Depression [CES-D] and the Hospital Anxiety and Depression Scale-Anxiety [HADS-A] on post-intervention and follow-up) as dependent variables. A significant decrease in logins and in the use of content and persuasive features over time was observed. The usage of features varied significantly during the treatment process. The usage of persuasive features increased during the third part of the ACT (commitment to value-based living

  17. Understanding PubMed® user search behavior through log analysis

    PubMed Central

    Islamaj Dogan, Rezarta; Murray, G. Craig; Névéol, Aurélie; Lu, Zhiyong

    2009-01-01

    This article reports on a detailed investigation of PubMed users’ needs and behavior as a step toward improving biomedical information retrieval. PubMed is providing free service to researchers with access to more than 19 million citations for biomedical articles from MEDLINE and life science journals. It is accessed by millions of users each day. Efficient search tools are crucial for biomedical researchers to keep abreast of the biomedical literature relating to their own research. This study provides insight into PubMed users’ needs and their behavior. This investigation was conducted through the analysis of one month of log data, consisting of more than 23 million user sessions and more than 58 million user queries. Multiple aspects of users’ interactions with PubMed are characterized in detail with evidence from these logs. Despite having many features in common with general Web searches, biomedical information searches have unique characteristics that are made evident in this study. PubMed users are more persistent in seeking information and they reformulate queries often. The three most frequent types of search are search by author name, search by gene/protein, and search by disease. Use of abbreviation in queries is very frequent. Factors such as result set size influence users’ decisions. Analysis of characteristics such as these plays a critical role in identifying users’ information needs and their search habits. In turn, such an analysis also provides useful insight for improving biomedical information retrieval. Database URL: http://www.ncbi.nlm.nih.gov/PubMed PMID:20157491

  18. Efficient Tracking, Logging, and Blocking of Accesses to Digital Objects

    DTIC Science & Technology

    2015-09-01

    that viruses and malware may expose data to the outside world; 3. Endpoint Data Loss Prevention (DLP) technologies and techniques to prevent...309, 2004. [2] A. Dinaburg, P. Royal, M. Sharif, and W. Lee. Ether: malware analysis via hardware virtualiza- tion extensions. In Proceedings of... malware etc. DLP system, on the other hand, is aimed at protecting sensitive data from leaking out an organization. The protected content is mainly of

  19. Sleep Logs: Measurement of Individual and Operational Efficiency

    DTIC Science & Technology

    1991-05-01

    during a given mission scenario. There are many tools for measuring amounts and patterns of sleep. Portable brain wave recording systems, for an...Hullaney, and Wybarney, 1982). It can be used easily in the field to separate periods of rest/sleep (minimal activities , presumably asleep’ from...physically active periods. However, such actigraphic units are relatively expensive. The most economical and preferred method to study sleep, especially ill

  20. A Distributed Network Logging Topology

    DTIC Science & Technology

    2010-03-01

    while still maintaining a searchable and efficient storage system. v Acknowledgments I would like to thank my advisor, Lt Col Brett...implemented: unreliable UDP transfer, reliable TCP transfer, or potentially an encrypted SSL transfer. Each method would have different levels of traffic...infrastructure that resolves the centralized server bottleneck and data loss problem while still maintaining a searchable and efficient storage system. 15

  1. LOG PERIODIC DIPOLE ARRAY WITH PARASITIC ELEMENTS

    DTIC Science & Technology

    The design and measured characteristics of dipole and monopole versions of a log periodic array with parasitic elements are discussed. In a dipole...for the elements to obtain log periodic performance of the anntenna. This design with parasitic elements lends itself to a monopole version of the...antenna which has a simplified feeding configuration. The result is a log periodic antenna design that can be used from high frequencies through microwave frequencies.

  2. Surgical e-learning: validation of multimedia web-based lectures.

    PubMed

    Ridgway, Paul F; Sheikh, Athar; Sweeney, Karl J; Evoy, Denis; McDermott, Enda; Felle, Patrick; Hill, Arnold D; O'Higgins, Niall J

    2007-02-01

    Distance learning has been advocated increasingly as a modern efficient method of teaching surgery. Efficiency of knowledge transfer and validity of web-based courses have not been subjected to rigorous study to date. An entirely web-based surgical 5-week lecture course was designed. Fifty per cent of the lectures were prepared as HTML slides with voice-over while the other group was presented in the text-only form. Only written material presented was examined. The lectures were presented via an educational web module. The lecture series was balanced specifically to reduce the pre-existent knowledge bias. Web usage was estimated utilising surrogates, including the number of hits as well as log-on timing. Face validity was assessed by a standardised questionnaire. Eighty-eight students took part in the lecture series and subsequent examination and questionnaire. Median multiple choice questionnaire (MCQ) marks were significantly higher in the aural lecture-derived stems versus the non-aural (P = 0.012, Mann-Whitney U-test). There was widespread approval of web-based learning as an adjunct to conventional teaching. Usage rates were augmented significantly in the final week when compared to the previous 4 weeks (mean total hits weeks 1-4 +/- SEM: 100.9 +/- 9.7 and mean total hits week 5: 152.1 +/- 13.1; P < 0.001, Kruskal-Wallis). However, total hits did not correlate with overall examination results (r(2) = 0.16). The aural lectures demonstrated higher face validity than the non-aural for content and presentation (P < 0.05, Kruskal-Wallis). The addition of aural files to the novel web-based lecture series is face valid and results in significantly increased examination performance.

  3. Flow rate logging seepage meter

    NASA Technical Reports Server (NTRS)

    Reay, William G. (Inventor); Walthall, Harry G. (Inventor)

    1996-01-01

    An apparatus for remotely measuring and logging the flow rate of groundwater seepage into surface water bodies. As groundwater seeps into a cavity created by a bottomless housing, it displaces water through an inlet and into a waterproof sealed upper compartment, at which point, the water is collected by a collection bag, which is contained in a bag chamber. A magnet on the collection bag approaches a proximity switch as the collection bag fills, and eventually enables the proximity switch to activate a control circuit. The control circuit then rotates a three-way valve from the collection path to a discharge path, enables a data logger to record the time, and enables a pump, which discharges the water from the collection bag, through the three-way valve and pump, and into the sea. As the collection bag empties, the magnet leaves the proximity of the proximity switch, and the control circuit turns off the pump, resets the valve to provide a collection path, and restarts the collection cycle.

  4. Web Worries.

    ERIC Educational Resources Information Center

    Reidelbach, Dorothy

    1996-01-01

    Four common difficulties in development and maintenance of a World Wide Web site are those of capturing control of the ever-changing medium and providing product consistency; providing relatively easy and rapid access for users; coping with copyrights when there are so few legal guidelines; and preserving privacy for use of credit cards. Solutions…

  5. Webbing It.

    ERIC Educational Resources Information Center

    Brandsberg, Jennifer

    1996-01-01

    Provides a quick look at some World Wide Web sites that contain current election year information. Recommends Project Vote Smart, a site with links to online news organizations, the home pages of all presidential candidates, and other political sites. Briefly notes several interactive CD-ROM resources. (MJP)

  6. Fiber webs

    Treesearch

    Roger M. Rowell; James S. Han; Von L. Byrd

    2005-01-01

    Wood fibers can be used to produce a wide variety of low-density three-dimensional webs, mats, and fiber-molded products. Short wood fibers blended with long fibers can be formed into flexible fiber mats, which can be made by physical entanglement, nonwoven needling, or thermoplastic fiber melt matrix technologies. The most common types of flexible mats are carded, air...

  7. 'Infectious web'.

    PubMed

    Kotra, L P; Ojcius, D M

    2000-12-01

    A comprehensive list of all known bacterial pathogens of humans is now available at various web-sites on the internet. The sites contain hyperlinks to original scientific literature, along with general information on laboratory testing, antibiotic resistance and clinical treatment. More specific sites highlight the fungus Pneumocystic carinii, arguably the main cause of pneumonia in immunosuppressed individuals.

  8. Webbing It.

    ERIC Educational Resources Information Center

    Brandsberg, Jennifer

    1996-01-01

    Provides a quick look at some World Wide Web sites that contain current election year information. Recommends Project Vote Smart, a site with links to online news organizations, the home pages of all presidential candidates, and other political sites. Briefly notes several interactive CD-ROM resources. (MJP)

  9. Web Sitings.

    ERIC Educational Resources Information Center

    Lo, Erika

    2001-01-01

    Presents seven mathematics games, located on the World Wide Web, for elementary students, including: Absurd Math: Pre-Algebra from Another Dimension; The Little Animals Activity Centre; MathDork Game Room (classic video games focusing on algebra); Lemonade Stand (students practice math and business skills); Math Cats (teaches the artistic beauty…

  10. Vibration transmission through sheet webs of hobo spiders (Eratigena agrestis) and tangle webs of western black widow spiders (Latrodectus hesperus).

    PubMed

    Vibert, Samantha; Scott, Catherine; Gries, Gerhard

    2016-11-01

    Web-building spiders construct their own vibratory signaling environments. Web architecture should affect signal design, and vice versa, such that vibratory signals are transmitted with a minimum of attenuation and degradation. However, the web is the medium through which a spider senses both vibratory signals from courting males and cues produced by captured prey. Moreover, webs function not only in vibration transmission, but also in defense from predators and the elements. These multiple functions may impose conflicting selection pressures on web design. We investigated vibration transmission efficiency and accuracy through two web types with contrasting architectures: sheet webs of Eratigena agrestis (Agelenidae) and tangle webs of Latrodectus hesperus (Theridiidae). We measured vibration transmission efficiencies by playing frequency sweeps through webs with a piezoelectric vibrator and a loudspeaker, recording the resulting web vibrations at several locations on each web using a laser Doppler vibrometer. Transmission efficiencies through both web types were highly variable, with within-web variation greater than among-web variation. There was little difference in transmission efficiencies of longitudinal and transverse vibrations. The inconsistent transmission of specific frequencies through webs suggests that parameters other than frequency are most important in allowing these spiders to distinguish between vibrations of prey and courting males.

  11. A time-efficient web-based teaching tool to improve medical knowledge and decrease ABIM failure rate in select residents.

    PubMed

    Drake, Sean M; Qureshi, Waqas; Morse, William; Baker-Genaw, Kimberly

    2015-01-01

    The American Board of Internal Medicine (ABIM) exam's pass rate is considered a quality measure of a residency program, yet few interventions have shown benefit in reducing the failure rate. We developed a web-based Directed Reading (DR) program with an aim to increase medical knowledge and reduce ABIM exam failure rate. Internal medicine residents at our academic medical center with In-Training Examination (ITE) scores ≤ 35 th percentile from 2007 to 2013 were enrolled in DR. The program matches residents to reading assignments based on their own ITE-failed educational objectives and provides direct electronic feedback from their teaching physicians. ABIM exam pass rates were analyzed across various groups between 2002 and 2013 to examine the effect of the DR program on residents with ITE scores ≤ 35 percentile pre- (2002-2006) and post-intervention (2007-2013). A time commitment survey was also given to physicians and DR residents at the end of the study. Residents who never scored ≤ 35 percentile on ITE were the most likely to pass the ABIM exam on first attempt regardless of time period. For those who ever scored ≤ 35 percentile on ITE, 91.9% of residents who participated in DR passed the ABIM exam on first attempt vs 85.2% of their counterparts pre-intervention (p < 0.001). This showed an improvement in ABIM exam pass rate for this subset of residents after introduction of the DR program. The time survey showed that faculty used an average of 40±18 min per week to participate in DR and residents required an average of 25 min to search/read about the objective and 20 min to write a response. Although residents who ever scored ≤ 35 percentile on ITE were more likely to fail ABIM exam on first attempt, those who participated in the DR program were less likely to fail than the historical control counterparts. The web-based teaching method required little time commitment by faculty.

  12. A time-efficient web-based teaching tool to improve medical knowledge and decrease ABIM failure rate in select residents.

    PubMed

    Drake, Sean M; Qureshi, Waqas; Morse, William; Baker-Genaw, Kimberly

    2015-01-01

    Aim The American Board of Internal Medicine (ABIM) exam's pass rate is considered a quality measure of a residency program, yet few interventions have shown benefit in reducing the failure rate. We developed a web-based Directed Reading (DR) program with an aim to increase medical knowledge and reduce ABIM exam failure rate. Methods Internal medicine residents at our academic medical center with In-Training Examination (ITE) scores ≤35th percentile from 2007 to 2013 were enrolled in DR. The program matches residents to reading assignments based on their own ITE-failed educational objectives and provides direct electronic feedback from their teaching physicians. ABIM exam pass rates were analyzed across various groups between 2002 and 2013 to examine the effect of the DR program on residents with ITE scores ≤35 percentile pre- (2002-2006) and post-intervention (2007-2013). A time commitment survey was also given to physicians and DR residents at the end of the study. Results Residents who never scored ≤35 percentile on ITE were the most likely to pass the ABIM exam on first attempt regardless of time period. For those who ever scored ≤35 percentile on ITE, 91.9% of residents who participated in DR passed the ABIM exam on first attempt vs 85.2% of their counterparts pre-intervention (p<0.001). This showed an improvement in ABIM exam pass rate for this subset of residents after introduction of the DR program. The time survey showed that faculty used an average of 40±18 min per week to participate in DR and residents required an average of 25 min to search/read about the objective and 20 min to write a response. Conclusions Although residents who ever scored ≤35 percentile on ITE were more likely to fail ABIM exam on first attempt, those who participated in the DR program were less likely to fail than the historical control counterparts. The web-based teaching method required little time commitment by faculty.

  13. A time-efficient web-based teaching tool to improve medical knowledge and decrease ABIM failure rate in select residents

    PubMed Central

    Drake, Sean M.; Qureshi, Waqas; Morse, William; Baker-Genaw, Kimberly

    2015-01-01

    Aim The American Board of Internal Medicine (ABIM) exam's pass rate is considered a quality measure of a residency program, yet few interventions have shown benefit in reducing the failure rate. We developed a web-based Directed Reading (DR) program with an aim to increase medical knowledge and reduce ABIM exam failure rate. Methods Internal medicine residents at our academic medical center with In-Training Examination (ITE) scores ≤35th percentile from 2007 to 2013 were enrolled in DR. The program matches residents to reading assignments based on their own ITE-failed educational objectives and provides direct electronic feedback from their teaching physicians. ABIM exam pass rates were analyzed across various groups between 2002 and 2013 to examine the effect of the DR program on residents with ITE scores ≤35 percentile pre- (2002–2006) and post-intervention (2007–2013). A time commitment survey was also given to physicians and DR residents at the end of the study. Results Residents who never scored ≤35 percentile on ITE were the most likely to pass the ABIM exam on first attempt regardless of time period. For those who ever scored ≤35 percentile on ITE, 91.9% of residents who participated in DR passed the ABIM exam on first attempt vs 85.2% of their counterparts pre-intervention (p<0.001). This showed an improvement in ABIM exam pass rate for this subset of residents after introduction of the DR program. The time survey showed that faculty used an average of 40±18 min per week to participate in DR and residents required an average of 25 min to search/read about the objective and 20 min to write a response. Conclusions Although residents who ever scored ≤35 percentile on ITE were more likely to fail ABIM exam on first attempt, those who participated in the DR program were less likely to fail than the historical control counterparts. The web-based teaching method required little time commitment by faculty. PMID:26521767

  14. Mining Formative Evaluation Rules Using Web-Based Learning Portfolios for Web-Based Learning Systems

    ERIC Educational Resources Information Center

    Chen, Chih-Ming; Hong, Chin-Ming; Chen, Shyuan-Yi; Liu, Chao-Yu

    2006-01-01

    Learning performance assessment aims to evaluate what knowledge learners have acquired from teaching activities. Objective technical measures of learning performance are difficult to develop, but are extremely important for both teachers and learners. Learning performance assessment using learning portfolios or web server log data is becoming an…

  15. Web Evaluation Tool (WET): A Creative Web Tool for Online Educators

    ERIC Educational Resources Information Center

    Hamza, Mohammad Khalid

    2003-01-01

    The Nielsen/Net report Ratings 2000, reported that in 2002, online usage at work jumped 17 percent year-over-year, driven by female office workers. Nearly 46 million American office workers logged onto the Web, the highest peak since January 2000. It was also predicted that the number of students using the Internet was expected to reach 13.5…

  16. Mining Formative Evaluation Rules Using Web-Based Learning Portfolios for Web-Based Learning Systems

    ERIC Educational Resources Information Center

    Chen, Chih-Ming; Hong, Chin-Ming; Chen, Shyuan-Yi; Liu, Chao-Yu

    2006-01-01

    Learning performance assessment aims to evaluate what knowledge learners have acquired from teaching activities. Objective technical measures of learning performance are difficult to develop, but are extremely important for both teachers and learners. Learning performance assessment using learning portfolios or web server log data is becoming an…

  17. DARK ENERGY FROM THE LOG-TRANSFORMED CONVERGENCE FIELD

    SciTech Connect

    Seo, Hee-Jong; Sato, Masanori; Takada, Masahiro; Dodelson, Scott

    2012-03-20

    A logarithmic transform of the convergence field improves 'the information content', i.e., the overall precision associated with the measurement of the amplitude of the convergence power spectrum, by improving the covariance matrix properties. The translation of this improvement in the information content to that in cosmological parameters, such as those associated with dark energy, requires knowing the sensitivity of the log-transformed field to those cosmological parameters. In this paper, we use N-body simulations with ray tracing to generate convergence fields at multiple source redshifts as a function of cosmology. The gain in information associated with the log-transformed field does lead to tighter constraints on dark energy parameters, but only if shape noise is neglected. The presence of shape noise quickly diminishes the advantage of the log-mapping, more quickly than we would expect based on the information content. With or without shape noise, using a larger pixel size allows for a more efficient log-transformation.

  18. Search Smarter, Not Harder, on the World Wide Web.

    ERIC Educational Resources Information Center

    Foster, Janet

    1999-01-01

    Highlights Web sites and print recommendations that make excellent resources for narrowing down the scope of online searches and making online time more efficient. Discusses Web directories, search engines, meta search tools, and books and journals on Internet searching. (AEF)

  19. A handy aid for hardwood log graders

    Treesearch

    M. D. Ostrander

    1952-01-01

    In hardwood log grading, the beginner encounters a formidable task: to memorize the specifications, exceptions to general rules, etc., as set down in the U. S. Forest Products Laboratory's "Hardwood Log Grades for Standard Lumber." He must refer to this text repeatedly until he becomes familiar with all the ins and outs of the job. This slows him down...

  20. Grading sugar pine saw logs in trees.

    Treesearch

    John W. Henley

    1972-01-01

    Small limbs and small overgrown limbs cause problems when grading saw logs in sugar pine trees. Surface characteristics and lumber recovery information for 426 logs from 64 sugar pine trees were examined. Resulting modifications in the grading specification that allow a grader to ignore small limbs and small limb indicators do not appear to decrease the performance of...

  1. 40 CFR 91.412 - Data logging.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the...

  2. 40 CFR 89.409 - Data logging.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records the...

  3. 40 CFR 91.412 - Data logging.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the...

  4. 40 CFR 90.412 - Data logging.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection...

  5. 40 CFR 90.412 - Data logging.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection...

  6. 40 CFR 89.409 - Data logging.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records the...

  7. 40 CFR 90.412 - Data logging.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection...

  8. 40 CFR 89.409 - Data logging.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records the...

  9. 40 CFR 89.409 - Data logging.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records the...

  10. 40 CFR 91.412 - Data logging.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the...

  11. 40 CFR 89.409 - Data logging.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records the...

  12. 40 CFR 90.412 - Data logging.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection...

  13. 40 CFR 90.412 - Data logging.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection...

  14. 40 CFR 91.412 - Data logging.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the...

  15. 40 CFR 91.412 - Data logging.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the...

  16. Internal log scanning: Research to reality

    Treesearch

    Daniel L. Schmoldt

    2000-01-01

    Improved log breakdown into lumber has been an active research topic since the 1960's. Demonstrated economic gains have driven the search for a cost-effective method to scan logs internally, from which it is assumed one can chose a better breakdown strategy. X-ray computed tomography (CT) has been widely accepted as the most promising internal imaging technique....

  17. An analysis technique for testing log grades

    Treesearch

    Carl A. Newport; William G. O' Regan

    1963-01-01

    An analytical technique that may be used in evaluating log-grading systems is described. It also provides means of comparing two or more grading systems, or a proposed change with the system from which it was developed. The total volume and computed value of lumber from each sample log are the basic data used.

  18. Logging deck organization with a bundler

    Treesearch

    Dana. Mitchell

    2009-01-01

    The original John Deere 1490D Slash Bundler is mounted on a forwarder so that it can collect woody biomass scattered throughout a tract. However, typical logging operations in the southeastern United States delimb and top at the landing, so logging residues are concentrated at the landing. In a current study by researchers at Auburn...

  19. Logging methods and peeling of Aspen

    Treesearch

    T. Schantz-Hansen

    1948-01-01

    The logging of forest products is influenced by many factors, including the size of the trees, density of the stand, the soundness of the trees, size of the area logged, topography and soil, weather conditions, the degree of utilization, the skill of the logger and the equipment used, the distance from market, etc. Each of these factors influences not only the method...

  20. Synthetic rope applications in Appalachian logging

    Treesearch

    Ben D. Spong; Jingxin Wang

    2008-01-01

    New ultra-high molecular weight polyethylene rope has shown good results as a replacement for wire rope in logging applications in the western United States. A single case study trial was performed in Appalachian forest conditions to assess the appropriateness of this technology for hardwood logging applications. The study focused on use of the rope in West Virginia...

  1. Mathematical model of a smoldering log.

    Treesearch

    Fernando de Souza Costa; David. Sandberg

    2004-01-01

    A mathematical model is developed describing the natural smoldering of logs. It is considered the steady one dimensional propagation of infinitesimally thin fronts of drying, pyrolysis, and char oxidation in a horizontal semi-infinite log. Expressions for the burn rates, distribution profiles of temperature, and positions of the drying, pyrolysis, and smoldering fronts...

  2. Logging truck noise near nesting northern goshawks

    Treesearch

    Teryl G. Grubb; Larry L. Pater; David K. Delaney

    1998-01-01

    We measured noise levels of four logging trucks as the trucks passed within approximately 500 m of two active northern goshawk (Accipiter gentilis) nests on the Kaibab Plateau in northern Arizona in 1997. Neither a brooding adult female nor a lone juvenile exhibited any discernable behavioral response to logging truck noise, which peaked at 53.4 and...

  3. Discover Presidential Log Cabins. Teacher's Discussion Guide.

    ERIC Educational Resources Information Center

    National Park Service (Dept. of Interior), Washington, DC.

    Discover Presidential Log Cabins is a set of materials designed to help educate 6-8 grade students about the significance of three log cabin sites occupied by George Washington, Ulysses Grant, Abraham Lincoln, and Theodore Roosevelt. This teacher's discussion guide is intended for use as part of a larger, comprehensive social studies program, and…

  4. Reduced-impact logging: challenges and opportunities

    Treesearch

    F.E. Putz; P. Sist; T. Fredericksen; D. Dykstra

    2008-01-01

    Over the past two decades, sets of timber harvesting guidelines designed to mitigate the deleterious environmental impacts of tree felling, yarding, and hauling have become known as "reduced-impact logging" (RIL) techniques. Although none of the components of RIL are new, concerns about destructive logging practices and worker safety in the tropics stimulated...

  5. Impacts of extended working hours in logging

    Treesearch

    Dana Mitchell; Tom Gallagher

    2008-01-01

    Last year at the 2007 AIM in Minneapolis, MN, the authors presented the human factors impacts to consider when implementing extended working hours in the logging industry. In a continuation of this project, we have researched existing literature to identify possible actions that logging business owners can take to reduce the impact of extended working hours on their...

  6. Measuring Reading Instruction with Teacher Logs

    ERIC Educational Resources Information Center

    Rowan, Brian; Correnti, Richard

    2009-01-01

    The authors argue that the criticisms of their earlier article on teacher logs ("Educational Researcher," March 2009) by Smagorinsky and Willis do not address, much less undermine, the evidence they presented as part of their validation argument about the teacher logs. Moreover, they argue that their method for studying classrooms is not nearly as…

  7. Recovery from simulated sawn logs with sweep.

    Treesearch

    Robert A. Monserud; Dean L. Parry; Christine L. Todoroki

    2004-01-01

    A sawing simulator, AUTOSAW, was used to examine the effect of increasing sweep on lumber recovery. Sample material consisted of 51 logs from 22 western hemlock (Tsuga heterophylla (Raf.) Sarg. ) trees in western Oregon, United States. All knots on the 4.9-m logs were measured, mapped, and converted into 3-dimensional digital formats. The digital...

  8. Log sort yard economics, planning, and feasibility

    Treesearch

    John Rusty Dramm; Robert Govett; Ted Bilek; Gerry L. Jackson

    2004-01-01

    This publication discusses basic marketing and economic concepts, planning approach, and feasibility methodology for assessing log sort yard operations. Special attention is given to sorting small diameter and underutilized logs from forest restoration, fuels reduction, and thinning operations. A planned programming approach of objectively determining the feasibility...

  9. Selective Logging in the Brazilian Amazon

    NASA Astrophysics Data System (ADS)

    Asner, Gregory P.; Knapp, David E.; Broadbent, Eben N.; Oliveira, Paulo J. C.; Keller, Michael; Silva, Jose N.

    2005-10-01

    Amazon deforestation has been measured by remote sensing for three decades. In comparison, selective logging has been mostly invisible to satellites. We developed a large-scale, high-resolution, automated remote-sensing analysis of selective logging in the top five timber-producing states of the Brazilian Amazon. Logged areas ranged from 12,075 to 19,823 square kilometers per year (+/-14%) between 1999 and 2002, equivalent to 60 to 123% of previously reported deforestation area. Up to 1200 square kilometers per year of logging were observed on conservation lands. Each year, 27 million to 50 million cubic meters of wood were extracted, and a gross flux of ~0.1 billion metric tons of carbon was destined for release to the atmosphere by logging.

  10. Selective logging in the Brazilian Amazon.

    PubMed

    Asner, Gregory P; Knapp, David E; Broadbent, Eben N; Oliveira, Paulo J C; Keller, Michael; Silva, Jose N

    2005-10-21

    Amazon deforestation has been measured by remote sensing for three decades. In comparison, selective logging has been mostly invisible to satellites. We developed a large-scale, high-resolution, automated remote-sensing analysis of selective logging in the top five timber-producing states of the Brazilian Amazon. Logged areas ranged from 12,075 to 19,823 square kilometers per year (+/-14%) between 1999 and 2002, equivalent to 60 to 123% of previously reported deforestation area. Up to 1200 square kilometers per year of logging were observed on conservation lands. Each year, 27 million to 50 million cubic meters of wood were extracted, and a gross flux of approximately 0.1 billion metric tons of carbon was destined for release to the atmosphere by logging.

  11. Coal-log pipeline system development

    SciTech Connect

    Liu, H.

    1991-12-01

    Project tasks include: (1) Perform the necessary testing and development to demonstrate that the amount of binder in coal logs can be reduced to 8% or lower to produce logs with adequate strength to eliminate breakage during pipeline transportation, under conditions experienced in long distance pipeline systems. Prior to conducting any testing and demonstration, grantee shall perform an information search and make full determination of all previous attempts to extrude or briquette coal, upon which the testing and demonstration shall be based. (2) Perform the necessary development to demonstrate a small model of the most promising injection system for coal-logs, and tests the logs produced. (3) Conduct economic analysis of coal-log pipeline, based upon the work to date. Refine and complete the economic model. (VC)

  12. Designing and Piloting a Leadership Daily Practice Log: Using Logs to Study the Practice of Leadership

    ERIC Educational Resources Information Center

    Spillane, James P.; Zuberi, Anita

    2009-01-01

    Purpose: This article aims to validate the Leadership Daily Practice (LDP) log, an instrument for conducting research on leadership in schools. Research Design: Using a combination of data sources--namely, a daily practice log, observations, and open-ended cognitive interviews--the authors evaluate the validity of the LDP log. Participants: Formal…

  13. Designing and Piloting a Leadership Daily Practice Log: Using Logs to Study the Practice of Leadership

    ERIC Educational Resources Information Center

    Spillane, James P.; Zuberi, Anita

    2009-01-01

    Purpose: This article aims to validate the Leadership Daily Practice (LDP) log, an instrument for conducting research on leadership in schools. Research Design: Using a combination of data sources--namely, a daily practice log, observations, and open-ended cognitive interviews--the authors evaluate the validity of the LDP log. Participants: Formal…

  14. Use of a Web-Based Calculator and a Structured Report Generator to Improve Efficiency, Accuracy, and Consistency of Radiology Reporting.

    PubMed

    Towbin, Alexander J; Hawkins, C Matthew

    2017-03-29

    While medical calculators are common, they are infrequently used in the day-to-day radiology practice. We hypothesized that a calculator coupled with a structured report generator would decrease the time required to interpret and dictate a study in addition to decreasing the number of errors in interpretation. A web-based application was created to help radiologists calculate leg-length discrepancies. A time motion study was performed to evaluate if the calculator helped to decrease the time for interpretation and dictation of leg-length radiographs. Two radiologists each evaluated two sets of ten radiographs, one set using the traditional pen and paper method and the other set using the calculator. The time to interpret each study and the time to dictate each study were recorded. In addition, each calculation was checked for errors. When comparing the two methods of calculating the leg lengths, the manual method was significantly slower than the calculator for all time points measured: the mean time to calculate the leg-length discrepancy (131.8 vs. 59.7 s; p < 0.001), the mean time to dictate the report (31.8 vs. 11 s; p < 0.001), and the mean total time (163.7 vs. 70.7 s; p < 0.001). Reports created by the calculator were more accurate than reports created via the manual method (100 vs. 90%), although this result was not significant (p = 0.16). A calculator with a structured report generator significantly improved the time required to calculate and dictate leg-length discrepancy studies.

  15. 'Infectious web'.

    PubMed

    Kotra, L P; Ojcius, D M

    2000-07-01

    Infections by Helicobacter pylori are responsible for duodenal and gastric ulcers and are a significant risk factor for the development of gastric adenocarcinoma. H. pylori was discovered in 1983, but many institutes in Canada, Europe, and the United States are already involved in programs to understand and treat the infections, as reflected by the growing number of internet sites devoted to this bacterium. Most AIDS patients and about 20% of children with acute lymphoblastic leukemia develop Pneumocystis carinii pneumoniae. Information on clinical symptoms and treatment, as well as the P. carinii genome sequencing project, are described at several web sites. Students and researchers wishing to understand the correlation between telomere length and AIDS may turn to web sites of the University of Colorado and Washington University School of Medicine for the latest on telomeres and telomerase, and their function in aging and cancer.

  16. Research on web performance optimization principles and models

    NASA Astrophysics Data System (ADS)

    Wang, Xin

    2013-03-01

    The Internet high speed development, causes Web the optimized question to be getting more and more prominent, therefore the Web performance optimizes into inevitably. the first principle of Web Performance Optimization is to understand, to know that income will have to pay, and return is diminishing; Simultaneously the probability will decrease Web the performance, and will start from the highest level to optimize obtained biggest. Web Technical models to improve the performance are: sharing costs, high-speed caching, profiles, parallel processing, simplified treatment. Based on this study, given the crucial Web performance optimization recommendations, which improve the performance of Web usage, accelerate the efficient use of Internet has an important significance.

  17. Human dynamics revealed through Web analytics.

    PubMed

    Gonçalves, Bruno; Ramasco, José J

    2008-08-01

    The increasing ubiquity of Internet access and the frequency with which people interact with it raise the possibility of using the Web to better observe, understand, and monitor several aspects of human social behavior. Web sites with large numbers of frequently returning users are ideal for this task. If these sites belong to companies or universities, their usage patterns can furnish information about the working habits of entire populations. In this work, we analyze the properly anonymized logs detailing the access history to Emory University's Web site. Emory is a medium-sized university located in Atlanta, Georgia. We find interesting structure in the activity patterns of the domain and study in a systematic way the main forces behind the dynamics of the traffic. In particular, we find that linear preferential linking, priority-based queuing, and the decay of interest for the contents of the pages are the essential ingredients to understand the way users navigate the Web.

  18. Human dynamics revealed through Web analytics

    NASA Astrophysics Data System (ADS)

    Gonçalves, Bruno; Ramasco, José J.

    2008-08-01

    The increasing ubiquity of Internet access and the frequency with which people interact with it raise the possibility of using the Web to better observe, understand, and monitor several aspects of human social behavior. Web sites with large numbers of frequently returning users are ideal for this task. If these sites belong to companies or universities, their usage patterns can furnish information about the working habits of entire populations. In this work, we analyze the properly anonymized logs detailing the access history to Emory University’s Web site. Emory is a medium-sized university located in Atlanta, Georgia. We find interesting structure in the activity patterns of the domain and study in a systematic way the main forces behind the dynamics of the traffic. In particular, we find that linear preferential linking, priority-based queuing, and the decay of interest for the contents of the pages are the essential ingredients to understand the way users navigate the Web.

  19. Project Assessment Skills Web Application

    NASA Technical Reports Server (NTRS)

    Goff, Samuel J.

    2013-01-01

    The purpose of this project is to utilize Ruby on Rails to create a web application that will replace a spreadsheet keeping track of training courses and tasks. The goal is to create a fast and easy to use web application that will allow users to track progress on training courses. This application will allow users to update and keep track of all of the training required of them. The training courses will be organized by group and by user, making readability easier. This will also allow group leads and administrators to get a sense of how everyone is progressing in training. Currently, updating and finding information from this spreadsheet is a long and tedious task. By upgrading to a web application, finding and updating information will be easier than ever as well as adding new training courses and tasks. Accessing this data will be much easier in that users just have to go to a website and log in with NDC credentials rather than request the relevant spreadsheet from the holder. In addition to Ruby on Rails, I will be using JavaScript, CSS, and jQuery to help add functionality and ease of use to my web application. This web application will include a number of features that will help update and track progress on training. For example, one feature will be to track progress of a whole group of users to be able to see how the group as a whole is progressing. Another feature will be to assign tasks to either a user or a group of users. All of these together will create a user friendly and functional web application.

  20. Query log analysis of an electronic health record search engine.

    PubMed

    Yang, Lei; Mei, Qiaozhu; Zheng, Kai; Hanauer, David A

    2011-01-01

    We analyzed a longitudinal collection of query logs of a full-text search engine designed to facilitate information retrieval in electronic health records (EHR). The collection, 202,905 queries and 35,928 user sessions recorded over a course of 4 years, represents the information-seeking behavior of 533 medical professionals, including frontline practitioners, coding personnel, patient safety officers, and biomedical researchers for patient data stored in EHR systems. In this paper, we present descriptive statistics of the queries, a categorization of information needs manifested through the queries, as well as temporal patterns of the users' information-seeking behavior. The results suggest that information needs in medical domain are substantially more sophisticated than those that general-purpose web search engines need to accommodate. Therefore, we envision there exists a significant challenge, along with significant opportunities, to provide intelligent query recommendations to facilitate information retrieval in EHR.

  1. Query Log Analysis of an Electronic Health Record Search Engine

    PubMed Central

    Yang, Lei; Mei, Qiaozhu; Zheng, Kai; Hanauer, David A.

    2011-01-01

    We analyzed a longitudinal collection of query logs of a full-text search engine designed to facilitate information retrieval in electronic health records (EHR). The collection, 202,905 queries and 35,928 user sessions recorded over a course of 4 years, represents the information-seeking behavior of 533 medical professionals, including frontline practitioners, coding personnel, patient safety officers, and biomedical researchers for patient data stored in EHR systems. In this paper, we present descriptive statistics of the queries, a categorization of information needs manifested through the queries, as well as temporal patterns of the users’ information-seeking behavior. The results suggest that information needs in medical domain are substantially more sophisticated than those that general-purpose web search engines need to accommodate. Therefore, we envision there exists a significant challenge, along with significant opportunities, to provide intelligent query recommendations to facilitate information retrieval in EHR. PMID:22195150

  2. The fluid-compensated cement bond log

    SciTech Connect

    Nayfeh, T.H.; Leslie, H.D.; Wheelis, W.B.

    1984-09-01

    An experimental and numerical wave mechanics study of cement bond logs demonstrated that wellsite computer processing can now segregate wellbore fluid effects from the sonic signal response to changing cement strength. Traditionally, cement logs have been interpreted as if water were in the wellbore, without consideration of wellbore fluid effects. These effects were assumed to be negligible. However, with the increasing number of logs being run in completion fluids such as CaCl/sub 2/, ZnBr/sub 2/, and CaBr/sub 2/, large variations in cement bond logs became apparent. A Schlumberger internal paper showing that bond log amplitude is related to the acoustic impedance of the fluid in which the tool is run led to a comprehensive study of wellbore fluid effects. Numerical and experimental models were developed simulating wellbore geometry. Measurements were conducted in 5-, 7-, and 95/8-in. casings by varying the wellbore fluid densities, viscosities, and fluid types (acoustic impedance). Parallel numerical modeling was undertaken using similar parameters. The results showed that the bond log amplitude varied dramatically with the wellbore fluid's acoustic impedance; for example, there was a 70 percent increase in the signal amplitude for 11.5-lb/ gal CaCl/sub 2/ over the signal amplitude in water. This led to the development of a Fluid-Compensated Bond log that corrects the amplitude for acoustic impedance of varying wellbore fluids, thereby making the measurements more directly related to the cement quality.

  3. Sedimentological analysis using geophysical well logs

    SciTech Connect

    Izotova, T.S. )

    1993-09-01

    The application of geophysical well logs in sedimentology and stratigraphic prospecting holds great promise in solving a number of geological problems. A suite of logs provides data on a wide range of rock properties: vertical and lateral variation of resistivity, natural polarization, natural and induced radioactivity, shear strength, and acoustic properties. Each of these properties is controlled by the depositional environment of the sediments and their later diagenesis. The attention of geologists and geophysicists is drawn to new techniques in the interpretation of geophysical well logs for exploration, appraisal, and development of oil and gas fields. The relationship between geophysical logs and depositional environments is explored. Bulk composition, rock structure, and texture and facies variation can be quantified by electric log parameters. Also, the possibility of using logs to demonstrate long- and short-period sedimentary cycles is demonstrated. Methods of sedimentological analysis using geophysical well logs are demonstrated. The importance of a genetic approach in the interpretation of geological sequences and paleogeological reconstructions is emphasized using examples taken from oil and gas prospecting operations in the Ukraine.

  4. Sample size calculation for testing differences between cure rates with the optimal log-rank test.

    PubMed

    Wu, Jianrong

    2017-01-01

    In this article, sample size calculations are developed for use when the main interest is in the differences between the cure rates of two groups. Following the work of Ewell and Ibrahim, the asymptotic distribution of the weighted log-rank test is derived under the local alternative. The optimal log-rank test under the proportional distributions alternative is discussed, and sample size formulas for the optimal and standard log-rank tests are derived. Simulation results show that the proposed formulas provide adequate sample size estimation for trial designs and that the optimal log-rank test is more efficient than the standard log-rank test, particularly when both cure rates and percentages of censoring are small.

  5. Investigating metrics of geospatial web services: The case of a CEOS federated catalog service for earth observation data

    NASA Astrophysics Data System (ADS)

    Han, Weiguo; Di, Liping; Yu, Genong; Shao, Yuanzheng; Kang, Lingjun

    2016-07-01

    Geospatial Web Services (GWS) make geospatial information and computing resources discoverable and accessible over the Web. Among them, Open Geospatial Consortium (OGC) standards-compliant data, catalog and processing services are most popular, and have been widely adopted and leveraged in geospatial research and applications. The GWS metrics, such as visit count, average processing time, and user distribution, are important to evaluate their overall performance and impacts. However, these metrics, especially of federated catalog service, have not been systematically evaluated and reported to relevant stakeholders from the point of view of service providers. Taking an integrated catalog service for earth observation data as an example, this paper describes metrics information retrieval, organization, and representation of a catalog service federation. An extensible and efficient log file analyzer is implemented to retrieve a variety of service metrics from the log file and store analysis results in an easily programmable format. An Ajax powered Web portal is built to provide stakeholders, sponsors, developers, partners, and other types of users with specific and relevant insights into metrics information in an interactive and informative form. The deployed system has provided useful information for periodical reports, service delivery, and decision support. The proposed measurement strategy and analytics framework can be a guidance to help GWS providers evaluate their services.

  6. Taming Log Files from Game/Simulation-Based Assessments: Data Models and Data Analysis Tools. Research Report. ETS RR-16-10

    ERIC Educational Resources Information Center

    Hao, Jiangang; Smith, Lawrence; Mislevy, Robert; von Davier, Alina; Bauer, Malcolm

    2016-01-01

    Extracting information efficiently from game/simulation-based assessment (G/SBA) logs requires two things: a well-structured log file and a set of analysis methods. In this report, we propose a generic data model specified as an extensible markup language (XML) schema for the log files of G/SBAs. We also propose a set of analysis methods for…

  7. Nonblocking and orphan free message logging protocols

    NASA Technical Reports Server (NTRS)

    Alvisi, Lorenzo; Hoppe, Bruce; Marzullo, Keith

    1992-01-01

    Currently existing message logging protocols demonstrate a classic pessimistic vs. optimistic tradeoff. We show that the optimistic-pessimistic tradeoff is not inherent to the problem of message logging. We construct a message-logging protocol that has the positive features of both optimistic and pessimistic protocol: our protocol prevents orphans and allows simple failure recovery; however, it requires no blocking in failure-free runs. Furthermore, this protocol does not introduce any additional message overhead as compared to one implemented for a system in which messages may be lost but processes do not crash.

  8. Reference manual for data base on Nevada well logs

    USGS Publications Warehouse

    Bauer, E.M.; Cartier, K.D.

    1995-01-01

    The U.S. Geological Survey and Nevada Division of Water Resources are cooperatively using a data base for are cooperatively using a data base for managing well-log information for the State of Nevada. The Well-Log Data Base is part of an integrated system of computer data bases using the Ingres Relational Data-Base Management System, which allows efficient storage and access to water information from the State Engineer's office. The data base contains a main table, two ancillary tables, and nine lookup tables, as well as a menu-driven system for entering, updating, and reporting on the data. This reference guide outlines the general functions of the system and provides a brief description of data tables and data-entry screens.

  9. Novel Desorber for Online Drilling Mud Gas Logging

    PubMed Central

    Lackowski, Marcin; Tobiszewski, Marek; Namieśnik, Jacek

    2016-01-01

    This work presents the construction solution and experimental results of a novel desorber for online drilling mud gas logging. The traditional desorbers use mechanical mixing of the liquid to stimulate transfer of hydrocarbons to the gaseous phase that is further analyzed. The presented approach is based on transfer of hydrocarbons from the liquid to the gas bubbles flowing through it and further gas analysis. The desorber was checked for gas logging from four different drilling muds collected from Polish boreholes. The results of optimization studies are also presented in this study. The comparison of the novel desorber with a commercial one reveals strong advantages of the novel one. It is characterized by much better hydrocarbons recovery efficiency and allows reaching lower limits of detection of the whole analytical system. The presented desorber seems to be very attractive alternative over widely used mechanical desorbers. PMID:27127674

  10. Novel Desorber for Online Drilling Mud Gas Logging.

    PubMed

    Lackowski, Marcin; Tobiszewski, Marek; Namieśnik, Jacek

    2016-01-01

    This work presents the construction solution and experimental results of a novel desorber for online drilling mud gas logging. The traditional desorbers use mechanical mixing of the liquid to stimulate transfer of hydrocarbons to the gaseous phase that is further analyzed. The presented approach is based on transfer of hydrocarbons from the liquid to the gas bubbles flowing through it and further gas analysis. The desorber was checked for gas logging from four different drilling muds collected from Polish boreholes. The results of optimization studies are also presented in this study. The comparison of the novel desorber with a commercial one reveals strong advantages of the novel one. It is characterized by much better hydrocarbons recovery efficiency and allows reaching lower limits of detection of the whole analytical system. The presented desorber seems to be very attractive alternative over widely used mechanical desorbers.

  11. Predicting hospital visits from geo-tagged Internet search logs

    PubMed Central

    Agarwal, Vibhu; Han, Lichy; Madan, Isaac; Saluja, Shaurya; Shidham, Aaditya; Shah, Nigam H.

    2016-01-01

    The steady rise in healthcare costs has deprived over 45 million Americans of healthcare services (1, 2) and has encouraged healthcare providers to look for opportunities to improve their operational efficiency. Prior studies have shown that evidence of healthcare seeking intent in Internet searches correlates well with healthcare resource utilization. Given the ubiquitous nature of mobile Internet search, we hypothesized that analyzing geo-tagged mobile search logs could enable us to machine-learn predictors of future patient visits. Using a de-identified dataset of geo-tagged mobile Internet search logs, we mined text and location patterns that are predictors of healthcare resource utilization and built statistical models that predict the probability of a user’s future visit to a medical facility. Our efforts will enable the development of innovative methods for modeling and optimizing the use of healthcare resources—a crucial prerequisite for securing healthcare access for everyone in the days to come. PMID:27570641

  12. A design method for an intuitive web site

    SciTech Connect

    Quinniey, M.L.; Diegert, K.V.; Baca, B.G.; Forsythe, J.C.; Grose, E.

    1999-11-03

    The paper describes a methodology for designing a web site for human factor engineers that is applicable for designing a web site for a group of people. Many web pages on the World Wide Web are not organized in a format that allows a user to efficiently find information. Often the information and hypertext links on web pages are not organized into intuitive groups. Intuition implies that a person is able to use their knowledge of a paradigm to solve a problem. Intuitive groups are categories that allow web page users to find information by using their intuition or mental models of categories. In order to improve the human factors engineers efficiency for finding information on the World Wide Web, research was performed to develop a web site that serves as a tool for finding information effectively. The paper describes a methodology for designing a web site for a group of people who perform similar task in an organization.

  13. Graph Structures and Algorithms for Query-Log Analysis

    NASA Astrophysics Data System (ADS)

    Donato, Debora

    Query logs are repositories that record all the interactions of users with a search engine. This incredibly rich user behavior data can be modeled using appropriate graph structures. In the recent years there has been an increasing amount of literature on studying properties, models, and algorithms for query-log graphs. Understanding the structure of such graphs, modeling user querying patterns, and designing algorithms for leveraging the latent knowledge (also known as the wisdom of the crowds) in those graphs introduces new challenges in the field of graph mining. The main goal of this paper is to present the reader with an example of these graph-structures, i.e., the Query-flow graph. This representation has been shown extremely effective for modeling user querying patterns and has been extensively used for developing real time applications. Moreover we present graph-based algorithmic solutions applied in the context of problems appearing in web applications as query recommendation and user-session segmentation.

  14. Coal log pipeline research at the University of Missouri. 4th Quarterly report, October 1, 1993--December 31, 1993

    SciTech Connect

    Liu, H.

    1994-05-01

    This paper is a progress report on a research project aimed at the development of coal log technology. Efforts have been directed at the development of technology for the fabrication of stable coal logs, as well as the energy efficient transport of these logs, in particular by pipelines. Work has been directed at new types of binders, new fabrication presses, the application of polymers to reduce transport losses, and modeling efforts.

  15. Determination of log P values of new cyclen based antimalarial drug leads using RP-HPLC.

    PubMed

    Rudraraju, A V; Amoyaw, P N A; Hubin, T J; Khan, M O F

    2014-09-01

    Lipophilicity, expressed by log P, is an important physicochemical property of drugs that affects many biological processes, including drug absorption and distribution. The main purpose of this study to determine the log P values of newly discovered drug leads using reversed-phase high-performance liquid chromatography (RP-HPLC). The reference standards, with varying polarity ranges, were dissolved in methanol and analyzed by RP-HPLC using a C18 column. The mobile phase consisted of a mixture of acetonitrile, methanol and water in a gradient elution mode. A calibration curve was plotted between the experimental log P values and obtained log k values of the reference standard compounds and a best fit line was obtained. The log k values of the new drug leads were determined in the same solvent system and were used to calculate the respective log P values by using the best fit equation. The log P vs. log k data gave a best fit linear curve that had an R2 of 0.9786 with Pvalues of the intercept and slope of 1.19 x 10(-6) and 1.56 x 10(-10), respectively, at 0.05 level of significance. Log P values of 15 new drug leads and related compounds, all of which are derivatives of macrocyclic polyamines and their metal complexes, were determined. The values obtained are closely related to the calculated log P (Clog P) values using ChemDraw Ultra 12.0. This experiment provided efficient, fast and reasonable estimates of log P values of the new drug leads by using RP-HPLC.

  16. Evaluation of historical dry well surveillance logs

    SciTech Connect

    Price, R.K.

    1996-09-09

    Several dry well surveillance logs from 1975 through 1995 for the SX Tank Farm have been examined to identify potential subsurface zones of radioactive contaminant migration. Several dynamic conditions of the gamma-ray emitting radioactive contaminant shave been identified.

  17. Expansion of industrial logging in Central Africa.

    PubMed

    Laporte, Nadine T; Stabach, Jared A; Grosch, Robert; Lin, Tiffany S; Goetz, Scott J

    2007-06-08

    Industrial logging has become the most extensive land use in Central Africa, with more than 600,000 square kilometers (30%) of forest currently under concession. With use of a time series of satellite imagery for the period from 1976 to 2003, we measured 51,916 kilometers of new logging roads. The density of roads across the forested region was 0.03 kilometer per square kilometer, but areas of Gabon and Equatorial Guinea had values over 0.09 kilometer per square kilometer. A new frontier of logging expansion was identified within the Democratic Republic of Congo, which contains 63% of the remaining forest of the region. Tree felling and skid trails increased disturbance in selectively logged areas.

  18. Optimal message log reclamation for uncoordinated checkpointing

    NASA Technical Reports Server (NTRS)

    Wang, Yi-Min; Fuchs, W. K.

    1994-01-01

    Uncoordinated checkpointing for message-passing systems allows maximum process autonomy and general nondeterministic execution, but suffers from potential domino effect and the large space overhead for maintaining checkpoints and message logs. Traditionally, it has been assumed that only obsolete checkpoints and message logs before the global recovery line can be garbage-collected. Recently, an approach to identifying all garbage checkpoints based on recovery line transformation and decomposition has been developed. We show in this paper that the same approach can be applied to the problem of identifying all garbage message logs for systems requiring message logging to record in-transit messages. Communication trace-driven simulation for several parallel programs is used to evaluate the proposed algorithm.

  19. Logging-while-coring method and apparatus

    DOEpatents

    Goldberg, David S.; Myers, Gregory J.

    2007-11-13

    A method and apparatus for downhole coring while receiving logging-while-drilling tool data. The apparatus includes core collar and a retrievable core barrel. The retrievable core barrel receives core from a borehole which is sent to the surface for analysis via wireline and latching tool The core collar includes logging-while-drilling tools for the simultaneous measurement of formation properties during the core excavation process. Examples of logging-while-drilling tools include nuclear sensors, resistivity sensors, gamma ray sensors, and bit resistivity sensors. The disclosed method allows for precise core-log depth calibration and core orientation within a single borehole, and without at pipe trip, providing both time saving and unique scientific advantages.

  20. Logging-while-coring method and apparatus

    DOEpatents

    Goldberg, David S.; Myers, Gregory J.

    2007-01-30

    A method and apparatus for downhole coring while receiving logging-while-drilling tool data. The apparatus includes core collar and a retrievable core barrel. The retrievable core barrel receives core from a borehole which is sent to the surface for analysis via wireline and latching tool The core collar includes logging-while-drilling tools for the simultaneous measurement of formation properties during the core excavation process. Examples of logging-while-drilling tools include nuclear sensors, resistivity sensors, gamma ray sensors, and bit resistivity sensors. The disclosed method allows for precise core-log depth calibration and core orientation within a single borehole, and without at pipe trip, providing both time saving and unique scientific advantages.

  1. Optimal message log reclamation for uncoordinated checkpointing

    NASA Technical Reports Server (NTRS)

    Wang, Yi-Min; Fuchs, W. K.

    1994-01-01

    Uncoordinated checkpointing for message-passing systems allows maximum process autonomy and general nondeterministic execution, but suffers from potential domino effect and the large space overhead for maintaining checkpoints and message logs. Traditionally, it has been assumed that only obsolete checkpoints and message logs before the global recovery line can be garbage-collected. Recently, an approach to identifying all garbage checkpoints based on recovery line transformation and decomposition has been developed. We show in this paper that the same approach can be applied to the problem of identifying all garbage message logs for systems requiring message logging to record in-transit messages. Communication trace-driven simulation for several parallel programs is used to evaluate the proposed algorithm.

  2. Focused Crawling of the Deep Web Using Service Class Descriptions

    SciTech Connect

    Rocco, D; Liu, L; Critchlow, T

    2004-06-21

    Dynamic Web data sources--sometimes known collectively as the Deep Web--increase the utility of the Web by providing intuitive access to data repositories anywhere that Web access is available. Deep Web services provide access to real-time information, like entertainment event listings, or present a Web interface to large databases or other data repositories. Recent studies suggest that the size and growth rate of the dynamic Web greatly exceed that of the static Web, yet dynamic content is often ignored by existing search engine indexers owing to the technical challenges that arise when attempting to search the Deep Web. To address these challenges, we present DynaBot, a service-centric crawler for discovering and clustering Deep Web sources offering dynamic content. DynaBot has three unique characteristics. First, DynaBot utilizes a service class model of the Web implemented through the construction of service class descriptions (SCDs). Second, DynaBot employs a modular, self-tuning system architecture for focused crawling of the DeepWeb using service class descriptions. Third, DynaBot incorporates methods and algorithms for efficient probing of the Deep Web and for discovering and clustering Deep Web sources and services through SCD-based service matching analysis. Our experimental results demonstrate the effectiveness of the service class discovery, probing, and matching algorithms and suggest techniques for efficiently managing service discovery in the face of the immense scale of the Deep Web.

  3. Conversation Threads Hidden within Email Server Logs

    NASA Astrophysics Data System (ADS)

    Palus, Sebastian; Kazienko, Przemysław

    Email server logs contain records of all email Exchange through this server. Often we would like to analyze those emails not separately but in conversation thread, especially when we need to analyze social network extracted from those email logs. Unfortunately each mail is in different record and those record are not tided to each other in any obvious way. In this paper method for discussion threads extraction was proposed together with experiments on two different data sets - Enron and WrUT..

  4. Selective Logging, Fire, and Biomass in Amazonia

    NASA Technical Reports Server (NTRS)

    Houghton, R. A.

    1999-01-01

    Biomass and rates of disturbance are major factors in determining the net flux of carbon between terrestrial ecosystems and the atmosphere, and neither of them is well known for most of the earth's surface. Satellite data over large areas are beginning to be used systematically to measure rates of two of the most important types of disturbance, deforestation and reforestation, but these are not the only types of disturbance that affect carbon storage. Other examples include selective logging and fire. In northern mid-latitude forests, logging and subsequent regrowth of forests have, in recent decades, contributed more to the net flux of carbon between terrestrial ecosystems and the atmosphere than any other type of land use. In the tropics logging is also becoming increasingly important. According to the FAO/UNEP assessment of tropical forests, about 25% of total area of productive forests have been logged one or more times in the 60-80 years before 1980. The fraction must be considerably greater at present. Thus, deforestation by itself accounts for only a portion of the emissions carbon from land. Furthermore, as rates of deforestation become more accurately measured with satellites, uncertainty in biomass will become the major factor accounting for the remaining uncertainty in estimates of carbon flux. An approach is needed for determining the biomass of terrestrial ecosystems. 3 Selective logging is increasingly important in Amazonia, yet it has not been included in region-wide, satellite-based assessments of land-cover change, in part because it is not as striking as deforestation. Nevertheless, logging affects terrestrial carbon storage both directly and indirectly. Besides the losses of carbon directly associated with selective logging, logging also increases the likelihood of fire.

  5. Veneer recovery from Douglas-fir logs.

    Treesearch

    E.H. Clarke; A.C. Knauss

    1957-01-01

    During 1956, the Pacific Northwest Forest and Range Experiment Station made a series of six veneer-recovery studies in the Douglas-fir region of Oregon and Washington. The net volume of logs involved totaled approximately 777 M board-feet. Purpose of these studies was to determine volume recovery, by grade of veneer, from the four principal grades of Douglas-fir logs...

  6. Hispanic logging worker safety in the south

    Treesearch

    Brandon O' Neal; Bob Shaffer

    2006-01-01

    Hispanic (Spanish-speaking) workers have entered the logging workforce in the South in significant numbers during the past ten years. According to the U.S. Labor Department, Hispanic workers in the construction and agriculture industries have significantly higher injury rates than non–Hispanics do. In view of that trend, of logging workers’ generally high exposure to...

  7. 3D GPR Imaging of Wooden Logs

    NASA Astrophysics Data System (ADS)

    Halabe, Udaya B.; Pyakurel, Sandeep

    2007-03-01

    There has been a lack of an effective NDE technique to locate internal defects within wooden logs. The few available elastic wave propagation based techniques are limited to predicting E values. Other techniques such as X-rays have not been very successful in detecting internal defects in logs. If defects such as embedded metals could be identified before the sawing process, the saw mills could significantly increase their production by reducing the probability of damage to the saw blade and the associated downtime and the repair cost. Also, if the internal defects such as knots and decayed areas could be identified in logs, the sawing blade can be oriented to exclude the defective portion and optimize the volume of high valued lumber that can be obtained from the logs. In this research, GPR has been successfully used to locate internal defects (knots, decays and embedded metals) within the logs. This paper discusses GPR imaging and mapping of the internal defects using both 2D and 3D interpretation methodology. Metal pieces were inserted in a log and the reflection patterns from these metals were interpreted from the radargrams acquired using 900 MHz antenna. Also, GPR was able to accurately identify the location of knots and decays. Scans from several orientations of the log were collected to generate 3D cylindrical volume. The actual location of the defects showed good correlation with the interpreted defects in the 3D volume. The time/depth slices from 3D cylindrical volume data were useful in understanding the extent of defects inside the log.

  8. Challenges in mapping behaviours to activities using logs from a citizen science project

    NASA Astrophysics Data System (ADS)

    Morais, Alessandra M. M.; Guarino de Vasconcelos, Leandro; Santos, Rafael D. C.

    2016-05-01

    Citizen science projects are those which recruit volunteers to participate as assistants in scientific studies. Since these projects depend on volunteer efforts, understanding the motivation that drives a volunteer to collaborate is important to ensure its success. One way to understand motivation is by interviewing the volunteers. While this approach may elicit detailed information on the volunteers' motivation and actions, it is restricted to a subset of willing participants. For web-based projects we could instead use logs of volunteers' activities, which measures which volunteer did what and when for all volunteers in a project. In this work we present some metrics that can be calculated from the logs, based on a model of interaction. We also comment on the applicability of those metrics, describe an ongoing work that may yield more precise logs and metrics and comment on issues for further research.

  9. Logging legacies affect insect pollinator communities in southern Appalachian forests

    Treesearch

    Michelle M. Jackson; Monica G. Turner; Scott M. Pearson

    2014-01-01

    Many temperate deciduous forests are recovering from past logging, but the effects of logging legacies and environmental gradients on forest insect pollinators have not been well studied. In this study, we asked how pollinator abundance and community composition varied with distance from logging roads and elevation in old (logged >90 years ago) and young (logged 20–...

  10. 32 CFR 700.845 - Maintenance of logs.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 32 National Defense 5 2012-07-01 2012-07-01 false Maintenance of logs. 700.845 Section 700.845... Commanding Officers Afloat § 700.845 Maintenance of logs. (a) A deck log and an engineering log shall be... Naval Operations. (b) A compass record shall be maintained as an adjunct to the deck log. An...

  11. 29 CFR 42.7 - Complaint/directed action logs.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 1 2012-07-01 2012-07-01 false Complaint/directed action logs. 42.7 Section 42.7 Labor Office of the Secretary of Labor COORDINATED ENFORCEMENT § 42.7 Complaint/directed action logs. (a) To... operation of a system of coordinated Complaint/Directed Action Logs (logs). The logs shall be maintained...

  12. 29 CFR 42.7 - Complaint/directed action logs.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 1 2014-07-01 2013-07-01 true Complaint/directed action logs. 42.7 Section 42.7 Labor Office of the Secretary of Labor COORDINATED ENFORCEMENT § 42.7 Complaint/directed action logs. (a) To... operation of a system of coordinated Complaint/Directed Action Logs (logs). The logs shall be maintained...

  13. 32 CFR 700.845 - Maintenance of logs.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 5 2011-07-01 2011-07-01 false Maintenance of logs. 700.845 Section 700.845... Commanding Officers Afloat § 700.845 Maintenance of logs. (a) A deck log and an engineering log shall be... Naval Operations. (b) A compass record shall be maintained as an adjunct to the deck log. An...

  14. 32 CFR 700.845 - Maintenance of logs.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 5 2013-07-01 2013-07-01 false Maintenance of logs. 700.845 Section 700.845... Commanding Officers Afloat § 700.845 Maintenance of logs. (a) A deck log and an engineering log shall be... Naval Operations. (b) A compass record shall be maintained as an adjunct to the deck log. An...

  15. 29 CFR 42.7 - Complaint/directed action logs.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 1 2013-07-01 2013-07-01 false Complaint/directed action logs. 42.7 Section 42.7 Labor Office of the Secretary of Labor COORDINATED ENFORCEMENT § 42.7 Complaint/directed action logs. (a) To... operation of a system of coordinated Complaint/Directed Action Logs (logs). The logs shall be maintained...

  16. 29 CFR 42.7 - Complaint/directed action logs.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 1 2011-07-01 2011-07-01 false Complaint/directed action logs. 42.7 Section 42.7 Labor Office of the Secretary of Labor COORDINATED ENFORCEMENT § 42.7 Complaint/directed action logs. (a) To... operation of a system of coordinated Complaint/Directed Action Logs (logs). The logs shall be maintained...

  17. 29 CFR 42.7 - Complaint/directed action logs.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 1 2010-07-01 2010-07-01 true Complaint/directed action logs. 42.7 Section 42.7 Labor Office of the Secretary of Labor COORDINATED ENFORCEMENT § 42.7 Complaint/directed action logs. (a) To... operation of a system of coordinated Complaint/Directed Action Logs (logs). The logs shall be maintained...

  18. 32 CFR 700.845 - Maintenance of logs.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 5 2014-07-01 2014-07-01 false Maintenance of logs. 700.845 Section 700.845... Commanding Officers Afloat § 700.845 Maintenance of logs. (a) A deck log and an engineering log shall be... Naval Operations. (b) A compass record shall be maintained as an adjunct to the deck log. An...

  19. CLARET user's manual: Mainframe Logs. Revision 1

    SciTech Connect

    Frobose, R.H.

    1984-11-12

    CLARET (Computer Logging and RETrieval) is a stand-alone PDP 11/23 system that can support 16 terminals. It provides a forms-oriented front end by which operators enter online activity logs for the Lawrence Livermore National Laboratory's OCTOPUS computer network. The logs are stored on the PDP 11/23 disks for later retrieval, and hardcopy reports are generated both automatically and upon request. Online viewing of the current logs is provided to management. As each day's logs are completed, the information is automatically sent to a CRAY and included in an online database system. The terminal used for the CLARET system is a dual-port Hewlett Packard 2626 terminal that can be used as either the CLARET logging station or as an independent OCTOPUS terminal. Because this is a stand-alone system, it does not depend on the availability of the OCTOPUS network to run and, in the event of a power failure, can be brought up independently.

  20. Computer analysis of digital well logs

    USGS Publications Warehouse

    Scott, James H.

    1984-01-01

    A comprehensive system of computer programs has been developed by the U.S. Geological Survey for analyzing digital well logs. The programs are operational on a minicomputer in a research well-logging truck, making it possible to analyze and replot the logs while at the field site. The minicomputer also serves as a controller of digitizers, counters, and recorders during acquisition of well logs. The analytical programs are coordinated with the data acquisition programs in a flexible system that allows the operator to make changes quickly and easily in program variables such as calibration coefficients, measurement units, and plotting scales. The programs are designed to analyze the following well-logging measurements: natural gamma-ray, neutron-neutron, dual-detector density with caliper, magnetic susceptibility, single-point resistance, self potential, resistivity (normal and Wenner configurations), induced polarization, temperature, sonic delta-t, and sonic amplitude. The computer programs are designed to make basic corrections for depth displacements, tool response characteristics, hole diameter, and borehole fluid effects (when applicable). Corrected well-log measurements are output to magnetic tape or plotter with measurement units transformed to petrophysical and chemical units of interest, such as grade of uranium mineralization in percent eU3O8, neutron porosity index in percent, and sonic velocity in kilometers per second.

  1. Deep Web video

    SciTech Connect

    None Available

    2009-06-01

    To make the web work better for science, OSTI has developed state-of-the-art technologies and services including a deep web search capability. The deep web includes content in searchable databases available to web users but not accessible by popular search engines, such as Google. This video provides an introduction to the deep web search engine.

  2. Deep Web video

    ScienceCinema

    None Available

    2016-07-12

    To make the web work better for science, OSTI has developed state-of-the-art technologies and services including a deep web search capability. The deep web includes content in searchable databases available to web users but not accessible by popular search engines, such as Google. This video provides an introduction to the deep web search engine.

  3. Financial returns under uncertainty for conventional and reduced-impact logging in permanent production forests of the Brazilian Amazon

    Treesearch

    Frederick Boltz; Douglas R. Carter; Thomas P. Holmes; Rodrigo Pereira

    2001-01-01

    Reduced-impact logging (RIL) techniques are designed to improve the efficiency of timber harvesting while mitigating its adverse effects on the forest ecosystem. Research on RIL in select tropical forest regions has demonstrated clear ecological benefits relative to conventional logging (CL) practices while the financial competitiveness of RIL is less conclusive. We...

  4. Minnesota logging utilization factors, 1975-1976--development, use, implications.

    Treesearch

    James E. Blyth; W. Brad Smith

    1979-01-01

    Discusses Minnesota saw log and pulpwood logging utilization factors developed during 1975-1976 and their implications. Compares factors for several species groups and shows their use in estimating growing stock cut for pulpwood and saw logs.

  5. 5. Log calving barn. Detail of wall corner showing half ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. Log calving barn. Detail of wall corner showing half dovetail notching on hand-hewn logs. - William & Lucina Bowe Ranch, Log Calving Barn, 230 feet south-southwest of House, Melrose, Silver Bow County, MT

  6. 55. VIEW OF STEAMOPERATED LOG HOIST TO PUT IN COMING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    55. VIEW OF STEAM-OPERATED LOG HOIST TO PUT IN COMING LOGS INTO RALPH HULL LUMBER CO. LOG POND. PHOTOGRAPHER: UNKNOWN. DATE: 1942. COURTESY OF RALPH HULL. - Hull-Oakes Lumber Company, 23837 Dawson Road, Monroe, Benton County, OR

  7. Well log characterization of natural gas hydrates

    USGS Publications Warehouse

    Collett, Timothy S.; Lee, Myung W.

    2011-01-01

    In the last 25 years we have seen significant advancements in the use of downhole well logging tools to acquire detailed information on the occurrence of gas hydrate in nature: From an early start of using wireline electrical resistivity and acoustic logs to identify gas hydrate occurrences in wells drilled in Arctic permafrost environments to today where wireline and advanced logging-while-drilling tools are routinely used to examine the petrophysical nature of gas hydrate reservoirs and the distribution and concentration of gas hydrates within various complex reservoir systems. The most established and well known use of downhole log data in gas hydrate research is the use of electrical resistivity and acoustic velocity data (both compressional- and shear-wave data) to make estimates of gas hydrate content (i.e., reservoir saturations) in various sediment types and geologic settings. New downhole logging tools designed to make directionally oriented acoustic and propagation resistivity log measurements have provided the data needed to analyze the acoustic and electrical anisotropic properties of both highly inter-bedded and fracture dominated gas hydrate reservoirs. Advancements in nuclear-magnetic-resonance (NMR) logging and wireline formation testing have also allowed for the characterization of gas hydrate at the pore scale. Integrated NMR and formation testing studies from northern Canada and Alaska have yielded valuable insight into how gas hydrates are physically distributed in sediments and the occurrence and nature of pore fluids (i.e., free-water along with clay and capillary bound water) in gas-hydrate-bearing reservoirs. Information on the distribution of gas hydrate at the pore scale has provided invaluable insight on the mechanisms controlling the formation and occurrence of gas hydrate in nature along with data on gas hydrate reservoir properties (i.e., permeabilities) needed to accurately predict gas production rates for various gas hydrate

  8. Thermal Properties of Bazhen fm. Sediments from Thermal Core Logging

    NASA Astrophysics Data System (ADS)

    Spasennykh, Mikhail; Popov, Evgeny; Popov, Yury; Chekhonin, Evgeny; Romushkevich, Raisa; Zagranovskaya, Dzhuliya; Belenkaya, Irina; Zhukov, Vladislav; Karpov, Igor; Saveliev, Egor; Gabova, Anastasia

    2016-04-01

    The Bazhen formation (B. fm.) is the hugest self-contained source-and-reservoir continuous petroleum system covering by more than 1 mln. km2 (West Siberia, Russia). High lithological differentiation in Bazhen deposits dominated by silicic shales and carbonates accompanied by extremely high total organic carbon values (of up to 35%), pyrite content and brittle mineralogical composition deteriorate standard thermal properties assessment for low permeable rocks. Reliable information of unconventional system thermal characteristics is the necessary part of works such as modelling of different processes in reservoir under thermal EOR for accessing their efficiency, developing and optimizing design of the oil recovery methods, interpretation of the well temperature logging data and for the basin petroleum modelling. A unique set of data including thermal conductivity, thermal diffusivity, volumetric heat capacity, thermal anisotropy for the B.fm. rocks was obtained from thermal core logging (high resolution continuous thermal profiling) on more than 4680 core samples (2000 of B.fm. samples are among) along seven wells for four oil fields. Some systematic peculiarities of the relation between thermal properties of the B.fm. rocks and their mineralogical composition, structural and texture properties were obtained. The high-resolution data are processed jointly with the standard petrophysical logging that allowed us to provide better separation of the formation. The research work was done with financial support of the Russian Ministry of Education and Science (unique identification number RFMEFI58114X0008).

  9. The use of web ontology languages and other semantic web tools in drug discovery.

    PubMed

    Chen, Huajun; Xie, Guotong

    2010-05-01

    To optimize drug development processes, pharmaceutical companies require principled approaches to integrate disparate data on a unified infrastructure, such as the web. The semantic web, developed on the web technology, provides a common, open framework capable of harmonizing diversified resources to enable networked and collaborative drug discovery. We survey the state of art of utilizing web ontologies and other semantic web technologies to interlink both data and people to support integrated drug discovery across domains and multiple disciplines. Particularly, the survey covers three major application categories including: i) semantic integration and open data linking; ii) semantic web service and scientific collaboration and iii) semantic data mining and integrative network analysis. The reader will gain: i) basic knowledge of the semantic web technologies; ii) an overview of the web ontology landscape for drug discovery and iii) a basic understanding of the values and benefits of utilizing the web ontologies in drug discovery. i) The semantic web enables a network effect for linking open data for integrated drug discovery; ii) The semantic web service technology can support instant ad hoc collaboration to improve pipeline productivity and iii) The semantic web encourages publishing data in a semantic way such as resource description framework attributes and thus helps move away from a reliance on pure textual content analysis toward more efficient semantic data mining.

  10. From Web 2.0 to Teacher 2.0

    ERIC Educational Resources Information Center

    Thomas, David A.; Li, Qing

    2008-01-01

    The World Wide Web is evolving in response to users who demand faster and more efficient access to information, portability, and reusability of digital objects between Web-based and computer-based applications and powerful communication, publication, collaboration, and teaching and learning tools. This article reviews current uses of Web-based…

  11. From Web 2.0 to Teacher 2.0

    ERIC Educational Resources Information Center

    Thomas, David A.; Li, Qing

    2008-01-01

    The World Wide Web is evolving in response to users who demand faster and more efficient access to information, portability, and reusability of digital objects between Web-based and computer-based applications and powerful communication, publication, collaboration, and teaching and learning tools. This article reviews current uses of Web-based…

  12. Spider Webs

    NASA Image and Video Library

    2016-10-19

    This image shows a lava channel north of Kuiper Crater in the high southern latitudes just before spring equinox. It was a target suggested by members of the public, using our suggestion tool called HiWish. The channel confluence at the top of the image illustrates interesting volcanic processes that took place long ago. However, it was the mounds on the rim of the channel to the south of the confluence that we initially found alarming. These mounds, up to 400 meters in diameter, are decorated by radial and concentric patterns that resemble spider webs. Radial and concentric fractures are familiar from forces penetrating a brittle layer, such as a rock thrown through a glass window. These particular fractures were evidently produced by something emerging from below the brittle surface of Mars. It seems likely that ice lenses, resulting from the accumulation of ice beneath the surface, created these peculiar mounds. Ice is less dense than rock, so the buried ice rose and pushed upwards on the surface and generated these spider web-like patterns. An analogous process creates similar sized mounds in arctic tundra on Earth that are known as "pingos," an Inuit word. The Martian fractures in this location are nowadays filled with dust instead of ice, so it is unclear how long ago this activity took place. It seems likely that these pingo-forming periglacial processes took place much more recently than the volcanic activity also evident in this region of Mars. http://photojournal.jpl.nasa.gov/catalog/PIA21110

  13. Investigating the Web Structure by Isolated Stars

    NASA Astrophysics Data System (ADS)

    Uno, Yushi; Ota, Yoshinobu; Uemichi, Akio

    The link structure of the Web is generally represented by the webgraph, and it is often used for web structure mining that mainly aims to find hidden communities on the Web. In this paper, we identify a common frequent substructure and give it a formal graph definition, which we call an isolated star (i-star), and propose an efficient enumeration algorithm of i-stars. We then investigate the structure of the Web by enumerating i-stars from real web data. As a result, we observed that most i-stars correspond to index structures in single domains, while some of them are verified to be candidates of communities, which implies the validity of i-stars as useful substructure for web structure mining and link spam detecting. We also observed that the distributions of i-star sizes show power-law, which is another new evidence of the scale-freeness of the webgraph.

  14. Well log evaluation of gas hydrate saturations

    USGS Publications Warehouse

    Collett, Timothy S.

    1998-01-01

    The amount of gas sequestered in gas hydrates is probably enormous, but estimates are highly speculative due to the lack of previous quantitative studies. Gas volumes that may be attributed to a gas hydrate accumulation within a given geologic setting are dependent on a number of reservoir parameters; one of which, gas-hydrate saturation, can be assessed with data obtained from downhole well logging devices. The primary objective of this study was to develop quantitative well-log evaluation techniques which will permit the calculation of gas-hydrate saturations in gas-hydrate-bearing sedimentary units. The `standard' and `quick look' Archie relations (resistivity log data) yielded accurate gas-hydrate and free-gas saturations within all of the gas hydrate accumulations assessed in the field verification phase of the study. Compressional wave acoustic log data have been used along with the Timur, modified Wood, and the Lee weighted average acoustic equations to calculate accurate gas-hydrate saturations in this study. The well log derived gas-hydrate saturations calculated in the field verification phase of this study, which range from as low as 2% to as high as 97%, confirm that gas hydrates represent a potentially important source of natural gas.

  15. Well log evaluation of gas hydrate saturations

    USGS Publications Warehouse

    Collett, T.S.

    1998-01-01

    The amount of gas sequestered in gas hydrates is probably enormous, but estimates are highly speculative due to the lack of previous quantitative studies. Gas volumes that may be attributed to a gas hydrate accumulation within a given geologic setting are dependent on a number of reservoir parameters; one of which, gas-hydrate saturation, can be assessed with data obtained from downhole well logging devices. The primary objective of this study was to develop quantitative well-log evaluation techniques which will permit the calculation of gas-hydrate saturations in gas-hydrate-bearing sedimentary units. The "standard" and "quick look" Archie relations (resistivity log data) yielded accurate gas-hydrate and free-gas saturations within all of the gas hydrate accumulations assessed in the field verification phase of the study. Compressional wave acoustic log data have been used along with the Timur, modified Wood, and the Lee weighted average acoustic equations to calculate accurate gas-hydrate saturations in all of the gas hydrate accumulations assessed in this study. The well log derived gas-hydrate saturations calculated in the field verification phase of this study, which range from as low as 2% to as high as 97%, confirm that gas hydrates represent a potentially important source of natural gas.

  16. Web Mining: Machine Learning for Web Applications.

    ERIC Educational Resources Information Center

    Chen, Hsinchun; Chau, Michael

    2004-01-01

    Presents an overview of machine learning research and reviews methods used for evaluating machine learning systems. Ways that machine-learning algorithms were used in traditional information retrieval systems in the "pre-Web" era are described, and the field of Web mining and how machine learning has been used in different Web mining…

  17. Web Mining: Machine Learning for Web Applications.

    ERIC Educational Resources Information Center

    Chen, Hsinchun; Chau, Michael

    2004-01-01

    Presents an overview of machine learning research and reviews methods used for evaluating machine learning systems. Ways that machine-learning algorithms were used in traditional information retrieval systems in the "pre-Web" era are described, and the field of Web mining and how machine learning has been used in different Web mining…

  18. Workspaces in the Semantic Web

    NASA Technical Reports Server (NTRS)

    Wolfe, Shawn R.; Keller, RIchard M.

    2005-01-01

    Due to the recency and relatively limited adoption of Semantic Web technologies. practical issues related to technology scaling have received less attention than foundational issues. Nonetheless, these issues must be addressed if the Semantic Web is to realize its full potential. In particular, we concentrate on the lack of scoping methods that reduce the size of semantic information spaces so they are more efficient to work with and more relevant to an agent's needs. We provide some intuition to motivate the need for such reduced information spaces, called workspaces, give a formal definition, and suggest possible methods of deriving them.

  19. New excitation method for acoustic logging transmitters

    NASA Astrophysics Data System (ADS)

    Zhang, K.; Ju, X. D.; Tan, B. H.; Lu, J. Q.; Men, B. Y.; Wu, W. H.; Chen, J. Y.

    2017-08-01

    The traditional transducer coupled excitation method for acoustic logging tools has many disadvantages given its transformer characteristics. A new excitation method that uses a complex programmable logic device and vertical metal oxide semiconductor field effect transistor is proposed in this study. Theoretical calculations, finite element analyses and acoustic experiments are performed to compare the acoustic and electrical characteristics of the new and the traditional methods. Results show that the acoustic waves emitted by the new method have lower dominant frequencies and approximately four times more acoustic energy than those emitted by the transformer coupled method. Furthermore, the adjustment functions and channel consistencies of the new electrical circuit are better than those of the traditional method. All the results indicate that the logging tools that use the new method are more flexible and accurate with better detection depth and sensitivity. This method has already been used in logging tools and has improved their performance.

  20. Unconventional neutron sources for oil well logging

    NASA Astrophysics Data System (ADS)

    Frankle, C. M.; Dale, G. E.

    2013-09-01

    Americium-Beryllium (AmBe) radiological neutron sources have been widely used in the petroleum industry for well logging purposes. There is strong desire on the part of various governmental and regulatory bodies to find alternate sources due to the high activity and small size of AmBe sources. Other neutron sources are available, both radiological (252Cf) and electronic accelerator driven (D-D and D-T). All of these, however, have substantially different neutron energy spectra from AmBe and thus cause significantly different responses in well logging tools. We report on simulations performed using unconventional sources and techniques to attempt to better replicate the porosity and carbon/oxygen ratio responses a well logging tool would see from AmBe neutrons. The AmBe response of these two types of tools is compared to the response from 252Cf, D-D, D-T, filtered D-T, and T-T sources.

  1. Spreadsheet log analysis in subsurface geology

    USGS Publications Warehouse

    Doveton, J.H.

    2000-01-01

    Most of the direct knowledge of the geology of the subsurface is gained from the examination of core and drill-cuttings recovered from boreholes drilled by the petroleum and water industries. Wireline logs run in these same boreholes generally have been restricted to tasks of lithostratigraphic correlation and thee location of hydrocarbon pay zones. However, the range of petrophysical measurements has expanded markedly in recent years, so that log traces now can be transformed to estimates of rock composition. Increasingly, logs are available in a digital format that can be read easily by a desktop computer and processed by simple spreadsheet software methods. Taken together, these developments offer accessible tools for new insights into subsurface geology that complement the traditional, but limited, sources of core and cutting observations.

  2. Statistical factor analysis technique for characterizing basalt through interpreting nuclear and electrical well logging data (case study from Southern Syria).

    PubMed

    Asfahani, Jamal

    2014-02-01

    Factor analysis technique is proposed in this research for interpreting the combination of nuclear well logging, including natural gamma ray, density and neutron-porosity, and the electrical well logging of long and short normal, in order to characterize the large extended basaltic areas in southern Syria. Kodana well logging data are used for testing and applying the proposed technique. The four resulting score logs enable to establish the lithological score cross-section of the studied well. The established cross-section clearly shows the distribution and the identification of four kinds of basalt which are hard massive basalt, hard basalt, pyroclastic basalt and the alteration basalt products, clay. The factor analysis technique is successfully applied on the Kodana well logging data in southern Syria, and can be used efficiently when several wells and huge well logging data with high number of variables are required to be interpreted. © 2013 Elsevier Ltd. All rights reserved.

  3. Coal log pipeline research at University of Missouri. Fourth quarterly report for 1995, October 1, 1995--December 30, 1995

    SciTech Connect

    Liu, H.

    1995-12-31

    The purpose of this project is to design and develop fast and efficient machines for manufacturing high quality coal logs. During the last three months, efforts were focused on the revision and improvement of the design of the 300-ton hydraulic press machine for coal log production. The conceptual design of the machine has been sent to Automated Resources, Inc. for review. Experiments were conducted on threshold binder (orimulsion) concentration. It showed that for binder concentrations below 1%, the initial weight loss of coal logs (due to chipping of corner) is unaffected by the binder concentration unless the binder concentration is 1% or more. For binder levels above 0.25%, more binder causes less coal log wear after long time or or large number of cycles of circulation through pipe. After 250 cycles in the pipe, binderless coal logs suffer approximately twice the wear of the logs with 1% binder.

  4. Enhancing DSN Operations Efficiency with the Discrepancy Reporting Management System (DRMS)

    NASA Technical Reports Server (NTRS)

    Chatillon, Mark; Lin, James; Cooper, Tonja M.

    2003-01-01

    The DRMS is the Discrepancy Reporting Management System used by the Deep Space Network (DSN). It uses a web interface and is a management tool designed to track and manage: data outage incidents during spacecraft tracks against equipment and software known as DRs (discrepancy Reports), to record "out of pass" incident logs against equipment and software in a Station Log, to record instances where equipment has be restarted or reset as Reset records, and to electronically record equipment readiness status across the DSN. Tracking and managing these items increases DSN operational efficiency by providing: the ability to establish the operational history of equipment items, data on the quality of service provided to the DSN customers, the ability to measure service performance, early insight into processes, procedures and interfaces that may need updating or changing, and the capability to trace a data outage to a software or hardware change. The items listed above help the DSN to focus resources on areas of most need.

  5. Enhancing DSN Operations Efficiency with the Discrepancy Reporting Management System (DRMS)

    NASA Technical Reports Server (NTRS)

    Chatillon, Mark; Lin, James; Cooper, Tonja M.

    2003-01-01

    The DRMS is the Discrepancy Reporting Management System used by the Deep Space Network (DSN). It uses a web interface and is a management tool designed to track and manage: data outage incidents during spacecraft tracks against equipment and software known as DRs (discrepancy Reports), to record "out of pass" incident logs against equipment and software in a Station Log, to record instances where equipment has be restarted or reset as Reset records, and to electronically record equipment readiness status across the DSN. Tracking and managing these items increases DSN operational efficiency by providing: the ability to establish the operational history of equipment items, data on the quality of service provided to the DSN customers, the ability to measure service performance, early insight into processes, procedures and interfaces that may need updating or changing, and the capability to trace a data outage to a software or hardware change. The items listed above help the DSN to focus resources on areas of most need.

  6. Web-Based Learning Programs: Use by Learners with Various Cognitive Styles

    ERIC Educational Resources Information Center

    Chen, Ling-Hsiu

    2010-01-01

    To consider how Web-based learning program is utilized by learners with different cognitive styles, this study presents a Web-based learning system (WBLS) and analyzes learners' browsing data recorded in the log file to identify how learners' cognitive styles and learning behavior are related. In order to develop an adapted WBLS, this study also…

  7. Computer Cache. Online Recess--Web Games for Play and Fun

    ERIC Educational Resources Information Center

    Byerly, Greg; Brodie, Carolyn S.

    2005-01-01

    There are many age-appropriate, free, and easy-to-use online games available on the Web. In this column the authors describe some of their favorites for use with and by elementary students. They have not included games that require children to log on and/or register with their names or play against someone else interactively over the Web. None of…

  8. Web-Based Learning Programs: Use by Learners with Various Cognitive Styles

    ERIC Educational Resources Information Center

    Chen, Ling-Hsiu

    2010-01-01

    To consider how Web-based learning program is utilized by learners with different cognitive styles, this study presents a Web-based learning system (WBLS) and analyzes learners' browsing data recorded in the log file to identify how learners' cognitive styles and learning behavior are related. In order to develop an adapted WBLS, this study also…

  9. Lithologic logs and geophysical logs from test drilling in Palm Beach County, Florida, since 1974

    USGS Publications Warehouse

    Swayze, Leo J.; McGovern, Michael C.; Fischer, John N.

    1980-01-01

    Test-hole data that may be used to determine the hydrogeology of the zone of high permeability in Palm Beach County, Fla., are presented. Lithologic logs from 46 test wells and geophysical logs from 40 test wells are contained in this report. (USGS)

  10. The Causes of Logging Truck Delays on Two West Virginia Logging Operations

    Treesearch

    John E. Baumgras

    1978-01-01

    Logging truck downtime increases timber harvesting costs. To determine the extent and causes of truck delays, four logging trucks on two separate operations were monitored for a 7-month period by recording speedometers and with tallies of delay causes. The results show the number of truck delays per shift, their duration, and the total delay time per shift for eight...

  11. Comparison of logging residue from lump sum and log scale timber sales.

    Treesearch

    James O Howard; Donald J. DeMars

    1985-01-01

    Data from 1973 and 1980 logging residues studies were used to compare the volume of residue from lump sum and log scale timber sales. Covariance analysis was used to adjust the mean volume for each data set for potential variation resulting from differences in stand conditions. Mean residue volumes from the two sale types were significantly different at the 5-percent...

  12. RAYSAW: a log sawing simulator for 3D laser-scanned hardwood logs

    Treesearch

    R. Edward. Thomas

    2013-01-01

    Laser scanning of hardwood logs provides detailed high-resolution imagery of log surfaces. Characteristics such as sweep, taper, and crook, as well as most surface defects, are visible to the eye in the scan data. In addition, models have been developed that predict interior knot size and position based on external defect information. Computerized processing of...

  13. Financial feasibility of a log sort yard handling small-diameter logs: A preliminary study

    Treesearch

    Han-Sup Han; E. M. (Ted) Bilek; John (Rusty) Dramm; Dan Loeffler; Dave Calkin

    2011-01-01

    The value and use of the trees removed in fuel reduction thinning and restoration treatments could be enhanced if the wood were effectively evaluated and sorted for quality and highest value before delivery to the next manufacturing destination. This article summarizes a preliminary financial feasibility analysis of a log sort yard that would serve as a log market to...

  14. Relationships between log N-log S and celestial distribution of gamma-ray bursts

    NASA Technical Reports Server (NTRS)

    Nishimura, J.; Yamagami, T.

    1985-01-01

    The apparent conflict between log N-log S curve and isotropic celestial distribution of the gamma ray bursts is discussed. A possible selection effect due to the time profile of each burst is examined. It is shown that the contradiction is due to this selection effect of the gamma ray bursts.

  15. LogSafe and Smart: Minnesota OSHA's LogSafe Program Takes Root.

    ERIC Educational Resources Information Center

    Honerman, James

    1999-01-01

    Logging is now the most dangerous U.S. occupation. The Occupational Safety and Health Administration (OSHA) developed specialized safety training for the logging industry but has been challenged to reach small operators. An OSHA-approved state program in Minnesota provides annual safety seminars to about two-thirds of the state's full-time…

  16. The Design of Plywood Webs for Airplane Wing Beams

    NASA Technical Reports Server (NTRS)

    Trayer, George W

    1931-01-01

    This report deals with the design of plywood webs for wooden box beams to obtain maximum strength per unit weight. A method of arriving at the most efficient and economical web thickness, and hence the most suitable unit shear stress, is presented and working stresses in shear for various types of webs and species of plywood are given. The questions of diaphragm spacing and required glue area between the webs and flange are also discussed.

  17. Impact of logging on aboveground biomass stocks in lowland rain forest, Papua New Guinea.

    PubMed

    Bryan, Jane; Shearman, Phil; Ash, Julian; Kirkpatrick, J B

    2010-12-01

    Greenhouse-gas emissions resulting from logging are poorly quantified across the tropics. There is a need for robust measurement of rain forest biomass and the impacts of logging from which carbon losses can be reliably estimated at regional and global scales. We used a modified Bitterlich plotless technique to measure aboveground live biomass at six unlogged and six logged rain forest areas (coupes) across two approximately 3000-ha regions at the Makapa concession in lowland Papua New Guinea. "Reduced-impact logging" is practiced at Makapa. We found the mean unlogged aboveground biomass in the two regions to be 192.96 +/- 4.44 Mg/ha and 252.92 +/- 7.00 Mg/ha (mean +/- SE), which was reduced by logging to 146.92 +/- 4.58 Mg/ha and 158.84 +/- 4.16, respectively. Killed biomass was not a fixed proportion, but varied with unlogged biomass, with 24% killed in the lower-biomass region, and 37% in the higher-biomass region. Across the two regions logging resulted in a mean aboveground carbon loss of 35 +/- 2.8 Mg/ha. The plotless technique proved efficient at estimating mean aboveground biomass and logging damage. We conclude that substantial bias is likely to occur within biomass estimates derived from single unreplicated plots.

  18. Generating event logs from non-process-aware systems enabling business process mining

    NASA Astrophysics Data System (ADS)

    Pérez-Castillo, Ricardo; Weber, Barbara; Pinggera, Jakob; Zugal, Stefan; García-Rodríguez de Guzmán, Ignacio; Piattini, Mario

    2011-08-01

    As information systems age they become legacy information systems (LISs), embedding business knowledge not present in other artefacts. LISs must be modernised when their maintainability falls below acceptable limits but the embedded business knowledge is valuable information that must be preserved to align the modernised versions of LISs with organisations' real-world business processes. Business process mining permits the discovery and preservation of all meaningful embedded business knowledge by using event logs, which represent the business activities executed by an information system. Event logs can be easily obtained through the execution of process-aware information systems (PAISs). However, several non-process-aware information systems also implicitly support organisations' business processes. This article presents a technique for obtaining event logs from traditional information systems (without any in-built logging functionality) by statically analysing and modifying LISs. The technique allows the modified systems to dynamically record event logs. The approach is validated with a case study involving a healthcare information system used in Austrian hospitals, which shows the technique obtains event logs that effectively and efficiently enable the discovery of embedded business processes. This implies the techniques provided within the process mining field, which are based on event logs, may also be applied to traditional information systems.

  19. Wave functions of log-periodic oscillators

    SciTech Connect

    Bessa, V.; Guedes, I.

    2011-06-15

    We use the Lewis and Riesenfeld invariant method [J. Math. Phys. 10, 1458 (1969)] and a unitary transformation to obtain the exact Schroedinger wave functions for time-dependent harmonic oscillators exhibiting log-periodic-type behavior. For each oscillator we calculate the quantum fluctuations in the coordinate and momentum as well as the quantum correlations between the coordinate and momentum. We observe that the oscillator with m=m{sub 0}t/t{sub 0} and {omega}={omega}{sub 0}t{sub 0}/t, which exhibits an exact log-periodic oscillation, behaves as the harmonic oscillator with m and {omega} constant.

  20. Development of pulsed neutron uranium logging instrument.

    PubMed

    Wang, Xin-guang; Liu, Dan; Zhang, Feng

    2015-03-01

    This article introduces a development of pulsed neutron uranium logging instrument. By analyzing the temporal distribution of epithermal neutrons generated from the thermal fission of (235)U, we propose a new method with a uranium-bearing index to calculate the uranium content in the formation. An instrument employing a D-T neutron generator and two epithermal neutron detectors has been developed. The logging response is studied using Monte Carlo simulation and experiments in calibration wells. The simulation and experimental results show that the uranium-bearing index is linearly correlated with the uranium content, and the porosity and thermal neutron lifetime of the formation can be acquired simultaneously.

  1. Compacting a Kentucky coal for quality logs

    SciTech Connect

    Lin, Y.; Li, Z.; Mao, S.

    1999-07-01

    A Kentucky coal was found more difficult to be compacted into large size strong logs. Study showed that compaction parameters affecting the strength of compacted coal logs could be categorized into three groups. The first group is coal inherent properties such as elasticity and coefficient of friction, the second group is machine properties such as mold geometry, and the third group is the coal mixture preparation parameters such as particle size distribution. Theoretical analysis showed that an appropriate backpressure can reduce surface cracks occurring during ejection. This has been confirmed by the experiments conducted.

  2. Development of pulsed neutron uranium logging instrument

    SciTech Connect

    Wang, Xin-guang; Liu, Dan; Zhang, Feng

    2015-03-15

    This article introduces a development of pulsed neutron uranium logging instrument. By analyzing the temporal distribution of epithermal neutrons generated from the thermal fission of {sup 235}U, we propose a new method with a uranium-bearing index to calculate the uranium content in the formation. An instrument employing a D-T neutron generator and two epithermal neutron detectors has been developed. The logging response is studied using Monte Carlo simulation and experiments in calibration wells. The simulation and experimental results show that the uranium-bearing index is linearly correlated with the uranium content, and the porosity and thermal neutron lifetime of the formation can be acquired simultaneously.

  3. MAIL LOG, program summary and specifications

    NASA Technical Reports Server (NTRS)

    Harris, D. K.

    1979-01-01

    The summary and specifications to obtain the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS are provided. The MAIL LOG program has four modes of operation: (1) input - putting new records into the data base; (2) revise - changing or modifying existing records in the data base; (3) search - finding special records existing in the data base; and (4) archive - store or put away existing records in the data base. The output includes special printouts of records in the data base and results from the input and search modes.

  4. Quality of the log-geometric distribution extrapolation for smaller undiscovered oil and gas pool size

    USGS Publications Warehouse

    Chenglin, L.; Charpentier, R.R.

    2010-01-01

    The U.S. Geological Survey procedure for the estimation of the general form of the parent distribution requires that the parameters of the log-geometric distribution be calculated and analyzed for the sensitivity of these parameters to different conditions. In this study, we derive the shape factor of a log-geometric distribution from the ratio of frequencies between adjacent bins. The shape factor has a log straight-line relationship with the ratio of frequencies. Additionally, the calculation equations of a ratio of the mean size to the lower size-class boundary are deduced. For a specific log-geometric distribution, we find that the ratio of the mean size to the lower size-class boundary is the same. We apply our analysis to simulations based on oil and gas pool distributions from four petroleum systems of Alberta, Canada and four generated distributions. Each petroleum system in Alberta has a different shape factor. Generally, the shape factors in the four petroleum systems stabilize with the increase of discovered pool numbers. For a log-geometric distribution, the shape factor becomes stable when discovered pool numbers exceed 50 and the shape factor is influenced by the exploration efficiency when the exploration efficiency is less than 1. The simulation results show that calculated shape factors increase with those of the parent distributions, and undiscovered oil and gas resources estimated through the log-geometric distribution extrapolation are smaller than the actual values. ?? 2010 International Association for Mathematical Geology.

  5. When Workflow Management Systems and Logging Systems Meet: Analyzing Large-Scale Execution Traces

    SciTech Connect

    Gunter, Daniel

    2008-07-31

    This poster shows the benefits of integrating a workflow management system with logging and log mining capabilities. By combing two existing, mature technologies: Pegasus-WMS and Netlogger, we are able to efficiently process execution logs of earthquake science workflows consisting of hundreds of thousands to one million tasks. In particular we show results of processing logs of CyberShake, a workflow application running on the TeraGrid. Client-side tools allow scientists to quickly gather statistics about a workflow run and find out which tasks executed, where they were executed, what was their runtime, etc. These statistics can be used to understand the performance characteristics of a workflow and help tune the execution parameters of the workflow management system. This poster shows the scalability of the system presenting results of uploading task execution records into the system and by showing results of querying the system for overall workflow performance information.

  6. Online Persistence in Higher Education Web-Supported Courses

    ERIC Educational Resources Information Center

    Hershkovitz, Arnon; Nachmias, Rafi

    2011-01-01

    This research consists of an empirical study of online persistence in Web-supported courses in higher education, using Data Mining techniques. Log files of 58 Moodle websites accompanying Tel Aviv University courses were drawn, recording the activity of 1189 students in 1897 course enrollments during the academic year 2008/9, and were analyzed…

  7. Use of an Academic Library Web Site Search Engine.

    ERIC Educational Resources Information Center

    Fagan, Jody Condit

    2002-01-01

    Describes an analysis of the search engine logs of Southern Illinois University, Carbondale's library to determine how patrons used the site search. Discusses results that showed patrons did not understand the function of the search and explains improvements that were made in the Web site and in online reference services. (Author/LRW)

  8. Adolescents' Web-Based Literacies, Identity Construction, and Skill Development

    ERIC Educational Resources Information Center

    Alvermann, Donna E.; Marshall, James D.; McLean, Cheryl A.; Huddleston, Andrew P.; Joaquin, Jairus; Bishop, John

    2012-01-01

    Five qualitative multiple-case studies document adolescents' uses of Web-based resources and digital literacy skills to construct their online identities. Working from a perspective that integrates new literacies with multimodality, the researchers enlisted the help of five high school students who kept daily logs of the websites they visited for…

  9. Antibiotic Pollution in Marine Food Webs in Laizhou Bay, North China: Trophodynamics and Human Exposure Implication.

    PubMed

    Liu, Sisi; Zhao, Hongxia; Lehmler, Hans-Joachim; Cai, Xiyun; Chen, Jingwen

    2017-02-21

    Little information is available about the bioaccumulation and biomagnification of antibiotics in marine food webs. Here, we investigate the levels and trophic transfer of 9 sulfonamide (SA), 5 fluoroquinolone (FQ), and 4 macrolide (ML) antibiotics, as well as trimethoprim in nine invertebrate and ten fish species collected from a marine food web in Laizhou Bay, North China in 2014 and 2015. All the antibiotics were detected in the marine organisms, with SAs and FQs being the most abundant antibiotics. Benthic fish accumulated more SAs than invertebrates and pelagic fish, while invertebrates exhibited higher FQ levels than fish. Generally, SAs and trimethoprim biomagnified in the food web, while the FQs and MLs were biodiluted. Trophic magnification factors (TMF) were 1.2-3.9 for SAs and trimethoprim, 0.3-1.0 for FQs and MLs. Limited biotransformation and relatively high assimilation efficiencies are the likely reasons for the biomagnification of SAs. The pH dependent distribution coefficients (log D) but not the lipophilicity (log KOW) of SAs and FQs had a significant correlation (r = 0.73; p < 0.05) with their TMFs. Although the calculated estimated daily intakes (EDI) for antibiotics suggest that consumption of seafood from Laizhou Bay is not associated with significant human health risks, this study provides important insights into the guidance of risk management of antibiotics.

  10. Children's Literature Web Sites.

    ERIC Educational Resources Information Center

    Yokota, Junko; Cai, Mingshui

    2002-01-01

    Presents annotations of approximately 80 web sites that range in coverage from idiosyncratic and focused to diverse and comprehensive metasites. Notes categories of sites include: children's literature web guides; trade book publisher web sites; author/illustrator sites (metasites and individual); book review sources and teaching ideas; web sites…

  11. EPA Web Taxonomy

    EPA Pesticide Factsheets

    EPA's Web Taxonomy is a faceted hierarchical vocabulary used to tag web pages with terms from a controlled vocabulary. Tagging enables search and discovery of EPA's Web based information assests. EPA's Web Taxonomy is being provided in Simple Knowledge Organization System (SKOS) format. SKOS is a standard for sharing and linking knowledge organization systems that promises to make Federal terminology resources more interoperable.

  12. Modelling tropical forests response to logging

    NASA Astrophysics Data System (ADS)

    Cazzolla Gatti, Roberto; Di Paola, Arianna; Valentini, Riccardo; Paparella, Francesco

    2013-04-01

    Tropical rainforests are among the most threatened ecosystems by large-scale fragmentation due to human activity such as heavy logging and agricultural clearance. Although, they provide crucial ecosystem goods and services, such as sequestering carbon from the atmosphere, protecting watersheds and conserving biodiversity. In several countries forest resource extraction has experienced a shift from clearcutting to selective logging to maintain a significant forest cover and understock of living biomass. However the knowledge on the short and long-term effects of removing selected species in tropical rainforest are scarce and need to be further investigated. One of the main effects of selective logging on forest dynamics seems to be the local disturbance which involve the invasion of open space by weed, vines and climbers at the expense of the late-successional state cenosis. We present a simple deterministic model that describes the dynamics of tropical rainforest subject to selective logging to understand how and why weeds displace native species. We argue that the selective removal of tallest tropical trees carries out gaps of light that allow weeds, vines and climbers to prevail on native species, inhibiting the possibility of recovery of the original vegetation. Our results show that different regime shifts may occur depending on the type of forest management adopted. This hypothesis is supported by a dataset of trees height and weed/vines cover that we collected from 9 plots located in Central and West Africa both in untouched and managed areas.

  13. Stream macroinvertebrate response to clearcut logging

    Treesearch

    J. Bruce Wallace; Damon. Ely

    2014-01-01

    Why study response of stream invertebrates to watershed disturbances such as clearcut logging? Stream invertebrates can be excellent integrators of changes in such ecosystem phenomena as changes in the food base of ecosystems. For example, a number of invertebrate taxa appear to track changes in food resources. Many taxa also exhibit substrate-specific as well as taxon...

  14. The Design Log: A New Informational Tool

    ERIC Educational Resources Information Center

    Spivak, Mayer

    1978-01-01

    The design log is a record of observations, diagnoses, prescriptions, and performance specifications for each space in a structure. It is a systematic approach to design that integrates information about user needs with traditional architectural programming and design. (Author/MLF)

  15. Disturbance during logging stimulates regeneration of koa

    Treesearch

    Paul G. Scowcroft; Robert E. Nelson

    1976-01-01

    The abundance and distribution of Acacia koa regeneration after logging were studied on a 500-acre (202-ha) tract of koa forest heavily infested with Passiflora mollissima vines on the island of Hawaii. Koa seedling density was about three times greater in disturbed areas than in undisturbed ones. Most of the koa seedlings in...

  16. Alaska midgrade logs: supply and offshore demand.

    Treesearch

    Donald F. Flora; Wendy J. McGinnis

    1989-01-01

    The outlook for shipments and prices of export logs from Alaska differs significantly by grade (quality class). For the majority lying in the middle of the value range, the trend of prices is projected to increase $200 per thousand board feet, or about 55 percent, by 2000. Shipments are expected to rise about 30 percent by 1995 and then subside about 10 percent. These...

  17. 47 CFR 80.409 - Station logs.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... statement of any corrective action taken. (5) Entries must be made giving details of all work performed... written log signed by the operator who supervised or performed the work and, unless the operator is.... (6) An entry at least once every thirty days that the batteries or other reserve power sources have...

  18. Identifying unstable sites on logging roads

    Treesearch

    R. M. Rice; J. Lewis

    1986-01-01

    Logging roads are an important source of forestry-related erosion. The amount of erosion on a forest road is determined by the interaction between how the road is constructed and maintained and the environment in which it is built. The roads in this study were constructed with large bulldozers, and most excavated material was sidecast. The roads studied were...

  19. [Human development and log-periodic law].

    PubMed

    Cash, Roland; Chaline, Jean; Nottale, Laurent; Grou, Pierre

    2002-05-01

    We suggest applying the log-periodic law formerly used to describe various crisis phenomena, in biology (evolutionary leaps), inorganic systems (earthquakes), societies and economy (economic crisis, market crashes) to the various steps of human ontogeny. We find a statistically significant agreement between this model and the data.

  20. 29 CFR 1910.266 - Logging operations.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Regulations Relating to Labor (Continued) OCCUPATIONAL SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR OCCUPATIONAL SAFETY AND HEALTH STANDARDS Special Industries § 1910.266 Logging operations. (a) Table of..., troughs, railings, screens, mats, or platforms, or by location, to prevent injury. Health care provider. A...