Sample records for logging techniques developed

  1. Development of a 3D log sawing optimization system for small sawmills in central Appalachia, US

    Treesearch

    Wenshu Lin; Jingxin Wang; Edward Thomas

    2011-01-01

    A 3D log sawing optimization system was developed to perform log generation, opening face determination, sawing simulation, and lumber grading using 3D modeling techniques. Heuristic and dynamic programming algorithms were used to determine opening face and grade sawing optimization. Positions and shapes of internal log defects were predicted using a model developed by...

  2. An analysis technique for testing log grades

    Treesearch

    Carl A. Newport; William G. O' Regan

    1963-01-01

    An analytical technique that may be used in evaluating log-grading systems is described. It also provides means of comparing two or more grading systems, or a proposed change with the system from which it was developed. The total volume and computed value of lumber from each sample log are the basic data used.

  3. An integrated 3D log processing optimization system for small sawmills in central Appalachia

    Treesearch

    Wenshu Lin; Jingxin Wang

    2013-01-01

    An integrated 3D log processing optimization system was developed to perform 3D log generation, opening face determination, headrig log sawing simulation, fl itch edging and trimming simulation, cant resawing, and lumber grading. A circular cross-section model, together with 3D modeling techniques, was used to reconstruct 3D virtual logs. Internal log defects (knots)...

  4. Research and development of improved geothermal well logging techniques, tools and components (current projects, goals and status). Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lamers, M.D.

    One of the key needs in the advancement of geothermal energy is availability of adequate subsurface measurements to aid the reservoir engineer in the development and operation of geothermal wells. Some current projects being sponsored by the U. S. Department of Energy's Division of Geothermal Energy pertaining to the development of improved well logging techniques, tools and components are described. An attempt is made to show how these projects contribute to improvement of geothermal logging technology in forming key elements of the overall program goals.

  5. Deriving Criteria-supporting Benchmark Values from Empirical Response Relationships: Comparison of Statistical Techniques and Effect of Log-transforming the Nutrient Variable

    EPA Science Inventory

    In analyses supporting the development of numeric nutrient criteria, multiple statistical techniques can be used to extract critical values from stressor response relationships. However there is little guidance for choosing among techniques, and the extent to which log-transfor...

  6. Balloon logging with the inverted skyline

    NASA Technical Reports Server (NTRS)

    Mosher, C. F.

    1975-01-01

    There is a gap in aerial logging techniques that has to be filled. The need for a simple, safe, sizeable system has to be developed before aerial logging will become effective and accepted in the logging industry. This paper presents such a system designed on simple principles with realistic cost and ecological benefits.

  7. Reduction of lithologic-log data to numbers for use in the digital computer

    USGS Publications Warehouse

    Morgan, C.O.; McNellis, J.M.

    1971-01-01

    The development of a standardized system for conveniently coding lithologic-log data for use in the digital computer has long been needed. The technique suggested involves a reduction of the original written alphanumeric log to a numeric log by use of computer programs. This numeric log can then be retrieved as a written log, interrogated for pertinent information, or analyzed statistically. ?? 1971 Plenum Publishing Corporation.

  8. Locating knots by industrial tomography- A feasibility study

    Treesearch

    Fred W. Taylor; Francis G. Wagner; Charles W. McMillin; Ira L. Morgan; Forrest F. Hopkins

    1984-01-01

    Industrial photon tomography was used to scan four southern pine logs and one red oak log. The logs were scanned at 16 cross-sectional slice planes located 1 centimeter apart along their longitudinal axes. Tomographic reconstructions were made from the scan data collected at these slice planes, and a cursory image analysis technique was developed to locate the log...

  9. Thorium normalization as a hydrocarbon accumulation indicator for Lower Miocene rocks in Ras Ghara area, Gulf of Suez, Egypt

    NASA Astrophysics Data System (ADS)

    El-Khadragy, A. A.; Shazly, T. F.; AlAlfy, I. M.; Ramadan, M.; El-Sawy, M. Z.

    2018-06-01

    An exploration method has been developed using surface and aerial gamma-ray spectral measurements in prospecting petroleum in stratigraphic and structural traps. The Gulf of Suez is an important region for studying hydrocarbon potentiality in Egypt. Thorium normalization technique was applied on the sandstone reservoirs in the region to determine the hydrocarbon potentialities zones using the three spectrometric radioactive gamma ray-logs (eU, eTh and K% logs). This method was applied on the recorded gamma-ray spectrometric logs for Rudeis and Kareem Formations in Ras Ghara oil Field, Gulf of Suez, Egypt. The conventional well logs (gamma-ray, resistivity, neutron, density and sonic logs) were analyzed to determine the net pay zones in the study area. The agreement ratios between the thorium normalization technique and the results of the well log analyses are high, so the application of thorium normalization technique can be used as a guide for hydrocarbon accumulation in the study reservoir rocks.

  10. Evaluation of residual oil saturation after waterflood in a carbonate reservoir

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verma, M.K.; Boucherit, M.; Bouvier, L.

    Four different approaches, including special core analysis (SCAL), log-inject-log, thermal-decay-time (TDT) logs, and material balance, were used to narrow the range of residual oil saturation (ROS) after waterflood, S[sub orw], in a carbonate reservoir in Qatar to between 23% and 27%. An equation was developed that relates S[sub orw] with connate-water saturation, S[sub wi], and porosity. This paper presents the results of S[sub orw] determinations with four different techniques: core waterflood followed by centrifuging, log-inject-log, TDT logging, and material balance.

  11. Standard weight (Ws) equations for four rare desert fishes

    USGS Publications Warehouse

    Didenko, A.V.; Bonar, Scott A.; Matter, W.J.

    2004-01-01

    Standard weight (Ws) equations have been used extensively to examine body condition in sport fishes. However, development of these equations for nongame fishes has only recently been emphasized. We used the regression-line-percentile technique to develop standard weight equations for four rare desert fishes: flannelmouth sucker Catostomus latipinnis, razorback sucker Xyrauchen texanus, roundtail chub Gila robusta, and humpback chub G. cypha. The Ws equation for flannelmouth suckers of 100-690 mm total length (TL) was developed from 17 populations: log10Ws = -5.180 + 3.068 log10TL. The Ws equation for razorback suckers of 110-885 mm TL was developed from 12 populations: log 10Ws = -4.886 + 2.985 log10TL. The W s equation for roundtail chub of 100-525 mm TL was developed from 20 populations: log10Ws = -5.065 + 3.015 log10TL. The Ws equation for humpback chub of 120-495 mm TL was developed from 9 populations: log10Ws = -5.278 + 3.096 log 10TL. These equations meet criteria for acceptable standard weight indexes and can be used to calculate relative weight, an index of body condition.

  12. Analysis of Hospital Processes with Process Mining Techniques.

    PubMed

    Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises

    2015-01-01

    Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.

  13. Bio-logging of physiological parameters in higher marine vertebrates

    NASA Astrophysics Data System (ADS)

    Ponganis, Paul J.

    2007-02-01

    Bio-logging of physiological parameters in higher marine vertebrates had its origins in the field of bio-telemetry in the 1960s and 1970s. The development of microprocessor technology allowed its first application to bio-logging investigations of Weddell seal diving physiology in the early 1980s. Since that time, with the use of increased memory capacity, new sensor technology, and novel data processing techniques, investigators have examined heart rate, temperature, swim speed, stroke frequency, stomach function (gastric pH and motility), heat flux, muscle oxygenation, respiratory rate, diving air volume, and oxygen partial pressure (P) during diving. Swim speed, heart rate, and body temperature have been the most commonly studied parameters. Bio-logging investigation of pressure effects has only been conducted with the use of blood samplers and nitrogen analyses on animals diving at isolated dive holes. The advantages/disadvantages and limitations of recording techniques, probe placement, calibration techniques, and study conditions are reviewed.

  14. Measuring ecological impacts from logging in natural forests of the eastern Amazonia as a tool to assess forest degradation

    Treesearch

    Marco W Lentini; Johan C Zweede; Thomas P Holmes

    2010-01-01

    Sound forest management practices have been seen as an interesting strategy to ally forest conservation and rural economic development in Amazônia. However, the implementation of Reduced Impact Logging (RIL) techniques in the field has been incipient, while most of the Amazonian timber production is generated through predatory and illegal logging. Despite several...

  15. New Factorization Techniques and Parallel (log N) Algorithms for Forward Dynamics Solution of Single Closed-Chain Robot Manipulators

    NASA Technical Reports Server (NTRS)

    Fijany, Amir

    1993-01-01

    In this paper parallel 0(log N) algorithms for dynamic simulation of single closed-chain rigid multibody system as specialized to the case of a robot manipulatoar in contact with the environment are developed.

  16. A New Approach to Logging.

    ERIC Educational Resources Information Center

    Miles, Donna

    2001-01-01

    In response to high numbers of preventable fatal accidents in the logging industry, the Occupational Safety and Health Administration (OSHA) developed a week-long logger safety training program that includes hands-on learning of safety techniques in the woods. Reaching small operators has been challenging; outreach initiatives in Maine, North…

  17. Transverse vibration techniques : logs to structural systems

    Treesearch

    Robert J. Ross

    2008-01-01

    Transverse vibration as a nondestructive testing and evaluation technique was first examined in the early 1960s. Initial research and development efforts focused on clear wood, lumber, and laminated products. Out of those efforts, tools were developed that are used today to assess lumber properties. Recently, use of this technique has been investigated for evaluating a...

  18. Automated lithology prediction from PGNAA and other geophysical logs.

    PubMed

    Borsaru, M; Zhou, B; Aizawa, T; Karashima, H; Hashimoto, T

    2006-02-01

    Different methods of lithology predictions from geophysical data have been developed in the last 15 years. The geophysical logs used for predicting lithology are the conventional logs: sonic, neutron-neutron, gamma (total natural-gamma) and density (backscattered gamma-gamma). The prompt gamma neutron activation analysis (PGNAA) is another established geophysical logging technique for in situ element analysis of rocks in boreholes. The work described in this paper was carried out to investigate the application of PGNAA to the lithology interpretation. The data interpretation was conducted using the automatic interpretation program LogTrans based on statistical analysis. Limited test suggests that PGNAA logging data can be used to predict the lithology. A success rate of 73% for lithology prediction was achieved from PGNAA logging data only. It can also be used in conjunction with the conventional geophysical logs to enhance the lithology prediction.

  19. Novel medium-throughput technique for investigating drug-cyclodextrin complexation by pH-metric titration using the partition coefficient method.

    PubMed

    Dargó, Gergő; Boros, Krisztina; Péter, László; Malanga, Milo; Sohajda, Tamás; Szente, Lajos; Balogh, György T

    2018-05-05

    The present study was aimed to develop a medium-throughput screening technique for investigation of cyclodextrin (CD)-active pharmaceutical ingredient (API) complexes. Dual-phase potentiometric lipophilicity measurement, as gold standard technique, was combined with the partition coefficient method (plotting the reciprocal of partition coefficients of APIs as a function of CD concentration). A general equation was derived for determination of stability constants of 1:1 CD-API complexes (K 1:1,CD ) based on solely the changes of partition coefficients (logP o/w N -logP app N ), without measurement of the actual API concentrations. Experimentally determined logP value (-1.64) of 6-deoxy-6[(5/6)-fluoresceinylthioureido]-HPBCD (FITC-NH-HPBCD) was used to estimate the logP value (≈ -2.5 to -3) of (2-hydroxypropyl)-ß-cyclodextrin (HPBCD). The results suggested that the amount of HPBCD can be considered to be inconsequential in the octanol phase. The decrease of octanol volume due to the octanol-CD complexation was considered, thus a corrected octanol-water phase ratio was also introduced. The K 1:1,CD values obtained by this developed method showed a good accordance with the results from other orthogonal methods. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Measurement of the distribution of ventilation-perfusion ratios in the human lung with proton MRI: comparison with the multiple inert-gas elimination technique.

    PubMed

    Sá, Rui Carlos; Henderson, A Cortney; Simonson, Tatum; Arai, Tatsuya J; Wagner, Harrieth; Theilmann, Rebecca J; Wagner, Peter D; Prisk, G Kim; Hopkins, Susan R

    2017-07-01

    We have developed a novel functional proton magnetic resonance imaging (MRI) technique to measure regional ventilation-perfusion (V̇ A /Q̇) ratio in the lung. We conducted a comparison study of this technique in healthy subjects ( n = 7, age = 42 ± 16 yr, Forced expiratory volume in 1 s = 94% predicted), by comparing data measured using MRI to that obtained from the multiple inert gas elimination technique (MIGET). Regional ventilation measured in a sagittal lung slice using Specific Ventilation Imaging was combined with proton density measured using a fast gradient-echo sequence to calculate regional alveolar ventilation, registered with perfusion images acquired using arterial spin labeling, and divided on a voxel-by-voxel basis to obtain regional V̇ A /Q̇ ratio. LogSDV̇ and LogSDQ̇, measures of heterogeneity derived from the standard deviation (log scale) of the ventilation and perfusion vs. V̇ A /Q̇ ratio histograms respectively, were calculated. On a separate day, subjects underwent study with MIGET and LogSDV̇ and LogSDQ̇ were calculated from MIGET data using the 50-compartment model. MIGET LogSDV̇ and LogSDQ̇ were normal in all subjects. LogSDQ̇ was highly correlated between MRI and MIGET (R = 0.89, P = 0.007); the intercept was not significantly different from zero (-0.062, P = 0.65) and the slope did not significantly differ from identity (1.29, P = 0.34). MIGET and MRI measures of LogSDV̇ were well correlated (R = 0.83, P = 0.02); the intercept differed from zero (0.20, P = 0.04) and the slope deviated from the line of identity (0.52, P = 0.01). We conclude that in normal subjects, there is a reasonable agreement between MIGET measures of heterogeneity and those from proton MRI measured in a single slice of lung. NEW & NOTEWORTHY We report a comparison of a new proton MRI technique to measure regional V̇ A /Q̇ ratio against the multiple inert gas elimination technique (MIGET). The study reports good relationships between measures of heterogeneity derived from MIGET and those derived from MRI. Although currently limited to a single slice acquisition, these data suggest that single sagittal slice measures of V̇ A /Q̇ ratio provide an adequate means to assess heterogeneity in the normal lung. Copyright © 2017 the American Physiological Society.

  1. The Economics of Reduced Impact Logging in the American Tropics: A Review of Recent Initiatives

    Treesearch

    Frederick Boltz; Thomas P. Holmes; Douglas R. Carter

    1999-01-01

    Programs aimed at developing and implementing reduced-impact logging (RIL) techniques are currently underway in important forest regions of Latin America, given the importance of timber production in the American tropics to national and global markets. RIL efforts focus upon planning and extraction methods which lessen harvest impact on residual commercial timber...

  2. Proposed standard-weight (W(s)) equations for kokanee, golden trout and bull trout

    USGS Publications Warehouse

    Hyatt, M.H.; Hubert, W.A.

    2000-01-01

    We developed standard-weight (W(s)) equations for kokanee (lacustrine Oncorhynchus nerka), golden trout (O. aguabonita), and bull trout (Salvelinus confluentus) using the regression-line-percentile technique. The W(s) equation for kokanee of 120-550 mm TL is log10 W(s) = -5.062 + 3.033 log10 TL, when W(s) is in grams and TL is total length in millimeters; the English-unit equivalent is log10 W(s) = -3.458 + 3.033 log10 TL, when W(s) is in pounds and TL is total length in inches. The W(s) equation for golden trout of 120-530 mm TL is log10 W(s) = -5.088 + 3.041 log10 TL, with the English-unit equivalent being log10 W(s) = -3.473 + 3.041 log10 TL. The W(s) equation for bull trout of 120-850 mm TL is log10 W(s) = -5.327 + 3.115 log10 TL, with the English-unit equivalent being log10 W(s) = -3.608 + 3.115 log10 TL.

  3. SpaceOps 2012 Plus 2: Social Tools to Simplify ISS Flight Control Communications and Log Keeping

    NASA Technical Reports Server (NTRS)

    Cowart, Hugh S.; Scott, David W.

    2014-01-01

    A paper written for the SpaceOps 2012 Conference (Simplify ISS Flight Control Communications and Log Keeping via Social Tools and Techniques) identified three innovative concepts for real time flight control communications tools based on social mechanisms: a) Console Log Tool (CoLT) - A log keeping application at Marshall Space Flight Center's (MSFC) Payload Operations Integration Center (POIC) that provides "anywhere" access, comment and notifications features similar to those found in Social Networking Systems (SNS), b) Cross-Log Communication via Social Techniques - A concept from Johnsson Space Center's (JSC) Mission Control Center Houston (MCC-H) that would use microblogging's @tag and #tag protocols to make information/requests visible and/or discoverable in logs owned by @Destination addressees, and c) Communications Dashboard (CommDash) - A MSFC concept for a Facebook-like interface to visually integrate and manage basic console log content, text chat streams analogous to voice loops, text chat streams dedicated to particular conversations, generic and position-specific status displays/streams, and a graphically based hailing display. CoLT was deployed operationally at nearly the same time as SpaceOps 2012, the Cross- Log Communications idea is currently waiting for a champion to carry it forward, and CommDash was approved as a NASA Iinformation Technoloby (IT) Labs project. This paper discusses lessons learned from two years of actual CoLT operations, updates CommDash prototype development status, and discusses potential for using Cross-Log Communications in both MCC-H and/or POIC environments, and considers other ways for synergizing console applcations.

  4. MID Plot: a new lithology technique. [Matrix identification plot

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clavier, C.; Rust, D.H.

    1976-01-01

    Lithology interpretation by the Litho-Porosity (M-N) method has been used for years, but is evidently too cumbersome and ambiguous for widespread acceptance as a field technique. To set aside these objections, another method has been devised. Instead of the log-derived parameters M and N, the MID Plot uses quasi-physical quantities, (rho/sub ma/)/sub a/ and (..delta..t/sub ma/)/sub a/, as its porosity-independent variables. These parameters, taken from suitably scaled Neutron-Density and Sonic-Neutron crossplots, define a unique matrix mineral or mixture for each point on the logs. The matrix points on the MID Plot thus remain constant in spite of changes in mudmore » filtrate, porosity, or neutron tool types (all of which significantly affect the M-N Plot). This new development is expected to bring welcome relief in areas where lithology identification is a routine part of log analysis.« less

  5. Application of borehole geophysics to water-resources investigations

    USGS Publications Warehouse

    Keys, W.S.; MacCary, L.M.

    1971-01-01

    This manual is intended to be a guide for hydrologists using borehole geophysics in ground-water studies. The emphasis is on the application and interpretation of geophysical well logs, and not on the operation of a logger. It describes in detail those logging techniques that have been utilized within the Water Resources Division of the U.S. Geological Survey, and those used in petroleum investigations that have potential application to hydrologic problems. Most of the logs described can be made by commercial logging service companies, and many can be made with small water-well loggers. The general principles of each technique and the rules of log interpretation are the same, regardless of differences in instrumentation. Geophysical well logs can be interpreted to determine the lithology, geometry, resistivity, formation factor, bulk density, porosity, permeability, moisture content, and specific yield of water-bearing rocks, and to define the source, movement, and chemical and physical characteristics of ground water. Numerous examples of logs are used to illustrate applications and interpretation in various ground-water environments. The interrelations between various types of logs are emphasized, and the following aspects are described for each of the important logging techniques: Principles and applications, instrumentation, calibration and standardization, radius of investigation, and extraneous effects.

  6. Development of Enabling Scientific Tools to Characterize the Geologic Subsurface at Hanford

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kenna, Timothy C.; Herron, Michael M.

    2014-07-08

    This final report to the Department of Energy provides a summary of activities conducted under our exploratory grant, funded through U.S. DOE Subsurface Biogeochemical Research Program in the category of enabling scientific tools, which covers the period from July 15, 2010 to July 14, 2013. The main goal of this exploratory project is to determine the parameters necessary to translate existing borehole log data into reservoir properties following scientifically sound petrophysical relationships. For this study, we focused on samples and Ge-based spectral gamma logging system (SGLS) data collected from wells located in the Hanford 300 Area. The main activities consistedmore » of 1) the analysis of available core samples for a variety of mineralogical, chemical and physical; 2) evaluation of selected spectral gamma logs, environmental corrections, and calibration; 3) development of algorithms and a proposed workflow that permits translation of log responses into useful reservoir properties such as lithology, matrix density, porosity, and permeability. These techniques have been successfully employed in the petroleum industry; however, the approach is relatively new when applied to subsurface remediation. This exploratory project has been successful in meeting its stated objectives. We have demonstrated that our approach can lead to an improved interpretation of existing well log data. The algorithms we developed can utilize available log data, in particular gamma, and spectral gamma logs, and continued optimization will improve their application to ERSP goals of understanding subsurface properties.« less

  7. DEVELOPMENT AND APPLICATION OF BOREHOLE FLOWMETERS FOR ENVIRONMENTAL ASSESSMENT

    EPA Science Inventory

    In order to understand the origin of contaminant plumes and infer their future migration, one requires a knowledge of the hydraulic conductivity (K) distribution. n many aquifers, the borehole flowmeter offers the most direct technique available for developing a log of hydraulic ...

  8. Comparison of various techniques for calibration of AIS data

    NASA Technical Reports Server (NTRS)

    Roberts, D. A.; Yamaguchi, Y.; Lyon, R. J. P.

    1986-01-01

    The Airborne Imaging Spectrometer (AIS) samples a region which is strongly influenced by decreasing solar irradiance at longer wavelengths and strong atmospheric absorptions. Four techniques, the Log Residual, the Least Upper Bound Residual, the Flat Field Correction and calibration using field reflectance measurements were investigated as a means for removing these two features. Of the four techniques field reflectance calibration proved to be superior in terms of noise and normalization. Of the other three techniques, the Log Residual was superior when applied to areas which did not contain one dominant cover type. In heavily vegetated areas, the Log Residual proved to be ineffective. After removing anomalously bright data values, the Least Upper Bound Residual proved to be almost as effective as the Log Residual in sparsely vegetated areas and much more effective in heavily vegetated areas. Of all the techniques, the Flat Field Correction was the noisest.

  9. The Spontaneous Ray Log: A New Aid for Constructing Pseudo-Synthetic Seismograms

    NASA Astrophysics Data System (ADS)

    Quadir, Adnan; Lewis, Charles; Rau, Ruey-Juin

    2018-02-01

    Conventional synthetic seismograms for hydrocarbon exploration combine the sonic and density logs, whereas pseudo-synthetic seismograms are constructed with a density log plus a resistivity, neutron, gamma ray, or rarely a spontaneous potential log. Herein, we introduce a new technique for constructing a pseudo-synthetic seismogram by combining the gamma ray (GR) and self-potential (SP) logs to produce the spontaneous ray (SR) log. Three wells, each of which consisted of more than 1000 m of carbonates, sandstones, and shales, were investigated; each well was divided into 12 Groups based on formation tops, and the Pearson product-moment correlation coefficient (PCC) was calculated for each "Group" from each of the GR, SP, and SR logs. The highest PCC-valued log curves for each Group were then combined to produce a single log whose values were cross-plotted against the reference well's sonic ITT values to determine a linear transform for producing a pseudo-sonic (PS) log and, ultimately, a pseudo-synthetic seismogram. The range for the Nash-Sutcliffe efficiency (NSE) acceptable value for the pseudo-sonic logs of three wells was 78-83%. This technique was tested on three wells, one of which was used as a blind test well, with satisfactory results. The PCC value between the composite PS (SR) log with low-density correction and the conventional sonic (CS) log was 86%. Because of the common occurrence of spontaneous potential and gamma ray logs in many of the hydrocarbon basins of the world, this inexpensive and straightforward technique could hold significant promise in areas that are in need of alternate ways to create pseudo-synthetic seismograms for seismic reflection interpretation.

  10. Summary goodness-of-fit statistics for binary generalized linear models with noncanonical link functions.

    PubMed

    Canary, Jana D; Blizzard, Leigh; Barry, Ronald P; Hosmer, David W; Quinn, Stephen J

    2016-05-01

    Generalized linear models (GLM) with a canonical logit link function are the primary modeling technique used to relate a binary outcome to predictor variables. However, noncanonical links can offer more flexibility, producing convenient analytical quantities (e.g., probit GLMs in toxicology) and desired measures of effect (e.g., relative risk from log GLMs). Many summary goodness-of-fit (GOF) statistics exist for logistic GLM. Their properties make the development of GOF statistics relatively straightforward, but it can be more difficult under noncanonical links. Although GOF tests for logistic GLM with continuous covariates (GLMCC) have been applied to GLMCCs with log links, we know of no GOF tests in the literature specifically developed for GLMCCs that can be applied regardless of link function chosen. We generalize the Tsiatis GOF statistic originally developed for logistic GLMCCs, (TG), so that it can be applied under any link function. Further, we show that the algebraically related Hosmer-Lemeshow (HL) and Pigeon-Heyse (J(2) ) statistics can be applied directly. In a simulation study, TG, HL, and J(2) were used to evaluate the fit of probit, log-log, complementary log-log, and log models, all calculated with a common grouping method. The TG statistic consistently maintained Type I error rates, while those of HL and J(2) were often lower than expected if terms with little influence were included. Generally, the statistics had similar power to detect an incorrect model. An exception occurred when a log GLMCC was incorrectly fit to data generated from a logistic GLMCC. In this case, TG had more power than HL or J(2) . © 2015 John Wiley & Sons Ltd/London School of Economics.

  11. Proposed standard-weight (Ws) equation and length-categorization standards for brown trout (Salmo trutta) in lentic habitats

    USGS Publications Warehouse

    Hyatt, M.W.; Hubert, W.A.

    2001-01-01

    We developed a standard-weight (Ws) equation for brown trout (Salmo trutta) in lentic habitats by applying the regression-line-percentile technique to samples from 49 populations in North America. The proposed Ws equation is log10 Ws = -5.422 + 3.194 log10 TL, when Ws is in grams and TL is total length in millimeters. The English-unit equivalent is log10 Ws = -3.592 + 3.194 log10 TL, when Ws is in pounds and TL is total length in inches. The equation is applicable for fish of 140-750 mm TL. Proposed length-category standards to evaluate fish within populations are: stock, 200 mm (8 in); quality, 300 mm (12 in); preferred, 400 mm (16 in); memorable, 500 mm (20 in); and trophy, 600 mm (24 in).

  12. Low-Cost Evaluation of EO-1 Hyperion and ALI for Detection and Biophysical Characterization of Forest Logging in Amazonia (NCC5-481)

    NASA Technical Reports Server (NTRS)

    Asner, Gregory P.; Keller, Michael M.; Silva, Jose Natalino; Zweede, Johan C.; Pereira, Rodrigo, Jr.

    2002-01-01

    Major uncertainties exist regarding the rate and intensity of logging in tropical forests worldwide: these uncertainties severely limit economic, ecological, and biogeochemical analyses of these regions. Recent sawmill surveys in the Amazon region of Brazil show that the area logged is nearly equal to total area deforested annually, but conversion of survey data to forest area, forest structural damage, and biomass estimates requires multiple assumptions about logging practices. Remote sensing could provide an independent means to monitor logging activity and to estimate the biophysical consequences of this land use. Previous studies have demonstrated that the detection of logging in Amazon forests is difficult and no studies have developed either the quantitative physical basis or remote sensing approaches needed to estimate the effects of various logging regimes on forest structure. A major reason for these limitations has been a lack of sufficient, well-calibrated optical satellite data, which in turn, has impeded the development and use of physically-based, quantitative approaches for detection and structural characterization of forest logging regimes. We propose to use data from the EO-1 Hyperion imaging spectrometer to greatly increase our ability to estimate the presence and structural attributes of selective logging in the Amazon Basin. Our approach is based on four "biogeophysical indicators" not yet derived simultaneously from any satellite sensor: 1) green canopy leaf area index; 2) degree of shadowing; 3) presence of exposed soil and; 4) non-photosynthetic vegetation material. Airborne, field and modeling studies have shown that the optical reflectance continuum (400-2500 nm) contains sufficient information to derive estimates of each of these indicators. Our ongoing studies in the eastern Amazon basin also suggest that these four indicators are sensitive to logging intensity. Satellite-based estimates of these indicators should provide a means to quantify both the presence and degree of structural disturbance caused by various logging regimes. Our quantitative assessment of Hyperion hyperspectral and ALI multi-spectral data for the detection and structural characterization of selective logging in Amazonia will benefit from data collected through an ongoing project run by the Tropical Forest Foundation, within which we have developed a study of the canopy and landscape biophysics of conventional and reduced-impact logging. We will add to our base of forest structural information in concert with an EO-1 overpass. Using a photon transport model inversion technique that accounts for non-linear mixing of the four biogeophysical indicators, we will estimate these parameters across a gradient of selective logging intensity provided by conventional and reduced impact logging sites. We will also compare our physical ly-based approach to both conventional (e.g., NDVI) and novel (e.g., SWIR-channel) vegetation indices as well as to linear mixture modeling methods. We will cross-compare these approaches using Hyperion and ALI imagers to determine the strengths and limitations of these two sensors for applications of forest biophysics. This effort will yield the first physical ly-based, quantitative analysis of the detection and intensity of selective logging in Amazonia, comparing hyperspectral and improved multi-spectral approaches as well as inverse modeling, linear mixture modeling, and vegetation index techniques.

  13. Direct push driven in situ color logging tool (CLT): technique, analysis routines, and application

    NASA Astrophysics Data System (ADS)

    Werban, U.; Hausmann, J.; Dietrich, P.; Vienken, T.

    2014-12-01

    Direct push technologies have recently seen a broad development providing several tools for in situ parameterization of unconsolidated sediments. One of these techniques is the measurement of soil colors - a proxy information that reveals to soil/sediment properties. We introduce the direct push driven color logging tool (CLT) for real-time and depth-resolved investigation of soil colors within the visible spectrum. Until now, no routines exist on how to handle high-resolved (mm-scale) soil color data. To develop such a routine, we transform raw data (CIEXYZ) into soil color surrogates of selected color spaces (CIExyY, CIEL*a*b*, CIEL*c*h*, sRGB) and denoise small-scale natural variability by Haar and Daublet4 wavelet transformation, gathering interpretable color logs over depth. However, interpreting color log data as a single application remains challenging. Additional information, such as site-specific knowledge of the geological setting, is required to correlate soil color data to specific layers properties. Hence, we exemplary provide results from a joint interpretation of in situ-obtained soil color data and 'state-of-the-art' direct push based profiling tool data and discuss the benefit of additional data. The developed routine is capable of transferring the provided information obtained as colorimetric data into interpretable color surrogates. Soil color data proved to correlate with small-scale lithological/chemical changes (e.g., grain size, oxidative and reductive conditions), especially when combined with additional direct push vertical high resolution data (e.g., cone penetration testing and soil sampling). Thus, the technique allows enhanced profiling by means of providing another reproducible high-resolution parameter for analysis subsurface conditions. This opens potential new areas of application and new outputs for such data in site investigation. It is our intention to improve color measurements by means method of application and data interpretation, useful to characterize vadose layer/soil/sediment characteristics.

  14. Petrophysical evaluation of subterranean formations

    DOEpatents

    Klein, James D; Schoderbek, David A; Mailloux, Jason M

    2013-05-28

    Methods and systems are provided for evaluating petrophysical properties of subterranean formations and comprehensively evaluating hydrate presence through a combination of computer-implemented log modeling and analysis. Certain embodiments include the steps of running a number of logging tools in a wellbore to obtain a variety of wellbore data and logs, and evaluating and modeling the log data to ascertain various petrophysical properties. Examples of suitable logging techniques that may be used in combination with the present invention include, but are not limited to, sonic logs, electrical resistivity logs, gamma ray logs, neutron porosity logs, density logs, NRM logs, or any combination or subset thereof.

  15. Borehole geophysics applied to ground-water investigations

    USGS Publications Warehouse

    Keys, W.S.

    1990-01-01

    The purpose of this manual is to provide hydrologists, geologists, and others who have the necessary background in hydrogeology with the basic information needed to apply the most useful borehole-geophysical-logging techniques to the solution of problems in ground-water hydrology. Geophysical logs can provide information on the construction of wells and on the character of the rocks and fluids penetrated by those wells, as well as on changes in the character of these factors over time. The response of well logs is caused by petrophysical factors, by the quality, temperature, and pressure of interstitial fluids, and by ground-water flow. Qualitative and quantitative analysis of analog records and computer analysis of digitized logs are used to derive geohydrologic information. This information can then be extrapolated vertically within a well and laterally to other wells using logs. The physical principles by which the mechanical and electronic components of a logging system measure properties of rocks, fluids, and wells, as well as the principles of measurement, must be understood if geophysical logs are to be interpreted correctly. Plating a logging operation involves selecting the equipment and the logs most likely to provide the needed information. Information on well construction and geohydrology is needed to guide this selection. Quality control of logs is an important responsibility of both the equipment operator and the log analyst and requires both calibration and well-site standardization of equipment. Logging techniques that are widely used in ground-water hydrology or that have significant potential for application to this field include spontaneous potential, resistance, resistivity, gamma, gamma spectrometry, gamma-gamma, neutron, acoustic velocity, acoustic televiewer, caliper, and fluid temperature, conductivity, and flow. The following topics are discussed for each of these techniques: principles and instrumentation, calibration and standardization, volume of investigation, extraneous effects, and interpretation and applications.

  16. Borehole geophysics applied to ground-water investigations

    USGS Publications Warehouse

    Keys, W.S.

    1988-01-01

    The purpose of this manual is to provide hydrologists, geologists, and others who have the necessary training with the basic information needed to apply the most useful borehole-geophysical-logging techniques to the solution of problems in ground-water hydrology. Geophysical logs can provide information on the construction of wells and on the character of the rocks and fluids penetrated by those wells, in addition to changes in the character of these factors with time. The response of well logs is caused by: petrophysical factors; the quality; temperature, and pressure of interstitial fluids; and ground-water flow. Qualitative and quantitative analysis of the analog records and computer analysis of digitized logs are used to derive geohydrologic information. This information can then be extrapolated vertically within a well and laterally to other wells using logs.The physical principles by which the mechanical and electronic components of a logging system measure properties of rocks, fluids and wells, and the principles of measurement need to be understood to correctly interpret geophysical logs. Planning the logging operation involves selecting the equipment and the logs most likely to provide the needed information. Information on well construction and geohydrology are needed to guide this selection. Quality control of logs is an important responsibility of both the equipment operator and log analyst and requires both calibration and well-site standardization of equipment.Logging techniques that are widely used in ground-water hydrology or that have significant potential for application to this field include: spontaneous potential, resistance, resistivity, gamma, gamma spectrometry, gamma-gamma, neutron, acoustic velocity, acoustic televiewer, caliper, and fluid temperature, conductivity, and flow. The following topics are discussed for each of these techniques: principles and instrumentation, calibration and standardization, volume of investigation, extraneous effects, and interpretation and applications.

  17. Porosity and hydraulic conductivity estimation of the basaltic aquifer in Southern Syria by using nuclear and electrical well logging techniques

    NASA Astrophysics Data System (ADS)

    Asfahani, Jamal

    2017-08-01

    An alternative approach using nuclear neutron-porosity and electrical resistivity well logging of long (64 inch) and short (16 inch) normal techniques is proposed to estimate the porosity and the hydraulic conductivity ( K) of the basaltic aquifers in Southern Syria. This method is applied on the available logs of Kodana well in Southern Syria. It has been found that the obtained K value by applying this technique seems to be reasonable and comparable with the hydraulic conductivity value of 3.09 m/day obtained by the pumping test carried out at Kodana well. The proposed alternative well logging methodology seems as promising and could be practiced in the basaltic environments for the estimation of hydraulic conductivity parameter. However, more detailed researches are still required to make this proposed technique very performed in basaltic environments.

  18. Fracture Characterization

    EPA Science Inventory

    The goal of this volume is to compare and assess various techniques for understanding fracture patterns at a site at Pease International Tradeport, NH, and to give an overview of the site as a whole. Techniques included are: core logging, geophysical logging, radar studies, and...

  19. Preliminary geological investigation of AIS data at Mary Kathleen, Queensland, Australia

    NASA Technical Reports Server (NTRS)

    Huntington, J. F.; Green, A. A.; Craig, M. D.; Cocks, T. D.

    1986-01-01

    The Airborne Imaging Spectrometer (AIS) was flown over granitic, volcanic, and calc-silicate terrain around the Mary Kathleen Uranium Mine in Queensland, in a test of its mineralocial mapping capabilities. An analysis strategy and restoration and enhancement techniques were developed to process the 128 band AIS data. A preliminary analysis of one of three AIS flight lines shows that the data contains considerable spectral variation but that it is also contaminated by second-order leakage of radiation from the near-infrared region. This makes the recognition of expected spectral absorption shapes very difficult. The effect appears worst in terrains containing considerable vegetation. Techniques that try to predict this supplementary radiation coupled with the log residual analytical technique show that expected mineral absorption spectra can be derived. The techniques suggest that with additional refinement correction procedures, the Australian AIS data may be revised. Application of the log residual analysis method has proved very successful on the cuprite, Nevada data set, and for highlighting the alunite, linite, and SiOH mineralogy.

  20. Logging while fishing: An alternate method to cut and thread fishing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tollefsen, E.; Crary, S.; Flores, B.

    1996-12-31

    New technology has been introduced to allow completion of the wireline logging program after the tool string has become lodged in the wellbore. Charges associated with extracting a stuck tool are substantial. These charges result from the nonproductive time during the fishing trip, an associated wiper trip, and re-logging the well. The ability to continue the logging program while retrieving the logging string from the wellbore is needed. Logging While Fishing (LWF) is a hybrid of existing technologies combined with a new sub capable of severing a cable remotely. This new method is comprised of cut and thread fishing, drillpipemore » conveyed logging, and bridled tool techniques. Utilizing these techniques it is possible to complete wireline logging operations while removing a stuck tool from the wellbore. Completing logging operations using this hybrid method will save operating companies time and money. Other benefits, depending on the situation, include reduced fishing time and an increased level of safety. This application has been demonstrated on jobs in the Gulf of Mexico, North Sea, Venezuela, and Southeast Asia.« less

  1. Design considerations for large woody debris placement in stream enhancement projects. North American Journal of Fisheries Management

    Treesearch

    Robert H. Hilderbrand; A. Dennis Lemly; C. Andrew Dolloff; Kelly L. Harpster

    1998-01-01

    Log length exerted a critical influence in stabilizing large woody debris (LWD) pieces added as an experimental stream restoration technique. Logs longer than the average bank-full channel width (5.5 m) were significantly less likely to be displaced than logs shorter than this width. The longest log in stable log groups was significantly longer than the longest log in...

  2. A VLSI architecture for performing finite field arithmetic with reduced table look-up

    NASA Technical Reports Server (NTRS)

    Hsu, I. S.; Truong, T. K.; Reed, I. S.

    1986-01-01

    A new table look-up method for finding the log and antilog of finite field elements has been developed by N. Glover. In his method, the log and antilog of a field element is found by the use of several smaller tables. The method is based on a use of the Chinese Remainder Theorem. The technique often results in a significant reduction in the memory requirements of the problem. A VLSI architecture is developed for a special case of this new algorithm to perform finite field arithmetic including multiplication, division, and the finding of an inverse element in the finite field.

  3. The development of a high-throughput measurement method of octanol/water distribution coefficient based on hollow fiber membrane solvent microextraction technique.

    PubMed

    Bao, James J; Liu, Xiaojing; Zhang, Yong; Li, Youxin

    2014-09-15

    This paper describes the development of a novel high-throughput hollow fiber membrane solvent microextraction technique for the simultaneous measurement of the octanol/water distribution coefficient (logD) for organic compounds such as drugs. The method is based on a designed system, which consists of a 96-well plate modified with 96 hollow fiber membrane tubes and a matching lid with 96 center holes and 96 side holes distributing in 96 grids. Each center hole was glued with a sealed on one end hollow fiber membrane tube, which is used to separate the aqueous phase from the octanol phase. A needle, such as microsyringe or automatic sampler, can be directly inserted into the membrane tube to deposit octanol as the accepted phase or take out the mixture of the octanol and the drug. Each side hole is filled with aqueous phase and could freely take in/out solvent as the donor phase from the outside of the hollow fiber membranes. The logD can be calculated by measuring the drug concentration in each phase after extraction equilibrium. After a comprehensive comparison, the polytetrafluoroethylene hollow fiber with the thickness of 210 μm, an extraction time of 300 min, a temperature of 25 °C and atmospheric pressure without stirring are selected for the high throughput measurement. The correlation coefficient of the linear fit of the logD values of five drugs determined by our system to reference values is 0.9954, showed a nice accurate. The -8.9% intra-day and -4.4% inter-day precision of logD for metronidazole indicates a good precision. In addition, the logD values of eight drugs were simultaneously and successfully measured, which indicated that the 96 throughput measure method of logD value was accurate, precise, reliable and useful for high throughput screening. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Logging while fishing technique results in substantial savings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tollefsen, E.; Everett, M.

    1996-12-01

    During wireline logging operations, tools occasionally become stuck in the borehole and require fishing. A typical fishing job can take anywhere from 1{1/2}--4 days. In the Gulf of Mexico, a fishing job can easily cost between $100,000 and $500,000. These costs result from nonproductive time during the fishing trip, associated wiper trip and relogging the well. Logging while fishing (LWF) technology is a patented system capable of retrieving a stuck fish and completing the logging run during the same pipe descent. Completing logging operations using LWF method saves time and money. The technique also provides well information where data maymore » not otherwise have been obtained. Other benefits include reduced fishing time and an increased level of safety.« less

  5. Yearly report, Yucca Mountain project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brune, J.N.

    1992-09-30

    We proposed to (1) Develop our data logging and analysis equipment and techniques for analyzing seismic data from the Southern Great Basin Seismic Network (SGBSN), (2) Investigate the SGBSN data for evidence of seismicity patterns, depth distribution patterns, and correlations with geologic features (3) Repair and maintain our three broad band downhole digital seismograph stations at Nelson, nevada, Troy Canyon, Nevada, and Deep Springs, California (4) Install, operate, and log data from a super sensitive microearthquake array at Yucca Mountain (5) Analyze data from micro-earthquakes relative to seismic hazard at Yucca Mountain.

  6. The design and implementation of web mining in web sites security

    NASA Astrophysics Data System (ADS)

    Li, Jian; Zhang, Guo-Yin; Gu, Guo-Chang; Li, Jian-Li

    2003-06-01

    The backdoor or information leak of Web servers can be detected by using Web Mining techniques on some abnormal Web log and Web application log data. The security of Web servers can be enhanced and the damage of illegal access can be avoided. Firstly, the system for discovering the patterns of information leakages in CGI scripts from Web log data was proposed. Secondly, those patterns for system administrators to modify their codes and enhance their Web site security were provided. The following aspects were described: one is to combine web application log with web log to extract more information, so web data mining could be used to mine web log for discovering the information that firewall and Information Detection System cannot find. Another approach is to propose an operation module of web site to enhance Web site security. In cluster server session, Density-Based Clustering technique is used to reduce resource cost and obtain better efficiency.

  7. Extracting the Textual and Temporal Structure of Supercomputing Logs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jain, S; Singh, I; Chandra, A

    2009-05-26

    Supercomputers are prone to frequent faults that adversely affect their performance, reliability and functionality. System logs collected on these systems are a valuable resource of information about their operational status and health. However, their massive size, complexity, and lack of standard format makes it difficult to automatically extract information that can be used to improve system management. In this work we propose a novel method to succinctly represent the contents of supercomputing logs, by using textual clustering to automatically find the syntactic structures of log messages. This information is used to automatically classify messages into semantic groups via an onlinemore » clustering algorithm. Further, we describe a methodology for using the temporal proximity between groups of log messages to identify correlated events in the system. We apply our proposed methods to two large, publicly available supercomputing logs and show that our technique features nearly perfect accuracy for online log-classification and extracts meaningful structural and temporal message patterns that can be used to improve the accuracy of other log analysis techniques.« less

  8. Detection of concrete dam leakage using an integrated geophysical technique based on flow-field fitting method

    NASA Astrophysics Data System (ADS)

    Dai, Qianwei; Lin, Fangpeng; Wang, Xiaoping; Feng, Deshan; Bayless, Richard C.

    2017-05-01

    An integrated geophysical investigation was performed at S dam located at Dadu basin in China to assess the condition of the dam curtain. The key methodology of the integrated technique used was flow-field fitting method, which allowed identification of the hydraulic connections between the dam foundation and surface water sources (upstream and downstream), and location of the anomalous leakage outlets in the dam foundation. Limitations of the flow-field fitting method were complemented with resistivity logging to identify the internal erosion which had not yet developed into seepage pathways. The results of the flow-field fitting method and resistivity logging were consistent when compared with data provided by seismic tomography, borehole television, water injection test, and rock quality designation.

  9. Logging damage in thinned, young-growth true fir stands in California and recommendations for prevention.

    Treesearch

    Paul E. Aho; Gary Fiddler; Mike. Srago

    1983-01-01

    Logging-damage surveys and tree-dissection studies were made in commercially thinned, naturally established young-growth true fir stands in the Lassen National Forest in northern California. Significant damage occurred to residual trees in stands logged by conventional methods. Logging damage was substantially lower in stands thinned using techniques designed to reduce...

  10. Characterization of the Hydrocarbon Potential and Non-Potential Zones Using Wavelet-Based Fractal Analysis

    NASA Astrophysics Data System (ADS)

    Mukherjee, Bappa; Roy, P. N. S.

    The identification of prospective and dry zone is of major importance from well log data. Truthfulness in the identification of potential zone is a very crucial issue in hydrocarbon exploration. In this line, the problem has received considerable attention and many conventional techniques have been proposed. The purpose of this study is to recognize the hydrocarbon and non-hydrocarbon bearing portion within a reservoir by using the non-conventional technique. The wavelet based fractal analysis (WBFA) has been applied on the wire-line log data in order to obtain the pre-defined hydrocarbon (HC) and non-hydrocarbon (NHC) zones by their self-affine signal nature is demonstrated in this paper. The feasibility of the proposed technique is tested with the help of most commonly used logs, like self-potential, gamma ray, resistivity and porosity log responses. These logs are obtained from the industry to make out several HC and NHC zones of all wells in the study region belonging to the upper Assam basin. The results obtained in this study for a particular log response, where in the case of HC bearing zones, it is found that they are mainly situated in a variety of sandstones lithology which leads to the higher Hurst exponent. Further, the NHC zones found to be analogous to lithology with higher shale content having lower Hurst exponent. The above proposed technique can overcome the chance of miss interpretation in conventional reservoir characterization.

  11. Sweep visually evoked potentials and visual findings in children with West syndrome.

    PubMed

    de Freitas Dotto, Patrícia; Cavascan, Nívea Nunes; Berezovsky, Adriana; Sacai, Paula Yuri; Rocha, Daniel Martins; Pereira, Josenilson Martins; Salomão, Solange Rios

    2014-03-01

    West syndrome (WS) is a type of early childhood epilepsy characterized by progressive neurological development deterioration that includes vision. To demonstrate the clinical importance of grating visual acuity thresholds (GVA) measurement by sweep visually evoked potentials technique (sweep-VEP) as a reliable tool for evaluation of the visual cortex status in WS children. This is a retrospective study of the best-corrected binocular GVA and ophthalmological features of WS children referred for the Laboratory of Clinical Electrophysiology of Vision of UNIFESP from 1998 to 2012 (Committee on Ethics in Research of UNIFESP n° 0349/08). The GVA deficit was calculated by subtracting binocular GVA score (logMAR units) of each patient from the median values of age norms from our own lab and classified as mild (0.1-0.39 logMAR), moderate (0.40-0.80 logMAR) or severe (>0.81 logMAR). Associated ophthalmological features were also described. Data from 30 WS children (age from 6 to 108 months, median = 14.5 months, mean ± SD = 22.0 ± 22.1 months; 19 male) were analyzed. The majority presented severe GVA deficit (0.15-1.44 logMAR; mean ± SD = 0.82 ± 0.32 logMAR; median = 0.82 logMAR), poor visual behavior, high prevalence of strabismus and great variability in ocular positioning. The GVA deficit did not vary according to gender (P = .8022), WS type (P = .908), birth age (P = .2881), perinatal oxygenation (P = .7692), visual behavior (P = .8789), ocular motility (P = .1821), nystagmus (P = .2868), risk of drug-induced retinopathy (P = .4632) and participation in early visual stimulation therapy (P = .9010). The sweep-VEP technique is a reliable tool to classify visual system impairment in WS children, in agreement with the poor visual behavior exhibited by them. Copyright © 2013 European Paediatric Neurology Society. Published by Elsevier Ltd. All rights reserved.

  12. Increased Oil Production and Reserves from Improved Completion Techniques in the Bluebell Field, Uinta Basin, Utah

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deo, M.D.; Morgan, C.D.

    1999-04-28

    The objective of the project is to increase oil production and reserves by the use of improved reservoir characterization and completion techniques in the Uinta Basin, Utah. To accomplish this objective, a two-year geologic and engineering characterization of the Bluebell field was conducted. The study evaluated surface and subsurface data, currently used completion techniques, and common production problems. It was determined that advanced case- and open-hole logs could be effective in determining productive beds and that stage-interval (about 500 ft [150 m] per stage) and bed-scale isolation completion techniques could result in improved well performance. In the first demonstration wellmore » (Michelle Ute well discussed in the previous technical report), dipole shear anisotropy (anisotropy) and dual-burst thermal decay time (TDT) logs were run before and isotope tracer log was run after the treatment. The logs were very helpful in characterizing the remaining hydrocarbon potential in the well. But, mechanical failure resulted in a poor recompletion and did not result in a significant improvement in the oil production from the well.« less

  13. Integration of carbon conservation into sustainable forest management using high resolution satellite imagery: A case study in Sabah, Malaysian Borneo

    NASA Astrophysics Data System (ADS)

    Langner, Andreas; Samejima, Hiromitsu; Ong, Robert C.; Titin, Jupiri; Kitayama, Kanehiro

    2012-08-01

    Conservation of tropical forests is of outstanding importance for mitigation of climate change effects and preserving biodiversity. In Borneo most of the forests are classified as permanent forest estates and are selectively logged using conventional logging techniques causing high damage to the forest ecosystems. Incorporation of sustainable forest management into climate change mitigation measures such as Reducing Emissions from Deforestation and Forest Degradation (REDD+) can help to avert further forest degradation by synergizing sustainable timber production with the conservation of biodiversity. In order to evaluate the efficiency of such initiatives, monitoring methods for forest degradation and above-ground biomass in tropical forests are urgently needed. In this study we developed an index using Landsat satellite data to describe the crown cover condition of lowland mixed dipterocarp forests. We showed that this index combined with field data can be used to estimate above-ground biomass using a regression model in two permanent forest estates in Sabah, Malaysian Borneo. Tangkulap represented a conventionally logged forest estate while Deramakot has been managed in accordance with sustainable forestry principles. The results revealed that conventional logging techniques used in Tangkulap during 1991 and 2000 decreased the above-ground biomass by an annual amount of average -6.0 t C/ha (-5.2 to -7.0 t C/ha, 95% confidential interval) whereas the biomass in Deramakot increased by 6.1 t C/ha per year (5.3-7.2 t C/ha, 95% confidential interval) between 2000 and 2007 while under sustainable forest management. This indicates that sustainable forest management with reduced-impact logging helps to protect above-ground biomass. In absolute terms, a conservative amount of 10.5 t C/ha per year, as documented using the methodology developed in this study, can be attributed to the different management systems, which will be of interest when implementing REDD+ that rewards the enhancement of carbon stocks.

  14. Developing EHR-driven heart failure risk prediction models using CPXR(Log) with the probabilistic loss function.

    PubMed

    Taslimitehrani, Vahid; Dong, Guozhu; Pereira, Naveen L; Panahiazar, Maryam; Pathak, Jyotishman

    2016-04-01

    Computerized survival prediction in healthcare identifying the risk of disease mortality, helps healthcare providers to effectively manage their patients by providing appropriate treatment options. In this study, we propose to apply a classification algorithm, Contrast Pattern Aided Logistic Regression (CPXR(Log)) with the probabilistic loss function, to develop and validate prognostic risk models to predict 1, 2, and 5year survival in heart failure (HF) using data from electronic health records (EHRs) at Mayo Clinic. The CPXR(Log) constructs a pattern aided logistic regression model defined by several patterns and corresponding local logistic regression models. One of the models generated by CPXR(Log) achieved an AUC and accuracy of 0.94 and 0.91, respectively, and significantly outperformed prognostic models reported in prior studies. Data extracted from EHRs allowed incorporation of patient co-morbidities into our models which helped improve the performance of the CPXR(Log) models (15.9% AUC improvement), although did not improve the accuracy of the models built by other classifiers. We also propose a probabilistic loss function to determine the large error and small error instances. The new loss function used in the algorithm outperforms other functions used in the previous studies by 1% improvement in the AUC. This study revealed that using EHR data to build prediction models can be very challenging using existing classification methods due to the high dimensionality and complexity of EHR data. The risk models developed by CPXR(Log) also reveal that HF is a highly heterogeneous disease, i.e., different subgroups of HF patients require different types of considerations with their diagnosis and treatment. Our risk models provided two valuable insights for application of predictive modeling techniques in biomedicine: Logistic risk models often make systematic prediction errors, and it is prudent to use subgroup based prediction models such as those given by CPXR(Log) when investigating heterogeneous diseases. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. L-O-S-T: Logging Optimization Selection Technique

    Treesearch

    Jerry L. Koger; Dennis B. Webster

    1984-01-01

    L-O-S-T is a FORTRAN computer program developed to systematically quantify, analyze, and improve user selected harvesting methods. Harvesting times and costs are computed for road construction, landing construction, system move between landings, skidding, and trucking. A linear programming formulation utilizing the relationships among marginal analysis, isoquants, and...

  16. Developing attractants and trapping techniques for the emerald ash borer

    Treesearch

    Therese M. Poland; Peter de Groot; Gary Grant; Linda MacDonald; Deborah G. McCullough

    2003-01-01

    Shortly after the 2002 discovery of emerald ash borer (EAB), Agrilus planipennis Fairmaire (Coleoptera: Buprestidae), in southeastern Michigan and Windsor, Ontario, quarantines regulating the movement of ash logs, firewood, and nursery stock were established to reduce the risk of human-assisted spread of this exotic forest insect pest. Accurate...

  17. Reduced-impact logging: challenges and opportunities

    Treesearch

    F.E. Putz; P. Sist; T. Fredericksen; D. Dykstra

    2008-01-01

    Over the past two decades, sets of timber harvesting guidelines designed to mitigate the deleterious environmental impacts of tree felling, yarding, and hauling have become known as "reduced-impact logging" (RIL) techniques. Although none of the components of RIL are new, concerns about destructive logging practices and worker safety in the tropics stimulated...

  18. Characterization of a complex near-surface structure using well logging and passive seismic measurements

    NASA Astrophysics Data System (ADS)

    Benjumea, Beatriz; Macau, Albert; Gabàs, Anna; Figueras, Sara

    2016-04-01

    We combine geophysical well logging and passive seismic measurements to characterize the near-surface geology of an area located in Hontomin, Burgos (Spain). This area has some near-surface challenges for a geophysical study. The irregular topography is characterized by limestone outcrops and unconsolidated sediments areas. Additionally, the near-surface geology includes an upper layer of pure limestones overlying marly limestones and marls (Upper Cretaceous). These materials lie on top of Low Cretaceous siliciclastic sediments (sandstones, clays, gravels). In any case, a layer with reduced velocity is expected. The geophysical data sets used in this study include sonic and gamma-ray logs at two boreholes and passive seismic measurements: three arrays and 224 seismic stations for applying the horizontal-to-vertical amplitude spectra ratio method (H/V). Well-logging data define two significant changes in the P-wave-velocity log within the Upper Cretaceous layer and one more at the Upper to Lower Cretaceous contact. This technique has also been used for refining the geological interpretation. The passive seismic measurements provide a map of sediment thickness with a maximum of around 40 m and shear-wave velocity profiles from the array technique. A comparison between seismic velocity coming from well logging and array measurements defines the resolution limits of the passive seismic techniques and helps it to be interpreted. This study shows how these low-cost techniques can provide useful information about near-surface complexity that could be used for designing a geophysical field survey or for seismic processing steps such as statics or imaging.

  19. Development of prototypes of bioactive packaging materials based on immobilized bacteriophages for control of growth of bacterial pathogens in foods.

    PubMed

    Lone, Ayesha; Anany, Hany; Hakeem, Mohammed; Aguis, Louise; Avdjian, Anne-Claire; Bouget, Marina; Atashi, Arash; Brovko, Luba; Rochefort, Dominic; Griffiths, Mansel W

    2016-01-18

    Due to lack of adequate control methods to prevent contamination in fresh produce and growing consumer demand for natural products, the use of bacteriophages has emerged as a promising approach to enhance safety of these foods. This study sought to control Listeria monocytogenes in cantaloupes and RTE meat and Escherichia coli O104:H4 in alfalfa seeds and sprouts under different storage conditions by using specific lytic bacteriophage cocktails applied either free or immobilized. Bacteriophage cocktails were introduced into prototypes of packaging materials using different techniques: i) immobilizing on positively charged modified cellulose membranes, ii) impregnating paper with bacteriophage suspension, and iii) encapsulating in alginate beads followed by application of beads onto the paper. Phage-treated and non-treated samples were stored for various times and at temperatures of 4°C, 12°C or 25°C. In cantaloupe, when free phage cocktail was added, L. monocytogenes counts dropped below the detection limit of the plating technique (<1 log CFU/g) after 5 days of storage at both 4°C and 12°C. However, at 25°C, counts below the detection limit were observed after 3 and 6h and a 2-log CFU/g reduction in cell numbers was seen after 24h. For the immobilized Listeria phage cocktail, around 1-log CFU/g reduction in the Listeria count was observed by the end of the storage period for all tested storage temperatures. For the alfalfa seeds and sprouts, regardless of the type of phage application technique (spraying of free phage suspension, bringing in contact with bacteriophage-based materials (paper coated with encapsulated bacteriophage or impregnated with bacteriophage suspension)), the count of E. coli O104:H4 was below the detection limit (<1 log CFU/g) after 1h in seeds and about a 1-log cycle reduction in E. coli count was observed on the germinated sprouts by day 5. In ready-to-eat (RTE) meat, LISTEX™ P100, a commercial phage product, was able to significantly reduce the growth of L. monocytogenes at both storage temperatures, 4°C and 10°C, for 25 days regardless of bacteriophage application format (immobilized or non-immobilized (free)). In conclusion, the developed phage-based materials demonstrated significant antimicrobial effect, when applied to the artificially contaminated foods, and can be used as prototypes for developing bioactive antimicrobial packaging materials capable of enhancing the safety of fresh produce and RTE meat. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Impact of logging on aboveground biomass stocks in lowland rain forest, Papua New Guinea.

    PubMed

    Bryan, Jane; Shearman, Phil; Ash, Julian; Kirkpatrick, J B

    2010-12-01

    Greenhouse-gas emissions resulting from logging are poorly quantified across the tropics. There is a need for robust measurement of rain forest biomass and the impacts of logging from which carbon losses can be reliably estimated at regional and global scales. We used a modified Bitterlich plotless technique to measure aboveground live biomass at six unlogged and six logged rain forest areas (coupes) across two approximately 3000-ha regions at the Makapa concession in lowland Papua New Guinea. "Reduced-impact logging" is practiced at Makapa. We found the mean unlogged aboveground biomass in the two regions to be 192.96 +/- 4.44 Mg/ha and 252.92 +/- 7.00 Mg/ha (mean +/- SE), which was reduced by logging to 146.92 +/- 4.58 Mg/ha and 158.84 +/- 4.16, respectively. Killed biomass was not a fixed proportion, but varied with unlogged biomass, with 24% killed in the lower-biomass region, and 37% in the higher-biomass region. Across the two regions logging resulted in a mean aboveground carbon loss of 35 +/- 2.8 Mg/ha. The plotless technique proved efficient at estimating mean aboveground biomass and logging damage. We conclude that substantial bias is likely to occur within biomass estimates derived from single unreplicated plots.

  1. Interlake production established using quantitative hydrocarbon well-log analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lancaster, J.; Atkinson, A.

    1988-07-01

    Production was established in a new pay zone of the basal Interlake Formation adjacent to production in Midway field in Williams County, North Dakota. Hydrocarbon saturation, which was computed using hydrocarbon well-log (mud-log) data, and computed permeability encouraged the operator to run casing and test this zone. By use of drilling rig parameters, drilling mud properties, hydrocarbon-show data from the mud log, drilled rock and porosity descriptions, and wireline log porosity, this new technique computes oil saturation (percent of porosity) and permeability to the invading filtrate, using the Darcy equation. The Leonardo Fee well was drilled to test the Devonianmore » Duperow, the Silurian upper Interlake, and the Ordovician Red River. The upper two objectives were penetrated downdip from Midway production and there were no hydrocarbon shows. It was determined that the Red River was tight, based on sample examination by well site personnel. The basal Interlake, however, liberated hydrocarbon shows that were analyzed by this new technology. The results of this evaluation accurately predicted this well would be a commercial success when placed in production. Where geophysical log analysis might be questionable, this new evaluation technique may provide answers to anticipated oil saturation and producibility. The encouraging results of hydrocarbon saturation and permeability, produced by this technique, may be largely responsible for the well being in production today.« less

  2. Glutenite bodies sequence division of the upper Es4 in northern Minfeng zone of Dongying Sag, Bohai Bay Basin, China

    NASA Astrophysics Data System (ADS)

    Shao, Xupeng

    2017-04-01

    Glutenite bodies are widely developed in northern Minfeng zone of Dongying Sag. Their litho-electric relationship is not clear. In addition, as the conventional sequence stratigraphic research method drawbacks of involving too many subjective human factors, it has limited deepening of the regional sequence stratigraphic research. The wavelet transform technique based on logging data and the time-frequency analysis technique based on seismic data have advantages of dividing sequence stratigraphy quantitatively comparing with the conventional methods. Under the basis of the conventional sequence research method, this paper used the above techniques to divide the fourth-order sequence of the upper Es4 in northern Minfeng zone of Dongying Sag. The research shows that the wavelet transform technique based on logging data and the time-frequency analysis technique based on seismic data are essentially consistent, both of which divide sequence stratigraphy quantitatively in the frequency domain; wavelet transform technique has high resolutions. It is suitable for areas with wells. The seismic time-frequency analysis technique has wide applicability, but a low resolution. Both of the techniques should be combined; the upper Es4 in northern Minfeng zone of Dongying Sag is a complete set of third-order sequence, which can be further subdivided into 5 fourth-order sequences that has the depositional characteristics of fine-upward sequence in granularity. Key words: Dongying sag, northern Minfeng zone, wavelet transform technique, time-frequency analysis technique ,the upper Es4, sequence stratigraphy

  3. An evaluation of borehole flowmeters used to measure horizontal ground-water flow in limestones of Indiana, Kentucky, and Tennessee, 1999

    USGS Publications Warehouse

    Wilson, John T.; Mandell, Wayne A.; Paillet, Frederick L.; Bayless, E. Randall; Hanson, Randall T.; Kearl, Peter M.; Kerfoot, William B.; Newhouse, Mark W.; Pedler, William H.

    2001-01-01

    Three borehole flowmeters and hydrophysical logging were used to measure ground-water flow in carbonate bedrock at sites in southeastern Indiana and on the westcentral border of Kentucky and Tennessee. The three flowmeters make point measurements of the direction and magnitude of horizontal flow, and hydrophysical logging measures the magnitude of horizontal flowover an interval. The directional flowmeters evaluated include a horizontal heat-pulse flowmeter, an acoustic Doppler velocimeter, and a colloidal borescope flowmeter. Each method was used to measure flow in selected zones where previous geophysical logging had indicated water-producing beds, bedding planes, or other permeable features that made conditions favorable for horizontal-flow measurements. Background geophysical logging indicated that ground-water production from the Indiana test wells was characterized by inflow from a single, 20-foot-thick limestone bed. The Kentucky/Tennessee test wells produced water from one or more bedding planes where geophysical logs indicated the bedding planes had been enlarged by dissolution. Two of the three test wells at the latter site contained measurable vertical flow between two or more bedding planes under ambient hydraulic head conditions. Field measurements and data analyses for each flow-measurement technique were completed by a developer of the technology or by a contractor with extensive experience in the application of that specific technology. Comparison of the horizontal-flow measurements indicated that the three point-measurement techniques rarely measured the same velocities and flow directions at the same measurement stations. Repeat measurements at selected depth stations also failed to consistently reproduce either flow direction, flow magnitude, or both. At a few test stations, two of the techniques provided similar flow magnitude or direction but usually not both. Some of this variability may be attributed to naturally occurring changes in hydraulic conditions during the 1-month study period in August and September 1999. The actual velocities and flow directions are unknown; therefore, it is uncertain which technique provided the most accurate measurements of horizontal flow in the boreholes and which measurements were most representative of flow in the aquifers. The horizontal heat-pulse flowmeter consistently yielded flow magnitudes considerably less than those provided by the acoustic Doppler velocimeter and colloidal borescope. The design of the horizontal heat-pulse flowmeter compensates for the local acceleration of ground-water velocity in the open borehole. The magnitude of the velocities estimated from the hydrophysical logging were comparable to those of the horizontal heat-pulse flowmeter, presumably because the hydrophysical logging also effectively compensates for the effect of the borehole on the flow field and averages velocity over a length of borehole rather than at a point. The acoustic Doppler velocimeter and colloidal borescope have discrete sampling points that allow for measuring preferential flow velocities that can be substantially higher than the average velocity through a length of borehole. The acoustic Doppler velocimeter and colloidal borescope also measure flow at the center of the borehole where the acceleration of the flow field should be greatest. Of the three techniques capable of measuring direction and magnitude of horizontal flow, only the acoustic Doppler velocimeter measured vertical flow. The acoustic Doppler velocimeter consistently measured downward velocity in all test wells. This apparent downward flow was attributed, in part, to particles falling through the water column as a result of mechanical disturbance during logging. Hydrophysical logging yielded estimates of vertical flow in the Kentucky/Tennessee test wells. In two of the test wells, the hydrophysical logging involved deliberate isolation of water-producing bedding planes with a packer to ensure that small horizontal flow could be quantified without the presence of vertical flow. The presence of vertical flow in the Kentucky/Tennessee test wells may preclude the definitive measurement of horizontal flow without the use of effective packer devices. None of the point-measurement techniques used a packer, but each technique used baffle devices to help suppress the vertical flow. The effectiveness of these baffle devices is not known; therefore, the effect of vertical flow on the measurements cannot be quantified. The general lack of agreement among the point-measurement techniques in this study highlights the difficulty of using measurements at a single depth point in a borehole to characterize the average horizontal flow in a heterogeneous aquifer. The effective measurement of horizontal flow may depend on the precise depth at which measurements are made, and the measurements at a given depth may vary over time as hydraulic head conditions change. The various measurements also demonstrate that the magnitude and possibly the direction of horizontal flow are affected by the presence of the open borehole. Although there is a lack of agreement among the measurement techniques, these results could mean that effective characterization of horizontal flow in heterogeneous aquifers might be possible if data from many depth stations and from repeat measurements can be averaged over an extended time period. Complications related to vertical flow in the borehole highlights the importance of using background logging methods like vertical flowmeters or hydrophysical logging to characterize the borehole environment before horizontal-flow measurements are attempted. If vertical flow is present, a packer device may be needed to acquire definitive measurements of horizontal flow. Because hydrophysical logging provides a complete depth profile of the borehole, a strength of this technique is in identifying horizontal- and vertical-flow zones in a well. Hydrophysical logging may be most applicable as a screening method. Horizontal- flow zones identified with the hydrophysical logging then could be evaluated with one of the point-measurement techniques for quantifying preferential flow zones and flow directions. Additional research is needed to determine how measurements of flow in boreholes relate to flow in bedrock aquifers. The flowmeters may need to be evaluated under controlled laboratory conditions to determine which of the methods accurately measure ground-water velocities and flow directions. Additional research also is needed to investigate variations in flow direction with time, daily changes in velocity, velocity corrections for fractured bedrock aquifers and unconsolidated aquifers, and directional differences in individual wells for hydraulically separated flow zones.

  4. Fast projection/backprojection and incremental methods applied to synchrotron light tomographic reconstruction.

    PubMed

    de Lima, Camila; Salomão Helou, Elias

    2018-01-01

    Iterative methods for tomographic image reconstruction have the computational cost of each iteration dominated by the computation of the (back)projection operator, which take roughly O(N 3 ) floating point operations (flops) for N × N pixels images. Furthermore, classical iterative algorithms may take too many iterations in order to achieve acceptable images, thereby making the use of these techniques unpractical for high-resolution images. Techniques have been developed in the literature in order to reduce the computational cost of the (back)projection operator to O(N 2 logN) flops. Also, incremental algorithms have been devised that reduce by an order of magnitude the number of iterations required to achieve acceptable images. The present paper introduces an incremental algorithm with a cost of O(N 2 logN) flops per iteration and applies it to the reconstruction of very large tomographic images obtained from synchrotron light illuminated data.

  5. Postfire logging: is it beneficial to a forest?

    Treesearch

    Sally Duncan

    2002-01-01

    Public debate on postfire logging has intensified in recent years, particularly since passage of the "salvage rider" in 1995, directing accelerated harvest of dead trees in the western United States. Supporters of postfires logging argue that it is part of a suite of restoration techniques, and that removal of timber means reduction of fuels for...

  6. Capabilities and limitations of dispersive liquid-liquid microextraction with solidification of floating organic drop for the extraction of organic pollutants from water samples.

    PubMed

    Vera-Avila, Luz E; Rojo-Portillo, Tania; Covarrubias-Herrera, Rosario; Peña-Alvarez, Araceli

    2013-12-17

    Dispersive liquid-liquid microextraction with solidification of floating organic drop (DLLME-SFO) is one of the most interesting sample preparation techniques developed in recent years. Although several applications have been reported, the potentiality and limitations of this simple and rapid extraction technique have not been made sufficiently explicit. In this work, the extraction efficiency of DLLME-SFO for pollutants from different chemical families was determined. Studied compounds include: 10 polycyclic aromatic hydrocarbons, 5 pesticides (chlorophenoxy herbicides and DDT), 8 phenols and 6 sulfonamides, thus, covering a large range of polarity and hydrophobicity (LogKow 0-7, overall). After optimization of extraction conditions using 1-dodecanol as extractant, the procedure was applied for extraction of each family from 10-mL spiked water samples, only adjusting sample pH as required. Absolute recoveries for pollutants with LogKow 3-7 were >70% and recovery values within this group (18 compounds) were independent of structure or hydrophobicity; the precision of recovery was very acceptable (RSD<12%) and linear behavior was observed in the studied concentration range (r(2)>0.995). Extraction recoveries for pollutants with LogKow 1.46-2.8 were in the range 13-62%, directly depending on individual LogKow values; however, good linearity (r(2)>0.993) and precision (RSD<6.5%) were also demonstrated for these polar solutes, despite recovery level. DLLME-SFO with 1-dodecanol completely failed for extraction of compounds with LogKow≤1 (sulfa drugs), other more polar extraction solvents (ionic liquids) should be explored for highly hydrophilic pollutants. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Influence of drilling operations on drilling mud gas monitoring during IODP Exp. 338 and 348

    NASA Astrophysics Data System (ADS)

    Hammerschmidt, Sebastian; Toczko, Sean; Kubo, Yusuke; Wiersberg, Thomas; Fuchida, Shigeshi; Kopf, Achim; Hirose, Takehiro; Saffer, Demian; Tobin, Harold; Expedition 348 Scientists, the

    2014-05-01

    The history of scientific ocean drilling has developed some new techniques and technologies for drilling science, dynamic positioning being one of the most famous. However, while industry has developed newer tools and techniques, only some of these have been used in scientific ocean drilling. The introduction of riser-drilling, which recirculates the drilling mud and returns to the platform solids and gases from the formation, to the International Ocean Drilling Program (IODP) through the launch of the Japan Agency of Marine Earth-Science and Technology (JAMSTEC) riser-drilling vessel D/V Chikyu, has made some of these techniques available to science. IODP Expedition 319 (NanTroSEIZE Stage 2: riser/riserless observatory) was the first such attempt, and among the tools and techniques used was drilling mud gas analysis. While industry regularly conducts drilling mud gas logging for safety concerns and reservoir evaluation, science is more interested in other components (e.g He, 222Rn) that are beyond the scope of typical mud logging services. Drilling mud gas logging simply examines the gases released into the drilling mud as part of the drilling process; the bit breaks and grinds the formation, releasing any trapped gases. These then circulate within the "closed circuit" mud-flow back to the drilling rig, where a degasser extracts these gases and passes them on to a dedicated mud gas logging unit. The unit contains gas chromatographs, mass spectrometers, spectral analyzers, radon gas analyzers, and a methane carbon isotope analyzer. Data are collected and stored in a database, together with several drilling parameters (rate of penetration, mud density, etc.). This initial attempt was further refined during IODP Expeditions 337 (Deep Coalbed Biosphere off Shimokita), 338 (NanTroSEIZE Stage 3: NanTroSEIZE Plate Boundary Deep Riser 2) and finally 348 (NanTroSEIZE Stage 3: NanTroSEIZE Plate Boundary Deep Riser 3). Although still in its development stage for scientific application, this technique can provide a valuable suite of measurements to complement more traditional IODP shipboard measurements. Here we present unpublished data from IODP Expeditions 338 and 348, penetrating the Nankai Accretionary wedge to 3058.5 meters below seafloor. Increasing mud density decreased degasser efficiency, especially for higher hydrocarbons. Blurring of the relative variations in total gas by depth was observed, and confirmed with comparison to headspace gas concentrations from the cored interval. Theoretically, overpressured zones in the formation can be identified through C2/C3 ratios, but these ratios are highly affected by changing drilling parameters. Proper mud gas evaluations will need to carefully consider the effects of variable drilling parameters when designing experiments and interpreting the data.

  8. Heterogeneous Shallow-Shelf Carbonate Buildups in the Paradox Basin, Utah and Colorado: Targets for Increased Oil Production and Reserves Using Horizontal Drilling Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wray, Laura L.; Eby, David E.; Chidsey, Jr., Thomas C.

    2002-07-24

    This report covers research activities for the second half of the second project year (October 6, 2001, through April 5, 2002). This work includes description and analysis of cores, correlation of geophysical well logs, reservoir mapping, petrographic description of thin sections, cross plotting of permeability and porosity data, and development of horizontal drilling strategies for the Little Ute and Sleeping Ute fields in Montezuma County, Colorado. Geological characterization on a local scale focused on reservoir heterogeneity, quality, and lateral continuity, as well as possible compartmentalization, within these fields. This study utilizes representative core, geophysical logs, and thin sections to characterizemore » and grade each field's potential for drilling horizontal laterals from existing development wells.« less

  9. Fuzzy inference system for identification of geological stratigraphy off Prydz Bay, East Antarctica

    NASA Astrophysics Data System (ADS)

    Singh, Upendra K.

    2011-12-01

    The analysis of well logging data plays key role in the exploration and development of hydrocarbon reservoirs. Various well log parameters such as porosity, gamma ray, density, transit time and resistivity, help in classification of strata and estimation of the physical, electrical and acoustical properties of the subsurface lithology. Strong and conspicuous changes in some of the log parameters associated with any particular geological stratigraphy formation are function of its composition, physical properties that help in classification. However some substrata show moderate values in respective log parameters and make difficult to identify the kind of strata, if we go by the standard variability ranges of any log parameters and visual inspection. The complexity increases further with more number of sensors involved. An attempt is made to identify the kinds of stratigraphy from well logs over Prydz bay basin, East Antarctica using fuzzy inference system. A model is built based on few data sets of known stratigraphy and further the network model is used as test model to infer the lithology of a borehole from their geophysical logs, not used in simulation. Initially the fuzzy based algorithm is trained, validated and tested on well log data and finally identifies the formation lithology of a hydrocarbon reservoir system of study area. The effectiveness of this technique is demonstrated by the analysis of the results for actual lithologs and coring data of ODP Leg 188. The fuzzy results show that the training performance equals to 82.95% while the prediction ability is 87.69%. The fuzzy results are very encouraging and the model is able to decipher even thin layer seams and other strata from geophysical logs. The result provides the significant sand formation of depth range 316.0- 341.0 m, where core recovery is incomplete.

  10. Successful Sampling Strategy Advances Laboratory Studies of NMR Logging in Unconsolidated Aquifers

    NASA Astrophysics Data System (ADS)

    Behroozmand, Ahmad A.; Knight, Rosemary; Müller-Petke, Mike; Auken, Esben; Barfod, Adrian A. S.; Ferré, Ty P. A.; Vilhelmsen, Troels N.; Johnson, Carole D.; Christiansen, Anders V.

    2017-11-01

    The nuclear magnetic resonance (NMR) technique has become popular in groundwater studies because it responds directly to the presence and mobility of water in a porous medium. There is a need to conduct laboratory experiments to aid in the development of NMR hydraulic conductivity models, as is typically done in the petroleum industry. However, the challenge has been obtaining high-quality laboratory samples from unconsolidated aquifers. At a study site in Denmark, we employed sonic drilling, which minimizes the disturbance of the surrounding material, and extracted twelve 7.6 cm diameter samples for laboratory measurements. We present a detailed comparison of the acquired laboratory and logging NMR data. The agreement observed between the laboratory and logging data suggests that the methodologies proposed in this study provide good conditions for studying NMR measurements of unconsolidated near-surface aquifers. Finally, we show how laboratory sample size and condition impact the NMR measurements.

  11. Protecting log cabins from decay

    Treesearch

    R. M. Rowell; J. M. Black; L. R. Gjovik; W. C. Feist

    1977-01-01

    This report answers the questions most often asked of the Forest Service on the protection of log cabins from decay, and on practices for the exterior finishing and maintenance of existing cabins. Causes of stain and decay are discussed, as are some basic techniques for building a cabin that will minimize decay. Selection and handling of logs, their preservative...

  12. The Inculcation of Critical Reflection through Reflective Learning Log: An Action Research in Entrepreneurship Module

    ERIC Educational Resources Information Center

    Kheng, Yeoh Khar

    2017-01-01

    Purpose: This study is part of the Scholarship of Teaching and Learning (SoTL) grant to examine written reflective learning log among the students studying BPME 3073 Entrepreneurship in UUM. Method: The data collection techniques is researcher-directed textual data through reflective learning log; obtained from students of one hundred forty. A…

  13. Internal log scanning: Research to reality

    Treesearch

    Daniel L. Schmoldt

    2000-01-01

    Improved log breakdown into lumber has been an active research topic since the 1960's. Demonstrated economic gains have driven the search for a cost-effective method to scan logs internally, from which it is assumed one can chose a better breakdown strategy. X-ray computed tomography (CT) has been widely accepted as the most promising internal imaging technique....

  14. 75 FR 60122 - Notice of Public Information Collection(s) Being Reviewed by the Federal Communications...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-29

    ... the respondents, including the use of automated collection techniques or other forms of information...: OMB Control Number: 3060-0360. Title: Section 80.409, Station Logs. Form No.: N/A. Type of Review... for filing suits upon such claims. Section 80.409(d), Ship Radiotelegraph Logs: Logs of ship stations...

  15. Application of Nuclear Well Logging Techniques to Lunar Resource Assessment

    NASA Technical Reports Server (NTRS)

    Albats, P.; Groves, J.; Schweitzer, J.; Tombrello, T.

    1992-01-01

    The use of neutron and gamma ray measurements for the analysis of material composition has become well established in the last 40 years. Schlumberger has pioneered the use of this technology for logging wells drilled to produce oil and gas, and for this purpose has developed neutron generators that allow measurements to be made in deep (5000 m) boreholes under adverse conditions. We also make ruggedized neutron and gamma ray detector packages that can be used to make reliable measurements on the drill collar of a rotating drill string while the well is being drilled, where the conditions are severe. Modern nuclear methods used in logging measure rock formation parameters like bulk density and porosity, fluid composition, and element abundances by weight including hydrogen concentration. The measurements are made with high precision and accuracy. These devices (well logging sondes) share many of the design criteria required for remote sensing in space; they must be small, light, rugged, and able to perform reliably under adverse conditions. We see a role for the adaptation of this technology to lunar or planetary resource assessment missions.

  16. BoreholeAR: A mobile tablet application for effective borehole database visualization using an augmented reality technology

    NASA Astrophysics Data System (ADS)

    Lee, Sangho; Suh, Jangwon; Park, Hyeong-Dong

    2015-03-01

    Boring logs are widely used in geological field studies since the data describes various attributes of underground and surface environments. However, it is difficult to manage multiple boring logs in the field as the conventional management and visualization methods are not suitable for integrating and combining large data sets. We developed an iPad application to enable its user to search the boring log rapidly and visualize them using the augmented reality (AR) technique. For the development of the application, a standard borehole database appropriate for a mobile-based borehole database management system was designed. The application consists of three modules: an AR module, a map module, and a database module. The AR module superimposes borehole data on camera imagery as viewed by the user and provides intuitive visualization of borehole locations. The map module shows the locations of corresponding borehole data on a 2D map with additional map layers. The database module provides data management functions for large borehole databases for other modules. Field survey was also carried out using more than 100,000 borehole data.

  17. Collisional-radiative switching - A powerful technique for converging non-LTE calculations

    NASA Technical Reports Server (NTRS)

    Hummer, D. G.; Voels, S. A.

    1988-01-01

    A very simple technique has been developed to converge statistical equilibrium and model atmospheric calculations in extreme non-LTE conditions when the usual iterative methods fail to converge from an LTE starting model. The proposed technique is based on a smooth transition from a collision-dominated LTE situation to the desired non-LTE conditions in which radiation dominates, at least in the most important transitions. The proposed approach was used to successfully compute stellar models with He abundances of 0.20, 0.30, and 0.50; Teff = 30,000 K, and log g = 2.9.

  18. Log Houses in les Laurentides. From Oral Tradition to AN Integrated Digital Documentation Based on the Re-Discovery of the Traditional Constructive-Geographical `REPERTOIRES' Through Digital Bim Data Archive

    NASA Astrophysics Data System (ADS)

    Esponda, M.; Piraino, F.; Stanga, C.; Mezzino, D.

    2017-08-01

    This paper presents an integrated approach between digital documentation workflows and historical research in order to document log houses, outstanding example of vernacular architecture in Quebec, focusing on their geometrical-dimensional as well as on the intangible elements associated with these historical structures. The 18 log houses selected in the Laurentians represent the material culture of how settlers adapted to the harsh Quebec environment at the end of the nineteenth century. The essay describes some results coming by professor Mariana Esponda in 2015 (Carleton University) and the digital documentation was carried out through the grant New Paradigm/New Tools for Architectural Heritage in Canada, supported by SSHRC Training Program) (May-August 2016). The workflow of the research started with the digital documentation, accomplished with laser scanning techniques, followed by onsite observations, and archival researches. This led to the creation of an 'abacus', a first step into the development of a territorialhistorical database of the log houses, potentially updatable by other researchers. Another important part of the documentation of these buildings has been the development of Historic Building Information Models fundamental to analyze the geometry of the logs and to understand how these constructions were built. The realization of HBIMs was a first step into the modeling of irregular shapes such as those of the logs - different Level of Detail were adopted in order to show how the models can be used for different purposes. In the future, they can potentially be used for the creation of a virtual tour app for the story telling of these buildings.

  19. Financial returns under uncertainty for conventional and reduced-impact logging in permanent production forests of the Brazilian Amazon

    Treesearch

    Frederick Boltz; Douglas R. Carter; Thomas P. Holmes; Rodrigo Pereira

    2001-01-01

    Reduced-impact logging (RIL) techniques are designed to improve the efficiency of timber harvesting while mitigating its adverse effects on the forest ecosystem. Research on RIL in select tropical forest regions has demonstrated clear ecological benefits relative to conventional logging (CL) practices while the financial competitiveness of RIL is less conclusive. We...

  20. Boat-Wave-Induced Bank Erosion on the Kenai River, Alaska

    DTIC Science & Technology

    2008-03-01

    with coir log habitat restoration. .....................................................................75 Figure 51. Type 1 bank with willow...various types of streambank stabilization. Common stabilization techniques consist of root wads, spruce tree revetments, coir logs, and riprap...restoration. ERDC TR-08-5 75 Figure 50. Type 1 bank with coir log habitat restoration. Figure 51. Type 1 bank with willow plantings/ladder access habitat

  1. Break-even zones for cable yarding by log size

    Treesearch

    Chris B. LeDoux

    1984-01-01

    The use of cable logging to extract small pieces of residue wood may result in low rates of production and a high cost per unit of wood produced. However, the logging manager can improve yarding productivity and break-even in cable residue removal operations by using the proper planning techniques. In this study, break-even zones for specific young-growth stands were...

  2. Using a Video Split-Screen Technique To Evaluate Streaming Instructional Videos.

    ERIC Educational Resources Information Center

    Gibbs, William J.; Bernas, Ronan S.; McCann, Steven A.

    The Media Center at Eastern Illinois University developed and streamed on the Internet 26 short (one to five minutes) instructional videos about WebCT that illustrated specific functions, including logging-in, changing a password, and using chat. This study observed trainees using and reacting to selections of these videos. It set out to assess…

  3. Preliminary logging analysis system (PLANS): overview.

    Treesearch

    R.H. Twito; S.E. Reutebuch; R.J. McGaughey; C.N. Mann

    1987-01-01

    The paper previews a computer-aided design system, PLANS, that is useful for developing timber harvest and road network plans on large-scale topographic maps. Earlier planning techniques are reviewed, and the advantages are explained of using advanced planning systems like PLANS. There is a brief summary of the input, output, and function of each program in the PLANS...

  4. Wilderness experience in Rocky Mountain National Park 2002: Report to RMNP

    USGS Publications Warehouse

    Schuster, Elke; Johnson, S. Shea; Taylor, Jonathan G.

    2004-01-01

    The social science technique of Visitor Employed Photography [VEP] was used to obtain information from visitors about wilderness experiences. Visitors were selected at random from Park-designated wilderness trails, in proportion to their use, and asked to participate in the survey. Respondents were given single-use, 10-exposure cameras and photo-log diaries to record experiences. A total of 293 cameras were distributed, with a response rate of 87%. Following the development of the photos, a copy of the photos, two pertinent pages from the photo-log, and a follow-up survey were mailed to respondents. Fifty six percent of the follow-up surveys were returned. Findings from the two surveys were analyzed and compared.

  5. Forest Roadidentification and Extractionof Through Advanced Log Matching Techniques

    NASA Astrophysics Data System (ADS)

    Zhang, W.; Hu, B.; Quist, L.

    2017-10-01

    A novel algorithm for forest road identification and extraction was developed. The algorithm utilized Laplacian of Gaussian (LoG) filter and slope calculation on high resolution multispectral imagery and LiDAR data respectively to extract both primary road and secondary road segments in the forest area. The proposed method used road shape feature to extract the road segments, which have been further processed as objects with orientation preserved. The road network was generated after post processing with tensor voting. The proposed method was tested on Hearst forest, located in central Ontario, Canada. Based on visual examination against manually digitized roads, the majority of roads from the test area have been identified and extracted from the process.

  6. Laser optogalvanic spectroscopy of molecules

    NASA Technical Reports Server (NTRS)

    Webster, C. R.; Rettner, C. T.

    1983-01-01

    In laser optogalvanic (LOG) spectroscopy, a tunable laser is used to probe the spectral characteristics of atomic or molecular species within an electrical discharge in a low pressure gas. Optogalvanic signals arise when the impedance of the discharge changes in response to the absorption of laser radiation. The technique may, therefore, be referred to as impedance spectroscopy. This change in impedance may be monitored as a change in the voltage across the discharge tube. LOG spectra are recorded by scanning the wavelength of a chopped CW dye laser while monitoring the discharge voltage with a lock-in amplifier. LOG signals are obtained if the laser wavelength matches a transition in a species present in the discharge (or flame), and if the absorption of energy in the laser beam alters the impedance of the discharge. Infrared LOG spectroscopy of molecules has been demonstrated and may prove to be the most productive application in the field of optogalvanic techniques.

  7. Detecting and classifying method based on similarity matching of Android malware behavior with profile.

    PubMed

    Jang, Jae-Wook; Yun, Jaesung; Mohaisen, Aziz; Woo, Jiyoung; Kim, Huy Kang

    2016-01-01

    Mass-market mobile security threats have increased recently due to the growth of mobile technologies and the popularity of mobile devices. Accordingly, techniques have been introduced for identifying, classifying, and defending against mobile threats utilizing static, dynamic, on-device, and off-device techniques. Static techniques are easy to evade, while dynamic techniques are expensive. On-device techniques are evasion, while off-device techniques need being always online. To address some of those shortcomings, we introduce Andro-profiler, a hybrid behavior based analysis and classification system for mobile malware. Andro-profiler main goals are efficiency, scalability, and accuracy. For that, Andro-profiler classifies malware by exploiting the behavior profiling extracted from the integrated system logs including system calls. Andro-profiler executes a malicious application on an emulator in order to generate the integrated system logs, and creates human-readable behavior profiles by analyzing the integrated system logs. By comparing the behavior profile of malicious application with representative behavior profile for each malware family using a weighted similarity matching technique, Andro-profiler detects and classifies it into malware families. The experiment results demonstrate that Andro-profiler is scalable, performs well in detecting and classifying malware with accuracy greater than 98 %, outperforms the existing state-of-the-art work, and is capable of identifying 0-day mobile malware samples.

  8. Design and Evaluation of Log-To-Dimension Manufacturing Systems Using System Simulation

    Treesearch

    Wenjie Lin; D. Earl Kline; Philip A. Araman; Janice K. Wiedenbeck

    1995-01-01

    In a recent study of alternative dimension manufacturing systems that produce green hardwood dimension directly fromlogs, it was observed that for Grade 2 and 3 red oak logs, up to 78 and 76 percent of the log scale volume could be converted into clear dimension parts. The potential high yields suggest that this processing system can be a promising technique for...

  9. Permeability Estimation Directly From Logging-While-Drilling Induced Polarization Data

    NASA Astrophysics Data System (ADS)

    Fiandaca, G.; Maurya, P. K.; Balbarini, N.; Hördt, A.; Christiansen, A. V.; Foged, N.; Bjerg, P. L.; Auken, E.

    2018-04-01

    In this study, we present the prediction of permeability from time domain spectral induced polarization (IP) data, measured in boreholes on undisturbed formations using the El-log logging-while-drilling technique. We collected El-log data and hydraulic properties on unconsolidated Quaternary and Miocene deposits in boreholes at three locations at a field site in Denmark, characterized by different electrical water conductivity and chemistry. The high vertical resolution of the El-log technique matches the lithological variability at the site, minimizing ambiguity in the interpretation originating from resolution issues. The permeability values were computed from IP data using a laboratory-derived empirical relationship presented in a recent study for saturated unconsolidated sediments, without any further calibration. A very good correlation, within 1 order of magnitude, was found between the IP-derived permeability estimates and those derived using grain size analyses and slug tests, with similar depth trends and permeability contrasts. Furthermore, the effect of water conductivity on the IP-derived permeability estimations was found negligible in comparison to the permeability uncertainties estimated from the inversion and the laboratory-derived empirical relationship.

  10. Simpler ISS Flight Control Communications and Log Keeping via Social Tools and Techniques

    NASA Technical Reports Server (NTRS)

    Scott, David W.; Cowart, Hugh; Stevens, Dan

    2012-01-01

    The heart of flight operations control involves a) communicating effectively in real time with other controllers in the room and/or in remote locations and b) tracking significant events, decisions, and rationale to support the next set of decisions, provide a thorough shift handover, and troubleshoot/improve operations. International Space Station (ISS) flight controllers speak with each other via multiple voice circuits or loops, each with a particular purpose and constituency. Controllers monitor and/or respond to several loops concurrently. The primary tracking tools are console logs, typically kept by a single operator and not visible to others in real-time. Information from telemetry, commanding, and planning systems also plays into decision-making. Email is very secondary/tertiary due to timing and archival considerations. Voice communications and log entries supporting ISS operations have increased by orders of magnitude because the number of control centers, flight crew, and payload operations have grown. This paper explores three developmental ground system concepts under development at Johnson Space Center s (JSC) Mission Control Center Houston (MCC-H) and Marshall Space Flight Center s (MSFC) Payload Operations Integration Center (POIC). These concepts could reduce ISS control center voice traffic and console logging yet increase the efficiency and effectiveness of both. The goal of this paper is to kindle further discussion, exploration, and tool development.

  11. Qualitative and quantitative comparison of geostatistical techniques of porosity prediction from the seismic and logging data: a case study from the Blackfoot Field, Alberta, Canada

    NASA Astrophysics Data System (ADS)

    Maurya, S. P.; Singh, K. H.; Singh, N. P.

    2018-05-01

    In present study, three recently developed geostatistical methods, single attribute analysis, multi-attribute analysis and probabilistic neural network algorithm have been used to predict porosity in inter well region for Blackfoot field, Alberta, Canada, an offshore oil field. These techniques make use of seismic attributes, generated by model based inversion and colored inversion techniques. The principle objective of the study is to find the suitable combination of seismic inversion and geostatistical techniques to predict porosity and identification of prospective zones in 3D seismic volume. The porosity estimated from these geostatistical approaches is corroborated with the well log porosity. The results suggest that all the three implemented geostatistical methods are efficient and reliable to predict the porosity but the multi-attribute and probabilistic neural network analysis provide more accurate and high resolution porosity sections. A low impedance (6000-8000 m/s g/cc) and high porosity (> 15%) zone is interpreted from inverted impedance and porosity sections respectively between 1060 and 1075 ms time interval and is characterized as reservoir. The qualitative and quantitative results demonstrate that of all the employed geostatistical methods, the probabilistic neural network along with model based inversion is the most efficient method for predicting porosity in inter well region.

  12. A three-dimensional bucking system for optimal bucking of Central Appalachian hardwoods

    Treesearch

    Jingxin Wang; Jingang Liu; Chris B. LeDoux

    2009-01-01

    An optimal tree stembucking systemwas developed for central Appalachian hardwood species using three-dimensional (3D) modeling techniques. ActiveX Data Objects were implemented via MS Visual C++/OpenGL to manipulate tree data which were supported by a backend relational data model with five data entity types for stems, grades and prices, logs, defects, and stem shapes...

  13. Productivity and cost estimators for conventional ground-based skidding on steep terrain using preplanned skid roads

    Treesearch

    Michael D. Erickson; Curt C. Hassler; Chris B. LeDoux

    1991-01-01

    Continuous time and motion study techniques were used to develop productivity and cost estimators for the skidding component of ground-based logging systems, operating on steep terrain using preplanned skid roads. Comparisons of productivity and costs were analyzed for an overland random access skidding method, verses a skidding method utilizing a network of preplanned...

  14. Application of work sampling technique to analyze logging operations.

    Treesearch

    Edwin S. Miyata; Helmuth M. Steinhilb; Sharon A. Winsauer

    1981-01-01

    Discusses the advantages and disadvantages of various time study methods for determining efficiency and productivity in logging. The work sampling method is compared with the continuous time-study method. Gives the feasibility, capability, and limitation of the work sampling method.

  15. A stress wave based approach to NDE of logs for assessing potential veneer quality: Part I—small-diameter ponderosa pine.

    Treesearch

    Robert J. Ross; Susan W. Willits; William Von Segen; Terry Black; Brian K. Brashaw; Roy F. Pellerin

    1999-01-01

    Longitudinal stress wave nondestructive evaluation (NDE) techniques have been used in a variety of applications in the forest products industry. Recently, it has been shown that they can significantly aid in the assessment of log quality, particularly when they are used to predict performance of structural lumber obtained from a log. The purpose of the research...

  16. How accurately can we estimate energetic costs in a marine top predator, the king penguin?

    PubMed

    Halsey, Lewis G; Fahlman, Andreas; Handrich, Yves; Schmidt, Alexander; Woakes, Anthony J; Butler, Patrick J

    2007-01-01

    King penguins (Aptenodytes patagonicus) are one of the greatest consumers of marine resources. However, while their influence on the marine ecosystem is likely to be significant, only an accurate knowledge of their energy demands will indicate their true food requirements. Energy consumption has been estimated for many marine species using the heart rate-rate of oxygen consumption (f(H) - V(O2)) technique, and the technique has been applied successfully to answer eco-physiological questions. However, previous studies on the energetics of king penguins, based on developing or applying this technique, have raised a number of issues about the degree of validity of the technique for this species. These include the predictive validity of the present f(H) - V(O2) equations across different seasons and individuals and during different modes of locomotion. In many cases, these issues also apply to other species for which the f(H) - V(O2) technique has been applied. In the present study, the accuracy of three prediction equations for king penguins was investigated based on validity studies and on estimates of V(O2) from published, field f(H) data. The major conclusions from the present study are: (1) in contrast to that for walking, the f(H) - V(O2) relationship for swimming king penguins is not affected by body mass; (2) prediction equation (1), log(V(O2) = -0.279 + 1.24log(f(H) + 0.0237t - 0.0157log(f(H)t, derived in a previous study, is the most suitable equation presently available for estimating V(O2) in king penguins for all locomotory and nutritional states. A number of possible problems associated with producing an f(H) - V(O2) relationship are discussed in the present study. Finally, a statistical method to include easy-to-measure morphometric characteristics, which may improve the accuracy of f(H) - V(O2) prediction equations, is explained.

  17. New Technique for TOC Estimation Based on Thermal Core Logging in Low-Permeable Formations (Bazhen fm.)

    NASA Astrophysics Data System (ADS)

    Popov, Evgeny; Popov, Yury; Spasennykh, Mikhail; Kozlova, Elena; Chekhonin, Evgeny; Zagranovskaya, Dzhuliya; Belenkaya, Irina; Alekseev, Aleksey

    2016-04-01

    A practical method of organic-rich intervals identifying within the low-permeable dispersive rocks based on thermal conductivity measurements along the core is presented. Non-destructive non-contact thermal core logging was performed with optical scanning technique on 4 685 full size core samples from 7 wells drilled in four low-permeable zones of the Bazhen formation (B.fm.) in the Western Siberia (Russia). The method employs continuous simultaneous measurements of rock anisotropy, volumetric heat capacity, thermal anisotropy coefficient and thermal heterogeneity factor along the cores allowing the high vertical resolution (of up to 1-2 mm). B.fm. rock matrix thermal conductivity was observed to be essentially stable within the range of 2.5-2.7 W/(m*K). However, stable matrix thermal conductivity along with the high thermal anisotropy coefficient is characteristic for B.fm. sediments due to the low rock porosity values. It is shown experimentally that thermal parameters measured relate linearly to organic richness rather than to porosity coefficient deviations. Thus, a new technique employing the transformation of the thermal conductivity profiles into continuous profiles of total organic carbon (TOC) values along the core was developed. Comparison of TOC values, estimated from the thermal conductivity values, with experimental pyrolytic TOC estimations of 665 samples from the cores using the Rock-Eval and HAWK instruments demonstrated high efficiency of the new technique for the organic rich intervals separation. The data obtained with the new technique are essential for the SR hydrocarbon generation potential, for basin and petroleum system modeling application, and estimation of hydrocarbon reserves. The method allows for the TOC richness to be accurately assessed using the thermal well logs. The research work was done with financial support of the Russian Ministry of Education and Science (unique identification number RFMEFI58114X0008).

  18. 3D Numerical simulation of bed morphological responses to complex in-streamstructures

    NASA Astrophysics Data System (ADS)

    Xu, Y.; Liu, X.

    2017-12-01

    In-stream structures are widely used in stream restoration for both hydraulic and ecologicalpurposes. The geometries of the structures are usually designed to be extremely complex andirregular, so as to provide nature-like physical habitat. The aim of this study is to develop anumerical model to accurately predict the bed-load transport and the morphological changescaused by the complex in-stream structures. This model is developed in the platform ofOpenFOAM. In the hydrodynamics part, it utilizes different turbulence models to capture thedetailed turbulence information near the in-stream structures. The technique of immersedboundary method (IBM) is efficiently implemented in the model to describe the movable bendand the rigid solid body of in-stream structures. With IBM, the difficulty of mesh generation onthe complex geometry is greatly alleviated, and the bed surface deformation is able to becoupled in to flow system. This morphodynamics model is firstly validated by simple structures,such as the morphology of the scour in log-vane structure. Then it is applied in a more complexstructure, engineered log jams (ELJ), which consists of multiple logs piled together. Thenumerical results including turbulence flow information and bed morphological responses areevaluated against the experimental measurement within the exact same flow condition.

  19. Inhibition of biofilm formation on the surface of water storage containers using biosand zeolite silver-impregnated clay granular and silver impregnated porous pot filtration systems

    PubMed Central

    Moropeng, Resoketswe Charlotte; Mpenyana-Monyatsi, Lizzy; Momba, Maggie Ndombo Benteke

    2018-01-01

    Development of biofilms occurring on the inner surface of storage vessels offers a suitable medium for the growth of microorganisms and consequently contributes to the deterioration of treated drinking water quality in homes. The aim of this study was to determine whether the two point-of-use technologies (biosand zeolite silver-impregnated clay granular (BSZ-SICG) filter and silver-impregnated porous pot (SIPP) filter) deployed in a rural community of South Africa could inhibit the formation of biofilm on the surface of plastic-based containers generally used by rural households for the storage of their drinking water. Culture-based methods and molecular techniques were used to detect the indicator bacteria (Total coliforms, faecal coliform, E. coli) and pathogenic bacteria (Salmonella spp., Shigella spp. and Vibrio cholerae) in intake water and on the surface of storage vessels containing treated water. Scanning electron microscopy was also used to visualize the development of biofilm. Results revealed that the surface water source used by the Makwane community was heavily contaminated and harboured unacceptably high counts of bacteria (heterotrophic plate count: 4.4–4.3 Log10 CFU/100mL, total coliforms: 2.2 Log10 CFU/100 mL—2.1 Log10 CFU/100 mL, faecal coliforms: 1.9 Log10 CFU/100 mL—1.8 Log10 CFU/100 mL, E. coli: 1.7 Log10 CFU/100 mL—1.6 Log10 CFU/100 mL, Salmonella spp.: 3 Log10 CFU/100 mL -8 CFU/100 mL; Shigella spp. and Vibrio cholerae had 1.0 Log10 CFU/100 mL and 0.8 Log10 CFU/100 mL respectively). Biofilm formation was apparent on the surface of the storage containers with untreated water within 24 h. The silver nanoparticles embedded in the clay of the filtration systems provided an effective barrier for the inhibition of biofilm formation on the surface of household water storage containers. Biofilm formation occurred on the surface of storage plastic vessels containing drinking water treated with the SIPP filter between 14 and 21 days, and on those containing drinking water treated with the BSZ-SICG filter between 3 and 14 days. The attachment of target bacteria on the surface of the coupons inoculated in storage containers ranged from (0.07 CFU/cm2–227.8 CFU/cm2). To effectively prevent the development of biofilms on the surface of container-stored water, which can lead to the recontamination of treated water, plastic storage containers should be washed within 14 days for water treated with the SIPP filter and within 3 days for water treated with the BSZ-SICG filter. PMID:29621296

  20. Inhibition of biofilm formation on the surface of water storage containers using biosand zeolite silver-impregnated clay granular and silver impregnated porous pot filtration systems.

    PubMed

    Budeli, Phumudzo; Moropeng, Resoketswe Charlotte; Mpenyana-Monyatsi, Lizzy; Momba, Maggie Ndombo Benteke

    2018-01-01

    Development of biofilms occurring on the inner surface of storage vessels offers a suitable medium for the growth of microorganisms and consequently contributes to the deterioration of treated drinking water quality in homes. The aim of this study was to determine whether the two point-of-use technologies (biosand zeolite silver-impregnated clay granular (BSZ-SICG) filter and silver-impregnated porous pot (SIPP) filter) deployed in a rural community of South Africa could inhibit the formation of biofilm on the surface of plastic-based containers generally used by rural households for the storage of their drinking water. Culture-based methods and molecular techniques were used to detect the indicator bacteria (Total coliforms, faecal coliform, E. coli) and pathogenic bacteria (Salmonella spp., Shigella spp. and Vibrio cholerae) in intake water and on the surface of storage vessels containing treated water. Scanning electron microscopy was also used to visualize the development of biofilm. Results revealed that the surface water source used by the Makwane community was heavily contaminated and harboured unacceptably high counts of bacteria (heterotrophic plate count: 4.4-4.3 Log10 CFU/100mL, total coliforms: 2.2 Log10 CFU/100 mL-2.1 Log10 CFU/100 mL, faecal coliforms: 1.9 Log10 CFU/100 mL-1.8 Log10 CFU/100 mL, E. coli: 1.7 Log10 CFU/100 mL-1.6 Log10 CFU/100 mL, Salmonella spp.: 3 Log10 CFU/100 mL -8 CFU/100 mL; Shigella spp. and Vibrio cholerae had 1.0 Log10 CFU/100 mL and 0.8 Log10 CFU/100 mL respectively). Biofilm formation was apparent on the surface of the storage containers with untreated water within 24 h. The silver nanoparticles embedded in the clay of the filtration systems provided an effective barrier for the inhibition of biofilm formation on the surface of household water storage containers. Biofilm formation occurred on the surface of storage plastic vessels containing drinking water treated with the SIPP filter between 14 and 21 days, and on those containing drinking water treated with the BSZ-SICG filter between 3 and 14 days. The attachment of target bacteria on the surface of the coupons inoculated in storage containers ranged from (0.07 CFU/cm2-227.8 CFU/cm2). To effectively prevent the development of biofilms on the surface of container-stored water, which can lead to the recontamination of treated water, plastic storage containers should be washed within 14 days for water treated with the SIPP filter and within 3 days for water treated with the BSZ-SICG filter.

  1. 3-D visualisation of palaeoseismic trench stratigraphy and trench logging using terrestrial remote sensing and GPR - combining techniques towards an objective multiparametric interpretation

    NASA Astrophysics Data System (ADS)

    Schneiderwind, S.; Mason, J.; Wiatr, T.; Papanikolaou, I.; Reicherter, K.

    2015-09-01

    Two normal faults on the Island of Crete and mainland Greece were studied to create and test an innovative workflow to make palaeoseismic trench logging more objective, and visualise the sedimentary architecture within the trench wall in 3-D. This is achieved by combining classical palaeoseismic trenching techniques with multispectral approaches. A conventional trench log was firstly compared to results of iso cluster analysis of a true colour photomosaic representing the spectrum of visible light. Passive data collection disadvantages (e.g. illumination) were addressed by complementing the dataset with active near-infrared backscatter signal image from t-LiDAR measurements. The multispectral analysis shows that distinct layers can be identified and it compares well with the conventional trench log. According to this, a distinction of adjacent stratigraphic units was enabled by their particular multispectral composition signature. Based on the trench log, a 3-D-interpretation of GPR data collected on the vertical trench wall was then possible. This is highly beneficial for measuring representative layer thicknesses, displacements and geometries at depth within the trench wall. Thus, misinterpretation due to cutting effects is minimised. Sedimentary feature geometries related to earthquake magnitude can be used to improve the accuracy of seismic hazard assessments. Therefore, this manuscript combines multiparametric approaches and shows: (i) how a 3-D visualisation of palaeoseismic trench stratigraphy and logging can be accomplished by combining t-LiDAR and GRP techniques, and (ii) how a multispectral digital analysis can offer additional advantages and a higher objectivity in the interpretation of palaeoseismic and stratigraphic information. The multispectral datasets are stored allowing unbiased input for future (re-)investigations.

  2. Integrated core-log petrofacies analysis in the construction of a reservoir geomodel: A case study of a mature Mississippian carbonate reservoir using limited data

    USGS Publications Warehouse

    Bhattacharya, S.; Doveton, J.H.; Carr, T.R.; Guy, W.R.; Gerlach, P.M.

    2005-01-01

    Small independent operators produce most of the Mississippian carbonate fields in the United States mid-continent, where a lack of integrated characterization studies precludes maximization of hydrocarbon recovery. This study uses integrative techniques to leverage extant data in an Osagian and Meramecian (Mississippian) cherty carbonate reservoir in Kansas. Available data include petrophysical logs of varying vintages, limited number of cores, and production histories from each well. A consistent set of assumptions were used to extract well-level porosity and initial saturations, from logs of different types and vintages, to build a geomodel. Lacking regularly recorded well shut-in pressures, an iterative technique, based on material balance formulations, was used to estimate average reservoir-pressure decline that matched available drillstem test data and validated log-analysis assumptions. Core plugs representing the principal reservoir petrofacies provide critical inputs for characterization and simulation studies. However, assigning plugs among multiple reservoir petrofacies is difficult in complex (carbonate) reservoirs. In a bottom-up approach, raw capillary pressure (Pc) data were plotted on the Super-Pickett plot, and log- and core-derived saturation-height distributions were reconciled to group plugs by facies, to identify core plugs representative of the principal reservoir facies, and to discriminate facies in the logged interval. Pc data from representative core plugs were used for effective pay evaluation to estimate water cut from completions, in infill and producing wells, and guide-selective perforations for economic exploitation of mature fields. The results from this study were used to drill 22 infill wells. Techniques demonstrated here can be applied in other fields and reservoirs. Copyright ?? 2005. The American Association of Petroleum Geologists. All rights reserved.

  3. Effectiveness of streambank-stabilization techniques along the Kenai River, Alaska

    USGS Publications Warehouse

    Dorava, Joseph M.

    1999-01-01

    The Kenai River in southcentral Alaska is the State's most popular sport fishery and an economically important salmon river that generates as much as $70 million annually. Boatwake-induced streambank erosion and the associated damage to riparian and riverine habitat present a potential threat to this fishery. Bank-stabilization techniques commonly in use along the Kenai River were selected for evaluation of their effectiveness at attenuating boatwakes and retarding streambank erosion. Spruce trees cabled to the bank and biodegradable man-made logs (called 'bio-logs') pinned to the bank were tested because they are commonly used techniques along the river. These two techniques were compared for their ability to reduce wake heights that strike the bank and to reduce erosion of bank material, as well as for the amount and quality of habitat they provide for juvenile chinook salmon. Additionally, an engineered bank-stabilization project was evaluated because this method of bank protection is being encouraged by managers of the river. During a test that included 20 controlled boat passes, the spruce trees and the bio-log provided a similar reduction in boatwake height and bank erosion; however, the spruce trees provided a greater amount of protective habitat than the bio-log. The engineered bank-stabilization project eroded less during nine boat passes and provided more protective cover than the adjacent unprotected natural bank. Features of the bank-stabilization techniques, such as tree limbs and willow plantings that extended into the water from the bank, attenuated the boatwakes, which helped reduce erosion. These features also provided protective cover to juvenile salmon.

  4. NMR Parameters Determination through ACE Committee Machine with Genetic Implanted Fuzzy Logic and Genetic Implanted Neural Network

    NASA Astrophysics Data System (ADS)

    Asoodeh, Mojtaba; Bagheripour, Parisa; Gholami, Amin

    2015-06-01

    Free fluid porosity and rock permeability, undoubtedly the most critical parameters of hydrocarbon reservoir, could be obtained by processing of nuclear magnetic resonance (NMR) log. Despite conventional well logs (CWLs), NMR logging is very expensive and time-consuming. Therefore, idea of synthesizing NMR log from CWLs would be of a great appeal among reservoir engineers. For this purpose, three optimization strategies are followed. Firstly, artificial neural network (ANN) is optimized by virtue of hybrid genetic algorithm-pattern search (GA-PS) technique, then fuzzy logic (FL) is optimized by means of GA-PS, and eventually an alternative condition expectation (ACE) model is constructed using the concept of committee machine to combine outputs of optimized and non-optimized FL and ANN models. Results indicated that optimization of traditional ANN and FL model using GA-PS technique significantly enhances their performances. Furthermore, the ACE committee of aforementioned models produces more accurate and reliable results compared with a singular model performing alone.

  5. Genesis analysis of high-gamma ray sandstone reservoir and its log evaluation techniques: a case study from the Junggar basin, northwest China.

    PubMed

    Wang, Liang; Mao, Zhiqiang; Sun, Zhongchun; Luo, Xingping; Song, Yong; Liu, Zhen

    2013-01-01

    In the Junggar basin, northwest China, many high gamma-ray (GR) sandstone reservoirs are found and routinely interpreted as mudstone non-reservoirs, with negative implications for the exploration and exploitation of oil and gas. Then, the high GR sandstone reservoirs' recognition principles, genesis, and log evaluation techniques are systematically studied. Studies show that the sandstone reservoirs with apparent shale content greater than 50% and GR value higher than 110API can be regarded as high GR sandstone reservoir. The high GR sandstone reservoir is mainly and directly caused by abnormally high uranium enrichment, but not the tuff, feldspar or clay mineral. Affected by formation's high water sensitivity and poor borehole quality, the conventional logs can not recognize reservoir and evaluate the physical property of reservoirs. Then, the nuclear magnetic resonance (NMR) logs is proposed and proved to be useful in reservoir recognition and physical property evaluation.

  6. Genesis Analysis of High-Gamma Ray Sandstone Reservoir and Its Log Evaluation Techniques: A Case Study from the Junggar Basin, Northwest China

    PubMed Central

    Wang, Liang; Mao, Zhiqiang; Sun, Zhongchun; Luo, Xingping; Song, Yong; Liu, Zhen

    2013-01-01

    In the Junggar basin, northwest China, many high gamma-ray (GR) sandstone reservoirs are found and routinely interpreted as mudstone non-reservoirs, with negative implications for the exploration and exploitation of oil and gas. Then, the high GR sandstone reservoirs' recognition principles, genesis, and log evaluation techniques are systematically studied. Studies show that the sandstone reservoirs with apparent shale content greater than 50% and GR value higher than 110API can be regarded as high GR sandstone reservoir. The high GR sandstone reservoir is mainly and directly caused by abnormally high uranium enrichment, but not the tuff, feldspar or clay mineral. Affected by formation's high water sensitivity and poor borehole quality, the conventional logs can not recognize reservoir and evaluate the physical property of reservoirs. Then, the nuclear magnetic resonance (NMR) logs is proposed and proved to be useful in reservoir recognition and physical property evaluation. PMID:24078797

  7. Proposed standard-weight equations for brook trout

    USGS Publications Warehouse

    Hyatt, M.W.; Hubert, W.A.

    2001-01-01

    Weight and length data were obtained for 113 populations of brook trout Salvelinus fontinalis across the species' geographic range in North America to estimate a standard-weight (Ws) equation for this species. Estimation was done by applying the regression-line-percentile technique to fish of 120-620 mm total length (TL). The proposed metric-unit (g and mm) equation is log10Ws = -5.186 + 3.103 log10TL; the English-unit (lb and in) equivalent is log10Ws = -3.483 + 3.103 log10TL. No systematic length bias was evident in the relative-weight values calculated from these equations.

  8. Depth optimal sorting networks resistant to k passive faults

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piotrow, M.

    In this paper, we study the problem of constructing a sorting network that is tolerant to faults and whose running time (i.e. depth) is as small as possible. We consider the scenario of worst-case comparator faults and follow the model of passive comparator failure proposed by Yao and Yao, in which a faulty comparator outputs directly its inputs without comparison. Our main result is the first construction of an N-input, k-fault-tolerant sorting network that is of an asymptotically optimal depth {theta}(log N+k). That improves over the recent result of Leighton and Ma, whose network is of depth O(log N +more » k log log N/log k). Actually, we present a fault-tolerant correction network that can be added after any N-input sorting network to correct its output in the presence of at most k faulty comparators. Since the depth of the network is O(log N + k) and the constants hidden behind the {open_quotes}O{close_quotes} notation are not big, the construction can be of practical use. Developing the techniques necessary to show the main result, we construct a fault-tolerant network for the insertion problem. As a by-product, we get an N-input, O(log N)-depth INSERT-network that is tolerant to random faults, thereby answering a question posed by Ma in his PhD thesis. The results are based on a new notion of constant delay comparator networks, that is, networks in which each register is used (compared) only in a period of time of a constant length. Copies of such networks can be put one after another with only a constant increase in depth per copy.« less

  9. Geophysical examination of coal deposits

    NASA Astrophysics Data System (ADS)

    Jackson, L. J.

    1981-04-01

    Geophysical techniques for the solution of mining problems and as an aid to mine planning are reviewed. Techniques of geophysical borehole logging are discussed. The responses of the coal seams to logging tools are easily recognized on the logging records. Cores for laboratory analysis are cut from selected sections of the borehole. In addition, information about the density and chemical composition of the coal may be obtained. Surface seismic reflection surveys using two dimensional arrays of seismic sources and detectors detect faults with throws as small as 3 m depths of 800 m. In geologically disturbed areas, good results have been obtained from three dimensional surveys. Smaller faults as far as 500 m in advance of the working face may be detected using in seam seismic surveying conducted from a roadway or working face. Small disturbances are detected by pulse radar and continuous wave electromagnetic methods either from within boreholes or from underground. Other geophysical techniques which explicit the electrical, magnetic, gravitational, and geothermal properties of rocks are described.

  10. Using Log Linear Analysis for Categorical Family Variables.

    ERIC Educational Resources Information Center

    Moen, Phyllis

    The Goodman technique of log linear analysis is ideal for family research, because it is designed for categorical (non-quantitative) variables. Variables are dichotomized (for example, married/divorced, childless/with children) or otherwise categorized (for example, level of permissiveness, life cycle stage). Contingency tables are then…

  11. Method of assaying uranium with prompt fission and thermal neutron borehole logging adjusted by borehole physical characteristics

    DOEpatents

    Barnard, Ralston W.; Jensen, Dal H.

    1982-01-01

    Uranium formations are assayed by prompt fission neutron logging techniques. The uranium in the formation is proportional to the ratio of epithermal counts to thermal or eqithermal dieaway. Various calibration factors enhance the accuracy of the measurement.

  12. Minnesota logging utilization factors, 1975-1976--development, use, implications.

    Treesearch

    James E. Blyth; W. Brad Smith

    1979-01-01

    Discusses Minnesota saw log and pulpwood logging utilization factors developed during 1975-1976 and their implications. Compares factors for several species groups and shows their use in estimating growing stock cut for pulpwood and saw logs.

  13. Inference of strata separation and gas emission paths in longwall overburden using continuous wavelet transform of well logs and geostatistical simulation

    NASA Astrophysics Data System (ADS)

    Karacan, C. Özgen; Olea, Ricardo A.

    2014-06-01

    Prediction of potential methane emission pathways from various sources into active mine workings or sealed gobs from longwall overburden is important for controlling methane and for improving mining safety. The aim of this paper is to infer strata separation intervals and thus gas emission pathways from standard well log data. The proposed technique was applied to well logs acquired through the Mary Lee/Blue Creek coal seam of the Upper Pottsville Formation in the Black Warrior Basin, Alabama, using well logs from a series of boreholes aligned along a nearly linear profile. For this purpose, continuous wavelet transform (CWT) of digitized gamma well logs was performed by using Mexican hat and Morlet, as the mother wavelets, to identify potential discontinuities in the signal. Pointwise Hölder exponents (PHE) of gamma logs were also computed using the generalized quadratic variations (GQV) method to identify the location and strength of singularities of well log signals as a complementary analysis. PHEs and wavelet coefficients were analyzed to find the locations of singularities along the logs. Using the well logs in this study, locations of predicted singularities were used as indicators in single normal equation simulation (SNESIM) to generate equi-probable realizations of potential strata separation intervals. Horizontal and vertical variograms of realizations were then analyzed and compared with those of indicator data and training image (TI) data using the Kruskal-Wallis test. A sum of squared differences was employed to select the most probable realization representing the locations of potential strata separations and methane flow paths. Results indicated that singularities located in well log signals reliably correlated with strata transitions or discontinuities within the strata. Geostatistical simulation of these discontinuities provided information about the location and extents of the continuous channels that may form during mining. If there is a gas source within their zone of influence, paths may develop and allow methane movement towards sealed or active gobs under pressure differentials. Knowledge gained from this research will better prepare mine operations for potential methane inflows, thus improving mine safety.

  14. Subsurface recognition of oolitic facies in carbonate sequence: Exploration and development applications: Ste. Genevieve Formation (Mississippian), Illinois basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bandy, W.F.

    1989-08-01

    The oolitic grainstone facies of the Ste. Genevieve Limestone is a widespread and highly productive reservoir in the Illinois basin. However, exploration and development of these oolitic facies are hampered by the inability to recognize the reservoir on logs. In many areas, the only log data available are old wireline electric logs. Comparison of cores with log response in northern Lawrence field, Lawrence County, Illinois, indicates a subjective but predictable relationship between log signature and carbonate lithology. Two productive lithologies, dolomite and oolitic grainstone, display well-developed SP curves. However, resistivity response is greatest in dense limestone, less well developed inmore » oolitic grainstone, and poorly developed in dolomites. On gamma-ray logs, oolitic facies can be differentiated from dolomites by their lower radioactivity. Oolitic sands are most easily recognized on porosity logs, where their average porosity is 13.7%, only half the average porosity of dolomites. In a new well, the best information for subsequent offset and development of an oolitic reservoir is provided by porosity and dipmeter logs.« less

  15. Method of assaying uranium with prompt fission and thermal neutron borehole logging adjusted by borehole physical characteristics. [Patient application

    DOEpatents

    Barnard, R.W.; Jensen, D.H.

    1980-11-05

    Uranium formations are assayed by prompt fission neutron logging techniques. The uranium in the formation is proportional to the ratio of epithermal counts to thermal or epithermal dieaway. Various calibration factors enhance the accuracy of the measurement.

  16. Time: a vital resource.

    PubMed

    Collins, Sandra K; Collins, Kevin S

    2004-01-01

    Resolving problems with time management requires an understanding of the concept of working smarter rather than harder. Therefore, managing time effectively is a vital responsibility of department managers. When developing a plan for more effectively managing time, it is important to carefully analyze where time is currently being used/lost. Keeping a daily log can be a time consuming effort. However, the log can provide information about ways that time may be saved and how to organize personal schedules to maximize time efficiency. The next step is to develop a strategy to decrease wasted time and create a more cohesive radiology department. The following list of time management strategies provides some suggestions for developing a plan. Get focused. Set goals and priorities. Get organized. Monitor individual motivation factors. Develop memory techniques. In healthcare, success means delivering the highest quality of care by getting organized, meeting deadlines, creating efficient schedules and appropriately budgeting resources. Effective time management focuses on knowing what needs to be done when. The managerial challenge is to shift the emphasis from doing everything all at once to orchestrating the departmental activities in order to maximize the time given in a normal workday.

  17. Efficacy of Neutral Electrolyzed Water, Quaternary Ammonium and Lactic Acid-Based Solutions in Controlling Microbial Contamination of Food Cutting Boards Using a Manual Spraying Technique.

    PubMed

    Al-Qadiri, Hamzah M; Ovissipour, Mahmoudreza; Al-Alami, Nivin; Govindan, Byju N; Shiroodi, Setareh Ghorban; Rasco, Barbara

    2016-05-01

    Bactericidal activity of neutral electrolyzed water (NEW), quaternary ammonium (QUAT), and lactic acid-based solutions was investigated using a manual spraying technique against Salmonella Typhimurium, Escherichia coli O157:H7, Campylobacter jejuni, Listeria monocytogenes and Staphylococcus aureus that were inoculated onto the surface of scarred polypropylene and wooden food cutting boards. Antimicrobial activity was also examined when using cutting boards in preparation of raw chopped beef, chicken tenders or salmon fillets. Viable counts of survivors were determined as log10 CFU/100 cm(2) within 0 (untreated control), 1, 3, and 5 min of treatment at ambient temperature. Within the first minute of treatment, NEW and QUAT solutions caused more than 3 log10 bacterial reductions on polypropylene surfaces whereas less than 3 log10 reductions were achieved on wooden surfaces. After 5 min of treatment, more than 5 log10 reductions were achieved for all bacterial strains inoculated onto polypropylene surfaces. Using NEW and QUAT solutions within 5 min reduced Gram-negative bacteria by 4.58 to 4.85 log10 compared to more than 5 log10 reductions in Gram-positive bacteria inoculated onto wooden surfaces. Lactic acid treatment was significantly less effective (P < 0.05) compared to NEW and QUAT treatments. A decline in antimicrobial effectiveness was observed (0.5 to <2 log10 reductions were achieved within the first minute) when both cutting board types were used to prepare raw chopped beef, chicken tenders or salmon fillets. © 2016 Institute of Food Technologists®

  18. High-voltage supply for neutron tubes in well-logging applications

    DOEpatents

    Humphreys, D.R.

    1982-09-15

    A high voltage supply is provided for a neutron tube used in well logging. The biased pulse supply of the invention combines DC and full pulse techniques and produces a target voltage comprising a substantial negative DC bias component on which is superimposed a pulse whose negative peak provides the desired negative voltage level for the neutron tube. The target voltage is preferably generated using voltage doubling techniques and employing a voltage source which generates bipolar pulse pairs having an amplitude corresponding to the DC bias level.

  19. High voltage supply for neutron tubes in well logging applications

    DOEpatents

    Humphreys, D. Russell

    1989-01-01

    A high voltage supply is provided for a neutron tube used in well logging. The "biased pulse" supply of the invention combines DC and "full pulse" techniques and produces a target voltage comprising a substantial negative DC bias component on which is superimposed a pulse whose negative peak provides the desired negative voltage level for the neutron tube. The target voltage is preferably generated using voltage doubling techniques and employing a voltage source which generates bipolar pulse pairs having an amplitude corresponding to the DC bias level.

  20. Integrating surface and borehole geophysics in ground water studies - an example using electromagnetic soundings in south Florida

    USGS Publications Warehouse

    Paillet, Frederick; Hite, Laura; Carlson, Matthew

    1999-01-01

    Time domain surface electromagnetic soundings, borehole induction logs, and other borehole logging techniques are used to construct a realistic model for the shallow subsurface hydraulic properties of unconsolidated sediments in south Florida. Induction logs are used to calibrate surface induction soundings in units of pore water salinity by correlating water sample specific electrical conductivity with the electrical conductivity of the formation over the sampled interval for a two‐layered aquifer model. Geophysical logs are also used to show that a constant conductivity layer model is appropriate for the south Florida study. Several physically independent log measurements are used to quantify the dependence of formation electrical conductivity on such parameters as salinity, permeability, and clay mineral fraction. The combined interpretation of electromagnetic soundings and induction logs was verified by logging three validation boreholes, confirming quantitative estimates of formation conductivity and thickness in the upper model layer, and qualitative estimates of conductivity in the lower model layer.

  1. Mobile capture of remote points of interest using line of sight modelling

    NASA Astrophysics Data System (ADS)

    Meek, Sam; Priestnall, Gary; Sharples, Mike; Goulding, James

    2013-03-01

    Recording points of interest using GPS whilst working in the field is an established technique in geographical fieldwork, where the user's current position is used as the spatial reference to be captured; this is known as geo-tagging. We outline the development and evaluation of a smartphone application called Zapp that enables geo-tagging of any distant point on the visible landscape. The ability of users to log or retrieve information relating to what they can see, rather than where they are standing, allows them to record observations of points in the broader landscape scene, or to access descriptions of landscape features from any viewpoint. The application uses the compass orientation and tilt of the phone to provide data for a line of sight algorithm that intersects with a Digital Surface Model stored on the mobile device. We describe the development process and design decisions for Zapp present the results of a controlled study of the accuracy of the application, and report on the use of Zapp for a student field exercise. The studies indicate the feasibility of the approach, but also how the appropriate use of such techniques will be constrained by current levels of precision in mobile sensor technology. The broader implications for interactive query of the distant landscape and for remote data logging are discussed.

  2. Hyperspectral scattering profiles for prediction of the microbial spoilage of beef

    NASA Astrophysics Data System (ADS)

    Peng, Yankun; Zhang, Jing; Wu, Jianhu; Hang, Hui

    2009-05-01

    Spoilage in beef is the result of decomposition and the formation of metabolites caused by the growth and enzymatic activity of microorganisms. There is still no technology for the rapid, accurate and non-destructive detection of bacterially spoiled or contaminated beef. In this study, hyperspectral imaging technique was exploited to measure biochemical changes within the fresh beef. Fresh beef rump steaks were purchased from a commercial plant, and left to spoil in refrigerator at 8°C. Every 12 hours, hyperspectral scattering profiles over the spectral region between 400 nm and 1100 nm were collected directly from the sample surface in reflection pattern in order to develop an optimal model for prediction of the beef spoilage, in parallel the total viable count (TVC) per gram of beef were obtained by classical microbiological plating methods. The spectral scattering profiles at individual wavelengths were fitted accurately by a two-parameter Lorentzian distribution function. TVC prediction models were developed, using multi-linear regression, on relating individual Lorentzian parameters and their combinations at different wavelengths to log10(TVC) value. The best predictions were obtained with r2= 0.96 and SEP = 0.23 for log10(TVC). The research demonstrated that hyperspectral imaging technique is a valid tool for real-time and non-destructive detection of bacterial spoilage in beef.

  3. Evidence of Long Range Dependence and Self-similarity in Urban Traffic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thakur, Gautam S; Helmy, Ahmed; Hui, Pan

    2015-01-01

    Transportation simulation technologies should accurately model traffic demand, distribution, and assignment parame- ters for urban environment simulation. These three param- eters significantly impact transportation engineering bench- mark process, are also critical in realizing realistic traffic modeling situations. In this paper, we model and charac- terize traffic density distribution of thousands of locations around the world. The traffic densities are generated from millions of images collected over several years and processed using computer vision techniques. The resulting traffic den- sity distribution time series are then analyzed. It is found using the goodness-of-fit test that the traffic density dis- tributions follows heavy-tailmore » models such as Log-gamma, Log-logistic, and Weibull in over 90% of analyzed locations. Moreover, a heavy-tail gives rise to long-range dependence and self-similarity, which we studied by estimating the Hurst exponent (H). Our analysis based on seven different Hurst estimators strongly indicate that the traffic distribution pat- terns are stochastically self-similar (0.5 H 1.0). We believe this is an important finding that will influence the design and development of the next generation traffic simu- lation techniques and also aid in accurately modeling traffic engineering of urban systems. In addition, it shall provide a much needed input for the development of smart cities.« less

  4. X-Ray Processing of ChaMPlane Fields: Methods and Initial Results for Selected Anti-Galactic Center Fields

    NASA Astrophysics Data System (ADS)

    Hong, JaeSub; van den Berg, Maureen; Schlegel, Eric M.; Grindlay, Jonathan E.; Koenig, Xavier; Laycock, Silas; Zhao, Ping

    2005-12-01

    We describe the X-ray analysis procedure of the ongoing Chandra Multiwavelength Plane (ChaMPlane) Survey and report the initial results from the analysis of 15 selected anti-Galactic center observations (90deg

  5. Multicriteria evaluation of simulated logging scenarios in a tropical rain forest.

    PubMed

    Huth, Andreas; Drechsler, Martin; Köhler, Peter

    2004-07-01

    Forest growth models are useful tools for investigating the long-term impacts of logging. In this paper, the results of the rain forest growth model FORMIND were assessed by a multicriteria decision analysis. The main processes covered by FORMIND include tree growth, mortality, regeneration and competition. Tree growth is calculated based on a carbon balance approach. Trees compete for light and space; dying large trees fall down and create gaps in the forest. Sixty-four different logging scenarios for an initially undisturbed forest stand at Deramakot (Malaysia) were simulated. The scenarios differ regarding the logging cycle, logging method, cutting limit and logging intensity. We characterise the impacts with four criteria describing the yield, canopy opening and changes in species composition. Multicriteria decision analysis was used for the first time to evaluate the scenarios and identify the efficient ones. Our results plainly show that reduced-impact logging scenarios are more 'efficient' than the others, since in these scenarios forest damage is minimised without significantly reducing yield. Nevertheless, there is a trade-off between yield and achieving a desired ecological state of logged forest; the ecological state of the logged forests can only be improved by reducing yields and enlarging the logging cycles. Our study also demonstrates that high cutting limits or low logging intensities cannot compensate for the high level of damage caused by conventional logging techniques.

  6. Assessment of PDMS-water partition coefficients: implications for passive environmental sampling of hydrophobic organic compounds

    USGS Publications Warehouse

    DiFilippo, Erica L.; Eganhouse, Robert P.

    2010-01-01

    Solid-phase microextraction (SPME) has shown potential as an in situ passive-sampling technique in aquatic environments. The reliability of this method depends upon accurate determination of the partition coefficient between the fiber coating and water (Kf). For some hydrophobic organic compounds (HOCs), Kf values spanning 4 orders of magnitude have been reported for polydimethylsiloxane (PDMS) and water. However, 24% of the published data examined in this review did not pass the criterion for negligible depletion, resulting in questionable Kf values. The range in reported Kf is reduced to just over 2 orders of magnitude for some polychlorinated biphenyls (PCBs) when these questionable values are removed. Other factors that could account for the range in reported Kf, such as fiber-coating thickness and fiber manufacturer, were evaluated and found to be insignificant. In addition to accurate measurement of Kf, an understanding of the impact of environmental variables, such as temperature and ionic strength, on partitioning is essential for application of laboratory-measured Kf values to field samples. To date, few studies have measured Kf for HOCs at conditions other than at 20 degrees or 25 degrees C in distilled water. The available data indicate measurable variations in Kf at different temperatures and different ionic strengths. Therefore, if the appropriate environmental variables are not taken into account, significant error will be introduced into calculated aqueous concentrations using this passive sampling technique. A multiparameter linear solvation energy relationship (LSER) was developed to estimate log Kf in distilled water at 25 degrees C based on published physicochemical parameters. This method provided a good correlation (R2 = 0.94) between measured and predicted log Kf values for several compound classes. Thus, an LSER approach may offer a reliable means of predicting log Kf for HOCs whose experimental log Kf values are presently unavailable. Future research should focus on understanding the impact of environmental variables on Kf. Obtaining the data needed for an LSER approach to estimate Kf for all environmentally relevant HOCs would be beneficial to the application of SPME as a passive-sampling technique.

  7. Determination and prediction of octanol-air partition coefficients of hydroxylated and methoxylated polybrominated diphenyl ethers.

    PubMed

    Zhao, Hongxia; Xie, Qing; Tan, Feng; Chen, Jingwen; Quan, Xie; Qu, Baocheng; Zhang, Xin; Li, Xiaona

    2010-07-01

    The octanol-air partition coefficient (K(OA)) of 19 hydroxylated polybrominated diphenyl ethers (OH-PBDEs) and 10 methoxylated polybrominated diphenyl ethers (MeO-PBDEs) were measured as a function of temperature using a gas chromatographic retention time technique. At room temperature (298.15K), log K(OA) ranged from 8.30 for monobrominated OH/MeO-PBDEs to 13.29 for hexabrominated OH/MeO-PBDEs. The internal energies of phase change from octanol to air (Delta(OA)U) for 29 OH/MeO-PBDE congeners ranged from 72 to 126 kJ mol(-1). Using partial least-squares (PLS) analysis, a statistically quantitative structure-property relationship (QSPR) model for logK(OA) of OH/MeO-PBDE congeners was developed based on the 16 fundamental quantum chemical descriptors computed by PM3 Hamiltonian, for which the Q(cum)(2) was about 0.937. The molecular weight (Mw) and energy of the lowest unoccupied molecular orbital (E(LUMO)) were found to be main factors governing the log K(OA). 2010 Elsevier Ltd. All rights reserved.

  8. Financial and Economic Analysis of Reduced Impact Logging

    Treesearch

    Tom Holmes

    2016-01-01

    Concern regarding extensive damage to tropical forests resulting from logging increased dramatically after World War II when mechanized logging systems developed in industrialized countries were deployed in the tropics. As a consequence, tropical foresters began developing logging procedures that were more environmentally benign, and by the 1990s, these practices began...

  9. A Monte Carlo simulation study of an improved K-edge log-subtraction X-ray imaging using a photon counting CdTe detector

    NASA Astrophysics Data System (ADS)

    Lee, Youngjin; Lee, Amy Candy; Kim, Hee-Joung

    2016-09-01

    Recently, significant effort has been spent on the development of photons counting detector (PCD) based on a CdTe for applications in X-ray imaging system. The motivation of developing PCDs is higher image quality. Especially, the K-edge subtraction (KES) imaging technique using a PCD is able to improve image quality and useful for increasing the contrast resolution of a target material by utilizing contrast agent. Based on above-mentioned technique, we presented an idea for an improved K-edge log-subtraction (KELS) imaging technique. The KELS imaging technique based on the PCDs can be realized by using different subtraction energy width of the energy window. In this study, the effects of the KELS imaging technique and subtraction energy width of the energy window was investigated with respect to the contrast, standard deviation, and CNR with a Monte Carlo simulation. We simulated the PCD X-ray imaging system based on a CdTe and polymethylmethacrylate (PMMA) phantom which consists of the various iodine contrast agents. To acquired KELS images, images of the phantom using above and below the iodine contrast agent K-edge absorption energy (33.2 keV) have been acquired at different energy range. According to the results, the contrast and standard deviation were decreased, when subtraction energy width of the energy window is increased. Also, the CNR using a KELS imaging technique is higher than that of the images acquired by using whole energy range. Especially, the maximum differences of CNR between whole energy range and KELS images using a 1, 2, and 3 mm diameter iodine contrast agent were acquired 11.33, 8.73, and 8.29 times, respectively. Additionally, the optimum subtraction energy width of the energy window can be acquired at 5, 4, and 3 keV for the 1, 2, and 3 mm diameter iodine contrast agent, respectively. In conclusion, we successfully established an improved KELS imaging technique and optimized subtraction energy width of the energy window, and based on our results, we recommend using this technique for high image quality.

  10. Atmospheric stellar parameters from cross-correlation functions

    NASA Astrophysics Data System (ADS)

    Malavolta, L.; Lovis, C.; Pepe, F.; Sneden, C.; Udry, S.

    2017-08-01

    The increasing number of spectra gathered by spectroscopic sky surveys and transiting exoplanet follow-up has pushed the community to develop automated tools for atmospheric stellar parameters determination. Here we present a novel approach that allows the measurement of temperature (Teff), metallicity ([Fe/H]) and gravity (log g) within a few seconds and in a completely automated fashion. Rather than performing comparisons with spectral libraries, our technique is based on the determination of several cross-correlation functions (CCFs) obtained by including spectral features with different sensitivity to the photospheric parameters. We use literature stellar parameters of high signal-to-noise (SNR), high-resolution HARPS spectra of FGK main-sequence stars to calibrate Teff, [Fe/H] and log g as a function of CCF parameters. Our technique is validated using low-SNR spectra obtained with the same instrument. For FGK stars we achieve a precision of σ _{{T_eff}} = 50 K, σlog g = 0.09 dex and σ _{{{[Fe/H]}}} =0.035 dex at SNR = 50, while the precision for observation with SNR ≳ 100 and the overall accuracy are constrained by the literature values used to calibrate the CCFs. Our approach can easily be extended to other instruments with similar spectral range and resolution or to other spectral range and stars other than FGK dwarfs if a large sample of reference stars is available for the calibration. Additionally, we provide the mathematical formulation to convert synthetic equivalent widths to CCF parameters as an alternative to direct calibration. We have made our tool publicly available.

  11. Immobilized Artificial Membrane HPLC Derived Parameters vs PAMPA-BBB Data in Estimating in Situ Measured Blood-Brain Barrier Permeation of Drugs.

    PubMed

    Grumetto, Lucia; Russo, Giacomo; Barbato, Francesco

    2016-08-01

    The affinity indexes for phospholipids (log kW(IAM)) for 42 compounds were measured by high performance liquid chromatography (HPLC) on two different phospholipid-based stationary phases (immobilized artificial membrane, IAM), i.e., IAM.PC.MG and IAM.PC.DD2. The polar/electrostatic interaction forces between analytes and membrane phospholipids (Δlog kW(IAM)) were calculated as the differences between the experimental values of log kW(IAM) and those expected for isolipophilic neutral compounds having polar surface area (PSA) = 0. The values of passage through a porcine brain lipid extract (PBLE) artificial membrane for 36 out of the 42 compounds considered, measured by the so-called PAMPA-BBB technique, were taken from the literature (P0(PAMPA-BBB)). The values of blood-brain barrier (BBB) passage measured in situ, P0(in situ), for 38 out of the 42 compounds considered, taken from the literature, represented the permeability of the neutral forms on "efflux minimized" rodent models. The present work was aimed at verifying the soundness of Δlog kW(IAM) at describing the potential of passage through the BBB as compared to data achieved by the PAMPA-BBB technique. In a first instance, the values of log P0(PAMPA-BBB) (32 data points) were found significantly related to the n-octanol lipophilicity values of the neutral forms (log P(N)) (r(2) = 0.782) whereas no significant relationship (r(2) = 0.246) was found with lipophilicity values of the mixtures of ionized and neutral forms existing at the experimental pH 7.4 (log D(7.4)) as well as with either log kW(IAM) or Δlog kW(IAM) values. log P0(PAMPA-BBB) related moderately to log P0(in situ) values (r(2) = 0.604). The latter did not relate with either n-octanol lipophilicity indexes (log P(N) and log D(7.4)) or phospholipid affinity indexes (log kW(IAM)). In contrast, significant inverse linear relationships were observed between log P0(in situ) (38 data points) and Δlog kW(IAM) values for all the compounds but ibuprofen and chlorpromazine, which behaved as moderate outliers (r(2) = 0.656 and r(2) = 0.757 for values achieved on IAM.PC.MG and IAM.PC.DD2, respectively). Since log P0(in situ) refer to the "intrinsic permeability" of the analytes regardless their ionization degree, no correction for ionization of Δlog kW(IAM) values was needed. Furthermore, log P0(in situ) were found roughly linearly related to log BB values (i.e., the logarithm of the ratio brain concentration/blood concentration measured in vivo) for all the analytes but those predominantly present at the experimental pH 7.4 as anions. These results suggest that, at least for the data set considered, Δlog kW(IAM) parameters are more effective than log P0(PAMPA-BBB) at predicting log P0(in situ) values for all the analytes. Furthermore, ionization appears to affect differently, and much more markedly, BBB passage of acids (yielding anions) than that of the other ionizable compounds.

  12. Gradually truncated log-normal in USA publicly traded firm size distribution

    NASA Astrophysics Data System (ADS)

    Gupta, Hari M.; Campanha, José R.; de Aguiar, Daniela R.; Queiroz, Gabriel A.; Raheja, Charu G.

    2007-03-01

    We study the statistical distribution of firm size for USA and Brazilian publicly traded firms through the Zipf plot technique. Sale size is used to measure firm size. The Brazilian firm size distribution is given by a log-normal distribution without any adjustable parameter. However, we also need to consider different parameters of log-normal distribution for the largest firms in the distribution, which are mostly foreign firms. The log-normal distribution has to be gradually truncated after a certain critical value for USA firms. Therefore, the original hypothesis of proportional effect proposed by Gibrat is valid with some modification for very large firms. We also consider the possible mechanisms behind this distribution.

  13. Artificial neural network modeling and cluster analysis for organic facies and burial history estimation using well log data: A case study of the South Pars Gas Field, Persian Gulf, Iran

    NASA Astrophysics Data System (ADS)

    Alizadeh, Bahram; Najjari, Saeid; Kadkhodaie-Ilkhchi, Ali

    2012-08-01

    Intelligent and statistical techniques were used to extract the hidden organic facies from well log responses in the Giant South Pars Gas Field, Persian Gulf, Iran. Kazhdomi Formation of Mid-Cretaceous and Kangan-Dalan Formations of Permo-Triassic Data were used for this purpose. Initially GR, SGR, CGR, THOR, POTA, NPHI and DT logs were applied to model the relationship between wireline logs and Total Organic Carbon (TOC) content using Artificial Neural Networks (ANN). The correlation coefficient (R2) between the measured and ANN predicted TOC equals to 89%. The performance of the model is measured by the Mean Squared Error function, which does not exceed 0.0073. Using Cluster Analysis technique and creating a binary hierarchical cluster tree the constructed TOC column of each formation was clustered into 5 organic facies according to their geochemical similarity. Later a second model with the accuracy of 84% was created by ANN to determine the specified clusters (facies) directly from well logs for quick cluster recognition in other wells of the studied field. Each created facies was correlated to its appropriate burial history curve. Hence each and every facies of a formation could be scrutinized separately and directly from its well logs, demonstrating the time and depth of oil or gas generation. Therefore potential production zone of Kazhdomi probable source rock and Kangan- Dalan reservoir formation could be identified while well logging operations (especially in LWD cases) were in progress. This could reduce uncertainty and save plenty of time and cost for oil industries and aid in the successful implementation of exploration and exploitation plans.

  14. Automated Detection of Selective Logging in Amazon Forests Using Airborne Lidar Data and Pattern Recognition Algorithms

    NASA Astrophysics Data System (ADS)

    Keller, M. M.; d'Oliveira, M. N.; Takemura, C. M.; Vitoria, D.; Araujo, L. S.; Morton, D. C.

    2012-12-01

    Selective logging, the removal of several valuable timber trees per hectare, is an important land use in the Brazilian Amazon and may degrade forests through long term changes in structure, loss of forest carbon and species diversity. Similar to deforestation, the annual area affected by selected logging has declined significantly in the past decade. Nonetheless, this land use affects several thousand km2 per year in Brazil. We studied a 1000 ha area of the Antimary State Forest (FEA) in the State of Acre, Brazil (9.304 ○S, 68.281 ○W) that has a basal area of 22.5 m2 ha-1 and an above-ground biomass of 231 Mg ha-1. Logging intensity was low, approximately 10 to 15 m3 ha-1. We collected small-footprint airborne lidar data using an Optech ALTM 3100EA over the study area once each in 2010 and 2011. The study area contained both recent and older logging that used both conventional and technologically advanced logging techniques. Lidar return density averaged over 20 m-2 for both collection periods with estimated horizontal and vertical precision of 0.30 and 0.15 m. A relative density model comparing returns from 0 to 1 m elevation to returns in 1-5 m elevation range revealed the pattern of roads and skid trails. These patterns were confirmed by ground-based GPS survey. A GIS model of the road and skid network was built using lidar and ground data. We tested and compared two pattern recognition approaches used to automate logging detection. Both segmentation using commercial eCognition segmentation and a Frangi filter algorithm identified the road and skid trail network compared to the GIS model. We report on the effectiveness of these two techniques.

  15. Solubility enhancement of dioxins and PCBs by surfactant monomers and micelles quantified with polymer depletion techniques.

    PubMed

    Schacht, Veronika J; Grant, Sharon C; Escher, Beate I; Hawker, Darryl W; Gaus, Caroline

    2016-06-01

    Partitioning of super-hydrophobic organic contaminants (SHOCs) to dissolved or colloidal materials such as surfactants can alter their behaviour by enhancing apparent aqueous solubility. Relevant partition constants are, however, challenging to quantify with reasonable accuracy. Partition constants to colloidal surfactants can be measured by introducing a polymer (PDMS) as third phase with known PDMS-water partition constant in combination with the mass balance approach. We quantified partition constants of PCBs and PCDDs (log KOW 5.8-8.3) between water and sodium dodecyl sulphate monomers (KMO) and micelles (KMI). A refined, recently introduced swelling-based polymer loading technique allowed highly precise (4.5-10% RSD) and fast (<24 h) loading of SHOCs into PDMS, and due to the miniaturisation of batch systems equilibrium was reached in <5 days for KMI and <3 weeks for KMO. SHOC losses to experimental surfaces were substantial (8-26%) in monomer solutions, but had a low impact on KMO (0.10-0.16 log units). Log KMO for PCDDs (4.0-5.2) were approximately 2.6 log units lower than respective log KMI, which ranged from 5.2 to 7.0 for PCDDs and 6.6-7.5 for PCBs. The linear relationship between log KMI and log KOW was consistent with more polar and moderately hydrophobic compounds. Apparent solubility increased with increasing hydrophobicity and was highest in micelle solutions. However, this solubility enhancement was also considerable in monomer solutions, up to 200 times for OCDD. Given the pervasive presence of surfactant monomers in typical field scenarios, these data suggest that low surfactant concentrations may be effective long-term facilitators for subsurface transport of SHOCs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. A proven record in changing attitudes about MWD logs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cantrell, L.; Paxson, K.B.; Keyser, W.L.

    1993-07-01

    Measurement while drilling (MWD) logs for quantitative reservoir characterization were evaluated during drilling of Gulf of Mexico flexure trend projects, Kilauea (Green Canyon Blocks 6 and 50) and Tick (Garden Banks Block 189). Comparisons confirmed that MWD can be used as an accurate replacement for wireline logging when borehole size is not a limiting factor. Texaco MWD experience evolved from last resort' to primary formation evaluation logging, which resulted in rigtime and associated cost savings. Difficult wells are now drilled and evaluated with confidence, geopressure is safely monitored, conventional core interval tops are selected, and geologic interpretations and operational decisionsmore » are made before wells TD. This paper reviews the performance, accuracy, and limitations of the MWD systems and compares the results to standard geophysical well logging techniques. Four case histories are presented.« less

  17. Empirical Mode Decomposition of Geophysical Well-log Data of Bombay Offshore Basin, Mumbai, India

    NASA Astrophysics Data System (ADS)

    Siddharth Gairola, Gaurav; Chandrasekhar, Enamundram

    2016-04-01

    Geophysical well-log data manifest the nonlinear behaviour of their respective physical properties of the heterogeneous subsurface layers as a function of depth. Therefore, nonlinear data analysis techniques must be implemented, to quantify the degree of heterogeneity in the subsurface lithologies. One such nonlinear data adaptive technique is empirical mode decomposition (EMD) technique, which facilitates to decompose the data into oscillatory signals of different wavelengths called intrinsic mode functions (IMF). In the present study EMD has been applied to gamma-ray log and neutron porosity log of two different wells: Well B and Well C located in the western offshore basin of India to perform heterogeneity analysis and compare the results with those obtained by multifractal studies of the same data sets. By establishing a relationship between the IMF number (m) and the mean wavelength associated with each IMF (Im), a heterogeneity index (ρ) associated with subsurface layers can be determined using the relation, Im=kρm, where 'k' is a constant. The ρ values bear an inverse relation with the heterogeneity of the subsurface: smaller ρ values designate higher heterogeneity and vice-versa. The ρ values estimated for different limestone payzones identified in the wells clearly show that Well C has higher degree of heterogeneity than Well B. This correlates well with the estimated Vshale values for the limestone reservoir zone showing higher shale content in Well C than Well B. The ρ values determined for different payzones of both wells will be used to quantify the degree of heterogeneity in different wells. The multifractal behaviour of each IMF of both the logs of both the wells will be compared with one another and discussed on the lines of their heterogeneity indices.

  18. Drill Cuttings-based Methodology to Optimize Multi-stage Hydraulic Fracturing in Horizontal Wells and Unconventional Gas Reservoirs

    NASA Astrophysics Data System (ADS)

    Ortega Mercado, Camilo Ernesto

    Horizontal drilling and hydraulic fracturing techniques have become almost mandatory technologies for economic exploitation of unconventional gas reservoirs. Key to commercial success is minimizing the risk while drilling and hydraulic fracturing these wells. Data collection is expensive and as a result this is one of the first casualties during budget cuts. As a result complete data sets in horizontal wells are nearly always scarce. In order to minimize the data scarcity problem, the research addressed throughout this thesis concentrates on using drill cuttings, an inexpensive direct source of information, for developing: 1) A new methodology for multi-stage hydraulic fracturing optimization of horizontal wells without any significant increases in operational costs. 2) A new method for petrophysical evaluation in those wells with limited amount of log information. The methods are explained using drill cuttings from the Nikanassin Group collected in the Deep Basin of the Western Canada Sedimentary Basin (WCSB). Drill cuttings are the main source of information for the proposed methodology in Item 1, which involves the creation of three 'log tracks' containing the following parameters for improving design of hydraulic fracturing jobs: (a) Brittleness Index, (b) Measured Permeability and (c) An Indicator of Natural Fractures. The brittleness index is primarily a function of Poisson's ratio and Young Modulus, parameters that are obtained from drill cuttings and sonic logs formulations. Permeability is measured on drill cuttings in the laboratory. The indication of natural fractures is obtained from direct observations on drill cuttings under the microscope. Drill cuttings are also the main source of information for the new petrophysical evaluation method mentioned above in Item 2 when well logs are not available. This is important particularly in horizontal wells where the amount of log data is almost non-existent in the vast majority of the wells. By combining data from drill cuttings and previously available empirical relationships developed from cores it is possible to estimate water saturations, pore throat apertures, capillary pressures, flow units, porosity (or cementation) exponent m, true formation resistivity Rt, distance to a water table (if present), and to distinguish the contributions of viscous and diffusion-like flow in the tight gas formation. The method further allows the construction of Pickett plots using porosity and permeability obtained from drill cuttings, without previous availability of well logs. The method assumes the existence of intervals at irreducible water saturation, which is the case of the Nikanassin Group throughout the gas column. The new methods mentioned above are not meant to replace the use of detailed and sophisticated evaluation techniques. But the proposed methods provide a valuable and practical aid in those cases where geomechanical and petrophysical information are scarce.

  19. A Clustering Methodology of Web Log Data for Learning Management Systems

    ERIC Educational Resources Information Center

    Valsamidis, Stavros; Kontogiannis, Sotirios; Kazanidis, Ioannis; Theodosiou, Theodosios; Karakos, Alexandros

    2012-01-01

    Learning Management Systems (LMS) collect large amounts of data. Data mining techniques can be applied to analyse their web data log files. The instructors may use this data for assessing and measuring their courses. In this respect, we have proposed a methodology for analysing LMS courses and students' activity. This methodology uses a Markov…

  20. Nondestructive Methods for Detecting Defects in Softwood Logs

    Treesearch

    Kristin C. Schad; Daniel L. Schmoldt; Robert J. Ross

    1996-01-01

    Wood degradation and defects, such as voids and knots, affect the quality and processing time of lumber. The ability to detect internal defects in the log can save mills time and processing costs. In this study, we investigated three nondestructive evaluation techniques for detecting internal wood defects. Sound wave transmission, x-ray computed tomography, and impulse...

  1. Techniques for the wheeled-skidder operator

    Treesearch

    Robert L. Hartman; Harry G. Gibson; Harry G. Gibson

    1970-01-01

    How much production a logger gets from a logging job may depend heavily on his skidder operators. They are key men on any logging job. This is one conclusion that forestry engineers at the USDA Forest Service's Forestry Sciences Laboratory at Morgantown, West Virginia, came to after studying the operation of u-heeled skidders in mountainous Appalachian terrain....

  2. CT Imaging, Data Reduction, and Visualization of Hardwood Logs

    Treesearch

    Daniel L. Schmoldt

    1996-01-01

    Computer tomography (CT) is a mathematical technique that, combined with noninvasive scanning such as x-ray imaging, has become a powerful tool to nondestructively test materials prior to use or to evaluate materials prior to processing. In the current context, hardwood lumber processing can benefit greatly by knowing what a log looks like prior to initial breakdown....

  3. SU-F-T-288: Impact of Trajectory Log Files for Clarkson-Based Independent Dose Verification of IMRT and VMAT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takahashi, R; Kamima, T; Tachibana, H

    2016-06-15

    Purpose: To investigate the effect of the trajectory files from linear accelerator for Clarkson-based independent dose verification in IMRT and VMAT plans. Methods: A CT-based independent dose verification software (Simple MU Analysis: SMU, Triangle Products, Japan) with a Clarksonbased algorithm was modified to calculate dose using the trajectory log files. Eclipse with the three techniques of step and shoot (SS), sliding window (SW) and Rapid Arc (RA) was used as treatment planning system (TPS). In this study, clinically approved IMRT and VMAT plans for prostate and head and neck (HN) at two institutions were retrospectively analyzed to assess the dosemore » deviation between DICOM-RT plan (PL) and trajectory log file (TJ). An additional analysis was performed to evaluate MLC error detection capability of SMU when the trajectory log files was modified by adding systematic errors (0.2, 0.5, 1.0 mm) and random errors (5, 10, 30 mm) to actual MLC position. Results: The dose deviations for prostate and HN in the two sites were 0.0% and 0.0% in SS, 0.1±0.0%, 0.1±0.1% in SW and 0.6±0.5%, 0.7±0.9% in RA, respectively. The MLC error detection capability shows the plans for HN IMRT were the most sensitive and 0.2 mm of systematic error affected 0.7% dose deviation on average. Effect of the MLC random error did not affect dose error. Conclusion: The use of trajectory log files including actual information of MLC location, gantry angle, etc should be more effective for an independent verification. The tolerance level for the secondary check using the trajectory file may be similar to that of the verification using DICOM-RT plan file. From the view of the resolution of MLC positional error detection, the secondary check could detect the MLC position error corresponding to the treatment sites and techniques. This research is partially supported by Japan Agency for Medical Research and Development (AMED)« less

  4. Analytical methods in multivariate highway safety exposure data estimation

    DOT National Transportation Integrated Search

    1984-01-01

    Three general analytical techniques which may be of use in : extending, enhancing, and combining highway accident exposure data are : discussed. The techniques are log-linear modelling, iterative propor : tional fitting and the expectation maximizati...

  5. Joint Inversion of Geochemical Data and Geophysical Logs for Lithology Identification in CCSD Main Hole

    NASA Astrophysics Data System (ADS)

    Deng, Chengxiang; Pan, Heping; Luo, Miao

    2017-12-01

    The Chinese Continental Scientific Drilling (CCSD) main hole is located in the Sulu ultrahigh-pressure metamorphic (UHPM) belt, providing significant opportunities for studying the metamorphic strata structure, kinetics process and tectonic evolution. Lithology identification is the primary and crucial stage for above geoscientific researches. To release the burden of log analyst and improve the efficiency of lithology interpretation, many algorithms have been developed to automate the process of lithology prediction. While traditional statistical techniques, such as discriminant analysis and K-nearest neighbors classifier, are incompetent in extracting nonlinear features of metamorphic rocks from complex geophysical log data; artificial intelligence algorithms are capable of solving nonlinear problems, but most of the algorithms suffer from tuning parameters to be global optimum to establish model rather than local optimum, and also encounter challenges in making the balance between training accuracy and generalization ability. Optimization methods have been applied extensively in the inversion of reservoir parameters of sedimentary formations using well logs. However, it is difficult to obtain accurate solution from the logging response equations of optimization method because of the strong overlapping of nonstationary log signals when applied in metamorphic formations. As oxide contents of each kinds of metamorphic rocks are relatively less overlapping, this study explores an approach, set in a metamorphic formation model and using the Broyden Fletcher Goldfarb Shanno (BFGS) optimization algorithm to identify lithology from oxide data. We first incorporate 11 geophysical logs and lab-collected geochemical data of 47 core samples to construct oxide profile of CCSD main hole by using backwards stepwise multiple regression method, which eliminates irrelevant input logs step by step for higher statistical significance and accuracy. Then we establish oxide response equations in accordance with the metamorphic formation model and employ BFGS algorithm to minimize the objective function. Finally, we identify lithology according to the composition content which accounts for the largest proportion. The results show that lithology identified by the method of this paper is consistent with core description. Moreover, this method demonstrates the benefits of using oxide content as an adhesive to connect logging data with lithology, can make the metamorphic formation model more understandable and accurate, and avoid selecting complex formation model and building nonlinear logging response equations.

  6. On the Rapid Computation of Various Polylogarithmic Constants

    NASA Technical Reports Server (NTRS)

    Bailey, David H.; Borwein, Peter; Plouffe, Simon

    1996-01-01

    We give algorithms for the computation of the d-th digit of certain transcendental numbers in various bases. These algorithms can be easily implemented (multiple precision arithmetic is not needed), require virtually no memory, and feature run times that scale nearly linearly with the order of the digit desired. They make it feasible to compute, for example, the billionth binary digit of log(2) or pi on a modest workstation in a few hours run time. We demonstrate this technique by computing the ten billionth hexadecimal digit of pi, the billionth hexadecimal digits of pi-squared, log(2) and log-squared(2), and the ten billionth decimal digit of log(9/10). These calculations rest on the observation that very special types of identities exist for certain numbers like pi, pi-squared, log(2) and log-squared(2). These are essentially polylogarithmic ladders in an integer base. A number of these identities that we derive in this work appear to be new, for example a critical identity for pi.

  7. Development of regional stump-to-mill logging cost estimators

    Treesearch

    Chris B. LeDoux; John E. Baumgras

    1989-01-01

    Planning logging operations requires estimating the logging costs for the sale or tract being harvested. Decisions need to be made on equipment selection and its application to terrain. In this paper a methodology is described that has been developed and implemented to solve the problem of accurately estimating logging costs by region. The methodology blends field time...

  8. Tangential scanning of hardwood logs: developing an industrial computer tomography scanner

    Treesearch

    Nand K. Gupta; Daniel L. Schmoldt; Bruce Isaacson

    1999-01-01

    It is generally believed that noninvasive scanning of hardwood logs such as computer tomography (CT) scanning prior to initial breakdown will greatly improve the processing of logs into lumber. This belief, however, has not translated into rapid development and widespread installation of industrial CT scanners for log processing. The roadblock has been more operational...

  9. Use of processed resistivity borehole imaging to assess the insoluble content of the massively bedded Preesall Halite NW England

    NASA Astrophysics Data System (ADS)

    Kingdon, Andrew; Evans, David J.

    2013-04-01

    With the decline of the UK's remaining conventional reserves of natural gas and associated growth of imports, the lack of adequate storage capacity is a matter of concern for ensuring energy security year-round. In a number of countries, subsurface caverns for gas storage have been created by solution mining of massive halite deposits and similar storage facilities are likely to become an important part of the UK's energy infrastructure. Crucial to the economic viability of such facilities is the percentage of insoluble material within the halite intervals, which influences strongly the relationship between cavern sump and working volumes: successful development of these caverns is dependent upon maximising the efficiency of cavern design and construction. The purity of a massive halite sequence can only be assessed either by direct means (i.e. coring) or indirectly by downhole geophysical logs The use of conventional geophysical logs in subsurface exploration is well established but literature generally relies on a very low resolution tools with a typical vertical logging sample interval of 15 centimetres. This means that such tools provide, at best, a "blurred" view of the sedimentary successions penetrated by the borehole and that discrete narrow bands of insoluble material will not be identifiable or distinguishable from zones of "dirtier" halite with disseminated mud materials. In 2008, Halite-Energy Group (formerly Canatxx Gas Storage Ltd) drilled the Burrows Marsh #1 borehole and acquired resistivity borehole imaging (FMI) logs through the Triassic Preesall Halite in the Preesall Saltfield, NW England. In addition to near full circumferal imaging capability, rather than a single measurement per increment, FMI logs allows millimetre to centimetre scale imaging of sedimentary features, that is one to two orders of magnitude higher vertical resolution. After binary segmentation of the FMI images to achieve a simple halite-insoluble ("mud") separation these were subject to a filtering process to develop a detailed understanding of the halite sequence's insoluble content. The results were then calibrated, post-normalisation, by new laboratory determinations of the insoluble content of laterally equivalent samples of core from the nearby Arm Hill #1 borehole. The FMI logs provide a greater degree of resolution when compared to conventional geophysical logs. With the statistical analysis provided by this process, it further enhances the correlation between the logs and core and ultimately, the assessment of insoluble content. Despite the obvious increase in resolution, precise statistical quantification of the success of the borehole imaging technique is somewhat obfuscated by the absence of both FMI logs and continuous core in a single borehole. The acquisition parameters for these images are at the limits for the tools and therefore more noisy than those acquired in other lithologies or logging environments. The optimum acquisition parameters (in particular gain settings and logging speed), the nature of the filtering required to quantify the insoluble content and the effects of image noise on those calculations are discussed.

  10. SU-G-JeP1-08: Dual Modality Verification for Respiratory Gating Using New Real- Time Tumor Tracking Radiotherapy System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shiinoki, T; Hanazawa, H; Shibuya, K

    Purpose: The respirato ry gating system combined the TrueBeam and a new real-time tumor-tracking radiotherapy system (RTRT) was installed. The RTRT system consists of two x-ray tubes and color image intensifiers. Using fluoroscopic images, the fiducial marker which was implanted near the tumor was tracked and was used as the internal surrogate for respiratory gating. The purposes of this study was to develop the verification technique of the respiratory gating with the new RTRT using cine electronic portal image device images (EPIDs) of TrueBeam and log files of the RTRT. Methods: A patient who underwent respiratory gated SBRT of themore » lung using the RTRT were enrolled in this study. For a patient, the log files of three-dimensional coordinate of fiducial marker used as an internal surrogate were acquired using the RTRT. Simultaneously, the cine EPIDs were acquired during respiratory gated radiotherapy. The data acquisition was performed for one field at five sessions during the course of SBRT. The residual motion errors were calculated using the log files (E{sub log}). The fiducial marker used as an internal surrogate into the cine EPIDs was automatically extracted by in-house software based on the template-matching algorithm. The differences between the the marker positions of cine EPIDs and digitally reconstructed radiograph were calculated (E{sub EPID}). Results: Marker detection on EPID using in-house software was influenced by low image contrast. For one field during the course of SBRT, the respiratory gating using the RTRT showed the mean ± S.D. of 95{sup th} percentile E{sub EPID} were 1.3 ± 0.3 mm,1.1 ± 0.5 mm,and those of E{sub log} were 1.5 ± 0.2 mm, 1.1 ± 0.2 mm in LR and SI directions, respectively. Conclusion: We have developed the verification method of respiratory gating combined TrueBeam and new real-time tumor-tracking radiotherapy system using EPIDs and log files.« less

  11. Conjunctive patches subspace learning with side information for collaborative image retrieval.

    PubMed

    Zhang, Lining; Wang, Lipo; Lin, Weisi

    2012-08-01

    Content-Based Image Retrieval (CBIR) has attracted substantial attention during the past few years for its potential practical applications to image management. A variety of Relevance Feedback (RF) schemes have been designed to bridge the semantic gap between the low-level visual features and the high-level semantic concepts for an image retrieval task. Various Collaborative Image Retrieval (CIR) schemes aim to utilize the user historical feedback log data with similar and dissimilar pairwise constraints to improve the performance of a CBIR system. However, existing subspace learning approaches with explicit label information cannot be applied for a CIR task, although the subspace learning techniques play a key role in various computer vision tasks, e.g., face recognition and image classification. In this paper, we propose a novel subspace learning framework, i.e., Conjunctive Patches Subspace Learning (CPSL) with side information, for learning an effective semantic subspace by exploiting the user historical feedback log data for a CIR task. The CPSL can effectively integrate the discriminative information of labeled log images, the geometrical information of labeled log images and the weakly similar information of unlabeled images together to learn a reliable subspace. We formally formulate this problem into a constrained optimization problem and then present a new subspace learning technique to exploit the user historical feedback log data. Extensive experiments on both synthetic data sets and a real-world image database demonstrate the effectiveness of the proposed scheme in improving the performance of a CBIR system by exploiting the user historical feedback log data.

  12. Mineral content prediction for unconventional oil and gas reservoirs based on logging data

    NASA Astrophysics Data System (ADS)

    Maojin, Tan; Youlong, Zou; Guoyue

    2012-09-01

    Coal bed methane and shale oil &gas are both important unconventional oil and gas resources, whose reservoirs are typical non-linear with complex and various mineral components, and the logging data interpretation model are difficult to establish for calculate the mineral contents, and the empirical formula cannot be constructed due to various mineral. The radial basis function (RBF) network analysis is a new method developed in recent years; the technique can generate smooth continuous function of several variables to approximate the unknown forward model. Firstly, the basic principles of the RBF is discussed including net construct and base function, and the network training is given in detail the adjacent clustering algorithm specific process. Multi-mineral content for coal bed methane and shale oil &gas, using the RBF interpolation method to achieve a number of well logging data to predict the mineral component contents; then, for coal-bed methane reservoir parameters prediction, the RBF method is used to realized some mineral contents calculation such as ash, volatile matter, carbon content, which achieves a mapping from various logging data to multimineral. To shale gas reservoirs, the RBF method can be used to predict the clay content, quartz content, feldspar content, carbonate content and pyrite content. Various tests in coalbed and gas shale show the method is effective and applicable for mineral component contents prediction

  13. An analysis of production and costs in high-lead yarding.

    Treesearch

    Magnus E. Tennas; Robert H. Ruth; Carl M. Berntsen

    1955-01-01

    In recent years loggers and timber owners have needed better information for estimating logging costs in the Douglas-fir region. Brandstrom's comprehensive study, published in 1933 (1), has long been used as a guide in making cost estimates. But the use of new equipment and techniques and an overall increase in logging costs have made it increasingly difficult to...

  14. Using nonlinear quantile regression to estimate the self-thinning boundary curve

    Treesearch

    Quang V. Cao; Thomas J. Dean

    2015-01-01

    The relationship between tree size (quadratic mean diameter) and tree density (number of trees per unit area) has been a topic of research and discussion for many decades. Starting with Reineke in 1933, the maximum size-density relationship, on a log-log scale, has been assumed to be linear. Several techniques, including linear quantile regression, have been employed...

  15. A method of improving sensitivity of carbon/oxygen well logging for low porosity formation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Juntao; Zhang, Feng; Zhang, Quanying

    Carbon/Oxygen (C/O) spectral logging technique has been widely used to determine residual oil saturation and the evaluation of water flooded layer. In order to improve the sensitivity of the technique for low – porosity formation, Gaussian and linear models are applied to fit the peaks of measured spectra to obtain the characteristic coefficients. Standard spectra of carbon and oxygen are combined to establish a new carbon /oxygen value calculation method, and the robustness of the new method is cross – validated with known mixed gamma ray spectrum. Formation models for different porosities and saturations are built using Monte Carlo method.more » The responses of carbon/oxygen which are calculated by conventional energy window method, and the new method is applied to oil saturation under low porosity conditions. The results show the new method can reduce the effects of gamma rays contaminated by the interaction between neutrons and other elements on carbon/oxygen ratio, and therefore can significantly improve the response sensitivity of carbon/oxygen well logging to oil saturation. The new method improves greatly carbon/oxygen well logging in low porosity conditions.« less

  16. A method of improving sensitivity of carbon/oxygen well logging for low porosity formation

    DOE PAGES

    Liu, Juntao; Zhang, Feng; Zhang, Quanying; ...

    2016-12-01

    Carbon/Oxygen (C/O) spectral logging technique has been widely used to determine residual oil saturation and the evaluation of water flooded layer. In order to improve the sensitivity of the technique for low – porosity formation, Gaussian and linear models are applied to fit the peaks of measured spectra to obtain the characteristic coefficients. Standard spectra of carbon and oxygen are combined to establish a new carbon /oxygen value calculation method, and the robustness of the new method is cross – validated with known mixed gamma ray spectrum. Formation models for different porosities and saturations are built using Monte Carlo method.more » The responses of carbon/oxygen which are calculated by conventional energy window method, and the new method is applied to oil saturation under low porosity conditions. The results show the new method can reduce the effects of gamma rays contaminated by the interaction between neutrons and other elements on carbon/oxygen ratio, and therefore can significantly improve the response sensitivity of carbon/oxygen well logging to oil saturation. The new method improves greatly carbon/oxygen well logging in low porosity conditions.« less

  17. ALOG: A spreadsheet-based program for generating artificial logs

    Treesearch

    Matthew F. Winn; Randolph H. Wynne; Philip A. Araman

    2004-01-01

    Log sawing simulation computer programs can be valuable tools for training sawyers as well as for testing different sawing patterns. Most available simulation programs rely on databases from which to draw logs and can be very costly and time-consuming to develop. ALOG (Artificial LOg Generator) is a Microsoft Excel®-based computer program that was developed to...

  18. High-Resolution Flow Logging for Hydraulic Characterization of Boreholes and Aquifer Flow Zones at Contaminated Bedrock Sites

    NASA Astrophysics Data System (ADS)

    Williams, J. H.; Johnson, C. D.; Paillet, F. L.

    2004-05-01

    In the past, flow logging was largely restricted to the application of spinner flowmeters to determine flow-zone contributions in large-diameter production wells screened in highly transmissive aquifers. Development and refinement of tool-measurement technology, field methods, and analysis techniques has greatly extended and enhanced flow logging to include the hydraulic characterization of boreholes and aquifer flow zones at contaminated bedrock sites. State-of-the-art in flow logging will be reviewed, and its application to bedrock-contamination investigations will be presented. In open bedrock boreholes, vertical flows are measured with high-resolution flowmeters equipped with flexible rubber-disk diverters fitted to the nominal borehole diameters to concentrate flow through the measurement throat of the tools. Heat-pulse flowmeters measure flows in the range of 0.05 to 5 liters per minute, and electromagnetic flowmeters measure flows in the range of 0.3 to 30 liters per minute. Under ambient and low-rate stressed (either extraction or injection) conditions, stationary flowmeter measurements are collected in competent sections of the borehole between fracture zones identified on borehole-wall images. Continuous flow, fluid-resistivity, and temperature logs are collected under both sets of conditions while trolling with a combination electromagnetic flowmeter and fluid tool. Electromagnetic flowmeters are used with underfit diverters to measure flow rates greater than 30 liters per minute and suppress effects of diameter variations while trolling. A series of corrections are applied to the flow-log data to account for the zero-flow response, bypass, trolling, and borehole-diameter biases and effects. The flow logs are quantitatively analyzed by matching simulated flows computed with a numerical model to measured flows by varying the hydraulic properties (transmissivity and hydraulic head) of the flow zones. Several case studies will be presented that demonstrate the integration of flow logging in site-characterization activities framework; 2) evaluate cross-connection effects and determine flow-zone contributions to water-quality samples from open boreholes; and 3) design discrete-zone hydraulic tests and monitoring-well completions.

  19. Study on fracture identification of shale reservoir based on electrical imaging logging

    NASA Astrophysics Data System (ADS)

    Yu, Zhou; Lai, Fuqiang; Xu, Lei; Liu, Lin; Yu, Tong; Chen, Junyu; Zhu, Yuantong

    2017-05-01

    In recent years, shale gas exploration has made important development, access to a major breakthrough, in which the study of mud shale fractures is extremely important. The development of fractures has an important role in the development of gas reservoirs. Based on the core observation and the analysis of laboratory flakes and laboratory materials, this paper divides the lithology of the shale reservoirs of the XX well in Zhanhua Depression. Based on the response of the mudstone fractures in the logging curve, the fracture development and logging Response to the relationship between the conventional logging and electrical imaging logging to identify the fractures in the work, the final completion of the type of fractures in the area to determine and quantify the calculation of fractures. It is concluded that the fracture type of the study area is high and the microstructures are developed from the analysis of the XX wells in Zhanhua Depression. The shape of the fractures can be clearly seen by imaging logging technology to determine its type.

  20. Geoscience techniques for engineering assessment of Oman to India pipeline route

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baerenwald, P.D.; Mullee, J.E.; Campbell, K.J.

    1996-12-31

    A variety of geoscience techniques were used to define soil conditions and evaluate geologic processes in order to develop design criteria for complex segments of the proposed Oman to Indian pipeline route. Geophysical survey data, seafloor cores, ROV observation of the seafloor, and oceanographic measurements were the principal field data collected. Geotechnical soil testing, and X-ray radiography, detailed geologic logging, and C-14 age dating of cores were carried out. The diverse sets of field data and lab test results were integrated by a multi-disciplined team of geoscientists and engineers to develop geologic and soil models, soil design criteria, a turbidmore » flow model, and seafloor stability models. The integrated approach used here is applicable to other complex areas where seafloor stability needs to be assessed or design criteria need to be developed for active geologic processes.« less

  1. Field experiment provides ground truth for surface nuclear magnetic resonance measurement

    USGS Publications Warehouse

    Knight, R.; Grunewald, E.; Irons, T.; Dlubac, K.; Song, Y.; Bachman, H.N.; Grau, B.; Walsh, D.; Abraham, J.D.; Cannia, J.

    2012-01-01

    The need for sustainable management of fresh water resources is one of the great challenges of the 21st century. Since most of the planet's liquid fresh water exists as groundwater, it is essential to develop non-invasive geophysical techniques to characterize groundwater aquifers. A field experiment was conducted in the High Plains Aquifer, central United States, to explore the mechanisms governing the non-invasive Surface NMR (SNMR) technology. We acquired both SNMR data and logging NMR data at a field site, along with lithology information from drill cuttings. This allowed us to directly compare the NMR relaxation parameter measured during logging,T2, to the relaxation parameter T2* measured using the SNMR method. The latter can be affected by inhomogeneity in the magnetic field, thus obscuring the link between the NMR relaxation parameter and the hydraulic conductivity of the geologic material. When the logging T2data were transformed to pseudo-T2* data, by accounting for inhomogeneity in the magnetic field and instrument dead time, we found good agreement with T2* obtained from the SNMR measurement. These results, combined with the additional information about lithology at the site, allowed us to delineate the physical mechanisms governing the SNMR measurement. Such understanding is a critical step in developing SNMR as a reliable geophysical method for the assessment of groundwater resources.

  2. PAMPA--critical factors for better predictions of absorption.

    PubMed

    Avdeef, Alex; Bendels, Stefanie; Di, Li; Faller, Bernard; Kansy, Manfred; Sugano, Kiyohiko; Yamauchi, Yukinori

    2007-11-01

    PAMPA, log P(OCT), and Caco-2 are useful tools in drug discovery for the prediction of oral absorption, brain penetration and for the development of structure-permeability relationships. Each approach has its advantages and limitations. Selection criteria for methods are based on many different factors: predictability, throughput, cost and personal preferences (people factor). The PAMPA concerns raised by Galinis-Luciani et al. (Galinis-Luciani et al., 2007, J Pharm Sci, this issue) are answered by experienced PAMPA practitioners, inventors and developers from diverse research organizations. Guidelines on how to use PAMPA are discussed. PAMPA and PAMPA-BBB have much better predictivity for oral absorption and brain penetration than log P(OCT) for real-world drug discovery compounds. PAMPA and Caco-2 have similar predictivity for passive oral absorption. However, it is not advisable to use PAMPA to predict absorption involving transporter-mediated processes, such as active uptake or efflux. Measurement of PAMPA is much more rapid and cost effective than Caco-2 and log P(OCT). PAMPA assay conditions are critical in order to generate high quality and relevant data, including permeation time, assay pH, stirring, use of cosolvents and selection of detection techniques. The success of using PAMPA in drug discovery depends on careful data interpretation, use of optimal assay conditions, implementation and integration strategies, and education of users. Copyright 2007 Wiley-Liss, Inc.

  3. Geophysical investigations in deep horizontal holes drilled ahead of tunnelling

    USGS Publications Warehouse

    Carroll, R.D.; Cunningham, M.J.

    1980-01-01

    Deep horizontal drill holes have been used since 1967 by the Defense Nuclear Agency as a primary exploration tool for siting nuclear events in tunnels at the Nevada Test Site. The U.S. Geological Survey had developed geophysical logging techniques for obtaining resistivity and velocity in these holes, and to date 33 horizontal drill holes in excess of 300 m in depth have been successfully logged. The deepest hole was drilled to a horizontal depth of 1125 m. The purposes of the logging measurements are to define clay zones, because of the unstable ground conditions such zones can present to tunnelling, and to define zones of partially saturated rock, because of the attenuating effects such zones have on the shock wave generated by the nuclear detonation. Excessive attenuation is undesirable because the shock wave is used as a tunnel closure mechanism to contain debris and other undesirable explosion products. Measurements are made by pumping resistivity, sonic and geophone probes down the drill string and out of the bit into the open hole. Clay zones are defined by the electrical resistivity technique based on empirical data relating the magnitude of the resistivity measurement to qualitative clay content. Rock exhibiting resistivity of less than 20 ??-m is considered potentially unstable, and resistivities less than 10 ??-m indicate appreciable amounts of clay are present in the rock. Partially saturated rock zones are defined by the measurement of the rock sound speed. Zones in the rock which exhibit velocities less than 2450 m/sec are considered of potential concern. ?? 1980.

  4. Small County: Development of a Virtual Environment for Instruction in Geological Characterization of Petroleum Reservoirs

    NASA Astrophysics Data System (ADS)

    Banz, B.; Bohling, G.; Doveton, J.

    2008-12-01

    Traditional programs of geological education continue to be focused primarily on the evaluation of surface or near-surface geology accessed at outcrops and shallow boreholes. However, most students who graduate to careers in geology work almost entirely on subsurface problems, interpreting drilling records and petrophysical logs from exploration and production wells. Thus, college graduates commonly find themselves ill-prepared when they enter the petroleum industry and require specialized training in drilling and petrophysical log interpretation. To aid in this training process, we are developing an environment for interactive instruction in the geological aspects of petroleum reservoir characterization employing a virtual subsurface closely reflecting the geology of the US mid-continent, in the fictional setting of Small County, Kansas. Stochastic simulation techniques are used to generate the subsurface characteristics, including the overall geological structure, distributions of facies, porosity, and fluid saturations, and petrophysical logs. The student then explores this subsurface by siting exploratory wells and examining drilling and petrophysical log records obtained from those wells. We are developing the application using the Eclipse Rich Client Platform, which allows for the rapid development of a platform-agnostic application while providing an immersive graphical interface. The application provides an array of views to enable relevant data display and student interaction. One such view is an interactive map of the county allowing the student to view the locations of existing well bores and select pertinent data overlays such as a contour map of the elevation of an interesting interval. Additionally, from this view a student may choose the site of a new well. Another view emulates a drilling log, complete with drilling rate plot and iconic representation of examined drill cuttings. From here, students are directed to stipulate subsurface lithology and interval tops as they progress through the drilling operation. Once the interpretation process is complete, the student is guided through an exercise emulating a drill stem test and then is prompted to decide on perforation intervals. The application provides a graphical framework by which the student is guided through well site selection, drilling data interpretation, and well completion or dry-hole abandonment, creating a tight feedback loop by which the student gains an over-arching view of drilling logistics and the subsurface data evaluation process.

  5. Estimating tree bole and log weights from green densities measured with the Bergstrom Xylodensimeter.

    Treesearch

    Dale R. Waddell; Michael B. Lambert; W.Y. Pong

    1984-01-01

    The performance of the Bergstrom xylodensimeter, designed to measure the green density of wood, was investigated and compared with a technique that derived green densities from wood disk samples. In addition, log and bole weights of old-growth Douglas-fir and western hemlock were calculated by various formulas and compared with lifted weights measured with a load cell...

  6. Footprints in the Sky: Using Student Track Logs from a "Bird's Eye View" Virtual Field Trip to Enhance Learning

    ERIC Educational Resources Information Center

    Treves, Richard; Viterbo, Paolo; Haklay, Mordechai

    2015-01-01

    Research into virtual field trips (VFTs) started in the 1990s but, only recently, the maturing technology of devices and networks has made them viable options for educational settings. By considering an experiment, the learning benefits of logging the movement of students within a VFT are shown. The data are visualized by two techniques:…

  7. Assessing wood quality of borer-infested red oak logs with a resonance acoustic technique

    Treesearch

    Xiping Wang; Henry E. Stelzer; Jan Wiedenbeck; Patricia K. Lebow; Robert J. Ross

    2009-01-01

    Large numbers of black oak (Quercus velutina Lam.) and scarlet oak (Quercus coccinea Muenchh.) trees are declining and dying in the Missouri Ozark forest as a result of oak decline. Red oak borer-infested trees produce low-grade logs that become extremely difficult to merchandize as the level of insect attack increases. The objective of this study was to investigate...

  8. Exploring Online Students' Self-Regulated Learning with Self-Reported Surveys and Log Files: A Data Mining Approach

    ERIC Educational Resources Information Center

    Cho, Moon-Heum; Yoo, Jin Soung

    2017-01-01

    Many researchers who are interested in studying students' online self-regulated learning (SRL) have heavily relied on self-reported surveys. Data mining is an alternative technique that can be used to discover students' SRL patterns from large data logs saved on a course management system. The purpose of this study was to identify students' online…

  9. Cable yarding residue after thinning young stands: a break-even simulation

    Treesearch

    Chris B. LeDoux

    1984-01-01

    The use of cable logging to extract small pieces of residue wood may result in low rates of production and a high cost per unit of wood produced. However, the logging manager can improve yarding productivity and break even in cable residue removal operations by using the proper planning techniques. In this study, breakeven zones for specific young-growth stands were...

  10. Solving Graph Laplacian Systems Through Recursive Bisections and Two-Grid Preconditioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ponce, Colin; Vassilevski, Panayot S.

    2016-02-18

    We present a parallelizable direct method for computing the solution to graph Laplacian-based linear systems derived from graphs that can be hierarchically bipartitioned with small edge cuts. For a graph of size n with constant-size edge cuts, our method decomposes a graph Laplacian in time O(n log n), and then uses that decomposition to perform a linear solve in time O(n log n). We then use the developed technique to design a preconditioner for graph Laplacians that do not have this property. Finally, we augment this preconditioner with a two-grid method that accounts for much of the preconditioner's weaknesses. Wemore » present an analysis of this method, as well as a general theorem for the condition number of a general class of two-grid support graph-based preconditioners. Numerical experiments illustrate the performance of the studied methods.« less

  11. Capillary-induced crack healing between surfaces of nanoscale roughness.

    PubMed

    Soylemez, Emrecan; de Boer, Maarten P

    2014-10-07

    Capillary forces are important in nature (granular materials, insect locomotion) and in technology (disk drives, adhesion). Although well studied in equilibrium state, the dynamics of capillary formation merit further investigation. Here, we show that microcantilever crack healing experiments are a viable experimental technique for investigating the influence of capillary nucleation on crack healing between rough surfaces. The average crack healing velocity, v̅, between clean hydrophilic polycrystalline silicon surfaces of nanoscale roughness is measured. A plot of v̅ versus energy release rate, G, reveals log-linear behavior, while the slope |d[log(v̅)]/dG| decreases with increasing relative humidity. A simplified interface model that accounts for the nucleation time of water bridges by an activated process is developed to gain insight into the crack healing trends. This methodology enables us to gain insight into capillary bridge dynamics, with a goal of attaining a predictive capability for this important microelectromechanical systems (MEMS) reliability failure mechanism.

  12. Investigation of methods and approaches for collecting and recording highway inventory data.

    DOT National Transportation Integrated Search

    2013-06-01

    Many techniques for collecting highway inventory data have been used by state and local agencies in the U.S. These : techniques include field inventory, photo/video log, integrated GPS/GIS mapping systems, aerial photography, satellite : imagery, vir...

  13. Flood frequency analysis using optimization techniques : final report.

    DOT National Transportation Integrated Search

    1992-10-01

    this study consists of three parts. In the first part, a comprehensive investigation was made to find an improved estimation method for the log-Pearson type 3 (LP3) distribution by using optimization techniques. Ninety sets of observed Louisiana floo...

  14. Operator product expansion in Liouville field theory and Seiberg-type transitions in log-correlated random energy models

    NASA Astrophysics Data System (ADS)

    Cao, Xiangyu; Le Doussal, Pierre; Rosso, Alberto; Santachiara, Raoul

    2018-04-01

    We study transitions in log-correlated random energy models (logREMs) that are related to the violation of a Seiberg bound in Liouville field theory (LFT): the binding transition and the termination point transition (a.k.a., pre-freezing). By means of LFT-logREM mapping, replica symmetry breaking and traveling-wave equation techniques, we unify both transitions in a two-parameter diagram, which describes the free-energy large deviations of logREMs with a deterministic background log potential, or equivalently, the joint moments of the free energy and Gibbs measure in logREMs without background potential. Under the LFT-logREM mapping, the transitions correspond to the competition of discrete and continuous terms in a four-point correlation function. Our results provide a statistical interpretation of a peculiar nonlocality of the operator product expansion in LFT. The results are rederived by a traveling-wave equation calculation, which shows that the features of LFT responsible for the transitions are reproduced in a simple model of diffusion with absorption. We examine also the problem by a replica symmetry breaking analysis. It complements the previous methods and reveals a rich large deviation structure of the free energy of logREMs with a deterministic background log potential. Many results are verified in the integrable circular logREM, by a replica-Coulomb gas integral approach. The related problem of common length (overlap) distribution is also considered. We provide a traveling-wave equation derivation of the LFT predictions announced in a precedent work.

  15. Analysis of calibration materials to improve dual-energy CT scanning for petrophysical applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ayyalasomavaiula, K.; McIntyre, D.; Jain, J.

    2011-01-01

    Dual energy CT-scanning is a rapidly emerging imaging technique employed in non-destructive evaluation of various materials. Although CT (Computerized Tomography) has been used for characterizing rocks and visualizing and quantifying multiphase flow through rocks for over 25 years, most of the scanning is done at a voltage setting above 100 kV for taking advantage of the Compton scattering (CS) effect, which responds to density changes. Below 100 kV the photoelectric effect (PE) is dominant which responds to the effective atomic numbers (Zeff), which is directly related to the photo electric factor. Using the combination of the two effects helps inmore » better characterization of reservoir rocks. The most common technique for dual energy CT-scanning relies on homogeneous calibration standards to produce the most accurate decoupled data. However, the use of calibration standards with impurities increases the probability of error in the reconstructed data and results in poor rock characterization. This work combines ICP-OES (inductively coupled plasma optical emission spectroscopy) and LIBS (laser induced breakdown spectroscopy) analytical techniques to quantify the type and level of impurities in a set of commercially purchased calibration standards used in dual-energy scanning. The Zeff data on the calibration standards with and without impurity data were calculated using the weighted linear combination of the various elements present and used in calculating Zeff using the dual energy technique. Results show 2 to 5% difference in predicted Zeff values which may affect the corresponding log calibrations. The effect that these techniques have on improving material identification data is discussed and analyzed. The workflow developed in this paper will translate to a more accurate material identification estimates for unknown samples and improve calibration of well logging tools.« less

  16. "Geo-statistics methods and neural networks in geophysical applications: A case study"

    NASA Astrophysics Data System (ADS)

    Rodriguez Sandoval, R.; Urrutia Fucugauchi, J.; Ramirez Cruz, L. C.

    2008-12-01

    The study is focus in the Ebano-Panuco basin of northeastern Mexico, which is being explored for hydrocarbon reservoirs. These reservoirs are in limestones and there is interest in determining porosity and permeability in the carbonate sequences. The porosity maps presented in this study are estimated from application of multiattribute and neural networks techniques, which combine geophysics logs and 3-D seismic data by means of statistical relationships. The multiattribute analysis is a process to predict a volume of any underground petrophysical measurement from well-log and seismic data. The data consist of a series of target logs from wells which tie a 3-D seismic volume. The target logs are neutron porosity logs. From the 3-D seismic volume a series of sample attributes is calculated. The objective of this study is to derive a set of attributes and the target log values. The selected set is determined by a process of forward stepwise regression. The analysis can be linear or nonlinear. In the linear mode the method consists of a series of weights derived by least-square minimization. In the nonlinear mode, a neural network is trained using the select attributes as inputs. In this case we used a probabilistic neural network PNN. The method is applied to a real data set from PEMEX. For better reservoir characterization the porosity distribution was estimated using both techniques. The case shown a continues improvement in the prediction of the porosity from the multiattribute to the neural network analysis. The improvement is in the training and the validation, which are important indicators of the reliability of the results. The neural network showed an improvement in resolution over the multiattribute analysis. The final maps provide more realistic results of the porosity distribution.

  17. Geophysical Log Data from Basalt Aquifers Near Waipahu on the Island of Oahu and Pahoa on the Island of Hawaii, Hawaii

    USGS Publications Warehouse

    Paillet, Frederick L.; Hess, Alfred E.

    1995-01-01

    Two relatively new geophysical logging techniques, the digitally enhanced borehole acoustic televiewer and the heat-pulse flowmeter, were tested from 1987 to 1991 at two sites in Hawaii: Waipahu on the island of Oahu, and Pahoa on the island of Hawaii. Although these data were obtained in an effort to test and improve these two logging techniques, the measurements are of interest to hydrologists studying the aquifers in Hawaii. This report presents a review of the measurements conducted during this effort and summarizes the data obtained in a form designed to make that data available to hydrologists studying the movement of ground water in Hawaiian aquifers. Caliper logs obtained at the Waipahu site indicate the distribution of openings in interbed clinker zones between relatively dense and impermeable basalt flows. The flowmeter data indicate the pattern of flow induced along seven observation boreholes that provide conduits between interbed zones in the vicinity of the Mahoe Pumping Station at the Waipahu site. The televiewer image logs obtained in some of the Waipahu Mahoe boreholes do not show any significant vertical or steeply dipping fractures that might allow communication across the dense interior of basalt flows. Acoustic televiewer logs obtained at the Pahoa site show that a number of steeply dipping fractures and dikes cut across basalt flows. Although flow under ambient hydraulic-head conditions in the Waipahu Mahoe Observation boreholes is attributed to hydraulic gradients associated with pumping from a nearby pumping station, flow in the Waipio Deep Observation borehole on Oahu and flow in the Scientific Observation borehole on Hawaii are attributed to the effects of natural recharge and downward decreasing hydraulic heads associated with that recharge.

  18. Intraocular straylight and contrast sensitivity after contralateral wavefront-guided LASIK and wavefront-guided PRK for myopia.

    PubMed

    Barreto, Jackson; Barboni, Mirella T S; Feitosa-Santana, Claudia; Sato, João R; Bechara, Samir J; Ventura, Dora F; Alves, Milton Ruiz

    2010-08-01

    To compare intraocular straylight measurements and contrast sensitivity after wavefront-guided LASIK (WFG LASIK) in one eye and wavefront-guided photorefractive keratectomy (WFG PRK) in the fellow eye for myopia and myopic astigmatism correction. A prospective, randomized study of 22 eyes of 11 patients who underwent simultaneous WFG LASIK and WFG PRK (contralateral eye). Both groups were treated with the NIDEK Advanced Vision Excimer Laser System, and a microkeratome was used for flap creation in the WFG LASIK group. High and low contrast visual acuity, wavefront analysis, contrast sensitivity, and retinal straylight measurements were performed preoperatively and at 3, 6, and 12 months postoperatively. A third-generation straylight meter, C-Quant (Oculus Optikgeräte GmbH), was used for measuring intraocular straylight. Twelve months postoperatively, mean uncorrected distance visual acuity was -0.06 +/- 0.07 logMAR in the WFG LASIK group and -0.10 +/- 0.10 logMAR in the WFG PRK group. Mean preoperative intraocular straylight was 0.94 +/- 0.12 logs for the WFG LASIK group and 0.96 +/- 0.11 logs for the WFG PRK group. After 12 months, the mean straylight value was 1.01 +/- 0.1 log s for the WFG LASIK group and 0.97 +/- 0.12 log s for the WFG PRK group. No difference was found between techniques after 12 months (P = .306). No significant difference in photopic and mesopic contrast sensitivity between groups was noted. Intraocular straylight showed no statistically significant increase 1 year after WFG LASIK and WFG PRK. Higher order aberrations increased significantly after surgery for both groups. Nevertheless, WFG LASIK and WFG PRK yielded excellent visual acuity and contrast sensitivity performance without significant differences between techniques.

  19. Accoustic waveform logging--Advances in theory and application

    USGS Publications Warehouse

    Paillet, F.L.; Cheng, C.H.; Pennington , W.D.

    1992-01-01

    Full-waveform acoustic logging has made significant advances in both theory and application in recent years, and these advances have greatly increased the capability of log analysts to measure the physical properties of formations. Advances in theory provide the analytical tools required to understand the properties of measured seismic waves, and to relate those properties to such quantities as shear and compressional velocity and attenuation, and primary and fracture porosity and permeability of potential reservoir rocks. The theory demonstrates that all parts of recorded waveforms are related to various modes of propagation, even in the case of dipole and quadrupole source logging. However, the theory also indicates that these mode properties can be used to design velocity and attenuation picking schemes, and shows how source frequency spectra can be selected to optimize results in specific applications. Synthetic microseismogram computations are an effective tool in waveform interpretation theory; they demonstrate how shear arrival picks and mode attenuation can be used to compute shear velocity and intrinsic attenuation, and formation permeability for monopole, dipole and quadrupole sources. Array processing of multi-receiver data offers the opportunity to apply even more sophisticated analysis techniques. Synthetic microseismogram data is used to illustrate the application of the maximum-likelihood method, semblance cross-correlation, and Prony's method analysis techniques to determine seismic velocities and attenuations. The interpretation of acoustic waveform logs is illustrated by reviews of various practical applications, including synthetic seismogram generation, lithology determination, estimation of geomechanical properties in situ, permeability estimation, and design of hydraulic fracture operations.

  20. A Guide to Hardwood Log Grading

    Treesearch

    Everette D. Rast; David L. Sonderman; Glenn L. Gammon

    1973-01-01

    A guide to hardwood log grading (revised) was developed as a teaching aid and field reference in grading hardwood logs. Outlines basic principles and gives detailed practical applications, with illustrations, in grading hardwood logs. Includes standards for various use classes.

  1. Application of borehole geophysics in defining the wellhead protection area for a fractured crystalline bedrock aquifer

    USGS Publications Warehouse

    Vernon, J.H.; Paillet, F.L.; Pedler, W.H.; Griswold, W.J.

    1993-01-01

    Wellbore geophysical techniques were used to characterize fractures and flow in a bedrock aquifer at a site near Blackwater Brook in Dover, New Hampshire. The primary focus ofthis study was the development of a model to assist in evaluating the area surrounding a planned water supply well where contaminants introduced at the land surface might be induced to flow towards a pumping well. Well logs and geophysical surveys used in this study included lithologic logs based on examination of cuttings obtained during drilling; conventional caliper and natural gamma logs; video camera and acoustic televiewer surveys; highresolution vertical flow measurements under ambient conditions and during pumping; and borehole fluid conductivity logs obtained after the borehole fluid was replaced with deionized water. These surveys were used for several applications: 1) to define a conceptual model of aquifer structure to be used in groundwater exploration; 2) to estimate optimum locations for test and observation wells; and 3) to delineate a wellhead protection area (WHPA) for a planned water supply well. Integration of borehole data with surface geophysical and geological mapping data indicated that the study site lies along a northeast-trending intensely fractured contact zone between surface exposures of quartz monzonite and metasedimentary rocks. Four of five bedrock boreholes at the site were estimated to produce more than 150 gallons per minute (gpm) (568 L/min) of water during drilling. Aquifer testing and other investigations indicated that water flowed to the test well along fractures parallel to the northeast-trending contact zone and along other northeast and north-northwest-trending fractures. Statistical plots of fracture strikes showed frequency maxima in the same northeast and north-northwest directions, although additional maxima occurred in other directions. Flowmeter surveys and borehole fluid conductivity logging after fluid replacement were used to identify water-producing zones in the boreholes; fractures associated with inflow into boreholes showed a dominant northeast orientation. Borehole fluid conductivity logging after fluid replacement also gave profiles of such water-quality parameters as fluid electrical conductivity (FEC), pH, temperature, and oxidation-reduction potential, strengthening the interpretation of crossconnection of boreholes by certain fracture zones. The results of this study showed that the application of these borehole geophysical techniques at the Blackwater Brook site led to an improved understanding of such parameters as fracture location, attitude, flow direction and velocity, and water quality; all of which are important in the determination of a WHPA.

  2. Optimization of antibacterial activity by Gold-Thread (Coptidis Rhizoma Franch) against Streptococcus mutans using evolutionary operation-factorial design technique.

    PubMed

    Choi, Ung-Kyu; Kim, Mi-Hyang; Lee, Nan-Hee

    2007-11-01

    This study was conducted to find the optimum extraction condition of Gold-Thread for antibacterial activity against Streptococcus mutans using The evolutionary operation-factorial design technique. Higher antibacterial activity was achieved in a higher extraction temperature (R2 = -0.79) and in a longer extraction time (R2 = -0.71). Antibacterial activity was not affected by differentiation of the ethanol concentration in the extraction solvent (R2 = -0.12). The maximum antibacterial activity of clove against S. mutans determined by the EVOP-factorial technique was obtained at 80 degrees C extraction temperature, 26 h extraction time, and 50% ethanol concentration. The population of S. mutans decreased from 6.110 logCFU/ml in the initial set to 4.125 logCFU/ml in the third set.

  3. Application of Fracture Distribution Prediction Model in Xihu Depression of East China Sea

    NASA Astrophysics Data System (ADS)

    Yan, Weifeng; Duan, Feifei; Zhang, Le; Li, Ming

    2018-02-01

    There are different responses on each of logging data with the changes of formation characteristics and outliers caused by the existence of fractures. For this reason, the development of fractures in formation can be characterized by the fine analysis of logging curves. The well logs such as resistivity, sonic transit time, density, neutron porosity and gamma ray, which are classified as conventional well logs, are more sensitive to formation fractures. In view of traditional fracture prediction model, using the simple weighted average of different logging data to calculate the comprehensive fracture index, are more susceptible to subjective factors and exist a large deviation, a statistical method is introduced accordingly. Combining with responses of conventional logging data on the development of formation fracture, a prediction model based on membership function is established, and its essence is to analyse logging data with fuzzy mathematics theory. The fracture prediction results in a well formation in NX block of Xihu depression through two models are compared with that of imaging logging, which shows that the accuracy of fracture prediction model based on membership function is better than that of traditional model. Furthermore, the prediction results are highly consistent with imaging logs and can reflect the development of cracks much better. It can provide a reference for engineering practice.

  4. The Wettzell System Monitoring Concept and First Realizations

    NASA Technical Reports Server (NTRS)

    Ettl, Martin; Neidhardt, Alexander; Muehlbauer, Matthias; Ploetz, Christian; Beaudoin, Christopher

    2010-01-01

    Automated monitoring of operational system parameters for the geodetic space techniques is becoming more important in order to improve the geodetic data and to ensure the safety and stability of automatic and remote-controlled observations. Therefore, the Wettzell group has developed the system monitoring software, SysMon, which is based on a reliable, remotely-controllable hardware/software realization. A multi-layered data logging system based on a fanless, robust industrial PC with an internal database system is used to collect data from several external, serial, bus, or PCI-based sensors. The internal communication is realized with Remote Procedure Calls (RPC) and uses generative programming with the interface software generator idl2rpc.pl developed at Wettzell. Each data monitoring stream can be configured individually via configuration files to define the logging rates or analog-digital-conversion parameters. First realizations are currently installed at the new laser ranging system at Wettzell to address safety issues and at the VLBI station O Higgins as a meteorological data logger. The system monitoring concept should be realized for the Wettzell radio telescope in the near future.

  5. Creative Analytics of Mission Ops Event Messages

    NASA Technical Reports Server (NTRS)

    Smith, Dan

    2017-01-01

    Historically, tremendous effort has been put into processing and displaying mission health and safety telemetry data; and relatively little attention has been paid to extracting information from missions time-tagged event log messages. Todays missions may log tens of thousands of messages per day and the numbers are expected to dramatically increase as satellite fleets and constellations are launched, as security monitoring continues to evolve, and as the overall complexity of ground system operations increases. The logs may contain information about orbital events, scheduled and actual observations, device status and anomalies, when operators were logged on, when commands were resent, when there were data drop outs or system failures, and much much more. When dealing with distributed space missions or operational fleets, it becomes even more important to systematically analyze this data. Several advanced information systems technologies make it appropriate to now develop analytic capabilities which can increase mission situational awareness, reduce mission risk, enable better event-driven automation and cross-mission collaborations, and lead to improved operations strategies: Industry Standard for Log Messages. The Object Management Group (OMG) Space Domain Task Force (SDTF) standards organization is in the process of creating a formal standard for industry for event log messages. The format is based on work at NASA GSFC. Open System Architectures. The DoD, NASA, and others are moving towards common open system architectures for mission ground data systems based on work at NASA GSFC with the full support of the commercial product industry and major integration contractors. Text Analytics. A specific area of data analytics which applies statistical, linguistic, and structural techniques to extract and classify information from textual sources. This presentation describes work now underway at NASA to increase situational awareness through the collection of non-telemetry mission operations information into a common log format and then providing display and analytics tools to provide in-depth assessment of the log contents. The work includes: Common interface formats for acquiring time-tagged text messages Conversion of common files for schedules, orbital events, and stored commands to the common log format Innovative displays to depict thousands of messages on a single display Structured English text queries against the log message data store, extensible to a more mature natural language query capability Goal of speech-to-text and text-to-speech additions to create a personal mission operations assistant to aid on-console operations. A wide variety of planned uses identified by the mission operations teams will be discussed.

  6. QUANTIFICATION OF IN-SITU GAS HYDRATES WITH WELL LOGS.

    USGS Publications Warehouse

    Collett, Timothy S.; Godbole, Sanjay P.; Economides, Christine

    1984-01-01

    This study evaluates in detail the expected theoretical log responses and the actual log responses within one stratigraphically controlled hydrate horizon in six wells spaced throughout the Kuparuk Oil Field. Detailed examination of the neutron porosity and sonic velocity responses within the horizon is included. In addition, the theoretical effect of the presence of hydrates on the neutron porosity and sonic velocity devices has been examined in order to correct for such an effect on the calculation of formation properties such as porosity and hydrate saturation. Also presented in the paper is a technique which allows the conclusive identification of a potential hydrate occurrence.

  7. Requirements-Driven Log Analysis Extended Abstract

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus

    2012-01-01

    Imagine that you are tasked to help a project improve their testing effort. In a realistic scenario it will quickly become clear, that having an impact is diffcult. First of all, it will likely be a challenge to suggest an alternative approach which is significantly more automated and/or more effective than current practice. The reality is that an average software system has a complex input/output behavior. An automated testing approach will have to auto-generate test cases, each being a pair (i; o) consisting of a test input i and an oracle o. The test input i has to be somewhat meaningful, and the oracle o can be very complicated to compute. Second, even in case where some testing technology has been developed that might improve current practice, it is then likely difficult to completely change the current behavior of the testing team unless the technique is obviously superior and does everything already done by existing technology. So is there an easier way to incorporate formal methods-based approaches than the full edged test revolution? Fortunately the answer is affirmative. A relatively simple approach is to benefit from possibly already existing logging infrastructure, which after all is part of most systems put in production. A log is a sequence of events, generated by special log recording statements, most often manually inserted in the code by the programmers. An event can be considered as a data record: a mapping from field names to values. We can analyze such a log using formal methods, for example checking it against a formal specification. This separates running the system for analyzing its behavior. It is not meant as an alternative to testing since it does not address the important in- put generation problem. However, it offers a solution which testing teams might accept since it has low impact on the existing process. A single person might be assigned to perform such log analysis, compared to the entire testing team changing behavior.

  8. Self organizing map neural networks approach for lithologic interpretation of nuclear and electrical well logs in basaltic environment, Southern Syria.

    PubMed

    Asfahani, J; Ahmad, Z; Ghani, B Abdul

    2018-07-01

    An approach based on self organizing map (SOM) artificial neural networks is proposed herewith oriented towards interpreting nuclear and electrical well logging data. The well logging measurements of Kodana well in Southern Syria have been interpreted by applying the proposed approach. Lithological cross-section model of the basaltic environment has been derived and four different kinds of basalt have been consequently distinguished. The four basalts are hard massive basalt, hard basalt, pyroclastic basalt and the alteration basalt products- clay. The results obtained by SOM artificial neural networks are in a good agreement with the previous published results obtained by other different techniques. The SOM approach is practiced successfully in the case study of the Kodana well logging data, and can be therefore recommended as a suitable and effective approach for handling huge well logging data with higher number of variables required for lithological discrimination purposes. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. Seismic lateral prediction in chalky limestone reservoirs offshore Qatar

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rubbens, I.B.H.M.; Murat, R.C.; Vankeulen, J.

    Following the discovery of non-structurally trapped oil accumulations in Cretaceous chalky reservoirs on the northern flank of the North Dome offshore QATAR, a seismic lateral prediction study was carried out for QATAR GENERAL PETROLEUM CORPORATION (Offshore Operations). The objectives of this study were to assist in the appraisal of these oil accumulations by predicting their possible lateral extent and to investigate if the technique applied could be used as a basis for further exploration of similar oil prospects in the area. Wireline logs of eight wells and some 1000 km of high quality seismic data were processed into acoustic impedancemore » (A.I.) logs and seismic A.I. sections. Having obtained a satisfactory match of the A.I. well logs and the A.I. of the seismic traces at the well locations, relationships were established by the use of well log data which allowed the interpretation of the seismic A.I. in terms of reservoir quality. Measurements of the relevant A.I. characteristics were then carried out by computer along all seismic lines and porosity distribution maps prepared for some of the reservoirs. These maps, combined with detailed seismic depth contour maps at reservoir tops, lead to definition of good reservoir development areas downdip from poor reservoir quality zones i.e. of the stratigraphic trap areas, and drilling locations could thus be proposed. The system remains to be adequately calibrated when core material becomes available in the area of study.« less

  10. Balancing conservation and economic sustainability: the future of the Amazon timber industry.

    PubMed

    Merry, Frank; Soares-Filho, Britaldo; Nepstad, Daniel; Amacher, Gregory; Rodrigues, Hermann

    2009-09-01

    Logging has been a much maligned feature of frontier development in the Amazon. Most discussions ignore the fact that logging can be part of a renewable, environmentally benign, and broadly equitable economic activity in these remote places. We estimate there to be some 4.5 +/- 1.35 billion m(3) of commercial timber volume in the Brazilian Amazon today, of which 1.2 billion m(3) is currently profitable to harvest, with a total potential stumpage value of $15.4 billion. A successful forest sector in the Brazilian Amazon will integrate timber harvesting on private lands and on unprotected and unsettled government lands with timber concessions on public lands. If a legal, productive, timber industry can be established outside of protected areas, it will deliver environmental benefits in synergy with those provided by the region's network of protected areas, the latter of which we estimate to have an opportunity cost from lost timber revenues of $2.3 billion over 30 years. Indeed, on all land accessible to harvesting, the timber industry could produce an average of more than 16 million m(3) per year over a 30-year harvest cycle-entirely outside of current protected areas-providing $4.8 billion in returns to landowners and generating $1.8 billion in sawnwood sales tax revenue. This level of harvest could be profitably complemented with an additional 10% from logging concessions on National Forests. This advance, however, should be realized only through widespread adoption of reduced impact logging techniques.

  11. Balancing Conservation and Economic Sustainability: The Future of the Amazon Timber Industry

    NASA Astrophysics Data System (ADS)

    Merry, Frank; Soares-Filho, Britaldo; Nepstad, Daniel; Amacher, Gregory; Rodrigues, Hermann

    2009-09-01

    Logging has been a much maligned feature of frontier development in the Amazon. Most discussions ignore the fact that logging can be part of a renewable, environmentally benign, and broadly equitable economic activity in these remote places. We estimate there to be some 4.5 ± 1.35 billion m3 of commercial timber volume in the Brazilian Amazon today, of which 1.2 billion m3 is currently profitable to harvest, with a total potential stumpage value of 15.4 billion. A successful forest sector in the Brazilian Amazon will integrate timber harvesting on private lands and on unprotected and unsettled government lands with timber concessions on public lands. If a legal, productive, timber industry can be established outside of protected areas, it will deliver environmental benefits in synergy with those provided by the region’s network of protected areas, the latter of which we estimate to have an opportunity cost from lost timber revenues of 2.3 billion over 30 years. Indeed, on all land accessible to harvesting, the timber industry could produce an average of more than 16 million m3 per year over a 30-year harvest cycle—entirely outside of current protected areas—providing 4.8 billion in returns to landowners and generating 1.8 billion in sawnwood sales tax revenue. This level of harvest could be profitably complemented with an additional 10% from logging concessions on National Forests. This advance, however, should be realized only through widespread adoption of reduced impact logging techniques.

  12. Teaching an Old Log New Tricks with Machine Learning.

    PubMed

    Schnell, Krista; Puri, Colin; Mahler, Paul; Dukatz, Carl

    2014-03-01

    To most people, the log file would not be considered an exciting area in technology today. However, these relatively benign, slowly growing data sources can drive large business transformations when combined with modern-day analytics. Accenture Technology Labs has built a new framework that helps to expand existing vendor solutions to create new methods of gaining insights from these benevolent information springs. This framework provides a systematic and effective machine-learning mechanism to understand, analyze, and visualize heterogeneous log files. These techniques enable an automated approach to analyzing log content in real time, learning relevant behaviors, and creating actionable insights applicable in traditionally reactive situations. Using this approach, companies can now tap into a wealth of knowledge residing in log file data that is currently being collected but underutilized because of its overwhelming variety and volume. By using log files as an important data input into the larger enterprise data supply chain, businesses have the opportunity to enhance their current operational log management solution and generate entirely new business insights-no longer limited to the realm of reactive IT management, but extending from proactive product improvement to defense from attacks. As we will discuss, this solution has immediate relevance in the telecommunications and security industries. However, the most forward-looking companies can take it even further. How? By thinking beyond the log file and applying the same machine-learning framework to other log file use cases (including logistics, social media, and consumer behavior) and any other transactional data source.

  13. Hardwood log grading scale stick improved

    Treesearch

    M. D. Ostrander; G. H. Englerth

    1953-01-01

    In February 1952 the Northeastern Forest Experiment Station described ( Research Note 13) a new log-grading scale stick developed by the Station for use as a visual aid in grading hardwood factory logs. It was based on the U. S. Forest Products Laboratory's log-grade specifications.

  14. Long-term impacts of recurrent logging and fire in Amazon forests: a modeling study using the Ecosystem Demography Model (ED2)

    NASA Astrophysics Data System (ADS)

    Longo, M.; Keller, M.; Scaranello, M. A., Sr.; dos-Santos, M. N.; Xu, Y.; Huang, M.; Morton, D. C.

    2017-12-01

    Logging and understory fires are major drivers of tropical forest degradation, reducing carbon stocks and changing forest structure, composition, and dynamics. In contrast to deforested areas, sites that are disturbed by logging and fires retain some, albeit severely altered, forest structure and function. In this study we simulated selective logging using the Ecosystem Demography Model (ED-2) to investigate the impact of a broad range of logging techniques, harvest intensities, and recurrence cycles on the long-term dynamics of Amazon forests, including the magnitude and duration of changes in forest flammability following timber extraction. Model results were evaluated using eddy covariance towers at logged sites at the Tapajos National Forest in Brazil and data on long-term dynamics reported in the literature. ED-2 is able to reproduce both the fast (< 5yr) recovery of water, energy fluxes compared to flux tower, and the typical, field-observed, decadal time scales for biomass recovery when no additional logging occurs. Preliminary results using the original ED-2 fire model show that canopy cover loss of forests under high-intensity, conventional logging cause sufficient drying to support more intense fires. These results indicate that under intense degradation, forests may shift to novel disturbance regimes, severely reducing carbon stocks, and inducing long-term changes in forest structure and composition from recurrent fires.

  15. Helmet and shoulder pad removal in football players with unstable cervical spine injuries.

    PubMed

    Dahl, Michael C; Ananthakrishnan, Dheera; Nicandri, Gregg; Chapman, Jens R; Ching, Randal P

    2009-05-01

    Football, one of the country's most popular team sports, is associated with the largest overall number of sports-related, catastrophic, cervical spine injuries in the United States (Mueller, 2007). Patient handling can be hindered by the protective sports equipment worn by the athlete. Improper stabilization of these patients can exacerbate neurologic injury. Because of the lack of consensus on the best method for equipment removal, a study was performed comparing three techniques: full body levitation, upper torso tilt, and log roll. These techniques were performed on an intact and lesioned cervical spine cadaveric model simulating conditions in the emergency department. The levitation technique was found to produce motion in the anterior and right lateral directions. The tilt technique resulted in motions in the posterior left lateral directions, and the log roll technique generated motions in the right lateral direction and had the largest amount of increased instability when comparing the intact and lesioned specimen. These findings suggest that each method of equipment removal displays unique weaknesses that the practitioner should take into account, possibly on a patient-by-patient basis.

  16. The geomorphic and ecological effectiveness of habitat rehabilitation works: Continuous measurement of scour and fill around large logs in sand-bed streams

    NASA Astrophysics Data System (ADS)

    Borg, Dan; Rutherfurd, Ian; Stewardson, Mike

    2007-09-01

    Geomorphologists, ecologists and engineers have all contributed to stream rehabilitation projects by predicting the physical effect of habitat restoration structures. In this study we report the results of a stream rehabilitation project on the Snowy River, SE Australia; that aims to improve fish habitat and facilitate migration associated with scour holes around large wood in the streambed. Whilst engineering models allow us to predict maximum scour, the key management issue here was not the maximum scour depth but whether the holes persisted at a range of flows, and if they were present when fish actually required them. This led to the development of a new method to continuously monitor scour in a sand-bed, using a buried pressure transducer. In this study we monitored fluctuations in the bed level below three large logs (1 m diameter) on the Snowy River. Each log had a different scour mechanism: a plunge pool, a horseshoe vortex (analogous to a bridge pier), and a submerged jet beneath the log. The continuous monitoring demonstrated a complex relationship between discharge and pool scour. The horseshoe vortex pool maintained a constant level, whilst, contrary to expectations, both the plunge pool and the submerged jet pool gradually filled over the 12 months. Filling was associated with the average rise in flows in winter, and occurred despite several freshes and discharge spikes. The plunge pool showed the most variation, with bed levels fluctuating by over 1 m. A key factor in pool scour here may not be the local water depth at the log, but the position of the log in relation to larger scale movements of sand-waves in the stream. These results question assumptions on the relative importance of small floods or channel-maintenance flows that lead to beneficial scour around large wood in sand-bed streams. Further, the continuous measurement of scour and fill around the logs suggested the presence of pool scour holes would have met critical requirements for Australian bass ( Macquaria novemaculeata) during the migration period, whereas less-frequent monitoring typical of rehabilitation trials would have suggested the contrary. The results of this study have demonstrated that geomorphic effectiveness is not always synonymous with biological effectiveness. Whilst physical models emphasise extreme changes, such as maximum scour, the key biological issue is whether scour occurs at the critical time of the life cycle. Continuous measurement of sand levels is an example of a geomorphic technique that will help to develop models that predict biologically meaningful processes, not just extremes.

  17. Evaluation and Enhancement of Carbon Dioxide Flooding Through Sweep Improvement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hughes, Richard

    2009-09-30

    Carbon dioxide displacement is a common improved recovery method applied to light oil reservoirs (30-45{degrees}API). The economic and technical success of CO{sub 2} floods is often limited by poor sweep efficiency or large CO{sub 2} utilization rates. Projected incremental recoveries for CO{sub 2} floods range from 7% to 20% of the original oil in place; however, actual incremental recoveries range from 9% to 15% of the original oil in place, indicating the potential for significant additional recoveries with improved sweep efficiency. This research program was designed to study the effectiveness of carbon dioxide flooding in a mature reservoir to identifymore » and develop methods and strategies to improve oil recovery in carbon dioxide floods. Specifically, the project has focused on relating laboratory, theoretical and simulation studies to actual field performance in a CO{sub 2} flood in an attempt to understand and mitigate problems of areal and vertical sweep efficiency. In this work the focus has been on evaluating the status of existing swept regions of a mature CO{sub 2} flood and developing procedures to improve the design of proposed floods. The Little Creek Field, Mississippi has been studied through laboratory, theoretical, numerical and simulation studies in an attempt to relate performance predictions to historical reservoir performance to determine sweep efficiency, improve the understanding of the reservoir response to CO{sub 2} injection, and develop scaling methodologies to relate laboratory data and simulation results to predicted reservoir behavior. Existing laboratory information from Little Creek was analyzed and an extensive amount of field data was collected. This was merged with an understanding of previous work at Little Creek to generate a detailed simulation study of two portions of the field – the original pilot area and a currently active part of the field. This work was done to try to relate all of this information to an understanding of where the CO{sub 2} went or is going and how recovery might be improved. New data was also generated in this process. Production logs were run to understand where the CO{sub 2} was entering the reservoir related to core and log information and also to corroborate the simulation model. A methodology was developed and successfully tested for evaluating saturations in a cased-hole environment. Finally an experimental and theoretical program was initiated to relate laboratory work to field scale design and analysis of operations. This work found that an understanding of vertical and areal heterogeneity is crucial for understanding sweep processes as well as understanding appropriate mitigation techniques to improve the sweep. Production and injection logs can provide some understanding of that heterogeneity when core data is not available. The cased-hole saturation logs developed in the project will also be an important part of the evaluation of vertical heterogeneity. Evaluation of injection well/production well connectivities through statistical or numerical techniques were found to be as successful in evaluating CO{sub 2} floods as they are for waterfloods. These are likely to be the lowest cost techniques to evaluate areal sweep. Full field simulation and 4D seismic techniques are other possibilities but were beyond the scope of the project. Detailed simulation studies of pattern areas proved insightful both for doing a “post-mortem” analysis of the pilot area as well as a late-term, active portion of the Little Creek Field. This work also evaluated options for improving sweep in the current flood as well as evaluating options that could have been successful at recovering more oil. That simulation study was successful due to the integration of a large amount of data supplied by the operator as well as collected through the course of the project. While most projects would not have the abundance of data that Little Creek had, integration of the available data continues to be critical for both the design and evaluation stages of CO{sub 2} floods. For cases where data availability is limited, running injection/production logs and/or running cased-hole saturation tools to provide an indication of vertical heterogeneity will be important.« less

  18. Nondestructive Testing Technique to Quantify Deterioration from Marine Borer Attack in Sitka Spruce and Western Hemlock Logs: Observations from a Pilot Test

    Treesearch

    Robert Ross; John W. Forsman; John R. Erickson; Allen M. Brackley

    2014-01-01

    Stress-wave nondestructive evaluation (NDE) techniques are used widely in the forest products industry—from the grading of wood veneer to inspection of timber structures. Inspection professionals frequently use stress-wave NDE techniques to locate internal voids and decayed or deteriorated areas in large timbers. Although these techniques have proven useful, little...

  19. Utilization and cost for animal logging operations

    Treesearch

    Suraj P. Shrestha; Bobby L. Lanford

    2001-01-01

    Forest harvesting with animals is a labor-intensive operation. Due to the development of efficient machines and high volume demands from the forest products industry, mechanization of logging developed very fast, leaving behind the traditional horse and mule logging. It is expensive to use machines on smaller woodlots, which require frequent moves if mechanically...

  20. Parallel O(log n) algorithms for open- and closed-chain rigid multibody systems based on a new mass matrix factorization technique

    NASA Technical Reports Server (NTRS)

    Fijany, Amir

    1993-01-01

    In this paper, parallel O(log n) algorithms for computation of rigid multibody dynamics are developed. These parallel algorithms are derived by parallelization of new O(n) algorithms for the problem. The underlying feature of these O(n) algorithms is a drastically different strategy for decomposition of interbody force which leads to a new factorization of the mass matrix (M). Specifically, it is shown that a factorization of the inverse of the mass matrix in the form of the Schur Complement is derived as M(exp -1) = C - B(exp *)A(exp -1)B, wherein matrices C, A, and B are block tridiagonal matrices. The new O(n) algorithm is then derived as a recursive implementation of this factorization of M(exp -1). For the closed-chain systems, similar factorizations and O(n) algorithms for computation of Operational Space Mass Matrix lambda and its inverse lambda(exp -1) are also derived. It is shown that these O(n) algorithms are strictly parallel, that is, they are less efficient than other algorithms for serial computation of the problem. But, to our knowledge, they are the only known algorithms that can be parallelized and that lead to both time- and processor-optimal parallel algorithms for the problem, i.e., parallel O(log n) algorithms with O(n) processors. The developed parallel algorithms, in addition to their theoretical significance, are also practical from an implementation point of view due to their simple architectural requirements.

  1. Robust Library Building for Autonomous Classification of Downhole Geophysical Logs Using Gaussian Processes

    NASA Astrophysics Data System (ADS)

    Silversides, Katherine L.; Melkumyan, Arman

    2017-03-01

    Machine learning techniques such as Gaussian Processes can be used to identify stratigraphically important features in geophysical logs. The marker shales in the banded iron formation hosted iron ore deposits of the Hamersley Ranges, Western Australia, form distinctive signatures in the natural gamma logs. The identification of these marker shales is important for stratigraphic identification of unit boundaries for the geological modelling of the deposit. Machine learning techniques each have different unique properties that will impact the results. For Gaussian Processes (GPs), the output values are inclined towards the mean value, particularly when there is not sufficient information in the library. The impact that these inclinations have on the classification can vary depending on the parameter values selected by the user. Therefore, when applying machine learning techniques, care must be taken to fit the technique to the problem correctly. This study focuses on optimising the settings and choices for training a GPs system to identify a specific marker shale. We show that the final results converge even when different, but equally valid starting libraries are used for the training. To analyse the impact on feature identification, GP models were trained so that the output was inclined towards a positive, neutral or negative output. For this type of classification, the best results were when the pull was towards a negative output. We also show that the GP output can be adjusted by using a standard deviation coefficient that changes the balance between certainty and accuracy in the results.

  2. Visualization of usability and functionality of a professional website through web-mining.

    PubMed

    Jones, Josette F; Mahoui, Malika; Gopa, Venkata Devi Pragna

    2007-10-11

    Functional interface design requires understanding of the information system structure and the user. Web logs record user interactions with the interface, and thus provide some insight into user search behavior and efficiency of the search process. The present study uses a data-mining approach with techniques such as association rules, clustering and classification, to visualize the usability and functionality of a digital library through in depth analyses of web logs.

  3. Combined Log Inventory and Process Simulation Models for the Planning and Control of Sawmill Operations

    Treesearch

    Guillermo A. Mendoza; Roger J. Meimban; Philip A. Araman; William G. Luppold

    1991-01-01

    A log inventory model and a real-time hardwood process simulation model were developed and combined into an integrated production planning and control system for hardwood sawmills. The log inventory model was designed to monitor and periodically update the status of the logs in the log yard. The process simulation model was designed to estimate various sawmill...

  4. Image-based reconstruction of three-dimensional myocardial infarct geometry for patient-specific modeling of cardiac electrophysiology

    PubMed Central

    Ukwatta, Eranga; Arevalo, Hermenegild; Rajchl, Martin; White, James; Pashakhanloo, Farhad; Prakosa, Adityo; Herzka, Daniel A.; McVeigh, Elliot; Lardo, Albert C.; Trayanova, Natalia A.; Vadakkumpadan, Fijoy

    2015-01-01

    Purpose: Accurate three-dimensional (3D) reconstruction of myocardial infarct geometry is crucial to patient-specific modeling of the heart aimed at providing therapeutic guidance in ischemic cardiomyopathy. However, myocardial infarct imaging is clinically performed using two-dimensional (2D) late-gadolinium enhanced cardiac magnetic resonance (LGE-CMR) techniques, and a method to build accurate 3D infarct reconstructions from the 2D LGE-CMR images has been lacking. The purpose of this study was to address this need. Methods: The authors developed a novel methodology to reconstruct 3D infarct geometry from segmented low-resolution (Lo-res) clinical LGE-CMR images. Their methodology employed the so-called logarithm of odds (LogOdds) function to implicitly represent the shape of the infarct in segmented image slices as LogOdds maps. These 2D maps were then interpolated into a 3D image, and the result transformed via the inverse of LogOdds to a binary image representing the 3D infarct geometry. To assess the efficacy of this method, the authors utilized 39 high-resolution (Hi-res) LGE-CMR images, including 36 in vivo acquisitions of human subjects with prior myocardial infarction and 3 ex vivo scans of canine hearts following coronary ligation to induce infarction. The infarct was manually segmented by trained experts in each slice of the Hi-res images, and the segmented data were downsampled to typical clinical resolution. The proposed method was then used to reconstruct 3D infarct geometry from the downsampled images, and the resulting reconstructions were compared with the manually segmented data. The method was extensively evaluated using metrics based on geometry as well as results of electrophysiological simulations of cardiac sinus rhythm and ventricular tachycardia in individual hearts. Several alternative reconstruction techniques were also implemented and compared with the proposed method. Results: The accuracy of the LogOdds method in reconstructing 3D infarct geometry, as measured by the Dice similarity coefficient, was 82.10% ± 6.58%, a significantly higher value than those of the alternative reconstruction methods. Among outcomes of electrophysiological simulations with infarct reconstructions generated by various methods, the simulation results corresponding to the LogOdds method showed the smallest deviation from those corresponding to the manual reconstructions, as measured by metrics based on both activation maps and pseudo-ECGs. Conclusions: The authors have developed a novel method for reconstructing 3D infarct geometry from segmented slices of Lo-res clinical 2D LGE-CMR images. This method outperformed alternative approaches in reproducing expert manual 3D reconstructions and in electrophysiological simulations. PMID:26233186

  5. Image-based reconstruction of three-dimensional myocardial infarct geometry for patient-specific modeling of cardiac electrophysiology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ukwatta, Eranga, E-mail: eukwatt1@jhu.edu; Arevalo, Hermenegild; Pashakhanloo, Farhad

    Purpose: Accurate three-dimensional (3D) reconstruction of myocardial infarct geometry is crucial to patient-specific modeling of the heart aimed at providing therapeutic guidance in ischemic cardiomyopathy. However, myocardial infarct imaging is clinically performed using two-dimensional (2D) late-gadolinium enhanced cardiac magnetic resonance (LGE-CMR) techniques, and a method to build accurate 3D infarct reconstructions from the 2D LGE-CMR images has been lacking. The purpose of this study was to address this need. Methods: The authors developed a novel methodology to reconstruct 3D infarct geometry from segmented low-resolution (Lo-res) clinical LGE-CMR images. Their methodology employed the so-called logarithm of odds (LogOdds) function to implicitlymore » represent the shape of the infarct in segmented image slices as LogOdds maps. These 2D maps were then interpolated into a 3D image, and the result transformed via the inverse of LogOdds to a binary image representing the 3D infarct geometry. To assess the efficacy of this method, the authors utilized 39 high-resolution (Hi-res) LGE-CMR images, including 36 in vivo acquisitions of human subjects with prior myocardial infarction and 3 ex vivo scans of canine hearts following coronary ligation to induce infarction. The infarct was manually segmented by trained experts in each slice of the Hi-res images, and the segmented data were downsampled to typical clinical resolution. The proposed method was then used to reconstruct 3D infarct geometry from the downsampled images, and the resulting reconstructions were compared with the manually segmented data. The method was extensively evaluated using metrics based on geometry as well as results of electrophysiological simulations of cardiac sinus rhythm and ventricular tachycardia in individual hearts. Several alternative reconstruction techniques were also implemented and compared with the proposed method. Results: The accuracy of the LogOdds method in reconstructing 3D infarct geometry, as measured by the Dice similarity coefficient, was 82.10% ± 6.58%, a significantly higher value than those of the alternative reconstruction methods. Among outcomes of electrophysiological simulations with infarct reconstructions generated by various methods, the simulation results corresponding to the LogOdds method showed the smallest deviation from those corresponding to the manual reconstructions, as measured by metrics based on both activation maps and pseudo-ECGs. Conclusions: The authors have developed a novel method for reconstructing 3D infarct geometry from segmented slices of Lo-res clinical 2D LGE-CMR images. This method outperformed alternative approaches in reproducing expert manual 3D reconstructions and in electrophysiological simulations.« less

  6. Efficacy of the World Health Organization-recommended handwashing technique and a modified washing technique to remove Clostridium difficile from hands.

    PubMed

    Deschênes, Philippe; Chano, Frédéric; Dionne, Léa-Laurence; Pittet, Didier; Longtin, Yves

    2017-08-01

    The efficacy of the World Health Organization (WHO)-recommended handwashing technique against Clostridium difficile is uncertain, and whether it could be improved remains unknown. Also, the benefit of using a structured technique instead of an unstructured technique remains unclear. This study was a prospective comparison of 3 techniques (unstructured, WHO, and a novel technique dubbed WHO shortened repeated [WHO-SR] technique) to remove C difficile. Ten participants were enrolled and performed each technique. Hands were contaminated with 3 × 10 6 colony forming units (CFU) of a nontoxigenic strain containing 90% spores. Efficacy was assessed using the whole-hand method. The relative efficacy of each technique and of a structured (either WHO or WHO-SR) vs an unstructured technique were assessed by Mann-Whitney U test and Wilcoxon signed-rank test. The median effectiveness of the unstructured, WHO, and WHO-SR techniques in log 10 CFU reduction was 1.30 (interquartile range [IQR], 1.27-1.43), 1.71 (IQR, 1.34-1.91), and 1.70 (IQR, 1.54-2.42), respectively. The WHO-SR technique was significantly more efficacious than the unstructured technique (P = .01). Washing hands with a structured technique was more effective than washing with an unstructured technique (median, 1.70 vs 1.30 log 10 CFU reduction, respectively; P = .007). A structured washing technique is more effective than an unstructured technique against C difficile. Copyright © 2017 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  7. Evaluation of electronic logging and gamma ray device for bridge boring interpretation.

    DOT National Transportation Integrated Search

    1971-07-01

    Since shallow electric logging devices and gamma ray devices (for use in holes of less than 200 feet in depth) have recently been developed, it was the aim of this work to ascertain if correlation between electric logs and/or gamma ray logs and known...

  8. Techniques for land use change detection using Landsat imagery

    NASA Technical Reports Server (NTRS)

    Angelici, G. L.; Bryant, N. A.; Friedman, S. Z.

    1977-01-01

    A variety of procedures were developed for the delineation of areas of land use change using Landsat Multispectral Scanner data and the generation of statistics revealing the nature of the changes involved (i.e., number of acres changed from rural to urban). Techniques of the Image Based Information System were utilized in all stages of the procedure, from logging the Landsat data and registering two frames of imagery, to extracting the changed areas and printing tabulations of land use change in acres. Two alternative methods of delineating land use change are presented while enumerating the steps of the entire process. The Houston, Texas urban area, and the Orlando, Florida urban area, are used as illustrative examples of various procedures.

  9. Bladder cancer mapping in Libya based on standardized morbidity ratio and log-normal model

    NASA Astrophysics Data System (ADS)

    Alhdiri, Maryam Ahmed; Samat, Nor Azah; Mohamed, Zulkifley

    2017-05-01

    Disease mapping contains a set of statistical techniques that detail maps of rates based on estimated mortality, morbidity, and prevalence. A traditional approach to measure the relative risk of the disease is called Standardized Morbidity Ratio (SMR). It is the ratio of an observed and expected number of accounts in an area, which has the greatest uncertainty if the disease is rare or if geographical area is small. Therefore, Bayesian models or statistical smoothing based on Log-normal model are introduced which might solve SMR problem. This study estimates the relative risk for bladder cancer incidence in Libya from 2006 to 2007 based on the SMR and log-normal model, which were fitted to data using WinBUGS software. This study starts with a brief review of these models, starting with the SMR method and followed by the log-normal model, which is then applied to bladder cancer incidence in Libya. All results are compared using maps and tables. The study concludes that the log-normal model gives better relative risk estimates compared to the classical method. The log-normal model has can overcome the SMR problem when there is no observed bladder cancer in an area.

  10. T-H-A-T-S: timber-harvesting-and-transport-simulator: with subroutines for Appalachian logging

    Treesearch

    A. Jeff Martin

    1975-01-01

    A computer program for simulating harvesting operations is presented. Written in FORTRAN IV, the program contains subroutines that were developed for Appalachian logging conditions. However, with appropriate modifications, the simulator would be applicable for most logging operations and locations. The details of model development and its methodology are presented,...

  11. Ubiquitous Learning Project Using Life-Logging Technology in Japan

    ERIC Educational Resources Information Center

    Ogata, Hiroaki; Hou, Bin; Li, Mengmeng; Uosaki, Noriko; Mouri, Kosuke; Liu, Songran

    2014-01-01

    A Ubiquitous Learning Log (ULL) is defined as a digital record of what a learner has learned in daily life using ubiquitous computing technologies. In this paper, a project which developed a system called SCROLL (System for Capturing and Reusing Of Learning Log) is presented. The aim of developing SCROLL is to help learners record, organize,…

  12. Using Plasticity Values Determined From Systematic Hardness Indentation Measurements for Predicting Impact Behavior in Structural Ceramics: A New, Simple Screening Technique

    DTIC Science & Technology

    2009-09-01

    kFc ) is shown to fit the Knoop data quite well. A plot of log10 (HK) vs. log10 (F) yielded easily comparable straight lines, whose slope and...AlON), silicon carbide, aluminum oxide and boron carbide. A power-law equation (H = kFc ) is shown to fit the Knoop data quite well. A plot of log10... kFc HK= 24.183 F-0.0699 R2= 0.97 H K (G Pa ) Load (N) HK = a/F + b ErrorValue 0.919483.7367a 0.6903619.361b NA25.591Chisq NA0.67368R2 1 1.1 1.2

  13. Using borehole flow logging to optimize hydraulic-test procedures in heterogeneous fractured aquifers

    USGS Publications Warehouse

    Paillet, F.L.

    1995-01-01

    Hydraulic properties of heterogeneous fractured aquifers are difficult to characterize, and such characterization usually requires equipment-intensive and time-consuming applications of hydraulic testing in situ. Conventional coring and geophysical logging techniques provide useful and reliable information on the distribution of bedding planes, fractures and solution openings along boreholes, but it is often unclear how these locally permeable features are organized into larger-scale zones of hydraulic conductivity. New boreholes flow-logging equipment provides techniques designed to identify hydraulically active fractures intersecting boreholes, and to indicate how these fractures might be connected to larger-scale flow paths in the surrounding aquifer. Potential complications in interpreting flowmeter logs include: 1) Ambient hydraulic conditions that mask the detection of hydraulically active fractures; 2) Inability to maintain quasi-steady drawdowns during aquifer tests, which causes temporal variations in flow intensity to be confused with inflows during pumping; and 3) Effects of uncontrolled background variations in hydraulic head, which also complicate the interpretation of inflows during aquifer tests. Application of these techniques is illustrated by the analysis of cross-borehole flowmeter data from an array of four bedrock boreholes in granitic schist at the Mirror Lake, New Hampshire, research site. Only two days of field operations were required to unambiguously identify the few fractures or fracture zones that contribute most inflow to boreholes in the CO borehole array during pumping. Such information was critical in the interpretation of water-quality data. This information also permitted the setting of the available string of two packers in each borehole so as to return the aquifer as close to pre-drilling conditions as possible with the available equipment.

  14. Spent Fuel Test-Climax: core logging for site investigation and instrumentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilder, D.G.; Yow, J.L. Jr.; Thorpe, R.K.

    1982-05-28

    As an integral part of the Spent Fuel Test-Climax 5150 ft (1570 m) of granite core was obtained. This core was diamond drilled in various sizes, mainly 38-mm and 76-mm diameters. The core was teken with single tube core barrels and was unoriented. Techniques used to drill and log this core are discussed, as well as techniques to orient the core. Of the 5150 ft (1570 m) of core more than 3645 ft (1111 m) was retained and logged in some detail. As a result of the core logging, geologic discontinuities were identified, joint frequency and spacing characterized. Discontinuities identifiedmore » included several joint sets, shear zones and faults. Correlations based on coring along were generally found to be impossible, even for the more prominent features. The only feature properly correlated from the exploratory drilling was the fault system at the end of the facility, but it was not identified from the exploratory core as a fault. Identification of discontinuities was later helped by underground mapping that identified several different joint sets with different characteristics. It was found that joint frequency varied from 0.3 to 1.1 joint per foot of core for open fractures and from 0.3 to 3.3/ft for closed or healed fractures. Histograms of fracture spacing indicate that there is likely a random distribution of spacing superimposed upon uniformly spaced fractures. It was found that a low angle joint set had a persistent mean orientation. These joints were healed and had pervasive wall rock alteration which made identification of joints in this set possible. The recognition of a joint set with known attitude allowed orientation of much of the core. This orientation technique was found to be effective. 10 references, 25 figures, 4 tables.« less

  15. Partitioning of fluorotelomer alcohols to octanol and different sources of dissolved organic carbon.

    PubMed

    Carmosini, Nadia; Lee, Linda S

    2008-09-01

    Interest in the environmental fate of fluorotelomer alcohols (FTOHs) has spurred efforts to understand their equilibrium partitioning behavior. Experimentally determined partition coefficients for FTOHs between soil/water and air/water have been reported, but direct measurements of partition coefficients for dissolved organic carbon (DOC)/water (K(doc)) and octanol/ water(K(ow)) have been lacking. Here we measured the partitioning of 8:2 and 6:2 FTOH between one or more types of DOC and water using enhanced solubility or dialysis bag techniques, and also quantified K(ow) values for 4:2 to 8:2 FTOH using a batch equilibration method. The range in measured log K(doc) values for 8:2 FTOH using the enhanced solubility technique with DOC derived from two soils, two biosolids, and three reference humic acids is 2.00-3.97 with the lowest values obtained for the biosolids and an average across all other DOC sources (biosolid DOC excluded) of 3.54 +/- 0.29. For 6:2 FTOH and Aldrich humic acid, a log K(doc) value of 1.96 +/- 0.45 was measured using the dialysis technique. These average values are approximately 1 to 2 log units lower than previously indirectly estimated K(doc) values. Overall, the affinity for DOC tends to be slightly lower than that for particulate soil organic carbon. Measured log K(ow) values for 4:2 (3.30 +/- 0.04), 6:2 (4.54 +/- 0.01), and 8:2 FTOH (5.58 +/- 0.06) were in good agreement with previously reported estimates. Using relationships between experimentally measured partition coefficients and C-atom chain length, we estimated K(doc) and K(ow) values for shorter and longer chain FTOHs, respectively, that we were unable to measure experimentally.

  16. Delineation of faults, fractures, foliation, and ground-water-flow zones in fractured-rock, on the southern part of Manhattan, New York, through use of advanced borehole-geophysical techniques

    USGS Publications Warehouse

    Stumm, Frederick; Chu, Anthony; Monti, Jack

    2004-01-01

    Advanced borehole-geophysical techniques were used to assess the geohydrology of crystalline bedrock in 20 boreholes on the southern part of Manhattan Island, N.Y., in preparation for construction of a third water tunnel for New York City. The borehole-logging techniques included natural gamma, single-point resistance, short-normal resistivity, mechanical and acoustic caliper, magnetic susceptibility, borehole-fluid temperature and resistivity, borehole-fluid specific conductance, dissolved oxygen, pH, redox, heatpulse flowmeter (at selected boreholes), borehole deviation, acoustic and optical televiewer, and borehole radar (at selected boreholes). Hydraulic head and specific-capacity test data were collected from 29 boreholes. The boreholes penetrated gneiss, schist, and other crystalline bedrock that has an overall southwest to northwest-dipping foliation. Most of the fractures penetrated are nearly horizontal or have moderate- to high-angle northwest or eastward dip azimuths. Foliation dip within the potential tunnel-construction zone is northwestward and southeastward in the proposed North Water-Tunnel, northwestward to southwestward in the proposed Midtown Water-Tunnel, and northwestward to westward dipping in the proposed South Water-Tunnel. Fracture population dip azimuths are variable. Heat-pulse flowmeter logs obtained under pumping and nonpumping (ambient) conditions, together with other geophysical logs, indicate transmissive fracture zones in each borehole. The 60-megahertz directional borehole-radar logs delineated the location and orientation of several radar reflectors that did not intersect the projection of the borehole.Fracture indexes range from 0.12 to 0.93 fractures per foot of borehole. Analysis of specific-capacity tests from each borehole indicated that transmissivity ranges from 2 to 459 feet squared per day; the highest transmissivity is at the Midtown Water-Tunnel borehole (E35ST-D).

  17. Practical life log video indexing based on content and context

    NASA Astrophysics Data System (ADS)

    Tancharoen, Datchakorn; Yamasaki, Toshihiko; Aizawa, Kiyoharu

    2006-01-01

    Today, multimedia information has gained an important role in daily life and people can use imaging devices to capture their visual experiences. In this paper, we present our personal Life Log system to record personal experiences in form of wearable video and environmental data; in addition, an efficient retrieval system is demonstrated to recall the desirable media. We summarize the practical video indexing techniques based on Life Log content and context to detect talking scenes by using audio/visual cues and semantic key frames from GPS data. Voice annotation is also demonstrated as a practical indexing method. Moreover, we apply body media sensors to record continuous life style and use body media data to index the semantic key frames. In the experiments, we demonstrated various video indexing results which provided their semantic contents and showed Life Log visualizations to examine personal life effectively.

  18. Copy-move forgery detection utilizing Fourier-Mellin transform log-polar features

    NASA Astrophysics Data System (ADS)

    Dixit, Rahul; Naskar, Ruchira

    2018-03-01

    In this work, we address the problem of region duplication or copy-move forgery detection in digital images, along with detection of geometric transforms (rotation and rescale) and postprocessing-based attacks (noise, blur, and brightness adjustment). Detection of region duplication, following conventional techniques, becomes more challenging when an intelligent adversary brings about such additional transforms on the duplicated regions. In this work, we utilize Fourier-Mellin transform with log-polar mapping and a color-based segmentation technique using K-means clustering, which help us to achieve invariance to all the above forms of attacks in copy-move forgery detection of digital images. Our experimental results prove the efficiency of the proposed method and its superiority to the current state of the art.

  19. Environmental and Genetic Factors Explain Differences in Intraocular Scattering.

    PubMed

    Benito, Antonio; Hervella, Lucía; Tabernero, Juan; Pennos, Alexandros; Ginis, Harilaos; Sánchez-Romera, Juan F; Ordoñana, Juan R; Ruiz-Sánchez, Marcos; Marín, José M; Artal, Pablo

    2016-01-01

    To study the relative impact of genetic and environmental factors on the variability of intraocular scattering within a classical twin study. A total of 64 twin pairs, 32 monozygotic (MZ) (mean age: 54.9 ± 6.3 years) and 32 dizygotic (DZ) (mean age: 56.4 ± 7.0 years), were measured after a complete ophthalmologic exam had been performed to exclude all ocular pathologies that increase intraocular scatter as cataracts. Intraocular scattering was evaluated by using two different techniques based on a straylight parameter log(S) estimation: a compact optical instrument based in the principle of optical integration and a psychophysical measurement. Intraclass correlation coefficients (ICC) were used as descriptive statistics of twin resemblance, and genetic models were fitted to estimate heritability. No statistically significant difference was found for MZ and DZ groups for age (P = 0.203), best-corrected visual acuity (P = 0.626), cataract gradation (P = 0.701), sex (P = 0.941), optical log(S) (P = 0.386), or psychophysical log(S) (P = 0.568), with only a minor difference in equivalent sphere (P = 0.008). Intraclass correlation coefficients between siblings were similar for scatter parameters: 0.676 in MZ and 0.471 in DZ twins for optical log(S); 0.533 in MZ twins and 0.475 in DZ twins for psychophysical log(S). For equivalent sphere, ICCs were 0.767 in MZ and 0.228 in DZ twins. Conservative estimates of heritability for the measured scattering parameters were 0.39 and 0.20, respectively. Correlations of intraocular scatter (straylight) parameters in the groups of identical and nonidentical twins were similar. Heritability estimates were of limited magnitude, suggesting that genetic and environmental factors determine the variance of ocular straylight in healthy middle-aged adults.

  20. Psychophysics, reliability, and norm values for temporal contrast sensitivity implemented on the two alternative forced choice C-Quant device.

    PubMed

    van den Berg, Thomas J T P; Franssen, Luuk; Kruijt, Bastiaan; Coppens, Joris E

    2011-08-01

    The current paper describes the design and population testing of a flicker sensitivity assessment technique corresponding to the psychophysical approach for straylight measurement. The purpose is twofold: to check the subjects' capability to perform the straylight test and as a test for retinal integrity for other purposes. The test was implemented in the Oculus C-Quant straylight meter, using homemade software (MATLAB). The geometry of the visual field lay-out was identical, as was the subjects' 2AFC task. A comparable reliability criterion ("unc") was developed. Outcome measure was logTCS (temporal contrast sensitivity). The population test was performed in science fair settings on about 400 subjects. Moreover, 2 subjects underwent extensive tests to check whether optical defects, mimicked with trial lenses and scatter filters, affected the TCS outcome. Repeated measures standard deviation was 0.11 log units for the reference population. Normal values for logTCS were around 2 (threshold 1%) with some dependence on age (range 6 to 85 years). The test outcome did not change upon a tenfold (optical) deterioration in visual acuity or straylight. The test has adequate precision for checking a subject's capability to perform straylight assessment. The unc reliability criterion ensures sufficient precision, also for assessment of retinal sensitivity loss.

  1. Indoor cultivation and cultural characteristics of Wolfiporia cocos sclerotia using mushroom culture bottles.

    PubMed

    Kubo, Toshiyuki; Terabayashi, Susumu; Takeda, Shuichi; Sasaki, Hiroshi; Aburada, Masaki; Miyamoto, Ken-ichi

    2006-06-01

    We newly developed an indoor cultivation technique for Wolfiporia cocos (Wolf) Ryvarden et Gilbertson (Syn. Poria cocos Wolf), not with soil, but using mushroom culture bottles with pine logs, and clarified some cultural characteristics of sclerotia in the laboratory. To determine the optimum conditions for sclerotia growth, the weight of sclerotia and concentration of CO2 in three different air filters; cloth, paper and urethane resin, and closed bottles were tested. When the cloth air filter was used, the growth rate was the fastest and the yield was maximal. These results suggested that the aeration was an important environmental factor for cultivation. To clarify the characteristics of culture in the cloth air filtered and closed bottles, the weight of sclerotia, the compositions of pine logs and the contents of pachymic acid and dehydropachymic acid were examined during 24 weeks. The growth of scleroia and the wood decaying efficiency in the cloth air filtered bottles were better than those in the closed bottles. Also, it was found that W. cocos was a brown rot fungus due to the alkaline solubility of pine logs in the wood decay process. In addition, the contents of pachymic acid and dehydropachymic acid and the TLC pattern between the cultivated and commercial sclerotia did not differ remarkably.

  2. A method to describe inelastic gamma field distribution in neutron gamma density logging.

    PubMed

    Zhang, Feng; Zhang, Quanying; Liu, Juntao; Wang, Xinguang; Wu, He; Jia, Wenbao; Ti, Yongzhou; Qiu, Fei; Zhang, Xiaoyang

    2017-11-01

    Pulsed neutron gamma density logging (NGD) is of great significance for radioprotection and density measurement in LWD, however, the current methods have difficulty in quantitative calculation and single factor analysis for the inelastic gamma field distribution. In order to clarify the NGD mechanism, a new method is developed to describe the inelastic gamma field distribution. Based on the fast-neutron scattering and gamma attenuation, the inelastic gamma field distribution is characterized by the inelastic scattering cross section, fast-neutron scattering free path, formation density and other parameters. And the contribution of formation parameters on the field distribution is quantitatively analyzed. The results shows the contribution of density attenuation is opposite to that of inelastic scattering cross section and fast-neutron scattering free path. And as the detector-spacing increases, the density attenuation gradually plays a dominant role in the gamma field distribution, which means large detector-spacing is more favorable for the density measurement. Besides, the relationship of density sensitivity and detector spacing was studied according to this gamma field distribution, therefore, the spacing of near and far gamma ray detector is determined. The research provides theoretical guidance for the tool parameter design and density determination of pulsed neutron gamma density logging technique. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gunawan, H.; Puspito, N. T.; Ibrahim, G.

    The new approach method to determine the magnitude by using amplitude displacement relationship (A), epicenter distance ({Delta}) and duration of high frequency radiation (t) has been investigated for Tasikmalaya earthquake, on September 2, 2009, and their aftershock. Moment magnitude scale commonly used seismic surface waves with the teleseismic range of the period is greater than 200 seconds or a moment magnitude of the P wave using teleseismic seismogram data and the range of 10-60 seconds. In this research techniques have been developed a new approach to determine the displacement amplitude and duration of high frequency radiation using near earthquake. Determinationmore » of the duration of high frequency using half of period of P waves on the seismograms displacement. This is due tothe very complex rupture process in the near earthquake. Seismic data of the P wave mixing with other wave (S wave) before the duration runs out, so it is difficult to separate or determined the final of P-wave. Application of the 68 earthquakes recorded by station of CISI, Garut West Java, the following relationship is obtained: Mw = 0.78 log (A) + 0.83 log {Delta}+ 0.69 log (t) + 6.46 with: A (m), d (km) and t (second). Moment magnitude of this new approach is quite reliable, time processing faster so useful for early warning.« less

  4. Physico-chemical measurements of CL-20 for environmental applications. Comparison with RDX and HMX.

    PubMed

    Monteil-Rivera, Fanny; Paquet, Louise; Deschamps, Stéphane; Balakrishnan, Vimal K; Beaulieu, Chantale; Hawari, Jalal

    2004-01-30

    CL-20 is a polycyclic energetic nitramine, which may soon replace the monocyclic nitramines RDX and HMX, because of its superior explosive performance. Therefore, to predict its environmental fate, analytical and physico-chemical data must be made available. An HPLC technique was thus developed to measure CL-20 in soil samples based on the US Environmental Protection Agency method 8330. We found that the soil water content and aging (21 days) had no effect on the recoveries (>92%) of CL-20, provided that the extracts were kept acidic (pH 3). The aqueous solubility of CL-20 was poor (3.6 mg l(-1) at 25 degrees C) and increased with temperature to reach 18.5 mg l(-1) at 60 degrees C. The octanol-water partition coefficient of CL-20 (log KOW = 1.92) was higher than that of RDX (log KOW = 0.90) and HMX (log KOW = 0.16), indicating its higher affinity to organic matter. Finally, CL-20 was found to decompose in non-acidified water upon contact with glass containers to give NO2- (2 equiv.), N2O (2 equiv.), and HCOO- (2 equiv.). The experimental findings suggest that CL-20 should be less persistent in the environment than RDX and HMX.

  5. Keystroke Analysis: Reflections on Procedures and Measures

    ERIC Educational Resources Information Center

    Baaijen, Veerle M.; Galbraith, David; de Glopper, Kees

    2012-01-01

    Although keystroke logging promises to provide a valuable tool for writing research, it can often be difficult to relate logs to underlying processes. This article describes the procedures and measures that the authors developed to analyze a sample of 80 keystroke logs, with a view to achieving a better alignment between keystroke-logging measures…

  6. Standing timber coefficients for Indiana walnut log production.

    Treesearch

    James E. Blyth; Edwin Kallio; John C. Callahan

    1969-01-01

    If the volume of walnut veneer logs and saw logs received at processing plants from Indiana forests is known, conversion factors developed in this paper can be used to determine how much timber was cut to provide these logs and the kinds of timber that were cut (sawtimber, cull trees, trees on nonforest land, etc.).

  7. LogSafe and Smart: Minnesota OSHA's LogSafe Program Takes Root.

    ERIC Educational Resources Information Center

    Honerman, James

    1999-01-01

    Logging is now the most dangerous U.S. occupation. The Occupational Safety and Health Administration (OSHA) developed specialized safety training for the logging industry but has been challenged to reach small operators. An OSHA-approved state program in Minnesota provides annual safety seminars to about two-thirds of the state's full-time…

  8. Selective logging in the Brazilian Amazon.

    Treesearch

    G. P. Asner; D. E. Knapp; E. N. Broadbent; P. J. C. Oliveira; M Keller; J. N. Silva

    2005-01-01

    Amazon deforestation has been measured by remote sensing for three decades. In comparison, selective logging has been mostly invisible to satellites. We developed a large-scale, high-resolution, automated remote-sensing analysis of selective logging in the top five timber-producing states of the Brazilian Amazon. Logged areas ranged from 12,075 to 19,823 square...

  9. A method to evaluate hydraulic fracture using proppant detection.

    PubMed

    Liu, Juntao; Zhang, Feng; Gardner, Robin P; Hou, Guojing; Zhang, Quanying; Li, Hu

    2015-11-01

    Accurate determination of the proppant placement and propped fracture height are important for evaluating and optimizing stimulation strategies. A technology using non-radioactive proppant and a pulsed neutron gamma energy spectra logging tool to determine the placement and height of propped fractures is proposed. Gd2O3 was incorporated into ceramic proppant and a Monte Carlo method was utilized to build the logging tools and formation models. Characteristic responses of the recorded information of different logging tools to fracture widths, proppant concentrations and influencing factors were studied. The results show that Gd capture gamma rays can be used to evaluate propped fractures and it has higher sensitivity to the change of fracture width and traceable proppant content compared with the exiting non-radioactive proppant evaluation techniques and only an after-fracture measurement is needed for the new method; The changes in gas saturation and borehole size have a great impact on determining propped fractures when compensated neutron and pulsed neutron capture tool are used. A field example is presented to validate the application of the new technique. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Techniques for estimating streamflow characteristics in the Eastern and Interior coal provinces of the United States

    USGS Publications Warehouse

    Wetzel, Kim L.; Bettandorff, J.M.

    1986-01-01

    Techniques are presented for estimating various streamflow characteristics, such as peak flows, mean monthly and annual flows, flow durations, and flow volumes, at ungaged sites on unregulated streams in the Eastern Coal region. Streamflow data and basin characteristics for 629 gaging stations were used to develop multiple-linear-regression equations. Separate equations were developed for the Eastern and Interior Coal Provinces. Drainage area is an independent variable common to all equations. Other variables needed, depending on the streamflow characteristic, are mean annual precipitation, mean basin elevation, main channel length, basin storage, main channel slope, and forest cover. A ratio of the observed 50- to 90-percent flow durations was used in the development of relations to estimate low-flow frequencies in the Eastern Coal Province. Relations to estimate low flows in the Interior Coal Province are not presented because the standard errors were greater than 0.7500 log units and were considered to be of poor reliability.

  11. Eliminating log rolling as a spine trauma order.

    PubMed

    Conrad, Bryan P; Rossi, Gianluca Del; Horodyski, Mary Beth; Prasarn, Mark L; Alemi, Yara; Rechtine, Glenn R

    2012-01-01

    Currently, up to 25% of patients with spinal cord injuries may experience neurologic deterioration during the initial management of their injuries. Therefore, more effective procedures need to be established for the transportation and care of these to reduce the risk of secondary neurologic damage. Here, we present more acceptable methods to minimize motion in the unstable spine during the management of patients with traumatic spine injuries. This review summarizes more than a decade of research aimed at evaluating different methods of caring for patients with spine trauma. The most commonly utilized technique to transport spinal cord injured patients, the log rolling maneuver, produced more motion than placing a patient on a spine board, removing a spine board, performing continuous lateral therapy, and positioning a patient prone for surgery. Alternative maneuvers that produced less motion included the straddle lift and slide, 6 + lift and slide, scoop stretcher, mechanical kinetic therapy, mechanical transfers, and the use of the operating table to rotate the patient to the prone position for surgical stabilization. The log roll maneuver should be removed from the trauma response guidelines for patients with suspected spine injuries, as it creates significantly more motion in the unstable spine than the readily available alternatives. The only exception is the patient who is found prone, in which case the patient should then be log rolled directly on to the spine board utilizing a push technique.

  12. The quality and availability of hardwood logging residue based on developed quality levels

    Treesearch

    Floyd G. Timson

    1980-01-01

    Hardwood logging residue was examined for salvageable quality material. Four quality levels (QL 1 to QL 4), based on four sets of specifications, were developed. The specifications used surface indicators, sweep, center decay, and piece size to determine quality. Twenty-six percent of the total logging residue (residue ≥ 4 inches in diameter outside bark at...

  13. Using parallel computing methods to improve log surface defect detection methods

    Treesearch

    R. Edward Thomas; Liya Thomas

    2013-01-01

    Determining the size and location of surface defects is crucial to evaluating the potential yield and value of hardwood logs. Recently a surface defect detection algorithm was developed using the Java language. This algorithm was developed around an earlier laser scanning system that had poor resolution along the length of the log (15 scan lines per foot). A newer...

  14. The Development and Validation of the Instructional Practices Log in Science: A Measure of K-5 Science Instruction

    ERIC Educational Resources Information Center

    Adams, Elizabeth L.; Carrier, Sarah J.; Minogue, James; Porter, Stephen R.; McEachin, Andrew; Walkowiak, Temple A.; Zulli, Rebecca A.

    2017-01-01

    The Instructional Practices Log in Science (IPL-S) is a daily teacher log developed for K-5 teachers to self-report their science instruction. The items on the IPL-S are grouped into scales measuring five dimensions of science instruction: "Low-level Sense-making," "High-level Sense-making," "Communication,"…

  15. Reconciling timber extraction with biodiversity conservation in tropical forests using reduced-impact logging

    PubMed Central

    Bicknell, Jake E; Struebig, Matthew J; Davies, Zoe G; Baraloto, Christopher

    2015-01-01

    Over 20% of the world's tropical forests have been selectively logged, and large expanses are allocated for future timber extraction. Reduced-impact logging (RIL) is being promoted as best practice forestry that increases sustainability and lowers CO2 emissions from logging, by reducing collateral damage associated with timber extraction. RIL is also expected to minimize the impacts of selective logging on biodiversity, although this is yet to be thoroughly tested. We undertake the most comprehensive study to date to investigate the biodiversity impacts of RIL across multiple taxonomic groups. We quantified birds, bats and large mammal assemblage structures, using a before-after control-impact (BACI) design across 20 sample sites over a 5-year period. Faunal surveys utilized point counts, mist nets and line transects and yielded >250 species. We examined assemblage responses to logging, as well as partitions of feeding guild and strata (understorey vs. canopy), and then tested for relationships with logging intensity to assess the primary determinants of community composition. Community analysis revealed little effect of RIL on overall assemblages, as structure and composition were similar before and after logging, and between logging and control sites. Variation in bird assemblages was explained by natural rates of change over time, and not logging intensity. However, when partitioned by feeding guild and strata, the frugivorous and canopy bird ensembles changed as a result of RIL, although the latter was also associated with change over time. Bats exhibited variable changes post-logging that were not related to logging, whereas large mammals showed no change at all. Indicator species analysis and correlations with logging intensities revealed that some species exhibited idiosyncratic responses to RIL, whilst abundance change of most others was associated with time. Synthesis and applications. Our study demonstrates the relatively benign effect of reduced-impact logging (RIL) on birds, bats and large mammals in a neotropical forest context, and therefore, we propose that forest managers should improve timber extraction techniques more widely. If RIL is extensively adopted, forestry concessions could represent sizeable and important additions to the global conservation estate – over 4 million km2. PMID:25954054

  16. Visible optical radiation generates bactericidal effect applicable for inactivation of health care associated germs demonstrated by inactivation of E. coli and B. subtilis using 405-nm and 460-nm light emitting diodes

    NASA Astrophysics Data System (ADS)

    Hönes, Katharina; Stangl, Felix; Sift, Michael; Hessling, Martin

    2015-07-01

    The Ulm University of Applied Sciences is investigating a technique using visible optical radiation (405 nm and 460 nm) to inactivate health-hazardous bacteria in water. A conceivable application could be point-of-use disinfection implementations in developing countries for safe drinking water supply. Another possible application field could be to provide sterile water in medical institutions like hospitals or dental surgeries where contaminated pipework or long-term disuse often results in higher germ concentrations. Optical radiation for disinfection is presently mostly used in UV wavelength ranges but the possibility of bacterial inactivation with visible light was so far generally disregarded. One of the advantages of visible light is, that instead of mercury arc lamps, light emitting diodes could be used, which are commercially available and therefore cost-efficient concerning the visible light spectrum. Furthermore they inherit a considerable longer life span than UV-C LEDs and are non-hazardous in contrast to mercury arc lamps. Above all there are specific germs, like Bacillus subtilis, which show an inactivation resistance to UV-C wavelengths. Due to the totally different deactivation mechanism even higher disinfection rates are reached, compared to Escherichia coli as a standard laboratory germ. By 460 nm a reduction of three log-levels appeared with Bacillus subtilis and a half log-level with Escherichia coli both at a dose of about 300 J/cm². By the more efficient wavelength of 405 nm four and a half log-levels are reached with Bacillus subtilis and one and a half log-level with Escherichia coli also both at a dose of about 300 J/cm². In addition the employed optical setup, which delivered a homogeneous illumination and skirts the need of a stirring technique to compensate irregularities, was an important improvement compared to previous published setups. Evaluated by optical simulation in ZEMAX® the designed optical element provided proven homogeneity distributions with maximum variation of ± 10 %.

  17. Visual Acuity and Contrast Sensitivity Development in Children: Sweep Visually Evoked Potential and Psychophysics.

    PubMed

    Almoqbel, Fahad M; Irving, Elizabeth L; Leat, Susan J

    2017-08-01

    The purpose of this study was to investigate the development of visual acuity (VA) and contrast sensitivity in children as measured with objective (sweep visually evoked potential) and subjective, psychophysical techniques, including signal detection theory (SDT), which attempts to control for differences in criterion or behavior between adults and children. Furthermore, this study examines the possibility of applying SDT methods with children. Visual acuity and contrast thresholds were measured in 12 children 6 to 7 years old, 10 children 8 to 9 years old, 10 children 10 to 12 years old, and 16 adults. For sweep visually evoked potential measurements, spatial frequency was swept from 1 to 40 cpd to measure VA, and contrast of sine-wave gratings (1 or 8 cpd) was swept from 0.33 to 30% to measure contrast thresholds. For psychophysical measurements, VA and contrast thresholds (1 or 8 cpd) were measured using a temporal two-alternative forced-choice staircase procedure and also with a yes-no SDT procedure. Optotype (logMAR [log of the minimum angle of resolution]) VA was also measured. The results of the various procedures were in agreement showing that there are age-related changes in threshold values and logMAR VA after the age of 6 years and that these visual functions do not become adult-like until the age of 8 to 9 years at the earliest. It was also found that children can participate in SDT procedures and do show differences in criterion compared with adults in psychophysical testing. These findings confirm a slightly later development of VA and contrast sensitivity (8 years or older) and indicate the importance of using SDT or forced-choice procedures in any developmental study to attempt to overcome the effect of criterion in children.

  18. Enhancement of subsurface geologic structure model based on gravity, magnetotelluric, and well log data in Kamojang geothermal field

    NASA Astrophysics Data System (ADS)

    Yustin Kamah, Muhammad; Armando, Adilla; Larasati Rahmani, Dinda; Paramitha, Shabrina

    2017-12-01

    Geophysical methods such as gravity and magnetotelluric methods commonly used in conventional and unconventional energy exploration, notably for exploring geothermal prospect. They used to identify the subsurface geology structures which is estimated as a path of fluid flow. This study was conducted in Kamojang Geothermal Field with the aim of highlighting the volcanic lineament in West Java, precisely in Guntur-Papandayan chain where there are three geothermal systems. Kendang Fault has predominant direction NE-SW, identified by magnetotelluric techniques and gravity data processing techniques. Gravity techniques such as spectral analysis, derivative solutions, and Euler deconvolution indicate the type and geometry of anomaly. Magnetotelluric techniques such as inverse modeling and polar diagram are required to know subsurface resistivity charactersitics and major orientation. Furthermore, the result from those methods will be compared to geology information and some section of well data, which is sufficiently suitable. This research is very useful to trace out another potential development area.

  19. Compositional data analysis as a robust tool to delineate hydrochemical facies within and between gas-bearing aquifers

    NASA Astrophysics Data System (ADS)

    Owen, D. Des. R.; Pawlowsky-Glahn, V.; Egozcue, J. J.; Buccianti, A.; Bradd, J. M.

    2016-08-01

    Isometric log ratios of proportions of major ions, derived from intuitive sequential binary partitions, are used to characterize hydrochemical variability within and between coal seam gas (CSG) and surrounding aquifers in a number of sedimentary basins in the USA and Australia. These isometric log ratios are the coordinates corresponding to an orthonormal basis in the sample space (the simplex). The characteristic proportions of ions, as described by linear models of isometric log ratios, can be used for a mathematical-descriptive classification of water types. This is a more informative and robust method of describing water types than simply classifying a water type based on the dominant ions. The approach allows (a) compositional distinctions between very similar water types to be made and (b) large data sets with a high degree of variability to be rapidly assessed with respect to particular relationships/compositions that are of interest. A major advantage of these techniques is that major and minor ion components can be comprehensively assessed and subtle processes—which may be masked by conventional techniques such as Stiff diagrams, Piper plots, and classic ion ratios—can be highlighted. Results show that while all CSG groundwaters are dominated by Na, HCO3, and Cl ions, the proportions of other ions indicate they can evolve via different means and the particular proportions of ions within total or subcompositions can be unique to particular basins. Using isometric log ratios, subtle differences in the behavior of Na, K, and Cl between CSG water types and very similar Na-HCO3 water types in adjacent aquifers are also described. A complementary pair of isometric log ratios, derived from a geochemically-intuitive sequential binary partition that is designed to reflect compositional variability within and between CSG groundwater, is proposed. These isometric log ratios can be used to model a hydrochemical pathway associated with methanogenesis and/or to delineate groundwater associated with high gas concentrations.

  20. 3-D visualisation of palaeoseismic trench stratigraphy and trench logging using terrestrial remote sensing and GPR - a multiparametric interpretation

    NASA Astrophysics Data System (ADS)

    Schneiderwind, Sascha; Mason, Jack; Wiatr, Thomas; Papanikolaou, Ioannis; Reicherter, Klaus

    2016-03-01

    Two normal faults on the island of Crete and mainland Greece were studied to test an innovative workflow with the goal of obtaining a more objective palaeoseismic trench log, and a 3-D view of the sedimentary architecture within the trench walls. Sedimentary feature geometries in palaeoseismic trenches are related to palaeoearthquake magnitudes which are used in seismic hazard assessments. If the geometry of these sedimentary features can be more representatively measured, seismic hazard assessments can be improved. In this study more representative measurements of sedimentary features are achieved by combining classical palaeoseismic trenching techniques with multispectral approaches. A conventional trench log was firstly compared to results of ISO (iterative self-organising) cluster analysis of a true colour photomosaic representing the spectrum of visible light. Photomosaic acquisition disadvantages (e.g. illumination) were addressed by complementing the data set with active near-infrared backscatter signal image from t-LiDAR measurements. The multispectral analysis shows that distinct layers can be identified and it compares well with the conventional trench log. According to this, a distinction of adjacent stratigraphic units was enabled by their particular multispectral composition signature. Based on the trench log, a 3-D interpretation of attached 2-D ground-penetrating radar (GPR) profiles collected on the vertical trench wall was then possible. This is highly beneficial for measuring representative layer thicknesses, displacements, and geometries at depth within the trench wall. Thus, misinterpretation due to cutting effects is minimised. This manuscript combines multiparametric approaches and shows (i) how a 3-D visualisation of palaeoseismic trench stratigraphy and logging can be accomplished by combining t-LiDAR and GPR techniques, and (ii) how a multispectral digital analysis can offer additional advantages to interpret palaeoseismic and stratigraphic data. The multispectral data sets are stored allowing unbiased input for future (re)investigations.

  1. Log-less metadata management on metadata server for parallel file systems.

    PubMed

    Liao, Jianwei; Xiao, Guoqiang; Peng, Xiaoning

    2014-01-01

    This paper presents a novel metadata management mechanism on the metadata server (MDS) for parallel and distributed file systems. In this technique, the client file system backs up the sent metadata requests, which have been handled by the metadata server, so that the MDS does not need to log metadata changes to nonvolatile storage for achieving highly available metadata service, as well as better performance improvement in metadata processing. As the client file system backs up certain sent metadata requests in its memory, the overhead for handling these backup requests is much smaller than that brought by the metadata server, while it adopts logging or journaling to yield highly available metadata service. The experimental results show that this newly proposed mechanism can significantly improve the speed of metadata processing and render a better I/O data throughput, in contrast to conventional metadata management schemes, that is, logging or journaling on MDS. Besides, a complete metadata recovery can be achieved by replaying the backup logs cached by all involved clients, when the metadata server has crashed or gone into nonoperational state exceptionally.

  2. Log-Less Metadata Management on Metadata Server for Parallel File Systems

    PubMed Central

    Xiao, Guoqiang; Peng, Xiaoning

    2014-01-01

    This paper presents a novel metadata management mechanism on the metadata server (MDS) for parallel and distributed file systems. In this technique, the client file system backs up the sent metadata requests, which have been handled by the metadata server, so that the MDS does not need to log metadata changes to nonvolatile storage for achieving highly available metadata service, as well as better performance improvement in metadata processing. As the client file system backs up certain sent metadata requests in its memory, the overhead for handling these backup requests is much smaller than that brought by the metadata server, while it adopts logging or journaling to yield highly available metadata service. The experimental results show that this newly proposed mechanism can significantly improve the speed of metadata processing and render a better I/O data throughput, in contrast to conventional metadata management schemes, that is, logging or journaling on MDS. Besides, a complete metadata recovery can be achieved by replaying the backup logs cached by all involved clients, when the metadata server has crashed or gone into nonoperational state exceptionally. PMID:24892093

  3. A three-dimensional optimal sawing system for small sawmills in central Appalachia

    Treesearch

    Wenshu Lin; Jingxin Wang; R. Edward. Thomas

    2011-01-01

    A three-dimensional (3D) log sawing optimization system was developed to perform 3D log generation, opening face determination, sawing simulation, and lumber grading. Superficial characteristics of logs such as length, large-end and small-end diameters, and external defects were collected from local sawmills. Internal log defect positions and shapes were predicted...

  4. ALOG user's manual: A Guide to using the spreadsheet-based artificial log generator

    Treesearch

    Matthew F. Winn; Philip A. Araman; Randolph H. Wynne

    2012-01-01

    Computer programs that simulate log sawing can be valuable training tools for sawyers, as well as a means oftesting different sawing patterns. Most available simulation programs rely on diagrammed-log databases, which canbe very costly and time consuming to develop. Artificial Log Generator (ALOG) is a user-friendly Microsoft® Excel®...

  5. Combining Radar and Optical Data for Forest Disturbance Studies

    NASA Technical Reports Server (NTRS)

    Ranson, K. Jon; Smith, David E. (Technical Monitor)

    2002-01-01

    Disturbance is an important factor in determining the carbon balance and succession of forests. Until the early 1990's researchers have focused on using optical or thermal sensors to detect and map forest disturbances from wild fires, logging or insect outbreaks. As part of a NASA Siberian mapping project, a study evaluated the capability of three different radar sensors (ERS, JERS and Radarsat) and an optical sensor (Landsat 7) to detect fire scars, logging and insect damage in the boreal forest. This paper describes the data sets and techniques used to evaluate the use of remote sensing to detect disturbance in central Siberian forests. Using images from each sensor individually and combined an assessment of the utility of using these sensors was developed. Transformed Divergence analysis and maximum likelihood classification revealed that Landsat data was the single best data type for this purpose. However, the combined use of the three radar and optical sensors did improve the results of discriminating these disturbances.

  6. Microbiological levels of randomly selected food contact surfaces in hotels located in Spain during 2007-2009.

    PubMed

    Doménech-Sánchez, Antonio; Laso, Elena; Pérez, María José; Berrocal, Clara Isabel

    2011-09-01

    The aim of this study was to survey the microbial levels of food contact surfaces in hotels. Microbiological levels of 4611 surfaces (chopping machines, kitchenware, knives, worktops, and cutting boards) from 280 different facilities in Spain were determined in a 3-year period. The contact-plate technique was used throughout the survey. Overall, the mean of the log of total aerobic count cm(-2) was 0.62, better than those reported for child-care and assisted living facilities. Significant differences were detected among different types of surfaces, time of sampling, season, and year. The majority (74%) of food contact surfaces sampled in Spanish hotels was within the recommended standard of <1.3 log CFU cm(-2), and differences depend on several factors. Our results set a representative picture of the actual situation in our resorts and establish the basis for the development of educational programs to improve food handlers' knowledge of foodborne diseases and their transmission via food contact surfaces.

  7. Comparison study of noise reduction algorithms in dual energy chest digital tomosynthesis

    NASA Astrophysics Data System (ADS)

    Lee, D.; Kim, Y.-S.; Choi, S.; Lee, H.; Choi, S.; Kim, H.-J.

    2018-04-01

    Dual energy chest digital tomosynthesis (CDT) is a recently developed medical technique that takes advantage of both tomosynthesis and dual energy X-ray images. However, quantum noise, which occurs in dual energy X-ray images, strongly interferes with diagnosis in various clinical situations. Therefore, noise reduction is necessary in dual energy CDT. In this study, noise-compensating algorithms, including a simple smoothing of high-energy images (SSH) and anti-correlated noise reduction (ACNR), were evaluated in a CDT system. We used a newly developed prototype CDT system and anthropomorphic chest phantom for experimental studies. The resulting images demonstrated that dual energy CDT can selectively image anatomical structures, such as bone and soft tissue. Among the resulting images, those acquired with ACNR showed the best image quality. Both coefficient of variation and contrast to noise ratio (CNR) were the highest in ACNR among the three different dual energy techniques, and the CNR of bone was significantly improved compared to the reconstructed images acquired at a single energy. This study demonstrated the clinical value of dual energy CDT and quantitatively showed that ACNR is the most suitable among the three developed dual energy techniques, including standard log subtraction, SSH, and ACNR.

  8. What's new in well logging and formation evaluation

    USGS Publications Warehouse

    Prensky, S.

    2011-01-01

    A number of significant new developments is emerging in well logging and formation evaluation. Some of the new developments include an ultrasonic wireline imager, an electromagnetic free-point indicator, wired and fiber-optic coiled tubing systems, and extreme-temperature logging-while-drilling (LWD) tools. The continued consolidation of logging and petrophysical service providers in 2010 means that these innovations are increasingly being provided by a few large companies. Weatherford International has launched a slimhole cross-dipole tool as part of the company's line of compact logging tools. The 26-ft-long Compact Cross-Dipole Sonic (CXD) tool can be run as part of a quad-combo compact logging string. Halliburton has introduced a version of its circumferential acoustic scanning tool (CAST) that runs on monoconductor cable (CAST-M) to provide high-resolution images in open hole and in cased hole for casing and cement evaluation.

  9. Automatic lithofacies segmentation from well-logs data. A comparative study between the Self-Organizing Map (SOM) and Walsh transform

    NASA Astrophysics Data System (ADS)

    Aliouane, Leila; Ouadfeul, Sid-Ali; Rabhi, Abdessalem; Rouina, Fouzi; Benaissa, Zahia; Boudella, Amar

    2013-04-01

    The main goal of this work is to realize a comparison between two lithofacies segmentation techniques of reservoir interval. The first one is based on the Kohonen's Self-Organizing Map neural network machine. The second technique is based on the Walsh transform decomposition. Application to real well-logs data of two boreholes located in the Algerian Sahara shows that the Self-organizing map is able to provide more lithological details that the obtained lithofacies model given by the Walsh decomposition. Keywords: Comparison, Lithofacies, SOM, Walsh References: 1)Aliouane, L., Ouadfeul, S., Boudella, A., 2011, Fractal analysis based on the continuous wavelet transform and lithofacies classification from well-logs data using the self-organizing map neural network, Arabian Journal of geosciences, doi: 10.1007/s12517-011-0459-4 2) Aliouane, L., Ouadfeul, S., Djarfour, N., Boudella, A., 2012, Petrophysical Parameters Estimation from Well-Logs Data Using Multilayer Perceptron and Radial Basis Function Neural Networks, Lecture Notes in Computer Science Volume 7667, 2012, pp 730-736, doi : 10.1007/978-3-642-34500-5_86 3)Ouadfeul, S. and Aliouane., L., 2011, Multifractal analysis revisited by the continuous wavelet transform applied in lithofacies segmentation from well-logs data, International journal of applied physics and mathematics, Vol01 N01. 4) Ouadfeul, S., Aliouane, L., 2012, Lithofacies Classification Using the Multilayer Perceptron and the Self-organizing Neural Networks, Lecture Notes in Computer Science Volume 7667, 2012, pp 737-744, doi : 10.1007/978-3-642-34500-5_87 5) Weisstein, Eric W. "Fast Walsh Transform." From MathWorld--A Wolfram Web Resource. http://mathworld.wolfram.com/FastWalshTransform.html

  10. Contemporary surgical trends in the management of upper tract calculi.

    PubMed

    Oberlin, Daniel T; Flum, Andrew S; Bachrach, Laurie; Matulewicz, Richard S; Flury, Sarah C

    2015-03-01

    Upper tract nephrolithiasis is a common surgical condition that is treated with multiple surgical techniques, including shock wave lithotripsy, ureteroscopy and percutaneous nephrolithotomy. We analyzed case logs submitted to the ABU by candidates for initial certification and recertification to help elucidate the trends in management of upper tract urinary calculi. Annualized case logs from 2003 to 2012 were analyzed. We used logistic regression models to assess how surgeon specific attributes affected the way that upper tract stones were treated. Cases were identified by the CPT code of the corresponding procedure. A total of 6,620 urologists in 3 certification groups recorded case logs, including 2,275 for initial certification, 2,381 for first recertification and 1,964 for second recertification. A total of 441,162 procedures were logged, of which 54.2% were ureteroscopy, 41.3% were shock wave lithotripsy and 4.5% were percutaneous nephrolithotomy. From 2003 to 2013 there was an increase in ureteroscopy from 40.9% to 59.6% and a corresponding decrease in shock wave lithotripsy from 54% to 36.3%. For new urologists ureteroscopy increased from 47.6% to 70.9% of all stones cases logged and for senior clinicians ureteroscopy increased from 40% to 55%. Endourologists performed a significantly higher proportion of percutaneous nephrolithotomies than nonendourologists (10.6% vs 3.69%, p <0.0001) and a significantly smaller proportion of shock wave lithotripsies (34.2% vs 42.2%, p = 0.001). Junior and senior clinicians showed a dramatic adoption of endoscopic techniques. Treatment of upper tract calculi is an evolving field and provider specific attributes affect how these stones are treated. Copyright © 2015 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  11. Combined intrastromal astigmatic keratotomy and laser in situ keratomileusis flap followed by photoablation to correct post-penetrating keratoplasty ametropia and high astigmatism: One-year follow-up.

    PubMed

    Shalash, Riad B; Elshazly, Malak I; Salama, Marwa M

    2015-10-01

    To evaluate a new technique combining intrastromal astigmatic keratotomy (AK) with a laser in situ keratomileusis (LASIK) flap followed by excimer laser photoablation to correct post-penetrating keratoplasty (PKP) high astigmatism and ametropia. Kasr El Aini Hospital, Cairo University, Cairo, Egypt. Prospective interventional uncontrolled case series. Patients with post-PKP high astigmatism and ametropia had paired intrastromal AK with LASIK flap using the M2 microkeratome followed 2 to 3 months later by excimer laser photoablation. The main outcome measures were uncorrected distance visual acuity (UDVA), corrected distance visual acuity (CDVA), mean refractive spherical equivalent (SE), and mean cylinder after each step and at the 1-year follow-up. The study comprised 20 eyes (20 patients). All parameters were significantly improved in all patients by the last follow-up visit. The mean UDVA improved from 1.07 logMAR ± 0.2 (SD) preoperatively to 0.23 ± 0.18 logMAR (P < .001), the mean CDVA improved from 0.79 ± 0.18 logMAR to 0.12 ± 0.12 logMAR (P < .001), the mean refractive SE improved from -5.04 ± 2.62 diopters (D) to -1.47 ± 1.32 D (P = .001), and the mean cylinder reduced from -5.39 ± 0.98 D to -1.05 ± 0.71 D (P < .001). The mean correction index was 0.84 ± 0.10, and the mean flattening index was 0.83 ± 0.10. Thirty-five percent of cases developed microperforations, and 15% developed epithelial ingrowth. This combined approach allowed for the correction of high astigmatism and ametropia following PKP; however, epithelial ingrowth requiring intervention is a complication to be considered. No author has a financial or proprietary interest in any material or method mentioned. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  12. Modelling uveal melanoma

    PubMed Central

    Foss, A.; Cree, I.; Dolin, P.; Hungerford, J.

    1999-01-01

    BACKGROUND/AIM—There has been no consistent pattern reported on how mortality for uveal melanoma varies with age. This information can be useful to model the complexity of the disease. The authors have examined ocular cancer trends, as an indirect measure for uveal melanoma mortality, to see how rates vary with age and to compare the results with their other studies on predicting metastatic disease.
METHODS—Age specific mortality was examined for England and Wales, the USA, and Canada. A log-log model was fitted to the data. The slopes of the log-log plots were used as measure of disease complexity and compared with the results of previous work on predicting metastatic disease.
RESULTS—The log-log model provided a good fit for the US and Canadian data, but the observed rates deviated for England and Wales among people over the age of 65 years. The log-log model for mortality data suggests that the underlying process depends upon four rate limiting steps, while a similar model for the incidence data suggests between three and four rate limiting steps. Further analysis of previous data on predicting metastatic disease on the basis of tumour size and blood vessel density would indicate a single rate limiting step between developing the primary tumour and developing metastatic disease.
CONCLUSIONS—There is significant underreporting or underdiagnosis of ocular melanoma for England and Wales in those over the age of 65 years. In those under the age of 65, a model is presented for ocular melanoma oncogenesis requiring three rate limiting steps to develop the primary tumour and a fourth rate limiting step to develop metastatic disease. The three steps in the generation of the primary tumour involve two key processes—namely, growth and angiogenesis within the primary tumour. The step from development of the primary to development of metastatic disease is likely to involve a single rate limiting process.

 PMID:10216060

  13. Extragalactic counterparts to Einstein slew survey sources

    NASA Technical Reports Server (NTRS)

    Schachter, Jonathan F.; Elvis, Martin; Plummer, David; Remillard, Ron

    1992-01-01

    The Einstein slew survey consists of 819 bright X-ray sources, of which 636 (or 78 percent) are identified with counterparts in standard catalogs. The importance of bright X-ray surveys is stressed, and the slew survey is compared to the Rosat all sky survey. Statistical techniques for minimizing confusion in arcminute error circles in digitized data are discussed. The 238 slew survey active galactic nuclei, clusters, and BL Lacertae objects identified to date and their implications for logN-logS and source evolution studies are described.

  14. A New Essential Functions Installed DWH in Hospital Information System: Process Mining Techniques and Natural Language Processing.

    PubMed

    Honda, Masayuki; Matsumoto, Takehiro

    2017-01-01

    Several kinds of event log data produced in daily clinical activities have yet to be used for secure and efficient improvement of hospital activities. Data Warehouse systems in Hospital Information Systems used for the analysis of structured data such as disease, lab-tests, and medications, have also shown efficient outcomes. This article is focused on two kinds of essential functions: process mining using log data and non-structured data analysis via Natural Language Processing.

  15. Koopman Mode Decomposition Methods in Dynamic Stall: Reduced Order Modeling and Control

    DTIC Science & Technology

    2015-11-10

    the flow phenomena by separating them into individual modes. The technique of Proper Orthogonal Decomposition (POD), see [ Holmes : 1998] is a popular...sampled values h(k), k = 0,…,2M-1, of the exponential sum 1. Solve the following linear system where 2. Compute all zeros zj  D, j = 1,…,M...of the Prony polynomial i.e., calculate all eigenvalues of the associated companion matrix and form fj = log zj for j = 1,…,M, where log is the

  16. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  17. Distributed Storage Algorithm for Geospatial Image Data Based on Data Access Patterns.

    PubMed

    Pan, Shaoming; Li, Yongkai; Xu, Zhengquan; Chong, Yanwen

    2015-01-01

    Declustering techniques are widely used in distributed environments to reduce query response time through parallel I/O by splitting large files into several small blocks and then distributing those blocks among multiple storage nodes. Unfortunately, however, many small geospatial image data files cannot be further split for distributed storage. In this paper, we propose a complete theoretical system for the distributed storage of small geospatial image data files based on mining the access patterns of geospatial image data using their historical access log information. First, an algorithm is developed to construct an access correlation matrix based on the analysis of the log information, which reveals the patterns of access to the geospatial image data. Then, a practical heuristic algorithm is developed to determine a reasonable solution based on the access correlation matrix. Finally, a number of comparative experiments are presented, demonstrating that our algorithm displays a higher total parallel access probability than those of other algorithms by approximately 10-15% and that the performance can be further improved by more than 20% by simultaneously applying a copy storage strategy. These experiments show that the algorithm can be applied in distributed environments to help realize parallel I/O and thereby improve system performance.

  18. A log-sinh transformation for data normalization and variance stabilization

    NASA Astrophysics Data System (ADS)

    Wang, Q. J.; Shrestha, D. L.; Robertson, D. E.; Pokhrel, P.

    2012-05-01

    When quantifying model prediction uncertainty, it is statistically convenient to represent model errors that are normally distributed with a constant variance. The Box-Cox transformation is the most widely used technique to normalize data and stabilize variance, but it is not without limitations. In this paper, a log-sinh transformation is derived based on a pattern of errors commonly seen in hydrological model predictions. It is suited to applications where prediction variables are positively skewed and the spread of errors is seen to first increase rapidly, then slowly, and eventually approach a constant as the prediction variable becomes greater. The log-sinh transformation is applied in two case studies, and the results are compared with one- and two-parameter Box-Cox transformations.

  19. Integrated geostatistics for modeling fluid contacts and shales in Prudhoe Bay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perez, G.; Chopra, A.K.; Severson, C.D.

    1997-12-01

    Geostatistics techniques are being used increasingly to model reservoir heterogeneity at a wide range of scales. A variety of techniques is now available with differing underlying assumptions, complexity, and applications. This paper introduces a novel method of geostatistics to model dynamic gas-oil contacts and shales in the Prudhoe Bay reservoir. The method integrates reservoir description and surveillance data within the same geostatistical framework. Surveillance logs and shale data are transformed to indicator variables. These variables are used to evaluate vertical and horizontal spatial correlation and cross-correlation of gas and shale at different times and to develop variogram models. Conditional simulationmore » techniques are used to generate multiple three-dimensional (3D) descriptions of gas and shales that provide a measure of uncertainty. These techniques capture the complex 3D distribution of gas-oil contacts through time. The authors compare results of the geostatistical method with conventional techniques as well as with infill wells drilled after the study. Predicted gas-oil contacts and shale distributions are in close agreement with gas-oil contacts observed at infill wells.« less

  20. Riparian Systems and Forest Management—Changes in Harvesting Techniques and their Effects on Decomposed Granitic Soils

    Treesearch

    John W. Bramhall

    1989-01-01

    In the 1950s, timber on steep granitic terrain in Trinity County, California was harvested by using the logging techniques of the time. After Trinity Dam was built in the 1960s, it became evident these techniques were not suited to quality riparian habitat and healthy anadromous fisheries. Since adoption of the Z'berg-Nejedly Forest Practice Act in 1973, efforts...

  1. Green lumber grade yields from factory grade logs of three oak species

    Treesearch

    Daniel A. Yaussy

    1986-01-01

    Multivariate regression models were developed to predict green board foot yields for the seven common factory lumber grades processed from white, black, and chestnut oak factory grade logs. These models use the standard log measurements of grade, scaling diameter, log length, and proportion of scaling defect. Any combination of lumber grades (such as 1 Common and...

  2. HW Buck for Windows: the optimal hardwood log bucking decision simulator with expanded capabilities

    Treesearch

    James B. Pickens; Scott Noble; Blair Orr; Philip A. Araman; John E. Baumgras; Al Steele

    2006-01-01

    It has long been recognized that inappropriate placement of crosscuts when manufacturing hardwood logs from harvested stems (log bucking) reduces the value of logs produced. Recent studies have estimated losses in the range from 28% to 38% in the lake states region. These estimates were developed by evaluating the bucking cuts chosen by harvesting crews and comparing...

  3. RAYSAW: a log sawing simulator for 3D laser-scanned hardwood logs

    Treesearch

    R. Edward Thomas

    2013-01-01

    Laser scanning of hardwood logs provides detailed high-resolution imagery of log surfaces. Characteristics such as sweep, taper, and crook, as well as most surface defects, are visible to the eye in the scan data. In addition, models have been developed that predict interior knot size and position based on external defect information. Computerized processing of...

  4. SU-E-J-182: Reproducibility of Tumor Motion Probability Distribution Function in Stereotactic Body Radiation Therapy of Lung Using Real-Time Tumor-Tracking Radiotherapy System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shiinoki, T; Hanazawa, H; Park, S

    2015-06-15

    Purpose: We aim to achieve new four-dimensional radiotherapy (4DRT) using the next generation real-time tumor-tracking (RTRT) system and flattening-filter-free techniques. To achieve new 4DRT, it is necessary to understand the respiratory motion of tumor. The purposes of this study were: 1.To develop the respiratory motion analysis tool using log files. 2.To evaluate the reproducibility of tumor motion probability distribution function (PDF) during stereotactic body RT (SBRT) of lung tumor. Methods: Seven patients having fiducial markers closely implanted to the lung tumor were enrolled in this study. The positions of fiducial markers were measured using the RTRT system (Mitsubishi Electronics Co.,more » JP) and recorded as two types of log files during the course of SBRT. For each patients, tumor motion range and tumor motion PDFs in left-right (LR), anterior-posterior (AP) and superior-inferior (SI) directions were calculated using log files of all beams per fraction (PDFn). Fractional PDF reproducibility (Rn) was calculated as Kullback-Leibler (KL) divergence between PDF1 and PDFn of tumor motion. The mean of Rn (Rm) was calculated for each patient and correlated to the patient’s mean tumor motion range (Am). The change of Rm during the course of SBRT was also evluated. These analyses were performed using in-house developed software. Results: The Rm were 0.19 (0.07–0.30), 0.14 (0.07–0.32) and 0.16 (0.09–0.28) in LR, AP and SI directions, respectively. The Am were 5.11 mm (2.58–9.99 mm), 7.81 mm (2.87–15.57 mm) and 11.26 mm (3.80–21.27 mm) in LR, AP and SI directions, respectively. The PDF reproducibility decreased as the tumor motion range increased in AP and SI direction. That decreased slightly through the course of RT in SI direction. Conclusion: We developed the respiratory motion analysis tool for 4DRT using log files and quantified the range and reproducibility of respiratory motion for lung tumors.« less

  5. Developments in Quantitative Structure-Activity Relationships (QSAR). A Review

    DTIC Science & Technology

    1976-07-01

    hyphae Analogs Inhibition of s-Nitrostyrenes 20 84 Growth Botrytie -,inerea Inhibition of a-Nitrostyrenes 6 84 Grcwth Bovine hemoglobin Binding of...AspergiL us niger, phenyl methacrylates upon Ranse-nula awmat~a and RR’NCSS Na+ upon Botrytis cinerea conformed to the general equation 35. The equations...log II vs log kw *79 Botrytis cinerea , 41, 64 -lg! slgý,7 Bovine hemoglobin, 36 lg Elv o .,7 Bovine serum albumin, 36 - log iI vs log P, 79 - log JE

  6. Sonodynamic inactivation of Gram-positive and Gram-negative bacteria using a Rose Bengal-antimicrobial peptide conjugate.

    PubMed

    Costley, David; Nesbitt, Heather; Ternan, Nigel; Dooley, James; Huang, Ying-Ying; Hamblin, Michael R; McHale, Anthony P; Callan, John F

    2017-01-01

    Combating antimicrobial resistance is one of the most serious public health challenges facing society today. The development of new antibiotics or alternative techniques that can help combat antimicrobial resistance is being prioritised by many governments and stakeholders across the globe. Antimicrobial photodynamic therapy is one such technique that has received considerable attention but is limited by the inability of light to penetrate through human tissue, reducing its effectiveness when used to treat deep-seated infections. The related technique sonodynamic therapy (SDT) has the potential to overcome this limitation given the ability of low-intensity ultrasound to penetrate human tissue. In this study, a Rose Bengal-antimicrobial peptide conjugate was prepared for use in antimicrobial SDT (ASDT). When Staphylococcus aureus and Pseudomonas aeruginosa planktonic cultures were treated with the conjugate and subsequently exposed to ultrasound, 5 log and 7 log reductions, respectively, in bacterial numbers were observed. The conjugate also displayed improved uptake by bacterial cells compared with a mammalian cell line (P ≤ 0.01), whilst pre-treatment of a P. aeruginosa biofilm with ultrasound resulted in a 2.6-fold improvement in sensitiser diffusion (P ≤ 0.01). A preliminary in vivo experiment involving ASDT treatment of P. aeruginosa-infected wounds in mice demonstrated that ultrasound irradiation of conjugate-treated wounds affects a substantial reduction in bacterial burden. Combined, the results obtained from this study highlight ASDT as a targeted broad-spectrum novel modality with potential for the treatment of deep-seated bacterial infections. Copyright © 2016. Published by Elsevier B.V.

  7. Log evaluation in wells drilled with inverted oil emulsion mud. [GLOBAL program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, D.P.; Lacour-Gayet, P.J.; Suau, J.

    1981-01-01

    As greater use is made of inverted oil emulsion, muds in the development of North Sea oil fields, the need for more precise log evaluation in this environment becomes apparent. This paper demonstrates an approach using the Dual Induction Log, taking into account invasion and boundary effects. Lithology and porosity are derived from the Formation Density or Litho-Density Log, Compensated Neutron Log, Sonic Log and the Natural Gamma Ray Spectrometry log. The effect of invasion by the oil component of the mud filtrate is treated in the evaluation, and a measurement of Moved Water is made Computations of petrophysical propertiesmore » are implemented by means of the GLOBAL interpretation program, taking advantage of its capability of adaption to any combination of logging sensors. 8 refs.« less

  8. Reducing Interprocessor Dependence in Recoverable Distributed Shared Memory

    NASA Technical Reports Server (NTRS)

    Janssens, Bob; Fuchs, W. Kent

    1994-01-01

    Checkpointing techniques in parallel systems use dependency tracking and/or message logging to ensure that a system rolls back to a consistent state. Traditional dependency tracking in distributed shared memory (DSM) systems is expensive because of high communication frequency. In this paper we show that, if designed correctly, a DSM system only needs to consider dependencies due to the transfer of blocks of data, resulting in reduced dependency tracking overhead and reduced potential for rollback propagation. We develop an ownership timestamp scheme to tolerate the loss of block state information and develop a passive server model of execution where interactions between processors are considered atomic. With our scheme, dependencies are significantly reduced compared to the traditional message-passing model.

  9. LoyalTracker: Visualizing Loyalty Dynamics in Search Engines.

    PubMed

    Shi, Conglei; Wu, Yingcai; Liu, Shixia; Zhou, Hong; Qu, Huamin

    2014-12-01

    The huge amount of user log data collected by search engine providers creates new opportunities to understand user loyalty and defection behavior at an unprecedented scale. However, this also poses a great challenge to analyze the behavior and glean insights into the complex, large data. In this paper, we introduce LoyalTracker, a visual analytics system to track user loyalty and switching behavior towards multiple search engines from the vast amount of user log data. We propose a new interactive visualization technique (flow view) based on a flow metaphor, which conveys a proper visual summary of the dynamics of user loyalty of thousands of users over time. Two other visualization techniques, a density map and a word cloud, are integrated to enable analysts to gain further insights into the patterns identified by the flow view. Case studies and the interview with domain experts are conducted to demonstrate the usefulness of our technique in understanding user loyalty and switching behavior in search engines.

  10. Generalized image contrast enhancement technique based on Heinemann contrast discrimination model

    NASA Astrophysics Data System (ADS)

    Liu, Hong; Nodine, Calvin F.

    1994-03-01

    This paper presents a generalized image contrast enhancement technique which equalizes perceived brightness based on the Heinemann contrast discrimination model. This is a modified algorithm which presents an improvement over the previous study by Mokrane in its mathematically proven existence of a unique solution and in its easily tunable parameterization. The model uses a log-log representation of contrast luminosity between targets and the surround in a fixed luminosity background setting. The algorithm consists of two nonlinear gray-scale mapping functions which have seven parameters, two of which are adjustable Heinemann constants. Another parameter is the background gray level. The remaining four parameters are nonlinear functions of gray scale distribution of the image, and can be uniquely determined once the previous three are given. Tests have been carried out to examine the effectiveness of the algorithm for increasing the overall contrast of images. It can be demonstrated that the generalized algorithm provides better contrast enhancement than histogram equalization. In fact, the histogram equalization technique is a special case of the proposed mapping.

  11. Tractor-logging costs and production in old-growth redwood forests

    Treesearch

    Kenneth N. Boe

    1963-01-01

    A cost accounting analysis of full-scale logging operations in old-growth redwood during 2 years revealed that it cost $12.24 per M bd. ft. (gross Scribner log scale) to get logs on trucks. Road development costs averaged another $5.19 per M bd. ft. Felling-bucking production was calculated by average tree d.b.h. Both skidding and loading outputs per hour were...

  12. The presence and nature of ellipticity in Appalachian hardwood logs

    Treesearch

    R. Edward Thomas; John S. Stanovick; Deborah Conner

    2017-01-01

    The ellipticity of hardwood logs is most often observed and measured from either end of a log. However, due to the nature of hardwood tree growth and bucking practices, the assessment of ellipticity in thir manner may not be accurate. Trees grown on hillsides often develop supporting wood that gives the first few feet of the  log butt a significant degree of...

  13. Defects in Hardwood Veneer Logs: Their Frequency and Importance

    Treesearch

    E.S. Harrar

    1954-01-01

    Most southern hardwood veneer and plywood plants have some method of classifying logs by grade to control the purchase price paid for logs bought on the open market. Such log-grading systems have been developed by experience and are dependent to a large extent upon the ability of the grader and his knowledge of veneer grades and yields required for the specific product...

  14. Microbial contamination on beef and sheep carcases in South Australia.

    PubMed

    Sumner, John; Petrenas, Elena; Dean, Peter; Dowsett, Paul; West, Geoff; Wiering, Rinie; Raven, Geoff

    2003-03-25

    A total of 523 chilled beef and lamb carcases were sampled from four abattoirs and 13 very small plants (VSPs) in South Australia during March 2002 in order to develop a microbiological profile of meat produced for domestic consumption within the State. Aerobic viable counts (AVCs) and Escherichia coli counts were obtained from samples taken by sponge-sampling the muscle-adipose tissue at sites designated for each species in the Microbiological Guidelines to the Australian Standard for Hygienic Production of Meat for Human Consumption (identical with those of the USA Pathogen Reduction: hazard analysis and critical control point (HACCP) systems: final rule). On beef carcases (n=159) mean log AVC/cm(2) was 1.82 and E. coli was detected on 18.8% of carcases (area sampled 200 cm(2)) for which the mean log of the positives was -0.34; for lamb carcases, on which 75 cm(2) was sampled (n=364), corresponding values were 2.59, 36.2% and log(10) 0.27, respectively. There was little difference in mean log AVC/cm(2) of carcases produced at abattoirs and VSPs, 1.72 versus 1.81, respectively, for beef, and 2.80 versus 2.44, respectively, for sheep. Prevalence of E. coli was lower at VSPs, however, with abattoirs having 28.4% for beef and 61.5% for sheep, compared with corresponding values of 4.7% and 18.5% at VSPs. In VSPs, the range of mean log AVC/cm(2) was 0.47-3.16 for beef and 1.63-3.65 for sheep carcases, data which will allow the Controlling Authority to assist plants to improve performance of slaughter and dressing techniques. The present survey is part of an assessment by the State meat authority of the effectiveness of co-regulation of meat hygiene between government and industry.

  15. A multidisciplinary approach to reservoir subdivision of the Maastrichtian chalk in the Dan field, Danish North Sea

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kristensen, L.; Dons, T.; Schioler, P.

    1995-11-01

    Correlation of wireline log data from the North Sea chalk reservoirs is frequently hampered by rather subtle log patterns in the chalk section due to the apparent monotonous nature of the chalk sediments, which may lead to ambiguous correlations. This study deals with a correlation technique based on an integration of biostratigraphic data, seismic interpretation, and wireline log correlation; this technique aims at producing a consistent reservoir subdivision that honors both the well data and the seismic data. This multidisciplinary approach has been used to subdivide and correlate the Maastrichtian chalk in the Dan field. The biostratigraphic subdivision is basedmore » on a new detailed dinoflagellate study of core samples from eight wells. Integrating the biostratigraphic results with three-dimensional seismic data allows recognition of four stratigraphic units within the Maastrichtian, bounded by assumed chronostratigraphic horizons. This subdivision is further refined by adding a seismic horizon and four horizons from wireline log correlations, establishing a total of nine reservoir units. The approximate chronostratigraphic nature of these units provides an improved interpretation of the depositional and structural patterns in this area. The three upper reservoir units pinch out and disappear in a northeasterly direction across the field. We interpret this stratal pattern as reflecting a relative sea level fall or regional basinal subsidence during the latest Maastrichtian, possibly combined with local synsedimentary uplift due to salt tectonics. Isochore maps indicate that the underlying six non-wedging units are unaffected by salt tectonics.« less

  16. Ceramic vacuum tubes for geothermal well logging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelly, R.D.

    1977-01-12

    The results of investigations carried out into the availability and suitability of ceramic vacuum tubes for the development of logging tools for geothermal wells are summarized. Design data acquired in the evaluation of ceramic vacuum tubes for the development of a 500/sup 0/C instrumentation amplifier are presented. The general requirements for ceramic vacuum tubes for application to the development of high temperature well logs are discussed. Commercially available tubes are described and future contract activities that specifically relate to ceramic vacuum tubes are detailed. Supplemental data is presented in the appendix. (MHR)

  17. TH-AB-201-12: Using Machine Log-Files for Treatment Planning and Delivery QA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stanhope, C; Liang, J; Drake, D

    2016-06-15

    Purpose: To determine the segment reduction and dose resolution necessary for machine log-files to effectively replace current phantom-based patient-specific quality assurance, while minimizing computational cost. Methods: Elekta’s Log File Convertor R3.2 records linac delivery parameters (dose rate, gantry angle, leaf position) every 40ms. Five VMAT plans [4 H&N, 1 Pulsed Brain] comprised of 2 arcs each were delivered on the ArcCHECK phantom. Log-files were reconstructed in Pinnacle on the phantom geometry using 1/2/3/4° control point spacing and 2/3/4mm dose grid resolution. Reconstruction effectiveness was quantified by comparing 2%/2mm gamma passing rates of the original and log-file plans. Modulation complexity scoresmore » (MCS) were calculated for each beam to correlate reconstruction accuracy and beam modulation. Percent error in absolute dose for each plan-pair combination (log-file vs. ArcCHECK, original vs. ArcCHECK, log-file vs. original) was calculated for each arc and every diode greater than 10% of the maximum measured dose (per beam). Comparing standard deviations of the three plan-pair distributions, relative noise of the ArcCHECK and log-file systems was elucidated. Results: The original plans exhibit a mean passing rate of 95.1±1.3%. The eight more modulated H&N arcs [MCS=0.088±0.014] and two less modulated brain arcs [MCS=0.291±0.004] yielded log-file pass rates most similar to the original plan when using 1°/2mm [0.05%±1.3% lower] and 2°/3mm [0.35±0.64% higher] log-file reconstructions respectively. Log-file and original plans displayed percent diode dose errors 4.29±6.27% and 3.61±6.57% higher than measurement. Excluding the phantom eliminates diode miscalibration and setup errors; log-file dose errors were 0.72±3.06% higher than the original plans – significantly less noisy. Conclusion: For log-file reconstructed VMAT arcs, 1° control point spacing and 2mm dose resolution is recommended, however, less modulated arcs may allow less stringent reconstructions. Following the aforementioned reconstruction recommendations, the log-file technique is capable of detecting delivery errors with equivalent accuracy and less noise than ArcCHECK QA. I am funded by an Elekta Research Grant.« less

  18. [Utilization suitability of forest resources in typical forest zone of Changbai Mountains].

    PubMed

    Hao, Zhanqing; Yu, Deyong; Xiong, Zaiping; Ye, Ji

    2004-10-01

    Conservation of natural forest does not simply equal to no logging. The Northeast China Forest Region has a logging quota of mature forest as part of natural forest conservation project. How to determine the logging spots rationally and scientifically is very important. Recent scientific theories of forest resources management advocate that the utilization of forest resources should stick to the principle of sustaining use, and pay attention to the ecological function of forest resources. According to the logging standards, RS and GIS techniques can be used to detect the precise location of forest resources and obtain information of forest areas and types, and thus, provide more rational and scientific support for space choice about future utilization of forest resources. In this paper, the Lushuihe Forest Bureau was selected as a typical case in Changbai Mountains Forest Region to assess the utilization conditions of forest resources, and some advices on spatial choice for future management of forest resources in the study area were offered.

  19. Aligning observed and modelled behaviour based on workflow decomposition

    NASA Astrophysics Data System (ADS)

    Wang, Lu; Du, YuYue; Liu, Wei

    2017-09-01

    When business processes are mostly supported by information systems, the availability of event logs generated from these systems, as well as the requirement of appropriate process models are increasing. Business processes can be discovered, monitored and enhanced by extracting process-related information. However, some events cannot be correctly identified because of the explosion of the amount of event logs. Therefore, a new process mining technique is proposed based on a workflow decomposition method in this paper. Petri nets (PNs) are used to describe business processes, and then conformance checking of event logs and process models is investigated. A decomposition approach is proposed to divide large process models and event logs into several separate parts that can be analysed independently; while an alignment approach based on a state equation method in PN theory enhances the performance of conformance checking. Both approaches are implemented in programmable read-only memory (ProM). The correctness and effectiveness of the proposed methods are illustrated through experiments.

  20. Life cycle performances of log wood applied for soil bioengineering constructions

    NASA Astrophysics Data System (ADS)

    Kalny, Gerda; Strauss-Sieberth, Alexandra; Strauss, Alfred; Rauch, Hans Peter

    2016-04-01

    Nowadays there is a high demand on engineering solutions considering not only technical aspects but also ecological and aesthetic values. Soil bioengineering is a construction technique that uses biological components for hydraulic and civil engineering solutions. Soil bioengineering solutions are based on the application of living plants and other auxiliary materials including among others log wood. This kind of construction material supports the soil bioengineering system as long as the plants as living construction material overtake the stability function. Therefore it is important to know about the durability and the degradation process of the wooden logs to retain the integral performance of a soil bio engineering system. These aspects will be considered within the framework of the interdisciplinary research project „ELWIRA Plants, wood, steel and concrete - life cycle performances as construction materials". Therefore field investigations on soil bioengineering construction material, specifically European Larch wood logs, of different soil bioengineering structures at the river Wien have been conducted. The drilling resistance as a parameter for particular material characteristics of selected logs was measured and analysed. The drilling resistance was measured with a Rinntech Resistograph instrument at different positions of the wooden logs, all surrounded with three different backfills: Fully surrounded with air, with earth contact on one side and near the water surface in wet-dry conditions. The age of the used logs ranges from one year old up to 20 year old. Results show progress of the drilling resistance throughout the whole cross section as an indicator to assess soil bioengineering construction material. Logs surrounded by air showed a higher drilling resistance than logs with earth contact and the ones exposed to wet-dry conditions. Hence the functional capability of wooden logs were analysed and discussed in terms of different levels of degradation. The results contribute to a sustainable and resource conserving handling with building materials in frame of construction and maintenance works of soil bioengineering structures.

  1. LogScope

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Smith, Margaret H.; Barringer, Howard; Groce, Alex

    2012-01-01

    LogScope is a software package for analyzing log files. The intended use is for offline post-processing of such logs, after the execution of the system under test. LogScope can, however, in principle, also be used to monitor systems online during their execution. Logs are checked against requirements formulated as monitors expressed in a rule-based specification language. This language has similarities to a state machine language, but is more expressive, for example, in its handling of data parameters. The specification language is user friendly, simple, and yet expressive enough for many practical scenarios. The LogScope software was initially developed to specifically assist in testing JPL s Mars Science Laboratory (MSL) flight software, but it is very generic in nature and can be applied to any application that produces some form of logging information (which almost any software does).

  2. Development of Cross-Platform Software for Well Logging Data Visualization

    NASA Astrophysics Data System (ADS)

    Akhmadulin, R. K.; Miraev, A. I.

    2017-07-01

    Well logging data processing is one of the main sources of information in the oil-gas field analysis and is of great importance in the process of its development and operation. Therefore, it is important to have the software which would accurately and clearly provide the user with processed data in the form of well logs. In this work, there have been developed a software product which not only has the basic functionality for this task (loading data from .las files, well log curves display, etc.), but can be run in different operating systems and on different devices. In the article a subject field analysis and task formulation have been performed, and the software design stage has been considered. At the end of the work the resulting software product interface has been described.

  3. Development and validation of a new self-report instrument for measuring sedentary behaviors and light-intensity physical activity in adults.

    PubMed

    Barwais, Faisal Awad; Cuddihy, Thomas F; Washington, Tracy; Tomson, L Michaud; Brymer, Eric

    2014-08-01

    Low levels of physical activity and high levels of sedentary behavior (SB) are major public health concerns. This study was designed to develop and validate the 7-day Sedentary (S) and Light Intensity Physical Activity (LIPA) Log (7-day SLIPA Log), a self-report measure of specific daily behaviors. To develop the log, 62 specific SB and LIPA behaviors were chosen from the Compendium of Physical Activities. Face-to-face interviews were conducted with 32 sedentary volunteers to identify domains and behaviors of SB and LIPA. To validate the log, a further 22 sedentary adults were recruited to wear the GT3x for 7 consecutive days and nights. Pearson correlations (r) between the 7-day SLIPA Log and GT3x were significant for sedentary (r = .86, P < .001), for LIPA (r = .80, P < .001). Lying and sitting postures were positively correlated with GT3x output (r = .60 and r = .64, P < .001, respectively). No significant correlation was found for standing posture (r = .14, P = .53).The kappa values between the 7-day SLIPA Log and GT3x variables ranged from 0.09 to 0.61, indicating poor to good agreement. The 7-day SLIPA Log is a valid self-report measure of SB and LIPA in specific behavioral domains.

  4. Correlations between chromatographic parameters and bioactivity predictors of potential herbicides.

    PubMed

    Janicka, Małgorzata

    2014-08-01

    Different liquid chromatography techniques, including reversed-phase liquid chromatography on Purosphere RP-18e, IAM.PC.DD2 and Cosmosil Cholester columns and micellar liqud chromatography with a Purosphere RP-8e column and using buffered sodium dodecyl sulfate-acetonitrile as the mobile phase, were applied to study the lipophilic properties of 15 newly synthesized phenoxyacetic and carbamic acid derivatives, which are potential herbicides. Chromatographic lipophilicity descriptors were used to extrapolate log k parameters (log kw and log km) and log k values. Partitioning lipophilicity descriptors, i.e., log P coefficients in an n-octanol-water system, were computed from the molecular structures of the tested compounds. Bioactivity descriptors, including partition coefficients in a water-plant cuticle system and water-human serum albumin and coefficients for human skin partition and permeation were calculated in silico by ACD/ADME software using the linear solvation energy relationship of Abraham. Principal component analysis was applied to describe similarities between various chromatographic and partitioning lipophilicities. Highly significant, predictive linear relationships were found between chromatographic parameters and bioactivity descriptors. © The Author [2013]. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Finite-difference modeling of the electroseismic logging in a fluid-saturated porous formation

    NASA Astrophysics Data System (ADS)

    Guan, Wei; Hu, Hengshan

    2008-05-01

    In a fluid-saturated porous medium, an electromagnetic (EM) wavefield induces an acoustic wavefield due to the electrokinetic effect. A potential geophysical application of this effect is electroseismic (ES) logging, in which the converted acoustic wavefield is received in a fluid-filled borehole to evaluate the parameters of the porous formation around the borehole. In this paper, a finite-difference scheme is proposed to model the ES logging responses to a vertical low frequency electric dipole along the borehole axis. The EM field excited by the electric dipole is calculated separately by finite-difference first, and is considered as a distributed exciting source term in a set of extended Biot's equations for the converted acoustic wavefield in the formation. This set of equations is solved by a modified finite-difference time-domain (FDTD) algorithm that allows for the calculation of dynamic permeability so that it is not restricted to low-frequency poroelastic wave problems. The perfectly matched layer (PML) technique without splitting the fields is applied to truncate the computational region. The simulated ES logging waveforms approximately agree with those obtained by the analytical method. The FDTD algorithm applies also to acoustic logging simulation in porous formations.

  6. Using borehole flow data to characterize the hydraulics of flow paths in operating wellfields

    USGS Publications Warehouse

    Paillet, F.; Lundy, J.

    2004-01-01

    Understanding the flow paths in the vicinity of water well intakes is critical in the design of effective wellhead protection strategies for heterogeneous carbonate aquifers. High-resolution flow logs can be combined with geophysical logs and borehole-wall-image logs (acoustic televiewer) to identify the porous beds, solution openings, and fractures serving as conduits connecting the well bore to the aquifer. Qualitative methods of flow log analysis estimate the relative transmissivity of each water-producing zone, but do not indicate how those zones are connected to the far-field aquifer. Borehole flow modeling techniques can be used to provide quantitative estimates of both transmissivity and far-field hydraulic head in each producing zone. These data can be used to infer how the individual zones are connected with each other, and to the surrounding large-scale aquifer. Such information is useful in land-use planning and the design of well intakes to prevent entrainment of contaminants into water-supply systems. Specific examples of flow log applications in the identification of flow paths in operating wellfields are given for sites in Austin and Faribault, Minnesota. Copyright ASCE 2004.

  7. Solvent-assisted stir bar sorptive extraction by using swollen polydimethylsiloxane for enhanced recovery of polar solutes in aqueous samples: Application to aroma compounds in beer and pesticides in wine.

    PubMed

    Ochiai, Nobuo; Sasamoto, Kikuo; David, Frank; Sandra, Pat

    2016-07-15

    A novel solvent-assisted stir bar sorptive extraction (SA-SBSE) technique was developed for enhanced recovery of polar solutes in aqueous samples. A conventional PDMS stir bar was swollen in several solvents with log Kow ranging from 1.0 to 3.5 while stirring for 30min prior to extraction. After extraction, thermal desorption - gas chromatography - (tandem) mass spectrometry (TD-GC-(MS/)MS) or liquid desorption - large volume injection (LD-LVI)-GC-MS were performed. An initial study involved investigation of potential solvents for SA-SBSE by weighing of the residual solvent in the swollen PDMS stir bar before and after extraction. Compared to conventional SBSE, SA-SBSE using diethyl ether, methyl isobutyl ketone, dichloromethane, diisopropyl ether and toluene provided higher recoveries from water samples for test solutes with log Kow<2.5. For SA-SBSE using dichloromethane, recoveries were improved by factors of 1.4-4.1, while maintaining or even improving the recoveries for test solutes with log Kow>2.5. The performance of the SA-SBSE method using dichloromethane, diisopropyl ether, and cyclohexane is illustrated with analyses of aroma compounds in beer and of pesticides in wine. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  8. Approximate matching of regular expressions.

    PubMed

    Myers, E W; Miller, W

    1989-01-01

    Given a sequence A and regular expression R, the approximate regular expression matching problem is to find a sequence matching R whose optimal alignment with A is the highest scoring of all such sequences. This paper develops an algorithm to solve the problem in time O(MN), where M and N are the lengths of A and R. Thus, the time requirement is asymptotically no worse than for the simpler problem of aligning two fixed sequences. Our method is superior to an earlier algorithm by Wagner and Seiferas in several ways. First, it treats real-valued costs, in addition to integer costs, with no loss of asymptotic efficiency. Second, it requires only O(N) space to deliver just the score of the best alignment. Finally, its structure permits implementation techniques that make it extremely fast in practice. We extend the method to accommodate gap penalties, as required for typical applications in molecular biology, and further refine it to search for sub-strings of A that strongly align with a sequence in R, as required for typical data base searches. We also show how to deliver an optimal alignment between A and R in only O(N + log M) space using O(MN log M) time. Finally, an O(MN(M + N) + N2log N) time algorithm is presented for alignment scoring schemes where the cost of a gap is an arbitrary increasing function of its length.

  9. Prediction of gas/particle partitioning of polybrominated diphenyl ethers (PBDEs) in global air: a theoretical study

    NASA Astrophysics Data System (ADS)

    Li, Y.-F.; Ma, W.-L.; Yang, M.

    2014-09-01

    Gas/particle (G / P) partitioning for most semivolatile organic compounds (SVOCs) is an important process that primarily governs their atmospheric fate, long-range atmospheric transport potential, and their routs to enter human body. All previous studies on this issue have been hypothetically derived from equilibrium conditions, the results of which do not predict results from monitoring studies well in most cases. In this study, a steady-state model instead of an equilibrium-state model for the investigation of the G / P partitioning behavior for polybrominated diphenyl ethers (PBDEs) was established, and an equation for calculating the partition coefficients under steady state (KPS) for PBDE congeners (log KPS = log KPE + logα) was developed, in which an equilibrium term (log KPE = log KOA + logfOM -11.91, where fOM is organic matter content of the particles) and a nonequilibrium term (logα, mainly caused by dry and wet depositions of particles), both being functions of log KOA (octanol-air partition coefficient), are included, and the equilibrium is a special case of steady state when the nonequilibrium term equals to zero. A criterion to classify the equilibrium and nonequilibrium status for PBDEs was also established using two threshold values of log KOA, log KOA1 and log KOA2, which divide the range of log KOA into 3 domains: equilibrium, nonequilibrium, and maximum partition domains; and accordingly, two threshold values of temperature t, tTH1 when log KOA = log KOA1 and tTH2 when log KOA = log KOA2, were identified, which divide the range of temperature also into the same 3 domains for each BDE congener. We predicted the existence of the maximum partition domain (the values of log KPS reach a maximum constant of -1.53) that every PBDE congener can reach when log KOA ≥ log KOA2, or t ≤ tTH2. The novel equation developed in this study was applied to predict the G / P partition coefficients of PBDEs for the published monitoring data worldwide, including Asia, Europe, North America, and the Arctic, and the results matched well with all the monitoring data, except those obtained at e-waste sites due to the unpredictable PBDE emissions at these sites. This study provided evidence that, the new developed steady-state-based equation is superior to the equilibrium-state-based equation that has been used in describing the G / P partitioning behavior in decades. We suggest that, the investigation on G / P partitioning behavior for PBDEs should be based on steady state, not equilibrium state, and equilibrium is just a special case of steady state when nonequilibrium factors can be ignored. We also believe that our new equation provides a useful tool for environmental scientists in both monitoring and modeling research on G / P partitioning for PBDEs and can be extended to predict G / P partitioning behavior for other SVOCs as well.

  10. Structural characteristics of novel symmetrical diaryl derivatives with nitrogenated functions. Requirements for cytotoxic activity.

    PubMed

    Font, María; Ardaiz, Elena; Cordeu, Lucia; Cubedo, Elena; García-Foncillas, Jesús; Sanmartin, Carmen; Palop, Juan-Antonio

    2006-03-15

    In an attempt to discover the essential features that would allow us to explain the differences in cytotoxic activity shown by a series of symmetrical diaryl derivatives with nitrogenated functions, we have studied by molecular modelling techniques the variation in Log P and conformational behaviour, in terms of structural modifications. The Log P data--although they provide few clues concerning the observed variability in activity--suggest that an initial separation of active and inactive compounds is possible based on this parameter. The subsequent study of the conformational behaviour of the compounds, selected according to their Log P values, showed that the active compounds preferentially display an extended conformation and inactive ones are associated with a certain type of folding, with a triangular-type conformation adopted in these cases.

  11. SU-F-T-230: A Simple Method to Assess Accuracy of Dynamic Wave Arc Irradiation Using An Electronic Portal Imaging Device and Log Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hirashima, H; Miyabe, Y; Yokota, K

    2016-06-15

    Purpose: The Dynamic Wave Arc (DWA) technique, where the multi-leaf collimator (MLC) and gantry/ring move simultaneously in a predefined non-coplanar trajectory, has been developed on the Vero4DRT. The aim of this study is to develop a simple method for quality assurance of DWA delivery using an electronic portal imaging device (EPID) measurements and log files analysis. Methods: The Vero4DRT has an EPID on the beam axis, the resolution of which is 0.18 mm/pixel at the isocenter plane. EPID images were acquired automatically. To verify the detection accuracy of the MLC position by EPID images, the MLC position with intentional errorsmore » was assessed. Tests were designed considering three factors: (1) accuracy of the MLC position (2) dose output consistency with variable dose rate (160–400 MU/min), gantry speed (2.4–6°/s), ring speed (0.5–2.5°/s), and (3) MLC speed (1.6–4.2 cm/s). All the patterns were delivered to the EPID and compared with those obtained with a stationary radiation beam with a 0° gantry angle. The irradiation log, including the MLC position and gantry/ring angle, were recorded simultaneously. To perform independent checks of the machine accuracy, the MLC position and gantry/ring angle position were assessed using log files. Results: 0.1 mm intentional error can be detected by the EPID, which is smaller than the EPID pixel size. The dose outputs with different conditions of the dose rate and gantry/ring speed and MLC speed showed good agreement, with a root mean square (RMS) error of 0.76%. The RMS error between the detected and recorded data were 0.1 mm for the MLC position, 0.12° for the gantry angle, and 0.07° for the ring angle. Conclusion: The MLC position and dose outputs in variable conditions during DWA irradiation can be easily detected using EPID measurements and log file analysis. The proposed method is useful for routine verification. This research is (partially) supported by the Practical Research for Innovative Cancer Control (15Ack0106151h0001) from Japan Agency for Medical Research and development, AMED. Authors Takashi Mizowaki and Masahiro Hiraoka have consultancy agreement with Mitsubishi Heavy Industries, Ltd., Japan.« less

  12. Reengineering Human Performance and Fatigue Research Through Use of Physiological Monitoring Devices, Web-Based and Mobile Device Data Collection Methods, and Integrated Data Storage Techniques

    DTIC Science & Technology

    2003-12-01

    Lin, T. M. and Sadhu, P. XML How to Program . Prentice Hall, 2001. Dement, W. and Greenberg, S. (1966) “Changes in Total Amount of Stage Four Sleep...training in the science of conducting human research as well as how to use the collection equipment and analysis tools available to them at NPS. In order...redesign? • How will development of a PDA based tracking log give support to process redesign? D. SCOPE OF THESIS In efforts to improve data

  13. Veneer recovery from peeler-grade Douglas-fir logs in northwestern Oregon.

    Treesearch

    E.H. Clarke; A.C. Knauss

    1957-01-01

    Continued expansion of Douglas-fir plywood production develops a greater demand for all grades of peelable logs to supply the industry. Thus, it becomes increasingly important to know the veneer-grade recovery expected so that timber and log values can be more accurately appraised.

  14. Comparative evaluation of the performance of the Abbott RealTime HIV-1 assay for measurement of HIV-1 plasma viral load on genetically diverse samples from Greece

    PubMed Central

    2011-01-01

    Background HIV-1 is characterized by increased genetic heterogeneity which tends to hinder the reliability of detection and accuracy of HIV-1 RNA quantitation assays. Methods In this study, the Abbott RealTime HIV-1 (Abbott RealTime) assay was compared to the Roche Cobas TaqMan HIV-1 (Cobas TaqMan) and the Siemens Versant HIV-1 RNA 3.0 (bDNA 3.0) assays, using clinical samples of various viral load levels and subtypes from Greece, where the recent epidemiology of HIV-1 infection has been characterized by increasing genetic diversity and a marked increase in subtype A genetic strains among newly diagnosed infections. Results A high correlation was observed between the quantitative results obtained by the Abbott RealTime and the Cobas TaqMan assays. Viral load values quantified by the Abbott RealTime were on average lower than those obtained by the Cobas TaqMan, with a mean (SD) difference of -0.206 (0.298) log10 copies/ml. The mean differences according to HIV-1 subtypes between the two techniques for samples of subtype A, B, and non-A/non-B were 0.089, -0.262, and -0.298 log10 copies/ml, respectively. Overall, differences were less than 0.5 log10 for 85% of the samples, and >1 log10 in only one subtype B sample. Similarly, Abbott RealTime and bDNA 3.0 assays yielded a very good correlation of quantitative results, whereas viral load values assessed by the Abbott RealTime were on average higher (mean (SD) difference: 0.160 (0.287) log10 copies/ml). The mean differences according to HIV-1 subtypes between the two techniques for subtype A, B and non-A/non-B samples were 0.438, 0.105 and 0.191 log10 copies/ml, respectively. Overall, the majority of samples (86%) differed by less than 0.5 log10, while none of the samples showed a deviation of more than 1.0 log10. Conclusions In an area of changing HIV-1 subtype pattern, the Abbott RealTime assay showed a high correlation and good agreement of results when compared both to the Cobas TaqMan and bDNA 3.0 assays, for all HIV-1 subtypes tested. All three assays could determine viral load from samples of different HIV-1 subtypes adequately. However, assay variation should be taken into account when viral load monitoring of the same individual is assessed by different systems. PMID:21219667

  15. Analysis of ILRS Site Ties

    NASA Astrophysics Data System (ADS)

    Husson, V. S.; Long, J. L.; Pearlman, M.

    2001-12-01

    By the end of 2000, 94% of ILRS stations had completed station and site information forms (i.e. site logs). These forms contain six types of information. These six categories include site identifiers, contact information, approximate coordinates, system configuration history, system ranging capabilities, and local survey ties. The ILRS Central Bureau, in conjunction with the ILRS Networks and Engineering Working Group, has developed procedures to quality control site log contents. Part of this verification entails data integrity checks of local site ties and is the primary focus of this paper. Local survey ties are critical to the combination of space geodetic network coordinate solutions (i.e. GPS, SLR, VLBI, DORIS) of the International Terrestrial Reference Frame (ITRF). Approximately 90% of active SLR sites are collocated with at least one other space geodetic technique. The process used to verify these SLR ties, at collocated sites, is identical to the approach used in ITRF2000. Local vectors (X, Y, Z) from each ILRS site log are differenced from its corresponding ITRF2000 position vectors (i.e. no transformations). These X, Y, and Z deltas are converted into North, East, and Up. Any deltas, in any component, larger than 5 millimeter is flagged for investigation. In the absence of ITRF2000 SLR positions, CSR positions were used. To further enhance this comparison and to fill gaps in information, local ties contained in site logs from the other space geodetic services (i.e. IGS, IVS, IDS) were used in addition to ITRF2000 ties. Case studies of two collocated sites (McDonald/Ft. Davis and Hartebeeshtoek) will be explored in-depth. Recommendations on how local site surveys should be conducted and how this information should be managed will also be presented.

  16. Self assembly of rectangular shapes on concentration programming and probabilistic tile assembly models

    PubMed Central

    Rajasekaran, Sanguthevar

    2013-01-01

    Efficient tile sets for self assembling rectilinear shapes is of critical importance in algorithmic self assembly. A lower bound on the tile complexity of any deterministic self assembly system for an n × n square is Ω(log(n)log(log(n))) (inferred from the Kolmogrov complexity). Deterministic self assembly systems with an optimal tile complexity have been designed for squares and related shapes in the past. However designing Θ(log(n)log(log(n))) unique tiles specific to a shape is still an intensive task in the laboratory. On the other hand copies of a tile can be made rapidly using PCR (polymerase chain reaction) experiments. This led to the study of self assembly on tile concentration programming models. We present two major results in this paper on the concentration programming model. First we show how to self assemble rectangles with a fixed aspect ratio (α:β), with high probability, using Θ(α + β) tiles. This result is much stronger than the existing results by Kao et al. (Randomized self-assembly for approximate shapes, LNCS, vol 5125. Springer, Heidelberg, 2008) and Doty (Randomized self-assembly for exact shapes. In: proceedings of the 50th annual IEEE symposium on foundations of computer science (FOCS), IEEE, Atlanta. pp 85–94, 2009)—which can only self assembly squares and rely on tiles which perform binary arithmetic. On the other hand, our result is based on a technique called staircase sampling. This technique eliminates the need for sub-tiles which perform binary arithmetic, reduces the constant in the asymptotic bound, and eliminates the need for approximate frames (Kao et al. Randomized self-assembly for approximate shapes, LNCS, vol 5125. Springer, Heidelberg, 2008). Our second result applies staircase sampling on the equimolar concentration programming model (The tile complexity of linear assemblies. In: proceedings of the 36th international colloquium automata, languages and programming: Part I on ICALP ’09, Springer-Verlag, pp 235–253, 2009), to self assemble rectangles (of fixed aspect ratio) with high probability. The tile complexity of our algorithm is Θ(log(n)) and is optimal on the probabilistic tile assembly model (PTAM)—n being an upper bound on the dimensions of a rectangle. PMID:24311993

  17. An object-oriented approach to data display and storage: 3 years experience, 25,000 cases.

    PubMed

    Sainsbury, D A

    1993-11-01

    Object-oriented programming techniques were used to develop computer based data display and storage systems. These have been operating in the 8 anaesthetising areas of the Adelaide Children's Hospital for 3 years. The analogue and serial outputs from an array of patient monitors are connected to IBM compatible PC-XT computers. The information is displayed on a colour screen as wave-form and trend graphs and digital format in 'real time'. The trend data is printed simultaneously on a dot matrix printer. This data is also stored for 24 hours on 'hard' disk. The major benefit has been the provision of a single visual focus for all monitored variables. The automatic logging of data has been invaluable in the analysis of critical incidents. The systems were made possible by recent, rapid improvements in computer hardware and software. This paper traces the development of the program and demonstrates the advantages of object-oriented programming techniques.

  18. Petrophysical analysis of geophysical logs of the National Drilling Company-U.S. Geological Survey ground-water research project for Abu Dhabi Emirate, United Arab Emirates

    USGS Publications Warehouse

    Jorgensen, Donald G.; Petricola, Mario

    1994-01-01

    A program of borehole-geophysical logging was implemented to supply geologic and geohydrologic information for a regional ground-water investigation of Abu Dhabi Emirate. Analysis of geophysical logs was essential to provide information on geohydrologic properties because drill cuttings were not always adequate to define lithologic boundaries. The standard suite of logs obtained at most project test holes consisted of caliper, spontaneous potential, gamma ray, dual induction, microresistivity, compensated neutron, compensated density, and compensated sonic. Ophiolitic detritus from the nearby Oman Mountains has unusual petrophysical properties that complicated the interpretation of geophysical logs. The density of coarse ophiolitic detritus is typically greater than 3.0 grams per cubic centimeter, porosity values are large, often exceeding 45 percent, and the clay fraction included unusual clays, such as lizardite. Neither the spontaneous-potential log nor the natural gamma-ray log were useable clay indicators. Because intrinsic permeability is a function of clay content, additional research in determining clay content was critical. A research program of geophysical logging was conducted to determine the petrophysical properties of the shallow subsurface formations. The logging included spectral-gamma and thermal-decay-time logs. These logs, along with the standard geophysical logs, were correlated to mineralogy and whole-rock chemistry as determined from sidewall cores. Thus, interpretation of lithology and fluids was accomplished. Permeability and specific yield were calculated from geophysical-log data and correlated to results from an aquifer test. On the basis of results from the research logging, a method of lithologic and water-resistivity interpretation was developed for the test holes at which the standard suite of logs were obtained. In addition, a computer program was developed to assist in the analysis of log data. Geohydrologic properties were estimated, including volume of clay matrix, volume of matrix other than clay, density of matrix other than clay, density of matrix, intrinsic permeability, specific yield, and specific storage. Geophysical logs were used to (1) determine lithology, (2) correlate lithologic and permeable zones, (3) calibrate seismic reprocessing, (4) calibrate transient-electromagnetic surveys, and (5) calibrate uphole-survey interpretations. Logs were used at the drill site to (1) determine permeability zones, (2) determine dissolved-solids content, which is a function of water resistivity, and (3) design wells accordingly. Data and properties derived from logs were used to determine transmissivity and specific yield of aquifer materials.

  19. Mathematical model of a smoldering log.

    Treesearch

    Fernando de Souza Costa; David Sandberg

    2004-01-01

    A mathematical model is developed describing the natural smoldering of logs. It is considered the steady one dimensional propagation of infinitesimally thin fronts of drying, pyrolysis, and char oxidation in a horizontal semi-infinite log. Expressions for the burn rates, distribution profiles of temperature, and positions of the drying, pyrolysis, and smoldering fronts...

  20. Development of Mechanistic Flexible Pavement Design Concepts for the Heavyweight F-15 Aircraft

    DTIC Science & Technology

    1986-01-01

    UPON REPETITION - VERTICAL SUBGRADE STRAIN RESULTS 131 B-l AC TENSILE STRAIN VERSUS AC THICKNESS, VARYING GRANULAR BASE THICKNESS 204 B...Log SR - 0.8243 - .4095(Log TAg)(Log E*^) - .0110(TGR/Log TAC) ♦ . 0132 (ERi) - .3811(Log ERi) R2-.947 SEE-.0506 (1.124) R2-.923 SEE-.06 Log DO...Thickness. 204 lk^»-^>W »-» •■■*-»«.■■»-»*-!»*» »j(i »*Ä«**»«iutft^^*fij»Ä^»t’Äa*a<uiKajtfj»«iBym^j,Ä*^j’jifjauai**Ä: -**> -"> V \\X%M?J(fjtfJK

  1. Quantitative geophysical investigations at the Diamond M field, Scurry County, Texas

    NASA Astrophysics Data System (ADS)

    Davogustto Cataldo, Oswaldo Ernesto

    The Diamond M field over the Horseshoe Atoll reservoir of west Texas has produced oil since 1942. Even with some 210 well penetrations, complex reservoir compartmentalization has justified an ongoing drilling program with three wells drilled within the last three years. Accurate reservoir characterization requires accurate description of the geometry, geological facies, and petrophysical property distribution ranging from core, through log to the seismic scale. The operator has conducted a careful logging and coring process including dipole sonic logs in addition to acquiring a modern 3D vertical phone - vertical vibrator "P-wave" seismic data volume and an equivalent size 2-component by 2-componet "S-wave" seismic data volume. I analyze these data at different scales, integrating them into a whole. I begin with core analysis of the petrophysical properties of the Horseshoe Atoll reservoir. Measuring porosity, permeability, NMR T2 relaxation and velocities (Vp and Vs) as a function of pressure and find that porosity measurements are consistent when measured with different techniques. When upscaled, these measurements are in excellent agreement with properties measured at the log scale. Together, these measurements provide a lithology-porosity template against which I correlate my seismic P- and S-impedance measurements. Careful examination of P- and S-impedances as well as density from prestack inversion of the P-wave survey of the original time migrated gathers showed lower vertical resolution for S-impedance and density. These latter two parameters are controlled by the far-offset data, which suffers from migration stretch. I address this shortcoming by applying a recently developed non-stretch NMO technique which not only improved the bandwidth of the data but also resulted in inversions that better match the S-impedance and density well log data. The operator hypothesized that 2C by 2C S-wave data would better delineate lithology than conventional P-wave seismic data. Although introduced in the mid-1980s, 2C by 2C data are rarely acquired, with most surveys showing less vertical resolution than conventional (and prior to slip-sweep technology more economically acquired) P-wave data. Initial processing by the service company showed a comparable, but lower frequency, image for the "transverse" component, and poor images for the "radial" component. Although the dipole sonic logs did not indicate the presence of significant anisotropy, shear wave splitting is readily observed on the surface seismic stacks. I therefore developed a prestack Alford rotation algorithm that minimizes the cross-talk between components, resulting vertical resolution comparable to the P-wave data, and independent measure of lithology, and also a direct measure of the direction of the principal axes of anisotropy. The direction of azimuthal anisotropy is aligned N45E consistent with the regional maximum horizontal stress axis obtained from the world stress map database. On average, the Cisco Formation appears 10% thicker on the slow shear (S2) volume than on the fast shear (S1 ) volume and between 70% and 100% thicker on the P-wave volume. Cross-plotting cumulative production against the various seismic attributes, I find a strong negative correlation to S-impedance and P-impedance. Zones of low S-impedance and low P-impedance correlate to better producing wells. More quantitative correlation will require the analysis of the role fractures versus porosity contribute to production.

  2. Near-infrared diffuse reflection systems for chlorophyll content of tomato leaves measurement

    NASA Astrophysics Data System (ADS)

    Jiang, Huanyu; Ying, Yibin; Lu, Huishan

    2006-10-01

    In this study, two measuring systems for chlorophyll content of tomato leaves were developed based on near-infrared spectral techniques. The systems mainly consists of a FT-IR spectrum analyzer, optic fiber diffuses reflection accessories and data card. Diffuse reflectance of intact tomato leaves was measured by an optics fiber optic fiber diffuses reflection accessory and a smart diffuses reflection accessory. Calibration models were developed from spectral and constituent measurements. 90 samples served as the calibration sets and 30 samples served as the validation sets. Partial least squares (PLS) and principal component regression (PCR) technique were used to develop the prediction models by different data preprocessing. The best model for chlorophyll content had a high correlation efficient of 0.9348 and a low standard error of prediction RMSEP of 4.79 when we select full range (12500-4000 cm -1), MSC path length correction method by the log(1/R). The results of this study suggest that FT-NIR method can be feasible to detect chlorophyll content of tomato leaves rapidly and nondestructively.

  3. The development of a full-digital and networkable multi-media based highway information system : phase 1

    DOT National Transportation Integrated Search

    1999-07-26

    This report covers the development of a Multimedia Based Highway Information System (MMHIS). MMHIS extends the capabilities of current photo logging facilities. Photographic logging systems used by highway agencies provide engineers with information ...

  4. Interpretation of well logs in a carbonate aquifer

    USGS Publications Warehouse

    MacCary, L.M.

    1978-01-01

    This report describes the log analysis of the Randolph and Sabial core holes in the Edwards aquifer in Texas, with particular attention to the principles that can be applied generally to any carbonate system. The geologic and hydrologic data were obtained during the drilling of the two holes, from extensive laboratory analysis of the cores, and from numerous geophysical logs run in the two holes. Some logging methods are inherently superiors to others for the analysis of limestone and dolomite aquifers. Three such systems are the dentistry, neutron, and acoustic-velocity (sonic) logs. Most of the log analysis described here is based on the interpretation of suites of logs from these three systems. In certain instances, deeply focused resistivity logs can be used to good advantage in carbonate rock studies; this technique is used to computer the water resistivity in the Randolph core hole. The rocks penetrated by the Randolph core hole are typical of those carbonates that have undergone very little solution by recent ground-water circulation. There are few large solutional openings; the water is saline; and the rocks are dark, dolomitic, have pore space that is interparticle or intercrystalline, and contain unoxidized organic material. The total porosity of rocks in the saline zone is higher than that of rocks in the fresh-water aquifer; however, the intrinsic permeability is much less in the saline zone because there are fewer large solutional openings. The Sabinal core hole penetrates a carbonate environment that has experienced much solution by ground water during recent geologic time. The rocks have high secondary porosities controlled by sedimentary structures within the rock; the water is fresh; and the dominant rock composition is limestone. The relative percentages of limestone and dolomite, the average matrix (grain) densities of the rock mixtures , and the porosity of the rock mass can be calculated from density, neutron, and acoustic logs. With supporting data from resistivity logs, the formation water quality can be estimated, as well as the relative cementation or tortuosity of the rock. Many of these properties calculated from logs can be verified by analysis of the core available from test holes drilled in the saline and fresh water zones.

  5. GTAG: architecture and design of miniature transmitter with position logging for radio telemetry

    NASA Astrophysics Data System (ADS)

    Řeřucha, Šimon; Bartonička, Tomáš; Jedlička, Petr

    2011-10-01

    The radio telemetry is a well-known technique used within zoological research to exploit the behaviour of animal species. A usage of GPS for a frequent and precise position recording gives interesting possibility for a further enhancement of this method. We present our proposal of an architecture and design concepts of telemetry transmitter with GPS module, called GTAG, that is suited for study of the Egyptian fruit bat (Rousettus aegyptiacus). The model group we study set particular constrains, especially the weight limit (9 g) and prevention of any power resources recharging technique. We discuss the aspect of physical realization and the energyconsumption issues. We have developed a reference implementation that has been already deployed during telemetry sessions and we evaluate the experience and compare the estimated performance of our device to a real data.

  6. Detecting well casing leaks in Bangladesh using a salt spiking method

    USGS Publications Warehouse

    Stahl, M.O.; Ong, J.B.; Harvey, C.F.; Johnson, C.D.; Badruzzaman, A.B.M.; Tarek, M.H.; VanGeen, A.; Anderson, J.A.; Lane, J.W.

    2014-01-01

    We apply fluid-replacement logging in arsenic-contaminated regions of Bangladesh using a low-cost, down-well fluid conductivity logging tool to detect leaks in the cased section of wells. The fluid-conductivity tool is designed for the developing world: it is lightweight and easily transportable, operable by one person, and can be built for minimal cost. The fluid-replacement test identifies leaking casing by comparison of fluid conductivity logs collected before and after spiking the wellbore with a sodium chloride tracer. Here, we present results of fluid-replacement logging tests from both leaking and non-leaking casing from wells in Araihazar and Munshiganj, Bangladesh, and demonstrate that the low-cost tool produces measurements comparable to those obtained with a standard geophysical logging tool. Finally, we suggest well testing procedures and approaches for preventing casing leaks in Bangladesh and other developing countries.

  7. Detecting well casing leaks in Bangladesh using a salt spiking method.

    PubMed

    Stahl, M O; Ong, J B; Harvey, C F; Johnson, C D; Badruzzaman, A B M; Tarek, M H; van Geen, A; Anderson, J A; Lane, J W

    2014-09-01

    We apply fluid-replacement logging in arsenic-contaminated regions of Bangladesh using a low-cost, down-well fluid conductivity logging tool to detect leaks in the cased section of wells. The fluid-conductivity tool is designed for the developing world: it is lightweight and easily transportable, operable by one person, and can be built for minimal cost. The fluid-replacement test identifies leaking casing by comparison of fluid conductivity logs collected before and after spiking the wellbore with a sodium chloride tracer. Here, we present results of fluid-replacement logging tests from both leaking and non-leaking casing from wells in Araihazar and Munshiganj, Bangladesh, and demonstrate that the low-cost tool produces measurements comparable to those obtained with a standard geophysical logging tool. Finally, we suggest well testing procedures and approaches for preventing casing leaks in Bangladesh and other developing countries. © 2014, National Ground Water Association.

  8. Web usage data mining agent

    NASA Astrophysics Data System (ADS)

    Madiraju, Praveen; Zhang, Yanqing

    2002-03-01

    When a user logs in to a website, behind the scenes the user leaves his/her impressions, usage patterns and also access patterns in the web servers log file. A web usage mining agent can analyze these web logs to help web developers to improve the organization and presentation of their websites. They can help system administrators in improving the system performance. Web logs provide invaluable help in creating adaptive web sites and also in analyzing the network traffic analysis. This paper presents the design and implementation of a Web usage mining agent for digging in to the web log files.

  9. Elastic anisotropy and borehole stress estimation in the Seve Nappe Complex from the COSC-1 well, Åre, Sweden.

    NASA Astrophysics Data System (ADS)

    Wenning, Quinn; Almquist, Bjarne; Ask, Maria; Schmitt, Douglas R.; Zappone, Alba

    2015-04-01

    The Caledonian orogeny, preserved in Scandinavia and Greenland, began with the closure of the Iapetus Ocean and culminated in the collision of Baltica and Laurentia cratons during the middle Paleozoic. The COSC scientific drilling project aims at understanding the crustal structure and composition of the Scandinavian Caledonides. The first well of the dual phase drilling program, completed in Summer of 2014, drilled through ~2.5 km of the Seve Nappe Complex near the town of Åre, Sweden. Newly acquired drill core and borehole logs provide fresh core material for physical rock property measurements and in-situ stress determination. This contribution presents preliminary data on compressional and shear wave ultrasonic velocities (Vp, Vs) determined from laboratory measurements on drill cores, together with in-situ stress orientation analysis using image logs from the first borehole of the Collisional Orogeny in the Scandinavian Caledonides project (COSC-1). An hydrostatically oil pressurized apparatus is used to test the ultrasonic Vp and Vs on three orthogonally cut samples of amphibolite, calcium bearing and felsic gneiss, meta-gabbro, and mylonitic schist from drill core. We measure directional anisotropy variability for each lithology using one sample cut perpendicular to the foliation and two additional plugs cut parallel to the foliation with one parallel to the lineation and the other perpendicular. Measurements are performed using the pulse transmission technique on samples subjected to hydrostatic pressure from 1-350 MPa at dry conditions. We present preliminary results relating Vp and Vs anisotropy to geologic units and degree of deformation. Additionally, we use acoustic borehole televiewer logs to estimate the horizontal stress orientation making use of well developed techniques for observed borehole breakouts (compressive failure) and drilling induced fractures (tensile failure). Preliminary observations show that very few drilling-induced tensile fractures are produced, and that borehole breakouts are episodic and suggests a NE-SW minimum horizontal stress direction

  10. Antimicrobial activity of a novel adhesive containing chlorhexidine gluconate (CHG) against the resident microflora in human volunteers

    PubMed Central

    Carty, Neal; Wibaux, Anne; Ward, Colleen; Paulson, Daryl S.; Johnson, Peter

    2014-01-01

    Objectives To evaluate the antimicrobial activity of a new, transparent composite film dressing, whose adhesive contains chlorhexidine gluconate (CHG), against the native microflora present on human skin. Methods CHG-containing adhesive film dressings and non-antimicrobial control film dressings were applied to the skin on the backs of healthy human volunteers without antiseptic preparation. Dressings were removed 1, 4 or 7 days after application. The bacterial populations underneath were measured by quantitative cultures (cylinder-scrub technique) and compared with one another as a function of time. Results The mean baseline microflora recovery was 3.24 log10 cfu/cm2. The mean log reductions from baseline measured from underneath the CHG-containing dressings were 0.87, 0.78 and 1.30 log10 cfu/cm2 on days 1, 4 and 7, respectively, compared with log reductions of 0.67, −0.87 and −1.29 log10 cfu/cm2 from underneath the control film dressings. There was no significant difference between the log reductions of the two treatments on day 1, but on days 4 and 7 the log reduction associated with the CHG adhesive was significantly higher than that associated with the control adhesive. Conclusions The adhesive containing CHG was associated with a sustained antimicrobial effect that was not present in the control. Incorporating the antimicrobial into the adhesive layer confers upon it bactericidal properties in marked contrast to the non-antimicrobial adhesive, which contributed to bacterial proliferation when the wear time was ≥4 days. PMID:24722839

  11. Antimicrobial activity of a novel adhesive containing chlorhexidine gluconate (CHG) against the resident microflora in human volunteers.

    PubMed

    Carty, Neal; Wibaux, Anne; Ward, Colleen; Paulson, Daryl S; Johnson, Peter

    2014-08-01

    To evaluate the antimicrobial activity of a new, transparent composite film dressing, whose adhesive contains chlorhexidine gluconate (CHG), against the native microflora present on human skin. CHG-containing adhesive film dressings and non-antimicrobial control film dressings were applied to the skin on the backs of healthy human volunteers without antiseptic preparation. Dressings were removed 1, 4 or 7 days after application. The bacterial populations underneath were measured by quantitative cultures (cylinder-scrub technique) and compared with one another as a function of time. The mean baseline microflora recovery was 3.24 log10 cfu/cm(2). The mean log reductions from baseline measured from underneath the CHG-containing dressings were 0.87, 0.78 and 1.30 log10 cfu/cm(2) on days 1, 4 and 7, respectively, compared with log reductions of 0.67, -0.87 and -1.29 log10 cfu/cm(2) from underneath the control film dressings. There was no significant difference between the log reductions of the two treatments on day 1, but on days 4 and 7 the log reduction associated with the CHG adhesive was significantly higher than that associated with the control adhesive. The adhesive containing CHG was associated with a sustained antimicrobial effect that was not present in the control. Incorporating the antimicrobial into the adhesive layer confers upon it bactericidal properties in marked contrast to the non-antimicrobial adhesive, which contributed to bacterial proliferation when the wear time was ≥4 days. © The Author 2014. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy.

  12. Anthropogenic Land-use Change and the Dynamics of Amazon Forest Biomass

    NASA Technical Reports Server (NTRS)

    Laurance, William F.

    2004-01-01

    This project was focused on assessing the effects of prevailing land uses, such as habitat fragmentation, selective logging, and fire, on biomass and carbon storage in Amazonian forests, and on the dynamics of carbon sequestration in regenerating forests. Ancillary goals included developing GIs models to help predict the future condition of Amazonian forests, and assessing the effects of anthropogenic climate change and ENS0 droughts on intact and fragmented forests. Ground-based studies using networks of permanent plots were linked with remote-sensing data (including Landsat TM and AVHRR) at regional scales, and higher-resolution techniques (IKONOS imagery, videography, LIDAR, aerial photographs) at landscape and local scales. The project s specific goals were quite eclectic and included: Determining the effects of habitat fragmentation on forest dynamics, floristic composition, and the various components of above- and below-ground biomass. Assessing historical and physical factors that affect trajectories of forest regeneration and carbon sequestration on abandoned lands. Extrapolating results from local studies of biomass dynamics in fragmented and regenerating forests to landscape and regional scales in Amazonia, using remote sensing and GIS. Testing the hypothesis that intact Amazonian forests are functioning as a significant carbon sink. Examining destructive synergisms between forest fragmentation and fire. Assessing the short-term impacts of selective logging on aboveground biomass. Developing GIS models that integrate current spatial data on forest cover, deforestation, logging, mining, highway and roads, navigable rivers, vulnerability to wild fires, protected areas, and existing and planned infrastructure projects, in an effort to predict the future condition of Brazilian Amazonian forests over the next 20-25 years. Devising predictive spatial models to assess the influence of varied biophysical and anthropogenic predictors on Amazonian deforestation.

  13. Prediction of gas/particle partitioning of polybrominated diphenyl ethers (PBDEs) in global air: A theoretical study

    NASA Astrophysics Data System (ADS)

    Li, Y.-F.; Ma, W.-L.; Yang, M.

    2015-02-01

    Gas/particle (G/P) partitioning of semi-volatile organic compounds (SVOCs) is an important process that primarily governs their atmospheric fate, long-range atmospheric transport, and their routes of entering the human body. All previous studies on this issue are hypothetically based on equilibrium conditions, the results of which do not predict results from monitoring studies well in most cases. In this study, a steady-state model instead of an equilibrium-state model for the investigation of the G/P partitioning behavior of polybrominated diphenyl ethers (PBDEs) was established, and an equation for calculating the partition coefficients under steady state (KPS) of PBDEs (log KPS = log KPE + logα) was developed in which an equilibrium term (log KPE = log KOA + logfOM -11.91 where fOM is organic matter content of the particles) and a non-equilibrium term (log α, caused by dry and wet depositions of particles), both being functions of log KOA (octanol-air partition coefficient), are included. It was found that the equilibrium is a special case of steady state when the non-equilibrium term equals zero. A criterion to classify the equilibrium and non-equilibrium status of PBDEs was also established using two threshold values of log KOA, log KOA1, and log KOA2, which divide the range of log KOA into three domains: equilibrium, non-equilibrium, and maximum partition domain. Accordingly, two threshold values of temperature t, tTH1 when log KOA = log KOA1 and tTH2 when log KOA = log KOA2, were identified, which divide the range of temperature also into the same three domains for each PBDE congener. We predicted the existence of the maximum partition domain (the values of log KPS reach a maximum constant of -1.53) that every PBDE congener can reach when log KOA ≥ log KOA2, or t ≤ tTH2. The novel equation developed in this study was applied to predict the G/P partition coefficients of PBDEs for our Chinese persistent organic pollutants (POPs) Soil and Air Monitoring Program, Phase 2 (China-SAMP-II) program and other monitoring programs worldwide, including in Asia, Europe, North America, and the Arctic, and the results matched well with all the monitoring data, except those obtained at e-waste sites due to the unpredictable PBDE emissions at these sites. This study provided evidence that the newly developed steady-state-based equation is superior to the equilibrium-state-based equation that has been used in describing the G/P partitioning behavior over decades. We suggest that the investigation on G/P partitioning behavior for PBDEs should be based onsteady-state, not equilibrium state, and equilibrium is just a special case of steady-state when non-equilibrium factors can be ignored. We also believe that our new equation provides a useful tool for environmental scientists in both monitoring and modeling research on G/P partitioning of PBDEs and can be extended to predict G/P partitioning behavior for other SVOCs as well.

  14. Sonar surveys used in gas-storage cavern analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crossley, N.G.

    1998-05-04

    Natural-gas storage cavern internal configuration, inspection information, and cavern integrity data can be obtained during high-pressure operations with specialized gas-sonar survey logging techniques. TransGas Ltd., Regina, Sask., has successfully performed these operations on several of its deepest and highest pressurized caverns. The data can determine gas-in-place inventory and assess changes in spatial volumes. These changes can result from cavern creep, shrinkage, or closure or from various downhole abnormalities such as fluid infill or collapse of the sidewall or roof. The paper discusses conventional surveys with sonar, running surveys in pressurized caverns, accuracy of the sonar survey, initial development of Cavernmore » 5, a roof fall, Cavern 4 development, and a damaged string.« less

  15. A prospective microbiome-wide association study of food sensitization and food allergy in early childhood.

    PubMed

    Savage, Jessica H; Lee-Sarwar, Kathleen A; Sordillo, Joanne; Bunyavanich, Supinda; Zhou, Yanjiao; O'Connor, George; Sandel, Megan; Bacharier, Leonard B; Zeiger, Robert; Sodergren, Erica; Weinstock, George M; Gold, Diane R; Weiss, Scott T; Litonjua, Augusto A

    2018-01-01

    Alterations in the intestinal microbiome are prospectively associated with the development of asthma; less is known regarding the role of microbiome alterations in food allergy development. Intestinal microbiome samples were collected at age 3-6 months in children participating in the follow-up phase of an interventional trial of high-dose vitamin D given during pregnancy. At age 3, sensitization to foods (milk, egg, peanut, soy, wheat, walnut) was assessed. Food allergy was defined as caretaker report of healthcare provider-diagnosed allergy to the above foods prior to age 3 with evidence of IgE sensitization. Analysis was performed using Phyloseq and DESeq2; P-values were adjusted for multiple comparisons. Complete data were available for 225 children; there were 87 cases of food sensitization and 14 cases of food allergy. Microbial diversity measures did not differ between food sensitization and food allergy cases and controls. The genera Haemophilus (log 2 fold change -2.15, P=.003), Dialister (log 2 fold change -2.22, P=.009), Dorea (log 2 fold change -1.65, P=.02), and Clostridium (log 2 fold change -1.47, P=.002) were underrepresented among subjects with food sensitization. The genera Citrobacter (log 2 fold change -3.41, P=.03), Oscillospira (log 2 fold change -2.80, P=.03), Lactococcus (log 2 fold change -3.19, P=.05), and Dorea (log 2 fold change -3.00, P=.05) were underrepresented among subjects with food allergy. The temporal association between bacterial colonization and food sensitization and allergy suggests that the microbiome may have a causal role in the development of food allergy. Our findings have therapeutic implications for the prevention and treatment of food allergy. © 2017 EAACI and John Wiley and Sons A/S. Published by John Wiley and Sons Ltd.

  16. Determination of real-time polymerase chain reaction uncertainty of measurement using replicate analysis and a graphical user interface with Fieller’s theorem

    PubMed Central

    Stuart, James Ian; Delport, Johan; Lannigan, Robert; Zahariadis, George

    2014-01-01

    BACKGROUND: Disease monitoring of viruses using real-time polymerase chain reaction (PCR) requires knowledge of the precision of the test to determine what constitutes a significant change. Calculation of quantitative PCR confidence limits requires bivariate statistical methods. OBJECTIVE: To develop a simple-to-use graphical user interface to determine the uncertainty of measurement (UOM) of BK virus, cytomegalovirus (CMV) and Epstein-Barr virus (EBV) real-time PCR assays. METHODS: Thirty positive clinical samples for each of the three viral assays were repeated once. A graphical user interface was developed using a spreadsheet (Excel, Microsoft Corporation, USA) to enable data entry and calculation of the UOM (according to Fieller’s theorem) and PCR efficiency. RESULTS: The confidence limits for the BK virus, CMV and EBV tests were ∼0.5 log, 0.5 log to 1.0 log, and 0.5 log to 1.0 log, respectively. The efficiencies of these assays, in the same order were 105%, 119% and 90%. The confidence limits remained stable over the linear range of all three tests. DISCUSSION: A >5 fold (0.7 log) and a >3-fold (0.5 log) change in viral load were significant for CMV and EBV when the results were ≤1000 copies/mL and >1000 copies/mL, respectively. A >3-fold (0.5 log) change in viral load was significant for BK virus over its entire linear range. PCR efficiency was ideal for BK virus and EBV but not CMV. Standardized international reference materials and shared reporting of UOM among laboratories are required for the development of treatment guidelines for BK virus, CMV and EBV in the context of changes in viral load. PMID:25285125

  17. Determination of real-time polymerase chain reaction uncertainty of measurement using replicate analysis and a graphical user interface with Fieller's theorem.

    PubMed

    Stuart, James Ian; Delport, Johan; Lannigan, Robert; Zahariadis, George

    2014-07-01

    Disease monitoring of viruses using real-time polymerase chain reaction (PCR) requires knowledge of the precision of the test to determine what constitutes a significant change. Calculation of quantitative PCR confidence limits requires bivariate statistical methods. To develop a simple-to-use graphical user interface to determine the uncertainty of measurement (UOM) of BK virus, cytomegalovirus (CMV) and Epstein-Barr virus (EBV) real-time PCR assays. Thirty positive clinical samples for each of the three viral assays were repeated once. A graphical user interface was developed using a spreadsheet (Excel, Microsoft Corporation, USA) to enable data entry and calculation of the UOM (according to Fieller's theorem) and PCR efficiency. The confidence limits for the BK virus, CMV and EBV tests were ∼0.5 log, 0.5 log to 1.0 log, and 0.5 log to 1.0 log, respectively. The efficiencies of these assays, in the same order were 105%, 119% and 90%. The confidence limits remained stable over the linear range of all three tests. A >5 fold (0.7 log) and a >3-fold (0.5 log) change in viral load were significant for CMV and EBV when the results were ≤1000 copies/mL and >1000 copies/mL, respectively. A >3-fold (0.5 log) change in viral load was significant for BK virus over its entire linear range. PCR efficiency was ideal for BK virus and EBV but not CMV. Standardized international reference materials and shared reporting of UOM among laboratories are required for the development of treatment guidelines for BK virus, CMV and EBV in the context of changes in viral load.

  18. Green Lumber Grade Yields for Subfactory Class Hardwood Logs

    Treesearch

    Leland F. Hanks; Leland F. Hanks

    1973-01-01

    Data on lumber grade yields for subfactory class logs are presented for ten species of hardwoods. Eogs of this type are expected to assume greater importance in the market. The yields, when coupled with lumber prices, will be useful to sawmill operators for developing log prices in terms of standard factory lumber.

  19. Predicting survival of Escherichia coli O157:H7 in dry fermented sausage using artificial neural networks.

    PubMed

    Palanichamy, A; Jayas, D S; Holley, R A

    2008-01-01

    The Canadian Food Inspection Agency required the meat industry to ensure Escherichia coli O157:H7 does not survive (experiences > or = 5 log CFU/g reduction) in dry fermented sausage (salami) during processing after a series of foodborne illness outbreaks resulting from this pathogenic bacterium occurred. The industry is in need of an effective technique like predictive modeling for estimating bacterial viability, because traditional microbiological enumeration is a time-consuming and laborious method. The accuracy and speed of artificial neural networks (ANNs) for this purpose is an attractive alternative (developed from predictive microbiology), especially for on-line processing in industry. Data from a study of interactive effects of different levels of pH, water activity, and the concentrations of allyl isothiocyanate at various times during sausage manufacture in reducing numbers of E. coli O157:H7 were collected. Data were used to develop predictive models using a general regression neural network (GRNN), a form of ANN, and a statistical linear polynomial regression technique. Both models were compared for their predictive error, using various statistical indices. GRNN predictions for training and test data sets had less serious errors when compared with the statistical model predictions. GRNN models were better and slightly better for training and test sets, respectively, than was the statistical model. Also, GRNN accurately predicted the level of allyl isothiocyanate required, ensuring a 5-log reduction, when an appropriate production set was created by interpolation. Because they are simple to generate, fast, and accurate, ANN models may be of value for industrial use in dry fermented sausage manufacture to reduce the hazard associated with E. coli O157:H7 in fresh beef and permit production of consistently safe products from this raw material.

  20. Field project to obtain pressure core, wireline log, and production test data for evaluation of CO/sub 2/ flooding potential, Conoco MCA unit well No. 358, Maljamar Field, Lea County, New Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swift, T.E.; Marlow, R.E.; Wilhelm, M.H.

    1981-11-01

    This report describes part of the work done to fulfill a contract awarded to Gruy Federal, Inc., by the Department of Energy (DOE) on Feburary 12, 1979. The work includes pressure-coring and associated logging and testing programs to provide data on in-situ oil saturation, porosity and permeability distribution, and other data needed for resource characterization of fields and reservoirs in which CO/sub 2/ injection might have a high probability of success. This report details the second such project. Core porosities agreed well with computed log porosities. Core water saturation and computed log porosities agree fairly well from 3692 to 3712more » feet, poorly from 3712 to 3820 feet and in a general way from 4035 to 4107 feet. Computer log analysis techniques incorporating the a, m, and n values obtained from Core Laboratories analysis did not improve the agreement of log versus core derived water saturations. However, both core and log analysis indicated the ninth zone had the highest residual hydrocarbon saturations and production data confirmed the validity of oil saturation determinations. Residual oil saturation, for the perforated and tested intervals were 259 STB/acre-ft for the interval from 4035 to 4055 feet, and 150 STB/acre-ft for the interval from 3692 to 3718 feet. Nine BOPD was produced from the interval 4035 to 4055 feet and no oil was produced from interval 3692 to 3718 feet, qualitatively confirming the relative oil saturations as calculated. The low oil production in the zone from 4022 to 4055 and the lack of production from 3692 to 3718 feet indicated the zone to be at or near residual waterflood conditions as determined by log analysis. This project demonstrates the usefulness of integrating pressure core, log, and production data to realistically evaluate a reservoir for carbon dioxide flood.« less

  1. Console Log Keeping Made Easier - Tools and Techniques for Improving Quality of Flight Controller Activity Logs

    NASA Technical Reports Server (NTRS)

    Scott, David W.; Underwood, Debrah (Technical Monitor)

    2002-01-01

    At the Marshall Space Flight Center's (MSFC) Payload Operations Integration Center (POIC) for International Space Station (ISS), each flight controller maintains detailed logs of activities and communications at their console position. These logs are critical for accurately controlling flight in real-time as well as providing a historical record and troubleshooting tool. This paper describes logging methods and electronic formats used at the POIC and provides food for thought on their strengths and limitations, plus proposes some innovative extensions. It also describes an inexpensive PC-based scheme for capturing and/or transcribing audio clips from communications consoles. Flight control activity (e.g. interpreting computer displays, entering data/issuing electronic commands, and communicating with others) can become extremely intense. It's essential to document it well, but the effort to do so may conflict with actual activity. This can be more than just annoying, as what's in the logs (or just as importantly not in them) often feeds back directly into the quality of future operations, whether short-term or long-term. In earlier programs, such as Spacelab, log keeping was done on paper, often using position-specific shorthand, and the other reader was at the mercy of the writer's penmanship. Today, user-friendly software solves the legibility problem and can automate date/time entry, but some content may take longer to finish due to individual typing speed and less use of symbols. File layout can be used to great advantage in making types of information easy to find, and creating searchable master logs for a given position is very easy and a real lifesaver in reconstructing events or researching a given topic. We'll examine log formats from several console position, and the types of information that are included and (just as importantly) excluded. We'll also look at when a summary or synopsis is effective, and when extensive detail is needed.

  2. Borehole techniques identifying subsurface chimney heights in loose ground-some experiences above underground nuclear explosions

    USGS Publications Warehouse

    Carroll, R.D.; Lacomb, J.W.

    1993-01-01

    The location of the subsurface top of the chimney formed by the collapse of the cavity resulting from an underground nuclear explosion is examined at five sites at the Nevada Test Site. The chimneys were investigated by drilling, coring, geophysical logging (density, gamma-ray, caliper), and seismic velocity surveys. The identification of the top of the chimney can be complicated by chimney termination in friable volcanic rock of relatively high porosity. The presence of an apical void in three of the five cases is confirmed as the chimney horizon by coincidence with anomalies observed in coring, caliper and gamma-ray logging (two cases), seismic velocity, and drilling. In the two cases where an apical void is not present, several of these techniques yield anomalies at identical horizons, however, the exact depth of chimney penetration is subject to some degree of uncertainty. This is due chiefly to the extent to which core recovery and seismic velocity may be affected by perturbations in the tuff above the chimney due to the explosion and collapse. The data suggest, however, that the depth uncertainty may be only of the order of 10 m if several indicators are available. Of all indicators, core recovery and seismic velocity indicate anomalous horizons in every case. Because radiation products associated with the explosion are contained within the immediate vicinity of the cavity, gamma-ray logs are generally not diagnostic of chimney penetration. In no case is the denisty log indicative of the presence of the chimney. ?? 1993.

  3. Fabrication of SrTiO3 Layer on Pt Electrode for Label-Free Capacitive Biosensors

    PubMed Central

    Carapella, Giovanni; Pilloton, Roberto; Di Matteo, Marisa

    2018-01-01

    Due to their interesting ferroelectric, conductive and dielectric properties, in recent years, perovskite-structured materials have begun to attract increasing interest in the biosensing field. In this study, a strontium titanate perovskite layer (SrTiO3) has been synthesized on a platinum electrode and exploited for the development of an impedimetric label-free immunosensor for Escherichia coli O157:H7 detection. The electrochemical characterization of the perovskite-modified electrode during the construction of the immunosensor, as well as after the interaction with different E. coli O157:H7 concentrations, showed a reproducible decrease of the total capacitance of the system that was used for the analytical characterization of the immunosensor. Under optimized conditions, the capacitive immunosensor showed a linear relationship from to 1 to 7 log cfu/mL with a low detection limit of 1 log cfu/mL. Moreover, the atomic force microscopy (AFM) technique underlined the increase in roughness of the SrTiO3-modified electrode surface after antibody immobilization, as well as the effective presence of cells with the typical size of E. coli. PMID:29547521

  4. Choice of Stimulus Range and Size Can Reduce Test-Retest Variability in Glaucomatous Visual Field Defects

    PubMed Central

    Swanson, William H.; Horner, Douglas G.; Dul, Mitchell W.; Malinovsky, Victor E.

    2014-01-01

    Purpose To develop guidelines for engineering perimetric stimuli to reduce test-retest variability in glaucomatous defects. Methods Perimetric testing was performed on one eye for 62 patients with glaucoma and 41 age-similar controls on size III and frequency-doubling perimetry and three custom tests with Gaussian blob and Gabor sinusoid stimuli. Stimulus range was controlled by values for ceiling (maximum sensitivity) and floor (minimum sensitivity). Bland-Altman analysis was used to derive 95% limits of agreement on test and retest, and bootstrap analysis was used to test the hypotheses about peak variability. Results Limits of agreement for the three custom stimuli were similar in width (0.72 to 0.79 log units) and peak variability (0.22 to 0.29 log units) for a stimulus range of 1.7 log units. The width of the limits of agreement for size III decreased from 1.78 to 1.37 to 0.99 log units for stimulus ranges of 3.9, 2.7, and 1.7 log units, respectively (F = 3.23, P < 0.001); peak variability was 0.99, 0.54, and 0.34 log units, respectively (P < 0.01). For a stimulus range of 1.3 log units, limits of agreement were narrowest with Gabor and widest with size III stimuli, and peak variability was lower (P < 0.01) with Gabor (0.18 log units) and frequency-doubling perimetry (0.24 log units) than with size III stimuli (0.38 log units). Conclusions Test-retest variability in glaucomatous visual field defects was substantially reduced by engineering the stimuli. Translational Relevance The guidelines should allow developers to choose from a wide range of stimuli. PMID:25371855

  5. Choice of Stimulus Range and Size Can Reduce Test-Retest Variability in Glaucomatous Visual Field Defects.

    PubMed

    Swanson, William H; Horner, Douglas G; Dul, Mitchell W; Malinovsky, Victor E

    2014-09-01

    To develop guidelines for engineering perimetric stimuli to reduce test-retest variability in glaucomatous defects. Perimetric testing was performed on one eye for 62 patients with glaucoma and 41 age-similar controls on size III and frequency-doubling perimetry and three custom tests with Gaussian blob and Gabor sinusoid stimuli. Stimulus range was controlled by values for ceiling (maximum sensitivity) and floor (minimum sensitivity). Bland-Altman analysis was used to derive 95% limits of agreement on test and retest, and bootstrap analysis was used to test the hypotheses about peak variability. Limits of agreement for the three custom stimuli were similar in width (0.72 to 0.79 log units) and peak variability (0.22 to 0.29 log units) for a stimulus range of 1.7 log units. The width of the limits of agreement for size III decreased from 1.78 to 1.37 to 0.99 log units for stimulus ranges of 3.9, 2.7, and 1.7 log units, respectively ( F = 3.23, P < 0.001); peak variability was 0.99, 0.54, and 0.34 log units, respectively ( P < 0.01). For a stimulus range of 1.3 log units, limits of agreement were narrowest with Gabor and widest with size III stimuli, and peak variability was lower ( P < 0.01) with Gabor (0.18 log units) and frequency-doubling perimetry (0.24 log units) than with size III stimuli (0.38 log units). Test-retest variability in glaucomatous visual field defects was substantially reduced by engineering the stimuli. The guidelines should allow developers to choose from a wide range of stimuli.

  6. Selective logging in the Brazilian Amazon.

    PubMed

    Asner, Gregory P; Knapp, David E; Broadbent, Eben N; Oliveira, Paulo J C; Keller, Michael; Silva, Jose N

    2005-10-21

    Amazon deforestation has been measured by remote sensing for three decades. In comparison, selective logging has been mostly invisible to satellites. We developed a large-scale, high-resolution, automated remote-sensing analysis of selective logging in the top five timber-producing states of the Brazilian Amazon. Logged areas ranged from 12,075 to 19,823 square kilometers per year (+/-14%) between 1999 and 2002, equivalent to 60 to 123% of previously reported deforestation area. Up to 1200 square kilometers per year of logging were observed on conservation lands. Each year, 27 million to 50 million cubic meters of wood were extracted, and a gross flux of approximately 0.1 billion metric tons of carbon was destined for release to the atmosphere by logging.

  7. Estimating generalized skew of the log-Pearson Type III distribution for annual peak floods in Illinois

    USGS Publications Warehouse

    Oberg, Kevin A.; Mades, Dean M.

    1987-01-01

    Four techniques for estimating generalized skew in Illinois were evaluated: (1) a generalized skew map of the US; (2) an isoline map; (3) a prediction equation; and (4) a regional-mean skew. Peak-flow records at 730 gaging stations having 10 or more annual peaks were selected for computing station skews. Station skew values ranged from -3.55 to 2.95, with a mean of -0.11. Frequency curves computed for 30 gaging stations in Illinois using the variations of the regional-mean skew technique are similar to frequency curves computed using a skew map developed by the US Water Resources Council (WRC). Estimates of the 50-, 100-, and 500-yr floods computed for 29 of these gaging stations using the regional-mean skew techniques are within the 50% confidence limits of frequency curves computed using the WRC skew map. Although the three variations of the regional-mean skew technique were slightly more accurate than the WRC map, there is no appreciable difference between flood estimates computed using the variations of the regional-mean technique and flood estimates computed using the WRC skew map. (Peters-PTT)

  8. Generalized image contrast enhancement technique based on the Heinemann contrast discrimination model

    NASA Astrophysics Data System (ADS)

    Liu, Hong; Nodine, Calvin F.

    1996-07-01

    This paper presents a generalized image contrast enhancement technique, which equalizes the perceived brightness distribution based on the Heinemann contrast discrimination model. It is based on the mathematically proven existence of a unique solution to a nonlinear equation, and is formulated with easily tunable parameters. The model uses a two-step log-log representation of luminance contrast between targets and surround in a luminous background setting. The algorithm consists of two nonlinear gray scale mapping functions that have seven parameters, two of which are adjustable Heinemann constants. Another parameter is the background gray level. The remaining four parameters are nonlinear functions of the gray-level distribution of the given image, and can be uniquely determined once the previous three are set. Tests have been carried out to demonstrate the effectiveness of the algorithm for increasing the overall contrast of radiology images. The traditional histogram equalization can be reinterpreted as an image enhancement technique based on the knowledge of human contrast perception. In fact, it is a special case of the proposed algorithm.

  9. Comparison of path visualizations and cognitive measures relative to travel technique in a virtual environment.

    PubMed

    Zanbaka, Catherine A; Lok, Benjamin C; Babu, Sabarish V; Ulinski, Amy C; Hodges, Larry F

    2005-01-01

    We describe a between-subjects experiment that compared four different methods of travel and their effect on cognition and paths taken in an immersive virtual environment (IVE). Participants answered a set of questions based on Crook's condensation of Bloom's taxonomy that assessed their cognition of the IVE with respect to knowledge, understanding and application, and higher mental processes. Participants also drew a sketch map of the IVE and the objects within it. The users' sense of presence was measured using the Steed-Usoh-Slater Presence Questionnaire. The participants' position and head orientation were automatically logged during their exposure to the virtual environment. These logs were later used to create visualizations of the paths taken. Path analysis, such as exploring the overlaid path visualizations and dwell data information, revealed further differences among the travel techniques. Our results suggest that, for applications where problem solving and evaluation of information is important or where opportunity to train is minimal, then having a large tracked space so that the participant can walk around the virtual environment provides benefits over common virtual travel techniques.

  10. Modal parameter identification using the log decrement method and band-pass filters

    NASA Astrophysics Data System (ADS)

    Liao, Yabin; Wells, Valana

    2011-10-01

    This paper presents a time-domain technique for identifying modal parameters of test specimens based on the log-decrement method. For lightly damped multidegree-of-freedom or continuous systems, the conventional method is usually restricted to identification of fundamental-mode parameters only. Implementation of band-pass filters makes it possible for the proposed technique to extract modal information of higher modes. The method has been applied to a polymethyl methacrylate (PMMA) beam for complex modulus identification in the frequency range 10-1100 Hz. Results compare well with those obtained using the Least Squares method, and with those previously published in literature. Then the accuracy of the proposed method has been further verified by experiments performed on a QuietSteel specimen with very low damping. The method is simple and fast. It can be used for a quick estimation of the modal parameters, or as a complementary approach for validation purposes.

  11. Parallel compression of data chunks of a shared data object using a log-structured file system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2016-10-25

    Techniques are provided for parallel compression of data chunks being written to a shared object. A client executing on a compute node or a burst buffer node in a parallel computing system stores a data chunk generated by the parallel computing system to a shared data object on a storage node by compressing the data chunk; and providing the data compressed data chunk to the storage node that stores the shared object. The client and storage node may employ Log-Structured File techniques. The compressed data chunk can be de-compressed by the client when the data chunk is read. A storagemore » node stores a data chunk as part of a shared object by receiving a compressed version of the data chunk from a compute node; and storing the compressed version of the data chunk to the shared data object on the storage node.« less

  12. Measurement of stiffness of standing trees and felled logs using acoustics: A review.

    PubMed

    Legg, Mathew; Bradley, Stuart

    2016-02-01

    This paper provides a review on the use of acoustics to measure stiffness of standing trees, stems, and logs. An outline is given of the properties of wood and how these are related to stiffness and acoustic velocity throughout the tree. Factors are described that influence the speed of sound in wood, including the different types of acoustic waves which propagate in tree stems and lumber. Acoustic tools and techniques that have been used to measure the stiffness of wood are reviewed. The reasons for a systematic difference between direct and acoustic measurements of stiffness for standing trees, and methods for correction, are discussed. Other techniques, which have been used in addition to acoustics to try to improve stiffness measurements, are also briefly described. Also reviewed are studies which have used acoustic tools to investigate factors that influence the stiffness of trees. These factors include different silvicultural practices, geographic and environmental conditions, and genetics.

  13. Evaluation of two immunomagnetic separation techniques for the detection and recovery of E. coli O157:H7 from finished composts

    USDA-ARS?s Scientific Manuscript database

    Two rapid immunomagnetic separation (IMS) protocols were evaluated to recover 1-2 log CFU/g inoculated E. coli O157:H7 from 30 different commercial, finished compost samples. Both protocols detected E. coli O157:H7 in compost samples; PCR techniques required the removal of inhibitors to reduce poss...

  14. Recovery Act Validation of Innovative Exploration Techniques Pilgrim Hot Springs, Alaska

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holdmann, Gwen

    2015-04-30

    Drilling and temperature logging campaigns between the late 1970's and early 1980’s measured temperatures at Pilgrim Hot Springs in excess of 90°C. Between 2010 and 2014 the University of Alaska used a variety of methods including geophysical surveys, remote sensing techniques, heat budget modeling, and additional drilling to better understand the resource and estimate the available geothermal energy.

  15. MAIL LOG, program theory, volume 1. [Scout project automatic data system

    NASA Technical Reports Server (NTRS)

    Harris, D. K.

    1979-01-01

    The program theory used to obtain the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS, is described. The program is written in FORTRAN for the PRIME 300 computer system. The MAIL LOG data base consists of three main subfiles: (1) incoming and outgoing mail correspondence; (2) design information releases and reports; and (3) drawings and engineering orders. All subroutine descriptions, flowcharts, and MAIL LOG outputs are given and the data base design is described.

  16. Emergent Intelligent Behavior through Integrated Investigation of Embodied Natural Language, Reasoning, Learning, Computer Vision, and Robotic Manipulation

    DTIC Science & Technology

    2011-10-11

    developed a method for determining the structure (component logs and their 3D place- ment) of a LINCOLN LOG assembly from a single image from an uncalibrated...small a class of components. Moreover, we focus on determining the precise pose and structure of an assembly, including the 3D pose of each...medial axes are parallel to the work surface. Thus valid structures Fig. 1. The 3D geometric shape parameters of LINCOLN LOGS. have logs on

  17. Boulder-Faced Log Dams and other Alternatives for Gabion Check Dams in First-Order Ephemeral Streams with Coarse Bed Load in Ethiopia

    NASA Astrophysics Data System (ADS)

    Nyssen, Jan; Gebreslassie, Seifu; Assefa, Romha; Deckers, Jozef; Guyassa, Etefa; Poesen, Jean; Frankl, Amaury

    2017-04-01

    Many thousands of gabion check dams have been installed to control gully erosion in Ethiopia, but several challenges still remain, such as the issue of gabion failure in ephemeral streams with coarse bed load, that abrades at the chute step. As an alternative for gabion check dams in torrents with coarse bed load, boulder-faced log dams were conceived, installed transversally across torrents and tested (n = 30). For this, logs (22-35 cm across) were embedded in the banks of torrents, 0.5-1 m above the bed and their upstream sides were faced with boulders (0.3-0.7 m across). Similar to gabion check dams, boulder-faced log dams lead to temporary ponding, spreading of peak flow over the entire channel width and sediment deposition. Results of testing under extreme flow conditions (including two storms with return periods of 5.6 and 7 years) show that 18 dams resisted strong floods. Beyond certain flood thresholds, represented by proxies such as Strahler's stream order, catchment area, D95 or channel width), 11 log dams were completely destroyed. Smallholder farmers see much potential in this type of structure to control first-order torrents with coarse bed load, since the technique is cost-effective and can be easily installed.

  18. Eliminating the rugosity effect from compensated density logs by geometrical response matching

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flaum, C.; Holenka, J.M.; Case, C.R.

    1991-06-01

    A theoretical and experimental effort to understand the effects of borehole rugosity on individual detector responses yielded an improved method of processing compensated density logs. Historically, the spine/ribs technique for obtaining borehole and mudcake compensation of dual-detector, gamma-gamma density logs has been very successful as long as the borehole and other environmental effects vary slowly with depth and the interest in limited to vertical features broader than several feet. With the increased interest in higher vertical resolution, a more detailed analysis of the effect of such quickly varying environmental effects as rugosity was required. A laboratory setup simulating the effectmore » of rugosity on Schlumberger Litho-Density{sup SM} tools (LDT) was used to study vertical response in the presence of rugosity. The data served as the benchmark for the Nonte Carlo models used to generate synthetic density logs in the presence of more complex rugosity patterns. The results provided in this paper show that proper matching of the two detector responses before application of conventional compensation methods can eliminate rugosity effects without degrading the measurements vertical resolution. The accuracy of the results is a good as the obtained in a parallel mudcake or standoff with the conventional method. Application to both field and synthetic log confirmed the validity of these results.« less

  19. Fluid-Rock Characterization and Interactions in NMR Well Logging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    George J. Hirasaki; Kishore K. Mohanty

    2005-09-05

    The objective of this report is to characterize the fluid properties and fluid-rock interactions that are needed for formation evaluation by NMR well logging. The advances made in the understanding of NMR fluid properties are summarized in a chapter written for an AAPG book on NMR well logging. This includes live oils, viscous oils, natural gas mixtures, and the relation between relaxation time and diffusivity. Oil based drilling fluids can have an adverse effect on NMR well logging if it alters the wettability of the formation. The effect of various surfactants on wettability and surface relaxivity are evaluated for silicamore » sand. The relation between the relaxation time and diffusivity distinguishes the response of brine, oil, and gas in a NMR well log. A new NMR pulse sequence in the presence of a field gradient and a new inversion technique enables the T{sub 2} and diffusivity distributions to be displayed as a two-dimensional map. The objectives of pore morphology and rock characterization are to identify vug connectivity by using X-ray CT scan, and to improve NMR permeability correlation. Improved estimation of permeability from NMR response is possible by using estimated tortuosity as a parameter to interpolate between two existing permeability models.« less

  20. Explorations in statistics: the log transformation.

    PubMed

    Curran-Everett, Douglas

    2018-06-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This thirteenth installment of Explorations in Statistics explores the log transformation, an established technique that rescales the actual observations from an experiment so that the assumptions of some statistical analysis are better met. A general assumption in statistics is that the variability of some response Y is homogeneous across groups or across some predictor variable X. If the variability-the standard deviation-varies in rough proportion to the mean value of Y, a log transformation can equalize the standard deviations. Moreover, if the actual observations from an experiment conform to a skewed distribution, then a log transformation can make the theoretical distribution of the sample mean more consistent with a normal distribution. This is important: the results of a one-sample t test are meaningful only if the theoretical distribution of the sample mean is roughly normal. If we log-transform our observations, then we want to confirm the transformation was useful. We can do this if we use the Box-Cox method, if we bootstrap the sample mean and the statistic t itself, and if we assess the residual plots from the statistical model of the actual and transformed sample observations.

  1. Log-Based Recovery in Asynchronous Distributed Systems. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Kane, Kenneth Paul

    1989-01-01

    A log-based mechanism is described for restoring consistent states to replicated data objects after failures. Preserving a causal form of consistency based on the notion of virtual time is focused upon in this report. Causal consistency has been shown to apply to a variety of applications, including distributed simulation, task decomposition, and mail delivery systems. Several mechanisms have been proposed for implementing causally consistent recovery, most notably those of Strom and Yemini, and Johnson and Zwaenepoel. The mechanism proposed here differs from these in two major respects. First, a roll-forward style of recovery is implemented. A functioning process is never required to roll-back its state in order to achieve consistency with a recovering process. Second, the mechanism does not require any explicit information about the causal dependencies between updates. Instead, all necessary dependency information is inferred from the orders in which updates are logged by the object servers. This basic recovery technique appears to be applicable to forms of consistency other than causal consistency. In particular, it is shown how the recovery technique can be modified to support an atomic form of consistency (grouping consistency). By combining grouping consistency with casual consistency, it may even be possible to implement serializable consistency within this mechanism.

  2. SU-E-T-664: Radiobiological Modeling of Prophylactic Cranial Irradiation in Mice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, D; Debeb, B; Woodward, W

    Purpose: Prophylactic cranial irradiation (PCI) is a clinical technique used to reduce the incidence of brain metastasis and improve overall survival in select patients with ALL and SCLC, and we have shown the potential of PCI in select breast cancer patients through a mouse model (manuscript in preparation). We developed a computational model using our experimental results to demonstrate the advantage of treating brain micro-metastases early. Methods: MATLAB was used to develop the computational model of brain metastasis and PCI in mice. The number of metastases per mouse and the volume of metastases from four- and eight-week endpoints were fitmore » to normal and log-normal distributions, respectively. Model input parameters were optimized so that model output would match the experimental number of metastases per mouse. A limiting dilution assay was performed to validate the model. The effect of radiation at different time points was computationally evaluated through the endpoints of incidence, number of metastases, and tumor burden. Results: The correlation between experimental number of metastases per mouse and the Gaussian fit was 87% and 66% at the two endpoints. The experimental volumes and the log-normal fit had correlations of 99% and 97%. In the optimized model, the correlation between number of metastases per mouse and the Gaussian fit was 96% and 98%. The log-normal volume fit and the model agree 100%. The model was validated by a limiting dilution assay, where the correlation was 100%. The model demonstrates that cells are very sensitive to radiation at early time points, and delaying treatment introduces a threshold dose at which point the incidence and number of metastases decline. Conclusion: We have developed a computational model of brain metastasis and PCI in mice that is highly correlated to our experimental data. The model shows that early treatment of subclinical disease is highly advantageous.« less

  3. Hardwood sawyer trainer

    Treesearch

    Luis G. Occeña; Eknarin Santitrakul; Daniel L. Schmoldt

    2000-01-01

    It is well understood by now that the initial breakdown of hardwood logs into lumber has a tremendous impact on the total lumber value and conversion efficiency. The focus of this research project is the development of a computer-aided sawing trainer tool for the primary breakdown of hardwood logs. Maximum lumber recovery is dependent on the proper log orientation as...

  4. Web Log Analysis: A Study of Instructor Evaluations Done Online

    ERIC Educational Resources Information Center

    Klassen, Kenneth J.; Smith, Wayne

    2004-01-01

    This paper focuses on developing a relatively simple method for analyzing web-logs. It also explores the challenges and benefits of web-log analysis. The study of student behavior on this site provides insights into website design and the effectiveness of this site in particular. Another benefit realized from the paper is the ease with which these…

  5. West Virginia wood waste from uncharted sources: log landings and active surface mines

    Treesearch

    Shawn T. Grushecky; Lawrence E. Osborn

    2013-01-01

    Traditionally, biomass availability estimates from West Virginia have focused on primary and secondary mill byproducts and logging residues. Other sources of woody biomass are available that have not been surveyed. Through a series of field studies during 2010 and 2011, biomass availability estimates were developed for surface mine sites and log landings in West...

  6. Project LOGgED ON: Advanced Science Online for Gifted Learners

    ERIC Educational Resources Information Center

    Reed, Christine; Urquhart, Jill

    2007-01-01

    Gifted students are often underserved because they do not have access to highly challenging curriculum. In October, 2002, Project LOGgED ON (www.scrolldown.com/loggedon/) at University of Virginia received federal funding from the Jacob Javits Act to tackle this issue. Those who were part of the LOGgED ON project developed advanced science…

  7. Assessing the feasibility and profitability of cable logging in southern upland hardwood forests

    Treesearch

    Chris B. LeDoux; Dennis M. May; Tony Johnson; Richard H. Widmann

    1995-01-01

    Procedures developed to assess available timber supplies from upland hardwood forest statistics reported by the USDA Forest Services' Forest Inventory and Analysis unit were modified to assess the feasibility and profitability of cable logging in southern upland hardwood forests. Depending on the harvest system and yarding distance used, cable logging can be...

  8. Rule-driven defect detection in CT images of hardwood logs

    Treesearch

    Erol Sarigul; A. Lynn Abbott; Daniel L. Schmoldt

    2000-01-01

    This paper deals with automated detection and identification of internal defects in hardwood logs using computed tomography (CT) images. We have developed a system that employs artificial neural networks to perform tentative classification of logs on a pixel-by-pixel basis. This approach achieves a high level of classification accuracy for several hardwood species (...

  9. Cluster analysis and quality assessment of logged water at an irrigation project, eastern Saudi Arabia.

    PubMed

    Hussain, Mahbub; Ahmed, Syed Munaf; Abderrahman, Walid

    2008-01-01

    A multivariate statistical technique, cluster analysis, was used to assess the logged surface water quality at an irrigation project at Al-Fadhley, Eastern Province, Saudi Arabia. The principal idea behind using the technique was to utilize all available hydrochemical variables in the quality assessment including trace elements and other ions which are not considered in conventional techniques for water quality assessments like Stiff and Piper diagrams. Furthermore, the area belongs to an irrigation project where water contamination associated with the use of fertilizers, insecticides and pesticides is expected. This quality assessment study was carried out on a total of 34 surface/logged water samples. To gain a greater insight in terms of the seasonal variation of water quality, 17 samples were collected from both summer and winter seasons. The collected samples were analyzed for a total of 23 water quality parameters including pH, TDS, conductivity, alkalinity, sulfate, chloride, bicarbonate, nitrate, phosphate, bromide, fluoride, calcium, magnesium, sodium, potassium, arsenic, boron, copper, cobalt, iron, lithium, manganese, molybdenum, nickel, selenium, mercury and zinc. Cluster analysis in both Q and R modes was used. Q-mode analysis resulted in three distinct water types for both the summer and winter seasons. Q-mode analysis also showed the spatial as well as temporal variation in water quality. R-mode cluster analysis led to the conclusion that there are two major sources of contamination for the surface/shallow groundwater in the area: fertilizers, micronutrients, pesticides, and insecticides used in agricultural activities, and non-point natural sources.

  10. Internet-based early intervention to prevent posttraumatic stress disorder in injury patients: randomized controlled trial.

    PubMed

    Mouthaan, Joanne; Sijbrandij, Marit; de Vries, Giel-Jan; Reitsma, Johannes B; van de Schoot, Rens; Goslings, J Carel; Luitse, Jan S K; Bakker, Fred C; Gersons, Berthold P R; Olff, Miranda

    2013-08-13

    Posttraumatic stress disorder (PTSD) develops in 10-20% of injury patients. We developed a novel, self-guided Internet-based intervention (called Trauma TIPS) based on techniques from cognitive behavioral therapy (CBT) to prevent the onset of PTSD symptoms. To determine whether Trauma TIPS is effective in preventing the onset of PTSD symptoms in injury patients. Adult, level 1 trauma center patients were randomly assigned to receive the fully automated Trauma TIPS Internet intervention (n=151) or to receive no early intervention (n=149). Trauma TIPS consisted of psychoeducation, in vivo exposure, and stress management techniques. Both groups were free to use care as usual (nonprotocolized talks with hospital staff). PTSD symptom severity was assessed at 1, 3, 6, and 12 months post injury with a clinical interview (Clinician-Administered PTSD Scale) by blinded trained interviewers and self-report instrument (Impact of Event Scale-Revised). Secondary outcomes were acute anxiety and arousal (assessed online), self-reported depressive and anxiety symptoms (Hospital Anxiety and Depression Scale), and mental health care utilization. Intervention usage was documented. The mean number of intervention logins was 1.7, SD 2.5, median 1, interquartile range (IQR) 1-2. Thirty-four patients in the intervention group did not log in (22.5%), 63 (41.7%) logged in once, and 54 (35.8%) logged in multiple times (mean 3.6, SD 3.5, median 3, IQR 2-4). On clinician-assessed and self-reported PTSD symptoms, both the intervention and control group showed a significant decrease over time (P<.001) without significant differences in trend. PTSD at 12 months was diagnosed in 4.7% of controls and 4.4% of intervention group patients. There were no group differences on anxiety or depressive symptoms over time. Post hoc analyses using latent growth mixture modeling showed a significant decrease in PTSD symptoms in a subgroup of patients with severe initial symptoms (n=20) (P<.001). Our results do not support the efficacy of the Trauma TIPS Internet-based early intervention in the prevention of PTSD symptoms for an unselected population of injury patients. Moreover, uptake was relatively low since one-fifth of individuals did not log in to the intervention. Future research should therefore focus on innovative strategies to increase intervention usage, for example, adding gameplay, embedding it in a blended care context, and targeting high-risk individuals who are more likely to benefit from the intervention. International Standard Randomized Controlled Trial Number (ISRCTN): 57754429; http://www.controlled-trials.com/ISRCTN57754429 (Archived by WebCite at http://webcitation.org/6FeJtJJyD).

  11. Integrated NMR Core and Log Investigations With Respect to ODP LEG 204

    NASA Astrophysics Data System (ADS)

    Arnold, J.; Pechnig, R.; Clauser, C.; Anferova, S.; Blümich, B.

    2005-12-01

    NMR techniques are widely used in the oil industry and are one of the most suitable methods to evaluate in-situ formation porosity and permeability. Recently, efforts are directed towards adapting NMR methods also to the Ocean Drilling Program (ODP) and the upcoming Integrated Ocean Drilling Program (IODP). We apply a newly developed light-weight, mobile NMR core scanner as a non-destructive instrument to determine routinely rock porosity and to estimate the pore size distribution. The NMR core scanner is used for transverse relaxation measurements on water-saturated core sections using a CPMG sequence with a short echo time. A regularized Laplace-transform analysis yields the distribution of transverse relaxation times T2. In homogeneous magnetic fields, T2 is proportional to the pore diameter of rocks. Hence, the T2 signal maps the pore-size distribution of the studied rock samples. For fully saturated samples the integral of the distribution curve and the CPMG echo amplitude extrapolated to zero echo time are proportional to porosity. Preliminary results show that the NMR core scanner is a suitable tool to determine rock porosity and to estimate pore size distribution of limestones and sandstones. Presently our investigations focus on Leg 204, where NMR Logging-While-Drilling (LWD) was performed for the first time in ODP. Leg 204 was drilled into Hydrate Ridge on the Cascadia accretionary margin, offshore Oregon. All drilling and logging operations were highly successful, providing excellent core, wireline, and LWD data from adjacent boreholes. Cores recovered during Leg 204 consist mainly of clay and claystone. As the NMR core scanner operates at frequencies higher than that of the well-logging sensor it has a shorter dead time. This advantage makes the NMR core scanner sensitive to signals with T2 values down to 0.1 ms as compared to 3 ms in NMR logging. Hence, we can study even rocks with small pores, such as the mudcores recovered during Leg 204. We present a comparison of data from core scanning and NMR logging. Future integration of conventional wireline data and electrical borehole wall images (RAB/FMS) will provide a detailed characterization of the sediments in terms of lithology, petrophysics and, fluid flow properties.

  12. An improved thermodynamic model for the complexation of trivalent actinides and lanthanide with oxalic acid valid to high ionic strength.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiong, Yongliang; Thakur, Punam; Borkowski, Marian

    The dissociation constants of oxalic acid (Ox), and the stability constants of Am 3+, Cm 3+ and Eu 3+ with Ox 2– have been determined at 25 °C, over a range of concentration varying from 0.1 to 6.60 m NaClO4 using potentiometric titration and extraction techniques, respectively. The experimental data support the formation of complexes, M(Ox) n 3 – 2n, where (M = Am 3+, Cm 3+ and Eu 3+ and n = 1 and 2). The dissociation constant and the stability constant values measured as a function of NaClO 4 concentration were used to estimate the Pitzer parameters formore » the respective interactions of Am 3+, Cm 3+ and Eu 3+ with Ox. Furthermore, the stability constants data of Am 3+ –Ox measured in NaClO 4 and in NaCl solutions from the literature were simultaneously fitted in order to refine the existing actinide–oxalate complexation model that can be used universally in the safety assessment of radioactive waste disposal. The thermodynamic stability constant: log β 0 101 = 6.30 ± 0.06 and log β 0 102 = 10.84 ± 0.06 for Am 3+ was obtained by simultaneously fitting data in NaCl and NaClO 4 media. Additionally, log β 0 101 = 6.72 ± 0.08 and log β 0 102 = 11.05 ± 0.09 for the Cm 3+ and log β 0 101 = 6.67 ± 0.08 and log β 0 102 = 11.15 ± 0.09 for the Eu 3+ were calculated by extrapolation of data to zero ionic strength in NaClO 4 medium only. For all stability constants, the Pitzer model gives an excellent representation of the data using interaction parameters β (0), β (1), and CΦ determined in this work. The thermodynamic model developed in this work will be useful in accurately modeling the potential solubility of trivalent actinides and early lanthanides to ionic strength of 6.60 m in low temperature environments in the presence of Ox. Furthermore, the work is also applicable to the accurate modeling transport of rare earth elements in various environments under the surface conditions.« less

  13. Inactivation of Bacillus anthracis Spores by a Combination of Biocides and Heating under High-Temperature Short-Time Pasteurization Conditions ▿

    PubMed Central

    Xu, Sa; Labuza, Theodore P.; Diez-Gonzalez, Francisco

    2008-01-01

    The milk supply is considered a primary route for a bioterrorism attack with Bacillus anthracis spores because typical high-temperature short-time (HTST) pasteurization conditions cannot inactivate spores. In the event of intentional contamination, an effective method to inactivate the spores in milk under HTST processing conditions is needed. This study was undertaken to identify combinations and concentrations of biocides that can inactivate B. anthracis spores at temperatures in the HTST range in less than 1 min. Hydrogen peroxide (HP), sodium hypochlorite (SH), and peroxyacetic acid (PA) were evaluated for their efficacy in inactivating spores of strains 7702, ANR-1, and 9131 in milk at 72, 80, and 85°C using a sealed capillary tube technique. Strains ANR-1 and 9131 were more resistant to all of the biocide treatments than strain 7702. Addition of 1,260 ppm SH to milk reduced the number of viable spores of each strain by 6 log CFU/ml in less than 90 and 60 s at 72 and 80°C, respectively. After neutralization, 1,260 ppm SH reduced the time necessary to inactivate 6 log CFU/ml (TTI6-log) at 80°C to less than 20 s. Treatment of milk with 7,000 ppm HP resulted in a similar level of inactivation in 60 s. Combined treatment with 1,260 ppm SH and 1,800 ppm HP inactivated spores of all strains in less than 20 s at 80°C. Mixing 15 ppm PA with milk containing 1,260 ppm SH resulted in TTI6-log of 25 and 12 s at 72 and 80°C, respectively. TTI6-log of less than 20 s were also achieved at 80°C by using two combinations of biocides: 250 ppm SH, 700 ppm HP, and 150 ppm PA; and 420 ppm SH (pH 7), 1,100 ppm HP, and 15 ppm PA. These results indicated that different combinations of biocides could consistently result in 6-log reductions in the number of B. anthracis spores in less than 1 min at temperatures in the HTST range. This information could be useful for developing more effective thermal treatment strategies which could be used in HTST milk plants to process contaminated milk for disposal and decontamination, as well as for potential protective measures. PMID:18390680

  14. Inactivation of Bacillus anthracis spores by a combination of biocides and heating under high-temperature short-time pasteurization conditions.

    PubMed

    Xu, Sa; Labuza, Theodore P; Diez-Gonzalez, Francisco

    2008-06-01

    The milk supply is considered a primary route for a bioterrorism attack with Bacillus anthracis spores because typical high-temperature short-time (HTST) pasteurization conditions cannot inactivate spores. In the event of intentional contamination, an effective method to inactivate the spores in milk under HTST processing conditions is needed. This study was undertaken to identify combinations and concentrations of biocides that can inactivate B. anthracis spores at temperatures in the HTST range in less than 1 min. Hydrogen peroxide (HP), sodium hypochlorite (SH), and peroxyacetic acid (PA) were evaluated for their efficacy in inactivating spores of strains 7702, ANR-1, and 9131 in milk at 72, 80, and 85 degrees C using a sealed capillary tube technique. Strains ANR-1 and 9131 were more resistant to all of the biocide treatments than strain 7702. Addition of 1,260 ppm SH to milk reduced the number of viable spores of each strain by 6 log CFU/ml in less than 90 and 60 s at 72 and 80 degrees C, respectively. After neutralization, 1,260 ppm SH reduced the time necessary to inactivate 6 log CFU/ml (TTI6-log) at 80 degrees C to less than 20 s. Treatment of milk with 7,000 ppm HP resulted in a similar level of inactivation in 60 s. Combined treatment with 1,260 ppm SH and 1,800 ppm HP inactivated spores of all strains in less than 20 s at 80 degrees C. Mixing 15 ppm PA with milk containing 1,260 ppm SH resulted in TTI6-log of 25 and 12 s at 72 and 80 degrees C, respectively. TTI6-log of less than 20 s were also achieved at 80 degrees C by using two combinations of biocides: 250 ppm SH, 700 ppm HP, and 150 ppm PA; and 420 ppm SH (pH 7), 1,100 ppm HP, and 15 ppm PA. These results indicated that different combinations of biocides could consistently result in 6-log reductions in the number of B. anthracis spores in less than 1 min at temperatures in the HTST range. This information could be useful for developing more effective thermal treatment strategies which could be used in HTST milk plants to process contaminated milk for disposal and decontamination, as well as for potential protective measures.

  15. Estimating Effective Seismic Anisotropy Of Coal Seam Gas Reservoirs from Sonic Log Data Using Orthorhombic Buckus-style Upscaling

    NASA Astrophysics Data System (ADS)

    Gross, Lutz; Tyson, Stephen

    2015-04-01

    Fracture density and orientation are key parameters controlling productivity of coal seam gas reservoirs. Seismic anisotropy can help to identify and quantify fracture characteristics. In particular, wide offset and dense azimuthal coverage land seismic recordings offers the opportunity for recovery of anisotropy parameters. In many coal seam gas reservoirs (eg. Walloon Subgroup in the Surat Basin, Queensland, Australia (Esterle et al. 2013)) the thickness of coal-beds and interbeds (e.g mud-stone) are well below the seismic wave length (0.3-1m versus 5-15m). In these situations, the observed seismic anisotropy parameters represent effective elastic properties of the composite media formed of fractured, anisotropic coal and isotropic interbed. As a consequence observed seismic anisotropy cannot directly be linked to fracture characteristics but requires a more careful interpretation. In the paper we will discuss techniques to estimate effective seismic anisotropy parameters from well log data with the objective to improve the interpretation for the case of layered thin coal beds. In the first step we use sonic log data to reconstruct the elasticity parameters as function of depth (at the resolution of the sonic log). It is assumed that within a sample fractures are sparse, of the same size and orientation, penny-shaped and equally spaced. Following classical fracture model this can be modeled as an elastic horizontally transversely isotropic (HTI) media (Schoenberg & Sayers 1995). Under the additional assumption of dry fractures, normal and tangential fracture weakness is estimated from slow and fast shear wave velocities of the sonic log. In the second step we apply Backus-style upscaling to construct effective anisotropy parameters on an appropriate length scale. In order to honor the HTI anisotropy present at each layer we have developed a new extension of the classical Backus averaging for layered isotropic media (Backus 1962) . Our new method assumes layered HTI media with constant anisotropy orientation as recovered in the first step. It leads to an effective horizontal orthorhombic elastic model. From this model Thomsen-style anisotropy parameters are calculated to derive azimuth-dependent normal move out (NMO) velocities (see Grechka & Tsvankin 1998). In our presentation we will show results of our approach from sonic well logs in the Surat Basin to investigate the potential of reconstructing S-wave velocity anisotropy and fracture density from azimuth dependent NMO velocities profiles.

  16. An improved thermodynamic model for the complexation of trivalent actinides and lanthanide with oxalic acid valid to high ionic strength.

    DOE PAGES

    Xiong, Yongliang; Thakur, Punam; Borkowski, Marian

    2015-07-30

    The dissociation constants of oxalic acid (Ox), and the stability constants of Am 3+, Cm 3+ and Eu 3+ with Ox 2– have been determined at 25 °C, over a range of concentration varying from 0.1 to 6.60 m NaClO4 using potentiometric titration and extraction techniques, respectively. The experimental data support the formation of complexes, M(Ox) n 3 – 2n, where (M = Am 3+, Cm 3+ and Eu 3+ and n = 1 and 2). The dissociation constant and the stability constant values measured as a function of NaClO 4 concentration were used to estimate the Pitzer parameters formore » the respective interactions of Am 3+, Cm 3+ and Eu 3+ with Ox. Furthermore, the stability constants data of Am 3+ –Ox measured in NaClO 4 and in NaCl solutions from the literature were simultaneously fitted in order to refine the existing actinide–oxalate complexation model that can be used universally in the safety assessment of radioactive waste disposal. The thermodynamic stability constant: log β 0 101 = 6.30 ± 0.06 and log β 0 102 = 10.84 ± 0.06 for Am 3+ was obtained by simultaneously fitting data in NaCl and NaClO 4 media. Additionally, log β 0 101 = 6.72 ± 0.08 and log β 0 102 = 11.05 ± 0.09 for the Cm 3+ and log β 0 101 = 6.67 ± 0.08 and log β 0 102 = 11.15 ± 0.09 for the Eu 3+ were calculated by extrapolation of data to zero ionic strength in NaClO 4 medium only. For all stability constants, the Pitzer model gives an excellent representation of the data using interaction parameters β (0), β (1), and CΦ determined in this work. The thermodynamic model developed in this work will be useful in accurately modeling the potential solubility of trivalent actinides and early lanthanides to ionic strength of 6.60 m in low temperature environments in the presence of Ox. Furthermore, the work is also applicable to the accurate modeling transport of rare earth elements in various environments under the surface conditions.« less

  17. A log-linear model approach to estimation of population size using the line-transect sampling method

    USGS Publications Warehouse

    Anderson, D.R.; Burnham, K.P.; Crain, B.R.

    1978-01-01

    The technique of estimating wildlife population size and density using the belt or line-transect sampling method has been used in many past projects, such as the estimation of density of waterfowl nestling sites in marshes, and is being used currently in such areas as the assessment of Pacific porpoise stocks in regions of tuna fishing activity. A mathematical framework for line-transect methodology has only emerged in the last 5 yr. In the present article, we extend this mathematical framework to a line-transect estimator based upon a log-linear model approach.

  18. [Investigation of Elekta linac characteristics for VMAT].

    PubMed

    Luo, Guangwen; Zhang, Kunyi

    2012-01-01

    The aim of this study is to investigate the characteristics of Elekta delivery system for volumetric modulated arc therapy (VMAT). Five VMAT plans were delivered in service mode and dose rates, and speed of gantry and MLC leaves were analyzed by log files. Results showed that dose rates varied between 6 dose rates. Gantry and MLC leaf speed dynamically varied during delivery. The technique of VMAT requires linac to dynamically control more parameters, and these key dynamic variables during VMAT delivery can be checked by log files. Quality assurance procedure should be carried out for VMAT related parameter.

  19. CS2 analysis in presence of non-Gaussian background noise - Effect on traditional estimators and resilience of log-envelope indicators

    NASA Astrophysics Data System (ADS)

    Borghesani, P.; Antoni, J.

    2017-06-01

    Second-order cyclostationary (CS2) analysis has become popular in the field of machine diagnostics and a series of digital signal processing techniques have been developed to extract CS2 components from the background noise. Among those techniques, squared envelope spectrum (SES) and cyclic modulation spectrum (CMS) have gained popularity thanks to their high computational efficiency and simple implementation. The effectiveness of CMS and SES has been previously quantified based on the hypothesis of Gaussian background noise and has led to statistical tests for the presence of CS2 peaks in squared envelope spectra and cyclic modulation spectra. However a recently established link of CMS with SES and of SES with kurtosis has exposed a potential weakness of those indicators in the case of highly leptokurtic background noise. This case is often present in practice when the machine is subjected to highly impulsive phenomena, either due to harsh operating conditions or to electric noise generated by power electronics and captured by the sensor. This study investigates and quantifies for the first time the effect of leptokurtic noise on the capabilities of SES and CMS, by analysing three progressively harsh situations: high kurtosis, infinite kurtosis and alpha-stable background noise (for which even first and second-order moments are not defined). Then the resilience of a recently proposed family of CS2 indicators, based on the log-envelope, is verified analytically, numerically and experimentally in the case of highly leptokurtic noise.

  20. Development of bovine serum albumin-water partition coefficients predictive models for ionogenic organic chemicals based on chemical form adjusted descriptors.

    PubMed

    Ding, Feng; Yang, Xianhai; Chen, Guosong; Liu, Jining; Shi, Lili; Chen, Jingwen

    2017-10-01

    The partition coefficients between bovine serum albumin (BSA) and water (K BSA/w ) for ionogenic organic chemicals (IOCs) were different greatly from those of neutral organic chemicals (NOCs). For NOCs, several excellent models were developed to predict their logK BSA/w . However, it was found that the conventional descriptors are inappropriate for modeling logK BSA/w of IOCs. Thus, alternative approaches are urgently needed to develop predictive models for K BSA/w of IOCs. In this study, molecular descriptors that can be used to characterize the ionization effects (e.g. chemical form adjusted descriptors) were calculated and used to develop predictive models for logK BSA/w of IOCs. The models developed had high goodness-of-fit, robustness, and predictive ability. The predictor variables selected to construct the models included the chemical form adjusted averages of the negative potentials on the molecular surface (V s-adj - ), the chemical form adjusted molecular dipole moment (dipolemoment adj ), the logarithm of the n-octanol/water distribution coefficient (logD). As these molecular descriptors can be calculated from their molecular structures directly, the developed model can be easily used to fill the logK BSA/w data gap for other IOCs within the applicability domain. Furthermore, the chemical form adjusted descriptors calculated in this study also could be used to construct predictive models on other endpoints of IOCs. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Enumeration of verocytotoxigenic Escherichia coli (VTEC) O157 and O26 in milk by quantitative PCR.

    PubMed

    Mancusi, Rocco; Trevisani, Marcello

    2014-08-01

    Quantitative real-time polymerase chain reaction (qPCR) can be a convenient alternative to the Most Probable Number (MPN) methods to count VTEC in milk. The number of VTEC is normally very low in milk; therefore with the aim of increasing the method sensitivity a qPCR protocol that relies on preliminary enrichment was developed. The growth pattern of six VTEC strains (serogroups O157 and O26) was studied using enrichment in Buffered Peptone Water (BPW) with or without acriflavine for 4-24h. Milk samples were inoculated with these strains over a five Log concentration range between 0.24-0.50 and 4.24-4.50 Log CFU/ml. DNA was extracted from the enriched samples in duplicate and each extract was analysed in duplicate by qPCR using pairs of primers specific for the serogroups O157 and O26. When samples were pre-enriched in BPW at 37°C for 8h, the relationship between threshold cycles (CT values) and VTEC Log numbers was linear over a five Log concentration range. The regression of PCR threshold cycle numbers on VTEC Log CFU/ml had a slope coefficient equal to -3.10 (R(2)=0.96) which is indicative of a 10-fold difference of the gene copy numbers between samples (with a 100 ± 10% PCR efficiency). The same 10-fold proportion used for inoculating the milk samples with VTEC was observed, therefore, also in the enriched samples at 8h. A comparison of the CT values of milk samples and controls revealed that the strains inoculated in milk grew with 3 Log increments in the 8h enrichment period. Regression lines that fitted the qPCR and MPN data revealed that the error of the qPCR estimates is lower than the error of the estimated MPN (r=0.982, R(2)=0.965 vs. r=0.967, R(2)=0.935). The growth rates of VTEC strains isolated from milk should be comparatively assessed before qPCR estimates based on the regression model are considered valid. Comparative assessment of the growth rates can be done using spectrophotometric measurements of standardized cultures of isolates and reference strains cultured in BPW at 37°C for 8h. The method developed for the serogroups O157 and O26 can be easily adapted to the other VTEC serogroups that are relevant for human health. The qPCR method is less laborious and faster than the standard MPN method and has been shown to be a good technique for quantifying VTEC in milk. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Improved method estimating bioconcentration/bioaccumulation factor from octanol/water partition coefficient

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meylan, W.M.; Howard, P.H.; Aronson, D.

    1999-04-01

    A compound`s bioconcentration factor (BDF) is the most commonly used indicator of its tendency to accumulate in aquatic organisms from the surrounding medium. Because it is expensive to measure, the BCF is generally estimated from the octanol/water partition coefficient (K{sub ow}), but currently used regression equations were developed from small data sets that do not adequately represent the wide range of chemical substances now subject to review. To develop and improved method, the authors collected BCF data in a file that contained information on measured BCFs and other key experimental details for 694 chemicals. Log BCF was then regressed againstmore » log K{sub ow} and chemicals with significant deviations from the line of best fit were analyzed by chemical structure. The resulting algorithm classifies a substance as either nonionic or ionic, the latter group including carboxylic acids, sulfonic acids and their salts, and quaternary N compounds. Log BCF for nonionics is estimated from log K{sub ow} and a series of correction factors if applicable; different equations apply for log K{sub ow} 1.0 to 7.0 and >7.0. For ionics, chemicals are categorized by log K{sub ow} and a log BCF in the range 0.5 to 1.75 is assigned. Organometallics, nonionics with long alkyl chains, and aromatic azo compounds receive special treatment. The correlation coefficient and mean error for log BCF indicate that the new method is a significantly better fit to existing data than other methods.« less

  3. Host range of the emerald ash borer (Agrilus planipennis Fairmaire) (Coleoptera: Buprestidae) in North America: results of multiple-choice field experiments.

    PubMed

    Anulewicz, Andrea C; McCullough, Deborah G; Cappaert, David L; Poland, Therese M

    2008-02-01

    Emerald ash borer (Agrilus planipennis Fairmaire) (Coleoptera: Buprestidae), an invasive phloem-feeding pest, was identified as the cause of widespread ash (Fraxinus) mortality in southeast Michigan and Windsor, Ontario, Canada, in 2002. A. planipennis reportedly colonizes other genera in its native range in Asia, including Ulmus L., Juglans L., and Pterocarya Kunth. Attacks on nonash species have not been observed in North America to date, but there is concern that other genera could be colonized. From 2003 to 2005, we assessed adult A. planipennis landing rates, oviposition, and larval development on North American ash species and congeners of its reported hosts in Asia in multiple-choice field studies conducted at several southeast Michigan sites. Nonash species evaluated included American elm (U. americana L.), hackberry (Celtis occidentalis L.), black walnut (J. nigra L.), shagbark hickory [Carya ovata (Mill.) K.Koch], and Japanese tree lilac (Syringa reticulata Bl.). In studies with freshly cut logs, adult beetles occasionally landed on nonash logs but generally laid fewer eggs than on ash logs. Larvae fed and developed normally on ash logs, which were often heavily infested. No larvae were able to survive, grow, or develop on any nonash logs, although failed first-instar galleries occurred on some walnut logs. High densities of larvae developed on live green ash and white ash nursery trees, but there was no evidence of larval survival or development on Japanese tree lilac and black walnut trees in the same plantation. We felled, debarked, and intensively examined >28 m2 of phloem area on nine American elm trees growing in contact with or adjacent to heavily infested ash trees. We found no sign of A. planipennis feeding on any elm.

  4. An Accurate and Stable FFT-based Method for Pricing Options under Exp-Lévy Processes

    NASA Astrophysics Data System (ADS)

    Ding, Deng; Chong U, Sio

    2010-05-01

    An accurate and stable method for pricing European options in exp-Lévy models is presented. The main idea of this new method is combining the quadrature technique and the Carr-Madan Fast Fourier Transform methods. The theoretical analysis shows that the overall complexity of this new method is still O(N log N) with N grid points as the fast Fourier transform methods. Numerical experiments for different exp-Lévy processes also show that the numerical algorithm proposed by this new method has an accuracy and stability for the small strike prices K. That develops and improves the Carr-Madan method.

  5. A method for development of a system of identification for Appalachian coal-bearing rocks

    USGS Publications Warehouse

    Ferm, J.C.; Weisenfluh, G.A.; Smith, G.C.

    2002-01-01

    The number of observable properties of sedimentary rocks is large and numerous classifications have been proposed for describing them. Some rock classifications, however, may be disadvantageous in situations such as logging rock core during coal exploration programs, where speed and simplicity are the essence. After experimenting with a number of formats for logging rock core in the Appalachian coal fields, a method of using color photographs accompanied by a rock name and numeric code was selected. In order to generate a representative collection of rocks to be photographed, sample methods were devised to produce a representative collection, and empirically based techniques were devised to identify repeatedly recognizable rock types. A number of cores representing the stratigraphic and geographic range of the region were sampled so that every megascopically recognizable variety was included in the collection; the frequency of samples of any variety reflects the frequency with which it would be encountered during logging. In order to generate repeatedly recognizable rock classes, the samples were sorted to display variation in grain size, mineral composition, color, and sedimentary structures. Class boundaries for each property were selected on the basis of existing, widely accepted limits and the precision with which these limits could be recognized. The process of sorting the core samples demonstrated relationships between rock properties and indicated that similar methods, applied to other groups of rocks, could yield more widely applicable field classifications. ?? 2002 Elsevier Science B.V. All rights reserved.

  6. Retinal nerve fiber layer thickness analysis in suspected malingerers with optic disc temporal pallor

    PubMed Central

    Civelekler, Mustafa; Halili, Ismail; Gundogan, Faith C; Sobaci, Gungor

    2009-01-01

    Purpose: To investigate the value of temporal retinal nerve fiber layer (RNFLtemporal) thickness in the prediction of malingering. Materials and Methods: This prospective, cross-sectional study was conducted on 33 military conscripts with optic disc temporal pallor (ODTP) and 33 age-and sex-matched healthy controls. Initial visual acuity (VAi) and visual acuity after simulation examination techniques (VAaset) were assessed. The subjects whose VAaset were two or more lines higher than VAi were determined as malingerers. Thickness of the peripapillary RNFL was determined with OCT (Stratus OCT™, Carl Zeiss Meditec, Inc.). RNFLtemporal thickness of the subjects were categorized into one of the 1+ to 4+ groups according to 50% confidence interval (CI), 25% CI and 5% CI values which were assessed in the control group. The VAs were converted to LogMAR-VAs for statistical comparisons. Results: A significant difference was found only in the temporal quadrant of RNFL thickness in subjects with ODTP (P=0.002). Mean LogMAR-VA increased significantly after SETs (P<0.001). Sensitivity, specificity, positive and negative predictive values of categorized RNFLtemporal thickness in diagnosing malingering were 84.6%, 75.0%, 68.8%, 88.2%, respectively. ROC curve showed that RNFLtemporal thickness of 67.5 μm is a significant cut-off point in determining malingering (P=0.001, area under the curve:0.862). The correlations between LogMAR-VAs and RNFLtemporal thicknesses were significant; the correlation coefficient for LogMAR-VAi was lower than the correlation for LogMAR-VAaset (r=−0.447, P=0.009 for LogMAR-VAi; r=−0.676, P<0.001 for LogMAR-VAaset). Conclusions: RNFLtemporal thickness assessment may be a valuable tool in determining malingering in subjects with ODTP objectively. PMID:19700875

  7. Accuracy and precision of Legionella isolation by US laboratories in the ELITE program pilot study.

    PubMed

    Lucas, Claressa E; Taylor, Thomas H; Fields, Barry S

    2011-10-01

    A pilot study for the Environmental Legionella Isolation Techniques Evaluation (ELITE) Program, a proficiency testing scheme for US laboratories that culture Legionella from environmental samples, was conducted September 1, 2008 through March 31, 2009. Participants (n=20) processed panels consisting of six sample types: pure and mixed positive, pure and mixed negative, pure and mixed variable. The majority (93%) of all samples (n=286) were correctly characterized, with 88.5% of samples positive for Legionella and 100% of negative samples identified correctly. Variable samples were incorrectly identified as negative in 36.9% of reports. For all samples reported positive (n=128), participants underestimated the cfu/ml by a mean of 1.25 logs with standard deviation of 0.78 logs, standard error of 0.07 logs, and a range of 3.57 logs compared to the CDC re-test value. Centering results around the interlaboratory mean yielded a standard deviation of 0.65 logs, standard error of 0.06 logs, and a range of 3.22 logs. Sampling protocol, treatment regimen, culture procedure, and laboratory experience did not significantly affect the accuracy or precision of reported concentrations. Qualitative and quantitative results from the ELITE pilot study were similar to reports from a corresponding proficiency testing scheme available in the European Union, indicating these results are probably valid for most environmental laboratories worldwide. The large enumeration error observed suggests that the need for remediation of a water system should not be determined solely by the concentration of Legionella observed in a sample since that value is likely to underestimate the true level of contamination. Published by Elsevier Ltd.

  8. Global and Local Approaches Describing Critical Phenomena on the Developing and Developed Financial Markets

    NASA Astrophysics Data System (ADS)

    Grech, Dariusz

    We define and confront global and local methods to analyze the financial crash-like events on the financial markets from the critical phenomena point of view. These methods are based respectively on the analysis of log-periodicity and on the local fractal properties of financial time series in the vicinity of phase transitions (crashes). The log-periodicity analysis is made in a daily time horizon, for the whole history (1991-2008) of Warsaw Stock Exchange Index (WIG) connected with the largest developing financial market in Europe. We find that crash-like events on the Polish financial market are described better by the log-divergent price model decorated with log-periodic behavior than by the power-law-divergent price model usually discussed in log-periodic scenarios for developed markets. Predictions coming from log-periodicity scenario are verified for all main crashes that took place in WIG history. It is argued that crash predictions within log-periodicity model strongly depend on the amount of data taken to make a fit and therefore are likely to contain huge inaccuracies. Next, this global analysis is confronted with the local fractal description. To do so, we provide calculation of the so-called local (time dependent) Hurst exponent H loc for the WIG time series and for main US stock market indices like DJIA and S&P 500. We point out dependence between the behavior of the local fractal properties of financial time series and the crashes appearance on the financial markets. We conclude that local fractal method seems to work better than the global approach - both for developing and developed markets. The very recent situation on the market, particularly related to the Fed intervention in September 2007 and the situation immediately afterwards is also analyzed within fractal approach. It is shown in this context how the financial market evolves through different phases of fractional Brownian motion. Finally, the current situation on American market is analyzed in fractal language. This is to show how far we still are from the end of recession and from the beginning of a new boom on US financial market or on other world leading stocks.

  9. Planetary Geochemistry Techniques: Probing In-Situ with Neutron and Gamma Rays (PING) Instrument

    NASA Technical Reports Server (NTRS)

    Parsons, A.; Bodnarik, J.; Burger, D.; Evans, L.; Floyd, S.; Lin, L.; McClanahan, T.; Nankung, M.; Nowicki, S.; Schweitzer, J.; hide

    2011-01-01

    The Probing In situ with Neutrons and Gamma rays (PING) instrument is a promising planetary science application of the active neutron-gamma ray technology so successfully used in oil field well logging and mineral exploration on Earth. The objective of our technology development program at NASA Goddard Space Flight Center's (NASA/GSFC) Astrochemistry Laboratory is to extend the application of neutron interrogation techniques to landed in situ planetary composition measurements by using a 14 MeV Pulsed Neutron Generator (PNG) combined with neutron and gamma ray detectors, to probe the surface and subsurface of planetary bodies without the need to drill. We are thus working to bring the PING instrument to the point where it can be flown on a variety of surface lander or rover missions to the Moon, Mars, Venus, asteroids, comets and the satellites of the outer planets.

  10. Green lumber grade yields from black cherry and red maple factory grade logs sawed at band and circular mills

    Treesearch

    Daniel A. Yaussy

    1989-01-01

    Multivariate regression models were developed to predict green board-foot yields (1 board ft. = 2.360 dm 3) for the standard factory lumber grades processed from black cherry (Prunus serotina Ehrh.) and red maple (Acer rubrum L.) factory grade logs sawed at band and circular sawmills. The models use log...

  11. Adjusting Quality index Log Values to Represent Local and Regional Commercial Sawlog Product Values

    Treesearch

    Orris D. McCauley; Joseph J. Mendel; Joseph J. Mendel

    1969-01-01

    The primary purpose of this paper is not only to report the results of a comparative analysis as to how well the Q.I. method predicts log product values when compared to commercial sawmill log output values, but also to develop a methodology which will facilitate the comparison and provide the adjustments needed by the sawmill operator.

  12. A Manual of Instruction for Log Scaling and the Measurement of Timber Products.

    ERIC Educational Resources Information Center

    Idaho State Board of Vocational Education, Boise. Div. of Trade and Industrial Education.

    This manual was developed by a state advisory committee in Idaho to improve and standardize log scaling and provide a reference in training men for the job of log scaling in timber measurement. The content includes: (1) an introduction containing the scope of the manual, a definition and history of scaling, the reasons for scaling, and the…

  13. British Columbia log export policy: historical review and analysis.

    Treesearch

    Craig W. Shinn

    1993-01-01

    Log exports have been restricted in British Columbia for over 100 years. The intent of the restriction is to use the timber in British Columbia to encourage development of forest industry, employment, and well-being in the Province. Logs have been exempted from the within-Province manufacturing rule at various times, in varying amounts, for different reasons, and by...

  14. When Smart People Fail: An Analysis of the Transaction Log of an Online Public Access Catalog.

    ERIC Educational Resources Information Center

    Peters, Thomas A.

    1989-01-01

    Describes a low cost study of the transaction logs of an online catalog at an academic library that examined failure rates, usage patterns, and probable causes of patron problems. The implications of the findings for bibliographic instruction and collection development are discussed and the benefits of analyzing transaction logs are identified.…

  15. Butt-log grade distributions for five Appalachian hardwood species

    Treesearch

    John R. Myers; Gary W. Miller; Harry V., Jr. Wiant; Joseph E. Barnard; Joseph E. Barnard

    1986-01-01

    Tree quality is an important factor in determining the market value of hardwood timber stands, but many forest inventories do not include estimates of tree quality. Butt-log grade distributions were developed for northern red oak, black oak, white oak, chestnut oak, and yellow-poplar using USDA Forest Service log grades on more than 4,700 trees in West Virginia. Butt-...

  16. Multivariate regression model for predicting yields of grade lumber from yellow birch sawlogs

    Treesearch

    Andrew F. Howard; Daniel A. Yaussy

    1986-01-01

    A multivariate regression model was developed to predict green board-foot yields for the common grades of factory lumber processed from yellow birch factory-grade logs. The model incorporates the standard log measurements of scaling diameter, length, proportion of scalable defects, and the assigned USDA Forest Service log grade. Differences in yields between band and...

  17. Pathogen Reduction in Human Plasma Using an Ultrashort Pulsed Laser

    PubMed Central

    Tsen, Shaw-Wei D.; Kingsley, David H.; Kibler, Karen; Jacobs, Bert; Sizemore, Sara; Vaiana, Sara M.; Anderson, Jeanne; Tsen, Kong-Thon; Achilefu, Samuel

    2014-01-01

    Pathogen reduction is a viable approach to ensure the continued safety of the blood supply against emerging pathogens. However, the currently licensed pathogen reduction techniques are ineffective against non-enveloped viruses such as hepatitis A virus, and they introduce chemicals with concerns of side effects which prevent their widespread use. In this report, we demonstrate the inactivation of both enveloped and non-enveloped viruses in human plasma using a novel chemical-free method, a visible ultrashort pulsed laser. We found that laser treatment resulted in 2-log, 1-log, and 3-log reductions in human immunodeficiency virus, hepatitis A virus, and murine cytomegalovirus in human plasma, respectively. Laser-treated plasma showed ≥70% retention for most coagulation factors tested. Furthermore, laser treatment did not alter the structure of a model coagulation factor, fibrinogen. Ultrashort pulsed lasers are a promising new method for chemical-free, broad-spectrum pathogen reduction in human plasma. PMID:25372037

  18. Cave Pearl Data Logger: A Flexible Arduino-Based Logging Platform for Long-Term Monitoring in Harsh Environments.

    PubMed

    Beddows, Patricia A; Mallon, Edward K

    2018-02-09

    A low-cost data logging platform is presented that provides long-term operation in remote or submerged environments. Three premade "breakout boards" from the open-source Arduino ecosystem are assembled into the core of the data logger. Power optimization techniques are presented which extend the operational life of this module-based design to >1 year on three alkaline AA batteries. Robust underwater housings are constructed for these loggers using PVC fittings. Both the logging platform and the enclosures, are easy to build and modify without specialized tools or a significant background in electronics. This combination turns the Cave Pearl data logger into a generalized prototyping system and this design flexibility is demonstrated with two field studies recording drip rates in a cave and water flow in a flooded cave system. This paper describes a complete DIY solution, suitable for a wide range of challenging deployment conditions.

  19. Cave Pearl Data Logger: A Flexible Arduino-Based Logging Platform for Long-Term Monitoring in Harsh Environments

    PubMed Central

    Mallon, Edward K.

    2018-01-01

    A low-cost data logging platform is presented that provides long-term operation in remote or submerged environments. Three premade “breakout boards” from the open-source Arduino ecosystem are assembled into the core of the data logger. Power optimization techniques are presented which extend the operational life of this module-based design to >1 year on three alkaline AA batteries. Robust underwater housings are constructed for these loggers using PVC fittings. Both the logging platform and the enclosures, are easy to build and modify without specialized tools or a significant background in electronics. This combination turns the Cave Pearl data logger into a generalized prototyping system and this design flexibility is demonstrated with two field studies recording drip rates in a cave and water flow in a flooded cave system. This paper describes a complete DIY solution, suitable for a wide range of challenging deployment conditions. PMID:29425185

  20. The modelling of carbon-based supercapacitors: Distributions of time constants and Pascal Equivalent Circuits

    NASA Astrophysics Data System (ADS)

    Fletcher, Stephen; Kirkpatrick, Iain; Dring, Roderick; Puttock, Robert; Thring, Rob; Howroyd, Simon

    2017-03-01

    Supercapacitors are an emerging technology with applications in pulse power, motive power, and energy storage. However, their carbon electrodes show a variety of non-ideal behaviours that have so far eluded explanation. These include Voltage Decay after charging, Voltage Rebound after discharging, and Dispersed Kinetics at long times. In the present work, we establish that a vertical ladder network of RC components can reproduce all these puzzling phenomena. Both software and hardware realizations of the network are described. In general, porous carbon electrodes contain random distributions of resistance R and capacitance C, with a wider spread of log R values than log C values. To understand what this implies, a simplified model is developed in which log R is treated as a Gaussian random variable while log C is treated as a constant. From this model, a new family of equivalent circuits is developed in which the continuous distribution of log R values is replaced by a discrete set of log R values drawn from a geometric series. We call these Pascal Equivalent Circuits. Their behaviour is shown to resemble closely that of real supercapacitors. The results confirm that distributions of RC time constants dominate the behaviour of real supercapacitors.

  1. The development and validation of the Instructional Practices Log in Science: a measure of K-5 science instruction

    NASA Astrophysics Data System (ADS)

    Adams, Elizabeth L.; Carrier, Sarah J.; Minogue, James; Porter, Stephen R.; McEachin, Andrew; Walkowiak, Temple A.; Zulli, Rebecca A.

    2017-02-01

    The Instructional Practices Log in Science (IPL-S) is a daily teacher log developed for K-5 teachers to self-report their science instruction. The items on the IPL-S are grouped into scales measuring five dimensions of science instruction: Low-level Sense-making, High-level Sense-making, Communication, Integrated Practices, and Basic Practices. As part of the current validation study, 206 elementary teachers completed 4137 daily log entries. The purpose of this paper is to provide evidence of validity for the IPL-S's scales, including (a) support for the theoretical framework; (b) cognitive interviews with logging teachers; (c) item descriptive statistics; (d) comparisons of 28 pairs of teacher and rater logs; and (e) an examination of the internal structure of the IPL-S. We present evidence to describe the extent to which the items and the scales are completed accurately by teachers and differentiate various types of science instructional strategies employed by teachers. Finally, we point to several practical implications of our work and potential uses for the IPL-S. Overall, results provide neutral to positive support for the validity of the groupings of items or scales.

  2. Acoustic Emission and Echo Signal Compensation Techniques Applied to an Ultrasonic Logging-While-Drilling Caliper.

    PubMed

    Yao, Yongchao; Ju, Xiaodong; Lu, Junqiang; Men, Baiyong

    2017-06-10

    A logging-while-drilling (LWD) caliper is a tool used for the real-time measurement of a borehole diameter in oil drilling engineering. This study introduces the mechanical structure and working principle of a new LWD caliper based on ultrasonic distance measurement (UDM). The detection range is a major performance index of a UDM system. This index is determined by the blind zone length and remote reflecting interface detection capability of the system. To reduce the blind zone length and detect near the reflecting interface, a full bridge acoustic emission technique based on bootstrap gate driver (BGD) and metal-oxide-semiconductor field effect transistor (MOSFET) is designed by analyzing the working principle and impedance characteristics of a given piezoelectric transducer. To detect the remote reflecting interface and reduce the dynamic range of the received echo signals, the relationships between the echo amplitude and propagation distance of ultrasonic waves are determined. A signal compensation technique based on time-varying amplification theory, which can automatically change the gain according to the echo arrival time is designed. Lastly, the aforementioned techniques and corresponding circuits are experimentally verified. Results show that the blind zone length in the UDM system of the LWD caliper is significantly reduced and the capability to detect the remote reflecting interface is considerably improved.

  3. Acoustic Emission and Echo Signal Compensation Techniques Applied to an Ultrasonic Logging-While-Drilling Caliper

    PubMed Central

    Yao, Yongchao; Ju, Xiaodong; Lu, Junqiang; Men, Baiyong

    2017-01-01

    A logging-while-drilling (LWD) caliper is a tool used for the real-time measurement of a borehole diameter in oil drilling engineering. This study introduces the mechanical structure and working principle of a new LWD caliper based on ultrasonic distance measurement (UDM). The detection range is a major performance index of a UDM system. This index is determined by the blind zone length and remote reflecting interface detection capability of the system. To reduce the blind zone length and detect near the reflecting interface, a full bridge acoustic emission technique based on bootstrap gate driver (BGD) and metal-oxide-semiconductor field effect transistor (MOSFET) is designed by analyzing the working principle and impedance characteristics of a given piezoelectric transducer. To detect the remote reflecting interface and reduce the dynamic range of the received echo signals, the relationships between the echo amplitude and propagation distance of ultrasonic waves are determined. A signal compensation technique based on time-varying amplification theory, which can automatically change the gain according to the echo arrival time is designed. Lastly, the aforementioned techniques and corresponding circuits are experimentally verified. Results show that the blind zone length in the UDM system of the LWD caliper is significantly reduced and the capability to detect the remote reflecting interface is considerably improved. PMID:28604603

  4. Operator-Friendly Technique and Quality Control Considerations for Indigo Colorimetric Measurement of Ozone Residual

    EPA Science Inventory

    Drinking water ozone disinfection systems measure ozone residual concentration, C, for regulatory compliance reporting of concentration-times-time (CT), and the resultant log-inactivation of virus, Giardia and Cryptosporidium. The indigotrisulfonate (ITS) colorimetric procedure i...

  5. Factors influencing woodlands of southwestern North Dakota

    Treesearch

    Michele M. Girard; Harold Goetz; Ardell J. Bjugstad

    1987-01-01

    Literature pertaining to woodlands of southwestern North Dakota is reviewed. Woodland species composition and distribution, and factors influencing woodland ecosystems such as climate, logging, fire, and grazing are described. Potential management and improvement techniques using vegetation and livestock manipulation have been suggested.

  6. Preparation and testing of drilled shafts with self-consolidating concrete.

    DOT National Transportation Integrated Search

    2012-06-01

    In this study, self-consolidating concrete (SCC) was evaluated in drilled shafts and the : integrity of drilled shafts was determined using cross-hole sonic logging (CSL), a low-strain : nondestructive integrity testing technique. SCC has very high f...

  7. Robust sampling-sourced numerical retrieval algorithm for optical energy loss function based on log-log mesh optimization and local monotonicity preserving Steffen spline

    NASA Astrophysics Data System (ADS)

    Maglevanny, I. I.; Smolar, V. A.

    2016-01-01

    We introduce a new technique of interpolation of the energy-loss function (ELF) in solids sampled by empirical optical spectra. Finding appropriate interpolation methods for ELFs poses several challenges. The sampled ELFs are usually very heterogeneous, can originate from various sources thus so called "data gaps" can appear, and significant discontinuities and multiple high outliers can be present. As a result an interpolation based on those data may not perform well at predicting reasonable physical results. Reliable interpolation tools, suitable for ELF applications, should therefore satisfy several important demands: accuracy and predictive power, robustness and computational efficiency, and ease of use. We examined the effect on the fitting quality due to different interpolation schemes with emphasis on ELF mesh optimization procedures and we argue that the optimal fitting should be based on preliminary log-log scaling data transforms by which the non-uniformity of sampled data distribution may be considerably reduced. The transformed data are then interpolated by local monotonicity preserving Steffen spline. The result is a piece-wise smooth fitting curve with continuous first-order derivatives that passes through all data points without spurious oscillations. Local extrema can occur only at grid points where they are given by the data, but not in between two adjacent grid points. It is found that proposed technique gives the most accurate results and also that its computational time is short. Thus, it is feasible using this simple method to address practical problems associated with interaction between a bulk material and a moving electron. A compact C++ implementation of our algorithm is also presented.

  8. Fovea sparing internal limiting membrane peeling using multiple parafoveal curvilinear peels for myopic foveoschisis: technique and outcome.

    PubMed

    Jin, Haiying; Zhang, Qi; Zhao, Peiquan

    2016-10-18

    To introduce a modified surgical technique, the "parafoveal multiple curvelinear internal limiting membrane (ILM) peeling", to preserve epi-foveal ILM in myopic foveoschisis surgery. Consecutive patients with myopic foveoschisis were enrolled in the present prospective interventional case series. The surgeries were performed using transconjunctival 23-gauge system. The macular area was divided into quadrants. ILM was peeled off in a curvilinear manner centered around the site that was away from the central fovea in each quadrant. Shearing forces were used to control the direction to keep the peeling away from central fovea. ILM at central fovea of about 500 to 1000 μm was preserved by this technique. This technique was performed in 20 eyes of 20 consecutive patients. Epi-foveal ILM was successfully preserved in all cases using the technique. Patients were followed up for more than 12 months. The mean postoperative logMAR visual acuity improved from 1.67 ± 0.65 preoperatively to 1.15 ± 0.49 (P = 0.015; paired t-test). Postoperative OCT examinations showed that full-thickness macular holes (MHs) did not developed in any case. Central fovea thickness decreased from 910 ± 261 μm preoperatively to 125 ± 85 postoperatively (P = 0.001; paired t-test). Fovea sparing ILM peeling using multiple parafoveal curvilinear peels prevents the development of postoperative full-thickness MHs in eyes with myopic foveoschisis.

  9. SU-F-T-233: Evaluation of Treatment Delivery Parameters Using High Resolution ELEKTA Log Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kabat, C; Defoor, D; Alexandrian, A

    2016-06-15

    Purpose: As modern linacs have become more technologically advanced with the implementation of IGRT and IMRT with HDMLCs, a requirement for more elaborate tracking techniques to monitor components’ integrity is paramount. ElektaLog files are generated every 40 milliseconds, which can be analyzed to track subtle changes and provide another aspect of quality assurance. This allows for constant monitoring of fraction consistency in addition to machine reliability. With this in mind, it was the aim of the study to evaluate if ElektaLog files can be utilized for linac consistency QA. Methods: ElektaLogs were reviewed for 16 IMRT patient plans with >16more » fractions. Logs were analyzed by creating fluence maps from recorded values of MLC locations, jaw locations, and dose per unit time. Fluence maps were then utilized to calculate a 2D gamma index with a 2%–2mm criteria for each fraction. ElektaLogs were also used to analyze positional errors for MLC leaves and jaws, which were used to compute an overall error for the MLC banks, Y-jaws, and X-jaws by taking the root-meansquare value of the individual recorded errors during treatment. Additionally, beam on time was calculated using the number of ElektaLog file entries within the file. Results: The average 2D gamma for all 16 patient plans was found to be 98.0±2.0%. Recorded gamma index values showed an acceptable correlation between fractions. Average RMS values for MLC leaves and the jaws resulted in a leaf variation of roughly 0.3±0.08 mm and jaw variation of about 0.15±0.04 mm, both of which fall within clinical tolerances. Conclusion: The use of ElektaLog files for day-to-day evaluation of linac integrity and patient QA can be utilized to allow for reliable analysis of system accuracy and performance.« less

  10. The effect of terpene enhancer lipophilicity on the percutaneous permeation of hydrocortisone formulated in HPMC gel systems.

    PubMed

    El-Kattan, A F; Asbill, C S; Michniak, B B

    2000-04-05

    The percutaneous permeation of hydrocortisone (HC) was investigated in hairless mouse skin after application of an alcoholic hydrogel using a diffusion cell technique. The formulations contained one of 12 terpenes, the selection of which was based on an increase in their lipophilicity (log P 1.06-5.36). Flux, cumulative receptor concentrations, skin content, and lag time of HC were measured over 24 h and compared with control gels (containing no terpene). Furthermore, HC skin content and the solubility of HC in the alcoholic hydrogel solvent mixture in the presence of terpene were determined, and correlated to the enhancing activity of terpenes. The in vitro permeation experiments with hairless mouse skin revealed that the terpene enhancers varied in their ability to enhance the flux of HC. Nerolidol which possessed the highest lipophilicity (log P = 5.36+/-0.38) provided the greatest enhancement for HC flux (35.3-fold over control). Fenchone (log P = 2.13+/-0.30) exhibited the lowest enhancement of HC flux (10.1-fold over control). In addition, a linear relationship was established between the log P of terpenes and the cumulative amount of HC in the receptor after 24 h (Q(24)). Nerolidol, provided the highest Q(24) (1733+/-93 microg/cm(2)), whereas verbenone produced the lowest Q(24) (653+/-105 microg/cm(2)). Thymol provided the lowest HC skin content (1151+/-293 microg/g), while cineole produced the highest HC skin content (18999+/-5666 microg/g). No correlation was established between the log P of enhancers and HC skin content. A correlation however, existed between the log P of terpenes and the lag time. As log P increased, a linear decrease in lag time was observed. Cymene yielded the shortest HC lag time, while fenchone produced the longest lag time. Also, the increase in the log P of terpenes resulted in a proportional increase in HC solubility in the formulation solvent mixture.

  11. Geopressure modeling from petrophysical data: An example from East Kalimantan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herkommer, M.A.

    1994-07-01

    Localized models of abnormal formation pressure (geopressure) are important economic and safety tools frequently used for well planning and drilling operations. Simplified computer-based procedures have been developed that permit these models to be developed more rapidly and with greater accuracy. These techniques are broadly applicable to basins throughout the world where abnormal formation pressures occur. An example from the Attaka field of East Kalimantan, southeast Asia, shows how geopressure models are developed. Using petrophysical and engineering data, empirical correlations between observed pressure and petrophysical logs can be created by computer-assisted data-fitting techniques. These correlations serve as the basis for modelsmore » of the geopressure. By performing repeated analyses on wells at various locations, contour maps on the top of abnormal geopressure can be created. Methods that are simple in their development and application make the task of geopressure estimation less formidable to the geologist and petroleum engineer. Further, more accurate estimates can significantly improve drilling speeds while reducing the incidence of stuck pipe, kicks, and blowouts. In general, geopressure estimates are used in all phases of drilling operations: To develop mud plans and specify equipment ratings, to assist in the recognition of geopressured formations and determination of mud weights, and to improve predictions at offset locations and geologically comparable areas.« less

  12. Predicting landslides in clearcut patches

    Treesearch

    Raymond M. Rice; Norman H. Pillsbury

    1982-01-01

    Abstract - Accelerated erosion in the form of landslides can be an undesirable consequence of clearcut logging on steep slopes. Forest managers need a method of predicting the risk of such erosion. Data collected after logging in a granitic area of northwestern California were used to develop a predictive equation. A linear discriminant function was developed that...

  13. Volume, value, and thinning: logs for the future.

    Treesearch

    Sally Duncan

    2002-01-01

    Thinning is one of our most important ways to influence tree and stand development. The objectives may include increasing the volume, size, and quality of wood produced from a forest and developing particular stand structures and characteristics for other values, such as wildlife or aesthetics.The Levels-of-Growing-Stock (LOGS) Cooperative was initiated in...

  14. Integrated reservoir characterization for unconventional reservoirs using seismic, microseismic and well log data

    NASA Astrophysics Data System (ADS)

    Maity, Debotyam

    This study is aimed at an improved understanding of unconventional reservoirs which include tight reservoirs (such as shale oil and gas plays), geothermal developments, etc. We provide a framework for improved fracture zone identification and mapping of the subsurface for a geothermal system by integrating data from different sources. The proposed ideas and methods were tested primarily on data obtained from North Brawley geothermal field and the Geysers geothermal field apart from synthetic datasets which were used to test new algorithms before actual application on the real datasets. The study has resulted in novel or improved algorithms for use at specific stages of data acquisition and analysis including improved phase detection technique for passive seismic (and teleseismic) data as well as optimization of passive seismic surveys for best possible processing results. The proposed workflow makes use of novel integration methods as a means of making best use of the available geophysical data for fracture characterization. The methodology incorporates soft computing tools such as hybrid neural networks (neuro-evolutionary algorithms) as well as geostatistical simulation techniques to improve the property estimates as well as overall characterization efficacy. The basic elements of the proposed characterization workflow involves using seismic and microseismic data to characterize structural and geomechanical features within the subsurface. We use passive seismic data to model geomechanical properties which are combined with other properties evaluated from seismic and well logs to derive both qualitative and quantitative fracture zone identifiers. The study has resulted in a broad framework highlighting a new technique for utilizing geophysical data (seismic and microseismic) for unconventional reservoir characterization. It provides an opportunity to optimally develop the resources in question by incorporating data from different sources and using their temporal and spatial variability as a means to better understand the reservoir behavior. As part of this study, we have developed the following elements which are discussed in the subsequent chapters: 1. An integrated characterization framework for unconventional settings with adaptable workflows for all stages of data processing, interpretation and analysis. 2. A novel autopicking workflow for noisy passive seismic data used for improved accuracy in event picking as well as for improved velocity model building. 3. Improved passive seismic survey design optimization framework for better data collection and improved property estimation. 4. Extensive post-stack seismic attribute studies incorporating robust schemes applicable in complex reservoir settings. 5. Uncertainty quantification and analysis to better quantify property estimates over and above the qualitative interpretations made and to validate observations independently with quantified uncertainties to prevent erroneous interpretations. 6. Property mapping from microseismic data including stress and anisotropic weakness estimates for integrated reservoir characterization and analysis. 7. Integration of results (seismic, microseismic and well logs) from analysis of individual data sets for integrated interpretation using predefined integration framework and soft computing tools.

  15. Antimicrobial photodynamic inactivation: a bright new technique to kill resistant microbes

    PubMed Central

    Hamblin, Michael R

    2016-01-01

    Photodynamic therapy (PDT) uses photosensitizers (non-toxic dyes) that are activated by absorption of visible light to form reactive oxygen species (including singlet oxygen) that can oxidize biomolecules and destroy cells. Antimicrobial photodynamic inactivation (aPDI) can treat localized infections. aPDI neither causes any resistance to develop in microbes, nor is affected by existing drug resistance status. We discuss some recent developments in aPDI. New photosensitizers including polycationic conjugates, stable synthetic bacteriochlorins and functionalized fullerenes are described. The microbial killing by aPDI can be synergistically potentiated (several logs) by harmless inorganic salts via photochemistry. Genetically engineered bioluminescent microbial cells allow PDT to treat infections in animal models. Photoantimicrobials have a promising future in the face of the unrelenting increase in antibiotic resistance. PMID:27421070

  16. Contrast Invariant Interest Point Detection by Zero-Norm LoG Filter.

    PubMed

    Zhenwei Miao; Xudong Jiang; Kim-Hui Yap

    2016-01-01

    The Laplacian of Gaussian (LoG) filter is widely used in interest point detection. However, low-contrast image structures, though stable and significant, are often submerged by the high-contrast ones in the response image of the LoG filter, and hence are difficult to be detected. To solve this problem, we derive a generalized LoG filter, and propose a zero-norm LoG filter. The response of the zero-norm LoG filter is proportional to the weighted number of bright/dark pixels in a local region, which makes this filter be invariant to the image contrast. Based on the zero-norm LoG filter, we develop an interest point detector to extract local structures from images. Compared with the contrast dependent detectors, such as the popular scale invariant feature transform detector, the proposed detector is robust to illumination changes and abrupt variations of images. Experiments on benchmark databases demonstrate the superior performance of the proposed zero-norm LoG detector in terms of the repeatability and matching score of the detected points as well as the image recognition rate under different conditions.

  17. How log-normal is your country? An analysis of the statistical distribution of the exported volumes of products

    NASA Astrophysics Data System (ADS)

    Annunziata, Mario Alberto; Petri, Alberto; Pontuale, Giorgio; Zaccaria, Andrea

    2016-10-01

    We have considered the statistical distributions of the volumes of 1131 products exported by 148 countries. We have found that the form of these distributions is not unique but heavily depends on the level of development of the nation, as expressed by macroeconomic indicators like GDP, GDP per capita, total export and a recently introduced measure for countries' economic complexity called fitness. We have identified three major classes: a) an incomplete log-normal shape, truncated on the left side, for the less developed countries, b) a complete log-normal, with a wider range of volumes, for nations characterized by intermediate economy, and c) a strongly asymmetric shape for countries with a high degree of development. Finally, the log-normality hypothesis has been checked for the distributions of all the 148 countries through different tests, Kolmogorov-Smirnov and Cramér-Von Mises, confirming that it cannot be rejected only for the countries of intermediate economy.

  18. An interactive machine-learning approach for defect detection in computed tomogaraphy (CT) images of hardwood logs

    Treesearch

    Erol Sarigul; A. Lynn Abbott; Daniel L. Schmoldt; Philip A. Araman

    2005-01-01

    This paper describes recent progress in the analysis of computed tomography (CT) images of hardwood logs. The long-term goal of the work is to develop a system that is capable of autonomous (or semiautonomous) detection of internal defects, so that log breakdown decisions can be optimized based on defect locations. The problem is difficult because wood exhibits large...

  19. The PsyLOG mobile application: development of a tool for the assessment and monitoring of side effects of psychotropic medication.

    PubMed

    Kuzman, Martina Rojnic; Andlauer, Olivier; Burmeister, Kai; Dvoracek, Boris; Lencer, Rebekka; Koelkebeck, Katja; Nawka, Alexander; Riese, Florian

    2017-06-01

    Mobile health interventions are regarded as affordable and accessible tools that can enhance standard psychiatric care. As part of the mHealth Psycho-Educational Intervention Versus Antipsychotic-Induced Side Effects (mPIVAS) project (www.psylog.eu), we developed the mobile application "PsyLOG" based on mobile "smartphone" technology to monitor antipsychotic-induced side effects. The aim of this paper is to describe the rationale and development of the PsyLOG and its clinical use. The PsyLOG application runs on smartphones with Android operating system. The application is currently available in seven languages (Croatian, Czech, English, French, German, Japanese and Serbian). It consists of several categories: "My Drug Effects", "My Life Styles", "My Charts", "My Medication", "My Strategies", "My Supporters", "Settings" and "About". The main category "My Drug Effects" includes a list of 30 side effects with the possibility to add three additional side effects. Side effects are each accompanied by an appropriate description and the possibility to rate its severity on a visual analogue scale from 0-100%. The PsyLOG application is intended to enhance the link between patients and mental health professionals, serving as a tool that more objectively monitors side-effects over certain periods of time. To the best of our knowledge, no such applications have so far been developed for patients taking antipsychotic medication or for their therapists.

  20. Health Screenings at School

    MedlinePlus

    ... Ribbon Commands Skip to main content Turn off Animations Turn on Animations Our Sponsors Log in | Register Menu Log in | ... others develop later. A child who has difficulty reading the blackboard may not know that she is ...

  1. Magnetic resonance imaging in laboratory petrophysical core analysis

    NASA Astrophysics Data System (ADS)

    Mitchell, J.; Chandrasekera, T. C.; Holland, D. J.; Gladden, L. F.; Fordham, E. J.

    2013-05-01

    Magnetic resonance imaging (MRI) is a well-known technique in medical diagnosis and materials science. In the more specialized arena of laboratory-scale petrophysical rock core analysis, the role of MRI has undergone a substantial change in focus over the last three decades. Initially, alongside the continual drive to exploit higher magnetic field strengths in MRI applications for medicine and chemistry, the same trend was followed in core analysis. However, the spatial resolution achievable in heterogeneous porous media is inherently limited due to the magnetic susceptibility contrast between solid and fluid. As a result, imaging resolution at the length-scale of typical pore diameters is not practical and so MRI of core-plugs has often been viewed as an inappropriate use of expensive magnetic resonance facilities. Recently, there has been a paradigm shift in the use of MRI in laboratory-scale core analysis. The focus is now on acquiring data in the laboratory that are directly comparable to data obtained from magnetic resonance well-logging tools (i.e., a common physics of measurement). To maintain consistency with well-logging instrumentation, it is desirable to measure distributions of transverse (T2) relaxation time-the industry-standard metric in well-logging-at the laboratory-scale. These T2 distributions can be spatially resolved over the length of a core-plug. The use of low-field magnets in the laboratory environment is optimal for core analysis not only because the magnetic field strength is closer to that of well-logging tools, but also because the magnetic susceptibility contrast is minimized, allowing the acquisition of quantitative image voxel (or pixel) intensities that are directly scalable to liquid volume. Beyond simple determination of macroscopic rock heterogeneity, it is possible to utilize the spatial resolution for monitoring forced displacement of oil by water or chemical agents, determining capillary pressure curves, and estimating wettability. The history of MRI in petrophysics is reviewed and future directions considered, including advanced data processing techniques such as compressed sensing reconstruction and Bayesian inference analysis of under-sampled data. Although this review focuses on rock core analysis, the techniques described are applicable in a wider context to porous media in general, such as cements, soils, ceramics, and catalytic materials.

  2. Infrared Spectroscopy for Rapid Characterization of Drill Core and Cutting Mineralogy

    NASA Astrophysics Data System (ADS)

    Calvin, W. M.; Kratt, C.; Kruse, F. A.

    2009-12-01

    Water geochemistry can vary with depth and location within a geothermal reservoir, owing to natural factors such as changing rock type, gas content, fluid source and temperature. The interaction of these variable fluids with the host rock will cause well known changes in alteration mineral assemblages that are commonly factored into the exploration of hydrothermal systems for economic metals, but are less utilized with regard to mapping borehole geology for geothermal energy production. Chemistry of geothermal fluids and rock alteration products can impact production factors such as pipeline corrosion and scaling and early studies explored the use of both silica and chlorites as geothermometers. Infrared spectroscopy is particularly good at identifying a wide variety of alteration minerals, especially in discrimination among clay minerals, with no sample preparation. The technique has been extensively used in the remote identification of materials, but is not commonly used on drill core or chips. We have performed several promising pilot studies that suggest the power of the technique to sample continuously and provide mineral logs akin to geophysical ones. We have surveyed a variety of samples, including drill chip boards, boxed core, and drill cuttings from envelopes, sample bottles and chip trays. This work has demonstrated that core and drill chips can be rapidly surveyed, acquiring spectra every few to tens of cm of section, or the vertical resolution of the chip tray (typically 10 feet). Depending on the sample type we can acquire spectral data over thousands of feet depth at high vertical resolution in a fraction of the time that is needed for traditional analytical methods such as XRD or TEM with better accuracy than traditional geologic drill or chip logging that uses visual inspection alone. We have successfully identified layered silicates such as illite, kaolinite, montmorillonite chlorite and prehnite, zeolites, opal, calcite, jarosite and iron oxides and hydroxides in geothermal drill samples. We are currently developing automated analysis techniques to convert this detailed spectral logging data into high-vertical-resolution mineral depth profiles that can be linked to lithology, stratigraphy, fracture zones and potential for geothermal production. Also in development are metrics that would link mapped mineralogy to known geothermometers such as Na-K, Mg depletion, discrimination among illite, montmorillonite, and beidellite, and kaolinite crystallinity. Identification of amorphous and crystalline silica components (chalcedony, crystobalite and quartz) can also constrain silica geothermometry. The degree of alteration and some mineral types have been shown to be a proxy for host rock permeability, natural circulation, and the potential for reservoir sealing. Analysis of alteration intensity is also under way. We will present a synthesis of results to date.

  3. Delineating chalk sand distribution of Ekofisk formation using probabilistic neural network (PNN) and stepwise regression (SWR): Case study Danish North Sea field

    NASA Astrophysics Data System (ADS)

    Haris, A.; Nafian, M.; Riyanto, A.

    2017-07-01

    Danish North Sea Fields consist of several formations (Ekofisk, Tor, and Cromer Knoll) that was started from the age of Paleocene to Miocene. In this study, the integration of seismic and well log data set is carried out to determine the chalk sand distribution in the Danish North Sea field. The integration of seismic and well log data set is performed by using the seismic inversion analysis and seismic multi-attribute. The seismic inversion algorithm, which is used to derive acoustic impedance (AI), is model-based technique. The derived AI is then used as external attributes for the input of multi-attribute analysis. Moreover, the multi-attribute analysis is used to generate the linear and non-linear transformation of among well log properties. In the case of the linear model, selected transformation is conducted by weighting step-wise linear regression (SWR), while for the non-linear model is performed by using probabilistic neural networks (PNN). The estimated porosity, which is resulted by PNN shows better suited to the well log data compared with the results of SWR. This result can be understood since PNN perform non-linear regression so that the relationship between the attribute data and predicted log data can be optimized. The distribution of chalk sand has been successfully identified and characterized by porosity value ranging from 23% up to 30%.

  4. A Generalized Approach for the Interpretation of Geophysical Well Logs in Ground-Water Studies - Theory and Application

    USGS Publications Warehouse

    Paillet, Frederick L.; Crowder, R.E.

    1996-01-01

    Quantitative analysis of geophysical logs in ground-water studies often involves at least as broad a range of applications and variation in lithology as is typically encountered in petroleum exploration, making such logs difficult to calibrate and complicating inversion problem formulation. At the same time, data inversion and analysis depend on inversion model formulation and refinement, so that log interpretation cannot be deferred to a geophysical log specialist unless active involvement with interpretation can be maintained by such an expert over the lifetime of the project. We propose a generalized log-interpretation procedure designed to guide hydrogeologists in the interpretation of geophysical logs, and in the integration of log data into ground-water models that may be systematically refined and improved in an iterative way. The procedure is designed to maximize the effective use of three primary contributions from geophysical logs: (1) The continuous depth scale of the measurements along the well bore; (2) The in situ measurement of lithologic properties and the correlation with hydraulic properties of the formations over a finite sample volume; and (3) Multiple independent measurements that can potentially be inverted for multiple physical or hydraulic properties of interest. The approach is formulated in the context of geophysical inversion theory, and is designed to be interfaced with surface geophysical soundings and conventional hydraulic testing. The step-by-step procedures given in our generalized interpretation and inversion technique are based on both qualitative analysis designed to assist formulation of the interpretation model, and quantitative analysis used to assign numerical values to model parameters. The approach bases a decision as to whether quantitative inversion is statistically warranted by formulating an over-determined inversion. If no such inversion is consistent with the inversion model, quantitative inversion is judged not possible with the given data set. Additional statistical criteria such as the statistical significance of regressions are used to guide the subsequent calibration of geophysical data in terms of hydraulic variables in those situations where quantitative data inversion is considered appropriate.

  5. Objective straylight assessment of the human eye with a novel device

    NASA Astrophysics Data System (ADS)

    Schramm, Stefan; Schikowski, Patrick; Lerm, Elena; Kaeding, André; Klemm, Matthias; Haueisen, Jens; Baumgarten, Daniel

    2016-03-01

    Forward scattered light from the anterior segment of the human eye can be measured by Shack-Hartmann (SH) wavefront aberrometers with limited visual angle. We propose a novel Point Spread Function (PSF) reconstruction algorithm based on SH measurements with a novel measurement devise to overcome these limitations. In our optical setup, we use a Digital Mirror Device as variable field stop, which is conventionally a pinhole suppressing scatter and reflections. Images with 21 different stop diameters were captured and from each image the average subaperture image intensity and the average intensity of the pupil were computed. The 21 intensities represent integral values of the PSF which is consequently reconstructed by derivation with respect to the visual angle. A generalized form of the Stiles-Holladay-approximation is fitted to the PSF resulting in a stray light parameter Log(IS). Additionaly the transmission loss of eye is computed. For the proof of principle, a study on 13 healthy young volunteers was carried out. Scatter filters were positioned in front of the volunteer's eye during C-Quant and scatter measurements to generate straylight emulating scatter in the lens. The straylight parameter is compared to the C-Quant measurement parameter Log(ISC) and scatter density of the filters SDF with a partial correlation. Log(IS) shows significant correlation with the SDF and Log(ISC). The correlation is more prominent between Log(IS) combined with the transmission loss and the SDF and Log(ISC). Our novel measurement and reconstruction technique allow for objective stray light analysis of visual angles up to 4 degrees.

  6. Pattern mining of user interaction logs for a post-deployment usability evaluation of a radiology PACS client.

    PubMed

    Jorritsma, Wiard; Cnossen, Fokie; Dierckx, Rudi A; Oudkerk, Matthijs; van Ooijen, Peter M A

    2016-01-01

    To perform a post-deployment usability evaluation of a radiology Picture Archiving and Communication System (PACS) client based on pattern mining of user interaction log data, and to assess the usefulness of this approach compared to a field study. All user actions performed on the PACS client were logged for four months. A data mining technique called closed sequential pattern mining was used to automatically extract frequently occurring interaction patterns from the log data. These patterns were used to identify usability issues with the PACS. The results of this evaluation were compared to the results of a field study based usability evaluation of the same PACS client. The interaction patterns revealed four usability issues: (1) the display protocols do not function properly, (2) the line measurement tool stays active until another tool is selected, rather than being deactivated after one use, (3) the PACS's built-in 3D functionality does not allow users to effectively perform certain 3D-related tasks, (4) users underuse the PACS's customization possibilities. All usability issues identified based on the log data were also found in the field study, which identified 48 issues in total. Post-deployment usability evaluation based on pattern mining of user interaction log data provides useful insights into the way users interact with the radiology PACS client. However, it reveals few usability issues compared to a field study and should therefore not be used as the sole method of usability evaluation. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  7. A Reading-Writing Connection in the Content Areas (Secondary Perspectives).

    ERIC Educational Resources Information Center

    Journal of Reading, 1990

    1990-01-01

    Discusses instructional activities designed to foster the reading-writing connection in the content area classroom. Describes the use of "possible sentences," learning logs, freewriting, dialogue journals, the RAFT technique (role, audience, format, and topic), and the "opinion-proof" organization strategy. (RS)

  8. Automated potentiometric titrations in KCl/water-saturated octanol: method for quantifying factors influencing ion-pair partitioning.

    PubMed

    Scherrer, Robert A; Donovan, Stephen F

    2009-04-01

    The knowledge base of factors influencing ion pair partitioning is very sparse, primarily because of the difficulty in determining accurate log P(I) values of desirable low molecular weight (MW) reference compounds. We have developed a potentiometric titration procedure in KCl/water-saturated octanol that provides a link to log P(I) through the thermodynamic cycle of ionization and partitioning. These titrations have the advantage of being independent of the magnitude of log P, while maintaining a reproducibility of a few hundredths of a log P in the calculated difference between log P neutral and log P ion pair (diff (log P(N - I))). Simple model compounds can be used. The titration procedure is described in detail, along with a program for calculating pK(a)'' values incorporating the ionization of water in octanol. Hydrogen bonding and steric factors have a greater influence on ion pairs than they do on neutral species, yet these factors are missing from current programs used to calculate log P(I) and log D. In contrast to the common assumption that diff (log P(N - I)) is the same for all amines, they can actually vary more than 3 log units, as in our examples. A major factor affecting log P(I) is the ability of water and the counterion to approach the charge center. Bulky substituents near the charge center have a negative influence on log P(I). On the other hand, hydrogen bonding groups near the charge center have the opposite effect by lowering the free energy of the ion pair. The use of this titration method to determine substituent ion pair stabilization values (IPS) should bring about more accurate log D calculations and encourage species-specific QSAR involving log D(N) and log D(I). This work also brings attention to the fascinating world of nature's highly stabilized ion pairs.

  9. Automated Potentiometric Titrations in KCl/Water-Saturated Octanol: Method for Quantifying Factors Influencing Ion-Pair Partitioning

    PubMed Central

    2009-01-01

    The knowledge base of factors influencing ion pair partitioning is very sparse, primarily because of the difficulty in determining accurate log PI values of desirable low molecular weight (MW) reference compounds. We have developed a potentiometric titration procedure in KCl/water-saturated octanol that provides a link to log PI through the thermodynamic cycle of ionization and partitioning. These titrations have the advantage of being independent of the magnitude of log P, while maintaining a reproducibility of a few hundredths of a log P in the calculated difference between log P neutral and log P ion pair (diff (log PN − I)). Simple model compounds can be used. The titration procedure is described in detail, along with a program for calculating pKa′′ values incorporating the ionization of water in octanol. Hydrogen bonding and steric factors have a greater influence on ion pairs than they do on neutral species, yet these factors are missing from current programs used to calculate log PI and log D. In contrast to the common assumption that diff (log PN − I) is the same for all amines, they can actually vary more than 3 log units, as in our examples. A major factor affecting log PI is the ability of water and the counterion to approach the charge center. Bulky substituents near the charge center have a negative influence on log PI. On the other hand, hydrogen bonding groups near the charge center have the opposite effect by lowering the free energy of the ion pair. The use of this titration method to determine substituent ion pair stabilization values (IPS) should bring about more accurate log D calculations and encourage species-specific QSAR involving log DN and log DI. This work also brings attention to the fascinating world of nature’s highly stabilized ion pairs. PMID:19265385

  10. Binary Detection using Multi-Hypothesis Log-Likelihood, Image Processing

    DTIC Science & Technology

    2014-03-27

    geosynchronous orbit and other scenarios important to the USAF. 2 1.3 Research objectives The question posed in this thesis is how well, if at all, can a...is important to compare them to another modern technique. The third objective is to compare results from another image detection method, specifically...Although adaptive optics is an important technique in moving closer to diffraction limited imaging, it is not currently a practical solution for all

  11. Industrial application of semantic process mining

    NASA Astrophysics Data System (ADS)

    Espen Ingvaldsen, Jon; Atle Gulla, Jon

    2012-05-01

    Process mining relates to the extraction of non-trivial and useful information from information system event logs. It is a new research discipline that has evolved significantly since the early work on idealistic process logs. Over the last years, process mining prototypes have incorporated elements from semantics and data mining and targeted visualisation techniques that are more user-friendly to business experts and process owners. In this article, we present a framework for evaluating different aspects of enterprise process flows and address practical challenges of state-of-the-art industrial process mining. We also explore the inherent strengths of the technology for more efficient process optimisation.

  12. The Solar System Origin Revisited

    NASA Astrophysics Data System (ADS)

    Johnson, Fred M.

    2016-10-01

    A novel theory will be presented based in part on astronomical observations, plasma physics experiments, principles of physics and forensic techniques. The new theory correctly predicts planetary distances with a 1% precision. It accounts for energy production mechanism inside all of the planets including our Earth. A log-log mass-luminosity plot of G2 class stars and solar system planets results in a straight line plot, whose slope implies that a fission rather than a proton-proton fusion energy production is operating. Furthermore, it is a confirmation that all our planets had originated from within our Sun. Other still-born planets continue to appear on the Sun's surface, they are mislabeled as sunspots.

  13. Diamond knife-assisted deep anterior lamellar keratoplasty to manage keratoconus.

    PubMed

    Vajpayee, Rasik B; Maharana, Prafulla K; Sharma, Namrata; Agarwal, Tushar; Jhanji, Vishal

    2014-02-01

    To evaluate the outcomes of a new surgical technique, diamond knife-assisted deep anterior lamellar keratoplasty (DALK), and compare its visual and refractive results with big-bubble DALK in cases of keratoconus. Tertiary eyecare hospital. Comparative case series. The visual and surgical outcomes of diamond knife-assisted DALK were compared with those of successful big-bubble DALK. Diamond knife-assisted DALK was performed in 19 eyes and big-bubble DALK, in 11 eyes. All surgeries were completed successfully. No intraoperative or postoperative complications occurred with diamond knife-assisted DALK. Six months after diamond knife-assisted DALK, the mean corrected distance visual acuity (CDVA) improved significantly from 1.87 logMAR ± 0.22 (SD) to 0.23 ± 0.06 logMAR, the mean keratometry improved from 65.99 ± 8.86 diopters (D) to 45.13 ± 1.16 D, and the mean keratometric cylinder improved from 7.99 ± 3.81 D to 2.87 ± 0.59 D (all P=.005). Postoperatively, the mean refractive astigmatism was 2.55 ± 0.49 D and the mean spherical equivalent was -1.97 ± 0.56 D. The mean logMAR CDVA (P = .06), postoperative keratometry (P=.64), refractive cylinder (P=.63), and endothelial cell loss (P=.11) were comparable between diamond knife-assisted DALK and big-bubble DALK. Diamond knife-assisted DALK was effective and predictable as a surgical technique for management of keratoconus cases. This technique has the potential to offer visual and refractive outcomes comparable to those of big-bubble DALK. No author has a financial or proprietary interest in any material or method mentioned. Copyright © 2013 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  14. JBFA - buoyant flight

    NASA Technical Reports Server (NTRS)

    Ohari, T.

    1982-01-01

    A method was developed whereby a balloon was used to carry lumber out of a forest in order to continue lumber production without destroying the natural environment and view of the forest. Emphasis was on the best shape for a logging balloon, development of a balloon logging system suitable for cutting lumber and safety plans, tests on balloon construction and development of netting, and weather of mountainous areas, especially solutions to problems caused by winds.

  15. Iterative raw measurements restoration method with penalized weighted least squares approach for low-dose CT

    NASA Astrophysics Data System (ADS)

    Takahashi, Hisashi; Goto, Taiga; Hirokawa, Koichi; Miyazaki, Osamu

    2014-03-01

    Statistical iterative reconstruction and post-log data restoration algorithms for CT noise reduction have been widely studied and these techniques have enabled us to reduce irradiation doses while maintaining image qualities. In low dose scanning, electronic noise becomes obvious and it results in some non-positive signals in raw measurements. The nonpositive signal should be converted to positive signal so that it can be log-transformed. Since conventional conversion methods do not consider local variance on the sinogram, they have difficulty of controlling the strength of the filtering. Thus, in this work, we propose a method to convert the non-positive signal to the positive signal by mainly controlling the local variance. The method is implemented in two separate steps. First, an iterative restoration algorithm based on penalized weighted least squares is used to mitigate the effect of electronic noise. The algorithm preserves the local mean and reduces the local variance induced by the electronic noise. Second, smoothed raw measurements by the iterative algorithm are converted to the positive signal according to a function which replaces the non-positive signal with its local mean. In phantom studies, we confirm that the proposed method properly preserves the local mean and reduce the variance induced by the electronic noise. Our technique results in dramatically reduced shading artifacts and can also successfully cooperate with the post-log data filter to reduce streak artifacts.

  16. Fatal injuries caused by logs rolling off trucks: Kentucky 1994-1998.

    PubMed

    Struttmann, T W; Scheerer, A L

    2001-02-01

    Logging is one of the most hazardous occupations and fatality rates are consistently among the highest of all industries. A review of fatalities caused by logs rolling off trucks is presented. The Kentucky Fatality Assessment and Control Evaluation Project is a statewide surveillance system for occupational fatalities. Investigations are conducted on selected injuries with an emphasis on prevention strategy development. Logging was an area of high priority for case investigation. During 1994-1998, we identified seven incidents in which a worker was killed by a log rolling off a truck at a sawmill, accounting for 15% of the 45 deaths related to logging activities. These cases were reviewed to identify similar characteristics and risk factors. Investigations led to recommendations for behavioral, administrative, and engineering controls. Potential interventions include limiting load height on trucks, installing unloading cages at sawmills and prohibiting overloaded trucks on public roadways. Copyright 2001 Wiley-Liss, Inc.

  17. Developing Surveillance Methodology for Agricultural and Logging Injury in New Hampshire Using Electronic Administrative Data Sets.

    PubMed

    Scott, Erika E; Hirabayashi, Liane; Krupa, Nicole L; Sorensen, Julie A; Jenkins, Paul L

    2015-08-01

    Agriculture and logging rank among industries with the highest rates of occupational fatality and injury. Establishing a nonfatal injury surveillance system is a top priority in the National Occupational Research Agenda. Sources of data such as patient care reports (PCRs) and hospitalization data have recently transitioned to electronic databases. Using narrative and location codes from PCRs, along with International Classification of Diseases, 9th Revision, external cause of injury codes (E-codes) in hospital data, researchers are designing a surveillance system to track farm and logging injury. A total of 357 true agricultural or logging cases were identified. These data indicate that it is possible to identify agricultural and logging injury events in PCR and hospital data. Multiple data sources increase catchment; nevertheless, limitations in methods of identification of agricultural and logging injury contribute to the likely undercount of injury events.

  18. The combined use of heat-pulse flowmeter logging and packer testing for transmissive fracture recognition

    NASA Astrophysics Data System (ADS)

    Lo, Hung-Chieh; Chen, Po-Jui; Chou, Po-Yi; Hsu, Shih-Meng

    2014-06-01

    This paper presents an improved borehole prospecting methodology based on a combination of techniques in the hydrogeological characterization of fractured rock aquifers. The approach is demonstrated by on-site tests carried out in the Hoshe Experimental Forest site and the Tailuge National Park, Taiwan. Borehole televiewer logs are used to obtain fracture location and distribution along boreholes. The heat-pulse flow meter log is used to measure vertical velocity flow profiles which can be analyzed to estimate fracture transmissivity and to indicate hydraulic connectivity between fractures. Double-packer hydraulic tests are performed to determine the rock mass transmissivity. The computer program FLASH is used to analyze the data from the flowmeter logs. The FLASH program is confirmed as a useful tool which quantitatively predicts the fracture transmissivity in comparison to the hydraulic properties obtained from packer tests. The location of conductive fractures and their transmissivity is identified, after which the preferential flow paths through the fracture network are precisely delineated from a cross-borehole test. The results provide robust confirmation of the use of combined flowmeter and packer methods in the characterization of fractured-rock aquifers, particularly in reference to the investigation of groundwater resource and contaminant transport dynamics.

  19. Language Development: 2 Year Olds

    MedlinePlus

    ... Ribbon Commands Skip to main content Turn off Animations Turn on Animations Our Sponsors Log in | Register Menu Log in | ... enrich his vocabulary and language skills by making reading a part of your everyday routine. At this ...

  20. Social Development: 1 Year Olds

    MedlinePlus

    ... Ribbon Commands Skip to main content Turn off Animations Turn on Animations Our Sponsors Log in | Register Menu Log in | ... re doing around the house. Whether you’re reading the paper, sweeping the floors, mowing the lawn, ...

  1. Publications - GMC 181 | Alaska Division of Geological & Geophysical

    Science.gov Websites

    DGGS GMC 181 Publication Details Title: Geologic logs and core assays of 14 nickel, copper, and cobalt information. Bibliographic Reference Inspiration Development Company, 1991, Geologic logs and core assays of

  2. Accuracy and borehole influences in pulsed neutron gamma density logging while drilling.

    PubMed

    Yu, Huawei; Sun, Jianmeng; Wang, Jiaxin; Gardner, Robin P

    2011-09-01

    A new pulsed neutron gamma density (NGD) logging has been developed to replace radioactive chemical sources in oil logging tools. The present paper describes studies of near and far density measurement accuracy of NGD logging at two spacings and the borehole influences using Monte-Carlo simulation. The results show that the accuracy of near density is not as good as far density. It is difficult to correct this for borehole effects by using conventional methods because both near and far density measurement is significantly sensitive to standoffs and mud properties. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. Development of a Multi-Species Biotic Ligand Model Predicting the Toxicity of Trivalent Chromium to Barley Root Elongation in Solution Culture

    PubMed Central

    Song, Ningning; Zhong, Xu; Li, Bo; Li, Jumei; Wei, Dongpu; Ma, Yibing

    2014-01-01

    Little knowledge is available about the influence of cation competition and metal speciation on trivalent chromium (Cr(III)) toxicity. In the present study, the effects of pH and selected cations on the toxicity of trivalent chromium (Cr(III)) to barley (Hordeum vulgare) root elongation were investigated to develop an appropriate biotic ligand model (BLM). Results showed that the toxicity of Cr(III) decreased with increasing activity of Ca2+ and Mg2+ but not with K+ and Na+. The effect of pH on Cr(III) toxicity to barley root elongation could be explained by H+ competition with Cr3+ bound to a biotic ligand (BL) as well as by the concomitant toxicity of CrOH2+ in solution culture. Stability constants were obtained for the binding of Cr3+, CrOH2+, Ca2+, Mg2+ and H+ with binding ligand: log KCrBL 7.34, log KCrOHBL 5.35, log KCaBL 2.64, log KMgBL 2.98, and log KHBL 4.74. On the basis of those estimated parameters, a BLM was successfully developed to predict Cr(III) toxicity to barley root elongation as a function of solution characteristics. PMID:25119269

  4. Development of a multi-species biotic ligand model predicting the toxicity of trivalent chromium to barley root elongation in solution culture.

    PubMed

    Song, Ningning; Zhong, Xu; Li, Bo; Li, Jumei; Wei, Dongpu; Ma, Yibing

    2014-01-01

    Little knowledge is available about the influence of cation competition and metal speciation on trivalent chromium (Cr(III)) toxicity. In the present study, the effects of pH and selected cations on the toxicity of trivalent chromium (Cr(III)) to barley (Hordeum vulgare) root elongation were investigated to develop an appropriate biotic ligand model (BLM). Results showed that the toxicity of Cr(III) decreased with increasing activity of Ca(2+) and Mg(2+) but not with K(+) and Na(+). The effect of pH on Cr(III) toxicity to barley root elongation could be explained by H(+) competition with Cr(3+) bound to a biotic ligand (BL) as well as by the concomitant toxicity of CrOH(2+) in solution culture. Stability constants were obtained for the binding of Cr(3+), CrOH(2+), Ca(2+), Mg(2+) and H(+) with binding ligand: log KCrBL 7.34, log KCrOHBL 5.35, log KCaBL 2.64, log KMgBL 2.98, and log KHBL 4.74. On the basis of those estimated parameters, a BLM was successfully developed to predict Cr(III) toxicity to barley root elongation as a function of solution characteristics.

  5. Graphical and PC-software analysis of volcano eruption precursors according to the Materials Failure Forecast Method (FFM)

    NASA Astrophysics Data System (ADS)

    Cornelius, Reinold R.; Voight, Barry

    1995-03-01

    The Materials Failure Forecasting Method for volcanic eruptions (FFM) analyses the rate of precursory phenomena. Time of eruption onset is derived from the time of "failure" implied by accelerating rate of deformation. The approach attempts to fit data, Ω, to the differential relationship Ω¨=AΩ˙, where the dot superscript represents the time derivative, and the data Ω may be any of several parameters describing the accelerating deformation or energy release of the volcanic system. Rate coefficients, A and α, may be derived from appropriate data sets to provide an estimate of time to "failure". As the method is still an experimental technique, it should be used with appropriate judgment during times of volcanic crisis. Limitations of the approach are identified and discussed. Several kinds of eruption precursory phenomena, all simulating accelerating creep during the mechanical deformation of the system, can be used with FFM. Among these are tilt data, slope-distance measurements, crater fault movements and seismicity. The use of seismic coda, seismic amplitude-derived energy release and time-integrated amplitudes or coda lengths are examined. Usage of cumulative coda length directly has some practical advantages to more rigorously derived parameters, and RSAM and SSAM technologies appear to be well suited to real-time applications. One graphical and four numerical techniques of applying FFM are discussed. The graphical technique is based on an inverse representation of rate versus time. For α = 2, the inverse rate plot is linear; it is concave upward for α < 2 and concave downward for α > 2. The eruption time is found by simple extrapolation of the data set toward the time axis. Three numerical techniques are based on linear least-squares fits to linearized data sets. The "linearized least-squares technique" is most robust and is expected to be the most practical numerical technique. This technique is based on an iterative linearization of the given rate-time series. The hindsight technique is disadvantaged by a bias favouring a too early eruption time in foresight applications. The "log rate versus log acceleration technique", utilizing a logarithmic representation of the fundamental differential equation, is disadvantaged by large data scatter after interpolation of accelerations. One further numerical technique, a nonlinear least-squares fit to rate data, requires special and more complex software. PC-oriented computer codes were developed for data manipulation, application of the three linearizing numerical methods, and curve fitting. Separate software is required for graphing purposes. All three linearizing techniques facilitate an eruption window based on a data envelope according to the linear least-squares fit, at a specific level of confidence, and an estimated rate at time of failure.

  6. Short-term impact of post-fire salvage logging on regeneration, hazardous fuel accumulation, and understorey development in ponderosa pine forest of the Black Hills, SD, USA

    Treesearch

    Tara L Keyser; Fredrick W Smith; Wayne D. Shepperd

    2009-01-01

    We examined the impacts of post-fire salvage logging on regeneration, fuel accumulation, and understorey vegetation and assessed whether the effects of salvage logging differed between stands burned under moderate and high fire severity following the 2000 Jasper Fire in the Black Hills. In unsalvaged sites, fire-related tree mortality...

  7. Use of polynomial expressions to describe the bioconcentration of hydrophobic chemicals by fish

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Connell, D.W.; Hawker, D.W.

    1988-12-01

    For the bioconcentration of hydrophobic chemicals by fish, relationships have been previously established between uptake rate constants (k1) and the octanol/water partition coefficient (Kow), and also between the clearance rate constant (k2) and Kow. These have been refined and extended on the basis of data for chlorinated hydrocarbons, and closely related compounds including polychlorinated dibenzodioxins, that covered a wider range of hydrophobicity (2.5 less than log Kow less than 9.5). This has allowed the development of new relationships between log Kow and various factors, including the bioconcentration factor (as log KB), equilibrium time (as log teq), and maximum biotic concentrationmore » (as log CB), which include extremely hydrophobic compounds previously not taken into account. The shape of the curves generated by these equations are in qualitative agreement with theoretical prediction and are described by polynomial expressions which are generally approximately linear over the more limited range of log Kow values used to develop previous relationships. The influences of factors such as hydrophobicity, aqueous solubility, molecular weight, lipid solubility, and also exposure time were considered. Decreasing lipid solubilities of extremely hydrophobic chemicals were found to result in increasing clearance rate constants, as well decreasing equilibrium times and bioconcentration factors.« less

  8. Partition of volatile organic compounds from air and from water into plant cuticular matrix: An LFER analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Platts, J.A.; Abraham, M.H.

    The partitioning of organic compounds between air and foliage and between water and foliage is of considerable environmental interest. The purpose of this work is to show that partitioning into the cuticular matrix of one particular species can be satisfactorily modeled by general equations the authors have previously developed and, hence, that the same general equations could be used to model partitioning into other plant materials of the same or different species. The general equations are linear free energy relationships that employ descriptors for polarity/polarizability, hydrogen bond acidity and basicity, dispersive effects, and volume. They have been applied to themore » partition of 62 very varied organic compounds between cuticular matrix of the tomato fruit, Lycopersicon esculentum, and either air (MX{sub a}) or water (MX{sub w}). Values of log MX{sub a} covering a range of 12.4 log units are correlated with a standard deviation of 0.232 log unit, and values of log MX{sub w} covering a range of 7.6 log unit are correlated with an SD of 0.236 log unit. Possibilities are discussed for the prediction of new air-plant cuticular matrix and water-plant cuticular matrix partition values on the basis of the equations developed.« less

  9. A small-diameter NMR logging tool for groundwater investigations

    USGS Publications Warehouse

    Walsh, David; Turner, Peter; Grunewald, Elliot; Zhang, Hong; Butler, James J.; Reboulet, Ed; Knobbe, Steve; Christy, Tom; Lane, John W.; Johnson, Carole D.; Munday, Tim; Fitzpatrick, Andrew

    2013-01-01

    A small-diameter nuclear magnetic resonance (NMR) logging tool has been developed and field tested at various sites in the United States and Australia. A novel design approach has produced relatively inexpensive, small-diameter probes that can be run in open or PVC-cased boreholes as small as 2 inches in diameter. The complete system, including surface electronics and various downhole probes, has been successfully tested in small-diameter monitoring wells in a range of hydrogeological settings. A variant of the probe that can be deployed by a direct-push machine has also been developed and tested in the field. The new NMR logging tool provides reliable, direct, and high-resolution information that is of importance for groundwater studies. Specifically, the technology provides direct measurement of total water content (total porosity in the saturated zone or moisture content in the unsaturated zone), and estimates of relative pore-size distribution (bound vs. mobile water content) and hydraulic conductivity. The NMR measurements show good agreement with ancillary data from lithologic logs, geophysical logs, and hydrogeologic measurements, and provide valuable information for groundwater investigations.

  10. Regional regression equations to estimate peak-flow frequency at sites in North Dakota using data through 2009

    USGS Publications Warehouse

    Williams-Sether, Tara

    2015-08-06

    Annual peak-flow frequency data from 231 U.S. Geological Survey streamflow-gaging stations in North Dakota and parts of Montana, South Dakota, and Minnesota, with 10 or more years of unregulated peak-flow record, were used to develop regional regression equations for exceedance probabilities of 0.5, 0.20, 0.10, 0.04, 0.02, 0.01, and 0.002 using generalized least-squares techniques. Updated peak-flow frequency estimates for 262 streamflow-gaging stations were developed using data through 2009 and log-Pearson Type III procedures outlined by the Hydrology Subcommittee of the Interagency Advisory Committee on Water Data. An average generalized skew coefficient was determined for three hydrologic zones in North Dakota. A StreamStats web application was developed to estimate basin characteristics for the regional regression equation analysis. Methods for estimating a weighted peak-flow frequency for gaged sites and ungaged sites are presented.

  11. Integrated reservoir characterization and flow simulation for well targeting and reservoir management, Iagifu-Hedinia field, Southern Highlands Province, Papua New Guinea

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franklin, S.P.; Livingston, J.E.; Fitzmorris, R.E.

    Infill drilling based on integrated reservoir characterization and flow simulation is increasing recoverable reserves by 20 MMBO, in lagifu-Hedinia Field (IHF). Stratigraphically-zoned models are input to window and full-field flow simulations, and results of the flow simulations target deviated and horizontal wells. Logging and pressure surveys facilitate detailed reservoir management. Flooding surfaces are the dominant control on differential depletion within and between reservoirs. The primary reservoir is the basal Cretaceous Toro Sandstone. Within the IHF, Toro is a 100 m quartz sandstone composed of stacked, coarsening-upward parasequences within a wave-dominated deltaic complex. Flooding surfaces are used to form a hydraulicmore » zonation. The zonation is refined using discontinuities in RIFT pressure gradients and logs from development wells. For flow simulation, models use 3D geostatistical techniques. First, variograms defining spatial correlation are developed. The variograms are used to construct 3D porosity and permeability models which reflect the stratigraphic facies models. Structure models are built using dipmeter, biostratigraphic, and surface data. Deviated wells often cross axial surfaces and geometry is predicted from dip domain and SCAT. Faults are identified using pressure transient data and dipmeter. The Toro reservoir is subnormally pressured and fluid contacts are hydrodynamically tilted. The hydrodynamic flow and tilted contacts are modeled by flow simulation and constrained by maps of the potentiometric surface.« less

  12. Better Management of Alcohol Liver Disease Using a ‘Microstructured Synbox’ System Comprising L. plantarum and EGCG

    PubMed Central

    Rishi, Praveen; Arora, Sumeha; Kaur, Ujjwal Jit; Chopra, Kanwaljit; Kaur, Indu Pal

    2017-01-01

    Synergistic combination of probiotics with carbohydrate based prebiotics is widely employed for the treatment of various gut related disorders. However, such carbohydrate based prebiotics encourage the growth of pathogens and probiotics, equally. Aim of the study was (i) to explore the possibility of using epigallocatechin gallate (EGCG) a phenolic compound, as a prebiotic for L.plantarum; (ii) to develop and evaluate a microstructured synbox (microencapsulating both probiotic and EGCG together) in rat model of alcohol liver disease (ALD); and, (iii) to confirm whether the combination can address issues of EGCG bioavailability and probiotic survivability in adverse gut conditions. Growth enhancing effect of EGCG on L. plantarum (12.8±0.5 log 10 units) was significantly (p≤0.05) better than inulin (11.4±0.38 log 10 units), a natural storage carbohydrate. The formulated synbox significantly modulated the levels of alcohol, endotoxin, hepatic enzymes and restored the hepatoarchitecture in comparison to simultaneous administration of free agents. Additionally, using a battery of techniques, levels of various cellular and molecular markers viz. NF-kB/p50, TNF-α, IL12/p40, and signalling molecules TLR4, CD14, MD2, MyD88 and COX-2 were observed to be suppressed. Developed microbead synbox, as a single delivery system for both the agents showed synergism and hence, holds promise as a therapeutic option for ALD management. PMID:28060832

  13. Performance characteristics and estimation of measurement uncertainty of three plating procedures for Campylobacter enumeration in chicken meat.

    PubMed

    Habib, I; Sampers, I; Uyttendaele, M; Berkvens, D; De Zutter, L

    2008-02-01

    In this work, we present an intra-laboratory study in order to estimate repeatability (r), reproducibility (R), and measurement uncertainty (U) associated with three media for Campylobacter enumeration, named, modified charcoal cefoperazone deoxycholate agar (mCCDA); Karmali agar; and CampyFood ID agar (CFA) a medium by Biomérieux SA. The study was performed at three levels: (1) pure bacterial cultures, using three Campylobacter strains; (2) artificially contaminated samples from three chicken meat matrixes (total n=30), whereby samples were spiked using two contamination levels; ca. 10(3)cfuCampylobacter/g, and ca. 10(4)cfuCampylobacter/g; and (3) pilot testing in naturally contaminated chicken meat samples (n=20). Results from pure culture experiment revealed that enumeration of Campylobacter colonies on Karmali and CFA media was more convenient in comparison with mCCDA using spread and spiral plating techniques. Based on artificially contaminated samples testing, values of repeatability (r) were comparable between the three media, and estimated as 0.15log(10)cfu/g for mCCDA, 0.14log(10)cfu/g for Karmali, and 0.18log(10)cfu/g for CFA. As well, reproducibility performance of the three plating media was comparable. General R values which can be used when testing chicken meat samples are; 0.28log(10), 0.32log(10), and 0.25log(10) for plating on mCCDA, Karmali agar, and CFA, respectively. Measurement uncertainty associated with mCCDA, Karmali agar, and CFA using spread plating, for combination of all meat matrixes, were +/-0.24log(10)cfu/g, +/-0.28log(10)cfu/g, and +/-0.22log(10)cfu/g, respectively. Higher uncertainty was associated with Karmali agar for Campylobacter enumeration in artificially inoculated minced meat (+/-0.48log(10)cfu/g). The general performance of CFA medium was comparable with mCCDA performance at the level of artificially contaminated samples. However, when tested at naturally contaminated samples, non-Campylobacter colonies gave similar deep red colour as that given by the typical Campylobacter growth on CFA. Such colonies were not easily distinguishable by naked eye. In general, the overall reproducibility, repeatability, and measurement uncertainty estimated by our study indicate that there are no major problems with the precision of the International Organization for Standardization (ISO) 10272-2:2006 protocol for Campylobacter enumeration using mCCDA medium.

  14. Use of biopartitioning micellar chromatography and RP-HPLC for the determination of blood-brain barrier penetration of α-adrenergic/imidazoline receptor ligands, and QSPR analysis.

    PubMed

    Vucicevic, J; Popovic, M; Nikolic, K; Filipic, S; Obradovic, D; Agbaba, D

    2017-03-01

    For this study, 31 compounds, including 16 imidazoline/α-adrenergic receptor (IRs/α-ARs) ligands and 15 central nervous system (CNS) drugs, were characterized in terms of the retention factors (k) obtained using biopartitioning micellar and classical reversed phase chromatography (log k BMC and log k wRP , respectively). Based on the retention factor (log k wRP ) and slope of the linear curve (S) the isocratic parameter (φ 0 ) was calculated. Obtained retention factors were correlated with experimental log BB values for the group of examined compounds. High correlations were obtained between logarithm of biopartitioning micellar chromatography (BMC) retention factor and effective permeability (r(log k BMC /log BB): 0.77), while for RP-HPLC system the correlations were lower (r(log k wRP /log BB): 0.58; r(S/log BB): -0.50; r(φ 0 /P e ): 0.61). Based on the log k BMC retention data and calculated molecular parameters of the examined compounds, quantitative structure-permeability relationship (QSPR) models were developed using partial least squares, stepwise multiple linear regression, support vector machine and artificial neural network methodologies. A high degree of structural diversity of the analysed IRs/α-ARs ligands and CNS drugs provides wide applicability domain of the QSPR models for estimation of blood-brain barrier penetration of the related compounds.

  15. Fracture characterization and fracture-permeability estimation at the underground research laboratory in southeastern Manitoba, Canada

    USGS Publications Warehouse

    Paillet, Frederick L.

    1988-01-01

    Various conventional geophysical well logs were obtained in conjunction with acoustic tube-wave amplitude and experimental heat-pulse flowmeter measurements in two deep boreholes in granitic rocks on the Canadian shield in southeastern Manitoba. The objective of this study is the development of measurement techniques and data processing methods for characterization of rock volumes that might be suitable for hosting a nuclear waste repository. One borehole, WRA1, intersected several major fracture zones, and was suitable for testing quantitative permeability estimation methods. The other borehole, URL13, appeared to intersect almost no permeable fractures; it was suitable for testing methods for the characterization of rocks of very small permeability and uniform thermo-mechanical properties in a potential repository horizon. Epithermal neutron , acoustic transit time, and single-point resistance logs provided useful, qualitative indications of fractures in the extensively fractured borehole, WRA1. A single-point log indicates both weathering and the degree of opening of a fracture-borehole intersection. All logs indicate the large intervals of mechanically and geochemically uniform, unfractured granite below depths of 300 m in the relatively unfractured borehole, URL13. Some indications of minor fracturing were identified in that borehole, with one possible fracture at a depth of about 914 m, producing a major acoustic waveform anomaly. Comparison of acoustic tube-wave attenuation with models of tube-wave attenuation in infinite fractures of given aperture provide permeability estimates ranging from equivalent single-fractured apertures of less than 0.01 mm to apertures of > 0.5 mm. One possible fracture anomaly in borehole URL13 at a depth of about 914 m corresponds with a thin mafic dike on the core where unusually large acoustic contrast may have produced the observed waveform anomaly. No indications of naturally occurring flow existed in borehole URL13; however, flowmeter measurements indicated flow at < 0.05 L/min from the upper fracture zones in borehole WRA1 to deeper fractures at depths below 800 m. (Author 's abstract)

  16. TU-CD-304-11: Veritas 2.0: A Cloud-Based Tool to Facilitate Research and Innovation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mishra, P; Patankar, A; Etmektzoglou, A

    Purpose: We introduce Veritas 2.0, a cloud-based, non-clinical research portal, to facilitate translation of radiotherapy research ideas to new delivery techniques. The ecosystem of research tools includes web apps for a research beam builder for TrueBeam Developer Mode, an image reader for compressed and uncompressed XIM files, and a trajectory log file based QA/beam delivery analyzer. Methods: The research beam builder can generate TrueBeam readable XML file either from scratch or from pre-existing DICOM-RT plans. DICOM-RT plan is first converted to XML format and then researcher can interactively modify or add control points to them. Delivered beam can be verifiedmore » via reading generated images and analyzing trajectory log files. Image reader can read both uncompressed and HND-compressed XIM images. The trajectory log analyzer lets researchers plot expected vs. actual values and deviations among 30 mechanical axes. The analyzer gives an animated view of MLC patterns for the beam delivery. Veritas 2.0 is freely available and its advantages versus standalone software are i) No software installation or maintenance needed, ii) easy accessibility across all devices iii) seamless upgrades and iv) OS independence. Veritas is written using open-source tools like twitter bootstrap, jQuery, flask, and Python-based modules. Results: In the first experiment, an anonymized 7-beam DICOM-RT IMRT plan was converted to XML beam containing 1400 control points. kV and MV imaging points were inserted into this XML beam. In another experiment, a binary log file was analyzed to compare actual vs expected values and deviations among axes. Conclusions: Veritas 2.0 is a public cloud-based web app that hosts a pool of research tools for facilitating research from conceptualization to verification. It is aimed at providing a platform for facilitating research and collaboration. I am full time employee at Varian Medical systems, Palo Alto.« less

  17. Decontamination of materials contaminated with Francisella philomiragia or MS2 bacteriophage using PES-Solid, a solid source of peracetic acid.

    PubMed

    Buhr, T L; Young, A A; Johnson, C A; Minter, Z A; Wells, C M

    2014-08-01

    The aim of the study was to develop test methods and evaluate survival of Francisella philomiragia cells and MS2 bacteriophage after exposure to PES-Solid (a solid source of peracetic acid) formulations with or without surfactants. Francisella philomiragia cells (≥7·6 log10 CFU) or MS2 bacteriophage (≥6·8 log10 PFU) were deposited on seven different test materials and treated with three different PES-Solid formulations, three different preneutralized samples and filter controls at room temperature for 15 min. There were 0-1·3 log10 CFU (<20 cells) of cell survival, or 0-1·7 log10 (<51 PFU) of bacteriophage survival in all 21 test combinations (organism, formulation and substrate) containing reactive PES-Solid. In addition, the microemulsion (Dahlgren Surfactant System) showed ≤2 log10 (100 cells) of viable F. philomiragia cells, indicating the microemulsion achieved <2 log10 CFU on its own. Three PES-Solid formulations and one microemulsion system (DSS) inactivated F. philomiragia cells and/or MS2 bacteriophage that were deposited on seven different materials. A test method was developed to show that reactive PES-Solid formulations and a microemulsion system (DSS) inactivated >6 log10 CFU/PFU F. philomiragia cells and/or MS2 bacteriophage on different materials. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.

  18. Predicting Information Flows in Network Traffic.

    ERIC Educational Resources Information Center

    Hinich, Melvin J.; Molyneux, Robert E.

    2003-01-01

    Discusses information flow in networks and predicting network traffic and describes a study that uses time series analysis on a day's worth of Internet log data. Examines nonlinearity and traffic invariants, and suggests that prediction of network traffic may not be possible with current techniques. (Author/LRW)

  19. South-East Asia's Trembling Rainforests.

    ERIC Educational Resources Information Center

    Laird, John

    1991-01-01

    This discussion focuses on potential solutions to the degradation of rainforests in Southeast Asia caused by indiscriminate logging, inappropriate road-construction techniques, forest fires, and the encroachment upon watersheds by both agricultural concerns and peasant farmers. Vignettes illustrate the impact of this degradation upon the animals,…

  20. Improving quantitative structure-activity relationship models using Artificial Neural Networks trained with dropout.

    PubMed

    Mendenhall, Jeffrey; Meiler, Jens

    2016-02-01

    Dropout is an Artificial Neural Network (ANN) training technique that has been shown to improve ANN performance across canonical machine learning (ML) datasets. Quantitative Structure Activity Relationship (QSAR) datasets used to relate chemical structure to biological activity in Ligand-Based Computer-Aided Drug Discovery pose unique challenges for ML techniques, such as heavily biased dataset composition, and relatively large number of descriptors relative to the number of actives. To test the hypothesis that dropout also improves QSAR ANNs, we conduct a benchmark on nine large QSAR datasets. Use of dropout improved both enrichment false positive rate and log-scaled area under the receiver-operating characteristic curve (logAUC) by 22-46 % over conventional ANN implementations. Optimal dropout rates are found to be a function of the signal-to-noise ratio of the descriptor set, and relatively independent of the dataset. Dropout ANNs with 2D and 3D autocorrelation descriptors outperform conventional ANNs as well as optimized fingerprint similarity search methods.

  1. Improving Quantitative Structure-Activity Relationship Models using Artificial Neural Networks Trained with Dropout

    PubMed Central

    Mendenhall, Jeffrey; Meiler, Jens

    2016-01-01

    Dropout is an Artificial Neural Network (ANN) training technique that has been shown to improve ANN performance across canonical machine learning (ML) datasets. Quantitative Structure Activity Relationship (QSAR) datasets used to relate chemical structure to biological activity in Ligand-Based Computer-Aided Drug Discovery (LB-CADD) pose unique challenges for ML techniques, such as heavily biased dataset composition, and relatively large number of descriptors relative to the number of actives. To test the hypothesis that dropout also improves QSAR ANNs, we conduct a benchmark on nine large QSAR datasets. Use of dropout improved both Enrichment false positive rate (FPR) and log-scaled area under the receiver-operating characteristic curve (logAUC) by 22–46% over conventional ANN implementations. Optimal dropout rates are found to be a function of the signal-to-noise ratio of the descriptor set, and relatively independent of the dataset. Dropout ANNs with 2D and 3D autocorrelation descriptors outperform conventional ANNs as well as optimized fingerprint similarity search methods. PMID:26830599

  2. Creep rupture testing of carbon fiber-reinforced epoxy composites

    NASA Astrophysics Data System (ADS)

    Burton, Kathryn Anne

    Carbon fiber is becoming more prevalent in everyday life. As such, it is necessary to have a thorough understanding of, not solely general mechanical properties, but of long-term material behavior. Creep rupture testing of carbon fiber is very difficult due to high strength and low strain to rupture properties. Past efforts have included testing upon strands, single tows and overwrapped pressure vessels. In this study, 1 inch wide, [0°/90°]s laminated composite specimens were constructed from fabric supplied by T.D. Williamson Inc. Specimen fabrication methods and gripping techniques were investigated and a method was developed to collect long term creep rupture behavior data. An Instron 1321 servo-hydraulic material testing machine was used to execute static strength and short term creep rupture tests. A hanging dead-weight apparatus was designed to perform long-term creep rupture testing. The testing apparatus, specimens, and specimen grips functioned well. Collected data exhibited a power law distribution and therefore, a linear trend upon a log strength-log time plot. Statistical analysis indicated the material exhibited slow degradation behavior, similar to previous studies, and could maintain a 50 year carrying capacity at 62% of static strength, approximately 45.7 ksi.

  3. Impact detection and analysis/health monitoring system for composites

    NASA Astrophysics Data System (ADS)

    Child, James E.; Kumar, Amrita; Beard, Shawn; Qing, Peter; Paslay, Don G.

    2006-05-01

    This manuscript includes information from test evaluations and development of a smart event detection system for use in monitoring composite rocket motor cases for damaging impacts. The primary purpose of the system as a sentry for case impact event logging is accomplished through; implementation of a passive network of miniaturized piezoelectric sensors, logger with pre-determined force threshold levels, and analysis software. Empirical approaches to structural characterizations and network calibrations along with implementation techniques were successfully evaluated, testing was performed on both unloaded (less propellants) as well as loaded rocket motors with the cylindrical areas being of primary focus. The logged test impact data with known physical network parameters provided for impact location as well as force determination, typically within 3 inches of actual impact location using a 4 foot network grid and force accuracy within 25%of an actual impact force. The simplistic empirical characterization approach along with the robust / flexible sensor grids and battery operated portable logger show promise of a system that can increase confidence in composite integrity for both new assets progressing through manufacturing processes as well as existing assets that may be in storage or transportation.

  4. Log-rise of the resistivity in the holographic Kondo model

    NASA Astrophysics Data System (ADS)

    Padhi, Bikash; Tiwari, Apoorv; Setty, Chandan; Phillips, Philip W.

    2018-03-01

    We study a single-channel Kondo effect using a recently developed [1-4] holographic large-N technique. In order to obtain resistivity of this model, we introduce a probe field. The gravity dual of a localized fermionic impurity in 1 +1 -dimensional host matter is constructed by embedding a localized two-dimensional Anti-de Sitter (AdS2 )-brane in the bulk of three-dimensional AdS3 . This helps us construct an impurity charge density which acts as a source to the bulk equation of motion of the probe gauge field. The functional form of the charge density is obtained independently by solving the equations of motion for the fields confined to the AdS2 -brane. The asymptotic solution of the probe field is dictated by the impurity charge density, which in turn affects the current-current correlation functions and hence the resistivity. Our choice of parameters tunes the near-boundary impurity current to be marginal, resulting in a log T behavior in the UV resistivity, as is expected for the Kondo problem. The resistivity at the IR fixed point turns out to be zero, signaling a complete screening of the impurity.

  5. Log export and import restrictions of the U.S. Pacific Northwest and British Columbia: past and present.

    Treesearch

    Christine L. Lane

    1998-01-01

    Export constraints affecting North American west coast logs have existed intermittently since 1831. Recent developments have tended toward tighter restrictions. National, Provincial, and State rules are described.

  6. Sight Version 0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bronevetsky, G.

    2014-09-01

    Enables applications to emit log information into an output file and produced a structured visual summary of the log data, as well as various statistical analyses of it. This makes it easier for developers to understand the behavior of their applications.

  7. Pesticide and trace metal occurrence and aquatic benchmark exceedances in surface waters and sediments of urban wetlands and retention ponds in Melbourne, Australia.

    PubMed

    Allinson, Graeme; Zhang, Pei; Bui, AnhDuyen; Allinson, Mayumi; Rose, Gavin; Marshall, Stephen; Pettigrove, Vincent

    2015-07-01

    Samples of water and sediments were collected from 24 urban wetlands in Melbourne, Australia, in April 2010, and tested for more than 90 pesticides using a range of gas chromatographic (GC) and liquid chromatographic (LC) techniques, sample 'hormonal' activity using yeast-based recombinant receptor-reporter gene bioassays, and trace metals using spectroscopic techniques. At the time of sampling, there was almost no estrogenic activity in the water column. Twenty-three different pesticide residues were observed in one or more water samples from the 24 wetlands; chemicals observed at more than 40% of sites were simazine (100%), atrazine (79%), and metalaxyl and terbutryn (46%). Using the toxicity unit (TU) concept, less than 15% of the detected pesticides were considered to pose an individual, short-term risk to fish or zooplankton in the ponds and wetlands. However, one pesticide (fenvalerate) may have posed a possible short-term risk to fish (log10TUf > -3), and three pesticides (azoxystrobin, fenamiphos and fenvalerate) may have posed a risk to zooplankton (logTUzp between -2 and -3); all the photosystem II (PSII) inhibiting herbicides may have posed a risk to primary producers in the ponds and wetlands (log10TUap and/or log10TUalg > -3). The wetland sediments were contaminated with 16 different pesticides; no chemicals were observed at more than one third of sites, but based on frequency of detection and concentrations, bifenthrin (33%, maximum 59 μg/kg) is the priority insecticide of concern for the sediments studied. Five sites returned a TU greater than the possible effect threshold (i.e. log10TU > 1) as a result of bifenthrin contamination of their sediments. Most sediments did not exceed Australian sediment quality guideline levels for trace metals. However, more than half of the sites had threshold effect concentration quotients (TECQ) values >1 for Cu (58%), Pb (50%), Ni (67%) and Zn (63%), and 75% of sites had mean probable effect concentration quotients (PECQ) >0.2, suggesting that the collected sediments may have been having some impact on sediment-dwelling organisms.

  8. Linear modeling of the soil-water partition coefficient normalized to organic carbon content by reversed-phase thin-layer chromatography.

    PubMed

    Andrić, Filip; Šegan, Sandra; Dramićanin, Aleksandra; Majstorović, Helena; Milojković-Opsenica, Dušanka

    2016-08-05

    Soil-water partition coefficient normalized to the organic carbon content (KOC) is one of the crucial properties influencing the fate of organic compounds in the environment. Chromatographic methods are well established alternative for direct sorption techniques used for KOC determination. The present work proposes reversed-phase thin-layer chromatography (RP-TLC) as a simpler, yet equally accurate method as officially recommended HPLC technique. Several TLC systems were studied including octadecyl-(RP18) and cyano-(CN) modified silica layers in combination with methanol-water and acetonitrile-water mixtures as mobile phases. In total 50 compounds of different molecular shape, size, and various ability to establish specific interactions were selected (phenols, beznodiazepines, triazine herbicides, and polyaromatic hydrocarbons). Calibration set of 29 compounds with known logKOC values determined by sorption experiments was used to build simple univariate calibrations, Principal Component Regression (PCR) and Partial Least Squares (PLS) models between logKOC and TLC retention parameters. Models exhibit good statistical performance, indicating that CN-layers contribute better to logKOC modeling than RP18-silica. The most promising TLC methods, officially recommended HPLC method, and four in silico estimation approaches have been compared by non-parametric Sum of Ranking Differences approach (SRD). The best estimations of logKOC values were achieved by simple univariate calibration of TLC retention data involving CN-silica layers and moderate content of methanol (40-50%v/v). They were ranked far well compared to the officially recommended HPLC method which was ranked in the middle. The worst estimates have been obtained from in silico computations based on octanol-water partition coefficient. Linear Solvation Energy Relationship study revealed that increased polarity of CN-layers over RP18 in combination with methanol-water mixtures is the key to better modeling of logKOC through significant diminishing of dipolar and proton accepting influence of the mobile phase as well as enhancing molar refractivity in excess of the chromatographic systems. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Predicting both passive intestinal absorption and the dissociation constant toward albumin using the PAMPA technique.

    PubMed

    Bujard, Alban; Sol, Marine; Carrupt, Pierre-Alain; Martel, Sophie

    2014-10-15

    The parallel artificial membrane permeability assay (PAMPA) is a high-throughput screening (HTS) method that is widely used to predict in vivo passive permeability through biological barriers, such as the skin, the blood brain barrier (BBB) and the gastrointestinal tract (GIT). The PAMPA technique has also been used to predict the dissociation constant (Kd) between a compound and human serum albumin (HSA) while disregarding passive permeability. Furthermore, the assay is based on the use of two separate 5-point kinetic experiments, which increases the analysis time. In the present study, we adapted the hexadecane membrane (HDM)-PAMPA assay to both predict passive gastrointestinal absorption via the permeability coefficient logPe value and determine the Kd. Two assays were performed: one in the presence and one in the absence of HSA in the acceptor compartment. In the absence of HSA, logPe values were determined after a 4-h incubation time, as originally described, but the dimethylsulfoxide (DMSO) percentage and pH were altered to be compatible with the protein. In parallel, a second PAMPA assay was performed in the presence of HSA during a 16-h incubation period. By adding HSA, a variation in the amount of compound crossing the membrane was observed compared to the permeability measured in the absence of HSA. The concentration of compound reaching the acceptor compartment in each case was used to determine both parameters (logPe and logKd) using numerical simulations, which highlighted the originality of this method because these calculations required only two endpoint measurements instead of a complete kinetic study. It should be noted that the amount of compound that reaches the acceptor compartment in the presence of HSA is modulated by complex dissociation in the receptor compartment. Only compounds that are moderately bound to albumin (-3

  10. Geophysical evaluation of sandstone aquifers in the Reconcavo-Tucano Basin, Bahia -- Brazil

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lima, O.A.L. de

    1993-11-01

    The upper clastic sediments in the Reconcavo-Tucano basin comprise a multilayer aquifer system of Jurassic age. Its groundwater is normally fresh down to depths of more than 1,000 m. Locally, however, there are zones producing high salinity or sulfur geothermal water. Analysis of electrical logs of more than 150 wells enabled the identification of the most typical sedimentary structures and the gross geometries for the sandstone units in selected areas of the basin. Based on this information, the thick sands are interpreted as coalescent point bars and the shales as flood plain deposits of a large fluvial environment. The resistivitymore » logs and core laboratory data are combined to develop empirical equations relating aquifer porosity and permeability to log-derived parameters such as formation factor and cementation exponent. Temperature logs of 15 wells were useful to quantify the water leakage through semiconfining shales. The groundwater quality was inferred from spontaneous potential (SP) log deflections under control of chemical analysis of water samples. An empirical chart is developed that relates the SP-derived water resistivity to the true water resistivity within the formations. The patterns of salinity variation with depth inferred from SP logs were helpful in identifying subsurface flows along major fault zones, where extensive mixing of water is taking place. A total of 49 vertical Schlumberger resistivity soundings aid in defining aquifer structures and in extrapolating the log derived results. Transition zones between fresh and saline waters have also been detected based on a combination of logging and surface sounding data. Ionic filtering by water leakage across regional shales, local convection and mixing along major faults and hydrodynamic dispersion away from lateral permeability contrasts are the main mechanisms controlling the observed distributions of salinity and temperature within the basin.« less

  11. Interspecies quantitative structure-activity relationships (QSARs) for eco-toxicity screening of chemicals: the role of physicochemical properties.

    PubMed

    Furuhama, A; Hasunuma, K; Aoki, Y

    2015-01-01

    In addition to molecular structure profiles, descriptors based on physicochemical properties are useful for explaining the eco-toxicities of chemicals. In a previous study we reported that a criterion based on the difference between the partition coefficient (log POW) and distribution coefficient (log D) values of chemicals enabled us to identify aromatic amines and phenols for which interspecies relationships with strong correlations could be developed for fish-daphnid and algal-daphnid toxicities. The chemicals that met the log D-based criterion were expected to have similar toxicity mechanisms (related to membrane penetration). Here, we investigated the applicability of log D-based criteria to the eco-toxicity of other kinds of chemicals, including aliphatic compounds. At pH 10, use of a log POW - log D > 0 criterion and omission of outliers resulted in the selection of more than 100 chemicals whose acute fish toxicities or algal growth inhibition toxicities were almost equal to their acute daphnid toxicities. The advantage of log D-based criteria is that they allow for simple, rapid screening and prioritizing of chemicals. However, inorganic molecules and chemicals containing certain structural elements cannot be evaluated, because calculated log D values are unavailable.

  12. Predicting the Rate of River Bank Erosion Caused by Large Wood Log

    NASA Astrophysics Data System (ADS)

    Zhang, N.; Rutherfurd, I.; Ghisalberti, M.

    2016-12-01

    When a single tree falls into a river channel, flow is deflected and accelerated between the tree roots and the bank face, increasing shear stress and scouring the bank. The scallop shaped erosion increases the diversity of the channel morphology, but also causes concern for adjacent landholders. Concern about increased bank erosion is one of the main reasons for large wood to still be removed from channels in SE Australia. Further, the hydraulic effect of many logs in the channel can reduce overall bank erosion rates. Although both phenomena have been described before, this research develops a hydraulic model that estimates their magnitude, and tests and calibrates this model with flume and field measurements, with logs with various configurations and sizes. Specifically, the model estimates the change in excess shear stress on the bank associated . The model addresses the effect of the log angle, distance from bank, and log size and flow condition by solving the mass continuity and energy conservation between the cross section at the approaching flow and contracted flow. Then, we evaluate our model against flume experiment preformed with semi-realistic log models to represent logs in different sizes and decay stages by comparing the measured and simulated velocity increase in the gap between the log and the bank. The log angle, distance from bank, and flow condition are systemically varied for each log model during the experiment. Final, the calibrated model is compared with the field data collected in anabranching channels of Murray River in SE Australia where there are abundant instream logs and regulated and consistent high flow for irrigation. Preliminary results suggest that a log can significantly increase the shear stress on the bank, especially when it positions perpendicular to the flow. The shear stress increases with the log angle in a rising curve (The log angle is the angle between log trunk and flow direction. 0o means log is parallel to flow with canopy pointing downstream). However, the shear stress shows insignificant changes as the log is being moved close to the bank.

  13. A Multi-temporal Analysis of Logging Impacts on Tropical Forest Structure Using Airborne Lidar Data

    NASA Astrophysics Data System (ADS)

    Keller, M. M.; Pinagé, E. R.; Duffy, P.; Longo, M.; dos-Santos, M. N.; Leitold, V.; Morton, D. C.

    2017-12-01

    The long-term impacts of selective logging on carbon cycling and ecosystem function in tropical-forests are still uncertain. Despite improvements in selective logging detection using satellite data, quantifying changes in forest structure from logging and recovery following logging is difficult using orbital data. We analyzed the dynamics of forest structure comparing logged and unlogged forests in the Eastern Brazilian Amazon (Paragominas Municipality, Pará State) using small footprint discrete return airborne lidar data acquired in 2012 and 2014. Logging operations were conducted at the 1200 ha study site from 2006 through 2013 using reduced impact logging techniques—management practices that minimize canopy and ground damage compared to more common conventional logging. Nevertheless, logging still reduced aboveground biomass by 10% to 20% in logged areas compared to intact forests. We aggregated lidar point-cloud data at spatial scales ranging from 50 m to 250 m and developed a binomial classification model based on the height distribution of lidar returns in 2012 and validated the model against the 2014 lidar acquisition. We accurately classified intact and logged forest classes compared with field data. Classification performance improved as spatial resolution increased (AUC = 0.974 at 250 m). We analyzed the differences in canopy gaps, understory damage (based on a relative density model), and biomass (estimated from total canopy height) of intact and logged classes. As expected, logging greatly increased both canopy gap formation and understory damage. However, while the area identified as canopy gap persisted for at least 8 years (from the oldest logging treatments in 2006 to the most recent lidar acquisition in 2014), the effects of ground damage were mostly erased by vigorous understory regrowth after about 5 years. The rate of new gap formation was 6 to 7 times greater in recently logged forests compared to undisturbed forests. New gaps opened at a rate of 1.8 times greater than background even 8 years following logging demonstrating the occurrence of delayed tree mortality. Our study showed that even low-intensity anthropogenic disturbances can cause persistent changes in tropical forest structure and dynamics.

  14. RTK and DGPS measurements using INTERNET and GSM radiolink

    NASA Astrophysics Data System (ADS)

    Rogowski, J. B.; Rogowski, A.; Kujawa, L.

    2003-04-01

    The practical need for GNSS positioning in real time caused to develop the medium for data transmission. The DGPS correction could be transmitted on the area of a few hundreds kilometers (test in Polish Solec Kujawski radio station) on log waves. The RTK technique needs the greater flow capacity of the radio lines and shorter distance between the base stations. The RTK data from the base stations could be transmitted in the DARC system by the local stations on UKF channels, but the local stations are not interested in propagation of RTCM data. The experiences of RTK and DGPS measurements using data transmissions by INTERNET and GSM radio link are presented in the paper.

  15. The albedo of particles in reflection nebulae

    NASA Technical Reports Server (NTRS)

    Rush, W. F.

    1974-01-01

    The relation between the apparent angular extent of a reflection nebula and the apparent magnitude of its illuminating star was reconsidered under a less restrictive set of assumptions. A computational technique was developed which permits the use of fits to the observed m-log a values to determine the albedo of particles composing reflection nebulae, providing only that a phase function and average optical thickness are assumed. Multiple scattering, anisotropic phase functions, and illumination by the general star field are considered, and the albedo of reflection nebular particles appears to be the same as that for interstellar particles in general. The possibility of continuous fluorescence contributions to the surface brightness is also considered.

  16. An Integrated System for Wildlife Sensing

    DTIC Science & Technology

    2014-08-14

    design requirement. “Sensor Controller” software. A custom Sensor Controller application was developed for the Android device in order to collect...and log readings from that device’s sensors. “Camera Controller” software. A custom Camera Controller application was developed for the Android device...into 2 separate Android applications (Figure 4). The Sensor Controller logs readings periodically from the Android device’s organic sensors, and

  17. Initial fungal colonizer affects mass loss and fungal community development in Picea abies logs 6 yr after inoculation

    Treesearch

    Daniel L. Lindner; Rimvydas Vasaitis; Ariana Kubartova; Johan Allmer; Hanna Johannesson; Mark T. Banik; Jan. Stenlid

    2011-01-01

    Picea abies logs were inoculated with Resinicium bicolor, Fomitopsis pinicola or left un-inoculated and placed in an old-growth boreal forest. Mass loss and fungal community data were collected after 6 yr to test whether simplification of the fungal community via inoculation affects mass loss and fungal community development. Three...

  18. Antimicrobial activity of buttermilk and lactoferrin peptide extracts on poultry pathogens.

    PubMed

    Jean, Catherine; Boulianne, Martine; Britten, Michel; Robitaille, Gilles

    2016-11-01

    Antibiotics are commonly used in poultry feed as growth promoters. This practice is questioned given the arising importance of antibiotic resistance. Antimicrobial peptides can be used as food additives for a potent alternative to synthetic or semi-synthetic antibiotics. The objective of this study was to develop a peptide production method based on membrane adsorption chromatography in order to produce extracts with antimicrobial activity against avian pathogens (Salmonella enterica var. Enteritidis, Salmonella enterica var. Typhimurium, and two Escherichia coli strains, O78:H80 and TK3 O1:K1) as well as Staphylococcus aureus. To achieve this, buttermilk powder and purified lactoferrin were digested with pepsin. The peptide extracts (<10 kDa) were fractionated depending on their charges through high-capacity cation-exchange and anion-exchange adsorptive membranes. The yields of cationic peptide extracts were 6·3 and 15·4% from buttermilk and lactoferrin total peptide extracts, respectively. Antimicrobial activity was assessed using the microdilution technique on microplates. Our results indicate that the buttermilk cationic peptide extracts were bactericidal at less than 5 mg/ml against the selected avian strains, with losses of 1·7 log CFU/ml (Salm. Typhimurium) to 3 log CFU/ml (E. coli O78:H80); viability decreased by 1·5 log CFU/ml for Staph. aureus, a Gram-positive bacterium. Anionic and non-adsorbed peptide extracts were inactive at 5 mg/ml. These results demonstrate that membrane adsorption chromatography is an effective way to prepare a cationic peptide extract from buttermilk that is active against avian pathogens.

  19. Classifying zones of suitability for manual drilling using textural and hydraulic parameters of shallow aquifers: a case study in northwestern Senegal

    NASA Astrophysics Data System (ADS)

    Fussi, F. Fabio; Fumagalli, Letizia; Fava, Francesco; Di Mauro, Biagio; Kane, Cheik Hamidou; Niang, Magatte; Wade, Souleye; Hamidou, Barry; Colombo, Roberto; Bonomi, Tullia

    2017-12-01

    A method is proposed that uses analysis of borehole stratigraphic logs for the characterization of shallow aquifers and for the assessment of areas suitable for manual drilling. The model is based on available borehole-log parameters: depth to hard rock, depth to water, thickness of laterite and hydraulic transmissivity of the shallow aquifer. The model is applied to a study area in northwestern Senegal. A dataset of boreholes logs has been processed using a software package (TANGAFRIC) developed during the research. After a manual procedure to assign a standard category describing the lithological characteristics, the next step is the automated extraction of different textural parameters and the estimation of hydraulic conductivity using reference values available in the literature. The hydraulic conductivity values estimated from stratigraphic data have been partially validated, by comparing them with measured values from a series of pumping tests carried out in large-diameter wells. The results show that this method is able to produce a reliable interpretation of the shallow hydrogeological context using information generally available in the region. The research contributes to improving the identification of areas where conditions are suitable for manual drilling. This is achieved by applying the described method, based on a structured and semi-quantitative approach, to classify the zones of suitability for given manual drilling techniques using data available in most African countries. Ultimately, this work will support proposed international programs aimed at promoting low-cost water supply in Africa and enhancing access to safe drinking water for the population.

  20. Protective immune response of chickens to oral vaccination with thermostable live Fowlpox virus vaccine (strain TPV-1) coated on oiled rice.

    PubMed

    Wambura, Philemon N; Godfrey, S K

    2010-03-01

    The objective of the present study was to develop and evaluate a local vaccine (strain TPV-1) against Fowl pox (FP) in chickens. Two separate groups of chickens were vaccinated with FP vaccine through oral (coated on oiled rice) and wing web stab routes, respectively. The results showed that the haemagglutination-inhibition (HI) antibody titres in both vaccinated groups were comparable and significantly higher (P < 0.05) than the control chickens. It was further revealed that 14 days after vaccination HI GMT of > or =2 log(2) was recorded in chickens vaccinated by oral and wing web stab routes whereas 35 days after vaccination the HI antibody titres reached 5.6 log(2) and 6.3 log(2), respectively. Moreover, in both groups the birds showed 100% protection against challenge virus at 35 days after vaccination. The findings from the present study have shown that oral route is equally effective as wing web stab route for vaccination of chickens against FP. However, the oral route can be used in mass vaccination of birds thus avoid catching individual birds for vaccination. It was noteworthy that strain TPV-1 virus could be propagated by a simple allantoic cavity inoculation and harvesting of allantoic fluid where it survived exposure at 57 degrees C for 2 hours. If the oral vaccination technique is optimized it may be used in controlling FP in scavenging and feral chickens. In conclusion, the present study has shown that FP vaccine (strain TPV-1) was safe, thermostable, immunogenic and efficacious in vaccinated chickens.

  1. A graphical automated detection system to locate hardwood log surface defects using high-resolution three-dimensional laser scan data

    Treesearch

    Liya Thomas; R. Edward Thomas

    2011-01-01

    We have developed an automated defect detection system and a state-of-the-art Graphic User Interface (GUI) for hardwood logs. The algorithm identifies defects at least 0.5 inch high and at least 3 inches in diameter on barked hardwood log and stem surfaces. To summarize defect features and to build a knowledge base, hundreds of defects were measured, photographed, and...

  2. Meta-Analysis of the Reduction of Norovirus and Male-Specific Coliphage Concentrations in Wastewater Treatment Plants.

    PubMed

    Pouillot, Régis; Van Doren, Jane M; Woods, Jacquelina; Plante, Daniel; Smith, Mark; Goblick, Gregory; Roberts, Christopher; Locas, Annie; Hajen, Walter; Stobo, Jeffrey; White, John; Holtzman, Jennifer; Buenaventura, Enrico; Burkhardt, William; Catford, Angela; Edwards, Robyn; DePaola, Angelo; Calci, Kevin R

    2015-07-01

    Human norovirus (NoV) is the leading cause of foodborne illness in the United States and Canada. Wastewater treatment plant (WWTP) effluents impacting bivalve mollusk-growing areas are potential sources of NoV contamination. We have developed a meta-analysis that evaluates WWTP influent concentrations and log10 reductions of NoV genotype I (NoV GI; in numbers of genome copies per liter [gc/liter]), NoV genotype II (NoV GII; in gc/liter), and male-specific coliphage (MSC; in number of PFU per liter), a proposed viral surrogate for NoV. The meta-analysis included relevant data (2,943 measurements) reported in the scientific literature through September 2013 and previously unpublished surveillance data from the United States and Canada. Model results indicated that the mean WWTP influent concentration of NoV GII (3.9 log10 gc/liter; 95% credible interval [CI], 3.5, 4.3 log10 gc/liter) is larger than the value for NoV GI (1.5 log10 gc/liter; 95% CI, 0.4, 2.4 log10 gc/liter), with large variations occurring from one WWTP to another. For WWTPs with mechanical systems and chlorine disinfection, mean log10 reductions were -2.4 log10 gc/liter (95% CI, -3.9, -1.1 log10 gc/liter) for NoV GI, -2.7 log10 gc/liter (95% CI, -3.6, -1.9 log10 gc/liter) for NoV GII, and -2.9 log10 PFU per liter (95% CI, -3.4, -2.4 log10 PFU per liter) for MSCs. Comparable values for WWTPs with lagoon systems and chlorine disinfection were -1.4 log10 gc/liter (95% CI, -3.3, 0.5 log10 gc/liter) for NoV GI, -1.7 log10 gc/liter (95% CI, -3.1, -0.3 log10 gc/liter) for NoV GII, and -3.6 log10 PFU per liter (95% CI, -4.8, -2.4 PFU per liter) for MSCs. Within WWTPs, correlations exist between mean NoV GI and NoV GII influent concentrations and between the mean log10 reduction in NoV GII and the mean log10 reduction in MSCs. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  3. Molecular methods (digital PCR and real-time PCR) for the quantification of low copy DNA of Phytophthora nicotianae in environmental samples.

    PubMed

    Blaya, Josefa; Lloret, Eva; Santísima-Trinidad, Ana B; Ros, Margarita; Pascual, Jose A

    2016-04-01

    Currently, real-time polymerase chain reaction (qPCR) is the technique most often used to quantify pathogen presence. Digital PCR (dPCR) is a new technique with the potential to have a substantial impact on plant pathology research owing to its reproducibility, sensitivity and low susceptibility to inhibitors. In this study, we evaluated the feasibility of using dPCR and qPCR to quantify Phytophthora nicotianae in several background matrices, including host tissues (stems and roots) and soil samples. In spite of the low dynamic range of dPCR (3 logs compared with 7 logs for qPCR), this technique proved to have very high precision applicable at very low copy numbers. The dPCR was able to detect accurately the pathogen in all type of samples in a broad concentration range. Moreover, dPCR seems to be less susceptible to inhibitors than qPCR in plant samples. Linear regression analysis showed a high correlation between the results obtained with the two techniques in soil, stem and root samples, with R(2) = 0.873, 0.999 and 0.995 respectively. These results suggest that dPCR is a promising alternative for quantifying soil-borne pathogens in environmental samples, even in early stages of the disease. © 2015 Society of Chemical Industry.

  4. Results of investigations at the Zunil geothermal field, Guatemala: Well logging and brine geochemistry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, A.; Dennis, B.; Van Eeckhout, E.

    1991-07-01

    The well logging team from Los Alamos and its counterpart from Central America were tasked to investigate the condition of four producing geothermal wells in the Zunil Geothermal Field. The information obtained would be used to help evaluate the Zunil geothermal reservoir in terms of possible additional drilling and future power plant design. The field activities focused on downhole measurements in four production wells (ZCQ-3, ZCQ-4, ZCQ-5, and ZCQ-6). The teams took measurements of the wells in both static (shut-in) and flowing conditions, using the high-temperature well logging tools developed at Los Alamos National Laboratory. Two well logging missions weremore » conducted in the Zunil field. In October 1988 measurements were made in well ZCQ-3, ZCQ-5, and ZCQ-6. In December 1989 the second field operation logged ZCQ-4 and repeated logs in ZCQ-3. Both field operations included not only well logging but the collecting of numerous fluid samples from both thermal and nonthermal waters. 18 refs., 22 figs., 7 tabs.« less

  5. Use of NMR logging to obtain estimates of hydraulic conductivity in the High Plains aquifer, Nebraska, USA

    USGS Publications Warehouse

    Dlubac, Katherine; Knight, Rosemary; Song, Yi-Qiao; Bachman, Nate; Grau, Ben; Cannia, Jim; Williams, John

    2013-01-01

    Hydraulic conductivity (K) is one of the most important parameters of interest in groundwater applications because it quantifies the ease with which water can flow through an aquifer material. Hydraulic conductivity is typically measured by conducting aquifer tests or wellbore flow (WBF) logging. Of interest in our research is the use of proton nuclear magnetic resonance (NMR) logging to obtain information about water-filled porosity and pore space geometry, the combination of which can be used to estimate K. In this study, we acquired a suite of advanced geophysical logs, aquifer tests, WBF logs, and sidewall cores at the field site in Lexington, Nebraska, which is underlain by the High Plains aquifer. We first used two empirical equations developed for petroleum applications to predict K from NMR logging data: the Schlumberger Doll Research equation (KSDR) and the Timur-Coates equation (KT-C), with the standard empirical constants determined for consolidated materials. We upscaled our NMR-derived K estimates to the scale of the WBF-logging K(KWBF-logging) estimates for comparison. All the upscaled KT-C estimates were within an order of magnitude of KWBF-logging and all of the upscaled KSDR estimates were within 2 orders of magnitude of KWBF-logging. We optimized the fit between the upscaled NMR-derived K and KWBF-logging estimates to determine a set of site-specific empirical constants for the unconsolidated materials at our field site. We conclude that reliable estimates of K can be obtained from NMR logging data, thus providing an alternate method for obtaining estimates of K at high levels of vertical resolution.

  6. Introducing high performance distributed logging service for ACS

    NASA Astrophysics Data System (ADS)

    Avarias, Jorge A.; López, Joao S.; Maureira, Cristián; Sommer, Heiko; Chiozzi, Gianluca

    2010-07-01

    The ALMA Common Software (ACS) is a software framework that provides the infrastructure for the Atacama Large Millimeter Array and other projects. ACS, based on CORBA, offers basic services and common design patterns for distributed software. Every properly built system needs to be able to log status and error information. Logging in a single computer scenario can be as easy as using fprintf statements. However, in a distributed system, it must provide a way to centralize all logging data in a single place without overloading the network nor complicating the applications. ACS provides a complete logging service infrastructure in which every log has an associated priority and timestamp, allowing filtering at different levels of the system (application, service and clients). Currently the ACS logging service uses an implementation of the CORBA Telecom Log Service in a customized way, using only a minimal subset of the features provided by the standard. The most relevant feature used by ACS is the ability to treat the logs as event data that gets distributed over the network in a publisher-subscriber paradigm. For this purpose the CORBA Notification Service, which is resource intensive, is used. On the other hand, the Data Distribution Service (DDS) provides an alternative standard for publisher-subscriber communication for real-time systems, offering better performance and featuring decentralized message processing. The current document describes how the new high performance logging service of ACS has been modeled and developed using DDS, replacing the Telecom Log Service. Benefits and drawbacks are analyzed. A benchmark is presented comparing the differences between the implementations.

  7. Alternative Hand Contamination Technique To Compare the Activities of Antimicrobial and Nonantimicrobial Soaps under Different Test Conditions▿

    PubMed Central

    Fuls, Janice L.; Rodgers, Nancy D.; Fischler, George E.; Howard, Jeanne M.; Patel, Monica; Weidner, Patrick L.; Duran, Melani H.

    2008-01-01

    Antimicrobial hand soaps provide a greater bacterial reduction than nonantimicrobial soaps. However, the link between greater bacterial reduction and a reduction of disease has not been definitively demonstrated. Confounding factors, such as compliance, soap volume, and wash time, may all influence the outcomes of studies. The aim of this work was to examine the effects of wash time and soap volume on the relative activities and the subsequent transfer of bacteria to inanimate objects for antimicrobial and nonantimicrobial soaps. Increasing the wash time from 15 to 30 seconds increased reduction of Shigella flexneri from 2.90 to 3.33 log10 counts (P = 0.086) for the antimicrobial soap, while nonantimicrobial soap achieved reductions of 1.72 and 1.67 log10 counts (P > 0.6). Increasing soap volume increased bacterial reductions for both the antimicrobial and the nonantimicrobial soaps. When the soap volume was normalized based on weight (∼3 g), nonantimicrobial soap reduced Serratia marcescens by 1.08 log10 counts, compared to the 3.83-log10 reduction caused by the antimicrobial soap (P < 0.001). The transfer of Escherichia coli to plastic balls following a 15-second hand wash with antimicrobial soap resulted in a bacterial recovery of 2.49 log10 counts, compared to the 4.22-log10 (P < 0.001) bacterial recovery on balls handled by hands washed with nonantimicrobial soap. This indicates that nonantimicrobial soap was less active and that the effectiveness of antimicrobial soaps can be improved with longer wash time and greater soap volume. The transfer of bacteria to objects was significantly reduced due to greater reduction in bacteria following the use of antimicrobial soap. PMID:18441107

  8. Alternative hand contamination technique to compare the activities of antimicrobial and nonantimicrobial soaps under different test conditions.

    PubMed

    Fuls, Janice L; Rodgers, Nancy D; Fischler, George E; Howard, Jeanne M; Patel, Monica; Weidner, Patrick L; Duran, Melani H

    2008-06-01

    Antimicrobial hand soaps provide a greater bacterial reduction than nonantimicrobial soaps. However, the link between greater bacterial reduction and a reduction of disease has not been definitively demonstrated. Confounding factors, such as compliance, soap volume, and wash time, may all influence the outcomes of studies. The aim of this work was to examine the effects of wash time and soap volume on the relative activities and the subsequent transfer of bacteria to inanimate objects for antimicrobial and nonantimicrobial soaps. Increasing the wash time from 15 to 30 seconds increased reduction of Shigella flexneri from 2.90 to 3.33 log(10) counts (P = 0.086) for the antimicrobial soap, while nonantimicrobial soap achieved reductions of 1.72 and 1.67 log(10) counts (P > 0.6). Increasing soap volume increased bacterial reductions for both the antimicrobial and the nonantimicrobial soaps. When the soap volume was normalized based on weight (approximately 3 g), nonantimicrobial soap reduced Serratia marcescens by 1.08 log(10) counts, compared to the 3.83-log(10) reduction caused by the antimicrobial soap (P < 0.001). The transfer of Escherichia coli to plastic balls following a 15-second hand wash with antimicrobial soap resulted in a bacterial recovery of 2.49 log(10) counts, compared to the 4.22-log(10) (P < 0.001) bacterial recovery on balls handled by hands washed with nonantimicrobial soap. This indicates that nonantimicrobial soap was less active and that the effectiveness of antimicrobial soaps can be improved with longer wash time and greater soap volume. The transfer of bacteria to objects was significantly reduced due to greater reduction in bacteria following the use of antimicrobial soap.

  9. Inactivation of Escherichia coli O157:H7 on Orange Fruit Surfaces and in Juice Using Photocatalysis and High Hydrostatic Pressure.

    PubMed

    Yoo, Sungyul; Ghafoor, Kashif; Kim, Jeong Un; Kim, Sanghun; Jung, Bora; Lee, Dong-Un; Park, Jiyong

    2015-06-01

    Nonpasteurized orange juice is manufactured by squeezing juice from fruit without peel removal. Fruit surfaces may carry pathogenic microorganisms that can contaminate squeezed juice. Titanium dioxide-UVC photocatalysis (TUVP), a nonthermal technique capable of microbial inactivation via generation of hydroxyl radicals, was used to decontaminate orange surfaces. Levels of spot-inoculated Escherichia coli O157:H7 (initial level of 7.0 log CFU/cm(2)) on oranges (12 cm(2)) were reduced by 4.3 log CFU/ml when treated with TUVP (17.2 mW/cm(2)). Reductions of 1.5, 3.9, and 3.6 log CFU/ml were achieved using tap water, chlorine (200 ppm), and UVC alone (23.7 mW/cm(2)), respectively. E. coli O157:H7 in juice from TUVP (17.2 mW/cm(2))-treated oranges was reduced by 1.7 log CFU/ml. After orange juice was treated with high hydrostatic pressure (HHP) at 400 MPa for 1 min without any prior fruit surface disinfection, the level of E. coli O157:H7 was reduced by 2.4 log CFU/ml. However, the E. coli O157:H7 level in juice was reduced by 4.7 log CFU/ml (to lower than the detection limit) when TUVP treatment of oranges was followed by HHP treatment of juice, indicating a synergistic inactivation effect. The inactivation kinetics of E. coli O157:H7 on orange surfaces followed a biphasic model. HHP treatment did not affect the pH, °Brix, or color of juice. However, the ascorbic acid concentration and pectinmethylesterase activity were reduced by 35.1 and 34.7%, respectively.

  10. Performance of Encounternet Tags: Field Tests of Miniaturized Proximity Loggers for Use on Small Birds

    PubMed Central

    Levin, Iris I.; Zonana, David M.; Burt, John M.; Safran, Rebecca J.

    2015-01-01

    Proximity logging is a new tool for understanding social behavior as it allows for accurate quantification of social networks. We report results from field calibration and deployment tests of miniaturized proximity tags (Encounternet), digital transceivers that log encounters between tagged individuals. We examined radio signal behavior in relation to tag attachment (tag, tag on bird, tag on saline-filled balloon) to understand how radio signal strength is affected by the tag mounting technique used for calibration tests. We investigated inter-tag and inter-receiver station variability, and in each calibration test we accounted for the effects of antennae orientation. Additionally, we used data from a live deployment on breeding barn swallows (Hirundo rustica erythrogaster) to analyze the quality of the logs, including reciprocal agreement in dyadic logs. We evaluated the impact (in terms of mass changes) of tag attachment on the birds. We were able to statistically distinguish between RSSI values associated with different close-proximity (<5m) tag-tag distances regardless of antennae orientation. Inter-tag variability was low, but we did find significant inter-receiver station variability. Reciprocal agreement of dyadic logs was high and social networks were constructed from proximity tag logs based on two different RSSI thresholds. There was no evidence of significant mass loss in the time birds were wearing tags. We conclude that proximity loggers are accurate and effective for quantifying social behavior. However, because RSSI and distance cannot be perfectly resolved, data from proximity loggers are most appropriate for comparing networks based on specific RSSI thresholds. The Encounternet system is flexible and customizable, and tags are now light enough for use on small animals (<50g). PMID:26348329

  11. Image processing system for the measurement of timber truck loads

    NASA Astrophysics Data System (ADS)

    Carvalho, Fernando D.; Correia, Bento A. B.; Davies, Roger; Rodrigues, Fernando C.; Freitas, Jose C. A.

    1993-01-01

    The paper industry uses wood as its raw material. To know the quantity of wood in the pile of sawn tree trunks, every truck load entering the plant is measured to determine its volume. The objective of this procedure is to know the solid volume of wood stocked in the plant. Weighing the tree trunks has its own problems, due to their high capacity for absorbing water. Image processing techniques were used to evaluate the volume of a truck load of logs of wood. The system is based on a PC equipped with an image processing board using data flow processors. Three cameras allow image acquisition of the sides and rear of the truck. The lateral images contain information about the sectional area of the logs, and the rear image contains information about the length of the logs. The machine vision system and the implemented algorithms are described. The results being obtained with the industrial prototype that is now installed in a paper mill are also presented.

  12. A generic standard additions based method to determine endogenous analyte concentrations by immunoassays to overcome complex biological matrix interference.

    PubMed

    Pang, Susan; Cowen, Simon

    2017-12-13

    We describe a novel generic method to derive the unknown endogenous concentrations of analyte within complex biological matrices (e.g. serum or plasma) based upon the relationship between the immunoassay signal response of a biological test sample spiked with known analyte concentrations and the log transformed estimated total concentration. If the estimated total analyte concentration is correct, a portion of the sigmoid on a log-log plot is very close to linear, allowing the unknown endogenous concentration to be estimated using a numerical method. This approach obviates conventional relative quantification using an internal standard curve and need for calibrant diluent, and takes into account the individual matrix interference on the immunoassay by spiking the test sample itself. This technique is based on standard additions for chemical analytes. Unknown endogenous analyte concentrations within even 2-fold diluted human plasma may be determined reliably using as few as four reaction wells.

  13. Image analysis for the automated estimation of clonal growth and its application to the growth of smooth muscle cells.

    PubMed

    Gavino, V C; Milo, G E; Cornwell, D G

    1982-03-01

    Image analysis was used for the automated measurement of colony frequency (f) and colony diameter (d) in cultures of smooth muscle cells, Initial studies with the inverted microscope showed that number of cells (N) in a colony varied directly with d: log N = 1.98 log d - 3.469 Image analysis generated the complement of a cumulative distribution for f as a function of d. The number of cells in each segment of the distribution function was calculated by multiplying f and the average N for the segment. These data were displayed as a cumulative distribution function. The total number of colonies (fT) and the total number of cells (NT) were used to calculate the average colony size (NA). Population doublings (PD) were then expressed as log2 NA. Image analysis confirmed previous studies in which colonies were sized and counted with an inverted microscope. Thus, image analysis is a rapid and automated technique for the measurement of clonal growth.

  14. IMPLEMENTING A NOVEL CYCLIC CO2 FLOOD IN PALEOZOIC REEFS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    James R. Wood; W. Quinlan; A. Wylie

    2003-07-01

    Recycled CO2 will be used in this demonstration project to produce bypassed oil from the Silurian Charlton 6 pinnacle reef (Otsego County) in the Michigan Basin. Contract negotiations by our industry partner to gain access to this CO2 that would otherwise be vented to the atmosphere are near completion. A new method of subsurface characterization, log curve amplitude slicing, is being used to map facies distributions and reservoir properties in two reefs, the Belle River Mills and Chester 18 Fields. The Belle River Mills and Chester18 fields are being used as typefields because they have excellent log-curve and core datamore » coverage. Amplitude slicing of the normalized gamma ray curves is showing trends that may indicate significant heterogeneity and compartmentalization in these reservoirs. Digital and hard copy data continues to be compiled for the Niagaran reefs in the Michigan Basin. Technology transfer took place through technical presentations regarding the log curve amplitude slicing technique and a booth at the Midwest PTTC meeting.« less

  15. The dominant microbial community associated with fermentation of Obushera (sorghum and millet beverages) determined by culture-dependent and culture-independent methods.

    PubMed

    Mukisa, Ivan M; Porcellato, Davide; Byaruhanga, Yusuf B; Muyanja, Charles M B K; Rudi, Knut; Langsrud, Thor; Narvhus, Judith A

    2012-11-01

    Obushera includes four fermented cereal beverages from Uganda namely: Obutoko, Enturire, Ekitiribita and Obuteire, whose microbial diversity has not hitherto been fully investigated. Knowledge of the microbial diversity and dynamics in these products is crucial for understanding their safety and development of appropriate starter cultures for controlled industrial processing. Culture-dependent and culture-independent techniques including denaturating gradient gel electrophoresis (DGGE) and mixed DNA sequencing of polymerase chain reaction (PCR) amplified ribosomal RNA genes were used to study the bacteria and yeast diversity of Obushera. The pH dropped from 6.0-4.6 to 3.5-4.0 within 1-2 days for Obutoko, Enturire and Obuteire whereas that of Ekitiribita decreased to 4.4 after 4 days. Counts of lactic acid bacteria (LAB) increased from 5.0 to 11.0 log cfug(-1) and yeasts increased from 3.4 to 7.1 log cfug(-1) while coliform counts decreased from 2.0 to <1 log cfug(-1) during four days of fermentation. LAB and yeast isolates were identified by rRNA gene sequence analysis. LAB isolates included: Enterococcus spp., Lactobacillus (Lb.) plantarum, Lb. fermentum, Lb. delbrueckii, Lactococcus lactis, Leuconostoc lactis, Streptococcus (S.) infantarius subsp. infantarius, Pediococcus pentosaceus and Weisella (W.) confusa. DGGE indicated predominance of S. gallolyticus, S. infantarius subsp. infantarius, Lb. fermentum, Lb. delbrueckii, W. confusa, Lb. reuteri, Fructobacillus spp., L. lactis and L. lactis. Yeast isolates included Clavispora lusitaniae, Cyberlindnera fabianii, Issatchenkia orientalis and Saccharomyces cerevisiae. DGGE indicated predominance of S. cerevisiae in Obutoko, Enturire and Obuteire and also detected Pichia spp. and I. orientalis in Obutoko. Obushera produced in the laboratory was initially dominated by Enterobacteriaceae and later by Lactococcus spp. Enterobacteriaceae and Bacillus spp. were also detected in Ekitiribita. Development of starters for Obushera may require combinations of LAB and S. cerevisiae for Obutoko, Enturire and Obuteire and LAB for Ekitiribita. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. Electrical Resistivity Tomography and Ground Penetrating Radar for locating buried petrified wood sites: a case study in the natural monument of the Petrified Forest of Evros, Greece

    NASA Astrophysics Data System (ADS)

    Vargemezis, George; Diamanti, Nectaria; Tsourlos, Panagiotis; Fikos, Ilias

    2014-05-01

    A geophysical survey was carried out in the Petrified Forest of Evros, the northernmost regional unit of Greece. This collection of petrified wood has an age of approximately 35 million years and it is the oldest in Greece (i.e., older than the well-known Petrified Forest of Lesvos island located in the North Aegean Sea and which is possibly the largest of the petrified forests worldwide). Protection, development and maintenance projects still need to be carried out at the area despite all fears regarding the forest's fate since many petrified logs remain exposed both in weather conditions - leading to erosion - and to the public. This survey was conducted as part of a more extensive framework regarding the development and protection of this natural monument. Geophysical surveying has been chosen as a non-destructive investigation method since the area of application is both a natural ecosystem and part of cultural heritage. Along with electrical resistivity tomography (ERT), ground penetrating radar (GPR) surveys have been carried out for investigating possible locations of buried fossilized tree trunks. The geoelectrical sections derived from ERT data in combination with the GPR profiles provided a broad view of the subsurface. Two and three dimensional subsurface geophysical images of the surveyed area have been constructed, pointing out probable locations of petrified logs. Regarding ERT, petrified trunks have been detected as high resistive bodies, while lower resistivity values were more related to the surrounding geological materials. GPR surveying has also indicated buried petrified log locations. As these two geophysical methods are affected in different ways by the subsurface conditions, the combined use of both techniques enhanced our ability to produce more reliable interpretations of the subsurface. After the completion of the geophysical investigations of this first stage, petrified trunks were revealed after a subsequent excavation at indicated locations. Moreover, we identified possible buried petrified targets at locations yet to be excavated.

  17. The Plant Ionome Revisited by the Nutrient Balance Concept

    PubMed Central

    Parent, Serge-Étienne; Parent, Léon Etienne; Egozcue, Juan José; Rozane, Danilo-Eduardo; Hernandes, Amanda; Lapointe, Line; Hébert-Gentile, Valérie; Naess, Kristine; Marchand, Sébastien; Lafond, Jean; Mattos, Dirceu; Barlow, Philip; Natale, William

    2013-01-01

    Tissue analysis is commonly used in ecology and agronomy to portray plant nutrient signatures. Nutrient concentration data, or ionomes, belong to the compositional data class, i.e., multivariate data that are proportions of some whole, hence carrying important numerical properties. Statistics computed across raw or ordinary log-transformed nutrient data are intrinsically biased, hence possibly leading to wrong inferences. Our objective was to present a sound and robust approach based on a novel nutrient balance concept to classify plant ionomes. We analyzed leaf N, P, K, Ca, and Mg of two wild and six domesticated fruit species from Canada, Brazil, and New Zealand sampled during reproductive stages. Nutrient concentrations were (1) analyzed without transformation, (2) ordinary log-transformed as commonly but incorrectly applied in practice, (3) additive log-ratio (alr) transformed as surrogate to stoichiometric rules, and (4) converted to isometric log-ratios (ilr) arranged as sound nutrient balance variables. Raw concentration and ordinary log transformation both led to biased multivariate analysis due to redundancy between interacting nutrients. The alr- and ilr-transformed data provided unbiased discriminant analyses of plant ionomes, where wild and domesticated species formed distinct groups and the ionomes of species and cultivars were differentiated without numerical bias. The ilr nutrient balance concept is preferable to alr, because the ilr technique projects the most important interactions between nutrients into a convenient Euclidean space. This novel numerical approach allows rectifying historical biases and supervising phenotypic plasticity in plant nutrition studies. PMID:23526060

  18. Instructional Conversations and Literature Logs. What Works Clearinghouse Intervention Report

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2006

    2006-01-01

    This What Works Clearinghouse (WWC) report examines the effect of "Instructional Conversations and Literature Logs" used in combination. The goal of "Instructional Conversations" is to help English language learners develop reading comprehension ability along with English language proficiency. "Instructional…

  19. Required Operational Capability, USMC-ROC-LOG-216.3.5 for the Ration, Cold Weather.

    DTIC Science & Technology

    1987-05-06

    in operations or training in an arctic environment . b. Organizational Concept. The ration , cold weather will be issued in accordance with established...all services. 2 ROC-ARCTIC 7. TECHNICAL FEASIBILITY AND ENERGY/ ENVIRONMENTAL IMPACTS a. Technical Feasibility. The risk of developing the ration ...r -A1833 963 REQUIRED OPERATIONAL CAPABILITY USMC-ROC-LOG-21635 FOR 1t/1 THE RATION COLD WEATHER(U) MARINE CORPS WASHINGTON DC 86 MAY 87 USMC-ROC-LOG

  20. Bushmeat supply and consumption in a tropical logging concession in northern Congo.

    PubMed

    Poulsen, J R; Clark, C J; Mavah, G; Elkan, P W

    2009-12-01

    Unsustainable hunting of wildlife for food empties tropical forests of many species critical to forest maintenance and livelihoods of forest people. Extractive industries, including logging, can accelerate exploitation of wildlife by opening forests to hunters and creating markets for bushmeat. We monitored human demographics, bushmeat supply in markets, and household bushmeat consumption in five logging towns in the northern Republic of Congo. Over 6 years we recorded 29,570 animals in town markets and collected 48,920 household meal records. Development of industrial logging operations led to a 69% increase in the population of logging towns and a 64% increase in bushmeat supply. The immigration of workers, jobseekers, and their families altered hunting patterns and was associated with increased use of wire snares and increased diversity in the species hunted and consumed. Immigrants hunted 72% of all bushmeat, which suggests the short-term benefits of hunting accrue disproportionately to "outsiders" to the detriment of indigenous peoples who have prior, legitimate claims to wildlife resources. Our results suggest that the greatest threat of logging to biodiversity may be the permanent urbanization of frontier forests. Although enforcement of hunting laws and promotion of alternative sources of protein may help curb the pressure on wildlife, the best strategy for biodiversity conservation may be to keep saw mills and the towns that develop around them out of forests.

  1. QSPR study of polychlorinated diphenyl ethers by molecular electronegativity distance vector (MEDV-4).

    PubMed

    Sun, Lili; Zhou, Liping; Yu, Yu; Lan, Yukun; Li, Zhiliang

    2007-01-01

    Polychlorinated diphenyl ethers (PCDEs) have received more and more concerns as a group of ubiquitous potential persistent organic pollutants (POPs). By using molecular electronegativity distance vector (MEDV-4), multiple linear regression (MLR) models are developed for sub-cooled liquid vapor pressures (P(L)), n-octanol/water partition coefficients (K(OW)) and sub-cooled liquid water solubilities (S(W,L)) of 209 PCDEs and diphenyl ether. The correlation coefficients (R) and the leave-one-out cross-validation (LOO) correlation coefficients (R(CV)) of all the 6-descriptor models for logP(L), logK(OW) and logS(W,L) are more than 0.98. By using stepwise multiple regression (SMR), the descriptors are selected and the resulting models are 5-descriptor model for logP(L), 4-descriptor model for logK(OW), and 6-descriptor model for logS(W,L), respectively. All these models exhibit excellent estimate capabilities for internal sample set and good predictive capabilities for external samples set. The consistency between observed and estimated/predicted values for logP(L) is the best (R=0.996, R(CV)=0.996), followed by logK(OW) (R=0.992, R(CV)=0.992) and logS(W,L) (R=0.983, R(CV)=0.980). By using MEDV-4 descriptors, the QSPR models can be used for prediction and the model predictions can hence extend the current database of experimental values.

  2. Ceramic vacuum tubes for geothermal well logging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelly, R.D.

    1977-01-01

    Useful design data acquired in the evaluation of ceramic vacuum tubes for the development of a 500/sup 0/C instrumentation amplifier are presented. The general requirements for ceramic vacuum tubes are discussed for application to the development of high temperature well logs. Commercially available tubes are described and future contract activities that specifically relate to ceramic vacuum tubes are detailed. Supplemental data are presented in the appendix.

  3. A Computer Vision System forLocating and Identifying Internal Log Defects Using CT Imagery

    Treesearch

    Dongping Zhu; Richard W. Conners; Frederick Lamb; Philip A. Araman

    1991-01-01

    A number of researchers have shown the ability of magnetic resonance imaging (MRI) and computer tomography (CT) imaging to detect internal defects in logs. However, if these devices are ever to play a role in the forest products industry, automatic methods for analyzing data from these devices must be developed. This paper reports research aimed at developing a...

  4. Performance of a completely automated system for monitoring CMV DNA in plasma.

    PubMed

    Mengelle, C; Sandres-Sauné, K; Mansuy, J-M; Haslé, C; Boineau, J; Izopet, J

    2016-06-01

    Completely automated systems for monitoring CMV-DNA in plasma samples are now available. Evaluate analytical and clinical performances of the VERIS™/MDx System CMV Assay(®). Analytical performance was assessed using quantified quality controls. Clinical performance was assessed by comparison with the COBAS(®) Ampliprep™/COBAS(®) Taqman CMV test using 169 plasma samples that had tested positive with the in-house technique in whole blood. The specificity of the VERIS™/MDx System CMV Assay(®) was 99% [CI 95%: 97.7-100]. Intra-assay reproducibilities were 0.03, 0.04, 0.05 and 0.04 log10IU/ml (means 2.78, 3.70, 4.64 and 5.60 log10IU/ml) for expected values of 2.70, 3.70, 4.70 and 5.70 log10IU/ml. The inter-assay reproducibilities were 0.12 and 0.08 (means 6.30 and 2.85 log10IU/ml) for expected values of 6.28 and 2.80 log10IU/ml. The lower limit of detection was 14.6IU/ml, and the assay was linear from 2.34 to 5.58 log10IU/ml. The results for the positive samples were concordant (r=0.71, p<0.0001; slope of Deming regression 0.79 [CI 95%: 0.56-1.57] and y-intercept 0.79 [CI 95%: 0.63-0.95]). The VERIS™/MDx System CMV Assay(®) detected 18 more positive samples than did the COBAS(®) Ampliprep™/COBAS(®) Taqman CMV test and the mean virus load were higher (0.41 log10IU/ml). Patient monitoring on 68 samples collected from 17 immunosuppressed patients showed similar trends between the two assays. As secondary question, virus loads detected by the VERIS™/MDx System CMV Assay(®) were compared to those of the in-house procedure on whole blood. The results were similar between the two assays (-0.09 log10IU/ml) as were the patient monitoring trends. The performances of the VERIS™/MDx System CMV Assay(®) facilitated its routine use in monitoring CMV-DNA loads in plasma samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Parallel algorithms for computation of the manipulator inertia matrix

    NASA Technical Reports Server (NTRS)

    Amin-Javaheri, Masoud; Orin, David E.

    1989-01-01

    The development of an O(log2N) parallel algorithm for the manipulator inertia matrix is presented. It is based on the most efficient serial algorithm which uses the composite rigid body method. Recursive doubling is used to reformulate the linear recurrence equations which are required to compute the diagonal elements of the matrix. It results in O(log2N) levels of computation. Computation of the off-diagonal elements involves N linear recurrences of varying-size and a new method, which avoids redundant computation of position and orientation transforms for the manipulator, is developed. The O(log2N) algorithm is presented in both equation and graphic forms which clearly show the parallelism inherent in the algorithm.

  6. A multiplex real-time PCR assay for identification of Pneumocystis jirovecii, Histoplasma capsulatum, and Cryptococcus neoformans/Cryptococcus gattii in samples from AIDS patients with opportunistic pneumonia.

    PubMed

    Gago, Sara; Esteban, Cristina; Valero, Clara; Zaragoza, Oscar; Puig de la Bellacasa, Jorge; Buitrago, María José

    2014-04-01

    A molecular diagnostic technique based on real-time PCR was developed for the simultaneous detection of three of the most frequent causative agents of fungal opportunistic pneumonia in AIDS patients: Pneumocystis jirovecii, Histoplasma capsulatum, and Cryptococcus neoformans/Cryptococcus gattii. This technique was tested in cultured strains and in clinical samples from HIV-positive patients. The methodology used involved species-specific molecular beacon probes targeted to the internal transcribed spacer regions of the rDNA. An internal control was also included in each assay. The multiplex real-time PCR assay was tested in 24 clinical strains and 43 clinical samples from AIDS patients with proven fungal infection. The technique developed showed high reproducibility (r(2) of >0.98) and specificity (100%). For H. capsulatum and Cryptococcus spp., the detection limits of the method were 20 and 2 fg of genomic DNA/20 μl reaction mixture, respectively, while for P. jirovecii the detection limit was 2.92 log10 copies/20 μl reaction mixture. The sensitivity in vitro was 100% for clinical strains and 90.7% for clinical samples. The assay was positive for 92.5% of the patients. For one of the patients with proven histoplasmosis, P. jirovecii was also detected in a bronchoalveolar lavage sample. No PCR inhibition was detected. This multiplex real-time PCR technique is fast, sensitive, and specific and may have clinical applications.

  7. Subsurface Formation Evaluation on Mars: Application of Methods from the Oil Patch

    NASA Astrophysics Data System (ADS)

    Passey, Q. R.

    2006-12-01

    The ability to drill 10- to 100-meter deep wellbores on Mars would allow for evaluation of shallow subsurface formations enabling the extension of current interpretations of the geologic history of this planet; moreover, subsurface access is likely to provide direct evidence to determine if water or permafrost is present. Methodologies for evaluating sedimentary rocks using drill holes and in situ sample and data acquisition are well developed here on Earth. Existing well log instruments can measure K, Th, and U from natural spectral gamma-ray emission, compressional and shear acoustic velocities, electrical resistivity and dielectric properties, bulk density (Cs-137 or Co-60 source), photoelectric absorption of gamma-rays (sensitive to the atomic number), hydrogen index from epithermal and thermal neutron scattering and capture, free hydrogen in water molecules from nuclear magnetic resonance, formation capture cross section, temperature, pressure, and elemental abundances (C, O, Si, Ca, H, Cl, Fe, S, and Gd) using 14 MeV pulsed neutron activation more elements possible with supercooled Ge detectors. Additionally, high-resolution wellbore images are possible using a variety of optical, electrical, and acoustic imaging tools. In the oil industry, these downhole measurements are integrated to describe potential hydrocarbon reservoir properties: lithology, mineralogy, porosity, depositional environment, sedimentary and structural dip, sedimentary features, fluid type (oil, gas, or water), and fluid amount (i.e., saturation). In many cases it is possible to determine the organic-carbon content of hydrocarbon source rocks from logs (if the total organic carbon content is 1 wt% or greater), and more accurate instruments likely could be developed. Since Martian boreholes will likely be drilled without using opaque drilling fluids (as generally used in terrestrial drilling), additional instruments can be used such as high resolution direct downhole imaging and other surface contact measurements (such as IR spectroscopy and x-ray fluorescence). However, such wellbores would require modification of some instruments since conventional drilling fluids often provide the coupling of the instrument sensors to the formation (e.g., sonic velocity and galvanic resistivity measurements). The ability to drill wellbores on Mars opens up new opportunities for exploration but also introduces additional technical challenges. Currently it is not known if all existing terrestrial logging instruments can be miniaturized sufficiently for a shallow Mars wellbore, but the existing well logging techniques and instruments provide a solid framework on which to build a Martian subsurface evaluation program.

  8. Setting analyst: A practical harvest planning technique

    Treesearch

    Olivier R.M. Halleux; W. Dale Greene

    2001-01-01

    Setting Analyst is an ArcView extension that facilitates practical harvest planning for ground-based systems. By modeling the travel patterns of ground-based machines, it compares different harvesting settings based on projected average skidding distance, logging costs, and site disturbance levels. Setting Analyst uses information commonly available to consulting...

  9. ADULT COHO SALMON AND STEELHEAD USE OF BOULDER WEIRS IN SOUTHWEST OREGON STREAMS

    EPA Science Inventory

    The placement of log and boulder structures in streams is a common and often effective technique for improving juvenile salmonid rearing habitat and increasing fish densities. Less frequently examined has been the use of these structures by adult salmonids. In 2004, spawner densi...

  10. Stabilization techniques for reactive aggregate in soil-cement base course.

    DOT National Transportation Integrated Search

    2003-01-01

    Anhydrite (CaSO4) beds occur as a cap rock on a salt dome in Winn Parish in north Louisiana. Locally known as Winn Rock, it has been quarried for gravel for road building. It has been used as a surface course for local parish and logging roads. Stabi...

  11. Techniques for estimating flood-peak discharges of rural, unregulated streams in Ohio

    USGS Publications Warehouse

    Koltun, G.F.

    2003-01-01

    Regional equations for estimating 2-, 5-, 10-, 25-, 50-, 100-, and 500-year flood-peak discharges at ungaged sites on rural, unregulated streams in Ohio were developed by means of ordinary and generalized least-squares (GLS) regression techniques. One-variable, simple equations and three-variable, full-model equations were developed on the basis of selected basin characteristics and flood-frequency estimates determined for 305 streamflow-gaging stations in Ohio and adjacent states. The average standard errors of prediction ranged from about 39 to 49 percent for the simple equations, and from about 34 to 41 percent for the full-model equations. Flood-frequency estimates determined by means of log-Pearson Type III analyses are reported along with weighted flood-frequency estimates, computed as a function of the log-Pearson Type III estimates and the regression estimates. Values of explanatory variables used in the regression models were determined from digital spatial data sets by means of a geographic information system (GIS), with the exception of drainage area, which was determined by digitizing the area within basin boundaries manually delineated on topographic maps. Use of GIS-based explanatory variables represents a major departure in methodology from that described in previous reports on estimating flood-frequency characteristics of Ohio streams. Examples are presented illustrating application of the regression equations to ungaged sites on ungaged and gaged streams. A method is provided to adjust regression estimates for ungaged sites by use of weighted and regression estimates for a gaged site on the same stream. A region-of-influence method, which employs a computer program to estimate flood-frequency characteristics for ungaged sites based on data from gaged sites with similar characteristics, was also tested and compared to the GLS full-model equations. For all recurrence intervals, the GLS full-model equations had superior prediction accuracy relative to the simple equations and therefore are recommended for use.

  12. Is skin penetration a determining factor in skin sensitization ...

    EPA Pesticide Factsheets

    Summary:Background. It is widely accepted that substances that cannot penetrate through the skin will not be sensitisers. Thresholds based on relevant physicochemical parameters such as a LogKow > 1 and a MW 1 is a true requirement for sensitisation.Methods. A large dataset of substances that had been evaluated for their skin sensitisation potential, together with measured LogKow values was compiled from the REACH database. The incidence of skin sensitisers relative to non-skin sensitisers below and above the LogKow = 1 threshold was evaluated. Results. 1482 substances with associated skin sensitisation outcomes and measured LogKow values were identified. 305 substances had a measured LogKow < 0 and of those, 38 were sensitisers.Conclusions. There was no significant difference in the incidence of skin sensitisation above and below the LogKow = 1 threshold. Reaction chemistry considerations could explain the skin sensitisation observed for the 38 sensitisers with a LogKow < 0. The LogKow threshold is a self-evident truth borne out from the widespread misconception that the ability to efficiently penetrate the stratum corneum is a key determinant of skin sensitisation potential and potency. Using the REACH data extracted to test out the validity of common assumptions in the skin sensitization AOP. Builds on trying to develop a proof of concept IATA

  13. The SHOLO mill: return on investment versus mill design

    Treesearch

    Hugh W. Reynolds; Charles J. Gatchell; Charles J. Gatchell

    1971-01-01

    The newly developed SHOLO (from SHOrt Log) process can be used to convert low-grade hardwood logs into parts for standard warehouse pallets and pulp chips. Should you build a SHOLO mill? This paper has been prepared to help you decide.

  14. Application of digital image processing techniques to astronomical imagery, 1979

    NASA Technical Reports Server (NTRS)

    Lorre, J. J.

    1979-01-01

    Several areas of applications of image processing to astronomy were identified and discussed. These areas include: (1) deconvolution for atmospheric seeing compensation; a comparison between maximum entropy and conventional Wiener algorithms; (2) polarization in galaxies from photographic plates; (3) time changes in M87 and methods of displaying these changes; (4) comparing emission line images in planetary nebulae; and (5) log intensity, hue saturation intensity, and principal component color enhancements of M82. Examples are presented of these techniques applied to a variety of objects.

  15. Modeling complex aquifer systems: a case study in Baton Rouge, Louisiana (USA)

    NASA Astrophysics Data System (ADS)

    Pham, Hai V.; Tsai, Frank T.-C.

    2017-05-01

    This study targets two challenges in groundwater model development: grid generation and model calibration for aquifer systems that are fluvial in origin. Realistic hydrostratigraphy can be developed using a large quantity of well log data to capture the complexity of an aquifer system. However, generating valid groundwater model grids to be consistent with the complex hydrostratigraphy is non-trivial. Model calibration can also become intractable for groundwater models that intend to match the complex hydrostratigraphy. This study uses the Baton Rouge aquifer system, Louisiana (USA), to illustrate a technical need to cope with grid generation and model calibration issues. A grid generation technique is introduced based on indicator kriging to interpolate 583 wireline well logs in the Baton Rouge area to derive a hydrostratigraphic architecture with fine vertical discretization. Then, an upscaling procedure is developed to determine a groundwater model structure with 162 layers that captures facies geometry in the hydrostratigraphic architecture. To handle model calibration for such a large model, this study utilizes a derivative-free optimization method in parallel computing to complete parameter estimation in a few months. The constructed hydrostratigraphy indicates the Baton Rouge aquifer system is fluvial in origin. The calibration result indicates hydraulic conductivity for Miocene sands is higher than that for Pliocene to Holocene sands and indicates the Baton Rouge fault and the Denham Springs-Scotlandville fault to be low-permeability leaky aquifers. The modeling result shows significantly low groundwater level in the "2,000-foot" sand due to heavy pumping, indicating potential groundwater upward flow from the "2,400-foot" sand.

  16. The NetLogger Toolkit V2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gunter, Dan; Lee, Jason; Stoufer, Martin

    2003-03-28

    The NetLogger Toolkit is designed to monitor, under actual operating conditions, the behavior of all the elements of the application-to-application communication path in order to determine exactly where time is spent within a complex system Using NetLogger, distnbuted application components are modified to produce timestamped logs of "interesting" events at all the critical points of the distributed system Events from each component are correlated, which allov^ one to characterize the performance of all aspects of the system and network in detail. The NetLogger Toolkit itself consists of four components an API and library of functions to simplify the generation ofmore » application-level event logs, a set of tools for collecting and sorting log files, an event archive system, and a tool for visualization and analysis of the log files In order to instrument an application to produce event logs, the application developer inserts calls to the NetLogger API at all the critical points in the code, then links the application with the NetLogger library All the tools in the NetLogger Toolkit share a common log format, and assume the existence of accurate and synchronized system clocks NetLogger messages can be logged using an easy-to-read text based format based on the lETF-proposed ULM format, or a binary format that can still be used through the same API but that is several times faster and smaller, with performance comparable or better than binary message formats such as MPI, XDR, SDDF-Binary, and PBIO. The NetLogger binary format is both highly efficient and self-describing, thus optimized for the dynamic message construction and parsing of application instrumentation. NetLogger includes an "activation" API that allows NetLogger logging to be turned on, off, or modified by changing an external file This IS useful for activating logging in daemons/services (e g GndFTP server). The NetLogger reliability API provides the ability to specify backup logging locations and penodically try to reconnect broken TCP pipe. A typical use for this is to store data on local disk while net is down. An event archiver can log one or more incoming NetLogger streams to a local disk file (netlogd) or to a mySQL database (netarchd). We have found exploratory, visual analysis of the log event data to be the most useful means of determining the causes of performance anomalies The NetLogger Visualization tool, niv, has been developed to provide a flexible and interactive graphical representation of system-level and application-level events.« less

  17. Interaction of Cesium Ions with Calix[4]arene-bis(t-octylbenzo-18-crown-6): NMR and Theoretical Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kriz, Jaroslav; Dybal, Jiri; Vanura, Petr

    2011-01-01

    Using 1H, 13C, and 133Cs NMR spectra, it is shown that calix[4]arene-bis (t-octylbenzo-18-crown-6) (L) forms complexes with one (L 3 Cs ) and two (L 3 2Cs ) Cs ions offered by cesium bis(1,2-dicarbollide) cobaltate (CsDCC) in nitrobenzene-d5. The ions interact with all six oxygen atoms in the crown-ether ring and the electrons of the calixarene aromatic moieties. According to extraction technique, the stability constant of the first complex is log nb(L 3 Cs ) = 8.8 ( 0.1. According to 133Cs NMR spectra, the value of the equilibrium constant of the second complex is log Knb (2)(L 3 2Csmore » ) = 6.3(0.2, i.e., its stabilization constant is log nb(L 3 2Cs ) = 15.1 ( 0.3. Self-diffusion measurements by 1H pulsed-field gradient (PFG) NMRcombined with density functional theory (DFT) calculations suggest that one DCC ion is tightly associated with L 3 Cs , decreasing its positive charge and consequently stabilizing the second complex, L 3 2Cs . Using a saturation-transfer 133Cs NMR technique, the correlation times ex of chemical exchange between L 3 Cs and L 3 2Cs as well as between L 3 2Cs and free Cs ions were determined as 33.6 and 29.2 ms, respectively.« less

  18. Process mining techniques: an application to time management

    NASA Astrophysics Data System (ADS)

    Khowaja, Ali Raza

    2018-04-01

    In an environment people have to make sure that all of their work are completed within a given time in accordance with its quality. In order to achieve the real phenomenon of process mining one needs to understand all of these processes in a detailed manner. Personal Information and communication has always been a highlighting issue on internet but for now information and communication tools within factual life refers to their daily schedule, location analysis, environmental analysis and, more generally, social media applications support these systems which makes data available for data analysis generated through event logs, but also for process analysis which combines environmental and location analysis. Process mining can be used to exploit all these real live processes with the help of the event logs which are already available in those datasets through user censored data or may be user labeled data. These processes could be used to redesign a user's flow and understand all these processes in a bit more detailed manner. In order to increase the quality of each of the processes that we go through our daily lives is to give a closer look to each of the processes and after analyzing them, one should make changes to get better results. On the contrarily, we applied process mining techniques on seven different subjects combined in a single dataset collected from Korea. Above all, the following paper comments on the efficiency of processes in the event logs referring to time management's sphere of influence.

  19. Controls on the physical properties of gas-hydrate-bearing sediments because of the interaction between gas hydrate and porous media

    USGS Publications Warehouse

    Lee, Myung W.; Collett, Timothy S.

    2005-01-01

    Physical properties of gas-hydrate-bearing sediments depend on the pore-scale interaction between gas hydrate and porous media as well as the amount of gas hydrate present. Well log measurements such as proton nuclear magnetic resonance (NMR) relaxation and electromagnetic propagation tool (EPT) techniques depend primarily on the bulk volume of gas hydrate in the pore space irrespective of the pore-scale interaction. However, elastic velocities or permeability depend on how gas hydrate is distributed in the pore space as well as the amount of gas hydrate. Gas-hydrate saturations estimated from NMR and EPT measurements are free of adjustable parameters; thus, the estimations are unbiased estimates of gas hydrate if the measurement is accurate. However, the amount of gas hydrate estimated from elastic velocities or electrical resistivities depends on many adjustable parameters and models related to the interaction of gas hydrate and porous media, so these estimates are model dependent and biased. NMR, EPT, elastic-wave velocity, electrical resistivity, and permeability measurements acquired in the Mallik 5L-38 well in the Mackenzie Delta, Canada, show that all of the well log evaluation techniques considered provide comparable gas-hydrate saturations in clean (low shale content) sandstone intervals with high gas-hydrate saturations. However, in shaly intervals, estimates from log measurement depending on the pore-scale interaction between gas hydrate and host sediments are higher than those estimates from measurements depending on the bulk volume of gas hydrate.

  20. Factors influencing the inactivation of Alicyclobacillus acidoterrestris spores exposed to high hydrostatic pressure in apple juice

    NASA Astrophysics Data System (ADS)

    Sokołowska, B.; Skąpska, S.; Fonberg-Broczek, M.; Niezgoda, J.; Chotkiewicz, M.; Dekowska, A.; Rzoska, S. J.

    2013-03-01

    Alicyclobacillus acidoterrestris, a thermoacidophilic and spore-forming bacterium, survives the typical pasteurization process and can cause the spoilage of juices, producing compounds associated with disinfectant-like odour (guaiacol, 2,6 - dibromophenol, 2,6 - dichlorophenol). Therefore, the use of other more effective techniques such as high hydrostatic pressure (HHP) is considered for preserving juices. The aim of this study was to search for factors affecting the resistance of A. acidoterrestris spores to HHP. The baroprotective effect of increased solute concentration in apple juice on A. acidoterrestris spores during high pressure processing was observed. During the 45 min pressurization (200 MPa, 50°C) of the spores in concentrated apple juice (71.1°Bx), no significant changes were observed in their number. However, in the juices with a soluble solids content of 35.7, 23.6 and 11.2°Bx, the reduction in spores was 1.3-2.4 log, 2.6-3.3 log and 2.8-4.0 log, respectively. No clear effect of age of spores on the survival under high pressure conditions was found. Spores surviving pressurization and subjected to subsequent HHP treatment showed increased resistance to pressure, by even as much as 2.0 log.

Top