Sample records for advanced logging techniques

  1. Accoustic waveform logging--Advances in theory and application

    USGS Publications Warehouse

    Paillet, F.L.; Cheng, C.H.; Pennington , W.D.

    1992-01-01

    Full-waveform acoustic logging has made significant advances in both theory and application in recent years, and these advances have greatly increased the capability of log analysts to measure the physical properties of formations. Advances in theory provide the analytical tools required to understand the properties of measured seismic waves, and to relate those properties to such quantities as shear and compressional velocity and attenuation, and primary and fracture porosity and permeability of potential reservoir rocks. The theory demonstrates that all parts of recorded waveforms are related to various modes of propagation, even in the case of dipole and quadrupole source logging. However, the theory also indicates that these mode properties can be used to design velocity and attenuation picking schemes, and shows how source frequency spectra can be selected to optimize results in specific applications. Synthetic microseismogram computations are an effective tool in waveform interpretation theory; they demonstrate how shear arrival picks and mode attenuation can be used to compute shear velocity and intrinsic attenuation, and formation permeability for monopole, dipole and quadrupole sources. Array processing of multi-receiver data offers the opportunity to apply even more sophisticated analysis techniques. Synthetic microseismogram data is used to illustrate the application of the maximum-likelihood method, semblance cross-correlation, and Prony's method analysis techniques to determine seismic velocities and attenuations. The interpretation of acoustic waveform logs is illustrated by reviews of various practical applications, including synthetic seismogram generation, lithology determination, estimation of geomechanical properties in situ, permeability estimation, and design of hydraulic fracture operations.

  2. Research and development of improved geothermal well logging techniques, tools and components (current projects, goals and status). Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lamers, M.D.

    One of the key needs in the advancement of geothermal energy is availability of adequate subsurface measurements to aid the reservoir engineer in the development and operation of geothermal wells. Some current projects being sponsored by the U. S. Department of Energy's Division of Geothermal Energy pertaining to the development of improved well logging techniques, tools and components are described. An attempt is made to show how these projects contribute to improvement of geothermal logging technology in forming key elements of the overall program goals.

  3. Increased Oil Production and Reserves from Improved Completion Techniques in the Bluebell Field, Uinta Basin, Utah

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deo, M.D.; Morgan, C.D.

    1999-04-28

    The objective of the project is to increase oil production and reserves by the use of improved reservoir characterization and completion techniques in the Uinta Basin, Utah. To accomplish this objective, a two-year geologic and engineering characterization of the Bluebell field was conducted. The study evaluated surface and subsurface data, currently used completion techniques, and common production problems. It was determined that advanced case- and open-hole logs could be effective in determining productive beds and that stage-interval (about 500 ft [150 m] per stage) and bed-scale isolation completion techniques could result in improved well performance. In the first demonstration wellmore » (Michelle Ute well discussed in the previous technical report), dipole shear anisotropy (anisotropy) and dual-burst thermal decay time (TDT) logs were run before and isotope tracer log was run after the treatment. The logs were very helpful in characterizing the remaining hydrocarbon potential in the well. But, mechanical failure resulted in a poor recompletion and did not result in a significant improvement in the oil production from the well.« less

  4. Geophysical examination of coal deposits

    NASA Astrophysics Data System (ADS)

    Jackson, L. J.

    1981-04-01

    Geophysical techniques for the solution of mining problems and as an aid to mine planning are reviewed. Techniques of geophysical borehole logging are discussed. The responses of the coal seams to logging tools are easily recognized on the logging records. Cores for laboratory analysis are cut from selected sections of the borehole. In addition, information about the density and chemical composition of the coal may be obtained. Surface seismic reflection surveys using two dimensional arrays of seismic sources and detectors detect faults with throws as small as 3 m depths of 800 m. In geologically disturbed areas, good results have been obtained from three dimensional surveys. Smaller faults as far as 500 m in advance of the working face may be detected using in seam seismic surveying conducted from a roadway or working face. Small disturbances are detected by pulse radar and continuous wave electromagnetic methods either from within boreholes or from underground. Other geophysical techniques which explicit the electrical, magnetic, gravitational, and geothermal properties of rocks are described.

  5. Delineation of faults, fractures, foliation, and ground-water-flow zones in fractured-rock, on the southern part of Manhattan, New York, through use of advanced borehole-geophysical techniques

    USGS Publications Warehouse

    Stumm, Frederick; Chu, Anthony; Monti, Jack

    2004-01-01

    Advanced borehole-geophysical techniques were used to assess the geohydrology of crystalline bedrock in 20 boreholes on the southern part of Manhattan Island, N.Y., in preparation for construction of a third water tunnel for New York City. The borehole-logging techniques included natural gamma, single-point resistance, short-normal resistivity, mechanical and acoustic caliper, magnetic susceptibility, borehole-fluid temperature and resistivity, borehole-fluid specific conductance, dissolved oxygen, pH, redox, heatpulse flowmeter (at selected boreholes), borehole deviation, acoustic and optical televiewer, and borehole radar (at selected boreholes). Hydraulic head and specific-capacity test data were collected from 29 boreholes. The boreholes penetrated gneiss, schist, and other crystalline bedrock that has an overall southwest to northwest-dipping foliation. Most of the fractures penetrated are nearly horizontal or have moderate- to high-angle northwest or eastward dip azimuths. Foliation dip within the potential tunnel-construction zone is northwestward and southeastward in the proposed North Water-Tunnel, northwestward to southwestward in the proposed Midtown Water-Tunnel, and northwestward to westward dipping in the proposed South Water-Tunnel. Fracture population dip azimuths are variable. Heat-pulse flowmeter logs obtained under pumping and nonpumping (ambient) conditions, together with other geophysical logs, indicate transmissive fracture zones in each borehole. The 60-megahertz directional borehole-radar logs delineated the location and orientation of several radar reflectors that did not intersect the projection of the borehole.Fracture indexes range from 0.12 to 0.93 fractures per foot of borehole. Analysis of specific-capacity tests from each borehole indicated that transmissivity ranges from 2 to 459 feet squared per day; the highest transmissivity is at the Midtown Water-Tunnel borehole (E35ST-D).

  6. Preliminary logging analysis system (PLANS): overview.

    Treesearch

    R.H. Twito; S.E. Reutebuch; R.J. McGaughey; C.N. Mann

    1987-01-01

    The paper previews a computer-aided design system, PLANS, that is useful for developing timber harvest and road network plans on large-scale topographic maps. Earlier planning techniques are reviewed, and the advantages are explained of using advanced planning systems like PLANS. There is a brief summary of the input, output, and function of each program in the PLANS...

  7. Automated Detection of Selective Logging in Amazon Forests Using Airborne Lidar Data and Pattern Recognition Algorithms

    NASA Astrophysics Data System (ADS)

    Keller, M. M.; d'Oliveira, M. N.; Takemura, C. M.; Vitoria, D.; Araujo, L. S.; Morton, D. C.

    2012-12-01

    Selective logging, the removal of several valuable timber trees per hectare, is an important land use in the Brazilian Amazon and may degrade forests through long term changes in structure, loss of forest carbon and species diversity. Similar to deforestation, the annual area affected by selected logging has declined significantly in the past decade. Nonetheless, this land use affects several thousand km2 per year in Brazil. We studied a 1000 ha area of the Antimary State Forest (FEA) in the State of Acre, Brazil (9.304 ○S, 68.281 ○W) that has a basal area of 22.5 m2 ha-1 and an above-ground biomass of 231 Mg ha-1. Logging intensity was low, approximately 10 to 15 m3 ha-1. We collected small-footprint airborne lidar data using an Optech ALTM 3100EA over the study area once each in 2010 and 2011. The study area contained both recent and older logging that used both conventional and technologically advanced logging techniques. Lidar return density averaged over 20 m-2 for both collection periods with estimated horizontal and vertical precision of 0.30 and 0.15 m. A relative density model comparing returns from 0 to 1 m elevation to returns in 1-5 m elevation range revealed the pattern of roads and skid trails. These patterns were confirmed by ground-based GPS survey. A GIS model of the road and skid network was built using lidar and ground data. We tested and compared two pattern recognition approaches used to automate logging detection. Both segmentation using commercial eCognition segmentation and a Frangi filter algorithm identified the road and skid trail network compared to the GIS model. We report on the effectiveness of these two techniques.

  8. Successful Sampling Strategy Advances Laboratory Studies of NMR Logging in Unconsolidated Aquifers

    NASA Astrophysics Data System (ADS)

    Behroozmand, Ahmad A.; Knight, Rosemary; Müller-Petke, Mike; Auken, Esben; Barfod, Adrian A. S.; Ferré, Ty P. A.; Vilhelmsen, Troels N.; Johnson, Carole D.; Christiansen, Anders V.

    2017-11-01

    The nuclear magnetic resonance (NMR) technique has become popular in groundwater studies because it responds directly to the presence and mobility of water in a porous medium. There is a need to conduct laboratory experiments to aid in the development of NMR hydraulic conductivity models, as is typically done in the petroleum industry. However, the challenge has been obtaining high-quality laboratory samples from unconsolidated aquifers. At a study site in Denmark, we employed sonic drilling, which minimizes the disturbance of the surrounding material, and extracted twelve 7.6 cm diameter samples for laboratory measurements. We present a detailed comparison of the acquired laboratory and logging NMR data. The agreement observed between the laboratory and logging data suggests that the methodologies proposed in this study provide good conditions for studying NMR measurements of unconsolidated near-surface aquifers. Finally, we show how laboratory sample size and condition impact the NMR measurements.

  9. Forest Roadidentification and Extractionof Through Advanced Log Matching Techniques

    NASA Astrophysics Data System (ADS)

    Zhang, W.; Hu, B.; Quist, L.

    2017-10-01

    A novel algorithm for forest road identification and extraction was developed. The algorithm utilized Laplacian of Gaussian (LoG) filter and slope calculation on high resolution multispectral imagery and LiDAR data respectively to extract both primary road and secondary road segments in the forest area. The proposed method used road shape feature to extract the road segments, which have been further processed as objects with orientation preserved. The road network was generated after post processing with tensor voting. The proposed method was tested on Hearst forest, located in central Ontario, Canada. Based on visual examination against manually digitized roads, the majority of roads from the test area have been identified and extracted from the process.

  10. Intraocular straylight and contrast sensitivity after contralateral wavefront-guided LASIK and wavefront-guided PRK for myopia.

    PubMed

    Barreto, Jackson; Barboni, Mirella T S; Feitosa-Santana, Claudia; Sato, João R; Bechara, Samir J; Ventura, Dora F; Alves, Milton Ruiz

    2010-08-01

    To compare intraocular straylight measurements and contrast sensitivity after wavefront-guided LASIK (WFG LASIK) in one eye and wavefront-guided photorefractive keratectomy (WFG PRK) in the fellow eye for myopia and myopic astigmatism correction. A prospective, randomized study of 22 eyes of 11 patients who underwent simultaneous WFG LASIK and WFG PRK (contralateral eye). Both groups were treated with the NIDEK Advanced Vision Excimer Laser System, and a microkeratome was used for flap creation in the WFG LASIK group. High and low contrast visual acuity, wavefront analysis, contrast sensitivity, and retinal straylight measurements were performed preoperatively and at 3, 6, and 12 months postoperatively. A third-generation straylight meter, C-Quant (Oculus Optikgeräte GmbH), was used for measuring intraocular straylight. Twelve months postoperatively, mean uncorrected distance visual acuity was -0.06 +/- 0.07 logMAR in the WFG LASIK group and -0.10 +/- 0.10 logMAR in the WFG PRK group. Mean preoperative intraocular straylight was 0.94 +/- 0.12 logs for the WFG LASIK group and 0.96 +/- 0.11 logs for the WFG PRK group. After 12 months, the mean straylight value was 1.01 +/- 0.1 log s for the WFG LASIK group and 0.97 +/- 0.12 log s for the WFG PRK group. No difference was found between techniques after 12 months (P = .306). No significant difference in photopic and mesopic contrast sensitivity between groups was noted. Intraocular straylight showed no statistically significant increase 1 year after WFG LASIK and WFG PRK. Higher order aberrations increased significantly after surgery for both groups. Nevertheless, WFG LASIK and WFG PRK yielded excellent visual acuity and contrast sensitivity performance without significant differences between techniques.

  11. Project LOGgED ON: Advanced Science Online for Gifted Learners

    ERIC Educational Resources Information Center

    Reed, Christine; Urquhart, Jill

    2007-01-01

    Gifted students are often underserved because they do not have access to highly challenging curriculum. In October, 2002, Project LOGgED ON (www.scrolldown.com/loggedon/) at University of Virginia received federal funding from the Jacob Javits Act to tackle this issue. Those who were part of the LOGgED ON project developed advanced science…

  12. Keystroke Logging in Writing Research: Using Inputlog to Analyze and Visualize Writing Processes

    ERIC Educational Resources Information Center

    Leijten, Marielle; Van Waes, Luuk

    2013-01-01

    Keystroke logging has become instrumental in identifying writing strategies and understanding cognitive processes. Recent technological advances have refined logging efficiency and analytical outputs. While keystroke logging allows for ecological data collection, it is often difficult to connect the fine grain of logging data to the underlying…

  13. Fluid-Rock Characterization and Interactions in NMR Well Logging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    George J. Hirasaki; Kishore K. Mohanty

    2005-09-05

    The objective of this report is to characterize the fluid properties and fluid-rock interactions that are needed for formation evaluation by NMR well logging. The advances made in the understanding of NMR fluid properties are summarized in a chapter written for an AAPG book on NMR well logging. This includes live oils, viscous oils, natural gas mixtures, and the relation between relaxation time and diffusivity. Oil based drilling fluids can have an adverse effect on NMR well logging if it alters the wettability of the formation. The effect of various surfactants on wettability and surface relaxivity are evaluated for silicamore » sand. The relation between the relaxation time and diffusivity distinguishes the response of brine, oil, and gas in a NMR well log. A new NMR pulse sequence in the presence of a field gradient and a new inversion technique enables the T{sub 2} and diffusivity distributions to be displayed as a two-dimensional map. The objectives of pore morphology and rock characterization are to identify vug connectivity by using X-ray CT scan, and to improve NMR permeability correlation. Improved estimation of permeability from NMR response is possible by using estimated tortuosity as a parameter to interpolate between two existing permeability models.« less

  14. Well log characterization of natural gas-hydrates

    USGS Publications Warehouse

    Collett, Timothy S.; Lee, Myung W.

    2012-01-01

    In the last 25 years there have been significant advancements in the use of well-logging tools to acquire detailed information on the occurrence of gas hydrates in nature: whereas wireline electrical resistivity and acoustic logs were formerly used to identify gas-hydrate occurrences in wells drilled in Arctic permafrost environments, more advanced wireline and logging-while-drilling (LWD) tools are now routinely used to examine the petrophysical nature of gas-hydrate reservoirs and the distribution and concentration of gas hydrates within various complex reservoir systems. Resistivity- and acoustic-logging tools are the most widely used for estimating the gas-hydrate content (i.e., reservoir saturations) in various sediment types and geologic settings. Recent integrated sediment coring and well-log studies have confirmed that electrical-resistivity and acoustic-velocity data can yield accurate gas-hydrate saturations in sediment grain-supported (isotropic) systems such as sand reservoirs, but more advanced log-analysis models are required to characterize gas hydrate in fractured (anisotropic) reservoir systems. New well-logging tools designed to make directionally oriented acoustic and propagation-resistivity log measurements provide the data needed to analyze the acoustic and electrical anisotropic properties of both highly interbedded and fracture-dominated gas-hydrate reservoirs. Advancements in nuclear magnetic resonance (NMR) logging and wireline formation testing (WFT) also allow for the characterization of gas hydrate at the pore scale. Integrated NMR and formation testing studies from northern Canada and Alaska have yielded valuable insight into how gas hydrates are physically distributed in sediments and the occurrence and nature of pore fluids(i.e., free water along with clay- and capillary-bound water) in gas-hydrate-bearing reservoirs. Information on the distribution of gas hydrate at the pore scale has provided invaluable insight on the mechanisms controlling the formation and occurrence of gas hydrate in nature along with data on gas-hydrate reservoir properties (i.e., porosities and permeabilities) needed to accurately predict gas production rates for various gas-hydrate production schemes.

  15. Gulf of Mexico Gas Hydrate Joint Industry Project Leg II logging-while-drilling data acquisition and anaylsis

    USGS Publications Warehouse

    Collett, Timothy S.; Lee, Myung W.; Zyrianova, Margarita V.; Mrozewski, Stefan A.; Guerin, Gilles; Cook, Ann E.; Goldberg, Dave S.

    2012-01-01

    One of the objectives of the Gulf of MexicoGasHydrateJointIndustryProjectLegII (GOM JIP LegII) was the collection of a comprehensive suite of logging-while-drilling (LWD) data within gas-hydrate-bearing sand reservoirs in order to make accurate estimates of the concentration of gashydrates under various geologic conditions and to understand the geologic controls on the occurrence of gashydrate at each of the sites drilled during this expedition. The LWD sensors just above the drill bit provided important information on the nature of the sediments and the occurrence of gashydrate. There has been significant advancements in the use of downhole well-logging tools to acquire detailed information on the occurrence of gashydrate in nature: From using electrical resistivity and acoustic logs to identify gashydrate occurrences in wells to where wireline and advanced logging-while-drilling tools are routinely used to examine the petrophysical nature of gashydrate reservoirs and the distribution and concentration of gashydrates within various complex reservoir systems. Recent integrated sediment coring and well-log studies have confirmed that electrical resistivity and acoustic velocity data can yield accurate gashydrate saturations in sediment grain supported (isotropic) systems such as sand reservoirs, but more advanced log analysis models are required to characterize gashydrate in fractured (anisotropic) reservoir systems. In support of the GOM JIP LegII effort, well-log data montages have been compiled and presented in this report which includes downhole logs obtained from all seven wells drilled during this expedition with a focus on identifying and characterizing the potential gas-hydrate-bearing sedimentary section in each of the wells. Also presented and reviewed in this report are the gas-hydrate saturation and sediment porosity logs for each of the wells as calculated from available downhole well logs.

  16. X-Ray Processing of ChaMPlane Fields: Methods and Initial Results for Selected Anti-Galactic Center Fields

    NASA Astrophysics Data System (ADS)

    Hong, JaeSub; van den Berg, Maureen; Schlegel, Eric M.; Grindlay, Jonathan E.; Koenig, Xavier; Laycock, Silas; Zhao, Ping

    2005-12-01

    We describe the X-ray analysis procedure of the ongoing Chandra Multiwavelength Plane (ChaMPlane) Survey and report the initial results from the analysis of 15 selected anti-Galactic center observations (90deg

  17. Reducing logging damage

    Treesearch

    Richard D. Cosens

    1952-01-01

    Reducing logging damage to reproduction and residual stands is an important part of harvesting the old-growth forests of California. Much of the over mature timber: is on areas with an acceptable stocking of advance growth. When the old trees are harvested, the advance growth is scarred, deformed, broken, or killed outright. Insects and disease attack the broken and...

  18. Gulf of Mexico Gas Hydrate Joint Industry Project Leg II logging-while-drilling data acquisition and analysis

    USGS Publications Warehouse

    Collett, Timothy S.; Lee, Wyung W.; Zyrianova, Margarita V.; Mrozewski, Stefan A.; Guerin, Gilles; Cook, Ann E.; Goldberg, Dave S.

    2012-01-01

    One of the objectives of the Gulf of Mexico Gas Hydrate Joint Industry Project Leg II (GOM JIP Leg II) was the collection of a comprehensive suite of logging-while-drilling (LWD) data within gas-hydrate-bearing sand reservoirs in order to make accurate estimates of the concentration of gas hydrates under various geologic conditions and to understand the geologic controls on the occurrence of gas hydrate at each of the sites drilled during this expedition. The LWD sensors just above the drill bit provided important information on the nature of the sediments and the occurrence of gas hydrate. There has been significant advancements in the use of downhole well-logging tools to acquire detailed information on the occurrence of gas hydrate in nature: From using electrical resistivity and acoustic logs to identify gas hydrate occurrences in wells to where wireline and advanced logging-while-drilling tools are routinely used to examine the petrophysical nature of gas hydrate reservoirs and the distribution and concentration of gas hydrates within various complex reservoir systems. Recent integrated sediment coring and well-log studies have confirmed that electrical resistivity and acoustic velocity data can yield accurate gas hydrate saturations in sediment grain supported (isotropic) systems such as sand reservoirs, but more advanced log analysis models are required to characterize gas hydrate in fractured (anisotropic) reservoir systems. In support of the GOM JIP Leg II effort, well-log data montages have been compiled and presented in this report which includes downhole logs obtained from all seven wells drilled during this expedition with a focus on identifying and characterizing the potential gas-hydrate-bearing sedimentary section in each of the wells. Also presented and reviewed in this report are the gas-hydrate saturation and sediment porosity logs for each of the wells as calculated from available downhole well logs.

  19. Application of borehole geophysics to water-resources investigations

    USGS Publications Warehouse

    Keys, W.S.; MacCary, L.M.

    1971-01-01

    This manual is intended to be a guide for hydrologists using borehole geophysics in ground-water studies. The emphasis is on the application and interpretation of geophysical well logs, and not on the operation of a logger. It describes in detail those logging techniques that have been utilized within the Water Resources Division of the U.S. Geological Survey, and those used in petroleum investigations that have potential application to hydrologic problems. Most of the logs described can be made by commercial logging service companies, and many can be made with small water-well loggers. The general principles of each technique and the rules of log interpretation are the same, regardless of differences in instrumentation. Geophysical well logs can be interpreted to determine the lithology, geometry, resistivity, formation factor, bulk density, porosity, permeability, moisture content, and specific yield of water-bearing rocks, and to define the source, movement, and chemical and physical characteristics of ground water. Numerous examples of logs are used to illustrate applications and interpretation in various ground-water environments. The interrelations between various types of logs are emphasized, and the following aspects are described for each of the important logging techniques: Principles and applications, instrumentation, calibration and standardization, radius of investigation, and extraneous effects.

  20. An analysis technique for testing log grades

    Treesearch

    Carl A. Newport; William G. O' Regan

    1963-01-01

    An analytical technique that may be used in evaluating log-grading systems is described. It also provides means of comparing two or more grading systems, or a proposed change with the system from which it was developed. The total volume and computed value of lumber from each sample log are the basic data used.

  1. Use of advanced borehole geophysical techniques to delineate fractured-rock ground-water flow and fractures along water-tunnel facilities in northern Queens County, New York

    USGS Publications Warehouse

    Stumm, Frederick; Chu, Anthony; Lange, Andrew D.; Paillet, Frederick L.; Williams, John H.; Lane, John W.

    2001-01-01

    Advanced borehole geophysical methods were used to assess the geohydrology of crystalline bedrock along the course of a new water tunnel for New York City. The logging methods include natural gamma, spontaneous potential, single-point resistance, mechanical and acoustic caliper, focused electromagnetic induction, electromagnetic resistivity, magnetic susceptibility, borehole-fluid temperature and conductance, differential temperature, heat-pulse flowmeter, acoustic televiewer, borehole deviation, optical televiewer, and borehole radar. Integrated interpretation of the geophysical logs from an 825-foot borehole (1) provided information on the extent, orientation, and structure (foliation and fractures) within the entire borehole, including intensely fractured intervals from which core recovery may be poor; (2) delineated transmissive fracture zones intersected by the borehole and provided estimates of their transmissivity and hydraulic head; and (3) enabled mapping of the location and orientation of structures at distances as much as 100 ft from the borehole.Analyses of the borehole-wall image and the geophysical logs from the borehole on Crescent Street, in northern Queens County, are presented here to illustrate the application of the methods. The borehole penetrates gneiss and other crystalline bedrock that has predominantly southeastward dipping foliation and nearly horizontal and southeastward-dipping fractures. The heat-pulse flowmeter logs obtained under pumping and nonpumping conditions, together with the other geophysical logs, indicate five transmissive fracture zones. More than 90 percent of the open-hole transmissivity is associated with a fracture zone 272 feet BLS (below land surface). A transmissive zone at 787 feet BLS that consists of nearly parallel fractures lies within the projected tunnel path; here the hydraulic head is 12 to 15 feet lower than that of transmissive zones above the 315-foot depth. The 60-megahertz directional borehole radar logs indicate the location and orientation of two closely spaced radar reflectors that would intersect the projection of the borehole below its drilled depth.Subsequent excavation of the tunnel past the borehole allowed comparison of the log analysis with conditions observed in the tunnel. The tunnel was found to intersect gneiss with southeastward dipping foliation; many nearly horizontal fractures; and a southeastward dipping fracture zone whose location, character, and orientation was consistent with that of the mapped radar reflectors. The fracture zone produced inflow to the tunnel at a rate of 50 to 100 gallons per minute. All conditions indicated by the logging methods were consistent with those observed within the tunnel.

  2. Comparison of various techniques for calibration of AIS data

    NASA Technical Reports Server (NTRS)

    Roberts, D. A.; Yamaguchi, Y.; Lyon, R. J. P.

    1986-01-01

    The Airborne Imaging Spectrometer (AIS) samples a region which is strongly influenced by decreasing solar irradiance at longer wavelengths and strong atmospheric absorptions. Four techniques, the Log Residual, the Least Upper Bound Residual, the Flat Field Correction and calibration using field reflectance measurements were investigated as a means for removing these two features. Of the four techniques field reflectance calibration proved to be superior in terms of noise and normalization. Of the other three techniques, the Log Residual was superior when applied to areas which did not contain one dominant cover type. In heavily vegetated areas, the Log Residual proved to be ineffective. After removing anomalously bright data values, the Least Upper Bound Residual proved to be almost as effective as the Log Residual in sparsely vegetated areas and much more effective in heavily vegetated areas. Of all the techniques, the Flat Field Correction was the noisest.

  3. Well log characterization of natural gas hydrates

    USGS Publications Warehouse

    Collett, Timothy S.; Lee, Myung W.

    2011-01-01

    In the last 25 years we have seen significant advancements in the use of downhole well logging tools to acquire detailed information on the occurrence of gas hydrate in nature: From an early start of using wireline electrical resistivity and acoustic logs to identify gas hydrate occurrences in wells drilled in Arctic permafrost environments to today where wireline and advanced logging-while-drilling tools are routinely used to examine the petrophysical nature of gas hydrate reservoirs and the distribution and concentration of gas hydrates within various complex reservoir systems. The most established and well known use of downhole log data in gas hydrate research is the use of electrical resistivity and acoustic velocity data (both compressional- and shear-wave data) to make estimates of gas hydrate content (i.e., reservoir saturations) in various sediment types and geologic settings. New downhole logging tools designed to make directionally oriented acoustic and propagation resistivity log measurements have provided the data needed to analyze the acoustic and electrical anisotropic properties of both highly inter-bedded and fracture dominated gas hydrate reservoirs. Advancements in nuclear-magnetic-resonance (NMR) logging and wireline formation testing have also allowed for the characterization of gas hydrate at the pore scale. Integrated NMR and formation testing studies from northern Canada and Alaska have yielded valuable insight into how gas hydrates are physically distributed in sediments and the occurrence and nature of pore fluids (i.e., free-water along with clay and capillary bound water) in gas-hydrate-bearing reservoirs. Information on the distribution of gas hydrate at the pore scale has provided invaluable insight on the mechanisms controlling the formation and occurrence of gas hydrate in nature along with data on gas hydrate reservoir properties (i.e., permeabilities) needed to accurately predict gas production rates for various gas hydrate production schemes.

  4. SU-F-T-233: Evaluation of Treatment Delivery Parameters Using High Resolution ELEKTA Log Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kabat, C; Defoor, D; Alexandrian, A

    2016-06-15

    Purpose: As modern linacs have become more technologically advanced with the implementation of IGRT and IMRT with HDMLCs, a requirement for more elaborate tracking techniques to monitor components’ integrity is paramount. ElektaLog files are generated every 40 milliseconds, which can be analyzed to track subtle changes and provide another aspect of quality assurance. This allows for constant monitoring of fraction consistency in addition to machine reliability. With this in mind, it was the aim of the study to evaluate if ElektaLog files can be utilized for linac consistency QA. Methods: ElektaLogs were reviewed for 16 IMRT patient plans with >16more » fractions. Logs were analyzed by creating fluence maps from recorded values of MLC locations, jaw locations, and dose per unit time. Fluence maps were then utilized to calculate a 2D gamma index with a 2%–2mm criteria for each fraction. ElektaLogs were also used to analyze positional errors for MLC leaves and jaws, which were used to compute an overall error for the MLC banks, Y-jaws, and X-jaws by taking the root-meansquare value of the individual recorded errors during treatment. Additionally, beam on time was calculated using the number of ElektaLog file entries within the file. Results: The average 2D gamma for all 16 patient plans was found to be 98.0±2.0%. Recorded gamma index values showed an acceptable correlation between fractions. Average RMS values for MLC leaves and the jaws resulted in a leaf variation of roughly 0.3±0.08 mm and jaw variation of about 0.15±0.04 mm, both of which fall within clinical tolerances. Conclusion: The use of ElektaLog files for day-to-day evaluation of linac integrity and patient QA can be utilized to allow for reliable analysis of system accuracy and performance.« less

  5. The Spontaneous Ray Log: A New Aid for Constructing Pseudo-Synthetic Seismograms

    NASA Astrophysics Data System (ADS)

    Quadir, Adnan; Lewis, Charles; Rau, Ruey-Juin

    2018-02-01

    Conventional synthetic seismograms for hydrocarbon exploration combine the sonic and density logs, whereas pseudo-synthetic seismograms are constructed with a density log plus a resistivity, neutron, gamma ray, or rarely a spontaneous potential log. Herein, we introduce a new technique for constructing a pseudo-synthetic seismogram by combining the gamma ray (GR) and self-potential (SP) logs to produce the spontaneous ray (SR) log. Three wells, each of which consisted of more than 1000 m of carbonates, sandstones, and shales, were investigated; each well was divided into 12 Groups based on formation tops, and the Pearson product-moment correlation coefficient (PCC) was calculated for each "Group" from each of the GR, SP, and SR logs. The highest PCC-valued log curves for each Group were then combined to produce a single log whose values were cross-plotted against the reference well's sonic ITT values to determine a linear transform for producing a pseudo-sonic (PS) log and, ultimately, a pseudo-synthetic seismogram. The range for the Nash-Sutcliffe efficiency (NSE) acceptable value for the pseudo-sonic logs of three wells was 78-83%. This technique was tested on three wells, one of which was used as a blind test well, with satisfactory results. The PCC value between the composite PS (SR) log with low-density correction and the conventional sonic (CS) log was 86%. Because of the common occurrence of spontaneous potential and gamma ray logs in many of the hydrocarbon basins of the world, this inexpensive and straightforward technique could hold significant promise in areas that are in need of alternate ways to create pseudo-synthetic seismograms for seismic reflection interpretation.

  6. Ruffed grouse (Bonasa umbellus) drumming log and habitat use in Grand Teton National Park, Wyoming

    USGS Publications Warehouse

    Buhler, M.L.; Anderson, S.H.

    2001-01-01

    We described 15 Ruffed Grouse (Bonasa umbellus) drumming logs and adjacent habitat within Grand Teton National Park, Wyoming. Drumming logs and adjacent habitat differed from 30 random non-drumming sites. Drumming logs had fewer limbs (8; P = 0.003) and a smaller percentage of bark remaining (12%; P = 0.0001). These logs were in advanced stages of decay but were still firm to the touch. Additionally, drumming logs were found close to clearings but in areas with increased amounts of undergrowth and mature trees. Adjacent habitat analysis (0.04-ha circular plot centered on logs) indicated drumming locations had significantly greater average canopy height, more vegetative cover consisting of conifer and total canopy cover, and more vertical foliage between 0.3 m and 3.0 m in height. Adjacent habitat was in advanced stages of maturity as indicated by significant numbers of both large-diameter logs and large-diameter lodgepole pine (Pinus contorta) and quaking aspen (Populus tremuloides) snags. Tree species dominating the canopy and subcanopy were large-diameter Engelmann spruce (Picea engelmannii), lodgepole pine, and quaking aspen. Subalpine fir (Abies lasiocarpa) and quaking aspen saplings were more numerous at used sites. Ruffed Grouse drummed in coniferous areas within close proximity of quaking aspen.

  7. Thorium normalization as a hydrocarbon accumulation indicator for Lower Miocene rocks in Ras Ghara area, Gulf of Suez, Egypt

    NASA Astrophysics Data System (ADS)

    El-Khadragy, A. A.; Shazly, T. F.; AlAlfy, I. M.; Ramadan, M.; El-Sawy, M. Z.

    2018-06-01

    An exploration method has been developed using surface and aerial gamma-ray spectral measurements in prospecting petroleum in stratigraphic and structural traps. The Gulf of Suez is an important region for studying hydrocarbon potentiality in Egypt. Thorium normalization technique was applied on the sandstone reservoirs in the region to determine the hydrocarbon potentialities zones using the three spectrometric radioactive gamma ray-logs (eU, eTh and K% logs). This method was applied on the recorded gamma-ray spectrometric logs for Rudeis and Kareem Formations in Ras Ghara oil Field, Gulf of Suez, Egypt. The conventional well logs (gamma-ray, resistivity, neutron, density and sonic logs) were analyzed to determine the net pay zones in the study area. The agreement ratios between the thorium normalization technique and the results of the well log analyses are high, so the application of thorium normalization technique can be used as a guide for hydrocarbon accumulation in the study reservoir rocks.

  8. Heart-rot hazard is low in Abies amabilis reproduction injured by logging.

    Treesearch

    Paul E. Aho

    1960-01-01

    Clear-cut units in upper-slope forest types in western Washington and Oregon often have an understory of Pacific silver fir (Abies amabilis) at time of logging. Foresters sometimes hesitate to preserve this advance regeneration, partly because of the possibility that heart rots infecting through logging wounds might considerably reduce the...

  9. An integrated 3D log processing optimization system for small sawmills in central Appalachia

    Treesearch

    Wenshu Lin; Jingxin Wang

    2013-01-01

    An integrated 3D log processing optimization system was developed to perform 3D log generation, opening face determination, headrig log sawing simulation, fl itch edging and trimming simulation, cant resawing, and lumber grading. A circular cross-section model, together with 3D modeling techniques, was used to reconstruct 3D virtual logs. Internal log defects (knots)...

  10. Petrophysical evaluation of subterranean formations

    DOEpatents

    Klein, James D; Schoderbek, David A; Mailloux, Jason M

    2013-05-28

    Methods and systems are provided for evaluating petrophysical properties of subterranean formations and comprehensively evaluating hydrate presence through a combination of computer-implemented log modeling and analysis. Certain embodiments include the steps of running a number of logging tools in a wellbore to obtain a variety of wellbore data and logs, and evaluating and modeling the log data to ascertain various petrophysical properties. Examples of suitable logging techniques that may be used in combination with the present invention include, but are not limited to, sonic logs, electrical resistivity logs, gamma ray logs, neutron porosity logs, density logs, NRM logs, or any combination or subset thereof.

  11. Borehole geophysics applied to ground-water investigations

    USGS Publications Warehouse

    Keys, W.S.

    1990-01-01

    The purpose of this manual is to provide hydrologists, geologists, and others who have the necessary background in hydrogeology with the basic information needed to apply the most useful borehole-geophysical-logging techniques to the solution of problems in ground-water hydrology. Geophysical logs can provide information on the construction of wells and on the character of the rocks and fluids penetrated by those wells, as well as on changes in the character of these factors over time. The response of well logs is caused by petrophysical factors, by the quality, temperature, and pressure of interstitial fluids, and by ground-water flow. Qualitative and quantitative analysis of analog records and computer analysis of digitized logs are used to derive geohydrologic information. This information can then be extrapolated vertically within a well and laterally to other wells using logs. The physical principles by which the mechanical and electronic components of a logging system measure properties of rocks, fluids, and wells, as well as the principles of measurement, must be understood if geophysical logs are to be interpreted correctly. Plating a logging operation involves selecting the equipment and the logs most likely to provide the needed information. Information on well construction and geohydrology is needed to guide this selection. Quality control of logs is an important responsibility of both the equipment operator and the log analyst and requires both calibration and well-site standardization of equipment. Logging techniques that are widely used in ground-water hydrology or that have significant potential for application to this field include spontaneous potential, resistance, resistivity, gamma, gamma spectrometry, gamma-gamma, neutron, acoustic velocity, acoustic televiewer, caliper, and fluid temperature, conductivity, and flow. The following topics are discussed for each of these techniques: principles and instrumentation, calibration and standardization, volume of investigation, extraneous effects, and interpretation and applications.

  12. Borehole geophysics applied to ground-water investigations

    USGS Publications Warehouse

    Keys, W.S.

    1988-01-01

    The purpose of this manual is to provide hydrologists, geologists, and others who have the necessary training with the basic information needed to apply the most useful borehole-geophysical-logging techniques to the solution of problems in ground-water hydrology. Geophysical logs can provide information on the construction of wells and on the character of the rocks and fluids penetrated by those wells, in addition to changes in the character of these factors with time. The response of well logs is caused by: petrophysical factors; the quality; temperature, and pressure of interstitial fluids; and ground-water flow. Qualitative and quantitative analysis of the analog records and computer analysis of digitized logs are used to derive geohydrologic information. This information can then be extrapolated vertically within a well and laterally to other wells using logs.The physical principles by which the mechanical and electronic components of a logging system measure properties of rocks, fluids and wells, and the principles of measurement need to be understood to correctly interpret geophysical logs. Planning the logging operation involves selecting the equipment and the logs most likely to provide the needed information. Information on well construction and geohydrology are needed to guide this selection. Quality control of logs is an important responsibility of both the equipment operator and log analyst and requires both calibration and well-site standardization of equipment.Logging techniques that are widely used in ground-water hydrology or that have significant potential for application to this field include: spontaneous potential, resistance, resistivity, gamma, gamma spectrometry, gamma-gamma, neutron, acoustic velocity, acoustic televiewer, caliper, and fluid temperature, conductivity, and flow. The following topics are discussed for each of these techniques: principles and instrumentation, calibration and standardization, volume of investigation, extraneous effects, and interpretation and applications.

  13. Magnetic resonance imaging in laboratory petrophysical core analysis

    NASA Astrophysics Data System (ADS)

    Mitchell, J.; Chandrasekera, T. C.; Holland, D. J.; Gladden, L. F.; Fordham, E. J.

    2013-05-01

    Magnetic resonance imaging (MRI) is a well-known technique in medical diagnosis and materials science. In the more specialized arena of laboratory-scale petrophysical rock core analysis, the role of MRI has undergone a substantial change in focus over the last three decades. Initially, alongside the continual drive to exploit higher magnetic field strengths in MRI applications for medicine and chemistry, the same trend was followed in core analysis. However, the spatial resolution achievable in heterogeneous porous media is inherently limited due to the magnetic susceptibility contrast between solid and fluid. As a result, imaging resolution at the length-scale of typical pore diameters is not practical and so MRI of core-plugs has often been viewed as an inappropriate use of expensive magnetic resonance facilities. Recently, there has been a paradigm shift in the use of MRI in laboratory-scale core analysis. The focus is now on acquiring data in the laboratory that are directly comparable to data obtained from magnetic resonance well-logging tools (i.e., a common physics of measurement). To maintain consistency with well-logging instrumentation, it is desirable to measure distributions of transverse (T2) relaxation time-the industry-standard metric in well-logging-at the laboratory-scale. These T2 distributions can be spatially resolved over the length of a core-plug. The use of low-field magnets in the laboratory environment is optimal for core analysis not only because the magnetic field strength is closer to that of well-logging tools, but also because the magnetic susceptibility contrast is minimized, allowing the acquisition of quantitative image voxel (or pixel) intensities that are directly scalable to liquid volume. Beyond simple determination of macroscopic rock heterogeneity, it is possible to utilize the spatial resolution for monitoring forced displacement of oil by water or chemical agents, determining capillary pressure curves, and estimating wettability. The history of MRI in petrophysics is reviewed and future directions considered, including advanced data processing techniques such as compressed sensing reconstruction and Bayesian inference analysis of under-sampled data. Although this review focuses on rock core analysis, the techniques described are applicable in a wider context to porous media in general, such as cements, soils, ceramics, and catalytic materials.

  14. Porosity and hydraulic conductivity estimation of the basaltic aquifer in Southern Syria by using nuclear and electrical well logging techniques

    NASA Astrophysics Data System (ADS)

    Asfahani, Jamal

    2017-08-01

    An alternative approach using nuclear neutron-porosity and electrical resistivity well logging of long (64 inch) and short (16 inch) normal techniques is proposed to estimate the porosity and the hydraulic conductivity ( K) of the basaltic aquifers in Southern Syria. This method is applied on the available logs of Kodana well in Southern Syria. It has been found that the obtained K value by applying this technique seems to be reasonable and comparable with the hydraulic conductivity value of 3.09 m/day obtained by the pumping test carried out at Kodana well. The proposed alternative well logging methodology seems as promising and could be practiced in the basaltic environments for the estimation of hydraulic conductivity parameter. However, more detailed researches are still required to make this proposed technique very performed in basaltic environments.

  15. Deriving Criteria-supporting Benchmark Values from Empirical Response Relationships: Comparison of Statistical Techniques and Effect of Log-transforming the Nutrient Variable

    EPA Science Inventory

    In analyses supporting the development of numeric nutrient criteria, multiple statistical techniques can be used to extract critical values from stressor response relationships. However there is little guidance for choosing among techniques, and the extent to which log-transfor...

  16. Fracture Characterization

    EPA Science Inventory

    The goal of this volume is to compare and assess various techniques for understanding fracture patterns at a site at Pease International Tradeport, NH, and to give an overview of the site as a whole. Techniques included are: core logging, geophysical logging, radar studies, and...

  17. Logging while fishing: An alternate method to cut and thread fishing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tollefsen, E.; Crary, S.; Flores, B.

    1996-12-31

    New technology has been introduced to allow completion of the wireline logging program after the tool string has become lodged in the wellbore. Charges associated with extracting a stuck tool are substantial. These charges result from the nonproductive time during the fishing trip, an associated wiper trip, and re-logging the well. The ability to continue the logging program while retrieving the logging string from the wellbore is needed. Logging While Fishing (LWF) is a hybrid of existing technologies combined with a new sub capable of severing a cable remotely. This new method is comprised of cut and thread fishing, drillpipemore » conveyed logging, and bridled tool techniques. Utilizing these techniques it is possible to complete wireline logging operations while removing a stuck tool from the wellbore. Completing logging operations using this hybrid method will save operating companies time and money. Other benefits, depending on the situation, include reduced fishing time and an increased level of safety. This application has been demonstrated on jobs in the Gulf of Mexico, North Sea, Venezuela, and Southeast Asia.« less

  18. Reduction of lithologic-log data to numbers for use in the digital computer

    USGS Publications Warehouse

    Morgan, C.O.; McNellis, J.M.

    1971-01-01

    The development of a standardized system for conveniently coding lithologic-log data for use in the digital computer has long been needed. The technique suggested involves a reduction of the original written alphanumeric log to a numeric log by use of computer programs. This numeric log can then be retrieved as a written log, interrogated for pertinent information, or analyzed statistically. ?? 1971 Plenum Publishing Corporation.

  19. Design considerations for large woody debris placement in stream enhancement projects. North American Journal of Fisheries Management

    Treesearch

    Robert H. Hilderbrand; A. Dennis Lemly; C. Andrew Dolloff; Kelly L. Harpster

    1998-01-01

    Log length exerted a critical influence in stabilizing large woody debris (LWD) pieces added as an experimental stream restoration technique. Logs longer than the average bank-full channel width (5.5 m) were significantly less likely to be displaced than logs shorter than this width. The longest log in stable log groups was significantly longer than the longest log in...

  20. Does the novel lateral trauma position cause more motion in an unstable cervical spine injury than the logroll maneuver?

    PubMed

    Hyldmo, Per Kristian; Horodyski, MaryBeth; Conrad, Bryan P; Aslaksen, Sindre; Røislien, Jo; Prasarn, Mark; Rechtine, Glenn R; Søreide, Eldar

    2017-11-01

    Prehospital personnel who lack advanced airway management training must rely on basic techniques when transporting unconscious trauma patients. The supine position is associated with a loss of airway patency when compared to lateral recumbent positions. Thus, an inherent conflict exists between securing an open airway using the recovery position and maintaining spinal immobilization in the supine position. The lateral trauma position is a novel technique that aims to combine airway management with spinal precautions. The objective of this study was to compare the spinal motion allowed by the novel lateral trauma position and the well-established log-roll maneuver. Using a full-body cadaver model with an induced globally unstable cervical spine (C5-C6) lesion, we investigated the mean range of motion (ROM) produced at the site of the injury in six dimensions by performing the two maneuvers using an electromagnetic tracking device. Compared to the log-roll maneuver, the lateral trauma position caused similar mean ROM in five of the six dimensions. Only medial/lateral linear motion was significantly greater in the lateral trauma position (1.4mm (95% confidence interval [CI] 0.4, 2.4mm)). In this cadaver study, the novel lateral trauma position and the well-established log-roll maneuver resulted in comparable amounts of motion in an unstable cervical spine injury model. We suggest that the lateral trauma position may be considered for unconscious non-intubated trauma patients. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  1. Balloon logging with the inverted skyline

    NASA Technical Reports Server (NTRS)

    Mosher, C. F.

    1975-01-01

    There is a gap in aerial logging techniques that has to be filled. The need for a simple, safe, sizeable system has to be developed before aerial logging will become effective and accepted in the logging industry. This paper presents such a system designed on simple principles with realistic cost and ecological benefits.

  2. 1987 Nuclear Science Symposium, 34th, and 1987 Symposium on Nuclear Power Systems, 19th, San Francisco, CA, Oct. 21-23, 1987, Proceedings

    NASA Astrophysics Data System (ADS)

    Armantrout, Guy A.

    1988-02-01

    The present conference consideres topics in radiation detectors, advanced electronic circuits, data acquisition systems, radiation detector systems, high-energy and nuclear physics radiation detection, spaceborne instrumentation, health physics and environmental radiation detection, nuclear medicine, nuclear well logging, and nuclear reactor instrumentation. Attention is given to the response of scintillators to heavy ions, phonon-mediated particle detection, ballistic deficits in pulse-shaping amplifiers, fast analog ICs for particle physics, logic cell arrays, the CERN host interface, high performance data buses, a novel scintillating glass for high-energy physics applications, background events in microchannel plates, a tritium accelerator mass spectrometer, a novel positron tomograph, advancements in PET, cylindrical positron tomography, nuclear techniques in subsurface geology, REE borehole neutron activation, and a continuous tritium monitor for aqueous process streams.

  3. Logging while fishing technique results in substantial savings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tollefsen, E.; Everett, M.

    1996-12-01

    During wireline logging operations, tools occasionally become stuck in the borehole and require fishing. A typical fishing job can take anywhere from 1{1/2}--4 days. In the Gulf of Mexico, a fishing job can easily cost between $100,000 and $500,000. These costs result from nonproductive time during the fishing trip, associated wiper trip and relogging the well. Logging while fishing (LWF) technology is a patented system capable of retrieving a stuck fish and completing the logging run during the same pipe descent. Completing logging operations using LWF method saves time and money. The technique also provides well information where data maymore » not otherwise have been obtained. Other benefits include reduced fishing time and an increased level of safety.« less

  4. Visual texture for automated characterisation of geological features in borehole televiewer imagery

    NASA Astrophysics Data System (ADS)

    Al-Sit, Waleed; Al-Nuaimy, Waleed; Marelli, Matteo; Al-Ataby, Ali

    2015-08-01

    Detailed characterisation of the structure of subsurface fractures is greatly facilitated by digital borehole logging instruments, the interpretation of which is typically time-consuming and labour-intensive. Despite recent advances towards autonomy and automation, the final interpretation remains heavily dependent on the skill, experience, alertness and consistency of a human operator. Existing computational tools fail to detect layers between rocks that do not exhibit distinct fracture boundaries, and often struggle characterising cross-cutting layers and partial fractures. This paper presents a novel approach to the characterisation of planar rock discontinuities from digital images of borehole logs. Multi-resolution texture segmentation and pattern recognition techniques utilising Gabor filters are combined with an iterative adaptation of the Hough transform to enable non-distinct, partial, distorted and steep fractures and layers to be accurately identified and characterised in a fully automated fashion. This approach has successfully detected fractures and layers with high detection accuracy and at a relatively low computational cost.

  5. Measurement of the distribution of ventilation-perfusion ratios in the human lung with proton MRI: comparison with the multiple inert-gas elimination technique.

    PubMed

    Sá, Rui Carlos; Henderson, A Cortney; Simonson, Tatum; Arai, Tatsuya J; Wagner, Harrieth; Theilmann, Rebecca J; Wagner, Peter D; Prisk, G Kim; Hopkins, Susan R

    2017-07-01

    We have developed a novel functional proton magnetic resonance imaging (MRI) technique to measure regional ventilation-perfusion (V̇ A /Q̇) ratio in the lung. We conducted a comparison study of this technique in healthy subjects ( n = 7, age = 42 ± 16 yr, Forced expiratory volume in 1 s = 94% predicted), by comparing data measured using MRI to that obtained from the multiple inert gas elimination technique (MIGET). Regional ventilation measured in a sagittal lung slice using Specific Ventilation Imaging was combined with proton density measured using a fast gradient-echo sequence to calculate regional alveolar ventilation, registered with perfusion images acquired using arterial spin labeling, and divided on a voxel-by-voxel basis to obtain regional V̇ A /Q̇ ratio. LogSDV̇ and LogSDQ̇, measures of heterogeneity derived from the standard deviation (log scale) of the ventilation and perfusion vs. V̇ A /Q̇ ratio histograms respectively, were calculated. On a separate day, subjects underwent study with MIGET and LogSDV̇ and LogSDQ̇ were calculated from MIGET data using the 50-compartment model. MIGET LogSDV̇ and LogSDQ̇ were normal in all subjects. LogSDQ̇ was highly correlated between MRI and MIGET (R = 0.89, P = 0.007); the intercept was not significantly different from zero (-0.062, P = 0.65) and the slope did not significantly differ from identity (1.29, P = 0.34). MIGET and MRI measures of LogSDV̇ were well correlated (R = 0.83, P = 0.02); the intercept differed from zero (0.20, P = 0.04) and the slope deviated from the line of identity (0.52, P = 0.01). We conclude that in normal subjects, there is a reasonable agreement between MIGET measures of heterogeneity and those from proton MRI measured in a single slice of lung. NEW & NOTEWORTHY We report a comparison of a new proton MRI technique to measure regional V̇ A /Q̇ ratio against the multiple inert gas elimination technique (MIGET). The study reports good relationships between measures of heterogeneity derived from MIGET and those derived from MRI. Although currently limited to a single slice acquisition, these data suggest that single sagittal slice measures of V̇ A /Q̇ ratio provide an adequate means to assess heterogeneity in the normal lung. Copyright © 2017 the American Physiological Society.

  6. Fluid-Rock Characterization and Interactions in NMR Well Logging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hirasaki, George J.; Mohanty, Kishore K.

    2003-02-10

    The objective of this project was to characterize the fluid properties and fluid-rock interactions that are needed for formation evaluation by NMR well logging. The advances made in the understanding of NMR fluid properties are summarized in a chapter written for an AAPG book on NMR well logging. This includes live oils, viscous oils, natural gas mixtures, and the relation between relaxation time and diffusivity.

  7. Acoustic testing to enhance western forest values and meet customer wood quality needs

    Treesearch

    Peter Carter; David Briggs; Robert J. Ross; Xiping Wang

    2005-01-01

    Nondestructive testing (NDT) of wood products, such as lumber and veneer, for stiffness and strength evaluation has been proven and commercialized for many years. The NDT concept has been extended and commercialized in the Director HM-200™ tool for testing logs in advance of processing so manufacturers can make more informed log purchases and better match logs to...

  8. Balancing conservation and economic sustainability: the future of the Amazon timber industry.

    PubMed

    Merry, Frank; Soares-Filho, Britaldo; Nepstad, Daniel; Amacher, Gregory; Rodrigues, Hermann

    2009-09-01

    Logging has been a much maligned feature of frontier development in the Amazon. Most discussions ignore the fact that logging can be part of a renewable, environmentally benign, and broadly equitable economic activity in these remote places. We estimate there to be some 4.5 +/- 1.35 billion m(3) of commercial timber volume in the Brazilian Amazon today, of which 1.2 billion m(3) is currently profitable to harvest, with a total potential stumpage value of $15.4 billion. A successful forest sector in the Brazilian Amazon will integrate timber harvesting on private lands and on unprotected and unsettled government lands with timber concessions on public lands. If a legal, productive, timber industry can be established outside of protected areas, it will deliver environmental benefits in synergy with those provided by the region's network of protected areas, the latter of which we estimate to have an opportunity cost from lost timber revenues of $2.3 billion over 30 years. Indeed, on all land accessible to harvesting, the timber industry could produce an average of more than 16 million m(3) per year over a 30-year harvest cycle-entirely outside of current protected areas-providing $4.8 billion in returns to landowners and generating $1.8 billion in sawnwood sales tax revenue. This level of harvest could be profitably complemented with an additional 10% from logging concessions on National Forests. This advance, however, should be realized only through widespread adoption of reduced impact logging techniques.

  9. Balancing Conservation and Economic Sustainability: The Future of the Amazon Timber Industry

    NASA Astrophysics Data System (ADS)

    Merry, Frank; Soares-Filho, Britaldo; Nepstad, Daniel; Amacher, Gregory; Rodrigues, Hermann

    2009-09-01

    Logging has been a much maligned feature of frontier development in the Amazon. Most discussions ignore the fact that logging can be part of a renewable, environmentally benign, and broadly equitable economic activity in these remote places. We estimate there to be some 4.5 ± 1.35 billion m3 of commercial timber volume in the Brazilian Amazon today, of which 1.2 billion m3 is currently profitable to harvest, with a total potential stumpage value of 15.4 billion. A successful forest sector in the Brazilian Amazon will integrate timber harvesting on private lands and on unprotected and unsettled government lands with timber concessions on public lands. If a legal, productive, timber industry can be established outside of protected areas, it will deliver environmental benefits in synergy with those provided by the region’s network of protected areas, the latter of which we estimate to have an opportunity cost from lost timber revenues of 2.3 billion over 30 years. Indeed, on all land accessible to harvesting, the timber industry could produce an average of more than 16 million m3 per year over a 30-year harvest cycle—entirely outside of current protected areas—providing 4.8 billion in returns to landowners and generating 1.8 billion in sawnwood sales tax revenue. This level of harvest could be profitably complemented with an additional 10% from logging concessions on National Forests. This advance, however, should be realized only through widespread adoption of reduced impact logging techniques.

  10. The design and implementation of web mining in web sites security

    NASA Astrophysics Data System (ADS)

    Li, Jian; Zhang, Guo-Yin; Gu, Guo-Chang; Li, Jian-Li

    2003-06-01

    The backdoor or information leak of Web servers can be detected by using Web Mining techniques on some abnormal Web log and Web application log data. The security of Web servers can be enhanced and the damage of illegal access can be avoided. Firstly, the system for discovering the patterns of information leakages in CGI scripts from Web log data was proposed. Secondly, those patterns for system administrators to modify their codes and enhance their Web site security were provided. The following aspects were described: one is to combine web application log with web log to extract more information, so web data mining could be used to mine web log for discovering the information that firewall and Information Detection System cannot find. Another approach is to propose an operation module of web site to enhance Web site security. In cluster server session, Density-Based Clustering technique is used to reduce resource cost and obtain better efficiency.

  11. Extracting the Textual and Temporal Structure of Supercomputing Logs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jain, S; Singh, I; Chandra, A

    2009-05-26

    Supercomputers are prone to frequent faults that adversely affect their performance, reliability and functionality. System logs collected on these systems are a valuable resource of information about their operational status and health. However, their massive size, complexity, and lack of standard format makes it difficult to automatically extract information that can be used to improve system management. In this work we propose a novel method to succinctly represent the contents of supercomputing logs, by using textual clustering to automatically find the syntactic structures of log messages. This information is used to automatically classify messages into semantic groups via an onlinemore » clustering algorithm. Further, we describe a methodology for using the temporal proximity between groups of log messages to identify correlated events in the system. We apply our proposed methods to two large, publicly available supercomputing logs and show that our technique features nearly perfect accuracy for online log-classification and extracts meaningful structural and temporal message patterns that can be used to improve the accuracy of other log analysis techniques.« less

  12. Estimating two indirect logging costs caused by accelerated erosion.

    Treesearch

    Glen O. Klock

    1976-01-01

    In forest areas where high soil erosion potential exists, a comparative yarding cost estimate, including the indirect costs determined by methods proposed here, shows that the total cost of using "advanced" logging methods may be less than that of "traditional" systems.

  13. Physical-scale models of engineered log jams in rivers

    USDA-ARS?s Scientific Manuscript database

    Stream restoration and river engineering projects are employing engineered log jams increasingly for stabilization and in-stream improvements. To further advance the design of these structures and their morphodynamic effects on corridors, the basis for physical-scale models of rivers with engineere...

  14. Locating knots by industrial tomography- A feasibility study

    Treesearch

    Fred W. Taylor; Francis G. Wagner; Charles W. McMillin; Ira L. Morgan; Forrest F. Hopkins

    1984-01-01

    Industrial photon tomography was used to scan four southern pine logs and one red oak log. The logs were scanned at 16 cross-sectional slice planes located 1 centimeter apart along their longitudinal axes. Tomographic reconstructions were made from the scan data collected at these slice planes, and a cursory image analysis technique was developed to locate the log...

  15. Logging damage in thinned, young-growth true fir stands in California and recommendations for prevention.

    Treesearch

    Paul E. Aho; Gary Fiddler; Mike. Srago

    1983-01-01

    Logging-damage surveys and tree-dissection studies were made in commercially thinned, naturally established young-growth true fir stands in the Lassen National Forest in northern California. Significant damage occurred to residual trees in stands logged by conventional methods. Logging damage was substantially lower in stands thinned using techniques designed to reduce...

  16. Analysis of Hospital Processes with Process Mining Techniques.

    PubMed

    Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises

    2015-01-01

    Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.

  17. Bio-logging of physiological parameters in higher marine vertebrates

    NASA Astrophysics Data System (ADS)

    Ponganis, Paul J.

    2007-02-01

    Bio-logging of physiological parameters in higher marine vertebrates had its origins in the field of bio-telemetry in the 1960s and 1970s. The development of microprocessor technology allowed its first application to bio-logging investigations of Weddell seal diving physiology in the early 1980s. Since that time, with the use of increased memory capacity, new sensor technology, and novel data processing techniques, investigators have examined heart rate, temperature, swim speed, stroke frequency, stomach function (gastric pH and motility), heat flux, muscle oxygenation, respiratory rate, diving air volume, and oxygen partial pressure (P) during diving. Swim speed, heart rate, and body temperature have been the most commonly studied parameters. Bio-logging investigation of pressure effects has only been conducted with the use of blood samplers and nitrogen analyses on animals diving at isolated dive holes. The advantages/disadvantages and limitations of recording techniques, probe placement, calibration techniques, and study conditions are reviewed.

  18. Characterization of the Hydrocarbon Potential and Non-Potential Zones Using Wavelet-Based Fractal Analysis

    NASA Astrophysics Data System (ADS)

    Mukherjee, Bappa; Roy, P. N. S.

    The identification of prospective and dry zone is of major importance from well log data. Truthfulness in the identification of potential zone is a very crucial issue in hydrocarbon exploration. In this line, the problem has received considerable attention and many conventional techniques have been proposed. The purpose of this study is to recognize the hydrocarbon and non-hydrocarbon bearing portion within a reservoir by using the non-conventional technique. The wavelet based fractal analysis (WBFA) has been applied on the wire-line log data in order to obtain the pre-defined hydrocarbon (HC) and non-hydrocarbon (NHC) zones by their self-affine signal nature is demonstrated in this paper. The feasibility of the proposed technique is tested with the help of most commonly used logs, like self-potential, gamma ray, resistivity and porosity log responses. These logs are obtained from the industry to make out several HC and NHC zones of all wells in the study region belonging to the upper Assam basin. The results obtained in this study for a particular log response, where in the case of HC bearing zones, it is found that they are mainly situated in a variety of sandstones lithology which leads to the higher Hurst exponent. Further, the NHC zones found to be analogous to lithology with higher shale content having lower Hurst exponent. The above proposed technique can overcome the chance of miss interpretation in conventional reservoir characterization.

  19. Development of a 3D log sawing optimization system for small sawmills in central Appalachia, US

    Treesearch

    Wenshu Lin; Jingxin Wang; Edward Thomas

    2011-01-01

    A 3D log sawing optimization system was developed to perform log generation, opening face determination, sawing simulation, and lumber grading using 3D modeling techniques. Heuristic and dynamic programming algorithms were used to determine opening face and grade sawing optimization. Positions and shapes of internal log defects were predicted using a model developed by...

  20. Evaluation of residual oil saturation after waterflood in a carbonate reservoir

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verma, M.K.; Boucherit, M.; Bouvier, L.

    Four different approaches, including special core analysis (SCAL), log-inject-log, thermal-decay-time (TDT) logs, and material balance, were used to narrow the range of residual oil saturation (ROS) after waterflood, S[sub orw], in a carbonate reservoir in Qatar to between 23% and 27%. An equation was developed that relates S[sub orw] with connate-water saturation, S[sub wi], and porosity. This paper presents the results of S[sub orw] determinations with four different techniques: core waterflood followed by centrifuging, log-inject-log, TDT logging, and material balance.

  1. Functional forms and price elasticities in a discrete continuous choice model of the residential water demand

    NASA Astrophysics Data System (ADS)

    Vásquez Lavín, F. A.; Hernandez, J. I.; Ponce, R. D.; Orrego, S. A.

    2017-07-01

    During recent decades, water demand estimation has gained considerable attention from scholars. From an econometric perspective, the most used functional forms include log-log and linear specifications. Despite the advances in this field and the relevance for policymaking, little attention has been paid to the functional forms used in these estimations, and most authors have not provided justifications for their selection of functional forms. A discrete continuous choice model of the residential water demand is estimated using six functional forms (log-log, full-log, log-quadratic, semilog, linear, and Stone-Geary), and the expected consumption and price elasticity are evaluated. From a policy perspective, our results highlight the relevance of functional form selection for both the expected consumption and price elasticity.

  2. Advanced Cyber Attack Modeling Analysis and Visualization

    DTIC Science & Technology

    2010-03-01

    Graph Analysis Network Web Logs Netflow Data TCP Dump Data System Logs Detect Protect Security Management What-If Figure 8. TVA attack graphs for...Clustered Graphs,” in Proceedings of the Symposium on Graph Drawing, September 1996. [25] K. Lakkaraju, W. Yurcik, A. Lee, “NVisionIP: NetFlow

  3. Downhole well log and core montages from the Mount Elbert Gas Hydrate Stratigraphic Test Well, Alaska North Slope

    USGS Publications Warehouse

    Collett, T.S.; Lewis, R.E.; Winters, W.J.; Lee, M.W.; Rose, K.K.; Boswell, R.M.

    2011-01-01

    The BPXA-DOE-USGS Mount Elbert Gas Hydrate Stratigraphic Test Well was an integral part of an ongoing project to determine the future energy resource potential of gas hydrates on the Alaska North Slope. As part of this effort, the Mount Elbert well included an advanced downhole geophysical logging program. Because gas hydrate is unstable at ground surface pressure and temperature conditions, a major emphasis was placed on the downhole-logging program to determine the occurrence of gas hydrates and the in-situ physical properties of the sediments. In support of this effort, well-log and core data montages have been compiled which include downhole log and core-data obtained from the gas-hydrate-bearing sedimentary section in the Mount Elbert well. Also shown are numerous reservoir parameters, including gas-hydrate saturation and sediment porosity log traces calculated from available downhole well log and core data. ?? 2010.

  4. Reduced-impact logging: challenges and opportunities

    Treesearch

    F.E. Putz; P. Sist; T. Fredericksen; D. Dykstra

    2008-01-01

    Over the past two decades, sets of timber harvesting guidelines designed to mitigate the deleterious environmental impacts of tree felling, yarding, and hauling have become known as "reduced-impact logging" (RIL) techniques. Although none of the components of RIL are new, concerns about destructive logging practices and worker safety in the tropics stimulated...

  5. Characterization of a complex near-surface structure using well logging and passive seismic measurements

    NASA Astrophysics Data System (ADS)

    Benjumea, Beatriz; Macau, Albert; Gabàs, Anna; Figueras, Sara

    2016-04-01

    We combine geophysical well logging and passive seismic measurements to characterize the near-surface geology of an area located in Hontomin, Burgos (Spain). This area has some near-surface challenges for a geophysical study. The irregular topography is characterized by limestone outcrops and unconsolidated sediments areas. Additionally, the near-surface geology includes an upper layer of pure limestones overlying marly limestones and marls (Upper Cretaceous). These materials lie on top of Low Cretaceous siliciclastic sediments (sandstones, clays, gravels). In any case, a layer with reduced velocity is expected. The geophysical data sets used in this study include sonic and gamma-ray logs at two boreholes and passive seismic measurements: three arrays and 224 seismic stations for applying the horizontal-to-vertical amplitude spectra ratio method (H/V). Well-logging data define two significant changes in the P-wave-velocity log within the Upper Cretaceous layer and one more at the Upper to Lower Cretaceous contact. This technique has also been used for refining the geological interpretation. The passive seismic measurements provide a map of sediment thickness with a maximum of around 40 m and shear-wave velocity profiles from the array technique. A comparison between seismic velocity coming from well logging and array measurements defines the resolution limits of the passive seismic techniques and helps it to be interpreted. This study shows how these low-cost techniques can provide useful information about near-surface complexity that could be used for designing a geophysical field survey or for seismic processing steps such as statics or imaging.

  6. Creative Analytics of Mission Ops Event Messages

    NASA Technical Reports Server (NTRS)

    Smith, Dan

    2017-01-01

    Historically, tremendous effort has been put into processing and displaying mission health and safety telemetry data; and relatively little attention has been paid to extracting information from missions time-tagged event log messages. Todays missions may log tens of thousands of messages per day and the numbers are expected to dramatically increase as satellite fleets and constellations are launched, as security monitoring continues to evolve, and as the overall complexity of ground system operations increases. The logs may contain information about orbital events, scheduled and actual observations, device status and anomalies, when operators were logged on, when commands were resent, when there were data drop outs or system failures, and much much more. When dealing with distributed space missions or operational fleets, it becomes even more important to systematically analyze this data. Several advanced information systems technologies make it appropriate to now develop analytic capabilities which can increase mission situational awareness, reduce mission risk, enable better event-driven automation and cross-mission collaborations, and lead to improved operations strategies: Industry Standard for Log Messages. The Object Management Group (OMG) Space Domain Task Force (SDTF) standards organization is in the process of creating a formal standard for industry for event log messages. The format is based on work at NASA GSFC. Open System Architectures. The DoD, NASA, and others are moving towards common open system architectures for mission ground data systems based on work at NASA GSFC with the full support of the commercial product industry and major integration contractors. Text Analytics. A specific area of data analytics which applies statistical, linguistic, and structural techniques to extract and classify information from textual sources. This presentation describes work now underway at NASA to increase situational awareness through the collection of non-telemetry mission operations information into a common log format and then providing display and analytics tools to provide in-depth assessment of the log contents. The work includes: Common interface formats for acquiring time-tagged text messages Conversion of common files for schedules, orbital events, and stored commands to the common log format Innovative displays to depict thousands of messages on a single display Structured English text queries against the log message data store, extensible to a more mature natural language query capability Goal of speech-to-text and text-to-speech additions to create a personal mission operations assistant to aid on-console operations. A wide variety of planned uses identified by the mission operations teams will be discussed.

  7. Impact of logging on aboveground biomass stocks in lowland rain forest, Papua New Guinea.

    PubMed

    Bryan, Jane; Shearman, Phil; Ash, Julian; Kirkpatrick, J B

    2010-12-01

    Greenhouse-gas emissions resulting from logging are poorly quantified across the tropics. There is a need for robust measurement of rain forest biomass and the impacts of logging from which carbon losses can be reliably estimated at regional and global scales. We used a modified Bitterlich plotless technique to measure aboveground live biomass at six unlogged and six logged rain forest areas (coupes) across two approximately 3000-ha regions at the Makapa concession in lowland Papua New Guinea. "Reduced-impact logging" is practiced at Makapa. We found the mean unlogged aboveground biomass in the two regions to be 192.96 +/- 4.44 Mg/ha and 252.92 +/- 7.00 Mg/ha (mean +/- SE), which was reduced by logging to 146.92 +/- 4.58 Mg/ha and 158.84 +/- 4.16, respectively. Killed biomass was not a fixed proportion, but varied with unlogged biomass, with 24% killed in the lower-biomass region, and 37% in the higher-biomass region. Across the two regions logging resulted in a mean aboveground carbon loss of 35 +/- 2.8 Mg/ha. The plotless technique proved efficient at estimating mean aboveground biomass and logging damage. We conclude that substantial bias is likely to occur within biomass estimates derived from single unreplicated plots.

  8. Interlake production established using quantitative hydrocarbon well-log analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lancaster, J.; Atkinson, A.

    1988-07-01

    Production was established in a new pay zone of the basal Interlake Formation adjacent to production in Midway field in Williams County, North Dakota. Hydrocarbon saturation, which was computed using hydrocarbon well-log (mud-log) data, and computed permeability encouraged the operator to run casing and test this zone. By use of drilling rig parameters, drilling mud properties, hydrocarbon-show data from the mud log, drilled rock and porosity descriptions, and wireline log porosity, this new technique computes oil saturation (percent of porosity) and permeability to the invading filtrate, using the Darcy equation. The Leonardo Fee well was drilled to test the Devonianmore » Duperow, the Silurian upper Interlake, and the Ordovician Red River. The upper two objectives were penetrated downdip from Midway production and there were no hydrocarbon shows. It was determined that the Red River was tight, based on sample examination by well site personnel. The basal Interlake, however, liberated hydrocarbon shows that were analyzed by this new technology. The results of this evaluation accurately predicted this well would be a commercial success when placed in production. Where geophysical log analysis might be questionable, this new evaluation technique may provide answers to anticipated oil saturation and producibility. The encouraging results of hydrocarbon saturation and permeability, produced by this technique, may be largely responsible for the well being in production today.« less

  9. Conventional and advanced oxidation processes used in disinfection of treated urban wastewater.

    PubMed

    Rodríguez-Chueca, J; Ormad, M P; Mosteo, R; Sarasa, J; Ovelleiro, J L

    2015-03-01

    The purpose of the current study is to compare the inactivation of Escherichia coli in wastewater effluents using conventional treatments (chlorination) and advanced oxidation processes (AOPs) such as UV irradiation, hydrogen peroxide (H2O2)/solar irradiation, and photo-Fenton processes. In addition, an analysis of the operational costs of each treatment is carried out taking into account the optimal dosages of chemicals used. Total inactivation of bacteria (7.5 log) was achieved by means of chlorination and UV irradiation. However, bacterial regrowth was observed 6 hours after the completion of UV treatment, obtaining a disinfection value around 3 to 4 log. On the other hand, the combination H2O2/solar irradiation achieved a maximum inactivation of E. coli of 3.30 ± 0.35 log. The photo-Fenton reaction achieved a level of inactivation of 4.87 ± 0.10 log. The order of disinfection, taking into account the reagent/cost ratio of each treatment, is as follows: chlorination > UV irradiation > photo-Fenton > H2O2/sunlight irradiation.

  10. Postfire logging: is it beneficial to a forest?

    Treesearch

    Sally Duncan

    2002-01-01

    Public debate on postfire logging has intensified in recent years, particularly since passage of the "salvage rider" in 1995, directing accelerated harvest of dead trees in the western United States. Supporters of postfires logging argue that it is part of a suite of restoration techniques, and that removal of timber means reduction of fuels for...

  11. Log-Log Convexity of Type-Token Growth in Zipf's Systems

    NASA Astrophysics Data System (ADS)

    Font-Clos, Francesc; Corral, Álvaro

    2015-06-01

    It is traditionally assumed that Zipf's law implies the power-law growth of the number of different elements with the total number of elements in a system—the so-called Heaps' law. We show that a careful definition of Zipf's law leads to the violation of Heaps' law in random systems, with growth curves that have a convex shape in log-log scale. These curves fulfill universal data collapse that only depends on the value of Zipf's exponent. We observe that real books behave very much in the same way as random systems, despite the presence of burstiness in word occurrence. We advance an explanation for this unexpected correspondence.

  12. AVTA Federal Fleet PEV Readiness Data Logging and Characterization Study: Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schey, Stephen; Francfort, Jim

    2015-06-01

    Collect and evaluate data on federal fleet operations as part of the Advanced Vehicle Testing Activity’s Federal Fleet Vehicle Data Logging and Characterization Study. The Advanced Vehicle Testing Activity study seeks to collect and evaluate data to validate the utilization of advanced plug-in electric vehicle (PEV) transportation. This report summarizes the fleets studied to identify daily operational characteristics of select vehicles and report findings on vehicle and mission characterizations to support the successful introduction of PEVs into the agencies’ fleets. Individual observations of these selected vehicles provide the basis for recommendations related to electric vehicle adoption and whether a batterymore » electric vehicle or plug-in hybrid electric vehicle (collectively referred to as PEVs) can fulfill the mission requirements.« less

  13. Protecting log cabins from decay

    Treesearch

    R. M. Rowell; J. M. Black; L. R. Gjovik; W. C. Feist

    1977-01-01

    This report answers the questions most often asked of the Forest Service on the protection of log cabins from decay, and on practices for the exterior finishing and maintenance of existing cabins. Causes of stain and decay are discussed, as are some basic techniques for building a cabin that will minimize decay. Selection and handling of logs, their preservative...

  14. The Inculcation of Critical Reflection through Reflective Learning Log: An Action Research in Entrepreneurship Module

    ERIC Educational Resources Information Center

    Kheng, Yeoh Khar

    2017-01-01

    Purpose: This study is part of the Scholarship of Teaching and Learning (SoTL) grant to examine written reflective learning log among the students studying BPME 3073 Entrepreneurship in UUM. Method: The data collection techniques is researcher-directed textual data through reflective learning log; obtained from students of one hundred forty. A…

  15. Internal log scanning: Research to reality

    Treesearch

    Daniel L. Schmoldt

    2000-01-01

    Improved log breakdown into lumber has been an active research topic since the 1960's. Demonstrated economic gains have driven the search for a cost-effective method to scan logs internally, from which it is assumed one can chose a better breakdown strategy. X-ray computed tomography (CT) has been widely accepted as the most promising internal imaging technique....

  16. 75 FR 60122 - Notice of Public Information Collection(s) Being Reviewed by the Federal Communications...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-29

    ... the respondents, including the use of automated collection techniques or other forms of information...: OMB Control Number: 3060-0360. Title: Section 80.409, Station Logs. Form No.: N/A. Type of Review... for filing suits upon such claims. Section 80.409(d), Ship Radiotelegraph Logs: Logs of ship stations...

  17. Users' Perceptions of the Web As Revealed by Transaction Log Analysis.

    ERIC Educational Resources Information Center

    Moukdad, Haidar; Large, Andrew

    2001-01-01

    Describes the results of a transaction log analysis of a Web search engine, WebCrawler, to analyze user's queries for information retrieval. Results suggest most users do not employ advanced search features, and the linguistic structure often resembles a human-human communication model that is not always successful in human-computer communication.…

  18. Automated lithology prediction from PGNAA and other geophysical logs.

    PubMed

    Borsaru, M; Zhou, B; Aizawa, T; Karashima, H; Hashimoto, T

    2006-02-01

    Different methods of lithology predictions from geophysical data have been developed in the last 15 years. The geophysical logs used for predicting lithology are the conventional logs: sonic, neutron-neutron, gamma (total natural-gamma) and density (backscattered gamma-gamma). The prompt gamma neutron activation analysis (PGNAA) is another established geophysical logging technique for in situ element analysis of rocks in boreholes. The work described in this paper was carried out to investigate the application of PGNAA to the lithology interpretation. The data interpretation was conducted using the automatic interpretation program LogTrans based on statistical analysis. Limited test suggests that PGNAA logging data can be used to predict the lithology. A success rate of 73% for lithology prediction was achieved from PGNAA logging data only. It can also be used in conjunction with the conventional geophysical logs to enhance the lithology prediction.

  19. Financial returns under uncertainty for conventional and reduced-impact logging in permanent production forests of the Brazilian Amazon

    Treesearch

    Frederick Boltz; Douglas R. Carter; Thomas P. Holmes; Rodrigo Pereira

    2001-01-01

    Reduced-impact logging (RIL) techniques are designed to improve the efficiency of timber harvesting while mitigating its adverse effects on the forest ecosystem. Research on RIL in select tropical forest regions has demonstrated clear ecological benefits relative to conventional logging (CL) practices while the financial competitiveness of RIL is less conclusive. We...

  20. Boat-Wave-Induced Bank Erosion on the Kenai River, Alaska

    DTIC Science & Technology

    2008-03-01

    with coir log habitat restoration. .....................................................................75 Figure 51. Type 1 bank with willow...various types of streambank stabilization. Common stabilization techniques consist of root wads, spruce tree revetments, coir logs, and riprap...restoration. ERDC TR-08-5 75 Figure 50. Type 1 bank with coir log habitat restoration. Figure 51. Type 1 bank with willow plantings/ladder access habitat

  1. Break-even zones for cable yarding by log size

    Treesearch

    Chris B. LeDoux

    1984-01-01

    The use of cable logging to extract small pieces of residue wood may result in low rates of production and a high cost per unit of wood produced. However, the logging manager can improve yarding productivity and break-even in cable residue removal operations by using the proper planning techniques. In this study, break-even zones for specific young-growth stands were...

  2. Progress in analysis of computed tomography (CT) images of hardwood logs for defect detection

    Treesearch

    Erol Sarigul; A. Lynn Abbott; Daniel L. Schmoldt

    2003-01-01

    This paper addresses the problem of automatically detecting internal defects in logs using computed tomography (CT) images. The overall purpose is to assist in breakdown optimization. Several studies have shown that the commercial value of resulting boards can be increased substantially if defect locations are known in advance, and if this information is used to make...

  3. Proposed standard-weight (W(s)) equations for kokanee, golden trout and bull trout

    USGS Publications Warehouse

    Hyatt, M.H.; Hubert, W.A.

    2000-01-01

    We developed standard-weight (W(s)) equations for kokanee (lacustrine Oncorhynchus nerka), golden trout (O. aguabonita), and bull trout (Salvelinus confluentus) using the regression-line-percentile technique. The W(s) equation for kokanee of 120-550 mm TL is log10 W(s) = -5.062 + 3.033 log10 TL, when W(s) is in grams and TL is total length in millimeters; the English-unit equivalent is log10 W(s) = -3.458 + 3.033 log10 TL, when W(s) is in pounds and TL is total length in inches. The W(s) equation for golden trout of 120-530 mm TL is log10 W(s) = -5.088 + 3.041 log10 TL, with the English-unit equivalent being log10 W(s) = -3.473 + 3.041 log10 TL. The W(s) equation for bull trout of 120-850 mm TL is log10 W(s) = -5.327 + 3.115 log10 TL, with the English-unit equivalent being log10 W(s) = -3.608 + 3.115 log10 TL.

  4. SpaceOps 2012 Plus 2: Social Tools to Simplify ISS Flight Control Communications and Log Keeping

    NASA Technical Reports Server (NTRS)

    Cowart, Hugh S.; Scott, David W.

    2014-01-01

    A paper written for the SpaceOps 2012 Conference (Simplify ISS Flight Control Communications and Log Keeping via Social Tools and Techniques) identified three innovative concepts for real time flight control communications tools based on social mechanisms: a) Console Log Tool (CoLT) - A log keeping application at Marshall Space Flight Center's (MSFC) Payload Operations Integration Center (POIC) that provides "anywhere" access, comment and notifications features similar to those found in Social Networking Systems (SNS), b) Cross-Log Communication via Social Techniques - A concept from Johnsson Space Center's (JSC) Mission Control Center Houston (MCC-H) that would use microblogging's @tag and #tag protocols to make information/requests visible and/or discoverable in logs owned by @Destination addressees, and c) Communications Dashboard (CommDash) - A MSFC concept for a Facebook-like interface to visually integrate and manage basic console log content, text chat streams analogous to voice loops, text chat streams dedicated to particular conversations, generic and position-specific status displays/streams, and a graphically based hailing display. CoLT was deployed operationally at nearly the same time as SpaceOps 2012, the Cross- Log Communications idea is currently waiting for a champion to carry it forward, and CommDash was approved as a NASA Iinformation Technoloby (IT) Labs project. This paper discusses lessons learned from two years of actual CoLT operations, updates CommDash prototype development status, and discusses potential for using Cross-Log Communications in both MCC-H and/or POIC environments, and considers other ways for synergizing console applcations.

  5. Laser optogalvanic spectroscopy of molecules

    NASA Technical Reports Server (NTRS)

    Webster, C. R.; Rettner, C. T.

    1983-01-01

    In laser optogalvanic (LOG) spectroscopy, a tunable laser is used to probe the spectral characteristics of atomic or molecular species within an electrical discharge in a low pressure gas. Optogalvanic signals arise when the impedance of the discharge changes in response to the absorption of laser radiation. The technique may, therefore, be referred to as impedance spectroscopy. This change in impedance may be monitored as a change in the voltage across the discharge tube. LOG spectra are recorded by scanning the wavelength of a chopped CW dye laser while monitoring the discharge voltage with a lock-in amplifier. LOG signals are obtained if the laser wavelength matches a transition in a species present in the discharge (or flame), and if the absorption of energy in the laser beam alters the impedance of the discharge. Infrared LOG spectroscopy of molecules has been demonstrated and may prove to be the most productive application in the field of optogalvanic techniques.

  6. Detecting and classifying method based on similarity matching of Android malware behavior with profile.

    PubMed

    Jang, Jae-Wook; Yun, Jaesung; Mohaisen, Aziz; Woo, Jiyoung; Kim, Huy Kang

    2016-01-01

    Mass-market mobile security threats have increased recently due to the growth of mobile technologies and the popularity of mobile devices. Accordingly, techniques have been introduced for identifying, classifying, and defending against mobile threats utilizing static, dynamic, on-device, and off-device techniques. Static techniques are easy to evade, while dynamic techniques are expensive. On-device techniques are evasion, while off-device techniques need being always online. To address some of those shortcomings, we introduce Andro-profiler, a hybrid behavior based analysis and classification system for mobile malware. Andro-profiler main goals are efficiency, scalability, and accuracy. For that, Andro-profiler classifies malware by exploiting the behavior profiling extracted from the integrated system logs including system calls. Andro-profiler executes a malicious application on an emulator in order to generate the integrated system logs, and creates human-readable behavior profiles by analyzing the integrated system logs. By comparing the behavior profile of malicious application with representative behavior profile for each malware family using a weighted similarity matching technique, Andro-profiler detects and classifies it into malware families. The experiment results demonstrate that Andro-profiler is scalable, performs well in detecting and classifying malware with accuracy greater than 98 %, outperforms the existing state-of-the-art work, and is capable of identifying 0-day mobile malware samples.

  7. Design and Evaluation of Log-To-Dimension Manufacturing Systems Using System Simulation

    Treesearch

    Wenjie Lin; D. Earl Kline; Philip A. Araman; Janice K. Wiedenbeck

    1995-01-01

    In a recent study of alternative dimension manufacturing systems that produce green hardwood dimension directly fromlogs, it was observed that for Grade 2 and 3 red oak logs, up to 78 and 76 percent of the log scale volume could be converted into clear dimension parts. The potential high yields suggest that this processing system can be a promising technique for...

  8. Measuring ecological impacts from logging in natural forests of the eastern Amazonia as a tool to assess forest degradation

    Treesearch

    Marco W Lentini; Johan C Zweede; Thomas P Holmes

    2010-01-01

    Sound forest management practices have been seen as an interesting strategy to ally forest conservation and rural economic development in Amazônia. However, the implementation of Reduced Impact Logging (RIL) techniques in the field has been incipient, while most of the Amazonian timber production is generated through predatory and illegal logging. Despite several...

  9. Permeability Estimation Directly From Logging-While-Drilling Induced Polarization Data

    NASA Astrophysics Data System (ADS)

    Fiandaca, G.; Maurya, P. K.; Balbarini, N.; Hördt, A.; Christiansen, A. V.; Foged, N.; Bjerg, P. L.; Auken, E.

    2018-04-01

    In this study, we present the prediction of permeability from time domain spectral induced polarization (IP) data, measured in boreholes on undisturbed formations using the El-log logging-while-drilling technique. We collected El-log data and hydraulic properties on unconsolidated Quaternary and Miocene deposits in boreholes at three locations at a field site in Denmark, characterized by different electrical water conductivity and chemistry. The high vertical resolution of the El-log technique matches the lithological variability at the site, minimizing ambiguity in the interpretation originating from resolution issues. The permeability values were computed from IP data using a laboratory-derived empirical relationship presented in a recent study for saturated unconsolidated sediments, without any further calibration. A very good correlation, within 1 order of magnitude, was found between the IP-derived permeability estimates and those derived using grain size analyses and slug tests, with similar depth trends and permeability contrasts. Furthermore, the effect of water conductivity on the IP-derived permeability estimations was found negligible in comparison to the permeability uncertainties estimated from the inversion and the laboratory-derived empirical relationship.

  10. Removal of native coliphages and coliform bacteria from municipal wastewater by various wastewater treatment processes: implications to water reuse.

    PubMed

    Zhang, K; Farahbakhsh, K

    2007-06-01

    The efficacy of a conventional activated sludge wastewater treatment process and the membrane bioreactor technology in removing microbial pathogens was investigated. Total and fecal coliforms and somatic and F-specific coliphages were used as indicators of pathogenic bacteria and viruses. Up to 5.7 logs removal of coliforms and 5.5 logs of coliphages were observed in the conventional treatment process with advanced tertiary treatment. Addition of chemical coagulants seemed to improve the efficacy of primary and secondary treatment for microorganism removal. Complete removal of fecal coliforms and up to 5.8 logs removal of coliphages was observed in the MBR system. It was shown that the MBR system was capable of high removal of coliphages despite the variation in feed coliphage concentrations. The results of this study indicated that the MBR system can achieve better microbial removal in far fewer steps than the conventional activated sludge process with advanced tertiary treatment. The final effluent from either treatment processes can be potentially reused.

  11. Application of work sampling technique to analyze logging operations.

    Treesearch

    Edwin S. Miyata; Helmuth M. Steinhilb; Sharon A. Winsauer

    1981-01-01

    Discusses the advantages and disadvantages of various time study methods for determining efficiency and productivity in logging. The work sampling method is compared with the continuous time-study method. Gives the feasibility, capability, and limitation of the work sampling method.

  12. A stress wave based approach to NDE of logs for assessing potential veneer quality: Part I—small-diameter ponderosa pine.

    Treesearch

    Robert J. Ross; Susan W. Willits; William Von Segen; Terry Black; Brian K. Brashaw; Roy F. Pellerin

    1999-01-01

    Longitudinal stress wave nondestructive evaluation (NDE) techniques have been used in a variety of applications in the forest products industry. Recently, it has been shown that they can significantly aid in the assessment of log quality, particularly when they are used to predict performance of structural lumber obtained from a log. The purpose of the research...

  13. Smart built-in test

    NASA Technical Reports Server (NTRS)

    Richards, Dale W.

    1990-01-01

    The work which built-in test (BIT) is asked to perform in today's electronic systems increases with every insertion of new technology or introduction of tighter performance criteria. Yet the basic purpose remains unchanged -- to determine with high confidence the operational capability of that equipment. Achievement of this level of BIT performance requires the management and assimilation of a large amount of data, both realtime and historical. Smart BIT has taken advantage of advanced techniques from the field of artificial intelligence (AI) in order to meet these demands. The Smart BIT approach enhances traditional functional BIT by utilizing AI techniques to incorporate environmental stress data, temporal BIT information and maintenance data, and realtime BIT reports into an integrated test methodology for increased BIT effectiveness and confidence levels. Future research in this area will incorporate onboard fault-logging of BIT output, stress data and Smart BIT decision criteria in support of a singular, integrated and complete test and maintenance capability. The state of this research is described along with a discussion of directions for future development.

  14. PandaEPL: a library for programming spatial navigation experiments.

    PubMed

    Solway, Alec; Miller, Jonathan F; Kahana, Michael J

    2013-12-01

    Recent advances in neuroimaging and neural recording techniques have enabled researchers to make significant progress in understanding the neural mechanisms underlying human spatial navigation. Because these techniques generally require participants to remain stationary, computer-generated virtual environments are used. We introduce PandaEPL, a programming library for the Python language designed to simplify the creation of computer-controlled spatial-navigation experiments. PandaEPL is built on top of Panda3D, a modern open-source game engine. It allows users to construct three-dimensional environments that participants can navigate from a first-person perspective. Sound playback and recording and also joystick support are provided through the use of additional optional libraries. PandaEPL also handles many tasks common to all cognitive experiments, including managing configuration files, logging all internal and participant-generated events, and keeping track of the experiment state. We describe how PandaEPL compares with other software for building spatial-navigation experiments and walk the reader through the process of creating a fully functional experiment.

  15. PandaEPL: A library for programming spatial navigation experiments

    PubMed Central

    Solway, Alec; Miller, Jonathan F.

    2013-01-01

    Recent advances in neuroimaging and neural recording techniques have enabled researchers to make significant progress in understanding the neural mechanisms underlying human spatial navigation. Because these techniques generally require participants to remain stationary, computer-generated virtual environments are used. We introduce PandaEPL, a programming library for the Python language designed to simplify the creation of computer-controlled spatial-navigation experiments. PandaEPL is built on top of Panda3D, a modern open-source game engine. It allows users to construct three-dimensional environments that participants can navigate from a first-person perspective. Sound playback and recording and also joystick support are provided through the use of additional optional libraries. PandaEPL also handles many tasks common to all cognitive experiments, including managing configuration files, logging all internal and participant-generated events, and keeping track of the experiment state. We describe how PandaEPL compares with other software for building spatial-navigation experiments and walk the reader through the process of creating a fully functional experiment. PMID:23549683

  16. Challenge Paper: Validation of Forensic Techniques for Criminal Prosecution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erbacher, Robert F.; Endicott-Popovsky, Barbara E.; Frincke, Deborah A.

    2007-04-10

    Abstract: As in many domains, there is increasing agreement in the user and research community that digital forensics analysts would benefit from the extension, development and application of advanced techniques in performing large scale and heterogeneous data analysis. Modern digital forensics analysis of cyber-crimes and cyber-enabled crimes often requires scrutiny of massive amounts of data. For example, a case involving network compromise across multiple enterprises might require forensic analysis of numerous sets of network logs and computer hard drives, potentially involving 100?s of gigabytes of heterogeneous data, or even terabytes or petabytes of data. Also, the goal for forensic analysismore » is to not only determine whether the illicit activity being considered is taking place, but also to identify the source of the activity and the full extent of the compromise or impact on the local network. Even after this analysis, there remains the challenge of using the results in subsequent criminal and civil processes.« less

  17. Vadose Zone Transport Field Study: Status Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gee, Glendon W.; Ward, Anderson L.

    2001-11-30

    Studies were initiated at the Hanford Site to evaluate the process controlling the transport of fluids in the vadose zone and to develop a reliable database upon which vadose-zone transport models can be calibrated. These models are needed to evaluate contaminant migration through the vadose zone to underlying groundwaters at Hanford. A study site that had previously been extensively characterized using geophysical monitoring techniques was selected in the 200 E Area. Techniques used previously included neutron probe for water content, spectral gamma logging for radionuclide tracers, and gamma scattering for wet bulk density. Building on the characterization efforts of themore » past 20 years, the site was instrumented to facilitate the comparison of nine vadose-zone characterization methods: advanced tensiometers, neutron probe, electrical resistance tomography (ERT), high-resolution resistivity (HRR), electromagnetic induction imaging (EMI), cross-borehole radar (XBR), and cross-borehole seismic (XBS). Soil coring was used to obtain soil samples for analyzing ionic and isotopic tracers.« less

  18. Smart built-in test

    NASA Astrophysics Data System (ADS)

    Richards, Dale W.

    1990-03-01

    The work which built-in test (BIT) is asked to perform in today's electronic systems increases with every insertion of new technology or introduction of tighter performance criteria. Yet the basic purpose remains unchanged -- to determine with high confidence the operational capability of that equipment. Achievement of this level of BIT performance requires the management and assimilation of a large amount of data, both realtime and historical. Smart BIT has taken advantage of advanced techniques from the field of artificial intelligence (AI) in order to meet these demands. The Smart BIT approach enhances traditional functional BIT by utilizing AI techniques to incorporate environmental stress data, temporal BIT information and maintenance data, and realtime BIT reports into an integrated test methodology for increased BIT effectiveness and confidence levels. Future research in this area will incorporate onboard fault-logging of BIT output, stress data and Smart BIT decision criteria in support of a singular, integrated and complete test and maintenance capability. The state of this research is described along with a discussion of directions for future development.

  19. 3-D visualisation of palaeoseismic trench stratigraphy and trench logging using terrestrial remote sensing and GPR - combining techniques towards an objective multiparametric interpretation

    NASA Astrophysics Data System (ADS)

    Schneiderwind, S.; Mason, J.; Wiatr, T.; Papanikolaou, I.; Reicherter, K.

    2015-09-01

    Two normal faults on the Island of Crete and mainland Greece were studied to create and test an innovative workflow to make palaeoseismic trench logging more objective, and visualise the sedimentary architecture within the trench wall in 3-D. This is achieved by combining classical palaeoseismic trenching techniques with multispectral approaches. A conventional trench log was firstly compared to results of iso cluster analysis of a true colour photomosaic representing the spectrum of visible light. Passive data collection disadvantages (e.g. illumination) were addressed by complementing the dataset with active near-infrared backscatter signal image from t-LiDAR measurements. The multispectral analysis shows that distinct layers can be identified and it compares well with the conventional trench log. According to this, a distinction of adjacent stratigraphic units was enabled by their particular multispectral composition signature. Based on the trench log, a 3-D-interpretation of GPR data collected on the vertical trench wall was then possible. This is highly beneficial for measuring representative layer thicknesses, displacements and geometries at depth within the trench wall. Thus, misinterpretation due to cutting effects is minimised. Sedimentary feature geometries related to earthquake magnitude can be used to improve the accuracy of seismic hazard assessments. Therefore, this manuscript combines multiparametric approaches and shows: (i) how a 3-D visualisation of palaeoseismic trench stratigraphy and logging can be accomplished by combining t-LiDAR and GRP techniques, and (ii) how a multispectral digital analysis can offer additional advantages and a higher objectivity in the interpretation of palaeoseismic and stratigraphic information. The multispectral datasets are stored allowing unbiased input for future (re-)investigations.

  20. Integrated core-log petrofacies analysis in the construction of a reservoir geomodel: A case study of a mature Mississippian carbonate reservoir using limited data

    USGS Publications Warehouse

    Bhattacharya, S.; Doveton, J.H.; Carr, T.R.; Guy, W.R.; Gerlach, P.M.

    2005-01-01

    Small independent operators produce most of the Mississippian carbonate fields in the United States mid-continent, where a lack of integrated characterization studies precludes maximization of hydrocarbon recovery. This study uses integrative techniques to leverage extant data in an Osagian and Meramecian (Mississippian) cherty carbonate reservoir in Kansas. Available data include petrophysical logs of varying vintages, limited number of cores, and production histories from each well. A consistent set of assumptions were used to extract well-level porosity and initial saturations, from logs of different types and vintages, to build a geomodel. Lacking regularly recorded well shut-in pressures, an iterative technique, based on material balance formulations, was used to estimate average reservoir-pressure decline that matched available drillstem test data and validated log-analysis assumptions. Core plugs representing the principal reservoir petrofacies provide critical inputs for characterization and simulation studies. However, assigning plugs among multiple reservoir petrofacies is difficult in complex (carbonate) reservoirs. In a bottom-up approach, raw capillary pressure (Pc) data were plotted on the Super-Pickett plot, and log- and core-derived saturation-height distributions were reconciled to group plugs by facies, to identify core plugs representative of the principal reservoir facies, and to discriminate facies in the logged interval. Pc data from representative core plugs were used for effective pay evaluation to estimate water cut from completions, in infill and producing wells, and guide-selective perforations for economic exploitation of mature fields. The results from this study were used to drill 22 infill wells. Techniques demonstrated here can be applied in other fields and reservoirs. Copyright ?? 2005. The American Association of Petroleum Geologists. All rights reserved.

  1. Application of advanced geophysical logging methods in the characterization of a fractured-sedimentary bedrock aquifer, Ventura County, California

    USGS Publications Warehouse

    Williams, John H.; Lane, John W.; Singha, Kamini; Haeni, F. Peter

    2002-01-01

    An integrated suite of advanced geophysical logging methods was used to characterize the geology and hydrology of three boreholes completed in fractured-sedimentary bedrock in Ventura County, California. The geophysical methods included caliper, gamma, electromagnetic induction, borehole deviation, optical and acoustic televiewer, borehole radar, fluid resistivity, temperature, and electromagnetic flowmeter. The geophysical logging 1) provided insights useful for the overall geohydrologic characterization of the bedrock and 2) enhanced the value of information collected by other methods from the boreholes including core-sample analysis, multiple-level monitoring, and packer testing.The logged boreholes, which have open intervals of 100 to 200 feet, penetrate a sequence of interbedded sandstone and mudstone with bedding striking 220 to 250 degrees and dipping 15 to 40 degrees to the northwest. Fractures intersected by the boreholes include fractures parallel to bedding and fractures with variable strike that dip moderately to steeply. Two to three flow zones were detected in each borehole. The flow zones consist of bedding-parallel or steeply dipping fractures or a combination of bedding-parallel fractures and moderately to steeply dipping fractures. About 75 to more than 90 percent of the measured flow under pumped conditions was produced by only one of the flow zones in each borehole.

  2. Human Factors and Data Logging Processes With the Use of Advanced Technology for Adults With Type 1 Diabetes: Systematic Integrative Review

    PubMed Central

    Martin, Clare; Franklin, Rachel; Duce, David; Harrison, Rachel

    2018-01-01

    Background People with type 1 diabetes (T1D) undertake self-management to prevent short and long-term complications. Advanced technology potentially supports such activities but requires consideration of psychological and behavioral constructs and usability issues. Economic factors and health care provider capacity influence access and uptake of advanced technology. Previous reviews have focused upon clinical outcomes or were descriptive or have synthesized studies on adults with those on children and young people where human factors are different. Objective This review described and examined the relationship between human factors and adherence with technology for data logging processes in adults with T1D. Methods A systematic literature search was undertaken by using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. Quality appraisal was undertaken and data were abstracted and categorized into the themes that underpinned the human factor constructs that were examined. Results A total of 18 studies were included. A total of 6 constructs emerged from the data analysis: the relationship between adherence to data logging and measurable outcomes; satisfaction with the transition to advanced technology for self-management; use of advanced technology and time spent on diabetes-related activities; strategies to mediate the complexities of diabetes and the use of advanced technology; cognition in the wild; and meanings, views, and perspectives from the users of technology. Conclusions Increased treatment satisfaction was found on transition from traditional to advanced technology use—insulin pump and continuous glucose monitoring (CGM); the most significant factor was when blood glucose levels were consistently <7.00 mmol/L (P ≤.01). Participants spent considerable time on their diabetes self-care. Logging of data was positively correlated with increasing age when using an app that provided meaningful feedback (regression coefficient=55.8 recordings/year; P ≤.01). There were benefits of CGM for older people in mediating complexities and fears of hypoglycemia with significant differences in well-being (P ≤.001). Qualitative studies explored the contextual use and uptake of technology. The results suggested frustrations with CGM, continuous subcutaneous insulin infusion, calibration of devices, and alarms. Furthermore implications for “body image” and the way in which “significant others” impacted on the behavior and attitude of the individual toward technology use. There were wide variations in the normal use of and interaction with technology across a continuum of sociocultural contexts, which has implications for the way in which future technologies should be designed. Quantitative studies were limited by small sample sizes, making it difficult to generalize findings to other contexts. This was further limited by a sample that was predominantly white, well-controlled, and engaged with self-care. The use of critical appraisal frameworks demonstrated where research into human factors and data logging processes of individuals could be improved. This included engaging people in the design of the technology, especially hard-to-reach or marginalized groups. PMID:29535079

  3. Effectiveness of streambank-stabilization techniques along the Kenai River, Alaska

    USGS Publications Warehouse

    Dorava, Joseph M.

    1999-01-01

    The Kenai River in southcentral Alaska is the State's most popular sport fishery and an economically important salmon river that generates as much as $70 million annually. Boatwake-induced streambank erosion and the associated damage to riparian and riverine habitat present a potential threat to this fishery. Bank-stabilization techniques commonly in use along the Kenai River were selected for evaluation of their effectiveness at attenuating boatwakes and retarding streambank erosion. Spruce trees cabled to the bank and biodegradable man-made logs (called 'bio-logs') pinned to the bank were tested because they are commonly used techniques along the river. These two techniques were compared for their ability to reduce wake heights that strike the bank and to reduce erosion of bank material, as well as for the amount and quality of habitat they provide for juvenile chinook salmon. Additionally, an engineered bank-stabilization project was evaluated because this method of bank protection is being encouraged by managers of the river. During a test that included 20 controlled boat passes, the spruce trees and the bio-log provided a similar reduction in boatwake height and bank erosion; however, the spruce trees provided a greater amount of protective habitat than the bio-log. The engineered bank-stabilization project eroded less during nine boat passes and provided more protective cover than the adjacent unprotected natural bank. Features of the bank-stabilization techniques, such as tree limbs and willow plantings that extended into the water from the bank, attenuated the boatwakes, which helped reduce erosion. These features also provided protective cover to juvenile salmon.

  4. New Factorization Techniques and Parallel (log N) Algorithms for Forward Dynamics Solution of Single Closed-Chain Robot Manipulators

    NASA Technical Reports Server (NTRS)

    Fijany, Amir

    1993-01-01

    In this paper parallel 0(log N) algorithms for dynamic simulation of single closed-chain rigid multibody system as specialized to the case of a robot manipulatoar in contact with the environment are developed.

  5. NMR Parameters Determination through ACE Committee Machine with Genetic Implanted Fuzzy Logic and Genetic Implanted Neural Network

    NASA Astrophysics Data System (ADS)

    Asoodeh, Mojtaba; Bagheripour, Parisa; Gholami, Amin

    2015-06-01

    Free fluid porosity and rock permeability, undoubtedly the most critical parameters of hydrocarbon reservoir, could be obtained by processing of nuclear magnetic resonance (NMR) log. Despite conventional well logs (CWLs), NMR logging is very expensive and time-consuming. Therefore, idea of synthesizing NMR log from CWLs would be of a great appeal among reservoir engineers. For this purpose, three optimization strategies are followed. Firstly, artificial neural network (ANN) is optimized by virtue of hybrid genetic algorithm-pattern search (GA-PS) technique, then fuzzy logic (FL) is optimized by means of GA-PS, and eventually an alternative condition expectation (ACE) model is constructed using the concept of committee machine to combine outputs of optimized and non-optimized FL and ANN models. Results indicated that optimization of traditional ANN and FL model using GA-PS technique significantly enhances their performances. Furthermore, the ACE committee of aforementioned models produces more accurate and reliable results compared with a singular model performing alone.

  6. Genesis analysis of high-gamma ray sandstone reservoir and its log evaluation techniques: a case study from the Junggar basin, northwest China.

    PubMed

    Wang, Liang; Mao, Zhiqiang; Sun, Zhongchun; Luo, Xingping; Song, Yong; Liu, Zhen

    2013-01-01

    In the Junggar basin, northwest China, many high gamma-ray (GR) sandstone reservoirs are found and routinely interpreted as mudstone non-reservoirs, with negative implications for the exploration and exploitation of oil and gas. Then, the high GR sandstone reservoirs' recognition principles, genesis, and log evaluation techniques are systematically studied. Studies show that the sandstone reservoirs with apparent shale content greater than 50% and GR value higher than 110API can be regarded as high GR sandstone reservoir. The high GR sandstone reservoir is mainly and directly caused by abnormally high uranium enrichment, but not the tuff, feldspar or clay mineral. Affected by formation's high water sensitivity and poor borehole quality, the conventional logs can not recognize reservoir and evaluate the physical property of reservoirs. Then, the nuclear magnetic resonance (NMR) logs is proposed and proved to be useful in reservoir recognition and physical property evaluation.

  7. Genesis Analysis of High-Gamma Ray Sandstone Reservoir and Its Log Evaluation Techniques: A Case Study from the Junggar Basin, Northwest China

    PubMed Central

    Wang, Liang; Mao, Zhiqiang; Sun, Zhongchun; Luo, Xingping; Song, Yong; Liu, Zhen

    2013-01-01

    In the Junggar basin, northwest China, many high gamma-ray (GR) sandstone reservoirs are found and routinely interpreted as mudstone non-reservoirs, with negative implications for the exploration and exploitation of oil and gas. Then, the high GR sandstone reservoirs' recognition principles, genesis, and log evaluation techniques are systematically studied. Studies show that the sandstone reservoirs with apparent shale content greater than 50% and GR value higher than 110API can be regarded as high GR sandstone reservoir. The high GR sandstone reservoir is mainly and directly caused by abnormally high uranium enrichment, but not the tuff, feldspar or clay mineral. Affected by formation's high water sensitivity and poor borehole quality, the conventional logs can not recognize reservoir and evaluate the physical property of reservoirs. Then, the nuclear magnetic resonance (NMR) logs is proposed and proved to be useful in reservoir recognition and physical property evaluation. PMID:24078797

  8. Proposed standard-weight equations for brook trout

    USGS Publications Warehouse

    Hyatt, M.W.; Hubert, W.A.

    2001-01-01

    Weight and length data were obtained for 113 populations of brook trout Salvelinus fontinalis across the species' geographic range in North America to estimate a standard-weight (Ws) equation for this species. Estimation was done by applying the regression-line-percentile technique to fish of 120-620 mm total length (TL). The proposed metric-unit (g and mm) equation is log10Ws = -5.186 + 3.103 log10TL; the English-unit (lb and in) equivalent is log10Ws = -3.483 + 3.103 log10TL. No systematic length bias was evident in the relative-weight values calculated from these equations.

  9. Review of hydraulic fracture mapping using advanced accelerometer-based receiver systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warpinski, N.R.; Uhl, J.E.; Engler, B.P.

    Hydraulic fracturing is an important tool for natural gas and oil exploitation, but its optimization has been impeded by an inability to observe how the fracture propagates and what its overall dimensions are. The few experiments in which fractures have been exposed through coring or mineback have shown that hydraulic fractures are complicated multi-stranded structures that may behave much differently than currently predicted by models. It is clear that model validation, fracture optimization, problem identification and solution, and field development have all been encumbered by the absence of any ground truth information on fracture behavior in field applications. The solutionmore » to this problem is to develop techniques to image the hydraulic fracture in situ from either the surface, the treatment well, or offset wells. Several diagnostic techniques have been available to assess individual elements of the fracture geometry, but most of these techniques have limitations on their usefulness. For example, tracers and temperature logs can only measure fracture height at the wellbore, well testing and production history matching provide a productive length which may or may not be different from the true fracture length, and tiltmeters can provide accurate information on azimuth and type of fracture (horizontal or vertical), but length and height can only be extracted from a non-unique inversion of the data. However, there is a method, the microseismic technique, which possesses the potential for imaging the entire hydraulic fracture and, more importantly, its growth history. This paper discusses application of advanced technology to the microseismic method in order to provide detailed accurate images of fractures and their growth processes.« less

  10. Using Log Linear Analysis for Categorical Family Variables.

    ERIC Educational Resources Information Center

    Moen, Phyllis

    The Goodman technique of log linear analysis is ideal for family research, because it is designed for categorical (non-quantitative) variables. Variables are dichotomized (for example, married/divorced, childless/with children) or otherwise categorized (for example, level of permissiveness, life cycle stage). Contingency tables are then…

  11. Method of assaying uranium with prompt fission and thermal neutron borehole logging adjusted by borehole physical characteristics

    DOEpatents

    Barnard, Ralston W.; Jensen, Dal H.

    1982-01-01

    Uranium formations are assayed by prompt fission neutron logging techniques. The uranium in the formation is proportional to the ratio of epithermal counts to thermal or eqithermal dieaway. Various calibration factors enhance the accuracy of the measurement.

  12. A New Approach to Logging.

    ERIC Educational Resources Information Center

    Miles, Donna

    2001-01-01

    In response to high numbers of preventable fatal accidents in the logging industry, the Occupational Safety and Health Administration (OSHA) developed a week-long logger safety training program that includes hands-on learning of safety techniques in the woods. Reaching small operators has been challenging; outreach initiatives in Maine, North…

  13. Novel medium-throughput technique for investigating drug-cyclodextrin complexation by pH-metric titration using the partition coefficient method.

    PubMed

    Dargó, Gergő; Boros, Krisztina; Péter, László; Malanga, Milo; Sohajda, Tamás; Szente, Lajos; Balogh, György T

    2018-05-05

    The present study was aimed to develop a medium-throughput screening technique for investigation of cyclodextrin (CD)-active pharmaceutical ingredient (API) complexes. Dual-phase potentiometric lipophilicity measurement, as gold standard technique, was combined with the partition coefficient method (plotting the reciprocal of partition coefficients of APIs as a function of CD concentration). A general equation was derived for determination of stability constants of 1:1 CD-API complexes (K 1:1,CD ) based on solely the changes of partition coefficients (logP o/w N -logP app N ), without measurement of the actual API concentrations. Experimentally determined logP value (-1.64) of 6-deoxy-6[(5/6)-fluoresceinylthioureido]-HPBCD (FITC-NH-HPBCD) was used to estimate the logP value (≈ -2.5 to -3) of (2-hydroxypropyl)-ß-cyclodextrin (HPBCD). The results suggested that the amount of HPBCD can be considered to be inconsequential in the octanol phase. The decrease of octanol volume due to the octanol-CD complexation was considered, thus a corrected octanol-water phase ratio was also introduced. The K 1:1,CD values obtained by this developed method showed a good accordance with the results from other orthogonal methods. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Comparison of long-term results between laparoscopy-assisted gastrectomy and open gastrectomy with D2 lymph node dissection for advanced gastric cancer.

    PubMed

    Hamabe, Atsushi; Omori, Takeshi; Tanaka, Koji; Nishida, Toshirou

    2012-06-01

    Laparoscopy-assisted gastrectomy (LAG) has been established as a low-invasive surgery for early gastric cancer. However, it remains unknown whether it is applicable also for advanced gastric cancer, mainly because the long-term results of LAG with D2 lymph node dissection for advanced gastric cancer have not been well validated compared with open gastrectomy (OG). A retrospective cohort study was performed to compare LAG and OG with D2 lymph node dissection. For this study, 167 patients (66 LAG and 101 OG patients) who underwent gastrectomy with D2 lymph node dissection for advanced gastric cancer were reviewed. Recurrence-free survival and overall survival time were estimated using Kaplan-Meier curves. Stratified log-rank statistical evaluation was used to compare the difference between the LAG and OG groups stratified by histologic type, pathologic T status, N status, and postoperative adjuvant chemotherapy. The adjusted Cox proportional hazards regression models were used to calculate the hazard ratios (HRs) of LAG. The 5-year recurrence-free survival rate was 89.6% in the LAG group and 75.8% in the OG group (nonsignificant difference; stratified log-rank statistic, 3.11; P = 0.0777). The adjusted HR of recurrence for LAG compared with OG was 0.389 [95% confidence interval (CI) 0.131-1.151]. The 5-year overall survival rate was 94.4% in the LAG group and 78.5% in the OG group (nonsignificant difference; stratified log-rank statistic, 0.4817; P = 0.4877). The adjusted HR of death for LAG compared with OG was 0.633 (95% CI 0.172-2.325). The findings show that LAG with D2 lymph node dissection is acceptable in terms of long-term results for advanced gastric cancer cases and may be applicable for advanced gastric cancer treatment.

  15. An evaluation of borehole flowmeters used to measure horizontal ground-water flow in limestones of Indiana, Kentucky, and Tennessee, 1999

    USGS Publications Warehouse

    Wilson, John T.; Mandell, Wayne A.; Paillet, Frederick L.; Bayless, E. Randall; Hanson, Randall T.; Kearl, Peter M.; Kerfoot, William B.; Newhouse, Mark W.; Pedler, William H.

    2001-01-01

    Three borehole flowmeters and hydrophysical logging were used to measure ground-water flow in carbonate bedrock at sites in southeastern Indiana and on the westcentral border of Kentucky and Tennessee. The three flowmeters make point measurements of the direction and magnitude of horizontal flow, and hydrophysical logging measures the magnitude of horizontal flowover an interval. The directional flowmeters evaluated include a horizontal heat-pulse flowmeter, an acoustic Doppler velocimeter, and a colloidal borescope flowmeter. Each method was used to measure flow in selected zones where previous geophysical logging had indicated water-producing beds, bedding planes, or other permeable features that made conditions favorable for horizontal-flow measurements. Background geophysical logging indicated that ground-water production from the Indiana test wells was characterized by inflow from a single, 20-foot-thick limestone bed. The Kentucky/Tennessee test wells produced water from one or more bedding planes where geophysical logs indicated the bedding planes had been enlarged by dissolution. Two of the three test wells at the latter site contained measurable vertical flow between two or more bedding planes under ambient hydraulic head conditions. Field measurements and data analyses for each flow-measurement technique were completed by a developer of the technology or by a contractor with extensive experience in the application of that specific technology. Comparison of the horizontal-flow measurements indicated that the three point-measurement techniques rarely measured the same velocities and flow directions at the same measurement stations. Repeat measurements at selected depth stations also failed to consistently reproduce either flow direction, flow magnitude, or both. At a few test stations, two of the techniques provided similar flow magnitude or direction but usually not both. Some of this variability may be attributed to naturally occurring changes in hydraulic conditions during the 1-month study period in August and September 1999. The actual velocities and flow directions are unknown; therefore, it is uncertain which technique provided the most accurate measurements of horizontal flow in the boreholes and which measurements were most representative of flow in the aquifers. The horizontal heat-pulse flowmeter consistently yielded flow magnitudes considerably less than those provided by the acoustic Doppler velocimeter and colloidal borescope. The design of the horizontal heat-pulse flowmeter compensates for the local acceleration of ground-water velocity in the open borehole. The magnitude of the velocities estimated from the hydrophysical logging were comparable to those of the horizontal heat-pulse flowmeter, presumably because the hydrophysical logging also effectively compensates for the effect of the borehole on the flow field and averages velocity over a length of borehole rather than at a point. The acoustic Doppler velocimeter and colloidal borescope have discrete sampling points that allow for measuring preferential flow velocities that can be substantially higher than the average velocity through a length of borehole. The acoustic Doppler velocimeter and colloidal borescope also measure flow at the center of the borehole where the acceleration of the flow field should be greatest. Of the three techniques capable of measuring direction and magnitude of horizontal flow, only the acoustic Doppler velocimeter measured vertical flow. The acoustic Doppler velocimeter consistently measured downward velocity in all test wells. This apparent downward flow was attributed, in part, to particles falling through the water column as a result of mechanical disturbance during logging. Hydrophysical logging yielded estimates of vertical flow in the Kentucky/Tennessee test wells. In two of the test wells, the hydrophysical logging involved deliberate isolation of water-producing bedding planes with a packer to ensure that small horizontal flow could be quantified without the presence of vertical flow. The presence of vertical flow in the Kentucky/Tennessee test wells may preclude the definitive measurement of horizontal flow without the use of effective packer devices. None of the point-measurement techniques used a packer, but each technique used baffle devices to help suppress the vertical flow. The effectiveness of these baffle devices is not known; therefore, the effect of vertical flow on the measurements cannot be quantified. The general lack of agreement among the point-measurement techniques in this study highlights the difficulty of using measurements at a single depth point in a borehole to characterize the average horizontal flow in a heterogeneous aquifer. The effective measurement of horizontal flow may depend on the precise depth at which measurements are made, and the measurements at a given depth may vary over time as hydraulic head conditions change. The various measurements also demonstrate that the magnitude and possibly the direction of horizontal flow are affected by the presence of the open borehole. Although there is a lack of agreement among the measurement techniques, these results could mean that effective characterization of horizontal flow in heterogeneous aquifers might be possible if data from many depth stations and from repeat measurements can be averaged over an extended time period. Complications related to vertical flow in the borehole highlights the importance of using background logging methods like vertical flowmeters or hydrophysical logging to characterize the borehole environment before horizontal-flow measurements are attempted. If vertical flow is present, a packer device may be needed to acquire definitive measurements of horizontal flow. Because hydrophysical logging provides a complete depth profile of the borehole, a strength of this technique is in identifying horizontal- and vertical-flow zones in a well. Hydrophysical logging may be most applicable as a screening method. Horizontal- flow zones identified with the hydrophysical logging then could be evaluated with one of the point-measurement techniques for quantifying preferential flow zones and flow directions. Additional research is needed to determine how measurements of flow in boreholes relate to flow in bedrock aquifers. The flowmeters may need to be evaluated under controlled laboratory conditions to determine which of the methods accurately measure ground-water velocities and flow directions. Additional research also is needed to investigate variations in flow direction with time, daily changes in velocity, velocity corrections for fractured bedrock aquifers and unconsolidated aquifers, and directional differences in individual wells for hydraulically separated flow zones.

  16. Method of assaying uranium with prompt fission and thermal neutron borehole logging adjusted by borehole physical characteristics. [Patient application

    DOEpatents

    Barnard, R.W.; Jensen, D.H.

    1980-11-05

    Uranium formations are assayed by prompt fission neutron logging techniques. The uranium in the formation is proportional to the ratio of epithermal counts to thermal or epithermal dieaway. Various calibration factors enhance the accuracy of the measurement.

  17. Reviews Equipment: Data logger Book: Imagined Worlds Equipment: Mini data loggers Equipment: PICAXE-18M2 data logger Books: Engineering: A Very Short Introduction and To Engineer Is Human Book: Soap, Science, & Flat-Screen TVs Equipment: uLog and SensorLab Web Watch

    NASA Astrophysics Data System (ADS)

    2012-07-01

    WE RECOMMEND Data logger Fourier NOVA LINK: data logging and analysis To Engineer is Human Engineering: essays and insights Soap, Science, & Flat-Screen TVs People, politics, business and science overlap uLog sensors and sensor adapter A new addition to the LogIT range offers simplicity and ease of use WORTH A LOOK Imagined Worlds Socio-scientific predictions for the future Mini light data logger and mini temperature data logger Small-scale equipment for schools SensorLab Plus LogIT's supporting software, with extra features HANDLE WITH CARE CAXE110P PICAXE-18M2 data logger Data logger 'on view' but disappoints Engineering: A Very Short Introduction A broad-brush treatment fails to satisfy WEB WATCH Two very different websites for students: advanced physics questions answered and a more general BBC science resource

  18. Efficacy of Neutral Electrolyzed Water, Quaternary Ammonium and Lactic Acid-Based Solutions in Controlling Microbial Contamination of Food Cutting Boards Using a Manual Spraying Technique.

    PubMed

    Al-Qadiri, Hamzah M; Ovissipour, Mahmoudreza; Al-Alami, Nivin; Govindan, Byju N; Shiroodi, Setareh Ghorban; Rasco, Barbara

    2016-05-01

    Bactericidal activity of neutral electrolyzed water (NEW), quaternary ammonium (QUAT), and lactic acid-based solutions was investigated using a manual spraying technique against Salmonella Typhimurium, Escherichia coli O157:H7, Campylobacter jejuni, Listeria monocytogenes and Staphylococcus aureus that were inoculated onto the surface of scarred polypropylene and wooden food cutting boards. Antimicrobial activity was also examined when using cutting boards in preparation of raw chopped beef, chicken tenders or salmon fillets. Viable counts of survivors were determined as log10 CFU/100 cm(2) within 0 (untreated control), 1, 3, and 5 min of treatment at ambient temperature. Within the first minute of treatment, NEW and QUAT solutions caused more than 3 log10 bacterial reductions on polypropylene surfaces whereas less than 3 log10 reductions were achieved on wooden surfaces. After 5 min of treatment, more than 5 log10 reductions were achieved for all bacterial strains inoculated onto polypropylene surfaces. Using NEW and QUAT solutions within 5 min reduced Gram-negative bacteria by 4.58 to 4.85 log10 compared to more than 5 log10 reductions in Gram-positive bacteria inoculated onto wooden surfaces. Lactic acid treatment was significantly less effective (P < 0.05) compared to NEW and QUAT treatments. A decline in antimicrobial effectiveness was observed (0.5 to <2 log10 reductions were achieved within the first minute) when both cutting board types were used to prepare raw chopped beef, chicken tenders or salmon fillets. © 2016 Institute of Food Technologists®

  19. Use of NMR logging to obtain estimates of hydraulic conductivity in the High Plains aquifer, Nebraska, USA

    USGS Publications Warehouse

    Dlubac, Katherine; Knight, Rosemary; Song, Yi-Qiao; Bachman, Nate; Grau, Ben; Cannia, Jim; Williams, John

    2013-01-01

    Hydraulic conductivity (K) is one of the most important parameters of interest in groundwater applications because it quantifies the ease with which water can flow through an aquifer material. Hydraulic conductivity is typically measured by conducting aquifer tests or wellbore flow (WBF) logging. Of interest in our research is the use of proton nuclear magnetic resonance (NMR) logging to obtain information about water-filled porosity and pore space geometry, the combination of which can be used to estimate K. In this study, we acquired a suite of advanced geophysical logs, aquifer tests, WBF logs, and sidewall cores at the field site in Lexington, Nebraska, which is underlain by the High Plains aquifer. We first used two empirical equations developed for petroleum applications to predict K from NMR logging data: the Schlumberger Doll Research equation (KSDR) and the Timur-Coates equation (KT-C), with the standard empirical constants determined for consolidated materials. We upscaled our NMR-derived K estimates to the scale of the WBF-logging K(KWBF-logging) estimates for comparison. All the upscaled KT-C estimates were within an order of magnitude of KWBF-logging and all of the upscaled KSDR estimates were within 2 orders of magnitude of KWBF-logging. We optimized the fit between the upscaled NMR-derived K and KWBF-logging estimates to determine a set of site-specific empirical constants for the unconsolidated materials at our field site. We conclude that reliable estimates of K can be obtained from NMR logging data, thus providing an alternate method for obtaining estimates of K at high levels of vertical resolution.

  20. High-voltage supply for neutron tubes in well-logging applications

    DOEpatents

    Humphreys, D.R.

    1982-09-15

    A high voltage supply is provided for a neutron tube used in well logging. The biased pulse supply of the invention combines DC and full pulse techniques and produces a target voltage comprising a substantial negative DC bias component on which is superimposed a pulse whose negative peak provides the desired negative voltage level for the neutron tube. The target voltage is preferably generated using voltage doubling techniques and employing a voltage source which generates bipolar pulse pairs having an amplitude corresponding to the DC bias level.

  1. High voltage supply for neutron tubes in well logging applications

    DOEpatents

    Humphreys, D. Russell

    1989-01-01

    A high voltage supply is provided for a neutron tube used in well logging. The "biased pulse" supply of the invention combines DC and "full pulse" techniques and produces a target voltage comprising a substantial negative DC bias component on which is superimposed a pulse whose negative peak provides the desired negative voltage level for the neutron tube. The target voltage is preferably generated using voltage doubling techniques and employing a voltage source which generates bipolar pulse pairs having an amplitude corresponding to the DC bias level.

  2. Integrating surface and borehole geophysics in ground water studies - an example using electromagnetic soundings in south Florida

    USGS Publications Warehouse

    Paillet, Frederick; Hite, Laura; Carlson, Matthew

    1999-01-01

    Time domain surface electromagnetic soundings, borehole induction logs, and other borehole logging techniques are used to construct a realistic model for the shallow subsurface hydraulic properties of unconsolidated sediments in south Florida. Induction logs are used to calibrate surface induction soundings in units of pore water salinity by correlating water sample specific electrical conductivity with the electrical conductivity of the formation over the sampled interval for a two‐layered aquifer model. Geophysical logs are also used to show that a constant conductivity layer model is appropriate for the south Florida study. Several physically independent log measurements are used to quantify the dependence of formation electrical conductivity on such parameters as salinity, permeability, and clay mineral fraction. The combined interpretation of electromagnetic soundings and induction logs was verified by logging three validation boreholes, confirming quantitative estimates of formation conductivity and thickness in the upper model layer, and qualitative estimates of conductivity in the lower model layer.

  3. Advanced sorting technologies for optimal wood products and woody biomass utilization

    Treesearch

    Xiping Wang

    2012-01-01

    Forest materials represent great potential for advancing our goals in the 21st century for sustainable building, energy independence, and carbon sequestration. A critical component of an improved system for producing bioproducts and bioenergr from forest materials is the ability to sort trees, stems, and logs into end-product categories that represent their highest...

  4. Reassessing the Economic Value of Advanced Level Mathematics

    ERIC Educational Resources Information Center

    Adkins, Michael; Noyes, Andrew

    2016-01-01

    In the late 1990s, the economic return to Advanced level (A-level) mathematics was examined. The analysis was based upon a series of log-linear models of earnings in the 1958 National Child Development Survey (NCDS) and the National Survey of 1980 Graduates and Diplomates. The core finding was that A-level mathematics had a unique earnings premium…

  5. Using Log Variables in a Learning Management System to Evaluate Learning Activity Using the Lens of Activity Theory

    ERIC Educational Resources Information Center

    Park, Yeonjeong; Jo, Il-Hyun

    2017-01-01

    As the advance of learning technologies and analytics tools continues, learning management systems (LMSs) have been required to fulfil the growing expectations for smart learning. However, the reality regarding the level of technology integration in higher education differs considerably from such expectations or the speed of advances in…

  6. Redefining the Surgical Council of Resident Education (SCORE) Curriculum: A Comparison with the Operative Experiences of Graduated General Surgical Residents.

    PubMed

    Strosberg, David S; Quinn, Kristen M; Abdel-Misih, Sherif R; Harzman, Alan E

    2018-04-01

    Our objective was to investigate the number and classify surgical operations performed by general surgery residents and compare these with the updated Surgical Council on Resident Education (SCORE) curriculum. We performed a retrospective review of logged surgical cases from general surgical residents who completed training at a single center from 2011 to 2015. The logged cases were correlated with the operations extracted from the SCORE curriculum. Hundred and fifty-one procedures were examined; there were 98 "core" and 53 "advanced" cases as determined by the SCORE. Twenty-eight residents graduated with an average of 1017 major cases. Each resident completed 66 (67%) core cases and 17 (32%) advanced cases an average of one or more times with 39 (40%) core cases and 6 (11%) advanced cases completed five or more times. Core procedures that are infrequently or not performed by residents should be identified in each program to focus on resident education.

  7. Human Factors and Data Logging Processes With the Use of Advanced Technology for Adults With Type 1 Diabetes: Systematic Integrative Review.

    PubMed

    Waite, Marion; Martin, Clare; Franklin, Rachel; Duce, David; Harrison, Rachel

    2018-03-15

    People with type 1 diabetes (T1D) undertake self-management to prevent short and long-term complications. Advanced technology potentially supports such activities but requires consideration of psychological and behavioral constructs and usability issues. Economic factors and health care provider capacity influence access and uptake of advanced technology. Previous reviews have focused upon clinical outcomes or were descriptive or have synthesized studies on adults with those on children and young people where human factors are different. This review described and examined the relationship between human factors and adherence with technology for data logging processes in adults with T1D. A systematic literature search was undertaken by using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. Quality appraisal was undertaken and data were abstracted and categorized into the themes that underpinned the human factor constructs that were examined. A total of 18 studies were included. A total of 6 constructs emerged from the data analysis: the relationship between adherence to data logging and measurable outcomes; satisfaction with the transition to advanced technology for self-management; use of advanced technology and time spent on diabetes-related activities; strategies to mediate the complexities of diabetes and the use of advanced technology; cognition in the wild; and meanings, views, and perspectives from the users of technology. Increased treatment satisfaction was found on transition from traditional to advanced technology use-insulin pump and continuous glucose monitoring (CGM); the most significant factor was when blood glucose levels were consistently <7.00 mmol/L (P ≤.01). Participants spent considerable time on their diabetes self-care. Logging of data was positively correlated with increasing age when using an app that provided meaningful feedback (regression coefficient=55.8 recordings/year; P ≤.01). There were benefits of CGM for older people in mediating complexities and fears of hypoglycemia with significant differences in well-being (P ≤.001). Qualitative studies explored the contextual use and uptake of technology. The results suggested frustrations with CGM, continuous subcutaneous insulin infusion, calibration of devices, and alarms. Furthermore implications for "body image" and the way in which "significant others" impacted on the behavior and attitude of the individual toward technology use. There were wide variations in the normal use of and interaction with technology across a continuum of sociocultural contexts, which has implications for the way in which future technologies should be designed. Quantitative studies were limited by small sample sizes, making it difficult to generalize findings to other contexts. This was further limited by a sample that was predominantly white, well-controlled, and engaged with self-care. The use of critical appraisal frameworks demonstrated where research into human factors and data logging processes of individuals could be improved. This included engaging people in the design of the technology, especially hard-to-reach or marginalized groups. ©Marion Waite, Clare Martin, Rachel Franklin, David Duce, Rachel Harrison. Originally published in JMIR Human Factors (http://humanfactors.jmir.org), 15.03.2018.

  8. Multicriteria evaluation of simulated logging scenarios in a tropical rain forest.

    PubMed

    Huth, Andreas; Drechsler, Martin; Köhler, Peter

    2004-07-01

    Forest growth models are useful tools for investigating the long-term impacts of logging. In this paper, the results of the rain forest growth model FORMIND were assessed by a multicriteria decision analysis. The main processes covered by FORMIND include tree growth, mortality, regeneration and competition. Tree growth is calculated based on a carbon balance approach. Trees compete for light and space; dying large trees fall down and create gaps in the forest. Sixty-four different logging scenarios for an initially undisturbed forest stand at Deramakot (Malaysia) were simulated. The scenarios differ regarding the logging cycle, logging method, cutting limit and logging intensity. We characterise the impacts with four criteria describing the yield, canopy opening and changes in species composition. Multicriteria decision analysis was used for the first time to evaluate the scenarios and identify the efficient ones. Our results plainly show that reduced-impact logging scenarios are more 'efficient' than the others, since in these scenarios forest damage is minimised without significantly reducing yield. Nevertheless, there is a trade-off between yield and achieving a desired ecological state of logged forest; the ecological state of the logged forests can only be improved by reducing yields and enlarging the logging cycles. Our study also demonstrates that high cutting limits or low logging intensities cannot compensate for the high level of damage caused by conventional logging techniques.

  9. A mixed-methods analysis of logging injuries in Montana and Idaho.

    PubMed

    Lagerstrom, Elise; Magzamen, Sheryl; Rosecrance, John

    2017-12-01

    Despite advances in mechanization, logging continues to be one of the most dangerous occupations in the United States. Logging in the Intermountain West region (Montana and Idaho) is especially hazardous due to steep terrain, extreme weather, and remote work locations. We implemented a mixed-methods approach combining analyses of workers' compensation claims and focus groups to identify factors associated with injuries and fatalities in the logging industry. Inexperienced workers (>6 months experience) accounted for over 25% of claims. Sprain/strain injuries were the most common, accounting for 36% of claims, while fatalities had the highest median claim cost ($274 411). Focus groups identified job tasks involving felling trees, skidding, and truck driving as having highest risk. Injury prevention efforts should focus on training related to safe work methods (especially for inexperienced workers), the development of a safety culture and safety leadership, as well as implementation of engineering controls. © 2017 Wiley Periodicals, Inc.

  10. Recursive Fury: Conspiracist Ideation in the Blogosphere in Response to Research on Conspiracist Ideation

    PubMed Central

    Lewandowsky, Stephan; Cook, John; Oberauer, Klaus; Marriott, Michael

    2013-01-01

    Conspiracist ideation has been repeatedly implicated in the rejection of scientific propositions, although empirical evidence to date has been sparse. A recent study involving visitors to climate blogs found that conspiracist ideation was associated with the rejection of climate science and the rejection of other scientific propositions such as the link between lung cancer and smoking, and between HIV and AIDS (Lewandowsky et al., in press; LOG12 from here on). This article analyses the response of the climate blogosphere to the publication of LOG12. We identify and trace the hypotheses that emerged in response to LOG12 and that questioned the validity of the paper’s conclusions. Using established criteria to identify conspiracist ideation, we show that many of the hypotheses exhibited conspiratorial content and counterfactual thinking. For example, whereas hypotheses were initially narrowly focused on LOG12, some ultimately grew in scope to include actors beyond the authors of LOG12, such as university executives, a media organization, and the Australian government. The overall pattern of the blogosphere’s response to LOG12 illustrates the possible role of conspiracist ideation in the rejection of science, although alternative scholarly interpretations may be advanced in the future. PMID:23508808

  11. Differences in hepatic phenotype between hemochromatosis patients with HFE C282Y homozygosity and other HFE genotypes.

    PubMed

    Cheng, Raymond; Barton, James C; Morrison, Elizabeth D; Phatak, Pradyumna D; Krawitt, Edward L; Gordon, Stuart C; Kowdley, Kris V

    2009-07-01

    There are limited data comparing hepatic phenotype among hemochromatosis patients with different HFE genotypes. The goal of this study was to compare hepatic histopathologic features and hepatic iron concentration (HIC) among patients with phenotypic hemochromatosis and different HFE genotypes. We studied 182 US patients with phenotypic hemochromatosis. Degree of hepatic fibrosis, pattern of iron deposition, presence of steatosis or necroinflammation, and HIC were compared among different HFE genotypes. C282Y/H63D compound heterozygotes and patients with HFE genotypes other than C282Y/C282Y were more likely to have stainable Kupffer cell iron (31.1% vs. 9.5%; P=0.02), portal or lobular inflammation (28.9% vs. 15.6%; P=0.03), and steatosis (33.3% vs. 10.2%; P<0.01) on liver biopsy than C282Y homozygotes. Mean log10 HIC (P<0.05) and log10 ferritin (P<0.05) were higher among C282Y homozygotes than in patients with other HFE genotypes. In a logistic regression analysis using age, sex, HFE genotype, log10 ferritin, and log10 HIC as independent variables, log10 serum ferritin (P=0.0008), male sex (P=0.0086), and log10 HIC (P=0.047), but not HFE genotype (P=0.0554) were independently associated with presence or absence of advanced hepatic fibrosis. C282Y/H63D compound heterozygotes and other non-C282Y homozygotes which express the hepatic hemochromatosis phenotype frequently have evidence of steatosis or chronic hepatitis and lower body iron stores than C282Y homozygotes. These data suggest that presence of concomitant liver disease may explain expression of the hemochromatosis phenotype among non-C282Y homozygotes. Increased age, HIC, and ferritin are associated with advanced hepatic fibrosis, regardless of HFE genotype.

  12. Slow Crack Growth of Brittle Materials With Exponential Crack-Velocity Formulation. Part 2; Constant Stress Rate Experiments

    NASA Technical Reports Server (NTRS)

    Choi, Sung R.; Nemeth, Noel N.; Gyekenyesi, John P.

    2002-01-01

    The previously determined life prediction analysis based on an exponential crack-velocity formulation was examined using a variety of experimental data on glass and advanced structural ceramics in constant stress rate and preload testing at ambient and elevated temperatures. The data fit to the relation of strength versus the log of the stress rate was very reasonable for most of the materials. Also, the preloading technique was determined equally applicable to the case of slow-crack-growth (SCG) parameter n greater than 30 for both the power-law and exponential formulations. The major limitation in the exponential crack-velocity formulation, however, was that the inert strength of a material must be known a priori to evaluate the important SCG parameter n, a significant drawback as compared with the conventional power-law crack-velocity formulation.

  13. Immobilized Artificial Membrane HPLC Derived Parameters vs PAMPA-BBB Data in Estimating in Situ Measured Blood-Brain Barrier Permeation of Drugs.

    PubMed

    Grumetto, Lucia; Russo, Giacomo; Barbato, Francesco

    2016-08-01

    The affinity indexes for phospholipids (log kW(IAM)) for 42 compounds were measured by high performance liquid chromatography (HPLC) on two different phospholipid-based stationary phases (immobilized artificial membrane, IAM), i.e., IAM.PC.MG and IAM.PC.DD2. The polar/electrostatic interaction forces between analytes and membrane phospholipids (Δlog kW(IAM)) were calculated as the differences between the experimental values of log kW(IAM) and those expected for isolipophilic neutral compounds having polar surface area (PSA) = 0. The values of passage through a porcine brain lipid extract (PBLE) artificial membrane for 36 out of the 42 compounds considered, measured by the so-called PAMPA-BBB technique, were taken from the literature (P0(PAMPA-BBB)). The values of blood-brain barrier (BBB) passage measured in situ, P0(in situ), for 38 out of the 42 compounds considered, taken from the literature, represented the permeability of the neutral forms on "efflux minimized" rodent models. The present work was aimed at verifying the soundness of Δlog kW(IAM) at describing the potential of passage through the BBB as compared to data achieved by the PAMPA-BBB technique. In a first instance, the values of log P0(PAMPA-BBB) (32 data points) were found significantly related to the n-octanol lipophilicity values of the neutral forms (log P(N)) (r(2) = 0.782) whereas no significant relationship (r(2) = 0.246) was found with lipophilicity values of the mixtures of ionized and neutral forms existing at the experimental pH 7.4 (log D(7.4)) as well as with either log kW(IAM) or Δlog kW(IAM) values. log P0(PAMPA-BBB) related moderately to log P0(in situ) values (r(2) = 0.604). The latter did not relate with either n-octanol lipophilicity indexes (log P(N) and log D(7.4)) or phospholipid affinity indexes (log kW(IAM)). In contrast, significant inverse linear relationships were observed between log P0(in situ) (38 data points) and Δlog kW(IAM) values for all the compounds but ibuprofen and chlorpromazine, which behaved as moderate outliers (r(2) = 0.656 and r(2) = 0.757 for values achieved on IAM.PC.MG and IAM.PC.DD2, respectively). Since log P0(in situ) refer to the "intrinsic permeability" of the analytes regardless their ionization degree, no correction for ionization of Δlog kW(IAM) values was needed. Furthermore, log P0(in situ) were found roughly linearly related to log BB values (i.e., the logarithm of the ratio brain concentration/blood concentration measured in vivo) for all the analytes but those predominantly present at the experimental pH 7.4 as anions. These results suggest that, at least for the data set considered, Δlog kW(IAM) parameters are more effective than log P0(PAMPA-BBB) at predicting log P0(in situ) values for all the analytes. Furthermore, ionization appears to affect differently, and much more markedly, BBB passage of acids (yielding anions) than that of the other ionizable compounds.

  14. Gradually truncated log-normal in USA publicly traded firm size distribution

    NASA Astrophysics Data System (ADS)

    Gupta, Hari M.; Campanha, José R.; de Aguiar, Daniela R.; Queiroz, Gabriel A.; Raheja, Charu G.

    2007-03-01

    We study the statistical distribution of firm size for USA and Brazilian publicly traded firms through the Zipf plot technique. Sale size is used to measure firm size. The Brazilian firm size distribution is given by a log-normal distribution without any adjustable parameter. However, we also need to consider different parameters of log-normal distribution for the largest firms in the distribution, which are mostly foreign firms. The log-normal distribution has to be gradually truncated after a certain critical value for USA firms. Therefore, the original hypothesis of proportional effect proposed by Gibrat is valid with some modification for very large firms. We also consider the possible mechanisms behind this distribution.

  15. Release of sugar pine seedlings and saplings by harvest cutting.

    Treesearch

    William E. Hallin

    1959-01-01

    Sugar pine, the preferred species to grow on many forest areas in southwestern Oregon, is often seeded or planted on clearcuts there. Advance growth in the form of seedlings, saplings, and poles is common in the mixed- conifer type, and costly planting can be eliminated if this advance growth can be saved during logging and slash disposal. However, if the necessary...

  16. Artificial neural network modeling and cluster analysis for organic facies and burial history estimation using well log data: A case study of the South Pars Gas Field, Persian Gulf, Iran

    NASA Astrophysics Data System (ADS)

    Alizadeh, Bahram; Najjari, Saeid; Kadkhodaie-Ilkhchi, Ali

    2012-08-01

    Intelligent and statistical techniques were used to extract the hidden organic facies from well log responses in the Giant South Pars Gas Field, Persian Gulf, Iran. Kazhdomi Formation of Mid-Cretaceous and Kangan-Dalan Formations of Permo-Triassic Data were used for this purpose. Initially GR, SGR, CGR, THOR, POTA, NPHI and DT logs were applied to model the relationship between wireline logs and Total Organic Carbon (TOC) content using Artificial Neural Networks (ANN). The correlation coefficient (R2) between the measured and ANN predicted TOC equals to 89%. The performance of the model is measured by the Mean Squared Error function, which does not exceed 0.0073. Using Cluster Analysis technique and creating a binary hierarchical cluster tree the constructed TOC column of each formation was clustered into 5 organic facies according to their geochemical similarity. Later a second model with the accuracy of 84% was created by ANN to determine the specified clusters (facies) directly from well logs for quick cluster recognition in other wells of the studied field. Each created facies was correlated to its appropriate burial history curve. Hence each and every facies of a formation could be scrutinized separately and directly from its well logs, demonstrating the time and depth of oil or gas generation. Therefore potential production zone of Kazhdomi probable source rock and Kangan- Dalan reservoir formation could be identified while well logging operations (especially in LWD cases) were in progress. This could reduce uncertainty and save plenty of time and cost for oil industries and aid in the successful implementation of exploration and exploitation plans.

  17. Solubility enhancement of dioxins and PCBs by surfactant monomers and micelles quantified with polymer depletion techniques.

    PubMed

    Schacht, Veronika J; Grant, Sharon C; Escher, Beate I; Hawker, Darryl W; Gaus, Caroline

    2016-06-01

    Partitioning of super-hydrophobic organic contaminants (SHOCs) to dissolved or colloidal materials such as surfactants can alter their behaviour by enhancing apparent aqueous solubility. Relevant partition constants are, however, challenging to quantify with reasonable accuracy. Partition constants to colloidal surfactants can be measured by introducing a polymer (PDMS) as third phase with known PDMS-water partition constant in combination with the mass balance approach. We quantified partition constants of PCBs and PCDDs (log KOW 5.8-8.3) between water and sodium dodecyl sulphate monomers (KMO) and micelles (KMI). A refined, recently introduced swelling-based polymer loading technique allowed highly precise (4.5-10% RSD) and fast (<24 h) loading of SHOCs into PDMS, and due to the miniaturisation of batch systems equilibrium was reached in <5 days for KMI and <3 weeks for KMO. SHOC losses to experimental surfaces were substantial (8-26%) in monomer solutions, but had a low impact on KMO (0.10-0.16 log units). Log KMO for PCDDs (4.0-5.2) were approximately 2.6 log units lower than respective log KMI, which ranged from 5.2 to 7.0 for PCDDs and 6.6-7.5 for PCBs. The linear relationship between log KMI and log KOW was consistent with more polar and moderately hydrophobic compounds. Apparent solubility increased with increasing hydrophobicity and was highest in micelle solutions. However, this solubility enhancement was also considerable in monomer solutions, up to 200 times for OCDD. Given the pervasive presence of surfactant monomers in typical field scenarios, these data suggest that low surfactant concentrations may be effective long-term facilitators for subsurface transport of SHOCs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Comparative reduction of Giardia cysts, F+ coliphages, sulphite reducing clostridia and fecal coliforms by wastewater treatment processes.

    PubMed

    Nasser, Abidelfatah M; Benisti, Neta-Lee; Ofer, Naomi; Hovers, Sivan; Nitzan, Yeshayahu

    2017-01-28

    Advanced wastewater treatment processes are applied to prevent the environmental dissemination of pathogenic microorganisms. Giardia lamblia causes a severe disease called giardiasis, and is highly prevalent in untreated wastewater worldwide. Monitoring the microbial quality of wastewater effluents is usually based on testing for the levels of indicator microorganisms in the effluents. This study was conducted to compare the suitability of fecal coliforms, F+ coliphages and sulfide reducing clostridia (SRC) as indicators for the reduction of Giardia cysts in two full-scale wastewater treatment plants. The treatment process consists of activated sludge, coagulation, high rate filtration and either chlorine or UV disinfection. The results of the study demonstrated that Giardia cysts are highly prevalent in raw wastewater at an average concentration of 3600 cysts/L. Fecal coliforms, F+ coliphages and SRC were also detected at high concentrations in raw wastewater. Giardia cysts were efficiently removed (3.6 log 10 ) by the treatment train. The greatest reduction was observed for fecal coliforms (9.6 log 10 ) whereas the least reduction was observed for F+ coliphages (2.1 log 10 ) following chlorine disinfection. Similar reduction was observed for SRC by filtration and disinfection by either UV (3.6 log 10 ) or chlorine (3.3 log 10 ). Since F+ coliphage and SRC were found to be more resistant than fecal coliforms for the tertiary treatment processes, they may prove to be more suitable as indicators for Giardia. The results of this study demonstrated that advanced wastewater treatment may prove efficient for the removal of Giardia cysts and may prevent its transmission when treated effluents are applied for crop irrigation or streams restoration.

  19. Simultaneous Implantation of an Ahmed and Baerveldt Glaucoma Drainage Device for Uncontrolled Intraocular Pressure in Advanced Glaucoma.

    PubMed

    Rao, Veena S; Christenbury, Joseph; Lee, Paul; Allingham, Rand; Herndon, Leon; Challa, Pratap

    2017-02-01

    To evaluate efficacy and safety of a novel technique, simultaneous implantation of Ahmed and Baerveldt shunts, for improved control of intraocular pressure (IOP) in advanced glaucoma with visual field defects threatening central fixation. Retrospective case series; all patients receiving simultaneous Ahmed and Baerveldt implantation at a single institution between October 2004 and October 2009 were included. Records were reviewed preoperatively and at postoperative day 1, week 1, month 1, month 3, month 6, year 1, and yearly until year 5. Outcome measures included IOP, best-corrected visual acuity, visual field mean deviation, cup to disc ratio, number of glaucoma medications, and complications. Fifty-nine eyes were identified; mean (±SD) follow-up was 26±23 months. Primary open-angle glaucoma was most common (n=37, 63%). Forty-six eyes (78%) had prior incisional surgery. Mean preoperative IOP was 25.5±9.8 mm Hg. IOP was reduced 50% day 1 (P<0.001, mean 12.7±7.0 mm Hg), which persisted throughout follow-up. At year 1, cup to disc ratio and mean deviation were stable with decreased best-corrected visual acuity from logMAR 0.72±0.72(20/100) to 1.06±1.13(20/200) (P=0.007). The Kaplan-Meier survival analysis showed median and mean survival of 1205 and 829±91 days, respectively. Complication rate was 47%. IOP is markedly reduced postoperative day 1 following double glaucoma tube implantation with effects persisting over postoperative year 1 and up to year 5. Complications were higher than that seen in reports of single shunt implantation, which may be explained by patient complexity in this cohort. This technique may prove a promising novel approach for management of uncontrolled IOP in advanced glaucoma.

  20. Subclassification of Barcelona Clinic Liver Cancer B and C hepatocellular carcinoma: A cohort study of the multicenter registry database.

    PubMed

    Lee, Sangheun; Kim, Beom Kyung; Song, Kijun; Park, Jun Yong; Ahn, Sang Hoon; Kim, Seung Up; Han, Kwang-Hyub; Kim, Do Young

    2016-04-01

    We aimed to subclassify hepatocellular carcinoma (HCC) using Barcelona Clinic Liver Cancer intermediate and advanced stages, which include a highly heterogeneous population. From two registries ("random" and "voluntary" cohorts in the Korean Liver Cancer Study Group), patients who were newly diagnosed as HCC with intermediate or advanced stage between 2003 and 2005 were considered eligible. Overall survival (OS) was analyzed using Kaplan-Meier method with comparison by log-rank test. Patients with intermediate-stage HCC (n = 994) were subclassified according to tumor size and Child-Pugh class. Patients with tumor size < 5 cm (B1), those with tumor size ≥ 5 cm and Child-Pugh A (B2), and those with tumor size ≥ 5 cm and Child-Pugh B (B3) had median OS of 30.73, 20.60, and 9.23 months, respectively (P < 0.001 by log-rank test). Among patients with advanced stage HCC (n = 1746), patients were subclassified according to presence of significant portal vein invasion (sPVI; defined as portal vein invasion in lobar, main, or contralateral branch) and extrahepatic spread (EHS). Patients with neither sPVI nor EHS (C1), those with either sPVI or EHS (C2), and those with both sPVI and EHS (C3) had median OS of 8.43, 4.63, and 3.63 months, respectively (P < 0.001 by log-rank test). Subclassification of Barcelona Clinic Liver Cancer intermediate and advanced stages might be useful for determining patient prognosis and guiding treatment strategies for HCC. © 2015 Journal of Gastroenterology and Hepatology Foundation and John Wiley & Sons Australia, Ltd.

  1. A proven record in changing attitudes about MWD logs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cantrell, L.; Paxson, K.B.; Keyser, W.L.

    1993-07-01

    Measurement while drilling (MWD) logs for quantitative reservoir characterization were evaluated during drilling of Gulf of Mexico flexure trend projects, Kilauea (Green Canyon Blocks 6 and 50) and Tick (Garden Banks Block 189). Comparisons confirmed that MWD can be used as an accurate replacement for wireline logging when borehole size is not a limiting factor. Texaco MWD experience evolved from last resort' to primary formation evaluation logging, which resulted in rigtime and associated cost savings. Difficult wells are now drilled and evaluated with confidence, geopressure is safely monitored, conventional core interval tops are selected, and geologic interpretations and operational decisionsmore » are made before wells TD. This paper reviews the performance, accuracy, and limitations of the MWD systems and compares the results to standard geophysical well logging techniques. Four case histories are presented.« less

  2. Empirical Mode Decomposition of Geophysical Well-log Data of Bombay Offshore Basin, Mumbai, India

    NASA Astrophysics Data System (ADS)

    Siddharth Gairola, Gaurav; Chandrasekhar, Enamundram

    2016-04-01

    Geophysical well-log data manifest the nonlinear behaviour of their respective physical properties of the heterogeneous subsurface layers as a function of depth. Therefore, nonlinear data analysis techniques must be implemented, to quantify the degree of heterogeneity in the subsurface lithologies. One such nonlinear data adaptive technique is empirical mode decomposition (EMD) technique, which facilitates to decompose the data into oscillatory signals of different wavelengths called intrinsic mode functions (IMF). In the present study EMD has been applied to gamma-ray log and neutron porosity log of two different wells: Well B and Well C located in the western offshore basin of India to perform heterogeneity analysis and compare the results with those obtained by multifractal studies of the same data sets. By establishing a relationship between the IMF number (m) and the mean wavelength associated with each IMF (Im), a heterogeneity index (ρ) associated with subsurface layers can be determined using the relation, Im=kρm, where 'k' is a constant. The ρ values bear an inverse relation with the heterogeneity of the subsurface: smaller ρ values designate higher heterogeneity and vice-versa. The ρ values estimated for different limestone payzones identified in the wells clearly show that Well C has higher degree of heterogeneity than Well B. This correlates well with the estimated Vshale values for the limestone reservoir zone showing higher shale content in Well C than Well B. The ρ values determined for different payzones of both wells will be used to quantify the degree of heterogeneity in different wells. The multifractal behaviour of each IMF of both the logs of both the wells will be compared with one another and discussed on the lines of their heterogeneity indices.

  3. MID Plot: a new lithology technique. [Matrix identification plot

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clavier, C.; Rust, D.H.

    1976-01-01

    Lithology interpretation by the Litho-Porosity (M-N) method has been used for years, but is evidently too cumbersome and ambiguous for widespread acceptance as a field technique. To set aside these objections, another method has been devised. Instead of the log-derived parameters M and N, the MID Plot uses quasi-physical quantities, (rho/sub ma/)/sub a/ and (..delta..t/sub ma/)/sub a/, as its porosity-independent variables. These parameters, taken from suitably scaled Neutron-Density and Sonic-Neutron crossplots, define a unique matrix mineral or mixture for each point on the logs. The matrix points on the MID Plot thus remain constant in spite of changes in mudmore » filtrate, porosity, or neutron tool types (all of which significantly affect the M-N Plot). This new development is expected to bring welcome relief in areas where lithology identification is a routine part of log analysis.« less

  4. The Economics of Reduced Impact Logging in the American Tropics: A Review of Recent Initiatives

    Treesearch

    Frederick Boltz; Thomas P. Holmes; Douglas R. Carter

    1999-01-01

    Programs aimed at developing and implementing reduced-impact logging (RIL) techniques are currently underway in important forest regions of Latin America, given the importance of timber production in the American tropics to national and global markets. RIL efforts focus upon planning and extraction methods which lessen harvest impact on residual commercial timber...

  5. A Clustering Methodology of Web Log Data for Learning Management Systems

    ERIC Educational Resources Information Center

    Valsamidis, Stavros; Kontogiannis, Sotirios; Kazanidis, Ioannis; Theodosiou, Theodosios; Karakos, Alexandros

    2012-01-01

    Learning Management Systems (LMS) collect large amounts of data. Data mining techniques can be applied to analyse their web data log files. The instructors may use this data for assessing and measuring their courses. In this respect, we have proposed a methodology for analysing LMS courses and students' activity. This methodology uses a Markov…

  6. Nondestructive Methods for Detecting Defects in Softwood Logs

    Treesearch

    Kristin C. Schad; Daniel L. Schmoldt; Robert J. Ross

    1996-01-01

    Wood degradation and defects, such as voids and knots, affect the quality and processing time of lumber. The ability to detect internal defects in the log can save mills time and processing costs. In this study, we investigated three nondestructive evaluation techniques for detecting internal wood defects. Sound wave transmission, x-ray computed tomography, and impulse...

  7. Techniques for the wheeled-skidder operator

    Treesearch

    Robert L. Hartman; Harry G. Gibson; Harry G. Gibson

    1970-01-01

    How much production a logger gets from a logging job may depend heavily on his skidder operators. They are key men on any logging job. This is one conclusion that forestry engineers at the USDA Forest Service's Forestry Sciences Laboratory at Morgantown, West Virginia, came to after studying the operation of u-heeled skidders in mountainous Appalachian terrain....

  8. CT Imaging, Data Reduction, and Visualization of Hardwood Logs

    Treesearch

    Daniel L. Schmoldt

    1996-01-01

    Computer tomography (CT) is a mathematical technique that, combined with noninvasive scanning such as x-ray imaging, has become a powerful tool to nondestructively test materials prior to use or to evaluate materials prior to processing. In the current context, hardwood lumber processing can benefit greatly by knowing what a log looks like prior to initial breakdown....

  9. New technology applied to well logging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stremel, K.

    1984-11-01

    Remote locations and increasingly complex geology require a higher level of sophistication in well-logging equipment and services. Applying technological advancements, well-logging contractors have developed a variety of new products and services designed to provide better quality data at reasonable prices. One of the most significant technological breakthroughs has been in satellite communications. Denver-based Western Tele-Communications Inc. is one of the few companies offering voice and data transmission services via satellite. Up to 9600 bits per second of realtime data is transmitted from terminals at remote wellsites through a main station in Denver to locations throughout the world. Because management inmore » separate offices can review well data simultaneously, critical operations decisions can be made more quickly.« less

  10. Analytical methods in multivariate highway safety exposure data estimation

    DOT National Transportation Integrated Search

    1984-01-01

    Three general analytical techniques which may be of use in : extending, enhancing, and combining highway accident exposure data are : discussed. The techniques are log-linear modelling, iterative propor : tional fitting and the expectation maximizati...

  11. Standard weight (Ws) equations for four rare desert fishes

    USGS Publications Warehouse

    Didenko, A.V.; Bonar, Scott A.; Matter, W.J.

    2004-01-01

    Standard weight (Ws) equations have been used extensively to examine body condition in sport fishes. However, development of these equations for nongame fishes has only recently been emphasized. We used the regression-line-percentile technique to develop standard weight equations for four rare desert fishes: flannelmouth sucker Catostomus latipinnis, razorback sucker Xyrauchen texanus, roundtail chub Gila robusta, and humpback chub G. cypha. The Ws equation for flannelmouth suckers of 100-690 mm total length (TL) was developed from 17 populations: log10Ws = -5.180 + 3.068 log10TL. The Ws equation for razorback suckers of 110-885 mm TL was developed from 12 populations: log 10Ws = -4.886 + 2.985 log10TL. The W s equation for roundtail chub of 100-525 mm TL was developed from 20 populations: log10Ws = -5.065 + 3.015 log10TL. The Ws equation for humpback chub of 120-495 mm TL was developed from 9 populations: log10Ws = -5.278 + 3.096 log 10TL. These equations meet criteria for acceptable standard weight indexes and can be used to calculate relative weight, an index of body condition.

  12. On the Rapid Computation of Various Polylogarithmic Constants

    NASA Technical Reports Server (NTRS)

    Bailey, David H.; Borwein, Peter; Plouffe, Simon

    1996-01-01

    We give algorithms for the computation of the d-th digit of certain transcendental numbers in various bases. These algorithms can be easily implemented (multiple precision arithmetic is not needed), require virtually no memory, and feature run times that scale nearly linearly with the order of the digit desired. They make it feasible to compute, for example, the billionth binary digit of log(2) or pi on a modest workstation in a few hours run time. We demonstrate this technique by computing the ten billionth hexadecimal digit of pi, the billionth hexadecimal digits of pi-squared, log(2) and log-squared(2), and the ten billionth decimal digit of log(9/10). These calculations rest on the observation that very special types of identities exist for certain numbers like pi, pi-squared, log(2) and log-squared(2). These are essentially polylogarithmic ladders in an integer base. A number of these identities that we derive in this work appear to be new, for example a critical identity for pi.

  13. Conjunctive patches subspace learning with side information for collaborative image retrieval.

    PubMed

    Zhang, Lining; Wang, Lipo; Lin, Weisi

    2012-08-01

    Content-Based Image Retrieval (CBIR) has attracted substantial attention during the past few years for its potential practical applications to image management. A variety of Relevance Feedback (RF) schemes have been designed to bridge the semantic gap between the low-level visual features and the high-level semantic concepts for an image retrieval task. Various Collaborative Image Retrieval (CIR) schemes aim to utilize the user historical feedback log data with similar and dissimilar pairwise constraints to improve the performance of a CBIR system. However, existing subspace learning approaches with explicit label information cannot be applied for a CIR task, although the subspace learning techniques play a key role in various computer vision tasks, e.g., face recognition and image classification. In this paper, we propose a novel subspace learning framework, i.e., Conjunctive Patches Subspace Learning (CPSL) with side information, for learning an effective semantic subspace by exploiting the user historical feedback log data for a CIR task. The CPSL can effectively integrate the discriminative information of labeled log images, the geometrical information of labeled log images and the weakly similar information of unlabeled images together to learn a reliable subspace. We formally formulate this problem into a constrained optimization problem and then present a new subspace learning technique to exploit the user historical feedback log data. Extensive experiments on both synthetic data sets and a real-world image database demonstrate the effectiveness of the proposed scheme in improving the performance of a CBIR system by exploiting the user historical feedback log data.

  14. Released advance reproduction of white and red fir. . . growth, damage, mortality

    Treesearch

    Donald T. Gordon

    1973-01-01

    Advance reproduction of white fir and red fir released by cutting overmature over-story was studied at the Swain Mountain Experimental Forest in northern California, at 6,300 feet elevation. Seedling and sapling height growth before logging was only 0.1-0.2 foot per year. Five years after cutting, seedling and sapling height growth had accelerated to about 0.5 to 0.8...

  15. Advances in borehole geophysics for hydrology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, P.H.

    1982-01-01

    Borehole geophysical methods provide vital subsurface information on rock properties, fluid movement, and the condition of engineered borehole structures. Within the first category, salient advances include the continuing improvement of the borehole televiewer, refinement of the electrical conductivity dipmeter for fracture characterization, and the development of a gigahertz-frequency electromagnetic propagation tool for water saturation measurements. The exploration of the rock mass between boreholes remains a challenging problem with high potential; promising methods are now incorporating high-density spatial sampling and sophisticated data processing. Flow-rate measurement methods appear adequate for all but low-flow situations. At low rates the tagging method seems themore » most attractive. The current exploitation of neutron-activation techniques for tagging means that the wellbore fluid itself is tagged, thereby eliminating the mixing of an alien fluid into the wellbore. Another method uses the acoustic noise generated by flow through constrictions and in and behind casing to detect and locate flaws in the production system. With the advent of field-recorded digital data, the interpretation of logs from sedimentary sequences is now reaching a sophisticated level with the aid of computer processing and the application of statistical methods. Lagging behind are interpretive schemes for the low-porosity, fracture-controlled igneous and metamorphic rocks encountered in the geothermal reservoirs and in potential waste-storage sites. Progress is being made on the general problem of fracture detection by use of electrical and acoustical techniques, but the reliable definition of permeability continues to be an elusive goal.« less

  16. An analysis of production and costs in high-lead yarding.

    Treesearch

    Magnus E. Tennas; Robert H. Ruth; Carl M. Berntsen

    1955-01-01

    In recent years loggers and timber owners have needed better information for estimating logging costs in the Douglas-fir region. Brandstrom's comprehensive study, published in 1933 (1), has long been used as a guide in making cost estimates. But the use of new equipment and techniques and an overall increase in logging costs have made it increasingly difficult to...

  17. Using nonlinear quantile regression to estimate the self-thinning boundary curve

    Treesearch

    Quang V. Cao; Thomas J. Dean

    2015-01-01

    The relationship between tree size (quadratic mean diameter) and tree density (number of trees per unit area) has been a topic of research and discussion for many decades. Starting with Reineke in 1933, the maximum size-density relationship, on a log-log scale, has been assumed to be linear. Several techniques, including linear quantile regression, have been employed...

  18. A method of improving sensitivity of carbon/oxygen well logging for low porosity formation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Juntao; Zhang, Feng; Zhang, Quanying

    Carbon/Oxygen (C/O) spectral logging technique has been widely used to determine residual oil saturation and the evaluation of water flooded layer. In order to improve the sensitivity of the technique for low – porosity formation, Gaussian and linear models are applied to fit the peaks of measured spectra to obtain the characteristic coefficients. Standard spectra of carbon and oxygen are combined to establish a new carbon /oxygen value calculation method, and the robustness of the new method is cross – validated with known mixed gamma ray spectrum. Formation models for different porosities and saturations are built using Monte Carlo method.more » The responses of carbon/oxygen which are calculated by conventional energy window method, and the new method is applied to oil saturation under low porosity conditions. The results show the new method can reduce the effects of gamma rays contaminated by the interaction between neutrons and other elements on carbon/oxygen ratio, and therefore can significantly improve the response sensitivity of carbon/oxygen well logging to oil saturation. The new method improves greatly carbon/oxygen well logging in low porosity conditions.« less

  19. A method of improving sensitivity of carbon/oxygen well logging for low porosity formation

    DOE PAGES

    Liu, Juntao; Zhang, Feng; Zhang, Quanying; ...

    2016-12-01

    Carbon/Oxygen (C/O) spectral logging technique has been widely used to determine residual oil saturation and the evaluation of water flooded layer. In order to improve the sensitivity of the technique for low – porosity formation, Gaussian and linear models are applied to fit the peaks of measured spectra to obtain the characteristic coefficients. Standard spectra of carbon and oxygen are combined to establish a new carbon /oxygen value calculation method, and the robustness of the new method is cross – validated with known mixed gamma ray spectrum. Formation models for different porosities and saturations are built using Monte Carlo method.more » The responses of carbon/oxygen which are calculated by conventional energy window method, and the new method is applied to oil saturation under low porosity conditions. The results show the new method can reduce the effects of gamma rays contaminated by the interaction between neutrons and other elements on carbon/oxygen ratio, and therefore can significantly improve the response sensitivity of carbon/oxygen well logging to oil saturation. The new method improves greatly carbon/oxygen well logging in low porosity conditions.« less

  20. International Symposium on Special Topics in Chemical Propulsion (8th): Advancements in Energetic Materials and Chemical Propulsion Held in Cape Town, South Africa on 2-6 November 2009

    DTIC Science & Technology

    2009-11-06

    Hydrogen azide is well-known as an endothermic explosion gas for a long time, but there is a lack of understanding about the detailed kinetics of its...Martin, K.K. Kuo, R. Houim, and M. Degges Lidar Detection of Explosives Vapors Using Excimer Laser (Log #127) S. Bobrovnikov, E. Gorlov, G...Agents for High Energy Propellants (Log #190) B.M. Kosowski, J. Consaga, and A. Condo Formulation & Development of an Explosive that Allows

  1. Ratio of serum levels of AGEs to soluble RAGE is correlated with trimethylamine-N-oxide in non-diabetic subjects.

    PubMed

    Tahara, Atsuko; Tahara, Nobuhiro; Yamagishi, Sho-Ichi; Honda, Akihiro; Igata, Sachiyo; Nitta, Yoshikazu; Bekki, Munehisa; Nakamura, Tomohisa; Sugiyama, Yoichi; Sun, Jiahui; Takeuchi, Masayoshi; Shimizu, Makiko; Yamazaki, Hiroshi; Fukami, Kei; Fukumoto, Yoshihiro

    2017-12-01

    Trimethylamine (TMA), an intestinal microflora-dependent metabolite formed from phosphatidylcholine- and L-carnitine-rich food, such as red meat, is further converted to trimethylamine-N-oxide (TMAO), which could play a role in cardiometabolic disease. Red meat-derived products are one of the major environmental sources of advanced glycation end products (AGEs) that may also contribute to the pathogenesis of cardiometabolic disorders through the interaction with receptor for AGEs (RAGE). However, the relationship among AGEs, soluble form of RAGE (sRAGE) and TMAO in humans remains unclear. Non-diabetic subjects underwent a physical examination, determination of blood chemistry and anthropometric variables, including AGEs, sRAGE, TMA and TMAO. Multiple regression analyses revealed that HbA1c, uric acid and AGEs were independently associated with log TMA, whereas log AGEs to sRAGE ratio and statin non-use were independently correlated with log TMAO. Our present findings indicated that AGEs to sRAGE ratio was correlated with log TMAO, a marker of cardiometabolic disorders.

  2. Solubility of organic compounds in octanol: Improved predictions based on the geometrical fragment approach.

    PubMed

    Mathieu, Didier

    2017-09-01

    Two new models are introduced to predict the solubility of chemicals in octanol (S oct ), taking advantage of the extensive character of log(S oct ) through a decomposition of molecules into so-called geometrical fragments (GF). They are extensively validated and their compliance with regulatory requirements is demonstrated. The first model requires just a molecular formula as input. Despite an extreme simplicity, it performs as well as an advanced random forest model involving 86 descriptors, with a root mean square error (RMSE) of 0.64 log units for an external test set of 100 molecules. For the second one, which requires the melting point T m as input, introducing GF descriptors reduces the RMSE from about 0.7 to <0.5 log units, a performance that could previously be obtained only through the use of Abraham descriptors. A script is provided for easy application of the models, taking into account the limits of their applicability domains. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Hyperspectral image reconstruction for x-ray fluorescence tomography

    DOE PAGES

    Gürsoy, Doǧa; Biçer, Tekin; Lanzirotti, Antonio; ...

    2015-01-01

    A penalized maximum-likelihood estimation is proposed to perform hyperspectral (spatio-spectral) image reconstruction for X-ray fluorescence tomography. The approach minimizes a Poisson-based negative log-likelihood of the observed photon counts, and uses a penalty term that has the effect of encouraging local continuity of model parameter estimates in both spatial and spectral dimensions simultaneously. The performance of the reconstruction method is demonstrated with experimental data acquired from a seed of arabidopsis thaliana collected at the 13-ID-E microprobe beamline at the Advanced Photon Source. The resulting element distribution estimates with the proposed approach show significantly better reconstruction quality than the conventional analytical inversionmore » approaches, and allows for a high data compression factor which can reduce data acquisition times remarkably. In particular, this technique provides the capability to tomographically reconstruct full energy dispersive spectra without compromising reconstruction artifacts that impact the interpretation of results.« less

  4. Harvesting data from advanced technologies.

    DOT National Transportation Integrated Search

    2014-11-01

    Data streams are emerging everywhere such as Web logs, Web page click streams, sensor data streams, and credit card transaction flows. : Different from traditional data sets, data streams are sequentially generated and arrive one by one rather than b...

  5. Policing in Afghanistan-Reform that Respects Tradition: Need for a Strategic Shift

    DTIC Science & Technology

    2010-05-01

    Simpkins Log: H 10-000547 Joint Advanced Warfighting Program Approved for public release; distribution is unlimited. About This Publication This work...iii Preface The Department of Defense established the Joint Advanced Warfighting Program (JAWP) in 1998 under the senior sponsorship of the Under...2 Greg Jaffe, “ Program Aims to Rebuild Afghan Police Force, Repair its Image,” Washington Post (12 March 2010), 8, www.washingtonpost.com

  6. Transverse vibration techniques : logs to structural systems

    Treesearch

    Robert J. Ross

    2008-01-01

    Transverse vibration as a nondestructive testing and evaluation technique was first examined in the early 1960s. Initial research and development efforts focused on clear wood, lumber, and laminated products. Out of those efforts, tools were developed that are used today to assess lumber properties. Recently, use of this technique has been investigated for evaluating a...

  7. Estimating tree bole and log weights from green densities measured with the Bergstrom Xylodensimeter.

    Treesearch

    Dale R. Waddell; Michael B. Lambert; W.Y. Pong

    1984-01-01

    The performance of the Bergstrom xylodensimeter, designed to measure the green density of wood, was investigated and compared with a technique that derived green densities from wood disk samples. In addition, log and bole weights of old-growth Douglas-fir and western hemlock were calculated by various formulas and compared with lifted weights measured with a load cell...

  8. Footprints in the Sky: Using Student Track Logs from a "Bird's Eye View" Virtual Field Trip to Enhance Learning

    ERIC Educational Resources Information Center

    Treves, Richard; Viterbo, Paolo; Haklay, Mordechai

    2015-01-01

    Research into virtual field trips (VFTs) started in the 1990s but, only recently, the maturing technology of devices and networks has made them viable options for educational settings. By considering an experiment, the learning benefits of logging the movement of students within a VFT are shown. The data are visualized by two techniques:…

  9. Assessing wood quality of borer-infested red oak logs with a resonance acoustic technique

    Treesearch

    Xiping Wang; Henry E. Stelzer; Jan Wiedenbeck; Patricia K. Lebow; Robert J. Ross

    2009-01-01

    Large numbers of black oak (Quercus velutina Lam.) and scarlet oak (Quercus coccinea Muenchh.) trees are declining and dying in the Missouri Ozark forest as a result of oak decline. Red oak borer-infested trees produce low-grade logs that become extremely difficult to merchandize as the level of insect attack increases. The objective of this study was to investigate...

  10. Exploring Online Students' Self-Regulated Learning with Self-Reported Surveys and Log Files: A Data Mining Approach

    ERIC Educational Resources Information Center

    Cho, Moon-Heum; Yoo, Jin Soung

    2017-01-01

    Many researchers who are interested in studying students' online self-regulated learning (SRL) have heavily relied on self-reported surveys. Data mining is an alternative technique that can be used to discover students' SRL patterns from large data logs saved on a course management system. The purpose of this study was to identify students' online…

  11. Cable yarding residue after thinning young stands: a break-even simulation

    Treesearch

    Chris B. LeDoux

    1984-01-01

    The use of cable logging to extract small pieces of residue wood may result in low rates of production and a high cost per unit of wood produced. However, the logging manager can improve yarding productivity and break even in cable residue removal operations by using the proper planning techniques. In this study, breakeven zones for specific young-growth stands were...

  12. Investigation of methods and approaches for collecting and recording highway inventory data.

    DOT National Transportation Integrated Search

    2013-06-01

    Many techniques for collecting highway inventory data have been used by state and local agencies in the U.S. These : techniques include field inventory, photo/video log, integrated GPS/GIS mapping systems, aerial photography, satellite : imagery, vir...

  13. Flood frequency analysis using optimization techniques : final report.

    DOT National Transportation Integrated Search

    1992-10-01

    this study consists of three parts. In the first part, a comprehensive investigation was made to find an improved estimation method for the log-Pearson type 3 (LP3) distribution by using optimization techniques. Ninety sets of observed Louisiana floo...

  14. Operator product expansion in Liouville field theory and Seiberg-type transitions in log-correlated random energy models

    NASA Astrophysics Data System (ADS)

    Cao, Xiangyu; Le Doussal, Pierre; Rosso, Alberto; Santachiara, Raoul

    2018-04-01

    We study transitions in log-correlated random energy models (logREMs) that are related to the violation of a Seiberg bound in Liouville field theory (LFT): the binding transition and the termination point transition (a.k.a., pre-freezing). By means of LFT-logREM mapping, replica symmetry breaking and traveling-wave equation techniques, we unify both transitions in a two-parameter diagram, which describes the free-energy large deviations of logREMs with a deterministic background log potential, or equivalently, the joint moments of the free energy and Gibbs measure in logREMs without background potential. Under the LFT-logREM mapping, the transitions correspond to the competition of discrete and continuous terms in a four-point correlation function. Our results provide a statistical interpretation of a peculiar nonlocality of the operator product expansion in LFT. The results are rederived by a traveling-wave equation calculation, which shows that the features of LFT responsible for the transitions are reproduced in a simple model of diffusion with absorption. We examine also the problem by a replica symmetry breaking analysis. It complements the previous methods and reveals a rich large deviation structure of the free energy of logREMs with a deterministic background log potential. Many results are verified in the integrable circular logREM, by a replica-Coulomb gas integral approach. The related problem of common length (overlap) distribution is also considered. We provide a traveling-wave equation derivation of the LFT predictions announced in a precedent work.

  15. Novel non-invasive biological predictive index for liver fibrosis in hepatitis C virus genotype 4 patients

    PubMed Central

    Khattab, Mahmoud; Sakr, Mohamed Amin; Fattah, Mohamed Abdel; Mousa, Youssef; Soliman, Elwy; Breedy, Ashraf; Fathi, Mona; Gaber, Salwa; Altaweil, Ahmed; Osman, Ashraf; Hassouna, Ahmed; Motawea, Ibrahim

    2016-01-01

    AIM To investigate the diagnostic ability of a non-invasive biological marker to predict liver fibrosis in hepatitis C genotype 4 patients with high accuracy. METHODS A cohort of 332 patients infected with hepatitis C genotype 4 was included in this cross-sectional study. Fasting plasma glucose, insulin, C-peptide, and angiotensin-converting enzyme serum levels were measured. Insulin resistance was mathematically calculated using the homeostasis model of insulin resistance (HOMA-IR). RESULTS Fibrosis stages were distributed based on Metavir score as follows: F0 = 43, F1 = 136, F2 = 64, F3 = 45 and F4 = 44. Statistical analysis relied upon reclassification of fibrosis stages into mild fibrosis (F0-F) = 179, moderate fibrosis (F2) = 64, and advanced fibrosis (F3-F4) = 89. Univariate analysis indicated that age, log aspartate amino transaminase, log HOMA-IR and log platelet count were independent predictors of liver fibrosis stage (P < 0.0001). A stepwise multivariate discriminant functional analysis was used to drive a discriminative model for liver fibrosis. Our index used cut-off values of ≥ 0.86 and ≤ -0.31 to diagnose advanced and mild fibrosis, respectively, with receiving operating characteristics of 0.91 and 0.88, respectively. The sensitivity, specificity, positive predictive value, negative predictive value and positive likelihood ratio were: 73%, 91%, 75%, 90% and 8.0 respectively for advanced fibrosis, and 67%, 88%, 84%, 70% and 4.9, respectively, for mild fibrosis. CONCLUSION Our predictive model is easily available and reproducible, and predicted liver fibrosis with acceptable accuracy. PMID:27917265

  16. Novel non-invasive biological predictive index for liver fibrosis in hepatitis C virus genotype 4 patients.

    PubMed

    Khattab, Mahmoud; Sakr, Mohamed Amin; Fattah, Mohamed Abdel; Mousa, Youssef; Soliman, Elwy; Breedy, Ashraf; Fathi, Mona; Gaber, Salwa; Altaweil, Ahmed; Osman, Ashraf; Hassouna, Ahmed; Motawea, Ibrahim

    2016-11-18

    To investigate the diagnostic ability of a non-invasive biological marker to predict liver fibrosis in hepatitis C genotype 4 patients with high accuracy. A cohort of 332 patients infected with hepatitis C genotype 4 was included in this cross-sectional study. Fasting plasma glucose, insulin, C-peptide, and angiotensin-converting enzyme serum levels were measured. Insulin resistance was mathematically calculated using the homeostasis model of insulin resistance (HOMA-IR). Fibrosis stages were distributed based on Metavir score as follows: F0 = 43, F1 = 136, F2 = 64, F3 = 45 and F4 = 44. Statistical analysis relied upon reclassification of fibrosis stages into mild fibrosis (F0-F) = 179, moderate fibrosis (F2) = 64, and advanced fibrosis (F3-F4) = 89. Univariate analysis indicated that age, log aspartate amino transaminase, log HOMA-IR and log platelet count were independent predictors of liver fibrosis stage ( P < 0.0001). A stepwise multivariate discriminant functional analysis was used to drive a discriminative model for liver fibrosis. Our index used cut-off values of ≥ 0.86 and ≤ -0.31 to diagnose advanced and mild fibrosis, respectively, with receiving operating characteristics of 0.91 and 0.88, respectively. The sensitivity, specificity, positive predictive value, negative predictive value and positive likelihood ratio were: 73%, 91%, 75%, 90% and 8.0 respectively for advanced fibrosis, and 67%, 88%, 84%, 70% and 4.9, respectively, for mild fibrosis. Our predictive model is easily available and reproducible, and predicted liver fibrosis with acceptable accuracy.

  17. "Geo-statistics methods and neural networks in geophysical applications: A case study"

    NASA Astrophysics Data System (ADS)

    Rodriguez Sandoval, R.; Urrutia Fucugauchi, J.; Ramirez Cruz, L. C.

    2008-12-01

    The study is focus in the Ebano-Panuco basin of northeastern Mexico, which is being explored for hydrocarbon reservoirs. These reservoirs are in limestones and there is interest in determining porosity and permeability in the carbonate sequences. The porosity maps presented in this study are estimated from application of multiattribute and neural networks techniques, which combine geophysics logs and 3-D seismic data by means of statistical relationships. The multiattribute analysis is a process to predict a volume of any underground petrophysical measurement from well-log and seismic data. The data consist of a series of target logs from wells which tie a 3-D seismic volume. The target logs are neutron porosity logs. From the 3-D seismic volume a series of sample attributes is calculated. The objective of this study is to derive a set of attributes and the target log values. The selected set is determined by a process of forward stepwise regression. The analysis can be linear or nonlinear. In the linear mode the method consists of a series of weights derived by least-square minimization. In the nonlinear mode, a neural network is trained using the select attributes as inputs. In this case we used a probabilistic neural network PNN. The method is applied to a real data set from PEMEX. For better reservoir characterization the porosity distribution was estimated using both techniques. The case shown a continues improvement in the prediction of the porosity from the multiattribute to the neural network analysis. The improvement is in the training and the validation, which are important indicators of the reliability of the results. The neural network showed an improvement in resolution over the multiattribute analysis. The final maps provide more realistic results of the porosity distribution.

  18. Geophysical Log Data from Basalt Aquifers Near Waipahu on the Island of Oahu and Pahoa on the Island of Hawaii, Hawaii

    USGS Publications Warehouse

    Paillet, Frederick L.; Hess, Alfred E.

    1995-01-01

    Two relatively new geophysical logging techniques, the digitally enhanced borehole acoustic televiewer and the heat-pulse flowmeter, were tested from 1987 to 1991 at two sites in Hawaii: Waipahu on the island of Oahu, and Pahoa on the island of Hawaii. Although these data were obtained in an effort to test and improve these two logging techniques, the measurements are of interest to hydrologists studying the aquifers in Hawaii. This report presents a review of the measurements conducted during this effort and summarizes the data obtained in a form designed to make that data available to hydrologists studying the movement of ground water in Hawaiian aquifers. Caliper logs obtained at the Waipahu site indicate the distribution of openings in interbed clinker zones between relatively dense and impermeable basalt flows. The flowmeter data indicate the pattern of flow induced along seven observation boreholes that provide conduits between interbed zones in the vicinity of the Mahoe Pumping Station at the Waipahu site. The televiewer image logs obtained in some of the Waipahu Mahoe boreholes do not show any significant vertical or steeply dipping fractures that might allow communication across the dense interior of basalt flows. Acoustic televiewer logs obtained at the Pahoa site show that a number of steeply dipping fractures and dikes cut across basalt flows. Although flow under ambient hydraulic-head conditions in the Waipahu Mahoe Observation boreholes is attributed to hydraulic gradients associated with pumping from a nearby pumping station, flow in the Waipio Deep Observation borehole on Oahu and flow in the Scientific Observation borehole on Hawaii are attributed to the effects of natural recharge and downward decreasing hydraulic heads associated with that recharge.

  19. Sweep visually evoked potentials and visual findings in children with West syndrome.

    PubMed

    de Freitas Dotto, Patrícia; Cavascan, Nívea Nunes; Berezovsky, Adriana; Sacai, Paula Yuri; Rocha, Daniel Martins; Pereira, Josenilson Martins; Salomão, Solange Rios

    2014-03-01

    West syndrome (WS) is a type of early childhood epilepsy characterized by progressive neurological development deterioration that includes vision. To demonstrate the clinical importance of grating visual acuity thresholds (GVA) measurement by sweep visually evoked potentials technique (sweep-VEP) as a reliable tool for evaluation of the visual cortex status in WS children. This is a retrospective study of the best-corrected binocular GVA and ophthalmological features of WS children referred for the Laboratory of Clinical Electrophysiology of Vision of UNIFESP from 1998 to 2012 (Committee on Ethics in Research of UNIFESP n° 0349/08). The GVA deficit was calculated by subtracting binocular GVA score (logMAR units) of each patient from the median values of age norms from our own lab and classified as mild (0.1-0.39 logMAR), moderate (0.40-0.80 logMAR) or severe (>0.81 logMAR). Associated ophthalmological features were also described. Data from 30 WS children (age from 6 to 108 months, median = 14.5 months, mean ± SD = 22.0 ± 22.1 months; 19 male) were analyzed. The majority presented severe GVA deficit (0.15-1.44 logMAR; mean ± SD = 0.82 ± 0.32 logMAR; median = 0.82 logMAR), poor visual behavior, high prevalence of strabismus and great variability in ocular positioning. The GVA deficit did not vary according to gender (P = .8022), WS type (P = .908), birth age (P = .2881), perinatal oxygenation (P = .7692), visual behavior (P = .8789), ocular motility (P = .1821), nystagmus (P = .2868), risk of drug-induced retinopathy (P = .4632) and participation in early visual stimulation therapy (P = .9010). The sweep-VEP technique is a reliable tool to classify visual system impairment in WS children, in agreement with the poor visual behavior exhibited by them. Copyright © 2013 European Paediatric Neurology Society. Published by Elsevier Ltd. All rights reserved.

  20. Disinfection of an advanced primary effluent with peracetic acid and ultraviolet combined treatment: a continuous-flow pilot plant study.

    PubMed

    González, Abelardo; Gehr, Ronald; Vaca, Mabel; López, Raymundo

    2012-03-01

    Disinfection of an advanced primary effluent using a continuous-flow combined peracetic acid/ultraviolet (PAA/UV) radiation system was evaluated. The purpose was to determine whether the maximum microbial content, established under Mexican standards for treated wastewaters meant for reuse--less than 240 most probable number fecal coliforms (FC)/100 mL--could be feasibly accomplished using either disinfectant individually, or the combined PAA/UV system. This meant achieving reduction of up to 5 logs, considering initial concentrations of 6.4 x 10(+6) to 5.8 x 10(+7) colony forming units/100 mL. During the tests performed under these experiments, total coliforms (TC) were counted because FC, at the most, will be equal to TC. Peracetic acid disinfection achieved less than 1.5 logs TC reduction when the C(t) x t product was less than 2.26 mg x minimum (min)/L; 3.8 logs for C(t) x t 4.40 mg x min/L; and 5.9 logs for C(t) x t 24.2 mg x min/L. In continuous-flow UV irradiation tests, at a low-operating flow (21 L/min; conditions which produced an average UV fluence of 13.0 mJ/cm2), the highest TC reduction was close to 2.5 logs. The only condition that produced a disinfection efficiency of approximately 5 logs, when both disinfection agents were used together, was the combined process dosing 30 mg PAA/L at a pilot plant flow of 21 L/min and contact time of 10 minutes to attain an average C(t) x t product of 24.2 mg x min/L and an average UV fluence of 13 mJ/cm2. There was no conclusive evidence of a synergistic effect when both disinfectants were employed in combination as compared to the individual effects achieved when used separately, but this does not take into account the nonlinearity (tailing-off) of the dose-response curve.

  1. Improving Models for Coseismic And Postseismic Deformation from the 2002 Denali, Alaska Earthquake

    NASA Astrophysics Data System (ADS)

    Harper, H.; Freymueller, J. T.

    2016-12-01

    Given the multi-decadal temporal scale of postseismic deformation, predictions of previous models for postseismic deformation resulting from the 2002 Denali Fault earthquake (M 7.9) do not agree with longer-term observations. In revising the past postseismic models with what is now over a decade of data, the first step is revisiting coseismic displacements and slip distribution of the earthquake. Advances in processing allow us to better constrain coseismic displacement estimates, which affect slip distribution predictions in modeling. Additionally, an updated slip model structure from a homogeneous model to a layered model rectifies previous inconsistencies between coseismic and postseismic models. Previous studies have shown that two primary processes contribute to postseismic deformation: afterslip, which decays with a short time constant; and viscoelastic relaxation, which decays with a longer time constant. We fit continuous postseismic GPS time series with three different relaxation models: 1) logarithmic decay + exponential decay, 2) log + exp + exp, and 3) log + log + exp. A grid search is used to minimize total model WRSS, and we find optimal relaxation times of: 1) 0.125 years (log) and 21.67 years (exp); 2) 0.14 years (log), 0.68 years (exp), and 28.33 years (exp); 3) 0.055 years (log), 14.44 years (log), and 22.22 years (exp). While there is not a one-to-one correspondence between a particular decay constant and a mechanism, the optimization of these constants allows us to model the future timeseries and constrain the contribution of different postseismic processes.

  2. NMR Methods, Applications and Trends for Groundwater Evaluation and Management

    NASA Astrophysics Data System (ADS)

    Walsh, D. O.; Grunewald, E. D.

    2011-12-01

    Nuclear magnetic resonance (NMR) measurements have a tremendous potential for improving groundwater characterization, as they provide direct detection and measurement of groundwater and unique information about pore-scale properties. NMR measurements, commonly used in chemistry and medicine, are utilized in geophysical investigations through non-invasive surface NMR (SNMR) or downhole NMR logging measurements. Our recent and ongoing research has focused on improving the performance and interpretation of NMR field measurements for groundwater characterization. Engineering advancements have addressed several key technical challenges associated with SNMR measurements. Susceptibility of SNMR measurements to environmental noise has been dramatically reduced through the development of multi-channel acquisition hardware and noise-cancellation software. Multi-channel instrumentation (up to 12 channels) has also enabled more efficient 2D and 3D imaging. Previous limitations in measuring NMR signals from water in silt, clay and magnetic geology have been addressed by shortening the instrument dead-time from 40 ms to 4 ms, and increasing the power output. Improved pulse sequences have been developed to more accurately estimate NMR relaxation times and their distributions, which are sensitive to pore size distributions. Cumulatively, these advancements have vastly expanded the range of environments in which SNMR measurements can be obtained, enabling detection of groundwater in smaller pores, in magnetic geology, in the unsaturated zone, and nearby to infrastructure (presented here in case studies). NMR logging can provide high-resolution estimates of bound and mobile water content and pore size distributions. While NMR logging has been utilized in oil and gas applications for decades, its use in groundwater investigations has been limited by the large size and high cost of oilfield NMR logging tools and services. Recently, engineering efforts funded by the US Department of Energy have produced an NMR logging tool that is much smaller and less costly than comparable oilfield NMR logging tools. This system is specifically designed for near surface groundwater investigations, incorporates small diameter probes (as small as 1.67 inches diameter) and man-portable surface stations, and provides NMR data and information content on par with oilfield NMR logging tools. A direct-push variant of this logging tool has also been developed. Key challenges associated with small diameter tools include inherently lower SNR and logging speeds, the desire to extend the sensitive zone as far as possible into unconsolidated formations, and simultaneously maintaining high power and signal fidelity. Our ongoing research in groundwater NMR aims to integrating surface and borehole measurements for regional-scale permeability mapping, and to develop in-place NMR sensors for long term monitoring of contaminant and remediation processes. In addition to groundwater resource characterization, promising new applications of NMR include assessing water content in ice and permafrost, management of groundwater in mining operations, and evaluation and management of groundwater in civil engineering applications.

  3. Optimization of antibacterial activity by Gold-Thread (Coptidis Rhizoma Franch) against Streptococcus mutans using evolutionary operation-factorial design technique.

    PubMed

    Choi, Ung-Kyu; Kim, Mi-Hyang; Lee, Nan-Hee

    2007-11-01

    This study was conducted to find the optimum extraction condition of Gold-Thread for antibacterial activity against Streptococcus mutans using The evolutionary operation-factorial design technique. Higher antibacterial activity was achieved in a higher extraction temperature (R2 = -0.79) and in a longer extraction time (R2 = -0.71). Antibacterial activity was not affected by differentiation of the ethanol concentration in the extraction solvent (R2 = -0.12). The maximum antibacterial activity of clove against S. mutans determined by the EVOP-factorial technique was obtained at 80 degrees C extraction temperature, 26 h extraction time, and 50% ethanol concentration. The population of S. mutans decreased from 6.110 logCFU/ml in the initial set to 4.125 logCFU/ml in the third set.

  4. QUANTIFICATION OF IN-SITU GAS HYDRATES WITH WELL LOGS.

    USGS Publications Warehouse

    Collett, Timothy S.; Godbole, Sanjay P.; Economides, Christine

    1984-01-01

    This study evaluates in detail the expected theoretical log responses and the actual log responses within one stratigraphically controlled hydrate horizon in six wells spaced throughout the Kuparuk Oil Field. Detailed examination of the neutron porosity and sonic velocity responses within the horizon is included. In addition, the theoretical effect of the presence of hydrates on the neutron porosity and sonic velocity devices has been examined in order to correct for such an effect on the calculation of formation properties such as porosity and hydrate saturation. Also presented in the paper is a technique which allows the conclusive identification of a potential hydrate occurrence.

  5. Self organizing map neural networks approach for lithologic interpretation of nuclear and electrical well logs in basaltic environment, Southern Syria.

    PubMed

    Asfahani, J; Ahmad, Z; Ghani, B Abdul

    2018-07-01

    An approach based on self organizing map (SOM) artificial neural networks is proposed herewith oriented towards interpreting nuclear and electrical well logging data. The well logging measurements of Kodana well in Southern Syria have been interpreted by applying the proposed approach. Lithological cross-section model of the basaltic environment has been derived and four different kinds of basalt have been consequently distinguished. The four basalts are hard massive basalt, hard basalt, pyroclastic basalt and the alteration basalt products- clay. The results obtained by SOM artificial neural networks are in a good agreement with the previous published results obtained by other different techniques. The SOM approach is practiced successfully in the case study of the Kodana well logging data, and can be therefore recommended as a suitable and effective approach for handling huge well logging data with higher number of variables required for lithological discrimination purposes. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. A depth video sensor-based life-logging human activity recognition system for elderly care in smart indoor environments.

    PubMed

    Jalal, Ahmad; Kamal, Shaharyar; Kim, Daijin

    2014-07-02

    Recent advancements in depth video sensors technologies have made human activity recognition (HAR) realizable for elderly monitoring applications. Although conventional HAR utilizes RGB video sensors, HAR could be greatly improved with depth video sensors which produce depth or distance information. In this paper, a depth-based life logging HAR system is designed to recognize the daily activities of elderly people and turn these environments into an intelligent living space. Initially, a depth imaging sensor is used to capture depth silhouettes. Based on these silhouettes, human skeletons with joint information are produced which are further used for activity recognition and generating their life logs. The life-logging system is divided into two processes. Firstly, the training system includes data collection using a depth camera, feature extraction and training for each activity via Hidden Markov Models. Secondly, after training, the recognition engine starts to recognize the learned activities and produces life logs. The system was evaluated using life logging features against principal component and independent component features and achieved satisfactory recognition rates against the conventional approaches. Experiments conducted on the smart indoor activity datasets and the MSRDailyActivity3D dataset show promising results. The proposed system is directly applicable to any elderly monitoring system, such as monitoring healthcare problems for elderly people, or examining the indoor activities of people at home, office or hospital.

  7. A Depth Video Sensor-Based Life-Logging Human Activity Recognition System for Elderly Care in Smart Indoor Environments

    PubMed Central

    Jalal, Ahmad; Kamal, Shaharyar; Kim, Daijin

    2014-01-01

    Recent advancements in depth video sensors technologies have made human activity recognition (HAR) realizable for elderly monitoring applications. Although conventional HAR utilizes RGB video sensors, HAR could be greatly improved with depth video sensors which produce depth or distance information. In this paper, a depth-based life logging HAR system is designed to recognize the daily activities of elderly people and turn these environments into an intelligent living space. Initially, a depth imaging sensor is used to capture depth silhouettes. Based on these silhouettes, human skeletons with joint information are produced which are further used for activity recognition and generating their life logs. The life-logging system is divided into two processes. Firstly, the training system includes data collection using a depth camera, feature extraction and training for each activity via Hidden Markov Models. Secondly, after training, the recognition engine starts to recognize the learned activities and produces life logs. The system was evaluated using life logging features against principal component and independent component features and achieved satisfactory recognition rates against the conventional approaches. Experiments conducted on the smart indoor activity datasets and the MSRDailyActivity3D dataset show promising results. The proposed system is directly applicable to any elderly monitoring system, such as monitoring healthcare problems for elderly people, or examining the indoor activities of people at home, office or hospital. PMID:24991942

  8. Preliminary geological investigation of AIS data at Mary Kathleen, Queensland, Australia

    NASA Technical Reports Server (NTRS)

    Huntington, J. F.; Green, A. A.; Craig, M. D.; Cocks, T. D.

    1986-01-01

    The Airborne Imaging Spectrometer (AIS) was flown over granitic, volcanic, and calc-silicate terrain around the Mary Kathleen Uranium Mine in Queensland, in a test of its mineralocial mapping capabilities. An analysis strategy and restoration and enhancement techniques were developed to process the 128 band AIS data. A preliminary analysis of one of three AIS flight lines shows that the data contains considerable spectral variation but that it is also contaminated by second-order leakage of radiation from the near-infrared region. This makes the recognition of expected spectral absorption shapes very difficult. The effect appears worst in terrains containing considerable vegetation. Techniques that try to predict this supplementary radiation coupled with the log residual analytical technique show that expected mineral absorption spectra can be derived. The techniques suggest that with additional refinement correction procedures, the Australian AIS data may be revised. Application of the log residual analysis method has proved very successful on the cuprite, Nevada data set, and for highlighting the alunite, linite, and SiOH mineralogy.

  9. Teaching an Old Log New Tricks with Machine Learning.

    PubMed

    Schnell, Krista; Puri, Colin; Mahler, Paul; Dukatz, Carl

    2014-03-01

    To most people, the log file would not be considered an exciting area in technology today. However, these relatively benign, slowly growing data sources can drive large business transformations when combined with modern-day analytics. Accenture Technology Labs has built a new framework that helps to expand existing vendor solutions to create new methods of gaining insights from these benevolent information springs. This framework provides a systematic and effective machine-learning mechanism to understand, analyze, and visualize heterogeneous log files. These techniques enable an automated approach to analyzing log content in real time, learning relevant behaviors, and creating actionable insights applicable in traditionally reactive situations. Using this approach, companies can now tap into a wealth of knowledge residing in log file data that is currently being collected but underutilized because of its overwhelming variety and volume. By using log files as an important data input into the larger enterprise data supply chain, businesses have the opportunity to enhance their current operational log management solution and generate entirely new business insights-no longer limited to the realm of reactive IT management, but extending from proactive product improvement to defense from attacks. As we will discuss, this solution has immediate relevance in the telecommunications and security industries. However, the most forward-looking companies can take it even further. How? By thinking beyond the log file and applying the same machine-learning framework to other log file use cases (including logistics, social media, and consumer behavior) and any other transactional data source.

  10. Proposed standard-weight (Ws) equation and length-categorization standards for brown trout (Salmo trutta) in lentic habitats

    USGS Publications Warehouse

    Hyatt, M.W.; Hubert, W.A.

    2001-01-01

    We developed a standard-weight (Ws) equation for brown trout (Salmo trutta) in lentic habitats by applying the regression-line-percentile technique to samples from 49 populations in North America. The proposed Ws equation is log10 Ws = -5.422 + 3.194 log10 TL, when Ws is in grams and TL is total length in millimeters. The English-unit equivalent is log10 Ws = -3.592 + 3.194 log10 TL, when Ws is in pounds and TL is total length in inches. The equation is applicable for fish of 140-750 mm TL. Proposed length-category standards to evaluate fish within populations are: stock, 200 mm (8 in); quality, 300 mm (12 in); preferred, 400 mm (16 in); memorable, 500 mm (20 in); and trophy, 600 mm (24 in).

  11. Long-term impacts of recurrent logging and fire in Amazon forests: a modeling study using the Ecosystem Demography Model (ED2)

    NASA Astrophysics Data System (ADS)

    Longo, M.; Keller, M.; Scaranello, M. A., Sr.; dos-Santos, M. N.; Xu, Y.; Huang, M.; Morton, D. C.

    2017-12-01

    Logging and understory fires are major drivers of tropical forest degradation, reducing carbon stocks and changing forest structure, composition, and dynamics. In contrast to deforested areas, sites that are disturbed by logging and fires retain some, albeit severely altered, forest structure and function. In this study we simulated selective logging using the Ecosystem Demography Model (ED-2) to investigate the impact of a broad range of logging techniques, harvest intensities, and recurrence cycles on the long-term dynamics of Amazon forests, including the magnitude and duration of changes in forest flammability following timber extraction. Model results were evaluated using eddy covariance towers at logged sites at the Tapajos National Forest in Brazil and data on long-term dynamics reported in the literature. ED-2 is able to reproduce both the fast (< 5yr) recovery of water, energy fluxes compared to flux tower, and the typical, field-observed, decadal time scales for biomass recovery when no additional logging occurs. Preliminary results using the original ED-2 fire model show that canopy cover loss of forests under high-intensity, conventional logging cause sufficient drying to support more intense fires. These results indicate that under intense degradation, forests may shift to novel disturbance regimes, severely reducing carbon stocks, and inducing long-term changes in forest structure and composition from recurrent fires.

  12. Helmet and shoulder pad removal in football players with unstable cervical spine injuries.

    PubMed

    Dahl, Michael C; Ananthakrishnan, Dheera; Nicandri, Gregg; Chapman, Jens R; Ching, Randal P

    2009-05-01

    Football, one of the country's most popular team sports, is associated with the largest overall number of sports-related, catastrophic, cervical spine injuries in the United States (Mueller, 2007). Patient handling can be hindered by the protective sports equipment worn by the athlete. Improper stabilization of these patients can exacerbate neurologic injury. Because of the lack of consensus on the best method for equipment removal, a study was performed comparing three techniques: full body levitation, upper torso tilt, and log roll. These techniques were performed on an intact and lesioned cervical spine cadaveric model simulating conditions in the emergency department. The levitation technique was found to produce motion in the anterior and right lateral directions. The tilt technique resulted in motions in the posterior left lateral directions, and the log roll technique generated motions in the right lateral direction and had the largest amount of increased instability when comparing the intact and lesioned specimen. These findings suggest that each method of equipment removal displays unique weaknesses that the practitioner should take into account, possibly on a patient-by-patient basis.

  13. Empirical relationships between tree fall and landscape-level amounts of logging and fire

    PubMed Central

    Blanchard, Wade; Blair, David; McBurney, Lachlan; Stein, John; Banks, Sam C.

    2018-01-01

    Large old trees are critically important keystone structures in forest ecosystems globally. Populations of these trees are also in rapid decline in many forest ecosystems, making it important to quantify the factors that influence their dynamics at different spatial scales. Large old trees often occur in forest landscapes also subject to fire and logging. However, the effects on the risk of collapse of large old trees of the amount of logging and fire in the surrounding landscape are not well understood. Using an 18-year study in the Mountain Ash (Eucalyptus regnans) forests of the Central Highlands of Victoria, we quantify relationships between the probability of collapse of large old hollow-bearing trees at a site and the amount of logging and the amount of fire in the surrounding landscape. We found the probability of collapse increased with an increasing amount of logged forest in the surrounding landscape. It also increased with a greater amount of burned area in the surrounding landscape, particularly for trees in highly advanced stages of decay. The most likely explanation for elevated tree fall with an increasing amount of logged or burned areas in the surrounding landscape is change in wind movement patterns associated with cutblocks or burned areas. Previous studies show that large old hollow-bearing trees are already at high risk of collapse in our study area. New analyses presented here indicate that additional logging operations in the surrounding landscape will further elevate that risk. Current logging prescriptions require the protection of large old hollow-bearing trees on cutblocks. We suggest that efforts to reduce the probability of collapse of large old hollow-bearing trees on unlogged sites will demand careful landscape planning to limit the amount of timber harvesting in the surrounding landscape. PMID:29474487

  14. Empirical relationships between tree fall and landscape-level amounts of logging and fire.

    PubMed

    Lindenmayer, David B; Blanchard, Wade; Blair, David; McBurney, Lachlan; Stein, John; Banks, Sam C

    2018-01-01

    Large old trees are critically important keystone structures in forest ecosystems globally. Populations of these trees are also in rapid decline in many forest ecosystems, making it important to quantify the factors that influence their dynamics at different spatial scales. Large old trees often occur in forest landscapes also subject to fire and logging. However, the effects on the risk of collapse of large old trees of the amount of logging and fire in the surrounding landscape are not well understood. Using an 18-year study in the Mountain Ash (Eucalyptus regnans) forests of the Central Highlands of Victoria, we quantify relationships between the probability of collapse of large old hollow-bearing trees at a site and the amount of logging and the amount of fire in the surrounding landscape. We found the probability of collapse increased with an increasing amount of logged forest in the surrounding landscape. It also increased with a greater amount of burned area in the surrounding landscape, particularly for trees in highly advanced stages of decay. The most likely explanation for elevated tree fall with an increasing amount of logged or burned areas in the surrounding landscape is change in wind movement patterns associated with cutblocks or burned areas. Previous studies show that large old hollow-bearing trees are already at high risk of collapse in our study area. New analyses presented here indicate that additional logging operations in the surrounding landscape will further elevate that risk. Current logging prescriptions require the protection of large old hollow-bearing trees on cutblocks. We suggest that efforts to reduce the probability of collapse of large old hollow-bearing trees on unlogged sites will demand careful landscape planning to limit the amount of timber harvesting in the surrounding landscape.

  15. Nondestructive Testing Technique to Quantify Deterioration from Marine Borer Attack in Sitka Spruce and Western Hemlock Logs: Observations from a Pilot Test

    Treesearch

    Robert Ross; John W. Forsman; John R. Erickson; Allen M. Brackley

    2014-01-01

    Stress-wave nondestructive evaluation (NDE) techniques are used widely in the forest products industry—from the grading of wood veneer to inspection of timber structures. Inspection professionals frequently use stress-wave NDE techniques to locate internal voids and decayed or deteriorated areas in large timbers. Although these techniques have proven useful, little...

  16. Low-Cost Evaluation of EO-1 Hyperion and ALI for Detection and Biophysical Characterization of Forest Logging in Amazonia (NCC5-481)

    NASA Technical Reports Server (NTRS)

    Asner, Gregory P.; Keller, Michael M.; Silva, Jose Natalino; Zweede, Johan C.; Pereira, Rodrigo, Jr.

    2002-01-01

    Major uncertainties exist regarding the rate and intensity of logging in tropical forests worldwide: these uncertainties severely limit economic, ecological, and biogeochemical analyses of these regions. Recent sawmill surveys in the Amazon region of Brazil show that the area logged is nearly equal to total area deforested annually, but conversion of survey data to forest area, forest structural damage, and biomass estimates requires multiple assumptions about logging practices. Remote sensing could provide an independent means to monitor logging activity and to estimate the biophysical consequences of this land use. Previous studies have demonstrated that the detection of logging in Amazon forests is difficult and no studies have developed either the quantitative physical basis or remote sensing approaches needed to estimate the effects of various logging regimes on forest structure. A major reason for these limitations has been a lack of sufficient, well-calibrated optical satellite data, which in turn, has impeded the development and use of physically-based, quantitative approaches for detection and structural characterization of forest logging regimes. We propose to use data from the EO-1 Hyperion imaging spectrometer to greatly increase our ability to estimate the presence and structural attributes of selective logging in the Amazon Basin. Our approach is based on four "biogeophysical indicators" not yet derived simultaneously from any satellite sensor: 1) green canopy leaf area index; 2) degree of shadowing; 3) presence of exposed soil and; 4) non-photosynthetic vegetation material. Airborne, field and modeling studies have shown that the optical reflectance continuum (400-2500 nm) contains sufficient information to derive estimates of each of these indicators. Our ongoing studies in the eastern Amazon basin also suggest that these four indicators are sensitive to logging intensity. Satellite-based estimates of these indicators should provide a means to quantify both the presence and degree of structural disturbance caused by various logging regimes. Our quantitative assessment of Hyperion hyperspectral and ALI multi-spectral data for the detection and structural characterization of selective logging in Amazonia will benefit from data collected through an ongoing project run by the Tropical Forest Foundation, within which we have developed a study of the canopy and landscape biophysics of conventional and reduced-impact logging. We will add to our base of forest structural information in concert with an EO-1 overpass. Using a photon transport model inversion technique that accounts for non-linear mixing of the four biogeophysical indicators, we will estimate these parameters across a gradient of selective logging intensity provided by conventional and reduced impact logging sites. We will also compare our physical ly-based approach to both conventional (e.g., NDVI) and novel (e.g., SWIR-channel) vegetation indices as well as to linear mixture modeling methods. We will cross-compare these approaches using Hyperion and ALI imagers to determine the strengths and limitations of these two sensors for applications of forest biophysics. This effort will yield the first physical ly-based, quantitative analysis of the detection and intensity of selective logging in Amazonia, comparing hyperspectral and improved multi-spectral approaches as well as inverse modeling, linear mixture modeling, and vegetation index techniques.

  17. Robust Library Building for Autonomous Classification of Downhole Geophysical Logs Using Gaussian Processes

    NASA Astrophysics Data System (ADS)

    Silversides, Katherine L.; Melkumyan, Arman

    2017-03-01

    Machine learning techniques such as Gaussian Processes can be used to identify stratigraphically important features in geophysical logs. The marker shales in the banded iron formation hosted iron ore deposits of the Hamersley Ranges, Western Australia, form distinctive signatures in the natural gamma logs. The identification of these marker shales is important for stratigraphic identification of unit boundaries for the geological modelling of the deposit. Machine learning techniques each have different unique properties that will impact the results. For Gaussian Processes (GPs), the output values are inclined towards the mean value, particularly when there is not sufficient information in the library. The impact that these inclinations have on the classification can vary depending on the parameter values selected by the user. Therefore, when applying machine learning techniques, care must be taken to fit the technique to the problem correctly. This study focuses on optimising the settings and choices for training a GPs system to identify a specific marker shale. We show that the final results converge even when different, but equally valid starting libraries are used for the training. To analyse the impact on feature identification, GP models were trained so that the output was inclined towards a positive, neutral or negative output. For this type of classification, the best results were when the pull was towards a negative output. We also show that the GP output can be adjusted by using a standard deviation coefficient that changes the balance between certainty and accuracy in the results.

  18. Visualization of usability and functionality of a professional website through web-mining.

    PubMed

    Jones, Josette F; Mahoui, Malika; Gopa, Venkata Devi Pragna

    2007-10-11

    Functional interface design requires understanding of the information system structure and the user. Web logs record user interactions with the interface, and thus provide some insight into user search behavior and efficiency of the search process. The present study uses a data-mining approach with techniques such as association rules, clustering and classification, to visualize the usability and functionality of a digital library through in depth analyses of web logs.

  19. Efficacy of the World Health Organization-recommended handwashing technique and a modified washing technique to remove Clostridium difficile from hands.

    PubMed

    Deschênes, Philippe; Chano, Frédéric; Dionne, Léa-Laurence; Pittet, Didier; Longtin, Yves

    2017-08-01

    The efficacy of the World Health Organization (WHO)-recommended handwashing technique against Clostridium difficile is uncertain, and whether it could be improved remains unknown. Also, the benefit of using a structured technique instead of an unstructured technique remains unclear. This study was a prospective comparison of 3 techniques (unstructured, WHO, and a novel technique dubbed WHO shortened repeated [WHO-SR] technique) to remove C difficile. Ten participants were enrolled and performed each technique. Hands were contaminated with 3 × 10 6 colony forming units (CFU) of a nontoxigenic strain containing 90% spores. Efficacy was assessed using the whole-hand method. The relative efficacy of each technique and of a structured (either WHO or WHO-SR) vs an unstructured technique were assessed by Mann-Whitney U test and Wilcoxon signed-rank test. The median effectiveness of the unstructured, WHO, and WHO-SR techniques in log 10 CFU reduction was 1.30 (interquartile range [IQR], 1.27-1.43), 1.71 (IQR, 1.34-1.91), and 1.70 (IQR, 1.54-2.42), respectively. The WHO-SR technique was significantly more efficacious than the unstructured technique (P = .01). Washing hands with a structured technique was more effective than washing with an unstructured technique (median, 1.70 vs 1.30 log 10 CFU reduction, respectively; P = .007). A structured washing technique is more effective than an unstructured technique against C difficile. Copyright © 2017 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  20. SAGES's advanced GI/MIS fellowship curriculum pilot project.

    PubMed

    Weis, Joshua J; Goldblatt, Matthew; Pryor, Aurora; Dunkin, Brian J; Brunt, L Michael; Jones, Daniel B; Scott, Daniel J

    2018-06-01

    The American health care system faces deficits in quality and quantity of surgeons. SAGES is a major stakeholder in surgical fellowship training and is responsible for defining the curriculum for the Advanced GI/MIS fellowship. SAGES leadership is actively adapting this curriculum. The process of reform began in 2014 through a series of iterative meetings and discussions. A working group within the Resident and Fellow Training Committee reviewed case log data from 2012 to 2015. These data were used to propose new criteria designed to provide adequate exposure to core content. The working group also proposed using video assessment of an MIS case to provide objective assessment of competency. Case log data were available for 326 fellows with a total of 85,154 cases logged (median 227 per fellow). The working group proposed new criteria starting with minimum case volumes for five defined categories including foregut (20), bariatrics (25), inguinal hernia (10), ventral hernia (10), and solid organ/colon/thoracic (10). Fellows are expected to perform an additional 75 complex MIS cases of any category for a total of 150 required cases overall. The proposal also included a minimum volume of flexible endoscopy (50) and submission of an MIS foregut case for video assessment. The new criteria more clearly defined which surgeon roles count for major credit within individual categories. Fourteen fellowships volunteered to pilot these new criteria for the 2017-2018 academic year. The new SAGES Advanced GI/MIS fellowship has been crafted to better define the core content that should be contained in these fellowships, while still allowing sufficient heterogeneity so that individual learners can tailor their training to specific areas of interest. The criteria also introduce innovative, evidence-based methods for assessing competency. Pending the results of the pilot program, SAGES will consider broad implementation of the new fellowship criteria.

  1. Bladder cancer mapping in Libya based on standardized morbidity ratio and log-normal model

    NASA Astrophysics Data System (ADS)

    Alhdiri, Maryam Ahmed; Samat, Nor Azah; Mohamed, Zulkifley

    2017-05-01

    Disease mapping contains a set of statistical techniques that detail maps of rates based on estimated mortality, morbidity, and prevalence. A traditional approach to measure the relative risk of the disease is called Standardized Morbidity Ratio (SMR). It is the ratio of an observed and expected number of accounts in an area, which has the greatest uncertainty if the disease is rare or if geographical area is small. Therefore, Bayesian models or statistical smoothing based on Log-normal model are introduced which might solve SMR problem. This study estimates the relative risk for bladder cancer incidence in Libya from 2006 to 2007 based on the SMR and log-normal model, which were fitted to data using WinBUGS software. This study starts with a brief review of these models, starting with the SMR method and followed by the log-normal model, which is then applied to bladder cancer incidence in Libya. All results are compared using maps and tables. The study concludes that the log-normal model gives better relative risk estimates compared to the classical method. The log-normal model has can overcome the SMR problem when there is no observed bladder cancer in an area.

  2. Advances in antimicrobial photodynamic inactivation at the nanoscale

    PubMed Central

    Kashef, Nasim; Huang, Ying-Ying; Hamblin, Michael R.

    2017-01-01

    The alarming worldwide increase in antibiotic resistance amongst microbial pathogens necessitates a search for new antimicrobial techniques, which will not be affected by, or indeed cause resistance themselves. Light-mediated photoinactivation is one such technique that takes advantage of the whole spectrum of light to destroy a broad spectrum of pathogens. Many of these photoinactivation techniques rely on the participation of a diverse range of nanoparticles and nanostructures that have dimensions very similar to the wavelength of light. Photodynamic inactivation relies on the photochemical production of singlet oxygen from photosensitizing dyes (type II pathway) that can benefit remarkably from formulation in nanoparticle-based drug delivery vehicles. Fullerenes are a closed-cage carbon allotrope nanoparticle with a high absorption coefficient and triplet yield. Their photochemistry is highly dependent on microenvironment, and can be type II in organic solvents and type I (hydroxyl radicals) in a biological milieu. Titanium dioxide nanoparticles act as a large band-gap semiconductor that can carry out photo-induced electron transfer under ultraviolet A light and can also produce reactive oxygen species that kill microbial cells. We discuss some recent studies in which quite remarkable potentiation of microbial killing (up to six logs) can be obtained by the addition of simple inorganic salts such as the non-toxic sodium/potassium iodide, bromide, nitrite, and even the toxic sodium azide. Interesting mechanistic insights were obtained to explain this increased killing. PMID:29226063

  3. Advances in antimicrobial photodynamic inactivation at the nanoscale

    NASA Astrophysics Data System (ADS)

    Kashef, Nasim; Huang, Ying-Ying; Hamblin, Michael R.

    2017-08-01

    The alarming worldwide increase in antibiotic resistance amongst microbial pathogens necessitates a search for new antimicrobial techniques, which will not be affected by, or indeed cause resistance themselves. Light-mediated photoinactivation is one such technique that takes advantage of the whole spectrum of light to destroy a broad spectrum of pathogens. Many of these photoinactivation techniques rely on the participation of a diverse range of nanoparticles and nanostructures that have dimensions very similar to the wavelength of light. Photodynamic inactivation relies on the photochemical production of singlet oxygen from photosensitizing dyes (type II pathway) that can benefit remarkably from formulation in nanoparticle-based drug delivery vehicles. Fullerenes are a closed-cage carbon allotrope nanoparticle with a high absorption coefficient and triplet yield. Their photochemistry is highly dependent on microenvironment, and can be type II in organic solvents and type I (hydroxyl radicals) in a biological milieu. Titanium dioxide nanoparticles act as a large band-gap semiconductor that can carry out photo-induced electron transfer under ultraviolet A light and can also produce reactive oxygen species that kill microbial cells. We discuss some recent studies in which quite remarkable potentiation of microbial killing (up to six logs) can be obtained by the addition of simple inorganic salts such as the non-toxic sodium/potassium iodide, bromide, nitrite, and even the toxic sodium azide. Interesting mechanistic insights were obtained to explain this increased killing.

  4. Using Plasticity Values Determined From Systematic Hardness Indentation Measurements for Predicting Impact Behavior in Structural Ceramics: A New, Simple Screening Technique

    DTIC Science & Technology

    2009-09-01

    kFc ) is shown to fit the Knoop data quite well. A plot of log10 (HK) vs. log10 (F) yielded easily comparable straight lines, whose slope and...AlON), silicon carbide, aluminum oxide and boron carbide. A power-law equation (H = kFc ) is shown to fit the Knoop data quite well. A plot of log10... kFc HK= 24.183 F-0.0699 R2= 0.97 H K (G Pa ) Load (N) HK = a/F + b ErrorValue 0.919483.7367a 0.6903619.361b NA25.591Chisq NA0.67368R2 1 1.1 1.2

  5. Using borehole flow logging to optimize hydraulic-test procedures in heterogeneous fractured aquifers

    USGS Publications Warehouse

    Paillet, F.L.

    1995-01-01

    Hydraulic properties of heterogeneous fractured aquifers are difficult to characterize, and such characterization usually requires equipment-intensive and time-consuming applications of hydraulic testing in situ. Conventional coring and geophysical logging techniques provide useful and reliable information on the distribution of bedding planes, fractures and solution openings along boreholes, but it is often unclear how these locally permeable features are organized into larger-scale zones of hydraulic conductivity. New boreholes flow-logging equipment provides techniques designed to identify hydraulically active fractures intersecting boreholes, and to indicate how these fractures might be connected to larger-scale flow paths in the surrounding aquifer. Potential complications in interpreting flowmeter logs include: 1) Ambient hydraulic conditions that mask the detection of hydraulically active fractures; 2) Inability to maintain quasi-steady drawdowns during aquifer tests, which causes temporal variations in flow intensity to be confused with inflows during pumping; and 3) Effects of uncontrolled background variations in hydraulic head, which also complicate the interpretation of inflows during aquifer tests. Application of these techniques is illustrated by the analysis of cross-borehole flowmeter data from an array of four bedrock boreholes in granitic schist at the Mirror Lake, New Hampshire, research site. Only two days of field operations were required to unambiguously identify the few fractures or fracture zones that contribute most inflow to boreholes in the CO borehole array during pumping. Such information was critical in the interpretation of water-quality data. This information also permitted the setting of the available string of two packers in each borehole so as to return the aquifer as close to pre-drilling conditions as possible with the available equipment.

  6. Spent Fuel Test-Climax: core logging for site investigation and instrumentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilder, D.G.; Yow, J.L. Jr.; Thorpe, R.K.

    1982-05-28

    As an integral part of the Spent Fuel Test-Climax 5150 ft (1570 m) of granite core was obtained. This core was diamond drilled in various sizes, mainly 38-mm and 76-mm diameters. The core was teken with single tube core barrels and was unoriented. Techniques used to drill and log this core are discussed, as well as techniques to orient the core. Of the 5150 ft (1570 m) of core more than 3645 ft (1111 m) was retained and logged in some detail. As a result of the core logging, geologic discontinuities were identified, joint frequency and spacing characterized. Discontinuities identifiedmore » included several joint sets, shear zones and faults. Correlations based on coring along were generally found to be impossible, even for the more prominent features. The only feature properly correlated from the exploratory drilling was the fault system at the end of the facility, but it was not identified from the exploratory core as a fault. Identification of discontinuities was later helped by underground mapping that identified several different joint sets with different characteristics. It was found that joint frequency varied from 0.3 to 1.1 joint per foot of core for open fractures and from 0.3 to 3.3/ft for closed or healed fractures. Histograms of fracture spacing indicate that there is likely a random distribution of spacing superimposed upon uniformly spaced fractures. It was found that a low angle joint set had a persistent mean orientation. These joints were healed and had pervasive wall rock alteration which made identification of joints in this set possible. The recognition of a joint set with known attitude allowed orientation of much of the core. This orientation technique was found to be effective. 10 references, 25 figures, 4 tables.« less

  7. Partitioning of fluorotelomer alcohols to octanol and different sources of dissolved organic carbon.

    PubMed

    Carmosini, Nadia; Lee, Linda S

    2008-09-01

    Interest in the environmental fate of fluorotelomer alcohols (FTOHs) has spurred efforts to understand their equilibrium partitioning behavior. Experimentally determined partition coefficients for FTOHs between soil/water and air/water have been reported, but direct measurements of partition coefficients for dissolved organic carbon (DOC)/water (K(doc)) and octanol/ water(K(ow)) have been lacking. Here we measured the partitioning of 8:2 and 6:2 FTOH between one or more types of DOC and water using enhanced solubility or dialysis bag techniques, and also quantified K(ow) values for 4:2 to 8:2 FTOH using a batch equilibration method. The range in measured log K(doc) values for 8:2 FTOH using the enhanced solubility technique with DOC derived from two soils, two biosolids, and three reference humic acids is 2.00-3.97 with the lowest values obtained for the biosolids and an average across all other DOC sources (biosolid DOC excluded) of 3.54 +/- 0.29. For 6:2 FTOH and Aldrich humic acid, a log K(doc) value of 1.96 +/- 0.45 was measured using the dialysis technique. These average values are approximately 1 to 2 log units lower than previously indirectly estimated K(doc) values. Overall, the affinity for DOC tends to be slightly lower than that for particulate soil organic carbon. Measured log K(ow) values for 4:2 (3.30 +/- 0.04), 6:2 (4.54 +/- 0.01), and 8:2 FTOH (5.58 +/- 0.06) were in good agreement with previously reported estimates. Using relationships between experimentally measured partition coefficients and C-atom chain length, we estimated K(doc) and K(ow) values for shorter and longer chain FTOHs, respectively, that we were unable to measure experimentally.

  8. Practical life log video indexing based on content and context

    NASA Astrophysics Data System (ADS)

    Tancharoen, Datchakorn; Yamasaki, Toshihiko; Aizawa, Kiyoharu

    2006-01-01

    Today, multimedia information has gained an important role in daily life and people can use imaging devices to capture their visual experiences. In this paper, we present our personal Life Log system to record personal experiences in form of wearable video and environmental data; in addition, an efficient retrieval system is demonstrated to recall the desirable media. We summarize the practical video indexing techniques based on Life Log content and context to detect talking scenes by using audio/visual cues and semantic key frames from GPS data. Voice annotation is also demonstrated as a practical indexing method. Moreover, we apply body media sensors to record continuous life style and use body media data to index the semantic key frames. In the experiments, we demonstrated various video indexing results which provided their semantic contents and showed Life Log visualizations to examine personal life effectively.

  9. Traceability of patient records usage: barriers and opportunities for improving user interface design and data management.

    PubMed

    Cruz-Correia, Ricardo; Lapão, Luís; Rodrigues, Pedro Pereira

    2011-01-01

    Although IT governance practices (like ITIL, which recommends on the use of audit logs for proper service level management) are being introduced in many Hospitals to cope with increasing levels of information quality and safety requirements, the standard maturity levels of hospital IT departments is still not enough to reach the level of frequent use of audit logs. This paper aims to address the issues related to the existence of AT in patient records, describe the Hospitals scenario and to produce recommendations. Representatives from four hospitals were interviewed regarding the use of AT in their Hospital IS. Very few AT are known to exist in these hospitals (average of 1 per hospital in an estimate of 21 existing IS). CIOs should to be much more concerned with the existence and maintenance of AT. Recommendations include server clock synchronization and using advanced log visualization tools.

  10. Armored instrumentation cable for geothermal well logging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dennis, B.R.; Johnson, J.; Todd, B.

    1981-01-01

    Multiconductor armored well-logging cable is used extensively by the oil and natural gas industry to lower various instruments used to measure the geological and geophysical parameters into deep wellbores. Advanced technology in oil-well drilling makes it possible to achieve borehole depths of 9 km (30,000 ft). The higher temperatures in these deeper boreholes demand advancements in the design and manufacturing of wireline cable and in the electrical insulating and armoring materials used as integral components. If geothermal energy is proved an abundant economic resource, drilling temperatures approaching and exceeding 300/sup 0/C will become commonplace. The adaptation of teflons as electricalmore » insulating material permitted use of armored cable in geothermal wellbores where temperatures are slightly in excess of 200/sup 0/C, and where the concentrations of corrosive minerals and gases are high. Teflon materials presently used in wireline cables, however, are not capable of continuous operation at the anticipated higher temperatures.« less

  11. Copy-move forgery detection utilizing Fourier-Mellin transform log-polar features

    NASA Astrophysics Data System (ADS)

    Dixit, Rahul; Naskar, Ruchira

    2018-03-01

    In this work, we address the problem of region duplication or copy-move forgery detection in digital images, along with detection of geometric transforms (rotation and rescale) and postprocessing-based attacks (noise, blur, and brightness adjustment). Detection of region duplication, following conventional techniques, becomes more challenging when an intelligent adversary brings about such additional transforms on the duplicated regions. In this work, we utilize Fourier-Mellin transform with log-polar mapping and a color-based segmentation technique using K-means clustering, which help us to achieve invariance to all the above forms of attacks in copy-move forgery detection of digital images. Our experimental results prove the efficiency of the proposed method and its superiority to the current state of the art.

  12. Environmental and Genetic Factors Explain Differences in Intraocular Scattering.

    PubMed

    Benito, Antonio; Hervella, Lucía; Tabernero, Juan; Pennos, Alexandros; Ginis, Harilaos; Sánchez-Romera, Juan F; Ordoñana, Juan R; Ruiz-Sánchez, Marcos; Marín, José M; Artal, Pablo

    2016-01-01

    To study the relative impact of genetic and environmental factors on the variability of intraocular scattering within a classical twin study. A total of 64 twin pairs, 32 monozygotic (MZ) (mean age: 54.9 ± 6.3 years) and 32 dizygotic (DZ) (mean age: 56.4 ± 7.0 years), were measured after a complete ophthalmologic exam had been performed to exclude all ocular pathologies that increase intraocular scatter as cataracts. Intraocular scattering was evaluated by using two different techniques based on a straylight parameter log(S) estimation: a compact optical instrument based in the principle of optical integration and a psychophysical measurement. Intraclass correlation coefficients (ICC) were used as descriptive statistics of twin resemblance, and genetic models were fitted to estimate heritability. No statistically significant difference was found for MZ and DZ groups for age (P = 0.203), best-corrected visual acuity (P = 0.626), cataract gradation (P = 0.701), sex (P = 0.941), optical log(S) (P = 0.386), or psychophysical log(S) (P = 0.568), with only a minor difference in equivalent sphere (P = 0.008). Intraclass correlation coefficients between siblings were similar for scatter parameters: 0.676 in MZ and 0.471 in DZ twins for optical log(S); 0.533 in MZ twins and 0.475 in DZ twins for psychophysical log(S). For equivalent sphere, ICCs were 0.767 in MZ and 0.228 in DZ twins. Conservative estimates of heritability for the measured scattering parameters were 0.39 and 0.20, respectively. Correlations of intraocular scatter (straylight) parameters in the groups of identical and nonidentical twins were similar. Heritability estimates were of limited magnitude, suggesting that genetic and environmental factors determine the variance of ocular straylight in healthy middle-aged adults.

  13. A method to evaluate hydraulic fracture using proppant detection.

    PubMed

    Liu, Juntao; Zhang, Feng; Gardner, Robin P; Hou, Guojing; Zhang, Quanying; Li, Hu

    2015-11-01

    Accurate determination of the proppant placement and propped fracture height are important for evaluating and optimizing stimulation strategies. A technology using non-radioactive proppant and a pulsed neutron gamma energy spectra logging tool to determine the placement and height of propped fractures is proposed. Gd2O3 was incorporated into ceramic proppant and a Monte Carlo method was utilized to build the logging tools and formation models. Characteristic responses of the recorded information of different logging tools to fracture widths, proppant concentrations and influencing factors were studied. The results show that Gd capture gamma rays can be used to evaluate propped fractures and it has higher sensitivity to the change of fracture width and traceable proppant content compared with the exiting non-radioactive proppant evaluation techniques and only an after-fracture measurement is needed for the new method; The changes in gas saturation and borehole size have a great impact on determining propped fractures when compensated neutron and pulsed neutron capture tool are used. A field example is presented to validate the application of the new technique. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Woodmetrics: imaging devices and processes in wood inspection at Lulea University of Technology

    NASA Astrophysics Data System (ADS)

    Hagman, Olle

    1999-09-01

    Wood Technology research and education at Lulea University of Technology is located in Skelleftea 800 km north of Stockholm. At the campus about 25 persons are involved in education and research in Wood Technology. We are educating M.Sc. and post- graduate students in Wood Technology. The research at the campus includes the following main fields: -- Wood Machining - - Woodmetrics -- Wood Drying -- Wood Composites/Wood Material Science. Our research strategy is to obtain an individual treatment of every tree, board and piece of wood in order to get highest possible value for the forest products. This shall be accomplished by the aid of advanced scanning technology and computer technology. Woodmetrics means to measure different wood parameters in order to optimize the utilization of the raw material. Today we have the following projects in this field: Automatic wood inspection -- Color changes and moisture flow in drying processes -- Inner quality of logs and lumber - - Stem quality database -- Computer tomography -- Aesthetic properties of wood -- Market/industry/forest relations. In the Woodmetrics field we are using computer tomography, CCD cameras and other sensors in order to find and measure defects in trees and on boards. The signals are analyzed and classified with modern image analyzing techniques and advanced statistical methods.

  15. Reservoir assessment of the Nubian sandstone reservoir in South Central Gulf of Suez, Egypt

    NASA Astrophysics Data System (ADS)

    El-Gendy, Nader; Barakat, Moataz; Abdallah, Hamed

    2017-05-01

    The Gulf of Suez is considered as one of the most important petroleum provinces in Egypt and contains the Saqqara and Edfu oil fields located in the South Central portion of the Gulf of Suez. The Nubian sandstone reservoir in the Gulf of Suez basin is well known for its great capability to store and produce large volumes of hydrocarbons. The Nubian sandstone overlies basement rocks throughout most of the Gulf of Suez region. It consists of a sequence of sandstones and shales of Paleozoic to Cretaceous age. The Nubian sandstone intersected in most wells has excellent reservoir characteristics. Its porosity is controlled by sedimentation style and diagenesis. The cementation materials are mainly kaolinite and quartz overgrowths. The permeability of the Nubian sandstone is mainly controlled by grain size, sorting, porosity and clay content especially kaolinite and decreases with increase of kaolinite. The permeability of the Nubian Sandstone is evaluated using the Nuclear Magnetic Resonance (NMR technology) and formation pressure data in addition to the conventional logs and the results were calibrated using core data. In this work, the Nubian sandstone was investigated and evaluated using complete suites of conventional and advanced logging techniques to understand its reservoir characteristics which have impact on economics of oil recovery. The Nubian reservoir has a complicated wettability nature which affects the petrophysical evaluation and reservoir productivity. So, understanding the reservoir wettability is very important for managing well performance, productivity and oil recovery.

  16. Eliminating log rolling as a spine trauma order.

    PubMed

    Conrad, Bryan P; Rossi, Gianluca Del; Horodyski, Mary Beth; Prasarn, Mark L; Alemi, Yara; Rechtine, Glenn R

    2012-01-01

    Currently, up to 25% of patients with spinal cord injuries may experience neurologic deterioration during the initial management of their injuries. Therefore, more effective procedures need to be established for the transportation and care of these to reduce the risk of secondary neurologic damage. Here, we present more acceptable methods to minimize motion in the unstable spine during the management of patients with traumatic spine injuries. This review summarizes more than a decade of research aimed at evaluating different methods of caring for patients with spine trauma. The most commonly utilized technique to transport spinal cord injured patients, the log rolling maneuver, produced more motion than placing a patient on a spine board, removing a spine board, performing continuous lateral therapy, and positioning a patient prone for surgery. Alternative maneuvers that produced less motion included the straddle lift and slide, 6 + lift and slide, scoop stretcher, mechanical kinetic therapy, mechanical transfers, and the use of the operating table to rotate the patient to the prone position for surgical stabilization. The log roll maneuver should be removed from the trauma response guidelines for patients with suspected spine injuries, as it creates significantly more motion in the unstable spine than the readily available alternatives. The only exception is the patient who is found prone, in which case the patient should then be log rolled directly on to the spine board utilizing a push technique.

  17. Reconciling timber extraction with biodiversity conservation in tropical forests using reduced-impact logging

    PubMed Central

    Bicknell, Jake E; Struebig, Matthew J; Davies, Zoe G; Baraloto, Christopher

    2015-01-01

    Over 20% of the world's tropical forests have been selectively logged, and large expanses are allocated for future timber extraction. Reduced-impact logging (RIL) is being promoted as best practice forestry that increases sustainability and lowers CO2 emissions from logging, by reducing collateral damage associated with timber extraction. RIL is also expected to minimize the impacts of selective logging on biodiversity, although this is yet to be thoroughly tested. We undertake the most comprehensive study to date to investigate the biodiversity impacts of RIL across multiple taxonomic groups. We quantified birds, bats and large mammal assemblage structures, using a before-after control-impact (BACI) design across 20 sample sites over a 5-year period. Faunal surveys utilized point counts, mist nets and line transects and yielded >250 species. We examined assemblage responses to logging, as well as partitions of feeding guild and strata (understorey vs. canopy), and then tested for relationships with logging intensity to assess the primary determinants of community composition. Community analysis revealed little effect of RIL on overall assemblages, as structure and composition were similar before and after logging, and between logging and control sites. Variation in bird assemblages was explained by natural rates of change over time, and not logging intensity. However, when partitioned by feeding guild and strata, the frugivorous and canopy bird ensembles changed as a result of RIL, although the latter was also associated with change over time. Bats exhibited variable changes post-logging that were not related to logging, whereas large mammals showed no change at all. Indicator species analysis and correlations with logging intensities revealed that some species exhibited idiosyncratic responses to RIL, whilst abundance change of most others was associated with time. Synthesis and applications. Our study demonstrates the relatively benign effect of reduced-impact logging (RIL) on birds, bats and large mammals in a neotropical forest context, and therefore, we propose that forest managers should improve timber extraction techniques more widely. If RIL is extensively adopted, forestry concessions could represent sizeable and important additions to the global conservation estate – over 4 million km2. PMID:25954054

  18. Compositional data analysis as a robust tool to delineate hydrochemical facies within and between gas-bearing aquifers

    NASA Astrophysics Data System (ADS)

    Owen, D. Des. R.; Pawlowsky-Glahn, V.; Egozcue, J. J.; Buccianti, A.; Bradd, J. M.

    2016-08-01

    Isometric log ratios of proportions of major ions, derived from intuitive sequential binary partitions, are used to characterize hydrochemical variability within and between coal seam gas (CSG) and surrounding aquifers in a number of sedimentary basins in the USA and Australia. These isometric log ratios are the coordinates corresponding to an orthonormal basis in the sample space (the simplex). The characteristic proportions of ions, as described by linear models of isometric log ratios, can be used for a mathematical-descriptive classification of water types. This is a more informative and robust method of describing water types than simply classifying a water type based on the dominant ions. The approach allows (a) compositional distinctions between very similar water types to be made and (b) large data sets with a high degree of variability to be rapidly assessed with respect to particular relationships/compositions that are of interest. A major advantage of these techniques is that major and minor ion components can be comprehensively assessed and subtle processes—which may be masked by conventional techniques such as Stiff diagrams, Piper plots, and classic ion ratios—can be highlighted. Results show that while all CSG groundwaters are dominated by Na, HCO3, and Cl ions, the proportions of other ions indicate they can evolve via different means and the particular proportions of ions within total or subcompositions can be unique to particular basins. Using isometric log ratios, subtle differences in the behavior of Na, K, and Cl between CSG water types and very similar Na-HCO3 water types in adjacent aquifers are also described. A complementary pair of isometric log ratios, derived from a geochemically-intuitive sequential binary partition that is designed to reflect compositional variability within and between CSG groundwater, is proposed. These isometric log ratios can be used to model a hydrochemical pathway associated with methanogenesis and/or to delineate groundwater associated with high gas concentrations.

  19. 3-D visualisation of palaeoseismic trench stratigraphy and trench logging using terrestrial remote sensing and GPR - a multiparametric interpretation

    NASA Astrophysics Data System (ADS)

    Schneiderwind, Sascha; Mason, Jack; Wiatr, Thomas; Papanikolaou, Ioannis; Reicherter, Klaus

    2016-03-01

    Two normal faults on the island of Crete and mainland Greece were studied to test an innovative workflow with the goal of obtaining a more objective palaeoseismic trench log, and a 3-D view of the sedimentary architecture within the trench walls. Sedimentary feature geometries in palaeoseismic trenches are related to palaeoearthquake magnitudes which are used in seismic hazard assessments. If the geometry of these sedimentary features can be more representatively measured, seismic hazard assessments can be improved. In this study more representative measurements of sedimentary features are achieved by combining classical palaeoseismic trenching techniques with multispectral approaches. A conventional trench log was firstly compared to results of ISO (iterative self-organising) cluster analysis of a true colour photomosaic representing the spectrum of visible light. Photomosaic acquisition disadvantages (e.g. illumination) were addressed by complementing the data set with active near-infrared backscatter signal image from t-LiDAR measurements. The multispectral analysis shows that distinct layers can be identified and it compares well with the conventional trench log. According to this, a distinction of adjacent stratigraphic units was enabled by their particular multispectral composition signature. Based on the trench log, a 3-D interpretation of attached 2-D ground-penetrating radar (GPR) profiles collected on the vertical trench wall was then possible. This is highly beneficial for measuring representative layer thicknesses, displacements, and geometries at depth within the trench wall. Thus, misinterpretation due to cutting effects is minimised. This manuscript combines multiparametric approaches and shows (i) how a 3-D visualisation of palaeoseismic trench stratigraphy and logging can be accomplished by combining t-LiDAR and GPR techniques, and (ii) how a multispectral digital analysis can offer additional advantages to interpret palaeoseismic and stratigraphic data. The multispectral data sets are stored allowing unbiased input for future (re)investigations.

  20. The development of a high-throughput measurement method of octanol/water distribution coefficient based on hollow fiber membrane solvent microextraction technique.

    PubMed

    Bao, James J; Liu, Xiaojing; Zhang, Yong; Li, Youxin

    2014-09-15

    This paper describes the development of a novel high-throughput hollow fiber membrane solvent microextraction technique for the simultaneous measurement of the octanol/water distribution coefficient (logD) for organic compounds such as drugs. The method is based on a designed system, which consists of a 96-well plate modified with 96 hollow fiber membrane tubes and a matching lid with 96 center holes and 96 side holes distributing in 96 grids. Each center hole was glued with a sealed on one end hollow fiber membrane tube, which is used to separate the aqueous phase from the octanol phase. A needle, such as microsyringe or automatic sampler, can be directly inserted into the membrane tube to deposit octanol as the accepted phase or take out the mixture of the octanol and the drug. Each side hole is filled with aqueous phase and could freely take in/out solvent as the donor phase from the outside of the hollow fiber membranes. The logD can be calculated by measuring the drug concentration in each phase after extraction equilibrium. After a comprehensive comparison, the polytetrafluoroethylene hollow fiber with the thickness of 210 μm, an extraction time of 300 min, a temperature of 25 °C and atmospheric pressure without stirring are selected for the high throughput measurement. The correlation coefficient of the linear fit of the logD values of five drugs determined by our system to reference values is 0.9954, showed a nice accurate. The -8.9% intra-day and -4.4% inter-day precision of logD for metronidazole indicates a good precision. In addition, the logD values of eight drugs were simultaneously and successfully measured, which indicated that the 96 throughput measure method of logD value was accurate, precise, reliable and useful for high throughput screening. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Log-less metadata management on metadata server for parallel file systems.

    PubMed

    Liao, Jianwei; Xiao, Guoqiang; Peng, Xiaoning

    2014-01-01

    This paper presents a novel metadata management mechanism on the metadata server (MDS) for parallel and distributed file systems. In this technique, the client file system backs up the sent metadata requests, which have been handled by the metadata server, so that the MDS does not need to log metadata changes to nonvolatile storage for achieving highly available metadata service, as well as better performance improvement in metadata processing. As the client file system backs up certain sent metadata requests in its memory, the overhead for handling these backup requests is much smaller than that brought by the metadata server, while it adopts logging or journaling to yield highly available metadata service. The experimental results show that this newly proposed mechanism can significantly improve the speed of metadata processing and render a better I/O data throughput, in contrast to conventional metadata management schemes, that is, logging or journaling on MDS. Besides, a complete metadata recovery can be achieved by replaying the backup logs cached by all involved clients, when the metadata server has crashed or gone into nonoperational state exceptionally.

  2. Log-Less Metadata Management on Metadata Server for Parallel File Systems

    PubMed Central

    Xiao, Guoqiang; Peng, Xiaoning

    2014-01-01

    This paper presents a novel metadata management mechanism on the metadata server (MDS) for parallel and distributed file systems. In this technique, the client file system backs up the sent metadata requests, which have been handled by the metadata server, so that the MDS does not need to log metadata changes to nonvolatile storage for achieving highly available metadata service, as well as better performance improvement in metadata processing. As the client file system backs up certain sent metadata requests in its memory, the overhead for handling these backup requests is much smaller than that brought by the metadata server, while it adopts logging or journaling to yield highly available metadata service. The experimental results show that this newly proposed mechanism can significantly improve the speed of metadata processing and render a better I/O data throughput, in contrast to conventional metadata management schemes, that is, logging or journaling on MDS. Besides, a complete metadata recovery can be achieved by replaying the backup logs cached by all involved clients, when the metadata server has crashed or gone into nonoperational state exceptionally. PMID:24892093

  3. Development of minimum standards for event-based data collection loggers and performance measure definitions for signalized intersections.

    DOT National Transportation Integrated Search

    2017-01-01

    The arterial traffic signal performance measures were not used to their fullest potential in the past. The development of traffic signal controllers with event-based, high-resolution data logging capabilities enabled the advances in derivation and vi...

  4. The Crossett Story, Revised: Updating a Forestry Classic

    Treesearch

    Don C. Bragg; James M. Guldin; Michael G. Shelton

    2003-01-01

    Abstract: The Crossett Story slide show was developed in 1980 to detail the history of logging, field forestry, and research centered on the USDA Forest Service's Crossett Experimental Forest (CEF). However, science and technology have advanced considerably over the last several decades and the regulatory environment has...

  5. World-Wide and Regional Examination of Substrates Facilitating Timberline Expansion

    NASA Astrophysics Data System (ADS)

    Johnson, A. C.; Yeakley, J. A.

    2010-12-01

    Upward advance of timberlines, associated with climate warming, is occurring in the Pacific Northwest (PNW) as well as many other mountainous regions of the world. Examination of seedling establishment and survival of sensitive seedlings, rather than examination of older resilient trees, may give a clearer understanding of current climatic factors affecting potential expansion of timberline. Our investigation of seedling establishment along timberline edges in the PNW indicates that trees often germinate on small landforms known as microsites. Microsites include small convexities or concavities on the soil surface having a scale of centimeters to meters, but also include associations with slope, aspect, rocks or plants, or substrates dominated by mineral soil or wood. Growing on favorable microsites helps seedlings cope with some of the stresses that exist at high elevation sites including wind, cold temperatures, high radiation, drought, animal predation, and infestation by fungal pathogens found in snow and soil. Microsites, by providing warmer substrates, adequate moisture, and shelter, allow plants to function more affectively in mountain environments. Our summary of microsite type and associated timberline advance in a world-wide context indicates that factors such as snow accumulation, summer rainfall, and availability of microsites, will control timberline advance. In windswept timberline locations, rocks and plants provide shelter from wind and reduce the likelihood of night frost. In arid climates, concave microsites aid in snow deposition providing needed moisture to seedlings during periods of drought. In contrast, convex microsites and wood substrates, typical sites of regeneration in the PNW where precipitation typically exceeds 150 cm per year, facilitate early snow melt, thereby increasing growing season. Large trees at the edge of timberline fall into alpine meadows, decay, and provide sites for seedling establishment. These sites commonly called nurse logs, much better known as key microsites in lower elevation forests, have been found to be conspicuous sites of timberline forest regeneration extending from the forest edge into alpine meadows. Nurse logs appear to be particularly important sites of regeneration in wetter alpine areas of the world such as the North Cascade Mountains of Washington in the PNW. Depending upon aspect and slope, one tree can potentially advance timberline close to 20 meters, a typical length of a tree growing at timberline. Nurse log temperature during the growing season is significantly greater than the adjacent soil, particularly in areas with reduced overstory canopy. Increased substrate temperature, associated with increased root growth, has been found to facilitate growth of seedlings. Further, the water holding capacity of rotten logs, which often surpasses that of soils, aids in seedling growth during summer droughts.

  6. Temporal Decay in Timber Species Composition and Value in Amazonian Logging Concessions.

    PubMed

    Richardson, Vanessa A; Peres, Carlos A

    2016-01-01

    Throughout human history, slow-renewal biological resource populations have been predictably overexploited, often to the point of economic extinction. We assess whether and how this has occurred with timber resources in the Brazilian Amazon. The asynchronous advance of industrial-scale logging frontiers has left regional-scale forest landscapes with varying histories of logging. Initial harvests in unlogged forests can be highly selective, targeting slow-growing, high-grade, shade-tolerant hardwood species, while later harvests tend to focus on fast-growing, light-wooded, long-lived pioneer trees. Brazil accounts for 85% of all native neotropical forest roundlog production, and the State of Pará for almost half of all timber production in Brazilian Amazonia, the largest old-growth tropical timber reserve controlled by any country. Yet the degree to which timber harvests beyond the first-cut can be financially profitable or demographically sustainable remains poorly understood. Here, we use data on legally planned logging of ~17.3 million cubic meters of timber across 314 species extracted from 824 authorized harvest areas in private and community-owned forests, 446 of which reported volumetric composition data by timber species. We document patterns of timber extraction by volume, species composition, and monetary value along aging eastern Amazonian logging frontiers, which are then explained on the basis of historical and environmental variables. Generalized linear models indicate that relatively recent logging operations farthest from heavy-traffic roads are the most selective, concentrating gross revenues on few high-value species. We find no evidence that the post-logging timber species composition and total value of forest stands recovers beyond the first-cut, suggesting that the commercially most valuable timber species become predictably rare or economically extinct in old logging frontiers. In avoiding even more destructive land-use patterns, managing yields of selectively-logged forests is crucial for the long-term integrity of forest biodiversity and financial viability of local industries. The logging history of eastern Amazonian old-growth forests likely mirrors unsustainable patterns of timber depletion over time in Brazil and other tropical countries.

  7. Temporal Decay in Timber Species Composition and Value in Amazonian Logging Concessions

    PubMed Central

    Peres, Carlos A.

    2016-01-01

    Throughout human history, slow-renewal biological resource populations have been predictably overexploited, often to the point of economic extinction. We assess whether and how this has occurred with timber resources in the Brazilian Amazon. The asynchronous advance of industrial-scale logging frontiers has left regional-scale forest landscapes with varying histories of logging. Initial harvests in unlogged forests can be highly selective, targeting slow-growing, high-grade, shade-tolerant hardwood species, while later harvests tend to focus on fast-growing, light-wooded, long-lived pioneer trees. Brazil accounts for 85% of all native neotropical forest roundlog production, and the State of Pará for almost half of all timber production in Brazilian Amazonia, the largest old-growth tropical timber reserve controlled by any country. Yet the degree to which timber harvests beyond the first-cut can be financially profitable or demographically sustainable remains poorly understood. Here, we use data on legally planned logging of ~17.3 million cubic meters of timber across 314 species extracted from 824 authorized harvest areas in private and community-owned forests, 446 of which reported volumetric composition data by timber species. We document patterns of timber extraction by volume, species composition, and monetary value along aging eastern Amazonian logging frontiers, which are then explained on the basis of historical and environmental variables. Generalized linear models indicate that relatively recent logging operations farthest from heavy-traffic roads are the most selective, concentrating gross revenues on few high-value species. We find no evidence that the post-logging timber species composition and total value of forest stands recovers beyond the first-cut, suggesting that the commercially most valuable timber species become predictably rare or economically extinct in old logging frontiers. In avoiding even more destructive land-use patterns, managing yields of selectively-logged forests is crucial for the long-term integrity of forest biodiversity and financial viability of local industries. The logging history of eastern Amazonian old-growth forests likely mirrors unsustainable patterns of timber depletion over time in Brazil and other tropical countries. PMID:27410029

  8. The role of circulating tumor cells in evaluation of prognosis and treatment response in advanced non-small-cell lung cancer.

    PubMed

    Zhou, Jia; Dong, Fei; Cui, Fang; Xu, Rui; Tang, Xiaokui

    2017-04-01

    Non-small-cell lung cancer (NSCLC) lacks validated biomarkers to predict the prognosis and treatment response. This study investigated whether circulating tumor cells (CTCs) detectable could reminder high risk of distant metastasis, provide prognostic information, and early indicate the response to the conventional therapy in patients with advanced NSCLC. In this single-center prospective study, blood samples for CTC analysis were obtained from 59 patients with previously untreated, stage III or IV NSCLC both before and after administration of two cycles of chemotherapy. CTCs took in peripheral blood were measured by Cell Search detect technique. Carcino-embryonic antigen and count of metastatic sites were positively related to CTC count analyzed by multiple linear regression (P < 0.05). The median overall survival was 11.2 months (95% CI: 10.37-12.03 months) for the baseline CTC ≥ 2 group compared with 8.3 months (95% CI: 7.72-8.88 months) for the CTC < 2 group (log-rank test P < 0.05). Similarly, patients with CTC ≥ 2 at baseline had a significantly shorter median PFS (4.3 months, 95% CI: 3.7-4.9 months) compared with patients with CTC < 2 (6.2 months, 95% CI: 5.59-6.82 months) (log-rank test P < 0.05). For the disease control (stable disease, partial response, or complete response), group CTC value before treatment did not present difference with that after therapy compared by pared-samples T test (t = 1.455, P = 0.154), similar to the result of progressed group (progressive disease) (t = -0.987, P = 0.335). The CTC value of progressed group was higher than that of disease control group either at baseline or post chemotherapy. These data provide an evidence of positive correlation between CTC counts with CEA, as well as count of metastatic sites. Meanwhile, CTCs could be an effective predictor of distant metastasis and poor prognosis. In this study, CTCs are poorly related to treatment response. Whether CTCs could be a predictor of curative effect in advanced NSCLC should be validated by more researches in the future.

  9. Capabilities and limitations of dispersive liquid-liquid microextraction with solidification of floating organic drop for the extraction of organic pollutants from water samples.

    PubMed

    Vera-Avila, Luz E; Rojo-Portillo, Tania; Covarrubias-Herrera, Rosario; Peña-Alvarez, Araceli

    2013-12-17

    Dispersive liquid-liquid microextraction with solidification of floating organic drop (DLLME-SFO) is one of the most interesting sample preparation techniques developed in recent years. Although several applications have been reported, the potentiality and limitations of this simple and rapid extraction technique have not been made sufficiently explicit. In this work, the extraction efficiency of DLLME-SFO for pollutants from different chemical families was determined. Studied compounds include: 10 polycyclic aromatic hydrocarbons, 5 pesticides (chlorophenoxy herbicides and DDT), 8 phenols and 6 sulfonamides, thus, covering a large range of polarity and hydrophobicity (LogKow 0-7, overall). After optimization of extraction conditions using 1-dodecanol as extractant, the procedure was applied for extraction of each family from 10-mL spiked water samples, only adjusting sample pH as required. Absolute recoveries for pollutants with LogKow 3-7 were >70% and recovery values within this group (18 compounds) were independent of structure or hydrophobicity; the precision of recovery was very acceptable (RSD<12%) and linear behavior was observed in the studied concentration range (r(2)>0.995). Extraction recoveries for pollutants with LogKow 1.46-2.8 were in the range 13-62%, directly depending on individual LogKow values; however, good linearity (r(2)>0.993) and precision (RSD<6.5%) were also demonstrated for these polar solutes, despite recovery level. DLLME-SFO with 1-dodecanol completely failed for extraction of compounds with LogKow≤1 (sulfa drugs), other more polar extraction solvents (ionic liquids) should be explored for highly hydrophilic pollutants. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. Automatic lithofacies segmentation from well-logs data. A comparative study between the Self-Organizing Map (SOM) and Walsh transform

    NASA Astrophysics Data System (ADS)

    Aliouane, Leila; Ouadfeul, Sid-Ali; Rabhi, Abdessalem; Rouina, Fouzi; Benaissa, Zahia; Boudella, Amar

    2013-04-01

    The main goal of this work is to realize a comparison between two lithofacies segmentation techniques of reservoir interval. The first one is based on the Kohonen's Self-Organizing Map neural network machine. The second technique is based on the Walsh transform decomposition. Application to real well-logs data of two boreholes located in the Algerian Sahara shows that the Self-organizing map is able to provide more lithological details that the obtained lithofacies model given by the Walsh decomposition. Keywords: Comparison, Lithofacies, SOM, Walsh References: 1)Aliouane, L., Ouadfeul, S., Boudella, A., 2011, Fractal analysis based on the continuous wavelet transform and lithofacies classification from well-logs data using the self-organizing map neural network, Arabian Journal of geosciences, doi: 10.1007/s12517-011-0459-4 2) Aliouane, L., Ouadfeul, S., Djarfour, N., Boudella, A., 2012, Petrophysical Parameters Estimation from Well-Logs Data Using Multilayer Perceptron and Radial Basis Function Neural Networks, Lecture Notes in Computer Science Volume 7667, 2012, pp 730-736, doi : 10.1007/978-3-642-34500-5_86 3)Ouadfeul, S. and Aliouane., L., 2011, Multifractal analysis revisited by the continuous wavelet transform applied in lithofacies segmentation from well-logs data, International journal of applied physics and mathematics, Vol01 N01. 4) Ouadfeul, S., Aliouane, L., 2012, Lithofacies Classification Using the Multilayer Perceptron and the Self-organizing Neural Networks, Lecture Notes in Computer Science Volume 7667, 2012, pp 737-744, doi : 10.1007/978-3-642-34500-5_87 5) Weisstein, Eric W. "Fast Walsh Transform." From MathWorld--A Wolfram Web Resource. http://mathworld.wolfram.com/FastWalshTransform.html

  11. Contemporary surgical trends in the management of upper tract calculi.

    PubMed

    Oberlin, Daniel T; Flum, Andrew S; Bachrach, Laurie; Matulewicz, Richard S; Flury, Sarah C

    2015-03-01

    Upper tract nephrolithiasis is a common surgical condition that is treated with multiple surgical techniques, including shock wave lithotripsy, ureteroscopy and percutaneous nephrolithotomy. We analyzed case logs submitted to the ABU by candidates for initial certification and recertification to help elucidate the trends in management of upper tract urinary calculi. Annualized case logs from 2003 to 2012 were analyzed. We used logistic regression models to assess how surgeon specific attributes affected the way that upper tract stones were treated. Cases were identified by the CPT code of the corresponding procedure. A total of 6,620 urologists in 3 certification groups recorded case logs, including 2,275 for initial certification, 2,381 for first recertification and 1,964 for second recertification. A total of 441,162 procedures were logged, of which 54.2% were ureteroscopy, 41.3% were shock wave lithotripsy and 4.5% were percutaneous nephrolithotomy. From 2003 to 2013 there was an increase in ureteroscopy from 40.9% to 59.6% and a corresponding decrease in shock wave lithotripsy from 54% to 36.3%. For new urologists ureteroscopy increased from 47.6% to 70.9% of all stones cases logged and for senior clinicians ureteroscopy increased from 40% to 55%. Endourologists performed a significantly higher proportion of percutaneous nephrolithotomies than nonendourologists (10.6% vs 3.69%, p <0.0001) and a significantly smaller proportion of shock wave lithotripsies (34.2% vs 42.2%, p = 0.001). Junior and senior clinicians showed a dramatic adoption of endoscopic techniques. Treatment of upper tract calculi is an evolving field and provider specific attributes affect how these stones are treated. Copyright © 2015 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  12. ROLE OF PARENTERAL NUTRITION IN ONCOLOGIC PATIENTS WITH INTESTINAL OCCLUSION AND PERITONEAL CARCINOMATOSIS.

    PubMed

    Aría Guerra, Eva; Cortés-Salgado, Alfonso; Mateo-Lobo, Raquel; Nattero, Lía; Riveiro, Javier; Vega-Piñero, Belén; Valbuena, Beatriz; Carabaña, Fátima; Carrero, Carmen; Grande, Enrique; Carrato, Alfredo; Botella-Carretero, José Ignacio

    2015-09-01

    the precise role of parenteral nutrition in the management of oncologic patients with intestinal occlusion is not well defined yet. We aimed to identify the effects of parenteral nutrition in these patients regarding prognosis. 55 patients with intestinal occlusion and peritoneal carcinomatosis were included. Parenteral nutrition aimed at 20-35 kcal/Kg/day, and 1.0 g/kg/day of amino-acids. Weight, body mass index, type of tumor, type of chemotherapy, and ECOG among others were recorded and analyzed. 69.1% of the patients had gastrointestinal tumors, 18.2% gynecologic and 12.7% others. Age was 60 ± 13y, baseline ECOG 1.5 ± 0.5 and body mass index 21.6 ± 4.3. Malnutrition was present in 85%. Survival from the start of parenteral nutrition was not significant when considering baseline ECOG (log rank = 0.593, p = 0.743), previous lines of chemotherapy (log rank = 2.117, p = 0.548), baseline BMI (log rank = 2.686, p = 0.261), or type of tumor (log rank = 2.066, p = 0.356). Survival in patients who received home parenteral nutrition after hospital discharge was higher than those who stayed in-hospital (log rank = 7.090, p = 0.008). Survival in patients who started chemotherapy during or after parenteral nutrition was higher than those who did not so (log rank = 17.316, p < 0.001). A total of 3.6% of patients presented catheter related infection without affecting survival (log rank = 0.061, p = 0.804). Parenteral nutrition in patients with advanced cancer and intestinal occlusion is safe, and in tho se who respond to chemotherapy, further administration of home parenteral nutrition together with chemotherapy may enhance prolonged survival. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  13. Glutenite bodies sequence division of the upper Es4 in northern Minfeng zone of Dongying Sag, Bohai Bay Basin, China

    NASA Astrophysics Data System (ADS)

    Shao, Xupeng

    2017-04-01

    Glutenite bodies are widely developed in northern Minfeng zone of Dongying Sag. Their litho-electric relationship is not clear. In addition, as the conventional sequence stratigraphic research method drawbacks of involving too many subjective human factors, it has limited deepening of the regional sequence stratigraphic research. The wavelet transform technique based on logging data and the time-frequency analysis technique based on seismic data have advantages of dividing sequence stratigraphy quantitatively comparing with the conventional methods. Under the basis of the conventional sequence research method, this paper used the above techniques to divide the fourth-order sequence of the upper Es4 in northern Minfeng zone of Dongying Sag. The research shows that the wavelet transform technique based on logging data and the time-frequency analysis technique based on seismic data are essentially consistent, both of which divide sequence stratigraphy quantitatively in the frequency domain; wavelet transform technique has high resolutions. It is suitable for areas with wells. The seismic time-frequency analysis technique has wide applicability, but a low resolution. Both of the techniques should be combined; the upper Es4 in northern Minfeng zone of Dongying Sag is a complete set of third-order sequence, which can be further subdivided into 5 fourth-order sequences that has the depositional characteristics of fine-upward sequence in granularity. Key words: Dongying sag, northern Minfeng zone, wavelet transform technique, time-frequency analysis technique ,the upper Es4, sequence stratigraphy

  14. Extragalactic counterparts to Einstein slew survey sources

    NASA Technical Reports Server (NTRS)

    Schachter, Jonathan F.; Elvis, Martin; Plummer, David; Remillard, Ron

    1992-01-01

    The Einstein slew survey consists of 819 bright X-ray sources, of which 636 (or 78 percent) are identified with counterparts in standard catalogs. The importance of bright X-ray surveys is stressed, and the slew survey is compared to the Rosat all sky survey. Statistical techniques for minimizing confusion in arcminute error circles in digitized data are discussed. The 238 slew survey active galactic nuclei, clusters, and BL Lacertae objects identified to date and their implications for logN-logS and source evolution studies are described.

  15. A New Essential Functions Installed DWH in Hospital Information System: Process Mining Techniques and Natural Language Processing.

    PubMed

    Honda, Masayuki; Matsumoto, Takehiro

    2017-01-01

    Several kinds of event log data produced in daily clinical activities have yet to be used for secure and efficient improvement of hospital activities. Data Warehouse systems in Hospital Information Systems used for the analysis of structured data such as disease, lab-tests, and medications, have also shown efficient outcomes. This article is focused on two kinds of essential functions: process mining using log data and non-structured data analysis via Natural Language Processing.

  16. Koopman Mode Decomposition Methods in Dynamic Stall: Reduced Order Modeling and Control

    DTIC Science & Technology

    2015-11-10

    the flow phenomena by separating them into individual modes. The technique of Proper Orthogonal Decomposition (POD), see [ Holmes : 1998] is a popular...sampled values h(k), k = 0,…,2M-1, of the exponential sum 1. Solve the following linear system where 2. Compute all zeros zj  D, j = 1,…,M...of the Prony polynomial i.e., calculate all eigenvalues of the associated companion matrix and form fj = log zj for j = 1,…,M, where log is the

  17. A log-sinh transformation for data normalization and variance stabilization

    NASA Astrophysics Data System (ADS)

    Wang, Q. J.; Shrestha, D. L.; Robertson, D. E.; Pokhrel, P.

    2012-05-01

    When quantifying model prediction uncertainty, it is statistically convenient to represent model errors that are normally distributed with a constant variance. The Box-Cox transformation is the most widely used technique to normalize data and stabilize variance, but it is not without limitations. In this paper, a log-sinh transformation is derived based on a pattern of errors commonly seen in hydrological model predictions. It is suited to applications where prediction variables are positively skewed and the spread of errors is seen to first increase rapidly, then slowly, and eventually approach a constant as the prediction variable becomes greater. The log-sinh transformation is applied in two case studies, and the results are compared with one- and two-parameter Box-Cox transformations.

  18. Time trends in recurrence of juvenile nasopharyngeal angiofibroma: Experience of the past 4 decades.

    PubMed

    Mishra, Anupam; Mishra, Subhash Chandra

    2016-01-01

    An analysis of time distribution of juvenile nasopharyngeal angiofibroma (JNA) from the last 4 decades is presented. Sixty recurrences were analyzed as per actuarial survival. SPSS software was used to generate Kaplan-Meier (KM) curves and time distributions were compared by Log-rank, Breslow and Tarone-Ware test. The overall recurrence rate was 17.59%. Majority underwent open transpalatal approach(es) without embolization. The probability of detecting a recurrence was 95% in first 24months and comparison of KM curves of 4 different time periods was not significant. This is the first and largest series to address the time-distribution. The required follow up period is 2years. Our recurrence is just half of the largest series (reported so far) suggesting the superiority of transpalatal techniques. The similarity of curves suggests less likelihood for recent technical advances to influence the recurrence that as per our hypothesis is more likely to reflect tumor biology per se. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Implementation of polyatomic MCTDHF capability

    NASA Astrophysics Data System (ADS)

    Haxton, Daniel; Jones, Jeremiah; Rescigno, Thomas; McCurdy, C. William; Ibrahim, Khaled; Williams, Sam; Vecharynski, Eugene; Rouet, Francois-Henry; Li, Xiaoye; Yang, Chao

    2015-05-01

    The implementation of the Multiconfiguration Time-Dependent Hartree-Fock method for poly- atomic molecules using a cartesian product grid of sinc basis functions will be discussed. The focus will be on two key components of the method: first, the use of a resolution-of-the-identity approximation; sec- ond, the use of established techniques for triple Toeplitz matrix algebra using fast Fourier transform over distributed memory architectures (MPI 3D FFT). The scaling of two-electron matrix element transformations is converted from O(N4) to O(N log N) by including these components. Here N = n3, with n the number of points on a side. We test the prelim- inary implementation by calculating absorption spectra of small hydro- carbons, using approximately 16-512 points on a side. This work is supported by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences, under the Early Career program, and by the offices of BES and Advanced Scientific Computing Research, under the SciDAC program.

  20. Predictive ethoinformatics reveals the complex migratory behaviour of a pelagic seabird, the Manx Shearwater

    PubMed Central

    Freeman, Robin; Dean, Ben; Kirk, Holly; Leonard, Kerry; Phillips, Richard A.; Perrins, Chris M.; Guilford, Tim

    2013-01-01

    Understanding the behaviour of animals in the wild is fundamental to conservation efforts. Advances in bio-logging technologies have offered insights into the behaviour of animals during foraging, migration and social interaction. However, broader application of these systems has been limited by device mass, cost and longevity. Here, we use information from multiple logger types to predict individual behaviour in a highly pelagic, migratory seabird, the Manx Shearwater (Puffinus puffinus). Using behavioural states resolved from GPS tracking of foraging during the breeding season, we demonstrate that individual behaviours can be accurately predicted during multi-year migrations from low cost, lightweight, salt-water immersion devices. This reveals a complex pattern of migratory stopovers: some involving high proportions of foraging, and others of rest behaviour. We use this technique to examine three consecutive years of global migrations, revealing the prominence of foraging behaviour during migration and the importance of highly productive waters during migratory stopover. PMID:23635496

  1. Superstatistical generalised Langevin equation: non-Gaussian viscoelastic anomalous diffusion

    NASA Astrophysics Data System (ADS)

    Ślęzak, Jakub; Metzler, Ralf; Magdziarz, Marcin

    2018-02-01

    Recent advances in single particle tracking and supercomputing techniques demonstrate the emergence of normal or anomalous, viscoelastic diffusion in conjunction with non-Gaussian distributions in soft, biological, and active matter systems. We here formulate a stochastic model based on a generalised Langevin equation in which non-Gaussian shapes of the probability density function and normal or anomalous diffusion have a common origin, namely a random parametrisation of the stochastic force. We perform a detailed analysis demonstrating how various types of parameter distributions for the memory kernel result in exponential, power law, or power-log law tails of the memory functions. The studied system is also shown to exhibit a further unusual property: the velocity has a Gaussian one point probability density but non-Gaussian joint distributions. This behaviour is reflected in the relaxation from a Gaussian to a non-Gaussian distribution observed for the position variable. We show that our theoretical results are in excellent agreement with stochastic simulations.

  2. Acceptability of a Mobile Phone App for Measuring Time Use in Breast Cancer Survivors (Life in a Day): Mixed-Methods Study

    PubMed Central

    2018-01-01

    Background Advancements in mobile technology allow innovative data collection techniques such as measuring time use (ie, how individuals structure their time) for the purpose of improving health behavior change interventions. Objective The aim of this study was to examine the acceptability of a 5-day trial of the Life in a Day mobile phone app measuring time use in breast cancer survivors to advance technology-based measurement of time use. Methods Acceptability data were collected from participants (N=40; 100% response rate) using a self-administered survey after 5 days of Life in a Day use. Results Overall, participants had a mean age of 55 years (SD 8) and completed 16 years of school (SD 2). Participants generally agreed that learning to use Life in a Day was easy (83%, 33/40) and would prefer to log activities using Life in a Day over paper-and-pencil diary (73%, 29/40). A slight majority felt that completing Life in a Day for 5 consecutive days was not too much (60%, 24/40) or overly time-consuming (68%, 27/40). Life in a Day was rated as easy to read (88%, 35/40) and navigate (70%, 32/40). Participants also agreed that it was easy to log activities using the activity timer at the start and end of an activity (90%, 35/39). Only 13% (5/40) downloaded the app on their personal phone, whereas 63% (19/30) of the remaining participants would have preferred to use their personal phone. Overall, 77% (30/39) of participants felt that the Life in a Day app was good or very good. Those who agreed that it was easy to edit activities were significantly more likely to be younger when compared with those who disagreed (mean 53 vs 58 years, P=.04). Similarly, those who agreed that it was easy to remember to log activities were more likely to be younger (mean 52 vs 60 years, P<.001). Qualitative coding of 2 open-ended survey items yielded 3 common themes for Life in a Day improvement (ie, convenience, user interface, and reminders). Conclusions A mobile phone app is an acceptable time-use measurement modality. Improving convenience, user interface, and memory prompts while addressing the needs of older participants is needed to enhance app utility. Trial Registration ClinicalTrials.gov NCT00929617; https://clinicaltrials.gov/ct2/show/NCT00929617 (Archived by WebCite at http://www.webcitation.org/6z2bZ4P7X) PMID:29759953

  3. Riparian Systems and Forest Management—Changes in Harvesting Techniques and their Effects on Decomposed Granitic Soils

    Treesearch

    John W. Bramhall

    1989-01-01

    In the 1950s, timber on steep granitic terrain in Trinity County, California was harvested by using the logging techniques of the time. After Trinity Dam was built in the 1960s, it became evident these techniques were not suited to quality riparian habitat and healthy anadromous fisheries. Since adoption of the Z'berg-Nejedly Forest Practice Act in 1973, efforts...

  4. Fuzzy inference system for identification of geological stratigraphy off Prydz Bay, East Antarctica

    NASA Astrophysics Data System (ADS)

    Singh, Upendra K.

    2011-12-01

    The analysis of well logging data plays key role in the exploration and development of hydrocarbon reservoirs. Various well log parameters such as porosity, gamma ray, density, transit time and resistivity, help in classification of strata and estimation of the physical, electrical and acoustical properties of the subsurface lithology. Strong and conspicuous changes in some of the log parameters associated with any particular geological stratigraphy formation are function of its composition, physical properties that help in classification. However some substrata show moderate values in respective log parameters and make difficult to identify the kind of strata, if we go by the standard variability ranges of any log parameters and visual inspection. The complexity increases further with more number of sensors involved. An attempt is made to identify the kinds of stratigraphy from well logs over Prydz bay basin, East Antarctica using fuzzy inference system. A model is built based on few data sets of known stratigraphy and further the network model is used as test model to infer the lithology of a borehole from their geophysical logs, not used in simulation. Initially the fuzzy based algorithm is trained, validated and tested on well log data and finally identifies the formation lithology of a hydrocarbon reservoir system of study area. The effectiveness of this technique is demonstrated by the analysis of the results for actual lithologs and coring data of ODP Leg 188. The fuzzy results show that the training performance equals to 82.95% while the prediction ability is 87.69%. The fuzzy results are very encouraging and the model is able to decipher even thin layer seams and other strata from geophysical logs. The result provides the significant sand formation of depth range 316.0- 341.0 m, where core recovery is incomplete.

  5. Detection of concrete dam leakage using an integrated geophysical technique based on flow-field fitting method

    NASA Astrophysics Data System (ADS)

    Dai, Qianwei; Lin, Fangpeng; Wang, Xiaoping; Feng, Deshan; Bayless, Richard C.

    2017-05-01

    An integrated geophysical investigation was performed at S dam located at Dadu basin in China to assess the condition of the dam curtain. The key methodology of the integrated technique used was flow-field fitting method, which allowed identification of the hydraulic connections between the dam foundation and surface water sources (upstream and downstream), and location of the anomalous leakage outlets in the dam foundation. Limitations of the flow-field fitting method were complemented with resistivity logging to identify the internal erosion which had not yet developed into seepage pathways. The results of the flow-field fitting method and resistivity logging were consistent when compared with data provided by seismic tomography, borehole television, water injection test, and rock quality designation.

  6. DNA Damage Response and Repair Gene Alterations Are Associated with Improved Survival in Patients with Platinum-Treated Advanced Urothelial Carcinoma.

    PubMed

    Teo, Min Yuen; Bambury, Richard M; Zabor, Emily C; Jordan, Emmet; Al-Ahmadie, Hikmat; Boyd, Mariel E; Bouvier, Nancy; Mullane, Stephanie A; Cha, Eugene K; Roper, Nitin; Ostrovnaya, Irina; Hyman, David M; Bochner, Bernard H; Arcila, Maria E; Solit, David B; Berger, Michael F; Bajorin, Dean F; Bellmunt, Joaquim; Iyer, Gopakumar; Rosenberg, Jonathan E

    2017-07-15

    Purpose: Platinum-based chemotherapy remains the standard treatment for advanced urothelial carcinoma by inducing DNA damage. We hypothesize that somatic alterations in DNA damage response and repair (DDR) genes are associated with improved sensitivity to platinum-based chemotherapy. Experimental Design: Patients with diagnosis of locally advanced and metastatic urothelial carcinoma treated with platinum-based chemotherapy who had exon sequencing with the Memorial Sloan Kettering-Integrated Mutation Profiling of Actionable Cancer Targets (MSK-IMPACT) assay were identified. Patients were dichotomized based on the presence/absence of alterations in a panel of 34 DDR genes. DDR alteration status was correlated with clinical outcomes and disease features. Results: One hundred patients were identified, of which 47 harbored alterations in DDR genes. Patients with DDR alterations had improved progression-free survival (9.3 vs. 6.0 months, log-rank P = 0.007) and overall survival (23.7 vs. 13.0 months, log-rank P = 0.006). DDR alterations were also associated with higher number mutations and copy-number alterations. A trend toward positive correlation between DDR status and nodal metastases and inverse correlation with visceral metastases were observed. Different DDR pathways also suggested variable impact on clinical outcomes. Conclusions: Somatic DDR alteration is associated with improved clinical outcomes in platinum-treated patients with advanced urothelial carcinoma. Once validated, it can improve patient selection for clinical practice and future study enrollment. Clin Cancer Res; 23(14); 3610-8. ©2017 AACR . ©2017 American Association for Cancer Research.

  7. On comparison of net survival curves.

    PubMed

    Pavlič, Klemen; Perme, Maja Pohar

    2017-05-02

    Relative survival analysis is a subfield of survival analysis where competing risks data are observed, but the causes of death are unknown. A first step in the analysis of such data is usually the estimation of a net survival curve, possibly followed by regression modelling. Recently, a log-rank type test for comparison of net survival curves has been introduced and the goal of this paper is to explore its properties and put this methodological advance into the context of the field. We build on the association between the log-rank test and the univariate or stratified Cox model and show the analogy in the relative survival setting. We study the properties of the methods using both the theoretical arguments as well as simulations. We provide an R function to enable practical usage of the log-rank type test. Both the log-rank type test and its model alternatives perform satisfactory under the null, even if the correlation between their p-values is rather low, implying that both approaches cannot be used simultaneously. The stratified version has a higher power in case of non-homogeneous hazards, but also carries a different interpretation. The log-rank type test and its stratified version can be interpreted in the same way as the results of an analogous semi-parametric additive regression model despite the fact that no direct theoretical link can be established between the test statistics.

  8. The use of MP3 recorders to log data from equine hoof mounted accelerometers.

    PubMed

    Parsons, K J; Wilson, A M

    2006-11-01

    MP3 recorders are readily available, small, lightweight and low cost, providing the potential for logging analogue hoof mounted accelerometer signals for the characterisation of equine locomotion. These, however, require testing in practice. To test whether 1) multiple MP3 recorders can maintain synchronisation, giving the ability to synchronise independent recorders for the logging of multiple limbs simultaneously; and 2) features of a foot mounted accelerometer signal attributable to foot-on and foot-off can be accurately identified from horse foot mounted accelerometers logged directly into an MP3 recorder. Three experiments were performed: 1) Maintenance of synchronisation was assessed by counting the number of samples recorded by each of 4 MP3 recorders while mounted on a trotting horse and over 2 consecutive 30 min periods in 8 recorders on a bench. 2) Foot-on and foot-off times obtained from manual transcription of MP3 logged data and directly logged accelerometer signal were compared. 3) MP3/accelerometer acquisition units were used to log accelerometer signals from racehorses during extended training sessions. Mean absolute error of synchronisation between MP3 recorders was 10 samples per million (compared to mean number of samples, range 1-32 samples per million). Error accumulation showed a linear correlation with time. Features attributable to foot on and foot off were equally identifiable from the MP3 recorded signal over a range of equine gaits. Multiple MP3 recorders can be synchronised and used as a relatively cheap, robust, reliable and accurate logging system when combined with an accelerometer and external battery for the specific application of the measurement of stride timing variables across the range of equine gaits during field locomotion. Footfall timings can be used to identify intervals between the fore and hind contacts, the identification of diagonal advanced placement and to calculate stride timing variables (stance time, protraction time and stride time). These parameters are invaluable for the characterisation and assessment of equine locomotion.

  9. Meta-analysis of the Effects of Sanitizing Treatments on Salmonella, Escherichia coli O157:H7, and Listeria monocytogenes Inactivation in Fresh Produce

    PubMed Central

    Prado-Silva, Leonardo; Cadavez, Vasco; Gonzales-Barron, Ursula; Rezende, Ana Carolina B.

    2015-01-01

    The aim of this study was to perform a meta-analysis of the effects of sanitizing treatments of fresh produce on Salmonella spp., Escherichia coli O157:H7, and Listeria monocytogenes. From 55 primary studies found to report on such effects, 40 were selected based on specific criteria, leading to more than 1,000 data on mean log reductions of these three bacterial pathogens impairing the safety of fresh produce. Data were partitioned to build three meta-analytical models that could allow the assessment of differences in mean log reductions among pathogens, fresh produce, and sanitizers. Moderating variables assessed in the meta-analytical models included type of fresh produce, type of sanitizer, concentration, and treatment time and temperature. Further, a proposal was done to classify the sanitizers according to bactericidal efficacy by means of a meta-analytical dendrogram. The results indicated that both time and temperature significantly affected the mean log reductions of the sanitizing treatment (P < 0.0001). In general, sanitizer treatments led to lower mean log reductions when applied to leafy greens (for example, 0.68 log reductions [0.00 to 1.37] achieved in lettuce) compared to other, nonleafy vegetables (for example, 3.04 mean log reductions [2.32 to 3.76] obtained for carrots). Among the pathogens, E. coli O157:H7 was more resistant to ozone (1.6 mean log reductions), while L. monocytogenes and Salmonella presented high resistance to organic acids, such as citric acid, acetic acid, and lactic acid (∼3.0 mean log reductions). With regard to the sanitizers, it has been found that slightly acidic electrolyzed water, acidified sodium chlorite, and the gaseous chlorine dioxide clustered together, indicating that they possessed the strongest bactericidal effect. The results reported seem to be an important achievement for advancing the global understanding of the effectiveness of sanitizers for microbial safety of fresh produce. PMID:26362982

  10. LoyalTracker: Visualizing Loyalty Dynamics in Search Engines.

    PubMed

    Shi, Conglei; Wu, Yingcai; Liu, Shixia; Zhou, Hong; Qu, Huamin

    2014-12-01

    The huge amount of user log data collected by search engine providers creates new opportunities to understand user loyalty and defection behavior at an unprecedented scale. However, this also poses a great challenge to analyze the behavior and glean insights into the complex, large data. In this paper, we introduce LoyalTracker, a visual analytics system to track user loyalty and switching behavior towards multiple search engines from the vast amount of user log data. We propose a new interactive visualization technique (flow view) based on a flow metaphor, which conveys a proper visual summary of the dynamics of user loyalty of thousands of users over time. Two other visualization techniques, a density map and a word cloud, are integrated to enable analysts to gain further insights into the patterns identified by the flow view. Case studies and the interview with domain experts are conducted to demonstrate the usefulness of our technique in understanding user loyalty and switching behavior in search engines.

  11. Generalized image contrast enhancement technique based on Heinemann contrast discrimination model

    NASA Astrophysics Data System (ADS)

    Liu, Hong; Nodine, Calvin F.

    1994-03-01

    This paper presents a generalized image contrast enhancement technique which equalizes perceived brightness based on the Heinemann contrast discrimination model. This is a modified algorithm which presents an improvement over the previous study by Mokrane in its mathematically proven existence of a unique solution and in its easily tunable parameterization. The model uses a log-log representation of contrast luminosity between targets and the surround in a fixed luminosity background setting. The algorithm consists of two nonlinear gray-scale mapping functions which have seven parameters, two of which are adjustable Heinemann constants. Another parameter is the background gray level. The remaining four parameters are nonlinear functions of gray scale distribution of the image, and can be uniquely determined once the previous three are given. Tests have been carried out to examine the effectiveness of the algorithm for increasing the overall contrast of images. It can be demonstrated that the generalized algorithm provides better contrast enhancement than histogram equalization. In fact, the histogram equalization technique is a special case of the proposed mapping.

  12. The VAST Challenge: History, Scope, and Outcomes: An introduction to the Special Issue

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Kristin A.; Grinstein, Georges; Whiting, Mark A.

    2014-10-01

    Visual analytics aims to facilitate human insight from complex data via a combination of visual representations, interaction techniques, and supporting algorithms. To create new tools and techniques that achieve this goal requires that researchers have an understanding of analytical questions to be addressed, data that illustrates the complexities and ambiguities found in realistic analytic settings, and methods for evaluating whether the plausible insights are gained through use of the new methods. However, researchers do not, generally speaking, have access to analysts who can articulate their problems or operational data that is used for analysis. To fill this gap, the Visualmore » Analytics Science and Technology (VAST) Challenge has been held annually since 2006. The VAST Challenge provides an opportunity for researchers to experiment with realistic but not real problems, using realistic synthetic data with known events embedded. Since its inception, the VAST Challenge has evolved along with the visual analytics research community to pose more complex challenges, ranging from text analysis to video analysis to large scale network log analysis. The seven years of the VAST Challenge have seen advancements in research and development, education, evaluation, and in the challenge process itself. This special issue of Information Visualization highlights some of the noteworthy advancements in each of these areas. Some of these papers focus on important research questions related to the challenge itself, and other papers focus on innovative research that has been shaped by participation in the challenge. This paper describes the VAST Challenge process and benefits in detail. It also provides an introduction to and context for the remaining papers in the issue.« less

  13. Fine PM measurements: personal and indoor air monitoring.

    PubMed

    Jantunen, M; Hänninen, O; Koistinen, K; Hashim, J H

    2002-12-01

    This review compiles personal and indoor microenvironment particulate matter (PM) monitoring needs from recently set research objectives, most importantly the NRC published "Research Priorities for Airborne Particulate Matter (1998)". Techniques and equipment used to monitor PM personal exposures and microenvironment concentrations and the constituents of the sampled PM during the last 20 years are then reviewed. Development objectives are set and discussed for personal and microenvironment PM samplers and monitors, for filter materials, and analytical laboratory techniques for equipment calibration, filter weighing and laboratory climate control. The progress is leading towards smaller sample flows, lighter, silent, independent (battery powered) monitors with data logging capacity to store microenvironment or activity relevant sensor data, advanced flow controls and continuous recording of the concentration. The best filters are non-hygroscopic, chemically pure and inert, and physically robust against mechanical wear. Semiautomatic and primary standard equivalent positive displacement flow meters are replacing the less accurate methods in flow calibration, and also personal sampling flow rates should become mass flow controlled (with or without volumetric compensation for pressure and temperature changes). In the weighing laboratory the alternatives are climatic control (set temperature and relative humidity), and mechanically simpler thermostatic heating, air conditioning and dehumidification systems combined with numerical control of temperature, humidity and pressure effects on flow calibration and filter weighing.

  14. A multidisciplinary approach to reservoir subdivision of the Maastrichtian chalk in the Dan field, Danish North Sea

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kristensen, L.; Dons, T.; Schioler, P.

    1995-11-01

    Correlation of wireline log data from the North Sea chalk reservoirs is frequently hampered by rather subtle log patterns in the chalk section due to the apparent monotonous nature of the chalk sediments, which may lead to ambiguous correlations. This study deals with a correlation technique based on an integration of biostratigraphic data, seismic interpretation, and wireline log correlation; this technique aims at producing a consistent reservoir subdivision that honors both the well data and the seismic data. This multidisciplinary approach has been used to subdivide and correlate the Maastrichtian chalk in the Dan field. The biostratigraphic subdivision is basedmore » on a new detailed dinoflagellate study of core samples from eight wells. Integrating the biostratigraphic results with three-dimensional seismic data allows recognition of four stratigraphic units within the Maastrichtian, bounded by assumed chronostratigraphic horizons. This subdivision is further refined by adding a seismic horizon and four horizons from wireline log correlations, establishing a total of nine reservoir units. The approximate chronostratigraphic nature of these units provides an improved interpretation of the depositional and structural patterns in this area. The three upper reservoir units pinch out and disappear in a northeasterly direction across the field. We interpret this stratal pattern as reflecting a relative sea level fall or regional basinal subsidence during the latest Maastrichtian, possibly combined with local synsedimentary uplift due to salt tectonics. Isochore maps indicate that the underlying six non-wedging units are unaffected by salt tectonics.« less

  15. Development of Enabling Scientific Tools to Characterize the Geologic Subsurface at Hanford

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kenna, Timothy C.; Herron, Michael M.

    2014-07-08

    This final report to the Department of Energy provides a summary of activities conducted under our exploratory grant, funded through U.S. DOE Subsurface Biogeochemical Research Program in the category of enabling scientific tools, which covers the period from July 15, 2010 to July 14, 2013. The main goal of this exploratory project is to determine the parameters necessary to translate existing borehole log data into reservoir properties following scientifically sound petrophysical relationships. For this study, we focused on samples and Ge-based spectral gamma logging system (SGLS) data collected from wells located in the Hanford 300 Area. The main activities consistedmore » of 1) the analysis of available core samples for a variety of mineralogical, chemical and physical; 2) evaluation of selected spectral gamma logs, environmental corrections, and calibration; 3) development of algorithms and a proposed workflow that permits translation of log responses into useful reservoir properties such as lithology, matrix density, porosity, and permeability. These techniques have been successfully employed in the petroleum industry; however, the approach is relatively new when applied to subsurface remediation. This exploratory project has been successful in meeting its stated objectives. We have demonstrated that our approach can lead to an improved interpretation of existing well log data. The algorithms we developed can utilize available log data, in particular gamma, and spectral gamma logs, and continued optimization will improve their application to ERSP goals of understanding subsurface properties.« less

  16. TH-AB-201-12: Using Machine Log-Files for Treatment Planning and Delivery QA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stanhope, C; Liang, J; Drake, D

    2016-06-15

    Purpose: To determine the segment reduction and dose resolution necessary for machine log-files to effectively replace current phantom-based patient-specific quality assurance, while minimizing computational cost. Methods: Elekta’s Log File Convertor R3.2 records linac delivery parameters (dose rate, gantry angle, leaf position) every 40ms. Five VMAT plans [4 H&N, 1 Pulsed Brain] comprised of 2 arcs each were delivered on the ArcCHECK phantom. Log-files were reconstructed in Pinnacle on the phantom geometry using 1/2/3/4° control point spacing and 2/3/4mm dose grid resolution. Reconstruction effectiveness was quantified by comparing 2%/2mm gamma passing rates of the original and log-file plans. Modulation complexity scoresmore » (MCS) were calculated for each beam to correlate reconstruction accuracy and beam modulation. Percent error in absolute dose for each plan-pair combination (log-file vs. ArcCHECK, original vs. ArcCHECK, log-file vs. original) was calculated for each arc and every diode greater than 10% of the maximum measured dose (per beam). Comparing standard deviations of the three plan-pair distributions, relative noise of the ArcCHECK and log-file systems was elucidated. Results: The original plans exhibit a mean passing rate of 95.1±1.3%. The eight more modulated H&N arcs [MCS=0.088±0.014] and two less modulated brain arcs [MCS=0.291±0.004] yielded log-file pass rates most similar to the original plan when using 1°/2mm [0.05%±1.3% lower] and 2°/3mm [0.35±0.64% higher] log-file reconstructions respectively. Log-file and original plans displayed percent diode dose errors 4.29±6.27% and 3.61±6.57% higher than measurement. Excluding the phantom eliminates diode miscalibration and setup errors; log-file dose errors were 0.72±3.06% higher than the original plans – significantly less noisy. Conclusion: For log-file reconstructed VMAT arcs, 1° control point spacing and 2mm dose resolution is recommended, however, less modulated arcs may allow less stringent reconstructions. Following the aforementioned reconstruction recommendations, the log-file technique is capable of detecting delivery errors with equivalent accuracy and less noise than ArcCHECK QA. I am funded by an Elekta Research Grant.« less

  17. [Utilization suitability of forest resources in typical forest zone of Changbai Mountains].

    PubMed

    Hao, Zhanqing; Yu, Deyong; Xiong, Zaiping; Ye, Ji

    2004-10-01

    Conservation of natural forest does not simply equal to no logging. The Northeast China Forest Region has a logging quota of mature forest as part of natural forest conservation project. How to determine the logging spots rationally and scientifically is very important. Recent scientific theories of forest resources management advocate that the utilization of forest resources should stick to the principle of sustaining use, and pay attention to the ecological function of forest resources. According to the logging standards, RS and GIS techniques can be used to detect the precise location of forest resources and obtain information of forest areas and types, and thus, provide more rational and scientific support for space choice about future utilization of forest resources. In this paper, the Lushuihe Forest Bureau was selected as a typical case in Changbai Mountains Forest Region to assess the utilization conditions of forest resources, and some advices on spatial choice for future management of forest resources in the study area were offered.

  18. Aligning observed and modelled behaviour based on workflow decomposition

    NASA Astrophysics Data System (ADS)

    Wang, Lu; Du, YuYue; Liu, Wei

    2017-09-01

    When business processes are mostly supported by information systems, the availability of event logs generated from these systems, as well as the requirement of appropriate process models are increasing. Business processes can be discovered, monitored and enhanced by extracting process-related information. However, some events cannot be correctly identified because of the explosion of the amount of event logs. Therefore, a new process mining technique is proposed based on a workflow decomposition method in this paper. Petri nets (PNs) are used to describe business processes, and then conformance checking of event logs and process models is investigated. A decomposition approach is proposed to divide large process models and event logs into several separate parts that can be analysed independently; while an alignment approach based on a state equation method in PN theory enhances the performance of conformance checking. Both approaches are implemented in programmable read-only memory (ProM). The correctness and effectiveness of the proposed methods are illustrated through experiments.

  19. Geohydrologic assessment of fractured crystalline bedrock on the southern part of Manhattan, New York, through the use of advanced borehole geophysical methods

    USGS Publications Warehouse

    Stumm, F.; Chu, A.; Joesten, P.K.; Lane, J.W.

    2007-01-01

    Advanced borehole-geophysical methods were used to assess the geohydrology of fractured crystalline bedrock in 31 of 64 boreholes on the southern part of Manhattan Island, NY in preparation of the construction of a new water tunnel. The study area is located in a highly urbanized part of New York City. The boreholes penetrated gneiss, schist, and other crystalline bedrock that has an overall southwest-to northwest-dipping foliation. Most of the fractures intersected are nearly horizontal or have moderate- to high-angle northwest or eastward dip azimuths. Heat-pulse flowmeter logs obtained under nonpumping (ambient) and pumping conditions, together with other geophysical logs, delineated transmissive fracture zones in each borehole. Water-level and flowmeter data suggest the fractured-rock ground-water-flow system is interconnected. The 60 MHz directional borehole-radar logs delineated the location and orientation of several radar reflectors that did not intersect the projection of the borehole. A total of 53 faults intersected by the boreholes have mean orientation populations of N12??W, 66??W and N11??W, 70??E. A total of 77 transmissive fractures delineated using the heat-pulse flowmeter have mean orientations of N11??E, 14??SE (majority) and N23??E, 57??NW (minority). The transmissivity of the bedrock boreholes ranged from 0.7 to 870 feet squared (ft2) per day (0.07 to 81 metres squared (m2) per day). ?? 2007 Nanjing Institute of Geophysical Prospecting.

  20. Life cycle performances of log wood applied for soil bioengineering constructions

    NASA Astrophysics Data System (ADS)

    Kalny, Gerda; Strauss-Sieberth, Alexandra; Strauss, Alfred; Rauch, Hans Peter

    2016-04-01

    Nowadays there is a high demand on engineering solutions considering not only technical aspects but also ecological and aesthetic values. Soil bioengineering is a construction technique that uses biological components for hydraulic and civil engineering solutions. Soil bioengineering solutions are based on the application of living plants and other auxiliary materials including among others log wood. This kind of construction material supports the soil bioengineering system as long as the plants as living construction material overtake the stability function. Therefore it is important to know about the durability and the degradation process of the wooden logs to retain the integral performance of a soil bio engineering system. These aspects will be considered within the framework of the interdisciplinary research project „ELWIRA Plants, wood, steel and concrete - life cycle performances as construction materials". Therefore field investigations on soil bioengineering construction material, specifically European Larch wood logs, of different soil bioengineering structures at the river Wien have been conducted. The drilling resistance as a parameter for particular material characteristics of selected logs was measured and analysed. The drilling resistance was measured with a Rinntech Resistograph instrument at different positions of the wooden logs, all surrounded with three different backfills: Fully surrounded with air, with earth contact on one side and near the water surface in wet-dry conditions. The age of the used logs ranges from one year old up to 20 year old. Results show progress of the drilling resistance throughout the whole cross section as an indicator to assess soil bioengineering construction material. Logs surrounded by air showed a higher drilling resistance than logs with earth contact and the ones exposed to wet-dry conditions. Hence the functional capability of wooden logs were analysed and discussed in terms of different levels of degradation. The results contribute to a sustainable and resource conserving handling with building materials in frame of construction and maintenance works of soil bioengineering structures.

  1. Correlations between chromatographic parameters and bioactivity predictors of potential herbicides.

    PubMed

    Janicka, Małgorzata

    2014-08-01

    Different liquid chromatography techniques, including reversed-phase liquid chromatography on Purosphere RP-18e, IAM.PC.DD2 and Cosmosil Cholester columns and micellar liqud chromatography with a Purosphere RP-8e column and using buffered sodium dodecyl sulfate-acetonitrile as the mobile phase, were applied to study the lipophilic properties of 15 newly synthesized phenoxyacetic and carbamic acid derivatives, which are potential herbicides. Chromatographic lipophilicity descriptors were used to extrapolate log k parameters (log kw and log km) and log k values. Partitioning lipophilicity descriptors, i.e., log P coefficients in an n-octanol-water system, were computed from the molecular structures of the tested compounds. Bioactivity descriptors, including partition coefficients in a water-plant cuticle system and water-human serum albumin and coefficients for human skin partition and permeation were calculated in silico by ACD/ADME software using the linear solvation energy relationship of Abraham. Principal component analysis was applied to describe similarities between various chromatographic and partitioning lipophilicities. Highly significant, predictive linear relationships were found between chromatographic parameters and bioactivity descriptors. © The Author [2013]. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Finite-difference modeling of the electroseismic logging in a fluid-saturated porous formation

    NASA Astrophysics Data System (ADS)

    Guan, Wei; Hu, Hengshan

    2008-05-01

    In a fluid-saturated porous medium, an electromagnetic (EM) wavefield induces an acoustic wavefield due to the electrokinetic effect. A potential geophysical application of this effect is electroseismic (ES) logging, in which the converted acoustic wavefield is received in a fluid-filled borehole to evaluate the parameters of the porous formation around the borehole. In this paper, a finite-difference scheme is proposed to model the ES logging responses to a vertical low frequency electric dipole along the borehole axis. The EM field excited by the electric dipole is calculated separately by finite-difference first, and is considered as a distributed exciting source term in a set of extended Biot's equations for the converted acoustic wavefield in the formation. This set of equations is solved by a modified finite-difference time-domain (FDTD) algorithm that allows for the calculation of dynamic permeability so that it is not restricted to low-frequency poroelastic wave problems. The perfectly matched layer (PML) technique without splitting the fields is applied to truncate the computational region. The simulated ES logging waveforms approximately agree with those obtained by the analytical method. The FDTD algorithm applies also to acoustic logging simulation in porous formations.

  3. Using borehole flow data to characterize the hydraulics of flow paths in operating wellfields

    USGS Publications Warehouse

    Paillet, F.; Lundy, J.

    2004-01-01

    Understanding the flow paths in the vicinity of water well intakes is critical in the design of effective wellhead protection strategies for heterogeneous carbonate aquifers. High-resolution flow logs can be combined with geophysical logs and borehole-wall-image logs (acoustic televiewer) to identify the porous beds, solution openings, and fractures serving as conduits connecting the well bore to the aquifer. Qualitative methods of flow log analysis estimate the relative transmissivity of each water-producing zone, but do not indicate how those zones are connected to the far-field aquifer. Borehole flow modeling techniques can be used to provide quantitative estimates of both transmissivity and far-field hydraulic head in each producing zone. These data can be used to infer how the individual zones are connected with each other, and to the surrounding large-scale aquifer. Such information is useful in land-use planning and the design of well intakes to prevent entrainment of contaminants into water-supply systems. Specific examples of flow log applications in the identification of flow paths in operating wellfields are given for sites in Austin and Faribault, Minnesota. Copyright ASCE 2004.

  4. Structural characteristics of novel symmetrical diaryl derivatives with nitrogenated functions. Requirements for cytotoxic activity.

    PubMed

    Font, María; Ardaiz, Elena; Cordeu, Lucia; Cubedo, Elena; García-Foncillas, Jesús; Sanmartin, Carmen; Palop, Juan-Antonio

    2006-03-15

    In an attempt to discover the essential features that would allow us to explain the differences in cytotoxic activity shown by a series of symmetrical diaryl derivatives with nitrogenated functions, we have studied by molecular modelling techniques the variation in Log P and conformational behaviour, in terms of structural modifications. The Log P data--although they provide few clues concerning the observed variability in activity--suggest that an initial separation of active and inactive compounds is possible based on this parameter. The subsequent study of the conformational behaviour of the compounds, selected according to their Log P values, showed that the active compounds preferentially display an extended conformation and inactive ones are associated with a certain type of folding, with a triangular-type conformation adopted in these cases.

  5. BORE II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bore II, co-developed by Berkeley Lab researchers Frank Hale, Chin-Fu Tsang, and Christine Doughty, provides vital information for solving water quality and supply problems and for improving remediation of contaminated sites. Termed "hydrophysical logging," this technology is based on the concept of measuring repeated depth profiles of fluid electric conductivity in a borehole that is pumping. As fluid enters the wellbore, its distinct electric conductivity causes peaks in the conductivity log that grow and migrate upward with time. Analysis of the evolution of the peaks enables characterization of groundwater flow distribution more quickly, more cost effectively, and with higher resolutionmore » than ever before. Combining the unique interpretation software Bore II with advanced downhole instrumentation (the hydrophysical logging tool), the method quantifies inflow and outflow locations, their associated flow rates, and the basic water quality parameters of the associated formation waters (e.g., pH, oxidation-reduction potential, temperature). In addition, when applied in conjunction with downhole fluid sampling, Bore II makes possible a complete assessment of contaminant concentration within groundwater.« less

  6. 78 FR 47242 - Drawbridge Operation Regulation; Umpqua River, Reedsport, OR

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-05

    ... the openings to twice daily; once in the morning and once in the evening. DATES: Comments and related... opening is requested at least six hours in advance. The U.S. 101 Umpqua River Bridge is a swing span... examined bridge opening logs and contacted all waterway users that have requested bridge openings...

  7. Mark Hopkins' Log: Teaching and the Analysis of Ideas.

    ERIC Educational Resources Information Center

    Grenander, M. E.

    1969-01-01

    Better equipped than most teachers in a humanistic background and in a knowledge of advances in interdisciplinary study, the English teacher is well-qualified to achieve a major educational goal--to help a student acquire a disciplined attitude toward knowledge through the analysis of ideas. One method of reaching this goal is through the…

  8. Advanced information processing system

    NASA Technical Reports Server (NTRS)

    Lala, J. H.

    1984-01-01

    Design and performance details of the advanced information processing system (AIPS) for fault and damage tolerant data processing on aircraft and spacecraft are presented. AIPS comprises several computers distributed throughout the vehicle and linked by a damage tolerant data bus. Most I/O functions are available to all the computers, which run in a TDMA mode. Each computer performs separate specific tasks in normal operation and assumes other tasks in degraded modes. Redundant software assures that all fault monitoring, logging and reporting are automated, together with control functions. Redundant duplex links and damage-spread limitation provide the fault tolerance. Details of an advanced design of a laboratory-scale proof-of-concept system are described, including functional operations.

  9. How Ontologies are Made: Studying the Hidden Social Dynamics Behind Collaborative Ontology Engineering Projects.

    PubMed

    Strohmaier, Markus; Walk, Simon; Pöschko, Jan; Lamprecht, Daniel; Tudorache, Tania; Nyulas, Csongor; Musen, Mark A; Noy, Natalya F

    2013-05-01

    Traditionally, evaluation methods in the field of semantic technologies have focused on the end result of ontology engineering efforts, mainly, on evaluating ontologies and their corresponding qualities and characteristics. This focus has led to the development of a whole arsenal of ontology-evaluation techniques that investigate the quality of ontologies as a product . In this paper, we aim to shed light on the process of ontology engineering construction by introducing and applying a set of measures to analyze hidden social dynamics. We argue that especially for ontologies which are constructed collaboratively, understanding the social processes that have led to its construction is critical not only in understanding but consequently also in evaluating the ontology. With the work presented in this paper, we aim to expose the texture of collaborative ontology engineering processes that is otherwise left invisible. Using historical change-log data, we unveil qualitative differences and commonalities between different collaborative ontology engineering projects. Explaining and understanding these differences will help us to better comprehend the role and importance of social factors in collaborative ontology engineering projects. We hope that our analysis will spur a new line of evaluation techniques that view ontologies not as the static result of deliberations among domain experts, but as a dynamic, collaborative and iterative process that needs to be understood, evaluated and managed in itself. We believe that advances in this direction would help our community to expand the existing arsenal of ontology evaluation techniques towards more holistic approaches.

  10. How Ontologies are Made: Studying the Hidden Social Dynamics Behind Collaborative Ontology Engineering Projects

    PubMed Central

    Strohmaier, Markus; Walk, Simon; Pöschko, Jan; Lamprecht, Daniel; Tudorache, Tania; Nyulas, Csongor; Musen, Mark A.; Noy, Natalya F.

    2013-01-01

    Traditionally, evaluation methods in the field of semantic technologies have focused on the end result of ontology engineering efforts, mainly, on evaluating ontologies and their corresponding qualities and characteristics. This focus has led to the development of a whole arsenal of ontology-evaluation techniques that investigate the quality of ontologies as a product. In this paper, we aim to shed light on the process of ontology engineering construction by introducing and applying a set of measures to analyze hidden social dynamics. We argue that especially for ontologies which are constructed collaboratively, understanding the social processes that have led to its construction is critical not only in understanding but consequently also in evaluating the ontology. With the work presented in this paper, we aim to expose the texture of collaborative ontology engineering processes that is otherwise left invisible. Using historical change-log data, we unveil qualitative differences and commonalities between different collaborative ontology engineering projects. Explaining and understanding these differences will help us to better comprehend the role and importance of social factors in collaborative ontology engineering projects. We hope that our analysis will spur a new line of evaluation techniques that view ontologies not as the static result of deliberations among domain experts, but as a dynamic, collaborative and iterative process that needs to be understood, evaluated and managed in itself. We believe that advances in this direction would help our community to expand the existing arsenal of ontology evaluation techniques towards more holistic approaches. PMID:24311994

  11. Comparative evaluation of the performance of the Abbott RealTime HIV-1 assay for measurement of HIV-1 plasma viral load on genetically diverse samples from Greece

    PubMed Central

    2011-01-01

    Background HIV-1 is characterized by increased genetic heterogeneity which tends to hinder the reliability of detection and accuracy of HIV-1 RNA quantitation assays. Methods In this study, the Abbott RealTime HIV-1 (Abbott RealTime) assay was compared to the Roche Cobas TaqMan HIV-1 (Cobas TaqMan) and the Siemens Versant HIV-1 RNA 3.0 (bDNA 3.0) assays, using clinical samples of various viral load levels and subtypes from Greece, where the recent epidemiology of HIV-1 infection has been characterized by increasing genetic diversity and a marked increase in subtype A genetic strains among newly diagnosed infections. Results A high correlation was observed between the quantitative results obtained by the Abbott RealTime and the Cobas TaqMan assays. Viral load values quantified by the Abbott RealTime were on average lower than those obtained by the Cobas TaqMan, with a mean (SD) difference of -0.206 (0.298) log10 copies/ml. The mean differences according to HIV-1 subtypes between the two techniques for samples of subtype A, B, and non-A/non-B were 0.089, -0.262, and -0.298 log10 copies/ml, respectively. Overall, differences were less than 0.5 log10 for 85% of the samples, and >1 log10 in only one subtype B sample. Similarly, Abbott RealTime and bDNA 3.0 assays yielded a very good correlation of quantitative results, whereas viral load values assessed by the Abbott RealTime were on average higher (mean (SD) difference: 0.160 (0.287) log10 copies/ml). The mean differences according to HIV-1 subtypes between the two techniques for subtype A, B and non-A/non-B samples were 0.438, 0.105 and 0.191 log10 copies/ml, respectively. Overall, the majority of samples (86%) differed by less than 0.5 log10, while none of the samples showed a deviation of more than 1.0 log10. Conclusions In an area of changing HIV-1 subtype pattern, the Abbott RealTime assay showed a high correlation and good agreement of results when compared both to the Cobas TaqMan and bDNA 3.0 assays, for all HIV-1 subtypes tested. All three assays could determine viral load from samples of different HIV-1 subtypes adequately. However, assay variation should be taken into account when viral load monitoring of the same individual is assessed by different systems. PMID:21219667

  12. Self assembly of rectangular shapes on concentration programming and probabilistic tile assembly models

    PubMed Central

    Rajasekaran, Sanguthevar

    2013-01-01

    Efficient tile sets for self assembling rectilinear shapes is of critical importance in algorithmic self assembly. A lower bound on the tile complexity of any deterministic self assembly system for an n × n square is Ω(log(n)log(log(n))) (inferred from the Kolmogrov complexity). Deterministic self assembly systems with an optimal tile complexity have been designed for squares and related shapes in the past. However designing Θ(log(n)log(log(n))) unique tiles specific to a shape is still an intensive task in the laboratory. On the other hand copies of a tile can be made rapidly using PCR (polymerase chain reaction) experiments. This led to the study of self assembly on tile concentration programming models. We present two major results in this paper on the concentration programming model. First we show how to self assemble rectangles with a fixed aspect ratio (α:β), with high probability, using Θ(α + β) tiles. This result is much stronger than the existing results by Kao et al. (Randomized self-assembly for approximate shapes, LNCS, vol 5125. Springer, Heidelberg, 2008) and Doty (Randomized self-assembly for exact shapes. In: proceedings of the 50th annual IEEE symposium on foundations of computer science (FOCS), IEEE, Atlanta. pp 85–94, 2009)—which can only self assembly squares and rely on tiles which perform binary arithmetic. On the other hand, our result is based on a technique called staircase sampling. This technique eliminates the need for sub-tiles which perform binary arithmetic, reduces the constant in the asymptotic bound, and eliminates the need for approximate frames (Kao et al. Randomized self-assembly for approximate shapes, LNCS, vol 5125. Springer, Heidelberg, 2008). Our second result applies staircase sampling on the equimolar concentration programming model (The tile complexity of linear assemblies. In: proceedings of the 36th international colloquium automata, languages and programming: Part I on ICALP ’09, Springer-Verlag, pp 235–253, 2009), to self assemble rectangles (of fixed aspect ratio) with high probability. The tile complexity of our algorithm is Θ(log(n)) and is optimal on the probabilistic tile assembly model (PTAM)—n being an upper bound on the dimensions of a rectangle. PMID:24311993

  13. Yearly report, Yucca Mountain project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brune, J.N.

    1992-09-30

    We proposed to (1) Develop our data logging and analysis equipment and techniques for analyzing seismic data from the Southern Great Basin Seismic Network (SGBSN), (2) Investigate the SGBSN data for evidence of seismicity patterns, depth distribution patterns, and correlations with geologic features (3) Repair and maintain our three broad band downhole digital seismograph stations at Nelson, nevada, Troy Canyon, Nevada, and Deep Springs, California (4) Install, operate, and log data from a super sensitive microearthquake array at Yucca Mountain (5) Analyze data from micro-earthquakes relative to seismic hazard at Yucca Mountain.

  14. Summary goodness-of-fit statistics for binary generalized linear models with noncanonical link functions.

    PubMed

    Canary, Jana D; Blizzard, Leigh; Barry, Ronald P; Hosmer, David W; Quinn, Stephen J

    2016-05-01

    Generalized linear models (GLM) with a canonical logit link function are the primary modeling technique used to relate a binary outcome to predictor variables. However, noncanonical links can offer more flexibility, producing convenient analytical quantities (e.g., probit GLMs in toxicology) and desired measures of effect (e.g., relative risk from log GLMs). Many summary goodness-of-fit (GOF) statistics exist for logistic GLM. Their properties make the development of GOF statistics relatively straightforward, but it can be more difficult under noncanonical links. Although GOF tests for logistic GLM with continuous covariates (GLMCC) have been applied to GLMCCs with log links, we know of no GOF tests in the literature specifically developed for GLMCCs that can be applied regardless of link function chosen. We generalize the Tsiatis GOF statistic originally developed for logistic GLMCCs, (TG), so that it can be applied under any link function. Further, we show that the algebraically related Hosmer-Lemeshow (HL) and Pigeon-Heyse (J(2) ) statistics can be applied directly. In a simulation study, TG, HL, and J(2) were used to evaluate the fit of probit, log-log, complementary log-log, and log models, all calculated with a common grouping method. The TG statistic consistently maintained Type I error rates, while those of HL and J(2) were often lower than expected if terms with little influence were included. Generally, the statistics had similar power to detect an incorrect model. An exception occurred when a log GLMCC was incorrectly fit to data generated from a logistic GLMCC. In this case, TG had more power than HL or J(2) . © 2015 John Wiley & Sons Ltd/London School of Economics.

  15. Direct push driven in situ color logging tool (CLT): technique, analysis routines, and application

    NASA Astrophysics Data System (ADS)

    Werban, U.; Hausmann, J.; Dietrich, P.; Vienken, T.

    2014-12-01

    Direct push technologies have recently seen a broad development providing several tools for in situ parameterization of unconsolidated sediments. One of these techniques is the measurement of soil colors - a proxy information that reveals to soil/sediment properties. We introduce the direct push driven color logging tool (CLT) for real-time and depth-resolved investigation of soil colors within the visible spectrum. Until now, no routines exist on how to handle high-resolved (mm-scale) soil color data. To develop such a routine, we transform raw data (CIEXYZ) into soil color surrogates of selected color spaces (CIExyY, CIEL*a*b*, CIEL*c*h*, sRGB) and denoise small-scale natural variability by Haar and Daublet4 wavelet transformation, gathering interpretable color logs over depth. However, interpreting color log data as a single application remains challenging. Additional information, such as site-specific knowledge of the geological setting, is required to correlate soil color data to specific layers properties. Hence, we exemplary provide results from a joint interpretation of in situ-obtained soil color data and 'state-of-the-art' direct push based profiling tool data and discuss the benefit of additional data. The developed routine is capable of transferring the provided information obtained as colorimetric data into interpretable color surrogates. Soil color data proved to correlate with small-scale lithological/chemical changes (e.g., grain size, oxidative and reductive conditions), especially when combined with additional direct push vertical high resolution data (e.g., cone penetration testing and soil sampling). Thus, the technique allows enhanced profiling by means of providing another reproducible high-resolution parameter for analysis subsurface conditions. This opens potential new areas of application and new outputs for such data in site investigation. It is our intention to improve color measurements by means method of application and data interpretation, useful to characterize vadose layer/soil/sediment characteristics.

  16. Data mining learning bootstrap through semantic thumbnail analysis

    NASA Astrophysics Data System (ADS)

    Battiato, Sebastiano; Farinella, Giovanni Maria; Giuffrida, Giovanni; Tribulato, Giuseppe

    2007-01-01

    The rapid increase of technological innovations in the mobile phone industry induces the research community to develop new and advanced systems to optimize services offered by mobile phones operators (telcos) to maximize their effectiveness and improve their business. Data mining algorithms can run over data produced by mobile phones usage (e.g. image, video, text and logs files) to discover user's preferences and predict the most likely (to be purchased) offer for each individual customer. One of the main challenges is the reduction of the learning time and cost of these automatic tasks. In this paper we discuss an experiment where a commercial offer is composed by a small picture augmented with a short text describing the offer itself. Each customer's purchase is properly logged with all relevant information. Upon arrival of new items we need to learn who the best customers (prospects) for each item are, that is, the ones most likely to be interested in purchasing that specific item. Such learning activity is time consuming and, in our specific case, is not applicable given the large number of new items arriving every day. Basically, given the current customer base we are not able to learn on all new items. Thus, we need somehow to select among those new items to identify the best candidates. We do so by using a joint analysis between visual features and text to estimate how good each new item could be, that is, whether or not is worth to learn on it. Preliminary results show the effectiveness of the proposed approach to improve classical data mining techniques.

  17. Reduction of antibiotic resistance genes in municipal wastewater effluent by advanced oxidation processes.

    PubMed

    Zhang, Yingying; Zhuang, Yao; Geng, Jinju; Ren, Hongqiang; Xu, Ke; Ding, Lili

    2016-04-15

    This study investigated the reduction of antibiotic resistance genes (ARGs), intI1 and 16S rRNA genes, by advanced oxidation processes (AOPs), namely Fenton oxidation (Fe(2+)/H2O2) and UV/H2O2 process. The ARGs include sul1, tetX, and tetG from municipal wastewater effluent. The results indicated that the Fenton oxidation and UV/H2O2 process could reduce selected ARGs effectively. Oxidation by the Fenton process was slightly better than that of the UV/H2O2 method. Particularly, for the Fenton oxidation, under the optimal condition wherein Fe(2+)/H2O2 had a molar ratio of 0.1 and a H2O2 concentration of 0.01molL(-1) with a pH of 3.0 and reaction time of 2h, 2.58-3.79 logs of target genes were removed. Under the initial effluent pH condition (pH=7.0), the removal was 2.26-3.35 logs. For the UV/H2O2 process, when the pH was 3.5 with a H2O2 concentration of 0.01molL(-1) accompanied by 30min of UV irradiation, all ARGs could achieve a reduction of 2.8-3.5 logs, and 1.55-2.32 logs at a pH of 7.0. The Fenton oxidation and UV/H2O2 process followed the first-order reaction kinetic model. The removal of target genes was affected by many parameters, including initial Fe(2+)/H2O2 molar ratios, H2O2 concentration, solution pH, and reaction time. Among these factors, reagent concentrations and pH values are the most important factors during AOPs. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. SaniTwice: a novel approach to hand hygiene for reducing bacterial contamination on hands when soap and water are unavailable.

    PubMed

    Edmonds, Sarah L; Mann, James; McCormack, Robert R; Macinga, David R; Fricker, Christopher M; Arbogast, James W; Dolan, Michael J

    2010-12-01

    The risk of inadequate hand hygiene in food handling settings is exacerbated when water is limited or unavailable, thereby making washing with soap and water difficult. The SaniTwice method involves application of excess alcohol-based hand sanitizer (ABHS), hand "washing" for 15 s, and thorough cleaning with paper towels while hands are still wet, followed by a standard application of ABHS. This study investigated the effectiveness of the SaniTwice methodology as an alternative to hand washing for cleaning and removal of microorganisms. On hands moderately soiled with beef broth containing Escherichia coli (ATCC 11229), washing with a nonantimicrobial hand washing product achieved a 2.86 (±0.64)-log reduction in microbial contamination compared with the baseline, whereas the SaniTwice method with 62 % ethanol (EtOH) gel, 62 % EtOH foam, and 70 % EtOH advanced formula gel achieved reductions of 2.64 ± 0.89, 3.64 ± 0.57, and 4.61 ± 0.33 log units, respectively. When hands were heavily soiled from handling raw hamburger containing E. coli, washing with nonantimicrobial hand washing product and antimicrobial hand washing product achieved reductions of 2.65 ± 0.33 and 2.69 ± 0.32 log units, respectively, whereas SaniTwice with 62 % EtOH foam, 70 % EtOH gel, and 70 % EtOH advanced formula gel achieved reductions of 2.87 ± 0.42, 2.99 ± 0.51, and 3.92 ± 0.65 log units, respectively. These results clearly demonstrate that the in vivo antibacterial efficacy of the SaniTwice regimen with various ABHS is equivalent to or exceeds that of the standard hand washing approach as specified in the U.S. Food and Drug Administration Food Code. Implementation of the SaniTwice regimen in food handling settings with limited water availability should significantly reduce the risk of foodborne infections resulting from inadequate hand hygiene.

  19. Interpretation of well logs in a carbonate aquifer

    USGS Publications Warehouse

    MacCary, L.M.

    1978-01-01

    This report describes the log analysis of the Randolph and Sabial core holes in the Edwards aquifer in Texas, with particular attention to the principles that can be applied generally to any carbonate system. The geologic and hydrologic data were obtained during the drilling of the two holes, from extensive laboratory analysis of the cores, and from numerous geophysical logs run in the two holes. Some logging methods are inherently superiors to others for the analysis of limestone and dolomite aquifers. Three such systems are the dentistry, neutron, and acoustic-velocity (sonic) logs. Most of the log analysis described here is based on the interpretation of suites of logs from these three systems. In certain instances, deeply focused resistivity logs can be used to good advantage in carbonate rock studies; this technique is used to computer the water resistivity in the Randolph core hole. The rocks penetrated by the Randolph core hole are typical of those carbonates that have undergone very little solution by recent ground-water circulation. There are few large solutional openings; the water is saline; and the rocks are dark, dolomitic, have pore space that is interparticle or intercrystalline, and contain unoxidized organic material. The total porosity of rocks in the saline zone is higher than that of rocks in the fresh-water aquifer; however, the intrinsic permeability is much less in the saline zone because there are fewer large solutional openings. The Sabinal core hole penetrates a carbonate environment that has experienced much solution by ground water during recent geologic time. The rocks have high secondary porosities controlled by sedimentary structures within the rock; the water is fresh; and the dominant rock composition is limestone. The relative percentages of limestone and dolomite, the average matrix (grain) densities of the rock mixtures , and the porosity of the rock mass can be calculated from density, neutron, and acoustic logs. With supporting data from resistivity logs, the formation water quality can be estimated, as well as the relative cementation or tortuosity of the rock. Many of these properties calculated from logs can be verified by analysis of the core available from test holes drilled in the saline and fresh water zones.

  20. How accurately can we estimate energetic costs in a marine top predator, the king penguin?

    PubMed

    Halsey, Lewis G; Fahlman, Andreas; Handrich, Yves; Schmidt, Alexander; Woakes, Anthony J; Butler, Patrick J

    2007-01-01

    King penguins (Aptenodytes patagonicus) are one of the greatest consumers of marine resources. However, while their influence on the marine ecosystem is likely to be significant, only an accurate knowledge of their energy demands will indicate their true food requirements. Energy consumption has been estimated for many marine species using the heart rate-rate of oxygen consumption (f(H) - V(O2)) technique, and the technique has been applied successfully to answer eco-physiological questions. However, previous studies on the energetics of king penguins, based on developing or applying this technique, have raised a number of issues about the degree of validity of the technique for this species. These include the predictive validity of the present f(H) - V(O2) equations across different seasons and individuals and during different modes of locomotion. In many cases, these issues also apply to other species for which the f(H) - V(O2) technique has been applied. In the present study, the accuracy of three prediction equations for king penguins was investigated based on validity studies and on estimates of V(O2) from published, field f(H) data. The major conclusions from the present study are: (1) in contrast to that for walking, the f(H) - V(O2) relationship for swimming king penguins is not affected by body mass; (2) prediction equation (1), log(V(O2) = -0.279 + 1.24log(f(H) + 0.0237t - 0.0157log(f(H)t, derived in a previous study, is the most suitable equation presently available for estimating V(O2) in king penguins for all locomotory and nutritional states. A number of possible problems associated with producing an f(H) - V(O2) relationship are discussed in the present study. Finally, a statistical method to include easy-to-measure morphometric characteristics, which may improve the accuracy of f(H) - V(O2) prediction equations, is explained.

  1. New Technique for TOC Estimation Based on Thermal Core Logging in Low-Permeable Formations (Bazhen fm.)

    NASA Astrophysics Data System (ADS)

    Popov, Evgeny; Popov, Yury; Spasennykh, Mikhail; Kozlova, Elena; Chekhonin, Evgeny; Zagranovskaya, Dzhuliya; Belenkaya, Irina; Alekseev, Aleksey

    2016-04-01

    A practical method of organic-rich intervals identifying within the low-permeable dispersive rocks based on thermal conductivity measurements along the core is presented. Non-destructive non-contact thermal core logging was performed with optical scanning technique on 4 685 full size core samples from 7 wells drilled in four low-permeable zones of the Bazhen formation (B.fm.) in the Western Siberia (Russia). The method employs continuous simultaneous measurements of rock anisotropy, volumetric heat capacity, thermal anisotropy coefficient and thermal heterogeneity factor along the cores allowing the high vertical resolution (of up to 1-2 mm). B.fm. rock matrix thermal conductivity was observed to be essentially stable within the range of 2.5-2.7 W/(m*K). However, stable matrix thermal conductivity along with the high thermal anisotropy coefficient is characteristic for B.fm. sediments due to the low rock porosity values. It is shown experimentally that thermal parameters measured relate linearly to organic richness rather than to porosity coefficient deviations. Thus, a new technique employing the transformation of the thermal conductivity profiles into continuous profiles of total organic carbon (TOC) values along the core was developed. Comparison of TOC values, estimated from the thermal conductivity values, with experimental pyrolytic TOC estimations of 665 samples from the cores using the Rock-Eval and HAWK instruments demonstrated high efficiency of the new technique for the organic rich intervals separation. The data obtained with the new technique are essential for the SR hydrocarbon generation potential, for basin and petroleum system modeling application, and estimation of hydrocarbon reserves. The method allows for the TOC richness to be accurately assessed using the thermal well logs. The research work was done with financial support of the Russian Ministry of Education and Science (unique identification number RFMEFI58114X0008).

  2. More Learners, Finite Resources, and the Changing Landscape of Procedural Training at the Bedside.

    PubMed

    Gisondi, Michael A; Regan, Linda; Branzetti, Jeremy; Hopson, Laura R

    2018-05-01

    There is growing competition for nonoperative, procedural training in teaching hospitals, due to an increased number of individuals seeking to learn procedures from a finite number of appropriate teaching cases. Procedural training is required by students, postgraduate learners, and practicing providers who must maintain their skills. These learner groups are growing in size as the number of medical schools increases and advance practice providers expand their skills to include complex procedures. These various learner needs occur against a background of advancing therapeutic techniques that improve patient care but also act to reduce the overall numbers of procedures available to learners. This article is a brief review of these and other challenges that are arising for program directors, medical school leaders, and hospital administrators who must act to ensure that all of their providers acquire and maintain competency in a wide array of procedural skills. The authors conclude their review with several recommendations to better address procedural training in this new era of learner competition. These include a call for innovative clinical rotations deliberately designed to improve procedural training, access to training opportunities at new clinical sites acquired in health system expansions, targeted faculty development for those who teach procedures, reporting of competition for bedside procedures by trainees, more frequent review of resident procedure and case logs, and the creation of an institutional oversight committee for procedural training.

  3. Transaction aware tape-infrastructure monitoring

    NASA Astrophysics Data System (ADS)

    Nikolaidis, Fotios; Kruse, Daniele Francesco

    2014-06-01

    Administrating a large scale, multi protocol, hierarchical tape infrastructure like the CERN Advanced STORage manager (CASTOR)[2], which stores now 100 PB (with an increasing step of 25 PB per year), requires an adequate monitoring system for quick spotting of malfunctions, easier debugging and on demand report generation. The main challenges for such system are: to cope with CASTOR's log format diversity and its information scattered among several log files, the need for long term information archival, the strict reliability requirements and the group based GUI visualization. For this purpose, we have designed, developed and deployed a centralized system consisting of four independent layers: the Log Transfer layer for collecting log lines from all tape servers to a single aggregation server, the Data Mining layer for combining log data into transaction context, the Storage layer for archiving the resulting transactions and finally the Web UI layer for accessing the information. Having flexibility, extensibility and maintainability in mind, each layer is designed to work as a message broker for the next layer, providing a clean and generic interface while ensuring consistency, redundancy and ultimately fault tolerance. This system unifies information previously dispersed over several monitoring tools into a single user interface, using Splunk, which also allows us to provide information visualization based on access control lists (ACL). Since its deployment, it has been successfully used by CASTOR tape operators for quick overview of transactions, performance evaluation, malfunction detection and from managers for report generation.

  4. Antimicrobial activity of a novel adhesive containing chlorhexidine gluconate (CHG) against the resident microflora in human volunteers

    PubMed Central

    Carty, Neal; Wibaux, Anne; Ward, Colleen; Paulson, Daryl S.; Johnson, Peter

    2014-01-01

    Objectives To evaluate the antimicrobial activity of a new, transparent composite film dressing, whose adhesive contains chlorhexidine gluconate (CHG), against the native microflora present on human skin. Methods CHG-containing adhesive film dressings and non-antimicrobial control film dressings were applied to the skin on the backs of healthy human volunteers without antiseptic preparation. Dressings were removed 1, 4 or 7 days after application. The bacterial populations underneath were measured by quantitative cultures (cylinder-scrub technique) and compared with one another as a function of time. Results The mean baseline microflora recovery was 3.24 log10 cfu/cm2. The mean log reductions from baseline measured from underneath the CHG-containing dressings were 0.87, 0.78 and 1.30 log10 cfu/cm2 on days 1, 4 and 7, respectively, compared with log reductions of 0.67, −0.87 and −1.29 log10 cfu/cm2 from underneath the control film dressings. There was no significant difference between the log reductions of the two treatments on day 1, but on days 4 and 7 the log reduction associated with the CHG adhesive was significantly higher than that associated with the control adhesive. Conclusions The adhesive containing CHG was associated with a sustained antimicrobial effect that was not present in the control. Incorporating the antimicrobial into the adhesive layer confers upon it bactericidal properties in marked contrast to the non-antimicrobial adhesive, which contributed to bacterial proliferation when the wear time was ≥4 days. PMID:24722839

  5. Antimicrobial activity of a novel adhesive containing chlorhexidine gluconate (CHG) against the resident microflora in human volunteers.

    PubMed

    Carty, Neal; Wibaux, Anne; Ward, Colleen; Paulson, Daryl S; Johnson, Peter

    2014-08-01

    To evaluate the antimicrobial activity of a new, transparent composite film dressing, whose adhesive contains chlorhexidine gluconate (CHG), against the native microflora present on human skin. CHG-containing adhesive film dressings and non-antimicrobial control film dressings were applied to the skin on the backs of healthy human volunteers without antiseptic preparation. Dressings were removed 1, 4 or 7 days after application. The bacterial populations underneath were measured by quantitative cultures (cylinder-scrub technique) and compared with one another as a function of time. The mean baseline microflora recovery was 3.24 log10 cfu/cm(2). The mean log reductions from baseline measured from underneath the CHG-containing dressings were 0.87, 0.78 and 1.30 log10 cfu/cm(2) on days 1, 4 and 7, respectively, compared with log reductions of 0.67, -0.87 and -1.29 log10 cfu/cm(2) from underneath the control film dressings. There was no significant difference between the log reductions of the two treatments on day 1, but on days 4 and 7 the log reduction associated with the CHG adhesive was significantly higher than that associated with the control adhesive. The adhesive containing CHG was associated with a sustained antimicrobial effect that was not present in the control. Incorporating the antimicrobial into the adhesive layer confers upon it bactericidal properties in marked contrast to the non-antimicrobial adhesive, which contributed to bacterial proliferation when the wear time was ≥4 days. © The Author 2014. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy.

  6. Field project to obtain pressure core, wireline log, and production test data for evaluation of CO/sub 2/ flooding potential, Conoco MCA unit well No. 358, Maljamar Field, Lea County, New Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swift, T.E.; Marlow, R.E.; Wilhelm, M.H.

    1981-11-01

    This report describes part of the work done to fulfill a contract awarded to Gruy Federal, Inc., by the Department of Energy (DOE) on Feburary 12, 1979. The work includes pressure-coring and associated logging and testing programs to provide data on in-situ oil saturation, porosity and permeability distribution, and other data needed for resource characterization of fields and reservoirs in which CO/sub 2/ injection might have a high probability of success. This report details the second such project. Core porosities agreed well with computed log porosities. Core water saturation and computed log porosities agree fairly well from 3692 to 3712more » feet, poorly from 3712 to 3820 feet and in a general way from 4035 to 4107 feet. Computer log analysis techniques incorporating the a, m, and n values obtained from Core Laboratories analysis did not improve the agreement of log versus core derived water saturations. However, both core and log analysis indicated the ninth zone had the highest residual hydrocarbon saturations and production data confirmed the validity of oil saturation determinations. Residual oil saturation, for the perforated and tested intervals were 259 STB/acre-ft for the interval from 4035 to 4055 feet, and 150 STB/acre-ft for the interval from 3692 to 3718 feet. Nine BOPD was produced from the interval 4035 to 4055 feet and no oil was produced from interval 3692 to 3718 feet, qualitatively confirming the relative oil saturations as calculated. The low oil production in the zone from 4022 to 4055 and the lack of production from 3692 to 3718 feet indicated the zone to be at or near residual waterflood conditions as determined by log analysis. This project demonstrates the usefulness of integrating pressure core, log, and production data to realistically evaluate a reservoir for carbon dioxide flood.« less

  7. Console Log Keeping Made Easier - Tools and Techniques for Improving Quality of Flight Controller Activity Logs

    NASA Technical Reports Server (NTRS)

    Scott, David W.; Underwood, Debrah (Technical Monitor)

    2002-01-01

    At the Marshall Space Flight Center's (MSFC) Payload Operations Integration Center (POIC) for International Space Station (ISS), each flight controller maintains detailed logs of activities and communications at their console position. These logs are critical for accurately controlling flight in real-time as well as providing a historical record and troubleshooting tool. This paper describes logging methods and electronic formats used at the POIC and provides food for thought on their strengths and limitations, plus proposes some innovative extensions. It also describes an inexpensive PC-based scheme for capturing and/or transcribing audio clips from communications consoles. Flight control activity (e.g. interpreting computer displays, entering data/issuing electronic commands, and communicating with others) can become extremely intense. It's essential to document it well, but the effort to do so may conflict with actual activity. This can be more than just annoying, as what's in the logs (or just as importantly not in them) often feeds back directly into the quality of future operations, whether short-term or long-term. In earlier programs, such as Spacelab, log keeping was done on paper, often using position-specific shorthand, and the other reader was at the mercy of the writer's penmanship. Today, user-friendly software solves the legibility problem and can automate date/time entry, but some content may take longer to finish due to individual typing speed and less use of symbols. File layout can be used to great advantage in making types of information easy to find, and creating searchable master logs for a given position is very easy and a real lifesaver in reconstructing events or researching a given topic. We'll examine log formats from several console position, and the types of information that are included and (just as importantly) excluded. We'll also look at when a summary or synopsis is effective, and when extensive detail is needed.

  8. Impact of logging on a mangrove swamp in south Mexico: cost/benefit analysis.

    PubMed

    Tovilla-Hernández, C; Espino de la Lanza, G; Orihuela-Belmonte, D E

    2001-06-01

    Environmental changes caused by logging in a mangrove swamp were studied in Barra de Tecoanapa, Guerrero, Mexico. Original forest included Rhizophora mangle, Laguncularia racemosa, Avicennia germinans and halophytic vegetation, and produced wood (164.03 m3/ha) and organic matter (3.9 g/m2/day). A total of 3.5 tons of wood per year were harvested from this area. Later, an average of 2,555 kg of maize per planting cycle were obtained (market value of 88 USD). Succession when the area was abandoned included strictly facultative and glycophyte halophytes (16 families, Cyperaceae and Poaceae were the best represented). After logging, temperatures increased 13 degrees C in the soil and 11 degrees C in the air, whereas salinity reached 52 psu in the dry season. These modified soil color and sand content increased from 42.6 to 63.4%. Logging was deleterious to species, habitat, biogeochemical and biological cycles, organic matter production, seeds, young plants, genetic exchange conservation of soil and its fertility, coastal protection, and aesthetic value; 3,000 m2 had eroded as the river advanced towards the deforested area (the cost/benefit analysis showed a ratio of 246:1). There was long-term economic loss for the community and only 30% of the site has recovered after five years.

  9. Borehole techniques identifying subsurface chimney heights in loose ground-some experiences above underground nuclear explosions

    USGS Publications Warehouse

    Carroll, R.D.; Lacomb, J.W.

    1993-01-01

    The location of the subsurface top of the chimney formed by the collapse of the cavity resulting from an underground nuclear explosion is examined at five sites at the Nevada Test Site. The chimneys were investigated by drilling, coring, geophysical logging (density, gamma-ray, caliper), and seismic velocity surveys. The identification of the top of the chimney can be complicated by chimney termination in friable volcanic rock of relatively high porosity. The presence of an apical void in three of the five cases is confirmed as the chimney horizon by coincidence with anomalies observed in coring, caliper and gamma-ray logging (two cases), seismic velocity, and drilling. In the two cases where an apical void is not present, several of these techniques yield anomalies at identical horizons, however, the exact depth of chimney penetration is subject to some degree of uncertainty. This is due chiefly to the extent to which core recovery and seismic velocity may be affected by perturbations in the tuff above the chimney due to the explosion and collapse. The data suggest, however, that the depth uncertainty may be only of the order of 10 m if several indicators are available. Of all indicators, core recovery and seismic velocity indicate anomalous horizons in every case. Because radiation products associated with the explosion are contained within the immediate vicinity of the cavity, gamma-ray logs are generally not diagnostic of chimney penetration. In no case is the denisty log indicative of the presence of the chimney. ?? 1993.

  10. Identification and Management of Pump Thrombus in the HeartWare Left Ventricular Assist Device System: A Novel Approach Using Log File Analysis.

    PubMed

    Jorde, Ulrich P; Aaronson, Keith D; Najjar, Samer S; Pagani, Francis D; Hayward, Christopher; Zimpfer, Daniel; Schlöglhofer, Thomas; Pham, Duc T; Goldstein, Daniel J; Leadley, Katrin; Chow, Ming-Jay; Brown, Michael C; Uriel, Nir

    2015-11-01

    The study sought to characterize patterns in the HeartWare (HeartWare Inc., Framingham, Massachusetts) ventricular assist device (HVAD) log files associated with successful medical treatment of device thrombosis. Device thrombosis is a serious adverse event for mechanical circulatory support devices and is often preceded by increased power consumption. Log files of the pump power are easily accessible on the bedside monitor of HVAD patients and may allow early diagnosis of device thrombosis. Furthermore, analysis of the log files may be able to predict the success rate of thrombolysis or the need for pump exchange. The log files of 15 ADVANCE trial patients (algorithm derivation cohort) with 16 pump thrombus events treated with tissue plasminogen activator (tPA) were assessed for changes in the absolute and rate of increase in power consumption. Successful thrombolysis was defined as a clinical resolution of pump thrombus including normalization of power consumption and improvement in biochemical markers of hemolysis. Significant differences in log file patterns between successful and unsuccessful thrombolysis treatments were verified in 43 patients with 53 pump thrombus events implanted outside of clinical trials (validation cohort). The overall success rate of tPA therapy was 57%. Successful treatments had significantly lower measures of percent of expected power (130.9% vs. 196.1%, p = 0.016) and rate of increase in power (0.61 vs. 2.87, p < 0.0001). Medical therapy was successful in 77.7% of the algorithm development cohort and 81.3% of the validation cohort when the rate of power increase and percent of expected power values were <1.25% and 200%, respectively. Log file parameters can potentially predict the likelihood of successful tPA treatments and if validated prospectively, could substantially alter the approach to thrombus management. Copyright © 2015 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  11. Simultaneous Tracking of Multiple Points Using a Wiimote

    ERIC Educational Resources Information Center

    Skeffington, Alex; Scully, Kyle

    2012-01-01

    This paper reviews the construction of an inexpensive motion tracking and data logging system, which can be used for a wide variety of teaching experiments ranging from entry-level physics courses to advanced courses. The system utilizes an affordable infrared camera found in a Nintendo Wiimote to track IR LEDs mounted to the objects to be…

  12. Technological advances in temperate hardwood tree improvement including breeding and molecular marker applications

    Treesearch

    Paula M. Pijut; Keith E. Woeste; G. Vengadesan

    2007-01-01

    Hardwood forests and plantations are an important economic resource for the forest products industry worldwide and to the international trade of lumber and logs. Hardwood trees are also planted for ecological reasons, for example, wildlife habitat, native woodland restoration, and riparian buffers. The demand for quality hardwood from tree plantations will continue to...

  13. Cooperative use of advanced scanning technology for low-volume hardwood processors

    Treesearch

    Luis G. Occeña; Timothy J. Rayner; Daniel L. Schmoldt; A. Lynn Abbott

    2001-01-01

    Of the several hundreds of hardwood lumber sawmills across the country, the majority are small- to medium-sized facilities operated as small businesses in rural communities. Trends of increased log costs and limited availability are forcing wood processors to become more efficient in their operations. Still, small mills are less able to adopt new, more efficient...

  14. Sensor-Free or Sensor-Full: A Comparison of Data Modalities in Multi-Channel Affect Detection

    ERIC Educational Resources Information Center

    Paquette, Luc; Rowe, Jonathan; Baker, Ryan; Mott, Bradford; Lester, James; DeFalco, Jeanine; Brawner, Keith; Sottilare, Robert; Georgoulas, Vasiliki

    2016-01-01

    Computational models that automatically detect learners' affective states are powerful tools for investigating the interplay of affect and learning. Over the past decade, affect detectors--which recognize learners' affective states at run-time using behavior logs and sensor data--have advanced substantially across a range of K-12 and postsecondary…

  15. Thermal properties of forest fuels

    Treesearch

    G.M. Byram; W.L. Fons

    1952-01-01

    Forest fuels are heterogeneous mixtures of a number of green and dead woody substances. Most common are leaves, grass, conifer needles, moss, bark, and wood. As a result of past fires, an area may also contain some charcoal. With the exception of charcoal, these materials are in various stages of decay or decomposition. Logs, limbs, and decayed wood in an advanced...

  16. Fast decoder for local quantum codes using Groebner basis

    NASA Astrophysics Data System (ADS)

    Haah, Jeongwan

    2013-03-01

    Based on arXiv:1204.1063. A local translation-invariant quantum code has a description in terms of Laurent polynomials. As an application of this observation, we present a fast decoding algorithm for translation-invariant local quantum codes in any spatial dimensions using the straightforward division algorithm for multivariate polynomials. The running time is O (n log n) on average, or O (n2 log n) on worst cases, where n is the number of physical qubits. The algorithm improves a subroutine of the renormalization-group decoder by Bravyi and Haah (arXiv:1112.3252) in the translation-invariant case. This work is supported in part by the Insitute for Quantum Information and Matter, an NSF Physics Frontier Center, and the Korea Foundation for Advanced Studies.

  17. Integration of carbon conservation into sustainable forest management using high resolution satellite imagery: A case study in Sabah, Malaysian Borneo

    NASA Astrophysics Data System (ADS)

    Langner, Andreas; Samejima, Hiromitsu; Ong, Robert C.; Titin, Jupiri; Kitayama, Kanehiro

    2012-08-01

    Conservation of tropical forests is of outstanding importance for mitigation of climate change effects and preserving biodiversity. In Borneo most of the forests are classified as permanent forest estates and are selectively logged using conventional logging techniques causing high damage to the forest ecosystems. Incorporation of sustainable forest management into climate change mitigation measures such as Reducing Emissions from Deforestation and Forest Degradation (REDD+) can help to avert further forest degradation by synergizing sustainable timber production with the conservation of biodiversity. In order to evaluate the efficiency of such initiatives, monitoring methods for forest degradation and above-ground biomass in tropical forests are urgently needed. In this study we developed an index using Landsat satellite data to describe the crown cover condition of lowland mixed dipterocarp forests. We showed that this index combined with field data can be used to estimate above-ground biomass using a regression model in two permanent forest estates in Sabah, Malaysian Borneo. Tangkulap represented a conventionally logged forest estate while Deramakot has been managed in accordance with sustainable forestry principles. The results revealed that conventional logging techniques used in Tangkulap during 1991 and 2000 decreased the above-ground biomass by an annual amount of average -6.0 t C/ha (-5.2 to -7.0 t C/ha, 95% confidential interval) whereas the biomass in Deramakot increased by 6.1 t C/ha per year (5.3-7.2 t C/ha, 95% confidential interval) between 2000 and 2007 while under sustainable forest management. This indicates that sustainable forest management with reduced-impact logging helps to protect above-ground biomass. In absolute terms, a conservative amount of 10.5 t C/ha per year, as documented using the methodology developed in this study, can be attributed to the different management systems, which will be of interest when implementing REDD+ that rewards the enhancement of carbon stocks.

  18. Turbine blade tip durability analysis

    NASA Technical Reports Server (NTRS)

    Mcknight, R. L.; Laflen, J. H.; Spamer, G. T.

    1981-01-01

    An air-cooled turbine blade from an aircraft gas turbine engine chosen for its history of cracking was subjected to advanced analytical and life-prediction techniques. The utility of advanced structural analysis techniques and advanced life-prediction techniques in the life assessment of hot section components are verified. Three dimensional heat transfer and stress analyses were applied to the turbine blade mission cycle and the results were input into advanced life-prediction theories. Shortcut analytical techniques were developed. The proposed life-prediction theories are evaluated.

  19. Generalized image contrast enhancement technique based on the Heinemann contrast discrimination model

    NASA Astrophysics Data System (ADS)

    Liu, Hong; Nodine, Calvin F.

    1996-07-01

    This paper presents a generalized image contrast enhancement technique, which equalizes the perceived brightness distribution based on the Heinemann contrast discrimination model. It is based on the mathematically proven existence of a unique solution to a nonlinear equation, and is formulated with easily tunable parameters. The model uses a two-step log-log representation of luminance contrast between targets and surround in a luminous background setting. The algorithm consists of two nonlinear gray scale mapping functions that have seven parameters, two of which are adjustable Heinemann constants. Another parameter is the background gray level. The remaining four parameters are nonlinear functions of the gray-level distribution of the given image, and can be uniquely determined once the previous three are set. Tests have been carried out to demonstrate the effectiveness of the algorithm for increasing the overall contrast of radiology images. The traditional histogram equalization can be reinterpreted as an image enhancement technique based on the knowledge of human contrast perception. In fact, it is a special case of the proposed algorithm.

  20. Comparison of path visualizations and cognitive measures relative to travel technique in a virtual environment.

    PubMed

    Zanbaka, Catherine A; Lok, Benjamin C; Babu, Sabarish V; Ulinski, Amy C; Hodges, Larry F

    2005-01-01

    We describe a between-subjects experiment that compared four different methods of travel and their effect on cognition and paths taken in an immersive virtual environment (IVE). Participants answered a set of questions based on Crook's condensation of Bloom's taxonomy that assessed their cognition of the IVE with respect to knowledge, understanding and application, and higher mental processes. Participants also drew a sketch map of the IVE and the objects within it. The users' sense of presence was measured using the Steed-Usoh-Slater Presence Questionnaire. The participants' position and head orientation were automatically logged during their exposure to the virtual environment. These logs were later used to create visualizations of the paths taken. Path analysis, such as exploring the overlaid path visualizations and dwell data information, revealed further differences among the travel techniques. Our results suggest that, for applications where problem solving and evaluation of information is important or where opportunity to train is minimal, then having a large tracked space so that the participant can walk around the virtual environment provides benefits over common virtual travel techniques.

  1. Modal parameter identification using the log decrement method and band-pass filters

    NASA Astrophysics Data System (ADS)

    Liao, Yabin; Wells, Valana

    2011-10-01

    This paper presents a time-domain technique for identifying modal parameters of test specimens based on the log-decrement method. For lightly damped multidegree-of-freedom or continuous systems, the conventional method is usually restricted to identification of fundamental-mode parameters only. Implementation of band-pass filters makes it possible for the proposed technique to extract modal information of higher modes. The method has been applied to a polymethyl methacrylate (PMMA) beam for complex modulus identification in the frequency range 10-1100 Hz. Results compare well with those obtained using the Least Squares method, and with those previously published in literature. Then the accuracy of the proposed method has been further verified by experiments performed on a QuietSteel specimen with very low damping. The method is simple and fast. It can be used for a quick estimation of the modal parameters, or as a complementary approach for validation purposes.

  2. Parallel compression of data chunks of a shared data object using a log-structured file system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2016-10-25

    Techniques are provided for parallel compression of data chunks being written to a shared object. A client executing on a compute node or a burst buffer node in a parallel computing system stores a data chunk generated by the parallel computing system to a shared data object on a storage node by compressing the data chunk; and providing the data compressed data chunk to the storage node that stores the shared object. The client and storage node may employ Log-Structured File techniques. The compressed data chunk can be de-compressed by the client when the data chunk is read. A storagemore » node stores a data chunk as part of a shared object by receiving a compressed version of the data chunk from a compute node; and storing the compressed version of the data chunk to the shared data object on the storage node.« less

  3. Measurement of stiffness of standing trees and felled logs using acoustics: A review.

    PubMed

    Legg, Mathew; Bradley, Stuart

    2016-02-01

    This paper provides a review on the use of acoustics to measure stiffness of standing trees, stems, and logs. An outline is given of the properties of wood and how these are related to stiffness and acoustic velocity throughout the tree. Factors are described that influence the speed of sound in wood, including the different types of acoustic waves which propagate in tree stems and lumber. Acoustic tools and techniques that have been used to measure the stiffness of wood are reviewed. The reasons for a systematic difference between direct and acoustic measurements of stiffness for standing trees, and methods for correction, are discussed. Other techniques, which have been used in addition to acoustics to try to improve stiffness measurements, are also briefly described. Also reviewed are studies which have used acoustic tools to investigate factors that influence the stiffness of trees. These factors include different silvicultural practices, geographic and environmental conditions, and genetics.

  4. Fast projection/backprojection and incremental methods applied to synchrotron light tomographic reconstruction.

    PubMed

    de Lima, Camila; Salomão Helou, Elias

    2018-01-01

    Iterative methods for tomographic image reconstruction have the computational cost of each iteration dominated by the computation of the (back)projection operator, which take roughly O(N 3 ) floating point operations (flops) for N × N pixels images. Furthermore, classical iterative algorithms may take too many iterations in order to achieve acceptable images, thereby making the use of these techniques unpractical for high-resolution images. Techniques have been developed in the literature in order to reduce the computational cost of the (back)projection operator to O(N 2 logN) flops. Also, incremental algorithms have been devised that reduce by an order of magnitude the number of iterations required to achieve acceptable images. The present paper introduces an incremental algorithm with a cost of O(N 2 logN) flops per iteration and applies it to the reconstruction of very large tomographic images obtained from synchrotron light illuminated data.

  5. Evaluation of two immunomagnetic separation techniques for the detection and recovery of E. coli O157:H7 from finished composts

    USDA-ARS?s Scientific Manuscript database

    Two rapid immunomagnetic separation (IMS) protocols were evaluated to recover 1-2 log CFU/g inoculated E. coli O157:H7 from 30 different commercial, finished compost samples. Both protocols detected E. coli O157:H7 in compost samples; PCR techniques required the removal of inhibitors to reduce poss...

  6. Recovery Act Validation of Innovative Exploration Techniques Pilgrim Hot Springs, Alaska

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holdmann, Gwen

    2015-04-30

    Drilling and temperature logging campaigns between the late 1970's and early 1980’s measured temperatures at Pilgrim Hot Springs in excess of 90°C. Between 2010 and 2014 the University of Alaska used a variety of methods including geophysical surveys, remote sensing techniques, heat budget modeling, and additional drilling to better understand the resource and estimate the available geothermal energy.

  7. Identification of Hypertension Management-related Errors in a Personal Digital Assistant-based Clinical Log for Nurses in Advanced Practice Nurse Training.

    PubMed

    Lee, Nam-Ju; Cho, Eunhee; Bakken, Suzanne

    2010-03-01

    The purposes of this study were to develop a taxonomy for detection of errors related to hypertension management and to apply the taxonomy to retrospectively analyze the documentation of nurses in Advanced Practice Nurse (APN) training. We developed the Hypertension Diagnosis and Management Error Taxonomy and applied it in a sample of adult patient encounters (N = 15,862) that were documented in a personal digital assistant-based clinical log by registered nurses in APN training. We used Standard Query Language queries to retrieve hypertension-related data from the central database. The data were summarized using descriptive statistics. Blood pressure was documented in 77.5% (n = 12,297) of encounters; 21% had high blood pressure values. Missed diagnosis, incomplete diagnosis and misdiagnosis rates were 63.7%, 6.8% and 7.5% respectively. In terms of treatment, the omission rates were 17.9% for essential medications and 69.9% for essential patient teaching. Contraindicated anti-hypertensive medications were documented in 12% of encounters with co-occurring diagnoses of hypertension and asthma. The Hypertension Diagnosis and Management Error Taxonomy was useful for identifying errors based on documentation in a clinical log. The results provide an initial understanding of the nature of errors associated with hypertension diagnosis and management of nurses in APN training. The information gained from this study can contribute to educational interventions that promote APN competencies in identification and management of hypertension as well as overall patient safety and informatics competencies. Copyright © 2010 Korean Society of Nursing Science. Published by . All rights reserved.

  8. Is uniportal thoracoscopic surgery a feasible approach for advanced stages of non-small cell lung cancer?

    PubMed Central

    Fieira, Eva; Delgado, Maria; Mendez, Lucía; Fernandez, Ricardo; de la Torre, Mercedes

    2014-01-01

    Objectives Conventional video-assisted thoracoscopic (VATS) lobectomy for advanced lung cancer is a feasible and safe surgery in experienced centers. The aim of this study is to assess the feasibility of uniportal VATS approach in the treatment of advanced non-small cell lung cancer (NSCLC) and compare the perioperative outcomes and survival with those in early-stage tumors operated through the uniportal approach. Methods From June 2010 to December 2012, we performed 163 uniportal VATS major pulmonary resections. Only NSCLC cases were included in this study (130 cases). Patients were divided into two groups: (A) early stage and (B) advanced cases (>5 cm, T3 or T4, or tumors requiring neoadjuvant treatment). A descriptive and retrospective study was performed, comparing perioperative outcomes and survival obtained in both groups. A survival analysis was performed with Kaplan-Meier curves and the log-rank test was used to compare survival between patients with early and advanced stages. Results A total of 130 cases were included in the study: 87 (A) vs. 43 (B) patients (conversion rate 1.1 vs. 6.5%, P=0.119). Mean global age was 64.9 years and 73.8% were men. The patient demographic data was similar in both groups. Upper lobectomies (A, 52 vs. B, 21 patients) and anatomic segmentectomies (A, 4 vs. B, 0) were more frequent in group A while pneumonectomy was more frequent in B (A, 1 vs. B, 6 patients). Surgical time was longer (144.9±41.3 vs. 183.2±48.9, P<0.001), and median number of lymph nodes (14 vs. 16, P=0.004) were statistically higher in advanced cases. Median number of nodal stations (5 vs. 5, P=0.165), days of chest tube (2 vs. 2, P=0.098), HOS (3 vs. 3, P=0.072), and rate of complications (17.2% vs. 14%, P=0.075) were similar in both groups. One patient died on the 58th postoperative day. The 30-month survival rate was 90% for the early stage group and 74% for advanced cases Conclusions Uniportal VATS lobectomy for advanced cases of NSCLC is a safe and reliable procedure that provides perioperative outcomes similar to those obtained in early stage tumours operated through this same technique. Further long term survival analyses are ongoing on a large number of patients. PMID:24976985

  9. Boulder-Faced Log Dams and other Alternatives for Gabion Check Dams in First-Order Ephemeral Streams with Coarse Bed Load in Ethiopia

    NASA Astrophysics Data System (ADS)

    Nyssen, Jan; Gebreslassie, Seifu; Assefa, Romha; Deckers, Jozef; Guyassa, Etefa; Poesen, Jean; Frankl, Amaury

    2017-04-01

    Many thousands of gabion check dams have been installed to control gully erosion in Ethiopia, but several challenges still remain, such as the issue of gabion failure in ephemeral streams with coarse bed load, that abrades at the chute step. As an alternative for gabion check dams in torrents with coarse bed load, boulder-faced log dams were conceived, installed transversally across torrents and tested (n = 30). For this, logs (22-35 cm across) were embedded in the banks of torrents, 0.5-1 m above the bed and their upstream sides were faced with boulders (0.3-0.7 m across). Similar to gabion check dams, boulder-faced log dams lead to temporary ponding, spreading of peak flow over the entire channel width and sediment deposition. Results of testing under extreme flow conditions (including two storms with return periods of 5.6 and 7 years) show that 18 dams resisted strong floods. Beyond certain flood thresholds, represented by proxies such as Strahler's stream order, catchment area, D95 or channel width), 11 log dams were completely destroyed. Smallholder farmers see much potential in this type of structure to control first-order torrents with coarse bed load, since the technique is cost-effective and can be easily installed.

  10. Eliminating the rugosity effect from compensated density logs by geometrical response matching

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flaum, C.; Holenka, J.M.; Case, C.R.

    1991-06-01

    A theoretical and experimental effort to understand the effects of borehole rugosity on individual detector responses yielded an improved method of processing compensated density logs. Historically, the spine/ribs technique for obtaining borehole and mudcake compensation of dual-detector, gamma-gamma density logs has been very successful as long as the borehole and other environmental effects vary slowly with depth and the interest in limited to vertical features broader than several feet. With the increased interest in higher vertical resolution, a more detailed analysis of the effect of such quickly varying environmental effects as rugosity was required. A laboratory setup simulating the effectmore » of rugosity on Schlumberger Litho-Density{sup SM} tools (LDT) was used to study vertical response in the presence of rugosity. The data served as the benchmark for the Nonte Carlo models used to generate synthetic density logs in the presence of more complex rugosity patterns. The results provided in this paper show that proper matching of the two detector responses before application of conventional compensation methods can eliminate rugosity effects without degrading the measurements vertical resolution. The accuracy of the results is a good as the obtained in a parallel mudcake or standoff with the conventional method. Application to both field and synthetic log confirmed the validity of these results.« less

  11. Explorations in statistics: the log transformation.

    PubMed

    Curran-Everett, Douglas

    2018-06-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This thirteenth installment of Explorations in Statistics explores the log transformation, an established technique that rescales the actual observations from an experiment so that the assumptions of some statistical analysis are better met. A general assumption in statistics is that the variability of some response Y is homogeneous across groups or across some predictor variable X. If the variability-the standard deviation-varies in rough proportion to the mean value of Y, a log transformation can equalize the standard deviations. Moreover, if the actual observations from an experiment conform to a skewed distribution, then a log transformation can make the theoretical distribution of the sample mean more consistent with a normal distribution. This is important: the results of a one-sample t test are meaningful only if the theoretical distribution of the sample mean is roughly normal. If we log-transform our observations, then we want to confirm the transformation was useful. We can do this if we use the Box-Cox method, if we bootstrap the sample mean and the statistic t itself, and if we assess the residual plots from the statistical model of the actual and transformed sample observations.

  12. Application of Nuclear Well Logging Techniques to Lunar Resource Assessment

    NASA Technical Reports Server (NTRS)

    Albats, P.; Groves, J.; Schweitzer, J.; Tombrello, T.

    1992-01-01

    The use of neutron and gamma ray measurements for the analysis of material composition has become well established in the last 40 years. Schlumberger has pioneered the use of this technology for logging wells drilled to produce oil and gas, and for this purpose has developed neutron generators that allow measurements to be made in deep (5000 m) boreholes under adverse conditions. We also make ruggedized neutron and gamma ray detector packages that can be used to make reliable measurements on the drill collar of a rotating drill string while the well is being drilled, where the conditions are severe. Modern nuclear methods used in logging measure rock formation parameters like bulk density and porosity, fluid composition, and element abundances by weight including hydrogen concentration. The measurements are made with high precision and accuracy. These devices (well logging sondes) share many of the design criteria required for remote sensing in space; they must be small, light, rugged, and able to perform reliably under adverse conditions. We see a role for the adaptation of this technology to lunar or planetary resource assessment missions.

  13. Log-Based Recovery in Asynchronous Distributed Systems. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Kane, Kenneth Paul

    1989-01-01

    A log-based mechanism is described for restoring consistent states to replicated data objects after failures. Preserving a causal form of consistency based on the notion of virtual time is focused upon in this report. Causal consistency has been shown to apply to a variety of applications, including distributed simulation, task decomposition, and mail delivery systems. Several mechanisms have been proposed for implementing causally consistent recovery, most notably those of Strom and Yemini, and Johnson and Zwaenepoel. The mechanism proposed here differs from these in two major respects. First, a roll-forward style of recovery is implemented. A functioning process is never required to roll-back its state in order to achieve consistency with a recovering process. Second, the mechanism does not require any explicit information about the causal dependencies between updates. Instead, all necessary dependency information is inferred from the orders in which updates are logged by the object servers. This basic recovery technique appears to be applicable to forms of consistency other than causal consistency. In particular, it is shown how the recovery technique can be modified to support an atomic form of consistency (grouping consistency). By combining grouping consistency with casual consistency, it may even be possible to implement serializable consistency within this mechanism.

  14. Inference of strata separation and gas emission paths in longwall overburden using continuous wavelet transform of well logs and geostatistical simulation

    NASA Astrophysics Data System (ADS)

    Karacan, C. Özgen; Olea, Ricardo A.

    2014-06-01

    Prediction of potential methane emission pathways from various sources into active mine workings or sealed gobs from longwall overburden is important for controlling methane and for improving mining safety. The aim of this paper is to infer strata separation intervals and thus gas emission pathways from standard well log data. The proposed technique was applied to well logs acquired through the Mary Lee/Blue Creek coal seam of the Upper Pottsville Formation in the Black Warrior Basin, Alabama, using well logs from a series of boreholes aligned along a nearly linear profile. For this purpose, continuous wavelet transform (CWT) of digitized gamma well logs was performed by using Mexican hat and Morlet, as the mother wavelets, to identify potential discontinuities in the signal. Pointwise Hölder exponents (PHE) of gamma logs were also computed using the generalized quadratic variations (GQV) method to identify the location and strength of singularities of well log signals as a complementary analysis. PHEs and wavelet coefficients were analyzed to find the locations of singularities along the logs. Using the well logs in this study, locations of predicted singularities were used as indicators in single normal equation simulation (SNESIM) to generate equi-probable realizations of potential strata separation intervals. Horizontal and vertical variograms of realizations were then analyzed and compared with those of indicator data and training image (TI) data using the Kruskal-Wallis test. A sum of squared differences was employed to select the most probable realization representing the locations of potential strata separations and methane flow paths. Results indicated that singularities located in well log signals reliably correlated with strata transitions or discontinuities within the strata. Geostatistical simulation of these discontinuities provided information about the location and extents of the continuous channels that may form during mining. If there is a gas source within their zone of influence, paths may develop and allow methane movement towards sealed or active gobs under pressure differentials. Knowledge gained from this research will better prepare mine operations for potential methane inflows, thus improving mine safety.

  15. Cluster analysis and quality assessment of logged water at an irrigation project, eastern Saudi Arabia.

    PubMed

    Hussain, Mahbub; Ahmed, Syed Munaf; Abderrahman, Walid

    2008-01-01

    A multivariate statistical technique, cluster analysis, was used to assess the logged surface water quality at an irrigation project at Al-Fadhley, Eastern Province, Saudi Arabia. The principal idea behind using the technique was to utilize all available hydrochemical variables in the quality assessment including trace elements and other ions which are not considered in conventional techniques for water quality assessments like Stiff and Piper diagrams. Furthermore, the area belongs to an irrigation project where water contamination associated with the use of fertilizers, insecticides and pesticides is expected. This quality assessment study was carried out on a total of 34 surface/logged water samples. To gain a greater insight in terms of the seasonal variation of water quality, 17 samples were collected from both summer and winter seasons. The collected samples were analyzed for a total of 23 water quality parameters including pH, TDS, conductivity, alkalinity, sulfate, chloride, bicarbonate, nitrate, phosphate, bromide, fluoride, calcium, magnesium, sodium, potassium, arsenic, boron, copper, cobalt, iron, lithium, manganese, molybdenum, nickel, selenium, mercury and zinc. Cluster analysis in both Q and R modes was used. Q-mode analysis resulted in three distinct water types for both the summer and winter seasons. Q-mode analysis also showed the spatial as well as temporal variation in water quality. R-mode cluster analysis led to the conclusion that there are two major sources of contamination for the surface/shallow groundwater in the area: fertilizers, micronutrients, pesticides, and insecticides used in agricultural activities, and non-point natural sources.

  16. A log-linear model approach to estimation of population size using the line-transect sampling method

    USGS Publications Warehouse

    Anderson, D.R.; Burnham, K.P.; Crain, B.R.

    1978-01-01

    The technique of estimating wildlife population size and density using the belt or line-transect sampling method has been used in many past projects, such as the estimation of density of waterfowl nestling sites in marshes, and is being used currently in such areas as the assessment of Pacific porpoise stocks in regions of tuna fishing activity. A mathematical framework for line-transect methodology has only emerged in the last 5 yr. In the present article, we extend this mathematical framework to a line-transect estimator based upon a log-linear model approach.

  17. [Investigation of Elekta linac characteristics for VMAT].

    PubMed

    Luo, Guangwen; Zhang, Kunyi

    2012-01-01

    The aim of this study is to investigate the characteristics of Elekta delivery system for volumetric modulated arc therapy (VMAT). Five VMAT plans were delivered in service mode and dose rates, and speed of gantry and MLC leaves were analyzed by log files. Results showed that dose rates varied between 6 dose rates. Gantry and MLC leaf speed dynamically varied during delivery. The technique of VMAT requires linac to dynamically control more parameters, and these key dynamic variables during VMAT delivery can be checked by log files. Quality assurance procedure should be carried out for VMAT related parameter.

  18. Analytic Simulation of the Elastic Waves Propagation in the Neighborhood of Fluid Filled Wells with Monopole Sources

    NASA Astrophysics Data System (ADS)

    Ávila-Carrera, R.; Sánchez-Sesma, F. J.; Spurlin, James H.; Valle-Molina, C.; Rodríguez-Castellanos, A.

    2014-09-01

    An analytic formulation to understand the scattering, diffraction and attenuation of elastic waves at the neighborhood of fluid filled wells is presented. An important, and not widely exploited, technique to carefully investigate the wave propagation in exploration wells is the logging of sonic waveforms. Fundamental decisions and production planning in petroleum reservoirs are made by interpretation of such recordings. Nowadays, geophysicists and engineers face problems related to the acquisition and interpretation under complex conditions associated with conducting open-hole measurements. A crucial problem that directly affects the response of sonic logs is the eccentricity of the measuring tool with respect to the center of the borehole. Even with the employment of centralizers, this simple variation, dramatically changes the physical conditions on the wave propagation around the well. Recent works in the numerical field reported advanced studies in modeling and simulation of acoustic wave propagation around wells, including complex heterogeneities and anisotropy. However, no analytical efforts have been made to formally understand the wireline sonic logging measurements acquired with borehole-eccentered tools. In this paper, the Graf's addition theorem was used to describe monopole sources in terms of solutions of the wave equation. The formulation was developed from the three-dimensional discrete wave-number method in the frequency domain. The cylindrical Bessel functions of the third kind and order zero were re-derived to obtain a simplified set of equations projected into a bi-dimensional plane-space for displacements and stresses. This new and condensed analytic formulation allows the straightforward calculation of all converted modes and their visualization in the time domain via Fourier synthesis. The main aim was to obtain spectral surfaces of transfer functions and synthetic seismograms that might be useful to understand the wave motion produced by the eccentricity of the source and explain in detail the new arising borehole propagation modes. Finally, time histories and amplitude spectra for relevant examples are presented and the validation of time traces using the spectral element method is reported.

  19. Acceptability of a Mobile Phone App for Measuring Time Use in Breast Cancer Survivors (Life in a Day): Mixed-Methods Study.

    PubMed

    Ainsworth, Matthew Cole; Pekmezi, Dori; Bowles, Heather; Ehlers, Diane; McAuley, Edward; Courneya, Kerry S; Rogers, Laura Q

    2018-05-14

    Advancements in mobile technology allow innovative data collection techniques such as measuring time use (ie, how individuals structure their time) for the purpose of improving health behavior change interventions. The aim of this study was to examine the acceptability of a 5-day trial of the Life in a Day mobile phone app measuring time use in breast cancer survivors to advance technology-based measurement of time use. Acceptability data were collected from participants (N=40; 100% response rate) using a self-administered survey after 5 days of Life in a Day use. Overall, participants had a mean age of 55 years (SD 8) and completed 16 years of school (SD 2). Participants generally agreed that learning to use Life in a Day was easy (83%, 33/40) and would prefer to log activities using Life in a Day over paper-and-pencil diary (73%, 29/40). A slight majority felt that completing Life in a Day for 5 consecutive days was not too much (60%, 24/40) or overly time-consuming (68%, 27/40). Life in a Day was rated as easy to read (88%, 35/40) and navigate (70%, 32/40). Participants also agreed that it was easy to log activities using the activity timer at the start and end of an activity (90%, 35/39). Only 13% (5/40) downloaded the app on their personal phone, whereas 63% (19/30) of the remaining participants would have preferred to use their personal phone. Overall, 77% (30/39) of participants felt that the Life in a Day app was good or very good. Those who agreed that it was easy to edit activities were significantly more likely to be younger when compared with those who disagreed (mean 53 vs 58 years, P=.04). Similarly, those who agreed that it was easy to remember to log activities were more likely to be younger (mean 52 vs 60 years, P<.001). Qualitative coding of 2 open-ended survey items yielded 3 common themes for Life in a Day improvement (ie, convenience, user interface, and reminders). A mobile phone app is an acceptable time-use measurement modality. Improving convenience, user interface, and memory prompts while addressing the needs of older participants is needed to enhance app utility. ClinicalTrials.gov NCT00929617; https://clinicaltrials.gov/ct2/show/NCT00929617 (Archived by WebCite at http://www.webcitation.org/6z2bZ4P7X). ©Matthew Cole Ainsworth, Dori Pekmezi, Heather Bowles, Diane Ehlers, Edward McAuley, Kerry S Courneya, Laura Q Rogers. Originally published in JMIR Cancer (http://cancer.jmir.org), 14.05.2018.

  20. Retinal nerve fiber layer thickness analysis in suspected malingerers with optic disc temporal pallor

    PubMed Central

    Civelekler, Mustafa; Halili, Ismail; Gundogan, Faith C; Sobaci, Gungor

    2009-01-01

    Purpose: To investigate the value of temporal retinal nerve fiber layer (RNFLtemporal) thickness in the prediction of malingering. Materials and Methods: This prospective, cross-sectional study was conducted on 33 military conscripts with optic disc temporal pallor (ODTP) and 33 age-and sex-matched healthy controls. Initial visual acuity (VAi) and visual acuity after simulation examination techniques (VAaset) were assessed. The subjects whose VAaset were two or more lines higher than VAi were determined as malingerers. Thickness of the peripapillary RNFL was determined with OCT (Stratus OCT™, Carl Zeiss Meditec, Inc.). RNFLtemporal thickness of the subjects were categorized into one of the 1+ to 4+ groups according to 50% confidence interval (CI), 25% CI and 5% CI values which were assessed in the control group. The VAs were converted to LogMAR-VAs for statistical comparisons. Results: A significant difference was found only in the temporal quadrant of RNFL thickness in subjects with ODTP (P=0.002). Mean LogMAR-VA increased significantly after SETs (P<0.001). Sensitivity, specificity, positive and negative predictive values of categorized RNFLtemporal thickness in diagnosing malingering were 84.6%, 75.0%, 68.8%, 88.2%, respectively. ROC curve showed that RNFLtemporal thickness of 67.5 μm is a significant cut-off point in determining malingering (P=0.001, area under the curve:0.862). The correlations between LogMAR-VAs and RNFLtemporal thicknesses were significant; the correlation coefficient for LogMAR-VAi was lower than the correlation for LogMAR-VAaset (r=−0.447, P=0.009 for LogMAR-VAi; r=−0.676, P<0.001 for LogMAR-VAaset). Conclusions: RNFLtemporal thickness assessment may be a valuable tool in determining malingering in subjects with ODTP objectively. PMID:19700875

  1. Depth optimal sorting networks resistant to k passive faults

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piotrow, M.

    In this paper, we study the problem of constructing a sorting network that is tolerant to faults and whose running time (i.e. depth) is as small as possible. We consider the scenario of worst-case comparator faults and follow the model of passive comparator failure proposed by Yao and Yao, in which a faulty comparator outputs directly its inputs without comparison. Our main result is the first construction of an N-input, k-fault-tolerant sorting network that is of an asymptotically optimal depth {theta}(log N+k). That improves over the recent result of Leighton and Ma, whose network is of depth O(log N +more » k log log N/log k). Actually, we present a fault-tolerant correction network that can be added after any N-input sorting network to correct its output in the presence of at most k faulty comparators. Since the depth of the network is O(log N + k) and the constants hidden behind the {open_quotes}O{close_quotes} notation are not big, the construction can be of practical use. Developing the techniques necessary to show the main result, we construct a fault-tolerant network for the insertion problem. As a by-product, we get an N-input, O(log N)-depth INSERT-network that is tolerant to random faults, thereby answering a question posed by Ma in his PhD thesis. The results are based on a new notion of constant delay comparator networks, that is, networks in which each register is used (compared) only in a period of time of a constant length. Copies of such networks can be put one after another with only a constant increase in depth per copy.« less

  2. Accuracy and precision of Legionella isolation by US laboratories in the ELITE program pilot study.

    PubMed

    Lucas, Claressa E; Taylor, Thomas H; Fields, Barry S

    2011-10-01

    A pilot study for the Environmental Legionella Isolation Techniques Evaluation (ELITE) Program, a proficiency testing scheme for US laboratories that culture Legionella from environmental samples, was conducted September 1, 2008 through March 31, 2009. Participants (n=20) processed panels consisting of six sample types: pure and mixed positive, pure and mixed negative, pure and mixed variable. The majority (93%) of all samples (n=286) were correctly characterized, with 88.5% of samples positive for Legionella and 100% of negative samples identified correctly. Variable samples were incorrectly identified as negative in 36.9% of reports. For all samples reported positive (n=128), participants underestimated the cfu/ml by a mean of 1.25 logs with standard deviation of 0.78 logs, standard error of 0.07 logs, and a range of 3.57 logs compared to the CDC re-test value. Centering results around the interlaboratory mean yielded a standard deviation of 0.65 logs, standard error of 0.06 logs, and a range of 3.22 logs. Sampling protocol, treatment regimen, culture procedure, and laboratory experience did not significantly affect the accuracy or precision of reported concentrations. Qualitative and quantitative results from the ELITE pilot study were similar to reports from a corresponding proficiency testing scheme available in the European Union, indicating these results are probably valid for most environmental laboratories worldwide. The large enumeration error observed suggests that the need for remediation of a water system should not be determined solely by the concentration of Legionella observed in a sample since that value is likely to underestimate the true level of contamination. Published by Elsevier Ltd.

  3. Fractures, stress and fluid flow prior to stimulation of well 27-15, Desert Peak, Nevada, EGS project

    USGS Publications Warehouse

    Davatzes, Nicholas C.; Hickman, Stephen H.

    2009-01-01

    A suite of geophysical logs has been acquired for structural, fluid flow and stress analysis of well 27-15 in the Desert Peak Geothermal Field, Nevada, in preparation for stimulation and development of an Enhanced Geothermal System (EGS). Advanced Logic Technologies Borehole Televiewer (BHTV) and Schlumberger Formation MicroScanner (FMS) image logs reveal extensive drilling-induced tensile fractures, showing that the current minimum compressive horizontal stress, Shmin, in the vicinity of well 27-15 is oriented along an azimuth of 114±17°. This orientation is consistent with the dip direction of recently active normal faults mapped at the surface and with extensive sets of fractures and some formation boundaries seen in the BHTV and FMS logs. Temperature and spinner flowmeter surveys reveal several minor flowing fractures that are well oriented for normal slip, although over-all permeability in the well is quite low. These results indicate that well 27-15 is a viable candidate for EGS stimulation and complements research by other investigators including cuttings analysis, a reflection seismic survey, pressure transient and tracer testing, and micro-seismic monitoring.

  4. Application of random seismic inversion method based on tectonic model in thin sand body research

    NASA Astrophysics Data System (ADS)

    Dianju, W.; Jianghai, L.; Qingkai, F.

    2017-12-01

    The oil and gas exploitation at Songliao Basin, Northeast China have already progressed to the period with high water production. The previous detailed reservoir description that based on seismic image, sediment core, borehole logging has great limitations in small scale structural interpretation and thin sand body characterization. Thus, precise guidance for petroleum exploration is badly in need of a more advanced method. To do so, we derived the method of random seismic inversion constrained by tectonic model.It can effectively improve the depicting ability of thin sand bodies, combining numerical simulation techniques, which can credibly reducing the blindness of reservoir analysis from the whole to the local and from the macroscopic to the microscopic. At the same time, this can reduce the limitations of the study under the constraints of different geological conditions of the reservoir, accomplish probably the exact estimation for the effective reservoir. Based on the research, this paper has optimized the regional effective reservoir evaluation and the productive location adjustment of applicability, combined with the practical exploration and development in Aonan oil field.

  5. Shell-model method for Gamow-Teller transitions in heavy deformed odd-mass nuclei

    NASA Astrophysics Data System (ADS)

    Wang, Long-Jun; Sun, Yang; Ghorui, Surja K.

    2018-04-01

    A shell-model method for calculating Gamow-Teller (GT) transition rates in heavy deformed odd-mass nuclei is presented. The method is developed within the framework of the projected shell model. To implement the computation requirement when many multi-quasiparticle configurations are included in the basis, a numerical advancement based on the Pfaffian formula is introduced. With this new many-body technique, it becomes feasible to perform state-by-state calculations for the GT nuclear matrix elements of β -decay and electron-capture processes, including those at high excitation energies in heavy nuclei which are usually deformed. The first results, β- decays of the well-deformed A =153 neutron-rich nuclei, are shown as the example. The known log(f t ) data corresponding to the B (GT- ) decay rates of the ground state of 153Nd to the low-lying states of 153Pm are well described. It is further shown that the B (GT) distributions can have a strong dependence on the detailed microscopic structure of relevant states of both the parent and daughter nuclei.

  6. Removal of total and antibiotic resistant bacteria in advanced wastewater treatment by ozonation in combination with different filtering techniques.

    PubMed

    Lüddeke, Frauke; Heß, Stefanie; Gallert, Claudia; Winter, Josef; Güde, Hans; Löffler, Herbert

    2015-02-01

    Elimination of bacteria by ozonation in combination with charcoal or slow sand filtration for advanced sewage treatment to improve the quality of treated sewage and to reduce the potential risk for human health of receiving surface waters was investigated in pilot scale at the sewage treatment plant Eriskirch, Baden-Wuerttemberg/Germany. To determine the elimination of sewage bacteria, inflowing and leaving wastewater of different treatment processes was analysed in a culture-based approach for its content of Escherichia coli, enterococci and staphylococci and their resistance against selected antibiotics over a period of 17 month. For enterococci, single species and their antibiotic resistances were identified. In comparison to the established flocculation filtration at Eriskirch, ozonation plus charcoal or sand filtration (pilot-scale) reduced the concentrations of total and antibiotic resistant E. coli, enterococci and staphylococci. However, antibiotic resistant E. coli and staphylococci apparently survived ozone treatment better than antibiotic sensitive strains. Neither vancomycin resistant enterococci nor methicillin resistant Staphylococcus aureus (MRSA) were detected. The decreased percentage of antibiotic resistant enterococci after ozonation may be explained by a different ozone sensitivity of species: Enterococcus faecium and Enterococcus faecalis, which determined the resistance-level, seemed to be more sensitive for ozone than other Enterococcus-species. Overall, ozonation followed by charcoal or sand filtration led to 0.8-1.1 log-units less total and antibiotic resistant E. coli, enterococci and staphylococci, as compared to the respective concentrations in treated sewage by only flocculation filtration. Thus, advanced wastewater treatment by ozonation plus charcoal or sand filtration after common sewage treatment is an effective tool for further elimination of microorganisms from sewage before discharge in surface waters. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Limited salvage logging effects on forest regeneration after moderate-severity windthrow

    Treesearch

    C.J. Peterson

    2008-01-01

    Recent conceptual advances address forest response to multiple disturbances within a brief time period, providing an ideal framework for examining the consequences of natural disturbances followed by anthropogenic management activities. The combination of two or more disturbances in a short period may produce ‘‘ecological surprises,’’...

  8. Empirical Evaluation of Advanced Electromagnetic Induction Systems - Factors Affecting Classification Effectiveness in Challenging Geologic Environments

    DTIC Science & Technology

    2016-10-01

    Figure 2-2). The array structure is fabricated from PVC and Garolite fiberglass. The array is normally deployed on a set of wheels, resulting in a sensor...Low branches were cleared to 8 feet above ground to reduce obstruction of the RTS prism; large logs and fallen timber were not removed. CH2M also

  9. Effects of prescribed fire in a central Appalachian oak-hickory stand

    Treesearch

    G.W. Wendel; H. Clay Smith; H. Clay Smith

    1986-01-01

    A prescribed fire in a central Appalachian mixed hardwood stand caused considerable damage to the butt logs of many overstory trees. Although there were increases in the abundance and distribution of several species of hardwoods, advanced red and chestnut oaks were poorly distributed 5-years after burning. An abundance of striped maple and other shrubs in the...

  10. Course-Embedded Mentoring for First-Year Students: Melding Academic Subject Support with Role Modeling, Psycho-Social Support, and Goal Setting

    ERIC Educational Resources Information Center

    Henry, Jim; Bruland, Holly Huff; Sano-Franchini, Jennifer

    2011-01-01

    This article examines a mentoring initiative that embedded advanced students in first-year composition courses to mentor students to excel to the best of their abilities. Mentors attended all classes along with students and conducted many out-of-class individual conferences, documenting each of them using programimplemented work logs. Four hundred…

  11. Direct seeding experiments on the 1951 Forks Burn.

    Treesearch

    Elmer W. Shaw

    1953-01-01

    Late in the summer of 1951 the Port Angeles and Western Railroad fire (commonly called the Forks fire) killed more than a half billion board feet of timber. An area approximately 20 miles long and 2-1/2 miles wide, covering 32,668 acres, was burned. It included fine virgin timber, thrifty plantations, ranch lands, reproduction areas, advanced young growth, logged-off...

  12. Logging intensity impact on small oak seedling survival and growth on the Cumberland Plateau in northeastern Alabama

    Treesearch

    Callie J. Schweitzer; Daniel C. Dey

    2013-01-01

    Ground disturbance caused by forest harvest operations can negatively impact oak regeneration. On the Cumberland Plateau, for successful regeneration, managers often must rely on very small (less than a ft in height) oak advance reproduction that is susceptible to disturbance by harvesting equipment. Furthermore, sites on the Plateau top are often harvested when...

  13. Experimental study of main rotor tip geometry and tail rotor interactions in hover. Volume 2: Run log and tabulated data

    NASA Technical Reports Server (NTRS)

    Balch, D. T.; Lombardi, J.

    1985-01-01

    A model scale hover test was conducted in the Sikorsky Aircraft Model Rotor hover Facility to identify and quantify the impact of the tail rotor on the demonstrated advantages of advanced geometry tip configurations. The existence of mutual interference between hovering main rotor and a tail rotor was acknowledged in the test. The test was conducted using the Basic Model Test Rig and two scaled main rotor systems, one representing a 1/5.727 scale UH-60A BLACK HAWK and the others a 1/4.71 scale S-76. Eight alternate rotor tip configurations were tested, 3 on the BLACK HAWK rotor and 6 on the S-76 rotor. Four of these tips were then selected for testing in close proximity to an operating tail rotor (operating in both tractor and pusher modes) to determine if the performance advantages that could be obtained from the use of advanced geometry tips in a main rotor only environment would still exist in the more complex flow field involving a tail rotor. This volume contains the test run log and tabulated data.

  14. Pathogen Reduction in Human Plasma Using an Ultrashort Pulsed Laser

    PubMed Central

    Tsen, Shaw-Wei D.; Kingsley, David H.; Kibler, Karen; Jacobs, Bert; Sizemore, Sara; Vaiana, Sara M.; Anderson, Jeanne; Tsen, Kong-Thon; Achilefu, Samuel

    2014-01-01

    Pathogen reduction is a viable approach to ensure the continued safety of the blood supply against emerging pathogens. However, the currently licensed pathogen reduction techniques are ineffective against non-enveloped viruses such as hepatitis A virus, and they introduce chemicals with concerns of side effects which prevent their widespread use. In this report, we demonstrate the inactivation of both enveloped and non-enveloped viruses in human plasma using a novel chemical-free method, a visible ultrashort pulsed laser. We found that laser treatment resulted in 2-log, 1-log, and 3-log reductions in human immunodeficiency virus, hepatitis A virus, and murine cytomegalovirus in human plasma, respectively. Laser-treated plasma showed ≥70% retention for most coagulation factors tested. Furthermore, laser treatment did not alter the structure of a model coagulation factor, fibrinogen. Ultrashort pulsed lasers are a promising new method for chemical-free, broad-spectrum pathogen reduction in human plasma. PMID:25372037

  15. Cave Pearl Data Logger: A Flexible Arduino-Based Logging Platform for Long-Term Monitoring in Harsh Environments.

    PubMed

    Beddows, Patricia A; Mallon, Edward K

    2018-02-09

    A low-cost data logging platform is presented that provides long-term operation in remote or submerged environments. Three premade "breakout boards" from the open-source Arduino ecosystem are assembled into the core of the data logger. Power optimization techniques are presented which extend the operational life of this module-based design to >1 year on three alkaline AA batteries. Robust underwater housings are constructed for these loggers using PVC fittings. Both the logging platform and the enclosures, are easy to build and modify without specialized tools or a significant background in electronics. This combination turns the Cave Pearl data logger into a generalized prototyping system and this design flexibility is demonstrated with two field studies recording drip rates in a cave and water flow in a flooded cave system. This paper describes a complete DIY solution, suitable for a wide range of challenging deployment conditions.

  16. Cave Pearl Data Logger: A Flexible Arduino-Based Logging Platform for Long-Term Monitoring in Harsh Environments

    PubMed Central

    Mallon, Edward K.

    2018-01-01

    A low-cost data logging platform is presented that provides long-term operation in remote or submerged environments. Three premade “breakout boards” from the open-source Arduino ecosystem are assembled into the core of the data logger. Power optimization techniques are presented which extend the operational life of this module-based design to >1 year on three alkaline AA batteries. Robust underwater housings are constructed for these loggers using PVC fittings. Both the logging platform and the enclosures, are easy to build and modify without specialized tools or a significant background in electronics. This combination turns the Cave Pearl data logger into a generalized prototyping system and this design flexibility is demonstrated with two field studies recording drip rates in a cave and water flow in a flooded cave system. This paper describes a complete DIY solution, suitable for a wide range of challenging deployment conditions. PMID:29425185

  17. Qualitative and quantitative comparison of geostatistical techniques of porosity prediction from the seismic and logging data: a case study from the Blackfoot Field, Alberta, Canada

    NASA Astrophysics Data System (ADS)

    Maurya, S. P.; Singh, K. H.; Singh, N. P.

    2018-05-01

    In present study, three recently developed geostatistical methods, single attribute analysis, multi-attribute analysis and probabilistic neural network algorithm have been used to predict porosity in inter well region for Blackfoot field, Alberta, Canada, an offshore oil field. These techniques make use of seismic attributes, generated by model based inversion and colored inversion techniques. The principle objective of the study is to find the suitable combination of seismic inversion and geostatistical techniques to predict porosity and identification of prospective zones in 3D seismic volume. The porosity estimated from these geostatistical approaches is corroborated with the well log porosity. The results suggest that all the three implemented geostatistical methods are efficient and reliable to predict the porosity but the multi-attribute and probabilistic neural network analysis provide more accurate and high resolution porosity sections. A low impedance (6000-8000 m/s g/cc) and high porosity (> 15%) zone is interpreted from inverted impedance and porosity sections respectively between 1060 and 1075 ms time interval and is characterized as reservoir. The qualitative and quantitative results demonstrate that of all the employed geostatistical methods, the probabilistic neural network along with model based inversion is the most efficient method for predicting porosity in inter well region.

  18. Acoustic Emission and Echo Signal Compensation Techniques Applied to an Ultrasonic Logging-While-Drilling Caliper.

    PubMed

    Yao, Yongchao; Ju, Xiaodong; Lu, Junqiang; Men, Baiyong

    2017-06-10

    A logging-while-drilling (LWD) caliper is a tool used for the real-time measurement of a borehole diameter in oil drilling engineering. This study introduces the mechanical structure and working principle of a new LWD caliper based on ultrasonic distance measurement (UDM). The detection range is a major performance index of a UDM system. This index is determined by the blind zone length and remote reflecting interface detection capability of the system. To reduce the blind zone length and detect near the reflecting interface, a full bridge acoustic emission technique based on bootstrap gate driver (BGD) and metal-oxide-semiconductor field effect transistor (MOSFET) is designed by analyzing the working principle and impedance characteristics of a given piezoelectric transducer. To detect the remote reflecting interface and reduce the dynamic range of the received echo signals, the relationships between the echo amplitude and propagation distance of ultrasonic waves are determined. A signal compensation technique based on time-varying amplification theory, which can automatically change the gain according to the echo arrival time is designed. Lastly, the aforementioned techniques and corresponding circuits are experimentally verified. Results show that the blind zone length in the UDM system of the LWD caliper is significantly reduced and the capability to detect the remote reflecting interface is considerably improved.

  19. Acoustic Emission and Echo Signal Compensation Techniques Applied to an Ultrasonic Logging-While-Drilling Caliper

    PubMed Central

    Yao, Yongchao; Ju, Xiaodong; Lu, Junqiang; Men, Baiyong

    2017-01-01

    A logging-while-drilling (LWD) caliper is a tool used for the real-time measurement of a borehole diameter in oil drilling engineering. This study introduces the mechanical structure and working principle of a new LWD caliper based on ultrasonic distance measurement (UDM). The detection range is a major performance index of a UDM system. This index is determined by the blind zone length and remote reflecting interface detection capability of the system. To reduce the blind zone length and detect near the reflecting interface, a full bridge acoustic emission technique based on bootstrap gate driver (BGD) and metal-oxide-semiconductor field effect transistor (MOSFET) is designed by analyzing the working principle and impedance characteristics of a given piezoelectric transducer. To detect the remote reflecting interface and reduce the dynamic range of the received echo signals, the relationships between the echo amplitude and propagation distance of ultrasonic waves are determined. A signal compensation technique based on time-varying amplification theory, which can automatically change the gain according to the echo arrival time is designed. Lastly, the aforementioned techniques and corresponding circuits are experimentally verified. Results show that the blind zone length in the UDM system of the LWD caliper is significantly reduced and the capability to detect the remote reflecting interface is considerably improved. PMID:28604603

  20. Operator-Friendly Technique and Quality Control Considerations for Indigo Colorimetric Measurement of Ozone Residual

    EPA Science Inventory

    Drinking water ozone disinfection systems measure ozone residual concentration, C, for regulatory compliance reporting of concentration-times-time (CT), and the resultant log-inactivation of virus, Giardia and Cryptosporidium. The indigotrisulfonate (ITS) colorimetric procedure i...

  1. Factors influencing woodlands of southwestern North Dakota

    Treesearch

    Michele M. Girard; Harold Goetz; Ardell J. Bjugstad

    1987-01-01

    Literature pertaining to woodlands of southwestern North Dakota is reviewed. Woodland species composition and distribution, and factors influencing woodland ecosystems such as climate, logging, fire, and grazing are described. Potential management and improvement techniques using vegetation and livestock manipulation have been suggested.

  2. Preparation and testing of drilled shafts with self-consolidating concrete.

    DOT National Transportation Integrated Search

    2012-06-01

    In this study, self-consolidating concrete (SCC) was evaluated in drilled shafts and the : integrity of drilled shafts was determined using cross-hole sonic logging (CSL), a low-strain : nondestructive integrity testing technique. SCC has very high f...

  3. Robust sampling-sourced numerical retrieval algorithm for optical energy loss function based on log-log mesh optimization and local monotonicity preserving Steffen spline

    NASA Astrophysics Data System (ADS)

    Maglevanny, I. I.; Smolar, V. A.

    2016-01-01

    We introduce a new technique of interpolation of the energy-loss function (ELF) in solids sampled by empirical optical spectra. Finding appropriate interpolation methods for ELFs poses several challenges. The sampled ELFs are usually very heterogeneous, can originate from various sources thus so called "data gaps" can appear, and significant discontinuities and multiple high outliers can be present. As a result an interpolation based on those data may not perform well at predicting reasonable physical results. Reliable interpolation tools, suitable for ELF applications, should therefore satisfy several important demands: accuracy and predictive power, robustness and computational efficiency, and ease of use. We examined the effect on the fitting quality due to different interpolation schemes with emphasis on ELF mesh optimization procedures and we argue that the optimal fitting should be based on preliminary log-log scaling data transforms by which the non-uniformity of sampled data distribution may be considerably reduced. The transformed data are then interpolated by local monotonicity preserving Steffen spline. The result is a piece-wise smooth fitting curve with continuous first-order derivatives that passes through all data points without spurious oscillations. Local extrema can occur only at grid points where they are given by the data, but not in between two adjacent grid points. It is found that proposed technique gives the most accurate results and also that its computational time is short. Thus, it is feasible using this simple method to address practical problems associated with interaction between a bulk material and a moving electron. A compact C++ implementation of our algorithm is also presented.

  4. The role of alternative (advanced) conscious sedation techniques in dentistry for adult patients: a series of cases.

    PubMed

    Robb, N

    2014-03-01

    The basic techniques of conscious sedation have been found to be safe and effective for the management of anxiety in adult dental patients requiring sedation to allow them to undergo dental treatment. There remains great debate within the profession as to the role of the so called advanced sedation techniques. This paper presents a series of nine patients who were managed with advanced sedation techniques where the basic techniques were either inappropriate or had previously failed to provide adequate relief of anxiety. In these cases, had there not been the availability of advanced sedation techniques, the most likely recourse would have been general anaesthesia--a treatment modality that current guidance indicates should not be used where there is an appropriate alternative. The sedation techniques used have provided that appropriate alternative management strategy.

  5. The effect of terpene enhancer lipophilicity on the percutaneous permeation of hydrocortisone formulated in HPMC gel systems.

    PubMed

    El-Kattan, A F; Asbill, C S; Michniak, B B

    2000-04-05

    The percutaneous permeation of hydrocortisone (HC) was investigated in hairless mouse skin after application of an alcoholic hydrogel using a diffusion cell technique. The formulations contained one of 12 terpenes, the selection of which was based on an increase in their lipophilicity (log P 1.06-5.36). Flux, cumulative receptor concentrations, skin content, and lag time of HC were measured over 24 h and compared with control gels (containing no terpene). Furthermore, HC skin content and the solubility of HC in the alcoholic hydrogel solvent mixture in the presence of terpene were determined, and correlated to the enhancing activity of terpenes. The in vitro permeation experiments with hairless mouse skin revealed that the terpene enhancers varied in their ability to enhance the flux of HC. Nerolidol which possessed the highest lipophilicity (log P = 5.36+/-0.38) provided the greatest enhancement for HC flux (35.3-fold over control). Fenchone (log P = 2.13+/-0.30) exhibited the lowest enhancement of HC flux (10.1-fold over control). In addition, a linear relationship was established between the log P of terpenes and the cumulative amount of HC in the receptor after 24 h (Q(24)). Nerolidol, provided the highest Q(24) (1733+/-93 microg/cm(2)), whereas verbenone produced the lowest Q(24) (653+/-105 microg/cm(2)). Thymol provided the lowest HC skin content (1151+/-293 microg/g), while cineole produced the highest HC skin content (18999+/-5666 microg/g). No correlation was established between the log P of enhancers and HC skin content. A correlation however, existed between the log P of terpenes and the lag time. As log P increased, a linear decrease in lag time was observed. Cymene yielded the shortest HC lag time, while fenchone produced the longest lag time. Also, the increase in the log P of terpenes resulted in a proportional increase in HC solubility in the formulation solvent mixture.

  6. Evaluation of portfolio credit risk based on survival analysis for progressive censored data

    NASA Astrophysics Data System (ADS)

    Jaber, Jamil J.; Ismail, Noriszura; Ramli, Siti Norafidah Mohd

    2017-04-01

    In credit risk management, the Basel committee provides a choice of three approaches to the financial institutions for calculating the required capital: the standardized approach, the Internal Ratings-Based (IRB) approach, and the Advanced IRB approach. The IRB approach is usually preferred compared to the standard approach due to its higher accuracy and lower capital charges. This paper use several parametric models (Exponential, log-normal, Gamma, Weibull, Log-logistic, Gompertz) to evaluate the credit risk of the corporate portfolio in the Jordanian banks based on the monthly sample collected from January 2010 to December 2015. The best model is selected using several goodness-of-fit criteria (MSE, AIC, BIC). The results indicate that the Gompertz distribution is the best model parametric model for the data.

  7. Log-periodic view on critical dates of the Chinese stock market bubbles

    NASA Astrophysics Data System (ADS)

    Li, Chong

    2017-01-01

    We present an analysis of critical dates of three historical Chinese stock market bubbles (July 2006-Oct. 2007, Dec. 2007-Oct. 2008, Oct. 2014-June 2015) based on the Shanghai Shenzhen CSI 300 index (CSI300). This supports that the log-periodic power law singularity (LPPLS) model can describe well the behavior of super-exponential (power law with finite-time singularity) increase or decrease of the CSI300 index, suggesting that the LPPLS is available to predict the critical date. We also attempt to analyze the fitting parameter α of the LPPLS and the forecast gap which is between the last observed date and the expected critical date, proposing that the forecast gap is an alternative way for advanced warning of the market conversion.

  8. Do plasma concentrations of apelin predict prognosis in patients with advanced heart failure?

    PubMed

    Dalzell, Jonathan R; Jackson, Colette E; Chong, Kwok S; McDonagh, Theresa A; Gardner, Roy S

    2014-01-01

    Apelin is an endogenous vasodilator and inotrope, plasma concentrations of which are reduced in advanced heart failure (HF). We determined the prognostic significance of plasma concentrations of apelin in advanced HF. Plasma concentrations of apelin were measured in 182 patients with advanced HF secondary to left ventricular systolic dysfunction. The predictive value of apelin for the primary end point of all-cause mortality was assessed over a median follow-up period of 544 (IQR: 196-923) days. In total, 30 patients (17%) reached the primary end point. Of those patients with a plasma apelin concentration above the median, 14 (16%) reached the primary end point compared with 16 (17%) of those with plasma apelin levels below the median (p = NS). NT-proBNP was the most powerful prognostic marker in this population (log rank statistic: 10.37; p = 0.001). Plasma apelin concentrations do not predict medium to long-term prognosis in patients with advanced HF secondary to left ventricular systolic dysfunction.

  9. Design and Testing of a Smartphone Application for Real-Time Self-Tracking Diabetes Self-Management Behaviors.

    PubMed

    Groat, Danielle; Soni, Hiral; Grando, Maria Adela; Thompson, Bithika; Kaufman, David; Cook, Curtiss B

    2018-04-01

     Type 1 diabetes (T1D) care requires multiple daily self-management behaviors (SMBs). Preliminary studies on SMBs rely mainly on self-reported survey and interview data. There is little information on adult T1D SMBs, along with corresponding compensation techniques (CTs), gathered in real-time.  The article aims to use a patient-centered approach to design iDECIDE, a smartphone application that gathers daily diabetes SMBs and CTs related to meal and alcohol intake and exercise in real-time, and contrast patients' actual behaviors against those self-reported with the app.  Two usability studies were used to improve iDECIDE's functionality. These were followed by a 30-day pilot test of the redesigned app. A survey designed to capture diabetes SMBs and CTs was administered prior to the 30-day pilot test. Survey results were compared against iDECIDE logs.  Usability studies revealed that participants desired advanced features for self-tracking meals and alcohol intake. Thirteen participants recorded over 1,200 CTs for carbohydrates during the 30-day study. Participants also recorded 76 alcohol and 166 exercise CTs. Comparisons of survey responses and iDECIDE logs showed mean% (standard deviation) concordance of 77% (25) for SMBs related to meals, where concordance of 100% indicates a perfect match. There was low concordance of 35% (35) and 46% (41) for alcohol and exercise events, respectively.  The high variability found in SMBs and CTs highlights the need for real-time diabetes self-tracking mechanisms to better understand SMBs and CTs. Future work will use the developed app to collect SMBs and CTs and identify patient-specific diabetes adherence barriers that could be addressed with individualized education interventions. Schattauer GmbH Stuttgart.

  10. Adjuvant breast cancer therapy: current status and future strategies--growth kinetics and the improved drug therapy of breast cancer.

    PubMed

    Norton, L

    1999-02-01

    It is well-established that the adjuvant treatment of breast cancer is effective in prolonging both disease-free and overall survival. The pressing questions are how to improve on existing treatment, whether new agents should be incorporated into adjuvant regimens, and, if so, how they should best be utilized. The application of log-kill principles to the sigmoid growth curve characteristic of human cancers suggests that the chances of eradicating tumor will be increased by dose-dense schedules. If the tumor is composed of several cell lines with different sensitivities, the optimum therapy is likely to consist of several drugs given in sequence at a good dose and on a dense schedule. Such sequential chemotherapy, rather than the use of drugs given in combination at longer intervals, should maximize log-kill at the same time as minimizing tumor regrowth. There is now evidence that the actions of chemotherapy may involve Ras, tyrosine kinases (epidermal growth factor receptor, HER2), TC21, or similar molecules. This concept may provide important clues for optimizing the clinical applications of drug therapy and for designing new therapeutic approaches. It might also explain the reason why dose density may be more effective than other schedules of administration. New blood vessel formation is an obligatory step in the establishment of a tumor in its sigmoid growth course and there is evidence that taxanes adversely affect this process. Major practical advances in the curative drug therapy of cancer should follow the demonstration of better ways to maximize cell kill, the development of predictive in vitro methods of selecting active agents, the discovery of techniques to minimize both drug resistance and host-cell toxicity, and the improved understanding of cancer-stromal interactions and their therapeutic perturbation.

  11. HALT to qualify electronic packages: a proof of concept

    NASA Astrophysics Data System (ADS)

    Ramesham, Rajeshuni

    2014-03-01

    A proof of concept of the Highly Accelerated Life Testing (HALT) technique was explored to assess and optimize electronic packaging designs for long duration deep space missions in a wide temperature range (-150°C to +125°C). HALT is a custom hybrid package suite of testing techniques using environments such as extreme temperatures and dynamic shock step processing from 0g up to 50g of acceleration. HALT testing used in this study implemented repetitive shock on the test vehicle components at various temperatures to precipitate workmanship and/or manufacturing defects to show the weak links of the designs. The purpose is to reduce the product development cycle time for improvements to the packaging design qualification. A test article was built using advanced electronic package designs and surface mount technology processes, which are considered useful for a variety of JPL and NASA projects, i.e. (surface mount packages such as ball grid arrays (BGA), plastic ball grid arrays (PBGA), very thin chip array ball grid array (CVBGA), quad flat-pack (QFP), micro-lead-frame (MLF) packages, several passive components, etc.). These packages were daisy-chained and independently monitored during the HALT test. The HALT technique was then implemented to predict reliability and assess survivability of these advanced packaging techniques for long duration deep space missions in much shorter test durations. Test articles were built using advanced electronic package designs that are considered useful in various NASA projects. All the advanced electronic packages were daisychained independently to monitor the continuity of the individual electronic packages. Continuity of the daisy chain packages was monitored during the HALT testing using a data logging system. We were able to test the boards up to 40g to 50g shock levels at temperatures ranging from +125°C to -150°C. The HALT system can deliver 50g shock levels at room temperature. Several tests were performed by subjecting the test boards to various g levels ranging from 5g to 50g, test durations of 10 minutes to 60 minutes, hot temperatures of up to +125°C and cold temperatures down to -150°C. During the HALT test, electrical continuity measurements of the PBGA package showed an open-circuit, whereas the BGA, MLF, and QFPs showed signs of small variations of electrical continuity measurements. The electrical continuity anomaly of the PBGA occurred in the test board within 12 hours of commencing the accelerated test. Similar test boards were assembled, thermal cycled independently from -150°C to +125°C and monitored for electrical continuity through each package design. The PBGA package on the test board showed an anomalous electrical continuity behavior after 959 thermal cycles. Each thermal cycle took around 2.33 hours, so that a total test time to failure of the PBGA was 2,237 hours (or ~3.1 months) due to thermal cycling alone. The accelerated technique (thermal cycling + shock) required only 12 hours to cause a failure in the PBGA electronic package. Compared to the thermal cycle only test, this was an acceleration of ~186 times (more than 2 orders of magnitude). This acceleration process can save significant time and resources for predicting the life of a package component in a given environment, assuming the failure mechanisms are similar in both the tests. Further studies are in progress to make systematic evaluations of the HALT technique on various other advanced electronic packaging components on the test board. With this information one will be able to estimate the number of mission thermal cycles to failure with a much shorter test program. Further studies are in progress to make systematic study of various components, constant temperature range for both the tests. Therefore, one can estimate the number of hours to fail in a given thermal and shock levels for a given test board physical properties.

  12. Assessment of PDMS-water partition coefficients: implications for passive environmental sampling of hydrophobic organic compounds

    USGS Publications Warehouse

    DiFilippo, Erica L.; Eganhouse, Robert P.

    2010-01-01

    Solid-phase microextraction (SPME) has shown potential as an in situ passive-sampling technique in aquatic environments. The reliability of this method depends upon accurate determination of the partition coefficient between the fiber coating and water (Kf). For some hydrophobic organic compounds (HOCs), Kf values spanning 4 orders of magnitude have been reported for polydimethylsiloxane (PDMS) and water. However, 24% of the published data examined in this review did not pass the criterion for negligible depletion, resulting in questionable Kf values. The range in reported Kf is reduced to just over 2 orders of magnitude for some polychlorinated biphenyls (PCBs) when these questionable values are removed. Other factors that could account for the range in reported Kf, such as fiber-coating thickness and fiber manufacturer, were evaluated and found to be insignificant. In addition to accurate measurement of Kf, an understanding of the impact of environmental variables, such as temperature and ionic strength, on partitioning is essential for application of laboratory-measured Kf values to field samples. To date, few studies have measured Kf for HOCs at conditions other than at 20 degrees or 25 degrees C in distilled water. The available data indicate measurable variations in Kf at different temperatures and different ionic strengths. Therefore, if the appropriate environmental variables are not taken into account, significant error will be introduced into calculated aqueous concentrations using this passive sampling technique. A multiparameter linear solvation energy relationship (LSER) was developed to estimate log Kf in distilled water at 25 degrees C based on published physicochemical parameters. This method provided a good correlation (R2 = 0.94) between measured and predicted log Kf values for several compound classes. Thus, an LSER approach may offer a reliable means of predicting log Kf for HOCs whose experimental log Kf values are presently unavailable. Future research should focus on understanding the impact of environmental variables on Kf. Obtaining the data needed for an LSER approach to estimate Kf for all environmentally relevant HOCs would be beneficial to the application of SPME as a passive-sampling technique.

  13. Delineating chalk sand distribution of Ekofisk formation using probabilistic neural network (PNN) and stepwise regression (SWR): Case study Danish North Sea field

    NASA Astrophysics Data System (ADS)

    Haris, A.; Nafian, M.; Riyanto, A.

    2017-07-01

    Danish North Sea Fields consist of several formations (Ekofisk, Tor, and Cromer Knoll) that was started from the age of Paleocene to Miocene. In this study, the integration of seismic and well log data set is carried out to determine the chalk sand distribution in the Danish North Sea field. The integration of seismic and well log data set is performed by using the seismic inversion analysis and seismic multi-attribute. The seismic inversion algorithm, which is used to derive acoustic impedance (AI), is model-based technique. The derived AI is then used as external attributes for the input of multi-attribute analysis. Moreover, the multi-attribute analysis is used to generate the linear and non-linear transformation of among well log properties. In the case of the linear model, selected transformation is conducted by weighting step-wise linear regression (SWR), while for the non-linear model is performed by using probabilistic neural networks (PNN). The estimated porosity, which is resulted by PNN shows better suited to the well log data compared with the results of SWR. This result can be understood since PNN perform non-linear regression so that the relationship between the attribute data and predicted log data can be optimized. The distribution of chalk sand has been successfully identified and characterized by porosity value ranging from 23% up to 30%.

  14. A Generalized Approach for the Interpretation of Geophysical Well Logs in Ground-Water Studies - Theory and Application

    USGS Publications Warehouse

    Paillet, Frederick L.; Crowder, R.E.

    1996-01-01

    Quantitative analysis of geophysical logs in ground-water studies often involves at least as broad a range of applications and variation in lithology as is typically encountered in petroleum exploration, making such logs difficult to calibrate and complicating inversion problem formulation. At the same time, data inversion and analysis depend on inversion model formulation and refinement, so that log interpretation cannot be deferred to a geophysical log specialist unless active involvement with interpretation can be maintained by such an expert over the lifetime of the project. We propose a generalized log-interpretation procedure designed to guide hydrogeologists in the interpretation of geophysical logs, and in the integration of log data into ground-water models that may be systematically refined and improved in an iterative way. The procedure is designed to maximize the effective use of three primary contributions from geophysical logs: (1) The continuous depth scale of the measurements along the well bore; (2) The in situ measurement of lithologic properties and the correlation with hydraulic properties of the formations over a finite sample volume; and (3) Multiple independent measurements that can potentially be inverted for multiple physical or hydraulic properties of interest. The approach is formulated in the context of geophysical inversion theory, and is designed to be interfaced with surface geophysical soundings and conventional hydraulic testing. The step-by-step procedures given in our generalized interpretation and inversion technique are based on both qualitative analysis designed to assist formulation of the interpretation model, and quantitative analysis used to assign numerical values to model parameters. The approach bases a decision as to whether quantitative inversion is statistically warranted by formulating an over-determined inversion. If no such inversion is consistent with the inversion model, quantitative inversion is judged not possible with the given data set. Additional statistical criteria such as the statistical significance of regressions are used to guide the subsequent calibration of geophysical data in terms of hydraulic variables in those situations where quantitative data inversion is considered appropriate.

  15. Objective straylight assessment of the human eye with a novel device

    NASA Astrophysics Data System (ADS)

    Schramm, Stefan; Schikowski, Patrick; Lerm, Elena; Kaeding, André; Klemm, Matthias; Haueisen, Jens; Baumgarten, Daniel

    2016-03-01

    Forward scattered light from the anterior segment of the human eye can be measured by Shack-Hartmann (SH) wavefront aberrometers with limited visual angle. We propose a novel Point Spread Function (PSF) reconstruction algorithm based on SH measurements with a novel measurement devise to overcome these limitations. In our optical setup, we use a Digital Mirror Device as variable field stop, which is conventionally a pinhole suppressing scatter and reflections. Images with 21 different stop diameters were captured and from each image the average subaperture image intensity and the average intensity of the pupil were computed. The 21 intensities represent integral values of the PSF which is consequently reconstructed by derivation with respect to the visual angle. A generalized form of the Stiles-Holladay-approximation is fitted to the PSF resulting in a stray light parameter Log(IS). Additionaly the transmission loss of eye is computed. For the proof of principle, a study on 13 healthy young volunteers was carried out. Scatter filters were positioned in front of the volunteer's eye during C-Quant and scatter measurements to generate straylight emulating scatter in the lens. The straylight parameter is compared to the C-Quant measurement parameter Log(ISC) and scatter density of the filters SDF with a partial correlation. Log(IS) shows significant correlation with the SDF and Log(ISC). The correlation is more prominent between Log(IS) combined with the transmission loss and the SDF and Log(ISC). Our novel measurement and reconstruction technique allow for objective stray light analysis of visual angles up to 4 degrees.

  16. Pattern mining of user interaction logs for a post-deployment usability evaluation of a radiology PACS client.

    PubMed

    Jorritsma, Wiard; Cnossen, Fokie; Dierckx, Rudi A; Oudkerk, Matthijs; van Ooijen, Peter M A

    2016-01-01

    To perform a post-deployment usability evaluation of a radiology Picture Archiving and Communication System (PACS) client based on pattern mining of user interaction log data, and to assess the usefulness of this approach compared to a field study. All user actions performed on the PACS client were logged for four months. A data mining technique called closed sequential pattern mining was used to automatically extract frequently occurring interaction patterns from the log data. These patterns were used to identify usability issues with the PACS. The results of this evaluation were compared to the results of a field study based usability evaluation of the same PACS client. The interaction patterns revealed four usability issues: (1) the display protocols do not function properly, (2) the line measurement tool stays active until another tool is selected, rather than being deactivated after one use, (3) the PACS's built-in 3D functionality does not allow users to effectively perform certain 3D-related tasks, (4) users underuse the PACS's customization possibilities. All usability issues identified based on the log data were also found in the field study, which identified 48 issues in total. Post-deployment usability evaluation based on pattern mining of user interaction log data provides useful insights into the way users interact with the radiology PACS client. However, it reveals few usability issues compared to a field study and should therefore not be used as the sole method of usability evaluation. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  17. DEVELOPMENT AND APPLICATION OF BOREHOLE FLOWMETERS FOR ENVIRONMENTAL ASSESSMENT

    EPA Science Inventory

    In order to understand the origin of contaminant plumes and infer their future migration, one requires a knowledge of the hydraulic conductivity (K) distribution. n many aquifers, the borehole flowmeter offers the most direct technique available for developing a log of hydraulic ...

  18. A Reading-Writing Connection in the Content Areas (Secondary Perspectives).

    ERIC Educational Resources Information Center

    Journal of Reading, 1990

    1990-01-01

    Discusses instructional activities designed to foster the reading-writing connection in the content area classroom. Describes the use of "possible sentences," learning logs, freewriting, dialogue journals, the RAFT technique (role, audience, format, and topic), and the "opinion-proof" organization strategy. (RS)

  19. Inactivation of marine heterotrophic bacteria in ballast water by an Electrochemical Advanced Oxidation Process.

    PubMed

    Moreno-Andrés, Javier; Ambauen, Noëmi; Vadstein, Olav; Hallé, Cynthia; Acevedo-Merino, Asunción; Nebot, Enrique; Meyn, Thomas

    2018-05-03

    Seawater treatment is increasingly required due to industrial activities that use substantial volumes of seawater in their processes. The shipping industry and the associated management of a ship's ballast water are currently considered a global challenge for the seas. Related to that, the suitability of an Electrochemical Advanced Oxidation Process (EAOP) with Boron Doped Diamond (BDD) electrodes has been assessed on a laboratory scale for the disinfection of seawater. This technology can produce both reactive oxygen species and chlorine species (especially in seawater) that are responsible for inactivation. The EAOP was applied in a continuous-flow regime with real seawater. Natural marine heterotrophic bacteria (MHB) were used as an indicator of disinfection efficiency. A biphasic inactivation kinetic model was fitted on experimental points, achieving 4-Log reductions at 0.019 Ah L -1 . By assessing regrowth after treatment, results suggest that higher bacterial damages result from the EAOP when it is compared to chlorination. Furthermore, several issues lacking fundamental understanding were investigated such as recolonization capacity or bacterial community dynamics. It was concluded that, despite disinfection processes being effective, there is not only a possibility for regrowth after treatment but also a change on bacterial population diversity produced by the treatment. Finally, energy consumption was estimated and indicated that 0.264 kWh·m -3 are needed for 4.8-Log reductions of MHB; otherwise, with 0.035 kWh·m -3 , less disinfection efficiency can be obtained (2.2-Log red). However, with a residual oxidant in the solution, total inactivation can be achieved in three days. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Surface water disinfection by chlorination and advanced oxidation processes: Inactivation of an antibiotic resistant E. coli strain and cytotoxicity evaluation.

    PubMed

    Miranda, Andreza Costa; Lepretti, Marilena; Rizzo, Luigi; Caputo, Ivana; Vaiano, Vincenzo; Sacco, Olga; Lopes, Wilton Silva; Sannino, Diana

    2016-06-01

    The release of antibiotics into the environment can result in antibiotic resistance (AR) spread, which in turn can seriously affect human health. Antibiotic resistant bacteria have been detected in different aquatic environments used as drinking water source. Water disinfection may be a possible solution to minimize AR spread but conventional processes, such as chlorination, result in the formation of dangerous disinfection by-products. In this study advanced oxidation processes (AOPs), namely H2O2/UV, TiO2/UV and N-TiO2/UV, have been compared with chlorination in the inactivation of an AR Escherichia coli (E. coli) strain in surface water. TiO2 P25 and nitrogen doped TiO2 (N-TiO2), prepared by sol-gel method at two different synthesis temperatures (0 and -20°C), were investigated in heterogeneous photocatalysis experiments. Under the investigated conditions, chlorination (1.0 mg L(-1)) was the faster process (2.5 min) to achieve total inactivation (6 Log). Among AOPs, H2O2/UV resulted in the best inactivation rate: total inactivation (6 Log) was achieved in 45 min treatment. Total inactivation was not observed (4.5 Log), also after 120 min treatment, only for N-doped TiO2 synthesized at 0°C. Moreover, H2O2/UV and chlorination processes were evaluated in terms of cytotoxicity potential by means of 3-(4,5-dime-thylthiazol-2-yl)-2,5-diphenylte-trazolium colorimetric test on a human-derived cell line and they similarly affected HepG2 cells viability. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. AVTA Federal Fleet PEV Readiness Data Logging and Characterization Study for NASA Glenn Research Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schey, Stephen; Francfort, Jim

    The Advanced Vehicle Testing Activity’s study seeks to collect and evaluate data to validate the utilization of advanced plug-in electric vehicle (PEV) transportation. This report focuses on the NASA Glenn Research Center (GRC) fleet to identify daily operational characteristics of select vehicles and report findings on vehicle and mission characterizations to support the successful introduction of PEVs into the agencies’ fleets. Individual observations of these selected vehicles provide the basis for recommendations related to electric vehicle adoption and whether a battery electric vehicle or plug-in hybrid electric vehicle (collectively referred to as PEVs) can fulfill the mission requirements.

  2. Binary Detection using Multi-Hypothesis Log-Likelihood, Image Processing

    DTIC Science & Technology

    2014-03-27

    geosynchronous orbit and other scenarios important to the USAF. 2 1.3 Research objectives The question posed in this thesis is how well, if at all, can a...is important to compare them to another modern technique. The third objective is to compare results from another image detection method, specifically...Although adaptive optics is an important technique in moving closer to diffraction limited imaging, it is not currently a practical solution for all

  3. Development of prototypes of bioactive packaging materials based on immobilized bacteriophages for control of growth of bacterial pathogens in foods.

    PubMed

    Lone, Ayesha; Anany, Hany; Hakeem, Mohammed; Aguis, Louise; Avdjian, Anne-Claire; Bouget, Marina; Atashi, Arash; Brovko, Luba; Rochefort, Dominic; Griffiths, Mansel W

    2016-01-18

    Due to lack of adequate control methods to prevent contamination in fresh produce and growing consumer demand for natural products, the use of bacteriophages has emerged as a promising approach to enhance safety of these foods. This study sought to control Listeria monocytogenes in cantaloupes and RTE meat and Escherichia coli O104:H4 in alfalfa seeds and sprouts under different storage conditions by using specific lytic bacteriophage cocktails applied either free or immobilized. Bacteriophage cocktails were introduced into prototypes of packaging materials using different techniques: i) immobilizing on positively charged modified cellulose membranes, ii) impregnating paper with bacteriophage suspension, and iii) encapsulating in alginate beads followed by application of beads onto the paper. Phage-treated and non-treated samples were stored for various times and at temperatures of 4°C, 12°C or 25°C. In cantaloupe, when free phage cocktail was added, L. monocytogenes counts dropped below the detection limit of the plating technique (<1 log CFU/g) after 5 days of storage at both 4°C and 12°C. However, at 25°C, counts below the detection limit were observed after 3 and 6h and a 2-log CFU/g reduction in cell numbers was seen after 24h. For the immobilized Listeria phage cocktail, around 1-log CFU/g reduction in the Listeria count was observed by the end of the storage period for all tested storage temperatures. For the alfalfa seeds and sprouts, regardless of the type of phage application technique (spraying of free phage suspension, bringing in contact with bacteriophage-based materials (paper coated with encapsulated bacteriophage or impregnated with bacteriophage suspension)), the count of E. coli O104:H4 was below the detection limit (<1 log CFU/g) after 1h in seeds and about a 1-log cycle reduction in E. coli count was observed on the germinated sprouts by day 5. In ready-to-eat (RTE) meat, LISTEX™ P100, a commercial phage product, was able to significantly reduce the growth of L. monocytogenes at both storage temperatures, 4°C and 10°C, for 25 days regardless of bacteriophage application format (immobilized or non-immobilized (free)). In conclusion, the developed phage-based materials demonstrated significant antimicrobial effect, when applied to the artificially contaminated foods, and can be used as prototypes for developing bioactive antimicrobial packaging materials capable of enhancing the safety of fresh produce and RTE meat. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Industrial application of semantic process mining

    NASA Astrophysics Data System (ADS)

    Espen Ingvaldsen, Jon; Atle Gulla, Jon

    2012-05-01

    Process mining relates to the extraction of non-trivial and useful information from information system event logs. It is a new research discipline that has evolved significantly since the early work on idealistic process logs. Over the last years, process mining prototypes have incorporated elements from semantics and data mining and targeted visualisation techniques that are more user-friendly to business experts and process owners. In this article, we present a framework for evaluating different aspects of enterprise process flows and address practical challenges of state-of-the-art industrial process mining. We also explore the inherent strengths of the technology for more efficient process optimisation.

  5. The Solar System Origin Revisited

    NASA Astrophysics Data System (ADS)

    Johnson, Fred M.

    2016-10-01

    A novel theory will be presented based in part on astronomical observations, plasma physics experiments, principles of physics and forensic techniques. The new theory correctly predicts planetary distances with a 1% precision. It accounts for energy production mechanism inside all of the planets including our Earth. A log-log mass-luminosity plot of G2 class stars and solar system planets results in a straight line plot, whose slope implies that a fission rather than a proton-proton fusion energy production is operating. Furthermore, it is a confirmation that all our planets had originated from within our Sun. Other still-born planets continue to appear on the Sun's surface, they are mislabeled as sunspots.

  6. A VLSI architecture for performing finite field arithmetic with reduced table look-up

    NASA Technical Reports Server (NTRS)

    Hsu, I. S.; Truong, T. K.; Reed, I. S.

    1986-01-01

    A new table look-up method for finding the log and antilog of finite field elements has been developed by N. Glover. In his method, the log and antilog of a field element is found by the use of several smaller tables. The method is based on a use of the Chinese Remainder Theorem. The technique often results in a significant reduction in the memory requirements of the problem. A VLSI architecture is developed for a special case of this new algorithm to perform finite field arithmetic including multiplication, division, and the finding of an inverse element in the finite field.

  7. PARKA II Experiment Utilizing SEA SPIDER. ONR Scientific Plan 2-69

    DTIC Science & Technology

    1969-06-26

    speed and wave height, and take a bathythermograph record to establish depth of surface layer . Log layer depth only with wind and wave data. Step 12...range acoustic propagation experiments designed to support the advanced development objectives of the Long Range Acoustic Propagation Project (LRAPP...environmental experiments conducted under the Long Range Acoustic Propagation Project (LR PP) for the purpose of, evaluating and improving

  8. Salmonid Communities in the South Fork of Caspar Creek, 1967 to 1969 and 1993 to 2003

    Treesearch

    Bradley E. Valentine; Richard A. Macedo; Tracie Hughes

    2007-01-01

    Demand for wood products and advances in logging technology post-World War II resulted in timber harvesting that extensively modified streams on the North Coast of California. To assess the resulting impacts to salmonid populations, the Department of Fish and Game conducted studies at widely spaced sites throughout the redwood region during the 1960s. In order to...

  9. Diamond knife-assisted deep anterior lamellar keratoplasty to manage keratoconus.

    PubMed

    Vajpayee, Rasik B; Maharana, Prafulla K; Sharma, Namrata; Agarwal, Tushar; Jhanji, Vishal

    2014-02-01

    To evaluate the outcomes of a new surgical technique, diamond knife-assisted deep anterior lamellar keratoplasty (DALK), and compare its visual and refractive results with big-bubble DALK in cases of keratoconus. Tertiary eyecare hospital. Comparative case series. The visual and surgical outcomes of diamond knife-assisted DALK were compared with those of successful big-bubble DALK. Diamond knife-assisted DALK was performed in 19 eyes and big-bubble DALK, in 11 eyes. All surgeries were completed successfully. No intraoperative or postoperative complications occurred with diamond knife-assisted DALK. Six months after diamond knife-assisted DALK, the mean corrected distance visual acuity (CDVA) improved significantly from 1.87 logMAR ± 0.22 (SD) to 0.23 ± 0.06 logMAR, the mean keratometry improved from 65.99 ± 8.86 diopters (D) to 45.13 ± 1.16 D, and the mean keratometric cylinder improved from 7.99 ± 3.81 D to 2.87 ± 0.59 D (all P=.005). Postoperatively, the mean refractive astigmatism was 2.55 ± 0.49 D and the mean spherical equivalent was -1.97 ± 0.56 D. The mean logMAR CDVA (P = .06), postoperative keratometry (P=.64), refractive cylinder (P=.63), and endothelial cell loss (P=.11) were comparable between diamond knife-assisted DALK and big-bubble DALK. Diamond knife-assisted DALK was effective and predictable as a surgical technique for management of keratoconus cases. This technique has the potential to offer visual and refractive outcomes comparable to those of big-bubble DALK. No author has a financial or proprietary interest in any material or method mentioned. Copyright © 2013 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  10. A double-blind placebo-controlled, randomised study comparing gemcitabine and marimastat with gemcitabine and placebo as first line therapy in patients with advanced pancreatic cancer

    PubMed Central

    Bramhall, S R; Schulz, J; Nemunaitis, J; Brown, P D; Baillet, M; Buckels, J A C

    2002-01-01

    Pancreatic cancer is the fifth most common cause of cancer death in the western world and the prognosis for unresectable disease remains poor. Recent advances in conventional chemotherapy and the development of novel ‘molecular’ treatment strategies with different toxicity profiles warrant investigation as combination treatment strategies. This randomised study in pancreatic cancer compares marimastat (orally administered matrix metalloproteinase inhibitor) in combination with gemcitabine to gemcitabine alone. Two hundred and thirty-nine patients with unresectable pancreatic cancer were randomised to receive gemcitabine (1000 mg m−2) in combination with either marimastat or placebo. The primary end-point was survival. Objective tumour response and duration of response, time to treatment failure and disease progression, quality of life and safety were also assessed. There was no significant difference in survival between gemcitabine and marimastat and gemcitabine and placebo (P=0.95 log-rank test). Median survival times were 165.5 and 164 days and 1-year survival was 18% and 17% respectively. There were no significant differences in overall response rates (11 and 16% respectively), progression-free survival (P=0.68 log-rank test) or time to treatment failure (P=0.70 log-rank test) between the treatment arms. The gemcitabine and marimastat combination was well tolerated with only 2.5% of patients withdrawn due to presumed marimastat toxicity. Grade 3 or 4 musculoskeletal toxicities were reported in only 4% of the marimastat treated patients, although 59% of marimastat treated patients reported some musculoskeletal events. The results of this study provide no evidence to support a combination of marimastat with gemcitabine in patients with advanced pancreatic cancer. The combination of marimastat with gemcitabine was well tolerated. Further studies of marimastat as a maintenance treatment following a response or stable disease on gemcitabine may be justified. British Journal of Cancer (2002) 87, 161–167. doi:10.1038/sj.bjc.6600446 www.bjcancer.com © 2002 Cancer Research UK PMID:12107836

  11. Iterative raw measurements restoration method with penalized weighted least squares approach for low-dose CT

    NASA Astrophysics Data System (ADS)

    Takahashi, Hisashi; Goto, Taiga; Hirokawa, Koichi; Miyazaki, Osamu

    2014-03-01

    Statistical iterative reconstruction and post-log data restoration algorithms for CT noise reduction have been widely studied and these techniques have enabled us to reduce irradiation doses while maintaining image qualities. In low dose scanning, electronic noise becomes obvious and it results in some non-positive signals in raw measurements. The nonpositive signal should be converted to positive signal so that it can be log-transformed. Since conventional conversion methods do not consider local variance on the sinogram, they have difficulty of controlling the strength of the filtering. Thus, in this work, we propose a method to convert the non-positive signal to the positive signal by mainly controlling the local variance. The method is implemented in two separate steps. First, an iterative restoration algorithm based on penalized weighted least squares is used to mitigate the effect of electronic noise. The algorithm preserves the local mean and reduces the local variance induced by the electronic noise. Second, smoothed raw measurements by the iterative algorithm are converted to the positive signal according to a function which replaces the non-positive signal with its local mean. In phantom studies, we confirm that the proposed method properly preserves the local mean and reduce the variance induced by the electronic noise. Our technique results in dramatically reduced shading artifacts and can also successfully cooperate with the post-log data filter to reduce streak artifacts.

  12. The combined use of heat-pulse flowmeter logging and packer testing for transmissive fracture recognition

    NASA Astrophysics Data System (ADS)

    Lo, Hung-Chieh; Chen, Po-Jui; Chou, Po-Yi; Hsu, Shih-Meng

    2014-06-01

    This paper presents an improved borehole prospecting methodology based on a combination of techniques in the hydrogeological characterization of fractured rock aquifers. The approach is demonstrated by on-site tests carried out in the Hoshe Experimental Forest site and the Tailuge National Park, Taiwan. Borehole televiewer logs are used to obtain fracture location and distribution along boreholes. The heat-pulse flow meter log is used to measure vertical velocity flow profiles which can be analyzed to estimate fracture transmissivity and to indicate hydraulic connectivity between fractures. Double-packer hydraulic tests are performed to determine the rock mass transmissivity. The computer program FLASH is used to analyze the data from the flowmeter logs. The FLASH program is confirmed as a useful tool which quantitatively predicts the fracture transmissivity in comparison to the hydraulic properties obtained from packer tests. The location of conductive fractures and their transmissivity is identified, after which the preferential flow paths through the fracture network are precisely delineated from a cross-borehole test. The results provide robust confirmation of the use of combined flowmeter and packer methods in the characterization of fractured-rock aquifers, particularly in reference to the investigation of groundwater resource and contaminant transport dynamics.

  13. Hand hygiene regimens for the reduction of risk in food service environments.

    PubMed

    Edmonds, Sarah L; McCormack, Robert R; Zhou, Sifang Steve; Macinga, David R; Fricker, Christopher M

    2012-07-01

    Pathogenic strains of Escherichia coli and human norovirus are the main etiologic agents of foodborne illness resulting from inadequate hand hygiene practices by food service workers. This study was conducted to evaluate the antibacterial and antiviral efficacy of various hand hygiene product regimens under different soil conditions representative of those in food service settings and assess the impact of product formulation on this efficacy. On hands contaminated with chicken broth containing E. coli, representing a moderate soil load, a regimen combining an antimicrobial hand washing product with a 70% ethanol advanced formula (EtOH AF) gel achieved a 5.22-log reduction, whereas a nonantimicrobial hand washing product alone achieved a 3.10log reduction. When hands were heavily soiled from handling ground beef containing E. coli, a wash-sanitize regimen with a 0.5% chloroxylenol antimicrobial hand washing product and the 70% EtOH AF gel achieved a 4.60-log reduction, whereas a wash-sanitize regimen with a 62% EtOH foam achieved a 4.11-log reduction. Sanitizing with the 70% EtOH AF gel alone was more effective than hand washing with a nonantimicrobial product for reducing murine norovirus (MNV), a surrogate for human norovirus, with 2.60- and 1.79-log reductions, respectively. When combined with hand washing, the 70% EtOH AF gel produced a 3.19-log reduction against MNV. A regimen using the SaniTwice protocol with the 70% EtOH AF gel produced a 4.04-log reduction against MNV. These data suggest that although the process of hand washing helped to remove pathogens from the hands, use of a wash-sanitize regimen was even more effective for reducing organisms. Use of a high-efficacy sanitizer as part of a wash-sanitize regimen further increased the efficacy of the regimen. The use of a well-formulated alcohol-based hand rub as part of a wash-sanitize regimen should be considered as a means to reduce risk of infection transmission in food service facilities.

  14. Effects of Thalassinoides ichnofabrics on the petrophysical properties of the Lower Cretaceous Lower Glen Rose Limestone, Middle Trinity Aquifer, Northern Bexar County, Texas

    NASA Astrophysics Data System (ADS)

    Golab, James A.; Smith, Jon J.; Clark, Allan K.; Blome, Charles D.

    2017-04-01

    The combined Edwards and Trinity aquifer system is the primary source of freshwater for the rapidly growing San Antonio and Austin metropolitan areas. The karstic Lower Cretaceous (Aptian-Albian) Lower Glen Rose Limestone (GRL) contains the middle Trinity aquifer and has been subdivided into six hydrostratigraphic units (HSUs) with distinct hydrologic characteristics. These HSUs were first identified in the subsurface via core examination at the Camp Stanley Storage Activity (CSSA) in northern Bexar County, Texas and were then correlated to associated gamma-ray and resistivity logs. The Trinity aquifer system is a telogenetic karst and fluid flow is directed primarily through solution-enhanced faults, fractures, and pervasive Thalassinoides networks because matrix porosity of both transmissive and confining HSUs is very low. Meteoric water infiltrates the Trinity aquifer through vertically-oriented faults and likely moves laterally through biogenic pores. Two 7.62 cm diameter GRL cores and well logs from monitoring wells CS-MW9-CC and CS-MW5-LGR recovered from the CSSA were used to characterize the effect such large-scale Thalassinoides networks have on the petrophysical properties (resistivity and natural gamma-ray) of four HSUs (Honey Creek, Rust, Doeppenschmidt, and Twin Sisters HSUs). Resistivity logs show that resistance values > 300 Ω-m correlate with well-developed biogenic porosity and values of 650 Ω-m are associated with solution enhancement of the Thalassinoides networks. These high resistivity zones are cyclical and are identified in muddy confining units, even when no changes in lithology or karstic development are identified. Pervasive Thalassinoides networks act as starting points for wide spread dissolution and lead to advanced karst development in transmissive HSUs. Natural gamma-ray logs do not reflect hydrologic characteristics directly, but are inversely correlated to resistivity logs and display m-scale cyclicity. Resistivity logs suggest that Thalassinoides networks are interconnected throughout strata within the GRL and when coupled with natural gamma-logs, the lateral distribution of these networks within HSUs can be correlated. Identifying such fluid pathways is of particular importance for wells not located in proximity to major faults and karstic features.

  15. Joint Inversion of Geochemical Data and Geophysical Logs for Lithology Identification in CCSD Main Hole

    NASA Astrophysics Data System (ADS)

    Deng, Chengxiang; Pan, Heping; Luo, Miao

    2017-12-01

    The Chinese Continental Scientific Drilling (CCSD) main hole is located in the Sulu ultrahigh-pressure metamorphic (UHPM) belt, providing significant opportunities for studying the metamorphic strata structure, kinetics process and tectonic evolution. Lithology identification is the primary and crucial stage for above geoscientific researches. To release the burden of log analyst and improve the efficiency of lithology interpretation, many algorithms have been developed to automate the process of lithology prediction. While traditional statistical techniques, such as discriminant analysis and K-nearest neighbors classifier, are incompetent in extracting nonlinear features of metamorphic rocks from complex geophysical log data; artificial intelligence algorithms are capable of solving nonlinear problems, but most of the algorithms suffer from tuning parameters to be global optimum to establish model rather than local optimum, and also encounter challenges in making the balance between training accuracy and generalization ability. Optimization methods have been applied extensively in the inversion of reservoir parameters of sedimentary formations using well logs. However, it is difficult to obtain accurate solution from the logging response equations of optimization method because of the strong overlapping of nonstationary log signals when applied in metamorphic formations. As oxide contents of each kinds of metamorphic rocks are relatively less overlapping, this study explores an approach, set in a metamorphic formation model and using the Broyden Fletcher Goldfarb Shanno (BFGS) optimization algorithm to identify lithology from oxide data. We first incorporate 11 geophysical logs and lab-collected geochemical data of 47 core samples to construct oxide profile of CCSD main hole by using backwards stepwise multiple regression method, which eliminates irrelevant input logs step by step for higher statistical significance and accuracy. Then we establish oxide response equations in accordance with the metamorphic formation model and employ BFGS algorithm to minimize the objective function. Finally, we identify lithology according to the composition content which accounts for the largest proportion. The results show that lithology identified by the method of this paper is consistent with core description. Moreover, this method demonstrates the benefits of using oxide content as an adhesive to connect logging data with lithology, can make the metamorphic formation model more understandable and accurate, and avoid selecting complex formation model and building nonlinear logging response equations.

  16. Performance characteristics and estimation of measurement uncertainty of three plating procedures for Campylobacter enumeration in chicken meat.

    PubMed

    Habib, I; Sampers, I; Uyttendaele, M; Berkvens, D; De Zutter, L

    2008-02-01

    In this work, we present an intra-laboratory study in order to estimate repeatability (r), reproducibility (R), and measurement uncertainty (U) associated with three media for Campylobacter enumeration, named, modified charcoal cefoperazone deoxycholate agar (mCCDA); Karmali agar; and CampyFood ID agar (CFA) a medium by Biomérieux SA. The study was performed at three levels: (1) pure bacterial cultures, using three Campylobacter strains; (2) artificially contaminated samples from three chicken meat matrixes (total n=30), whereby samples were spiked using two contamination levels; ca. 10(3)cfuCampylobacter/g, and ca. 10(4)cfuCampylobacter/g; and (3) pilot testing in naturally contaminated chicken meat samples (n=20). Results from pure culture experiment revealed that enumeration of Campylobacter colonies on Karmali and CFA media was more convenient in comparison with mCCDA using spread and spiral plating techniques. Based on artificially contaminated samples testing, values of repeatability (r) were comparable between the three media, and estimated as 0.15log(10)cfu/g for mCCDA, 0.14log(10)cfu/g for Karmali, and 0.18log(10)cfu/g for CFA. As well, reproducibility performance of the three plating media was comparable. General R values which can be used when testing chicken meat samples are; 0.28log(10), 0.32log(10), and 0.25log(10) for plating on mCCDA, Karmali agar, and CFA, respectively. Measurement uncertainty associated with mCCDA, Karmali agar, and CFA using spread plating, for combination of all meat matrixes, were +/-0.24log(10)cfu/g, +/-0.28log(10)cfu/g, and +/-0.22log(10)cfu/g, respectively. Higher uncertainty was associated with Karmali agar for Campylobacter enumeration in artificially inoculated minced meat (+/-0.48log(10)cfu/g). The general performance of CFA medium was comparable with mCCDA performance at the level of artificially contaminated samples. However, when tested at naturally contaminated samples, non-Campylobacter colonies gave similar deep red colour as that given by the typical Campylobacter growth on CFA. Such colonies were not easily distinguishable by naked eye. In general, the overall reproducibility, repeatability, and measurement uncertainty estimated by our study indicate that there are no major problems with the precision of the International Organization for Standardization (ISO) 10272-2:2006 protocol for Campylobacter enumeration using mCCDA medium.

  17. Assessment of Remote Sensing Technologies for Location of Hydrogen and Helium Leaks

    NASA Technical Reports Server (NTRS)

    Sellar, R. Glenn; Sohn, Yongho; Mathur, Varun; Reardon, Peter

    2001-01-01

    In Phase 1 of this project, a hierarchy of techniques for H2 and He leak location was developed. A total of twelve specific remote sensing techniques were evaluated; the results are summarized. A basic diffusion model was also developed to predict the concentration and distribution of H2 or He resulting from a leak. The objectives of Phase 2 of the project consisted of the following four tasks: Advance Rayleigh Doppler technique from TRL 1 to TRL 2; Plan to advance Rayleigh Doppler technique from TRL 2 to TRL 3; Advance researchers and resources for further advancement; Extend diffusion model.

  18. Predicting Information Flows in Network Traffic.

    ERIC Educational Resources Information Center

    Hinich, Melvin J.; Molyneux, Robert E.

    2003-01-01

    Discusses information flow in networks and predicting network traffic and describes a study that uses time series analysis on a day's worth of Internet log data. Examines nonlinearity and traffic invariants, and suggests that prediction of network traffic may not be possible with current techniques. (Author/LRW)

  19. South-East Asia's Trembling Rainforests.

    ERIC Educational Resources Information Center

    Laird, John

    1991-01-01

    This discussion focuses on potential solutions to the degradation of rainforests in Southeast Asia caused by indiscriminate logging, inappropriate road-construction techniques, forest fires, and the encroachment upon watersheds by both agricultural concerns and peasant farmers. Vignettes illustrate the impact of this degradation upon the animals,…

  20. Improving quantitative structure-activity relationship models using Artificial Neural Networks trained with dropout.

    PubMed

    Mendenhall, Jeffrey; Meiler, Jens

    2016-02-01

    Dropout is an Artificial Neural Network (ANN) training technique that has been shown to improve ANN performance across canonical machine learning (ML) datasets. Quantitative Structure Activity Relationship (QSAR) datasets used to relate chemical structure to biological activity in Ligand-Based Computer-Aided Drug Discovery pose unique challenges for ML techniques, such as heavily biased dataset composition, and relatively large number of descriptors relative to the number of actives. To test the hypothesis that dropout also improves QSAR ANNs, we conduct a benchmark on nine large QSAR datasets. Use of dropout improved both enrichment false positive rate and log-scaled area under the receiver-operating characteristic curve (logAUC) by 22-46 % over conventional ANN implementations. Optimal dropout rates are found to be a function of the signal-to-noise ratio of the descriptor set, and relatively independent of the dataset. Dropout ANNs with 2D and 3D autocorrelation descriptors outperform conventional ANNs as well as optimized fingerprint similarity search methods.

  1. Improving Quantitative Structure-Activity Relationship Models using Artificial Neural Networks Trained with Dropout

    PubMed Central

    Mendenhall, Jeffrey; Meiler, Jens

    2016-01-01

    Dropout is an Artificial Neural Network (ANN) training technique that has been shown to improve ANN performance across canonical machine learning (ML) datasets. Quantitative Structure Activity Relationship (QSAR) datasets used to relate chemical structure to biological activity in Ligand-Based Computer-Aided Drug Discovery (LB-CADD) pose unique challenges for ML techniques, such as heavily biased dataset composition, and relatively large number of descriptors relative to the number of actives. To test the hypothesis that dropout also improves QSAR ANNs, we conduct a benchmark on nine large QSAR datasets. Use of dropout improved both Enrichment false positive rate (FPR) and log-scaled area under the receiver-operating characteristic curve (logAUC) by 22–46% over conventional ANN implementations. Optimal dropout rates are found to be a function of the signal-to-noise ratio of the descriptor set, and relatively independent of the dataset. Dropout ANNs with 2D and 3D autocorrelation descriptors outperform conventional ANNs as well as optimized fingerprint similarity search methods. PMID:26830599

  2. SU-F-T-288: Impact of Trajectory Log Files for Clarkson-Based Independent Dose Verification of IMRT and VMAT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takahashi, R; Kamima, T; Tachibana, H

    2016-06-15

    Purpose: To investigate the effect of the trajectory files from linear accelerator for Clarkson-based independent dose verification in IMRT and VMAT plans. Methods: A CT-based independent dose verification software (Simple MU Analysis: SMU, Triangle Products, Japan) with a Clarksonbased algorithm was modified to calculate dose using the trajectory log files. Eclipse with the three techniques of step and shoot (SS), sliding window (SW) and Rapid Arc (RA) was used as treatment planning system (TPS). In this study, clinically approved IMRT and VMAT plans for prostate and head and neck (HN) at two institutions were retrospectively analyzed to assess the dosemore » deviation between DICOM-RT plan (PL) and trajectory log file (TJ). An additional analysis was performed to evaluate MLC error detection capability of SMU when the trajectory log files was modified by adding systematic errors (0.2, 0.5, 1.0 mm) and random errors (5, 10, 30 mm) to actual MLC position. Results: The dose deviations for prostate and HN in the two sites were 0.0% and 0.0% in SS, 0.1±0.0%, 0.1±0.1% in SW and 0.6±0.5%, 0.7±0.9% in RA, respectively. The MLC error detection capability shows the plans for HN IMRT were the most sensitive and 0.2 mm of systematic error affected 0.7% dose deviation on average. Effect of the MLC random error did not affect dose error. Conclusion: The use of trajectory log files including actual information of MLC location, gantry angle, etc should be more effective for an independent verification. The tolerance level for the secondary check using the trajectory file may be similar to that of the verification using DICOM-RT plan file. From the view of the resolution of MLC positional error detection, the secondary check could detect the MLC position error corresponding to the treatment sites and techniques. This research is partially supported by Japan Agency for Medical Research and Development (AMED)« less

  3. Pesticide and trace metal occurrence and aquatic benchmark exceedances in surface waters and sediments of urban wetlands and retention ponds in Melbourne, Australia.

    PubMed

    Allinson, Graeme; Zhang, Pei; Bui, AnhDuyen; Allinson, Mayumi; Rose, Gavin; Marshall, Stephen; Pettigrove, Vincent

    2015-07-01

    Samples of water and sediments were collected from 24 urban wetlands in Melbourne, Australia, in April 2010, and tested for more than 90 pesticides using a range of gas chromatographic (GC) and liquid chromatographic (LC) techniques, sample 'hormonal' activity using yeast-based recombinant receptor-reporter gene bioassays, and trace metals using spectroscopic techniques. At the time of sampling, there was almost no estrogenic activity in the water column. Twenty-three different pesticide residues were observed in one or more water samples from the 24 wetlands; chemicals observed at more than 40% of sites were simazine (100%), atrazine (79%), and metalaxyl and terbutryn (46%). Using the toxicity unit (TU) concept, less than 15% of the detected pesticides were considered to pose an individual, short-term risk to fish or zooplankton in the ponds and wetlands. However, one pesticide (fenvalerate) may have posed a possible short-term risk to fish (log10TUf > -3), and three pesticides (azoxystrobin, fenamiphos and fenvalerate) may have posed a risk to zooplankton (logTUzp between -2 and -3); all the photosystem II (PSII) inhibiting herbicides may have posed a risk to primary producers in the ponds and wetlands (log10TUap and/or log10TUalg > -3). The wetland sediments were contaminated with 16 different pesticides; no chemicals were observed at more than one third of sites, but based on frequency of detection and concentrations, bifenthrin (33%, maximum 59 μg/kg) is the priority insecticide of concern for the sediments studied. Five sites returned a TU greater than the possible effect threshold (i.e. log10TU > 1) as a result of bifenthrin contamination of their sediments. Most sediments did not exceed Australian sediment quality guideline levels for trace metals. However, more than half of the sites had threshold effect concentration quotients (TECQ) values >1 for Cu (58%), Pb (50%), Ni (67%) and Zn (63%), and 75% of sites had mean probable effect concentration quotients (PECQ) >0.2, suggesting that the collected sediments may have been having some impact on sediment-dwelling organisms.

  4. Linear modeling of the soil-water partition coefficient normalized to organic carbon content by reversed-phase thin-layer chromatography.

    PubMed

    Andrić, Filip; Šegan, Sandra; Dramićanin, Aleksandra; Majstorović, Helena; Milojković-Opsenica, Dušanka

    2016-08-05

    Soil-water partition coefficient normalized to the organic carbon content (KOC) is one of the crucial properties influencing the fate of organic compounds in the environment. Chromatographic methods are well established alternative for direct sorption techniques used for KOC determination. The present work proposes reversed-phase thin-layer chromatography (RP-TLC) as a simpler, yet equally accurate method as officially recommended HPLC technique. Several TLC systems were studied including octadecyl-(RP18) and cyano-(CN) modified silica layers in combination with methanol-water and acetonitrile-water mixtures as mobile phases. In total 50 compounds of different molecular shape, size, and various ability to establish specific interactions were selected (phenols, beznodiazepines, triazine herbicides, and polyaromatic hydrocarbons). Calibration set of 29 compounds with known logKOC values determined by sorption experiments was used to build simple univariate calibrations, Principal Component Regression (PCR) and Partial Least Squares (PLS) models between logKOC and TLC retention parameters. Models exhibit good statistical performance, indicating that CN-layers contribute better to logKOC modeling than RP18-silica. The most promising TLC methods, officially recommended HPLC method, and four in silico estimation approaches have been compared by non-parametric Sum of Ranking Differences approach (SRD). The best estimations of logKOC values were achieved by simple univariate calibration of TLC retention data involving CN-silica layers and moderate content of methanol (40-50%v/v). They were ranked far well compared to the officially recommended HPLC method which was ranked in the middle. The worst estimates have been obtained from in silico computations based on octanol-water partition coefficient. Linear Solvation Energy Relationship study revealed that increased polarity of CN-layers over RP18 in combination with methanol-water mixtures is the key to better modeling of logKOC through significant diminishing of dipolar and proton accepting influence of the mobile phase as well as enhancing molar refractivity in excess of the chromatographic systems. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Predicting both passive intestinal absorption and the dissociation constant toward albumin using the PAMPA technique.

    PubMed

    Bujard, Alban; Sol, Marine; Carrupt, Pierre-Alain; Martel, Sophie

    2014-10-15

    The parallel artificial membrane permeability assay (PAMPA) is a high-throughput screening (HTS) method that is widely used to predict in vivo passive permeability through biological barriers, such as the skin, the blood brain barrier (BBB) and the gastrointestinal tract (GIT). The PAMPA technique has also been used to predict the dissociation constant (Kd) between a compound and human serum albumin (HSA) while disregarding passive permeability. Furthermore, the assay is based on the use of two separate 5-point kinetic experiments, which increases the analysis time. In the present study, we adapted the hexadecane membrane (HDM)-PAMPA assay to both predict passive gastrointestinal absorption via the permeability coefficient logPe value and determine the Kd. Two assays were performed: one in the presence and one in the absence of HSA in the acceptor compartment. In the absence of HSA, logPe values were determined after a 4-h incubation time, as originally described, but the dimethylsulfoxide (DMSO) percentage and pH were altered to be compatible with the protein. In parallel, a second PAMPA assay was performed in the presence of HSA during a 16-h incubation period. By adding HSA, a variation in the amount of compound crossing the membrane was observed compared to the permeability measured in the absence of HSA. The concentration of compound reaching the acceptor compartment in each case was used to determine both parameters (logPe and logKd) using numerical simulations, which highlighted the originality of this method because these calculations required only two endpoint measurements instead of a complete kinetic study. It should be noted that the amount of compound that reaches the acceptor compartment in the presence of HSA is modulated by complex dissociation in the receptor compartment. Only compounds that are moderately bound to albumin (-3

  6. The association of serum angiogenic growth factors with renal structure and function in patients with adult autosomal dominant polycystic kidney disease.

    PubMed

    Coban, Melahat; Inci, Ayca

    2018-07-01

    Autosomal dominant polycystic kidney disease (ADPKD) is a common congenital chronic kidney disease (CKD). We report here the relationship of serum angiopoietin-1 (Ang-1), Ang-2, and vascular endothelial growth factor (VEGF) with total kidney volume (TKV), total cyst volume (TCV), and renal failure in adult ADPKD patients at various stages of CKD. This cross-sectional study was conducted with 50 patients diagnosed with ADPKD and a control group of 45 age-matched healthy volunteers. In patient group, TKV and TCV were determined with upper abdominal magnetic resonance imaging, whereas in controls, TKV was determined with ultrasonography according to ellipsoid formula. Renal function was assessed with serum creatinine, estimated glomerular filtration rate (eGFR), and spot urinary protein/creatinine ratio (UPCR). Ang-1, Ang-2, and VEGF were measured using enzyme-linked immunosorbent assay. Patients with ADPKD had significantly higher TKV (p < 0.001) and UPCR (p < 0.001), and lower eGFR (p ≤ 0.001) compared to the controls. Log 10 Ang-2 was found to be higher in ADPKD patients at all CKD stages. Multiple linear regression analysis showed that there was no association between log 10 Ang-1, log 10 Ang-2, or log 10 VEGF and creatinine, eGFR, UPCR, log 10 TKV (p > 0.05). There was no association of serum angiogenic growth factors with TKV or renal failure in ADPKD patients. Increased serum Ang-2 observed in stages 1-2 CKD suggests that angiogenesis plays a role in the progression of early stage ADPKD, but not at later stages of the disease. This may be explained by possible cessation of angiogenesis in advanced stages of CKD due to the increased number of sclerotic glomeruli.

  7. Test-bench system for a borehole azimuthal acoustic reflection imaging logging tool

    NASA Astrophysics Data System (ADS)

    Liu, Xianping; Ju, Xiaodong; Qiao, Wenxiao; Lu, Junqiang; Men, Baiyong; Liu, Dong

    2016-06-01

    The borehole azimuthal acoustic reflection imaging logging tool (BAAR) is a new generation of imaging logging tool, which is able to investigate stratums in a relatively larger range of space around the borehole. The BAAR is designed based on the idea of modularization with a very complex structure, so it has become urgent for us to develop a dedicated test-bench system to debug each module of the BAAR. With the help of a test-bench system introduced in this paper, test and calibration of BAAR can be easily achieved. The test-bench system is designed based on the client/server model. The hardware system mainly consists of a host computer, an embedded controlling board, a bus interface board, a data acquisition board and a telemetry communication board. The host computer serves as the human machine interface and processes the uploaded data. The software running on the host computer is designed based on VC++. The embedded controlling board uses Advanced Reduced Instruction Set Machines 7 (ARM7) as the micro controller and communicates with the host computer via Ethernet. The software for the embedded controlling board is developed based on the operating system uClinux. The bus interface board, data acquisition board and telemetry communication board are designed based on a field programmable gate array (FPGA) and provide test interfaces for the logging tool. To examine the feasibility of the test-bench system, it was set up to perform a test on BAAR. By analyzing the test results, an unqualified channel of the electronic receiving cabin was discovered. It is suggested that the test-bench system can be used to quickly determine the working condition of sub modules of BAAR and it is of great significance in improving production efficiency and accelerating industrial production of the logging tool.

  8. Emerging Technologies for Assessing Physical Activity Behaviors in Space and Time

    PubMed Central

    Hurvitz, Philip M.; Moudon, Anne Vernez; Kang, Bumjoon; Saelens, Brian E.; Duncan, Glen E.

    2014-01-01

    Precise measurement of physical activity is important for health research, providing a better understanding of activity location, type, duration, and intensity. This article describes a novel suite of tools to measure and analyze physical activity behaviors in spatial epidemiology research. We use individual-level, high-resolution, objective data collected in a space-time framework to investigate built and social environment influences on activity. First, we collect data with accelerometers, global positioning system units, and smartphone-based digital travel and photo diaries to overcome many limitations inherent in self-reported data. Behaviors are measured continuously over the full spectrum of environmental exposures in daily life, instead of focusing exclusively on the home neighborhood. Second, data streams are integrated using common timestamps into a single data structure, the “LifeLog.” A graphic interface tool, “LifeLog View,” enables simultaneous visualization of all LifeLog data streams. Finally, we use geographic information system SmartMap rasters to measure spatially continuous environmental variables to capture exposures at the same spatial and temporal scale as in the LifeLog. These technologies enable precise measurement of behaviors in their spatial and temporal settings but also generate very large datasets; we discuss current limitations and promising methods for processing and analyzing such large datasets. Finally, we provide applications of these methods in spatially oriented research, including a natural experiment to evaluate the effects of new transportation infrastructure on activity levels, and a study of neighborhood environmental effects on activity using twins as quasi-causal controls to overcome self-selection and reverse causation problems. In summary, the integrative characteristics of large datasets contained in LifeLogs and SmartMaps hold great promise for advancing spatial epidemiologic research to promote healthy behaviors. PMID:24479113

  9. Influence of drilling operations on drilling mud gas monitoring during IODP Exp. 338 and 348

    NASA Astrophysics Data System (ADS)

    Hammerschmidt, Sebastian; Toczko, Sean; Kubo, Yusuke; Wiersberg, Thomas; Fuchida, Shigeshi; Kopf, Achim; Hirose, Takehiro; Saffer, Demian; Tobin, Harold; Expedition 348 Scientists, the

    2014-05-01

    The history of scientific ocean drilling has developed some new techniques and technologies for drilling science, dynamic positioning being one of the most famous. However, while industry has developed newer tools and techniques, only some of these have been used in scientific ocean drilling. The introduction of riser-drilling, which recirculates the drilling mud and returns to the platform solids and gases from the formation, to the International Ocean Drilling Program (IODP) through the launch of the Japan Agency of Marine Earth-Science and Technology (JAMSTEC) riser-drilling vessel D/V Chikyu, has made some of these techniques available to science. IODP Expedition 319 (NanTroSEIZE Stage 2: riser/riserless observatory) was the first such attempt, and among the tools and techniques used was drilling mud gas analysis. While industry regularly conducts drilling mud gas logging for safety concerns and reservoir evaluation, science is more interested in other components (e.g He, 222Rn) that are beyond the scope of typical mud logging services. Drilling mud gas logging simply examines the gases released into the drilling mud as part of the drilling process; the bit breaks and grinds the formation, releasing any trapped gases. These then circulate within the "closed circuit" mud-flow back to the drilling rig, where a degasser extracts these gases and passes them on to a dedicated mud gas logging unit. The unit contains gas chromatographs, mass spectrometers, spectral analyzers, radon gas analyzers, and a methane carbon isotope analyzer. Data are collected and stored in a database, together with several drilling parameters (rate of penetration, mud density, etc.). This initial attempt was further refined during IODP Expeditions 337 (Deep Coalbed Biosphere off Shimokita), 338 (NanTroSEIZE Stage 3: NanTroSEIZE Plate Boundary Deep Riser 2) and finally 348 (NanTroSEIZE Stage 3: NanTroSEIZE Plate Boundary Deep Riser 3). Although still in its development stage for scientific application, this technique can provide a valuable suite of measurements to complement more traditional IODP shipboard measurements. Here we present unpublished data from IODP Expeditions 338 and 348, penetrating the Nankai Accretionary wedge to 3058.5 meters below seafloor. Increasing mud density decreased degasser efficiency, especially for higher hydrocarbons. Blurring of the relative variations in total gas by depth was observed, and confirmed with comparison to headspace gas concentrations from the cored interval. Theoretically, overpressured zones in the formation can be identified through C2/C3 ratios, but these ratios are highly affected by changing drilling parameters. Proper mud gas evaluations will need to carefully consider the effects of variable drilling parameters when designing experiments and interpreting the data.

  10. Advanced Neuroimaging in Traumatic Brain Injury

    PubMed Central

    Edlow, Brian L.; Wu, Ona

    2013-01-01

    Advances in structural and functional neuroimaging have occurred at a rapid pace over the past two decades. Novel techniques for measuring cerebral blood flow, metabolism, white matter connectivity, and neural network activation have great potential to improve the accuracy of diagnosis and prognosis for patients with traumatic brain injury (TBI), while also providing biomarkers to guide the development of new therapies. Several of these advanced imaging modalities are currently being implemented into clinical practice, whereas others require further development and validation. Ultimately, for advanced neuroimaging techniques to reach their full potential and improve clinical care for the many civilians and military personnel affected by TBI, it is critical for clinicians to understand the applications and methodological limitations of each technique. In this review, we examine recent advances in structural and functional neuroimaging and the potential applications of these techniques to the clinical care of patients with TBI. We also discuss pitfalls and confounders that should be considered when interpreting data from each technique. Finally, given the vast amounts of advanced imaging data that will soon be available to clinicians, we discuss strategies for optimizing data integration, visualization and interpretation. PMID:23361483

  11. Molecular methods (digital PCR and real-time PCR) for the quantification of low copy DNA of Phytophthora nicotianae in environmental samples.

    PubMed

    Blaya, Josefa; Lloret, Eva; Santísima-Trinidad, Ana B; Ros, Margarita; Pascual, Jose A

    2016-04-01

    Currently, real-time polymerase chain reaction (qPCR) is the technique most often used to quantify pathogen presence. Digital PCR (dPCR) is a new technique with the potential to have a substantial impact on plant pathology research owing to its reproducibility, sensitivity and low susceptibility to inhibitors. In this study, we evaluated the feasibility of using dPCR and qPCR to quantify Phytophthora nicotianae in several background matrices, including host tissues (stems and roots) and soil samples. In spite of the low dynamic range of dPCR (3 logs compared with 7 logs for qPCR), this technique proved to have very high precision applicable at very low copy numbers. The dPCR was able to detect accurately the pathogen in all type of samples in a broad concentration range. Moreover, dPCR seems to be less susceptible to inhibitors than qPCR in plant samples. Linear regression analysis showed a high correlation between the results obtained with the two techniques in soil, stem and root samples, with R(2) = 0.873, 0.999 and 0.995 respectively. These results suggest that dPCR is a promising alternative for quantifying soil-borne pathogens in environmental samples, even in early stages of the disease. © 2015 Society of Chemical Industry.

  12. Establishment of a 12-gene expression signature to predict colon cancer prognosis

    PubMed Central

    Zhao, Guangxi; Dong, Pingping; Wu, Bingrui

    2018-01-01

    A robust and accurate gene expression signature is essential to assist oncologists to determine which subset of patients at similar Tumor-Lymph Node-Metastasis (TNM) stage has high recurrence risk and could benefit from adjuvant therapies. Here we applied a two-step supervised machine-learning method and established a 12-gene expression signature to precisely predict colon adenocarcinoma (COAD) prognosis by using COAD RNA-seq transcriptome data from The Cancer Genome Atlas (TCGA). The predictive performance of the 12-gene signature was validated with two independent gene expression microarray datasets: GSE39582 includes 566 COAD cases for the development of six molecular subtypes with distinct clinical, molecular and survival characteristics; GSE17538 is a dataset containing 232 colon cancer patients for the generation of a metastasis gene expression profile to predict recurrence and death in COAD patients. The signature could effectively separate the poor prognosis patients from good prognosis group (disease specific survival (DSS): Kaplan Meier (KM) Log Rank p = 0.0034; overall survival (OS): KM Log Rank p = 0.0336) in GSE17538. For patients with proficient mismatch repair system (pMMR) in GSE39582, the signature could also effectively distinguish high risk group from low risk group (OS: KM Log Rank p = 0.005; Relapse free survival (RFS): KM Log Rank p = 0.022). Interestingly, advanced stage patients were significantly enriched in high 12-gene score group (Fisher’s exact test p = 0.0003). After stage stratification, the signature could still distinguish poor prognosis patients in GSE17538 from good prognosis within stage II (Log Rank p = 0.01) and stage II & III (Log Rank p = 0.017) in the outcome of DFS. Within stage III or II/III pMMR patients treated with Adjuvant Chemotherapies (ACT) and patients with higher 12-gene score showed poorer prognosis (III, OS: KM Log Rank p = 0.046; III & II, OS: KM Log Rank p = 0.041). Among stage II/III pMMR patients with lower 12-gene scores in GSE39582, the subgroup receiving ACT showed significantly longer OS time compared with those who received no ACT (Log Rank p = 0.021), while there is no obvious difference between counterparts among patients with higher 12-gene scores (Log Rank p = 0.12). Besides COAD, our 12-gene signature is multifunctional in several other cancer types including kidney cancer, lung cancer, uveal and skin melanoma, brain cancer, and pancreatic cancer. Functional classification showed that seven of the twelve genes are involved in immune system function and regulation, so our 12-gene signature could potentially be used to guide decisions about adjuvant therapy for patients with stage II/III and pMMR COAD.

  13. Review of advanced imaging techniques

    PubMed Central

    Chen, Yu; Liang, Chia-Pin; Liu, Yang; Fischer, Andrew H.; Parwani, Anil V.; Pantanowitz, Liron

    2012-01-01

    Pathology informatics encompasses digital imaging and related applications. Several specialized microscopy techniques have emerged which permit the acquisition of digital images (“optical biopsies”) at high resolution. Coupled with fiber-optic and micro-optic components, some of these imaging techniques (e.g., optical coherence tomography) are now integrated with a wide range of imaging devices such as endoscopes, laparoscopes, catheters, and needles that enable imaging inside the body. These advanced imaging modalities have exciting diagnostic potential and introduce new opportunities in pathology. Therefore, it is important that pathology informaticists understand these advanced imaging techniques and the impact they have on pathology. This paper reviews several recently developed microscopic techniques, including diffraction-limited methods (e.g., confocal microscopy, 2-photon microscopy, 4Pi microscopy, and spatially modulated illumination microscopy) and subdiffraction techniques (e.g., photoactivated localization microscopy, stochastic optical reconstruction microscopy, and stimulated emission depletion microscopy). This article serves as a primer for pathology informaticists, highlighting the fundamentals and applications of advanced optical imaging techniques. PMID:22754737

  14. Advanced Tomographic Imaging Methods for the Analysis of Materials

    DTIC Science & Technology

    1991-08-01

    used in composite manufacture: aluminum, silicon carbide, and titanium aluminide . Also depicted in Fig. 2 are the energy intervals which can...SiC-fiber (SCS6) in a titanium - aluminide matrix. The contrast between SiC and AtIis only 10% over a broad eiaergy range. Therefore, distinguishing the...borehole logging, orrodent detection on turbine blades , kerogen analysis of shale, and contents of coals (sulfur, minerals, and btu). APSTNG

  15. Multiscale Models of Melting Arctic Sea Ice

    DTIC Science & Technology

    2014-09-30

    from weakly to highly correlated, or Poissonian toward Wigner -Dyson, as a function of system connectedness. This provides a mechanism for explaining...eluded us. Court Strong found such a method. It creates an optimal fit of a hyperbolic tangent model for the fractal dimension as a function of log A...actual melt pond images, and have made significant advances in the underlying functional and numerical analysis needed for these computations

  16. Disinfection of an advanced primary effluent using peracetic acid or ultraviolet radiation for its reuse in public services.

    PubMed

    Julio, Flores R; Hilario, Terres-Peña; Mabel, Vaca M; Raymundo, López C; Arturo, Lizardi-Ramos; Ma Neftalí, Rojas-Valencia

    2015-03-01

    The disinfection of a continuous flow of an effluent from an advanced primary treatment (coagulation-flocculation-sedimentation) with or without posterior filtration, using either peracetic acid (PAA) or ultraviolet (UV) radiation was studied. We aimed to obtain bacteriological quality to comply with the microbiological standard established in the Mexican regulations for treated wastewater reuse (NOM-003-SEMARNAT-1997), i.e., less than 240 MPN (most probable number) FC/100 mL. The concentrations of PAA were 10, 15, and 20 mg/L, with contact times of 10, and 15 min. Fecal coliforms (FC) inactivation ranged from 0.93 up to 6.4 log units, and in all cases it reached the limits set by the mentioned regulation. Water quality influenced the PAA disinfection effectiveness. An efficiency of 91% was achieved for the unfiltered effluent, as compared to 99% when wastewater was filtered. UV radiation was applied to wastewater flows of 21, 30 and 39 L/min, with dosages from 1 to 6 mJ/cm². This treatment did not achieve the bacteriological quality required for treated wastewater reuse, since the best inactivation of FC was 1.62 log units, for a flow of 21 L/min of filtered wastewater and a UV dosage of 5.6 mJ/cm².

  17. Collisional-radiative switching - A powerful technique for converging non-LTE calculations

    NASA Technical Reports Server (NTRS)

    Hummer, D. G.; Voels, S. A.

    1988-01-01

    A very simple technique has been developed to converge statistical equilibrium and model atmospheric calculations in extreme non-LTE conditions when the usual iterative methods fail to converge from an LTE starting model. The proposed technique is based on a smooth transition from a collision-dominated LTE situation to the desired non-LTE conditions in which radiation dominates, at least in the most important transitions. The proposed approach was used to successfully compute stellar models with He abundances of 0.20, 0.30, and 0.50; Teff = 30,000 K, and log g = 2.9.

  18. Log Houses in les Laurentides. From Oral Tradition to AN Integrated Digital Documentation Based on the Re-Discovery of the Traditional Constructive-Geographical `REPERTOIRES' Through Digital Bim Data Archive

    NASA Astrophysics Data System (ADS)

    Esponda, M.; Piraino, F.; Stanga, C.; Mezzino, D.

    2017-08-01

    This paper presents an integrated approach between digital documentation workflows and historical research in order to document log houses, outstanding example of vernacular architecture in Quebec, focusing on their geometrical-dimensional as well as on the intangible elements associated with these historical structures. The 18 log houses selected in the Laurentians represent the material culture of how settlers adapted to the harsh Quebec environment at the end of the nineteenth century. The essay describes some results coming by professor Mariana Esponda in 2015 (Carleton University) and the digital documentation was carried out through the grant New Paradigm/New Tools for Architectural Heritage in Canada, supported by SSHRC Training Program) (May-August 2016). The workflow of the research started with the digital documentation, accomplished with laser scanning techniques, followed by onsite observations, and archival researches. This led to the creation of an 'abacus', a first step into the development of a territorialhistorical database of the log houses, potentially updatable by other researchers. Another important part of the documentation of these buildings has been the development of Historic Building Information Models fundamental to analyze the geometry of the logs and to understand how these constructions were built. The realization of HBIMs was a first step into the modeling of irregular shapes such as those of the logs - different Level of Detail were adopted in order to show how the models can be used for different purposes. In the future, they can potentially be used for the creation of a virtual tour app for the story telling of these buildings.

  19. Alternative Hand Contamination Technique To Compare the Activities of Antimicrobial and Nonantimicrobial Soaps under Different Test Conditions▿

    PubMed Central

    Fuls, Janice L.; Rodgers, Nancy D.; Fischler, George E.; Howard, Jeanne M.; Patel, Monica; Weidner, Patrick L.; Duran, Melani H.

    2008-01-01

    Antimicrobial hand soaps provide a greater bacterial reduction than nonantimicrobial soaps. However, the link between greater bacterial reduction and a reduction of disease has not been definitively demonstrated. Confounding factors, such as compliance, soap volume, and wash time, may all influence the outcomes of studies. The aim of this work was to examine the effects of wash time and soap volume on the relative activities and the subsequent transfer of bacteria to inanimate objects for antimicrobial and nonantimicrobial soaps. Increasing the wash time from 15 to 30 seconds increased reduction of Shigella flexneri from 2.90 to 3.33 log10 counts (P = 0.086) for the antimicrobial soap, while nonantimicrobial soap achieved reductions of 1.72 and 1.67 log10 counts (P > 0.6). Increasing soap volume increased bacterial reductions for both the antimicrobial and the nonantimicrobial soaps. When the soap volume was normalized based on weight (∼3 g), nonantimicrobial soap reduced Serratia marcescens by 1.08 log10 counts, compared to the 3.83-log10 reduction caused by the antimicrobial soap (P < 0.001). The transfer of Escherichia coli to plastic balls following a 15-second hand wash with antimicrobial soap resulted in a bacterial recovery of 2.49 log10 counts, compared to the 4.22-log10 (P < 0.001) bacterial recovery on balls handled by hands washed with nonantimicrobial soap. This indicates that nonantimicrobial soap was less active and that the effectiveness of antimicrobial soaps can be improved with longer wash time and greater soap volume. The transfer of bacteria to objects was significantly reduced due to greater reduction in bacteria following the use of antimicrobial soap. PMID:18441107

  20. Alternative hand contamination technique to compare the activities of antimicrobial and nonantimicrobial soaps under different test conditions.

    PubMed

    Fuls, Janice L; Rodgers, Nancy D; Fischler, George E; Howard, Jeanne M; Patel, Monica; Weidner, Patrick L; Duran, Melani H

    2008-06-01

    Antimicrobial hand soaps provide a greater bacterial reduction than nonantimicrobial soaps. However, the link between greater bacterial reduction and a reduction of disease has not been definitively demonstrated. Confounding factors, such as compliance, soap volume, and wash time, may all influence the outcomes of studies. The aim of this work was to examine the effects of wash time and soap volume on the relative activities and the subsequent transfer of bacteria to inanimate objects for antimicrobial and nonantimicrobial soaps. Increasing the wash time from 15 to 30 seconds increased reduction of Shigella flexneri from 2.90 to 3.33 log(10) counts (P = 0.086) for the antimicrobial soap, while nonantimicrobial soap achieved reductions of 1.72 and 1.67 log(10) counts (P > 0.6). Increasing soap volume increased bacterial reductions for both the antimicrobial and the nonantimicrobial soaps. When the soap volume was normalized based on weight (approximately 3 g), nonantimicrobial soap reduced Serratia marcescens by 1.08 log(10) counts, compared to the 3.83-log(10) reduction caused by the antimicrobial soap (P < 0.001). The transfer of Escherichia coli to plastic balls following a 15-second hand wash with antimicrobial soap resulted in a bacterial recovery of 2.49 log(10) counts, compared to the 4.22-log(10) (P < 0.001) bacterial recovery on balls handled by hands washed with nonantimicrobial soap. This indicates that nonantimicrobial soap was less active and that the effectiveness of antimicrobial soaps can be improved with longer wash time and greater soap volume. The transfer of bacteria to objects was significantly reduced due to greater reduction in bacteria following the use of antimicrobial soap.

  1. Inactivation of Escherichia coli O157:H7 on Orange Fruit Surfaces and in Juice Using Photocatalysis and High Hydrostatic Pressure.

    PubMed

    Yoo, Sungyul; Ghafoor, Kashif; Kim, Jeong Un; Kim, Sanghun; Jung, Bora; Lee, Dong-Un; Park, Jiyong

    2015-06-01

    Nonpasteurized orange juice is manufactured by squeezing juice from fruit without peel removal. Fruit surfaces may carry pathogenic microorganisms that can contaminate squeezed juice. Titanium dioxide-UVC photocatalysis (TUVP), a nonthermal technique capable of microbial inactivation via generation of hydroxyl radicals, was used to decontaminate orange surfaces. Levels of spot-inoculated Escherichia coli O157:H7 (initial level of 7.0 log CFU/cm(2)) on oranges (12 cm(2)) were reduced by 4.3 log CFU/ml when treated with TUVP (17.2 mW/cm(2)). Reductions of 1.5, 3.9, and 3.6 log CFU/ml were achieved using tap water, chlorine (200 ppm), and UVC alone (23.7 mW/cm(2)), respectively. E. coli O157:H7 in juice from TUVP (17.2 mW/cm(2))-treated oranges was reduced by 1.7 log CFU/ml. After orange juice was treated with high hydrostatic pressure (HHP) at 400 MPa for 1 min without any prior fruit surface disinfection, the level of E. coli O157:H7 was reduced by 2.4 log CFU/ml. However, the E. coli O157:H7 level in juice was reduced by 4.7 log CFU/ml (to lower than the detection limit) when TUVP treatment of oranges was followed by HHP treatment of juice, indicating a synergistic inactivation effect. The inactivation kinetics of E. coli O157:H7 on orange surfaces followed a biphasic model. HHP treatment did not affect the pH, °Brix, or color of juice. However, the ascorbic acid concentration and pectinmethylesterase activity were reduced by 35.1 and 34.7%, respectively.

  2. Performance of Encounternet Tags: Field Tests of Miniaturized Proximity Loggers for Use on Small Birds

    PubMed Central

    Levin, Iris I.; Zonana, David M.; Burt, John M.; Safran, Rebecca J.

    2015-01-01

    Proximity logging is a new tool for understanding social behavior as it allows for accurate quantification of social networks. We report results from field calibration and deployment tests of miniaturized proximity tags (Encounternet), digital transceivers that log encounters between tagged individuals. We examined radio signal behavior in relation to tag attachment (tag, tag on bird, tag on saline-filled balloon) to understand how radio signal strength is affected by the tag mounting technique used for calibration tests. We investigated inter-tag and inter-receiver station variability, and in each calibration test we accounted for the effects of antennae orientation. Additionally, we used data from a live deployment on breeding barn swallows (Hirundo rustica erythrogaster) to analyze the quality of the logs, including reciprocal agreement in dyadic logs. We evaluated the impact (in terms of mass changes) of tag attachment on the birds. We were able to statistically distinguish between RSSI values associated with different close-proximity (<5m) tag-tag distances regardless of antennae orientation. Inter-tag variability was low, but we did find significant inter-receiver station variability. Reciprocal agreement of dyadic logs was high and social networks were constructed from proximity tag logs based on two different RSSI thresholds. There was no evidence of significant mass loss in the time birds were wearing tags. We conclude that proximity loggers are accurate and effective for quantifying social behavior. However, because RSSI and distance cannot be perfectly resolved, data from proximity loggers are most appropriate for comparing networks based on specific RSSI thresholds. The Encounternet system is flexible and customizable, and tags are now light enough for use on small animals (<50g). PMID:26348329

  3. Limited role of culture conversion for decision-making in individual patient care and for advancing novel regimens to confirmatory clinical trials.

    PubMed

    Phillips, Patrick P J; Mendel, Carl M; Burger, Divan A; Crook, Angela M; Crook, Angela; Nunn, Andrew J; Dawson, Rodney; Diacon, Andreas H; Gillespie, Stephen H

    2016-02-04

    Despite recent increased clinical trials activity, no regimen has proved able to replace the standard 6-month regimen for drug-sensitive tuberculosis. Understanding the relationship between microbiological markers measured during treatment and long-term clinical outcomes is critical to evaluate their usefulness for decision-making for both individual patient care and for advancing novel regimens into time-consuming and expensive pivotal phase III trials. Using data from the randomized controlled phase III trial REMoxTB, we evaluated sputum-based markers of speed of clearance of bacilli: time to smear negative status; time to culture negative status on LJ or in MGIT; daily rate of change of log10(TTP) to day 56; and smear or culture results at weeks 6, 8 or 12; as individual- and trial-level surrogate endpoints for long-term clinical outcome. Time to culture negative status on LJ or in MGIT, time to smear negative status and daily rate of change in log10(TTP) were each independent predictors of clinical outcome, adjusted for treatment (p <0.001). However, discrimination between low and high risk patients, as measured by the c-statistic, was modest and not much higher than the reference model adjusted for BMI, history of smoking, HIV status, cavitation, gender and MGIT TTP. Culture conversion during treatment for tuberculosis, however measured, has only a limited role in decision-making for advancing regimens into phase III trials or in predicting the outcome of treatment for individual patients. REMoxTB ClinicalTrials.gov number: NCT00864383.

  4. Application of advanced techniques for the assessment of bio-stability of biowaste-derived residues: A minireview.

    PubMed

    Lü, Fan; Shao, Li-Ming; Zhang, Hua; Fu, Wen-Ding; Feng, Shi-Jin; Zhan, Liang-Tong; Chen, Yun-Min; He, Pin-Jing

    2018-01-01

    Bio-stability is a key feature for the utilization and final disposal of biowaste-derived residues, such as aerobic compost or vermicompost of food waste, bio-dried waste, anaerobic digestate or landfilled waste. The present paper reviews conventional methods and advanced techniques used for the assessment of bio-stability. The conventional methods are reclassified into two categories. Advanced techniques, including spectroscopic (fluorescent, ultraviolet-visible, infrared, Raman, nuclear magnetic resonance), thermogravimetric and thermochemolysis analysis, are emphasized for their application in bio-stability assessment in recent years. Their principles, pros and cons are critically discussed. These advanced techniques are found to be convenient in sample preparation and to supply diversified information. However, the viability of these techniques as potential indicators for bio-stability assessment ultimately lies in the establishment of the relationship of advanced ones with the conventional methods, especially with the methods based on biotic response. Furthermore, some misuses in data explanation should be noted. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Image processing system for the measurement of timber truck loads

    NASA Astrophysics Data System (ADS)

    Carvalho, Fernando D.; Correia, Bento A. B.; Davies, Roger; Rodrigues, Fernando C.; Freitas, Jose C. A.

    1993-01-01

    The paper industry uses wood as its raw material. To know the quantity of wood in the pile of sawn tree trunks, every truck load entering the plant is measured to determine its volume. The objective of this procedure is to know the solid volume of wood stocked in the plant. Weighing the tree trunks has its own problems, due to their high capacity for absorbing water. Image processing techniques were used to evaluate the volume of a truck load of logs of wood. The system is based on a PC equipped with an image processing board using data flow processors. Three cameras allow image acquisition of the sides and rear of the truck. The lateral images contain information about the sectional area of the logs, and the rear image contains information about the length of the logs. The machine vision system and the implemented algorithms are described. The results being obtained with the industrial prototype that is now installed in a paper mill are also presented.

  6. A generic standard additions based method to determine endogenous analyte concentrations by immunoassays to overcome complex biological matrix interference.

    PubMed

    Pang, Susan; Cowen, Simon

    2017-12-13

    We describe a novel generic method to derive the unknown endogenous concentrations of analyte within complex biological matrices (e.g. serum or plasma) based upon the relationship between the immunoassay signal response of a biological test sample spiked with known analyte concentrations and the log transformed estimated total concentration. If the estimated total analyte concentration is correct, a portion of the sigmoid on a log-log plot is very close to linear, allowing the unknown endogenous concentration to be estimated using a numerical method. This approach obviates conventional relative quantification using an internal standard curve and need for calibrant diluent, and takes into account the individual matrix interference on the immunoassay by spiking the test sample itself. This technique is based on standard additions for chemical analytes. Unknown endogenous analyte concentrations within even 2-fold diluted human plasma may be determined reliably using as few as four reaction wells.

  7. Image analysis for the automated estimation of clonal growth and its application to the growth of smooth muscle cells.

    PubMed

    Gavino, V C; Milo, G E; Cornwell, D G

    1982-03-01

    Image analysis was used for the automated measurement of colony frequency (f) and colony diameter (d) in cultures of smooth muscle cells, Initial studies with the inverted microscope showed that number of cells (N) in a colony varied directly with d: log N = 1.98 log d - 3.469 Image analysis generated the complement of a cumulative distribution for f as a function of d. The number of cells in each segment of the distribution function was calculated by multiplying f and the average N for the segment. These data were displayed as a cumulative distribution function. The total number of colonies (fT) and the total number of cells (NT) were used to calculate the average colony size (NA). Population doublings (PD) were then expressed as log2 NA. Image analysis confirmed previous studies in which colonies were sized and counted with an inverted microscope. Thus, image analysis is a rapid and automated technique for the measurement of clonal growth.

  8. IMPLEMENTING A NOVEL CYCLIC CO2 FLOOD IN PALEOZOIC REEFS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    James R. Wood; W. Quinlan; A. Wylie

    2003-07-01

    Recycled CO2 will be used in this demonstration project to produce bypassed oil from the Silurian Charlton 6 pinnacle reef (Otsego County) in the Michigan Basin. Contract negotiations by our industry partner to gain access to this CO2 that would otherwise be vented to the atmosphere are near completion. A new method of subsurface characterization, log curve amplitude slicing, is being used to map facies distributions and reservoir properties in two reefs, the Belle River Mills and Chester 18 Fields. The Belle River Mills and Chester18 fields are being used as typefields because they have excellent log-curve and core datamore » coverage. Amplitude slicing of the normalized gamma ray curves is showing trends that may indicate significant heterogeneity and compartmentalization in these reservoirs. Digital and hard copy data continues to be compiled for the Niagaran reefs in the Michigan Basin. Technology transfer took place through technical presentations regarding the log curve amplitude slicing technique and a booth at the Midwest PTTC meeting.« less

  9. Developing EHR-driven heart failure risk prediction models using CPXR(Log) with the probabilistic loss function.

    PubMed

    Taslimitehrani, Vahid; Dong, Guozhu; Pereira, Naveen L; Panahiazar, Maryam; Pathak, Jyotishman

    2016-04-01

    Computerized survival prediction in healthcare identifying the risk of disease mortality, helps healthcare providers to effectively manage their patients by providing appropriate treatment options. In this study, we propose to apply a classification algorithm, Contrast Pattern Aided Logistic Regression (CPXR(Log)) with the probabilistic loss function, to develop and validate prognostic risk models to predict 1, 2, and 5year survival in heart failure (HF) using data from electronic health records (EHRs) at Mayo Clinic. The CPXR(Log) constructs a pattern aided logistic regression model defined by several patterns and corresponding local logistic regression models. One of the models generated by CPXR(Log) achieved an AUC and accuracy of 0.94 and 0.91, respectively, and significantly outperformed prognostic models reported in prior studies. Data extracted from EHRs allowed incorporation of patient co-morbidities into our models which helped improve the performance of the CPXR(Log) models (15.9% AUC improvement), although did not improve the accuracy of the models built by other classifiers. We also propose a probabilistic loss function to determine the large error and small error instances. The new loss function used in the algorithm outperforms other functions used in the previous studies by 1% improvement in the AUC. This study revealed that using EHR data to build prediction models can be very challenging using existing classification methods due to the high dimensionality and complexity of EHR data. The risk models developed by CPXR(Log) also reveal that HF is a highly heterogeneous disease, i.e., different subgroups of HF patients require different types of considerations with their diagnosis and treatment. Our risk models provided two valuable insights for application of predictive modeling techniques in biomedicine: Logistic risk models often make systematic prediction errors, and it is prudent to use subgroup based prediction models such as those given by CPXR(Log) when investigating heterogeneous diseases. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. The Plant Ionome Revisited by the Nutrient Balance Concept

    PubMed Central

    Parent, Serge-Étienne; Parent, Léon Etienne; Egozcue, Juan José; Rozane, Danilo-Eduardo; Hernandes, Amanda; Lapointe, Line; Hébert-Gentile, Valérie; Naess, Kristine; Marchand, Sébastien; Lafond, Jean; Mattos, Dirceu; Barlow, Philip; Natale, William

    2013-01-01

    Tissue analysis is commonly used in ecology and agronomy to portray plant nutrient signatures. Nutrient concentration data, or ionomes, belong to the compositional data class, i.e., multivariate data that are proportions of some whole, hence carrying important numerical properties. Statistics computed across raw or ordinary log-transformed nutrient data are intrinsically biased, hence possibly leading to wrong inferences. Our objective was to present a sound and robust approach based on a novel nutrient balance concept to classify plant ionomes. We analyzed leaf N, P, K, Ca, and Mg of two wild and six domesticated fruit species from Canada, Brazil, and New Zealand sampled during reproductive stages. Nutrient concentrations were (1) analyzed without transformation, (2) ordinary log-transformed as commonly but incorrectly applied in practice, (3) additive log-ratio (alr) transformed as surrogate to stoichiometric rules, and (4) converted to isometric log-ratios (ilr) arranged as sound nutrient balance variables. Raw concentration and ordinary log transformation both led to biased multivariate analysis due to redundancy between interacting nutrients. The alr- and ilr-transformed data provided unbiased discriminant analyses of plant ionomes, where wild and domesticated species formed distinct groups and the ionomes of species and cultivars were differentiated without numerical bias. The ilr nutrient balance concept is preferable to alr, because the ilr technique projects the most important interactions between nutrients into a convenient Euclidean space. This novel numerical approach allows rectifying historical biases and supervising phenotypic plasticity in plant nutrition studies. PMID:23526060

  11. Advanced prior modeling for 3D bright field electron tomography

    NASA Astrophysics Data System (ADS)

    Sreehari, Suhas; Venkatakrishnan, S. V.; Drummy, Lawrence F.; Simmons, Jeffrey P.; Bouman, Charles A.

    2015-03-01

    Many important imaging problems in material science involve reconstruction of images containing repetitive non-local structures. Model-based iterative reconstruction (MBIR) could in principle exploit such redundancies through the selection of a log prior probability term. However, in practice, determining such a log prior term that accounts for the similarity between distant structures in the image is quite challenging. Much progress has been made in the development of denoising algorithms like non-local means and BM3D, and these are known to successfully capture non-local redundancies in images. But the fact that these denoising operations are not explicitly formulated as cost functions makes it unclear as to how to incorporate them in the MBIR framework. In this paper, we formulate a solution to bright field electron tomography by augmenting the existing bright field MBIR method to incorporate any non-local denoising operator as a prior model. We accomplish this using a framework we call plug-and-play priors that decouples the log likelihood and the log prior probability terms in the MBIR cost function. We specifically use 3D non-local means (NLM) as the prior model in the plug-and-play framework, and showcase high quality tomographic reconstructions of a simulated aluminum spheres dataset, and two real datasets of aluminum spheres and ferritin structures. We observe that streak and smear artifacts are visibly suppressed, and that edges are preserved. Also, we report lower RMSE values compared to the conventional MBIR reconstruction using qGGMRF as the prior model.

  12. Performance of a completely automated system for monitoring CMV DNA in plasma.

    PubMed

    Mengelle, C; Sandres-Sauné, K; Mansuy, J-M; Haslé, C; Boineau, J; Izopet, J

    2016-06-01

    Completely automated systems for monitoring CMV-DNA in plasma samples are now available. Evaluate analytical and clinical performances of the VERIS™/MDx System CMV Assay(®). Analytical performance was assessed using quantified quality controls. Clinical performance was assessed by comparison with the COBAS(®) Ampliprep™/COBAS(®) Taqman CMV test using 169 plasma samples that had tested positive with the in-house technique in whole blood. The specificity of the VERIS™/MDx System CMV Assay(®) was 99% [CI 95%: 97.7-100]. Intra-assay reproducibilities were 0.03, 0.04, 0.05 and 0.04 log10IU/ml (means 2.78, 3.70, 4.64 and 5.60 log10IU/ml) for expected values of 2.70, 3.70, 4.70 and 5.70 log10IU/ml. The inter-assay reproducibilities were 0.12 and 0.08 (means 6.30 and 2.85 log10IU/ml) for expected values of 6.28 and 2.80 log10IU/ml. The lower limit of detection was 14.6IU/ml, and the assay was linear from 2.34 to 5.58 log10IU/ml. The results for the positive samples were concordant (r=0.71, p<0.0001; slope of Deming regression 0.79 [CI 95%: 0.56-1.57] and y-intercept 0.79 [CI 95%: 0.63-0.95]). The VERIS™/MDx System CMV Assay(®) detected 18 more positive samples than did the COBAS(®) Ampliprep™/COBAS(®) Taqman CMV test and the mean virus load were higher (0.41 log10IU/ml). Patient monitoring on 68 samples collected from 17 immunosuppressed patients showed similar trends between the two assays. As secondary question, virus loads detected by the VERIS™/MDx System CMV Assay(®) were compared to those of the in-house procedure on whole blood. The results were similar between the two assays (-0.09 log10IU/ml) as were the patient monitoring trends. The performances of the VERIS™/MDx System CMV Assay(®) facilitated its routine use in monitoring CMV-DNA loads in plasma samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. High-Resolution Flow Logging for Hydraulic Characterization of Boreholes and Aquifer Flow Zones at Contaminated Bedrock Sites

    NASA Astrophysics Data System (ADS)

    Williams, J. H.; Johnson, C. D.; Paillet, F. L.

    2004-05-01

    In the past, flow logging was largely restricted to the application of spinner flowmeters to determine flow-zone contributions in large-diameter production wells screened in highly transmissive aquifers. Development and refinement of tool-measurement technology, field methods, and analysis techniques has greatly extended and enhanced flow logging to include the hydraulic characterization of boreholes and aquifer flow zones at contaminated bedrock sites. State-of-the-art in flow logging will be reviewed, and its application to bedrock-contamination investigations will be presented. In open bedrock boreholes, vertical flows are measured with high-resolution flowmeters equipped with flexible rubber-disk diverters fitted to the nominal borehole diameters to concentrate flow through the measurement throat of the tools. Heat-pulse flowmeters measure flows in the range of 0.05 to 5 liters per minute, and electromagnetic flowmeters measure flows in the range of 0.3 to 30 liters per minute. Under ambient and low-rate stressed (either extraction or injection) conditions, stationary flowmeter measurements are collected in competent sections of the borehole between fracture zones identified on borehole-wall images. Continuous flow, fluid-resistivity, and temperature logs are collected under both sets of conditions while trolling with a combination electromagnetic flowmeter and fluid tool. Electromagnetic flowmeters are used with underfit diverters to measure flow rates greater than 30 liters per minute and suppress effects of diameter variations while trolling. A series of corrections are applied to the flow-log data to account for the zero-flow response, bypass, trolling, and borehole-diameter biases and effects. The flow logs are quantitatively analyzed by matching simulated flows computed with a numerical model to measured flows by varying the hydraulic properties (transmissivity and hydraulic head) of the flow zones. Several case studies will be presented that demonstrate the integration of flow logging in site-characterization activities framework; 2) evaluate cross-connection effects and determine flow-zone contributions to water-quality samples from open boreholes; and 3) design discrete-zone hydraulic tests and monitoring-well completions.

  14. Wilderness experience in Rocky Mountain National Park 2002: Report to RMNP

    USGS Publications Warehouse

    Schuster, Elke; Johnson, S. Shea; Taylor, Jonathan G.

    2004-01-01

    The social science technique of Visitor Employed Photography [VEP] was used to obtain information from visitors about wilderness experiences. Visitors were selected at random from Park-designated wilderness trails, in proportion to their use, and asked to participate in the survey. Respondents were given single-use, 10-exposure cameras and photo-log diaries to record experiences. A total of 293 cameras were distributed, with a response rate of 87%. Following the development of the photos, a copy of the photos, two pertinent pages from the photo-log, and a follow-up survey were mailed to respondents. Fifty six percent of the follow-up surveys were returned. Findings from the two surveys were analyzed and compared.

  15. Irreversible electroporation of locally advanced pancreatic neck/body adenocarcinoma

    PubMed Central

    2015-01-01

    Objective Irreversible electroporation (IRE) of locally advanced pancreatic adenocarcinoma of the neck has been used to palliate appropriate stage 3 pancreatic cancers without evidence of metastasis and who have undergone appropriate induction therapy. Currently there has not been a standardized reported technique for pancreatic mid-body tumors for patient selection and intra-operative technique. Patients Subjects are patients with locally advanced pancreatic adenocarcinoma of the body/neck who have undergone appropriate induction chemotherapy for a reasonable duration. Main outcome measures Technique of open IRE of locally advanced pancreatic adenocarcinoma of the neck/body is described, with the emphasis on intra-operative ultrasound and intra-operative electroporation management. Results The technique of open IRE of the pancreatic neck/body with bracketing of the celiac axis and superior mesenteric artery with continuous intraoperative ultrasound imaging and consideration of intraoperative navigational system is described. Conclusions IRE of locally advanced pancreatic adenocarcinoma of the body/neck is feasible for appropriate patients with locally advanced unresectable pancreatic cancer. PMID:26029461

  16. INL Fleet Vehicle Characterization Study for the U.S. Department of Navy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, Brion Dale; Francfort, James Edward; Smart, John Galloway

    Battelle Energy Alliance, LLC, managing and operating contractor for the U.S. Department of Energy’s Idaho National Laboratory, is the lead laboratory for U.S. Department of Energy Advanced Vehicle Testing. Battelle Energy Alliance, LLC collected and evaluated data on federal fleet operations as part of the Advanced Vehicle Testing Activity’s Federal Fleet Vehicle Data Logging and Characterization Study. The Advanced Vehicle Testing Activity’s study seeks to collect and evaluate data to validate use of advanced plug-in electric vehicle (PEV) transportation. This report focuses on US Department of Navy's fleet to identify daily operational characteristics of select vehicles and report findings onmore » vehicle and mission characterizations to support the successful introduction of PEVs into the agency’s fleets. Individual observations of these selected vehicles provide the basis for recommendations related to electric vehicle adoption and whether a battery electric vehicle or plug-in hybrid electric vehicle (collectively referred to as PEVs) can fulfill the mission requirements.« less

  17. Thyroid Radiofrequency Ablation: Updates on Innovative Devices and Techniques

    PubMed Central

    Park, Hye Sun; Park, Auh Whan; Chung, Sae Rom; Choi, Young Jun; Lee, Jeong Hyun

    2017-01-01

    Radiofrequency ablation (RFA) is a well-known, effective, and safe method for treating benign thyroid nodules and recurrent thyroid cancers. Thyroid-dedicated devices and basic techniques for thyroid RFA were introduced by the Korean Society of Thyroid Radiology (KSThR) in 2012. Thyroid RFA has now been adopted worldwide, with subsequent advances in devices and techniques. To optimize the treatment efficacy and patient safety, understanding the basic and advanced RFA techniques and selecting the optimal treatment strategy are critical. The goal of this review is to therefore provide updates and analysis of current devices and advanced techniques for RFA treatment of benign thyroid nodules and recurrent thyroid cancers. PMID:28670156

  18. L-O-S-T: Logging Optimization Selection Technique

    Treesearch

    Jerry L. Koger; Dennis B. Webster

    1984-01-01

    L-O-S-T is a FORTRAN computer program developed to systematically quantify, analyze, and improve user selected harvesting methods. Harvesting times and costs are computed for road construction, landing construction, system move between landings, skidding, and trucking. A linear programming formulation utilizing the relationships among marginal analysis, isoquants, and...

  19. Setting analyst: A practical harvest planning technique

    Treesearch

    Olivier R.M. Halleux; W. Dale Greene

    2001-01-01

    Setting Analyst is an ArcView extension that facilitates practical harvest planning for ground-based systems. By modeling the travel patterns of ground-based machines, it compares different harvesting settings based on projected average skidding distance, logging costs, and site disturbance levels. Setting Analyst uses information commonly available to consulting...

  20. ADULT COHO SALMON AND STEELHEAD USE OF BOULDER WEIRS IN SOUTHWEST OREGON STREAMS

    EPA Science Inventory

    The placement of log and boulder structures in streams is a common and often effective technique for improving juvenile salmonid rearing habitat and increasing fish densities. Less frequently examined has been the use of these structures by adult salmonids. In 2004, spawner densi...

  1. Stabilization techniques for reactive aggregate in soil-cement base course.

    DOT National Transportation Integrated Search

    2003-01-01

    Anhydrite (CaSO4) beds occur as a cap rock on a salt dome in Winn Parish in north Louisiana. Locally known as Winn Rock, it has been quarried for gravel for road building. It has been used as a surface course for local parish and logging roads. Stabi...

  2. Developing attractants and trapping techniques for the emerald ash borer

    Treesearch

    Therese M. Poland; Peter de Groot; Gary Grant; Linda MacDonald; Deborah G. McCullough

    2003-01-01

    Shortly after the 2002 discovery of emerald ash borer (EAB), Agrilus planipennis Fairmaire (Coleoptera: Buprestidae), in southeastern Michigan and Windsor, Ontario, quarantines regulating the movement of ash logs, firewood, and nursery stock were established to reduce the risk of human-assisted spread of this exotic forest insect pest. Accurate...

  3. Visualization and characterization of users in a citizen science project

    NASA Astrophysics Data System (ADS)

    Morais, Alessandra M. M.; Raddick, Jordan; Coelho dos Santos, Rafael D.

    2013-05-01

    Recent technological advances allowed the creation and use of internet-based systems where many users can collaborate gathering and sharing information for specific or general purposes: social networks, e-commerce review systems, collaborative knowledge systems, etc. Since most of the data collected in these systems is user-generated, understanding of the motivations and general behavior of users is a very important issue. Of particular interest are citizen science projects, where users without scientific training are asked for collaboration labeling and classifying information (either automatically by giving away idle computer time or manually by actually seeing data and providing information about it). Understanding behavior of users of those types of data collection systems may help increase the involvement of the users, categorize users accordingly to different parameters, facilitate their collaboration with the systems, design better user interfaces, and allow better planning and deployment of similar projects and systems. Behavior of those users could be estimated through analysis of their collaboration track: registers of which user did what and when can be easily and unobtrusively collected in several different ways, the simplest being a log of activities. In this paper we present some results on the visualization and characterization of almost 150.000 users with more than 80.000.000 collaborations with a citizen science project - Galaxy Zoo I, which asked users to classify galaxies' images. Basic visualization techniques are not applicable due to the number of users, so techniques to characterize users' behavior based on feature extraction and clustering are used.

  4. Prediction of load threshold of fibre-reinforced laminated composite panels subjected to low velocity drop-weight impact using efficient data filtering techniques

    NASA Astrophysics Data System (ADS)

    Farooq, Umar; Myler, Peter

    This work is concerned with physical testing of carbon fibrous laminated composite panels with low velocity drop-weight impacts from flat and round nose impactors. Eight, sixteen, and twenty-four ply panels were considered. Non-destructive damage inspections of tested specimens were conducted to approximate impact-induced damage. Recorded data were correlated to load-time, load-deflection, and energy-time history plots to interpret impact induced damage. Data filtering techniques were also applied to the noisy data that unavoidably generate due to limitations of testing and logging systems. Built-in, statistical, and numerical filters effectively predicted load thresholds for eight and sixteen ply laminates. However, flat nose impact of twenty-four ply laminates produced clipped data that can only be de-noised involving oscillatory algorithms. Data filtering and extrapolation of such data have received rare attention in the literature that needs to be investigated. The present work demonstrated filtering and extrapolation of the clipped data using Fast Fourier Convolution algorithm to predict load thresholds. Selected results were compared to the damage zones identified with C-scan and acceptable agreements have been observed. Based on the results it is proposed that use of advanced data filtering and analysis methods to data collected by the available resources has effectively enhanced data interpretations without resorting to additional resources. The methodology could be useful for efficient and reliable data analysis and impact-induced damage prediction of similar cases' data.

  5. Application of digital image processing techniques to astronomical imagery, 1979

    NASA Technical Reports Server (NTRS)

    Lorre, J. J.

    1979-01-01

    Several areas of applications of image processing to astronomy were identified and discussed. These areas include: (1) deconvolution for atmospheric seeing compensation; a comparison between maximum entropy and conventional Wiener algorithms; (2) polarization in galaxies from photographic plates; (3) time changes in M87 and methods of displaying these changes; (4) comparing emission line images in planetary nebulae; and (5) log intensity, hue saturation intensity, and principal component color enhancements of M82. Examples are presented of these techniques applied to a variety of objects.

  6. Application of borehole geophysics in defining the wellhead protection area for a fractured crystalline bedrock aquifer

    USGS Publications Warehouse

    Vernon, J.H.; Paillet, F.L.; Pedler, W.H.; Griswold, W.J.

    1993-01-01

    Wellbore geophysical techniques were used to characterize fractures and flow in a bedrock aquifer at a site near Blackwater Brook in Dover, New Hampshire. The primary focus ofthis study was the development of a model to assist in evaluating the area surrounding a planned water supply well where contaminants introduced at the land surface might be induced to flow towards a pumping well. Well logs and geophysical surveys used in this study included lithologic logs based on examination of cuttings obtained during drilling; conventional caliper and natural gamma logs; video camera and acoustic televiewer surveys; highresolution vertical flow measurements under ambient conditions and during pumping; and borehole fluid conductivity logs obtained after the borehole fluid was replaced with deionized water. These surveys were used for several applications: 1) to define a conceptual model of aquifer structure to be used in groundwater exploration; 2) to estimate optimum locations for test and observation wells; and 3) to delineate a wellhead protection area (WHPA) for a planned water supply well. Integration of borehole data with surface geophysical and geological mapping data indicated that the study site lies along a northeast-trending intensely fractured contact zone between surface exposures of quartz monzonite and metasedimentary rocks. Four of five bedrock boreholes at the site were estimated to produce more than 150 gallons per minute (gpm) (568 L/min) of water during drilling. Aquifer testing and other investigations indicated that water flowed to the test well along fractures parallel to the northeast-trending contact zone and along other northeast and north-northwest-trending fractures. Statistical plots of fracture strikes showed frequency maxima in the same northeast and north-northwest directions, although additional maxima occurred in other directions. Flowmeter surveys and borehole fluid conductivity logging after fluid replacement were used to identify water-producing zones in the boreholes; fractures associated with inflow into boreholes showed a dominant northeast orientation. Borehole fluid conductivity logging after fluid replacement also gave profiles of such water-quality parameters as fluid electrical conductivity (FEC), pH, temperature, and oxidation-reduction potential, strengthening the interpretation of crossconnection of boreholes by certain fracture zones. The results of this study showed that the application of these borehole geophysical techniques at the Blackwater Brook site led to an improved understanding of such parameters as fracture location, attitude, flow direction and velocity, and water quality; all of which are important in the determination of a WHPA.

  7. Systems-Level Synthetic Biology for Advanced Biofuel Production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruffing, Anne; Jensen, Travis J.; Strickland, Lucas Marshall

    2015-03-01

    Cyanobacteria have been shown to be capable of producing a variety of advanced biofuels; however, product yields remain well below those necessary for large scale production. New genetic tools and high throughput metabolic engineering techniques are needed to optimize cyanobacterial metabolisms for enhanced biofuel production. Towards this goal, this project advances the development of a multiple promoter replacement technique for systems-level optimization of gene expression in a model cyanobacterial host: Synechococcus sp. PCC 7002. To realize this multiple-target approach, key capabilities were developed, including a high throughput detection method for advanced biofuels, enhanced transformation efficiency, and genetic tools for Synechococcusmore » sp. PCC 7002. Moreover, several additional obstacles were identified for realization of this multiple promoter replacement technique. The techniques and tools developed in this project will help to enable future efforts in the advancement of cyanobacterial biofuels.« less

  8. Interaction of Cesium Ions with Calix[4]arene-bis(t-octylbenzo-18-crown-6): NMR and Theoretical Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kriz, Jaroslav; Dybal, Jiri; Vanura, Petr

    2011-01-01

    Using 1H, 13C, and 133Cs NMR spectra, it is shown that calix[4]arene-bis (t-octylbenzo-18-crown-6) (L) forms complexes with one (L 3 Cs ) and two (L 3 2Cs ) Cs ions offered by cesium bis(1,2-dicarbollide) cobaltate (CsDCC) in nitrobenzene-d5. The ions interact with all six oxygen atoms in the crown-ether ring and the electrons of the calixarene aromatic moieties. According to extraction technique, the stability constant of the first complex is log nb(L 3 Cs ) = 8.8 ( 0.1. According to 133Cs NMR spectra, the value of the equilibrium constant of the second complex is log Knb (2)(L 3 2Csmore » ) = 6.3(0.2, i.e., its stabilization constant is log nb(L 3 2Cs ) = 15.1 ( 0.3. Self-diffusion measurements by 1H pulsed-field gradient (PFG) NMRcombined with density functional theory (DFT) calculations suggest that one DCC ion is tightly associated with L 3 Cs , decreasing its positive charge and consequently stabilizing the second complex, L 3 2Cs . Using a saturation-transfer 133Cs NMR technique, the correlation times ex of chemical exchange between L 3 Cs and L 3 2Cs as well as between L 3 2Cs and free Cs ions were determined as 33.6 and 29.2 ms, respectively.« less

  9. Process mining techniques: an application to time management

    NASA Astrophysics Data System (ADS)

    Khowaja, Ali Raza

    2018-04-01

    In an environment people have to make sure that all of their work are completed within a given time in accordance with its quality. In order to achieve the real phenomenon of process mining one needs to understand all of these processes in a detailed manner. Personal Information and communication has always been a highlighting issue on internet but for now information and communication tools within factual life refers to their daily schedule, location analysis, environmental analysis and, more generally, social media applications support these systems which makes data available for data analysis generated through event logs, but also for process analysis which combines environmental and location analysis. Process mining can be used to exploit all these real live processes with the help of the event logs which are already available in those datasets through user censored data or may be user labeled data. These processes could be used to redesign a user's flow and understand all these processes in a bit more detailed manner. In order to increase the quality of each of the processes that we go through our daily lives is to give a closer look to each of the processes and after analyzing them, one should make changes to get better results. On the contrarily, we applied process mining techniques on seven different subjects combined in a single dataset collected from Korea. Above all, the following paper comments on the efficiency of processes in the event logs referring to time management's sphere of influence.

  10. Atmospheric stellar parameters from cross-correlation functions

    NASA Astrophysics Data System (ADS)

    Malavolta, L.; Lovis, C.; Pepe, F.; Sneden, C.; Udry, S.

    2017-08-01

    The increasing number of spectra gathered by spectroscopic sky surveys and transiting exoplanet follow-up has pushed the community to develop automated tools for atmospheric stellar parameters determination. Here we present a novel approach that allows the measurement of temperature (Teff), metallicity ([Fe/H]) and gravity (log g) within a few seconds and in a completely automated fashion. Rather than performing comparisons with spectral libraries, our technique is based on the determination of several cross-correlation functions (CCFs) obtained by including spectral features with different sensitivity to the photospheric parameters. We use literature stellar parameters of high signal-to-noise (SNR), high-resolution HARPS spectra of FGK main-sequence stars to calibrate Teff, [Fe/H] and log g as a function of CCF parameters. Our technique is validated using low-SNR spectra obtained with the same instrument. For FGK stars we achieve a precision of σ _{{T_eff}} = 50 K, σlog g = 0.09 dex and σ _{{{[Fe/H]}}} =0.035 dex at SNR = 50, while the precision for observation with SNR ≳ 100 and the overall accuracy are constrained by the literature values used to calibrate the CCFs. Our approach can easily be extended to other instruments with similar spectral range and resolution or to other spectral range and stars other than FGK dwarfs if a large sample of reference stars is available for the calibration. Additionally, we provide the mathematical formulation to convert synthetic equivalent widths to CCF parameters as an alternative to direct calibration. We have made our tool publicly available.

  11. Controls on the physical properties of gas-hydrate-bearing sediments because of the interaction between gas hydrate and porous media

    USGS Publications Warehouse

    Lee, Myung W.; Collett, Timothy S.

    2005-01-01

    Physical properties of gas-hydrate-bearing sediments depend on the pore-scale interaction between gas hydrate and porous media as well as the amount of gas hydrate present. Well log measurements such as proton nuclear magnetic resonance (NMR) relaxation and electromagnetic propagation tool (EPT) techniques depend primarily on the bulk volume of gas hydrate in the pore space irrespective of the pore-scale interaction. However, elastic velocities or permeability depend on how gas hydrate is distributed in the pore space as well as the amount of gas hydrate. Gas-hydrate saturations estimated from NMR and EPT measurements are free of adjustable parameters; thus, the estimations are unbiased estimates of gas hydrate if the measurement is accurate. However, the amount of gas hydrate estimated from elastic velocities or electrical resistivities depends on many adjustable parameters and models related to the interaction of gas hydrate and porous media, so these estimates are model dependent and biased. NMR, EPT, elastic-wave velocity, electrical resistivity, and permeability measurements acquired in the Mallik 5L-38 well in the Mackenzie Delta, Canada, show that all of the well log evaluation techniques considered provide comparable gas-hydrate saturations in clean (low shale content) sandstone intervals with high gas-hydrate saturations. However, in shaly intervals, estimates from log measurement depending on the pore-scale interaction between gas hydrate and host sediments are higher than those estimates from measurements depending on the bulk volume of gas hydrate.

  12. Factors influencing the inactivation of Alicyclobacillus acidoterrestris spores exposed to high hydrostatic pressure in apple juice

    NASA Astrophysics Data System (ADS)

    Sokołowska, B.; Skąpska, S.; Fonberg-Broczek, M.; Niezgoda, J.; Chotkiewicz, M.; Dekowska, A.; Rzoska, S. J.

    2013-03-01

    Alicyclobacillus acidoterrestris, a thermoacidophilic and spore-forming bacterium, survives the typical pasteurization process and can cause the spoilage of juices, producing compounds associated with disinfectant-like odour (guaiacol, 2,6 - dibromophenol, 2,6 - dichlorophenol). Therefore, the use of other more effective techniques such as high hydrostatic pressure (HHP) is considered for preserving juices. The aim of this study was to search for factors affecting the resistance of A. acidoterrestris spores to HHP. The baroprotective effect of increased solute concentration in apple juice on A. acidoterrestris spores during high pressure processing was observed. During the 45 min pressurization (200 MPa, 50°C) of the spores in concentrated apple juice (71.1°Bx), no significant changes were observed in their number. However, in the juices with a soluble solids content of 35.7, 23.6 and 11.2°Bx, the reduction in spores was 1.3-2.4 log, 2.6-3.3 log and 2.8-4.0 log, respectively. No clear effect of age of spores on the survival under high pressure conditions was found. Spores surviving pressurization and subjected to subsequent HHP treatment showed increased resistance to pressure, by even as much as 2.0 log.

  13. Impact of logging on the foraging behaviour of two sympatric species of Couas (Coua coquereli and Coua gigas) in the western dry forest of Madagascar.

    PubMed

    Chouteau, Philippe

    2009-06-01

    Two ground-dwelling couas species, Coquerel's Coua Coua coquereli and Giant Coua Coua gigas, live in sympatry in the dry forest of Madagascar. These birds are typically insectivorous and mainly feed at ground level. The two species differ by size but have the same morphology, suggesting they have the same physical attributes for foraging and prey capture. To test if the two species have the same foraging behaviour, and also to know how habitat disturbance due to logging could affect their foraging behaviour, I compared and analysed the foraging strategies of both species in two different dry forest habitats: unlogged and logged. The two species differed in their foraging behaviour between the two habitats, mainly by the ability to climb in the vegetation, and by the technique used by both species. Coquerel's Coua used more often gleaning and probing in the unlogged forest, while Giant Coua used lunge more often in this habitat. The giant Coua used also more often leaves as a substrate in the logged forest. Some modifications in the diet have been recorded too. These results suggest that anthropogenic disturbance of forest does influence the foraging behaviour of the terrestrial couas species living in the dry forest in Madagascar.

  14. Femtosecond Laser-Assisted Descemetorhexis: A Novel Technique in Descemet Membrane Endothelial Keratoplasty.

    PubMed

    Pilger, Daniel; von Sonnleithner, Christoph; Bertelmann, Eckart; Joussen, Antonia M; Torun, Necip

    2016-10-01

    To explore the feasibility of femtosecond laser-assisted descemetorhexis (DR) to facilitate Descemet membrane endothelial keratoplasty (DMEK) surgery. Six pseudophakic patients suffering from Fuchs' endothelial dystrophy underwent femtosecond laser-assisted DMEK surgery. DR was performed using the LenSx femtosecond laser, followed by manual removal of the Descemet membrane. Optical coherence tomography images were used to measure DR parameters. Patients were followed up for 1 month to examine best corrected visual acuity, endothelial cell loss, flap detachment, and structure of the anterior chamber of the eye. The diameter of the DR approximated the intended diameter closely [mean error of 34 μm (0.45%) and 54 μm (0.67%) in the x- and y-diameter, respectively] and did not require manual correction. The median visual acuity increased from 0.4 logMAR (range 0.6-0.4 logMAR) preoperative to 0.2 logMAR (range 0-0.4 logMAR) postoperative. The median endothelial cell loss was 22% (range 7%-34%). No clinically significant flap detachments were noted. All patients had clear corneas after surgery, and no side effects or damage to structures of the anterior chamber were noted. Femtosecond laser-assisted DR is a safe and precise method for facilitating DMEK surgery.

  15. Capture of intraocular lens optic by residual capsular opening in secondary implantation: long-term follow-up.

    PubMed

    Tian, Tian; Chen, Chunli; Jin, Haiying; Jiao, Lyu; Zhang, Qi; Zhao, Peiquan

    2018-04-02

    To introduce a novel surgical technique for optic capture by residual capsular opening in secondary intraocular lens (IOL) implantation and to report the outcomes of a long follow-up. Twenty patients (20 eyes) who had received secondary IOL implantation with the optic capture technique were retrospectively reviewed. We used the residual capsular opening for capturing the optic and inserted the haptics in the sulcus during surgery. Baseline clinical characteristics and surgical outcomes, including best-corrected visual acuity (BCVA), refractive status, and IOL position were recorded. The postoperative location and stability of IOL were evaluated using the ultrasound biomicroscopy. Optic capture technique was successfully performed in all cases, including 5 cases with large area of posterior capsular opacity, 6 cases with posterior capsular tear or rupture,and 9 cases with adhesive capsules. BCVA improved from 0.60 logMAR at baseline to 0.36 logMAR at the last follow-up (P < 0.001). Spherical equivalent changed from 10.67 ± 4.59 D at baseline to 0.12 ± 1.35 D at 6 months postoperatively (P < 0.001). Centered IOLs were observed in all cases and remained captured through residual capsular opening in 19 (95%) eyes at the last follow-up. In one case, the captured optic of IOL slid into ciliary sulcus at 7 months postoperatively. No other postoperative complications were observed in any cases. This optic capture technique by using residual capsule opening is an efficacious and safe technique and can achieve IOL stability in the long follow-up.

  16. Single-tooth anesthesia: pressure-sensing technology provides innovative advancement in the field of dental local anesthesia.

    PubMed

    Hochman, Mark N

    2007-04-01

    This article will review standard techniques for intraligamentary injection and describe the technology and technique behind a new single-tooth anesthesia system. This system and technique represents a technological advancement and a greater understanding of intraligamentary anesthesia.

  17. An analysis of fracture trace patterns in areas of flat-lying sedimentary rocks for the detection of buried geologic structure. [Kansas and Texas

    NASA Technical Reports Server (NTRS)

    Podwysocki, M. H.

    1974-01-01

    Two study areas in a cratonic platform underlain by flat-lying sedimentary rocks were analyzed to determine if a quantitative relationship exists between fracture trace patterns and their frequency distributions and subsurface structural closures which might contain petroleum. Fracture trace lengths and frequency (number of fracture traces per unit area) were analyzed by trend surface analysis and length frequency distributions also were compared to a standard Gaussian distribution. Composite rose diagrams of fracture traces were analyzed using a multivariate analysis method which grouped or clustered the rose diagrams and their respective areas on the basis of the behavior of the rays of the rose diagram. Analysis indicates that the lengths of fracture traces are log-normally distributed according to the mapping technique used. Fracture trace frequency appeared higher on the flanks of active structures and lower around passive reef structures. Fracture trace log-mean lengths were shorter over several types of structures, perhaps due to increased fracturing and subsequent erosion. Analysis of rose diagrams using a multivariate technique indicated lithology as the primary control for the lower grouping levels. Groupings at higher levels indicated that areas overlying active structures may be isolated from their neighbors by this technique while passive structures showed no differences which could be isolated.

  18. Analyzing coastal environments by means of functional data analysis

    NASA Astrophysics Data System (ADS)

    Sierra, Carlos; Flor-Blanco, Germán; Ordoñez, Celestino; Flor, Germán; Gallego, José R.

    2017-07-01

    Here we used Functional Data Analysis (FDA) to examine particle-size distributions (PSDs) in a beach/shallow marine sedimentary environment in Gijón Bay (NW Spain). The work involved both Functional Principal Components Analysis (FPCA) and Functional Cluster Analysis (FCA). The grainsize of the sand samples was characterized by means of laser dispersion spectroscopy. Within this framework, FPCA was used as a dimension reduction technique to explore and uncover patterns in grain-size frequency curves. This procedure proved useful to describe variability in the structure of the data set. Moreover, an alternative approach, FCA, was applied to identify clusters and to interpret their spatial distribution. Results obtained with this latter technique were compared with those obtained by means of two vector approaches that combine PCA with CA (Cluster Analysis). The first method, the point density function (PDF), was employed after adapting a log-normal distribution to each PSD and resuming each of the density functions by its mean, sorting, skewness and kurtosis. The second applied a centered-log-ratio (clr) to the original data. PCA was then applied to the transformed data, and finally CA to the retained principal component scores. The study revealed functional data analysis, specifically FPCA and FCA, as a suitable alternative with considerable advantages over traditional vector analysis techniques in sedimentary geology studies.

  19. ELOPTA: a novel microcontroller-based operant device.

    PubMed

    Hoffman, Adam M; Song, Jianjian; Tuttle, Elaina M

    2007-11-01

    Operant devices have been used for many years in animal behavior research, yet such devices a regenerally highly specialized and quite expensive. Although commercial models are somewhat adaptable and resilient, they are also extremely expensive and are controlled by difficult to learn proprietary software. As an alternative to commercial devices, we have designed and produced a fully functional, programmable operant device, using a PICmicro microcontroller (Microchip Technology, Inc.). The electronic operant testing apparatus (ELOPTA) is designed to deliver food when a study animal, in this case a bird, successfully depresses the correct sequence of illuminated keys. The device logs each keypress and can detect and log whenever a test animal i spositioned at the device. Data can be easily transferred to a computer and imported into any statistical analysis software. At about 3% the cost of a commercial device, ELOPTA will advance behavioral sciences, including behavioral ecology, animal learning and cognition, and ethology.

  20. Web usage mining at an academic health sciences library: an exploratory study.

    PubMed

    Bracke, Paul J

    2004-10-01

    This paper explores the potential of multinomial logistic regression analysis to perform Web usage mining for an academic health sciences library Website. Usage of database-driven resource gateway pages was logged for a six-month period, including information about users' network addresses, referring uniform resource locators (URLs), and types of resource accessed. It was found that referring URL did vary significantly by two factors: whether a user was on-campus and what type of resource was accessed. Although the data available for analysis are limited by the nature of the Web and concerns for privacy, this method demonstrates the potential for gaining insight into Web usage that supplements Web log analysis. It can be used to improve the design of static and dynamic Websites today and could be used in the design of more advanced Web systems in the future.

  1. Perspectives for on-line analysis of bauxite by neutron irradiation

    NASA Astrophysics Data System (ADS)

    Beurton, Gabriel; Ledru, Bertrand; Letourneur, Philippe

    1995-03-01

    The interest in bauxite as a major source of alumina results in a strong demand for on-line instrumentation suitable for sorting, blending, and processing operations at the bauxite mine and for monitoring instrumentation in the Bayer process. The results of laboratory experiments based on neutron interactions with bauxite are described. The technique was chosen in order to overcome the problem of spatial heterogeneity in bulk mineral analysis. The evaluated elements contributed to approximately 99.5% of the sample weight. In addition, the measurements provide valuable information on physical parameters such as density, hygrometry, and material flow. Using a pulsed generator, the analysis system offers potential for on-line measurements (borehole logging or conveyor belt). An overall description of the experimental set-up is given. The experimental data include measurements of natural radioactivity, delayed radioactivity induced by activation, and prompt gamma rays following neutron reaction. In situ applications of neutron interactions provide continuous analysis and produce results which are more statistically significant. The key factors contributing to advances in industrial applications are the development of high count rate gamma spectroscopy and computational tools to design measurement systems and interpret their results.

  2. Top Performing PMs: How DAU Develops Them

    DTIC Science & Technology

    2015-12-01

    Business Intermediate Systems Financial Management Planning, Research Development and 26 hrs., online Engineering 9 hrs., online LOG 103...share your expertise with the acquisition community? Want to help change the way DoD does business ? Write an article (1,500 to 2,500 words) and...Defense AT&L can help advance your career. One of our authors has even been offered jobs on the basis of articles writ- ten for the magazine . Now we

  3. Advanced Video Activity Analytics (AVAA): Human Factors Evaluation

    DTIC Science & Technology

    2015-05-01

    video, and 3) creating and saving annotations (Fig. 11). (The logging program was updated after the pilot to also capture search clicks.) Playing and... visual search task and the auditory task together and thus automatically focused on the visual task. Alternatively, the operator may have intentionally...affect performance on the primary task; however, in the current test there was no apparent effect on the operator’s performance in the visual search task

  4. Advanced UV Source for Biological Agent Destruction

    DTIC Science & Technology

    2006-01-01

    protection against chemical agents. The AUVS can be inserted into HVAC air ducts to eliminate BW agents, used to purify water, and / or used to reduce...operating costs are very low. The technology has been shown to be very effective for destroying Bacillus pumilus endospores that are significantly more...resistant to UV than anthrax spores . Up to7 orders of magnitude (7 logs) kill of B. pumilus spores have been demonstrated with the AUVS technology

  5. RF Testing Of Microwave Integrated Circuits

    NASA Technical Reports Server (NTRS)

    Romanofsky, R. R.; Ponchak, G. E.; Shalkhauser, K. A.; Bhasin, K. B.

    1988-01-01

    Fixtures and techniques are undergoing development. Four test fixtures and two advanced techniques developed in continuing efforts to improve RF characterization of MMIC's. Finline/waveguide test fixture developed to test submodules of 30-GHz monolithic receiver. Universal commercially-manufactured coaxial test fixture modified to enable characterization of various microwave solid-state devices in frequency range of 26.5 to 40 GHz. Probe/waveguide fixture is compact, simple, and designed for non destructive testing of large number of MMIC's. Nondestructive-testing fixture includes cosine-tapered ridge, to match impedance wavequide to microstrip. Advanced technique is microwave-wafer probing. Second advanced technique is electro-optical sampling.

  6. Endoscopic therapy for early gastric cancer: Standard techniques and recent advances in ESD

    PubMed Central

    Kume, Keiichiro

    2014-01-01

    The technique of endoscopic submucosal dissection (ESD) is now a well-known endoscopic therapy for early gastric cancer. ESD was introduced to resect large specimens of early gastric cancer in a single piece. ESD can provide precision of histologic diagnosis and can also reduce the recurrence rate. However, the drawback of ESD is its technical difficulty, and, consequently, it is associated with a high rate of complications, the need for advanced endoscopic techniques, and a lengthy procedure time. Various advances in the devices and techniques used for ESD have contributed to overcoming these drawbacks. PMID:24914364

  7. Association of CHEK2 polymorphisms with the efficacy of platinum-based chemotherapy for advanced non-small-cell lung cancer in Chinese never-smoking women.

    PubMed

    Xu, Wen; Liu, Di; Yang, Yang; Ding, Xi; Sun, Yifeng; Zhang, Baohong; Xu, Jinfu; Su, Bo

    2016-09-01

    Cell cycle checkpoint kinase 2 (CHEK2) plays an essential role in the repair of DNA damage. Single nucleotide polymorphisms (SNPs) in DNA repair genes are thought to influence treatment effects and survival of cancer patients. This study aimed to investigate the relationship between polymorphisms in the CHEK2 gene and efficacy of platinum-based doublet chemotherapy in never-smoking Chinese female patients with advanced non-small-cell lung cancer (NSCLC). Using DNA from blood samples of 272 Chinese advanced NSCLC non-smoking female patients treated with first-line platinum-based chemotherapy, we have analyzed the relationships between four SNPs in the CHEK2 gene and clinical outcomes. We found that overall survival (OS) was significantly associated with CHEK2 rs4035540 (Log-Rank P=0.020), as well as the CHEK2 rs4035540 dominant model (Log-Rank P=0.026), especially in the lung adenocarcinoma group. After multivariate analysis, patients with rs4035540 A/G genotype had a significantly better OS than those with the G/G genotype (HR =0.67, 95% CI, 0.48-0.93; P=0.016). In the toxicity analysis, it was observed that patients with the CHEK2 rs4035540 A/A genotype had a higher risk of gastrointestinal toxicity than the G/G genotype group (P=0.009). However, there are no significant associations between chemotherapy treatments and genetic variations. Our findings indicate that SNPs in CHEK2 are related to Chinese advanced NSCLC never-smoking female patients receiving platinum-based doublet chemotherapy in China. Patients with rs4035540 A/G genotype have a better OS. And patients with rs4035540 A/A genotype have a higher risk of gastrointestinal toxicity. These results point to a direction for predicting the prognosis for Chinese never-smoking NSCLC female patients. However, there are no significant associations between chemotherapy treatments and SNPs in CHEK2 , which need more samples to the further study.

  8. Advanced Diffusion-Weighted Magnetic Resonance Imaging Techniques of the Human Spinal Cord

    PubMed Central

    Andre, Jalal B.; Bammer, Roland

    2012-01-01

    Unlike those of the brain, advances in diffusion-weighted imaging (DWI) of the human spinal cord have been challenged by the more complicated and inhomogeneous anatomy of the spine, the differences in magnetic susceptibility between adjacent air and fluid-filled structures and the surrounding soft tissues, and the inherent limitations of the initially used echo-planar imaging techniques used to image the spine. Interval advances in DWI techniques for imaging the human spinal cord, with the specific aims of improving the diagnostic quality of the images, and the simultaneous reduction in unwanted artifacts have resulted in higher-quality images that are now able to more accurately portray the complicated underlying anatomy and depict pathologic abnormality with improved sensitivity and specificity. Diffusion tensor imaging (DTI) has benefited from the advances in DWI techniques, as DWI images form the foundation for all tractography and DTI. This review provides a synopsis of the many recent advances in DWI of the human spinal cord, as well as some of the more common clinical uses for these techniques, including DTI and tractography. PMID:22158130

  9. Simultaneous and individual quantitative estimation of Salmonella, Shigella and Listeria monocytogenes on inoculated Roma tomatoes (Lycopersicon esculentum var. Pyriforme) and Serrano peppers (Capsicum annuum) using an MPN technique.

    PubMed

    Cabrera-Díaz, E; Martínez-Chávez, L; Sánchez-Camarena, J; Muñiz-Flores, J A; Castillo, A; Gutiérrez-González, P; Arvizu-Medrano, S M; González-Aguilar, D G; Martínez-Gonzáles, N E

    2018-08-01

    Simultaneous and individual enumeration of Salmonella, Shigella and Listeria monocytogenes was compared on inoculated Roma tomatoes and Serrano peppers using an Most Probable Number (MPN) technique. Samples consisting of tomatoes (4 units) or peppers (8 units) were individually inoculated with a cocktail of three strains of Salmonella, Shigella or L. monocytogenes, or by simultaneous inoculation of three strains of each pathogen, at low (1.2-1.7 log CFU/sample) and high (2.2-2.7 log CFU/sample) inocula. Samples were analyzed by an MPN technique using universal pre-enrichment (UP) broth at 35 °C for 24 ± 2 h. The UP tubes from each MPN series were transferred to enrichment and plating media following adequate conventional methods for isolating each pathogen. Data were analyzed using multifactorial analysis of variance (p < 0.05) and LSD multiple rang test. There were differences (p < 0.05) in recovery of simultaneous and individual bacteria inoculated (individual > simultaneous), type of bacteria (Salmonella > Shigella and L. monocytogenes), type of sample (UP broth > pepper and tomato), and inoculum level (high > low). The MPN technique was effective for Salmonella on both commodities. Shigella counts were higher on tomatoes compared to peppers, (p < 0.05), and for L. monocytogenes on peppers (p < 0.05). Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Biotechnology Apprenticeship for Secondary-Level Students: Teaching Advanced Cell Culture Techniques for Research.

    ERIC Educational Resources Information Center

    Lewis, Jennifer R.; Kotur, Mark S.; Butt, Omar; Kulcarni, Sumant; Riley, Alyssa A.; Ferrell, Nick; Sullivan, Kathryn D.; Ferrari, Mauro

    2002-01-01

    Discusses small-group apprenticeships (SGAs) as a method for introducing cell culture techniques to high school participants. Teaches cell culture practices and introduces advance imaging techniques to solve various biomedical engineering problems. Clarifies and illuminates the value of small-group laboratory apprenticeships. (Author/KHR)

  11. Advanced techniques to prepare seed to sow

    Treesearch

    Robert P. Karrfalt

    2013-01-01

    This paper reviews research on improving the basic technique of cold stratification for tree and shrub seeds. Advanced stratification techniques include long stratification, stratification re-dry, or multiple cycles of warm-cold stratification. Research demonstrates that careful regulation of moisture levels and lengthening the stratification period have produced a...

  12. Fourth NASA Inter-Center Control Systems Conference

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Space vehicle control applications are discussed, along with aircraft guidance, control, and handling qualities. System simulation and identification, engine control, advanced propulsion techniques, and advanced control techniques are also included.

  13. Stress wave nondestructive evaluation of Douglas-fir peeler cores

    Treesearch

    Robert J. Ross; John I. Zerbe; Xiping Wang; David W. Green; Roy F. Pellerin

    2005-01-01

    With the need for evaluating the utilization of veneer peeler log cores in higher value products and the increasing importance of utilizing round timbers in poles, posts, stakes, and building construction components, we conducted a cooperative project to verify the suitability of stress wave nondestructive evaluation techniques for assessing peeler cores and some...

  14. Full analogue electronic realisation of the Hodgkin-Huxley neuronal dynamics in weak-inversion CMOS.

    PubMed

    Lazaridis, E; Drakakis, E M; Barahona, M

    2007-01-01

    This paper presents a non-linear analog synthesis path towards the modeling and full implementation of the Hodgkin-Huxley neuronal dynamics in silicon. The proposed circuits have been realized in weak-inversion CMOS technology and take advantage of both log-domain and translinear transistor-level techniques.

  15. Online Persistence in Higher Education Web-Supported Courses

    ERIC Educational Resources Information Center

    Hershkovitz, Arnon; Nachmias, Rafi

    2011-01-01

    This research consists of an empirical study of online persistence in Web-supported courses in higher education, using Data Mining techniques. Log files of 58 Moodle websites accompanying Tel Aviv University courses were drawn, recording the activity of 1189 students in 1897 course enrollments during the academic year 2008/9, and were analyzed…

  16. 78 FR 56873 - Information Collection Being Reviewed by the Federal Communications Commission

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-16

    ... on the respondents, including the use of automated collection techniques or other forms of....: 3060-0360. Title: Section 80.409, Station Logs (Maritime Services). Form No.: N/A. Type of Review... the claim or complaint has been satisfied or barred by statute limiting the time for filing suits upon...

  17. Gas chromatographic quantitation of underivatized amines in the determination of their octanol-0.1 M sodium hydroxide partition coefficients by the shake-flask method.

    PubMed

    Grunewald, G L; Pleiss, M A; Gatchell, C L; Pazhenchevsky, R; Rafferty, M F

    1984-06-01

    The use of gas chromatography (GC) for the determination of 0.1 M sodium hydroxide-octanol partition coefficients (log P) for a wide variety of ethylamines is demonstrated. The conventional shake-flask procedure (SFP) is utilized, with the addition of an internal reference, which is cleanly separated from the desired solute and solvents on a 10% Apiezon L, 2% potassium hydroxide on 80-100 mesh Chromosorb W AW column. The partitioned solute is extracted from the aqueous phase with chloroform and analyzed by GC. The method provides an accurate and highly reproducible means of determining log P values, as demonstrated by the low relative standard errors. The technique is both rapid and extremely versatile. The use of the internal standard method of analysis introduces consistency, since variables like the exact weight of solute are not necessary (unlike the traditional SFP) and the volume of sample injected is not critical. The technique is readily accessible to microgram quantities of solutes, making it ideal for a wide range of volatile, amine-bearing compounds.

  18. Effect of surgical hand scrub time on subsequent bacterial growth.

    PubMed

    Wheelock, S M; Lookinland, S

    1997-06-01

    In this experimental study, the researchers evaluated the effect of surgical hand scrub time on subsequent bacterial growth and assessed the effectiveness of the glove juice technique in a clinical setting. In a randomized crossover design, 25 perioperative staff members scrubbed for two or three minutes in the first trial and vice versa in the second trial, after which the wore sterile surgical gloves for one hour under clinical conditions. The researchers then sampled the subjects' nondominant hands for bacterial growth, cultured aliquots from the sampling solution, and counted microorganisms. Scrubbing for three minutes produced lower mean log bacterial counts than scrubbing for two minutes. Although the mean bacterial count differed significantly (P = .02) between the two-minute and three-minute surgical hand scrub times, it fell below 0.5 log, which is the threshold for practical and clinical significance. This finding suggests that a two-minute surgical hand scrub is clinically as effective as a three-minute surgical had scrub. The glove juice technique demonstrated sensitivity and reliability in enumerating bacteria on the hands of perioperative staff members in a clinical setting.

  19. Determining the minimum in situ stress from hydraulic fracturing through perforations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warpinski, N.R.

    Hydraulic fracture stress measurements have been performed through perforations at depths from 1310 to 2470 m at the US Department of Energy's Multiwell Experiment site. The results of over sixty stress tests conducted through perforations have shown that small-volume hydraulic fractures generally provide an accurate, reproducible measurement of the minimum in situ stress. However, unusual behavior can occur in some tests and techniques to evaluate the behavior are suggested. Unclear instantaneous shut-in pressures, which are found on occasional tests, are difficult to evaluate, but the problem appears to be a complex stress state; reprocessing the data using log-log or othermore » functions does not necessarily provide the correct stress value. The possible error in such tests should be assessed from the original pressure-time data and not the reprocessing techniques. Stress results show that the stress distribution is dependent on lithology at this site; mudstones, shales and other nonreservoir rocks generally have a near-lithostatic stress, while sandstones have a considerably lower minimum stress value. 30 refs., 18 figs., 4 tabs.« less

  20. A Novel Microcharacterization Technique in the Measurement of Strain and Orientation Gradient in Advanced Materials

    NASA Technical Reports Server (NTRS)

    Garmestai, H.; Harris, K.; Lourenco, L.

    1997-01-01

    Representation of morphology and evolution of the microstructure during processing and their relation to properties requires proper experimental techniques. Residual strains, lattice distortion, and texture (micro-texture) at the interface and the matrix of a layered structure or a functionally gradient material and their variation are among parameters important in materials characterization but hard to measure with present experimental techniques. Current techniques available to measure changes in interred material parameters (residual stress, micro-texture, microplasticity) produce results which are either qualitative or unreliable. This problem becomes even more complicated in the case of a temperature variation. These parameters affect many of the mechanical properties of advanced materials including stress-strain relation, ductility, creep, and fatigue. A review of some novel experimental techniques using recent advances in electron microscopy is presented here to measure internal stress, (micro)texture, interracial strength and (sub)grain formation and realignment. Two of these techniques are combined in the chamber of an Environmental Scanning Electron Microscope to measure strain and orientation gradients in advanced materials. These techniques which include Backscattered Kikuchi Diffractometry (BKD) and Microscopic Strain Field Analysis are used to characterize metallic and intermetallic matrix composites and superplastic materials. These techniques are compared with the more conventional x-ray diffraction and indentation techniques.

  1. The thermal maturation degree of organic matter from source rocks revealed by wells logs including examples from Murzuk Basin, Libya

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Negoita, V.; Gheorghe, A.

    1995-08-01

    The customary technique used to know the organic matter quantity per rock volume it as well as the organic matter maturation stage is based on geochemical analyses accomplished on a preselected number of samples and cuttings drawn from boreholes during the drilling period. But the same objectives can be approached without any extra cost using the continuous measurements of well logs recorded in each well from the ground surface to the total depth. During the diagenetic stage, the identification of potential source rocks out of which no hydrocarbon have been generated may be carried out using a well logging suitemore » including Gamma Ray Spectrometry, the Compensated Neutron/Litho Density combination and a Dual Induction/Sonic Log. During the catagenetic stage the onset of oil generation brings some important changes in the organic matter structure as well as in the fluid distribution throughout the pore space of source rocks. The replacement of electric conductive water by electric non-conductive hydrocarbons, together with water and oil being expelled from source rocks represent a process of different intensities dependent of time/temperature geohistory and kerogen type. The different generation and expulsion scenarios of hydrocarbons taking place during the catagenetic and metagenetic stages of source rocks are very well revealed by Induction and Laterolog investigations. Several crossplots relating vitrinite reflectance, total organic carbon and log-derived physical parameters are illustrated and discussed. The field applications are coming from Murzuk Basin, where Rompetrol of Libya is operating.« less

  2. A miniaturized fibrinolytic assay for plasminogen activators

    NASA Technical Reports Server (NTRS)

    Lewis, M. L.; Nachtwey, D. S.; Damron, K. L.

    1991-01-01

    This report describes a micro-clot lysis assay (MCLA) for evaluating fibrinolytic activity of plasminogen activators (PA). Fibrin clots were formed in wells of microtiter plates. Lysis of the clots by PA, indicated by change in turbidity (optical density, OD), was monitored with a microplate reader at five minutes intervals. Log-log plots of PA dilution versus endpoint, the time at which the OD value was halfway between the maximum and minimum value for each well, were linear over a broad range of PA concentrations (2-200 International units/ml). The MCLA is a modification and miniaturization of well established fibrinolytic methods. The significant practical advantages of the MCLA are that it is a simple, relatively sensitive, non-radioactive, quantitative, kinetic, fibrinolytic micro-technique which can be automated.

  3. Comparison of Methods for Estimating Low Flow Characteristics of Streams

    USGS Publications Warehouse

    Tasker, Gary D.

    1987-01-01

    Four methods for estimating the 7-day, 10-year and 7-day, 20-year low flows for streams are compared by the bootstrap method. The bootstrap method is a Monte Carlo technique in which random samples are drawn from an unspecified sampling distribution defined from observed data. The nonparametric nature of the bootstrap makes it suitable for comparing methods based on a flow series for which the true distribution is unknown. Results show that the two methods based on hypothetical distribution (Log-Pearson III and Weibull) had lower mean square errors than did the G. E. P. Box-D. R. Cox transformation method or the Log-W. C. Boughton method which is based on a fit of plotting positions.

  4. Generation of complete source samples from the Slew Survey

    NASA Technical Reports Server (NTRS)

    Schachter, Jonathan

    1992-01-01

    The Einstein Slew Survey consists of 819 bright X-ray sources, of which 636 (or 78 percent) are identified with counterparts in standard catalogs. We argue for the importance of bright X-ray surveys, and compare the slew results to the ROSAT all-sky survey. Also, we discuss statistical techniques for minimizing confusion in arcminute error circles in digitized data. We describe the 238 Slew Survey AGN, clusters, and BL Lac objects identified to date and their implications for logN-logS and source evolution studies. Also given is a catalog of 1075 sources detected in the Einstein Imaging Proportional Counter (IPC) Slew Survey of the X-ray sky. Five hundred fifty-four of these sources were not previously known as X-ray sources.

  5. Numerical results on the transcendence of constants involving pi, e, and Euler's constant

    NASA Technical Reports Server (NTRS)

    Bailey, David H.

    1988-01-01

    The existence of simple polynomial equations (integer relations) for the constants e/pi, e + pi, log pi, gamma (Euler's constant), e exp gamma, gamma/e, gamma/pi, and log gamma is investigated by means of numerical computations. The recursive form of the Ferguson-Fourcade algorithm (Ferguson and Fourcade, 1979; Ferguson, 1986 and 1987) is implemented on the Cray-2 supercomputer at NASA Ames, applying multiprecision techniques similar to those described by Bailey (1988) except that FFTs are used instead of dual-prime-modulus transforms for multiplication. It is shown that none of the constants has an integer relation of degree eight or less with coefficients of Euclidean norm 10 to the 9th or less.

  6. Simple prognostic model for patients with advanced cancer based on performance status.

    PubMed

    Jang, Raymond W; Caraiscos, Valerie B; Swami, Nadia; Banerjee, Subrata; Mak, Ernie; Kaya, Ebru; Rodin, Gary; Bryson, John; Ridley, Julia Z; Le, Lisa W; Zimmermann, Camilla

    2014-09-01

    Providing survival estimates is important for decision making in oncology care. The purpose of this study was to provide survival estimates for outpatients with advanced cancer, using the Eastern Cooperative Oncology Group (ECOG), Palliative Performance Scale (PPS), and Karnofsky Performance Status (KPS) scales, and to compare their ability to predict survival. ECOG, PPS, and KPS were completed by physicians for each new patient attending the Princess Margaret Cancer Centre outpatient Oncology Palliative Care Clinic (OPCC) from April 2007 to February 2010. Survival analysis was performed using the Kaplan-Meier method. The log-rank test for trend was employed to test for differences in survival curves for each level of performance status (PS), and the concordance index (C-statistic) was used to test the predictive discriminatory ability of each PS measure. Measures were completed for 1,655 patients. PS delineated survival well for all three scales according to the log-rank test for trend (P < .001). Survival was approximately halved for each worsening performance level. Median survival times, in days, for each ECOG level were: EGOG 0, 293; ECOG 1, 197; ECOG 2, 104; ECOG 3, 55; and ECOG 4, 25.5. Median survival times, in days, for PPS (and KPS) were: PPS/KPS 80-100, 221 (215); PPS/KPS 60 to 70, 115 (119); PPS/KPS 40 to 50, 51 (49); PPS/KPS 10 to 30, 22 (29). The C-statistic was similar for all three scales and ranged from 0.63 to 0.64. We present a simple tool that uses PS alone to prognosticate in advanced cancer, and has similar discriminatory ability to more complex models. Copyright © 2014 by American Society of Clinical Oncology.

  7. Conventional and advanced time series estimation: application to the Australian and New Zealand Intensive Care Society (ANZICS) adult patient database, 1993-2006.

    PubMed

    Moran, John L; Solomon, Patricia J

    2011-02-01

    Time series analysis has seen limited application in the biomedical Literature. The utility of conventional and advanced time series estimators was explored for intensive care unit (ICU) outcome series. Monthly mean time series, 1993-2006, for hospital mortality, severity-of-illness score (APACHE III), ventilation fraction and patient type (medical and surgical), were generated from the Australia and New Zealand Intensive Care Society adult patient database. Analyses encompassed geographical seasonal mortality patterns, series structural time changes, mortality series volatility using autoregressive moving average and Generalized Autoregressive Conditional Heteroscedasticity models in which predicted variances are updated adaptively, and bivariate and multivariate (vector error correction models) cointegrating relationships between series. The mortality series exhibited marked seasonality, declining mortality trend and substantial autocorrelation beyond 24 lags. Mortality increased in winter months (July-August); the medical series featured annual cycling, whereas the surgical demonstrated long and short (3-4 months) cycling. Series structural breaks were apparent in January 1995 and December 2002. The covariance stationary first-differenced mortality series was consistent with a seasonal autoregressive moving average process; the observed conditional-variance volatility (1993-1995) and residual Autoregressive Conditional Heteroscedasticity effects entailed a Generalized Autoregressive Conditional Heteroscedasticity model, preferred by information criterion and mean model forecast performance. Bivariate cointegration, indicating long-term equilibrium relationships, was established between mortality and severity-of-illness scores at the database level and for categories of ICUs. Multivariate cointegration was demonstrated for {log APACHE III score, log ICU length of stay, ICU mortality and ventilation fraction}. A system approach to understanding series time-dependence may be established using conventional and advanced econometric time series estimators. © 2010 Blackwell Publishing Ltd.

  8. Advanced Welding Concepts

    NASA Technical Reports Server (NTRS)

    Ding, Robert J.

    2010-01-01

    Four advanced welding techniques and their use in NASA are briefly reviewed in this poster presentation. The welding techniques reviewed are: Solid State Welding, Friction Stir Welding (FSW), Thermal Stir Welding (TSW) and Ultrasonic Stir Welding.

  9. Sensor failure detection system. [for the F100 turbofan engine

    NASA Technical Reports Server (NTRS)

    Beattie, E. C.; Laprad, R. F.; Mcglone, M. E.; Rock, S. M.; Akhter, M. M.

    1981-01-01

    Advanced concepts for detecting, isolating, and accommodating sensor failures were studied to determine their applicability to the gas turbine control problem. Five concepts were formulated based upon such techniques as Kalman filters and a screening process led to the selection of one advanced concept for further evaluation. The selected advanced concept uses a Kalman filter to generate residuals, a weighted sum square residuals technique to detect soft failures, likelihood ratio testing of a bank of Kalman filters for isolation, and reconfiguring of the normal mode Kalman filter by eliminating the failed input to accommodate the failure. The advanced concept was compared to a baseline parameter synthesis technique. The advanced concept was shown to be a viable concept for detecting, isolating, and accommodating sensor failures for the gas turbine applications.

  10. Targeted Muscle Reinnervation for Transradial Amputation: Description of Operative Technique.

    PubMed

    Morgan, Emily N; Kyle Potter, Benjamin; Souza, Jason M; Tintle, Scott M; Nanos, George P

    2016-12-01

    Targeted muscle reinnervation (TMR) is a revolutionary surgical technique that, together with advances in upper extremity prostheses and advanced neuromuscular pattern recognition, allows intuitive and coordinated control in multiple planes of motion for shoulder disarticulation and transhumeral amputees. TMR also may provide improvement in neuroma-related pain and may represent an opportunity for sensory reinnervation as advances in prostheses and haptic feedback progress. Although most commonly utilized following shoulder disarticulation and transhumeral amputations, TMR techniques also represent an exciting opportunity for improvement in integrated prosthesis control and neuroma-related pain improvement in patients with transradial amputations. As there are no detailed descriptions of this technique in the literature to date, we provide our surgical technique for TMR in transradial amputations.

  11. Development of nanomaterial-enabled advanced oxidation techniques for treatment of organic micropollutants

    NASA Astrophysics Data System (ADS)

    Oulton, Rebekah Lynn

    Increasing demand for limited fresh water resources necessitates that alternative water sources be developed. Nonpotable reuse of treated wastewater represents one such alternative. However, the ubiquitous presence of organic micropollutants such as pharmaceuticals and personal care products (PPCPs) in wastewater effluents limits use of this resource. Numerous investigations have examined PPCP fate during wastewater treatment, focusing on their removal during conventional and advanced treatment processes. Analysis of influent and effluent data from published studies reveals that at best 1-log10 concentration unit of PPCP removal can generally be achieved with conventional treatment. In contrast, plants employing advanced treatment methods, particularly ozonation and/or membranes, remove most PPCPs often to levels below analytical detection limits. However, membrane treatment is cost prohibitive for many facilities, and ozone treatment can be very selective. Ozone-recalcitrant compounds require the use of Advanced Oxidation Processes (AOPs), which utilize highly reactive hydroxyl radicals (*OH) to target resistant pollutants. Due to cost and energy use concerns associated with current AOPs, alternatives such as catalytic ozonation are under investigation. Catalytic ozonation uses substrates such as activated carbon to promote *OH formation during ozonation. Here, we show that multi-walled carbon nanotubes (MWCNTs) represent another viable substrate, promoting *OH formation during ozonation to levels exceeding activated carbon and equivalent to conventional ozone-based AOPs. Via a series of batch reactions, we observ a strong correlation between *OH formation and MWCNT surface oxygen concentrations. Results suggest that deprotonated carboxyl groups on the CNT surface are integral to their reactivity toward ozone and corresponding *OH formation. From a practical standpoint, we show that industrial grade MWCNTs exhibit similar *OH production as their research-grade counterparts. Accelerated aging studies indicate that MWCNTs maintain surface reactivity for an extended period during ozonation treatment. Further, *OH generation is essentially unaffected in complex water matrices containing known radical scavengers, and is effective for degradation of the ozone-recalcitrant herbicide atrazine. A proof-of-concept study verified that results from batch systems can be replicated in a flow-through reactor utilizing MWCNTs immobilized on a ceramic membrane support. Collective, results suggest that CNT-enhanced ozonation may provide a viable treatment alternative for emerging organic micropollutants.

  12. Overall Survival of Patients with Locally Advanced or Metastatic Esophageal Squamous Cell Carcinoma Treated with Nimotuzumab in the Real World.

    PubMed

    Saumell, Yaimarelis; Sanchez, Lizet; González, Sandra; Ortiz, Ramón; Medina, Edadny; Galán, Yaima; Lage, Agustin

    2017-12-01

    Despite improvements in surgical techniques and treatments introduced into clinical practice, the overall survival of patients with esophageal squamous cell carcinoma remains low. Several epidermal growth factor receptor inhibitors are being evaluated in the context of clinical trials, but there is little evidence of effectiveness in real-world conditions. This study aimed at assessing the effectiveness of nimotuzumab combined with onco-specific treatment in Cuban real-life patients with locally advanced or metastatic esophageal squamous cell carcinoma. A comparative and retrospective effectiveness study was performed. The 93 patients treated with nimotuzumab were matched, with use of propensity score matching, with patients who received a diagnosis of locally advanced or metastatic squamous cell carcinoma of the esophagus in three Cuban provinces reported between 2011 and 2015 to the National Cancer Registry. The Kaplan-Meier method was used to estimate event-time distributions. Log-rank statistics were used for comparisons of overall survival between groups. A two-component mixture model assuming a Weibull distribution was fitted to assess the effect of nimotuzumab on short-term and long-term survival populations. There was an increase in median overall survival in patients treated with nimotuzumab (11.9 months versus 6.5 months without treatment) and an increase in the 1-year survival rate (54.0% versus 21.9% without treatment). The 2-year survival rates were 21.1% for patients treated with nimotuzumab and 0% in the untreated cohort. There were statistically significant differences in survival between groups treated and not treated with nimotuzumab, both in the short-term survival population (6.0 months vs 4.0 months, p = 0.009) and in the long-term survival population (18.0 months vs 11.0 months, p = 0.001). Our study shows that nimotuzumab treatment concurrent with chemoradiotherapy increases the survival of real-world patients with locally advanced or metastatic esophageal squamous cell carcinoma. Further prospective studies are required to confirm the therapeutic effectiveness of nimotuzumab in esophageal cancer.

  13. Web-of-Objects (WoO)-Based Context Aware Emergency Fire Management Systems for the Internet of Things

    PubMed Central

    Shamszaman, Zia Ush; Ara, Safina Showkat; Chong, Ilyoung; Jeong, Youn Kwae

    2014-01-01

    Recent advancements in the Internet of Things (IoT) and the Web of Things (WoT) accompany a smart life where real world objects, including sensing devices, are interconnected with each other. The Web representation of smart objects empowers innovative applications and services for various domains. To accelerate this approach, Web of Objects (WoO) focuses on the implementation aspects of bringing the assorted real world objects to the Web applications. In this paper; we propose an emergency fire management system in the WoO infrastructure. Consequently, we integrate the formation and management of Virtual Objects (ViO) which are derived from real world physical objects and are virtually connected with each other into the semantic ontology model. The charm of using the semantic ontology is that it allows information reusability, extensibility and interoperability, which enable ViOs to uphold orchestration, federation, collaboration and harmonization. Our system is context aware, as it receives contextual environmental information from distributed sensors and detects emergency situations. To handle a fire emergency, we present a decision support tool for the emergency fire management team. The previous fire incident log is the basis of the decision support system. A log repository collects all the emergency fire incident logs from ViOs and stores them in a repository. PMID:24531299

  14. Web-of-Objects (WoO)-based context aware emergency fire management systems for the Internet of Things.

    PubMed

    Shamszaman, Zia Ush; Ara, Safina Showkat; Chong, Ilyoung; Jeong, Youn Kwae

    2014-02-13

    Recent advancements in the Internet of Things (IoT) and the Web of Things (WoT) accompany a smart life where real world objects, including sensing devices, are interconnected with each other. The Web representation of smart objects empowers innovative applications and services for various domains. To accelerate this approach, Web of Objects (WoO) focuses on the implementation aspects of bringing the assorted real world objects to the Web applications. In this paper; we propose an emergency fire management system in the WoO infrastructure. Consequently, we integrate the formation and management of Virtual Objects (ViO) which are derived from real world physical objects and are virtually connected with each other into the semantic ontology model. The charm of using the semantic ontology is that it allows information reusability, extensibility and interoperability, which enable ViOs to uphold orchestration, federation, collaboration and harmonization. Our system is context aware, as it receives contextual environmental information from distributed sensors and detects emergency situations. To handle a fire emergency, we present a decision support tool for the emergency fire management team. The previous fire incident log is the basis of the decision support system. A log repository collects all the emergency fire incident logs from ViOs and stores them in a repository.

  15. The experiences of undergraduate nursing students with bots in Second LifeRTM

    NASA Astrophysics Data System (ADS)

    Rose, Lesele H.

    As technology continues to transform education from the status quo of traditional lecture-style instruction to an interactive engaging learning experience, students' experiences within the learning environment continues to change as well. This dissertation addressed the need for continuing research in advancing implementation of technology in higher education. The purpose of this phenomenological study was to discover more about the experiences of undergraduate nursing students using standardized geriatric evaluation tools when interacting with scripted geriatric patient bots tools in a simulated instructional intake setting. Data was collected through a Demographics questionnaire, an Experiential questionnaire, and a Reflection questionnaire. Triangulation of data collection occurred through an automatically created log of the interactions with the two bots, and by an automatically recorded log of the participants' movements while in the simulated geriatric intake interview. The data analysis consisted of an iterative review of the questionnaires and the participants' logs in an effort to identify common themes, recurring comments, and issues which would benefit from further exploration. Findings revealed that the interactions with the bots were perceived as a valuable experience for the participants from the perspective of interacting with the Geriatric Evaluation Tools in the role of an intake nurse. Further research is indicated to explore instructional interactions with bots in effectively mastering the use of established Geriatric Evaluation Tools.

  16. Advanced magnetic resonance imaging of neurodegenerative diseases.

    PubMed

    Agosta, Federica; Galantucci, Sebastiano; Filippi, Massimo

    2017-01-01

    Magnetic resonance imaging (MRI) is playing an increasingly important role in the study of neurodegenerative diseases, delineating the structural and functional alterations determined by these conditions. Advanced MRI techniques are of special interest for their potential to characterize the signature of each neurodegenerative condition and aid both the diagnostic process and the monitoring of disease progression. This aspect will become crucial when disease-modifying (personalized) therapies will be established. MRI techniques are very diverse and go from the visual inspection of MRI scans to more complex approaches, such as manual and automatic volume measurements, diffusion tensor MRI, and functional MRI. All these techniques allow us to investigate the different features of neurodegeneration. In this review, we summarize the most recent advances concerning the use of MRI in some of the most important neurodegenerative conditions, putting an emphasis on the advanced techniques.

  17. Sonic Fatigue Design Techniques for Advanced Composite Aircraft Structures

    DTIC Science & Technology

    1980-04-01

    AFWAL-TR-80.3019 AD A 090553 SONIC FATIGUE DESIGN TECHNIQUES FOR ADVANCED COMPOSITE AIRCRAFT STRUCTURES FINAL REPORT Ian Holehouse Rohr Industries...5 2. General Sonic Fatigue Theory .... ....... 7 3. Composite Laminate Analysis .. ....... ... 10 4. Preliminary Sonic Fatigue...overall sonic fatigue design guides. These existing desiyn methcds have been developed for metal structures. However, recent advanced composite

  18. Biotechnology Apprenticeship for Secondary-Level Students: Teaching Advanced Cell Culture Techniques for Research

    ERIC Educational Resources Information Center

    Lewis, Jennifer R.; Kotur, Mark S.; Butt, Omar; Kulcarni, Sumant; Riley, Alyssa A.; Ferrell, Nick; Sullivan, Kathryn D.; Ferrari, Mauro

    2002-01-01

    The purpose of this article is to discuss "small-group apprenticeships (SGAs)" as a method to instruct cell culture techniques to high school participants. The study aimed to teach cell culture practices and to introduce advanced imaging techniques to solve various biomedical engineering problems. Participants designed and completed experiments…

  19. Advances in top-down and bottom-up surface nanofabrication: techniques, applications & future prospects.

    PubMed

    Biswas, Abhijit; Bayer, Ilker S; Biris, Alexandru S; Wang, Tao; Dervishi, Enkeleda; Faupel, Franz

    2012-01-15

    This review highlights the most significant advances of the nanofabrication techniques reported over the past decade with a particular focus on the approaches tailored towards the fabrication of functional nano-devices. The review is divided into two sections: top-down and bottom-up nanofabrication. Under the classification of top-down, special attention is given to technical reports that demonstrate multi-directional patterning capabilities less than or equal to 100 nm. These include recent advances in lithographic techniques, such as optical, electron beam, soft, nanoimprint, scanning probe, and block copolymer lithography. Bottom-up nanofabrication techniques--such as, atomic layer deposition, sol-gel nanofabrication, molecular self-assembly, vapor-phase deposition and DNA-scaffolding for nanoelectronics--are also discussed. Specifically, we describe advances in the fabrication of functional nanocomposites and graphene using chemical and physical vapor deposition. Our aim is to provide a comprehensive platform for prominent nanofabrication tools and techniques in order to facilitate the development of new or hybrid nanofabrication techniques leading to novel and efficient functional nanostructured devices. Copyright © 2011 Elsevier B.V. All rights reserved.

  20. BoreholeAR: A mobile tablet application for effective borehole database visualization using an augmented reality technology

    NASA Astrophysics Data System (ADS)

    Lee, Sangho; Suh, Jangwon; Park, Hyeong-Dong

    2015-03-01

    Boring logs are widely used in geological field studies since the data describes various attributes of underground and surface environments. However, it is difficult to manage multiple boring logs in the field as the conventional management and visualization methods are not suitable for integrating and combining large data sets. We developed an iPad application to enable its user to search the boring log rapidly and visualize them using the augmented reality (AR) technique. For the development of the application, a standard borehole database appropriate for a mobile-based borehole database management system was designed. The application consists of three modules: an AR module, a map module, and a database module. The AR module superimposes borehole data on camera imagery as viewed by the user and provides intuitive visualization of borehole locations. The map module shows the locations of corresponding borehole data on a 2D map with additional map layers. The database module provides data management functions for large borehole databases for other modules. Field survey was also carried out using more than 100,000 borehole data.

  1. Log-linear model based behavior selection method for artificial fish swarm algorithm.

    PubMed

    Huang, Zhehuang; Chen, Yidong

    2015-01-01

    Artificial fish swarm algorithm (AFSA) is a population based optimization technique inspired by social behavior of fishes. In past several years, AFSA has been successfully applied in many research and application areas. The behavior of fishes has a crucial impact on the performance of AFSA, such as global exploration ability and convergence speed. How to construct and select behaviors of fishes are an important task. To solve these problems, an improved artificial fish swarm algorithm based on log-linear model is proposed and implemented in this paper. There are three main works. Firstly, we proposed a new behavior selection algorithm based on log-linear model which can enhance decision making ability of behavior selection. Secondly, adaptive movement behavior based on adaptive weight is presented, which can dynamically adjust according to the diversity of fishes. Finally, some new behaviors are defined and introduced into artificial fish swarm algorithm at the first time to improve global optimization capability. The experiments on high dimensional function optimization showed that the improved algorithm has more powerful global exploration ability and reasonable convergence speed compared with the standard artificial fish swarm algorithm.

  2. Solubilization of polycyclic aromatic hydrocarbons in micellar nonionic surfactant solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, D.A.; Luthy, R.G.; Liu, Zhongbao

    1991-01-01

    Experimental data are presented on the enhanced apparent solubilities of naphthalene, phenanthrene, and pyrene resulting from solubilization in aqueous solutions of four commercial, nonionic surfactants: an alkyl polyoxyethylene (POE) type, two octylphenol POE types, and a nonylphenol POE type. Apparent solubilities of the polycyclic aromatic hydrocarbon (PAH) compounds in surfactant solutions were determined by radiolabeled techniques. Solubilization of each PAH compound commenced at the surfactant critical micelle concentration and was proportional to the concentration of surfactant in micelle form. The partitioning of organic compounds between surfactant micelles and aqueous solution is characterized by a mole fraction micelle-phase/aqueous-phase partition coefficient, K{submore » m}. Values of log K{sub m} for PAH compounds in surfactant solutions of this study range from 4.57 to 6.53. Log K{sub m} appears to be a linear function of log K{sub ow} for a given surfactant solution. A knowledge of partitioning in aqueous surfactant systems is a prerequisite to understanding mechanisms affecting the behavior of hydrophobic organic compounds in soil-water systems in which surfactants play a role in contaminant remediation or facilitated transport.« less

  3. Determination and prediction of octanol-air partition coefficients of hydroxylated and methoxylated polybrominated diphenyl ethers.

    PubMed

    Zhao, Hongxia; Xie, Qing; Tan, Feng; Chen, Jingwen; Quan, Xie; Qu, Baocheng; Zhang, Xin; Li, Xiaona

    2010-07-01

    The octanol-air partition coefficient (K(OA)) of 19 hydroxylated polybrominated diphenyl ethers (OH-PBDEs) and 10 methoxylated polybrominated diphenyl ethers (MeO-PBDEs) were measured as a function of temperature using a gas chromatographic retention time technique. At room temperature (298.15K), log K(OA) ranged from 8.30 for monobrominated OH/MeO-PBDEs to 13.29 for hexabrominated OH/MeO-PBDEs. The internal energies of phase change from octanol to air (Delta(OA)U) for 29 OH/MeO-PBDE congeners ranged from 72 to 126 kJ mol(-1). Using partial least-squares (PLS) analysis, a statistically quantitative structure-property relationship (QSPR) model for logK(OA) of OH/MeO-PBDE congeners was developed based on the 16 fundamental quantum chemical descriptors computed by PM3 Hamiltonian, for which the Q(cum)(2) was about 0.937. The molecular weight (Mw) and energy of the lowest unoccupied molecular orbital (E(LUMO)) were found to be main factors governing the log K(OA). 2010 Elsevier Ltd. All rights reserved.

  4. Removal of contaminants and pathogens from secondary effluents using intermittent sand filters.

    PubMed

    Bali, Mahmoud; Gueddari, Moncef; Boukchina, Rachid

    2011-01-01

    Intermittent infiltration percolation of wastewater through unsaturated sand bed is an extensive treatment technique aimed at eliminating organic matter, oxidizing ammonium and removing pathogens. The main purpose of this study was to determine the depuration efficiencies of a sand filter to remove contaminants from secondary wastewater effluents. Elimination of pathogenic bacteria (total and faecal coliforms, streptococci) and their relationship with the filter depth were investigated. Results showed a high capacity of infiltration percolation process to treat secondary effluents. Total elimination of suspended solids was obtained. Mean removal rate of BOD(5) and COD was more than 97 and more than 81%, respectively. Other water quality parameters such as NH(4)-N, TKN and PO(4)-P showed significant reduction except NO(3)-N which increased significantly in the filtered water. Efficiency of pathogenic bacteria removal was shown to mainly depend on the filter depth. Average reductions of 2.35 log total coliforms, 2.47 log faecal coliforms and 2.11 log faecal streptococci were obtained. The experimental study has shown the influence of the temperature on the output purification of infiltration percolation process.

  5. Metaheuristic optimization approaches to predict shear-wave velocity from conventional well logs in sandstone and carbonate case studies

    NASA Astrophysics Data System (ADS)

    Emami Niri, Mohammad; Amiri Kolajoobi, Rasool; Khodaiy Arbat, Mohammad; Shahbazi Raz, Mahdi

    2018-06-01

    Seismic wave velocities, along with petrophysical data, provide valuable information during the exploration and development stages of oil and gas fields. The compressional-wave velocity (VP ) is acquired using conventional acoustic logging tools in many drilled wells. But the shear-wave velocity (VS ) is recorded using advanced logging tools only in a limited number of wells, mainly because of the high operational costs. In addition, laboratory measurements of seismic velocities on core samples are expensive and time consuming. So, alternative methods are often used to estimate VS . Heretofore, several empirical correlations that predict VS by using well logging measurements and petrophysical data such as VP , porosity and density are proposed. However, these empirical relations can only be used in limited cases. The use of intelligent systems and optimization algorithms are inexpensive, fast and efficient approaches for predicting VS. In this study, in addition to the widely used Greenberg–Castagna empirical method, we implement three relatively recently developed metaheuristic algorithms to construct linear and nonlinear models for predicting VS : teaching–learning based optimization, imperialist competitive and artificial bee colony algorithms. We demonstrate the applicability and performance of these algorithms to predict Vs using conventional well logs in two field data examples, a sandstone formation from an offshore oil field and a carbonate formation from an onshore oil field. We compared the estimated VS using each of the employed metaheuristic approaches with observed VS and also with those predicted by Greenberg–Castagna relations. The results indicate that, for both sandstone and carbonate case studies, all three implemented metaheuristic algorithms are more efficient and reliable than the empirical correlation to predict VS . The results also demonstrate that in both sandstone and carbonate case studies, the performance of an artificial bee colony algorithm in VS prediction is slightly higher than two other alternative employed approaches.

  6. The standard deviation of extracellular water/intracellular water is associated with all-cause mortality and technique failure in peritoneal dialysis patients.

    PubMed

    Tian, Jun-Ping; Wang, Hong; Du, Feng-He; Wang, Tao

    2016-09-01

    The mortality rate of peritoneal dialysis (PD) patients is still high, and the predicting factors for PD patient mortality remain to be determined. This study aimed to explore the relationship between the standard deviation (SD) of extracellular water/intracellular water (E/I) and all-cause mortality and technique failure in continuous ambulatory PD (CAPD) patients. All 152 patients came from the PD Center between January 1st 2006 and December 31st 2007. Clinical data and at least five-visit E/I ratio defined by bioelectrical impedance analysis were collected. The patients were followed up till December 31st 2010. The primary outcomes were death from any cause and technique failure. Kaplan-Meier analysis and Cox proportional hazards models were used to identify risk factors for mortality and technique failure in CAPD patients. All patients were followed up for 59.6 ± 23.0 months. The patients were divided into two groups according to their SD of E/I values: lower SD of E/I group (≤0.126) and higher SD of E/I group (>0.126). The patients with higher SD of E/I showed a higher all-cause mortality (log-rank χ (2) = 10.719, P = 0.001) and technique failure (log-rank χ (2) = 9.724, P = 0.002) than those with lower SD of E/I. Cox regression analysis found that SD of E/I independently predicted all-cause mortality (HR  3.551, 95 % CI 1.442-8.746, P = 0.006) and technique failure (HR  2.487, 95 % CI 1.093-5.659, P = 0.030) in CAPD patients after adjustment for confounders except when sensitive C-reactive protein was added into the model. The SD of E/I was a strong independent predictor of all-cause mortality and technique failure in CAPD patients.

  7. Quantifying the Effects of Water Temperature, Soap Volume, Lather Time, and Antimicrobial Soap as Variables in the Removal of Escherichia coli ATCC 11229 from Hands.

    PubMed

    Jensen, Dane A; Macinga, David R; Shumaker, David J; Bellino, Roberto; Arbogast, James W; Schaffner, Donald W

    2017-06-01

    The literature on hand washing, while extensive, often contains conflicting data, and key variables are only superficially studied or not studied at all. Some hand washing recommendations are made without scientific support, and agreement between recommendations is limited. The influence of key variables such as soap volume, lather time, water temperature, and product formulation on hand washing efficacy was investigated in the present study. Baseline conditions were 1 mL of a bland (nonantimicrobial) soap, a 5-s lather time, and 38°C (100°F) water temperature. A nonpathogenic strain of Escherichia coli (ATCC 11229) was the challenge microorganism. Twenty volunteers (10 men and 10 women) participated in the study, and each test condition had 20 replicates. An antimicrobial soap formulation (1% chloroxylenol) was not significantly more effective than the bland soap for removing E. coli under a variety of test conditions. Overall, the mean reduction was 1.94 log CFU (range, 1.83 to 2.10 log CFU) with the antimicrobial soap and 2.22 log CFU (range, 1.91 to 2.54 log CFU) with the bland soap. Overall, lather time significantly influenced efficacy in one scenario, in which a 0.5-log greater reduction was observed after 20 s with bland soap compared with the baseline wash (P = 0.020). Water temperature as high as 38°C (100°F) and as low as 15°C (60°F) did not have a significant effect on the reduction of bacteria during hand washing; however, the energy usage differed between these temperatures. No significant differences were observed in mean log reductions experienced by men and women (both 2.08 log CFU; P = 0.988). A large part of the variability in the data was associated with the behaviors of the volunteers. Understanding what behaviors and human factors most influence hand washing may help researchers find techniques to optimize the effectiveness of hand washing.

  8. Actual distribution of Cronobacter spp. in industrial batches of powdered infant formula and consequences for performance of sampling strategies.

    PubMed

    Jongenburger, I; Reij, M W; Boer, E P J; Gorris, L G M; Zwietering, M H

    2011-11-15

    The actual spatial distribution of microorganisms within a batch of food influences the results of sampling for microbiological testing when this distribution is non-homogeneous. In the case of pathogens being non-homogeneously distributed, it markedly influences public health risk. This study investigated the spatial distribution of Cronobacter spp. in powdered infant formula (PIF) on industrial batch-scale for both a recalled batch as well a reference batch. Additionally, local spatial occurrence of clusters of Cronobacter cells was assessed, as well as the performance of typical sampling strategies to determine the presence of the microorganisms. The concentration of Cronobacter spp. was assessed in the course of the filling time of each batch, by taking samples of 333 g using the most probable number (MPN) enrichment technique. The occurrence of clusters of Cronobacter spp. cells was investigated by plate counting. From the recalled batch, 415 MPN samples were drawn. The expected heterogeneous distribution of Cronobacter spp. could be quantified from these samples, which showed no detectable level (detection limit of -2.52 log CFU/g) in 58% of samples, whilst in the remainder concentrations were found to be between -2.52 and 2.75 log CFU/g. The estimated average concentration in the recalled batch was -2.78 log CFU/g and a standard deviation of 1.10 log CFU/g. The estimated average concentration in the reference batch was -4.41 log CFU/g, with 99% of the 93 samples being below the detection limit. In the recalled batch, clusters of cells occurred sporadically in 8 out of 2290 samples of 1g taken. The two largest clusters contained 123 (2.09 log CFU/g) and 560 (2.75 log CFU/g) cells. Various sampling strategies were evaluated for the recalled batch. Taking more and smaller samples and keeping the total sampling weight constant, considerably improved the performance of the sampling plans to detect such a type of contaminated batch. Compared to random sampling, stratified random sampling improved the probability to detect the heterogeneous contamination. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. Geophysical investigations in deep horizontal holes drilled ahead of tunnelling

    USGS Publications Warehouse

    Carroll, R.D.; Cunningham, M.J.

    1980-01-01

    Deep horizontal drill holes have been used since 1967 by the Defense Nuclear Agency as a primary exploration tool for siting nuclear events in tunnels at the Nevada Test Site. The U.S. Geological Survey had developed geophysical logging techniques for obtaining resistivity and velocity in these holes, and to date 33 horizontal drill holes in excess of 300 m in depth have been successfully logged. The deepest hole was drilled to a horizontal depth of 1125 m. The purposes of the logging measurements are to define clay zones, because of the unstable ground conditions such zones can present to tunnelling, and to define zones of partially saturated rock, because of the attenuating effects such zones have on the shock wave generated by the nuclear detonation. Excessive attenuation is undesirable because the shock wave is used as a tunnel closure mechanism to contain debris and other undesirable explosion products. Measurements are made by pumping resistivity, sonic and geophone probes down the drill string and out of the bit into the open hole. Clay zones are defined by the electrical resistivity technique based on empirical data relating the magnitude of the resistivity measurement to qualitative clay content. Rock exhibiting resistivity of less than 20 ??-m is considered potentially unstable, and resistivities less than 10 ??-m indicate appreciable amounts of clay are present in the rock. Partially saturated rock zones are defined by the measurement of the rock sound speed. Zones in the rock which exhibit velocities less than 2450 m/sec are considered of potential concern. ?? 1980.

  10. Advanced Manufacturing Processes in the Motor Vehicle Industry

    DOT National Transportation Integrated Search

    1983-05-01

    Advanced manufacturing processes, which include a range of automation and management techniques, are aiding U.S. motor vehicle manufacturers to reduce vehicle costs. This report discusses these techniques in general and their specific applications in...

  11. Mineral content prediction for unconventional oil and gas reservoirs based on logging data

    NASA Astrophysics Data System (ADS)

    Maojin, Tan; Youlong, Zou; Guoyue

    2012-09-01

    Coal bed methane and shale oil &gas are both important unconventional oil and gas resources, whose reservoirs are typical non-linear with complex and various mineral components, and the logging data interpretation model are difficult to establish for calculate the mineral contents, and the empirical formula cannot be constructed due to various mineral. The radial basis function (RBF) network analysis is a new method developed in recent years; the technique can generate smooth continuous function of several variables to approximate the unknown forward model. Firstly, the basic principles of the RBF is discussed including net construct and base function, and the network training is given in detail the adjacent clustering algorithm specific process. Multi-mineral content for coal bed methane and shale oil &gas, using the RBF interpolation method to achieve a number of well logging data to predict the mineral component contents; then, for coal-bed methane reservoir parameters prediction, the RBF method is used to realized some mineral contents calculation such as ash, volatile matter, carbon content, which achieves a mapping from various logging data to multimineral. To shale gas reservoirs, the RBF method can be used to predict the clay content, quartz content, feldspar content, carbonate content and pyrite content. Various tests in coalbed and gas shale show the method is effective and applicable for mineral component contents prediction

  12. Prediction of Compressional, Shear, and Stoneley Wave Velocities from Conventional Well Log Data Using a Committee Machine with Intelligent Systems

    NASA Astrophysics Data System (ADS)

    Asoodeh, Mojtaba; Bagheripour, Parisa

    2012-01-01

    Measurement of compressional, shear, and Stoneley wave velocities, carried out by dipole sonic imager (DSI) logs, provides invaluable data in geophysical interpretation, geomechanical studies and hydrocarbon reservoir characterization. The presented study proposes an improved methodology for making a quantitative formulation between conventional well logs and sonic wave velocities. First, sonic wave velocities were predicted from conventional well logs using artificial neural network, fuzzy logic, and neuro-fuzzy algorithms. Subsequently, a committee machine with intelligent systems was constructed by virtue of hybrid genetic algorithm-pattern search technique while outputs of artificial neural network, fuzzy logic and neuro-fuzzy models were used as inputs of the committee machine. It is capable of improving the accuracy of final prediction through integrating the outputs of aforementioned intelligent systems. The hybrid genetic algorithm-pattern search tool, embodied in the structure of committee machine, assigns a weight factor to each individual intelligent system, indicating its involvement in overall prediction of DSI parameters. This methodology was implemented in Asmari formation, which is the major carbonate reservoir rock of Iranian oil field. A group of 1,640 data points was used to construct the intelligent model, and a group of 800 data points was employed to assess the reliability of the proposed model. The results showed that the committee machine with intelligent systems performed more effectively compared with individual intelligent systems performing alone.

  13. Simpler ISS Flight Control Communications and Log Keeping via Social Tools and Techniques

    NASA Technical Reports Server (NTRS)

    Scott, David W.; Cowart, Hugh; Stevens, Dan

    2012-01-01

    The heart of flight operations control involves a) communicating effectively in real time with other controllers in the room and/or in remote locations and b) tracking significant events, decisions, and rationale to support the next set of decisions, provide a thorough shift handover, and troubleshoot/improve operations. International Space Station (ISS) flight controllers speak with each other via multiple voice circuits or loops, each with a particular purpose and constituency. Controllers monitor and/or respond to several loops concurrently. The primary tracking tools are console logs, typically kept by a single operator and not visible to others in real-time. Information from telemetry, commanding, and planning systems also plays into decision-making. Email is very secondary/tertiary due to timing and archival considerations. Voice communications and log entries supporting ISS operations have increased by orders of magnitude because the number of control centers, flight crew, and payload operations have grown. This paper explores three developmental ground system concepts under development at Johnson Space Center s (JSC) Mission Control Center Houston (MCC-H) and Marshall Space Flight Center s (MSFC) Payload Operations Integration Center (POIC). These concepts could reduce ISS control center voice traffic and console logging yet increase the efficiency and effectiveness of both. The goal of this paper is to kindle further discussion, exploration, and tool development.

  14. Solubility and modeling acid-base properties of adrenaline in NaCl aqueous solutions at different ionic strengths and temperatures.

    PubMed

    Bretti, Clemente; Cigala, Rosalia Maria; Crea, Francesco; De Stefano, Concetta; Vianelli, Giuseppina

    2015-10-12

    Solubility and acid-base properties of adrenaline were studied in NaCl aqueous solutions at different ionic strengths (0

  15. Analysis of calibration materials to improve dual-energy CT scanning for petrophysical applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ayyalasomavaiula, K.; McIntyre, D.; Jain, J.

    2011-01-01

    Dual energy CT-scanning is a rapidly emerging imaging technique employed in non-destructive evaluation of various materials. Although CT (Computerized Tomography) has been used for characterizing rocks and visualizing and quantifying multiphase flow through rocks for over 25 years, most of the scanning is done at a voltage setting above 100 kV for taking advantage of the Compton scattering (CS) effect, which responds to density changes. Below 100 kV the photoelectric effect (PE) is dominant which responds to the effective atomic numbers (Zeff), which is directly related to the photo electric factor. Using the combination of the two effects helps inmore » better characterization of reservoir rocks. The most common technique for dual energy CT-scanning relies on homogeneous calibration standards to produce the most accurate decoupled data. However, the use of calibration standards with impurities increases the probability of error in the reconstructed data and results in poor rock characterization. This work combines ICP-OES (inductively coupled plasma optical emission spectroscopy) and LIBS (laser induced breakdown spectroscopy) analytical techniques to quantify the type and level of impurities in a set of commercially purchased calibration standards used in dual-energy scanning. The Zeff data on the calibration standards with and without impurity data were calculated using the weighted linear combination of the various elements present and used in calculating Zeff using the dual energy technique. Results show 2 to 5% difference in predicted Zeff values which may affect the corresponding log calibrations. The effect that these techniques have on improving material identification data is discussed and analyzed. The workflow developed in this paper will translate to a more accurate material identification estimates for unknown samples and improve calibration of well logging tools.« less

  16. Advanced wiring technique and hardware application: Airplane and space vehicle

    NASA Technical Reports Server (NTRS)

    Ernst, H. L.; Eichman, C. D.

    1972-01-01

    An advanced wiring system is described which achieves the safety/reliability required for present and future airplane and space vehicle applications. Also, present wiring installation techniques and hardware are analyzed to establish existing problem areas. An advanced wiring system employing matrix interconnecting unit, plug to plug trunk bundles (FCC or ribbon cable) is outlined, and an installation study presented. A planned program to develop, lab test and flight test key features of these techniques and hardware as a part of the SST technology follow-on activities is discussed.

  17. Three-dimensional trend mapping from wire-line logs

    USGS Publications Warehouse

    Doveton, J.H.; Ke-an, Z.

    1985-01-01

    Mapping of lithofacies and porosities of stratigraphic units is complicated because these properties vary in three dimensions. The method of moments was proposed by Krumbein and Libby (1957) as a technique to aid in resolving this problem. Moments are easily computed from wireline logs and are simple statistics which summarize vertical variation in a log trace. Combinations of moment maps have proved useful in understanding vertical and lateral changes in lithology of sedimentary rock units. Although moments have meaning both as statistical descriptors and as mechanical properties, they also define polynomial curves which approximate lithologic changes as a function of depth. These polynomials can be fitted by least-squares methods, partitioning major trends in rock properties from finescale fluctuations. Analysis of variance yields the degree of fit of any polynomial and measures the proportion of vertical variability expressed by any moment or combination of moments. In addition, polynomial curves can be differentiated to determine depths at which pronounced expressions of facies occur and to determine the locations of boundaries between major lithologic subdivisions. Moments can be estimated at any location in an area by interpolating from log moments at control wells. A matrix algebra operation then converts moment estimates to coefficients of a polynomial function which describes a continuous curve of lithologic variation with depth. If this procedure is applied to a grid of geographic locations, the result is a model of variability in three dimensions. Resolution of the model is determined largely by number of moments used in its generation. The method is illustrated with an analysis of lithofacies in the Simpson Group of south-central Kansas; the three-dimensional model is shown as cross sections and slice maps. In this study, the gamma-ray log is used as a measure of shaliness of the unit. However, the method is general and can be applied, for example, to suites of neutron, density, or sonic logs to produce three-dimensional models of porosity in reservoir rocks. ?? 1985 Plenum Publishing Corporation.

  18. Neural network prediction of carbonate lithofacies from well logs, Big Bow and Sand Arroyo Creek fields, Southwest Kansas

    USGS Publications Warehouse

    Qi, L.; Carr, T.R.

    2006-01-01

    In the Hugoton Embayment of southwestern Kansas, St. Louis Limestone reservoirs have relatively low recovery efficiencies, attributed to the heterogeneous nature of the oolitic deposits. This study establishes quantitative relationships between digital well logs and core description data, and applies these relationships in a probabilistic sense to predict lithofacies in 90 uncored wells across the Big Bow and Sand Arroyo Creek fields. In 10 wells, a single hidden-layer neural network based on digital well logs and core described lithofacies of the limestone depositional texture was used to train and establish a non-linear relationship between lithofacies assignments from detailed core descriptions and selected log curves. Neural network models were optimized by selecting six predictor variables and automated cross-validation with neural network parameters and then used to predict lithofacies on the whole data set of the 2023 half-foot intervals from the 10 cored wells with the selected network size of 35 and a damping parameter of 0.01. Predicted lithofacies results compared to actual lithofacies displays absolute accuracies of 70.37-90.82%. Incorporating adjoining lithofacies, within-one lithofacies improves accuracy slightly (93.72%). Digital logs from uncored wells were batch processed to predict lithofacies and probabilities related to each lithofacies at half-foot resolution corresponding to log units. The results were used to construct interpolated cross-sections and useful depositional patterns of St. Louis lithofacies were illustrated, e.g., the concentration of oolitic deposits (including lithofacies 5 and 6) along local highs and the relative dominance of quartz-rich carbonate grainstone (lithofacies 1) in the zones A and B of the St. Louis Limestone. Neural network techniques are applicable to other complex reservoirs, in which facies geometry and distribution are the key factors controlling heterogeneity and distribution of rock properties. Future work involves extension of the neural network to predict reservoir properties, and construction of three-dimensional geo-models. ?? 2005 Elsevier Ltd. All rights reserved.

  19. Requirements-Driven Log Analysis Extended Abstract

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus

    2012-01-01

    Imagine that you are tasked to help a project improve their testing effort. In a realistic scenario it will quickly become clear, that having an impact is diffcult. First of all, it will likely be a challenge to suggest an alternative approach which is significantly more automated and/or more effective than current practice. The reality is that an average software system has a complex input/output behavior. An automated testing approach will have to auto-generate test cases, each being a pair (i; o) consisting of a test input i and an oracle o. The test input i has to be somewhat meaningful, and the oracle o can be very complicated to compute. Second, even in case where some testing technology has been developed that might improve current practice, it is then likely difficult to completely change the current behavior of the testing team unless the technique is obviously superior and does everything already done by existing technology. So is there an easier way to incorporate formal methods-based approaches than the full edged test revolution? Fortunately the answer is affirmative. A relatively simple approach is to benefit from possibly already existing logging infrastructure, which after all is part of most systems put in production. A log is a sequence of events, generated by special log recording statements, most often manually inserted in the code by the programmers. An event can be considered as a data record: a mapping from field names to values. We can analyze such a log using formal methods, for example checking it against a formal specification. This separates running the system for analyzing its behavior. It is not meant as an alternative to testing since it does not address the important in- put generation problem. However, it offers a solution which testing teams might accept since it has low impact on the existing process. A single person might be assigned to perform such log analysis, compared to the entire testing team changing behavior.

  20. Inhibition of biofilm formation on the surface of water storage containers using biosand zeolite silver-impregnated clay granular and silver impregnated porous pot filtration systems

    PubMed Central

    Moropeng, Resoketswe Charlotte; Mpenyana-Monyatsi, Lizzy; Momba, Maggie Ndombo Benteke

    2018-01-01

    Development of biofilms occurring on the inner surface of storage vessels offers a suitable medium for the growth of microorganisms and consequently contributes to the deterioration of treated drinking water quality in homes. The aim of this study was to determine whether the two point-of-use technologies (biosand zeolite silver-impregnated clay granular (BSZ-SICG) filter and silver-impregnated porous pot (SIPP) filter) deployed in a rural community of South Africa could inhibit the formation of biofilm on the surface of plastic-based containers generally used by rural households for the storage of their drinking water. Culture-based methods and molecular techniques were used to detect the indicator bacteria (Total coliforms, faecal coliform, E. coli) and pathogenic bacteria (Salmonella spp., Shigella spp. and Vibrio cholerae) in intake water and on the surface of storage vessels containing treated water. Scanning electron microscopy was also used to visualize the development of biofilm. Results revealed that the surface water source used by the Makwane community was heavily contaminated and harboured unacceptably high counts of bacteria (heterotrophic plate count: 4.4–4.3 Log10 CFU/100mL, total coliforms: 2.2 Log10 CFU/100 mL—2.1 Log10 CFU/100 mL, faecal coliforms: 1.9 Log10 CFU/100 mL—1.8 Log10 CFU/100 mL, E. coli: 1.7 Log10 CFU/100 mL—1.6 Log10 CFU/100 mL, Salmonella spp.: 3 Log10 CFU/100 mL -8 CFU/100 mL; Shigella spp. and Vibrio cholerae had 1.0 Log10 CFU/100 mL and 0.8 Log10 CFU/100 mL respectively). Biofilm formation was apparent on the surface of the storage containers with untreated water within 24 h. The silver nanoparticles embedded in the clay of the filtration systems provided an effective barrier for the inhibition of biofilm formation on the surface of household water storage containers. Biofilm formation occurred on the surface of storage plastic vessels containing drinking water treated with the SIPP filter between 14 and 21 days, and on those containing drinking water treated with the BSZ-SICG filter between 3 and 14 days. The attachment of target bacteria on the surface of the coupons inoculated in storage containers ranged from (0.07 CFU/cm2–227.8 CFU/cm2). To effectively prevent the development of biofilms on the surface of container-stored water, which can lead to the recontamination of treated water, plastic storage containers should be washed within 14 days for water treated with the SIPP filter and within 3 days for water treated with the BSZ-SICG filter. PMID:29621296

Top