Sample records for logging instrumentation development

  1. Calibration of the Concorde radiation detection instrument and measurements at SST altitude.

    DOT National Transportation Integrated Search

    1971-06-01

    Performance tests were carried out on a solar cosmic radiation detection instrument developed for the Concorde SST. The instrument calibration curve (log dose-rate vs instrument reading) was reasonably linear from 0.004 to 1 rem/hr for both gamma rad...

  2. Ceramic vacuum tubes for geothermal well logging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelly, R.D.

    1977-01-12

    The results of investigations carried out into the availability and suitability of ceramic vacuum tubes for the development of logging tools for geothermal wells are summarized. Design data acquired in the evaluation of ceramic vacuum tubes for the development of a 500/sup 0/C instrumentation amplifier are presented. The general requirements for ceramic vacuum tubes for application to the development of high temperature well logs are discussed. Commercially available tubes are described and future contract activities that specifically relate to ceramic vacuum tubes are detailed. Supplemental data is presented in the appendix. (MHR)

  3. Ceramic vacuum tubes for geothermal well logging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelly, R.D.

    1977-01-01

    Useful design data acquired in the evaluation of ceramic vacuum tubes for the development of a 500/sup 0/C instrumentation amplifier are presented. The general requirements for ceramic vacuum tubes are discussed for application to the development of high temperature well logs. Commercially available tubes are described and future contract activities that specifically relate to ceramic vacuum tubes are detailed. Supplemental data are presented in the appendix.

  4. Reading Logs and Literature Teaching Models in English Language Teacher Education

    ERIC Educational Resources Information Center

    Ochoa Delarriva, Ornella; Basabe, Enrique Alejandro

    2016-01-01

    Reading logs are regularly used in foreign language education since they are not only critical in the development of reading comprehension but may also be instrumental in taking readers beyond the referential into the representational realms of language. In this paper we offer the results of a qualitative analysis of a series of reading logs…

  5. Low-temperature methyl bromide fumigation of emerald ash borer (Coleoptera: Buprestidae) in ash logs.

    PubMed

    Barak, Alan V; Elder, Peggy; Fraser, Ivich

    2011-02-01

    Ash (Fraxinus spp.) logs, infested with fully developed, cold-acclimated larval and prepupal emerald ash borer, Agrilus planipennis Fairmaire (Coleoptera: Buprestidae), were fumigated with methyl bromide (MeBr) at 4.4 and 10.0 degrees C for 24 h. Concentrations X time dosages of MeBr obtained were 1579 and 1273 g-h/m3 (24-h exposure) at 4.4 and 10.0 degrees C after applied doses of 112 and 96 g/m3, respectively. MeBr concentrations were simultaneously measured with a ContainIR infrared monitor and Fumiscope thermal conductivity meter calibrated for MeBr to measure the effect of CO2 on Fumiscope concentration readings compared with the infrared (IR) instrument. The presence of CO2 caused false high MeBr readings. With the thermal conductivity meter, CO2 measured 11.36 g/m3 MeBr per 1% CO2 in clean air, whereas the gas-specific infrared ContainIR instrument measured 9.55% CO2 as 4.2 g/m3 MeBr (0.44 g/m3 per 1% CO2). The IR instrument was 0.4% as sensitive to CO2 as the thermal conductivity meter. After aeration, fumigated and control logs were held for 8 wk to capture emerging beetles. No A. planipennis adults emerged from any of the fumigated logs, whereas 262 emerged from control logs (139 and 123/m2 at 4.4 and 10.0 degrees C, respectively). An effective fumigation dose and minimum periodic MeBr concentrations are proposed. The use of a CO2 scrubber in conjunction with nonspecific thermal conductivity instruments is necessary to more accurately measure MeBr concentrations.

  6. Qualitative development of a patient-reported outcome symptom measure in diarrhea-predominant irritable bowel syndrome.

    PubMed

    Marquis, P; Lasch, K E; Delgado-Herrera, L; Kothari, S; Lembo, A; Lademacher, C; Spears, G; Nishida, A; Tesler, Waldman L; Piault, E; Rosa, K; Zeiher, B

    2014-06-26

    Despite a documented clinical need, no patient reported outcome (PRO) symptom measure meeting current regulatory requirements for clinically relevant end points is available for the evaluation of treatment benefit in diarrhea-predominant IBS (IBS-D). Patients (N=113) with IBS-D participated in five study phases: (1) eight concept elicitation focus groups (N=34), from which a 17-item IBS-D Daily Symptom Diary and four-item IBS-D Symptom Event Log (Diary and Event Log) were developed; (2) one-on-one cognitive interviews (N=11) to assess the instrument's comprehensiveness, understandability, appropriateness, and readability; (3) four data triangulation focus groups (N=32) to confirm the concepts elicited; (4) two hybrid (concept elicitation and cognitive interview) focus groups (N=16); and (5) two iterative sets of one-on-one cognitive interviews (N=20) to further clarify the symptoms of IBS-D and debrief a revised seven-item Diary and four-item Event Log. Of thirty-six concepts initially identified, 22 were excluded because they were not saturated, not clinically relevant, not critical symptoms of IBS-D, considered upper GI symptoms, or too broad or vaguely defined. The remaining concepts were diarrhea, immediate need (urgency), bloating/pressure, frequency of bowel movements, cramps, abdominal/stomach pain, gas, completely emptied bowels/incomplete evacuation, accidents, bubbling in intestines (bowel sounds), rectal burning, stool consistency, rectal spasm, and pain while wiping. The final instrument included a daily diary with separate items for abdominal and stomach pain and an event log with four items completed after each bowel movement as follows: (1) a record of the bowel movement/event and an assessment of (2) severity of immediacy of need/bowel urgency, (3) incomplete evacuation, and (4) stool consistency (evaluated using the newly developed Astellas Stool Form Scale). Based on rounds of interviews and clinical input, items considered secondary or nonspecific to IBS-D (rectal burning, bubbling in intestines, spasms, and pain while wiping) were excluded. The IBS-D Symptom Diary and Event Log represent a rigorously developed PRO instrument for the measurement of the IBS-D symptom experience from the perspective of the patient. Its content validity has been supported, and future work should evaluate the instrument's psychometric properties.

  7. Subsurface Formation Evaluation on Mars: Application of Methods from the Oil Patch

    NASA Astrophysics Data System (ADS)

    Passey, Q. R.

    2006-12-01

    The ability to drill 10- to 100-meter deep wellbores on Mars would allow for evaluation of shallow subsurface formations enabling the extension of current interpretations of the geologic history of this planet; moreover, subsurface access is likely to provide direct evidence to determine if water or permafrost is present. Methodologies for evaluating sedimentary rocks using drill holes and in situ sample and data acquisition are well developed here on Earth. Existing well log instruments can measure K, Th, and U from natural spectral gamma-ray emission, compressional and shear acoustic velocities, electrical resistivity and dielectric properties, bulk density (Cs-137 or Co-60 source), photoelectric absorption of gamma-rays (sensitive to the atomic number), hydrogen index from epithermal and thermal neutron scattering and capture, free hydrogen in water molecules from nuclear magnetic resonance, formation capture cross section, temperature, pressure, and elemental abundances (C, O, Si, Ca, H, Cl, Fe, S, and Gd) using 14 MeV pulsed neutron activation more elements possible with supercooled Ge detectors. Additionally, high-resolution wellbore images are possible using a variety of optical, electrical, and acoustic imaging tools. In the oil industry, these downhole measurements are integrated to describe potential hydrocarbon reservoir properties: lithology, mineralogy, porosity, depositional environment, sedimentary and structural dip, sedimentary features, fluid type (oil, gas, or water), and fluid amount (i.e., saturation). In many cases it is possible to determine the organic-carbon content of hydrocarbon source rocks from logs (if the total organic carbon content is 1 wt% or greater), and more accurate instruments likely could be developed. Since Martian boreholes will likely be drilled without using opaque drilling fluids (as generally used in terrestrial drilling), additional instruments can be used such as high resolution direct downhole imaging and other surface contact measurements (such as IR spectroscopy and x-ray fluorescence). However, such wellbores would require modification of some instruments since conventional drilling fluids often provide the coupling of the instrument sensors to the formation (e.g., sonic velocity and galvanic resistivity measurements). The ability to drill wellbores on Mars opens up new opportunities for exploration but also introduces additional technical challenges. Currently it is not known if all existing terrestrial logging instruments can be miniaturized sufficiently for a shallow Mars wellbore, but the existing well logging techniques and instruments provide a solid framework on which to build a Martian subsurface evaluation program.

  8. Factors Driving Learner Success in Online Professional Development

    ERIC Educational Resources Information Center

    Vu, Phu; Cao, Vien; Vu, Lan; Cepero, Jude

    2014-01-01

    This study examined factors that contributed to the success of online learners in an online professional development course. Research instruments included an online survey and learners' activity logs in an online professional development course for 512 in-service teachers. The findings showed that there were several factors affecting online…

  9. Keystroke Logging in Writing Research: Using Inputlog to Analyze and Visualize Writing Processes

    ERIC Educational Resources Information Center

    Leijten, Marielle; Van Waes, Luuk

    2013-01-01

    Keystroke logging has become instrumental in identifying writing strategies and understanding cognitive processes. Recent technological advances have refined logging efficiency and analytical outputs. While keystroke logging allows for ecological data collection, it is often difficult to connect the fine grain of logging data to the underlying…

  10. Considering the effect of the logging cable stretching under the influence of its own mass in the well on the accuracy of geophysical parameters determining

    NASA Astrophysics Data System (ADS)

    Kozhushko, A. A.; Pahotin, A. N.; Mal'tsev, V. N.; Bojarnikova, L. V.; Stepanova, E. P.

    2018-04-01

    The technology of carrying out various geophysical studies of oil wells using a carrying well-logging cable for the delivery of geophysical instruments and equipment is considered in the article. The relevance of the topic results from the need to evaluate the effect of well-logging cable stretching in the well under the influence of its own mass and mass of geophysical instruments and equipment on the accuracy and adequacy geophysical studies. Calculation formulas for determining the well-logging cable stretching under the influence of the geophysical tools and equipment mass without taking into account and taking into account the influence of the carrying well-logging cable mass are also presented. For three types of carrying well-logging cables, calculations were made of their stretching in the oil well under the mass of geophysical instruments and equipment and its own mass, depending on the depth of the investigated well. The analysis of the obtained results made it possible to numerically evaluate the extension of the carrying well-logging cable depending on the depth of the investigated well and by correcting the obtained results it allows to provide the required depth accuracy and reliability of the interpreted results of the geophysical studies.

  11. High Temperature Logging and Monitoring Instruments to Explore and Drill Deep into Hot Oceanic Crust.

    NASA Astrophysics Data System (ADS)

    Denchik, N.; Pezard, P. A.; Ragnar, A.; Jean-Luc, D.; Jan, H.

    2014-12-01

    Drilling an entire section of the oceanic crust and through the Moho has been a goal of the scientific community for more than half of a century. On the basis of ODP and IODP experience and data, this will require instruments and strategies working at temperature far above 200°C (reached, for example, at the bottom of DSDP/ODP Hole 504B), and possibly beyond 300°C. Concerning logging and monitoring instruments, progress were made over the past ten years in the context of the HiTI ("High Temperature Instruments") project funded by the european community for deep drilling in hot Icelandic geothermal holes where supercritical conditions and a highly corrosive environment are expected at depth (with temperatures above 374 °C and pressures exceeding 22 MPa). For example, a slickline tool (memory tool) tolerating up to 400°C and wireline tools up to 300°C were developed and tested in Icelandic high-temperature geothermal fields. The temperature limitation of logging tools was defined to comply with the present limitation in wireline cables (320°C). As part of this new set of downhole tools, temperature, pressure, fluid flow and casing collar location might be measured up to 400°C from a single multisensor tool. Natural gamma radiation spectrum, borehole wall ultrasonic images signal, and fiber optic cables (using distributed temperature sensing methods) were also developed for wireline deployment up to 300°C and tested in the field. A wireline, dual laterolog electrical resistivity tool was also developed but could not be field tested as part of HiTI. This new set of tools constitutes a basis for the deep exploration of the oceanic crust in the future. In addition, new strategies including the real-time integration of drilling parameters with modeling of the thermo-mechanical status of the borehole could be developed, using time-lapse logging of temperature (for heat flow determination) and borehole wall images (for hole stability and in-situ stress determination) as boundary conditions for the models. In all, and with limited integration of existing tools, to deployment of high-temperature downhole tools could contribute largely to the success of the long awaited Mohole project.

  12. Development and validation of a new self-report instrument for measuring sedentary behaviors and light-intensity physical activity in adults.

    PubMed

    Barwais, Faisal Awad; Cuddihy, Thomas F; Washington, Tracy; Tomson, L Michaud; Brymer, Eric

    2014-08-01

    Low levels of physical activity and high levels of sedentary behavior (SB) are major public health concerns. This study was designed to develop and validate the 7-day Sedentary (S) and Light Intensity Physical Activity (LIPA) Log (7-day SLIPA Log), a self-report measure of specific daily behaviors. To develop the log, 62 specific SB and LIPA behaviors were chosen from the Compendium of Physical Activities. Face-to-face interviews were conducted with 32 sedentary volunteers to identify domains and behaviors of SB and LIPA. To validate the log, a further 22 sedentary adults were recruited to wear the GT3x for 7 consecutive days and nights. Pearson correlations (r) between the 7-day SLIPA Log and GT3x were significant for sedentary (r = .86, P < .001), for LIPA (r = .80, P < .001). Lying and sitting postures were positively correlated with GT3x output (r = .60 and r = .64, P < .001, respectively). No significant correlation was found for standing posture (r = .14, P = .53).The kappa values between the 7-day SLIPA Log and GT3x variables ranged from 0.09 to 0.61, indicating poor to good agreement. The 7-day SLIPA Log is a valid self-report measure of SB and LIPA in specific behavioral domains.

  13. Designing and Piloting a Leadership Daily Practice Log: Using Logs to Study the Practice of Leadership

    ERIC Educational Resources Information Center

    Spillane, James P.; Zuberi, Anita

    2009-01-01

    Purpose: This article aims to validate the Leadership Daily Practice (LDP) log, an instrument for conducting research on leadership in schools. Research Design: Using a combination of data sources--namely, a daily practice log, observations, and open-ended cognitive interviews--the authors evaluate the validity of the LDP log. Participants: Formal…

  14. Measuring physical activity during US Army Basic Combat Training: a comparison of 3 methods.

    PubMed

    Redmond, Jan E; Cohen, Bruce S; Simpson, Kathleen; Spiering, Barry A; Sharp, Marilyn A

    2013-01-01

    An understanding of the demands of physical activity (PA) during US Army Basic Combat Training (BCT) is necessary to support Soldier readiness and resilience. The purpose of this study was to determine the agreement among 3 different PA measurement instruments in the BCT environment. Twenty-four recruits from each of 11 companies wore an ActiGraph accelerometer (Actigraph, LLC, Pensacola, FL) and completed a daily PA log during 8 weeks of BCT at 2 different training sites. The PA of one recruit from each company was recorded using PAtracker, an Army-developed direct observation tool. Information obtained from the accelerometer, PA log, and PAtracker included time spent in various types of PA, body positions, PA intensities, and external loads carried. Pearson product moment correlations were run to determine the strength of association between the ActiGraph and PAtracker for measures of PA intensity and between the PAtracker and daily PA log for measures of body position and PA type. The Bland-Altman method was used to assess the limits of agreement (LoA) between the measurement instruments. Weak correlations (r=-0.052 to r=0.302) were found between the ActiGraph and PAtracker for PA intensity. Weak but positive correlations (r=0.033 to r=0.268) were found between the PAtracker and daily PA log for body position and type of PA. The 95% LoA for the ActiGraph and PAtracker for PA intensity were in disagreement. The 95% LoA for the PAtracker and daily PA log for standing and running and all PA types were in disagreement; sitting and walking were in agreement. The ActiGraph accelerometer provided the best measure of the recruits' PA intensity while the PAtracker and daily PA log were best for capturing body position and type of PA in the BCT environment. The use of multiple PA measurement instruments in this study was necessary to best characterize the physical demands of BCT.

  15. Log Truck-Weighing System

    NASA Technical Reports Server (NTRS)

    1977-01-01

    ELDEC Corp., Lynwood, Wash., built a weight-recording system for logging trucks based on electronic technology the company acquired as a subcontractor on space programs such as Apollo and the Saturn launch vehicle. ELDEC employed its space-derived expertise to develop a computerized weight-and-balance system for Lockheed's TriStar jetliner. ELDEC then adapted the airliner system to a similar product for logging trucks. Electronic equipment computes tractor weight, trailer weight and overall gross weight, and this information is presented to the driver by an instrument in the cab. The system costs $2,000 but it pays for itself in a single year. It allows operators to use a truck's hauling capacity more efficiently since the load can be maximized without exceeding legal weight limits for highway travel. Approximately 2,000 logging trucks now use the system.

  16. The NetLogger Toolkit V2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gunter, Dan; Lee, Jason; Stoufer, Martin

    2003-03-28

    The NetLogger Toolkit is designed to monitor, under actual operating conditions, the behavior of all the elements of the application-to-application communication path in order to determine exactly where time is spent within a complex system Using NetLogger, distnbuted application components are modified to produce timestamped logs of "interesting" events at all the critical points of the distributed system Events from each component are correlated, which allov^ one to characterize the performance of all aspects of the system and network in detail. The NetLogger Toolkit itself consists of four components an API and library of functions to simplify the generation ofmore » application-level event logs, a set of tools for collecting and sorting log files, an event archive system, and a tool for visualization and analysis of the log files In order to instrument an application to produce event logs, the application developer inserts calls to the NetLogger API at all the critical points in the code, then links the application with the NetLogger library All the tools in the NetLogger Toolkit share a common log format, and assume the existence of accurate and synchronized system clocks NetLogger messages can be logged using an easy-to-read text based format based on the lETF-proposed ULM format, or a binary format that can still be used through the same API but that is several times faster and smaller, with performance comparable or better than binary message formats such as MPI, XDR, SDDF-Binary, and PBIO. The NetLogger binary format is both highly efficient and self-describing, thus optimized for the dynamic message construction and parsing of application instrumentation. NetLogger includes an "activation" API that allows NetLogger logging to be turned on, off, or modified by changing an external file This IS useful for activating logging in daemons/services (e g GndFTP server). The NetLogger reliability API provides the ability to specify backup logging locations and penodically try to reconnect broken TCP pipe. A typical use for this is to store data on local disk while net is down. An event archiver can log one or more incoming NetLogger streams to a local disk file (netlogd) or to a mySQL database (netarchd). We have found exploratory, visual analysis of the log event data to be the most useful means of determining the causes of performance anomalies The NetLogger Visualization tool, niv, has been developed to provide a flexible and interactive graphical representation of system-level and application-level events.« less

  17. Instrumentation at Paranal Observatory: maintaining the instrument suite of five large telescopes and its interferometer alive

    NASA Astrophysics Data System (ADS)

    Gillet, Gordon; Alvarez, José Luis; Beltrán, Juan; Bourget, Pierre; Castillo, Roberto; Diaz, Álvaro; Haddad, Nicolás; Leiva, Alfredo; Mardones, Pedro; O'Neal, Jared; Ribes, Mauricio; Riquelme, Miguel; Robert, Pascal; Rojas, Chester; Valenzuela, Javier

    2010-07-01

    This presentation provides interesting miscellaneous information regarding the instrumentation activities at Paranal Observatory. It introduces the suite of 23 instruments and auxiliary systems that are under the responsibility of the Paranal Instrumentation group, information on the type of instruments, their usage and downtime statistics. The data is based on comprehensive data recorded in the Paranal Night Log System and the Paranal Problem Reporting System whose principles are explained as well. The work organization of the 15 team members around the high number of instruments is laid out, which includes: - Maintaining older instruments with obsolete components - Receiving new instruments and supporting their integration and commissioning - Contributing to future instruments in their developing phase. The assignments of the Instrumentation staff to the actual instruments as well as auxiliary equipment (Laser Guide Star Facility, Mask Manufacturing Unit, Cloud Observation Tool) are explained with respect to responsibility and scheduling issues. The essential activities regarding hardware & software are presented, as well as the technical and organizational developments within the group towards its present and future challenges.

  18. NMR logging apparatus

    DOEpatents

    Walsh, David O; Turner, Peter

    2014-05-27

    Technologies including NMR logging apparatus and methods are disclosed. Example NMR logging apparatus may include surface instrumentation and one or more downhole probes configured to fit within an earth borehole. The surface instrumentation may comprise a power amplifier, which may be coupled to the downhole probes via one or more transmission lines, and a controller configured to cause the power amplifier to generate a NMR activating pulse or sequence of pulses. Impedance matching means may be configured to match an output impedance of the power amplifier through a transmission line to a load impedance of a downhole probe. Methods may include deploying the various elements of disclosed NMR logging apparatus and using the apparatus to perform NMR measurements.

  19. Application of borehole geophysics to water-resources investigations

    USGS Publications Warehouse

    Keys, W.S.; MacCary, L.M.

    1971-01-01

    This manual is intended to be a guide for hydrologists using borehole geophysics in ground-water studies. The emphasis is on the application and interpretation of geophysical well logs, and not on the operation of a logger. It describes in detail those logging techniques that have been utilized within the Water Resources Division of the U.S. Geological Survey, and those used in petroleum investigations that have potential application to hydrologic problems. Most of the logs described can be made by commercial logging service companies, and many can be made with small water-well loggers. The general principles of each technique and the rules of log interpretation are the same, regardless of differences in instrumentation. Geophysical well logs can be interpreted to determine the lithology, geometry, resistivity, formation factor, bulk density, porosity, permeability, moisture content, and specific yield of water-bearing rocks, and to define the source, movement, and chemical and physical characteristics of ground water. Numerous examples of logs are used to illustrate applications and interpretation in various ground-water environments. The interrelations between various types of logs are emphasized, and the following aspects are described for each of the important logging techniques: Principles and applications, instrumentation, calibration and standardization, radius of investigation, and extraneous effects.

  20. Check-off logs for routine equipment maintenance.

    PubMed

    Brewster, M A; Carver, P H; Randolph, B

    1995-12-01

    The regulatory requirement for appropriate routine instrument maintenance documentation is approached by laboratories in numerous ways. Standard operating procedures (SOPs) may refer to maintenance listed in instrument manuals, may indicate "periodic" performance of an action, or may indicate specific tasks to be performed at certain frequencies. The Quality Assurance Unit (QAU) task of assuring the performance of these indicated maintenance tasks can be extremely laborious if these records are merged with other analysis records. Further, the lack of written maintenance schedules often leads to omission of infrequently performed tasks. We recommend creation of routine maintenance check-off logs for instruments with tasks grouped by frequency of expected performance. Usage of such logs should result in better laboratory compliance with SOPs and the compliance can be readily monitored by QAU or by regulatory agencies.

  1. Cosmological Distance Scale to Gamma-Ray Bursts

    NASA Astrophysics Data System (ADS)

    Azzam, W. J.; Linder, E. V.; Petrosian, V.

    1993-05-01

    The source counts or the so-called log N -- log S relations are the primary data that constrain the spatial distribution of sources with unknown distances, such as gamma-ray bursts. In order to test galactic, halo, and cosmological models for gamma-ray bursts we compare theoretical characteristics of the log N -- log S relations to those obtained from data gathered by the BATSE instrument on board the Compton Observatory (GRO) and other instruments. We use a new and statistically correct method, that takes proper account of the variable nature of the triggering threshold, to analyze the data. Constraints on models obtained by this comparison will be presented. This work is supported by NASA grants NAGW 2290, NAG5 2036, and NAG5 1578.

  2. NMR Methods, Applications and Trends for Groundwater Evaluation and Management

    NASA Astrophysics Data System (ADS)

    Walsh, D. O.; Grunewald, E. D.

    2011-12-01

    Nuclear magnetic resonance (NMR) measurements have a tremendous potential for improving groundwater characterization, as they provide direct detection and measurement of groundwater and unique information about pore-scale properties. NMR measurements, commonly used in chemistry and medicine, are utilized in geophysical investigations through non-invasive surface NMR (SNMR) or downhole NMR logging measurements. Our recent and ongoing research has focused on improving the performance and interpretation of NMR field measurements for groundwater characterization. Engineering advancements have addressed several key technical challenges associated with SNMR measurements. Susceptibility of SNMR measurements to environmental noise has been dramatically reduced through the development of multi-channel acquisition hardware and noise-cancellation software. Multi-channel instrumentation (up to 12 channels) has also enabled more efficient 2D and 3D imaging. Previous limitations in measuring NMR signals from water in silt, clay and magnetic geology have been addressed by shortening the instrument dead-time from 40 ms to 4 ms, and increasing the power output. Improved pulse sequences have been developed to more accurately estimate NMR relaxation times and their distributions, which are sensitive to pore size distributions. Cumulatively, these advancements have vastly expanded the range of environments in which SNMR measurements can be obtained, enabling detection of groundwater in smaller pores, in magnetic geology, in the unsaturated zone, and nearby to infrastructure (presented here in case studies). NMR logging can provide high-resolution estimates of bound and mobile water content and pore size distributions. While NMR logging has been utilized in oil and gas applications for decades, its use in groundwater investigations has been limited by the large size and high cost of oilfield NMR logging tools and services. Recently, engineering efforts funded by the US Department of Energy have produced an NMR logging tool that is much smaller and less costly than comparable oilfield NMR logging tools. This system is specifically designed for near surface groundwater investigations, incorporates small diameter probes (as small as 1.67 inches diameter) and man-portable surface stations, and provides NMR data and information content on par with oilfield NMR logging tools. A direct-push variant of this logging tool has also been developed. Key challenges associated with small diameter tools include inherently lower SNR and logging speeds, the desire to extend the sensitive zone as far as possible into unconsolidated formations, and simultaneously maintaining high power and signal fidelity. Our ongoing research in groundwater NMR aims to integrating surface and borehole measurements for regional-scale permeability mapping, and to develop in-place NMR sensors for long term monitoring of contaminant and remediation processes. In addition to groundwater resource characterization, promising new applications of NMR include assessing water content in ice and permafrost, management of groundwater in mining operations, and evaluation and management of groundwater in civil engineering applications.

  3. Development of Pediatric Sleep Questionnaires as Diagnostic or Epidemiological Tools: A Brief Review of Do's and Don'ts

    PubMed Central

    Spruyt, Karen; Gozal, David

    2010-01-01

    Questionnaires are a useful and extensively used tool in clinical sleep medicine and in sleep research. The number of sleep questionnaires targeting the pediatric age range has tremendously increased in recent years, and with such explosion in the number of instruments, their heterogeneity has become all the more apparent. Here, we explore the theoretical and pragmatic processes required for instrument design and development, i.e., how any questionnaire, inventory, log, or diary should be created and evaluated, and also provide illustrative examples to further underline the potential pitfalls that are inherently embedded in every step of tool development. PMID:20952230

  4. Proceedings of the Conference on High-temperature Electronics

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The development of electronic devices for use in high temperature environments is addressed. The instrumentational needs of planetary exploration, fossil and nuclear power reactors, turbine engine monitoring, and well logging are defined. Emphasis is place on the fabrication and performance of materials and semiconductor devices, circuits and systems and packaging.

  5. Continued development of hybrid directional boring technology and New horizontal logging development for characterization, monitoring and instrument emplacement at environmental sites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wemple, R.P.; Meyer, R.D.; Jacobson, R.D.

    This work in partnership with industry is a continuation of cost- effective innovative, directional boring development begun in FY90 and planed to extend into FY94. Several demonstrations of the strategy of building hybrid hardware from utilities installation, geothermal, and soil mechanics technologies have been performed at Sandia National Laboratories (SNL) and at Charles Machine works (CMW) test sites as well as at a commercial refinery site. Additional tests at the SNL Directional Boring Test Range (DBTR) and a lagoon site are planned in calendar 1991. A new companion project to develop and demonstrate a hybrid capability for horizontal logging withmore » penetrometers, specialty instruments and samplers has been taken from concept to early prototype hardware. The project goal of extending the tracking/locating capability of the shallow boring equipment to 80in. is being pursued with encouraging results at 40in. depths. Boring costs, not including tailored well completions dictated by individual site parameters, are estimated at $20 to $50 per foot. Applications continue to emerge for this work and interest continues to be expressed by DoD and EPA researchers and environmental site engineers. 12 figs.« less

  6. Instrument response measurements of ion mobility spectrometers in situ: maintaining optimal system performance of fielded systems

    NASA Astrophysics Data System (ADS)

    Wallis, Eric; Griffin, Todd M.; Popkie, Norm, Jr.; Eagan, Michael A.; McAtee, Robert F.; Vrazel, Danet; McKinly, Jim

    2005-05-01

    Ion Mobility Spectroscopy (IMS) is the most widespread detection technique in use by the military for the detection of chemical warfare agents, explosives, and other threat agents. Moreover, its role in homeland security and force protection has expanded due, in part, to its good sensitivity, low power, lightweight, and reasonable cost. With the increased use of IMS systems as continuous monitors, it becomes necessary to develop tools and methodologies to ensure optimal performance over a wide range of conditions and extended periods of time. Namely, instrument calibration is needed to ensure proper sensitivity and to correct for matrix or environmental effects. We have developed methodologies to deal with the semi-quantitative nature of IMS and allow us to generate response curves that allow a gauge of instrument performance and maintenance requirements. This instrumentation communicates to the IMS systems via a software interface that was developed in-house. The software measures system response, logs information to a database, and generates the response curves. This paper will discuss the instrumentation, software, data collected, and initial results from fielded systems.

  7. Evaluation and performance of a newly developed patient-reported outcome instrument for diarrhea-predominant irritable bowel syndrome in a clinical study population

    PubMed Central

    Delgado-Herrera, Leticia; Lasch, Kathryn; Zeiher, Bernhardt; Lembo, Anthony J.; Drossman, Douglas A.; Banderas, Benjamin; Rosa, Kathleen; Lademacher, Christopher; Arbuckle, Rob

    2017-01-01

    Background: To evaluate the psychometric properties of the newly developed seven-item Irritable Bowel Syndrome – Diarrhea predominant (IBS-D) Daily Symptom Diary and four-item Event Log using phase II clinical trial safety and efficacy data in patients with IBS-D. This instrument measures diarrhea (stool frequency and stool consistency), abdominal pain related to IBS-D (stomach pain, abdominal pain, abdominal cramps), immediate need to have a bowel movement (immediate need and accident occurrence), bloating, pressure, gas, and incomplete evacuation. Methods: Psychometric properties and responsiveness of the instrument were evaluated in a clinical trial population [ClinicalTrials.gov identifier: NCT01494233]. Results: A total of 434 patients were included in the analyses. Significant differences were found among severity groups (p < 0.01) defined by IBS Patient Global Impression of Severity (PGI-S) and IBS Patient Global Impression of Change (PGI-C). Severity scores for each Diary and Event Log item score and five-item, four-item, and three-item summary scores were calculated. Between-group differences in changes over time were significant for all summary scores in groups stratified by changes in PGI-S (p < 0.05), two of six Diary items, and three of four Event Log items; a one-grade change in PGI-S was considered a meaningful difference with mean change scores on all Diary items −0.13 to −0.86 [standard deviation (SD) 0.79–1.39]. Similarly, for patients who reported being ‘slightly improved’ (considered a clinically meaningful difference) on the PGI-C, mean change scores on Diary items ranged from −0.45 to −1.55 (SD 0.69–1.39). All estimates of clinically important change for each item and all summary scores were small and should be considered preliminary. These results are aligned with the previous standalone psychometric study regarding reliability and validity tests. Conclusions: These analyses provide evidence of the psychometric properties of the IBS-D Daily Symptom Diary and Event Log in a clinical trial population. PMID:28932269

  8. Rugged, Low Cost, Environmental Sensors for a Turbulent World

    NASA Astrophysics Data System (ADS)

    Schulz, B.; Sandell, C. T.; Wickert, A. D.

    2017-12-01

    Ongoing scientific research and resource management require a diverse range of high-quality and low-cost sensors to maximize the number and type of measurements that can be obtained. To accomplish this, we have developed a series of diversified sensors for common environmental applications. The TP-DownHole is an ultra-compact temperature and pressure sensor designed for use in CMT (Continuous Multi-channel Tubing) multi-level wells. Its 1 mm water depth resolution, 30 cm altitude resolution, and rugged design make it ideal for both water level measurements and monitoring barometric pressure and associated temperature changes. The TP-DownHole sensor has also been incorporated into a self-contained, fully independent data recorder for extreme and remote environments. This device (the TP-Solo) is based around the TP-DownHole design, but has self-contained power and data storage and is designed to collect data independently for up to 6 months (logging at once an hour), creating a specialized tool for extreme environment data collection. To gather spectral information, we have also developed a very low cost photodiode-based Lux sensor to measure spectral irradiance; while this does not measure the entire solar radiation spectrum, simple modeling to rescale the remainder of the solar spectrum makes this a cost-effective alternative to a thermopile pyranometer. Lastly, we have developed an instrumentation amplifier which is designed to interface a wide range of sensitive instruments to common data logging systems, such as thermopile pyranometers, thermocouples, and many other analog output sensors. These three instruments are the first in a diverse family aimed to give researchers a set of powerful and low-cost tools for environmental instrumentation.

  9. 10 CFR 39.69 - Radioactive contamination control.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING Radiation Safety Requirements § 39.69 Radioactive contamination control. (a) If the licensee detects... licensee shall continuously monitor, with an appropriate radiation detection instrument or a logging tool...

  10. 10 CFR 39.69 - Radioactive contamination control.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING Radiation Safety Requirements § 39.69 Radioactive contamination control. (a) If the licensee detects... licensee shall continuously monitor, with an appropriate radiation detection instrument or a logging tool...

  11. 10 CFR 39.69 - Radioactive contamination control.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING Radiation Safety Requirements § 39.69 Radioactive contamination control. (a) If the licensee detects... licensee shall continuously monitor, with an appropriate radiation detection instrument or a logging tool...

  12. 10 CFR 39.69 - Radioactive contamination control.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING Radiation Safety Requirements § 39.69 Radioactive contamination control. (a) If the licensee detects... licensee shall continuously monitor, with an appropriate radiation detection instrument or a logging tool...

  13. 10 CFR 39.69 - Radioactive contamination control.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING Radiation Safety Requirements § 39.69 Radioactive contamination control. (a) If the licensee detects... licensee shall continuously monitor, with an appropriate radiation detection instrument or a logging tool...

  14. Rule Systems for Runtime Verification: A Short Tutorial

    NASA Astrophysics Data System (ADS)

    Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex

    In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.

  15. 10 CFR 39.33 - Radiation detection instruments.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Radiation detection instruments. 39.33 Section 39.33 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING Equipment § 39.33 Radiation detection instruments. (a) The licensee shall keep a calibrated and operable...

  16. 10 CFR 39.33 - Radiation detection instruments.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Radiation detection instruments. 39.33 Section 39.33 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING Equipment § 39.33 Radiation detection instruments. (a) The licensee shall keep a calibrated and operable...

  17. 10 CFR 39.33 - Radiation detection instruments.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Radiation detection instruments. 39.33 Section 39.33 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING Equipment § 39.33 Radiation detection instruments. (a) The licensee shall keep a calibrated and operable...

  18. 10 CFR 39.33 - Radiation detection instruments.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Radiation detection instruments. 39.33 Section 39.33 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING Equipment § 39.33 Radiation detection instruments. (a) The licensee shall keep a calibrated and operable...

  19. 10 CFR 39.33 - Radiation detection instruments.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Radiation detection instruments. 39.33 Section 39.33 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING Equipment § 39.33 Radiation detection instruments. (a) The licensee shall keep a calibrated and operable...

  20. New LWD tools are just in time to probe for baby elephants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghiselin, D.

    Development of sophisticated formation evaluation instrumentation for use while drilling has led to a stratification of while-drilling services. Measurements while drilling (MWD) comprises measurements of mechanical parameters like weight-on-bit, mud pressures, torque, vibration, hole angle and direction. Logging while drilling (LWD) describes resistivity, sonic, and radiation logging which rival wireline measurements in accuracy. A critical feature of LWD is the rate that data can be telemetered to the surface. Early tools could only transmit 3 bits per second one way. In the last decade, the data rate has more than tripled. Despite these improvements, LWD tools have the ability tomore » make many more measurements than can be telemetered in real-time. The paper discusses the development of this technology and its applications.« less

  1. Geomicrobial Optical Logging Detectors (GOLD)

    NASA Astrophysics Data System (ADS)

    Bramall, N. E.; Stoker, C. R.; Price, P. B.; Coates, J. D.; Allamandola, L. J.; Mattioda, A. L.

    2008-12-01

    We will present concepts for downhole instrumentation that could be used in the Deep Underground Science and Engineering Laboratory (DUSEL). We envision optical borehole-logging instruments that could monitor bacterial concentration, mineralogy, aromatic organics, temperature and oxygen concentration, allowing for the in situ monitoring of time-dependent microbial and short-scale geologic processes and provide valuable in situ data on stratigraphy to supplement core analyses, especially where instances of missing or damaged core sections make such studies difficult. Incorporated into these instruments will be a sampling/inoculation tool to allow for the recovery and/or manipulation of particularly interesting sections of the borehole wall for further study, enabling a series of microbiological studies. The borehole tools we will develop revolve around key emerging technologies and methods, some of which are briefly described below: 1) Autofluorescence Spectroscopy: Building on past instruments, we will develop a new borehole logger that searches for microbial life and organics using fluorescence spectroscopy. Many important organic compounds (e.g. PAHs) and biomolecules (e.g. aromatic amino acids, proteins, methanogenic coenzymes) fluoresce when excited with ultraviolet and visible light. Through the careful selection of excitation wavelength(s) and temporal gating parameters, a borehole logging instrument can detect and differentiate between these different compounds and the mineral matrix in which they exist. 2) Raman Spectroscopy: Though less sensitive than fluorescence spectroscopy, Raman spectroscopy is more definitive: it can provide important mineral phase distribution/proportions and other chemical data enabling studies of mineralogy and microbe-mineral interactions (when combined with fluorescence). 3) Borehole Camera: Imaging of the borehole wall with extended information in the UV, visible, and NIR for a more informative view can provide a lot of insight to in situ processes. 4) Temperature and Oxygen Sensors: The ambient temperature will be recorded as well as the presence of oxygen. Oxygen presence can be measured using a fluorescence quenching fiber optic probe to avoid interference from other gases. We forsee that this technology will enable experiments including studies of gene transfer, microbial habitat, in situ stratigraphy and hydrological processes. In addition, though designed to scan borehole walls, GOLD could be used to scan core samples as they are recovered for rapid quantification and analysis in order to discover samples of particular interest that could then be prioritized for more in-depth, traditional analysis.

  2. Active Wireline Heave Compensation for Ocean Drilling

    NASA Astrophysics Data System (ADS)

    Goldberg, D.; Liu, T.; Swain, K.; Furman, C.; Iturrino, G. J.

    2014-12-01

    The up-and-down heave motion of a ship causes a similar motion on any instruments tethered on wireline cable below it. If the amplitude of this motion is greater than a few tens of cm, significant discrepancy in the depth below the ship is introduced, causing uncertainty in the acquired data. Large and irregular cabled motions also increase the risk of damaging tethered instruments, particularly those with relatively delicate sensors. In 2005, Schlumberger and Deep Down, Inc built an active wireline heave compensator (AHC) system for use onboard the JOIDES Resolution to compensate for heave motion on wireline logging tools deployed in scientific drill holes. The goals for the new AHC system were to (1) design a reliable heave compensation system; and (2) devise a robust and quantitative methodology for routine assessment of compensation efficiency (CE) during wireline operations. Software programs were developed to monitor CE and the dynamics of logging tools in real-time, including system performance under variable parameters such as water depth, sea state, cable length, logging speed and direction. We present the CE results from the AHC system on the JOIDES Resolution during a 5-year period of recent IODP operations and compare the results to those from previous compensation systems deployed during ODP and IODP. Based on new data under heave conditions of ±0.2-2.0 m and water depths of 300-4,800 m in open holes, the system reduces 65-80% of downhole tool displacement under stationary conditions and 50-60% during normal logging operations. Moreover, down/up tool motion at low speeds (300-600 m/h) reduces the system's CE values by 15-20%, and logging down at higher speeds (1,000-1,200 m/h) reduces CE values by 55-65%. Furthermore, the system yields slightly lower CE values of 40-50% without tension feedback of the downhole cable while logging. These results indicate that the new system's compensation efficiency is comparable to or better than previous systems, with additional advantages that include upgradable compensation control software and the capability for continued assessment under varying environmental conditions. Future integration of downhole cable dynamics as an input feedback could further improve CE during logging operations.­

  3. Visitor's Computer Guidelines | CTIO

    Science.gov Websites

    Visitor's Computer Guidelines Network Connection Request Instruments Instruments by Telescope IR Instruments Logs Tololo Kaxis Webcam NOAO Newsletters NOAO Data Archive Astronomical Links Visitor's Computer ‹› You are here CTIO Home » Astronomers » Visitor's Computer Guidelines Visitor's Computer

  4. The application of PGNAA borehole logging for copper grade estimation at Chuquicamata mine.

    PubMed

    Charbucinski, J; Duran, O; Freraut, R; Heresi, N; Pineyro, I

    2004-05-01

    The field trials of a prompt gamma neutron activation (PGNAA) spectrometric logging method and instrumentation (SIROLOG) for copper grade estimation in production holes of a porphyry type copper ore mine, Chuquicamata in Chile, are described. Examples of data analysis, calibration procedures and copper grade profiles are provided. The field tests have proved the suitability of the PGNAA logging system for in situ quality control of copper ore.

  5. Measuring colour rivalry suppression in amblyopia.

    PubMed

    Hofeldt, T S; Hofeldt, A J

    1999-11-01

    To determine if the colour rivalry suppression is an index of the visual impairment in amblyopia and if the stereopsis and fusion evaluator (SAFE) instrument is a reliable indicator of the difference in visual input from the two eyes. To test the accuracy of the SAFE instrument for measuring the visual input from the two eyes, colour rivalry suppression was measured in six normal subjects. A test neutral density filter (NDF) was placed before one eye to induce a temporary relative afferent defect and the subject selected the NDF before the fellow eye to neutralise the test NDF. In a non-paediatric private practice, 24 consecutive patients diagnosed with unilateral amblyopia were tested with the SAFE. Of the 24 amblyopes, 14 qualified for the study because they were able to fuse images and had no comorbid disease. The relation between depth of colour rivalry suppression, stereoacuity, and interocular difference in logMAR acuity was analysed. In normal subjects, the SAFE instrument reversed temporary defects of 0.3 to 1. 8 log units to within 0.6 log units. In amblyopes, the NDF to reverse colour rivalry suppression was positively related to interocular difference in logMAR acuity (beta=1.21, p<0.0001), and negatively related to stereoacuity (beta=-0.16, p=0.019). The interocular difference in logMAR acuity was negatively related to stereoacuity (beta=-0.13, p=0.009). Colour rivalry suppression as measured with the SAFE was found to agree closely with the degree of visual acuity impairment in non-paediatric patients with amblyopia.

  6. Responses of experimental river corridors to engineered log jams

    USDA-ARS?s Scientific Manuscript database

    Physical models of the Big Sioux River, SD, were constructed to assess the impact on flow, drag, and bed erosion and deposition in response to the installation of two different types of engineered log jams (ELJs). A fixed-bed model focused on flow velocity and forces acting on an instrumented ELJ, a...

  7. Controlled Vocabularies and Ontologies for Oceanographic Data: The R2R Eventlogger Project

    NASA Astrophysics Data System (ADS)

    Coburn, E.; Maffei, A. R.; Chandler, C. L.; Raymond, L. M.

    2012-12-01

    Research vessels coordinated by the United States University-National Oceanographic Laboratory System (US-UNOLS) collect data which is considered an important oceanographic resource. The NSF-funded Rolling Deck to Repository (R2R) project aims to improve access to this data and diminish the barriers to use. One aspect of the R2R project has been to develop a shipboard scientific event logging system, Eventlogger, that incorporates best practice guidelines, controlled vocabularies, a cruise metadata schema, and a scientific event log. This will facilitate the eventual ingestion of datasets into oceanographic data repositories for subsequent integration and synthesis by investigators. One important aspect of this system is the careful use of controlled vocabularies and ontologies. Existing ontologies, where available, will be used and others will be developed. The use of internationally-informed, consensus-driven controlled vocabularies will make datasets more interoperable, and discoverable. The R2R Eventlogger project is led by Woods Hole Oceanographic Institution (WHOI), and the management of the controlled vocabularies and mapping of these vocabularies to authoritative community vocabularies are led by the Data Librarian in the Marine Biological Laboratory/Woods Hole Oceanographic Institution (MBLWHOI) Library. The first target vocabulary is oceanographic instruments. Management of this vocabulary has thus far consisted of reconciling local community terms with the more widely used SeaDataNet Device Vocabulary terms. Rather than adopt existing terms, often the local terms are mapped by data managers in the NSF-funded Biological and Chemical Oceanographic Data Management Office (BCO-DMO) to the existing terms as they are given by investigators and often provide important information and meaning. New terms (often custom, or modified instruments) are submitted for review to the SeaDataNet community listserv for discussion and eventual incorporation into the Device Vocabulary. These vocabularies and their mappings are an important part of the Eventlogger system. Before a research cruise investigators configure the instruments they intend to use for science activities. The instruments available for selection are pulled directly from the instrument vocabulary. The promotion and use of controlled vocabularies and ontologies will pave the way for linked data. By mapping local terms to agreed upon authoritative terms links are created, whereby related datasets can be discovered, and utilized. The Library is a natural home for the management of standards. Librarians have an established history of working with controlled vocabularies and metadata and libraries serve as centers for information discovery. Eventlogger is currently being tested across the UNOLS fleet. A large submission of suggested instrument terms to the SeaDataNet community listserv is in progress. References: Maffei, Andrew R., Cynthia L. Chandler, Janet Fredericks, Nan Galbraith, Laura Stolp. Rolling Deck to Repository (R2R): A Controlled Vocabulary and Ontology Development Effort for Oceanographic Research Cruise Event Logging. EGU2011-12341. Poster presented at the 2011 EGU Meeting.

  8. BORE II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bore II, co-developed by Berkeley Lab researchers Frank Hale, Chin-Fu Tsang, and Christine Doughty, provides vital information for solving water quality and supply problems and for improving remediation of contaminated sites. Termed "hydrophysical logging," this technology is based on the concept of measuring repeated depth profiles of fluid electric conductivity in a borehole that is pumping. As fluid enters the wellbore, its distinct electric conductivity causes peaks in the conductivity log that grow and migrate upward with time. Analysis of the evolution of the peaks enables characterization of groundwater flow distribution more quickly, more cost effectively, and with higher resolutionmore » than ever before. Combining the unique interpretation software Bore II with advanced downhole instrumentation (the hydrophysical logging tool), the method quantifies inflow and outflow locations, their associated flow rates, and the basic water quality parameters of the associated formation waters (e.g., pH, oxidation-reduction potential, temperature). In addition, when applied in conjunction with downhole fluid sampling, Bore II makes possible a complete assessment of contaminant concentration within groundwater.« less

  9. DIY Soundcard Based Temperature Logging System. Part I: Design

    ERIC Educational Resources Information Center

    Nunn, John

    2016-01-01

    This paper aims to enable schools to make their own low-cost temperature logging instrument and to learn something about its calibration in the process. This paper describes how a thermistor can be integrated into a simple potential divider circuit which is powered with the sound output of a computer and monitored by the microphone input. The…

  10. Math Talk and Representations in Elementary Classrooms of Beginning Teachers: A MLM Exploratory Analysis

    ERIC Educational Resources Information Center

    Alnizami, Reema

    2017-01-01

    This study examined the math talk and the use of multiple representations in elementary classrooms of 134 beginning teachers, all in their second year of teaching. A quantitative correlational research design was employed to investigate the research questions. The data were collected using a log instrument, the Instructional Practices Log in…

  11. Instruments and methods acoustic televiewer logging in glacier boreholes

    USGS Publications Warehouse

    Morin, R.H.; Descamps, G.E.; Cecil, L.D.

    2000-01-01

    The acoustic televiewer is a geophysical logging instrument that is deployed in a water-filled borehole and operated while trolling. It generates a digital, magnetically oriented image of the borehole wall that is developed from the amplitudes and transit times of acoustic waves emitted from the tool and reflected at the water-wall interface. The transit-time data are also converted to radial distances, from which cross-sectional views of the borehole shape can be constructed. Because the televiewer is equipped with both a three-component magnetometer and a two-component inclinometer, the borehole's trajectory in space is continuously recorded as well. This instrument is routinely used in mining and hydrogeologic applications, but in this investigation it was deployed in two boreholes drilled into Upper Fremont Glacier, Wyoming, U.S.A. The acoustic images recorded in this glacial setting are not as clear as those typically obtained in rocks, due to a lower reflection coefficient for water and ice than for water and rock. Results indicate that the depth and orientation of features intersecting the boreholes can be determined, but that interpreting their physical nature is problematic and requires corroborating information from inspection of cores. Nevertheless, these data can provide some insight into englacial structural characteristics. Additional information derived from the cross-sectional geometry of the borehole, as well as from its trajectory, may also be useful in studies concerned with stress patterns and deformation processes.

  12. Atmospheric stellar parameters from cross-correlation functions

    NASA Astrophysics Data System (ADS)

    Malavolta, L.; Lovis, C.; Pepe, F.; Sneden, C.; Udry, S.

    2017-08-01

    The increasing number of spectra gathered by spectroscopic sky surveys and transiting exoplanet follow-up has pushed the community to develop automated tools for atmospheric stellar parameters determination. Here we present a novel approach that allows the measurement of temperature (Teff), metallicity ([Fe/H]) and gravity (log g) within a few seconds and in a completely automated fashion. Rather than performing comparisons with spectral libraries, our technique is based on the determination of several cross-correlation functions (CCFs) obtained by including spectral features with different sensitivity to the photospheric parameters. We use literature stellar parameters of high signal-to-noise (SNR), high-resolution HARPS spectra of FGK main-sequence stars to calibrate Teff, [Fe/H] and log g as a function of CCF parameters. Our technique is validated using low-SNR spectra obtained with the same instrument. For FGK stars we achieve a precision of σ _{{T_eff}} = 50 K, σlog g = 0.09 dex and σ _{{{[Fe/H]}}} =0.035 dex at SNR = 50, while the precision for observation with SNR ≳ 100 and the overall accuracy are constrained by the literature values used to calibrate the CCFs. Our approach can easily be extended to other instruments with similar spectral range and resolution or to other spectral range and stars other than FGK dwarfs if a large sample of reference stars is available for the calibration. Additionally, we provide the mathematical formulation to convert synthetic equivalent widths to CCF parameters as an alternative to direct calibration. We have made our tool publicly available.

  13. Effect of logging on subsurface pipeflow and erosion: coastal northern California, USA

    Treesearch

    R. R. Ziemer

    1992-01-01

    Abstract - Three zero-order swales, each with a contributing drainage area of about 1 ha, were instrumented to measure pipeflows within the Caspar Creek Experimental Watershed in northwestern California, USA. After two winters of data collection, the second-growth forest on two of the swales was clearcut logged. The third swale remained as an uncut control. After...

  14. Web-Enabled Optoelectronic Particle-Fallout Monitor

    NASA Technical Reports Server (NTRS)

    Lineberger, Lewis P.

    2008-01-01

    A Web-enabled optoelectronic particle- fallout monitor has been developed as a prototype of future such instruments that (l) would be installed in multiple locations for which assurance of cleanliness is required and (2) could be interrogated and controlled in nearly real time by multiple remote users. Like prior particle-fallout monitors, this instrument provides a measure of particles that accumulate on a surface as an indication of the quantity of airborne particulate contaminants. The design of this instrument reflects requirements to: Reduce the cost and complexity of its optoelectronic sensory subsystem relative to those of prior optoelectronic particle fallout monitors while maintaining or improving capabilities; Use existing network and office computers for distributed display and control; Derive electric power for the instrument from a computer network, a wall outlet, or a battery; Provide for Web-based retrieval and analysis of measurement data and of a file containing such ancillary data as a log of command attempts at remote units; and Use the User Datagram Protocol (UDP) for maximum performance and minimal network overhead.

  15. Measuring students' self-regulated learning in professional education: bridging the gap between event and aptitude measurements.

    PubMed

    Endedijk, Maaike D; Brekelmans, Mieke; Sleegers, Peter; Vermunt, Jan D

    Self-regulated learning has benefits for students' academic performance in school, but also for expertise development during their professional career. This study examined the validity of an instrument to measure student teachers' regulation of their learning to teach across multiple and different kinds of learning events in the context of a postgraduate professional teacher education programme. Based on an analysis of the literature, we developed a log with structured questions that could be used as a multiple-event instrument to determine the quality of student teachers' regulation of learning by combining data from multiple learning experiences. The findings showed that this structured version of the instrument measured student teachers' regulation of their learning in a valid and reliable way. Furthermore, with the aid of the Structured Learning Report individual differences in student teachers' regulation of learning could be discerned. Together the findings indicate that a multiple-event instrument can be used to measure regulation of learning in multiple contexts for various learning experiences at the same time, without the necessity of relying on students' ability to rate themselves across all these different experiences. In this way, this instrument can make an important contribution to bridging the gap between two dominant approaches to measure SRL, the traditional aptitude and event measurement approach.

  16. Planetary Geochemistry Techniques: Probing In-Situ with Neutron and Gamma Rays (PING) Instrument

    NASA Technical Reports Server (NTRS)

    Parsons, A.; Bodnarik, J.; Burger, D.; Evans, L.; Floyd, S.; Lin, L.; McClanahan, T.; Nankung, M.; Nowicki, S.; Schweitzer, J.; hide

    2011-01-01

    The Probing In situ with Neutrons and Gamma rays (PING) instrument is a promising planetary science application of the active neutron-gamma ray technology so successfully used in oil field well logging and mineral exploration on Earth. The objective of our technology development program at NASA Goddard Space Flight Center's (NASA/GSFC) Astrochemistry Laboratory is to extend the application of neutron interrogation techniques to landed in situ planetary composition measurements by using a 14 MeV Pulsed Neutron Generator (PNG) combined with neutron and gamma ray detectors, to probe the surface and subsurface of planetary bodies without the need to drill. We are thus working to bring the PING instrument to the point where it can be flown on a variety of surface lander or rover missions to the Moon, Mars, Venus, asteroids, comets and the satellites of the outer planets.

  17. Nuclear Tools For Oilfield Logging-While-Drilling Applications

    NASA Astrophysics Data System (ADS)

    Reijonen, Jani

    2011-06-01

    Schlumberger is an international oilfield service company with nearly 80,000 employees of 140 nationalities, operating globally in 80 countries. As a market leader in oilfield services, Schlumberger has developed a suite of technologies to assess the downhole environment, including, among others, electromagnetic, seismic, chemical, and nuclear measurements. In the past 10 years there has been a radical shift in the oilfield service industry from traditional wireline measurements to logging-while-drilling (LWD) analysis. For LWD measurements, the analysis is performed and the instruments are operated while the borehole is being drilled. The high temperature, high shock, and extreme vibration environment of LWD imposes stringent requirements for the devices used in these applications. This has a significant impact on the design of the components and subcomponents of a downhole tool. Another significant change in the past few years for nuclear-based oilwell logging tools is the desire to replace the sealed radioisotope sources with active, electronic ones. These active radiation sources provide great benefits compared to the isotopic sources, ranging from handling and safety to nonproliferation and well contamination issues. The challenge is to develop electronic generators that have a high degree of reliability for the entire lifetime of a downhole tool. LWD tool testing and operations are highlighted with particular emphasis on electronic radiation sources and nuclear detectors for the downhole environment.

  18. Simulated-use validation of a sponge ATP method for determining the adequacy of manual cleaning of endoscope channels.

    PubMed

    Alfa, Michelle J; Olson, Nancy

    2016-05-04

    The objective of this study was to validate the relative light unit (RLU) cut-off of adequate cleaning of flexible colonoscopes for an ATP (adenosine tri-phosphate) test kit that used a sponge channel collection method. This was a simulated-use study. The instrument channel segment of a flexible colonoscope was soiled with ATS (artificial test soil) containing approximately 8 Log10 Enterococcus faecalis and Pseudomonas aeruginosa/mL. Full cleaning, partial cleaning and no cleaning were evaluated for ATP, protein and bacterial residuals. Channel samples were collected using a sponge device to assess residual RLUs. Parallel colonoscopes inoculated and cleaned in the same manner were sampled using the flush method to quantitatively assess protein and bacterial residuals. The protein and viable count benchmarks for adequate cleaning were <6.4 ug/cm(2) and <4 Log10 cfu/cm(2). The negative controls for the instrument channel, over the course of the study remained low with on average 14 RLUs, 0.04 ug/cm(2) protein and 0.025 Log10 cfu/cm(2). Partial cleaning resulted in an average of 6601 RLUs, 3.99 ug/cm(2), 5.25 Log10 cfu/cm(2) E. faecalis and 4.48 Log10 cfu/cm(2) P. aeruginosa. After full cleaning, the average RLU was 29 (range 7-71 RLUs) and the average protein, E. faecalis and P. aeruginosa residuals were 0.23 ug/cm(2), 0.79 and 1.61 Log10 cfu/cm(2), respectively. The validated cut-off for acceptable manual cleaning was set at ≤100 RLUs for the sponge collected channel ATP test kit.

  19. Seismic velocity deviation log: An effective method for evaluating spatial distribution of reservoir pore types

    NASA Astrophysics Data System (ADS)

    Shirmohamadi, Mohamad; Kadkhodaie, Ali; Rahimpour-Bonab, Hossain; Faraji, Mohammad Ali

    2017-04-01

    Velocity deviation log (VDL) is a synthetic log used to determine pore types in reservoir rocks based on a combination of the sonic log with neutron-density logs. The current study proposes a two step approach to create a map of porosity and pore types by integrating the results of petrographic studies, well logs and seismic data. In the first step, velocity deviation log was created from the combination of the sonic log with the neutron-density log. The results allowed identifying negative, zero and positive deviations based on the created synthetic velocity log. Negative velocity deviations (below - 500 m/s) indicate connected or interconnected pores and fractures, while positive deviations (above + 500 m/s) are related to isolated pores. Zero deviations in the range of [- 500 m/s, + 500 m/s] are in good agreement with intercrystalline and microporosities. The results of petrographic studies were used to validate the main pore type derived from velocity deviation log. In the next step, velocity deviation log was estimated from seismic data by using a probabilistic neural network model. For this purpose, the inverted acoustic impedance along with the amplitude based seismic attributes were formulated to VDL. The methodology is illustrated by performing a case study from the Hendijan oilfield, northwestern Persian Gulf. The results of this study show that integration of petrographic, well logs and seismic attributes is an instrumental way for understanding the spatial distribution of main reservoir pore types.

  20. Sample Analysis at Mars Instrument Simulator

    NASA Technical Reports Server (NTRS)

    Benna, Mehdi; Nolan, Tom

    2013-01-01

    The Sample Analysis at Mars Instrument Simulator (SAMSIM) is a numerical model dedicated to plan and validate operations of the Sample Analysis at Mars (SAM) instrument on the surface of Mars. The SAM instrument suite, currently operating on the Mars Science Laboratory (MSL), is an analytical laboratory designed to investigate the chemical and isotopic composition of the atmosphere and volatiles extracted from solid samples. SAMSIM was developed using Matlab and Simulink libraries of MathWorks Inc. to provide MSL mission planners with accurate predictions of the instrument electrical, thermal, mechanical, and fluid responses to scripted commands. This tool is a first example of a multi-purpose, full-scale numerical modeling of a flight instrument with the purpose of supplementing or even eliminating entirely the need for a hardware engineer model during instrument development and operation. SAMSIM simulates the complex interactions that occur between the instrument Command and Data Handling unit (C&DH) and all subsystems during the execution of experiment sequences. A typical SAM experiment takes many hours to complete and involves hundreds of components. During the simulation, the electrical, mechanical, thermal, and gas dynamics states of each hardware component are accurately modeled and propagated within the simulation environment at faster than real time. This allows the simulation, in just a few minutes, of experiment sequences that takes many hours to execute on the real instrument. The SAMSIM model is divided into five distinct but interacting modules: software, mechanical, thermal, gas flow, and electrical modules. The software module simulates the instrument C&DH by executing a customized version of the instrument flight software in a Matlab environment. The inputs and outputs to this synthetic C&DH are mapped to virtual sensors and command lines that mimic in their structure and connectivity the layout of the instrument harnesses. This module executes, and thus validates, complex command scripts prior to their up-linking to the SAM instrument. As an output, this module generates synthetic data and message logs at a rate that is similar to the actual instrument.

  1. Deep installations of monitoring instrumentation in unsaturated welded tuff

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tyler, S.

    1985-12-31

    The major goal of this research is to develop low cost techniques to measure matric potential, moisture content, and to sample liquid and vapor for chemical analysis in the deep unsaturated zones of the arid areas of Nevada. This work has been prompted by the high level waste repository proposed in the unsaturated zone of Yucca Mountain. The work presented focuses on two deep (250 meter) boreholes planned for completion at the southern end of Yucca Mountain in fractured tuff. One borehole will be drilled without water and cased to slightly below the zone of saturation in order to measuremore » the depth to saturation and to collect water samples for analysis. This hole will also be used for routine quarterly neutron logging. Between loggings, vapor liquid water samplers will be suspended in the borehole and packed off at selective screened intervals to collect water vapor for isotopic analysis. The second borehole will be drilled to slightly above the water table and serve as a multiple interval psychrometer installation. Thermocouple psychrometers will be placed in isolated screened intervals within the casing. These boreholes will be used for instrument testing, interference and permeability testing, and to monitor short term fluctuations of soil and rock moisture due to precipitation and recharge.« less

  2. Development of procedures for calculating stiffness and damping of elastomers in engineering applications. Part 5: Elastomer performance limits and the design and test of an elastomer damper

    NASA Technical Reports Server (NTRS)

    Tecza, J. A.; Darlow, M. S.; Smalley, A. J.

    1979-01-01

    Tests were performed on elastomer specimens of the material polybutadiene to determine the performance limitations imposed by strain, temperature, and frequency. Three specimens were tested: a shear specimen, a compression specimen, and a second compression specimen in which thermocouples were embedded in the elastomer buttons. Stiffness and damping were determined from all tests, and internal temperatures were recorded for the instrumented compression specimen. Measured results are presented together with comparisons between predictions of a thermo-viscoelastic analysis and the measured results. Dampers of polybutadiene and Viton were designed, built, and tested. Vibration measurements were made and sensitivity of vibration to change in unbalance was also determined. Values for log decrement were extracted from the synchronous response curves. Comparisons were made between measured sensitivity to unbalance and log decrement and predicted values for these quantities.

  3. Results of investigation at the Miravalles Geothermal Field, Costa Rica: Part 1, Well logging. Resultados de las investigaciones en el campo geotermico de Miravalles, Costa Rica: Parte 1, Registros de pozos (in EN;SP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dennis, B.R.; Lawton, R.G.; Kolar, J.D.

    The well-logging operations performed in the Miravalles Geothermal Field in Costa Rica were conducted during two separate field trips. The Phase I program provided the deployment of a suite of high-temperature borehole instruments, including the temperature/rabbit, fluid sampler, and three-arm caliper in Well PGM-3. These same tools were deployed in Well PGM-10 along with an additional survey run with a combination fluid velocity/temperature/pressure instrument used to measure thermodynamic properties under flowing well conditions. The Phase II program complemented Phase I with the suite of tools deployed in Wells PGM-5, PGM-11, and PGM-12. 4 refs., 25 figs., 1 tab.

  4. Active Neutron and Gamma-Ray Instrumentation for In Situ Planetary Science Applications

    NASA Technical Reports Server (NTRS)

    Parsons, A.; Bodnarik, J.; Evans, L.; Floyd, A.; Lim, L.; McClanahan, T.; Namkung, M.; Nowicki, S.; Schweitzer, J.; Starr, R.; hide

    2011-01-01

    We describe the development of an instrument capable of detailed in situ bulk geochemical analysis of the surface of planets, moons, asteroids, and comets. This instrument technology uses a pulsed neutron generator to excite the solid materials of a planet and measures the resulting neutron and gamma-ray emission with its detector system. These time-resolved neutron and gamma-ray data provide detailed information about the bulk elemental composition, chemical context, and density distribution of the soil within 50 cm of the surface. While active neutron scattering and neutron-induced gamma-ray techniques have been used extensively for terrestrial nuclear well logging applications, our goal is to apply these techniques to surface instruments for use on any solid solar system body. As described, experiments at NASA Goddard Space Flight Center use a prototype neutron-induced gamma-ray instrument and the resulting data presented show the promise of this technique for becoming a versatile, robust, workhorse technology for planetary science, and exploration of any of the solid bodies in the solar system. The detection of neutrons at the surface also provides useful information about the material. This paper focuses on the data provided by the gamma-ray detector.

  5. Tree rings and rainfall in the equatorial Amazon

    NASA Astrophysics Data System (ADS)

    Granato-Souza, Daniela; Stahle, David W.; Barbosa, Ana Carolina; Feng, Song; Torbenson, Max C. A.; de Assis Pereira, Gabriel; Schöngart, Jochen; Barbosa, Joao Paulo; Griffin, Daniel

    2018-05-01

    The Amazon basin is a global center of hydroclimatic variability and biodiversity, but there are only eight instrumental rainfall stations with continuous records longer than 80 years in the entire basin, an area nearly the size of the coterminous US. The first long moisture-sensitive tree-ring chronology has been developed in the eastern equatorial Amazon of Brazil based on dendrochronological analysis of Cedrela cross sections cut during sustainable logging operations near the Rio Paru. The Rio Paru chronology dates from 1786 to 2016 and is significantly correlated with instrumental precipitation observations from 1939 to 2016. The strength and spatial scale of the precipitation signal vary during the instrumental period, but the Rio Paru chronology has been used to develop a preliminary reconstruction of February to November rainfall totals from 1786 to 2016. The reconstruction is related to SSTs in the Atlantic and especially the tropical Pacific, similar to the stronger pattern of association computed for the instrumental rainfall data from the eastern Amazon. The tree-ring data estimate extended drought and wet episodes in the mid- to late-nineteenth century, providing a valuable, long-term perspective on the moisture changes expected to emerge over the Amazon in the coming century due to deforestation and anthropogenic climate change.

  6. Finding Faults: Tohoku and other Active Megathrusts/Megasplays

    NASA Astrophysics Data System (ADS)

    Moore, J. C.; Conin, M.; Cook, B. J.; Kirkpatrick, J. D.; Remitti, F.; Chester, F.; Nakamura, Y.; Lin, W.; Saito, S.; Scientific Team, E.

    2012-12-01

    Current subduction-fault drilling procedure is to drill a logging hole, identify target faults, then core and instrument them. Seismic data may constrain faults but the additional resolution of borehole logs is necessary for efficient coring and instrumentation under difficult conditions and tight schedules. Thus, refining the methodology of identifying faults in logging data has become important, and thus comparison of log signatures of faults in different locations is worthwhile. At the C0019 (JFAST) drill site, the Tohoku megathrust was principally identified as a decollement where steep cylindrically-folded bedding abruptly flattens below the basal detachment. A similar structural contrast occurs across a megasplay fault in the NanTroSEIZE transect (Site C0004). At the Tohoku decollement, a high gamma-ray value from a pelagic clay layer, predicted as a likely decollement sediment type, strengthens the megathrust interpretation. The original identification of the pelagic clay as a decollement candidate was based on results of previous coring of an oceanic reference site. Negative density anomalies, often seen as low resistivity zones, identified a subsidiary fault in the deformed prism overlying the Tohoku megathrust. Elsewhere, at Barbados, Nankai (Moroto), and Costa Rica, negative density anomalies are associated with the decollement and other faults in hanging walls. Log-based density anomalies in fault zones provide a basis for recognizing in-situ fault zone dilation. At the Tohoku Site C0019, breakouts are present above but not below the megathrust. Changes in breakout orientation and width (stress magnitude) occur across megasplay faults at Sites C0004 and C0010 in the NantroSEIZE transect. Annular pressure anomalies are not apparent at the Tohoku megathrust, but are variably associated with faults and fracture zones drilled along the NanTroSEIZE transect. Overall, images of changes in structural features, negative density anomalies, and changes in breakout occurrence and orientation provide the most common log criteria for recognizing major thrust zones in ocean drilling holes at convergent margins. In the case of JFAST, identification of faults by logging was confirmed during subsequent coring activities, and logging data was critical for successful placement of the observatory down hole.

  7. Testing concordance of instrumental variable effects in generalized linear models with application to Mendelian randomization

    PubMed Central

    Dai, James Y.; Chan, Kwun Chuen Gary; Hsu, Li

    2014-01-01

    Instrumental variable regression is one way to overcome unmeasured confounding and estimate causal effect in observational studies. Built on structural mean models, there has been considerale work recently developed for consistent estimation of causal relative risk and causal odds ratio. Such models can sometimes suffer from identification issues for weak instruments. This hampered the applicability of Mendelian randomization analysis in genetic epidemiology. When there are multiple genetic variants available as instrumental variables, and causal effect is defined in a generalized linear model in the presence of unmeasured confounders, we propose to test concordance between instrumental variable effects on the intermediate exposure and instrumental variable effects on the disease outcome, as a means to test the causal effect. We show that a class of generalized least squares estimators provide valid and consistent tests of causality. For causal effect of a continuous exposure on a dichotomous outcome in logistic models, the proposed estimators are shown to be asymptotically conservative. When the disease outcome is rare, such estimators are consistent due to the log-linear approximation of the logistic function. Optimality of such estimators relative to the well-known two-stage least squares estimator and the double-logistic structural mean model is further discussed. PMID:24863158

  8. Comparison of fine particle measurements from a direct-reading instrument and a gravimetric sampling method.

    PubMed

    Kim, Jee Young; Magari, Shannon R; Herrick, Robert F; Smith, Thomas J; Christiani, David C

    2004-11-01

    Particulate air pollution, specifically the fine particle fraction (PM2.5), has been associated with increased cardiopulmonary morbidity and mortality in general population studies. Occupational exposure to fine particulate matter can exceed ambient levels by a large factor. Due to increased interest in the health effects of particulate matter, many particle sampling methods have been developed In this study, two such measurement methods were used simultaneously and compared. PM2.5 was sampled using a filter-based gravimetric sampling method and a direct-reading instrument, the TSI Inc. model 8520 DUSTTRAK aerosol monitor. Both sampling methods were used to determine the PM2.5 exposure in a group of boilermakers exposed to welding fumes and residual fuel oil ash. The geometric mean PM2.5 concentration was 0.30 mg/m3 (GSD 3.25) and 0.31 mg/m3 (GSD 2.90)from the DUSTTRAK and gravimetric method, respectively. The Spearman rank correlation coefficient for the gravimetric and DUSTTRAK PM2.5 concentrations was 0.68. Linear regression models indicated that log, DUSTTRAK PM2.5 concentrations significantly predicted loge gravimetric PM2.5 concentrations (p < 0.01). The association between log(e) DUSTTRAK and log, gravimetric PM2.5 concentrations was found to be modified by surrogate measures for seasonal variation and type of aerosol. PM2.5 measurements from the DUSTTRAK are well correlated and highly predictive of measurements from the gravimetric sampling method for the aerosols in these work environments. However, results from this study suggest that aerosol particle characteristics may affect the relationship between the gravimetric and DUSTTRAK PM2.5 measurements. Recalibration of the DUSTTRAK for the specific aerosol, as recommended by the manufacturer, may be necessary to produce valid measures of airborne particulate matter.

  9. Field experiment provides ground truth for surface nuclear magnetic resonance measurement

    USGS Publications Warehouse

    Knight, R.; Grunewald, E.; Irons, T.; Dlubac, K.; Song, Y.; Bachman, H.N.; Grau, B.; Walsh, D.; Abraham, J.D.; Cannia, J.

    2012-01-01

    The need for sustainable management of fresh water resources is one of the great challenges of the 21st century. Since most of the planet's liquid fresh water exists as groundwater, it is essential to develop non-invasive geophysical techniques to characterize groundwater aquifers. A field experiment was conducted in the High Plains Aquifer, central United States, to explore the mechanisms governing the non-invasive Surface NMR (SNMR) technology. We acquired both SNMR data and logging NMR data at a field site, along with lithology information from drill cuttings. This allowed us to directly compare the NMR relaxation parameter measured during logging,T2, to the relaxation parameter T2* measured using the SNMR method. The latter can be affected by inhomogeneity in the magnetic field, thus obscuring the link between the NMR relaxation parameter and the hydraulic conductivity of the geologic material. When the logging T2data were transformed to pseudo-T2* data, by accounting for inhomogeneity in the magnetic field and instrument dead time, we found good agreement with T2* obtained from the SNMR measurement. These results, combined with the additional information about lithology at the site, allowed us to delineate the physical mechanisms governing the SNMR measurement. Such understanding is a critical step in developing SNMR as a reliable geophysical method for the assessment of groundwater resources.

  10. Development of a microprocessor controller for stand-alone photovoltaic power systems

    NASA Technical Reports Server (NTRS)

    Millner, A. R.; Kaufman, D. L.

    1984-01-01

    A controller for stand-alone photovoltaic systems has been developed using a low power CMOS microprocessor. It performs battery state of charge estimation, array control, load management, instrumentation, automatic testing, and communications functions. Array control options are sequential subarray switching and maximum power control. A calculator keypad and LCD display provides manual control, fault diagnosis and digital multimeter functions. An RS-232 port provides data logging or remote control capability. A prototype 5 kW unit has been built and tested successfully. The controller is expected to be useful in village photovoltaic power systems, large solar water pumping installations, and other battery management applications.

  11. A study on directional resistivity logging-while-drilling based on self-adaptive hp-FEM

    NASA Astrophysics Data System (ADS)

    Liu, Dejun; Li, Hui; Zhang, Yingying; Zhu, Gengxue; Ai, Qinghui

    2014-12-01

    Numerical simulation of resistivity logging-while-drilling (LWD) tool response provides guidance for designing novel logging instruments and interpreting real-time logging data. In this paper, based on self-adaptive hp-finite element method (hp-FEM) algorithm, we analyze LWD tool response against model parameters and briefly illustrate geosteering capabilities of directional resistivity LWD. Numerical simulation results indicate that the change of source spacing is of obvious influence on the investigation depth and detecting precision of resistivity LWD tool; the change of frequency can improve the resolution of low-resistivity formation and high-resistivity formation. The simulation results also indicate that the self-adaptive hp-FEM algorithm has good convergence speed and calculation accuracy to guide the geologic steering drilling and it is suitable to simulate the response of resistivity LWD tools.

  12. [School-Based UV-B Monitoring Project in Support of EOS-CHEM

    NASA Technical Reports Server (NTRS)

    Brooks, David R.

    2005-01-01

    This grant is an extension of Grant NAG5-8929 (Drexel Project Number 230026), resulting from extensions necessary to meet changing science objectives as described in the final report for NAG5-8929, a copy of which is attached. The instrument configuration resulting from NAG54929 has remained basically intact. Cosine response measurements conducted by James Slusser s group at Fort Collins, Colorado, in support of the proposal for Aura ground validation mentioned in the final report for NAG5-8929, indicated that there was significant light leakage to the detector through the sides of the nylon housing. This was easily remedied by machining a removable opaque collar (made from the dark same grey rigid plastic plumbing tubing as the collimating tube) that fits around the detector collar. Also during this grant period, data logging procedures were established for the UV-A instrument, to record irradiance before, during, and after an Aura overflight. This is required in order to compare spatial and temporal variability as required for ground validation of data products derived from the Ozone Monitoring Instrument (OMI). Standalone 12-bit loggers from Onset Computer Corporation (the U12 series), which were not available at the start of these projects, makes possible relatively inexpensive logging for these instruments at a usable resolution. configuration and disposition of these instruments, including the final version of a GLOBE protocol for using the instruments, currently depend on action taken on the Aura ground validation proposal submitted in 2004. A copy of that proposal is attached.

  13. A framework and a measurement instrument for sustainability of work practices in long-term care

    PubMed Central

    2011-01-01

    Background In health care, many organizations are working on quality improvement and/or innovation of their care practices. Although the effectiveness of improvement processes has been studied extensively, little attention has been given to sustainability of the changed work practices after implementation. The objective of this study is to develop a theoretical framework and measurement instrument for sustainability. To this end sustainability is conceptualized with two dimensions: routinization and institutionalization. Methods The exploratory methodological design consisted of three phases: a) framework development; b) instrument development; and c) field testing in former improvement teams in a quality improvement program for health care (N teams = 63, N individual = 112). Data were collected not until at least one year had passed after implementation. Underlying constructs and their interrelations were explored using Structural Equation Modeling and Principal Component Analyses. Internal consistency was computed with Cronbach's alpha coefficient. A long and a short version of the instrument are proposed. Results The χ2- difference test of the -2 Log Likelihood estimates demonstrated that the hierarchical two factor model with routinization and institutionalization as separate constructs showed a better fit than the one factor model (p < .01). Secondly, construct validity of the instrument was strong as indicated by the high factor loadings of the items. Finally, the internal consistency of the subscales was good. Conclusions The theoretical framework offers a valuable starting point for the analysis of sustainability on the level of actual changed work practices. Even though the two dimensions routinization and institutionalization are related, they are clearly distinguishable and each has distinct value in the discussion of sustainability. Finally, the subscales conformed to psychometric properties defined in literature. The instrument can be used in the evaluation of improvement projects. PMID:22087884

  14. Air Launch Instrumented Vehicles Evaluation (ALIVE).

    DTIC Science & Technology

    1977-02-01

    propellant .s. The study addressed aging of two 12—inch—diamete r , SRBDM—type motors cast with mode ra te—burning—rate prope l l a n t . The propel lan t...s Ii t ttiis j t .y Factor vs Half Crack Length 86 30 Stress Intensity Factor /Load vs I1~ l 1 Crack Length 87 31 Log Stress I n t c r t s t t y... Factor vs Log Crac k Tip V e l o c i ty for S t r ip Biaxial Specimen 88 32 Log Stress I t i t i n s i t v Factor A d j u s t e d for Stra in

  15. SBIR Phase II Final Report: Low cost Autonomous NMR and Multi-sensor Soil Monitoring Instrument

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walsh, David O.

    In this 32-month SBIR Phase 2 program, Vista Clara designed, assembled and successfully tested four new NMR instruments for soil moisture measurement and monitoring: An enhanced performance man-portable Dart NMR logging probe and control unit for rapid, mobile measurement in core holes and 2” PVC access wells; A prototype 4-level Dart NMR monitoring probe and prototype multi-sensor soil monitoring control unit for long-term unattended monitoring of soil moisture and other measurements in-situ; A non-invasive 1m x 1m Discus NMR soil moisture sensor with surface based magnet/coil array for rapid measurement of soil moisture in the top 50 cm of themore » subsurface; A non-invasive, ultra-lightweight Earth’s field surface NMR instrument for non-invasive measurement and mapping of soil moisture in the top 3 meters of the subsurface. The Phase 2 research and development achieved most, but not all of our technical objectives. The single-coil Dart in-situ sensor and control unit were fully developed, demonstrated and successfully commercialized within the Phase 2 period of performance. The multi-level version of the Dart probe was designed, assembled and demonstrated in Phase 2, but its final assembly and testing were delayed until close to the end of the Phase 2 performance period, which limited our opportunities for demonstration in field settings. Likewise, the multi-sensor version of the Dart control unit was designed and assembled, but not in time for it to be deployed for any long-term monitoring demonstrations. The prototype ultra-lightweight surface NMR instrument was developed and demonstrated, and this result will be carried forward into the development of a new flexible surface NMR instrument and commercial product in 2018.« less

  16. Design of LabVIEW®-based software for the control of sequential injection analysis instrumentation for the determination of morphine

    PubMed Central

    Lenehan, Claire E.; Lewis, Simon W.

    2002-01-01

    LabVIEW®-based software for the automation of a sequential injection analysis instrument for the determination of morphine is presented. Detection was based on its chemiluminescence reaction with acidic potassium permanganate in the presence of sodium polyphosphate. The calibration function approximated linearity (range 5 × 10-10 to 5 × 10-6 M) with a line of best fit of y=1.05x+8.9164 (R2 =0.9959), where y is the log10 signal (mV) and x is the log10 morphine concentration (M). Precision, as measured by relative standard deviation, was 0.7% for five replicate analyses of morphine standard (5 × 10-8 M). The limit of detection (3σ) was determined as 5 × 10-11 M morphine. PMID:18924729

  17. Design of LabVIEW-based software for the control of sequential injection analysis instrumentation for the determination of morphine.

    PubMed

    Lenehan, Claire E; Barnett, Neil W; Lewis, Simon W

    2002-01-01

    LabVIEW-based software for the automation of a sequential injection analysis instrument for the determination of morphine is presented. Detection was based on its chemiluminescence reaction with acidic potassium permanganate in the presence of sodium polyphosphate. The calibration function approximated linearity (range 5 x 10(-10) to 5 x 10(-6) M) with a line of best fit of y=1.05(x)+8.9164 (R(2) =0.9959), where y is the log10 signal (mV) and x is the log10 morphine concentration (M). Precision, as measured by relative standard deviation, was 0.7% for five replicate analyses of morphine standard (5 x 10(-8) M). The limit of detection (3sigma) was determined as 5 x 10(-11) M morphine.

  18. Instrumentation of Java Bytecode for Runtime Analysis

    NASA Technical Reports Server (NTRS)

    Goldberg, Allen; Haveland, Klaus

    2003-01-01

    This paper describes JSpy, a system for high-level instrumentation of Java bytecode and its use with JPaX, OUT system for runtime analysis of Java programs. JPaX monitors the execution of temporal logic formulas and performs predicative analysis of deadlocks and data races. JSpy s input is an instrumentation specification, which consists of a collection of rules, where a rule is a predicate/action pair The predicate is a conjunction of syntactic constraints on a Java statement, and the action is a description of logging information to be inserted in the bytecode corresponding to the statement. JSpy is built using JTrek an instrumentation package at a lower level of abstraction.

  19. Design and development of control unit and software for the ADFOSC instrument of the 3.6 m Devasthal optical telescope

    NASA Astrophysics Data System (ADS)

    Kumar, T. S.

    2016-08-01

    In this paper, we describe the details of control unit and GUI software for positioning two filter wheels, a slit wheel and a grism wheel in the ADFOSC instrument. This is a first generation instrument being built for the 3.6 m Devasthal optical telescope. The control hardware consists of five electronic boards based on low cost 8-bit PIC microcontrollers and are distributed over I2C bus. The four wheels are controlled by four identical boards which are configured in I2C slave mode while the fifth board acts as an I2C master for sending commands to and receiving status from the slave boards. The master also communicates with the interfacing PC over TCP/IP protocol using simple ASCII commands. For moving the wheels stepper motors along with suitable amplifiers have been employed. Homing after powering ON is achieved using hall effect sensors. By implementing distributed control units having identical design modularity is achieved enabling easier maintenance and upgradation. A GUI based software for commanding the instrument is developed in Microsoft Visual C++. For operating the system during observations the user selects normal mode while the engineering mode is available for offering additional flexibility and low level control during maintenance and testing. A detailed time-stamped log of commands, status and errors are continuously generated. Both the control unit and the software have been successfully tested and integrated with the ADFOSC instrument.

  20. Accessible laparoscopic instrument tracking ("InsTrac"): construct validity in a take-home box simulator.

    PubMed

    Partridge, Roland W; Hughes, Mark A; Brennan, Paul M; Hennessey, Iain A M

    2014-08-01

    Objective performance feedback has potential to maximize the training benefit of laparoscopic simulators. Instrument movement metrics are, however, currently the preserve of complex and expensive systems. We aimed to develop and validate affordable, user-ready software that provides objective feedback by tracking instrument movement in a "take-home" laparoscopic simulator. Computer-vision processing tracks the movement of colored bands placed around the distal instrument shafts. The position of each instrument is logged from the simulator camera feed and movement metrics calculated in real time. Ten novices (junior doctors) and 13 general surgery trainees (StR) (training years 3-7) performed a standardized task (threading string through hoops) on the eoSim (eoSurgical™ Ltd., Edinburgh, Scotland, United Kingdom) take-home laparoscopic simulator. Statistical analysis was performed using unpaired t tests with Welch's correction. The software was able to track the instrument tips reliably and effectively. Significant differences between the two groups were observed in time to complete task (StR versus novice, 2 minutes 33 seconds versus 9 minutes 53 seconds; P=.01), total distance traveled by instruments (3.29 m versus 11.38 m, respectively; P=.01), average instrument motion smoothness (0.15 mm/second(3) versus 0.06 mm/second(3), respectively; P<.01), and handedness (mean difference between dominant and nondominant hand) (0.55 m versus 2.43 m, respectively; P=.03). There was no significant difference seen in the distance between instrument tips, acceleration, speed of instruments, or time off-screen. We have developed software that brings objective performance feedback to the portable laparoscopic box simulator. Construct validity has been demonstrated. Removing the need for additional motion-tracking hardware makes it affordable and accessible. It is user-ready and has the potential to enhance the training benefit of portable simulators both in the workplace and at home.

  1. “Blogging” About Course Concepts: Using Technology for Reflective Journaling in a Communications Class

    PubMed Central

    Bouldin, Alicia S.; Holmes, Erin R.; Fortenberry, Michael L.

    2006-01-01

    Objective Web log technology was applied to a reflective journaling exercise in a communication course during the second-professional year at the University of Mississippi School of Pharmacy, to encourage students to reflect on course concepts and apply them to the environment outside the classroom, and to assess their communication performance. Design Two Web log entries per week were required for full credit. Web logs were evaluated at three points during the term. At the end of the course, students evaluated the assignment using a 2-page survey instrument. Assessment The assignment contributed to student learning and increased awareness level for approximately 40% of the class. Students had few complaints about the logistics of the assignment. Conclusion The Web log technology was a useful tool for reflective journaling in this communications course. Future versions of the assignment will benefit from student feedback from this initial experience. PMID:17136203

  2. Borehole geophysics applied to ground-water investigations

    USGS Publications Warehouse

    Keys, W.S.

    1990-01-01

    The purpose of this manual is to provide hydrologists, geologists, and others who have the necessary background in hydrogeology with the basic information needed to apply the most useful borehole-geophysical-logging techniques to the solution of problems in ground-water hydrology. Geophysical logs can provide information on the construction of wells and on the character of the rocks and fluids penetrated by those wells, as well as on changes in the character of these factors over time. The response of well logs is caused by petrophysical factors, by the quality, temperature, and pressure of interstitial fluids, and by ground-water flow. Qualitative and quantitative analysis of analog records and computer analysis of digitized logs are used to derive geohydrologic information. This information can then be extrapolated vertically within a well and laterally to other wells using logs. The physical principles by which the mechanical and electronic components of a logging system measure properties of rocks, fluids, and wells, as well as the principles of measurement, must be understood if geophysical logs are to be interpreted correctly. Plating a logging operation involves selecting the equipment and the logs most likely to provide the needed information. Information on well construction and geohydrology is needed to guide this selection. Quality control of logs is an important responsibility of both the equipment operator and the log analyst and requires both calibration and well-site standardization of equipment. Logging techniques that are widely used in ground-water hydrology or that have significant potential for application to this field include spontaneous potential, resistance, resistivity, gamma, gamma spectrometry, gamma-gamma, neutron, acoustic velocity, acoustic televiewer, caliper, and fluid temperature, conductivity, and flow. The following topics are discussed for each of these techniques: principles and instrumentation, calibration and standardization, volume of investigation, extraneous effects, and interpretation and applications.

  3. Borehole geophysics applied to ground-water investigations

    USGS Publications Warehouse

    Keys, W.S.

    1988-01-01

    The purpose of this manual is to provide hydrologists, geologists, and others who have the necessary training with the basic information needed to apply the most useful borehole-geophysical-logging techniques to the solution of problems in ground-water hydrology. Geophysical logs can provide information on the construction of wells and on the character of the rocks and fluids penetrated by those wells, in addition to changes in the character of these factors with time. The response of well logs is caused by: petrophysical factors; the quality; temperature, and pressure of interstitial fluids; and ground-water flow. Qualitative and quantitative analysis of the analog records and computer analysis of digitized logs are used to derive geohydrologic information. This information can then be extrapolated vertically within a well and laterally to other wells using logs.The physical principles by which the mechanical and electronic components of a logging system measure properties of rocks, fluids and wells, and the principles of measurement need to be understood to correctly interpret geophysical logs. Planning the logging operation involves selecting the equipment and the logs most likely to provide the needed information. Information on well construction and geohydrology are needed to guide this selection. Quality control of logs is an important responsibility of both the equipment operator and log analyst and requires both calibration and well-site standardization of equipment.Logging techniques that are widely used in ground-water hydrology or that have significant potential for application to this field include: spontaneous potential, resistance, resistivity, gamma, gamma spectrometry, gamma-gamma, neutron, acoustic velocity, acoustic televiewer, caliper, and fluid temperature, conductivity, and flow. The following topics are discussed for each of these techniques: principles and instrumentation, calibration and standardization, volume of investigation, extraneous effects, and interpretation and applications.

  4. A new fully automated FTIR system for total column measurements of greenhouse gases

    NASA Astrophysics Data System (ADS)

    Geibel, M. C.; Gerbig, C.; Feist, D. G.

    2010-10-01

    This article introduces a new fully automated FTIR system that is part of the Total Carbon Column Observing Network (TCCON). It will provide continuous ground-based measurements of column-averaged volume mixing ratio for CO2, CH4 and several other greenhouse gases in the tropics. Housed in a 20-foot shipping container it was developed as a transportable system that could be deployed almost anywhere in the world. We describe the automation concept which relies on three autonomous subsystems and their interaction. Crucial components like a sturdy and reliable solar tracker dome are described in detail. The automation software employs a new approach relying on multiple processes, database logging and web-based remote control. First results of total column measurements at Jena, Germany show that the instrument works well and can provide parts of the diurnal as well as seasonal cycle for CO2. Instrument line shape measurements with an HCl cell suggest that the instrument stays well-aligned over several months. After a short test campaign for side by side intercomaprison with an existing TCCON instrument in Australia, the system will be transported to its final destination Ascension Island.

  5. Quantitative comparison of measurements of urgent care service quality.

    PubMed

    Qin, Hong; Prybutok, Victor; Prybutok, Gayle

    2016-01-01

    Service quality and patient satisfaction are essential to health care organization success. Parasuraman, Zeithaml, and Berry introduced SERVQUAL, a prominent service quality measure not yet applied to urgent care. We develop an instrument to measure perceived service quality and identify the determinants of patient satisfaction/ behavioral intentions. We examine the relationships among perceived service quality, patient satisfaction and behavioral intentions, and demonstrate that urgent care service quality is not equivalent using measures of perceptions only, differences of expectations minus perceptions, ratio of perceptions to expectations, and the log of the ratio. Perceptions provide the best measure of urgent care service quality.

  6. Induction logging device

    DOEpatents

    Koelle, A.R.; Landt, J.A.

    An instrument is disclosed for mapping vertical conductive fractures in a resistive bedrock, magnetically inducing eddy currents by a pair of vertically oriented, mutually perpendicular, coplanar coils. The eddy currents drive magnetic fields which are picked up by a second, similar pair of coils.

  7. Interactive Multi-Instrument Database of Solar Flares (IMIDSF)

    NASA Astrophysics Data System (ADS)

    Sadykov, Viacheslav M.; Nita, Gelu M.; Oria, Vincent; Kosovichev, Alexander G.

    2017-08-01

    Solar flares represent a complicated physical phenomenon observed in a broad range of the electromagnetic spectrum, from radiowaves to gamma-rays. For a complete understanding of the flares it is necessary to perform a combined multi-wavelength analysis using observations from many satellites and ground-based observatories. For efficient data search, integration of different flare lists and representation of observational data, we have developed the Interactive Multi-Instrument Database of Solar Flares (https://solarflare.njit.edu/). The web database is fully functional and allows the user to search for uniquely-identified flare events based on their physical descriptors and availability of observations of a particular set of instruments. Currently, data from three primary flare lists (GOES, RHESSI and HEK) and a variety of other event catalogs (Hinode, Fermi GBM, Konus-Wind, OVSA flare catalogs, CACTus CME catalog, Filament eruption catalog) and observing logs (IRIS and Nobeyama coverage), are integrated. An additional set of physical descriptors (temperature and emission measure) along with observing summary, data links and multi-wavelength light curves is provided for each flare event since January 2002. Results of an initial statistical analysis will be presented.

  8. Emerging Methods and Systems for Observing Life in the Sea

    NASA Astrophysics Data System (ADS)

    Chavez, F.; Pearlman, J.; Simmons, S. E.

    2016-12-01

    There is a growing need for observations of life in the sea at time and space scales consistent with those made for physical and chemical parameters. International programs such as the Global Ocean Observing System (GOOS) and Marine Biodiversity Observation Networks (MBON) are making the case for expanded biological observations and working diligently to prioritize essential variables. Here we review past, present and emerging systems and methods for observing life in the sea from the perspective of maintaining continuous observations over long time periods. Methods that rely on ships with instrumentation and over-the-side sample collections will need to be supplemented and eventually replaced with those based from autonomous platforms. Ship-based optical and acoustic instruments are being reduced in size and power for deployment on moorings and autonomous vehicles. In parallel a new generation of low power, improved resolution sensors are being developed. Animal bio-logging is evolving with new, smaller and more sophisticated tags being developed. New genomic methods, capable of assessing multiple trophic levels from a single water sample, are emerging. Autonomous devices for genomic sample collection are being miniaturized and adapted to autonomous vehicles. The required processing schemes and methods for these emerging data collections are being developed in parallel with the instrumentation. An evolving challenge will be the integration of information from these disparate methods given that each provides their own unique view of life in the sea.

  9. DIY soundcard based temperature logging system. Part I: design

    NASA Astrophysics Data System (ADS)

    Nunn, John

    2016-11-01

    This paper aims to enable schools to make their own low-cost temperature logging instrument and to learn a something about its calibration in the process. This paper describes how a thermistor can be integrated into a simple potential divider circuit which is powered with the sound output of a computer and monitored by the microphone input. The voltage across a fixed resistor is recorded and scaled to convert it into a temperature reading in the range 0-100 °C. The calibration process is described with reference to fixed points and the effects of non-linearity are highlighted. An optimised calibration procedure is described which enables sub degree resolution and a software program was written which makes it possible to log, display and save temperature changes over a user determined period of time.

  10. Environmental and Genetic Factors Explain Differences in Intraocular Scattering.

    PubMed

    Benito, Antonio; Hervella, Lucía; Tabernero, Juan; Pennos, Alexandros; Ginis, Harilaos; Sánchez-Romera, Juan F; Ordoñana, Juan R; Ruiz-Sánchez, Marcos; Marín, José M; Artal, Pablo

    2016-01-01

    To study the relative impact of genetic and environmental factors on the variability of intraocular scattering within a classical twin study. A total of 64 twin pairs, 32 monozygotic (MZ) (mean age: 54.9 ± 6.3 years) and 32 dizygotic (DZ) (mean age: 56.4 ± 7.0 years), were measured after a complete ophthalmologic exam had been performed to exclude all ocular pathologies that increase intraocular scatter as cataracts. Intraocular scattering was evaluated by using two different techniques based on a straylight parameter log(S) estimation: a compact optical instrument based in the principle of optical integration and a psychophysical measurement. Intraclass correlation coefficients (ICC) were used as descriptive statistics of twin resemblance, and genetic models were fitted to estimate heritability. No statistically significant difference was found for MZ and DZ groups for age (P = 0.203), best-corrected visual acuity (P = 0.626), cataract gradation (P = 0.701), sex (P = 0.941), optical log(S) (P = 0.386), or psychophysical log(S) (P = 0.568), with only a minor difference in equivalent sphere (P = 0.008). Intraclass correlation coefficients between siblings were similar for scatter parameters: 0.676 in MZ and 0.471 in DZ twins for optical log(S); 0.533 in MZ twins and 0.475 in DZ twins for psychophysical log(S). For equivalent sphere, ICCs were 0.767 in MZ and 0.228 in DZ twins. Conservative estimates of heritability for the measured scattering parameters were 0.39 and 0.20, respectively. Correlations of intraocular scatter (straylight) parameters in the groups of identical and nonidentical twins were similar. Heritability estimates were of limited magnitude, suggesting that genetic and environmental factors determine the variance of ocular straylight in healthy middle-aged adults.

  11. A wireless high-speed data acquisition system for geotechnical centrifuge model testing

    NASA Astrophysics Data System (ADS)

    Gaudin, C.; White, D. J.; Boylan, N.; Breen, J.; Brown, T.; DeCatania, S.; Hortin, P.

    2009-09-01

    This paper describes a novel high-speed wireless data acquisition system (WDAS) developed at the University of Western Australia for operation onboard a geotechnical centrifuge, in an enhanced gravitational field of up to 300 times Earth's gravity. The WDAS system consists of up to eight separate miniature units distributed around the circumference of a 0.8 m diameter drum centrifuge, communicating with the control room via wireless Ethernet. Each unit is capable of powering and monitoring eight instrument channels at a sampling rate of up to 1 MHz at 16-bit resolution. The data are stored within the logging unit in solid-state memory, but may also be streamed in real-time at low frequency (up to 10 Hz) to the centrifuge control room, via wireless transmission. The high-speed logging runs continuously within a circular memory (buffer), allowing for storage of a pre-trigger segment of data prior to an event. To suit typical geotechnical modelling applications, the system can record low-speed data continuously, until a burst of high-speed acquisition is triggered when an experimental event occurs, after which the system reverts back to low-speed acquisition to monitor the aftermath of the event. Unlike PC-based data acquisition solutions, this system performs the full sequence of amplification, conditioning, digitization and storage on a single circuit board via an independent micro-controller allocated to each pair of instrumented channels. This arrangement is efficient, compact and physically robust to suit the centrifuge environment. This paper details the design specification of the WDAS along with the software interface developed to control the units. Results from a centrifuge test of a submarine landslide are used to illustrate the performance of the new WDAS.

  12. Reductions in emissions from deforestation from Indonesia’s moratorium on new oil palm, timber, and logging concessions

    PubMed Central

    Busch, Jonah; Ferretti-Gallon, Kalifi; Engelmann, Jens; Wright, Max; Austin, Kemen G.; Stolle, Fred; Turubanova, Svetlana; Potapov, Peter V.; Margono, Belinda; Hansen, Matthew C.; Baccini, Alessandro

    2015-01-01

    To reduce greenhouse gas emissions from deforestation, Indonesia instituted a nationwide moratorium on new license areas (“concessions”) for oil palm plantations, timber plantations, and logging activity on primary forests and peat lands after May 2011. Here we indirectly evaluate the effectiveness of this policy using annual nationwide data on deforestation, concession licenses, and potential agricultural revenue from the decade preceding the moratorium. We estimate that on average granting a concession for oil palm, timber, or logging in Indonesia increased site-level deforestation rates by 17–127%, 44–129%, or 3.1–11.1%, respectively, above what would have occurred otherwise. We further estimate that if Indonesia’s moratorium had been in place from 2000 to 2010, then nationwide emissions from deforestation over that decade would have been 241–615 MtCO2e (2.8–7.2%) lower without leakage, or 213–545 MtCO2e (2.5–6.4%) lower with leakage. As a benchmark, an equivalent reduction in emissions could have been achieved using a carbon price-based instrument at a carbon price of $3.30–7.50/tCO2e (mandatory) or $12.95–19.45/tCO2e (voluntary). For Indonesia to have achieved its target of reducing emissions by 26%, the geographic scope of the moratorium would have had to expand beyond new concessions (15.0% of emissions from deforestation and peat degradation) to also include existing concessions (21.1% of emissions) and address deforestation outside of concessions and protected areas (58.7% of emissions). Place-based policies, such as moratoria, may be best thought of as bridge strategies that can be implemented rapidly while the institutions necessary to enable carbon price-based instruments are developed. PMID:25605880

  13. The Importance of Field Demonstration Sites: The View from the Unconventional Resource Region of the Appalachian Basin

    NASA Astrophysics Data System (ADS)

    Carr, T.

    2017-12-01

    The Appalachian basin with the Marcellus and Utica shale units is one of the most active unconventional resource plays in North America. Unconventional resource plays are critical and rapidly-growing areas of energy, where research lags behind exploration and production activity. There remains a poor overall understanding of physical, chemical and biological factors that control shale gas production efficiency and possible environmental impacts associated with shale gas development. We have developed an approach that works with local industrial partners and communities and across research organizations. The Marcellus Shale Energy and Environment Laboratory (MSEEL) consists of a multidisciplinary and multi-institutional team undertaking integrated geoscience, engineering and environmental studies in cooperation with the Department of Energy. This approach is being expanded to other sites and to the international arena. MSEEL consists of four horizontal production wells, which are instrumented, a cored and logged vertical pilot bore-hole, and a microseismic observation well. MSEEL has integrated geophysical observations (microseismic and surface), fiber-optic monitoring for distributed acoustic (DAS) and temperature sensing (DTS), well logs, core data and production logging and continued monitoring, to characterize subsurface rock properties, and the propagation pattern of induced fractures in the stimulated reservoir volume. Significant geologic heterogeneity along the lateral affects fracture stimulation efficiency - both completion efficiency (clusters that receive effective stimulation), and production efficiency (clusters effectively contributing to production). MSEEL works to develop new knowledge of subsurface geology and engineering, and surface environmental impact to identify best practices that can optimize hydraulic fracture stimulation to increase flow rates, estimated ultimate recovery in order to reduce the number of wells and environmental impact.

  14. Real-time use of the iPad by third-year medical students for clinical decision support and learning: a mixed methods study.

    PubMed

    Nuss, Michelle A; Hill, Janette R; Cervero, Ronald M; Gaines, Julie K; Middendorf, Bruce F

    2014-01-01

    Despite widespread use of mobile technology in medical education, medical students' use of mobile technology for clinical decision support and learning is not well understood. Three key questions were explored in this extensive mixed methods study: 1) how medical students used mobile technology in the care of patients, 2) the mobile applications (apps) used and 3) how expertise and time spent changed overtime. This year-long (July 2012-June 2013) mixed methods study explored the use of the iPad, using four data collection instruments: 1) beginning and end-of-year questionnaires, 2) iPad usage logs, 3) weekly rounding observations, and 4) weekly medical student interviews. Descriptive statistics were generated for the questionnaires and apps reported in the usage logs. The iPad usage logs, observation logs, and weekly interviews were analyzed via inductive thematic analysis. Students predominantly used mobile technology to obtain real-time patient data via the electronic health record (EHR), to access medical knowledge resources for learning, and to inform patient care. The top four apps used were Epocrates(®), PDF Expert(®), VisualDx(®), and Micromedex(®). The majority of students indicated that their use (71%) and expertise (75%) using mobile technology grew overtime. This mixed methods study provides substantial evidence that medical students used mobile technology for clinical decision support and learning. Integrating its use into the medical student's daily workflow was essential for achieving these outcomes. Developing expertise in using mobile technology and various apps was critical for effective and efficient support of real-time clinical decisions.

  15. From gold leaf to thermal neutrons: One hundred years of radioactivity and geological exploration (Invited)

    NASA Astrophysics Data System (ADS)

    Howarth, R. J.

    2010-12-01

    In 1789 Klaproth extracted ‘Uranit,’ from shiny black ‘Bechblende’ ore obtained from the George Wagsfort silver mine at Johanngeorgenstadt in the Erzegebirge (Ore Mountains), Saxony, Germany. He believed it to be a new chemical element (but what he had obtained was actually an oxide; uranium was first isolated in its pure metallic form by Peligot in 1856, following an earlier attempt in 1841). By 1816 pitchblende had also been found in Hungary & Cornwall, England. In1871, an English metallurgist, Richard Pearce, visiting the USA, discovered two cwt. of pitchblende ‘thrown away on a refuse-heap’ at the Wood Mine, Gilpin Co., Colorado. He returned the following year and leased the mine, which subsequently supplied the ore to MMe. Curie. In 1867, Saint-Victor had noticed the blackening of silver halide emulsion by uranium compounds, but it was Becquerel who, in a series of experiments in 1896, first showed that they emitted a radiation which was not the same as that from a Crookes tube (X-rays). In 1898 both Mme Curie and Schmidt independently described the radioactivity of thorium and its compounds but she realised, from her electrometer experiments, that uranium minerals might contain a yet more active element than either U or Th. By the end of 1900, the U-decay series had been elucidated, and during 1903-4, a radioactive gas found to exist in soil, water, air and crude petroleum, was shown to be identical with radium ‘emanation.’ Wolcott (1904) suggested that it might be used to prospect for U and Th ores. Nevertheless, for the next 25 years, investigations were largely confined to the laboratory with instrumental development and studies of radiochemisty, mineralogy, autoradiography, pleochroic haloes and contemplating radioactive heating of the Earth. However, there were exceptions: In 1905 von dem Borne used electrometer measurements to locate veins of pitchblende in a mine, and Ambronn (1921) measured the activity of successive core samples taken down an oil well to make a down-hole radioactivity profile. Technical advances were rapidly reflected in prospecting on foot, by car, and in the air, with successive adoption of the electrometer (1927); the Geiger-Müller (1945), scintillation (1952) and Hare (1954) counters; and the gamma-spectrometer (1960). The modern era of well-logging began with the patenting by Fearon in 1937 of logs using gamma rays (discovered by Viellard, 1900; named by Rutherford, 1914) and neutrons (discovered by Chadwick, 1932), although the term ‘gamma ray log’ is reported as having first been used on 29 October 1938. A simultaneous gamma and neutron logging device was developed by Sherbatskoy in 1951. Neutron-gamma and gamma-gamma logs followed in the next two years and, by the time it was possible to undertake this with a single instrument (Monaghan 1961), further tools had been developed to attempt detection of both hydrocarbons and salt water in the formations passed through. One-hundred years after Pearce’s discovery, the Thermal Neutron Decay Time Log was introduced; the marriage of radioactivity and geology had truly come of age.

  16. How many days of monitoring predict physical activity and sedentary behaviour in older adults?

    PubMed Central

    2011-01-01

    Background The number of days of pedometer or accelerometer data needed to reliably assess physical activity (PA) is important for research that examines the relationship with health. While this important research has been completed in young to middle-aged adults, data is lacking in older adults. Further, data determining the number of days of self-reports PA data is also void. The purpose of this study was to examine the number of days needed to predict habitual PA and sedentary behaviour across pedometer, accelerometer, and physical activity log (PA log) data in older adults. Methods Participants (52 older men and women; age = 69.3 ± 7.4 years, range= 55-86 years) wore a Yamax Digiwalker SW-200 pedometer and an ActiGraph 7164 accelerometer while completing a PA log for 21 consecutive days. Mean differences each instrument and intensity between days of the week were examined using separate repeated measures analysis of variance for with pairwise comparisons. Spearman-Brown Prophecy Formulae based on Intraclass Correlations of .80, .85, .90 and .95 were used to predict the number of days of accelerometer or pedometer wear or PA log daily records needed to represent total PA, light PA, moderate-to-vigorous PA, and sedentary behaviour. Results Results of this study showed that three days of accelerometer data, four days of pedometer data, or four days of completing PA logs are needed to accurately predict PA levels in older adults. When examining time spent in specific intensities of PA, fewer days of data are needed for accurate prediction of time spent in that activity for ActiGraph but more for the PA log. To accurately predict average daily time spent in sedentary behaviour, five days of ActiGraph data are needed. Conclusions The number days of objective (pedometer and ActiGraph) and subjective (PA log) data needed to accurately estimate daily PA in older adults was relatively consistent. Despite no statistical differences between days for total PA by the pedometer and ActiGraph, the magnitude of differences between days suggests that day of the week cannot be completely ignored in the design and analysis of PA studies that involve < 7-day monitoring protocols for these instruments. More days of accelerometer data were needed to determine typical sedentary behaviour than PA level in this population of older adults. PMID:21679426

  17. How many days of monitoring predict physical activity and sedentary behaviour in older adults?

    PubMed

    Hart, Teresa L; Swartz, Ann M; Cashin, Susan E; Strath, Scott J

    2011-06-16

    The number of days of pedometer or accelerometer data needed to reliably assess physical activity (PA) is important for research that examines the relationship with health. While this important research has been completed in young to middle-aged adults, data is lacking in older adults. Further, data determining the number of days of self-reports PA data is also void. The purpose of this study was to examine the number of days needed to predict habitual PA and sedentary behaviour across pedometer, accelerometer, and physical activity log (PA log) data in older adults. Participants (52 older men and women; age = 69.3 ± 7.4 years, range= 55-86 years) wore a Yamax Digiwalker SW-200 pedometer and an ActiGraph 7164 accelerometer while completing a PA log for 21 consecutive days. Mean differences each instrument and intensity between days of the week were examined using separate repeated measures analysis of variance for with pairwise comparisons. Spearman-Brown Prophecy Formulae based on Intraclass Correlations of .80, .85, .90 and .95 were used to predict the number of days of accelerometer or pedometer wear or PA log daily records needed to represent total PA, light PA, moderate-to-vigorous PA, and sedentary behaviour. Results of this study showed that three days of accelerometer data, four days of pedometer data, or four days of completing PA logs are needed to accurately predict PA levels in older adults. When examining time spent in specific intensities of PA, fewer days of data are needed for accurate prediction of time spent in that activity for ActiGraph but more for the PA log. To accurately predict average daily time spent in sedentary behaviour, five days of ActiGraph data are needed. The number days of objective (pedometer and ActiGraph) and subjective (PA log) data needed to accurately estimate daily PA in older adults was relatively consistent. Despite no statistical differences between days for total PA by the pedometer and ActiGraph, the magnitude of differences between days suggests that day of the week cannot be completely ignored in the design and analysis of PA studies that involve < 7-day monitoring protocols for these instruments. More days of accelerometer data were needed to determine typical sedentary behaviour than PA level in this population of older adults.

  18. Measuring System for Growth Control of the Spirulina Aquaculture

    NASA Astrophysics Data System (ADS)

    Ponce S., Claudio; Ponce L., Ernesto; Bernardo S., Barraza

    2008-11-01

    It describes the workings of a data-logging instrument that measures growth levels of the Spirulina aquaculture. The Spirulina is a very delicate algae and its culture may be suddenly lost due to overgrowth. This kind of instrument is not at present available in the market. The transduction is a submergible laser device whose measuring margin of error is near to 0.28%. The advantage of this new instrument is the improvement in the measurement and the low cost. The future application of this work is related to the industrial production of food and fuel from micro algae culture, for the growing world population.

  19. Minnesota logging utilization factors, 1975-1976--development, use, implications.

    Treesearch

    James E. Blyth; W. Brad Smith

    1979-01-01

    Discusses Minnesota saw log and pulpwood logging utilization factors developed during 1975-1976 and their implications. Compares factors for several species groups and shows their use in estimating growing stock cut for pulpwood and saw logs.

  20. A compiler and validator for flight operations on NASA space missions

    NASA Astrophysics Data System (ADS)

    Fonte, Sergio; Politi, Romolo; Capria, Maria Teresa; Giardino, Marco; De Sanctis, Maria Cristina

    2016-07-01

    In NASA missions the management and the programming of the flight systems is performed by a specific scripting language, the SASF (Spacecraft Activity Sequence File). In order to perform a check on the syntax and grammar it is necessary a compiler that stress the errors (eventually) found in the sequence file produced for an instrument on board the flight system. In our experience on Dawn mission, we developed VIRV (VIR Validator), a tool that performs checks on the syntax and grammar of SASF, runs a simulations of VIR acquisitions and eventually finds violation of the flight rules of the sequences produced. The project of a SASF compiler (SSC - Spacecraft Sequence Compiler) is ready to have a new implementation: the generalization for different NASA mission. In fact, VIRV is a compiler for a dialect of SASF; it includes VIR commands as part of SASF language. Our goal is to produce a general compiler for the SASF, in which every instrument has a library to be introduced into the compiler. The SSC can analyze a SASF, produce a log of events, perform a simulation of the instrument acquisition and check the flight rules for the instrument selected. The output of the program can be produced in GRASS GIS format and may help the operator to analyze the geometry of the acquisition.

  1. Temperature control system for optical elements in astronomical instrumentation

    NASA Astrophysics Data System (ADS)

    Verducci, Orlando; de Oliveira, Antonio C.; Ribeiro, Flávio F.; Vital de Arruda, Márcio; Gneiding, Clemens D.; Fraga, Luciano

    2014-07-01

    Extremely low temperatures may damage the optical components assembled inside of an astronomical instrument due to the crack in the resin or glue used to attach lenses and mirrors. The environment, very cold and dry, in most of the astronomical observatories contributes to this problem. This paper describes the solution implemented at SOAR for remotely monitoring and controlling temperatures inside of a spectrograph, in order to prevent a possible damage of the optical parts. The system automatically switches on and off some heat dissipation elements, located near the optics, as the measured temperature reaches a trigger value. This value is set to a temperature at which the instrument is not operational to prevent malfunction and only to protect the optics. The software was developed with LabVIEWTM and based on an object-oriented design that offers flexibility and ease of maintenance. As result, the system is able to keep the internal temperature of the instrument above a chosen limit, except perhaps during the response time, due to inertia of the temperature. This inertia can be controlled and even avoided by choosing the correct amount of heat dissipation and location of the thermal elements. A log file records the measured temperature values by the system for operation analysis.

  2. Protocol Analysis as a Tool in Function and Task Analysis

    DTIC Science & Technology

    1999-10-01

    Autocontingency The use of log-linear and logistic regression methods to analyse sequential data seems appealing , and is strongly advocated by...collection and analysis of observational data. Behavior Research Methods, Instruments, and Computers, 23(3), 415-429. Patrick, J. D. (1991). Snob : A

  3. 10 CFR 39.61 - Training.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Training. 39.61 Section 39.61 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING Radiation Safety Requirements § 39.61... handling tools, and radiation survey instruments by a field evaluation; and (4) Has demonstrated...

  4. 10 CFR 39.61 - Training.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Training. 39.61 Section 39.61 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING Radiation Safety Requirements § 39.61... handling tools, and radiation survey instruments by a field evaluation; and (4) Has demonstrated...

  5. 10 CFR 39.61 - Training.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Training. 39.61 Section 39.61 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING Radiation Safety Requirements § 39.61... handling tools, and radiation survey instruments by a field evaluation; and (4) Has demonstrated...

  6. 10 CFR 39.61 - Training.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Training. 39.61 Section 39.61 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING Radiation Safety Requirements § 39.61... handling tools, and radiation survey instruments by a field evaluation; and (4) Has demonstrated...

  7. 10 CFR 39.61 - Training.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Training. 39.61 Section 39.61 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING Radiation Safety Requirements § 39.61... handling tools, and radiation survey instruments by a field evaluation; and (4) Has demonstrated...

  8. Subsurface recognition of oolitic facies in carbonate sequence: Exploration and development applications: Ste. Genevieve Formation (Mississippian), Illinois basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bandy, W.F.

    1989-08-01

    The oolitic grainstone facies of the Ste. Genevieve Limestone is a widespread and highly productive reservoir in the Illinois basin. However, exploration and development of these oolitic facies are hampered by the inability to recognize the reservoir on logs. In many areas, the only log data available are old wireline electric logs. Comparison of cores with log response in northern Lawrence field, Lawrence County, Illinois, indicates a subjective but predictable relationship between log signature and carbonate lithology. Two productive lithologies, dolomite and oolitic grainstone, display well-developed SP curves. However, resistivity response is greatest in dense limestone, less well developed inmore » oolitic grainstone, and poorly developed in dolomites. On gamma-ray logs, oolitic facies can be differentiated from dolomites by their lower radioactivity. Oolitic sands are most easily recognized on porosity logs, where their average porosity is 13.7%, only half the average porosity of dolomites. In a new well, the best information for subsequent offset and development of an oolitic reservoir is provided by porosity and dipmeter logs.« less

  9. Inactivation of Bacteria S. aureus ATCC 25923 and S. Thyphimurium ATCC 14 028 Influence of UV-HPEF

    NASA Astrophysics Data System (ADS)

    Bakri, A.; Hariono, B.; Utami, M. M. D.; Sutrisno

    2018-01-01

    The research was objected to study the performance of the UV unit - HPEF in inactivating bacteria population of Gram-positive (S aureus ATCC 25923) and Gram-negative (S Thyphimurium ATCC 14028) inoculated in sterilized goat’s milk. UV pasteurization instrument employed three reactors constructed in series UV-C system at 10 W, 253.7 nm wavelength made in Kada (USA) Inc. with 1.8 J/cm2 dose per reactor. HPEF instrument used high pulsed electric field at 31.67 kV/cm, 15 Hz and goat’s milk rate at 4:32 ± 0.71 cc/second. Pathogenic bacteria was observed According to Indonesian National Standard 01-2782-1998. Inactivation rate of pathogenic bacteria ie S Thyphimurium ATCC 14028 and S. aureus ATCC 25923 was 0.28 and 0.19 log cycle or 6.35 and 4.34 log cfu/ml/hour, respectively; D value was 0.16 and 0.23 hour with k value was 14.62 and 10 hour-1 respectively.

  10. Armored instrumentation cable for geothermal well logging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dennis, B.R.; Johnson, J.; Todd, B.

    1981-01-01

    Multiconductor armored well-logging cable is used extensively by the oil and natural gas industry to lower various instruments used to measure the geological and geophysical parameters into deep wellbores. Advanced technology in oil-well drilling makes it possible to achieve borehole depths of 9 km (30,000 ft). The higher temperatures in these deeper boreholes demand advancements in the design and manufacturing of wireline cable and in the electrical insulating and armoring materials used as integral components. If geothermal energy is proved an abundant economic resource, drilling temperatures approaching and exceeding 300/sup 0/C will become commonplace. The adaptation of teflons as electricalmore » insulating material permitted use of armored cable in geothermal wellbores where temperatures are slightly in excess of 200/sup 0/C, and where the concentrations of corrosive minerals and gases are high. Teflon materials presently used in wireline cables, however, are not capable of continuous operation at the anticipated higher temperatures.« less

  11. Assessment of subjective intraocular forward scattering and quality of vision after posterior chamber phakic intraocular lens with a central hole (Hole ICL) implantation.

    PubMed

    Iijima, Ayaka; Shimizu, Kimiya; Yamagishi, Mayumi; Kobashi, Hidenaga; Igarashi, Akihito; Kamiya, Kazutaka

    2016-12-01

    To evaluate the subjective intraocular forward scattering and quality of vision after posterior chamber phakic intraocular lens with a central hole (Hole ICL, STAAR Surgical) implantation. We prospectively examined 29 eyes of 29 consecutive patients (15 men and 14 women; ages, 37.2 ± 8.8 years) undergoing Hole ICL implantation. We assessed the values of the logarithmic straylight value [log (s)] using a straylight meter (C-Quant ™ , Oculus) preoperatively and 3 months postoperatively. The patients completed a questionnaire detailing symptoms on a quantitative grading scale (National Eye Institute Refractive Error Quality of Life Instrument-42; NEI RQL-42) 3 months postoperatively. We compared the preoperative and postoperative values of the log(s) and evaluated the correlation of these values with patient subjective symptoms. The mean log(s) was not significantly changed, from 1.07 ± 0.20 preoperatively, to 1.06 ± 0.17 postoperatively (Wilcoxon signed-rank test, p = 0.641). There was a significant correlation between the preoperative and postoperative log(s) (Spearman's correlation coefficient r = 0.695, p < 0.001). The postoperative log(s) was significantly associated with the scores of glare in the questionnaire (Spearman's correlation coefficient r = -0.575, p = 0.017). According to our experience, Hole ICL implantation does not induce a significant additional change in the subjective intraocular forward scattering. The symptom of glare after Hole ICL implantation was significantly correlated with the postoperative intraocular forward scattering in relation to the preoperative one. © 2016 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  12. Sonic Kayaks: Environmental monitoring and experimental music by citizens.

    PubMed

    Griffiths, Amber G F; Kemp, Kirsty M; Matthews, Kaffe; Garrett, Joanne K; Griffiths, David J

    2017-11-01

    The Sonic Kayak is a musical instrument used to investigate nature and developed during open hacklab events. The kayaks are rigged with underwater environmental sensors, which allow paddlers to hear real-time water temperature sonifications and underwater sounds, generating live music from the marine world. Sensor data is also logged every second with location, time and date, which allows for fine-scale mapping of water temperatures and underwater noise that was previously unattainable using standard research equipment. The system can be used as a citizen science data collection device, research equipment for professional scientists, or a sound art installation in its own right.

  13. Sonic Kayaks: Environmental monitoring and experimental music by citizens

    PubMed Central

    Kemp, Kirsty M.; Matthews, Kaffe; Garrett, Joanne K.; Griffiths, David J.

    2017-01-01

    The Sonic Kayak is a musical instrument used to investigate nature and developed during open hacklab events. The kayaks are rigged with underwater environmental sensors, which allow paddlers to hear real-time water temperature sonifications and underwater sounds, generating live music from the marine world. Sensor data is also logged every second with location, time and date, which allows for fine-scale mapping of water temperatures and underwater noise that was previously unattainable using standard research equipment. The system can be used as a citizen science data collection device, research equipment for professional scientists, or a sound art installation in its own right. PMID:29190283

  14. Stepwise shockwave velocity determinator

    NASA Technical Reports Server (NTRS)

    Roth, Timothy E.; Beeson, Harold

    1992-01-01

    To provide an uncomplicated and inexpensive method for measuring the far-field velocity of a surface shockwave produced by an explosion, a stepwise shockwave velocity determinator (SSVD) was developed. The velocity determinator is constructed of readily available materials and works on the principle of breaking discrete sensors composed of aluminum foil contacts. The discrete sensors have an average breaking threshold of approximately 7 kPa. An incremental output step of 250 mV is created with each foil contact breakage and is logged by analog-to-digital instrumentation. Velocity data obtained from the SSVD is within approximately 11 percent of the calculated surface shockwave velocity of a muzzle blast from a 30.06 rifle.

  15. Induction logging device with a pair of mutually perpendicular bucking coils

    DOEpatents

    Koelle, Alfred R.; Landt, Jeremy A.

    1981-01-01

    An instrument is disclosed for mapping vertical conductive fractures in a resistive bedrock, magnetically inducing eddy currents by a pair of vertically oriented, mutually perpendicular, coplanar coils. The eddy currents drive magnetic fields which are picked up by a second, similar pair of coils.

  16. ASVCP guidelines: quality assurance for point-of-care testing in veterinary medicine.

    PubMed

    Flatland, Bente; Freeman, Kathleen P; Vap, Linda M; Harr, Kendal E

    2013-12-01

    Point-of-care testing (POCT) refers to any laboratory testing performed outside the conventional reference laboratory and implies close proximity to patients. Instrumental POCT systems consist of small, handheld or benchtop analyzers. These have potential utility in many veterinary settings, including private clinics, academic veterinary medical centers, the community (eg, remote area veterinary medical teams), and for research applications in academia, government, and industry. Concern about the quality of veterinary in-clinic testing has been expressed in published veterinary literature; however, little guidance focusing on POCT is available. Recognizing this void, the ASVCP formed a subcommittee in 2009 charged with developing quality assurance (QA) guidelines for veterinary POCT. Guidelines were developed through literature review and a consensus process. Major recommendations include (1) taking a formalized approach to POCT within the facility, (2) use of written policies, standard operating procedures, forms, and logs, (3) operator training, including periodic assessment of skills, (4) assessment of instrument analytical performance and use of both statistical quality control and external quality assessment programs, (5) use of properly established or validated reference intervals, (6) and ensuring accurate patient results reporting. Where possible, given instrument analytical performance, use of a validated 13s control rule for interpretation of control data is recommended. These guidelines are aimed at veterinarians and veterinary technicians seeking to improve management of POCT in their clinical or research setting, and address QA of small chemistry and hematology instruments. These guidelines are not intended to be all-inclusive; rather, they provide a minimum standard for maintenance of POCT instruments in the veterinary setting. © 2013 American Society for Veterinary Clinical Pathology and European Society for Veterinary Clinical Pathology.

  17. A novel quantitative reverse-transcription PCR (qRT-PCR) for the enumeration of total bacteria, using meat micro-flora as a model.

    PubMed

    Dolan, Anthony; Burgess, Catherine M; Barry, Thomas B; Fanning, Seamus; Duffy, Geraldine

    2009-04-01

    A sensitive quantitative reverse-transcription PCR (qRT-PCR) method was developed for enumeration of total bacteria. Using two sets of primers separately to target the ribonuclease-P (RNase P) RNA transcripts of gram positive and gram negative bacteria. Standard curves were generated using SYBR Green I kits for the LightCycler 2.0 instrument (Roche Diagnostics) to allow quantification of mixed microflora in liquid media. RNA standards were used and extracted from known cell equivalents and subsequently converted to cDNA for the construction of standard curves. The number of mixed bacteria in culture was determined by qRT-PCR, and the results correlated (r(2)=0.88, rsd=0.466) with the total viable count over the range from approx. Log(10) 3 to approx. Log(10) 7 CFU ml(-1). The rapid nature of this assay (8 h) and its potential as an alternative method to the standard plate count method to predict total viable counts and shelf life are discussed.

  18. TraceContract

    NASA Technical Reports Server (NTRS)

    Kavelund, Klaus; Barringer, Howard

    2012-01-01

    TraceContract is an API (Application Programming Interface) for trace analysis. A trace is a sequence of events, and can, for example, be generated by a running program, instrumented appropriately to generate events. An event can be any data object. An example of a trace is a log file containing events that a programmer has found important to record during a program execution. Trace - Contract takes as input such a trace together with a specification formulated using the API and reports on any violations of the specification, potentially calling code (reactions) to be executed when violations are detected. The software is developed as an internal DSL (Domain Specific Language) in the Scala programming language. Scala is a relatively new programming language that is specifically convenient for defining such internal DSLs due to a number of language characteristics. This includes Scala s elegant combination of object-oriented and functional programming, a succinct notation, and an advanced type system. The DSL offers a combination of data-parameterized state machines and temporal logic, which is novel. As an extension of Scala, it is a very expressive and convenient log file analysis framework.

  19. Development of a 3D log sawing optimization system for small sawmills in central Appalachia, US

    Treesearch

    Wenshu Lin; Jingxin Wang; Edward Thomas

    2011-01-01

    A 3D log sawing optimization system was developed to perform log generation, opening face determination, sawing simulation, and lumber grading using 3D modeling techniques. Heuristic and dynamic programming algorithms were used to determine opening face and grade sawing optimization. Positions and shapes of internal log defects were predicted using a model developed by...

  20. Standard weight (Ws) equations for four rare desert fishes

    USGS Publications Warehouse

    Didenko, A.V.; Bonar, Scott A.; Matter, W.J.

    2004-01-01

    Standard weight (Ws) equations have been used extensively to examine body condition in sport fishes. However, development of these equations for nongame fishes has only recently been emphasized. We used the regression-line-percentile technique to develop standard weight equations for four rare desert fishes: flannelmouth sucker Catostomus latipinnis, razorback sucker Xyrauchen texanus, roundtail chub Gila robusta, and humpback chub G. cypha. The Ws equation for flannelmouth suckers of 100-690 mm total length (TL) was developed from 17 populations: log10Ws = -5.180 + 3.068 log10TL. The Ws equation for razorback suckers of 110-885 mm TL was developed from 12 populations: log 10Ws = -4.886 + 2.985 log10TL. The W s equation for roundtail chub of 100-525 mm TL was developed from 20 populations: log10Ws = -5.065 + 3.015 log10TL. The Ws equation for humpback chub of 120-495 mm TL was developed from 9 populations: log10Ws = -5.278 + 3.096 log 10TL. These equations meet criteria for acceptable standard weight indexes and can be used to calculate relative weight, an index of body condition.

  1. Do the BSRI and PAQ really measure masculinity and femininity?

    PubMed

    Fernández, Juan; Coello, Ma Teresa

    2010-11-01

    The two most used instruments to assess masculinity (M) and femininity (F) are the Bem Sex Role Inventory (BSRI) and the Personality Attributes Questionnaire (PAQ). Two hypotheses will be tested: a) multidimensionality versus bidimensionality, and b) to what extent the two instruments, elaborated to measure the same constructs, classify subjects in the same way. Participants were 420 high school students, 198 women and 222 men, aged 12-15 years. Exploratory factor analysis and internal consistency analysis were carried out and log-linear models were tested. The data support a) the multidimensionality of both instruments and b) the lack of full concordance in the classification of persons according to the fourfold typology. Implications of the results are discussed regarding the supposed theory behind instrumentality/ expressiveness and masculinity/femininity, as well as for the use of both instruments to classify different subjects into the four distinct types.

  2. Using client-side event logging and path tracing to assess and improve the quality of web-based surveys.

    PubMed

    White, Thomas M; Hauan, Michael J

    2002-01-01

    Web-based data collection has considerable appeal. However, the quality of data collected using such instruments is often questionable. There can be systematic problems with the wording of the surveys, and/or the means with which they are deployed. In unsupervised data collection, there are also concerns about whether subjects understand the questions, and wehther they are answering honestly. This paper presents a schema for using client-side timestamps and traces of subjects' paths through instruments to detect problems with the definition of instruments and their deployment. We discuss two large, anonymous, web-based, medical surveys as examples of the utility of this approach.

  3. The CALorimetric Electron Telescope (CALET) Launch and Early On-Orbit Performance

    NASA Astrophysics Data System (ADS)

    Guzik, T. Gregory; Calet Collaboration

    2016-03-01

    The CALET space experiment, has been developed by collaborators in Japan, Italy and the United States, will study electrons to 20 TeV, gamma rays above 10 GeV and nuclei with Z =1 to 40 up to 1,000 TeV during a five-year mission on the International Space Station. The instrument consists of a particle charge identification module, a thin imaging calorimeter (3 r.l. in total) with tungsten plates interleaving scintillating fiber planes, and a thick calorimeter (27 r.l.) composed of lead tungstate logs. CALET has the depth, imaging capabilities and energy resolution for excellent separation between hadrons, electrons and gamma rays. The instrument was launched into orbit on August 19, 2015 and on August 25, 2015 was mounted as an attached payload on the International Space Station (ISS) Japanese Experiment Module - Exposed Facility (JEM-EF). The experiment has successfully completed on-orbit checkout and has now been transitioned to normal science operations. This presentation summarizes the instrument design, science goals and early on-orbit performance. This effort is supported by NASA in the United States, by JAXA in Japan, and ASI in Italy.

  4. SKALA, a log-periodic array antenna for the SKA-low instrument: design, simulations, tests and system considerations

    NASA Astrophysics Data System (ADS)

    de Lera Acedo, E.; Razavi-Ghods, N.; Troop, N.; Drought, N.; Faulkner, A. J.

    2015-10-01

    The very demanding requirements of the SKA-low instrument call for a challenging antenna design capable of delivering excellent performance in radiation patterns, impedance matching, polarization purity, cost, longevity, etc. This paper is devoted to the development (design and test of the first prototypes) of an active ultra-wideband antenna element for the low-frequency instrument of the SKA radio telescope. The antenna element and differential low noise amplifier described here were originally designed to cover the former SKA-low band (70-450 MHz) but it is now aimed to cover the re-defined SKA-low band (50-350 MHz) and furthermore the antenna is capable of performing up to 650 MHz with the current design. The design is focused on maximum sensitivity in a wide field of view (+/- 45° from zenith) and low cross-polarization ratios. Furthermore, the size and cost of the element has to be kept to a minimum as millions of these antennas will need to be deployed for the full SKA in very compact configurations. The primary focus of this paper is therefore to discuss various design implications for the SKA-low telescope.

  5. Environment Modeling Using Runtime Values for JPF-Android

    NASA Technical Reports Server (NTRS)

    van der Merwe, Heila; Tkachuk, Oksana; Nel, Seal; van der Merwe, Brink; Visser, Willem

    2015-01-01

    Software applications are developed to be executed in a specific environment. This environment includes external native libraries to add functionality to the application and drivers to fire the application execution. For testing and verification, the environment of an application is simplified abstracted using models or stubs. Empty stubs, returning default values, are simple to generate automatically, but they do not perform well when the application expects specific return values. Symbolic execution is used to find input parameters for drivers and return values for library stubs, but it struggles to detect the values of complex objects. In this work-in-progress paper, we explore an approach to generate drivers and stubs based on values collected during runtime instead of using default values. Entry-points and methods that need to be modeled are instrumented to log their parameters and return values. The instrumented applications are then executed using a driver and instrumented libraries. The values collected during runtime are used to generate driver and stub values on- the-fly that improve coverage during verification by enabling the execution of code that previously crashed or was missed. We are implementing this approach to improve the environment model of JPF-Android, our model checking and analysis tool for Android applications.

  6. A new automated passive capillary lysimeter for logging real-time drainage water fluxes

    USDA-ARS?s Scientific Manuscript database

    Effective monitoring of chemical transport through the soil profile requires accurate and appropriate instrumentation to measure drainage water fluxes below the root zone of cropping system. The objectives of this study were to methodically describe in detail the construction and installation of a n...

  7. Wave-Ice Interaction and the Marginal Ice Zone

    DTIC Science & Technology

    2013-09-30

    concept, using a high-quality attitude and heading reference system ( AHRS ) together with an accurate twin-antennae GPS compass. The instruments logged...the AHRS parameters at 50Hz, together with GPS-derived fixes, heading (accurate to better than 1o) and velocities at 10Hz. The 30MB hourly files

  8. Item generation and pilot testing of the Comprehensive Professional Behaviours Development Log.

    PubMed

    Bartlett, Doreen J; Lucy, S Deborah; Bisbee, Leslie

    2006-01-01

    The purpose of this project was to generate and refine criteria for professional behaviors previously identified to be important for physical therapy practice and to develop and pilot test a new instrument, which we have called the Comprehensive Professional Behaviours Development Log (CPBDL). Items were generated from our previous work, the work of Warren May and his colleagues, a competency profile for entry-level physical therapists, our regulatory code of ethics, and an evaluation of clinical performance. A group of eight people, including recent graduates, clinical instructors and professional practice leaders, and faculty members, refined the items in two iterations using the Delphi process. The CPBDL contains nine key professional behaviors with a range of nine to 23 specific behavioral criteria for individuals to reflect on and to indicate the consistency of performance from a selection of "not at all," "sometimes," and "always" response options. Pilot testing with a group of 42 students in the final year of our entry-to-practice curriculum indicated that the criteria were clear, the measure was feasible to complete in a reasonable time frame, and there were no ceiling or floor effects. We believe that others, including health care educators and practicing professionals, might be interested in adapting the CPBDL in their own settings to enhance the professional behaviors of either students in preparation for entry to practice or clinicians wishing to demonstrate continuing competency to professional regulatory bodies.

  9. Financial and Economic Analysis of Reduced Impact Logging

    Treesearch

    Tom Holmes

    2016-01-01

    Concern regarding extensive damage to tropical forests resulting from logging increased dramatically after World War II when mechanized logging systems developed in industrialized countries were deployed in the tropics. As a consequence, tropical foresters began developing logging procedures that were more environmentally benign, and by the 1990s, these practices began...

  10. Life cycle performances of log wood applied for soil bioengineering constructions

    NASA Astrophysics Data System (ADS)

    Kalny, Gerda; Strauss-Sieberth, Alexandra; Strauss, Alfred; Rauch, Hans Peter

    2016-04-01

    Nowadays there is a high demand on engineering solutions considering not only technical aspects but also ecological and aesthetic values. Soil bioengineering is a construction technique that uses biological components for hydraulic and civil engineering solutions. Soil bioengineering solutions are based on the application of living plants and other auxiliary materials including among others log wood. This kind of construction material supports the soil bioengineering system as long as the plants as living construction material overtake the stability function. Therefore it is important to know about the durability and the degradation process of the wooden logs to retain the integral performance of a soil bio engineering system. These aspects will be considered within the framework of the interdisciplinary research project „ELWIRA Plants, wood, steel and concrete - life cycle performances as construction materials". Therefore field investigations on soil bioengineering construction material, specifically European Larch wood logs, of different soil bioengineering structures at the river Wien have been conducted. The drilling resistance as a parameter for particular material characteristics of selected logs was measured and analysed. The drilling resistance was measured with a Rinntech Resistograph instrument at different positions of the wooden logs, all surrounded with three different backfills: Fully surrounded with air, with earth contact on one side and near the water surface in wet-dry conditions. The age of the used logs ranges from one year old up to 20 year old. Results show progress of the drilling resistance throughout the whole cross section as an indicator to assess soil bioengineering construction material. Logs surrounded by air showed a higher drilling resistance than logs with earth contact and the ones exposed to wet-dry conditions. Hence the functional capability of wooden logs were analysed and discussed in terms of different levels of degradation. The results contribute to a sustainable and resource conserving handling with building materials in frame of construction and maintenance works of soil bioengineering structures.

  11. Real-time use of the iPad by third-year medical students for clinical decision support and learning: a mixed methods study

    PubMed Central

    Nuss, Michelle A.; Hill, Janette R.; Cervero, Ronald M.; Gaines, Julie K.; Middendorf, Bruce F.

    2014-01-01

    Purpose Despite widespread use of mobile technology in medical education, medical students’ use of mobile technology for clinical decision support and learning is not well understood. Three key questions were explored in this extensive mixed methods study: 1) how medical students used mobile technology in the care of patients, 2) the mobile applications (apps) used and 3) how expertise and time spent changed overtime. Methods This year-long (July 2012–June 2013) mixed methods study explored the use of the iPad, using four data collection instruments: 1) beginning and end-of-year questionnaires, 2) iPad usage logs, 3) weekly rounding observations, and 4) weekly medical student interviews. Descriptive statistics were generated for the questionnaires and apps reported in the usage logs. The iPad usage logs, observation logs, and weekly interviews were analyzed via inductive thematic analysis. Results Students predominantly used mobile technology to obtain real-time patient data via the electronic health record (EHR), to access medical knowledge resources for learning, and to inform patient care. The top four apps used were Epocrates®, PDF Expert®, VisualDx®, and Micromedex®. The majority of students indicated that their use (71%) and expertise (75%) using mobile technology grew overtime. Conclusions This mixed methods study provides substantial evidence that medical students used mobile technology for clinical decision support and learning. Integrating its use into the medical student's daily workflow was essential for achieving these outcomes. Developing expertise in using mobile technology and various apps was critical for effective and efficient support of real-time clinical decisions. PMID:25317266

  12. On coincident loop transient electromagnetic induction logging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swidinsky, Andrei; Weiss, Chester J.

    Coincident loop transient induction wireline logging is examined as the borehole analog of the well-known land and airborne time-domain electromagnetic (EM) method. The concept of whole-space late-time apparent resistivity is modified from the half-space version commonly used in land and airborne geophysics and applied to the coincident loop voltages produced from various formation, borehole, and invasion models. Given typical tool diameters, off-time measurements with such an instrument must be made on the order of nanoseconds to microseconds — much more rapidly than for surface methods. Departure curves of the apparent resistivity for thin beds, calculated using an algorithm developed tomore » model the transient response of a loop in a multilayered earth, indicate that the depth of investigation scales with the bed thickness. Modeled resistivity logs are comparable in accuracy and resolution with standard frequency-domain focused induction logs. However, if measurement times are longer than a few microseconds, the thicknesses of conductors can be overestimated, whereas resistors are underestimated. Thin-bed resolution characteristics are explained by visualizing snapshots of the EM fields in the formation, where a conductor traps the electric field while two current maxima are produced in the shoulder beds surrounding a resistor. Radial profiling is studied using a concentric cylinder earth model. Results found that true formation resistivity can be determined in the presence of either oil- or water-based mud, although in the latter case, measurements must be taken several orders of magnitude later in time. Lastly, the ability to determine true formation resistivity is governed by the degree that the EM field heals after being distorted by borehole fluid and invasion, a process visualized and particularly evident in the case of conductive water-based mud.« less

  13. On coincident loop transient electromagnetic induction logging

    DOE PAGES

    Swidinsky, Andrei; Weiss, Chester J.

    2017-05-31

    Coincident loop transient induction wireline logging is examined as the borehole analog of the well-known land and airborne time-domain electromagnetic (EM) method. The concept of whole-space late-time apparent resistivity is modified from the half-space version commonly used in land and airborne geophysics and applied to the coincident loop voltages produced from various formation, borehole, and invasion models. Given typical tool diameters, off-time measurements with such an instrument must be made on the order of nanoseconds to microseconds — much more rapidly than for surface methods. Departure curves of the apparent resistivity for thin beds, calculated using an algorithm developed tomore » model the transient response of a loop in a multilayered earth, indicate that the depth of investigation scales with the bed thickness. Modeled resistivity logs are comparable in accuracy and resolution with standard frequency-domain focused induction logs. However, if measurement times are longer than a few microseconds, the thicknesses of conductors can be overestimated, whereas resistors are underestimated. Thin-bed resolution characteristics are explained by visualizing snapshots of the EM fields in the formation, where a conductor traps the electric field while two current maxima are produced in the shoulder beds surrounding a resistor. Radial profiling is studied using a concentric cylinder earth model. Results found that true formation resistivity can be determined in the presence of either oil- or water-based mud, although in the latter case, measurements must be taken several orders of magnitude later in time. Lastly, the ability to determine true formation resistivity is governed by the degree that the EM field heals after being distorted by borehole fluid and invasion, a process visualized and particularly evident in the case of conductive water-based mud.« less

  14. Development of a Real Time Internal Charging Tool for Geosynchronous Orbit

    NASA Technical Reports Server (NTRS)

    Posey, Nathaniel A.; Minow, Joesph I.

    2013-01-01

    The high-energy electron fluxes encountered by satellites in geosynchronous orbit pose a serious threat to onboard instrumentation and other circuitry. A substantial build-up of charge within a satellite's insulators can lead to electric fields in excess of the breakdown strength, which can result in destructive electrostatic discharges. The software tool we've developed uses data on the plasma environment taken from NOAA's GOES-13 satellite to track the resulting electric field strength within a material of arbitrary depth and conductivity and allows us to monitor the risk of material failure in real time. The tool also utilizes a transport algorithm to simulate the effects of shielding on the dielectric. Data on the plasma environment and the resulting electric fields are logged to allow for playback at a variable frame rate.

  15. STRETCH IN A BOREHOLE CABLE AND ITS POSSIBLE EFFECTS ON THE OPERATION OF BOREHOLE LOGGING EQUIPMENT TYPE 1417A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, D.; Cole, H.A.G.

    1958-11-01

    A curious effect was observed during the operation of Bcrehole Logging Equipment Type 1417A in Rhodesia and Swaziland. With the probe submerged in water at depths greater than 150 ft vertical motion or light jerking of the cable produced spurious high count rates on the ratemeter. It has been suggested that voltage pulses which would be necessary to produce this effect, may be caused by changes in the length and hence capacity of the charged concentric cable connector prcduced under load conditions. This memorandum analyzes possible effects of cable stretch on instrument performance. (auth)

  16. Blazar Duty-Cycle at Gamma-Ray Frequecies: Constraints From Extragalactic Background Radiation And Prospects for AGILE And GLAST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pittori, Carlotta; Cavazzuti, Elisabetta; Colafrancesco, Sergio

    2011-11-29

    We take into account the constraints from the observed extragalactic {gamma}-ray background to estimate the maximum duty cycle allowed for a selected sample of WMAP Blazars, in order to be detectable by AGILE and GLAST {gamma}-ray experiments. For the nominal sensitivity values of both instruments, we identify a subset of sources which can in principle be detectable also in a steady state without over-predicting the extragalactic background. This work is based on the results of a recently derived Blazar radio LogN-LogS obtained by combining several multi-frequency surveys.

  17. Synopsis of a computer program designed to interface a personal computer with the fast data acquisition system of a time-of-flight mass spectrometer

    NASA Technical Reports Server (NTRS)

    Bechtel, R. D.; Mateos, M. A.; Lincoln, K. A.

    1988-01-01

    Briefly described are the essential features of a computer program designed to interface a personal computer with the fast, digital data acquisition system of a time-of-flight mass spectrometer. The instrumentation was developed to provide a time-resolved analysis of individual vapor pulses produced by the incidence of a pulsed laser beam on an ablative material. The high repetition rate spectrometer coupled to a fast transient recorder captures complete mass spectra every 20 to 35 microsecs, thereby providing the time resolution needed for the study of this sort of transient event. The program enables the computer to record the large amount of data generated by the system in short time intervals, and it provides the operator the immediate option of presenting the spectral data in several different formats. Furthermore, the system does this with a high degree of automation, including the tasks of mass labeling the spectra and logging pertinent instrumental parameters.

  18. A new, low-cost sun photometer for student use

    NASA Astrophysics Data System (ADS)

    Espinoza, A.; Pérez-Álvarez, H.; Parra-Vilchis, J. I.; Fauchey-López, E.; Fernando-González, L.; Faus-Landeros, G. E.; Celarier, E. A.; Robinson, D. Q.; Zepeda-Galbez, R.

    2011-12-01

    We have designed a sun photometer for the measurement of aerosol optical thickness (AOT) at 505 nm and 620 nm, using custom-made glass filters (9.5 nm bandpass, FWHM) and photodiodes. The recommended price-point (US150 - US200) allowed us to incorporate technologies such as microcontrollers, a sun target, a USB port for data uploading, nonvolatile memory to contain tables of up to 127 geolocation profiles, extensive calibration data, and a log of up to 2,000 measurements. The instrument is designed to be easy to use, and to provide instant display of AOT estimates. A diffuser in the fore-optics limits the sensitivity to pointing error. We have developed postprocessing software to refine the AOT estimates, format a spreadsheet file, and upload the data to the GLOBE website. We are currently finalizing hardware and firmware, and conducting extensive calibration/validation experiments. These instruments will soon be in production and available to the K-12 education community, including and especially the GLOBE program.

  19. Development of regional stump-to-mill logging cost estimators

    Treesearch

    Chris B. LeDoux; John E. Baumgras

    1989-01-01

    Planning logging operations requires estimating the logging costs for the sale or tract being harvested. Decisions need to be made on equipment selection and its application to terrain. In this paper a methodology is described that has been developed and implemented to solve the problem of accurately estimating logging costs by region. The methodology blends field time...

  20. Tangential scanning of hardwood logs: developing an industrial computer tomography scanner

    Treesearch

    Nand K. Gupta; Daniel L. Schmoldt; Bruce Isaacson

    1999-01-01

    It is generally believed that noninvasive scanning of hardwood logs such as computer tomography (CT) scanning prior to initial breakdown will greatly improve the processing of logs into lumber. This belief, however, has not translated into rapid development and widespread installation of industrial CT scanners for log processing. The roadblock has been more operational...

  1. 14 CFR 61.129 - Aeronautical experience.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Aeronautical experience. (a) For an airplane single-engine rating. Except as provided in paragraph (i) of this...-engine class rating must log at least 250 hours of flight time as a pilot that consists of at least: (1... required on instrument training must be in a single engine airplane; (ii) 10 hours of training in an...

  2. 14 CFR 61.129 - Aeronautical experience.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Aeronautical experience. (a) For an airplane single-engine rating. Except as provided in paragraph (i) of this...-engine class rating must log at least 250 hours of flight time as a pilot that consists of at least: (1... required on instrument training must be in a single engine airplane; (ii) 10 hours of training in an...

  3. 14 CFR 61.129 - Aeronautical experience.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Aeronautical experience. (a) For an airplane single-engine rating. Except as provided in paragraph (i) of this...-engine class rating must log at least 250 hours of flight time as a pilot that consists of at least: (1... required on instrument training must be in a single engine airplane; (ii) 10 hours of training in an...

  4. 14 CFR 61.129 - Aeronautical experience.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Aeronautical experience. (a) For an airplane single-engine rating. Except as provided in paragraph (i) of this...-engine class rating must log at least 250 hours of flight time as a pilot that consists of at least: (1... required on instrument training must be in a single engine airplane; (ii) 10 hours of training in an...

  5. 14 CFR 61.129 - Aeronautical experience.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Aeronautical experience. (a) For an airplane single-engine rating. Except as provided in paragraph (i) of this...-engine class rating must log at least 250 hours of flight time as a pilot that consists of at least: (1... required on instrument training must be in a single engine airplane; (ii) 10 hours of training in an...

  6. 78 FR 16698 - Chemical Facility Anti-Terrorism Standards (CFATS) Chemical-Terrorism Vulnerability Information...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-18

    ...: Interested persons are invited to submit written comments on the proposed information collection to the... Standards. In 6 CFR 27.400, CFATS also establishes the requirements that covered persons must follow to... Tracking Log.'' The commenter assumed that the number of responses per respondent for this instrument was...

  7. Climate logging with a new rapid optical technique at siple dome

    USGS Publications Warehouse

    Bay, R.C.; Price, P.B.; Clow, G.D.; Gow, A.J.

    2001-01-01

    The dust logger design is based on a decade of experience in the use of light sources to measure optical properties of deep Antarctic ice. Light is emitted at the top of the instrument by side-directed LEDs, scattered or absorbed by dust in the ice surrounding the borehole, and collected in a downhole-pointing photomultiplier tube (PMT) a meter below. With this method the ice is sampled at ambient pressure in a much larger volume than is the case in a core study, and the entire length can be logged in one day. In ice in which scattering is dominated by bubbles, the absorption from dust impurities is perceived as a drop in signal, whereas in bubble-free ice the scattering from dust increases the light collected. We report on results obtained in Siple Dome Hole A in December 2000. The instrument measured increases in dust concentration extending over many meters during glacial maxima, as well as narrow spikes due to ??? 1 cm thick ash and dust bands of volcanic origin. Monte Carlo simulation is employed to clarify data analysis and predict the capabilities of future designs.

  8. The Motor Activity Log-28: assessing daily use of the hemiparetic arm after stroke.

    PubMed

    Uswatte, G; Taub, E; Morris, D; Light, K; Thompson, P A

    2006-10-10

    Data from monkeys with deafferented forelimbs and humans after stroke indicate that tests of the motor capacity of impaired extremities can overestimate their spontaneous use. Before the Motor Activity Log (MAL) was developed, no instruments assessed spontaneous use of a hemiparetic arm outside the treatment setting. To study the MAL's reliability and validity for assessing real-world quality of movement (QOM scale) and amount of use (AOU scale) of the hemiparetic arm in stroke survivors. Participants in a multisite clinical trial completed a 30-item MAL before and after treatment (n = 106) or an equivalent no-treatment period (n = 116). Participants also completed the Stroke Impact Scale (SIS) and wore accelerometers that monitored arm movement for three consecutive days outside the laboratory. All were 3 to 12 months post-stroke and had mild to moderate paresis of an upper extremity. After an item analysis, two MAL tasks were eliminated. Revised participant MAL QOM scores were reliable (r =0.82). Validity was also supported. During the first observation period, the correlation between QOM and SIS Hand Function scale scores was 0.72. The corresponding correlation for QOM and accelerometry values was 0.52. Participant QOM and AOU scores were highly correlated (r = 0.92). The participant Motor Activity Log is reliable and valid in individuals with subacute stroke. It might be employed to assess the real-world effects of upper extremity neurorehabilitation and detect deficits in spontaneous use of the hemiparetic arm in daily life.

  9. ALOG: A spreadsheet-based program for generating artificial logs

    Treesearch

    Matthew F. Winn; Randolph H. Wynne; Philip A. Araman

    2004-01-01

    Log sawing simulation computer programs can be valuable tools for training sawyers as well as for testing different sawing patterns. Most available simulation programs rely on databases from which to draw logs and can be very costly and time-consuming to develop. ALOG (Artificial LOg Generator) is a Microsoft Excel®-based computer program that was developed to...

  10. Study on fracture identification of shale reservoir based on electrical imaging logging

    NASA Astrophysics Data System (ADS)

    Yu, Zhou; Lai, Fuqiang; Xu, Lei; Liu, Lin; Yu, Tong; Chen, Junyu; Zhu, Yuantong

    2017-05-01

    In recent years, shale gas exploration has made important development, access to a major breakthrough, in which the study of mud shale fractures is extremely important. The development of fractures has an important role in the development of gas reservoirs. Based on the core observation and the analysis of laboratory flakes and laboratory materials, this paper divides the lithology of the shale reservoirs of the XX well in Zhanhua Depression. Based on the response of the mudstone fractures in the logging curve, the fracture development and logging Response to the relationship between the conventional logging and electrical imaging logging to identify the fractures in the work, the final completion of the type of fractures in the area to determine and quantify the calculation of fractures. It is concluded that the fracture type of the study area is high and the microstructures are developed from the analysis of the XX wells in Zhanhua Depression. The shape of the fractures can be clearly seen by imaging logging technology to determine its type.

  11. Rolling Deck to Repository I: Designing a Database Infrastructure

    NASA Astrophysics Data System (ADS)

    Arko, R. A.; Miller, S. P.; Chandler, C. L.; Ferrini, V. L.; O'Hara, S. H.

    2008-12-01

    The NSF-supported academic research fleet collectively produces a large and diverse volume of scientific data, which are increasingly being shared across disciplines and contributed to regional and global syntheses. As both Internet connectivity and storage technology improve, it becomes practical for ships to routinely deliver data and documentation for a standard suite of underway instruments to a central shoreside repository. Routine delivery will facilitate data discovery and integration, quality assessment, cruise planning, compliance with funding agency and clearance requirements, and long-term data preservation. We are working collaboratively with ship operators and data managers to develop a prototype "data discovery system" for NSF-supported research vessels. Our goal is to establish infrastructure for a central shoreside repository, and to develop and test procedures for the routine delivery of standard data products and documentation to the repository. Related efforts are underway to identify tools and criteria for quality control of standard data products, and to develop standard interfaces and procedures for maintaining an underway event log. Development of a shoreside repository infrastructure will include: 1. Deployment and testing of a central catalog that holds cruise summaries and vessel profiles. A cruise summary will capture the essential details of a research expedition (operating institution, ports/dates, personnel, data inventory, etc.), as well as related documentation such as event logs and technical reports. A vessel profile will capture the essential details of a ship's installed instruments (manufacturer, model, serial number, reference location, etc.), with version control as the profile changes through time. The catalog's relational database schema will be based on the UNOLS Data Best Practices Committee's recommendations, and published as a formal XML specification. 2. Deployment and testing of a central repository that holds navigation and routine underway data. Based on discussion with ship operators and data managers at a workgroup meeting in September 2008, we anticipate that a subset of underway data could be delivered from ships to the central repository in near- realtime - enabling the integrated display of ship tracks at a public Web portal, for example - and a full data package could be delivered post-cruise by network transfer or disk shipment. Once ashore, data sets could be distributed to assembly centers such as the Shipboard Automated Meteorological and Oceanographic System (SAMOS) for routine processing, quality assessment, and synthesis efforts - as well as transmitted to national data centers such as NODC and NGDC for permanent archival. 3. Deployment and testing of a basic suite of Web services to make cruise summaries, vessel profiles, event logs, and navigation data easily available. A standard set of catalog records, maps, and navigation features will be published via the Open Archives Initiative (OAI) and Open Geospatial Consortium (OGC) protocols, which can then be harvested by partner data centers and/or embedded in client applications.

  12. Characterization and Application of Passive Samplers for Monitoring of Pesticides in Water.

    PubMed

    Ahrens, Lutz; Daneshvar, Atlasi; Lau, Anna E; Kreuger, Jenny

    2016-08-03

    Five different water passive samplers were calibrated under laboratory conditions for measurement of 124 legacy and current used pesticides. This study provides a protocol for the passive sampler preparation, calibration, extraction method and instrumental analysis. Sampling rates (RS) and passive sampler-water partition coefficients (KPW) were calculated for silicone rubber, polar organic chemical integrative sampler POCIS-A, POCIS-B, SDB-RPS and C18 disk. The uptake of the selected compounds depended on their physicochemical properties, i.e., silicone rubber showed a better uptake for more hydrophobic compounds (log octanol-water partition coefficient (KOW) > 5.3), whereas POCIS-A, POCIS-B and SDB-RPS disk were more suitable for hydrophilic compounds (log KOW < 0.70).

  13. Reinforcement of drinking by running: effect of fixed ratio and reinforcement time1

    PubMed Central

    Premack, David; Schaeffer, Robert W.; Hundt, Alan

    1964-01-01

    Rats were required to complete varying numbers of licks (FR), ranging from 10 to 300, in order to free an activity wheel for predetermined times (CT) ranging from 2 to 20 sec. The reinforcement of drinking by running was shown both by an increased frequency of licking, and by changes in length of the burst of licking relative to operant-level burst length. In log-log coordinates, instrumental licking tended to be a linear increasing function of FR for the range tested, a linear decreasing function of CT for the range tested. Pause time was implicated in both of the above relations, being a generally increasing function of both FR and CT. PMID:14120150

  14. REINFORCEMENT OF DRINKING BY RUNNING: EFFECT OF FIXED RATIO AND REINFORCEMENT TIME.

    PubMed

    PREMACK, D; SCHAEFFER, R W; HUNDT, A

    1964-01-01

    Rats were required to complete varying numbers of licks (FR), ranging from 10 to 300, in order to free an activity wheel for predetermined times (CT) ranging from 2 to 20 sec. The reinforcement of drinking by running was shown both by an increased frequency of licking, and by changes in length of the burst of licking relative to operant-level burst length. In log-log coordinates, instrumental licking tended to be a linear increasing function of FR for the range tested, a linear decreasing function of CT for the range tested. Pause time was implicated in both of the above relations, being a generally increasing function of both FR and CT.

  15. Spectral density measurements of gyro noise

    NASA Technical Reports Server (NTRS)

    Truncale, A.; Koenigsberg, W.; Harris, R.

    1972-01-01

    Power spectral density (PSD) was used to analyze the outputs of several gyros in the frequency range from 0.01 to 200 Hz. Data were accumulated on eight inertial quality instruments. The results are described in terms of input angle noise (arcsec 2/Hz) and are presented on log-log plots of PSD. These data show that the standard deviation of measurement noise was 0.01 arcsec or less for some gyros in the passband from 1 Hz down 10 0.01 Hz and probably down to 0.001 Hz for at least one gyro. For the passband between 1 and 100 Hz, uncertainties in the 0.01 and 0.05 arcsec region were observed.

  16. An integrated 3D log processing optimization system for small sawmills in central Appalachia

    Treesearch

    Wenshu Lin; Jingxin Wang

    2013-01-01

    An integrated 3D log processing optimization system was developed to perform 3D log generation, opening face determination, headrig log sawing simulation, fl itch edging and trimming simulation, cant resawing, and lumber grading. A circular cross-section model, together with 3D modeling techniques, was used to reconstruct 3D virtual logs. Internal log defects (knots)...

  17. Portable Chemical Sterilizer for Microbial Decontamination of Surgical Instruments, Fruits and Vegetables, and Field Feeding Equipment

    DTIC Science & Technology

    2006-11-01

    spores of B. stearothermophilus . For all of the test organisms, conditions were found that effected sterilization (6-log kill of contaminating...kill 106 E. coli, L. monocytogenes, S. aureus, and bacterial spores of B. atrophaeus and B. stearothermophilus and to sterilize high-grade...Portable Chemical Sterilizer for Microbial Decontamination of

  18. Assessing the Validity of an Annual Survey for Measuring the Enacted Literacy Curriculum

    ERIC Educational Resources Information Center

    Camburn, Eric M.; Han, Seong Won; Sebastian, James

    2017-01-01

    Surveys are frequently used to inform consequential decisions about teachers, policies, and programs. Consequently, it is important to understand the validity of these instruments. This study assesses the validity of measures of instruction captured by an annual survey by comparing survey data with those of a validated daily log. The two…

  19. nStudy: A System for Researching Information Problem Solving

    ERIC Educational Resources Information Center

    Winne, Philip H.; Nesbit, John C.; Popowich, Fred

    2017-01-01

    A bottleneck in gathering big data about learning is instrumentation designed to record data about processes students use to learn and information on which those processes operate. The software system nStudy fills this gap. nStudy is an extension to the Chrome web browser plus a server side database for logged trace data plus peripheral modules…

  20. Absolute Risk Aversion and the Returns to Education.

    ERIC Educational Resources Information Center

    Brunello, Giorgio

    2002-01-01

    Uses 1995 Italian household income and wealth survey to measure individual absolute risk aversion of 1,583 married Italian male household heads. Uses this measure as an instrument for attained education in a standard-log earnings equation. Finds that the IV estimate of the marginal return to schooling is much higher than the ordinary least squares…

  1. Frequency-agile wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Arms, Steven W.; Townsend, Christopher P.; Churchill, David L.; Hamel, Michael J.; Galbreath, Jacob H.; Mundell, Steven W.

    2004-07-01

    Our goal was to demonstrate a wireless communications system capable of simultaneous, high speed data communications from a variety of sensors. We have previously reported on the design and application of 2 KHz data logging transceiver nodes, however, only one node may stream data at a time, since all nodes on the network use the same communications frequency. To overcome these limitations, second generation data logging transceivers were developed with software programmable radio frequency (RF) communications. Each node contains on-board memory (2 Mbytes), sensor excitation, instrumentation amplifiers with programmable gains & offsets, multiplexer, 16 bit A/D converter, microcontroller, and frequency agile, bi-directional, frequency shift keyed (FSK) RF serial data link. These systems are capable of continuous data transmission from 26 distinct nodes (902-928 MHz band, 75 kbaud). The system was demonstrated in a compelling structural monitoring application. The National Parks Service requested a means for continual monitoring and recording of sensor data from the Liberty Bell during a move to a new location (Philadelphia, October 2003). Three distinct, frequency agile, wireless sensing nodes were used to detect visible crack shear/opening micromotions, triaxial accelerations, and hairline crack tip strains. The wireless sensors proved to be useful in protecting the Liberty Bell.

  2. An Interactive Multi-instrument Database of Solar Flares

    NASA Astrophysics Data System (ADS)

    Sadykov, Viacheslav M.; Kosovichev, Alexander G.; Oria, Vincent; Nita, Gelu M.

    2017-07-01

    Solar flares are complicated physical phenomena that are observable in a broad range of the electromagnetic spectrum, from radio waves to γ-rays. For a more comprehensive understanding of flares, it is necessary to perform a combined multi-wavelength analysis using observations from many satellites and ground-based observatories. For an efficient data search, integration of different flare lists, and representation of observational data, we have developed the Interactive Multi-Instrument Database of Solar Flares (IMIDSF, https://solarflare.njit.edu/). The web-accessible database is fully functional and allows the user to search for uniquely identified flare events based on their physical descriptors and the availability of observations by a particular set of instruments. Currently, the data from three primary flare lists (Geostationary Operational Environmental Satellites, RHESSI, and HEK) and a variety of other event catalogs (Hinode, Fermi GBM, Konus-WIND, the OVSA flare catalogs, the CACTus CME catalog, the Filament eruption catalog) and observing logs (IRIS and Nobeyama coverage) are integrated, and an additional set of physical descriptors (temperature and emission measure) is provided along with an observing summary, data links, and multi-wavelength light curves for each flare event since 2002 January. We envision that this new tool will allow researchers to significantly speed up the search of events of interest for statistical and case studies.

  3. Log Books and the Law of Storms: Maritime Meteorology and the British Admiralty in the Nineteenth Century.

    PubMed

    Naylor, Simon

    2015-12-01

    This essay contributes to debates about the relationship between science and the military by examining the British Admiralty's participation in meteorological projects in the first half of the nineteenth century. It focuses on attempts to transform Royal Navy log books into standardized meteorological registers that would be of use to both science and the state. The essay begins with a discussion of Admiralty Hydrographer Francis Beaufort, who promoted the use of standardized systems for the observation of the weather at sea. It then examines the application of ships' logs to the science of storms. The essay focuses on the Army engineer William Reid, who studied hurricanes while stationed in Barbados and Bermuda. Reid was instrumental in persuading the Admiralty to implement a naval meteorological policy, something the Admiralty Hydrographer had struggled to achieve. The essay uses the reception and adoption of work on storms at sea to reflect on the means and ends of maritime meteorology in the mid-nineteenth century.

  4. Program Instrumentation and Trace Analysis

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Goldberg, Allen; Filman, Robert; Rosu, Grigore; Koga, Dennis (Technical Monitor)

    2002-01-01

    Several attempts have been made recently to apply techniques such as model checking and theorem proving to the analysis of programs. This shall be seen as a current trend to analyze real software systems instead of just their designs. This includes our own effort to develop a model checker for Java, the Java PathFinder 1, one of the very first of its kind in 1998. However, model checking cannot handle very large programs without some kind of abstraction of the program. This paper describes a complementary scalable technique to handle such large programs. Our interest is turned on the observation part of the equation: How much information can be extracted about a program from observing a single execution trace? It is our intention to develop a technology that can be applied automatically and to large full-size applications, with minimal modification to the code. We present a tool, Java PathExplorer (JPaX), for exploring execution traces of Java programs. The tool prioritizes scalability for completeness, and is directed towards detecting errors in programs, not to prove correctness. One core element in JPaX is an instrumentation package that allows to instrument Java byte code files to log various events when executed. The instrumentation is driven by a user provided script that specifies what information to log. Examples of instructions that such a script can contain are: 'report name and arguments of all called methods defined in class C, together with a timestamp'; 'report all updates to all variables'; and 'report all acquisitions and releases of locks'. In more complex instructions one can specify that certain expressions should be evaluated and even that certain code should be executed under various conditions. The instrumentation package can hence be seen as implementing Aspect Oriented Programming for Java in the sense that one can add functionality to a Java program without explicitly changing the code of the original program, but one rather writes an aspect and compiles it into the original program using the instrumentation. Another core element of JPaX is an observation package that supports the analysis of the generated event stream. Two kinds of analysis are currently supported. In temporal analysis the execution trace is evaluated against formulae written in temporal logic. We have implemented a temporal logic evaluator on finite traces using the Maude rewriting system from SRI International, USA. Temporal logic is defined in Maude by giving its syntax as a signature and its semantics as rewrite equations. The resulting semantics is extremely efficient and can handle event streams of hundreds of millions events in few minutes. Furthermore, the implementation is very succinct. The second form of even stream analysis supported is error pattern analysis where an execution trace is analyzed using various error detection algorithms that can identify error-prone programming practices that may potentially lead to errors in some different executions. Two such algorithms focusing on concurrency errors have been implemented in JPaX, one for deadlocks and the other for data races. It is important to note, that a deadlock or data race potential does not need to occur in order for its potential to be detected with these algorithms. This is what makes them very scalable in practice. The data race algorithm implemented is the Eraser algorithm from Compaq, however adopted to Java. The tool is currently being applied to a code base for controlling a spacecraft by the developers of that software in order to evaluate its applicability.

  5. A Guide to Hardwood Log Grading

    Treesearch

    Everette D. Rast; David L. Sonderman; Glenn L. Gammon

    1973-01-01

    A guide to hardwood log grading (revised) was developed as a teaching aid and field reference in grading hardwood logs. Outlines basic principles and gives detailed practical applications, with illustrations, in grading hardwood logs. Includes standards for various use classes.

  6. Integrated NMR Core and Log Investigations With Respect to ODP LEG 204

    NASA Astrophysics Data System (ADS)

    Arnold, J.; Pechnig, R.; Clauser, C.; Anferova, S.; Blümich, B.

    2005-12-01

    NMR techniques are widely used in the oil industry and are one of the most suitable methods to evaluate in-situ formation porosity and permeability. Recently, efforts are directed towards adapting NMR methods also to the Ocean Drilling Program (ODP) and the upcoming Integrated Ocean Drilling Program (IODP). We apply a newly developed light-weight, mobile NMR core scanner as a non-destructive instrument to determine routinely rock porosity and to estimate the pore size distribution. The NMR core scanner is used for transverse relaxation measurements on water-saturated core sections using a CPMG sequence with a short echo time. A regularized Laplace-transform analysis yields the distribution of transverse relaxation times T2. In homogeneous magnetic fields, T2 is proportional to the pore diameter of rocks. Hence, the T2 signal maps the pore-size distribution of the studied rock samples. For fully saturated samples the integral of the distribution curve and the CPMG echo amplitude extrapolated to zero echo time are proportional to porosity. Preliminary results show that the NMR core scanner is a suitable tool to determine rock porosity and to estimate pore size distribution of limestones and sandstones. Presently our investigations focus on Leg 204, where NMR Logging-While-Drilling (LWD) was performed for the first time in ODP. Leg 204 was drilled into Hydrate Ridge on the Cascadia accretionary margin, offshore Oregon. All drilling and logging operations were highly successful, providing excellent core, wireline, and LWD data from adjacent boreholes. Cores recovered during Leg 204 consist mainly of clay and claystone. As the NMR core scanner operates at frequencies higher than that of the well-logging sensor it has a shorter dead time. This advantage makes the NMR core scanner sensitive to signals with T2 values down to 0.1 ms as compared to 3 ms in NMR logging. Hence, we can study even rocks with small pores, such as the mudcores recovered during Leg 204. We present a comparison of data from core scanning and NMR logging. Future integration of conventional wireline data and electrical borehole wall images (RAB/FMS) will provide a detailed characterization of the sediments in terms of lithology, petrophysics and, fluid flow properties.

  7. Application of Fracture Distribution Prediction Model in Xihu Depression of East China Sea

    NASA Astrophysics Data System (ADS)

    Yan, Weifeng; Duan, Feifei; Zhang, Le; Li, Ming

    2018-02-01

    There are different responses on each of logging data with the changes of formation characteristics and outliers caused by the existence of fractures. For this reason, the development of fractures in formation can be characterized by the fine analysis of logging curves. The well logs such as resistivity, sonic transit time, density, neutron porosity and gamma ray, which are classified as conventional well logs, are more sensitive to formation fractures. In view of traditional fracture prediction model, using the simple weighted average of different logging data to calculate the comprehensive fracture index, are more susceptible to subjective factors and exist a large deviation, a statistical method is introduced accordingly. Combining with responses of conventional logging data on the development of formation fracture, a prediction model based on membership function is established, and its essence is to analyse logging data with fuzzy mathematics theory. The fracture prediction results in a well formation in NX block of Xihu depression through two models are compared with that of imaging logging, which shows that the accuracy of fracture prediction model based on membership function is better than that of traditional model. Furthermore, the prediction results are highly consistent with imaging logs and can reflect the development of cracks much better. It can provide a reference for engineering practice.

  8. Reduction of lithologic-log data to numbers for use in the digital computer

    USGS Publications Warehouse

    Morgan, C.O.; McNellis, J.M.

    1971-01-01

    The development of a standardized system for conveniently coding lithologic-log data for use in the digital computer has long been needed. The technique suggested involves a reduction of the original written alphanumeric log to a numeric log by use of computer programs. This numeric log can then be retrieved as a written log, interrogated for pertinent information, or analyzed statistically. ?? 1971 Plenum Publishing Corporation.

  9. Development of telescope control system for the 50cm telescope of UC Observatory Santa Martina

    NASA Astrophysics Data System (ADS)

    Shen, Tzu-Chiang; Soto, Ruben; Reveco, Johnny; Vanzi, Leonardo; Fernández, Jose M.; Escarate, Pedro; Suc, Vincent

    2012-09-01

    The main telescope of the UC Observatory Santa Martina is a 50cm optical telescope donated by ESO to Pontificia Universidad Catolica de Chile. During the past years the telescope has been refurbished and used as the main facility for testing and validating new instruments under construction by the center of Astro-Engineering UC. As part of this work, the need to develop a more efficient and flexible control system arises. The new distributed control system has been developed on top of Internet Communication Engine (ICE), a framework developed by Zeroc Inc. This framework features a lightweight but powerful and flexible inter-process communication infrastructure and provides binding to classic and modern programming languages, such as, C/C++, java, c#, ruby-rail, objective c, etc. The result of this work shows ICE as a real alternative for CORBA and other de-facto distribute programming framework. Classical control software architecture has been chosen and comprises an observation control system (OCS), the orchestrator of the observation, which controls the telescope control system (TCS), and detector control system (DCS). The real-time control and monitoring system is deployed and running over ARM based single board computers. Other features such as logging and configuration services have been developed as well. Inter-operation with other main astronomical control frameworks are foreseen in order achieve a smooth integration of instruments when they will be integrated in the main observatories in the north of Chile

  10. The Seven-Segment Data Logger

    NASA Astrophysics Data System (ADS)

    Bates, Alan

    2015-12-01

    Instruments or digital meters with data values visible on a seven-segment display can easily be found in the physics lab. Examples include multimeters, sound level meters, Geiger-Müller counters and electromagnetic field meters, where the display is used to show numerical data. Such instruments, without the ability to connect to computers or data loggers, can measure and display data at a particular instant in time. The user should be present to read the display and to record the data. Unlike these digital meters, the sensor-data logger system has the advantage of automatically measuring and recording data at selectable sample rates over a desired sample time. The process of adding data logging features to a digital meter with a seven-segment display can be achieved with Seven Segment Optical Character Recognition (SSOCR) software. One might ask, why not just purchase a field meter with data logging features? They are relatively inexpensive, reliable, available online, and can be delivered within a few days. But then there is the challenge of making your own instrument, the excitement of implementing a design, the pleasure of experiencing an entire process from concept to product, and the satisfaction of avoiding costs by taking advantage of available technology. This experiment makes use of an electromagnetic field meter with a seven-segment liquid crystal display to measure background electromagnetic field intensity. Images of the meter display are automatically captured with a camera and analyzed using SSOCR to produce a text file containing meter display values.

  11. Well network installation and hydrogeologic data collection, Assateague Island National Seashore, Worcester County, Maryland, 2010

    USGS Publications Warehouse

    Banks, William S.L.; Masterson, John P.; Johnson, Carole D.

    2012-01-01

    The U.S. Geological Survey, as part of its Climate and Land Use Change Research and Development Program, is conducting a multi-year investigation to assess potential impacts on the natural resources of Assateague Island National Seashore, Maryland that may result from changes in the hydrologic system in response to projected sea-level rise. As part of this effort, 26 monitoring wells were installed in pairs along five east-west trending transects. Each of the five transects has between two and four pairs of wells, consisting of a shallow well and a deeper well. The shallow well typically was installed several feet below the water table—usually in freshwater about 10 feet below land surface (ft bls)—to measure water-level changes in the shallow groundwater system. The deeper well was installed below the anticipated depth to the freshwater-saltwater interface—usually in saltwater about 45 to 55 ft bls—for the purpose of borehole geophysical logging to characterize local differences in lithology and salinity and to monitor tidal influences on groundwater. Four of the 13 shallow wells and 5 of the 13 deeper wells were instrumented with water-level recorders that collected water-level data at 15-minute intervals from August 12 through September 28, 2010. Data collected from these instrumented wells were compared with tide data collected north of Assateague Island at the Ocean City Inlet tide gage, and precipitation data collected by National Park Service staff on Assateague Island. These data indicate that precipitation events coupled with changes in ambient sea level had the largest effect on groundwater levels in all monitoring wells near the Atlantic Ocean and Chincoteague and Sinepuxent Bays, whereas precipitation events alone had the greatest impact on shallow groundwater levels near the center of the island. Daily and bi-monthly tidal cycles appeared to have minimal influence on groundwater levels throughout the island and the water-level changes that were observed appeared to vary among well sites, indicating that changes in lithology and salinity also may affect the response of water levels in the shallow and deeper groundwater systems throughout the island. Borehole geophysical logs were collected at each of the 13 deeper wells along the 5 transects. Electromagnetic induction logs were collected to identify changes in lithology; determine the approximate location of the freshwater-saltwater interface; and characterize the distribution of fresh and brackish water in the shallow aquifer, and the geometry of the fresh groundwater lens beneath the island. Natural gamma logs were collected to provide information on the geologic framework of the island including the presence and thickness of finer-grained deposits found in the subsurface throughout the island during previous investigations. Results of this investigation show the need for collection of continuous water-level data in both the shallow and deeper parts of the flow system and electromagnetic induction and natural gamma geophysical logging data to better understand the response of this groundwater system to changes in precipitation and tidal forcing. Hydrologic data collected as part of this investigation will serve as the foundation for the development of numerical flow models to assess the potential effects of climate change on the coastal groundwater system of Assateague Island.

  12. An Examination of Faculty and Student Online Activity: Predictive Relationships of Student Academic Success in a Learning Management System (LMS)

    ERIC Educational Resources Information Center

    Stamm, Randy Lee

    2013-01-01

    The purpose of this mixed method research study was to examine relationships in student and instructor activity logs and student performance benchmarks specific to enabling early intervention by the instructor in a Learning Management System (LMS). Instructor feedback was collected through a survey instrument to demonstrate perceived importance of…

  13. 40 CFR 796.1050 - Absorption in aqueous solution: Ultraviolet/visible spectra.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ....50 λ in nm 235 257 313 350 (2) Fluoranthene (in methanol) from C.R.C. Atlas of Spectral Data...-nitrophenol (in methanol) from C.R.C. Atlas of Spectral Data, paragraph (d)(3) of this section: log ε 3.88 4... to obtain the UV-VIS spectrum of the test compound. Such an instrument should have a photometric...

  14. Multi-sensor Array for High Altitude Balloon Missions to the Stratosphere

    NASA Astrophysics Data System (ADS)

    Davis, Tim; McClurg, Bryce; Sohl, John

    2008-10-01

    We have designed and built a microprocessor controlled and expandable multi-sensor array for data collection on near space missions. Weber State University has started a high altitude research balloon program called HARBOR. This array has been designed to data log a base set of measurements for every flight and has room for six guest instruments. The base measurements are absolute pressure, on-board temperature, 3-axis accelerometer for attitude measurement, and 2-axis compensated magnetic compass. The system also contains a real time clock and circuitry for logging data directly to a USB memory stick. In typical operation the measurements will be cycled through in sequence and saved to the memory stick along with the clock's time stamp. The microprocessor can be reprogrammed to adapt to guest experiments with either analog or digital interfacing. This system will fly with every mission and will provide backup data collection for other instrumentation for which the primary task is measuring atmospheric pressure and temperature. The attitude data will be used to determine the orientation of the onboard camera systems to aid in identifying features in the images. This will make these images easier to use for any future GIS (geographic information system) remote sensing missions.

  15. Hardwood log grading scale stick improved

    Treesearch

    M. D. Ostrander; G. H. Englerth

    1953-01-01

    In February 1952 the Northeastern Forest Experiment Station described ( Research Note 13) a new log-grading scale stick developed by the Station for use as a visual aid in grading hardwood factory logs. It was based on the U. S. Forest Products Laboratory's log-grade specifications.

  16. Software development kit for a compact cryo-refrigerator

    NASA Astrophysics Data System (ADS)

    Gardiner, J.; Hamilton, J.; Lawton, J.; Knight, K.; Wilson, A.; Spagna, S.

    2017-12-01

    This paper introduces a Software Development Kit (SDK) that enables the creation of custom software applications that automate the control of a cryo-refrigerator (Quantum Design model GA-1) in third party instruments. A remote interface allows real time tracking and logging of critical system diagnostics such as pressures, temperatures, valve states and run modes. The helium compressor scroll capsule speed and Gifford-McMahon (G-M) cold head speed can be manually adjusted over a serial communication line via a CAN interface. This configuration optimizes cooling power, while reducing wear on moving components thus extending service life. Additionally, a proportional speed control mode allows for automated throttling of speeds based on temperature or pressure feedback from a 3rd party device. Warm up and cool down modes allow 1st and 2nd stage temperatures to be adjusted without the use of external heaters.

  17. Utilization and cost for animal logging operations

    Treesearch

    Suraj P. Shrestha; Bobby L. Lanford

    2001-01-01

    Forest harvesting with animals is a labor-intensive operation. Due to the development of efficient machines and high volume demands from the forest products industry, mechanization of logging developed very fast, leaving behind the traditional horse and mule logging. It is expensive to use machines on smaller woodlots, which require frequent moves if mechanically...

  18. Combined Log Inventory and Process Simulation Models for the Planning and Control of Sawmill Operations

    Treesearch

    Guillermo A. Mendoza; Roger J. Meimban; Philip A. Araman; William G. Luppold

    1991-01-01

    A log inventory model and a real-time hardwood process simulation model were developed and combined into an integrated production planning and control system for hardwood sawmills. The log inventory model was designed to monitor and periodically update the status of the logs in the log yard. The process simulation model was designed to estimate various sawmill...

  19. Balloon logging with the inverted skyline

    NASA Technical Reports Server (NTRS)

    Mosher, C. F.

    1975-01-01

    There is a gap in aerial logging techniques that has to be filled. The need for a simple, safe, sizeable system has to be developed before aerial logging will become effective and accepted in the logging industry. This paper presents such a system designed on simple principles with realistic cost and ecological benefits.

  20. Evaluation of electronic logging and gamma ray device for bridge boring interpretation.

    DOT National Transportation Integrated Search

    1971-07-01

    Since shallow electric logging devices and gamma ray devices (for use in holes of less than 200 feet in depth) have recently been developed, it was the aim of this work to ascertain if correlation between electric logs and/or gamma ray logs and known...

  1. T-H-A-T-S: timber-harvesting-and-transport-simulator: with subroutines for Appalachian logging

    Treesearch

    A. Jeff Martin

    1975-01-01

    A computer program for simulating harvesting operations is presented. Written in FORTRAN IV, the program contains subroutines that were developed for Appalachian logging conditions. However, with appropriate modifications, the simulator would be applicable for most logging operations and locations. The details of model development and its methodology are presented,...

  2. Ubiquitous Learning Project Using Life-Logging Technology in Japan

    ERIC Educational Resources Information Center

    Ogata, Hiroaki; Hou, Bin; Li, Mengmeng; Uosaki, Noriko; Mouri, Kosuke; Liu, Songran

    2014-01-01

    A Ubiquitous Learning Log (ULL) is defined as a digital record of what a learner has learned in daily life using ubiquitous computing technologies. In this paper, a project which developed a system called SCROLL (System for Capturing and Reusing Of Learning Log) is presented. The aim of developing SCROLL is to help learners record, organize,…

  3. Keystroke Analysis: Reflections on Procedures and Measures

    ERIC Educational Resources Information Center

    Baaijen, Veerle M.; Galbraith, David; de Glopper, Kees

    2012-01-01

    Although keystroke logging promises to provide a valuable tool for writing research, it can often be difficult to relate logs to underlying processes. This article describes the procedures and measures that the authors developed to analyze a sample of 80 keystroke logs, with a view to achieving a better alignment between keystroke-logging measures…

  4. Standing timber coefficients for Indiana walnut log production.

    Treesearch

    James E. Blyth; Edwin Kallio; John C. Callahan

    1969-01-01

    If the volume of walnut veneer logs and saw logs received at processing plants from Indiana forests is known, conversion factors developed in this paper can be used to determine how much timber was cut to provide these logs and the kinds of timber that were cut (sawtimber, cull trees, trees on nonforest land, etc.).

  5. LogSafe and Smart: Minnesota OSHA's LogSafe Program Takes Root.

    ERIC Educational Resources Information Center

    Honerman, James

    1999-01-01

    Logging is now the most dangerous U.S. occupation. The Occupational Safety and Health Administration (OSHA) developed specialized safety training for the logging industry but has been challenged to reach small operators. An OSHA-approved state program in Minnesota provides annual safety seminars to about two-thirds of the state's full-time…

  6. Selective logging in the Brazilian Amazon.

    Treesearch

    G. P. Asner; D. E. Knapp; E. N. Broadbent; P. J. C. Oliveira; M Keller; J. N. Silva

    2005-01-01

    Amazon deforestation has been measured by remote sensing for three decades. In comparison, selective logging has been mostly invisible to satellites. We developed a large-scale, high-resolution, automated remote-sensing analysis of selective logging in the top five timber-producing states of the Brazilian Amazon. Logged areas ranged from 12,075 to 19,823 square...

  7. Critical length sampling: a method to estimate the volume of downed coarse woody debris

    Treesearch

    G& #246; ran St& #229; hl; Jeffrey H. Gove; Michael S. Williams; Mark J. Ducey

    2010-01-01

    In this paper, critical length sampling for estimating the volume of downed coarse woody debris is presented. Using this method, the volume of downed wood in a stand can be estimated by summing the critical lengths of down logs included in a sample obtained using a relascope or wedge prism; typically, the instrument should be tilted 90° from its usual...

  8. Geothermal technology publications and related reports: a bibliography, January 1977-December 1980

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hudson, S.R.

    1981-04-01

    This bibliograhy lists titles, authors, abstracts, and reference information for publications which have been published in the areas of drilling technology, logging instrumentation, and magma energy during the period 1977-1980. These publications are the results of work carried on at Sandia National Laboratories and their subcontractors. Some work was also done in conjunction with the Morgantown, Bartlesville, and Pittsburgh Energy Technology Centers.

  9. Changes in soil moisture and pore pressure after harvesting a forested hillslope in northern California

    Treesearch

    Elizabeth T. Keppeler; Robert R. Ziemer; Peter H. Cafferata

    1994-01-01

    Abstract - In 1987, a 0.83-ha zero-order swale was instrumented with 58 pierometers and 25 tensiometers along several hillslope transects. Through 1993, soil moisture conditions were measured by pressure transducers connected to a digital data logger recording at 15-minute intervals. In August 1989, the 100-year-old second-growth forest in the swale was felled. Logs...

  10. The quality and availability of hardwood logging residue based on developed quality levels

    Treesearch

    Floyd G. Timson

    1980-01-01

    Hardwood logging residue was examined for salvageable quality material. Four quality levels (QL 1 to QL 4), based on four sets of specifications, were developed. The specifications used surface indicators, sweep, center decay, and piece size to determine quality. Twenty-six percent of the total logging residue (residue ≥ 4 inches in diameter outside bark at...

  11. Using parallel computing methods to improve log surface defect detection methods

    Treesearch

    R. Edward Thomas; Liya Thomas

    2013-01-01

    Determining the size and location of surface defects is crucial to evaluating the potential yield and value of hardwood logs. Recently a surface defect detection algorithm was developed using the Java language. This algorithm was developed around an earlier laser scanning system that had poor resolution along the length of the log (15 scan lines per foot). A newer...

  12. The Development and Validation of the Instructional Practices Log in Science: A Measure of K-5 Science Instruction

    ERIC Educational Resources Information Center

    Adams, Elizabeth L.; Carrier, Sarah J.; Minogue, James; Porter, Stephen R.; McEachin, Andrew; Walkowiak, Temple A.; Zulli, Rebecca A.

    2017-01-01

    The Instructional Practices Log in Science (IPL-S) is a daily teacher log developed for K-5 teachers to self-report their science instruction. The items on the IPL-S are grouped into scales measuring five dimensions of science instruction: "Low-level Sense-making," "High-level Sense-making," "Communication,"…

  13. Progress in alternative neutron detection to address the helium-3 shortage

    NASA Astrophysics Data System (ADS)

    Kouzes, Richard T.; Lintereur, Azaree T.; Siciliano, Edward R.

    2015-06-01

    One of the main uses for 3He is in gas proportional counters for neutron detection. Such detectors are used at neutron scattering science facilities and in radiation portal monitors deployed for homeland security and non-proliferation applications. Other uses of 3He are for research detectors, commercial instruments, well logging detectors, dilution refrigerators, lung imaging, for targets in nuclear research, and for basic research in condensed matter physics. The supply of 3He comes entirely from the decay of tritium produced for nuclear weapons in the U.S. and Russia. Due to the large increase in use of 3He for science and homeland security (since 2002), the supply could no longer meet the demand. This has led to the development of a number of alternative neutron detection schemes.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kouzes, Richard T.; Lintereur, Azaree T.; Siciliano, Edward R.

    One of the main uses for 3He is in gas proportional counters for neutron detection. Such detectors are used at neutron scattering science facilities and in radiation portal monitors deployed for homeland security and non-proliferation applications. Other uses of 3He are for research detectors, commercial instruments, well logging detectors, dilution refrigerators, lung imaging, for targets in nuclear research, and for basic research in condensed matter physics. The supply of 3He comes entirely from the decay of tritium produced for nuclear weapons in the U.S. and Russia. Due to the large increase in use of 3He for science and homeland securitymore » (since 2002), the supply has dwindled, and can no longer meet the demand. This has led to the development of a number of alternative neutron detection schemes.« less

  15. Evaluation of residual oil saturation after waterflood in a carbonate reservoir

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verma, M.K.; Boucherit, M.; Bouvier, L.

    Four different approaches, including special core analysis (SCAL), log-inject-log, thermal-decay-time (TDT) logs, and material balance, were used to narrow the range of residual oil saturation (ROS) after waterflood, S[sub orw], in a carbonate reservoir in Qatar to between 23% and 27%. An equation was developed that relates S[sub orw] with connate-water saturation, S[sub wi], and porosity. This paper presents the results of S[sub orw] determinations with four different techniques: core waterflood followed by centrifuging, log-inject-log, TDT logging, and material balance.

  16. An Interactive Multi-instrument Database of Solar Flares

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sadykov, Viacheslav M; Kosovichev, Alexander G; Oria, Vincent

    Solar flares are complicated physical phenomena that are observable in a broad range of the electromagnetic spectrum, from radio waves to γ -rays. For a more comprehensive understanding of flares, it is necessary to perform a combined multi-wavelength analysis using observations from many satellites and ground-based observatories. For an efficient data search, integration of different flare lists, and representation of observational data, we have developed the Interactive Multi-Instrument Database of Solar Flares (IMIDSF, https://solarflare.njit.edu/). The web-accessible database is fully functional and allows the user to search for uniquely identified flare events based on their physical descriptors and the availability ofmore » observations by a particular set of instruments. Currently, the data from three primary flare lists ( Geostationary Operational Environmental Satellites , RHESSI , and HEK) and a variety of other event catalogs ( Hinode , Fermi GBM, Konus- W IND, the OVSA flare catalogs, the CACTus CME catalog, the Filament eruption catalog) and observing logs ( IRIS and Nobeyama coverage) are integrated, and an additional set of physical descriptors (temperature and emission measure) is provided along with an observing summary, data links, and multi-wavelength light curves for each flare event since 2002 January. We envision that this new tool will allow researchers to significantly speed up the search of events of interest for statistical and case studies.« less

  17. A Maximum Likelihood Ensemble Data Assimilation Method Tailored to the Inner Radiation Belt

    NASA Astrophysics Data System (ADS)

    Guild, T. B.; O'Brien, T. P., III; Mazur, J. E.

    2014-12-01

    The Earth's radiation belts are composed of energetic protons and electrons whose fluxes span many orders of magnitude, whose distributions are log-normal, and where data-model differences can be large and also log-normal. This physical system thus challenges standard data assimilation methods relying on underlying assumptions of Gaussian distributions of measurements and data-model differences, where innovations to the model are small. We have therefore developed a data assimilation method tailored to these properties of the inner radiation belt, analogous to the ensemble Kalman filter but for the unique cases of non-Gaussian model and measurement errors, and non-linear model and measurement distributions. We apply this method to the inner radiation belt proton populations, using the SIZM inner belt model [Selesnick et al., 2007] and SAMPEX/PET and HEO proton observations to select the most likely ensemble members contributing to the state of the inner belt. We will describe the algorithm, the method of generating ensemble members, our choice of minimizing the difference between instrument counts not phase space densities, and demonstrate the method with our reanalysis of the inner radiation belt throughout solar cycle 23. We will report on progress to continue our assimilation into solar cycle 24 using the Van Allen Probes/RPS observations.

  18. Comparative analytical evaluation of the respiratory TaqMan Array Card with real-time PCR and commercial multi-pathogen assays.

    PubMed

    Harvey, John J; Chester, Stephanie; Burke, Stephen A; Ansbro, Marisela; Aden, Tricia; Gose, Remedios; Sciulli, Rebecca; Bai, Jing; DesJardin, Lucy; Benfer, Jeffrey L; Hall, Joshua; Smole, Sandra; Doan, Kimberly; Popowich, Michael D; St George, Kirsten; Quinlan, Tammy; Halse, Tanya A; Li, Zhen; Pérez-Osorio, Ailyn C; Glover, William A; Russell, Denny; Reisdorf, Erik; Whyte, Thomas; Whitaker, Brett; Hatcher, Cynthia; Srinivasan, Velusamy; Tatti, Kathleen; Tondella, Maria Lucia; Wang, Xin; Winchell, Jonas M; Mayer, Leonard W; Jernigan, Daniel; Mawle, Alison C

    2016-02-01

    In this study, a multicenter evaluation of the Life Technologies TaqMan(®) Array Card (TAC) with 21 custom viral and bacterial respiratory assays was performed on the Applied Biosystems ViiA™ 7 Real-Time PCR System. The goal of the study was to demonstrate the analytical performance of this platform when compared to identical individual pathogen specific laboratory developed tests (LDTs) designed at the Centers for Disease Control and Prevention (CDC), equivalent LDTs provided by state public health laboratories, or to three different commercial multi-respiratory panels. CDC and Association of Public Health Laboratories (APHL) LDTs had similar analytical sensitivities for viral pathogens, while several of the bacterial pathogen APHL LDTs demonstrated sensitivities one log higher than the corresponding CDC LDT. When compared to CDC LDTs, TAC assays were generally one to two logs less sensitive depending on the site performing the analysis. Finally, TAC assays were generally more sensitive than their counterparts in three different commercial multi-respiratory panels. TAC technology allows users to spot customized assays and design TAC layout, simplify assay setup, conserve specimen, dramatically reduce contamination potential, and as demonstrated in this study, analyze multiple samples in parallel with good reproducibility between instruments and operators. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Locating knots by industrial tomography- A feasibility study

    Treesearch

    Fred W. Taylor; Francis G. Wagner; Charles W. McMillin; Ira L. Morgan; Forrest F. Hopkins

    1984-01-01

    Industrial photon tomography was used to scan four southern pine logs and one red oak log. The logs were scanned at 16 cross-sectional slice planes located 1 centimeter apart along their longitudinal axes. Tomographic reconstructions were made from the scan data collected at these slice planes, and a cursory image analysis technique was developed to locate the log...

  20. A three-dimensional optimal sawing system for small sawmills in central Appalachia

    Treesearch

    Wenshu Lin; Jingxin Wang; R. Edward. Thomas

    2011-01-01

    A three-dimensional (3D) log sawing optimization system was developed to perform 3D log generation, opening face determination, sawing simulation, and lumber grading. Superficial characteristics of logs such as length, large-end and small-end diameters, and external defects were collected from local sawmills. Internal log defect positions and shapes were predicted...

  1. ALOG user's manual: A Guide to using the spreadsheet-based artificial log generator

    Treesearch

    Matthew F. Winn; Philip A. Araman; Randolph H. Wynne

    2012-01-01

    Computer programs that simulate log sawing can be valuable training tools for sawyers, as well as a means oftesting different sawing patterns. Most available simulation programs rely on diagrammed-log databases, which canbe very costly and time consuming to develop. Artificial Log Generator (ALOG) is a user-friendly Microsoft® Excel®...

  2. Software systems for operation, control, and monitoring of the EBEX instrument

    NASA Astrophysics Data System (ADS)

    Milligan, Michael; Ade, Peter; Aubin, François; Baccigalupi, Carlo; Bao, Chaoyun; Borrill, Julian; Cantalupo, Christopher; Chapman, Daniel; Didier, Joy; Dobbs, Matt; Grainger, Will; Hanany, Shaul; Hillbrand, Seth; Hubmayr, Johannes; Hyland, Peter; Jaffe, Andrew; Johnson, Bradley; Kisner, Theodore; Klein, Jeff; Korotkov, Andrei; Leach, Sam; Lee, Adrian; Levinson, Lorne; Limon, Michele; MacDermid, Kevin; Matsumura, Tomotake; Miller, Amber; Pascale, Enzo; Polsgrove, Daniel; Ponthieu, Nicolas; Raach, Kate; Reichborn-Kjennerud, Britt; Sagiv, Ilan; Tran, Huan; Tucker, Gregory S.; Vinokurov, Yury; Yadav, Amit; Zaldarriaga, Matias; Zilic, Kyle

    2010-07-01

    We present the hardware and software systems implementing autonomous operation, distributed real-time monitoring, and control for the EBEX instrument. EBEX is a NASA-funded balloon-borne microwave polarimeter designed for a 14 day Antarctic flight that circumnavigates the pole. To meet its science goals the EBEX instrument autonomously executes several tasks in parallel: it collects attitude data and maintains pointing control in order to adhere to an observing schedule; tunes and operates up to 1920 TES bolometers and 120 SQUID amplifiers controlled by as many as 30 embedded computers; coordinates and dispatches jobs across an onboard computer network to manage this detector readout system; logs over 3 GiB/hour of science and housekeeping data to an onboard disk storage array; responds to a variety of commands and exogenous events; and downlinks multiple heterogeneous data streams representing a selected subset of the total logged data. Most of the systems implementing these functions have been tested during a recent engineering flight of the payload, and have proven to meet the target requirements. The EBEX ground segment couples uplink and downlink hardware to a client-server software stack, enabling real-time monitoring and command responsibility to be distributed across the public internet or other standard computer networks. Using the emerging dirfile standard as a uniform intermediate data format, a variety of front end programs provide access to different components and views of the downlinked data products. This distributed architecture was demonstrated operating across multiple widely dispersed sites prior to and during the EBEX engineering flight.

  3. What's new in well logging and formation evaluation

    USGS Publications Warehouse

    Prensky, S.

    2011-01-01

    A number of significant new developments is emerging in well logging and formation evaluation. Some of the new developments include an ultrasonic wireline imager, an electromagnetic free-point indicator, wired and fiber-optic coiled tubing systems, and extreme-temperature logging-while-drilling (LWD) tools. The continued consolidation of logging and petrophysical service providers in 2010 means that these innovations are increasingly being provided by a few large companies. Weatherford International has launched a slimhole cross-dipole tool as part of the company's line of compact logging tools. The 26-ft-long Compact Cross-Dipole Sonic (CXD) tool can be run as part of a quad-combo compact logging string. Halliburton has introduced a version of its circumferential acoustic scanning tool (CAST) that runs on monoconductor cable (CAST-M) to provide high-resolution images in open hole and in cased hole for casing and cement evaluation.

  4. Sea Bed Drilling Technology MARUM-MeBo: Overview on recent scientific drilling campaigns and technical developments

    NASA Astrophysics Data System (ADS)

    Freudenthal, Tim; Bergenthal, Markus; Bohrmann, Gerhard; Pape, Thomas; Kopf, Achim; Huhn-Frehers, Katrin; Gohl, Karsten; Wefer, Gerold

    2017-04-01

    The MARUM-MeBo (abbreviation for Meeresboden-Bohrgerät, the German expression for seafloor drill rig) is a robotic drilling system that is developed since 2004 at the MARUM Center for Marine Environmental Sciences at the University of Bremen in close cooperation with Bauer Maschinen GmbH and other industry partners. The MARUM-MeBo drill rigs can be deployed from multipurpose research vessel like, RV MARIA S. MERIAN, RV METEOR, RV SONNE and RV POLARSTERN and are used for getting long cores both in soft sediments as well as hard rocks in the deep sea. The first generation drill rig, the MARUM-MeBo70 is dedicated for a drilling depth of more than 70 m (Freudenthal and Wefer, 2013). Between 2005 and 2016 it was deployed on 17 research expeditions and drilled about 3 km into different types of geology including carbonate and crystalline rocks, gas hydrates, glacial tills, sands and gravel, glacial till and hemipelagic mud with an average recovery rate of about 70 %. We used the development and operational experiences of MARUM-MeBo70 for the development of a second generation drill rig MARUM-MeBo200. This drill rig is dedicated for conducting core drilling down to 200 m below sea floor. After successful sea trials in the North Sea in October 2014 the MeBo200 was used on a scientific expedition on the research vessel RV SONNE (SO247) in March/April 2016. During 12 deployments we drilled altogether 514 m in hemipelagic sediments with volcanic ashes as well as in muddy and sandy slide deposits off New Zealand. The average core recovery was about 54%. The maximum drilling depth was 105 m below sea floor. Developments for the MeBo drilling technology include the development of a pressure core barrel that was successfully deployed on two research expeditions so far. Bore hole logging adds to the coring capacity. Several autonomous logging probes have been developed in the last years for a deployment with MeBo in the logging while tripping mode - a sonic probe measuring in situ p-wave velocity being the latest development. Various bore hole monitoring systems where developed and deployed with the MeBo system. They allow for long-term monitoring of pressure variability within the sealed bore holes. References: Freudenthal, T and Wefer, G (2013) Drilling cores on the sea floor with the remote-controlled sea floor drilling rig MeBo. Geoscientific Instrumentation, Methods and Data Systems, 2(2). 329-337. doi:10.5194/gi-2-329-2013

  5. Mechanical reduction of the intracanal Enterococcus faecalis population by Hyflex CM, K3XF, ProTaper Next, and two manual instrument systems: an in vitro comparative study.

    PubMed

    Tewari, Rajendra K; Ali, Sajid; Mishra, Surendra K; Kumar, Ashok; Andrabi, Syed Mukhtar-Un-Nisar; Zoya, Asma; Alam, Sharique

    2016-05-01

    In the present study, the effectiveness of three rotary and two manual nickel titanium instrument systems on mechanical reduction of the intracanal Enterococcus faecalis population was evaluated. Mandibular premolars with straight roots were selected. Teeth were decoronated and instrumented until 20 K file and irrigated with physiological saline. After sterilization by ethylene oxide gas, root canals were inoculated with Enterococcus faecalis. The specimens were randomly divided into five groups for canal instrumentation: Manual Nitiflex and Hero Shaper nickel titanium files, and rotary Hyflex CM, ProTaper Next, and K3XF nickel titanium files. Intracanal bacterial sampling was done before and after instrumentation. After serial dilution, samples were plated onto the Mitis Salivarius agar. The c.f.u. grown were counted, and log10 transformation was calculated. All instrumentation systems significantly reduced the intracanal bacterial population after root canal preparation. ProTaper Next was found to be significantly more effective than Hyflex CM and manual Nitiflex and Hero Shaper. However, ProTaper Next showed no significant difference with K3XF. Canal instrumentation by all the file systems significantly reduced the intracanal Enterococcus faecalis counts. ProTaper Next was found to be most effective in reducing the number of bacteria than other rotary or hand instruments. © 2014 Wiley Publishing Asia Pty Ltd.

  6. Modelling uveal melanoma

    PubMed Central

    Foss, A.; Cree, I.; Dolin, P.; Hungerford, J.

    1999-01-01

    BACKGROUND/AIM—There has been no consistent pattern reported on how mortality for uveal melanoma varies with age. This information can be useful to model the complexity of the disease. The authors have examined ocular cancer trends, as an indirect measure for uveal melanoma mortality, to see how rates vary with age and to compare the results with their other studies on predicting metastatic disease.
METHODS—Age specific mortality was examined for England and Wales, the USA, and Canada. A log-log model was fitted to the data. The slopes of the log-log plots were used as measure of disease complexity and compared with the results of previous work on predicting metastatic disease.
RESULTS—The log-log model provided a good fit for the US and Canadian data, but the observed rates deviated for England and Wales among people over the age of 65 years. The log-log model for mortality data suggests that the underlying process depends upon four rate limiting steps, while a similar model for the incidence data suggests between three and four rate limiting steps. Further analysis of previous data on predicting metastatic disease on the basis of tumour size and blood vessel density would indicate a single rate limiting step between developing the primary tumour and developing metastatic disease.
CONCLUSIONS—There is significant underreporting or underdiagnosis of ocular melanoma for England and Wales in those over the age of 65 years. In those under the age of 65, a model is presented for ocular melanoma oncogenesis requiring three rate limiting steps to develop the primary tumour and a fourth rate limiting step to develop metastatic disease. The three steps in the generation of the primary tumour involve two key processes—namely, growth and angiogenesis within the primary tumour. The step from development of the primary to development of metastatic disease is likely to involve a single rate limiting process.

 PMID:10216060

  7. Financial Crisis: A New Measure for Risk of Pension Fund Portfolios

    PubMed Central

    Cadoni, Marinella; Melis, Roberta; Trudda, Alessandro

    2015-01-01

    It has been argued that pension funds should have limitations on their asset allocation, based on the risk profile of the different financial instruments available on the financial markets. This issue proves to be highly relevant at times of market crisis, when a regulation establishing limits to risk taking for pension funds could prevent defaults. In this paper we present a framework for evaluating the risk level of a single financial instrument or a portfolio. By assuming that the log asset returns can be described by a multifractional Brownian motion, we evaluate the risk using the time dependent Hurst parameter H(t) which models volatility. To provide a measure of the risk, we model the Hurst parameter with a random variable with mixture of beta distribution. We prove the efficacy of the methodology by implementing it on different risk level financial instruments and portfolios. PMID:26086529

  8. Financial Crisis: A New Measure for Risk of Pension Fund Portfolios.

    PubMed

    Cadoni, Marinella; Melis, Roberta; Trudda, Alessandro

    2015-01-01

    It has been argued that pension funds should have limitations on their asset allocation, based on the risk profile of the different financial instruments available on the financial markets. This issue proves to be highly relevant at times of market crisis, when a regulation establishing limits to risk taking for pension funds could prevent defaults. In this paper we present a framework for evaluating the risk level of a single financial instrument or a portfolio. By assuming that the log asset returns can be described by a multifractional Brownian motion, we evaluate the risk using the time dependent Hurst parameter H(t) which models volatility. To provide a measure of the risk, we model the Hurst parameter with a random variable with mixture of beta distribution. We prove the efficacy of the methodology by implementing it on different risk level financial instruments and portfolios.

  9. Three-parameter optical studies in Scottish coastal waters

    NASA Astrophysics Data System (ADS)

    McKee, David; Cunningham, Alex; Jones, Ken

    1997-02-01

    A new submersible optical instrument has been constructed which allows chlorophyll fluorescence, attenuation and wide- angle scattering measurements to be made simultaneously at he same point in a body of water. The instrument sues a single xenon flashlamp as the light source, and incorporates its own power supply and microprocessor based data logging system. It has ben cross-calibrated against commercial single-parameter instruments using a range of non-algal particles and phytoplankton cultures. The equipment has been deployed at sea in the Firth of Clyde and Loch Linnhe, where is has been used to study seasonal variability in optical water column structure. Results will be presented to illustrate how ambiguity in the interpretation of measurements of a single optical parameter can be alleviated by measuring several parameters simultaneously. Comparative studies of differences in winter and spring relationships between optical variable shave also ben carried out.

  10. Bringing Undergraduates and Geoscientists Together for Field-Based Geophysical Education and Research at an On-Campus Well Field

    NASA Astrophysics Data System (ADS)

    Day-Lewis, F. D.; Gray, M. B.

    2004-12-01

    Development of our Hydrogeophysics Well Field has enabled new opportunities for field-based undergraduate research and active-learning at Bucknell University. Installed in 2001-2002, the on-campus well field has become a cornerstone of field labs for hydrogeology and applied geophysics courses, and for introductory labs in engineering and environmental geology. In addition to enabling new field experiences, the well field serves as a meeting place for students and practicing geoscientists. In the last three years, we have hosted field demonstrations by alumni working in the environmental, geophysical, and water-well drilling industries; researchers from government agencies; graduate students from other universities; and geophysical equipment vendors seeking to test and demonstrate new instruments. Coordinating undergraduate research and practical course labs with field experiments led by alumni and practicing geoscientists provides students hands-on experience with new technology while educating them about career and graduate-school opportunities. In addition to being effective pedagogical strategy, these experiences are well received by students -- enrollment in our geophysics course has tripled from three years ago. The Bucknell Hydrogeophysics Well Field consists of five bedrock wells, installed in a fractured-rock aquifer in the Wills Creek Shale. The wells are open in the bedrock, facilitating geophysical and hydraulic measurements. To date, student have helped acquire from one or more wells: (1) open-hole slug- and aquifer-test data; (2) packer test data from isolated borehole intervals; (3) flow-meter logs; (4) acoustic and optical televiewer logs; (5) standard borehole logs including single-point resistance, caliper, and natural-gamma; (6) borehole video camera; (7) electrical resistivity tomograms; (8) water levels while drilling; and (9) water chemistry and temperature logs. Preliminary student-led data analysis indicates that sparse discrete fractures dominate the response of water levels to pumping. The three sets of fractures observed in the wells are consistent with those observed in outcrops around Bucknell: (1) bedding sub-parallel fractures; (2) joints; and (3) fractures parallel to rock cleavage. Efforts are ongoing to develop a CD-ROM of field data, photographs and video footage documenting the site and experiments; the CD is intended for publication as a "Virtual Field Laboratory" teaching tool for undergraduate hydrogeology and applied geophysics. We have seen the benefits of merging theory and practice in our undergraduate curriculum, and we seek to make these benefits available to other schools.

  11. Relative validation of Block Kids Food Screener for dietary assessment in children and adolescents.

    PubMed

    Hunsberger, Monica; O'Malley, Jean; Block, Torin; Norris, Jean C

    2015-04-01

    Food frequency questionnaires (FFQs) are less time consuming and inexpensive instruments for collecting dietary intake when compared with 24-h dietary recalls or double-labelled water; however, the validation of FFQ is important as incorrect information may lead to biased conclusions about associations. Therefore, the relative validity of the Block Kids Food Screener (BKFS) developed for use with children was examined in a convenience sample of 99 youth recruited from the Portland, OR metropolitan area. Three 24-h dietary recalls served as the reference. The relative validity was analysed after natural log transformation of all variables except glycaemic index prior to correlation analysis. Daily cup equivalent totals from the BKFS and 'servings' from 24-h recalls were used to compute average daily intake of fruits, vegetables, potatoes, whole grains, legumes, meat/fish/poultry and dairy. Protein grams (g), total kcalories, glycaemic index (glucose reference), glycaemic load (glucose reference), total saturated fat (g) and added sugar (g) were also calculated by each instrument. The correlation between data obtained from the two instruments was corrected for the within-subject variation in food intake reported by the 24-h recalls using standard nutritional assessment methodology. The de-attenuated correlations in nutritional intake between the two dietary assessment instruments ranged from 0.526 for vegetables, to 0.878 for potatoes. The 24-h recall estimated higher levels of saturated fat and added sugar consumption, higher glycaemic loads and glycaemic indices; the de-attenuatted correlations of these measures ranged from 0.478 to 0.768. Assessment of Bland-Altman plots indicated no systematic difference between the two instruments for vegetable, dairy and meat/fish/poultry fat consumption. BKFS is a useful dietary assessment instrument for the nutrients and food groups it was designed to assess in children age 10-17 years. © 2012 Blackwell Publishing Ltd.

  12. Development of the Probing In-Situ with Neutron and Gamma Rays (PING) Instrument for Planetary Science Applications

    NASA Technical Reports Server (NTRS)

    Parsons, A.; Bodnarik, J.; Burger, D.; Evans, L.; Floyd, S; Lim, L.; McClanahan, T.; Namkung, M.; Nowicki, S.; Schweitzer, J.; hide

    2011-01-01

    The Probing In situ with Neutrons and Gamma rays (PING) instrument is a promising planetary science application of the active neutron-gamma ray technology that has been used successfully in oil field well logging and mineral exploration on Earth for decades. Similar techniques can be very powerful for non-invasive in situ measurements of the subsurface elemental composition on other planets. The objective of our active neutron-gamma ray technology program at NASA Goddard Space Flight Center (NASA/GSFC) is to bring instruments using this technology to the point where they can be flown on a variety of surface lander or rover missions to the Moon, Mars, Venus, asteroids, comets and the satellites of the outer planets. PING combines a 14 MeV deuterium-tritium pulsed neutron generator with a gamma ray spectrometer and two neutron detectors to produce a landed instrument that can determine the elemental composition of a planet down to 30 - 50 cm below the planet's surface. The penetrating nature of.5 - 10 MeV gamma rays and 14 MeV neutrons allows such sub-surface composition measurements to be made without the need to drill into or otherwise disturb the planetary surface, thus greatly simplifying the lander design. We are currently testing a PING prototype at a unique outdoor neutron instrumentation test facility at NASA/GSFC that provides two large (1.8 m x 1.8 m x.9 m) granite and basalt test formations placed outdoors in an empty field. Since an independent trace elemental analysis has been performed on both the Columbia River basalt and Concord Gray granite materials, these samples present two known standards with which to compare PING's experimentally measured elemental composition results. We will present experimental results from PING measurements of both the granite and basalt test formations and show how and why the optimum PING instrument operating parameters differ for studying the two materials.

  13. Green lumber grade yields from factory grade logs of three oak species

    Treesearch

    Daniel A. Yaussy

    1986-01-01

    Multivariate regression models were developed to predict green board foot yields for the seven common factory lumber grades processed from white, black, and chestnut oak factory grade logs. These models use the standard log measurements of grade, scaling diameter, log length, and proportion of scaling defect. Any combination of lumber grades (such as 1 Common and...

  14. HW Buck for Windows: the optimal hardwood log bucking decision simulator with expanded capabilities

    Treesearch

    James B. Pickens; Scott Noble; Blair Orr; Philip A. Araman; John E. Baumgras; Al Steele

    2006-01-01

    It has long been recognized that inappropriate placement of crosscuts when manufacturing hardwood logs from harvested stems (log bucking) reduces the value of logs produced. Recent studies have estimated losses in the range from 28% to 38% in the lake states region. These estimates were developed by evaluating the bucking cuts chosen by harvesting crews and comparing...

  15. RAYSAW: a log sawing simulator for 3D laser-scanned hardwood logs

    Treesearch

    R. Edward Thomas

    2013-01-01

    Laser scanning of hardwood logs provides detailed high-resolution imagery of log surfaces. Characteristics such as sweep, taper, and crook, as well as most surface defects, are visible to the eye in the scan data. In addition, models have been developed that predict interior knot size and position based on external defect information. Computerized processing of...

  16. Spectral performance of Square Kilometre Array Antennas - II. Calibration performance

    NASA Astrophysics Data System (ADS)

    Trott, Cathryn M.; de Lera Acedo, Eloy; Wayth, Randall B.; Fagnoni, Nicolas; Sutinjo, Adrian T.; Wakley, Brett; Punzalan, Chris Ivan B.

    2017-09-01

    We test the bandpass smoothness performance of two prototype Square Kilometre Array (SKA) SKA1-Low log-periodic dipole antennas, SKALA2 and SKALA3 ('SKA Log-periodic Antenna'), and the current dipole from the Murchison Widefield Array (MWA) precursor telescope. Throughout this paper, we refer to the output complex-valued voltage response of an antenna when connected to a low-noise amplifier, as the dipole bandpass. In Paper I, the bandpass spectral response of the log-periodic antenna being developed for the SKA1-Low was estimated using numerical electromagnetic simulations and analysed using low-order polynomial fittings, and it was compared with the HERA antenna against the delay spectrum metric. In this work, realistic simulations of the SKA1-Low instrument, including frequency-dependent primary beam shapes and array configuration, are used with a weighted least-squares polynomial estimator to assess the ability of a given prototype antenna to perform the SKA Epoch of Reionisation (EoR) statistical experiments. This work complements the ideal estimator tolerances computed for the proposed EoR science experiments in Trott & Wayth, with the realized performance of an optimal and standard estimation (calibration) procedure. With a sufficient sky calibration model at higher frequencies, all antennas have bandpasses that are sufficiently smooth to meet the tolerances described in Trott & Wayth to perform the EoR statistical experiments, and these are primarily limited by an adequate sky calibration model and the thermal noise level in the calibration data. At frequencies of the Cosmic Dawn, which is of principal interest to SKA as one of the first next-generation telescopes capable of accessing higher redshifts, the MWA dipole and SKALA3 antenna have adequate performance, while the SKALA2 design will impede the ability to explore this era.

  17. The Swift/BAT Hard X-Ray Survey

    NASA Technical Reports Server (NTRS)

    Tueller, Jack; Markwardt, C. B.; Mushotzky, R. F.; Barthelmy, S. D.; Gehrels, N.; Krimm, H. A.; Skinner, G. K.; Falcone, A.; Kennea, J. A.

    2006-01-01

    The BAT instrument on Swift is a wide field (70 deg. '100 deg.) coded aperture instrument with a CdZnTe detector array sensitive to energies of 14-200 keV. Each day, the BAT survey typically covers 60% of the sky to a detection limit of 30 millicrab. BAT makes hard X-ray light curves of similar sensitivity and coverage to the X-ray light curves from XTE/ASM, but in an energy range where sources show remarkably different behavior. Integrating the BAT data produces an all sky map with a source detection limit at 15 months of a few 10(exp -11) ergs per square centimeter per second, depending on the exposure. This is the first uniform all-sky survey at energies high enough to be unaffected by absorption since HEAO 1 in 1977-8. BAT has detected greater than 200 AGN and greater than 180 galactic sources. At high galactic latitudes, the BAT sources are usually easy to identify, but many are heavily absorbed and there are a few quite surprising identifications. The BAT selected galaxies can be used to calculate LogN/LogS and the luminosity function for AGN which are complete and free from common systematics. Several crucial parameters for understanding the cosmic hard x-ray background are now determined.

  18. Aspects of fluency in writing.

    PubMed

    Uppstad, Per Henning; Solheim, Oddny Judith

    2007-03-01

    The notion of 'fluency' is most often associated with spoken-language phenomena such as stuttering. The present article investigates the relevance of considering fluency in writing. The basic argument for raising this question is empirical-it follows from a focus on difficulties in written and spoken language as manifestations of different problems which should be investigated separately on the basis of their symptoms. Key-logging instruments provide new possibilities for the study of writing. The obvious use of this new technology is to study writing as it unfolds in real time, instead of focusing only on aspects of the end product. A more sophisticated application is to exploit the key-logging instrument in order to test basic assumptions of contemporary theories of spelling. The present study is a dictation task involving words and non-words, intended to investigate spelling in nine-year-old pupils with regard to their mastery of the doubling of consonants in Norwegian. In this study, we report on differences with regard to temporal measures between a group of strong writers and a group of poor ones. On the basis of these pupils' writing behavior, the relevance of the concept of 'fluency' in writing is highlighted. The interpretation of the results questions basic assumptions of the cognitive hypothesis about spelling; the article concludes by hypothesizing a different conception of spelling.

  19. Validation of a computerized 24-hour physical activity recall (24PAR) instrument with pattern-recognition activity monitors.

    PubMed

    Calabro, Miguel A; Welk, Gregory J; Carriquiry, Alicia L; Nusser, Sarah M; Beyler, Nicholas K; Mathews, Charles E

    2009-03-01

    The purpose of this study was to examine the validity of a computerized 24-hour physical activity recall instrument (24PAR). Participants (n=20) wore 2 pattern-recognition activity monitors (an IDEEA and a SenseWear Pro Armband) for a 24-hour period and then completed the 24PAR the following morning. Participants completed 2 trials, 1 while maintaining a prospective diary of their activities and 1 without a diary. The trials were counterbalanced and completed within a week from each other. Estimates of energy expenditure (EE) and minutes of moderate-to-vigorous physical activity (MVPA) were compared with the criterion measures using 3-way (method by gender by trial) mixed-model ANOVA analyses. For EE, pairwise correlations were high (r>.88), and there were no differences in estimates across methods. Estimates of MVPA were more variable, but correlations were still in the moderate to high range (r>.57). Average activity levels were significantly higher on the logging trial, but there was no significant difference in the accuracy of self-report on days with and without logging. The results of this study support the overall utility of the 24PAR for group-level estimates of daily EE and MVPA.

  20. Developments in Quantitative Structure-Activity Relationships (QSAR). A Review

    DTIC Science & Technology

    1976-07-01

    hyphae Analogs Inhibition of s-Nitrostyrenes 20 84 Growth Botrytie -,inerea Inhibition of a-Nitrostyrenes 6 84 Grcwth Bovine hemoglobin Binding of...AspergiL us niger, phenyl methacrylates upon Ranse-nula awmat~a and RR’NCSS Na+ upon Botrytis cinerea conformed to the general equation 35. The equations...log II vs log kw *79 Botrytis cinerea , 41, 64 -lg! slgý,7 Bovine hemoglobin, 36 lg Elv o .,7 Bovine serum albumin, 36 - log iI vs log P, 79 - log JE

  1. NanTroSEIZE observatories: Installation of a long-term borehole monitoring systems offshore the Kii Peninsula, Japan

    NASA Astrophysics Data System (ADS)

    Kopf, A.; Saffer, D. M.; Davis, E. E.; Araki, E.; Kinoshita, M.; Lauer, R. M.; Wheat, C. G.; Kitada, K.; Kimura, T.; Toczko, S.; Eguchi, N. O.; Science Parties, E.

    2010-12-01

    The IODP Nankai Trough Seismogenic Zone Experiment (NanTroSEIZE) is a multi-expedition drilling program designed to investigate fault mechanics, fault slip behavior, and strain accumulation along subduction megathrusts, through coring, logging, and long-term monitoring experiments. One key objective is the development and installation of a borehole observatory network extending from locations above the outer, presumably aseismic accretionary wedge to the seismogenic and interseismically locked plate interface, to record seismicity and slip transients, monitor strain accumulation, document hydraulic transients associated with deformation events, and quantify in situ pore fluid pressure and temperature. As part of recent NanTroSEIZE operations, borehole instruments have been developed for deployment at two sites: (1) Site C0010, which penetrates a major out-of-sequence thrust fault termed the “megasplay” at ca. 400 mbsf, and (2) Site C0002 in the Kumano forearc basin at a location that overlies both the updip edge of the inferred interseismically locked portion of the plate interface, and clusters of very low frequency thrust and reverse earthquakes located within the accretionary prism and potentially on the megasplay fault. In 2009, Site C0010 was drilled and cased with screens to access the megasplay fault, and a simple pore pressure and temperature monitoring system (a ”smartplug”) was installed. The simple observatory unit includes pressure and temperature sensors and a data logging package mounted beneath a mechanically set retrievable casing packer, and includes two pressure sensors, one in hydraulic communication with the formation through the casing screens below the packer, and the other to the open borehole above the packer to record hydrostatic reference pressure and ocean loading signals. Temperatures are recorded within the instrument package using a platinum thermometer and by a self-contained miniature temperature logger (MTL). In fall 2010, the smartplug will be retrieved and replaced with an upgraded instrument package that also includes an autonomous osmotic geochemical sampling system and microbial colonization experiment. Fall 2010 operations will also drill and case Site C0002 to ca. 1000 m depth and install a newly developed multi-sensor permanent observatory system, which includes a volumetric strainmeter, a broadband seismometer, tiltmeter, thermister string, and multi-level pore-pressure sensors. The strain, seismometer, and tilt sensors will be cemented with the basal mudstones of the Kumano basin, and pore pressure will be monitored within both the underlying accretionary prism and within the lower basin sediments. The observatory will ultimately be connected to the seafloor fiber-optic cable network DONET. Here, we report on the retrieval of the smartplug, installation and configuration of the new multi-sensor permanent observatory, and preliminary data obtained from the smartplug deployment.

  2. The influence on response of axial rotation of a six-group local-conductance probe in horizontal oil-water two-phase flow

    NASA Astrophysics Data System (ADS)

    Weihang, Kong; Lingfu, Kong; Lei, Li; Xingbin, Liu; Tao, Cui

    2017-06-01

    Water volume fraction is an important parameter of two-phase flow measurement, and it is an urgent task for accurate measurement in horizontal oil field development and optimization of oil production. The previous ring-shaped conductance water-cut meter cannot obtain the response values corresponding to the oil field water conductivity for oil-water two-phase flow in horizontal oil-producing wells characterized by low yield liquid, low velocity and high water cut. Hence, an inserted axisymmetric array structure sensor, i.e. a six-group local-conductance probe (SGLCP), is proposed in this paper. Firstly, the electric field distributions generated by the exciting electrodes of SGLCP are investigated by the finite element method (FEM), and the spatial sensitivity distributions of SGLCP are analyzed from the aspect of different separations between two electrodes and different axial rotation angles respectively. Secondly, the numerical simulation responses of SGLCP in horizontal segregated flow are calculated from the aspect of different water cut and heights of the water layer, respectively. Lastly, an SGLCP-based well logging instrument was developed, and experiments were carried out in a horizontal pipe with an inner diameter of 125 mm on the industrial-scale experimental multiphase flow setup in the Daqing Oilfield, China. In the experiments, the different oil-water two-phase flow, mineralization degree, temperature and pressure were tested. The results obtained from the simulation experiments and simulation well experiments demonstrate that the designed and developed SGLCP-based instrument still has a good response characteristic for measuring water conductivity under the different conditions mentioned above. The validity and reliability of obtaining the response values corresponding to the water conductivity through the designed and developed SGLCP-based instrument are verified by the experimental results. The significance of this work can provide an effective technology for measuring the water volume fraction of oil-water two-phase flow in horizontal oil-producing wells.

  3. Impact of Lean on surgical instrument reduction: Less is more.

    PubMed

    Wannemuehler, Todd J; Elghouche, Alhasan N; Kokoska, Mimi S; Deig, Christopher R; Matt, Bruce H

    2015-12-01

    To determine whether instrument sets that are frequently used by multiple surgeons can be substantially reduced in size with consensus. Prospective quality improvement study using Lean Six Sigma for purposeful and consensual reduction of non-value-added instruments in adenotonsillectomy instrument sets. Value stream mapping was utilized to determine instrumentation usage and reprocessing workflow. Preintervention instrument utilization surveys allowed consensual and intelligent set reduction. Non-value-added instruments were targeted for waste elimination by placement in a supplemental set. Times for pre- and postintervention instrument assembly, Mayo setup, and surgery were collected for adenotonsillectomies. Postintervention satisfaction surveys of surgeons and staff were conducted. Adenotonsillectomy sets were reduced from 52 to 24 instruments. Median assembly times were significantly reduced from 8.4 to 4.7 minutes (P < .0001) with a set assembly cost reduction of 44%. Following natural log transformations, mean Mayo setup times were significantly reduced from 97.6 to 76.1 seconds (P < .0001), and mean operative times were not significantly affected (1,773 vs. 1,631 seconds, P > .05). The supplemental set was opened in only 3.6% of cases. Satisfaction was >90% regarding the intervention. Set build cost was reduced by $1,468.99 per set. Lean Six Sigma improves efficiency and reduces waste by empowering team members to improve their environment. Instrument set reduction is ideal for waste elimination because of tool accumulation over time and instrument obsolescence as newer technologies are adopted. Similar interventions could easily be applied to larger sinus, mastoidectomy, and spine sets. NA. © 2015 The American Laryngological, Rhinological and Otological Society, Inc.

  4. Log evaluation in wells drilled with inverted oil emulsion mud. [GLOBAL program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, D.P.; Lacour-Gayet, P.J.; Suau, J.

    1981-01-01

    As greater use is made of inverted oil emulsion, muds in the development of North Sea oil fields, the need for more precise log evaluation in this environment becomes apparent. This paper demonstrates an approach using the Dual Induction Log, taking into account invasion and boundary effects. Lithology and porosity are derived from the Formation Density or Litho-Density Log, Compensated Neutron Log, Sonic Log and the Natural Gamma Ray Spectrometry log. The effect of invasion by the oil component of the mud filtrate is treated in the evaluation, and a measurement of Moved Water is made Computations of petrophysical propertiesmore » are implemented by means of the GLOBAL interpretation program, taking advantage of its capability of adaption to any combination of logging sensors. 8 refs.« less

  5. Recent developments in blast furnace process control within British Steel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warren, P.W.

    1995-12-01

    British Steel generally operates seven blast furnaces on four integrated works. All furnaces have been equipped with comprehensive instrumentation and data logging computers over the past eight years. The four Scunthorpe furnaces practice coal injection up to 170 kg/tHM (340 lb/THM), the remainder injecting oil at up to 100 kg/tHM (200 lb/THM). Distribution control is effected by Paul Wurth Bell-Less Tops on six of the seven furnaces, and Movable Throat Armour with bells on the remaining one. All have at least one sub burden probe. The blast furnace operator has a vast quantity of data and signals to consider andmore » evaluate when attempting to achieve the objective of providing a consistent supply of hot metal. Techniques have been, and are being, developed to assist the operator to interpret large numbers of signals. A simple operator guidance system has been developed to provide advice, based on current operating procedures and interpreted data. Further development will involve the use of a sophisticated Expert System software shell.« less

  6. Proposed standard-weight (W(s)) equations for kokanee, golden trout and bull trout

    USGS Publications Warehouse

    Hyatt, M.H.; Hubert, W.A.

    2000-01-01

    We developed standard-weight (W(s)) equations for kokanee (lacustrine Oncorhynchus nerka), golden trout (O. aguabonita), and bull trout (Salvelinus confluentus) using the regression-line-percentile technique. The W(s) equation for kokanee of 120-550 mm TL is log10 W(s) = -5.062 + 3.033 log10 TL, when W(s) is in grams and TL is total length in millimeters; the English-unit equivalent is log10 W(s) = -3.458 + 3.033 log10 TL, when W(s) is in pounds and TL is total length in inches. The W(s) equation for golden trout of 120-530 mm TL is log10 W(s) = -5.088 + 3.041 log10 TL, with the English-unit equivalent being log10 W(s) = -3.473 + 3.041 log10 TL. The W(s) equation for bull trout of 120-850 mm TL is log10 W(s) = -5.327 + 3.115 log10 TL, with the English-unit equivalent being log10 W(s) = -3.608 + 3.115 log10 TL.

  7. The NSF Earthscope USArray Instrumentation Network

    NASA Astrophysics Data System (ADS)

    Davis, G. A.; Vernon, F.

    2012-12-01

    Since 2004, the Transportable Array component of the USArray Instrumentation Network has collected high resolution seismic data in near real-time from over 400 geographically distributed seismic stations. The deployed footprint of the array has steadily migrated across the continental United States, starting on the west coast and gradually moving eastward. As the network footprint shifts, stations from various regional seismic networks have been incorporated into the dataset. In 2009, an infrasound and barometric sensor component was added to existing core stations and to all new deployments. The ongoing success of the project can be attributed to a number of factors, including reliable communications to each site, on-site data buffering, largely homogenous data logging hardware, and a common phase-locked time reference between all stations. Continuous data quality is ensured by thorough human and automated review of data from the primary sensors and over 24 state-of-health parameters from each station. The staff at the Array Network Facility have developed a number of tools to visualize data and troubleshoot problematic stations remotely. In the event of an emergency or maintenance on the server hardware, data acquisition can be shifted to alternate data centers through the use of virtualization technologies.

  8. Perspectives for on-line analysis of bauxite by neutron irradiation

    NASA Astrophysics Data System (ADS)

    Beurton, Gabriel; Ledru, Bertrand; Letourneur, Philippe

    1995-03-01

    The interest in bauxite as a major source of alumina results in a strong demand for on-line instrumentation suitable for sorting, blending, and processing operations at the bauxite mine and for monitoring instrumentation in the Bayer process. The results of laboratory experiments based on neutron interactions with bauxite are described. The technique was chosen in order to overcome the problem of spatial heterogeneity in bulk mineral analysis. The evaluated elements contributed to approximately 99.5% of the sample weight. In addition, the measurements provide valuable information on physical parameters such as density, hygrometry, and material flow. Using a pulsed generator, the analysis system offers potential for on-line measurements (borehole logging or conveyor belt). An overall description of the experimental set-up is given. The experimental data include measurements of natural radioactivity, delayed radioactivity induced by activation, and prompt gamma rays following neutron reaction. In situ applications of neutron interactions provide continuous analysis and produce results which are more statistically significant. The key factors contributing to advances in industrial applications are the development of high count rate gamma spectroscopy and computational tools to design measurement systems and interpret their results.

  9. Tractor-logging costs and production in old-growth redwood forests

    Treesearch

    Kenneth N. Boe

    1963-01-01

    A cost accounting analysis of full-scale logging operations in old-growth redwood during 2 years revealed that it cost $12.24 per M bd. ft. (gross Scribner log scale) to get logs on trucks. Road development costs averaged another $5.19 per M bd. ft. Felling-bucking production was calculated by average tree d.b.h. Both skidding and loading outputs per hour were...

  10. The presence and nature of ellipticity in Appalachian hardwood logs

    Treesearch

    R. Edward Thomas; John S. Stanovick; Deborah Conner

    2017-01-01

    The ellipticity of hardwood logs is most often observed and measured from either end of a log. However, due to the nature of hardwood tree growth and bucking practices, the assessment of ellipticity in thir manner may not be accurate. Trees grown on hillsides often develop supporting wood that gives the first few feet of the  log butt a significant degree of...

  11. Defects in Hardwood Veneer Logs: Their Frequency and Importance

    Treesearch

    E.S. Harrar

    1954-01-01

    Most southern hardwood veneer and plywood plants have some method of classifying logs by grade to control the purchase price paid for logs bought on the open market. Such log-grading systems have been developed by experience and are dependent to a large extent upon the ability of the grader and his knowledge of veneer grades and yields required for the specific product...

  12. Foundations of Intervention Research in Instrumental Practice

    PubMed Central

    Hatfield, Johannes L.; Lemyre, Pierre-Nicolas

    2016-01-01

    The goals of the present study are to evaluate, implement, and adapt psychological skills used in the realm of sports into music performance. This research project also aims to build foundations on how to implement future interventions to guide music students on how to optimize practice toward performance. A 2-month psychological skills intervention was provided to two students from the national music academy's bachelor program in music performance to better understand how to adapt and construct psychological skills training programs for performing music students. The program evaluated multiple intervention tools including the use of questionnaires, performance profiling, iPads, electronic practice logs, recording the perceived value of individual and combined work, as well as the effectiveness of different communication forms. Perceived effects of the intervention were collected through semi-structured interviews, observations, and logs. PMID:26834660

  13. From Ions to Bits - Developing the IT infrastructure around the CAMECA IMS 1280-HR SIMS lab at GFZ Potsdam

    NASA Astrophysics Data System (ADS)

    Galkin, A.; Klump, J.; Wiedenbeck, M.

    2012-04-01

    Secondary Ion Mass Spectrometers (SIMS) is an highly sensitive technique for analyzing the surfaces of solids and thin film samples, but has the major drawback that such instruments are both rare and expensive. The Virtual SIMS project aims to design, develop and operate the IT infrastructure around the CAMECA IMS 1280-HR SIMS at GFZ Potsdam. The system will cover the whole spectrum of the procedures in the lab - from the online application for measurement time, to the remote access to the instrument and finally the maintenance of the data for publishing and future re-use. A virtual lab infrastructure around the IMS 1280 will enable remote access to the instrument and make measurement time available to the broadest possible user community. Envisioned is that the IT infrastructure would consist of the following: web portal, data repository, sample repository, project management software, communication arrangements between the lab staff and distant researcher and remote access to the instruments. The web portal will handle online applications for the measurement time. The data from the experiments, the monitoring sensor logs and the lab logbook entries are to be stored and archived. Researchers will be able to access their data remotely in real time, thus imposing a user rights management strucuture. Also planned is that all samples and the standards will be assigned a unique International GeoSample Number (IGSN) and that the images of the samples will be stored and made accessible in addition to any additional documents which might be uploaded by the researcher. The project management application will schedule the application process, the measurements times, notifications and alerts. A video conference capability is forseen for communication between the Potsdam staff and the remote researcher. The remote access to the instruments requires a sophisticated client-server solution. This highly sensitive instrument has to be controlled in real-time with latencies diminished to a minimum. Also, failures and shortages of the internet connection, as well as possible outages on the client side, have to be considered and safe fallbacks for such events must be provided. The level of skills of the researcher remotely operating the instrument will define the scope of control given during an operating session. An important aspect of the project is the design of the virtual lab system in collaboration with the laboratory operators and the researchers who will use the instrument and its peripherals. Different approaches for the IT solutions will be tested and evaluated, so imporved guidelines can evolve from obsperved operating performance.

  14. Suomi Npp and Jpss Pre-Launch Test Data Collection and Archive

    NASA Astrophysics Data System (ADS)

    Denning, M.; Ullman, R.; Guenther, B.; Kilcoyne, H.; Chandler, C.; Adameck, J.

    2012-12-01

    During the development of each Suomi National Polar-orbiting Partnership (Suomi NPP) instrument, significant testing was performed, both in ambient and simulated orbital (thermal-vacuum) conditions, at the instrument factory, and again after integration with the spacecraft. The NPOESS Integrated Program Office (IPO), and later the NASA Joint Polar Satellite System (JPSS) Program Office, defined two primary objectives with respect to capturing instrument and spacecraft test data during these test events. The first objective was to disseminate test data and auxiliary documentation to an often distributed network of scientists to permit timely production of independent assessments of instrument performance, calibration, data quality, and test progress. The second goal was to preserve the data and documentation in a catalogued government archive for the life of the mission, to aid in the resolution of anomalies and to facilitate the comparison of on-orbit instrument operating characteristics to those observed prior to launch. In order to meet these objectives, Suomi NPP pre-launch test data collection, distribution, processing, and archive methods included adaptable support infrastructures to quickly and completely transfer test data and documentation from the instrument and spacecraft factories to sensor scientist teams on-site at the factory and around the country. These methods were unique, effective, and low in cost. These efforts supporting pre-launch instrument calibration permitted timely data quality assessments and technical feedback from contributing organizations within the government, academia, and industry, and were critical in supporting timely sensor development. Second, in parallel to data distribution to the sensor science teams, pre-launch test data were transferred and ingested into the central Suomi NPP calibration and validation (cal/val) system, known as the Government Resource for Algorithm Verification, Independent Testing, and Evaluation (GRAVITE), where they will reside for the life of the mission. As a result, data and documentation are available for query, analysis, and download by the cal/val community via the command-line GRAVITE Transfer Protocol (GTP) tool or via the NOAA-collaborative website "CasaNOSA". Instrument and spacecraft test data, telemetry, and ground support equipment information were collected and organized with detailed test procedures, logs, analyses, characterizations, and reports. This 45 Terabyte archive facilitates the comparison of on-orbit Suomi NPP operating characteristics with that observed prior to launch, and will serve as a resource to aid in the assessment of pre-launch JPSS-1 sensor performance. In summary, this paper will present the innovative pre-launch test data campaign infrastructures employed for Suomi NPP and planned for JPSS-1.

  15. The XMM-Newton Wide-Field Survey in the COSMOS Field. II. X-Ray Data and the logN-logS Relations

    NASA Astrophysics Data System (ADS)

    Cappelluti, N.; Hasinger, G.; Brusa, M.; Comastri, A.; Zamorani, G.; Böhringer, H.; Brunner, H.; Civano, F.; Finoguenov, A.; Fiore, F.; Gilli, R.; Griffiths, R. E.; Mainieri, V.; Matute, I.; Miyaji, T.; Silverman, J.

    2007-09-01

    We present data analysis and X-ray source counts for the first season of XMM-Newton observations in the COSMOS field. The survey covers ~2 deg2 within the region of sky bounded by 09h57m30s

  16. Surgical task analysis of simulated laparoscopic cholecystectomy with a navigation system.

    PubMed

    Sugino, T; Kawahira, H; Nakamura, R

    2014-09-01

       Advanced surgical procedures, which have become complex and difficult, increase the burden of surgeons. Quantitative analysis of surgical procedures can improve training, reduce variability, and enable optimization of surgical procedures. To this end, a surgical task analysis system was developed that uses only surgical navigation information.    Division of the surgical procedure, task progress analysis, and task efficiency analysis were done. First, the procedure was divided into five stages. Second, the operating time and progress rate were recorded to document task progress during specific stages, including the dissecting task. Third, the speed of the surgical instrument motion (mean velocity and acceleration), as well as the size and overlap ratio of the approximate ellipse of the location log data distribution, was computed to estimate the task efficiency during each stage. These analysis methods were evaluated based on experimental validation with two groups of surgeons, i.e., skilled and "other" surgeons. The performance metrics and analytical parameters included incidents during the operation, the surgical environment, and the surgeon's skills or habits.    Comparison of groups revealed that skilled surgeons tended to perform the procedure in less time and involved smaller regions; they also manipulated the surgical instruments more gently.    Surgical task analysis developed for quantitative assessment of surgical procedures and surgical performance may provide practical methods and metrics for objective evaluation of surgical expertise.

  17. The NetQuakes Project - Research-quality Seismic Data Transmitted via the Internet from Citizen-hosted Instruments (Invited)

    NASA Astrophysics Data System (ADS)

    Luetgert, J. H.; Oppenheimer, D. H.; Hamilton, J.

    2010-12-01

    The USGS seeks accelerograph spacing of 5-10 km in selected urban areas of the US to obtain spatially un-aliased recordings of strong ground motions during large earthquakes. These dense measurements will improve our ability to make rapid post-earthquake assessments of expected damage and contribute to the continuing development of engineering standards for construction. To achieve this goal the USGS and its university partners are deploying “NetQuakes” seismographs, designed to record moderate to large earthquakes from the near field to about 100 km. The instruments have tri-axial Colibrys 2005SF MEMS sensors, clip at 3g, and have 18-bit resolution. These instruments are uniquely designed for deployment in private homes, businesses, public buildings and schools where there is an existing Broadband connection to the Internet. The NetQuakes instruments connect to a local network using WiFi and then via the Internet to USGS servers to a) upload triggered accelerograms in miniSEED format, P arrival times, and computed peak ground motion parameters immediately after an earthquake; b) download software updates; c) respond to requests for log files, execute UNIX scripts, and upload waveforms from long-term memory for quakes with peak motions below the trigger threshold; d) send state-of-health (SOH) information in XML format every 10 minutes; and e) synchronize instrument clocks to 1ms accuracy using the Network Time Protocol. NetQuakes instruments cost little to operate and save about $600/yr/site compared to instruments that transmit data via leased telemetry. After learning about the project through press releases, thousands of citizens have registered to host an instrument at http://earthquake.usgs.gov/netquakes using a Google Map interface that depicts where we seek instrument sites. The website also provides NetQuakes hosts access to waveform images recorded by instruments installed in their building. Since 3/2009, the NetQuakes project has installed over 100 instruments in the San Francisco Bay area, over 30 in the Seattle region, and 20 elsewhere in the US. Five instruments are also deployed in the San Francisco Bay region on San Pablo Dam, operated by the East Bay Municipal Utility District (EBMUD). These instruments provide cost-effective monitoring for EBMUD through free Internet telemetry, and because the USGS monitors instrument SOH, performs all data processing and archiving, and transmits recorded shaking levels to the dam operators via ShakeCast. EBMUD allows the strong motion data from their instruments to be freely available for use by the seismological and engineering communities. The NetQuakes project expects to install 350 instruments by the end of 2011.

  18. Automated lithology prediction from PGNAA and other geophysical logs.

    PubMed

    Borsaru, M; Zhou, B; Aizawa, T; Karashima, H; Hashimoto, T

    2006-02-01

    Different methods of lithology predictions from geophysical data have been developed in the last 15 years. The geophysical logs used for predicting lithology are the conventional logs: sonic, neutron-neutron, gamma (total natural-gamma) and density (backscattered gamma-gamma). The prompt gamma neutron activation analysis (PGNAA) is another established geophysical logging technique for in situ element analysis of rocks in boreholes. The work described in this paper was carried out to investigate the application of PGNAA to the lithology interpretation. The data interpretation was conducted using the automatic interpretation program LogTrans based on statistical analysis. Limited test suggests that PGNAA logging data can be used to predict the lithology. A success rate of 73% for lithology prediction was achieved from PGNAA logging data only. It can also be used in conjunction with the conventional geophysical logs to enhance the lithology prediction.

  19. LogScope

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Smith, Margaret H.; Barringer, Howard; Groce, Alex

    2012-01-01

    LogScope is a software package for analyzing log files. The intended use is for offline post-processing of such logs, after the execution of the system under test. LogScope can, however, in principle, also be used to monitor systems online during their execution. Logs are checked against requirements formulated as monitors expressed in a rule-based specification language. This language has similarities to a state machine language, but is more expressive, for example, in its handling of data parameters. The specification language is user friendly, simple, and yet expressive enough for many practical scenarios. The LogScope software was initially developed to specifically assist in testing JPL s Mars Science Laboratory (MSL) flight software, but it is very generic in nature and can be applied to any application that produces some form of logging information (which almost any software does).

  20. Development of Cross-Platform Software for Well Logging Data Visualization

    NASA Astrophysics Data System (ADS)

    Akhmadulin, R. K.; Miraev, A. I.

    2017-07-01

    Well logging data processing is one of the main sources of information in the oil-gas field analysis and is of great importance in the process of its development and operation. Therefore, it is important to have the software which would accurately and clearly provide the user with processed data in the form of well logs. In this work, there have been developed a software product which not only has the basic functionality for this task (loading data from .las files, well log curves display, etc.), but can be run in different operating systems and on different devices. In the article a subject field analysis and task formulation have been performed, and the software design stage has been considered. At the end of the work the resulting software product interface has been described.

  1. Prediction of gas/particle partitioning of polybrominated diphenyl ethers (PBDEs) in global air: a theoretical study

    NASA Astrophysics Data System (ADS)

    Li, Y.-F.; Ma, W.-L.; Yang, M.

    2014-09-01

    Gas/particle (G / P) partitioning for most semivolatile organic compounds (SVOCs) is an important process that primarily governs their atmospheric fate, long-range atmospheric transport potential, and their routs to enter human body. All previous studies on this issue have been hypothetically derived from equilibrium conditions, the results of which do not predict results from monitoring studies well in most cases. In this study, a steady-state model instead of an equilibrium-state model for the investigation of the G / P partitioning behavior for polybrominated diphenyl ethers (PBDEs) was established, and an equation for calculating the partition coefficients under steady state (KPS) for PBDE congeners (log KPS = log KPE + logα) was developed, in which an equilibrium term (log KPE = log KOA + logfOM -11.91, where fOM is organic matter content of the particles) and a nonequilibrium term (logα, mainly caused by dry and wet depositions of particles), both being functions of log KOA (octanol-air partition coefficient), are included, and the equilibrium is a special case of steady state when the nonequilibrium term equals to zero. A criterion to classify the equilibrium and nonequilibrium status for PBDEs was also established using two threshold values of log KOA, log KOA1 and log KOA2, which divide the range of log KOA into 3 domains: equilibrium, nonequilibrium, and maximum partition domains; and accordingly, two threshold values of temperature t, tTH1 when log KOA = log KOA1 and tTH2 when log KOA = log KOA2, were identified, which divide the range of temperature also into the same 3 domains for each BDE congener. We predicted the existence of the maximum partition domain (the values of log KPS reach a maximum constant of -1.53) that every PBDE congener can reach when log KOA ≥ log KOA2, or t ≤ tTH2. The novel equation developed in this study was applied to predict the G / P partition coefficients of PBDEs for the published monitoring data worldwide, including Asia, Europe, North America, and the Arctic, and the results matched well with all the monitoring data, except those obtained at e-waste sites due to the unpredictable PBDE emissions at these sites. This study provided evidence that, the new developed steady-state-based equation is superior to the equilibrium-state-based equation that has been used in describing the G / P partitioning behavior in decades. We suggest that, the investigation on G / P partitioning behavior for PBDEs should be based on steady state, not equilibrium state, and equilibrium is just a special case of steady state when nonequilibrium factors can be ignored. We also believe that our new equation provides a useful tool for environmental scientists in both monitoring and modeling research on G / P partitioning for PBDEs and can be extended to predict G / P partitioning behavior for other SVOCs as well.

  2. Engineering changes to the 0.1m cryogenic wind tunnel at Southampton University

    NASA Technical Reports Server (NTRS)

    Goodyer, M. J.

    1984-01-01

    The more important changes to the 0.1 m cryogenic wind tunnel since its completion in 1977 are outlined. These include detailed improvements in the fan drive to allow higher speeds, and the provision of a test section leg suitable for use with a magnetic suspension and balance system. The instrumentation, data logging, data reduction and tunnel controls were also improved and modernized. A tunnel performance summary is given.

  3. Long-Term Autonomous Measurement of Ocean Dissipation with EPS-MAPPER

    DTIC Science & Technology

    2002-09-30

    profiler merges two well-established instruments, EPSONDE (Oakey, 1988) and Seahorse (Hamilton et al, 1999). The EPSONDE ocean- microstructure technology...will be repackaged with modernized electronics and data logging memory and used as the payload for the Seahorse  moored profiler. APPROACH The...mounting to decouple the SeaHorse motions from the profiler. SeaHorseTM uses wave energy to move the profiler down a mooring wire to a docked

  4. History of the Combat Zone Tax Exclusion

    DTIC Science & Technology

    2011-09-01

    and Accounting Service (DFAS), Military Pay Tables, 1943 and 1945. Note: Minimum and maximum pay values vary within grades due to a member’s years of...Horowitz, Task Leader Log: H 11-001279 Approved for public release; distribution is unlimited. The Institute for Defense Analyses is a non- profit ...instrumental to the functioning of a fair tax system for members of the armed services. Despite its historical ties to wartime finance, the income tax

  5. PBF Reactor Building (PER620). After lowering reactor vessel onto blocks, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). After lowering reactor vessel onto blocks, it is rolled on logs into PBF. Metal framework under vessel is handling device. Various penetrations in reactor bottom were for instrumentation, poison injection, drains. Large one, below center "manhole" was for primary coolant. Photographer: Larry Page. Date: February 13, 1970. INEEL negative no. 70-736 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  6. Veneer recovery from peeler-grade Douglas-fir logs in northwestern Oregon.

    Treesearch

    E.H. Clarke; A.C. Knauss

    1957-01-01

    Continued expansion of Douglas-fir plywood production develops a greater demand for all grades of peelable logs to supply the industry. Thus, it becomes increasingly important to know the veneer-grade recovery expected so that timber and log values can be more accurately appraised.

  7. Petrophysical analysis of geophysical logs of the National Drilling Company-U.S. Geological Survey ground-water research project for Abu Dhabi Emirate, United Arab Emirates

    USGS Publications Warehouse

    Jorgensen, Donald G.; Petricola, Mario

    1994-01-01

    A program of borehole-geophysical logging was implemented to supply geologic and geohydrologic information for a regional ground-water investigation of Abu Dhabi Emirate. Analysis of geophysical logs was essential to provide information on geohydrologic properties because drill cuttings were not always adequate to define lithologic boundaries. The standard suite of logs obtained at most project test holes consisted of caliper, spontaneous potential, gamma ray, dual induction, microresistivity, compensated neutron, compensated density, and compensated sonic. Ophiolitic detritus from the nearby Oman Mountains has unusual petrophysical properties that complicated the interpretation of geophysical logs. The density of coarse ophiolitic detritus is typically greater than 3.0 grams per cubic centimeter, porosity values are large, often exceeding 45 percent, and the clay fraction included unusual clays, such as lizardite. Neither the spontaneous-potential log nor the natural gamma-ray log were useable clay indicators. Because intrinsic permeability is a function of clay content, additional research in determining clay content was critical. A research program of geophysical logging was conducted to determine the petrophysical properties of the shallow subsurface formations. The logging included spectral-gamma and thermal-decay-time logs. These logs, along with the standard geophysical logs, were correlated to mineralogy and whole-rock chemistry as determined from sidewall cores. Thus, interpretation of lithology and fluids was accomplished. Permeability and specific yield were calculated from geophysical-log data and correlated to results from an aquifer test. On the basis of results from the research logging, a method of lithologic and water-resistivity interpretation was developed for the test holes at which the standard suite of logs were obtained. In addition, a computer program was developed to assist in the analysis of log data. Geohydrologic properties were estimated, including volume of clay matrix, volume of matrix other than clay, density of matrix other than clay, density of matrix, intrinsic permeability, specific yield, and specific storage. Geophysical logs were used to (1) determine lithology, (2) correlate lithologic and permeable zones, (3) calibrate seismic reprocessing, (4) calibrate transient-electromagnetic surveys, and (5) calibrate uphole-survey interpretations. Logs were used at the drill site to (1) determine permeability zones, (2) determine dissolved-solids content, which is a function of water resistivity, and (3) design wells accordingly. Data and properties derived from logs were used to determine transmissivity and specific yield of aquifer materials.

  8. Scientific results from Gulf of Mexico Gas Hydrates Joint Industry Project Leg 1 drilling: Introduction and overview

    USGS Publications Warehouse

    Ruppel, C.; Boswell, R.; Jones, E.

    2008-01-01

    The Gulf of Mexico Gas Hydrates Joint Industry Project (JIP) is a consortium of production and service companies and some government agencies formed to address the challenges that gas hydrates pose for deepwater exploration and production. In partnership with the U.S. Department of Energy and with scientific assistance from the U.S. Geological Survey and academic partners, the JIP has focused on studies to assess hazards associated with drilling the fine-grained, hydrate-bearing sediments that dominate much of the shallow subseafloor in the deepwater (>500 m) Gulf of Mexico. In preparation for an initial drilling, logging, and coring program, the JIP sponsored a multi-year research effort that included: (a) the development of borehole stability models for hydrate-bearing sediments; (b) exhaustive laboratory measurements of the physical properties of hydrate-bearing sediments; (c) refinement of new techniques for processing industry-standard 3-D seismic data to constrain gas hydrate saturations; and (d) construction of instrumentation to measure the physical properties of sediment cores that had never been removed from in situ hydrostatic pressure conditions. Following review of potential drilling sites, the JIP launched a 35-day expedition in Spring 2005 to acquire well logs and sediment cores at sites in Atwater Valley lease blocks 13/14 and Keathley Canyon lease block 151 in the northern Gulf of Mexico minibasin province. The Keathley Canyon site has a bottom simulating reflection at ???392 m below the seafloor, while the Atwater Valley location is characterized by seafloor mounds with an underlying upwarped seismic reflection consistent with upward fluid migration and possible shoaling of the base of the gas hydrate stability (BGHS). No gas hydrate was recovered at the drill sites, but logging data, and to some extent cores, suggest the occurrence of gas hydrate in inferred coarser-grained beds and fractures, particularly between 220 and 330 m below the seafloor at the Keathley Canyon site. This paper provides an overview of the results of the initial phases of the JIP work and introduces the 15 papers that make up this special volume on the scientific results related to the 2005 logging and drilling expedition.

  9. Mathematical model of a smoldering log.

    Treesearch

    Fernando de Souza Costa; David Sandberg

    2004-01-01

    A mathematical model is developed describing the natural smoldering of logs. It is considered the steady one dimensional propagation of infinitesimally thin fronts of drying, pyrolysis, and char oxidation in a horizontal semi-infinite log. Expressions for the burn rates, distribution profiles of temperature, and positions of the drying, pyrolysis, and smoldering fronts...

  10. Development of Mechanistic Flexible Pavement Design Concepts for the Heavyweight F-15 Aircraft

    DTIC Science & Technology

    1986-01-01

    UPON REPETITION - VERTICAL SUBGRADE STRAIN RESULTS 131 B-l AC TENSILE STRAIN VERSUS AC THICKNESS, VARYING GRANULAR BASE THICKNESS 204 B...Log SR - 0.8243 - .4095(Log TAg)(Log E*^) - .0110(TGR/Log TAC) ♦ . 0132 (ERi) - .3811(Log ERi) R2-.947 SEE-.0506 (1.124) R2-.923 SEE-.06 Log DO...Thickness. 204 lk^»-^>W »-» •■■*-»«.■■»-»*-!»*» »j(i »*Ä«**»«iutft^^*fij»Ä^»t’Äa*a<uiKajtfj»«iBym^j,Ä*^j’jifjauai**Ä: -**> -"> V \\X%M?J(fjtfJK

  11. Time-location analysis for exposure assessment studies of children using a novel global positioning system instrument.

    PubMed Central

    Elgethun, Kai; Fenske, Richard A; Yost, Michael G; Palcisko, Gary J

    2003-01-01

    Global positioning system (GPS) technology is used widely for business and leisure activities and offers promise for human time-location studies to evaluate potential exposure to environmental contaminants. In this article we describe the development of a novel GPS instrument suitable for tracking the movements of young children. Eleven children in the Seattle area (2-8 years old) wore custom-designed data-logging GPS units integrated into clothing. Location data were transferred into geographic information systems software for map overlay, visualization, and tabular analysis. Data were grouped into five location categories (in vehicle, inside house, inside school, inside business, and outside) to determine time spent and percentage reception in each location. Additional experiments focused on spatial resolution, reception efficiency in typical environments, and sources of signal interference. Significant signal interference occurred only inside concrete/steel-frame buildings and inside a power substation. The GPS instruments provided adequate spatial resolution (typically about 2-3 m outdoors and 4-5 m indoors) to locate subjects within distinct microenvironments and distinguish a variety of human activities. Reception experiments showed that location could be tracked outside, proximal to buildings, and inside some buildings. Specific location information could identify movement in a single room inside a home, on a playground, or along a fence line. The instrument, worn in a vest or in bib overalls, was accepted by children and parents. Durability of the wiring was improved early in the study to correct breakage problems. The use of GPS technology offers a new level of accuracy for direct quantification of time-location activity patterns in exposure assessment studies. PMID:12515689

  12. The development of a full-digital and networkable multi-media based highway information system : phase 1

    DOT National Transportation Integrated Search

    1999-07-26

    This report covers the development of a Multimedia Based Highway Information System (MMHIS). MMHIS extends the capabilities of current photo logging facilities. Photographic logging systems used by highway agencies provide engineers with information ...

  13. Research and development of improved geothermal well logging techniques, tools and components (current projects, goals and status). Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lamers, M.D.

    One of the key needs in the advancement of geothermal energy is availability of adequate subsurface measurements to aid the reservoir engineer in the development and operation of geothermal wells. Some current projects being sponsored by the U. S. Department of Energy's Division of Geothermal Energy pertaining to the development of improved well logging techniques, tools and components are described. An attempt is made to show how these projects contribute to improvement of geothermal logging technology in forming key elements of the overall program goals.

  14. Subsurface Rock Physical Properties by Downhole Loggings - Case Studies of Continental Deep Drilling in Kanto Distinct, Japan

    NASA Astrophysics Data System (ADS)

    Omura, K.

    2014-12-01

    In recent years, many examples of physical logging have been carried out in deep boreholes. The loggings are direct in-situ measurements of rock physical properties under the ground. They provide significant basic data for the geological, geophysical and geotechnical investigations, e.g., tectonic history, seismic wave propagation, and ground motion prediction. Since about 1980's, Natl. Res. Inst. for Earth Sci. and Disast. Prev. (NIED) dug deep boreholes (from 200m to 3000m depth) in sedimentary basin of Kanto distinct, Japan, for purposes of installing seismographs and hydrological instruments, and in-situ stress and pore pressure measurements. At that time, downhole physical loggings were conducted in the boreholes: spontaneous potential, electrical resistance, elastic wave velocity, formation density, neutron porosity, total gamma ray, caliper, temperature loggings. In many cases, digital data values were provided every 2m or 1m or 0.1m. In other cases, we read printed graphs of logging plots and got digital data values. Data from about 30 boreholes are compiled. Especially, particular change of logging data at the depth of an interface between a shallow part (soft sedimentary rock) and a base rock (equivalent to hard pre-Neogene rock) is examined. In this presentation, the correlations among physical properties of rock (especially, formation density, elastic wave velocity and electrical resistance) are introduced and the relation to the lithology is discussed. Formation density, elastic wave velocity and electric resistance data indicate the data are divide in two groups that are higher or lower than 2.5g/cm3: the one correspond to a shallow part and the other correspond to a base rock part. In each group, the elastic wave velocity and electric resistance increase with increase of formation density. However the rates of increases in the shallow part are smaller than in the base rock part. The shallow part has lower degree of solidification and higher porosity than that in the base rock part. It appears differences in the degree of solidification and/or porosity are related to differences in the increasing rates. The present data show that the physical logging data are effective information to explore where the base rock is and what properties of the base rock are different from those in the shallow part.

  15. Detecting well casing leaks in Bangladesh using a salt spiking method

    USGS Publications Warehouse

    Stahl, M.O.; Ong, J.B.; Harvey, C.F.; Johnson, C.D.; Badruzzaman, A.B.M.; Tarek, M.H.; VanGeen, A.; Anderson, J.A.; Lane, J.W.

    2014-01-01

    We apply fluid-replacement logging in arsenic-contaminated regions of Bangladesh using a low-cost, down-well fluid conductivity logging tool to detect leaks in the cased section of wells. The fluid-conductivity tool is designed for the developing world: it is lightweight and easily transportable, operable by one person, and can be built for minimal cost. The fluid-replacement test identifies leaking casing by comparison of fluid conductivity logs collected before and after spiking the wellbore with a sodium chloride tracer. Here, we present results of fluid-replacement logging tests from both leaking and non-leaking casing from wells in Araihazar and Munshiganj, Bangladesh, and demonstrate that the low-cost tool produces measurements comparable to those obtained with a standard geophysical logging tool. Finally, we suggest well testing procedures and approaches for preventing casing leaks in Bangladesh and other developing countries.

  16. Detecting well casing leaks in Bangladesh using a salt spiking method.

    PubMed

    Stahl, M O; Ong, J B; Harvey, C F; Johnson, C D; Badruzzaman, A B M; Tarek, M H; van Geen, A; Anderson, J A; Lane, J W

    2014-09-01

    We apply fluid-replacement logging in arsenic-contaminated regions of Bangladesh using a low-cost, down-well fluid conductivity logging tool to detect leaks in the cased section of wells. The fluid-conductivity tool is designed for the developing world: it is lightweight and easily transportable, operable by one person, and can be built for minimal cost. The fluid-replacement test identifies leaking casing by comparison of fluid conductivity logs collected before and after spiking the wellbore with a sodium chloride tracer. Here, we present results of fluid-replacement logging tests from both leaking and non-leaking casing from wells in Araihazar and Munshiganj, Bangladesh, and demonstrate that the low-cost tool produces measurements comparable to those obtained with a standard geophysical logging tool. Finally, we suggest well testing procedures and approaches for preventing casing leaks in Bangladesh and other developing countries. © 2014, National Ground Water Association.

  17. Web usage data mining agent

    NASA Astrophysics Data System (ADS)

    Madiraju, Praveen; Zhang, Yanqing

    2002-03-01

    When a user logs in to a website, behind the scenes the user leaves his/her impressions, usage patterns and also access patterns in the web servers log file. A web usage mining agent can analyze these web logs to help web developers to improve the organization and presentation of their websites. They can help system administrators in improving the system performance. Web logs provide invaluable help in creating adaptive web sites and also in analyzing the network traffic analysis. This paper presents the design and implementation of a Web usage mining agent for digging in to the web log files.

  18. Assessment of Physical Activity in Chronic Kidney Disease

    PubMed Central

    Robinson-Cohen, Cassianne; Littman, Alyson J; Duncan, Glen E; Roshanravan, Baback; Ikizler, T. Alp; Himmelfarb, Jonathan; Kestenbaum, Bryan R

    2012-01-01

    Background Physical activity (PA) plays important roles in the development of kidney disease and its complications; however, the validity of standard tools for measuring PA is not well understood. Study Design We investigated the performance of several readily-available and widely-used PA and physical function questionnaires, individually and in combination, against accelerometry among a cohort of CKD participants. Setting and Participants Forty-six participants from the Seattle Kidney Study, an observational cohort study of persons with CKD, completed the PA Scale for the Elderly, Human Activity Profile (HAP), Medical Outcomes Study SF-36 questionnaire, and the Four Week PA History Questionnaire (FWH). We simultaneously measured PA using an Actigraph GT3X accelerometer over a 14-day period. We estimated the validity of each instrument by testing its associations with log-transformed accelerometry counts. We used the Akaike information criterion to investigate the performance of combinations of questionnaires. Results All questionnaire scores were significantly associated with log-transformed accelerometry counts. The HAP correlated best with accelerometry counts (r2=0.32) followed by the SF-36 (r2=0.23). Forty-three percent of the variability in accelerometry counts data was explained by a model that combined the HAP, SF-36 and FWH. Conclusion A combination of measurement tools can account for a modest component of PA in patients with CKD; however, a substantial proportion of physical activity is not captured by standard assessments. PMID:22739659

  19. Operationalizing quality improvement in a pediatric surgical practice.

    PubMed

    Arca, Marjorie J; Enters, Jessica; Christensen, Melissa; Jeziorczak, Paul; Sato, Thomas T; Thielke, Robert; Oldham, Keith T

    2014-01-01

    Quality improvement (QI) is critical to enhancing patient care. It is necessary to prioritize which QI initiatives are relevant to one's institution and practice, as implementation is resource-intensive. We have developed and implemented a streamlined process to identify QI opportunities in our practice. We designed a web-based Pediatric and Infant Case Log and Outcomes (PICaLO) instrument using Research Electronic Data Capture (REDCap™) to record all surgical procedures for our practice. At the time of operation, a surgeon completes a case report form. An administrative assistant enters the data in PICaLO within 5-7days. Outcomes such as complications, deaths, and "occurrences" (readmissions, reoperations, transfers to ICU, ER visit, additional clinic visits) are recorded at the time of encounter, during M & M Conferences, and during follow-up clinic visits. Variables were chosen and defined based on national standards from the American College of Surgeons (ACS) National Surgical Quality Improvement Program (NSQIP), and Patient Based Learning Log. Occurrences are queried for potential QI initiatives. In 2012, 3597 patients were entered, totaling 5177 procedures. There were 220 complications, 278 occurrences, and 16 deaths. Specific QI opportunities were identified and put into place. Data on procedures and outcomes can be collected effectively in a pediatric surgery practice to delineate pertinent QI initiatives. PICaLO is recognized by the American Board of Surgery as a mechanism to meet Maintenance of Certification 4 criteria. © 2014.

  20. Rapid Development of Bespoke Unmanned Platforms for Atmospheric Science

    NASA Astrophysics Data System (ADS)

    Sobester, A.; Johnston, S. J.; Scanlan, J. P.; Hart, E. E.; O'Brien, N. S.

    2012-04-01

    The effective deployment of airborne atmospheric science instruments often hinges on the development cycle time of a suitable platform, one that is capable of delivering them to the desired altitude range for a specified amount of time, along a pre-determined trajectory. This could be driven by the need to respond rapidly to sudden, unexpected events (e.g., volcano eruptions, nuclear fallout, etc.) or simply to accommodate the iterative design and flight test cycle of the instrument developer. A shorter development cycle time would also afford us the ability to quickly adapt the hardware and control logic in response to unexpected results during an experimental campaign. We report on recent developments aimed at meeting this demand. As part of the Atmospheric Science Through Robotic Aircraft (ASTRA) initiative we have investigated the use of rapid prototyping technologies to this end, both on the 'airframe' of the platform itself and on the on-board systems. We show how fast multi-disciplinary design optimization techniques, coupled with computer-controlled additive manufacturing (3D printing) and laser cutting methods and electronic prototyping (using standard, modular, programmable building blocks) can lead to the delivery of a fully customized platform integrating a given instrument in a timescale of the order of ten days. Specific examples include the design and testing of a balloon-launched glider sensorcraft and a stratospheric balloon system. The 'vehicle' for the latter was built on a 3D printer using a copolymer thermoplastic material and fitted with a sacrificial protective 'cage' laser-cut from an open-cell foam. The data logging, tracking, sensor integration and communications services of the platform were constructed using the .net Gadgeteer open source hardware kit. The flight planning and eventual post-flight recovery of the system is enabled by a generic, stochastic trajectory simulation tool, also developed as part of the ASTRA initiative. This also demonstrated the feasibility of retrieving instrument platforms after the observations are complete, either through self-recovery (in the case of the glider) or accurate pre-flight prediction and real-time tracking, in the case of the balloon platform. We also review developments in progress, including a balloon-launched flock of sensorcraft designed for the effective mapping of aerosol concentrations or other atmospheric measurements across a target airspace block. At the heart of this effort lies the optimization of the (pre-programmed or dynamically re-designed) trajectories such that they combine to approximate space-filling curves that maximize sampling efficiency (a 3D 'travelling salesman'-type calculus of variations problem).

  1. Repairing casing at a gas storage field

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hollenbaugh, B.

    1992-09-01

    This paper reports on the Leyden gas storage field which is a 1.5-Bcf working volume underground gas storage facility locating at the northwest edge of the Denver, Colo., metropolitan area. The field is owned by Public Service Co. of Colorado and operated by its wholly owned subsidiary, Western Gas Supply Co. Logging technology was instrumental in locating casing damage at two wells, identifying the extent of the damage and ensuring a successful repair. The well casings were repaired by installing a liner between two packers, with one packer set above the damage and the other set below it. Special equipmentmore » and procedures were required for workover and drilling operations because of the complications associated with cavern storage. Logging technology can locate damaged casing and evaluate the type and extent of the damage, and also predict the probability of gas migration behind the casing.« less

  2. Development of Enabling Scientific Tools to Characterize the Geologic Subsurface at Hanford

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kenna, Timothy C.; Herron, Michael M.

    2014-07-08

    This final report to the Department of Energy provides a summary of activities conducted under our exploratory grant, funded through U.S. DOE Subsurface Biogeochemical Research Program in the category of enabling scientific tools, which covers the period from July 15, 2010 to July 14, 2013. The main goal of this exploratory project is to determine the parameters necessary to translate existing borehole log data into reservoir properties following scientifically sound petrophysical relationships. For this study, we focused on samples and Ge-based spectral gamma logging system (SGLS) data collected from wells located in the Hanford 300 Area. The main activities consistedmore » of 1) the analysis of available core samples for a variety of mineralogical, chemical and physical; 2) evaluation of selected spectral gamma logs, environmental corrections, and calibration; 3) development of algorithms and a proposed workflow that permits translation of log responses into useful reservoir properties such as lithology, matrix density, porosity, and permeability. These techniques have been successfully employed in the petroleum industry; however, the approach is relatively new when applied to subsurface remediation. This exploratory project has been successful in meeting its stated objectives. We have demonstrated that our approach can lead to an improved interpretation of existing well log data. The algorithms we developed can utilize available log data, in particular gamma, and spectral gamma logs, and continued optimization will improve their application to ERSP goals of understanding subsurface properties.« less

  3. Prediction of gas/particle partitioning of polybrominated diphenyl ethers (PBDEs) in global air: A theoretical study

    NASA Astrophysics Data System (ADS)

    Li, Y.-F.; Ma, W.-L.; Yang, M.

    2015-02-01

    Gas/particle (G/P) partitioning of semi-volatile organic compounds (SVOCs) is an important process that primarily governs their atmospheric fate, long-range atmospheric transport, and their routes of entering the human body. All previous studies on this issue are hypothetically based on equilibrium conditions, the results of which do not predict results from monitoring studies well in most cases. In this study, a steady-state model instead of an equilibrium-state model for the investigation of the G/P partitioning behavior of polybrominated diphenyl ethers (PBDEs) was established, and an equation for calculating the partition coefficients under steady state (KPS) of PBDEs (log KPS = log KPE + logα) was developed in which an equilibrium term (log KPE = log KOA + logfOM -11.91 where fOM is organic matter content of the particles) and a non-equilibrium term (log α, caused by dry and wet depositions of particles), both being functions of log KOA (octanol-air partition coefficient), are included. It was found that the equilibrium is a special case of steady state when the non-equilibrium term equals zero. A criterion to classify the equilibrium and non-equilibrium status of PBDEs was also established using two threshold values of log KOA, log KOA1, and log KOA2, which divide the range of log KOA into three domains: equilibrium, non-equilibrium, and maximum partition domain. Accordingly, two threshold values of temperature t, tTH1 when log KOA = log KOA1 and tTH2 when log KOA = log KOA2, were identified, which divide the range of temperature also into the same three domains for each PBDE congener. We predicted the existence of the maximum partition domain (the values of log KPS reach a maximum constant of -1.53) that every PBDE congener can reach when log KOA ≥ log KOA2, or t ≤ tTH2. The novel equation developed in this study was applied to predict the G/P partition coefficients of PBDEs for our Chinese persistent organic pollutants (POPs) Soil and Air Monitoring Program, Phase 2 (China-SAMP-II) program and other monitoring programs worldwide, including in Asia, Europe, North America, and the Arctic, and the results matched well with all the monitoring data, except those obtained at e-waste sites due to the unpredictable PBDE emissions at these sites. This study provided evidence that the newly developed steady-state-based equation is superior to the equilibrium-state-based equation that has been used in describing the G/P partitioning behavior over decades. We suggest that the investigation on G/P partitioning behavior for PBDEs should be based onsteady-state, not equilibrium state, and equilibrium is just a special case of steady-state when non-equilibrium factors can be ignored. We also believe that our new equation provides a useful tool for environmental scientists in both monitoring and modeling research on G/P partitioning of PBDEs and can be extended to predict G/P partitioning behavior for other SVOCs as well.

  4. A prospective microbiome-wide association study of food sensitization and food allergy in early childhood.

    PubMed

    Savage, Jessica H; Lee-Sarwar, Kathleen A; Sordillo, Joanne; Bunyavanich, Supinda; Zhou, Yanjiao; O'Connor, George; Sandel, Megan; Bacharier, Leonard B; Zeiger, Robert; Sodergren, Erica; Weinstock, George M; Gold, Diane R; Weiss, Scott T; Litonjua, Augusto A

    2018-01-01

    Alterations in the intestinal microbiome are prospectively associated with the development of asthma; less is known regarding the role of microbiome alterations in food allergy development. Intestinal microbiome samples were collected at age 3-6 months in children participating in the follow-up phase of an interventional trial of high-dose vitamin D given during pregnancy. At age 3, sensitization to foods (milk, egg, peanut, soy, wheat, walnut) was assessed. Food allergy was defined as caretaker report of healthcare provider-diagnosed allergy to the above foods prior to age 3 with evidence of IgE sensitization. Analysis was performed using Phyloseq and DESeq2; P-values were adjusted for multiple comparisons. Complete data were available for 225 children; there were 87 cases of food sensitization and 14 cases of food allergy. Microbial diversity measures did not differ between food sensitization and food allergy cases and controls. The genera Haemophilus (log 2 fold change -2.15, P=.003), Dialister (log 2 fold change -2.22, P=.009), Dorea (log 2 fold change -1.65, P=.02), and Clostridium (log 2 fold change -1.47, P=.002) were underrepresented among subjects with food sensitization. The genera Citrobacter (log 2 fold change -3.41, P=.03), Oscillospira (log 2 fold change -2.80, P=.03), Lactococcus (log 2 fold change -3.19, P=.05), and Dorea (log 2 fold change -3.00, P=.05) were underrepresented among subjects with food allergy. The temporal association between bacterial colonization and food sensitization and allergy suggests that the microbiome may have a causal role in the development of food allergy. Our findings have therapeutic implications for the prevention and treatment of food allergy. © 2017 EAACI and John Wiley and Sons A/S. Published by John Wiley and Sons Ltd.

  5. Determination of real-time polymerase chain reaction uncertainty of measurement using replicate analysis and a graphical user interface with Fieller’s theorem

    PubMed Central

    Stuart, James Ian; Delport, Johan; Lannigan, Robert; Zahariadis, George

    2014-01-01

    BACKGROUND: Disease monitoring of viruses using real-time polymerase chain reaction (PCR) requires knowledge of the precision of the test to determine what constitutes a significant change. Calculation of quantitative PCR confidence limits requires bivariate statistical methods. OBJECTIVE: To develop a simple-to-use graphical user interface to determine the uncertainty of measurement (UOM) of BK virus, cytomegalovirus (CMV) and Epstein-Barr virus (EBV) real-time PCR assays. METHODS: Thirty positive clinical samples for each of the three viral assays were repeated once. A graphical user interface was developed using a spreadsheet (Excel, Microsoft Corporation, USA) to enable data entry and calculation of the UOM (according to Fieller’s theorem) and PCR efficiency. RESULTS: The confidence limits for the BK virus, CMV and EBV tests were ∼0.5 log, 0.5 log to 1.0 log, and 0.5 log to 1.0 log, respectively. The efficiencies of these assays, in the same order were 105%, 119% and 90%. The confidence limits remained stable over the linear range of all three tests. DISCUSSION: A >5 fold (0.7 log) and a >3-fold (0.5 log) change in viral load were significant for CMV and EBV when the results were ≤1000 copies/mL and >1000 copies/mL, respectively. A >3-fold (0.5 log) change in viral load was significant for BK virus over its entire linear range. PCR efficiency was ideal for BK virus and EBV but not CMV. Standardized international reference materials and shared reporting of UOM among laboratories are required for the development of treatment guidelines for BK virus, CMV and EBV in the context of changes in viral load. PMID:25285125

  6. Determination of real-time polymerase chain reaction uncertainty of measurement using replicate analysis and a graphical user interface with Fieller's theorem.

    PubMed

    Stuart, James Ian; Delport, Johan; Lannigan, Robert; Zahariadis, George

    2014-07-01

    Disease monitoring of viruses using real-time polymerase chain reaction (PCR) requires knowledge of the precision of the test to determine what constitutes a significant change. Calculation of quantitative PCR confidence limits requires bivariate statistical methods. To develop a simple-to-use graphical user interface to determine the uncertainty of measurement (UOM) of BK virus, cytomegalovirus (CMV) and Epstein-Barr virus (EBV) real-time PCR assays. Thirty positive clinical samples for each of the three viral assays were repeated once. A graphical user interface was developed using a spreadsheet (Excel, Microsoft Corporation, USA) to enable data entry and calculation of the UOM (according to Fieller's theorem) and PCR efficiency. The confidence limits for the BK virus, CMV and EBV tests were ∼0.5 log, 0.5 log to 1.0 log, and 0.5 log to 1.0 log, respectively. The efficiencies of these assays, in the same order were 105%, 119% and 90%. The confidence limits remained stable over the linear range of all three tests. A >5 fold (0.7 log) and a >3-fold (0.5 log) change in viral load were significant for CMV and EBV when the results were ≤1000 copies/mL and >1000 copies/mL, respectively. A >3-fold (0.5 log) change in viral load was significant for BK virus over its entire linear range. PCR efficiency was ideal for BK virus and EBV but not CMV. Standardized international reference materials and shared reporting of UOM among laboratories are required for the development of treatment guidelines for BK virus, CMV and EBV in the context of changes in viral load.

  7. Summary goodness-of-fit statistics for binary generalized linear models with noncanonical link functions.

    PubMed

    Canary, Jana D; Blizzard, Leigh; Barry, Ronald P; Hosmer, David W; Quinn, Stephen J

    2016-05-01

    Generalized linear models (GLM) with a canonical logit link function are the primary modeling technique used to relate a binary outcome to predictor variables. However, noncanonical links can offer more flexibility, producing convenient analytical quantities (e.g., probit GLMs in toxicology) and desired measures of effect (e.g., relative risk from log GLMs). Many summary goodness-of-fit (GOF) statistics exist for logistic GLM. Their properties make the development of GOF statistics relatively straightforward, but it can be more difficult under noncanonical links. Although GOF tests for logistic GLM with continuous covariates (GLMCC) have been applied to GLMCCs with log links, we know of no GOF tests in the literature specifically developed for GLMCCs that can be applied regardless of link function chosen. We generalize the Tsiatis GOF statistic originally developed for logistic GLMCCs, (TG), so that it can be applied under any link function. Further, we show that the algebraically related Hosmer-Lemeshow (HL) and Pigeon-Heyse (J(2) ) statistics can be applied directly. In a simulation study, TG, HL, and J(2) were used to evaluate the fit of probit, log-log, complementary log-log, and log models, all calculated with a common grouping method. The TG statistic consistently maintained Type I error rates, while those of HL and J(2) were often lower than expected if terms with little influence were included. Generally, the statistics had similar power to detect an incorrect model. An exception occurred when a log GLMCC was incorrectly fit to data generated from a logistic GLMCC. In this case, TG had more power than HL or J(2) . © 2015 John Wiley & Sons Ltd/London School of Economics.

  8. An analysis technique for testing log grades

    Treesearch

    Carl A. Newport; William G. O' Regan

    1963-01-01

    An analytical technique that may be used in evaluating log-grading systems is described. It also provides means of comparing two or more grading systems, or a proposed change with the system from which it was developed. The total volume and computed value of lumber from each sample log are the basic data used.

  9. Green Lumber Grade Yields for Subfactory Class Hardwood Logs

    Treesearch

    Leland F. Hanks; Leland F. Hanks

    1973-01-01

    Data on lumber grade yields for subfactory class logs are presented for ten species of hardwoods. Eogs of this type are expected to assume greater importance in the market. The yields, when coupled with lumber prices, will be useful to sawmill operators for developing log prices in terms of standard factory lumber.

  10. VizieR Online Data Catalog: CCD {Delta}a-photometry of 5 open clusters (Paunzen+, 2003)

    NASA Astrophysics Data System (ADS)

    Paunzen, E.; Pintado, O. I.; Maitzen, H. M.

    2004-01-01

    Observations of the five open clusters were performed with the Bochum 61cm (ESO-La Silla), the Helen-Sawyer-Hogg 61cm telescope (UTSO-Las Campanas Observatory), the 2.15m telescope at the Complejo Astronomico el Leoncito (CASLEO) and the L. Figl Observatory (FOA) with the 150cm telescope on Mt. Schopfl (Austria) using the multimode instrument OEFOSC (see the observation log in Table 1). (5 data files).

  11. European climate reconstructed for the past 500 years based on documentary and instrumental evidence

    NASA Astrophysics Data System (ADS)

    Wheeler, Dennis; Brazdil, Rudolf; Pfister, Christian

    2010-05-01

    European climate reconstructed for the past 500 years based on documentary and instrumental evidence Dennis Wheeler, Rudolf Brázdil, Christian Pfister and the Millennium project SG1 team The paper summarises the results of historical-climatological research conducted as part of the EU-funded 6th FP project MILLENNIUM the principal focus of which was the investigation of European climate during the past one thousand years (http://www.millenniumproject.net/). This project represents a major advance in bringing together, for the first time on such a scale, historical climatologists with other palaeoclimatological communities and climate modellers from many European countries. As part of MILLENNIUM, a sub-group (SG1) of historical climatologists from ten countries had the responsibility of collating and comprehensively analysing evidence from instrumental and documentary archives. This paper presents the main results of this undertaking but confines its attention to the study of the climate of the past 500 years and represents a summary of 10 themed papers submitted for a special issue of Climatic Change. They range across a variety of topics including newly-studied documentary data sources (e.g. early instrumental records, opening of the Stockholm harbour, ship log book data), temperature reconstructions for Central Europe, the Stockholm area and Mediterranean based on different types of documentary evidence, the application of standard paleoclimatological approaches to reconstructions based on index series derived from the documentary data, the influence of circulation dynamics on January-April climate , a comparison of reconstructions based on documentary data with the model runs (ECHO-G), a study of the quality of instrumental data in climate reconstructions, a 500-year flood chronology in Europe, and selected disastrous European windstorms and their reflection in documentary evidence and human memory. Finally, perspectives of historical-climatological research and future challenges and directions in this rapidly-developing and important field are presented together with an overview of the potential of documentary sources for climatic reconstructions.

  12. Gas hydrate environmental monitoring program in the Ulleung Basin, East Sea of Korea

    NASA Astrophysics Data System (ADS)

    Ryu, Byong-Jae; Chun, Jong-Hwa; McLean, Scott

    2013-04-01

    As a part of the Korean National Gas Hydrate Program, the Korea Institute of Geoscience and Mineral Resources (KIGAM) has been planned and conducted the environmental monitoring program for the gas hydrate production test in the Ulleung Basin, East Sea of Korea in 2014. This program includes a baseline survey using a KIGAM Seafloor Observation System (KISOS) and R/V TAMHAE II of KIGAM, development of a KIGAM Seafloor Monitoring System (KIMOS), and seafloor monitoring on various potential hazards associated with the dissociated gas from gas hydrates during the production test. The KIGAM also plans to conduct the geophysical survey for determining the change of gas hydrate reservoirs and production-efficiency around the production well before and after the production test. During production test, release of gas dissociated from the gas hydrate to the water column, seafloor deformation, changes in chemical characteristics of bottom water, changes in seafloor turbidity, etc. will be monitored by using the various monitoring instruments. The KIMOS consists of a near-field observation array and a far-field array. The near-field array is constructed with four remote sensor platforms each, and cabled to the primary node. The far-field sensor array will consists of four autonomous instrument pods. A scientific Remotely Operated Vehicle (ROV) will be used to deploy the sensor arrays, and to connect the cables to each field instrument package and a primary node. A ROV will also be tasked to collect the water and/or gas samples, and to identify any gas (bubble) plumes from the seafloor using a high-frequency sector scanning sonar. Power to the near-field instrument packages will be supplied by battery units located on the seafloor near the primary node. Data obtained from the instruments on the near-field array will be logged and downloaded in-situ at the primary node, and transmitted real-time to the support vessel using a ROV. These data will also be transmitted real-time to the drilling vessel via satellite.

  13. Choice of Stimulus Range and Size Can Reduce Test-Retest Variability in Glaucomatous Visual Field Defects

    PubMed Central

    Swanson, William H.; Horner, Douglas G.; Dul, Mitchell W.; Malinovsky, Victor E.

    2014-01-01

    Purpose To develop guidelines for engineering perimetric stimuli to reduce test-retest variability in glaucomatous defects. Methods Perimetric testing was performed on one eye for 62 patients with glaucoma and 41 age-similar controls on size III and frequency-doubling perimetry and three custom tests with Gaussian blob and Gabor sinusoid stimuli. Stimulus range was controlled by values for ceiling (maximum sensitivity) and floor (minimum sensitivity). Bland-Altman analysis was used to derive 95% limits of agreement on test and retest, and bootstrap analysis was used to test the hypotheses about peak variability. Results Limits of agreement for the three custom stimuli were similar in width (0.72 to 0.79 log units) and peak variability (0.22 to 0.29 log units) for a stimulus range of 1.7 log units. The width of the limits of agreement for size III decreased from 1.78 to 1.37 to 0.99 log units for stimulus ranges of 3.9, 2.7, and 1.7 log units, respectively (F = 3.23, P < 0.001); peak variability was 0.99, 0.54, and 0.34 log units, respectively (P < 0.01). For a stimulus range of 1.3 log units, limits of agreement were narrowest with Gabor and widest with size III stimuli, and peak variability was lower (P < 0.01) with Gabor (0.18 log units) and frequency-doubling perimetry (0.24 log units) than with size III stimuli (0.38 log units). Conclusions Test-retest variability in glaucomatous visual field defects was substantially reduced by engineering the stimuli. Translational Relevance The guidelines should allow developers to choose from a wide range of stimuli. PMID:25371855

  14. Choice of Stimulus Range and Size Can Reduce Test-Retest Variability in Glaucomatous Visual Field Defects.

    PubMed

    Swanson, William H; Horner, Douglas G; Dul, Mitchell W; Malinovsky, Victor E

    2014-09-01

    To develop guidelines for engineering perimetric stimuli to reduce test-retest variability in glaucomatous defects. Perimetric testing was performed on one eye for 62 patients with glaucoma and 41 age-similar controls on size III and frequency-doubling perimetry and three custom tests with Gaussian blob and Gabor sinusoid stimuli. Stimulus range was controlled by values for ceiling (maximum sensitivity) and floor (minimum sensitivity). Bland-Altman analysis was used to derive 95% limits of agreement on test and retest, and bootstrap analysis was used to test the hypotheses about peak variability. Limits of agreement for the three custom stimuli were similar in width (0.72 to 0.79 log units) and peak variability (0.22 to 0.29 log units) for a stimulus range of 1.7 log units. The width of the limits of agreement for size III decreased from 1.78 to 1.37 to 0.99 log units for stimulus ranges of 3.9, 2.7, and 1.7 log units, respectively ( F = 3.23, P < 0.001); peak variability was 0.99, 0.54, and 0.34 log units, respectively ( P < 0.01). For a stimulus range of 1.3 log units, limits of agreement were narrowest with Gabor and widest with size III stimuli, and peak variability was lower ( P < 0.01) with Gabor (0.18 log units) and frequency-doubling perimetry (0.24 log units) than with size III stimuli (0.38 log units). Test-retest variability in glaucomatous visual field defects was substantially reduced by engineering the stimuli. The guidelines should allow developers to choose from a wide range of stimuli.

  15. Selective logging in the Brazilian Amazon.

    PubMed

    Asner, Gregory P; Knapp, David E; Broadbent, Eben N; Oliveira, Paulo J C; Keller, Michael; Silva, Jose N

    2005-10-21

    Amazon deforestation has been measured by remote sensing for three decades. In comparison, selective logging has been mostly invisible to satellites. We developed a large-scale, high-resolution, automated remote-sensing analysis of selective logging in the top five timber-producing states of the Brazilian Amazon. Logged areas ranged from 12,075 to 19,823 square kilometers per year (+/-14%) between 1999 and 2002, equivalent to 60 to 123% of previously reported deforestation area. Up to 1200 square kilometers per year of logging were observed on conservation lands. Each year, 27 million to 50 million cubic meters of wood were extracted, and a gross flux of approximately 0.1 billion metric tons of carbon was destined for release to the atmosphere by logging.

  16. Development of the functional vision questionnaire for children and young people with visual impairment: the FVQ_CYP.

    PubMed

    Tadić, Valerija; Cooper, Andrew; Cumberland, Phillippa; Lewando-Hundt, Gillian; Rahi, Jugnoo S

    2013-12-01

    To develop a novel age-appropriate measure of functional vision (FV) for self-reporting by visually impaired (VI) children and young people. Questionnaire development. A representative patient sample of VI children and young people aged 10 to 15 years, visual acuity of the logarithm of the minimum angle of resolution (logMAR) worse than 0.48, and a school-based (nonrandom) expert group sample of VI students aged 12 to 17 years. A total of 32 qualitative semistructured interviews supplemented by narrative feedback from 15 eligible VI children and young people were used to generate draft instrument items. Seventeen VI students were consulted individually on item relevance and comprehensibility, instrument instructions, format, and administration methods. The resulting draft instrument was piloted with 101 VI children and young people comprising a nationally representative sample, drawn from 21 hospitals in the United Kingdom. Initial item reduction was informed by presence of missing data and individual item response pattern. Exploratory factor analysis (FA) and parallel analysis (PA), and Rasch analysis (RA) were applied to test the instrument's psychometric properties. Psychometric indices and validity assessment of the Functional Vision Questionnaire for Children and Young People (FVQ_CYP). A total of 712 qualitative statements became a 56-item draft scale, capturing the level of difficulty in performing vision-dependent activities. After piloting, items were removed iteratively as follows: 11 for high percentage of missing data, 4 for skewness, and 1 for inadequate item infit and outfit values in RA, 3 having shown differential item functioning across age groups and 1 across gender in RA. The remaining 36 items showed item fit values within acceptable limits, good measurement precision and targeting, and ordered response categories. The reduced scale has a clear unidimensional structure, with all items having a high factor loading on the single factor in FA and PA. The summary scores correlated significantly with visual acuity. We have developed a novel, psychometrically robust self-report questionnaire for children and young people-the FVQ_CYP-that captures the functional impact of visual disability from their perspective. The 36-item, 4-point unidimensional scale has potential as a complementary adjunct to objective clinical assessments in routine pediatric ophthalmology practice and in research. Copyright © 2013 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  17. MAIL LOG, program theory, volume 1. [Scout project automatic data system

    NASA Technical Reports Server (NTRS)

    Harris, D. K.

    1979-01-01

    The program theory used to obtain the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS, is described. The program is written in FORTRAN for the PRIME 300 computer system. The MAIL LOG data base consists of three main subfiles: (1) incoming and outgoing mail correspondence; (2) design information releases and reports; and (3) drawings and engineering orders. All subroutine descriptions, flowcharts, and MAIL LOG outputs are given and the data base design is described.

  18. Emergent Intelligent Behavior through Integrated Investigation of Embodied Natural Language, Reasoning, Learning, Computer Vision, and Robotic Manipulation

    DTIC Science & Technology

    2011-10-11

    developed a method for determining the structure (component logs and their 3D place- ment) of a LINCOLN LOG assembly from a single image from an uncalibrated...small a class of components. Moreover, we focus on determining the precise pose and structure of an assembly, including the 3D pose of each...medial axes are parallel to the work surface. Thus valid structures Fig. 1. The 3D geometric shape parameters of LINCOLN LOGS. have logs on

  19. Hardwood sawyer trainer

    Treesearch

    Luis G. Occeña; Eknarin Santitrakul; Daniel L. Schmoldt

    2000-01-01

    It is well understood by now that the initial breakdown of hardwood logs into lumber has a tremendous impact on the total lumber value and conversion efficiency. The focus of this research project is the development of a computer-aided sawing trainer tool for the primary breakdown of hardwood logs. Maximum lumber recovery is dependent on the proper log orientation as...

  20. Web Log Analysis: A Study of Instructor Evaluations Done Online

    ERIC Educational Resources Information Center

    Klassen, Kenneth J.; Smith, Wayne

    2004-01-01

    This paper focuses on developing a relatively simple method for analyzing web-logs. It also explores the challenges and benefits of web-log analysis. The study of student behavior on this site provides insights into website design and the effectiveness of this site in particular. Another benefit realized from the paper is the ease with which these…

  1. West Virginia wood waste from uncharted sources: log landings and active surface mines

    Treesearch

    Shawn T. Grushecky; Lawrence E. Osborn

    2013-01-01

    Traditionally, biomass availability estimates from West Virginia have focused on primary and secondary mill byproducts and logging residues. Other sources of woody biomass are available that have not been surveyed. Through a series of field studies during 2010 and 2011, biomass availability estimates were developed for surface mine sites and log landings in West...

  2. Project LOGgED ON: Advanced Science Online for Gifted Learners

    ERIC Educational Resources Information Center

    Reed, Christine; Urquhart, Jill

    2007-01-01

    Gifted students are often underserved because they do not have access to highly challenging curriculum. In October, 2002, Project LOGgED ON (www.scrolldown.com/loggedon/) at University of Virginia received federal funding from the Jacob Javits Act to tackle this issue. Those who were part of the LOGgED ON project developed advanced science…

  3. Assessing the feasibility and profitability of cable logging in southern upland hardwood forests

    Treesearch

    Chris B. LeDoux; Dennis M. May; Tony Johnson; Richard H. Widmann

    1995-01-01

    Procedures developed to assess available timber supplies from upland hardwood forest statistics reported by the USDA Forest Services' Forest Inventory and Analysis unit were modified to assess the feasibility and profitability of cable logging in southern upland hardwood forests. Depending on the harvest system and yarding distance used, cable logging can be...

  4. Rule-driven defect detection in CT images of hardwood logs

    Treesearch

    Erol Sarigul; A. Lynn Abbott; Daniel L. Schmoldt

    2000-01-01

    This paper deals with automated detection and identification of internal defects in hardwood logs using computed tomography (CT) images. We have developed a system that employs artificial neural networks to perform tentative classification of logs on a pixel-by-pixel basis. This approach achieves a high level of classification accuracy for several hardwood species (...

  5. Influence of pressure change during hydraulic tests on fracture aperture.

    PubMed

    Ji, Sung-Hoon; Koh, Yong-Kwon; Kuhlman, Kristopher L; Lee, Moo Yul; Choi, Jong Won

    2013-03-01

    In a series of field experiments, we evaluate the influence of a small water pressure change on fracture aperture during a hydraulic test. An experimental borehole is instrumented at the Korea Atomic Energy Research Institute (KAERI) Underground Research Tunnel (KURT). The target fracture for testing was found from the analyses of borehole logging and hydraulic tests. A double packer system was developed and installed in the test borehole to directly observe the aperture change due to water pressure change. Using this packer system, both aperture and flow rate are directly observed under various water pressures. Results indicate a slight change in fracture hydraulic head leads to an observable change in aperture. This suggests that aperture change should be considered when analyzing hydraulic test data from a sparsely fractured rock aquifer. © 2012, The Author(s). Groundwater © 2012, National Ground Water Association.

  6. Solutions for acceleration measurement in vehicle crash tests

    NASA Astrophysics Data System (ADS)

    Dima, D. S.; Covaciu, D.

    2017-10-01

    Crash tests are useful for validating computer simulations of road traffic accidents. One of the most important parameters measured is the acceleration. The evolution of acceleration versus time, during a crash test, form a crash pulse. The correctness of the crash pulse determination depends on the data acquisition system used. Recommendations regarding the instrumentation for impact tests are given in standards, which are focused on the use of accelerometers as impact sensors. The goal of this paper is to present the device and software developed by authors for data acquisition and processing. The system includes two accelerometers with different input ranges, a processing unit based on a 32-bit microcontroller and a data logging unit with SD card. Data collected on card, as text files, is processed with a dedicated software running on personal computers. The processing is based on diagrams and includes the digital filters recommended in standards.

  7. Estimating sub-surface dispersed oil concentration using acoustic backscatter response.

    PubMed

    Fuller, Christopher B; Bonner, James S; Islam, Mohammad S; Page, Cheryl; Ojo, Temitope; Kirkey, William

    2013-05-15

    The recent Deepwater Horizon disaster resulted in a dispersed oil plume at an approximate depth of 1000 m. Several methods were used to characterize this plume with respect to concentration and spatial extent including surface supported sampling and autonomous underwater vehicles with in situ instrument payloads. Additionally, echo sounders were used to track the plume location, demonstrating the potential for remote detection using acoustic backscatter (ABS). This study evaluated use of an Acoustic Doppler Current Profiler (ADCP) to quantitatively detect oil-droplet suspensions from the ABS response in a controlled laboratory setting. Results from this study showed log-linear ABS responses to oil-droplet volume concentration. However, the inability to reproduce ABS response factors suggests the difficultly in developing meaningful calibration factors for quantitative field analysis. Evaluation of theoretical ABS intensity derived from the particle size distribution provided insight regarding method sensitivity in the presence of interfering ambient particles. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Design and performance of a high resolution, low latency stripline beam position monitor system

    NASA Astrophysics Data System (ADS)

    Apsimon, R. J.; Bett, D. R.; Blaskovic Kraljevic, N.; Burrows, P. N.; Christian, G. B.; Clarke, C. I.; Constance, B. D.; Dabiri Khah, H.; Davis, M. R.; Perry, C.; Resta López, J.; Swinson, C. J.

    2015-03-01

    A high-resolution, low-latency beam position monitor (BPM) system has been developed for use in particle accelerators and beam lines that operate with trains of particle bunches with bunch separations as low as several tens of nanoseconds, such as future linear electron-positron colliders and free-electron lasers. The system was tested with electron beams in the extraction line of the Accelerator Test Facility at the High Energy Accelerator Research Organization (KEK) in Japan. It consists of three stripline BPMs instrumented with analogue signal-processing electronics and a custom digitizer for logging the data. The design of the analogue processor units is presented in detail, along with measurements of the system performance. The processor latency is 15.6 ±0.1 ns . A single-pass beam position resolution of 291 ±10 nm has been achieved, using a beam with a bunch charge of approximately 1 nC.

  9. Development of bovine serum albumin-water partition coefficients predictive models for ionogenic organic chemicals based on chemical form adjusted descriptors.

    PubMed

    Ding, Feng; Yang, Xianhai; Chen, Guosong; Liu, Jining; Shi, Lili; Chen, Jingwen

    2017-10-01

    The partition coefficients between bovine serum albumin (BSA) and water (K BSA/w ) for ionogenic organic chemicals (IOCs) were different greatly from those of neutral organic chemicals (NOCs). For NOCs, several excellent models were developed to predict their logK BSA/w . However, it was found that the conventional descriptors are inappropriate for modeling logK BSA/w of IOCs. Thus, alternative approaches are urgently needed to develop predictive models for K BSA/w of IOCs. In this study, molecular descriptors that can be used to characterize the ionization effects (e.g. chemical form adjusted descriptors) were calculated and used to develop predictive models for logK BSA/w of IOCs. The models developed had high goodness-of-fit, robustness, and predictive ability. The predictor variables selected to construct the models included the chemical form adjusted averages of the negative potentials on the molecular surface (V s-adj - ), the chemical form adjusted molecular dipole moment (dipolemoment adj ), the logarithm of the n-octanol/water distribution coefficient (logD). As these molecular descriptors can be calculated from their molecular structures directly, the developed model can be easily used to fill the logK BSA/w data gap for other IOCs within the applicability domain. Furthermore, the chemical form adjusted descriptors calculated in this study also could be used to construct predictive models on other endpoints of IOCs. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Improved method estimating bioconcentration/bioaccumulation factor from octanol/water partition coefficient

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meylan, W.M.; Howard, P.H.; Aronson, D.

    1999-04-01

    A compound`s bioconcentration factor (BDF) is the most commonly used indicator of its tendency to accumulate in aquatic organisms from the surrounding medium. Because it is expensive to measure, the BCF is generally estimated from the octanol/water partition coefficient (K{sub ow}), but currently used regression equations were developed from small data sets that do not adequately represent the wide range of chemical substances now subject to review. To develop and improved method, the authors collected BCF data in a file that contained information on measured BCFs and other key experimental details for 694 chemicals. Log BCF was then regressed againstmore » log K{sub ow} and chemicals with significant deviations from the line of best fit were analyzed by chemical structure. The resulting algorithm classifies a substance as either nonionic or ionic, the latter group including carboxylic acids, sulfonic acids and their salts, and quaternary N compounds. Log BCF for nonionics is estimated from log K{sub ow} and a series of correction factors if applicable; different equations apply for log K{sub ow} 1.0 to 7.0 and >7.0. For ionics, chemicals are categorized by log K{sub ow} and a log BCF in the range 0.5 to 1.75 is assigned. Organometallics, nonionics with long alkyl chains, and aromatic azo compounds receive special treatment. The correlation coefficient and mean error for log BCF indicate that the new method is a significantly better fit to existing data than other methods.« less

  11. Host range of the emerald ash borer (Agrilus planipennis Fairmaire) (Coleoptera: Buprestidae) in North America: results of multiple-choice field experiments.

    PubMed

    Anulewicz, Andrea C; McCullough, Deborah G; Cappaert, David L; Poland, Therese M

    2008-02-01

    Emerald ash borer (Agrilus planipennis Fairmaire) (Coleoptera: Buprestidae), an invasive phloem-feeding pest, was identified as the cause of widespread ash (Fraxinus) mortality in southeast Michigan and Windsor, Ontario, Canada, in 2002. A. planipennis reportedly colonizes other genera in its native range in Asia, including Ulmus L., Juglans L., and Pterocarya Kunth. Attacks on nonash species have not been observed in North America to date, but there is concern that other genera could be colonized. From 2003 to 2005, we assessed adult A. planipennis landing rates, oviposition, and larval development on North American ash species and congeners of its reported hosts in Asia in multiple-choice field studies conducted at several southeast Michigan sites. Nonash species evaluated included American elm (U. americana L.), hackberry (Celtis occidentalis L.), black walnut (J. nigra L.), shagbark hickory [Carya ovata (Mill.) K.Koch], and Japanese tree lilac (Syringa reticulata Bl.). In studies with freshly cut logs, adult beetles occasionally landed on nonash logs but generally laid fewer eggs than on ash logs. Larvae fed and developed normally on ash logs, which were often heavily infested. No larvae were able to survive, grow, or develop on any nonash logs, although failed first-instar galleries occurred on some walnut logs. High densities of larvae developed on live green ash and white ash nursery trees, but there was no evidence of larval survival or development on Japanese tree lilac and black walnut trees in the same plantation. We felled, debarked, and intensively examined >28 m2 of phloem area on nine American elm trees growing in contact with or adjacent to heavily infested ash trees. We found no sign of A. planipennis feeding on any elm.

  12. Calculation and evaluation of log-based physical properties in the inner accretionary prism, NanTroSEIZE Site C0002, Nankai Trough, Japan

    NASA Astrophysics Data System (ADS)

    Webb, S. I.; Tudge, J.; Tobin, H. J.

    2013-12-01

    Integrated Ocean Drilling Program (IODP) Expedition 338, the most recently completed drilling stage of the NanTroSEIZE project, targeted the Miocene inner accretionary prism off the coast of southwest Japan. NanTroSEIZE is a multi-stage project in which the main objective is to characterize, sample, and instrument the potentially seismogenic region of the Nankai Trough, an active subduction zone. Understanding the physical properties of the inner accretionary prism will aid in the characterization of the deformation that has taken place and the evolution of stress, fluid pressure, and strain over the deformational history of these sediments and rocks. This study focuses on the estimation of porosity and density from available logs to inform solid and fluid volume estimates at Site C0002 from the sea floor through the Kumano Basin into the accretionary prism. Gamma ray, resistivity, and sonic logs were acquired at Hole C0002F, to a total depth of 2005 mbsf into the inner accretionary prism. Because a density and neutron porosity tool could not be deployed, porosity and density must be estimated using a variety of largely empirical methods. In this study, we calculate estimated porosity and density from both the electrical resistivity and sonic (P-wave velocity) logs collected in Hole C0002F. However, the relationship of these physical properties to the available logs is not straightforward and can be affected by changes in fluid type, salinity, temperature, presence of fractures, and clay mineralogy. To evaluate and calibrate the relationships among these properties, we take advantage of the more extensive suite of LWD data recorded in Hole C0002A at the same drill site, including density and neutron porosity measurements. Data collected in both boreholes overlaps in the interval from 875 - 1400 mbsf in the lower Kumano Basin and across the basin-accretionary wedge boundary. Core-based physical properties are also available across this interval. Through comparison of density and porosity values in intervals where core and LWD data overlap, we calculate porosity and density values and evaluate their uncertainties, developing a best estimate given the specific lithology and pore fluid at this tectonic setting. We then propagate this calibrated estimate to the deeper portions of C0002F where core and LWD density and porosity measurements are unavailable, using the sonic and resistivity data alone.

  13. Effects of U.S. Navy Diver Training on Physiological Parameters, Time of Useful Consciousness and Cognitive Performance During Periods of Normobaric Hypoxia

    DTIC Science & Technology

    2014-04-01

    from the pulse oximeter were integrated, digitized, and displayed graphically in real time in LabView (National Instruments) and logged at 20 Hz...Peripheral oxygenation monitoring: Fg-SpO2 levels were measured using a pulse oximeter placed on the left index finger (ROBD-2; Series 6202, Environics...Tolland, CT). Heart rate monitoring: HR was measured using a pulse oximeter placed on the left index finger (ROBD-2; Series 6202, Environics

  14. Effects of U.S. Navy Diver Training on Physiological Parameters, Time of Useful Consciousness, and Cognitive Performance During Periods of Normobaric Hypoxia

    DTIC Science & Technology

    2014-04-01

    from the pulse oximeter were integrated, digitized, and displayed graphically in real time in LabView (National Instruments) and logged at 20 Hz...Peripheral oxygenation monitoring: Fg-SpO2 levels were measured using a pulse oximeter placed on the left index finger (ROBD-2; Series 6202, Environics...Tolland, CT). Heart rate monitoring: HR was measured using a pulse oximeter placed on the left index finger (ROBD-2; Series 6202, Environics

  15. Kalman Filter Chemical Data Assimilation: A Case Study in January 1992

    NASA Technical Reports Server (NTRS)

    Lary, D. J.; Khattatov, B.; Atlas, Robert; Mussa, H.

    2002-01-01

    This paper describes a Kalman filter chemical data assimilation system and its use for analysing a vertical atmospheric profile during January 1992. The vertical profile was at an equivalent PV latitude (phi(sub e)) of 55 deg S and consisted of 21 potential temperature (theta) levels spaced equally in log(theta) between 400 K and 2000 K. This equivalent latitude was chosen as it was well observed during January 1992 by instruments on board the Upper Atmosphere Research Satellite (UARS).

  16. ULF/VLF (0.001 to 50 Hz) Seismo-Acoustic Noise in the Ocean. Proceedings of a Workshop Held at Austin, Texas on November 29-December 1, 1988

    DTIC Science & Technology

    1989-08-03

    holes drilled in the seafloor from the D/V JOIDES Resolution through petrological , geochemical and paleomagnetic studies of the samples and logging...seismome- ters and/or hydrophones (or differential pressure gauges , DPG). Testing of the new instruments at very early stages is important to ensure...resolved using ocean bottom seismometers, suspended hydrophones and differential pressure gauges assisted by an orbiting radar altimeter (GEOSAT

  17. MODIS. Volume 1: MODIS level 1A software baseline requirements

    NASA Technical Reports Server (NTRS)

    Masuoka, Edward; Fleig, Albert; Ardanuy, Philip; Goff, Thomas; Carpenter, Lloyd; Solomon, Carl; Storey, James

    1994-01-01

    This document describes the level 1A software requirements for the moderate resolution imaging spectroradiometer (MODIS) instrument. This includes internal and external requirements. Internal requirements include functional, operational, and data processing as well as performance, quality, safety, and security engineering requirements. External requirements include those imposed by data archive and distribution systems (DADS); scheduling, control, monitoring, and accounting (SCMA); product management (PM) system; MODIS log; and product generation system (PGS). Implementation constraints and requirements for adapting the software to the physical environment are also included.

  18. Automated decontamination of surface-adherent prions.

    PubMed

    Schmitt, A; Westner, I M; Reznicek, L; Michels, W; Mitteregger, G; Kretzschmar, H A

    2010-09-01

    At present there is no routinely available decontamination procedure in washer-disinfectors to allow the reliable inactivation and/or elimination of prions present on reusable surgical instruments. This means that is not possible to provide assurance for preventing iatrogenic transmission of prion diseases. We need effective procedures in prion decontamination that can be integrated into the usual routine of reprocessing surgical instruments. This article reports on the evaluation of an automated process designed to decontaminate prions in washer-disinfectors using a quantitative, highly sensitive in vivo assay for surface-adherent 22L prions. The automated process showed great advantages when compared with conventional alkaline cleaning. In contrast, the new process was as effective as autoclaving at 134 degrees C for 2h and left no detectable prion infectivity, even for heavily contaminated surfaces. This indicates a reduction of surface-adherent prion infectivity of >7 log units. Due to its compatibility with even delicate surgical instruments, the process can be integrated into the large scale reprocessing of instruments in a central sterile supply department. The system could potentially make an important contribution to the prevention of iatrogenic transmission of prions. Copyright 2010 The Hospital Infection Society. Published by Elsevier Ltd. All rights reserved.

  19. Global and Local Approaches Describing Critical Phenomena on the Developing and Developed Financial Markets

    NASA Astrophysics Data System (ADS)

    Grech, Dariusz

    We define and confront global and local methods to analyze the financial crash-like events on the financial markets from the critical phenomena point of view. These methods are based respectively on the analysis of log-periodicity and on the local fractal properties of financial time series in the vicinity of phase transitions (crashes). The log-periodicity analysis is made in a daily time horizon, for the whole history (1991-2008) of Warsaw Stock Exchange Index (WIG) connected with the largest developing financial market in Europe. We find that crash-like events on the Polish financial market are described better by the log-divergent price model decorated with log-periodic behavior than by the power-law-divergent price model usually discussed in log-periodic scenarios for developed markets. Predictions coming from log-periodicity scenario are verified for all main crashes that took place in WIG history. It is argued that crash predictions within log-periodicity model strongly depend on the amount of data taken to make a fit and therefore are likely to contain huge inaccuracies. Next, this global analysis is confronted with the local fractal description. To do so, we provide calculation of the so-called local (time dependent) Hurst exponent H loc for the WIG time series and for main US stock market indices like DJIA and S&P 500. We point out dependence between the behavior of the local fractal properties of financial time series and the crashes appearance on the financial markets. We conclude that local fractal method seems to work better than the global approach - both for developing and developed markets. The very recent situation on the market, particularly related to the Fed intervention in September 2007 and the situation immediately afterwards is also analyzed within fractal approach. It is shown in this context how the financial market evolves through different phases of fractional Brownian motion. Finally, the current situation on American market is analyzed in fractal language. This is to show how far we still are from the end of recession and from the beginning of a new boom on US financial market or on other world leading stocks.

  20. Green lumber grade yields from black cherry and red maple factory grade logs sawed at band and circular mills

    Treesearch

    Daniel A. Yaussy

    1989-01-01

    Multivariate regression models were developed to predict green board-foot yields (1 board ft. = 2.360 dm 3) for the standard factory lumber grades processed from black cherry (Prunus serotina Ehrh.) and red maple (Acer rubrum L.) factory grade logs sawed at band and circular sawmills. The models use log...

  1. Adjusting Quality index Log Values to Represent Local and Regional Commercial Sawlog Product Values

    Treesearch

    Orris D. McCauley; Joseph J. Mendel; Joseph J. Mendel

    1969-01-01

    The primary purpose of this paper is not only to report the results of a comparative analysis as to how well the Q.I. method predicts log product values when compared to commercial sawmill log output values, but also to develop a methodology which will facilitate the comparison and provide the adjustments needed by the sawmill operator.

  2. A Manual of Instruction for Log Scaling and the Measurement of Timber Products.

    ERIC Educational Resources Information Center

    Idaho State Board of Vocational Education, Boise. Div. of Trade and Industrial Education.

    This manual was developed by a state advisory committee in Idaho to improve and standardize log scaling and provide a reference in training men for the job of log scaling in timber measurement. The content includes: (1) an introduction containing the scope of the manual, a definition and history of scaling, the reasons for scaling, and the…

  3. British Columbia log export policy: historical review and analysis.

    Treesearch

    Craig W. Shinn

    1993-01-01

    Log exports have been restricted in British Columbia for over 100 years. The intent of the restriction is to use the timber in British Columbia to encourage development of forest industry, employment, and well-being in the Province. Logs have been exempted from the within-Province manufacturing rule at various times, in varying amounts, for different reasons, and by...

  4. When Smart People Fail: An Analysis of the Transaction Log of an Online Public Access Catalog.

    ERIC Educational Resources Information Center

    Peters, Thomas A.

    1989-01-01

    Describes a low cost study of the transaction logs of an online catalog at an academic library that examined failure rates, usage patterns, and probable causes of patron problems. The implications of the findings for bibliographic instruction and collection development are discussed and the benefits of analyzing transaction logs are identified.…

  5. Butt-log grade distributions for five Appalachian hardwood species

    Treesearch

    John R. Myers; Gary W. Miller; Harry V., Jr. Wiant; Joseph E. Barnard; Joseph E. Barnard

    1986-01-01

    Tree quality is an important factor in determining the market value of hardwood timber stands, but many forest inventories do not include estimates of tree quality. Butt-log grade distributions were developed for northern red oak, black oak, white oak, chestnut oak, and yellow-poplar using USDA Forest Service log grades on more than 4,700 trees in West Virginia. Butt-...

  6. Multivariate regression model for predicting yields of grade lumber from yellow birch sawlogs

    Treesearch

    Andrew F. Howard; Daniel A. Yaussy

    1986-01-01

    A multivariate regression model was developed to predict green board-foot yields for the common grades of factory lumber processed from yellow birch factory-grade logs. The model incorporates the standard log measurements of scaling diameter, length, proportion of scalable defects, and the assigned USDA Forest Service log grade. Differences in yields between band and...

  7. The modelling of carbon-based supercapacitors: Distributions of time constants and Pascal Equivalent Circuits

    NASA Astrophysics Data System (ADS)

    Fletcher, Stephen; Kirkpatrick, Iain; Dring, Roderick; Puttock, Robert; Thring, Rob; Howroyd, Simon

    2017-03-01

    Supercapacitors are an emerging technology with applications in pulse power, motive power, and energy storage. However, their carbon electrodes show a variety of non-ideal behaviours that have so far eluded explanation. These include Voltage Decay after charging, Voltage Rebound after discharging, and Dispersed Kinetics at long times. In the present work, we establish that a vertical ladder network of RC components can reproduce all these puzzling phenomena. Both software and hardware realizations of the network are described. In general, porous carbon electrodes contain random distributions of resistance R and capacitance C, with a wider spread of log R values than log C values. To understand what this implies, a simplified model is developed in which log R is treated as a Gaussian random variable while log C is treated as a constant. From this model, a new family of equivalent circuits is developed in which the continuous distribution of log R values is replaced by a discrete set of log R values drawn from a geometric series. We call these Pascal Equivalent Circuits. Their behaviour is shown to resemble closely that of real supercapacitors. The results confirm that distributions of RC time constants dominate the behaviour of real supercapacitors.

  8. The development and validation of the Instructional Practices Log in Science: a measure of K-5 science instruction

    NASA Astrophysics Data System (ADS)

    Adams, Elizabeth L.; Carrier, Sarah J.; Minogue, James; Porter, Stephen R.; McEachin, Andrew; Walkowiak, Temple A.; Zulli, Rebecca A.

    2017-02-01

    The Instructional Practices Log in Science (IPL-S) is a daily teacher log developed for K-5 teachers to self-report their science instruction. The items on the IPL-S are grouped into scales measuring five dimensions of science instruction: Low-level Sense-making, High-level Sense-making, Communication, Integrated Practices, and Basic Practices. As part of the current validation study, 206 elementary teachers completed 4137 daily log entries. The purpose of this paper is to provide evidence of validity for the IPL-S's scales, including (a) support for the theoretical framework; (b) cognitive interviews with logging teachers; (c) item descriptive statistics; (d) comparisons of 28 pairs of teacher and rater logs; and (e) an examination of the internal structure of the IPL-S. We present evidence to describe the extent to which the items and the scales are completed accurately by teachers and differentiate various types of science instructional strategies employed by teachers. Finally, we point to several practical implications of our work and potential uses for the IPL-S. Overall, results provide neutral to positive support for the validity of the groupings of items or scales.

  9. Study of the quality characteristics in cold-smoked salmon (Salmo salar) originating from pre- or post-rigor raw material.

    PubMed

    Birkeland, S; Akse, L

    2010-01-01

    Improved slaughtering procedures in the salmon industry have caused a delayed onset of rigor mortis and, thus, a potential for pre-rigor secondary processing. The aim of this study was to investigate the effect of rigor status at time of processing on quality traits color, texture, sensory, microbiological, in injection salted, and cold-smoked Atlantic salmon (Salmo salar). Injection of pre-rigor fillets caused a significant (P<0.001) contraction (-7.9%± 0.9%) on the caudal-cranial axis. No significant differences in instrumental color (a*, b*, C*, or h*), texture (hardness), or sensory traits (aroma, color, taste, and texture) were observed between pre- or post-rigor processed fillets; however, post-rigor (1477 ± 38 g) fillets had a significant (P>0.05) higher fracturability than pre-rigor fillets (1369 ± 71 g). Pre-rigor fillets were significantly (P<0.01) lighter, L*, (39.7 ± 1.0) than post-rigor fillets (37.8 ± 0.8) and had significantly lower (P<0.05) aerobic plate count (APC), 1.4 ± 0.4 log CFU/g against 2.6 ± 0.6 log CFU/g, and psychrotrophic count (PC), 2.1 ± 0.2 log CFU/g against 3.0 ± 0.5 log CFU/g, than post-rigor processed fillets. This study showed that similar quality characteristics can be obtained in cold-smoked products processed either pre- or post-rigor when using suitable injection salting protocols and smoking techniques. © 2010 Institute of Food Technologists®

  10. Predicting landslides in clearcut patches

    Treesearch

    Raymond M. Rice; Norman H. Pillsbury

    1982-01-01

    Abstract - Accelerated erosion in the form of landslides can be an undesirable consequence of clearcut logging on steep slopes. Forest managers need a method of predicting the risk of such erosion. Data collected after logging in a granitic area of northwestern California were used to develop a predictive equation. A linear discriminant function was developed that...

  11. Volume, value, and thinning: logs for the future.

    Treesearch

    Sally Duncan

    2002-01-01

    Thinning is one of our most important ways to influence tree and stand development. The objectives may include increasing the volume, size, and quality of wood produced from a forest and developing particular stand structures and characteristics for other values, such as wildlife or aesthetics.The Levels-of-Growing-Stock (LOGS) Cooperative was initiated in...

  12. Particle Number Concentrations for HI-SCALE Field Campaign Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hering, Susanne V

    In support of the Holistic Interactions of Shallow Clouds, Aerosols, and Ecosystems (HI-SCALE) project to study new particle formation in the atmosphere, a pair of custom water condensation particle counters were provided to the second intensive field campaign, from mid-August through mid-September 2017, at the U.S. Department of Energy Southern Great Plains Atmospheric Radiation Measurement (ARM) Climate Research Facility observatory. These custom instruments were developed by Aerosol Dynamics, Inc. (Hering et al. 2017) to detect particles into the nanometer size range. Referred to as “versatile water condensation particle counter (vWCPC)”, they are water-based, laminar-flow condensational growth instruments whose lower particlemore » size threshold can be set based on user-selected operating temperatures. For HI-SCALE, the vWCPCs were configured to measure airborne particle number concentrations in the size range from approximately 2nm to 2μm. Both were installed in the particle sizing system operated by Chongai Kuang of Brookhaven National Laboratory (BNL). One of these was operated in parallel to a TSI Model 3776, upstream of the mobility particle sizing system, to measure total ambient particle concentrations. The airborne particle concentration data from this “total particle number vWCPC” (Ntot-vWCPC) system has been reported to the ARM database. The data are reported with one-second resolution. The second vWCPC was operated in parallel with the BNL diethylene glycol instrument to count particles downstream of a separate differential mobility size analyzer. Data from this “DMA-vWCPC” system was logged by BNL, and will eventually be provided by that laboratory.« less

  13. Simultaneous ground-based and airborne measurements of biogenic VOC oxidation products using iodide-adduct HR-ToF-CIMS in the Southeast U.S

    NASA Astrophysics Data System (ADS)

    Lee, B.; Mohr, C.; Lopez-Hilfiker, F.; Warneke, C.; Graus, M.; Gilman, J.; Lerner, B. M.; Pollack, I. B.; Ryerson, T. B.; Roberts, J. M.; Edwards, P. M.; Brown, S. S.; Holloway, J.; Aikin, K.; Dube, W. P.; Liao, J.; Welti, A.; Middlebrook, A. M.; Nowak, J. B.; Neuman, J. A.; Brioude, J. F.; McKeen, S. A.; Hanisco, T. F.; Kaiser, J.; Keutsch, F. N.; Wolfe, G. M.; Hallquist, M.; Trainer, M.; De Gouw, J. A.; Thornton, J. A.

    2013-12-01

    We present measurements by two high-resolution time-of-flight chemical-ionization mass spectrometers (HR-ToF-CIMS) during the Southeast Atmosphere Study in June and July of 2013. Both HR-ToF-CIMS used iodide as the reagent ion, which provides minimum fragmentation during ionization. Isoprene and monoterpene oxidation byproducts such as hydroxy hydroperoxides, carboxylic acids and organic nitrates, were ubiquitous in the mass spectra. In addition, we observed select inorganic gases such as N2O5 and ClNO2. The flight instrument was deployed aboard the NOAA WP-3D during SENEX, which explored the lower atmosphere over the Southeast U.S., logging a total of 125 flight hours. These measurements provide insight into the spatial and temporal variation of these types of compounds and the influence of natural gas fields, power plants, biomass burning, urban and biogenic emissions on their abundance under both day and nighttime conditions. The ground-based instrument was located near Brent, Alabama as part of the SOAS campaign and utilized the Filter Inlet for Gas and AEROsol (FIGAERO) - developed at the University of Washington - which allows measurement of both the gas and particle phases. Continuous observations spanning more than 4 weeks show the diurnal variability and the influence of meteorology and anthropogenic emissions on the gas-particle partitioning. Flights during SENEX over the ground site provide a unique opportunity to investigate the vertical distribution of a whole suite of these chemical species measured by these two cross-calibrated nearly identical instruments.

  14. Contrast Invariant Interest Point Detection by Zero-Norm LoG Filter.

    PubMed

    Zhenwei Miao; Xudong Jiang; Kim-Hui Yap

    2016-01-01

    The Laplacian of Gaussian (LoG) filter is widely used in interest point detection. However, low-contrast image structures, though stable and significant, are often submerged by the high-contrast ones in the response image of the LoG filter, and hence are difficult to be detected. To solve this problem, we derive a generalized LoG filter, and propose a zero-norm LoG filter. The response of the zero-norm LoG filter is proportional to the weighted number of bright/dark pixels in a local region, which makes this filter be invariant to the image contrast. Based on the zero-norm LoG filter, we develop an interest point detector to extract local structures from images. Compared with the contrast dependent detectors, such as the popular scale invariant feature transform detector, the proposed detector is robust to illumination changes and abrupt variations of images. Experiments on benchmark databases demonstrate the superior performance of the proposed zero-norm LoG detector in terms of the repeatability and matching score of the detected points as well as the image recognition rate under different conditions.

  15. Hydrogeomorphological and water quality impacts of oil palm conversion and logging in Sabah, Malaysian Borneo: a multi-catchment approach

    NASA Astrophysics Data System (ADS)

    Walsh, Rory; Nainar, Anand; Bidin, Kawi; Higton, Sam; Annammala, Kogilavani; Blake, William; Luke, Sarah; Murphy, Laura; Perryman, Emily; Wall, Katy; Hanapi, Jamil

    2016-04-01

    The last three decades have seen a combination of logging and land-use change across most of the rainforest tropics. This has involved conversion to oil palm across large parts of SE Asia. Although much is now known about the hydrological and sediment transport impacts of logging, relatively little is known about how impacts of oil palm conversion compare with those of logging. Furthermore little is known about the impacts of both on river morphology and water quality. This paper reports some findings of the first phase of a ten-year large-scale manipulative multi-catchment experiment (part of the SAFE - Stability of Altered Forest Ecosystems - Project), based in the upper part of the Brantian Catchment in Sabah, Malaysian Borneo; the project is designed to assess the degree to which adverse impacts of oil palm conversion (on erosion, downstream channel change, water quality and river ecology) might be reduced by retaining buffer zones of riparian forest of varying width from zero to 120 metres. Ten 2 km2 catchments of contrasting land use history have been instrumented since 2011 to record discharge, turbidity, conductivity and water temperature at 5-minute intervals. These comprise 6 repeat-logged catchments being subjected in 2015-16 to conversion to oil palm with varying riparian forest widths; a repeat-logged 'control' catchment; an old regrowth catchment; an oil palm catchment; and a primary forest catchment. In addition, (1) monthly water samples from the catchments have been analysed for nitrates and phosphates, (2) channel cross-sectional change along each stream has been monitored at six-monthly intervals and (3) supplementary surveys have been made of downstream bankfull channel cross-sectional size and water chemistry at a wider range of catchment sites, and (4) sediment cores have been taken and contemporary deposition monitored at a hierarchical network of sites in the large Brantian catchment for geochemical analysis and dating to establish the history of sedimentation and inferred changes in upstream sediment sources. Effects on river ecology were also assessed. This paper summarises the key findings to date, focussing on differences in suspended sediment dynamics, downstream bankfull channel size and shape, and pollution between oil palm catchments, and catchments under post-logging and primary rainforest.

  16. Development of intelligent instruments with embedded HTTP servers for control and data acquisition in a cryogenic setup--The hardware, firmware, and software implementation.

    PubMed

    Antony, Joby; Mathuria, D S; Datta, T S; Maity, Tanmoy

    2015-12-01

    The power of Ethernet for control and automation technology is being largely understood by the automation industry in recent times. Ethernet with HTTP (Hypertext Transfer Protocol) is one of the most widely accepted communication standards today. Ethernet is best known for being able to control through internet from anywhere in the globe. The Ethernet interface with built-in on-chip embedded servers ensures global connections for crate-less model of control and data acquisition systems which have several advantages over traditional crate-based control architectures for slow applications. This architecture will completely eliminate the use of any extra PLC (Programmable Logic Controller) or similar control hardware in any automation network as the control functions are firmware coded inside intelligent meters itself. Here, we describe the indigenously built project of a cryogenic control system built for linear accelerator at Inter University Accelerator Centre, known as "CADS," which stands for "Complete Automation of Distribution System." CADS deals with complete hardware, firmware, and software implementation of the automated linac cryogenic distribution system using many Ethernet based embedded cryogenic instruments developed in-house. Each instrument works as an intelligent meter called device-server which has the control functions and control loops built inside the firmware itself. Dedicated meters with built-in servers were designed out of ARM (Acorn RISC (Reduced Instruction Set Computer) Machine) and ATMEL processors and COTS (Commercially Off-the-Shelf) SMD (Surface Mount Devices) components, with analog sensor front-end and a digital back-end web server implementing remote procedure call over HTTP for digital control and readout functions. At present, 24 instruments which run 58 embedded servers inside, each specific to a particular type of sensor-actuator combination for closed loop operations, are now deployed and distributed across control LAN (Local Area Network). A group of six categories of such instruments have been identified for all cryogenic applications required for linac operation which were designed to build this medium-scale cryogenic automation setup. These devices have special features like remote rebooters, daughter boards for PIDs (Proportional Integral Derivative), etc., to operate them remotely in radiation areas and also have emergency switches by which each device can be taken to emergency mode temporarily. Finally, all the data are monitored, logged, controlled, and analyzed online at a central control room which has a user-friendly control interface developed using LabVIEW(®). This paper discusses the overall hardware, firmware, software design, and implementation for the cryogenics setup.

  17. Development of intelligent instruments with embedded HTTP servers for control and data acquisition in a cryogenic setup—The hardware, firmware, and software implementation

    NASA Astrophysics Data System (ADS)

    Antony, Joby; Mathuria, D. S.; Datta, T. S.; Maity, Tanmoy

    2015-12-01

    The power of Ethernet for control and automation technology is being largely understood by the automation industry in recent times. Ethernet with HTTP (Hypertext Transfer Protocol) is one of the most widely accepted communication standards today. Ethernet is best known for being able to control through internet from anywhere in the globe. The Ethernet interface with built-in on-chip embedded servers ensures global connections for crate-less model of control and data acquisition systems which have several advantages over traditional crate-based control architectures for slow applications. This architecture will completely eliminate the use of any extra PLC (Programmable Logic Controller) or similar control hardware in any automation network as the control functions are firmware coded inside intelligent meters itself. Here, we describe the indigenously built project of a cryogenic control system built for linear accelerator at Inter University Accelerator Centre, known as "CADS," which stands for "Complete Automation of Distribution System." CADS deals with complete hardware, firmware, and software implementation of the automated linac cryogenic distribution system using many Ethernet based embedded cryogenic instruments developed in-house. Each instrument works as an intelligent meter called device-server which has the control functions and control loops built inside the firmware itself. Dedicated meters with built-in servers were designed out of ARM (Acorn RISC (Reduced Instruction Set Computer) Machine) and ATMEL processors and COTS (Commercially Off-the-Shelf) SMD (Surface Mount Devices) components, with analog sensor front-end and a digital back-end web server implementing remote procedure call over HTTP for digital control and readout functions. At present, 24 instruments which run 58 embedded servers inside, each specific to a particular type of sensor-actuator combination for closed loop operations, are now deployed and distributed across control LAN (Local Area Network). A group of six categories of such instruments have been identified for all cryogenic applications required for linac operation which were designed to build this medium-scale cryogenic automation setup. These devices have special features like remote rebooters, daughter boards for PIDs (Proportional Integral Derivative), etc., to operate them remotely in radiation areas and also have emergency switches by which each device can be taken to emergency mode temporarily. Finally, all the data are monitored, logged, controlled, and analyzed online at a central control room which has a user-friendly control interface developed using LabVIEW®. This paper discusses the overall hardware, firmware, software design, and implementation for the cryogenics setup.

  18. Development of intelligent instruments with embedded HTTP servers for control and data acquisition in a cryogenic setup—The hardware, firmware, and software implementation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Antony, Joby; Mathuria, D. S.; Datta, T. S.

    The power of Ethernet for control and automation technology is being largely understood by the automation industry in recent times. Ethernet with HTTP (Hypertext Transfer Protocol) is one of the most widely accepted communication standards today. Ethernet is best known for being able to control through internet from anywhere in the globe. The Ethernet interface with built-in on-chip embedded servers ensures global connections for crate-less model of control and data acquisition systems which have several advantages over traditional crate-based control architectures for slow applications. This architecture will completely eliminate the use of any extra PLC (Programmable Logic Controller) or similarmore » control hardware in any automation network as the control functions are firmware coded inside intelligent meters itself. Here, we describe the indigenously built project of a cryogenic control system built for linear accelerator at Inter University Accelerator Centre, known as “CADS,” which stands for “Complete Automation of Distribution System.” CADS deals with complete hardware, firmware, and software implementation of the automated linac cryogenic distribution system using many Ethernet based embedded cryogenic instruments developed in-house. Each instrument works as an intelligent meter called device-server which has the control functions and control loops built inside the firmware itself. Dedicated meters with built-in servers were designed out of ARM (Acorn RISC (Reduced Instruction Set Computer) Machine) and ATMEL processors and COTS (Commercially Off-the-Shelf) SMD (Surface Mount Devices) components, with analog sensor front-end and a digital back-end web server implementing remote procedure call over HTTP for digital control and readout functions. At present, 24 instruments which run 58 embedded servers inside, each specific to a particular type of sensor-actuator combination for closed loop operations, are now deployed and distributed across control LAN (Local Area Network). A group of six categories of such instruments have been identified for all cryogenic applications required for linac operation which were designed to build this medium-scale cryogenic automation setup. These devices have special features like remote rebooters, daughter boards for PIDs (Proportional Integral Derivative), etc., to operate them remotely in radiation areas and also have emergency switches by which each device can be taken to emergency mode temporarily. Finally, all the data are monitored, logged, controlled, and analyzed online at a central control room which has a user-friendly control interface developed using LabVIEW{sup ®}. This paper discusses the overall hardware, firmware, software design, and implementation for the cryogenics setup.« less

  19. How log-normal is your country? An analysis of the statistical distribution of the exported volumes of products

    NASA Astrophysics Data System (ADS)

    Annunziata, Mario Alberto; Petri, Alberto; Pontuale, Giorgio; Zaccaria, Andrea

    2016-10-01

    We have considered the statistical distributions of the volumes of 1131 products exported by 148 countries. We have found that the form of these distributions is not unique but heavily depends on the level of development of the nation, as expressed by macroeconomic indicators like GDP, GDP per capita, total export and a recently introduced measure for countries' economic complexity called fitness. We have identified three major classes: a) an incomplete log-normal shape, truncated on the left side, for the less developed countries, b) a complete log-normal, with a wider range of volumes, for nations characterized by intermediate economy, and c) a strongly asymmetric shape for countries with a high degree of development. Finally, the log-normality hypothesis has been checked for the distributions of all the 148 countries through different tests, Kolmogorov-Smirnov and Cramér-Von Mises, confirming that it cannot be rejected only for the countries of intermediate economy.

  20. New Technique for TOC Estimation Based on Thermal Core Logging in Low-Permeable Formations (Bazhen fm.)

    NASA Astrophysics Data System (ADS)

    Popov, Evgeny; Popov, Yury; Spasennykh, Mikhail; Kozlova, Elena; Chekhonin, Evgeny; Zagranovskaya, Dzhuliya; Belenkaya, Irina; Alekseev, Aleksey

    2016-04-01

    A practical method of organic-rich intervals identifying within the low-permeable dispersive rocks based on thermal conductivity measurements along the core is presented. Non-destructive non-contact thermal core logging was performed with optical scanning technique on 4 685 full size core samples from 7 wells drilled in four low-permeable zones of the Bazhen formation (B.fm.) in the Western Siberia (Russia). The method employs continuous simultaneous measurements of rock anisotropy, volumetric heat capacity, thermal anisotropy coefficient and thermal heterogeneity factor along the cores allowing the high vertical resolution (of up to 1-2 mm). B.fm. rock matrix thermal conductivity was observed to be essentially stable within the range of 2.5-2.7 W/(m*K). However, stable matrix thermal conductivity along with the high thermal anisotropy coefficient is characteristic for B.fm. sediments due to the low rock porosity values. It is shown experimentally that thermal parameters measured relate linearly to organic richness rather than to porosity coefficient deviations. Thus, a new technique employing the transformation of the thermal conductivity profiles into continuous profiles of total organic carbon (TOC) values along the core was developed. Comparison of TOC values, estimated from the thermal conductivity values, with experimental pyrolytic TOC estimations of 665 samples from the cores using the Rock-Eval and HAWK instruments demonstrated high efficiency of the new technique for the organic rich intervals separation. The data obtained with the new technique are essential for the SR hydrocarbon generation potential, for basin and petroleum system modeling application, and estimation of hydrocarbon reserves. The method allows for the TOC richness to be accurately assessed using the thermal well logs. The research work was done with financial support of the Russian Ministry of Education and Science (unique identification number RFMEFI58114X0008).

  1. Measuring ecological impacts from logging in natural forests of the eastern Amazonia as a tool to assess forest degradation

    Treesearch

    Marco W Lentini; Johan C Zweede; Thomas P Holmes

    2010-01-01

    Sound forest management practices have been seen as an interesting strategy to ally forest conservation and rural economic development in Amazônia. However, the implementation of Reduced Impact Logging (RIL) techniques in the field has been incipient, while most of the Amazonian timber production is generated through predatory and illegal logging. Despite several...

  2. An interactive machine-learning approach for defect detection in computed tomogaraphy (CT) images of hardwood logs

    Treesearch

    Erol Sarigul; A. Lynn Abbott; Daniel L. Schmoldt; Philip A. Araman

    2005-01-01

    This paper describes recent progress in the analysis of computed tomography (CT) images of hardwood logs. The long-term goal of the work is to develop a system that is capable of autonomous (or semiautonomous) detection of internal defects, so that log breakdown decisions can be optimized based on defect locations. The problem is difficult because wood exhibits large...

  3. Application of a novel antimicrobial coating on roast beef for inactivation and inhibition of Listeria monocytogenes during storage.

    PubMed

    Wang, Luxin; Zhao, Liang; Yuan, Jing; Jin, Tony Z

    2015-10-15

    The antilisterial efficacy of novel coating solutions made with organic acids, lauric arginate ester, and chitosan was evaluated in a three-stage study on inoculated roast beef for the first time. Ready-to-eat roast beef was specially ordered from the manufacturer. The meat surface was inoculated with five-strain Listeria monocytogenes cocktail inoculums at two different levels, ~3 and 6 Log CFU/cm(2) and treated with the stock solution (HAMS), the 1:5 diluted solution (MAMS), and the 1:10 diluted solution (LAMS) (stage 1). During the 20 min contact time, the antimicrobial coatings reduced the Listeria populations by approximately 0.9-0.3 Log CFU/cm(2). The higher the concentrations of the antimicrobial solution, the better the antilisterial effects were. The treated inoculated beef samples were then stored at 4 °C for 30 days. During storage, Listeria growth inhibition effects were seen. While no growth was seen from the HAMS-treated samples, a 1.6 Log CFU/cm(2) increase was seen for MAMS-treated samples, a 4.6 Log CFU/cm(2) increase was seen for LAMS-treated samples, and a 5.7 Log CFU/cm(2) increase was seen for NoAMS-treated samples on Day 30 (~3 Log CFU/cm(2) inoculation level). In the second stage, the impact of the roast beef storage time on solution's antilisterial effect was evaluated. Results showed that the effect of the antimicrobial solution was dependent on both the initial inoculation levels and storage times. In stage 3, the effect of the antimicrobial solution on roast beef quality was studied with both instrument measurement and sensory evaluation. Minor changes in color, pH, and water activity were found. However, only limited sensory differences were seen between the treated and untreated samples. When panels were able to accurately find color differences between samples, they preferred the treated samples. The findings of this research proved the antilisterial efficacy of the novel antimicrobial solution and showed its potential for being used as a roast beef cut surface coating to control Listeria contamination and for color protection. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. The PsyLOG mobile application: development of a tool for the assessment and monitoring of side effects of psychotropic medication.

    PubMed

    Kuzman, Martina Rojnic; Andlauer, Olivier; Burmeister, Kai; Dvoracek, Boris; Lencer, Rebekka; Koelkebeck, Katja; Nawka, Alexander; Riese, Florian

    2017-06-01

    Mobile health interventions are regarded as affordable and accessible tools that can enhance standard psychiatric care. As part of the mHealth Psycho-Educational Intervention Versus Antipsychotic-Induced Side Effects (mPIVAS) project (www.psylog.eu), we developed the mobile application "PsyLOG" based on mobile "smartphone" technology to monitor antipsychotic-induced side effects. The aim of this paper is to describe the rationale and development of the PsyLOG and its clinical use. The PsyLOG application runs on smartphones with Android operating system. The application is currently available in seven languages (Croatian, Czech, English, French, German, Japanese and Serbian). It consists of several categories: "My Drug Effects", "My Life Styles", "My Charts", "My Medication", "My Strategies", "My Supporters", "Settings" and "About". The main category "My Drug Effects" includes a list of 30 side effects with the possibility to add three additional side effects. Side effects are each accompanied by an appropriate description and the possibility to rate its severity on a visual analogue scale from 0-100%. The PsyLOG application is intended to enhance the link between patients and mental health professionals, serving as a tool that more objectively monitors side-effects over certain periods of time. To the best of our knowledge, no such applications have so far been developed for patients taking antipsychotic medication or for their therapists.

  5. Health Screenings at School

    MedlinePlus

    ... Ribbon Commands Skip to main content Turn off Animations Turn on Animations Our Sponsors Log in | Register Menu Log in | ... others develop later. A child who has difficulty reading the blackboard may not know that she is ...

  6. 1987 Nuclear Science Symposium, 34th, and 1987 Symposium on Nuclear Power Systems, 19th, San Francisco, CA, Oct. 21-23, 1987, Proceedings

    NASA Astrophysics Data System (ADS)

    Armantrout, Guy A.

    1988-02-01

    The present conference consideres topics in radiation detectors, advanced electronic circuits, data acquisition systems, radiation detector systems, high-energy and nuclear physics radiation detection, spaceborne instrumentation, health physics and environmental radiation detection, nuclear medicine, nuclear well logging, and nuclear reactor instrumentation. Attention is given to the response of scintillators to heavy ions, phonon-mediated particle detection, ballistic deficits in pulse-shaping amplifiers, fast analog ICs for particle physics, logic cell arrays, the CERN host interface, high performance data buses, a novel scintillating glass for high-energy physics applications, background events in microchannel plates, a tritium accelerator mass spectrometer, a novel positron tomograph, advancements in PET, cylindrical positron tomography, nuclear techniques in subsurface geology, REE borehole neutron activation, and a continuous tritium monitor for aqueous process streams.

  7. Matrix form for the instrument line shape of Fourier-transform spectrometers yielding a fast integration algorithm to theoretical spectra.

    PubMed

    Desbiens, Raphaël; Tremblay, Pierre; Genest, Jérôme; Bouchard, Jean-Pierre

    2006-01-20

    The instrument line shape (ILS) of a Fourier-transform spectrometer is expressed in a matrix form. For all line shape effects that scale with wavenumber, the ILS matrix is shown to be transposed in the spectral and interferogram domains. The novel representation of the ILS matrix in the interferogram domain yields an insightful physical interpretation of the underlying process producing self-apodization. Working in the interferogram domain circumvents the problem of taking into account the effects of finite optical path difference and permits a proper discretization of the equations. A fast algorithm in O(N log2 N), based on the fractional Fourier transform, is introduced that permits the application of a constant resolving power line shape to theoretical spectra or forward models. The ILS integration formalism is validated with experimental data.

  8. WinTICS-24 --- A Telescope Control Interface for MS Windows

    NASA Astrophysics Data System (ADS)

    Hawkins, R. Lee

    1995-12-01

    WinTICS-24 is a telescope control system interface and observing assistant written in Visual Basic for MS Windows. It provides the ability to control a telescope and up to 3 other instruments via the serial ports on an IBM-PC compatible computer, all from one consistent user interface. In addition to telescope control, WinTICS contains an observing logbook, trouble log (which can automatically email its entries to a responsible person), lunar phase display, object database (which allows the observer to type in the name of an object and automatically slew to it), a time of minimum calculator for eclipsing binary stars, and an interface to the Guide CD-ROM for bringing up finder charts of the current telescope coordinates. Currently WinTICS supports control of DFM telescopes, but is easily adaptable to other telescopes and instrumentation.

  9. MIST VR. A laparoscopic surgery procedures trainer and evaluator.

    PubMed

    Sutton, C; McCloy, R; Middlebrook, A; Chater, P; Wilson, M; Stone, R

    1997-01-01

    The key bimanual instrument tasks involved in laparoscopic surgery have been abstracted for use in a virtual reality surgical skills evaluator and trainer. The trainer uses two laparoscopic instruments mounted on a frame with position sensors which provide instrument movement data that is translated into interactive real time graphics on a PC (P133, 16 Mb RAM, graphics acceleration card). An accurately scaled operating volume of 10 cm3 is represented by a 3D cube on the computer screen. "Camera" position and size of target objects can be varied for different skill levels. Targets appear randomly within the operating volume according to the skill task and can be grasped and manipulated with the instruments. Accuracy and errors during the tasks and time to completion are logged. Mist VR has tutorial, training, examination, analysis and configuration modes. Six tasks have been selected and include combinations of instrument approach, target acquisition, target manipulation and placement, transfer between instruments, target contact with optional diathermy, and controlled instrument withdrawal/replacement. Tasks can be configured for varying degrees of difficulty and the configurations saved to a library for reuse. Specific task configurations can be assigned to individual students. In the examination mode the supervisor can select the tasks, repetitions and order and save to a specific file for that trainee. Progress can be assessed and there is the option for playback of the training session or examination. Data analyses permit overall, including task, and right or left hand performances to be quantified. Mist VR represents a significant advance over the subjective assessment of training performances with existing "plastic box" basic trainers.

  10. Automated potentiometric titrations in KCl/water-saturated octanol: method for quantifying factors influencing ion-pair partitioning.

    PubMed

    Scherrer, Robert A; Donovan, Stephen F

    2009-04-01

    The knowledge base of factors influencing ion pair partitioning is very sparse, primarily because of the difficulty in determining accurate log P(I) values of desirable low molecular weight (MW) reference compounds. We have developed a potentiometric titration procedure in KCl/water-saturated octanol that provides a link to log P(I) through the thermodynamic cycle of ionization and partitioning. These titrations have the advantage of being independent of the magnitude of log P, while maintaining a reproducibility of a few hundredths of a log P in the calculated difference between log P neutral and log P ion pair (diff (log P(N - I))). Simple model compounds can be used. The titration procedure is described in detail, along with a program for calculating pK(a)'' values incorporating the ionization of water in octanol. Hydrogen bonding and steric factors have a greater influence on ion pairs than they do on neutral species, yet these factors are missing from current programs used to calculate log P(I) and log D. In contrast to the common assumption that diff (log P(N - I)) is the same for all amines, they can actually vary more than 3 log units, as in our examples. A major factor affecting log P(I) is the ability of water and the counterion to approach the charge center. Bulky substituents near the charge center have a negative influence on log P(I). On the other hand, hydrogen bonding groups near the charge center have the opposite effect by lowering the free energy of the ion pair. The use of this titration method to determine substituent ion pair stabilization values (IPS) should bring about more accurate log D calculations and encourage species-specific QSAR involving log D(N) and log D(I). This work also brings attention to the fascinating world of nature's highly stabilized ion pairs.

  11. Automated Potentiometric Titrations in KCl/Water-Saturated Octanol: Method for Quantifying Factors Influencing Ion-Pair Partitioning

    PubMed Central

    2009-01-01

    The knowledge base of factors influencing ion pair partitioning is very sparse, primarily because of the difficulty in determining accurate log PI values of desirable low molecular weight (MW) reference compounds. We have developed a potentiometric titration procedure in KCl/water-saturated octanol that provides a link to log PI through the thermodynamic cycle of ionization and partitioning. These titrations have the advantage of being independent of the magnitude of log P, while maintaining a reproducibility of a few hundredths of a log P in the calculated difference between log P neutral and log P ion pair (diff (log PN − I)). Simple model compounds can be used. The titration procedure is described in detail, along with a program for calculating pKa′′ values incorporating the ionization of water in octanol. Hydrogen bonding and steric factors have a greater influence on ion pairs than they do on neutral species, yet these factors are missing from current programs used to calculate log PI and log D. In contrast to the common assumption that diff (log PN − I) is the same for all amines, they can actually vary more than 3 log units, as in our examples. A major factor affecting log PI is the ability of water and the counterion to approach the charge center. Bulky substituents near the charge center have a negative influence on log PI. On the other hand, hydrogen bonding groups near the charge center have the opposite effect by lowering the free energy of the ion pair. The use of this titration method to determine substituent ion pair stabilization values (IPS) should bring about more accurate log D calculations and encourage species-specific QSAR involving log DN and log DI. This work also brings attention to the fascinating world of nature’s highly stabilized ion pairs. PMID:19265385

  12. Towards Determining the Upper Temperature Limits to Life on Earth: An In-situ Sulfide-Microbial Incubator

    NASA Astrophysics Data System (ADS)

    Kelley, D.; Baross, J.; Delaney, J.; Girguis, P.; Schrenk, M.

    2004-12-01

    Determining the maximum conditions under which life thrives, survives, and expires is critical to understanding how and where life might have evolved on our planet and for investigation of life in extraterrestrial environments. Submarine black smoker systems are optimal sites to study such questions because thermal gradients are extreme and accessible within the chimney walls under high-pressure conditions. Intact cells containing DNA and ribosomes have been observed even within the most extreme environments of sulfide structure walls bounded by 300\\deg C fluids. Membrane lipids from archaea have been detected in sulfide flanges and chimneys where temperatures are believed to be 200-300\\deg C. However, a balanced inquiry into the limits of life must focus on characterization of the actual conditions in a given system that favor reactions necessary to initiate and/or sustain life. At present, in-situ instrumentation of sulfide deposits is the only effective way to gain direct access to these natural high-temperature environments for documentation and experimentation. With this goal in mind, three prototype microbial incubators were developed with funding from the NSF, University of Washington, and the W.M. Keck Foundation. The incubators were deployed in 2003 in the walls of active black smoker chimneys in the Mothra Hydrothermal Field, Endeavour Segment of the Juan de Fuca Ridge. All instruments were successfully recovered in 2004, and one was redeployed for a short time-series experiment. Each 53-cm-long titanium assembly houses 27 temperature sensors that record temperatures from 0 to 500\\deg C within three discrete incubation chambers. Data are logged in a separate housing and inductively coupled links provide access to the data loggers without removal of the instruments. During the initial deployment, data were collected from 189 to 245 days, with up to ˜478° K temperature measurements completed for an individual instrument. Temperatures within the chimney walls ranged from near ambient conditions to ˜280° C. Distinct thermal gradients were delineated extremely well in each of the three discrete environmental chambers in all instruments. In one instrument numerous perturbations were recorded simultaneously on all 27 probes showing temperature increases of up to ˜30° C. Smaller-scale fluctuations resulting from tidal perturbations were ubiquitous in all instruments. Tidal pumping that mixes oxygenated seawater and reduced, volatile-rich hydrothermal fluids may be critical for development of dense and diverse microbial communities within the outer chimney walls. Preliminary examination of some sterile mineral surfaces emplaced within the chambers shows extensive biofilm development. Culturing experiments are ongoing and DNA has been successfully extracted from many of the chambers for genetic characterization. This experiment is a component of the W.M. Keck Foundation-funded proto-NEPTUNE Observatory and Ridge R2K program at Endeavour.

  13. An oscillating microbalance for meteorological measurements of ice and volcanic ash accumulation from a weather balloon platform

    NASA Astrophysics Data System (ADS)

    Airey, Martin; Harrison, Giles; Nicoll, Keri; Williams, Paul; Marlton, Graeme

    2017-04-01

    A new, low cost, instrument has been developed for meteorological measurements of the accumulation of ice and volcanic ash that can be readily deployed using commercial radiosondes and weather balloons. It is based on principles used by [1], an instrument originally developed to measure supercooled liquid water profiles in clouds. This new instrument introduces numerous improvements in terms of reduced complexity and cost. It uses the oscillating microbalance principle, whereby a wire vibrating at its natural frequency is subjected to increased loading of the property to be measured. The increase in mass modifies the wire properties such that its natural frequency of oscillation changes. By measuring this frequency, the increase in mass can be inferred and transmitted to a ground base station through the radiosonde's UHF antenna via the PANDORA interface [2], which has been previously developed to provide power and connection to the radiosonde telemetry. The device consists of a simple circuit board controlled by an ATMEGA microcontroller. For calibration, the controller is capable of driving the wire at specified frequencies via excitation by a piezo sounder upon which the wire is mounted. The same piezo sounder is also used during active operation to measure the frequency of the wire in its non-driven state in order to infer the mass change on the wire. A phase-locked loop implemented on the board identifies when resonance occurs and the measured frequency is stable, prompting the microcontroller to send the measurement through the data interface. The device may be used for any application that requires the measurement of incremental mass variation e.g. ice accumulation, frosting, or particle accumulation such as dust and volcanic ash. For the solid particle accumulation, a low temperature, high-tack, adhesive may be applied to the wire prior to deployment to collect the material. In addition, the same instrument may be used for ground-based applications, such as ice accumulation, with direct monitoring via a serial connection or logged to removable storage media in the absence of the radiosonde. References [1] Hill, G.E. and Woffinden, D.S. (1980) Journal of Applied Meteorology, 19, 11, 1285-1292 [2] Harrison, R.G., et al. (2012) Rev. Sci. Instrum., 83, 3

  14. ANDRILL Borehole AND-1B: Well Log Analysis of Lithofacies and Glacimarine Cycles.

    NASA Astrophysics Data System (ADS)

    Jackolski, C. L.; Williams, T.; Powell, R. D.; Jarrard, R.; Morin, R. H.; Talarico, F. M.; Niessen, F.; Kuhn, G.

    2008-12-01

    During the 2006-2007 austral summer, the Antarctic geological drilling program ANDRILL recovered cores of sedimentary rock from a 1285-m-deep borehole below the McMurdo Ice Shelf. Well logging instruments were deployed to a depth of 1017 mbsf after core recovery. This study focuses on two intervals of the AND-1B borehole: upper HQ (238-343 mbsf; Pliocene) and NQ (698-1017 mbsf; upper Miocene), which were logged with natural gamma ray, induction resistivity and magnetic susceptibility tools. To understand how the well logs fit into a more complete physical properties data set, we performed factor and cluster analyses on a suite of well logs and core logs in the upper HQ and NQ intervals. In both intervals, factor analysis groups resistivity and core P-velocity into a factor that we interpret as being inversely proportional to porosity. It also groups natural gamma and potassium (from the XRF core scanner) into a factor that we interpret as a particle-size or lithology index. An additional factor in the NQ interval, influenced by clast number and magnetic susceptibility, distinguishes subglacial diamictites from other lithofacies. The factors in each interval (2 in HQ, 3 in NQ) are used as input to cluster analysis. The results are log data objectively organized into clusters, or electrofacies. We compare these electrofacies to the lithofacies, well logs and unconformity-bounded glacimarine cycles of AND-1B. Patterns in the glacimarine cycles are observed in the well logs and electrofacies. In the NQ glacimarine sediments, an electrofacies pattern is produced between subglacial diamictites at the bottom of each sequence and the glacial retreat facies above. Subglacial diamictites have higher values for the additional NQ factor, corresponding to clast number and magnetic susceptibility, than the muds and sands that form the retreat facies. Differences in the porosity factor are not observed in any electrofacies pattern in the NQ interval, but subtle patterns in the resistivity well log are observed. Subglacial diamictites have greater resistivities than most retreat facies. In the HQ interval, there is only one glacimarine cycle that resembles those in the NQ interval, and most of the interval is subglacial or ice-proximal diamictite. There are only two and a half cycles in the HQ interval, but they contain an incipient electrofacies pattern. In the lower two cycles, the potassium/gamma factor is low at the bottom and high toward the top, and porosity, as indicated by the porosity factor, is low at the bottom and high toward the top. Throughout most of the HQ interval, potassium and natural gamma correlate with porosity. Two exceptions are the lower half of the top cycle, in which resistivity increases toward the top, and the two diatomite beds at the top of the two lower cycles, in which potassium/gamma is low and porosity is very high.

  15. Low-Cost Evaluation of EO-1 Hyperion and ALI for Detection and Biophysical Characterization of Forest Logging in Amazonia (NCC5-481)

    NASA Technical Reports Server (NTRS)

    Asner, Gregory P.; Keller, Michael M.; Silva, Jose Natalino; Zweede, Johan C.; Pereira, Rodrigo, Jr.

    2002-01-01

    Major uncertainties exist regarding the rate and intensity of logging in tropical forests worldwide: these uncertainties severely limit economic, ecological, and biogeochemical analyses of these regions. Recent sawmill surveys in the Amazon region of Brazil show that the area logged is nearly equal to total area deforested annually, but conversion of survey data to forest area, forest structural damage, and biomass estimates requires multiple assumptions about logging practices. Remote sensing could provide an independent means to monitor logging activity and to estimate the biophysical consequences of this land use. Previous studies have demonstrated that the detection of logging in Amazon forests is difficult and no studies have developed either the quantitative physical basis or remote sensing approaches needed to estimate the effects of various logging regimes on forest structure. A major reason for these limitations has been a lack of sufficient, well-calibrated optical satellite data, which in turn, has impeded the development and use of physically-based, quantitative approaches for detection and structural characterization of forest logging regimes. We propose to use data from the EO-1 Hyperion imaging spectrometer to greatly increase our ability to estimate the presence and structural attributes of selective logging in the Amazon Basin. Our approach is based on four "biogeophysical indicators" not yet derived simultaneously from any satellite sensor: 1) green canopy leaf area index; 2) degree of shadowing; 3) presence of exposed soil and; 4) non-photosynthetic vegetation material. Airborne, field and modeling studies have shown that the optical reflectance continuum (400-2500 nm) contains sufficient information to derive estimates of each of these indicators. Our ongoing studies in the eastern Amazon basin also suggest that these four indicators are sensitive to logging intensity. Satellite-based estimates of these indicators should provide a means to quantify both the presence and degree of structural disturbance caused by various logging regimes. Our quantitative assessment of Hyperion hyperspectral and ALI multi-spectral data for the detection and structural characterization of selective logging in Amazonia will benefit from data collected through an ongoing project run by the Tropical Forest Foundation, within which we have developed a study of the canopy and landscape biophysics of conventional and reduced-impact logging. We will add to our base of forest structural information in concert with an EO-1 overpass. Using a photon transport model inversion technique that accounts for non-linear mixing of the four biogeophysical indicators, we will estimate these parameters across a gradient of selective logging intensity provided by conventional and reduced impact logging sites. We will also compare our physical ly-based approach to both conventional (e.g., NDVI) and novel (e.g., SWIR-channel) vegetation indices as well as to linear mixture modeling methods. We will cross-compare these approaches using Hyperion and ALI imagers to determine the strengths and limitations of these two sensors for applications of forest biophysics. This effort will yield the first physical ly-based, quantitative analysis of the detection and intensity of selective logging in Amazonia, comparing hyperspectral and improved multi-spectral approaches as well as inverse modeling, linear mixture modeling, and vegetation index techniques.

  16. A portable data-logging system for industrial hygiene personal chlorine monitoring.

    PubMed

    Langhorst, M L; Illes, S P

    1986-02-01

    The combination of suitable portable sensors or instruments with small microprocessor-based data-logger units has made it possible to obtain detailed monitoring data for many health and environmental applications. Following data acquisition in field use, the logged data may be transferred to a desk-top personal computer for complete flexibility in manipulation of data and formating of results. A system has been assembled from commercial components and demonstrated for chlorine personal monitoring applications. The system consists of personal chlorine sensors, a Metrosonics data-logger and reader unit, and an Apple II Plus personal computer. The computer software was developed to handle sensor calibration, data evaluation and reduction, report formating and long-term storage of raw data on a disk. This system makes it possible to generate time-concentration profiles, evaluate dose above a threshold, quantitate short-term excursions and summarize time-weighted average (TWA) results. Field data from plant trials demonstrated feasibility of use, ruggedness and reliability. No significant differences were found between the time-weighted average chlorine concentrations determined by the sensor/logger system and two other methods: the sulfamic acid bubbler reference method and the 3M Poroplastic diffusional dosimeter. The sensor/data-logger system, however, provided far more information than the other two methods in terms of peak excursions, TWAs and exposure doses. For industrial hygiene applications, the system allows better definition of employee exposures, particularly for chemicals with acute as well as chronic health effects.(ABSTRACT TRUNCATED AT 250 WORDS)

  17. Assessment of physical activity in chronic kidney disease.

    PubMed

    Robinson-Cohen, Cassianne; Littman, Alyson J; Duncan, Glen E; Roshanravan, Baback; Ikizler, T Alp; Himmelfarb, Jonathan; Kestenbaum, Bryan R

    2013-03-01

    Physical inactivity plays an important role in the development of kidney disease and its complications; however, the validity of standard tools for measuring physical activity (PA) is not well understood. We investigated the performance of several readily available and widely used PA and physical function questionnaires, individually and in combination, against accelerometry among a cohort of chronic kidney disease (CKD) participants. Forty-six participants from the Seattle Kidney Study, an observational cohort study of persons with CKD, completed the Physical Activity Scale for the Elderly, Human Activity Profile (HAP), Medical Outcomes Study SF-36 questionnaire, and the Four-week Physical Activity History questionnaires. We simultaneously measured PA using an Actigraph GT3X accelerometer during a 14-day period. We estimated the validity of each instrument by testing its associations with log-transformed accelerometry counts. We used the Akaike information criterion to investigate the performance of combinations of questionnaires. All questionnaire scores were significantly associated with log-transformed accelerometry counts. The HAP correlated best with accelerometry counts (r(2) = 0.32) followed by SF-36 (r(2) = 0.23). Forty-three percent of the variability in accelerometry counts data was explained by a model that combined the HAP, SF-36, and Four-week Physical Activity History questionnaires. A combination of measurement tools can account for a modest component of PA in patients with CKD; however, a substantial proportion of PA is not captured by standard assessments. Copyright © 2013 National Kidney Foundation, Inc. All rights reserved.

  18. Modeling strategic use of human computer interfaces with novel hidden Markov models

    PubMed Central

    Mariano, Laura J.; Poore, Joshua C.; Krum, David M.; Schwartz, Jana L.; Coskren, William D.; Jones, Eric M.

    2015-01-01

    Immersive software tools are virtual environments designed to give their users an augmented view of real-world data and ways of manipulating that data. As virtual environments, every action users make while interacting with these tools can be carefully logged, as can the state of the software and the information it presents to the user, giving these actions context. This data provides a high-resolution lens through which dynamic cognitive and behavioral processes can be viewed. In this report, we describe new methods for the analysis and interpretation of such data, utilizing a novel implementation of the Beta Process Hidden Markov Model (BP-HMM) for analysis of software activity logs. We further report the results of a preliminary study designed to establish the validity of our modeling approach. A group of 20 participants were asked to play a simple computer game, instrumented to log every interaction with the interface. Participants had no previous experience with the game's functionality or rules, so the activity logs collected during their naïve interactions capture patterns of exploratory behavior and skill acquisition as they attempted to learn the rules of the game. Pre- and post-task questionnaires probed for self-reported styles of problem solving, as well as task engagement, difficulty, and workload. We jointly modeled the activity log sequences collected from all participants using the BP-HMM approach, identifying a global library of activity patterns representative of the collective behavior of all the participants. Analyses show systematic relationships between both pre- and post-task questionnaires, self-reported approaches to analytic problem solving, and metrics extracted from the BP-HMM decomposition. Overall, we find that this novel approach to decomposing unstructured behavioral data within software environments provides a sensible means for understanding how users learn to integrate software functionality for strategic task pursuit. PMID:26191026

  19. JBFA - buoyant flight

    NASA Technical Reports Server (NTRS)

    Ohari, T.

    1982-01-01

    A method was developed whereby a balloon was used to carry lumber out of a forest in order to continue lumber production without destroying the natural environment and view of the forest. Emphasis was on the best shape for a logging balloon, development of a balloon logging system suitable for cutting lumber and safety plans, tests on balloon construction and development of netting, and weather of mountainous areas, especially solutions to problems caused by winds.

  20. Fatal injuries caused by logs rolling off trucks: Kentucky 1994-1998.

    PubMed

    Struttmann, T W; Scheerer, A L

    2001-02-01

    Logging is one of the most hazardous occupations and fatality rates are consistently among the highest of all industries. A review of fatalities caused by logs rolling off trucks is presented. The Kentucky Fatality Assessment and Control Evaluation Project is a statewide surveillance system for occupational fatalities. Investigations are conducted on selected injuries with an emphasis on prevention strategy development. Logging was an area of high priority for case investigation. During 1994-1998, we identified seven incidents in which a worker was killed by a log rolling off a truck at a sawmill, accounting for 15% of the 45 deaths related to logging activities. These cases were reviewed to identify similar characteristics and risk factors. Investigations led to recommendations for behavioral, administrative, and engineering controls. Potential interventions include limiting load height on trucks, installing unloading cages at sawmills and prohibiting overloaded trucks on public roadways. Copyright 2001 Wiley-Liss, Inc.

  1. Developing Surveillance Methodology for Agricultural and Logging Injury in New Hampshire Using Electronic Administrative Data Sets.

    PubMed

    Scott, Erika E; Hirabayashi, Liane; Krupa, Nicole L; Sorensen, Julie A; Jenkins, Paul L

    2015-08-01

    Agriculture and logging rank among industries with the highest rates of occupational fatality and injury. Establishing a nonfatal injury surveillance system is a top priority in the National Occupational Research Agenda. Sources of data such as patient care reports (PCRs) and hospitalization data have recently transitioned to electronic databases. Using narrative and location codes from PCRs, along with International Classification of Diseases, 9th Revision, external cause of injury codes (E-codes) in hospital data, researchers are designing a surveillance system to track farm and logging injury. A total of 357 true agricultural or logging cases were identified. These data indicate that it is possible to identify agricultural and logging injury events in PCR and hospital data. Multiple data sources increase catchment; nevertheless, limitations in methods of identification of agricultural and logging injury contribute to the likely undercount of injury events.

  2. Language Development: 2 Year Olds

    MedlinePlus

    ... Ribbon Commands Skip to main content Turn off Animations Turn on Animations Our Sponsors Log in | Register Menu Log in | ... enrich his vocabulary and language skills by making reading a part of your everyday routine. At this ...

  3. Social Development: 1 Year Olds

    MedlinePlus

    ... Ribbon Commands Skip to main content Turn off Animations Turn on Animations Our Sponsors Log in | Register Menu Log in | ... re doing around the house. Whether you’re reading the paper, sweeping the floors, mowing the lawn, ...

  4. Publications - GMC 181 | Alaska Division of Geological & Geophysical

    Science.gov Websites

    DGGS GMC 181 Publication Details Title: Geologic logs and core assays of 14 nickel, copper, and cobalt information. Bibliographic Reference Inspiration Development Company, 1991, Geologic logs and core assays of

  5. Accuracy and borehole influences in pulsed neutron gamma density logging while drilling.

    PubMed

    Yu, Huawei; Sun, Jianmeng; Wang, Jiaxin; Gardner, Robin P

    2011-09-01

    A new pulsed neutron gamma density (NGD) logging has been developed to replace radioactive chemical sources in oil logging tools. The present paper describes studies of near and far density measurement accuracy of NGD logging at two spacings and the borehole influences using Monte-Carlo simulation. The results show that the accuracy of near density is not as good as far density. It is difficult to correct this for borehole effects by using conventional methods because both near and far density measurement is significantly sensitive to standoffs and mud properties. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Development of a Multi-Species Biotic Ligand Model Predicting the Toxicity of Trivalent Chromium to Barley Root Elongation in Solution Culture

    PubMed Central

    Song, Ningning; Zhong, Xu; Li, Bo; Li, Jumei; Wei, Dongpu; Ma, Yibing

    2014-01-01

    Little knowledge is available about the influence of cation competition and metal speciation on trivalent chromium (Cr(III)) toxicity. In the present study, the effects of pH and selected cations on the toxicity of trivalent chromium (Cr(III)) to barley (Hordeum vulgare) root elongation were investigated to develop an appropriate biotic ligand model (BLM). Results showed that the toxicity of Cr(III) decreased with increasing activity of Ca2+ and Mg2+ but not with K+ and Na+. The effect of pH on Cr(III) toxicity to barley root elongation could be explained by H+ competition with Cr3+ bound to a biotic ligand (BL) as well as by the concomitant toxicity of CrOH2+ in solution culture. Stability constants were obtained for the binding of Cr3+, CrOH2+, Ca2+, Mg2+ and H+ with binding ligand: log KCrBL 7.34, log KCrOHBL 5.35, log KCaBL 2.64, log KMgBL 2.98, and log KHBL 4.74. On the basis of those estimated parameters, a BLM was successfully developed to predict Cr(III) toxicity to barley root elongation as a function of solution characteristics. PMID:25119269

  7. Development of a multi-species biotic ligand model predicting the toxicity of trivalent chromium to barley root elongation in solution culture.

    PubMed

    Song, Ningning; Zhong, Xu; Li, Bo; Li, Jumei; Wei, Dongpu; Ma, Yibing

    2014-01-01

    Little knowledge is available about the influence of cation competition and metal speciation on trivalent chromium (Cr(III)) toxicity. In the present study, the effects of pH and selected cations on the toxicity of trivalent chromium (Cr(III)) to barley (Hordeum vulgare) root elongation were investigated to develop an appropriate biotic ligand model (BLM). Results showed that the toxicity of Cr(III) decreased with increasing activity of Ca(2+) and Mg(2+) but not with K(+) and Na(+). The effect of pH on Cr(III) toxicity to barley root elongation could be explained by H(+) competition with Cr(3+) bound to a biotic ligand (BL) as well as by the concomitant toxicity of CrOH(2+) in solution culture. Stability constants were obtained for the binding of Cr(3+), CrOH(2+), Ca(2+), Mg(2+) and H(+) with binding ligand: log KCrBL 7.34, log KCrOHBL 5.35, log KCaBL 2.64, log KMgBL 2.98, and log KHBL 4.74. On the basis of those estimated parameters, a BLM was successfully developed to predict Cr(III) toxicity to barley root elongation as a function of solution characteristics.

  8. Short-term impact of post-fire salvage logging on regeneration, hazardous fuel accumulation, and understorey development in ponderosa pine forest of the Black Hills, SD, USA

    Treesearch

    Tara L Keyser; Fredrick W Smith; Wayne D. Shepperd

    2009-01-01

    We examined the impacts of post-fire salvage logging on regeneration, fuel accumulation, and understorey vegetation and assessed whether the effects of salvage logging differed between stands burned under moderate and high fire severity following the 2000 Jasper Fire in the Black Hills. In unsalvaged sites, fire-related tree mortality...

  9. Use of polynomial expressions to describe the bioconcentration of hydrophobic chemicals by fish

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Connell, D.W.; Hawker, D.W.

    1988-12-01

    For the bioconcentration of hydrophobic chemicals by fish, relationships have been previously established between uptake rate constants (k1) and the octanol/water partition coefficient (Kow), and also between the clearance rate constant (k2) and Kow. These have been refined and extended on the basis of data for chlorinated hydrocarbons, and closely related compounds including polychlorinated dibenzodioxins, that covered a wider range of hydrophobicity (2.5 less than log Kow less than 9.5). This has allowed the development of new relationships between log Kow and various factors, including the bioconcentration factor (as log KB), equilibrium time (as log teq), and maximum biotic concentrationmore » (as log CB), which include extremely hydrophobic compounds previously not taken into account. The shape of the curves generated by these equations are in qualitative agreement with theoretical prediction and are described by polynomial expressions which are generally approximately linear over the more limited range of log Kow values used to develop previous relationships. The influences of factors such as hydrophobicity, aqueous solubility, molecular weight, lipid solubility, and also exposure time were considered. Decreasing lipid solubilities of extremely hydrophobic chemicals were found to result in increasing clearance rate constants, as well decreasing equilibrium times and bioconcentration factors.« less

  10. Partition of volatile organic compounds from air and from water into plant cuticular matrix: An LFER analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Platts, J.A.; Abraham, M.H.

    The partitioning of organic compounds between air and foliage and between water and foliage is of considerable environmental interest. The purpose of this work is to show that partitioning into the cuticular matrix of one particular species can be satisfactorily modeled by general equations the authors have previously developed and, hence, that the same general equations could be used to model partitioning into other plant materials of the same or different species. The general equations are linear free energy relationships that employ descriptors for polarity/polarizability, hydrogen bond acidity and basicity, dispersive effects, and volume. They have been applied to themore » partition of 62 very varied organic compounds between cuticular matrix of the tomato fruit, Lycopersicon esculentum, and either air (MX{sub a}) or water (MX{sub w}). Values of log MX{sub a} covering a range of 12.4 log units are correlated with a standard deviation of 0.232 log unit, and values of log MX{sub w} covering a range of 7.6 log unit are correlated with an SD of 0.236 log unit. Possibilities are discussed for the prediction of new air-plant cuticular matrix and water-plant cuticular matrix partition values on the basis of the equations developed.« less

  11. A small-diameter NMR logging tool for groundwater investigations

    USGS Publications Warehouse

    Walsh, David; Turner, Peter; Grunewald, Elliot; Zhang, Hong; Butler, James J.; Reboulet, Ed; Knobbe, Steve; Christy, Tom; Lane, John W.; Johnson, Carole D.; Munday, Tim; Fitzpatrick, Andrew

    2013-01-01

    A small-diameter nuclear magnetic resonance (NMR) logging tool has been developed and field tested at various sites in the United States and Australia. A novel design approach has produced relatively inexpensive, small-diameter probes that can be run in open or PVC-cased boreholes as small as 2 inches in diameter. The complete system, including surface electronics and various downhole probes, has been successfully tested in small-diameter monitoring wells in a range of hydrogeological settings. A variant of the probe that can be deployed by a direct-push machine has also been developed and tested in the field. The new NMR logging tool provides reliable, direct, and high-resolution information that is of importance for groundwater studies. Specifically, the technology provides direct measurement of total water content (total porosity in the saturated zone or moisture content in the unsaturated zone), and estimates of relative pore-size distribution (bound vs. mobile water content) and hydraulic conductivity. The NMR measurements show good agreement with ancillary data from lithologic logs, geophysical logs, and hydrogeologic measurements, and provide valuable information for groundwater investigations.

  12. Use of biopartitioning micellar chromatography and RP-HPLC for the determination of blood-brain barrier penetration of α-adrenergic/imidazoline receptor ligands, and QSPR analysis.

    PubMed

    Vucicevic, J; Popovic, M; Nikolic, K; Filipic, S; Obradovic, D; Agbaba, D

    2017-03-01

    For this study, 31 compounds, including 16 imidazoline/α-adrenergic receptor (IRs/α-ARs) ligands and 15 central nervous system (CNS) drugs, were characterized in terms of the retention factors (k) obtained using biopartitioning micellar and classical reversed phase chromatography (log k BMC and log k wRP , respectively). Based on the retention factor (log k wRP ) and slope of the linear curve (S) the isocratic parameter (φ 0 ) was calculated. Obtained retention factors were correlated with experimental log BB values for the group of examined compounds. High correlations were obtained between logarithm of biopartitioning micellar chromatography (BMC) retention factor and effective permeability (r(log k BMC /log BB): 0.77), while for RP-HPLC system the correlations were lower (r(log k wRP /log BB): 0.58; r(S/log BB): -0.50; r(φ 0 /P e ): 0.61). Based on the log k BMC retention data and calculated molecular parameters of the examined compounds, quantitative structure-permeability relationship (QSPR) models were developed using partial least squares, stepwise multiple linear regression, support vector machine and artificial neural network methodologies. A high degree of structural diversity of the analysed IRs/α-ARs ligands and CNS drugs provides wide applicability domain of the QSPR models for estimation of blood-brain barrier penetration of the related compounds.

  13. Visual texture for automated characterisation of geological features in borehole televiewer imagery

    NASA Astrophysics Data System (ADS)

    Al-Sit, Waleed; Al-Nuaimy, Waleed; Marelli, Matteo; Al-Ataby, Ali

    2015-08-01

    Detailed characterisation of the structure of subsurface fractures is greatly facilitated by digital borehole logging instruments, the interpretation of which is typically time-consuming and labour-intensive. Despite recent advances towards autonomy and automation, the final interpretation remains heavily dependent on the skill, experience, alertness and consistency of a human operator. Existing computational tools fail to detect layers between rocks that do not exhibit distinct fracture boundaries, and often struggle characterising cross-cutting layers and partial fractures. This paper presents a novel approach to the characterisation of planar rock discontinuities from digital images of borehole logs. Multi-resolution texture segmentation and pattern recognition techniques utilising Gabor filters are combined with an iterative adaptation of the Hough transform to enable non-distinct, partial, distorted and steep fractures and layers to be accurately identified and characterised in a fully automated fashion. This approach has successfully detected fractures and layers with high detection accuracy and at a relatively low computational cost.

  14. Effects of high hydrostatic pressure and varying concentrations of sodium nitrite from traditional and vegetable-based sources on the growth of Listeria monocytogenes on ready-to-eat (RTE) sliced ham.

    PubMed

    Myers, Kevin; Cannon, Jerry; Montoya, Damian; Dickson, James; Lonergan, Steven; Sebranek, Joseph

    2013-05-01

    The objective of this study was to determine the effect the source of added nitrite and high hydrostatic pressure (HHP) had on the growth of Listeria monocytogenes on ready-to-eat (RTE) sliced ham. Use of 600MPa HHP for 3min resulted in an immediate 3.9-4.3log CFU/g reduction in L. monocytogenes numbers, while use of 400MPa HHP (3min) provided less than 1log CFU/g reduction. With the 600MPa HHP treatment, sliced ham with a conventional concentration of sodium nitrite (200ppm) was not different in L. monocytogenes growth from use with 50 or 100ppm of sodium nitrite in pre-converted celery powder. Instrumental color values as well as residual nitrite and residual nitrate concentrations for cured (sodium nitrite and nitrite from celery powder) and uncured ham formulations are discussed. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Surveys with Athena: results from detailed SIXTE simulations

    NASA Astrophysics Data System (ADS)

    Lanzuisi, G.; Comastri, A.; Aird, J.; Brusa, M.; Cappelluti, N.; Gilli, R.; Matute, I.

    2017-10-01

    "Formation and early growth of BH' and "Accretion by supermassive BH through cosmic time' are two of the scientific objectives of the Athena mission. To these and other topics (i.e. first galaxy groups, cold and warm obscuration and feedback signatures in AGN at high z), a large fraction (20-25%) of the Athena Mock Observing Plan is devoted, in the form of a multi-tiered (deep-medium-wide) survey with the WFI. We used the flexible SIXTE simulator to study the impact of different instrumental configurations, in terms of WFI FOV, mirror psf, background levels, on the performance in the three layers of the WFI survey. We mainly focus on the scientific objective that drives the survey configuration: the detection of at least 10 AGN at z=6-8 with Log(LX)=43-43.5 erg/s and 10 at z=8.10 with Log(LX)=44-44.5 erg/s. Implications for other scientific objectives involved in the survey are also discussed.

  16. EARS : Repositioning data management near data acquisition.

    NASA Astrophysics Data System (ADS)

    Sinquin, Jean-Marc; Sorribas, Jordi; Diviacco, Paolo; Vandenberghe, Thomas; Munoz, Raquel; Garcia, Oscar

    2016-04-01

    The EU FP7 Projects Eurofleets and Eurofleets2 are an European wide alliance of marine research centers that aim to share their research vessels, to improve information sharing on planned, current and completed cruises, on details of ocean-going research vessels and specialized equipment, and to durably improve cost-effectiveness of cruises. Within this context logging of information on how, when and where anything happens on board of the vessel is crucial information for data users in a later stage. This forms a primordial step in the process of data quality control as it could assist in the understanding of anomalies and unexpected trends recorded in the acquired data sets. In this way completeness of the metadata is improved as it is recorded accurately at the origin of the measurement. The collection of this crucial information has been done in very different ways, using different procedures, formats and pieces of software in the context of the European Research Fleet. At the time that the Eurofleets project started, every institution and country had adopted different strategies and approaches, which complicated the task of users that need to log general purpose information and events on-board whenever they access a different platform loosing the opportunity to produce this valuable metadata on-board. Among the many goals the Eurofleets project has, a very important task is the development of an "event log software" called EARS (Eurofleets Automatic Reporting System) that enables scientists and operators to record what happens during a survey. EARS will allow users to fill, in a standardized way, the gap existing at the moment in metadata description that only very seldom links data with its history. Events generated automatically by acquisition instruments will also be handled, enhancing the granularity and precision of the event annotation. The adoption of a common procedure to log survey events and a common terminology to describe them is crucial to provide a friendly and successfully metadata on-board creation procedure for the whole the European Fleet. The possibility of automatically reporting metadata and general purpose data, will simplify the work of scientists and data managers with regards to data transmission. An improved accuracy and completeness of metadata is expected when events are recorded at acquisition time. This will also enhance multiple usages of the data as it allows verification of the different requirements existing in different disciplines.

  17. Decontamination of materials contaminated with Francisella philomiragia or MS2 bacteriophage using PES-Solid, a solid source of peracetic acid.

    PubMed

    Buhr, T L; Young, A A; Johnson, C A; Minter, Z A; Wells, C M

    2014-08-01

    The aim of the study was to develop test methods and evaluate survival of Francisella philomiragia cells and MS2 bacteriophage after exposure to PES-Solid (a solid source of peracetic acid) formulations with or without surfactants. Francisella philomiragia cells (≥7·6 log10 CFU) or MS2 bacteriophage (≥6·8 log10 PFU) were deposited on seven different test materials and treated with three different PES-Solid formulations, three different preneutralized samples and filter controls at room temperature for 15 min. There were 0-1·3 log10 CFU (<20 cells) of cell survival, or 0-1·7 log10 (<51 PFU) of bacteriophage survival in all 21 test combinations (organism, formulation and substrate) containing reactive PES-Solid. In addition, the microemulsion (Dahlgren Surfactant System) showed ≤2 log10 (100 cells) of viable F. philomiragia cells, indicating the microemulsion achieved <2 log10 CFU on its own. Three PES-Solid formulations and one microemulsion system (DSS) inactivated F. philomiragia cells and/or MS2 bacteriophage that were deposited on seven different materials. A test method was developed to show that reactive PES-Solid formulations and a microemulsion system (DSS) inactivated >6 log10 CFU/PFU F. philomiragia cells and/or MS2 bacteriophage on different materials. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.

  18. The HST/WFC3 Quicklook Project: A User Interface to Hubble Space Telescope Wide Field Camera 3 Data

    NASA Astrophysics Data System (ADS)

    Bourque, Matthew; Bajaj, Varun; Bowers, Ariel; Dulude, Michael; Durbin, Meredith; Gosmeyer, Catherine; Gunning, Heather; Khandrika, Harish; Martlin, Catherine; Sunnquist, Ben; Viana, Alex

    2017-06-01

    The Hubble Space Telescope's Wide Field Camera 3 (WFC3) instrument, comprised of two detectors, UVIS (Ultraviolet-Visible) and IR (Infrared), has been acquiring ~ 50-100 images daily since its installation in 2009. The WFC3 Quicklook project provides a means for instrument analysts to store, calibrate, monitor, and interact with these data through the various Quicklook systems: (1) a ~ 175 TB filesystem, which stores the entire WFC3 archive on disk, (2) a MySQL database, which stores image header data, (3) a Python-based automation platform, which currently executes 22 unique calibration/monitoring scripts, (4) a Python-based code library, which provides system functionality such as logging, downloading tools, database connection objects, and filesystem management, and (5) a Python/Flask-based web interface to the Quicklook system. The Quicklook project has enabled large-scale WFC3 analyses and calibrations, such as the monitoring of the health and stability of the WFC3 instrument, the measurement of ~ 20 million WFC3/UVIS Point Spread Functions (PSFs), the creation of WFC3/IR persistence calibration products, and many others.

  19. Log export and import restrictions of the U.S. Pacific Northwest and British Columbia: past and present.

    Treesearch

    Christine L. Lane

    1998-01-01

    Export constraints affecting North American west coast logs have existed intermittently since 1831. Recent developments have tended toward tighter restrictions. National, Provincial, and State rules are described.

  20. Sight Version 0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bronevetsky, G.

    2014-09-01

    Enables applications to emit log information into an output file and produced a structured visual summary of the log data, as well as various statistical analyses of it. This makes it easier for developers to understand the behavior of their applications.

  1. Geophysical evaluation of sandstone aquifers in the Reconcavo-Tucano Basin, Bahia -- Brazil

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lima, O.A.L. de

    1993-11-01

    The upper clastic sediments in the Reconcavo-Tucano basin comprise a multilayer aquifer system of Jurassic age. Its groundwater is normally fresh down to depths of more than 1,000 m. Locally, however, there are zones producing high salinity or sulfur geothermal water. Analysis of electrical logs of more than 150 wells enabled the identification of the most typical sedimentary structures and the gross geometries for the sandstone units in selected areas of the basin. Based on this information, the thick sands are interpreted as coalescent point bars and the shales as flood plain deposits of a large fluvial environment. The resistivitymore » logs and core laboratory data are combined to develop empirical equations relating aquifer porosity and permeability to log-derived parameters such as formation factor and cementation exponent. Temperature logs of 15 wells were useful to quantify the water leakage through semiconfining shales. The groundwater quality was inferred from spontaneous potential (SP) log deflections under control of chemical analysis of water samples. An empirical chart is developed that relates the SP-derived water resistivity to the true water resistivity within the formations. The patterns of salinity variation with depth inferred from SP logs were helpful in identifying subsurface flows along major fault zones, where extensive mixing of water is taking place. A total of 49 vertical Schlumberger resistivity soundings aid in defining aquifer structures and in extrapolating the log derived results. Transition zones between fresh and saline waters have also been detected based on a combination of logging and surface sounding data. Ionic filtering by water leakage across regional shales, local convection and mixing along major faults and hydrodynamic dispersion away from lateral permeability contrasts are the main mechanisms controlling the observed distributions of salinity and temperature within the basin.« less

  2. Interspecies quantitative structure-activity relationships (QSARs) for eco-toxicity screening of chemicals: the role of physicochemical properties.

    PubMed

    Furuhama, A; Hasunuma, K; Aoki, Y

    2015-01-01

    In addition to molecular structure profiles, descriptors based on physicochemical properties are useful for explaining the eco-toxicities of chemicals. In a previous study we reported that a criterion based on the difference between the partition coefficient (log POW) and distribution coefficient (log D) values of chemicals enabled us to identify aromatic amines and phenols for which interspecies relationships with strong correlations could be developed for fish-daphnid and algal-daphnid toxicities. The chemicals that met the log D-based criterion were expected to have similar toxicity mechanisms (related to membrane penetration). Here, we investigated the applicability of log D-based criteria to the eco-toxicity of other kinds of chemicals, including aliphatic compounds. At pH 10, use of a log POW - log D > 0 criterion and omission of outliers resulted in the selection of more than 100 chemicals whose acute fish toxicities or algal growth inhibition toxicities were almost equal to their acute daphnid toxicities. The advantage of log D-based criteria is that they allow for simple, rapid screening and prioritizing of chemicals. However, inorganic molecules and chemicals containing certain structural elements cannot be evaluated, because calculated log D values are unavailable.

  3. Predicting the Rate of River Bank Erosion Caused by Large Wood Log

    NASA Astrophysics Data System (ADS)

    Zhang, N.; Rutherfurd, I.; Ghisalberti, M.

    2016-12-01

    When a single tree falls into a river channel, flow is deflected and accelerated between the tree roots and the bank face, increasing shear stress and scouring the bank. The scallop shaped erosion increases the diversity of the channel morphology, but also causes concern for adjacent landholders. Concern about increased bank erosion is one of the main reasons for large wood to still be removed from channels in SE Australia. Further, the hydraulic effect of many logs in the channel can reduce overall bank erosion rates. Although both phenomena have been described before, this research develops a hydraulic model that estimates their magnitude, and tests and calibrates this model with flume and field measurements, with logs with various configurations and sizes. Specifically, the model estimates the change in excess shear stress on the bank associated . The model addresses the effect of the log angle, distance from bank, and log size and flow condition by solving the mass continuity and energy conservation between the cross section at the approaching flow and contracted flow. Then, we evaluate our model against flume experiment preformed with semi-realistic log models to represent logs in different sizes and decay stages by comparing the measured and simulated velocity increase in the gap between the log and the bank. The log angle, distance from bank, and flow condition are systemically varied for each log model during the experiment. Final, the calibrated model is compared with the field data collected in anabranching channels of Murray River in SE Australia where there are abundant instream logs and regulated and consistent high flow for irrigation. Preliminary results suggest that a log can significantly increase the shear stress on the bank, especially when it positions perpendicular to the flow. The shear stress increases with the log angle in a rising curve (The log angle is the angle between log trunk and flow direction. 0o means log is parallel to flow with canopy pointing downstream). However, the shear stress shows insignificant changes as the log is being moved close to the bank.

  4. A Multi-temporal Analysis of Logging Impacts on Tropical Forest Structure Using Airborne Lidar Data

    NASA Astrophysics Data System (ADS)

    Keller, M. M.; Pinagé, E. R.; Duffy, P.; Longo, M.; dos-Santos, M. N.; Leitold, V.; Morton, D. C.

    2017-12-01

    The long-term impacts of selective logging on carbon cycling and ecosystem function in tropical-forests are still uncertain. Despite improvements in selective logging detection using satellite data, quantifying changes in forest structure from logging and recovery following logging is difficult using orbital data. We analyzed the dynamics of forest structure comparing logged and unlogged forests in the Eastern Brazilian Amazon (Paragominas Municipality, Pará State) using small footprint discrete return airborne lidar data acquired in 2012 and 2014. Logging operations were conducted at the 1200 ha study site from 2006 through 2013 using reduced impact logging techniques—management practices that minimize canopy and ground damage compared to more common conventional logging. Nevertheless, logging still reduced aboveground biomass by 10% to 20% in logged areas compared to intact forests. We aggregated lidar point-cloud data at spatial scales ranging from 50 m to 250 m and developed a binomial classification model based on the height distribution of lidar returns in 2012 and validated the model against the 2014 lidar acquisition. We accurately classified intact and logged forest classes compared with field data. Classification performance improved as spatial resolution increased (AUC = 0.974 at 250 m). We analyzed the differences in canopy gaps, understory damage (based on a relative density model), and biomass (estimated from total canopy height) of intact and logged classes. As expected, logging greatly increased both canopy gap formation and understory damage. However, while the area identified as canopy gap persisted for at least 8 years (from the oldest logging treatments in 2006 to the most recent lidar acquisition in 2014), the effects of ground damage were mostly erased by vigorous understory regrowth after about 5 years. The rate of new gap formation was 6 to 7 times greater in recently logged forests compared to undisturbed forests. New gaps opened at a rate of 1.8 times greater than background even 8 years following logging demonstrating the occurrence of delayed tree mortality. Our study showed that even low-intensity anthropogenic disturbances can cause persistent changes in tropical forest structure and dynamics.

  5. An Integrated System for Wildlife Sensing

    DTIC Science & Technology

    2014-08-14

    design requirement. “Sensor Controller” software. A custom Sensor Controller application was developed for the Android device in order to collect...and log readings from that device’s sensors. “Camera Controller” software. A custom Camera Controller application was developed for the Android device...into 2 separate Android applications (Figure 4). The Sensor Controller logs readings periodically from the Android device’s organic sensors, and

  6. Initial fungal colonizer affects mass loss and fungal community development in Picea abies logs 6 yr after inoculation

    Treesearch

    Daniel L. Lindner; Rimvydas Vasaitis; Ariana Kubartova; Johan Allmer; Hanna Johannesson; Mark T. Banik; Jan. Stenlid

    2011-01-01

    Picea abies logs were inoculated with Resinicium bicolor, Fomitopsis pinicola or left un-inoculated and placed in an old-growth boreal forest. Mass loss and fungal community data were collected after 6 yr to test whether simplification of the fungal community via inoculation affects mass loss and fungal community development. Three...

  7. A graphical automated detection system to locate hardwood log surface defects using high-resolution three-dimensional laser scan data

    Treesearch

    Liya Thomas; R. Edward Thomas

    2011-01-01

    We have developed an automated defect detection system and a state-of-the-art Graphic User Interface (GUI) for hardwood logs. The algorithm identifies defects at least 0.5 inch high and at least 3 inches in diameter on barked hardwood log and stem surfaces. To summarize defect features and to build a knowledge base, hundreds of defects were measured, photographed, and...

  8. Meta-Analysis of the Reduction of Norovirus and Male-Specific Coliphage Concentrations in Wastewater Treatment Plants.

    PubMed

    Pouillot, Régis; Van Doren, Jane M; Woods, Jacquelina; Plante, Daniel; Smith, Mark; Goblick, Gregory; Roberts, Christopher; Locas, Annie; Hajen, Walter; Stobo, Jeffrey; White, John; Holtzman, Jennifer; Buenaventura, Enrico; Burkhardt, William; Catford, Angela; Edwards, Robyn; DePaola, Angelo; Calci, Kevin R

    2015-07-01

    Human norovirus (NoV) is the leading cause of foodborne illness in the United States and Canada. Wastewater treatment plant (WWTP) effluents impacting bivalve mollusk-growing areas are potential sources of NoV contamination. We have developed a meta-analysis that evaluates WWTP influent concentrations and log10 reductions of NoV genotype I (NoV GI; in numbers of genome copies per liter [gc/liter]), NoV genotype II (NoV GII; in gc/liter), and male-specific coliphage (MSC; in number of PFU per liter), a proposed viral surrogate for NoV. The meta-analysis included relevant data (2,943 measurements) reported in the scientific literature through September 2013 and previously unpublished surveillance data from the United States and Canada. Model results indicated that the mean WWTP influent concentration of NoV GII (3.9 log10 gc/liter; 95% credible interval [CI], 3.5, 4.3 log10 gc/liter) is larger than the value for NoV GI (1.5 log10 gc/liter; 95% CI, 0.4, 2.4 log10 gc/liter), with large variations occurring from one WWTP to another. For WWTPs with mechanical systems and chlorine disinfection, mean log10 reductions were -2.4 log10 gc/liter (95% CI, -3.9, -1.1 log10 gc/liter) for NoV GI, -2.7 log10 gc/liter (95% CI, -3.6, -1.9 log10 gc/liter) for NoV GII, and -2.9 log10 PFU per liter (95% CI, -3.4, -2.4 log10 PFU per liter) for MSCs. Comparable values for WWTPs with lagoon systems and chlorine disinfection were -1.4 log10 gc/liter (95% CI, -3.3, 0.5 log10 gc/liter) for NoV GI, -1.7 log10 gc/liter (95% CI, -3.1, -0.3 log10 gc/liter) for NoV GII, and -3.6 log10 PFU per liter (95% CI, -4.8, -2.4 PFU per liter) for MSCs. Within WWTPs, correlations exist between mean NoV GI and NoV GII influent concentrations and between the mean log10 reduction in NoV GII and the mean log10 reduction in MSCs. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  9. Understanding Systematics in ZZ Ceti Model Fitting to Enable Differential Seismology

    NASA Astrophysics Data System (ADS)

    Fuchs, J. T.; Dunlap, B. H.; Clemens, J. C.; Meza, J. A.; Dennihy, E.; Koester, D.

    2017-03-01

    We are conducting a large spectroscopic survey of over 130 Southern ZZ Cetis with the Goodman Spectrograph on the SOAR Telescope. Because it employs a single instrument with high UV throughput, this survey will both improve the signal-to-noise of the sample of SDSS ZZ Cetis and provide a uniform dataset for model comparison. We are paying special attention to systematics in the spectral fitting and quantify three of those systematics here. We show that relative positions in the log g -Teff plane are consistent for these three systematics.

  10. Scanner observations of cool stars from 3400 to 11,000 A.

    NASA Technical Reports Server (NTRS)

    Fay, T.; Honeycutt, R. K.

    1972-01-01

    Evaluation of photoelectric scans of the M supergiant alpha Ori and the carbon stars 19 Psc, W Ori, and DS Peg made at 20-A resolution from 3400 to 6000 A and at 40-A resolution from 6000 to 11,000 A. The data are corrected for atmospheric extinction and for the instrumental response to obtain plots of log flux per unit frequency interval versus wavelength. The dominant spectral features are due to C2, CN, and TiO; the variation of these features with spectral class is pointed out.

  11. Results of investigations at the Zunil geothermal field, Guatemala: Well logging and brine geochemistry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, A.; Dennis, B.; Van Eeckhout, E.

    1991-07-01

    The well logging team from Los Alamos and its counterpart from Central America were tasked to investigate the condition of four producing geothermal wells in the Zunil Geothermal Field. The information obtained would be used to help evaluate the Zunil geothermal reservoir in terms of possible additional drilling and future power plant design. The field activities focused on downhole measurements in four production wells (ZCQ-3, ZCQ-4, ZCQ-5, and ZCQ-6). The teams took measurements of the wells in both static (shut-in) and flowing conditions, using the high-temperature well logging tools developed at Los Alamos National Laboratory. Two well logging missions weremore » conducted in the Zunil field. In October 1988 measurements were made in well ZCQ-3, ZCQ-5, and ZCQ-6. In December 1989 the second field operation logged ZCQ-4 and repeated logs in ZCQ-3. Both field operations included not only well logging but the collecting of numerous fluid samples from both thermal and nonthermal waters. 18 refs., 22 figs., 7 tabs.« less

  12. Reliability and One-Year Stability of the PIN3 Neighborhood Environmental Audit in Urban and Rural Neighborhoods.

    PubMed

    Porter, Anna K; Wen, Fang; Herring, Amy H; Rodríguez, Daniel A; Messer, Lynne C; Laraia, Barbara A; Evenson, Kelly R

    2018-06-01

    Reliable and stable environmental audit instruments are needed to successfully identify the physical and social attributes that may influence physical activity. This study described the reliability and stability of the PIN3 environmental audit instrument in both urban and rural neighborhoods. Four randomly sampled road segments in and around a one-quarter mile buffer of participants' residences from the Pregnancy, Infection, and Nutrition (PIN3) study were rated twice, approximately 2 weeks apart. One year later, 253 of the year 1 sampled roads were re-audited. The instrument included 43 measures that resulted in 73 item scores for calculation of percent overall agreement, kappa statistics, and log-linear models. For same-day reliability, 81% of items had moderate to outstanding kappa statistics (kappas ≥ 0.4). Two-week reliability was slightly lower, with 77% of items having moderate to outstanding agreement using kappa statistics. One-year stability had 68% of items showing moderate to outstanding agreement using kappa statistics. The reliability of the audit measures was largely consistent when comparing urban to rural locations, with only 8% of items exhibiting significant differences (α < 0.05) by urbanicity. The PIN3 instrument is a reliable and stable audit tool for studies assessing neighborhood attributes in urban and rural environments.

  13. Use of NMR logging to obtain estimates of hydraulic conductivity in the High Plains aquifer, Nebraska, USA

    USGS Publications Warehouse

    Dlubac, Katherine; Knight, Rosemary; Song, Yi-Qiao; Bachman, Nate; Grau, Ben; Cannia, Jim; Williams, John

    2013-01-01

    Hydraulic conductivity (K) is one of the most important parameters of interest in groundwater applications because it quantifies the ease with which water can flow through an aquifer material. Hydraulic conductivity is typically measured by conducting aquifer tests or wellbore flow (WBF) logging. Of interest in our research is the use of proton nuclear magnetic resonance (NMR) logging to obtain information about water-filled porosity and pore space geometry, the combination of which can be used to estimate K. In this study, we acquired a suite of advanced geophysical logs, aquifer tests, WBF logs, and sidewall cores at the field site in Lexington, Nebraska, which is underlain by the High Plains aquifer. We first used two empirical equations developed for petroleum applications to predict K from NMR logging data: the Schlumberger Doll Research equation (KSDR) and the Timur-Coates equation (KT-C), with the standard empirical constants determined for consolidated materials. We upscaled our NMR-derived K estimates to the scale of the WBF-logging K(KWBF-logging) estimates for comparison. All the upscaled KT-C estimates were within an order of magnitude of KWBF-logging and all of the upscaled KSDR estimates were within 2 orders of magnitude of KWBF-logging. We optimized the fit between the upscaled NMR-derived K and KWBF-logging estimates to determine a set of site-specific empirical constants for the unconsolidated materials at our field site. We conclude that reliable estimates of K can be obtained from NMR logging data, thus providing an alternate method for obtaining estimates of K at high levels of vertical resolution.

  14. Technical note: An improved approach to determining background aerosol concentrations with PILS sampling on aircraft

    NASA Astrophysics Data System (ADS)

    Fukami, Christine S.; Sullivan, Amy P.; Ryan Fulgham, S.; Murschell, Trey; Borch, Thomas; Smith, James N.; Farmer, Delphine K.

    2016-07-01

    Particle-into-Liquid Samplers (PILS) have become a standard aerosol collection technique, and are widely used in both ground and aircraft measurements in conjunction with off-line ion chromatography (IC) measurements. Accurate and precise background samples are essential to account for gas-phase components not efficiently removed and any interference in the instrument lines, collection vials or off-line analysis procedures. For aircraft sampling with PILS, backgrounds are typically taken with in-line filters to remove particles prior to sample collection once or twice per flight with more numerous backgrounds taken on the ground. Here, we use data collected during the Front Range Air Pollution and Photochemistry Éxperiment (FRAPPÉ) to demonstrate that not only are multiple background filter samples are essential to attain a representative background, but that the chemical background signals do not follow the Gaussian statistics typically assumed. Instead, the background signals for all chemical components analyzed from 137 background samples (taken from ∼78 total sampling hours over 18 flights) follow a log-normal distribution, meaning that the typical approaches of averaging background samples and/or assuming a Gaussian distribution cause an over-estimation of background samples - and thus an underestimation of sample concentrations. Our approach of deriving backgrounds from the peak of the log-normal distribution results in detection limits of 0.25, 0.32, 3.9, 0.17, 0.75 and 0.57 μg m-3 for sub-micron aerosol nitrate (NO3-), nitrite (NO2-), ammonium (NH4+), sulfate (SO42-), potassium (K+) and calcium (Ca2+), respectively. The difference in backgrounds calculated from assuming a Gaussian distribution versus a log-normal distribution were most extreme for NH4+, resulting in a background that was 1.58× that determined from fitting a log-normal distribution.

  15. Testing compression strength of wood logs by drilling resistance

    NASA Astrophysics Data System (ADS)

    Kalny, Gerda; Rados, Kristijan; Rauch, Hans Peter

    2017-04-01

    Soil bioengineering is a construction technique using biological components for hydraulic and civil engineering solutions, based on the application of living plants and other auxiliary materials including among others log wood. Considering the reliability of the construction it is important to know about the durability and the degradation process of the wooden logs to estimate and retain the integral performance of a soil bioengineering system. An important performance indicator is the compression strength, but this parameter is not easy to examine by non-destructive methods. The Rinntech Resistograph is an instrument to measure the drilling resistance by a 3 mm wide needle in a wooden log. It is a quasi-non-destructive method as the remaining hole has no weakening effects to the wood. This is an easy procedure but result in values, hard to interpret. To assign drilling resistance values to specific compression strengths, wooden specimens were tested in an experiment and analysed with the Resistograph. Afterwards compression tests were done at the same specimens. This should allow an easier interpretation of drilling resistance curves in future. For detailed analyses specimens were investigated by means of branch inclusions, cracks and distances between annual rings. Wood specimens are tested perpendicular to the grain. First results show a correlation between drilling resistance and compression strength by using the mean drilling resistance, average width of the annual rings and the mean range of the minima and maxima values as factors for the drilling resistance. The extended limit of proportionality, the offset yield strength and the maximum strength were taken as parameters for compression strength. Further investigations at a second point in time strengthen these results.

  16. Spent Fuel Test-Climax: core logging for site investigation and instrumentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilder, D.G.; Yow, J.L. Jr.; Thorpe, R.K.

    1982-05-28

    As an integral part of the Spent Fuel Test-Climax 5150 ft (1570 m) of granite core was obtained. This core was diamond drilled in various sizes, mainly 38-mm and 76-mm diameters. The core was teken with single tube core barrels and was unoriented. Techniques used to drill and log this core are discussed, as well as techniques to orient the core. Of the 5150 ft (1570 m) of core more than 3645 ft (1111 m) was retained and logged in some detail. As a result of the core logging, geologic discontinuities were identified, joint frequency and spacing characterized. Discontinuities identifiedmore » included several joint sets, shear zones and faults. Correlations based on coring along were generally found to be impossible, even for the more prominent features. The only feature properly correlated from the exploratory drilling was the fault system at the end of the facility, but it was not identified from the exploratory core as a fault. Identification of discontinuities was later helped by underground mapping that identified several different joint sets with different characteristics. It was found that joint frequency varied from 0.3 to 1.1 joint per foot of core for open fractures and from 0.3 to 3.3/ft for closed or healed fractures. Histograms of fracture spacing indicate that there is likely a random distribution of spacing superimposed upon uniformly spaced fractures. It was found that a low angle joint set had a persistent mean orientation. These joints were healed and had pervasive wall rock alteration which made identification of joints in this set possible. The recognition of a joint set with known attitude allowed orientation of much of the core. This orientation technique was found to be effective. 10 references, 25 figures, 4 tables.« less

  17. Introducing high performance distributed logging service for ACS

    NASA Astrophysics Data System (ADS)

    Avarias, Jorge A.; López, Joao S.; Maureira, Cristián; Sommer, Heiko; Chiozzi, Gianluca

    2010-07-01

    The ALMA Common Software (ACS) is a software framework that provides the infrastructure for the Atacama Large Millimeter Array and other projects. ACS, based on CORBA, offers basic services and common design patterns for distributed software. Every properly built system needs to be able to log status and error information. Logging in a single computer scenario can be as easy as using fprintf statements. However, in a distributed system, it must provide a way to centralize all logging data in a single place without overloading the network nor complicating the applications. ACS provides a complete logging service infrastructure in which every log has an associated priority and timestamp, allowing filtering at different levels of the system (application, service and clients). Currently the ACS logging service uses an implementation of the CORBA Telecom Log Service in a customized way, using only a minimal subset of the features provided by the standard. The most relevant feature used by ACS is the ability to treat the logs as event data that gets distributed over the network in a publisher-subscriber paradigm. For this purpose the CORBA Notification Service, which is resource intensive, is used. On the other hand, the Data Distribution Service (DDS) provides an alternative standard for publisher-subscriber communication for real-time systems, offering better performance and featuring decentralized message processing. The current document describes how the new high performance logging service of ACS has been modeled and developed using DDS, replacing the Telecom Log Service. Benefits and drawbacks are analyzed. A benchmark is presented comparing the differences between the implementations.

  18. First Report on a Randomized Investigation of Antimicrobial Resistance in Fecal Indicator Bacteria from Livestock, Poultry, and Humans in Tanzania.

    PubMed

    Katakweba, Abdul A S; Muhairwa, Amandus P; Lupindu, Athumani M; Damborg, Peter; Rosenkrantz, Jesper T; Minga, Uswege M; Mtambo, Madundo M A; Olsen, John E

    2018-04-01

    This study provides an estimate of antimicrobial resistance in intestinal indicator bacteria from humans (n = 97) and food animals (n = 388) in Tanzania. More than 70% of all fecal samples contained tetracycline (TE), sulfamethoxazole (STX), and ampicillin (AMP)-resistant coliforms, while cefotaxime (CTX)-resistant coliforms were observed in 40% of all samples. The average Log 10 colony forming units/g of CTX-resistant coliforms in samples from humans were 2.20. Of 390 Escherichia coli tested, 66.4% were resistant to TE, 54.9% to STX, 54.9% to streptomycin, and 36.4% to CTX. Isolates were commonly (65.1%) multiresistant. All CTX-resistant isolates contained bla CTX-M gene type. AMP- and vancomycin-resistant enterococci were rare, and the average concentrations in positive samples were low (log 10 0.9 and 0.4, respectively). A low-to-moderate resistance (2.1-15%) was detected in 240 enterococci isolates to the drugs tested, except for rifampicin resistance (75.2% of isolates). The average number of sulII gene copies varied between Log 10 5.37 and 5.68 with no significant difference between sample source, while cattle had significantly higher number of tetW genes than humans. These findings, based on randomly obtained samples, will be instrumental in designing antimicrobial resistance (AMR) intervention strategies for Tanzania.

  19. Instructional Conversations and Literature Logs. What Works Clearinghouse Intervention Report

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2006

    2006-01-01

    This What Works Clearinghouse (WWC) report examines the effect of "Instructional Conversations and Literature Logs" used in combination. The goal of "Instructional Conversations" is to help English language learners develop reading comprehension ability along with English language proficiency. "Instructional…

  20. Precision gravity measurement utilizing Accelerex vibrating beam accelerometer technology

    NASA Astrophysics Data System (ADS)

    Norling, Brian L.

    Tests run using Sundstrand vibrating beam accelerometers to sense microgravity are described. Lunar-solar tidal effects were used as a highly predictable signal which varies by approximately 200 billionths of the full-scale gravitation level. Test runs of 48-h duration were used to evaluate stability, resolution, and noise. Test results on the Accelerex accelerometer show accuracies suitable for precision applications such as gravity mapping and gravity density logging. The test results indicate that Accelerex technology, even with an instrument design and signal processing approach not optimized for microgravity measurement, can achieve 48-nano-g (1 sigma) or better accuracy over a 48-h period. This value includes contributions from instrument noise and random walk, combined bias and scale factor drift, and thermal modeling errors as well as external contributions from sampling noise, test equipment inaccuracies, electrical noise, and cultural noise induced acceleration.

  1. Required Operational Capability, USMC-ROC-LOG-216.3.5 for the Ration, Cold Weather.

    DTIC Science & Technology

    1987-05-06

    in operations or training in an arctic environment . b. Organizational Concept. The ration , cold weather will be issued in accordance with established...all services. 2 ROC-ARCTIC 7. TECHNICAL FEASIBILITY AND ENERGY/ ENVIRONMENTAL IMPACTS a. Technical Feasibility. The risk of developing the ration ...r -A1833 963 REQUIRED OPERATIONAL CAPABILITY USMC-ROC-LOG-21635 FOR 1t/1 THE RATION COLD WEATHER(U) MARINE CORPS WASHINGTON DC 86 MAY 87 USMC-ROC-LOG

  2. Bushmeat supply and consumption in a tropical logging concession in northern Congo.

    PubMed

    Poulsen, J R; Clark, C J; Mavah, G; Elkan, P W

    2009-12-01

    Unsustainable hunting of wildlife for food empties tropical forests of many species critical to forest maintenance and livelihoods of forest people. Extractive industries, including logging, can accelerate exploitation of wildlife by opening forests to hunters and creating markets for bushmeat. We monitored human demographics, bushmeat supply in markets, and household bushmeat consumption in five logging towns in the northern Republic of Congo. Over 6 years we recorded 29,570 animals in town markets and collected 48,920 household meal records. Development of industrial logging operations led to a 69% increase in the population of logging towns and a 64% increase in bushmeat supply. The immigration of workers, jobseekers, and their families altered hunting patterns and was associated with increased use of wire snares and increased diversity in the species hunted and consumed. Immigrants hunted 72% of all bushmeat, which suggests the short-term benefits of hunting accrue disproportionately to "outsiders" to the detriment of indigenous peoples who have prior, legitimate claims to wildlife resources. Our results suggest that the greatest threat of logging to biodiversity may be the permanent urbanization of frontier forests. Although enforcement of hunting laws and promotion of alternative sources of protein may help curb the pressure on wildlife, the best strategy for biodiversity conservation may be to keep saw mills and the towns that develop around them out of forests.

  3. Internet-based early intervention to prevent posttraumatic stress disorder in injury patients: randomized controlled trial.

    PubMed

    Mouthaan, Joanne; Sijbrandij, Marit; de Vries, Giel-Jan; Reitsma, Johannes B; van de Schoot, Rens; Goslings, J Carel; Luitse, Jan S K; Bakker, Fred C; Gersons, Berthold P R; Olff, Miranda

    2013-08-13

    Posttraumatic stress disorder (PTSD) develops in 10-20% of injury patients. We developed a novel, self-guided Internet-based intervention (called Trauma TIPS) based on techniques from cognitive behavioral therapy (CBT) to prevent the onset of PTSD symptoms. To determine whether Trauma TIPS is effective in preventing the onset of PTSD symptoms in injury patients. Adult, level 1 trauma center patients were randomly assigned to receive the fully automated Trauma TIPS Internet intervention (n=151) or to receive no early intervention (n=149). Trauma TIPS consisted of psychoeducation, in vivo exposure, and stress management techniques. Both groups were free to use care as usual (nonprotocolized talks with hospital staff). PTSD symptom severity was assessed at 1, 3, 6, and 12 months post injury with a clinical interview (Clinician-Administered PTSD Scale) by blinded trained interviewers and self-report instrument (Impact of Event Scale-Revised). Secondary outcomes were acute anxiety and arousal (assessed online), self-reported depressive and anxiety symptoms (Hospital Anxiety and Depression Scale), and mental health care utilization. Intervention usage was documented. The mean number of intervention logins was 1.7, SD 2.5, median 1, interquartile range (IQR) 1-2. Thirty-four patients in the intervention group did not log in (22.5%), 63 (41.7%) logged in once, and 54 (35.8%) logged in multiple times (mean 3.6, SD 3.5, median 3, IQR 2-4). On clinician-assessed and self-reported PTSD symptoms, both the intervention and control group showed a significant decrease over time (P<.001) without significant differences in trend. PTSD at 12 months was diagnosed in 4.7% of controls and 4.4% of intervention group patients. There were no group differences on anxiety or depressive symptoms over time. Post hoc analyses using latent growth mixture modeling showed a significant decrease in PTSD symptoms in a subgroup of patients with severe initial symptoms (n=20) (P<.001). Our results do not support the efficacy of the Trauma TIPS Internet-based early intervention in the prevention of PTSD symptoms for an unselected population of injury patients. Moreover, uptake was relatively low since one-fifth of individuals did not log in to the intervention. Future research should therefore focus on innovative strategies to increase intervention usage, for example, adding gameplay, embedding it in a blended care context, and targeting high-risk individuals who are more likely to benefit from the intervention. International Standard Randomized Controlled Trial Number (ISRCTN): 57754429; http://www.controlled-trials.com/ISRCTN57754429 (Archived by WebCite at http://webcitation.org/6FeJtJJyD).

  4. 3D Extended Logging for Geothermal Resources: Field Trials with the Geo-Bilt System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mallan, R; Wilt, M; Kirkendall, B

    2002-05-29

    Geo-BILT (Geothermal Borehole Induction Logging Tool) is an extended induction logging tool designed for 3D resistivity imaging around a single borehole. The tool was developed for deployment in high temperature geothermal wells under a joint program funded by the California Energy Commission, Electromagnetic Instruments (EMI) and the U.S. Department of Energy. EM1 was responsible for tool design and manufacture, and numerical modeling efforts were being addressed at Lawrence Livermore Laboratory (LLNL) and other contractors. The field deployment was done by EM1 and LLNL. The tool operates at frequencies from 2 to 42 kHz, and its design features a series ofmore » three-component magnetic sensors offset at 2 and 5 meters from a three-component magnetic source. The combined package makes it possible to do 3D resistivity imaging, deep into the formation, from a single well. The manufacture and testing of the tool was completed in spring of 2001, and the initial deployment of Geo-BILT occurred in May 2001 at the Lost Hills oil field in southern California at leases operated by Chevron USA. This site was chosen for the initial field test because of the favorable geological conditions and the availability of a number of wells suitable for tool deployment. The second deployment occurred in April 2002 at the Dixie Valley geothermal field, operated by Caithness Power LLC, in central Nevada. This constituted the first test in a high temperature environment. The Chevron site features a fiberglass-cased observation well in the vicinity of a water injector. The injected water, which is used for pressure maintenance and for secondary sweep of the heavy oil formation, has a much lower resistivity than the oil bearing formation. This, in addition to the non-uniform flow of this water, creates a 3D resistivity structure, which is analogous to conditions produced from flowing fractures adjacent to geothermal boreholes. Therefore, it is an excellent site for testing the 3D capability of the tool in a low risk environment. The Dixie Valley site offered an environment where the tool could locate near-well fractures associated with steam development. The Lost Hills field measurements yielded a data set suitable for 3D imaging. The Geo-BLT data corresponded to existing conventional logging data and showed clear indications, in several depth intervals, of near-well 3D structure. Subsequent 3D inversion of these data produced a model consistent with non-planar water flow in specific layers. The Dixie Valley measurements identified structures associated with dike intrusions and water inflow at particular depths. Preliminary analysis suggests these structures are steeply dipping, which is consistent with the geology.« less

  5. QSPR study of polychlorinated diphenyl ethers by molecular electronegativity distance vector (MEDV-4).

    PubMed

    Sun, Lili; Zhou, Liping; Yu, Yu; Lan, Yukun; Li, Zhiliang

    2007-01-01

    Polychlorinated diphenyl ethers (PCDEs) have received more and more concerns as a group of ubiquitous potential persistent organic pollutants (POPs). By using molecular electronegativity distance vector (MEDV-4), multiple linear regression (MLR) models are developed for sub-cooled liquid vapor pressures (P(L)), n-octanol/water partition coefficients (K(OW)) and sub-cooled liquid water solubilities (S(W,L)) of 209 PCDEs and diphenyl ether. The correlation coefficients (R) and the leave-one-out cross-validation (LOO) correlation coefficients (R(CV)) of all the 6-descriptor models for logP(L), logK(OW) and logS(W,L) are more than 0.98. By using stepwise multiple regression (SMR), the descriptors are selected and the resulting models are 5-descriptor model for logP(L), 4-descriptor model for logK(OW), and 6-descriptor model for logS(W,L), respectively. All these models exhibit excellent estimate capabilities for internal sample set and good predictive capabilities for external samples set. The consistency between observed and estimated/predicted values for logP(L) is the best (R=0.996, R(CV)=0.996), followed by logK(OW) (R=0.992, R(CV)=0.992) and logS(W,L) (R=0.983, R(CV)=0.980). By using MEDV-4 descriptors, the QSPR models can be used for prediction and the model predictions can hence extend the current database of experimental values.

  6. The Richter scale: its development and use for determining earthquake source parameters

    USGS Publications Warehouse

    Boore, D.M.

    1989-01-01

    The ML scale, introduced by Richter in 1935, is the antecedent of every magnitude scale in use today. The scale is defined such that a magnitude-3 earthquake recorded on a Wood-Anderson torsion seismometer at a distance of 100 km would write a record with a peak excursion of 1 mm. To be useful, some means are needed to correct recordings to the standard distance of 100 km. Richter provides a table of correction values, which he terms -log Ao, the latest of which is contained in his 1958 textbook. A new analysis of over 9000 readings from almost 1000 earthquakes in the southern California region was recently completed to redetermine the -log Ao values. Although some systematic differences were found between this analysis and Richter's values (such that using Richter's values would lead to underand overestimates of ML at distances less than 40 km and greater than 200 km, respectively), the accuracy of his values is remarkable in view of the small number of data used in their determination. Richter's corrections for the distance attenuation of the peak amplitudes on Wood-Anderson seismographs apply only to the southern California region, of course, and should not be used in other areas without first checking to make sure that they are applicable. Often in the past this has not been done, but recently a number of papers have been published determining the corrections for other areas. If there are significant differences in the attenuation within 100 km between regions, then the definition of the magnitude at 100 km could lead to difficulty in comparing the sizes of earthquakes in various parts of the world. To alleviate this, it is proposed that the scale be defined such that a magnitude 3 corresponds to 10 mm of motion at 17 km. This is consistent both with Richter's definition of ML at 100 km and with the newly determined distance corrections in the southern California region. Aside from the obvious (and original) use as a means of cataloguing earthquakes according to size, ML has been used in predictions of ground shaking as a function of distance and magnitude; it has also been used in estimating energy and seismic moment. There is a good correlation of peak ground velocity and the peak motion on a Wood-Anderson instrument at the same location, as well as an observationally defined (and theoretically predicted) nonlinear relation between ML and seismic moment. An important byproduct of the establishment of the ML scale is the continuous operation of the network of Wood-Anderson seismographs on which the scale is based. The records from these instruments can be used to make relative comparisons of amplitudes and waveforms of recent and historic earthquakes; furthermore, because of the moderate gain, the instruments can write onscale records from great earthquakes at teleseismic distances and thus can provide important information about the energy radiated from such earthquakes at frequencies where many instruments have saturated. ?? 1989.

  7. A Computer Vision System forLocating and Identifying Internal Log Defects Using CT Imagery

    Treesearch

    Dongping Zhu; Richard W. Conners; Frederick Lamb; Philip A. Araman

    1991-01-01

    A number of researchers have shown the ability of magnetic resonance imaging (MRI) and computer tomography (CT) imaging to detect internal defects in logs. However, if these devices are ever to play a role in the forest products industry, automatic methods for analyzing data from these devices must be developed. This paper reports research aimed at developing a...

  8. Developing EHR-driven heart failure risk prediction models using CPXR(Log) with the probabilistic loss function.

    PubMed

    Taslimitehrani, Vahid; Dong, Guozhu; Pereira, Naveen L; Panahiazar, Maryam; Pathak, Jyotishman

    2016-04-01

    Computerized survival prediction in healthcare identifying the risk of disease mortality, helps healthcare providers to effectively manage their patients by providing appropriate treatment options. In this study, we propose to apply a classification algorithm, Contrast Pattern Aided Logistic Regression (CPXR(Log)) with the probabilistic loss function, to develop and validate prognostic risk models to predict 1, 2, and 5year survival in heart failure (HF) using data from electronic health records (EHRs) at Mayo Clinic. The CPXR(Log) constructs a pattern aided logistic regression model defined by several patterns and corresponding local logistic regression models. One of the models generated by CPXR(Log) achieved an AUC and accuracy of 0.94 and 0.91, respectively, and significantly outperformed prognostic models reported in prior studies. Data extracted from EHRs allowed incorporation of patient co-morbidities into our models which helped improve the performance of the CPXR(Log) models (15.9% AUC improvement), although did not improve the accuracy of the models built by other classifiers. We also propose a probabilistic loss function to determine the large error and small error instances. The new loss function used in the algorithm outperforms other functions used in the previous studies by 1% improvement in the AUC. This study revealed that using EHR data to build prediction models can be very challenging using existing classification methods due to the high dimensionality and complexity of EHR data. The risk models developed by CPXR(Log) also reveal that HF is a highly heterogeneous disease, i.e., different subgroups of HF patients require different types of considerations with their diagnosis and treatment. Our risk models provided two valuable insights for application of predictive modeling techniques in biomedicine: Logistic risk models often make systematic prediction errors, and it is prudent to use subgroup based prediction models such as those given by CPXR(Log) when investigating heterogeneous diseases. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Thorium normalization as a hydrocarbon accumulation indicator for Lower Miocene rocks in Ras Ghara area, Gulf of Suez, Egypt

    NASA Astrophysics Data System (ADS)

    El-Khadragy, A. A.; Shazly, T. F.; AlAlfy, I. M.; Ramadan, M.; El-Sawy, M. Z.

    2018-06-01

    An exploration method has been developed using surface and aerial gamma-ray spectral measurements in prospecting petroleum in stratigraphic and structural traps. The Gulf of Suez is an important region for studying hydrocarbon potentiality in Egypt. Thorium normalization technique was applied on the sandstone reservoirs in the region to determine the hydrocarbon potentialities zones using the three spectrometric radioactive gamma ray-logs (eU, eTh and K% logs). This method was applied on the recorded gamma-ray spectrometric logs for Rudeis and Kareem Formations in Ras Ghara oil Field, Gulf of Suez, Egypt. The conventional well logs (gamma-ray, resistivity, neutron, density and sonic logs) were analyzed to determine the net pay zones in the study area. The agreement ratios between the thorium normalization technique and the results of the well log analyses are high, so the application of thorium normalization technique can be used as a guide for hydrocarbon accumulation in the study reservoir rocks.

  10. Parallel algorithms for computation of the manipulator inertia matrix

    NASA Technical Reports Server (NTRS)

    Amin-Javaheri, Masoud; Orin, David E.

    1989-01-01

    The development of an O(log2N) parallel algorithm for the manipulator inertia matrix is presented. It is based on the most efficient serial algorithm which uses the composite rigid body method. Recursive doubling is used to reformulate the linear recurrence equations which are required to compute the diagonal elements of the matrix. It results in O(log2N) levels of computation. Computation of the off-diagonal elements involves N linear recurrences of varying-size and a new method, which avoids redundant computation of position and orientation transforms for the manipulator, is developed. The O(log2N) algorithm is presented in both equation and graphic forms which clearly show the parallelism inherent in the algorithm.

  11. Is skin penetration a determining factor in skin sensitization ...

    EPA Pesticide Factsheets

    Summary:Background. It is widely accepted that substances that cannot penetrate through the skin will not be sensitisers. Thresholds based on relevant physicochemical parameters such as a LogKow > 1 and a MW 1 is a true requirement for sensitisation.Methods. A large dataset of substances that had been evaluated for their skin sensitisation potential, together with measured LogKow values was compiled from the REACH database. The incidence of skin sensitisers relative to non-skin sensitisers below and above the LogKow = 1 threshold was evaluated. Results. 1482 substances with associated skin sensitisation outcomes and measured LogKow values were identified. 305 substances had a measured LogKow < 0 and of those, 38 were sensitisers.Conclusions. There was no significant difference in the incidence of skin sensitisation above and below the LogKow = 1 threshold. Reaction chemistry considerations could explain the skin sensitisation observed for the 38 sensitisers with a LogKow < 0. The LogKow threshold is a self-evident truth borne out from the widespread misconception that the ability to efficiently penetrate the stratum corneum is a key determinant of skin sensitisation potential and potency. Using the REACH data extracted to test out the validity of common assumptions in the skin sensitization AOP. Builds on trying to develop a proof of concept IATA

  12. The SHOLO mill: return on investment versus mill design

    Treesearch

    Hugh W. Reynolds; Charles J. Gatchell; Charles J. Gatchell

    1971-01-01

    The newly developed SHOLO (from SHOrt Log) process can be used to convert low-grade hardwood logs into parts for standard warehouse pallets and pulp chips. Should you build a SHOLO mill? This paper has been prepared to help you decide.

  13. Designing A Robust Command, Communications and Data Acquisition System For Autonomous Sensor Platforms Using The Data Transport Network

    NASA Astrophysics Data System (ADS)

    Valentic, T. A.

    2012-12-01

    The Data Transport Network is designed for the delivery of data from scientific instruments located at remote field sites with limited or unreliable communications. Originally deployed at the Sondrestrom Research Facility in Greenland over a decade ago, the system supports the real-time collection and processing of data from large instruments such as incoherent scatter radars and lidars. In recent years, the Data Transport Network has been adapted to small, low-power embedded systems controlling remote instrumentation platforms deployed throughout the Arctic. These projects include multiple buoys from the O-Buoy, IceLander and IceGoat programs, renewable energy monitoring at the Imnavait Creek and Ivotuk field sites in Alaska and remote weather observation stations in Alaska and Greenland. This presentation will discuss the common communications controller developed for these projects. Although varied in their application, each of these systems share a number of common features. Multiple instruments are attached, each of which needs to be power controlled, data sampled and files transmitted offsite. In addition, the power usage of the overall system must be minimized to handle the limited energy available from sources such as solar, wind and fuel cells. The communications links are satellite based. The buoys and weather stations utilize Iridium, necessitating the need to handle the common drop outs and high-latency, low-bandwidth nature of the link. The communications controller is an off-the-shelf, low-power, single board computer running a customized version of the Linux operating system. The Data Transport Network provides a Python-based software framework for writing individual data collection programs and supplies a number of common services for configuration, scheduling, logging, data transmission and resource management. Adding a new instrument involves writing only the necessary code for interfacing to the hardware. Individual programs communicate with the system services using XML-RPC. The scheduling algorithms have access the current position and power levels, allowing for instruments such as cameras to only be run during daylight hours or when sufficient power is available. The resource manager monitors the use of common devices such as the USB bus or Ethernet ports, and can power them down when they are not being used. This management lets us drop the power consumption from an average of 1W to 250mW.

  14. Development of a Mass Spectrometer-based Instrument for Volcanic Gas Monitoring

    NASA Astrophysics Data System (ADS)

    McMurtry, G. M.; Hilton, D. R.; Fischer, T.; Sutton, A. J.; Elias, T.

    2007-05-01

    We have developed and field tested an instrument that is capable of acquiring multiple-species gas chemistry data at active volcanoes and hydrothermal systems. The current prototype consists of a quadrupole mass spectrometer, a series of pumps, valves and control/data logging electronics housed in a corrosion-resistant container. We tested the instrument at the summit of Kilauea volcano in March, 2006, collecting time-series data from a 96°C fumarole (Sulphur Banks) at 15 minute intervals for nearly 3 days. Two temperature probes were utilized, a thermocouple placed in the gas stream and a thermistor which recorded ambient air temperatures inside the instrument housing. Of these, the thermistor produced the more reliable trace, as the thermocouple pegged near 45°C shortly after reaching the fumarole gas composition. This composition was indicated by sharp drops in the instrument response for N2, O2, Ar, and water vapor, and increases in CO2 and SO2 at about 6.5 hours elapsed time. The two most obvious gas/temperature trends in this brief time-series are: (1) sharp discontinuities caused by two of the standard "Giggenbach" bottle sampling interludes (despite some care given not to vent the gas line to atmosphere); and (2) two distinct types of thermal events. The two sampling interruptions caused decreases in temperature, and caused the responses of CO2, N2, O2, Ar and water vapor and the ratio of CO2/He to rise sharply. This appears consistent with contamination by cooler ambient air enriched in CO2 relative to normal air (solfatara air). The two types of thermal events are similar in that both generally show enrichments of SO2 and He, and decreases in CO2/He, whereas the last, much hotter event displays increases in CO2, N2, O2, Ar, and water vapor, in contrast to decreases in these gases during the two former events. The last thermal event correlates with a brief dry period on 17 March, after a previous week of almost continuous rainfall. An interesting increase in the HD/H2 ratio suggests either HD-enriched H2 gas or water vapor was introduced during the last thermal event, which is consistent with a fumarole influenced by evaporated, boiling water and atmospheric gases at depth. During our tests we have discovered several problematic issues that need to be overcome if the instrument is to be deployed for extended periods of time (months to years) in harsh and remote locations of active volcanoes. One of the main obstacles is the large amount of water vapor in fumaroles and the need for keeping that water out of the mass spectrometer. We have successfully achieved this using a series of traps and a condenser that still allow the other species to enter the instrument. Related problems are loss of some of the reactive gases within the instrument and/or traps and the precipitation of elemental sulfur in the pre-mass spec inlet system. Another issue that we are currently addressing is the relatively high power consumption of the instrument and condenser.

  15. The development of an Ada programming support environment database: SEAD (Software Engineering and Ada Database), user's manual

    NASA Technical Reports Server (NTRS)

    Liaw, Morris; Evesson, Donna

    1988-01-01

    This is a manual for users of the Software Engineering and Ada Database (SEAD). SEAD was developed to provide an information resource to NASA and NASA contractors with respect to Ada-based resources and activities that are available or underway either in NASA or elsewhere in the worldwide Ada community. The sharing of such information will reduce the duplication of effort while improving quality in the development of future software systems. The manual describes the organization of the data in SEAD, the user interface from logging in to logging out, and concludes with a ten chapter tutorial on how to use the information in SEAD. Two appendices provide quick reference for logging into SEAD and using the keyboard of an IBM 3270 or VT100 computer terminal.

  16. The Moulin Explorer: A Novel Instrument to Study Greenland Ice Sheet Melt-Water Flow.

    NASA Astrophysics Data System (ADS)

    Behar, A.; Wang, H.; Elliott, A.; O'Hern, S.; Martin, S.; Lutz, C.; Steffen, K.; McGrath, D.; Phillips, T.

    2008-12-01

    Recent data shows that the Greenland ice sheet has been melting at an accelerated rate over the past decade. This melt water flows from the surface of the glacier to the bedrock below by draining into tubular crevasses known as moulins. Some believe these pathways eventually converge to nearby lakes and possibly the ocean. The Moulin Explorer Probe has been developed to traverse autonomously through these moulins. It uses in-situ pressure, temperature, and three-axis accelerometer sensors to log data. At the end of its journey, the probe will surface and send GPS coordinates using an Iridium satellite tracker so it may be retrieved via helicopter or boat. The information gathered when retrieved can be used to map the pathways and water flow rate through the moulins. This work was performed at the Jet Propulsion Laboratory- California Institute of Technology, under contract to NASA. Support was provided by the NASA Earth Science, Cryosphere program

  17. Vadose Zone Transport Field Study: Status Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gee, Glendon W.; Ward, Anderson L.

    2001-11-30

    Studies were initiated at the Hanford Site to evaluate the process controlling the transport of fluids in the vadose zone and to develop a reliable database upon which vadose-zone transport models can be calibrated. These models are needed to evaluate contaminant migration through the vadose zone to underlying groundwaters at Hanford. A study site that had previously been extensively characterized using geophysical monitoring techniques was selected in the 200 E Area. Techniques used previously included neutron probe for water content, spectral gamma logging for radionuclide tracers, and gamma scattering for wet bulk density. Building on the characterization efforts of themore » past 20 years, the site was instrumented to facilitate the comparison of nine vadose-zone characterization methods: advanced tensiometers, neutron probe, electrical resistance tomography (ERT), high-resolution resistivity (HRR), electromagnetic induction imaging (EMI), cross-borehole radar (XBR), and cross-borehole seismic (XBS). Soil coring was used to obtain soil samples for analyzing ionic and isotopic tracers.« less

  18. Emerging Technologies for Real-Time Continuous Monitoring of Wellbore Integrity

    NASA Astrophysics Data System (ADS)

    Freifeld, B. M.

    2017-12-01

    Assessment of a well's integrity has traditionally been carried out through periodic wireline logging, often performed only when an operational problem was noted at the surface. There are several emerging technologies that can be installed permanently as part of the well completion and offer the ability to monitor operations while providing continuous indicators to evaluate the structural health of a well. Permanent behind casing instrumentation, such as pressure and temperature gauges can monitor for behind casing leakage. Similarly, fiber-optic distributed temperature and acoustic sensing provide additional information for assessing unwanted movement of fluid, which is indicative of problems either inside or outside of casing. Furthermore, these technologies offer the benefit of providing real-time continuous streams of information that serve as leading-indicators of wellbore problems to allow for early intervention. Additional research is still needed to develop best practices for the installation and operation of these technologies, as they increase cost and add additional risks that must be managed.

  19. Application of seismic interpretation in the development of Jerneh Field, Malay Basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yusoff, Z.

    1994-07-01

    Development of the Jerneh gas field has been significantly aided by the use of 3-D and site survey seismic interpretations. The two aspects that have been of particular importance are identification of sea-floor and near-surface safety hazards for safe platform installation/development drilling and mapping of reservoirs/hydrocarbons within gas-productive sands of the Miocene groups B, D, and E. Choice of platform location as well as casing design require detailed analysis of sea-floor and near-surface safety hazards. At Jerneh, sea-floor pockmarks near-surface high amplitudes, distributary channels, and minor faults were recognized as potential operational safety hazards. The integration of conventional 3-D andmore » site survey seismic data enabled comprehensive understanding of the occurrence and distribution of potential hazards to platform installation and development well drilling. Three-dimensional seismic interpretation has been instrumental not only in the field structural definition but also in recognition of reservoir trends and hydrocarbon distribution. Additional gas reservoirs were identified by their DHI characteristics and subsequently confirmed by development wells. The innovative use of seismic attribute mapping techniques has been very important in defining both fluid and reservoir distribution in groups B and D. Integration of 3-D seismic data and well-log interpretations has helped in optimal field development, including the planning of well locations and drilling sequence.« less

  20. The Great Observatories All-Sky LIRG Survey: Herschel Image Atlas and Aperture Photometry

    NASA Astrophysics Data System (ADS)

    Chu, Jason K.; Sanders, D. B.; Larson, K. L.; Mazzarella, J. M.; Howell, J. H.; Díaz-Santos, T.; Xu, K. C.; Paladini, R.; Schulz, B.; Shupe, D.; Appleton, P.; Armus, L.; Billot, N.; Chan, B. H. P.; Evans, A. S.; Fadda, D.; Frayer, D. T.; Haan, S.; Ishida, C. M.; Iwasawa, K.; Kim, D.-C.; Lord, S.; Murphy, E.; Petric, A.; Privon, G. C.; Surace, J. A.; Treister, E.

    2017-04-01

    Far-infrared images and photometry are presented for 201 Luminous and Ultraluminous Infrared Galaxies [LIRGs: log ({L}{IR}/{L}⊙ )=11.00{--}11.99, ULIRGs: log ({L}{IR}/{L}⊙ )=12.00{--}12.99], in the Great Observatories All-Sky LIRG Survey (GOALS), based on observations with the Herschel Space Observatory Photodetector Array Camera and Spectrometer (PACS) and the Spectral and Photometric Imaging Receiver (SPIRE) instruments. The image atlas displays each GOALS target in the three PACS bands (70, 100, and 160 μm) and the three SPIRE bands (250, 350, and 500 μm), optimized to reveal structures at both high and low surface brightness levels, with images scaled to simplify comparison of structures in the same physical areas of ˜100 × 100 kpc2. Flux densities of companion galaxies in merging systems are provided where possible, depending on their angular separation and the spatial resolution in each passband, along with integrated system fluxes (sum of components). This data set constitutes the imaging and photometric component of the GOALS Herschel OT1 observing program, and is complementary to atlases presented for the Hubble Space Telescope, Spitzer Space Telescope, and Chandra X-ray Observatory. Collectively, these data will enable a wide range of detailed studies of active galactic nucleus and starburst activity within the most luminous infrared galaxies in the local universe. Based on Herschel Space Observatory observations. Herschel is an ESA space observatory with science instruments provided by the European-led Principal Investigator consortia, and important participation from NASA.

  1. Drilling and geophysical logs of the tophole at an oil-and-gas well site, Central Venango County, Pennsylvania

    USGS Publications Warehouse

    Williams, John H.; Bird, Philip H.; Conger, Randall W.; Anderson, J. Alton

    2014-01-01

    Collection and integrated analysis of drilling and geophysical logs provided an efficient and effective means for characterizing the geohydrologic framework and conditions penetrated by the tophole at the selected oil-and-gas well site. The logging methods and lessons learned at this well site could be applied at other oil-and-gas drilling sites to better characterize the shallow subsurface with the overall goal of protecting freshwater aquifers during hydrocarbon development.

  2. Quality control and assurance for validation of DOS/I measurements

    NASA Astrophysics Data System (ADS)

    Cerussi, Albert; Durkin, Amanda; Kwong, Richard; Quang, Timothy; Hill, Brian; Tromberg, Bruce J.; MacKinnon, Nick; Mantulin, William W.

    2010-02-01

    Ongoing multi-center clinical trials are crucial for Biophotonics to gain acceptance in medical imaging. In these trials, quality control (QC) and assurance (QA) are key to success and provide "data insurance". Quality control and assurance deal with standardization, validation, and compliance of procedures, materials and instrumentation. Specifically, QC/QA involves systematic assessment of testing materials, instrumentation performance, standard operating procedures, data logging, analysis, and reporting. QC and QA are important for FDA accreditation and acceptance by the clinical community. Our Biophotonics research in the Network for Translational Research in Optical Imaging (NTROI) program for breast cancer characterization focuses on QA/QC issues primarily related to the broadband Diffuse Optical Spectroscopy and Imaging (DOS/I) instrumentation, because this is an emerging technology with limited standardized QC/QA in place. In the multi-center trial environment, we implement QA/QC procedures: 1. Standardize and validate calibration standards and procedures. (DOS/I technology requires both frequency domain and spectral calibration procedures using tissue simulating phantoms and reflectance standards, respectively.) 2. Standardize and validate data acquisition, processing and visualization (optimize instrument software-EZDOS; centralize data processing) 3. Monitor, catalog and maintain instrument performance (document performance; modularize maintenance; integrate new technology) 4. Standardize and coordinate trial data entry (from individual sites) into centralized database 5. Monitor, audit and communicate all research procedures (database, teleconferences, training sessions) between participants ensuring "calibration". This manuscript describes our ongoing efforts, successes and challenges implementing these strategies.

  3. Automated knot detection with visual post-processing of Douglas-fir veneer images

    Treesearch

    C.L. Todoroki; Eini C. Lowell; Dennis Dykstra

    2010-01-01

    Knots on digital images of 51 full veneer sheets, obtained from nine peeler blocks crosscut from two 35-foot (10.7 m) long logs and one 18-foot (5.5 m) log from a single Douglas-fir tree, were detected using a two-phase algorithm. The algorithm was developed using one image, the Development Sheet, refined on five other images, the Training Sheets, and then applied to...

  4. New Factorization Techniques and Parallel (log N) Algorithms for Forward Dynamics Solution of Single Closed-Chain Robot Manipulators

    NASA Technical Reports Server (NTRS)

    Fijany, Amir

    1993-01-01

    In this paper parallel 0(log N) algorithms for dynamic simulation of single closed-chain rigid multibody system as specialized to the case of a robot manipulatoar in contact with the environment are developed.

  5. Analyzing Decision Logs to Understand Decision Making in Serious Crime Investigations.

    PubMed

    Dando, Coral J; Ormerod, Thomas C

    2017-12-01

    Objective To study decision making by detectives when investigating serious crime through the examination of decision logs to explore hypothesis generation and evidence selection. Background Decision logs are used to record and justify decisions made during serious crime investigations. The complexity of investigative decision making is well documented, as are the errors associated with miscarriages of justice and inquests. The use of decision logs has not been the subject of an empirical investigation, yet they offer an important window into the nature of investigative decision making in dynamic, time-critical environments. Method A sample of decision logs from British police forces was analyzed qualitatively and quantitatively to explore hypothesis generation and evidence selection by police detectives. Results Analyses revealed diversity in documentation of decisions that did not correlate with case type and identified significant limitations of the decision log approach to supporting investigative decision making. Differences emerged between experienced and less experienced officers' decision log records in exploration of alternative hypotheses, generation of hypotheses, and sources of evidential inquiry opened over phase of investigation. Conclusion The practical use of decision logs is highly constrained by their format and context of use. Despite this, decision log records suggest that experienced detectives display strategic decision making to avoid confirmation and satisficing, which affect less experienced detectives. Application Potential applications of this research include both training in case documentation and the development of new decision log media that encourage detectives, irrespective of experience, to generate multiple hypotheses and optimize the timely selection of evidence to test them.

  6. Parental perceptions of the learner driver log book system in two Australian states.

    PubMed

    Bates, Lyndel; Watson, Barry; King, Mark Johann

    2014-01-01

    Though many jurisdictions internationally now require learner drivers to complete a specified number of hours of supervised driving practice before being able to drive unaccompanied, very few require learner drivers to complete a log book to record this practice and then present it to the licensing authority. Learner drivers in most Australian jurisdictions must complete a log book that records their practice, thereby confirming to the licensing authority that they have met the mandated hours of practice requirement. These log books facilitate the management and enforcement of minimum supervised hours of driving requirements. Parents of learner drivers in 2 Australian states, Queensland and New South Wales, completed an online survey assessing a range of factors, including their perceptions of the accuracy of their child's learner log book and the effectiveness of the log book system. The study indicates that the large majority of parents believe that their child's learner log book is accurate. However, they generally report that the log book system is only moderately effective as a system to measure the number of hours of supervised practice a learner driver has completed. The results of this study suggest the presence of a paradox, with many parents possibly believing that others are not as diligent in the use of log books as they are or that the system is too open to misuse. Given that many parents report that their child's log book is accurate, this study has important implications for the development and ongoing monitoring of hours of practice requirements in graduated driver licensing systems.

  7. The human factors of implementing shift work in logging operations.

    PubMed

    Mitchell, D L; Gallagher, T V; Thomas, R E

    2008-10-01

    A fairly recent development in the forest industry is the use of shift work in logging in the southeastern U.S. Logging company owners are implementing shift work as an opportunity to increase production and potentially reduce the cost of producing each unit of wood, without consideration of the potential impacts on the logging crew. There are many documented physiological and psychological impacts on workers from shift work in a variety of industries, although few address forestry workers in the U.S. Semi-structured interviews were performed to gather information about how logging company owners were implementing shift work in seven southeastern states. Data collected during the interviews included employee turnover, shift hours, shift scheduling, safety considerations, and production impacts. Various work schedules were employed. The majority of the schedules encompassed less than 24 hours per day. Permanent and rotating shift schedules were found. None of the logging company owners used more than two crews in a 24-hour period. Additional safety precautions were implemented as a result of working after dark. No in-woods worker accidents or injuries were reported by any of those interviewed. Results indicate that a variety of work schedules can be successfully implemented in the southeastern logging industry.

  8. Selective‐logging and oil palm: multitaxon impacts, biodiversity indicators, and trade‐offs for conservation planning.

    PubMed

    Edwards, David P; Magrach, Ainhoa; Woodcock, Paul; Ji, Yinqiu; Lim, Norman T -L; Edwards, Felicity A; Larsen, Trond H; Hsu, Wayne W; Benedick, Suzan; Khen, Chey Vun; Chung, Arthur Y C; Reynolds, Glen; Fisher, Brendan; Laurance, William F; Wilcove, David S; Hamer, Keith C; Yu, Douglas W

    Strong global demand for tropical timber and agricultural products has driven large-scale logging and subsequent conversion of tropical forests. Given that the majority of tropical landscapes have been or will likely be logged, the protection of biodiversity within tropical forests thus depends on whether species can persist in these economically exploited lands, and if species cannot persist, whether we can protect enough primary forest from logging and conversion. However, our knowledge of the impact of logging and conversion on biodiversity is limited to a few taxa, often sampled in different locations with complex land-use histories, hampering attempts to plan cost-effective conservation strategies and to draw conclusions across taxa. Spanning a land-use gradient of primary forest, once- and twice-logged forests, and oil palm plantations, we used traditional sampling and DNA metabarcoding to compile an extensive data set in Sabah, Malaysian Borneo for nine vertebrate and invertebrate taxa to quantify the biological impacts of logging and oil palm, develop cost-effective methods of protecting biodiversity, and examine whether there is congruence in response among taxa. Logged forests retained high species richness, including, on average, 70% of species found in primary forest. In contrast, conversion to oil palm dramatically reduces species richness, with significantly fewer primary-forest species than found on logged forest transects for seven taxa. Using a systematic conservation planning analysis, we show that efficient protection of primary-forest species is achieved with land portfolios that include a large proportion of logged-forest plots. Protecting logged forests is thus a cost-effective method of protecting an ecologically and taxonomically diverse range of species, particularly when conservation budgets are limited. Six indicator groups (birds, leaf-litter ants, beetles, aerial hymenopterans, flies, and true bugs) proved to be consistently good predictors of the response of the other taxa to logging and oil palm. Our results confidently establish the high conservation value of logged forests and the low value of oil palm. Cross-taxon congruence in responses to disturbance also suggests that the practice of focusing on key indicator taxa yields important information of general biodiversity in studies of logging and oil palm.

  9. A New Approach to Logging.

    ERIC Educational Resources Information Center

    Miles, Donna

    2001-01-01

    In response to high numbers of preventable fatal accidents in the logging industry, the Occupational Safety and Health Administration (OSHA) developed a week-long logger safety training program that includes hands-on learning of safety techniques in the woods. Reaching small operators has been challenging; outreach initiatives in Maine, North…

  10. NASA Planetary Science Division's Instrument Development Programs, PICASSO and MatISSE

    NASA Technical Reports Server (NTRS)

    Gaier, James R.

    2016-01-01

    The Planetary Science Division (PSD) has combined several legacy instrument development programs into just two. The Planetary Instrument Concepts Advancing Solar System Observations (PICASSO) program funds the development of low TRL instruments and components. The Maturation of Instruments for Solar System Observations (MatISSE) program funds the development of instruments in the mid-TRL range. The strategy of PSD instrument development is to develop instruments from PICASSO to MatISSE to proposing for mission development.

  11. Inhibition of biofilm formation on the surface of water storage containers using biosand zeolite silver-impregnated clay granular and silver impregnated porous pot filtration systems

    PubMed Central

    Moropeng, Resoketswe Charlotte; Mpenyana-Monyatsi, Lizzy; Momba, Maggie Ndombo Benteke

    2018-01-01

    Development of biofilms occurring on the inner surface of storage vessels offers a suitable medium for the growth of microorganisms and consequently contributes to the deterioration of treated drinking water quality in homes. The aim of this study was to determine whether the two point-of-use technologies (biosand zeolite silver-impregnated clay granular (BSZ-SICG) filter and silver-impregnated porous pot (SIPP) filter) deployed in a rural community of South Africa could inhibit the formation of biofilm on the surface of plastic-based containers generally used by rural households for the storage of their drinking water. Culture-based methods and molecular techniques were used to detect the indicator bacteria (Total coliforms, faecal coliform, E. coli) and pathogenic bacteria (Salmonella spp., Shigella spp. and Vibrio cholerae) in intake water and on the surface of storage vessels containing treated water. Scanning electron microscopy was also used to visualize the development of biofilm. Results revealed that the surface water source used by the Makwane community was heavily contaminated and harboured unacceptably high counts of bacteria (heterotrophic plate count: 4.4–4.3 Log10 CFU/100mL, total coliforms: 2.2 Log10 CFU/100 mL—2.1 Log10 CFU/100 mL, faecal coliforms: 1.9 Log10 CFU/100 mL—1.8 Log10 CFU/100 mL, E. coli: 1.7 Log10 CFU/100 mL—1.6 Log10 CFU/100 mL, Salmonella spp.: 3 Log10 CFU/100 mL -8 CFU/100 mL; Shigella spp. and Vibrio cholerae had 1.0 Log10 CFU/100 mL and 0.8 Log10 CFU/100 mL respectively). Biofilm formation was apparent on the surface of the storage containers with untreated water within 24 h. The silver nanoparticles embedded in the clay of the filtration systems provided an effective barrier for the inhibition of biofilm formation on the surface of household water storage containers. Biofilm formation occurred on the surface of storage plastic vessels containing drinking water treated with the SIPP filter between 14 and 21 days, and on those containing drinking water treated with the BSZ-SICG filter between 3 and 14 days. The attachment of target bacteria on the surface of the coupons inoculated in storage containers ranged from (0.07 CFU/cm2–227.8 CFU/cm2). To effectively prevent the development of biofilms on the surface of container-stored water, which can lead to the recontamination of treated water, plastic storage containers should be washed within 14 days for water treated with the SIPP filter and within 3 days for water treated with the BSZ-SICG filter. PMID:29621296

  12. Inhibition of biofilm formation on the surface of water storage containers using biosand zeolite silver-impregnated clay granular and silver impregnated porous pot filtration systems.

    PubMed

    Budeli, Phumudzo; Moropeng, Resoketswe Charlotte; Mpenyana-Monyatsi, Lizzy; Momba, Maggie Ndombo Benteke

    2018-01-01

    Development of biofilms occurring on the inner surface of storage vessels offers a suitable medium for the growth of microorganisms and consequently contributes to the deterioration of treated drinking water quality in homes. The aim of this study was to determine whether the two point-of-use technologies (biosand zeolite silver-impregnated clay granular (BSZ-SICG) filter and silver-impregnated porous pot (SIPP) filter) deployed in a rural community of South Africa could inhibit the formation of biofilm on the surface of plastic-based containers generally used by rural households for the storage of their drinking water. Culture-based methods and molecular techniques were used to detect the indicator bacteria (Total coliforms, faecal coliform, E. coli) and pathogenic bacteria (Salmonella spp., Shigella spp. and Vibrio cholerae) in intake water and on the surface of storage vessels containing treated water. Scanning electron microscopy was also used to visualize the development of biofilm. Results revealed that the surface water source used by the Makwane community was heavily contaminated and harboured unacceptably high counts of bacteria (heterotrophic plate count: 4.4-4.3 Log10 CFU/100mL, total coliforms: 2.2 Log10 CFU/100 mL-2.1 Log10 CFU/100 mL, faecal coliforms: 1.9 Log10 CFU/100 mL-1.8 Log10 CFU/100 mL, E. coli: 1.7 Log10 CFU/100 mL-1.6 Log10 CFU/100 mL, Salmonella spp.: 3 Log10 CFU/100 mL -8 CFU/100 mL; Shigella spp. and Vibrio cholerae had 1.0 Log10 CFU/100 mL and 0.8 Log10 CFU/100 mL respectively). Biofilm formation was apparent on the surface of the storage containers with untreated water within 24 h. The silver nanoparticles embedded in the clay of the filtration systems provided an effective barrier for the inhibition of biofilm formation on the surface of household water storage containers. Biofilm formation occurred on the surface of storage plastic vessels containing drinking water treated with the SIPP filter between 14 and 21 days, and on those containing drinking water treated with the BSZ-SICG filter between 3 and 14 days. The attachment of target bacteria on the surface of the coupons inoculated in storage containers ranged from (0.07 CFU/cm2-227.8 CFU/cm2). To effectively prevent the development of biofilms on the surface of container-stored water, which can lead to the recontamination of treated water, plastic storage containers should be washed within 14 days for water treated with the SIPP filter and within 3 days for water treated with the BSZ-SICG filter.

  13. Contextualising impacts of logging on tropical rainforest catchment sediment dynamics using the stratigraphic record of in-channel bench deposits

    NASA Astrophysics Data System (ADS)

    Blake, Will; Walsh, Rory; Bidin, Kawi; Annammala, Kogila

    2015-04-01

    It is widely recognised that commercial logging and conversion of tropical rainforest to oil palm plantation leads to enhanced fluvial sediment flux to the coastal zone but the dynamics of delivery and mechanisms that act to retain sediment and nutrients within rainforest ecosystems, e.g. riparian zone and floodplain storage, are poorly understood and underexploited as a management tool. While accretion of lateral in-channel bench deposits in response to forest clearance has been demonstrated in temperate landscapes, their development and value as sedimentary archives of catchment response to human disturbance remains largely unexplored in tropical rainforest river systems. Working within the Segama River basin, Sabah, Malaysian Borneo, this study aimed to test the hypothesis that (1) lateral bench development in tropical rainforest rivers systems is enhanced by upstream catchment disturbance and that (2) the sedimentary record of these deposits can be used to infer changes in sediment provenance and intensification of sediment flux associated with logging activities. Sediment cores were taken from in-channel bench deposits with upstream catchment contributing areas of 721 km2 and 2800 km2 respectively. Accretion rates were determined using fallout 210Pb and 137Cs and the timing of peak accumulation was shown to correspond exactly with the known temporal pattern of logging and associated fluvial sediment response over the period 1980 to present following low pre-logging rates. Major and minor element geochemistry of deposits was used to assess the degree of weathering that deposited sediment had experienced. This was linked to surface (heavily weathered) and subsurface (less weathered) sediment sources relating to initial disturbance by logging and post-logging landsliding responses respectively. A shift in the dominant source of deposited material from surface (i.e. topsoil) to subsurface (i.e. relatively unweathered subsoil close to bedrock) origin was observed to coincide with the increase in accretion rates following logging of steep headwater slopes. Coherence of sedimentary, monitoring and observational evidence demonstrates that in-channel bench deposits offer a previously unexplored sedimentary archive of catchment response to logging in tropical rainforest systems and a tool for evaluating the erosional responses of ungauged basins. In-channel bench development due to catchment disturbance may augment ecosystem services provided by the riparian corridors of larger rivers and process knowledge gained from sedimentary archives can be used to underpin future riparian and catchment forest management strategies.

  14. Correlating Petrophysical Well Logs Using Fractal-based Analysis to Identify Changes in the Signal Complexity Across Neutron, Density, Dipole Sonic, and Gamma Ray Tool Types

    NASA Astrophysics Data System (ADS)

    Matthews, L.; Gurrola, H.

    2015-12-01

    Typical petrophysical well log correlation is accomplished by manual pattern recognition leading to subjective correlations. The change in character in a well log is dependent upon the change in the response of the tool to lithology. The petrophysical interpreter looks for a change in one log type that would correspond to the way a different tool responds to the same lithology. To develop an objective way to pick changes in well log characteristics, we adapt a method of first arrival picking used in seismic data to analyze changes in the character of well logs. We chose to use the fractal method developed by Boschetti et al[1] (1996). This method worked better than we expected and we found similar changes in the fractal dimension across very different tool types (sonic vs density vs gamma ray). We reason the fractal response of the log is not dependent on the physics of the tool response but rather the change in the complexity of the log data. When a formation changes physical character in time or space the recorded magnitude in tool data changes complexity at the same time even if the original tool response is very different. The relative complexity of the data regardless of the tool used is dependent upon the complexity of the medium relative to tool measurement. The relative complexity of the recorded magnitude data changes as a tool transitions from one character type to another. The character we are measuring is the roughness or complexity of the petrophysical curve. Our method provides a way to directly compare different log types based on a quantitative change in signal complexity. For example, using changes in data complexity allow us to correlate gamma ray suites with sonic logs within a well and then across to an adjacent well with similar signatures. Our method creates reliable and automatic correlations to be made in data sets beyond the reasonable cognitive limits of geoscientists in both speed and consistent pattern recognition. [1] Fabio Boschetti, Mike D. Dentith, and Ron D. List, (1996). A fractal-based algorithm for detecting first arrivals on seismic traces. Geophysics, Vol.61, No.4, P. 1095-1102.

  15. Headwater streams and forest management: does ecoregional context influence logging effects on benthic communities?

    USGS Publications Warehouse

    Medhurst, R. Bruce; Wipfli, Mark S.; Binckley, Chris; Polivka, Karl; Hessburg, Paul F.; Salter, R. Brion

    2010-01-01

    Effects of forest management on stream communities have been widely documented, but the role that climate plays in the disturbance outcomes is not understood. In order to determine whether the effect of disturbance from forest management on headwater stream communities varies by climate, we evaluated benthic macroinvertebrate communities in 24 headwater streams that differed in forest management (logged-roaded vs. unlogged-unroaded, hereafter logged and unlogged) within two ecological sub-regions (wet versus dry) within the eastern Cascade Range, Washington, USA. In both ecoregions, total macroinvertebrate density was highest at logged sites (P = 0.001) with gathering-collectors and shredders dominating. Total taxonomic richness and diversity did not differ between ecoregions or forest management types. Shredder densities were positively correlated with total deciduous and Sitka alder (Alnus sinuata) riparian cover. Further, differences in shredder density between logged and unlogged sites were greater in the wet ecoregion (logging × ecoregion interaction; P = 0.006) suggesting that differences in post-logging forest succession between ecoregions were responsible for differences in shredder abundance. Headwater stream benthic community structure was influenced by logging and regional differences in climate. Future development of ecoregional classification models at the subbasin scale, and use of functional metrics in addition to structural metrics, may allow for more accurate assessments of anthropogenic disturbances in mountainous regions where mosaics of localized differences in climate are common.

  16. Borehole geophysical logs at Naval Weapons Industrial Reserve Plant, Dallas, Texas

    USGS Publications Warehouse

    Braun, Christopher L.; Anaya, Roberto; Kuniansky, Eve L.

    2000-01-01

    A shallow alluvial aquifer at the Naval Weapons Industrial Reserve Plant near Dallas, Texas, has been contaminated by organic solvents used in the fabrication and assembly of aircraft and aircraft parts. Natural gamma-ray and electromagnetic-induction borehole geophysical logs were obtained from 162 poly vinyl-chloride-cased wells at the plant and were integrated with existing lithologic data to improve site characterization of the subsurface alluvium. Software was developed for filtering and classifying the log data and for processing, analyzing, and creating graphical output of the digital data. The alluvium consists of mostly fine-grained low-permeability sediments; however for this study, the alluvium was classified into low, intermediate, and high clay-content sediments on the basis of the gamma-ray logs. The low clay-content sediments were interpreted as being relatively permeable, whereas the high clay-content sediments were interpreted as being relatively impermeable. Simple statistics were used to identify zones of potentially contaminated sediments on the basis of the gamma-ray log classifications and the electromagnetic-induction log conductivity data.

  17. Single and multiple object tracking using log-euclidean Riemannian subspace and block-division appearance model.

    PubMed

    Hu, Weiming; Li, Xi; Luo, Wenhan; Zhang, Xiaoqin; Maybank, Stephen; Zhang, Zhongfei

    2012-12-01

    Object appearance modeling is crucial for tracking objects, especially in videos captured by nonstationary cameras and for reasoning about occlusions between multiple moving objects. Based on the log-euclidean Riemannian metric on symmetric positive definite matrices, we propose an incremental log-euclidean Riemannian subspace learning algorithm in which covariance matrices of image features are mapped into a vector space with the log-euclidean Riemannian metric. Based on the subspace learning algorithm, we develop a log-euclidean block-division appearance model which captures both the global and local spatial layout information about object appearances. Single object tracking and multi-object tracking with occlusion reasoning are then achieved by particle filtering-based Bayesian state inference. During tracking, incremental updating of the log-euclidean block-division appearance model captures changes in object appearance. For multi-object tracking, the appearance models of the objects can be updated even in the presence of occlusions. Experimental results demonstrate that the proposed tracking algorithm obtains more accurate results than six state-of-the-art tracking algorithms.

  18. Petrophysical rock properties of the Bazhenov Formation of the South-Eastern part of Kaymysovsky Vault (Tomsk Region)

    NASA Astrophysics Data System (ADS)

    Gorshkov, A. M.; Kudryashova, L. K.; Lee-Van-Khe, O. S.

    2016-09-01

    The article presents the results of studying petrophysical rock properties of the Bazhenov Formation of the South-Eastern part of Kaymysovsky Vault with the Gas Research Institute (GRI) method. The authors have constructed dependence charts for bulk and grain density, open porosity and matrix permeability vs. depth. The results of studying petrophysical properties with the GRI method and core description have allowed dividing the entire section into three intervals each of which characterized by different conditions of Bazhenov Formation rock formation. The authors have determined a correlation between the compensated neutron log and the rock density vs. depth chart on the basis of complex well logging and petrophysical section analysis. They have determined a promising interval for producing hydrocarbons from the Bazhenov Formation in the well under study. Besides, they have determined the typical behavior of compensated neutron logs and SP logs on well logs for this interval. These studies will allow re-interpreting available well logs in order to determine the most promising interval to be involved in Bazhenov Formation development in Tomsk Region.

  19. Estimating the footprint of pollution on coral reefs with models of species turnover.

    PubMed

    Brown, Christopher J; Hamilton, Richard J

    2018-01-15

    Ecological communities typically change along gradients of human impact, although it is difficult to estimate the footprint of impacts for diffuse threats such as pollution. We developed a joint model (i.e., one that includes multiple species and their interactions with each other and environmental covariates) of benthic habitats on lagoonal coral reefs and used it to infer change in benthic composition along a gradient of distance from logging operations. The model estimated both changes in abundances of benthic groups and their compositional turnover, a type of beta diversity. We used the model to predict the footprint of turbidity impacts from past and recent logging. Benthic communities far from logging were dominated by branching corals, whereas communities close to logging had higher cover of dead coral, massive corals, and soft sediment. Recent impacts were predicted to be small relative to the extensive impacts of past logging because recent logging has occurred far from lagoonal reefs. Our model can be used more generally to estimate the footprint of human impacts on ecosystems and evaluate the benefits of conservation actions for ecosystems. © 2018 Society for Conservation Biology.

  20. Contributed Review: Nuclear magnetic resonance core analysis at 0.3 T

    NASA Astrophysics Data System (ADS)

    Mitchell, Jonathan; Fordham, Edmund J.

    2014-11-01

    Nuclear magnetic resonance (NMR) provides a powerful toolbox for petrophysical characterization of reservoir core plugs and fluids in the laboratory. Previously, there has been considerable focus on low field magnet technology for well log calibration. Now there is renewed interest in the study of reservoir samples using stronger magnets to complement these standard NMR measurements. Here, the capabilities of an imaging magnet with a field strength of 0.3 T (corresponding to 12.9 MHz for proton) are reviewed in the context of reservoir core analysis. Quantitative estimates of porosity (saturation) and pore size distributions are obtained under favorable conditions (e.g., in carbonates), with the added advantage of multidimensional imaging, detection of lower gyromagnetic ratio nuclei, and short probe recovery times that make the system suitable for shale studies. Intermediate field instruments provide quantitative porosity maps of rock plugs that cannot be obtained using high field medical scanners due to the field-dependent susceptibility contrast in the porous medium. Example data are presented that highlight the potential applications of an intermediate field imaging instrument as a complement to low field instruments in core analysis and for materials science studies in general.

  1. Forest structure following tornado damage and salvage logging in northern Maine, USA

    Treesearch

    Shawn Fraver; Kevin J. Dodds; Laura S. Kenefic; Rick Morrill; Robert S. Seymour; Eben Sypitkowski

    2017-01-01

    Understanding forest structural changes resulting from postdisturbance management practices such as salvage logging is critical for predicting forest recovery and developing appropriate management strategies. In 2013, a tornado and subsequent salvage operations in northern Maine, USA, created three conditions (i.e., treatments) with contrasting forest structure:...

  2. Including operational data in QMRA model: development and impact of model inputs.

    PubMed

    Jaidi, Kenza; Barbeau, Benoit; Carrière, Annie; Desjardins, Raymond; Prévost, Michèle

    2009-03-01

    A Monte Carlo model, based on the Quantitative Microbial Risk Analysis approach (QMRA), has been developed to assess the relative risks of infection associated with the presence of Cryptosporidium and Giardia in drinking water. The impact of various approaches for modelling the initial parameters of the model on the final risk assessments is evaluated. The Monte Carlo simulations that we performed showed that the occurrence of parasites in raw water was best described by a mixed distribution: log-Normal for concentrations > detection limit (DL), and a uniform distribution for concentrations < DL. The selection of process performance distributions for modelling the performance of treatment (filtration and ozonation) influences the estimated risks significantly. The mean annual risks for conventional treatment are: 1.97E-03 (removal credit adjusted by log parasite = log spores), 1.58E-05 (log parasite = 1.7 x log spores) or 9.33E-03 (regulatory credits based on the turbidity measurement in filtered water). Using full scale validated SCADA data, the simplified calculation of CT performed at the plant was shown to largely underestimate the risk relative to a more detailed CT calculation, which takes into consideration the downtime and system failure events identified at the plant (1.46E-03 vs. 3.93E-02 for the mean risk).

  3. New Era of Scientific Ocean Drilling

    NASA Astrophysics Data System (ADS)

    Eguchi, N.; Toczko, S.; Sanada, Y.; Igarashi, C.; Kubo, Y.; Maeda, L.; Sawada, I.; Takase, K.; Kyo, N.

    2014-12-01

    The D/V Chikyu, committed to scientific ocean drilling since 2007, has completed thirteen IODP expeditions, and Chikyu's enhanced drilling technology gives us the means to reach deep targets, enhanced well logging, deep water riserless drilling, and state of the art laboratory. Chikyu recovered core samples from 2466 meters below sea floor (mbsf) in IODP Exp. 337, and drilled to 3058.5 mbsf in IODP Exp. 348, but these are still not the limit of Chikyu's capability. As deep as these depths are, they are just halfway to the 5200 mbsf plate boundary target for the NanTroSEIZE deep riser borehole. There are several active IODP proposals in the pipeline. Each has scientific targets requiring several thousand meters of penetration below the sea floor. Riser technology is the only way to collect samples and data from that depth. Well logging has been enhanced with the adoption of riser drilling, especially for logging-while-drilling (LWD). LWD has several advantages over wireline logging, and provides more opportunities for continuous measurements even in unstable boreholes. Because of the larger diameter of riser pipes and enhanced borehole stability, Chikyu can use several state-of-the-art downhole tools, e.g. fracture tester, fluid sampling tool, wider borehole imaging, and the latest sonic tools. These new technologies and tools can potentially expand the envelope of scientific ocean drilling. Chikyu gives us access to ultra-deep water riserless drilling. IODP Exp. 343/343T investigating the March 2011 Tohoku Oki Earthquake, explored the toe of the landward slope of the Japan Trench. This expedition reached the plate boundary fault target at more than 800 mbsf in water depths over 6900 m for logging-while-drilling, coring, and observatory installation. This deep-water drilling capability also expands the scientific ocean drilling envelope and provides access to previously unreachable targets. On top of these operational capabilities, Chikyu's onboard laboratory is equipped with state-of-the-art instruments to analyze all science samples. X-ray CT creates non-destructive 3D images of core samples providing high resolution structural detail. The microbiology laboratory offers clean and contamination-free work environments required for microbiological samples.

  4. Spreadsheet log analysis in subsurface geology

    USGS Publications Warehouse

    Doveton, J.H.

    2000-01-01

    Most of the direct knowledge of the geology of the subsurface is gained from the examination of core and drill-cuttings recovered from boreholes drilled by the petroleum and water industries. Wireline logs run in these same boreholes generally have been restricted to tasks of lithostratigraphic correlation and thee location of hydrocarbon pay zones. However, the range of petrophysical measurements has expanded markedly in recent years, so that log traces now can be transformed to estimates of rock composition. Increasingly, logs are available in a digital format that can be read easily by a desktop computer and processed by simple spreadsheet software methods. Taken together, these developments offer accessible tools for new insights into subsurface geology that complement the traditional, but limited, sources of core and cutting observations.

  5. Lactic acid concentrations that reduce microbial load yet minimally impact colour and sensory characteristics of beef.

    PubMed

    Rodríguez-Melcón, Cristina; Alonso-Calleja, Carlos; Capita, Rosa

    2017-07-01

    Lactic acid (LA) has recently been approved in the EU as beef decontaminant. In order to identify the most appropriate concentration, beef samples were spray-treated with LA (2%, 3%, 4% or 5%) or left untreated (control). Microbial load (aerobic plate counts, psychrotrophs and Enterobacteriaceae), pH, instrumental colour and sensory properties were investigated at 0, 24, 72 and 120h of refrigerated storage. The reductions in bacteria after spraying ranged from 0.57 to 0.95 log units. A residual antimicrobial effect was observed so that at 120h LA reduced microbial load by up to 2 log units compared with the control samples. Samples treated with 5% LA showed the lowest redness value (a*) and hedonic scores at all sampling times. Only for samples treated with 4% LA did the sensorial shelf-life limit extend beyond 120h. It is suggested that treatment of beef with 4% LA not only may improve microbiological quality, but also may enhance sensory properties and shelf-life. Copyright © 2017. Published by Elsevier Ltd.

  6. A Nanoflare-Based Cellular Automaton Model and the Observed Properties of the Coronal Plasma

    NASA Technical Reports Server (NTRS)

    Lopez-Fuentes, Marcelo; Klimchuk, James Andrew

    2016-01-01

    We use the cellular automaton model described in Lopez Fuentes and Klimchuk to study the evolution of coronal loop plasmas. The model, based on the idea of a critical misalignment angle in tangled magnetic fields, produces nanoflares of varying frequency with respect to the plasma cooling time. We compare the results of the model with active region (AR) observations obtained with the Hinode/XRT and SDOAIA instruments. The comparison is based on the statistical properties of synthetic and observed loop light curves. Our results show that the model reproduces the main observational characteristics of the evolution of the plasma in AR coronal loops. The typical intensity fluctuations have amplitudes of 10 percent - 15 percent both for the model and the observations. The sign of the skewness of the intensity distributions indicates the presence of cooling plasma in the loops. We also study the emission measure (EM) distribution predicted by the model and obtain slopes in log(EM) versus log(T) between 2.7 and 4.3, in agreement with published observational values.

  7. Visualization of nuclear particle trajectories in nuclear oil-well logging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Case, C.R.; Chiaramonte, J.M.

    Nuclear oil-well logging measures specific properties of subsurface geological formations as a function of depth in the well. The knowledge gained is used to evaluate the hydrocarbon potential of the surrounding oil field. The measurements are made by lowering an instrument package into an oil well and slowly extracting it at a constant speed. During the extraction phase, neutrons or gamma rays are emitted from the tool, interact with the formation, and scatter back to the detectors located within the tool. Even though only a small percentage of the emitted particles ever reach the detectors, mathematical modeling has been verymore » successful in the accurate prediction of these detector responses. The two dominant methods used to model these devices have been the two-dimensional discrete ordinates method and the three-dimensional Monte Carlo method has routinely been used to investigate the response characteristics of nuclear tools. A special Los Alamos National Laboratory version of their standard MCNP Monte carlo code retains the details of each particle history of later viewing within SABRINA, a companion three-dimensional geometry modeling and debugging code.« less

  8. A NANOFLARE-BASED CELLULAR AUTOMATON MODEL AND THE OBSERVED PROPERTIES OF THE CORONAL PLASMA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fuentes, Marcelo López; Klimchuk, James A., E-mail: lopezf@iafe.uba.ar

    2016-09-10

    We use the cellular automaton model described in López Fuentes and Klimchuk to study the evolution of coronal loop plasmas. The model, based on the idea of a critical misalignment angle in tangled magnetic fields, produces nanoflares of varying frequency with respect to the plasma cooling time. We compare the results of the model with active region (AR) observations obtained with the Hinode /XRT and SDO /AIA instruments. The comparison is based on the statistical properties of synthetic and observed loop light curves. Our results show that the model reproduces the main observational characteristics of the evolution of the plasmamore » in AR coronal loops. The typical intensity fluctuations have amplitudes of 10%–15% both for the model and the observations. The sign of the skewness of the intensity distributions indicates the presence of cooling plasma in the loops. We also study the emission measure (EM) distribution predicted by the model and obtain slopes in log(EM) versus log(T) between 2.7 and 4.3, in agreement with published observational values.« less

  9. Construction, completion, and testing of replacement monitoring wells MW 3-2, MW 6-2, MW 7-2, and MW 11-2, Mountain Home Air Force Base, Idaho, February through April 2000

    USGS Publications Warehouse

    Parliman, D.J.

    2000-01-01

    In February and March 2000, the U.S. Geological Survey Western Regional Research Drilling Operation constructed replacement monitoring wells MW 3–2, MW 6–2, MW 7–2, and MW 11–2 as part of a regional ground-water monitor- ing network for the Mountain Home Air Force Base, Elmore County, Idaho. Total well depths ranged from 435.5 to 456.5 feet, and initial depth-to-water measurements ranged from about 350 to 375 feet below land surface. After completion, wells were pumped and onsite measurements were made of water temperature, specific conductance, pH, and dissolved oxygen. At each well, natural gamma, spontaneous potential, resistivity, caliper, and temperature logs were obtained from instruments placed in open boreholes. A three- dimensional borehole flow analysis was completed for MW 3–2 and MW 11–2, and a video log was obtained for MW 11–2 to annotate lithology and note wet zones in the borehole above saturated rock.

  10. Serum Spot 14 concentration is negatively associated with thyroid-stimulating hormone level

    PubMed Central

    Chen, Yen-Ting; Tseng, Fen-Yu; Chen, Pei-Lung; Chi, Yu-Chao; Han, Der-Sheng; Yang, Wei-Shiung

    2016-01-01

    Abstract Spot 14 (S14) is a protein involved in fatty acid synthesis and was shown to be induced by thyroid hormone in rat liver. However, the presence of S14 in human serum and its relations with thyroid function status have not been investigated. The objectives of this study were to compare serum S14 concentrations in patients with hyperthyroidism or euthyroidism and to evaluate the associations between serum S14 and free thyroxine (fT4) or thyroid-stimulating hormone (TSH) levels. We set up an immunoassay for human serum S14 concentrations and compared its levels between hyperthyroid and euthyroid subjects. Twenty-six hyperthyroid patients and 29 euthyroid individuals were recruited. Data of all patients were pooled for the analysis of the associations between the levels of S14 and fT4, TSH, or quartile of TSH. The hyperthyroid patients had significantly higher serum S14 levels than the euthyroid subjects (median [Q1, Q3]: 975 [669, 1612] ng/mL vs 436 [347, 638] ng/mL, P < 0.001). In univariate linear regression, the log-transformed S14 level (logS14) was positively associated with fT4 but negatively associated with creatinine (Cre), total cholesterol (T-C), triglyceride (TG), low-density lipoprotein cholesterol (LDL-C), and TSH. The positive associations between logS14 and fT4 and the negative associations between logS14 and Cre, TG, T-C, or TSH remained significant after adjustment with sex and age. These associations were prominent in females but not in males. The logS14 levels were negatively associated with the TSH levels grouped by quartile (ß = −0.3020, P < 0.001). The association between logS14 and TSH quartile persisted after adjustment with sex and age (ß = −0.2828, P = 0.001). In stepwise multivariate regression analysis, only TSH grouped by quartile remained significantly associated with logS14 level. We developed an ELISA to measure serum S14 levels in human. Female patients with hyperthyroidism had higher serum S14 levels than the female subjects with euthyroidism. The serum logS14 concentrations were negatively associated with TSH levels. Changes of serum S14 level in the whole thyroid function spectrum deserve further investigation. PMID:27749565

  11. Evaluation and Comparison of Methods for Measuring Ozone ...

    EPA Pesticide Factsheets

    Ambient evaluations of the various ozone and NO2 methods were conducted during field intensive studies as part of the NASA DISCOVER-AQ project conducted during July 2011 near Baltimore, MD; January – February 2013 in the San Juaquin valley, CA; September 2013 in Houston, TX; and July – August 2014 near Denver, CO. During field intensive studies, instruments were calibrated according to manufacturers’ operation manuals and in accordance with FRM requirements listed in 40 CFR 50. During the ambient evaluation campaigns, nightly automated zero and span checks were performed to monitor the validity of the calibration and control for drifts or variations in the span and/or zero response. Both the calibration gas concentrations and the nightly zero and span gas concentrations were delivered using a dynamic dilution calibration system (T700U/T701H, Teledyne API). The analyzers were housed within a temperature-controlled shelter during the sampling campaigns. A glass inlet with sampling height located approximately 5 m above ground level and a subsequent sampling manifold were shared by all instruments. Data generated by all analyzers were collected and logged using a field deployable data acquisition system (Envidas Ultimate). A summary of instruments used during DISCOVER-AQ deployment are listed in Table 1. Figure 1 shows a typical DISCOVER-AQ site (Houston 2013) where EPA (and others) instrumentation was deployed. Under the Clean Air Act, the U.S. EPA has estab

  12. Characterization and analysis of diesel exhaust odor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shala, F.J.

    1983-01-01

    An instrumental method known as the Diesel Odor Analysis System or DOAS, has been developed at A.D. Little, Inc. for measuring diesel exhaust odor. It was of interest to determine which compound or compounds in the oxygenated fraction of the exhaust were primarily responsible for the odor correlation as developed at A.D. Little, Inc. This was accomplished by observing how the measurement of the exhaust odor intensity and number of chemical constituents of the oxygenate fraction were changing with respect to the odor values as measured by the DOAS. Benzaldehyde was found to give the best correlation (R = 0.98)more » with odor. A quantitative relationship between exhaust odor as measured by the total intensity of aroma (TIA) and the benzaldehyde concentration (B) in ppm in the exhaust is given by: TIA = 1.11 log/sub 10/(B) + 4.10. This correlation was supported by results obtained from two other diesel engine exhaust sources. A methyl benzaldehyde isomer also yielded a good correlation (R = 0.90) with odor. Air to fuel ratio correlations were determined for the tentatively identified compounds, cinnamaldehyde (R = 0.94) and a C2-benzaldehyde isomer (R = 0.94).« less

  13. Diverse Applications of Electronic-Nose Technologies in Agriculture and Forestry

    PubMed Central

    Wilson, Alphus D.

    2013-01-01

    Electronic-nose (e-nose) instruments, derived from numerous types of aroma-sensor technologies, have been developed for a diversity of applications in the broad fields of agriculture and forestry. Recent advances in e-nose technologies within the plant sciences, including improvements in gas-sensor designs, innovations in data analysis and pattern-recognition algorithms, and progress in material science and systems integration methods, have led to significant benefits to both industries. Electronic noses have been used in a variety of commercial agricultural-related industries, including the agricultural sectors of agronomy, biochemical processing, botany, cell culture, plant cultivar selections, environmental monitoring, horticulture, pesticide detection, plant physiology and pathology. Applications in forestry include uses in chemotaxonomy, log tracking, wood and paper processing, forest management, forest health protection, and waste management. These aroma-detection applications have improved plant-based product attributes, quality, uniformity, and consistency in ways that have increased the efficiency and effectiveness of production and manufacturing processes. This paper provides a comprehensive review and summary of a broad range of electronic-nose technologies and applications, developed specifically for the agriculture and forestry industries over the past thirty years, which have offered solutions that have greatly improved worldwide agricultural and agroforestry production systems. PMID:23396191

  14. Diverse applications of electronic-nose technologies in agriculture and forestry.

    PubMed

    Wilson, Alphus D

    2013-02-08

    Electronic-nose (e-nose) instruments, derived from numerous types of aroma-sensor technologies, have been developed for a diversity of applications in the broad fields of agriculture and forestry. Recent advances in e-nose technologies within the plant sciences, including improvements in gas-sensor designs, innovations in data analysis and pattern-recognition algorithms, and progress in material science and systems integration methods, have led to significant benefits to both industries. Electronic noses have been used in a variety of commercial agricultural-related industries, including the agricultural sectors of agronomy, biochemical processing, botany, cell culture, plant cultivar selections, environmental monitoring, horticulture, pesticide detection, plant physiology and pathology. Applications in forestry include uses in chemotaxonomy, log tracking, wood and paper processing, forest management, forest health protection, and waste management. These aroma-detection applications have improved plant-based product attributes, quality, uniformity, and consistency in ways that have increased the efficiency and effectiveness of production and manufacturing processes. This paper provides a comprehensive review and summary of a broad range of electronic-nose technologies and applications, developed specifically for the agriculture and forestry industries over the past thirty years, which have offered solutions that have greatly improved worldwide agricultural and agroforestry production systems.

  15. MAIL LOG, program summary and specifications

    NASA Technical Reports Server (NTRS)

    Harris, D. K.

    1979-01-01

    The summary and specifications to obtain the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS are provided. The MAIL LOG program has four modes of operation: (1) input - putting new records into the data base; (2) revise - changing or modifying existing records in the data base; (3) search - finding special records existing in the data base; and (4) archive - store or put away existing records in the data base. The output includes special printouts of records in the data base and results from the input and search modes.

  16. Cosmic dance at z 3: Detecting the host galaxies of the dual AGN system LBQS 0302-0019 and Jil with HAWK-I+GRAAL

    NASA Astrophysics Data System (ADS)

    Husemann, B.; Bielby, R.; Jahnke, K.; Arrigoni-Battaia, F.; Worseck, G.; Shanks, T.; Wardlow, J.; Scholtz, J.

    2018-06-01

    We recently discovered that the luminous radio-quiet quasi-stellar objects (QSO) LBQS 0302-0019 at z = 3.286 is likely accompanied by an obscured AGN at 20 kpc projected distance, which we dubbed Jil. It represents the tightest candidate system of an obscured and unobscured dual AGN at z > 3. To verify the dual AGN scenario, we obtained deep Ks band (rest-frame V band) imaging with the VLT/HAWK-I+GRAAL instrument at 0.″4 resolution during science verification in January 2018. We detect the individual host galaxies of the QSO and Jil with estimated stellar masses of log(M⋆/M⊙) = 11.4 ± 0.5 and log(M⋆/M⊙) = 0.9 ± 0.5, respectively. Near-IR spectra obtained with Very Large Telescope-K-band Multi Object Spectrograph (VLT-KMOS) reveal a clear [O III] λ5007 line detection at the location of Jil that does not contribute significantly to the Ks band flux. Both observations therefore corroborate the dual AGN scenario. A comparison to Illustris simulations suggests a parent halo mass of log(Mhalo/M⊙) = 13.2 ± 0.5 for this interacting galaxy system, corresponding to a massive dark matter halo at that epoch. Based on observations collected at the European Organisation for Astronomical Research in the Southern Hemisphere under ESO programme(s) 60.A-9471(A) and 100.A-0134(B).

  17. The use of optical fiber in endodontic photodynamic therapy. Is it really relevant?

    PubMed

    Garcez, Aguinaldo S; Fregnani, Eduardo R; Rodriguez, Helena M; Nunez, Silvia C; Sabino, Caetano P; Suzuki, Hideo; Ribeiro, Martha S

    2013-01-01

    This study analyzed the necessity of use of an optical fiber/diffusor when performing antimicrobial photodynamic therapy (PDT) associated with endodontic therapy. Fifty freshly extracted human single-rooted teeth were used. Conventional endodontic treatment was performed using a sequence of ProTaper (Dentsply Maillefer Instruments), the teeth were sterilized, and the canals were contaminated with Enterococcus faecalis 3 days' biofilm. The samples were divided into five groups: group 1--ten roots irradiated with a laser tip (area of 0.04 cm(2)), group 2--ten roots irradiated with a smaller laser tip (area of 0.028 cm(2)), and group 3--ten teeth with the crown, irradiate with the laser tip with 0.04 cm(2) of area. The forth group (G4) followed the same methodology as group 3, but the irradiation was performed with smaller tip (area of 0.028 cm(2)) and G5 ten teeth with crown were irradiated using a 200-mm-diameter fiber/diffusor coupled to diode laser. Microbiological samples were taken after accessing the canal, after endodontic therapy, and after PDT. Groups 1 and 2 showed a reduction of two logs (99%), groups 3 and 4 of one log (85% and 97%, respectively), and group 5 of four logs (99.99%). Results suggest that the use of PDT added to endodontic treatment in roots canals infected with E. faecalis with the optical fiber/diffusor is better than when the laser light is used directed at the access of cavity.

  18. Environmental effects and characterization of the Egyptian radioactive well logging calibration pad facility.

    PubMed

    Al Alfy, Ibrahim Mohammad

    2013-12-01

    A set of ten radioactive well-logging calibration pads were constructed in one of the premises of the Nuclear Materials Authority (NMA), Egypt, at 6th October city. These pads were built for calibrating geophysical well-logging instruments. This calibration facility was conducted through technical assistance and practical support of the International Atomic Energy Agency (IAEA) and (ARCN). There are five uranium pads with three different uranium concentrations and borehole diameters. The other five calibration pads include one from each of the following: blank, potassium, thorium, multi layers and mixed. More than 22 t of various selected Egyptian raw materials were gathered for pad construction from different locations in Egypt. Pad's site and the surrounding area were spectrometrically surveyed before excavation for the construction process of pad-basin floor. They yielded negligible radiation values which are very near to the detected general background. After pad's construction, spectrometric measurements were carried out again in the same locations when the exposed bore holes of the pads were closed. No radioactivity leakage was noticed from the pads. Meanwhile, dose rate values were found to range from 0.12 to 1.26 mS/y. They were measured during the opening of bore holes of the pads. These values depend mainly upon the type and concentration of the pads as well as their borehole diameters. The results of radiospectrometric survey illustrate that the specification of top layers of the pads were constructed according to international standards. © 2013 Elsevier Ltd. All rights reserved.

  19. Neural Network Model for Survival and Growth of Salmonella enterica Serotype 8,20:-:z6 in Ground Chicken Thigh Meat during Cold Storage: Extrapolation to Other Serotypes.

    PubMed

    Oscar, T P

    2015-10-01

    Mathematical models that predict the behavior of human bacterial pathogens in food are valuable tools for assessing and managing this risk to public health. A study was undertaken to develop a model for predicting the behavior of Salmonella enterica serotype 8,20:-:z6 in chicken meat during cold storage and to determine how well the model would predict the behavior of other serotypes of Salmonella stored under the same conditions. To develop the model, ground chicken thigh meat (0.75 cm(3)) was inoculated with 1.7 log Salmonella 8,20:-:z6 and then stored for 0 to 8 -8 to 16°C. An automated miniaturized most-probable-number (MPN) method was developed and used for the enumeration of Salmonella. Commercial software (Excel and the add-in program NeuralTools) was used to develop a multilayer feedforward neural network model with one hidden layer of two nodes. The performance of the model was evaluated using the acceptable prediction zone (APZ) method. The number of Salmonella in ground chicken thigh meat stayed the same (P > 0.05) during 8 days of storage at -8 to 8°C but increased (P < 0.05) during storage at 9°C (+0.6 log) to 16°C (+5.1 log). The proportion of residual values (observed minus predicted values) in an APZ (pAPZ) from -1 log (fail-safe) to 0.5 log (fail-dangerous) was 0.939 for the data (n = 426 log MPN values) used in the development of the model. The model had a pAPZ of 0.944 or 0.954 when it was extrapolated to test data (n = 108 log MPN per serotype) for other serotypes (S. enterica serotype Typhimurium var 5-, Kentucky, Typhimurium, and Thompson) of Salmonella in ground chicken thigh meat stored for 0 to 8 days at -4, 4, 12, or 16°C under the same experimental conditions. A pAPZ of ≥0.7 indicates that a model provides predictions with acceptable bias and accuracy. Thus, the results indicated that the model provided valid predictions of the survival and growth of Salmonella 8,20:-:z6 in ground chicken thigh meat stored for 0 to 8 days at -8 to 16°C and that the model was validated for extrapolation to four other serotypes of Salmonella.

  20. Computer analysis of digital well logs

    USGS Publications Warehouse

    Scott, James H.

    1984-01-01

    A comprehensive system of computer programs has been developed by the U.S. Geological Survey for analyzing digital well logs. The programs are operational on a minicomputer in a research well-logging truck, making it possible to analyze and replot the logs while at the field site. The minicomputer also serves as a controller of digitizers, counters, and recorders during acquisition of well logs. The analytical programs are coordinated with the data acquisition programs in a flexible system that allows the operator to make changes quickly and easily in program variables such as calibration coefficients, measurement units, and plotting scales. The programs are designed to analyze the following well-logging measurements: natural gamma-ray, neutron-neutron, dual-detector density with caliper, magnetic susceptibility, single-point resistance, self potential, resistivity (normal and Wenner configurations), induced polarization, temperature, sonic delta-t, and sonic amplitude. The computer programs are designed to make basic corrections for depth displacements, tool response characteristics, hole diameter, and borehole fluid effects (when applicable). Corrected well-log measurements are output to magnetic tape or plotter with measurement units transformed to petrophysical and chemical units of interest, such as grade of uranium mineralization in percent eU3O8, neutron porosity index in percent, and sonic velocity in kilometers per second.

  1. Effects of Thalassinoides ichnofabrics on the petrophysical properties of the Lower Cretaceous Lower Glen Rose Limestone, Middle Trinity Aquifer, Northern Bexar County, Texas

    NASA Astrophysics Data System (ADS)

    Golab, James A.; Smith, Jon J.; Clark, Allan K.; Blome, Charles D.

    2017-04-01

    The combined Edwards and Trinity aquifer system is the primary source of freshwater for the rapidly growing San Antonio and Austin metropolitan areas. The karstic Lower Cretaceous (Aptian-Albian) Lower Glen Rose Limestone (GRL) contains the middle Trinity aquifer and has been subdivided into six hydrostratigraphic units (HSUs) with distinct hydrologic characteristics. These HSUs were first identified in the subsurface via core examination at the Camp Stanley Storage Activity (CSSA) in northern Bexar County, Texas and were then correlated to associated gamma-ray and resistivity logs. The Trinity aquifer system is a telogenetic karst and fluid flow is directed primarily through solution-enhanced faults, fractures, and pervasive Thalassinoides networks because matrix porosity of both transmissive and confining HSUs is very low. Meteoric water infiltrates the Trinity aquifer through vertically-oriented faults and likely moves laterally through biogenic pores. Two 7.62 cm diameter GRL cores and well logs from monitoring wells CS-MW9-CC and CS-MW5-LGR recovered from the CSSA were used to characterize the effect such large-scale Thalassinoides networks have on the petrophysical properties (resistivity and natural gamma-ray) of four HSUs (Honey Creek, Rust, Doeppenschmidt, and Twin Sisters HSUs). Resistivity logs show that resistance values > 300 Ω-m correlate with well-developed biogenic porosity and values of 650 Ω-m are associated with solution enhancement of the Thalassinoides networks. These high resistivity zones are cyclical and are identified in muddy confining units, even when no changes in lithology or karstic development are identified. Pervasive Thalassinoides networks act as starting points for wide spread dissolution and lead to advanced karst development in transmissive HSUs. Natural gamma-ray logs do not reflect hydrologic characteristics directly, but are inversely correlated to resistivity logs and display m-scale cyclicity. Resistivity logs suggest that Thalassinoides networks are interconnected throughout strata within the GRL and when coupled with natural gamma-logs, the lateral distribution of these networks within HSUs can be correlated. Identifying such fluid pathways is of particular importance for wells not located in proximity to major faults and karstic features.

  2. Linear solvation energy relationships regarding sorption and retention properties of hydrophobic organic compounds in soil leaching column chromatography.

    PubMed

    Xu, Feng; Liang, Xinmiao; Lin, Bingcheng; Su, Fan; Schramm, Karl-Werner; Kettrup, Antonius

    2002-08-01

    The capacity factors of a series of hydrophobic organic compounds (HOCs) were measured in soil leaching column chromatography (SLCC) on a soil column, and in reversed-phase liquid chromatography on a C18 column with different volumetric fractions (phi) of methanol in methanol-water mixtures. A general equation of linear solvation energy relationships, log(XYZ) XYZ0 + mV(I)/100 + spi + bbetam + aalpham, was applied to analyze capacity factors (k'), soil organic partition coefficients (Koc) and octanol-water partition coefficients (P). The analyses exhibited high accuracy. The chief solute factors that control logKoc, log P, and logk' (on soil and on C18) are the solute size (V(I)/100) and hydrogen-bond basicity (betam). Less important solute factors are the dipolarity/polarizability (pi*) and hydrogen-bond acidity (alpham). Log k' on soil and log Koc have similar signs in four fitting coefficients (m, s, b and a) and similar ratios (m:s:b:a), while log k' on C18 and logP have similar signs in coefficients (m, s, b and a) and similar ratios (m:s:b:a). Consequently, logk' values on C18 have good correlations with logP (r > 0.97), while logk' values on soil have good correlations with logKoc (r > 0.98). Two Koc estimation methods were developed, one through solute solvatochromic parameters, and the other through correlations with k' on soil. For HOCs, a linear relationship between logarithmic capacity factor and methanol composition in methanol-water mixtures could also be derived in SLCC.

  3. SpaceOps 2012 Plus 2: Social Tools to Simplify ISS Flight Control Communications and Log Keeping

    NASA Technical Reports Server (NTRS)

    Cowart, Hugh S.; Scott, David W.

    2014-01-01

    A paper written for the SpaceOps 2012 Conference (Simplify ISS Flight Control Communications and Log Keeping via Social Tools and Techniques) identified three innovative concepts for real time flight control communications tools based on social mechanisms: a) Console Log Tool (CoLT) - A log keeping application at Marshall Space Flight Center's (MSFC) Payload Operations Integration Center (POIC) that provides "anywhere" access, comment and notifications features similar to those found in Social Networking Systems (SNS), b) Cross-Log Communication via Social Techniques - A concept from Johnsson Space Center's (JSC) Mission Control Center Houston (MCC-H) that would use microblogging's @tag and #tag protocols to make information/requests visible and/or discoverable in logs owned by @Destination addressees, and c) Communications Dashboard (CommDash) - A MSFC concept for a Facebook-like interface to visually integrate and manage basic console log content, text chat streams analogous to voice loops, text chat streams dedicated to particular conversations, generic and position-specific status displays/streams, and a graphically based hailing display. CoLT was deployed operationally at nearly the same time as SpaceOps 2012, the Cross- Log Communications idea is currently waiting for a champion to carry it forward, and CommDash was approved as a NASA Iinformation Technoloby (IT) Labs project. This paper discusses lessons learned from two years of actual CoLT operations, updates CommDash prototype development status, and discusses potential for using Cross-Log Communications in both MCC-H and/or POIC environments, and considers other ways for synergizing console applcations.

  4. Psychometric assessment of the IBS-D Daily Symptom Diary and Symptom Event Log.

    PubMed

    Rosa, Kathleen; Delgado-Herrera, Leticia; Zeiher, Bernie; Banderas, Benjamin; Arbuckle, Rob; Spears, Glen; Hudgens, Stacie

    2016-12-01

    Diarrhea-predominant irritable bowel syndrome (IBS-D) can considerably impact patients' lives. Patient-reported symptoms are crucial in understanding the diagnosis and progression of IBS-D. This study psychometrically evaluates the newly developed IBS-D Daily Symptom Diary and Symptom Event Log (hereafter, "Event Log") according to US regulatory recommendations. A US-based observational field study was conducted to understand cross-sectional psychometric properties of the IBS-D Daily Symptom Diary and Event Log. Analyses included item descriptive statistics, item-to-item correlations, reliability, and construct validity. The IBS-D Daily Symptom Diary and Event Log had no items with excessive missing data. With the exception of two items ("frequency of gas" and "accidents"), moderate to high inter-item correlations were observed among all items of the IBS-D Daily Symptom Diary and Event Log (day 1 range 0.67-0.90). Item scores demonstrated reliability, with the exception of the "frequency of gas" and "accidents" items of the Diary and "incomplete evacuation" item of the Event Log. The pattern of correlations of the IBS-D Daily Symptom Diary and Event Log item scores with generic and disease-specific measures was as expected, moderate for similar constructs and low for dissimilar constructs, supporting construct validity. Known-groups methods showed statistically significant differences and monotonic trends in each of the IBS-D Daily Symptom Diary item scores among groups defined by patients' IBS-D severity ratings ("none"/"mild," "moderate," or "severe"/"very severe"), supporting construct validity. Initial psychometric results support the reliability and validity of the items of the IBS-D Daily Symptom Diary and Event Log.

  5. Online molecular image repository and analysis system: A multicenter collaborative open-source infrastructure for molecular imaging research and application.

    PubMed

    Rahman, Mahabubur; Watabe, Hiroshi

    2018-05-01

    Molecular imaging serves as an important tool for researchers and clinicians to visualize and investigate complex biochemical phenomena using specialized instruments; these instruments are either used individually or in combination with targeted imaging agents to obtain images related to specific diseases with high sensitivity, specificity, and signal-to-noise ratios. However, molecular imaging, which is a multidisciplinary research field, faces several challenges, including the integration of imaging informatics with bioinformatics and medical informatics, requirement of reliable and robust image analysis algorithms, effective quality control of imaging facilities, and those related to individualized disease mapping, data sharing, software architecture, and knowledge management. As a cost-effective and open-source approach to address these challenges related to molecular imaging, we develop a flexible, transparent, and secure infrastructure, named MIRA, which stands for Molecular Imaging Repository and Analysis, primarily using the Python programming language, and a MySQL relational database system deployed on a Linux server. MIRA is designed with a centralized image archiving infrastructure and information database so that a multicenter collaborative informatics platform can be built. The capability of dealing with metadata, image file format normalization, and storing and viewing different types of documents and multimedia files make MIRA considerably flexible. With features like logging, auditing, commenting, sharing, and searching, MIRA is useful as an Electronic Laboratory Notebook for effective knowledge management. In addition, the centralized approach for MIRA facilitates on-the-fly access to all its features remotely through any web browser. Furthermore, the open-source approach provides the opportunity for sustainable continued development. MIRA offers an infrastructure that can be used as cross-boundary collaborative MI research platform for the rapid achievement in cancer diagnosis and therapeutics. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. DAN instrument for NASA`s MSL mission: fast science data processing and instrument commanding for Mars surface operations and for field tests

    NASA Astrophysics Data System (ADS)

    Vostrukhin, A.; Kozyrev, A.; Litvak, M.; Malakhov, A.; Mitrofanov, I.; Mokrousov, M.; Sanin, A.; Tretyakov, V.

    2009-04-01

    The Dynamic Albedo of Neutrons (DAN) instrument is contributed by Russian Space Agency to NASA for Mars Science Laboratory mission which was originally scheduled for 2009 and now is shifted to 2011. The design of DAN instrument is partially inherited from HEND instrument for NASA's Mars Odyssey, which now successfully operates providing global mapping of martian neutron albedo, searching the distribution of martian water and observing the martian seasonal cycles. DAN is specially designed as an active neutron instrument for surface operations onboard mobile platforms. It is able to focus science investigations on local surface area around rover with horizontal resolution about 1 meter and vertical penetration about 0.5 m. The primary goal of DAN is the exploration of the hydrogen content of the bulk Martian subsurface material. This data will be used to estimate the content of chemically bound water in the hydrated minerals. The concept of DAN operations is based on combination of neutron activation analysis and neutron well logging tequnique, which are commonly used in the Earth geological applications. DAN consists blocks of Detectors and Electronics (DE) and Pulse Neutron Generator (PNG). The last one is used to irradiate the martian subsurface by pulses of 14MeV neutrons with changeable frequency up to 10 Hz. The first one detects post-pulse afterglow of neutrons, as they were thermalized down to epithermal and thermal energies within the martian subsurface. The result of detections are so called die away curves of neutrons afterglow, which show flux and time profile of thermalized neutrons and bring to us the observational signature of layering structure of martian regolith in part of depth distribution of Hydrogen (most effective element for thermalization of neutrons). In this study we focus on the development, verification and validation of DAN fast data processing and commanding. It is necessary to perform deconvolution from counting statistic in DAN detectors (raw data) to the real science products such as estimated average content of Hydrgen content or its depth distribution along the rover trace. For the rover surface operations it is necessary to provide real time data analysis to combine DAN data with data from all another science instruments and to develop the best observation strategy for the future periods of operation activity. In our approach we use: 1) Onboard FPGA data processing for recording neutron die away curves for epthermal and thermal neutrons of post-pulse afterglow 2) Getting raw data of DAN at the Mission operation center 3) Validation of instrument parameters and operational performance 4) Fast first level science data processing (statistical analysis, background subtraction, normalization) 5) Fast deconvolution of detector counts into the Hydrogen content (including numerical simulation, comparison with the known standard models of regolith), 6) Comparison with known information obtained with another instruments 7) Development of the near-term and long-term strategy for next DAN operations onboard MSL. 8) Generation and testing commanding sequences for the next period of MSL autonomous operations All this activity shall be adjusted in the real time, so the steps 2-8 shall not exceed 2-3 hours. Before launch we plan to validate this approach trough the instrument calibrations, field tests and MSL science group activity. The first experience will be presented of fast data analysis and commanding for the field tests of DAN, which were performed in the testing facility of the Joint Institute of Nuclear Research (Russia). Also, we will discuss our plans of DAN operations for coming field tests in Antarctica.

  7. The evolution of the dust temperatures of galaxies in the SFR-M∗ plane up to z ∼ 2

    NASA Astrophysics Data System (ADS)

    Magnelli, B.; Lutz, D.; Saintonge, A.; Berta, S.; Santini, P.; Symeonidis, M.; Altieri, B.; Andreani, P.; Aussel, H.; Béthermin, M.; Bock, J.; Bongiovanni, A.; Cepa, J.; Cimatti, A.; Conley, A.; Daddi, E.; Elbaz, D.; Förster Schreiber, N. M.; Genzel, R.; Ivison, R. J.; Le Floc'h, E.; Magdis, G.; Maiolino, R.; Nordon, R.; Oliver, S. J.; Page, M.; Pérez García, A.; Poglitsch, A.; Popesso, P.; Pozzi, F.; Riguccini, L.; Rodighiero, G.; Rosario, D.; Roseboom, I.; Sanchez-Portal, M.; Scott, D.; Sturm, E.; Tacconi, L. J.; Valtchanov, I.; Wang, L.; Wuyts, S.

    2014-01-01

    We study the evolution of the dust temperature of galaxies in the SFR- M∗ plane up to z ~ 2 using far-infrared and submillimetre observations from the Herschel Space Observatory taken as part of the PACS Evolutionary Probe (PEP) and Herschel Multi-tiered Extragalactic Survey (HerMES) guaranteed time key programmes. Starting from a sample of galaxies with reliable star-formation rates (SFRs), stellar masses (M∗) and redshift estimates, we grid the SFR- M∗parameter space in several redshift ranges and estimate the mean dust temperature (Tdust) of each SFR-M∗ - z bin. Dust temperatures are inferred using the stacked far-infrared flux densities (100-500 μm) of our SFR-M∗ - z bins. At all redshifts, the dust temperature of galaxies smoothly increases with rest-frame infrared luminosities (LIR), specific SFRs (SSFR; i.e., SFR/M∗), and distances with respect to the main sequence (MS) of the SFR- M∗ plane (i.e., Δlog (SSFR)MS = log [SSFR(galaxy)/SSFRMS(M∗,z)]). The Tdust - SSFR and Tdust - Δlog (SSFR)MS correlations are statistically much more significant than the Tdust - LIR one. While the slopes of these three correlations are redshift-independent, their normalisations evolve smoothly from z = 0 and z ~ 2. We convert these results into a recipe to derive Tdust from SFR, M∗ and z, valid out to z ~ 2 and for the stellar mass and SFR range covered by our stacking analysis. The existence of a strong Tdust - Δlog (SSFR)MS correlation provides us with several pieces of information on the dust and gas content of galaxies. Firstly, the slope of the Tdust - Δlog (SSFR)MS correlation can be explained by the increase in the star-formation efficiency (SFE; SFR/Mgas) with Δlog (SSFR)MS as found locally by molecular gas studies. Secondly, at fixed Δlog (SSFR)MS, the constant dust temperature observed in galaxies probing wide ranges in SFR and M∗ can be explained by an increase or decrease in the number of star-forming regions with comparable SFE enclosed in them. And thirdly, at high redshift, the normalisation towards hotter dust temperature of the Tdust - Δlog (SSFR)MS correlation can be explained by the decrease in the metallicities of galaxies or by the increase in the SFE of MS galaxies. All these results support the hypothesis that the conditions prevailing in the star-forming regions of MS and far-above-MS galaxies are different. MS galaxies have star-forming regions with low SFEs and thus cold dust, while galaxies situated far above the MS seem to be in a starbursting phase characterised by star-forming regions with high SFEs and thus hot dust. Herschel is an ESA space observatory with science instruments provided by European-led Principal Investigator consortia and with important participation from NASA.Appendices are available in electronic form at http://www.aanda.org

  8. Deriving Criteria-supporting Benchmark Values from Empirical Response Relationships: Comparison of Statistical Techniques and Effect of Log-transforming the Nutrient Variable

    EPA Science Inventory

    In analyses supporting the development of numeric nutrient criteria, multiple statistical techniques can be used to extract critical values from stressor response relationships. However there is little guidance for choosing among techniques, and the extent to which log-transfor...

  9. Physiological and psychological impacts of extended work hours in logging operations

    Treesearch

    Dana Mitchell; Tom Gallagher

    2007-01-01

    A study was initiated in 2006 to develop an understanding of the considerations of using extended work hours in the logging industry in the southeastern United States. Through semistructured interviews, it was obvious that loggers were individually creating ways of successfully implementing extended working hours without understanding the impacts that extended working...

  10. Heartwood formation in four black walnut plantations

    Treesearch

    Keith Woeste; Brian Beheler

    2003-01-01

    The amount of heartwood in black walnut (Juglans nigra L.) logs can vary widely, even among trees of the same age growing at the same location. There is little published data on the genetics, physiology, and development of heartwood in hardwoods, even though the volume of heartwood in a log can significantly influence its value.

  11. Sprinkling to prevent decay in decked western hemlock logs.

    Treesearch

    Ernest Wright; A.C. Knauss; R.M. Lindgren

    1959-01-01

    Decay developing in decked western hemlock (Tsuga heterophylla) has caused considerable loss in pulp yields in the Pacific Northwest. The Oregon Pulp and Paper Co., which commonly decks hemlock logs through one summer and occasionally through two summers, cooperated with the Pacific Northwest Forest and Range Experiment Station in a study to...

  12. Acoustic measurements on trees and logs: a review and analysis

    Treesearch

    Xiping Wang

    2013-01-01

    Acoustic technologies have been well established as material evaluation tools in the past several decades, and their use has become widely accepted in the forest products industry for online quality control and products grading. Recent research developments on acoustic sensing technology offer further opportunities to evaluate standing trees and logs for general wood...

  13. Minimizing bias in biomass allometry: Model selection and log transformation of data

    Treesearch

    Joseph Mascaro; undefined undefined; Flint Hughes; Amanda Uowolo; Stefan A. Schnitzer

    2011-01-01

    Nonlinear regression is increasingly used to develop allometric equations for forest biomass estimation (i.e., as opposed to the raditional approach of log-transformation followed by linear regression). Most statistical software packages, however, assume additive errors by default, violating a key assumption of allometric theory and possibly producing spurious models....

  14. Outdoor Education Student Log Book.

    ERIC Educational Resources Information Center

    Garbutt, Barbara; And Others.

    A student log book for outdoor education was developed to aid Oakland County (Michigan) teachers and supervisors of outdoor education in preparing student campers for their role and responsibilities in the total program. A sample letter to sixth graders explains the purpose of the booklet. General camp rules (10) are presented, followed by 6 woods…

  15. PACALL: Supporting Language Learning Using SenseCam

    ERIC Educational Resources Information Center

    Hou, Bin; Ogata, Hiroaki; Kunita, Toma; Li, Mengmeng; Uosaki, Noriko

    2013-01-01

    The authors' research defines a ubiquitous learning log (ULLO) as a digital record of what a learner has learned in the daily life using ubiquitous technologies. In their previous works, the authors proposed a model named LORE (Log--Organize--Recall--Evaluate) to describe the learning process of ULLO and developed a system named SCROLL to…

  16. A Comparison of Video-Based and Interaction-Based Affect Detectors in Physics Playground

    ERIC Educational Resources Information Center

    Kai, Shiming; Paquette, Luc; Baker, Ryan S.; Bosch, Nigel; D'Mello, Sidney; Ocumpaugh, Jaclyn; Shute, Valerie; Ventura, Matthew

    2015-01-01

    Increased attention to the relationships between affect and learning has led to the development of machine-learned models that are able to identify students' affective states in computerized learning environments. Data for these affect detectors have been collected from multiple modalities including physical sensors, dialogue logs, and logs of…

  17. Chromatographic behaviour predicts the ability of potential nootropics to permeate the blood-brain barrier.

    PubMed

    Farsa, Oldřich

    2013-01-01

    The log BB parameter is the logarithm of the ratio of a compound's equilibrium concentrations in the brain tissue versus the blood plasma. This parameter is a useful descriptor in assessing the ability of a compound to permeate the blood-brain barrier. The aim of this study was to develop a Hansch-type linear regression QSAR model that correlates the parameter log BB and the retention time of drugs and other organic compounds on a reversed-phase HPLC containing an embedded amide moiety. The retention time was expressed by the capacity factor log k'. The second aim was to estimate the brain's absorption of 2-(azacycloalkyl)acetamidophenoxyacetic acids, which are analogues of piracetam, nefiracetam, and meclofenoxate. Notably, these acids may be novel nootropics. Two simple regression models that relate log BB and log k' were developed from an assay performed using a reversed-phase HPLC that contained an embedded amide moiety. Both the quadratic and linear models yielded statistical parameters comparable to previously published models of log BB dependence on various structural characteristics. The models predict that four members of the substituted phenoxyacetic acid series have a strong chance of permeating the barrier and being absorbed in the brain. The results of this study show that a reversed-phase HPLC system containing an embedded amide moiety is a functional in vitro surrogate of the blood-brain barrier. These results suggest that racetam-type nootropic drugs containing a carboxylic moiety could be more poorly absorbed than analogues devoid of the carboxyl group, especially if the compounds penetrate the barrier by a simple diffusion mechanism.

  18. Serum α-fetoprotein concentration in extremely low-birthweight infants.

    PubMed

    Maruyama, Kenichi

    2017-02-01

    Extremely low-birthweight infants (ELBWI) are at greater risk of developing hepatoblastoma than are normal-weight infants. Serum α-fetoprotein (AFP) plays an important role as a tumor marker in the diagnosis of hepatoblastoma, therefore the aim of this study was to determine the changes in serum AFP concentration after birth in ELBWI. Data were obtained for infants born between January 2005 and March 2008 with birthweight <1000 g who were followed up at Gunma Children's Medical Center with clinical examinations, including monitoring of the development of hepatoblastoma. The relationship between serum AFP concentration and age was analyzed up to 730 days after birth. Overall, 95 serum AFP measurements were obtained from 23 infants 30-730 days of age, with gestational age 24-32 weeks, and birthweight 498-982 g. Log 10 (AFP [ng/mL]) was significantly correlated with log 10 (age [days]) (r = -0.961, P = 0.000, n = 95), with the following regression formula: log 10 (AFP [ng/mL]) = 11.063 - 3.752 log 10 (age [days]) (adjusted R 2  = 0.923, n = 95). The standard error of the estimate, mean log 10 (age [days]), and the sum of squares for log 10 (age [days]) were 0.363, 2.503, and 10.579, respectively. A correlation was found between serum AFP concentration and age in ELBWI, and the 95%CI of serum AFP concentration was determined for ELBWI up to 2 years after birth. © 2016 Japan Pediatric Society.

  19. Development of a bifunctional filter for prion protein and leukoreduction of red blood cell components.

    PubMed

    Yokomizo, Tomo; Kai, Takako; Miura, Morikazu; Ohto, Hitoshi

    2015-02-01

    Leukofiltration of blood components is currently implemented worldwide as a precautionary measure against white blood cell-associated adverse effects and the potential transmission of variant Creutzfeldt-Jakob disease (vCJD). A newly developed bifunctional filter (Sepacell Prima, Asahi Kasei Medical) was assessed for prion removal, leukoreduction (LR), and whether the filter significantly affected red blood cells (RBCs). Sepacell Prima's postfiltration effects on RBCs, including hemolysis, complement activation, and RBC chemistry, were compared with those of a conventional LR filter (Sepacell Pure RC). Prion removal was measured by Western blot after spiking RBCs with microsomal fractions derived from scrapie-infected hamster brain homogenate. Serially diluted exogenous prion solutions (0.05 mL), with or without filtration, were injected intracerebrally into Golden Syrian hamsters. LR efficiency of 4.44 log with the Sepacell Prima was comparable to 4.11 log with the conventional LR filter. There were no significant differences between the two filters in hemoglobin loss, hemolysis, complement activation, and RBC biomarkers. In vitro reduction of exogenously spiked prions by the filter exceeded 3 log. The titer, 6.63 (log ID50 /mL), of prefiltration infectivity of healthy hamsters was reduced to 2.52 (log ID50 /mL) after filtration. The reduction factor was calculated as 4.20 (log ID50 ). With confirmed removal efficacy for exogenous prion protein, this new bifunctional prion and LR filter should reduce the residual risk of vCJD transmission through blood transfusion without adding complexity to component processing. © 2014 AABB.

  20. Comparison study of global and local approaches describing critical phenomena on the Polish stock exchange market

    NASA Astrophysics Data System (ADS)

    Czarnecki, Łukasz; Grech, Dariusz; Pamuła, Grzegorz

    2008-12-01

    We confront global and local methods to analyze the financial crash-like events on the Polish financial market from the critical phenomena point of view. These methods are based on the analysis of log-periodicity and the local fractal properties of financial time series in the vicinity of phase transitions (crashes). The whole history (1991-2008) of Warsaw Stock Exchange Index (WIG) describing the largest developing financial market in Europe, is analyzed in a daily time horizon. We find that crash-like events on the Polish financial market are described better by the log-divergent price model decorated with log-periodic behavior than the corresponding power-law-divergent price model. Predictions coming from log-periodicity scenario are verified for all main crashes that took place in WIG history. It is argued that crash predictions within log-periodicity model strongly depend on the amount of data taken to make a fit and therefore are likely to contain huge inaccuracies. Turning to local fractal description, we calculate the so-called local (time dependent) Hurst exponent H for the WIG time series and we find the dependence between the behavior of the local fractal properties of the WIG time series and the crashes appearance on the financial market. The latter method seems to work better than the global approach - both for developing as for developed markets. The current situation on the market, particularly related to the Fed intervention in September’07 and the situation on the market immediately after this intervention is also analyzed from the fractional Brownian motion point of view.

  1. Testing the Quick Seismic Event Locator and Magnitude Calculator (SSL_Calc) by Marsite Project Data Base

    NASA Astrophysics Data System (ADS)

    Tunc, Suleyman; Tunc, Berna; Caka, Deniz; Baris, Serif

    2016-04-01

    Locating and calculating size of the seismic events is quickly one of the most important and challenging issue in especially real time seismology. In this study, we developed a Matlab application to locate seismic events and calculate their magnitudes (Local Magnitude and empirical Moment Magnitude) using single station called SSL_Calc. This newly developed sSoftware has been tested on the all stations of the Marsite project "New Directions in Seismic Hazard Assessment through Focused Earth Observation in the Marmara Supersite-MARsite". SSL_Calc algorithm is suitable both for velocity and acceleration sensors. Data has to be in GCF (Güralp Compressed Format). Online or offline data can be selected in SCREAM software (belongs to Guralp Systems Limited) and transferred to SSL_Calc. To locate event P and S wave picks have to be marked by using SSL_Calc window manually. During magnitude calculation, instrument correction has been removed and converted to real displacement in millimeter. Then the displacement data is converted to Wood Anderson Seismometer output by using; Z=[0;0]; P=[-6.28+4.71j; -6.28-4.71j]; A0=[2080] parameters. For Local Magnitude calculation,; maximum displacement amplitude (A) and distance (dist) are used in formula (1) for distances up to 200km and formula (2) for more than 200km. ML=log10(A)-(-1.118-0.0647*dist+0.00071*dist2-3.39E-6*dist3+5.71e-9*dist4) (1) ML=log10(A)+(2.1173+0.0082*dist-0.0000059628*dist2) (2) Following Local Magnitude calculation, the programcode calculates two empiric Moment Magnitudes using formulas (3) Akkar et al. (2010) and (4) Ulusay et al. (2004). Mw=0.953* ML+0.422 (3) Mw=0.7768* ML+1.5921 (4) SSL_Calc is a software that is easy to implement and user friendly and offers practical solution to individual users to location of event and ML, Mw calculation.

  2. Exploration of spatio-temporal patterns of students' movement in field trip by visualizing the log data

    NASA Astrophysics Data System (ADS)

    Cho, Nahye; Kang, Youngok

    2018-05-01

    A numerous log data in addition to user input data are being generated as mobile and web users continue to increase recently, and the studies in order to explore the patterns and meanings of various movement activities by making use of these log data are also rising rapidly. On the other hand, in the field of education, people have recognized the importance of field trip as the creative education is highlighted. Also, the examples which utilize the mobile devices in the field trip in accordance to the development of information technology are growing. In this study, we try to explore the patterns of student's activity by visualizing the log data generated from high school students' field trip with mobile device.

  3. Geophysical Log Database for the Mississippi Embayment Regional Aquifer Study (MERAS)

    USGS Publications Warehouse

    Hart, Rheannon M.; Clark, Brian R.

    2008-01-01

    The Mississippi Embayment Regional Aquifer Study (MERAS) is an investigation of ground-water availability and sustainability within the Mississippi embayment as part of the U.S. Geological Survey Ground-Water Resources Program. The MERAS area consists of approximately 70,000 square miles and encompasses parts of eight states including Alabama, Arkansas, Illinois, Kentucky, Louisiana, Mississippi, Missouri, and Tennessee. More than 2,600 geophysical logs of test holes and wells within the MERAS area were compiled into a database and were used to develop a digital hydrogeologic framework from land surface to the top of the Midway Group of upper Paleocene age. The purpose of this report is to document, present, and summarize the geophysical log database, as well as to preserve the geophysical logs in a digital image format for online access.

  4. Estimating pore-space gas hydrate saturations from well log acoustic data

    NASA Astrophysics Data System (ADS)

    Lee, Myung W.; Waite, William F.

    2008-07-01

    Relating pore-space gas hydrate saturation to sonic velocity data is important for remotely estimating gas hydrate concentration in sediment. In the present study, sonic velocities of gas hydrate-bearing sands are modeled using a three-phase Biot-type theory in which sand, gas hydrate, and pore fluid form three homogeneous, interwoven frameworks. This theory is developed using well log compressional and shear wave velocity data from the Mallik 5L-38 permafrost gas hydrate research well in Canada and applied to well log data from hydrate-bearing sands in the Alaskan permafrost, Gulf of Mexico, and northern Cascadia margin. Velocity-based gas hydrate saturation estimates are in good agreement with Nuclear Magneto Resonance and resistivity log estimates over the complete range of observed gas hydrate saturations.

  5. Mail LOG: Program operating instructions

    NASA Technical Reports Server (NTRS)

    Harris, D. K.

    1979-01-01

    The operating instructions for the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS, are provided. The program is written in FORTRAN for the PRIME 300 computer system. The MAIL LOG program has the following four modes of operation: (1) INPUT - putting new records into the data base (2) REVISE - changing or modifying existing records in the data base (3) SEARCH - finding special records existing in the data base (4) ARCHIVE - store or put away existing records in the data base. The output includes special printouts of records in the data base and results from the INPUT and SEARCH modes. The MAIL LOG data base consists of three main subfiles: Incoming and outgoing mail correspondence; Design Information Releases and Releases and Reports; and Drawings and Engineering orders.

  6. Estimating pore-space gas hydrate saturations from well log acoustic data

    USGS Publications Warehouse

    Lee, Myung W.; Waite, William F.

    2008-01-01

    Relating pore-space gas hydrate saturation to sonic velocity data is important for remotely estimating gas hydrate concentration in sediment. In the present study, sonic velocities of gas hydrate–bearing sands are modeled using a three-phase Biot-type theory in which sand, gas hydrate, and pore fluid form three homogeneous, interwoven frameworks. This theory is developed using well log compressional and shear wave velocity data from the Mallik 5L-38 permafrost gas hydrate research well in Canada and applied to well log data from hydrate-bearing sands in the Alaskan permafrost, Gulf of Mexico, and northern Cascadia margin. Velocity-based gas hydrate saturation estimates are in good agreement with Nuclear Magneto Resonance and resistivity log estimates over the complete range of observed gas hydrate saturations.

  7. Microbial Removals by a Novel Biofilter Water Treatment System

    PubMed Central

    Wendt, Christopher; Ives, Rebecca; Hoyt, Anne L.; Conrad, Ken E.; Longstaff, Stephanie; Kuennen, Roy W.; Rose, Joan B.

    2015-01-01

    Two point-of-use drinking water treatment systems designed using a carbon filter and foam material as a possible alternative to traditional biosand systems were evaluated for removal of bacteria, protozoa, and viruses. Two configurations were tested: the foam material was positioned vertically around the carbon filter in the sleeve unit or horizontally in the disk unit. The filtration systems were challenged with Cryptosporidium parvum, Raoultella terrigena, and bacteriophages P22 and MS2 before and after biofilm development to determine average log reduction (ALR) for each organism and the role of the biofilm. There was no significant difference in performance between the two designs, and both designs showed significant levels of removal (at least 4 log10 reduction in viruses, 6 log10 for protozoa, and 8 log10 for bacteria). Removal levels meet or exceeded Environmental Protection Agency (EPA) standards for microbial purifiers. Exploratory test results suggested that mature biofilm formation contributed 1–2 log10 reductions. Future work is recommended to determine field viability. PMID:25758649

  8. Displacement chromatography on cyclodextrin silicas. IV. Separation of the enantiomers of ibuprofen.

    PubMed

    Farkas, G; Irgens, L H; Quintero, G; Beeson, M D; al-Saeed, A; Vigh, G

    1993-08-13

    A displacement chromatographic method has been developed for the preparative separation of the enantiomers of ibuprofen using a beta-cyclodextrin silica stationary phase. The retention behavior of ibuprofen was studied in detail: the log k' vs. polar organic modifier concentration, the log k' vs. pH, the log k' vs. buffer concentration and the log k' vs. 1/T relationships; also, the alpha vs. polar organic modifier concentration, the alpha vs. pH, the alpha vs. buffer concentration and the log alpha vs. 1/T relationships have been determined in order to find the carrier solution composition which results in maximum chiral selectivity and sufficient, but not excessive solute retention (1 < k' < 30). 4-tert.-Butylcyclohexanol, a structurally similar but more retained compound than ibuprofen, was selected as displacer for the separation. Even with an alpha value as small as 1.08, good preparative chiral separations were observed both in the displacement mode and in the overloaded elution mode, up to a sample load of 0.5 mg.

  9. Reliability and validity of the instrument used in BRFSS to assess physical activity.

    PubMed

    Yore, Michelle M; Ham, Sandra A; Ainsworth, Barbara E; Kruger, Judy; Reis, Jared P; Kohl, Harold W; Macera, Caroline A

    2007-08-01

    State-level statistics of adherence to the physical activity objectives in Healthy People 2010 are derived from the Behavioral Risk Factor Surveillance System (BRFSS) data. BRFSS physical activity questions were updated in 2001 to include domains of leisure time, household, and transportation-related activity of moderate- and vigorous intensity, and walking questions. This article reports the reliability and validity of these questions. The BRFSS Physical Activity Study (BPAS) was conducted from September 2000 to May 2001 in Columbia, SC. Sixty participants were followed for 22 d; they answered the physical activity questions three times via telephone, wore a pedometer and accelerometer, and completed a daily physical activity log for 1 wk. Measures for moderate, vigorous, recommended (i.e., met the criteria for moderate or vigorous), and strengthening activities were created according to Healthy People 2010 operational definitions. Reliability and validity were assessed using Cohen's kappa (kappa) and Pearson correlation coefficients. Seventy-three percent of participants met the recommended activity criteria compared with 45% in the total U.S. population. Test-retest reliability (kappa) was 0.35-0.53 for moderate activity, 0.80-0.86 for vigorous activity, 0.67-0.84 for recommended activity, and 0.85-0.92 for strengthening. Validity (kappa) of the survey (using the accelerometer as the standard) was 0.17-0.22 for recommended activity. Validity (kappa) of the survey (using the physical activity log as the standard) was 0.40-0.52 for recommended activity. The validity and reliability of the BRFSS physical activity questions suggests that this instrument can classify groups of adults into the levels of recommended and vigorous activity as defined by Healthy People 2010. Repeated administration of these questions over time will help to identify trends in physical activity.

  10. Design of a simple low cost tethersonde data acquisition system for meteorological measurements

    NASA Astrophysics Data System (ADS)

    John, Thomas; Garg, S. C.; Maini, H. K.; Chaunal, D. S.; Yadav, V. S.

    2005-08-01

    A tethersonde instrument for vertical sounding in the lowest height region of the atmosphere designed using commercially available solid state sensors is presented. This instrumentation which was developed for high-resolution measurements of the vertical profiles of atmospheric temperature, humidity, and pressure to study the evolution of radiation fog in winter over Delhi, measures the pressure (800-1100milli-bar), temperature (0-35°C), and relative humidity (0-100% RH) and includes an eight-channel 12bit data acquisition unit and a low-power digital, FM telemetry system operating at 173MHz and uses a PC printer port connection for real-time data logging and operates from a single 9V supply. It is capable of providing data to high resolutions of better than 0.01°C for temperature, 0.07mb for pressure, 0.1% for the relative humidity, and the system has a response time of ˜50s. The sonde can accommodate up to six different sensors according to sounding requirements. The data are serially streamed into the PC and the identification of data from the different sensors is accomplished by using a sync word which is included as a part of the data stream. The sonde has been flown quite successfully a number of times and the flights made thus far have shown excellent performance. The design implementation facilitates easy recalibrations of the individual sensors and their replacement upon failure.

  11. The application of ANN for zone identification in a complex reservoir

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, A.C.; Molnar, D.; Aminian, K.

    1995-12-31

    Reservoir characterization plays a critical role in appraising the economic success of reservoir management and development methods. Nearly all reservoirs show some degree of heterogeneity, which invariably impacts production. As a result, the production performance of a complex reservoir cannot be realistically predicted without accurate reservoir description. Characterization of a heterogeneous reservoir is a complex problem. The difficulty stems from the fact that sufficient data to accurately predict the distribution of the formation attributes are not usually available. Generally the geophysical logs are available from a considerable number of wells in the reservoir. Therefore, a methodology for reservoir description andmore » characterization utilizing only well logs data represents a significant technical as well as economic advantage. One of the key issues in the description and characterization of heterogeneous formations is the distribution of various zones and their properties. In this study, several artificial neural networks (ANN) were successfully designed and developed for zone identification in a heterogeneous formation from geophysical well logs. Granny Creek Field in West Virginia has been selected as the study area in this paper. This field has produced oil from Big Injun Formation since the early 1900`s. The water flooding operations were initiated in the 1970`s and are currently still in progress. Well log data on a substantial number of wells in this reservoir were available and were collected. Core analysis results were also available from a few wells. The log data from 3 wells along with the various zone definitions were utilized to train the networks for zone recognition. The data from 2 other wells with previously determined zones, based on the core and log data, were then utilized to verify the developed networks predictions. The results indicated that ANN can be a useful tool for accurately identifying the zones in complex reservoirs.« less

  12. REAL-TIME TRACER MONITORING OF RESERVOIR STIMULATION PROCEDURES VIA ELECTRONIC WIRELINE AND TELEMETRY DATA TRANSMISSION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    George L. Scott III

    2005-01-01

    Finalized Phase 2-3 project work has field-proven two separate real-time reservoir processes that were co-developed via funding by the National Energy Technology Laboratory (NETL). Both technologies are presently patented in the United States and select foreign markets; a downhole-commingled reservoir stimulation procedure and a real-time tracer-logged fracturing diagnostic system. Phase 2 and early Phase 3 project work included the research, development and well testing of a U.S. patented gamma tracer fracturing diagnostic system. This stimulation logging process was successfully field-demonstrated; real-time tracer measurement of fracture height while fracturing was accomplished and proven technically possible. However, after the initial well tests,more » there were several licensing issues that developed between service providers that restricted and minimized Realtimezone's (RTZ) ability to field-test the real-time gamma diagnostic system as was originally outlined for this project. Said restrictions were encountered after when one major provider agreed to license their gamma logging tools to another. Both of these companies previously promised contributory support toward Realtimezone's DE-FC26-99FT40129 project work, however, actual support was less than desired when newly-licensed wireline gamma logging tools from one company were converted by the other from electric wireline into slickline, batter-powered ''memory'' tools for post-stimulation logging purposes. Unfortunately, the converted post-fracture measurement memory tools have no applications in experimentally monitoring real-time movement of tracers in the reservoir concurrent with the fracturing treatment. RTZ subsequently worked with other tracer gamma-logging tool companies for basic gamma logging services, but with lessened results due to lack of multiple-isotope detection capability. In addition to real-time logging system development and well testing, final Phase 2 and Phase 3 project work included the development of a real-time reservoir stimulation procedure, which was successfully field-demonstrated and is presently patented in the U.S. and select foreign countries, including Venezuela, Brazil and Canada. Said patents are co-owned by RTZ and the National Energy Technology Lab (NETL). In 2002, Realtimezone and the NETL licensed said patents to Halliburton Energy Services (HES). Additional licensing agreements (LA) are anticipated with other service industry companies in 2005. Final Phase 3 work has led to commercial applications of the real-time reservoir stimulation procedure. Four successfully downhole-mixed well tests were conducted with commercially expected production results. The most recent, fourth field test was a downhole-mixed stimulated well completed in June, 2004, which currently produces 11 BOPD with 90 barrels of water per day. Conducted Phase 2 and Phase 3 field-test work to date has resulted in the fine-tuning of a real-time enhanced stimulation system that will significantly increase future petroleum well recoveries in the United States and foreign petroleum fields, both onshore and offshore, and in vertical and horizontal wells.« less

  13. Little effects of reduced-impact logging on insect communities in eastern Amazonia.

    PubMed

    Nogueira, Denis Silva; Calvão, Lenize Batista; de Assis Montag, Luciano Fogaça; Juen, Leandro; De Marco, Paulo

    2016-07-01

    Selective logging has become a major source of threats to tropical forest, bringing challenges for both ecologists and managers to develop low-impact forestry. Reduced-impact logging (RIL) is a prominent activity accounting for such forestry practices to prevent strong forest disturbances. Our aims were to evaluate the effects of RIL on insect communities of forested streams from Eastern Amazon and to test the hypothesis of negative effects of RIL on species richness, abundance, and functional feeding groups of aquatic insect assemblages. Neither of the evaluated metrics of the studied assemblages were negatively affected by RIL. Environmental metrics, such as substrate heterogeneity, woody canopy cover, and hill slope height, varied more among RIL streams than in reference streams, indicating a gradient according to logging impacts, and are suitable candidates to monitor RIL impacts in Amazonian streams. In addition, the PHI index also varied among REF and RIL, according to age class and year of logging, which could reflect trends to recover the forest structure after logging in a time frame of only 10 years. We conclude that RIL impacts have not had detrimental impacts on insect communities, but have changed little of the environmental conditions, especially of the riparian vegetation around streams.

  14. Halo vest instrumentation

    NASA Astrophysics Data System (ADS)

    Huston, Dryver R.; Krag, Martin

    1996-05-01

    The halo vest is a head and neck immobilization system that is often used on patients that are recovering from cervical trauma or surgery. The halo vest system consists of a rigid halo that is firmly attached to the skull, an upright support structure for stabilization and immobilization, and a torso-enveloping vest. The main purpose of this study was to measure the forces that are carried by the halo-vest structure as the subject undergoes various activities of daily living and external loading for different vest designs. A tethered strain gage load cell based instrumentation system was used to take these load measurements on ten different subjects. Three different halo-vest systems were evaluated. The primary difference between the vests was the amount of torso coverage and the use of shoulder straps. The loads were measured, analyzed and used to compare the vests and to create a model of halo-vest-neck mechanics. Future applications of this technology to standalone data logging, pin-load measuring and biofeedback applications are discussed.

  15. The Flux of Carbon from Selective Logging, Fire, and Regrowth in Amazonia

    NASA Technical Reports Server (NTRS)

    Houghton, R. A.

    2004-01-01

    The major goal of this work was to develop a spatial, process-based model (CARLUC) that would calculate sources and sinks of carbon from changes in land use, including logging and fire. The work also included Landsat data, together with fieldwork, to investigate fire and logging in three different forest types within Brazilian Amazonia. Results from these three activities (modeling, fieldwork, and remote sensing) are described, individually, below. The work and some of the personnel overlapped with research carried out by Dr. Daniel Nepstad's LBA team, and thus some of the findings are also reported in his summaries.

  16. Fuzzy inference system for identification of geological stratigraphy off Prydz Bay, East Antarctica

    NASA Astrophysics Data System (ADS)

    Singh, Upendra K.

    2011-12-01

    The analysis of well logging data plays key role in the exploration and development of hydrocarbon reservoirs. Various well log parameters such as porosity, gamma ray, density, transit time and resistivity, help in classification of strata and estimation of the physical, electrical and acoustical properties of the subsurface lithology. Strong and conspicuous changes in some of the log parameters associated with any particular geological stratigraphy formation are function of its composition, physical properties that help in classification. However some substrata show moderate values in respective log parameters and make difficult to identify the kind of strata, if we go by the standard variability ranges of any log parameters and visual inspection. The complexity increases further with more number of sensors involved. An attempt is made to identify the kinds of stratigraphy from well logs over Prydz bay basin, East Antarctica using fuzzy inference system. A model is built based on few data sets of known stratigraphy and further the network model is used as test model to infer the lithology of a borehole from their geophysical logs, not used in simulation. Initially the fuzzy based algorithm is trained, validated and tested on well log data and finally identifies the formation lithology of a hydrocarbon reservoir system of study area. The effectiveness of this technique is demonstrated by the analysis of the results for actual lithologs and coring data of ODP Leg 188. The fuzzy results show that the training performance equals to 82.95% while the prediction ability is 87.69%. The fuzzy results are very encouraging and the model is able to decipher even thin layer seams and other strata from geophysical logs. The result provides the significant sand formation of depth range 316.0- 341.0 m, where core recovery is incomplete.

  17. Longer-term Stream Nitrogen Dynamics after Wildfire and Salvage Harvesting: Implications for Management Concepts based on Trajectories of Post-disturbance Watershed Recovery.

    NASA Astrophysics Data System (ADS)

    Silins, U.; Emelko, M. B.; Bladon, K. D.; Stone, M.; Williams, C.; Martens, A. M.; Wagner, M. J.

    2015-12-01

    Biogeochemical processes reflecting interaction of vegetation and hydrology govern long-term export of nutrients such as nitrogen, phosphorus, and carbon over successional time scales. While management concepts of watershed "recovery" from disturbance back towards pre-disturbance conditions are often considered over much shorter timescales, few studies have directly explored watershed biogeochemical responses to disturbance long enough to directly document the longer-term trajectory of responses to severe land disturbance on nitrogen export. The objectives of this study were to document both the initial magnitude and patterns of longer-term recovery of stream nitrogen after the 2003 Lost Creek wildfire over nine years in front ranges of the Rocky Mountains in south-west Alberta, Canada. The study was conducted in seven instrumented catchments (4-14 km2), including burned, burned and salvage logged, and unburned (reference) conditions since 2004. Total nitrogen (TN) and nitrate (NO3-) concentrations and area-normalized yields were greater and more variable in burned and post-fire salvage logged catchments when compared with unburned catchments. Large initial increases in stream TN and NO3- production 1-3 years after both wildfire and post-fire salvage logging declined strongly to levels similar to, or below that of unburned watersheds 4-6 years after the fire, and continued to decline (although more slowly) 7-9 years after the wildfire. Post-fire salvage logging produced lower impacts on TN and NO3- in streams and these effects declined even more rapidly compared to the effects of wildfire alone. These changes closely corresponded to the early trajectory of establishment and rapid juvenile growth of post-fire regenerating forest vegetation in both catchment groups. While the concept of hydrologic recovery from disturbance is both a practical and meaningful concept for integrated landscape management for protection of forest water resources, the benchmark for "recovery" based on present conditions in undisturbed forests may vary widely depending on forest age and successional status.

  18. Deuterium Abundance Toward G191-B2B: Results from the Far Ultraviolet Spectroscopic Explorer (FUSE) Mission

    NASA Technical Reports Server (NTRS)

    Lemoine, M.; Vidal-Madjar, A.; Hebrard, G.; Desert, J.-M.; Ferlet, R.; LecavelierdesEtangs, A.; Howk, J. C.; Andre, M.; Blair, W. P.; Friedman, S. D.; hide

    2002-01-01

    High-resolution spectra of the hot white dwarf G191-B2B covering the wavelength region 905-1187A were obtained with the Far Ultraviolet Spectroscopic Explorer (FUSE). This data was used in conjunction with existing high-resolution Hubble Space Telescope STIS observations to evaluate the total H(sub I), D(sub I), O(sub I) and N(sub I) column densities along the line of sight. Previous determinations of N(D(sub I)) based upon GHRS (Goddard High Resolution Spectrograph) and STIS (Space Telescope Imaging Spectrograph) observations were controversial due to the saturated strength of the D(sub I) Lyman alpha line. In the present analysis the column density of D(sub I) has been measured using only the unsaturated Lyman beta and Lyman gamma lines observed by FUSE. A careful inspection of possible systematic uncertainties tied to the modeling of the stellar continuum or to the uncertainties in the FUSE instrumental character series has been performed. The column densities derived are: log N(D(sub I)) = 13.40+/-0.07, log N(O(sub I)) = 14.86+/-0.07, and log N(N(sub I)) = 13.87+/-0.07 quoted with 2sigma, uncertainties. The measurement of the H(sub I) column density by profile fitting of the Lyman alpha line has been found to be unsecure. If additional weak hot interstellar components are added to the three detected clouds along the line of sight, the H(sub I)) column density can be reduced quite significantly, even though the signal-to-noise ratio and spectral resolution at Lyman alpha are excellent. The new estimate of N(H(sub I)) toward G191-B2B reads: logN(H (sub I)) = 18.18+/-0.18 (2sigma uncertainty), so that the average (D/H) ratio on the line of sight is: (D/H)= 1.66(+0.9/-0.6) x 10(exp -5) (2sigma uncertainty).

  19. Development of spruce-fir stands following spruce beetle outbreaks

    Treesearch

    J. M. Schmid; T. E. Hinds

    1974-01-01

    Logged and unlogged stands of Engelmann spruce-subalpine fir were evaluated in spruce beetle outbreak areas infested about 15, 25, 50, and 100 years ago. Seedling regeneration was generally adequate except in heavily logged areas, although seedlings were often damaged, apparently by animals. Species composition was dramatically altered in favor of fir in the unlogged...

  20. School Principals at Their Lonely Work: Recording Workday Practices through ESM Logs

    ERIC Educational Resources Information Center

    Lopez, Veronica; Ahumada, Luis; Galdames, Sergio; Madrid, Romina

    2012-01-01

    This study used portable technology based on Experience Sampling Methodology (ESM log) to register workday practices of school principals and heads from Chilean schools who were implementing school improvement plans aimed at developing a culture of organizational learning. For a week, Smartphone devices which beeped seven times a day were given to…

  1. Predicting logging residues: an interim equation for Appalachian oak sawtimber

    Treesearch

    A. Jeff Martin

    1975-01-01

    An equation, using dbh, dbh², bole length, and sawlog height to predict the cubic-foot volume of logging residue per tree, was developed from data collected on 36 mixed oaks in southwestern Virginia. The equation produced reliable results for small sawtimber trees, but additional research is needed for other species, sites, and utilization practices.

  2. The Economics of Reduced Impact Logging in the American Tropics: A Review of Recent Initiatives

    Treesearch

    Frederick Boltz; Thomas P. Holmes; Douglas R. Carter

    1999-01-01

    Programs aimed at developing and implementing reduced-impact logging (RIL) techniques are currently underway in important forest regions of Latin America, given the importance of timber production in the American tropics to national and global markets. RIL efforts focus upon planning and extraction methods which lessen harvest impact on residual commercial timber...

  3. Role of Passive Capturing in a Ubiquitous Learning Environment

    ERIC Educational Resources Information Center

    Ogata, Hiroaki; Hou, Bin; Li, MengMeng; Uosaki, Noriko; Mouri, Kousuke

    2013-01-01

    Ubiquitous Learning Log (ULL) is defined as a digital record of what you have learned in the daily life using ubiquitous technologies. This paper focuses on how to capture learning experiences in our daily life for vocabulary learning. In our previous works, we developed a system named SCROLL (System for Capturing and Reminding Of Learning Log) in…

  4. An ecosystem model for tropical forest disturbance and selective logging

    Treesearch

    Maoyi Huang; Gregory P. Asner; Michael Keller; Joseph A. Berry

    2008-01-01

    [1] A new three-dimensional version of the Carnegie-Ames-Stanford Approach (CASA) ecosystem model (CASA-3D) was developed to simulate regional carbon cycling in tropical forest ecosystems after disturbances such as logging. CASA-3D has the following new features: (1) an alternative approach for calculating absorbed photosynthetically active radiation (APAR) using new...

  5. Logging residues on Douglas-fir region clearcuts—weights and volumes.

    Treesearch

    John D. Dell; Franklin R. Ward

    1971-01-01

    This paper presents the results of ground measurements of logging residue weights and volumes on 30 clearcut units in Douglas-fir forests of western Oregon and Washington. Additional information is given on quantities of material left as slash which might be utilized. These measurements were made on public lands, using a method developed in Canada.

  6. Effects of logging debris treatments on five-year development of competing vegetation and planted Douglas-fir

    Treesearch

    Timothy B. Harrington; Stephen H. Schoenholtz

    2010-01-01

    Although considerable research has focused on the influences of logging debris treatments on soil and forest regeneration responses, few studies have identified whether debris effects are mediated by associated changes in competing vegetation abundance. At sites near Matlock, Washington, and Molalla, Oregon, studies were initiated after timber harvest to quantify the...

  7. Estimating risk of debris slides after timber harvest in northwestern California

    Treesearch

    M. K. Neely; R. M. Rice

    1990-01-01

    Abstract - The risk of debris slides resulting from logging must often be appraised by foresters planning timber harvests. The Board of Forestry of the State of California commissioned the development of a Mass Movement Checklist to guide foresters making such appraisals. The classification accuracy of the Checklist, tested using 50 logging-related debris slides and 50...

  8. Discovering Decision Knowledge from Web Log Portfolio for Managing Classroom Processes by Applying Decision Tree and Data Cube Technology.

    ERIC Educational Resources Information Center

    Chen, Gwo-Dong; Liu, Chen-Chung; Ou, Kuo-Liang; Liu, Baw-Jhiune

    2000-01-01

    Discusses the use of Web logs to record student behavior that can assist teachers in assessing performance and making curriculum decisions for distance learning students who are using Web-based learning systems. Adopts decision tree and data cube information processing methodologies for developing more effective pedagogical strategies. (LRW)

  9. Testing and analysis of internal hardwood log defect prediction models

    Treesearch

    R. Edward Thomas

    2011-01-01

    The severity and location of internal defects determine the quality and value of lumber sawn from hardwood logs. Models have been developed to predict the size and position of internal defects based on external defect indicator measurements. These models were shown to predict approximately 80% of all internal knots based on external knot indicators. However, the size...

  10. A logging residue "yield" table for Appalachian hardwoods

    Treesearch

    A. Jeff Martin

    1976-01-01

    An equation for predicting logging-residue volume per acre for Appalachian hardwoods was developed from data collected on 20 timber sales in national forests in West Virginia and Virginia. The independent variables of type-of-cut, products removed, basal area per acre, and stand age explained 95 percent of the variation in residue volume per acre. A "yield"...

  11. Impact of in-woods product merchandizing on profitable logging opportunities in southern upland hardwood forests

    Treesearch

    Dennis M. May; Chris B. LeDoux; John B. Tansey; Richard Widmann

    1994-01-01

    Procedures developed to assess available timber supplies from upland hardwood forest statistics reported by the U.S. Department of Agriculture, Forest Service, Forest Inventory and Analysis (FIA) units, were modified to demonstrate the impact of three in-woods product-merchandizing options on profitable logging opportunities in upland hardwood forests in 14 Southern...

  12. Effect of temperature on Acoustic Evaluation of standing trees and logs: Part 2: Field Investigation

    Treesearch

    Shan Gao; Xiping Wang; Lihai Wang; R. Bruce Allison

    2013-01-01

    The objectives of this study were to investigate the effect of seasonal temperature changes on acoustic velocity measured on standing trees and green logs and to develop models for compensating temperature differences because acoustic measurements are performed in different climates and seasons. Field testing was conducted on 20 red pine (Pinus resinosa...

  13. Statistically-Driven Visualizations of Student Interactions with a French Online Course Video

    ERIC Educational Resources Information Center

    Youngs, Bonnie L.; Prakash, Akhil; Nugent, Rebecca

    2018-01-01

    Logged tracking data for online courses are generally not available to instructors, students, and course designers and developers, and even if these data were available, most content-oriented instructors do not have the skill set to analyze them. Learning analytics, mined from logged course data and usually presented in the form of learning…

  14. Historic Mining and Agriculture as Indicators of Occurrence and Abundance of Widespread Invasive Plant Species

    PubMed Central

    Calinger, Kellen; Calhoon, Elisabeth; Chang, Hsiao-chi; Whitacre, James; Wenzel, John; Comita, Liza; Queenborough, Simon

    2015-01-01

    Anthropogenic disturbances often change ecological communities and provide opportunities for non-native species invasion. Understanding the impacts of disturbances on species invasion is therefore crucial for invasive species management. We used generalized linear mixed effects models to explore the influence of land-use history and distance to roads on the occurrence and abundance of two invasive plant species (Rosa multiflora and Berberis thunbergii) in a 900-ha deciduous forest in the eastern U.S.A., the Powdermill Nature Reserve. Although much of the reserve has been continuously forested since at least 1939, aerial photos revealed a variety of land-uses since then including agriculture, mining, logging, and development. By 2008, both R. multiflora and B. thunbergii were widespread throughout the reserve (occurring in 24% and 13% of 4417 10-m diameter regularly-placed vegetation plots, respectively) with occurrence and abundance of each varying significantly with land-use history. Rosa multiflora was more likely to occur in historically farmed, mined, logged or developed plots than in plots that remained forested, (log odds of 1.8 to 3.0); Berberis thunbergii was more likely to occur in plots with agricultural, mining, or logging history than in plots without disturbance (log odds of 1.4 to 2.1). Mining, logging, and agriculture increased the probability that R. multiflora had >10% cover while only past agriculture was related to cover of B. thunbergii. Proximity to roads was positively correlated with the occurrence of R. multiflora (a 0.26 increase in the log odds for every 1-m closer) but not B. thunbergii, and roads had no impact on the abundance of either species. Our results indicated that a wide variety of disturbances may aid the introduction of invasive species into new habitats, while high-impact disturbances such as agriculture and mining increase the likelihood of high abundance post-introduction. PMID:26046534

  15. Speech Enhancement Using Gaussian Scale Mixture Models

    PubMed Central

    Hao, Jiucang; Lee, Te-Won; Sejnowski, Terrence J.

    2011-01-01

    This paper presents a novel probabilistic approach to speech enhancement. Instead of a deterministic logarithmic relationship, we assume a probabilistic relationship between the frequency coefficients and the log-spectra. The speech model in the log-spectral domain is a Gaussian mixture model (GMM). The frequency coefficients obey a zero-mean Gaussian whose covariance equals to the exponential of the log-spectra. This results in a Gaussian scale mixture model (GSMM) for the speech signal in the frequency domain, since the log-spectra can be regarded as scaling factors. The probabilistic relation between frequency coefficients and log-spectra allows these to be treated as two random variables, both to be estimated from the noisy signals. Expectation-maximization (EM) was used to train the GSMM and Bayesian inference was used to compute the posterior signal distribution. Because exact inference of this full probabilistic model is computationally intractable, we developed two approaches to enhance the efficiency: the Laplace method and a variational approximation. The proposed methods were applied to enhance speech corrupted by Gaussian noise and speech-shaped noise (SSN). For both approximations, signals reconstructed from the estimated frequency coefficients provided higher signal-to-noise ratio (SNR) and those reconstructed from the estimated log-spectra produced lower word recognition error rate because the log-spectra fit the inputs to the recognizer better. Our algorithms effectively reduced the SSN, which algorithms based on spectral analysis were not able to suppress. PMID:21359139

  16. Semi-quantitative assay for polyketide prymnesins isolated from Prymnesium parvum (Haptophyta) cultures.

    PubMed

    La Claire, J W; Manning, S R; Talarski, A E

    2015-08-01

    A fluorometric assay was developed to semi-quantify co-purified polyketide prymnesins-1 and -2 (PPs) from Prymnesium parvum cultures. Evaluations performed throughout the growth cycle of 5 practical salinity unit (PSU) cultures detected relatively 8-10 × more PPs in the culture medium (exotoxins) than in cells (endotoxins). The [exotoxin] remained stable and relatively low until post-log growth, when they increased significantly. However, on a per-cell basis, [exotoxin] declined throughout log phase and subsequently increased dramatically during late- and post-log phases. The [endotoxin] remained stable until late- and post-log phases, when it achieved its highest level before declining sharply. Shaking cultures of strains from Texas, South Carolina and the United Kingdom displayed dramatically different [exotoxin] during post-log decline. Cultures adapted to 30 PSU had significantly lower [exotoxin] over the course of cultivation than those grown at 5 PSU. Phosphate limitation enhanced [exotoxin] on a per-cell basis, especially in late- and post-log cultures. Media containing streptomycin exhibited a ∼20% increase in [exotoxin] in post-log cultures vs. control treatments, but it had only negligible effects on endotoxin levels. Brefeldin A had minimal effects on [exotoxin], suggesting that the presence of PPs in the medium may be largely derived from cell lysis or some other passive means. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Evaluation of the effects of ultraviolet light on bacterial contaminants inoculated into whole milk and colostrum, and on colostrum immunoglobulin G

    PubMed Central

    Pereira, R. V.; Bicalho, M. L.; Machado, V. S.; Lima, S.; Teixeira, A. G.; Warnick, L. D.; Bicalho, R. C.

    2015-01-01

    Raw milk and colostrum can harbor dangerous micro-organisms that can pose serious health risks for animals and humans. According to the USDA, more than 58% of calves in the United States are fed unpasteurized milk. The aim of this study was to evaluate the effect of UV light on reduction of bacteria in milk and colostrum, and on colostrum IgG. A pilot-scale UV light continuous (UVC) flow-through unit (45 J/cm2) was used to treat milk and colostrum. Colostrum and sterile whole milk were inoculated with Listeria innocua, Mycobacterium smegmatis, Salmonella serovar Typhimurium, Escherichia coli, Staphylococcus aureus, Streptococcus agalactiae, and Acinetobacter baumannii before being treated with UVC. During UVC treatment, samples were collected at 5 time points and bacteria were enumerated using selective media. The effect of UVC on IgG was evaluated using raw colostrum from a nearby dairy farm without the addition of bacteria. For each colostrum batch, samples were collected at several different time points and IgG was measured using ELISA. The UVC treatment of milk resulted in a significant final count (log cfu/mL) reduction of Listeria monocytogenes (3.2 ± 0.3 log cfu/mL reduction), Salmonella spp. (3.7 ± 0.2 log cfu/mL reduction), Escherichia coli (2.8 ± 0.2 log cfu/mL reduction), Staph. aureus (3.4 ± 0.3 log cfu/mL reduction), Streptococcus spp. (3.4 ± 0.4 log cfu/mL reduction), and A. baumannii (2.8 ± 0.2 log cfu/mL reduction). The UVC treatment of milk did not result in a significant final count (log cfu/mL) reduction for M. smegmatis (1.8 ± 0.5 log cfu/mL reduction). The UVC treatment of colostrum was significantly associated with a final reduction of bacterial count (log cfu/mL) of Listeria spp. (1.4 ± 0.3 log cfu/mL reduction), Salmonella spp. (1.0 ± 0.2 log cfu/mL reduction), and Acinetobacter spp. (1.1 ± 0.3 log cfu/mL reduction), but not of E. coli (0.5 ± 0.3 log cfu/mL reduction), Strep. agalactiae (0.8 ± 0.2 log cfu/mL reduction), and Staph. aureus (0.4 ± 0.2 log cfu/mL reduction). The UVC treatment of colostrum significantly decreased the IgG concentration, with an observed final mean IgG reduction of approximately 50%. Development of new methods to reduce bacterial contaminants in colostrum must take into consideration the barriers imposed by its opacity and organic components, and account for the incidental damage to IgG caused by manipulating colostrum. PMID:24582452

  18. Diarrhea-predominant irritable bowel syndrome: creation of an electronic version of a patient-reported outcome instrument by conversion from a pen-and-paper version and evaluation of their equivalence.

    PubMed

    Delgado-Herrera, Leticia; Banderas, Benjamin; Ojo, Oluwafunke; Kothari, Ritesh; Zeiher, Bernhardt

    2017-01-01

    Subjects with diarrhea-predominant irritable bowel syndrome (IBS-D) experience abdominal cramping, bloating, pressure, and pain. Due to an absence of clinical biomarkers for IBS-D severity, evaluation of clinical therapy benefits depends on valid and reliable symptom assessments. A patient-reported outcome (PRO) instrument has been developed, comprising of two questionnaires - the IBS-D Daily Symptom Diary and IBS-D Symptom Event Log - suitable for clinical trials and real-world settings. This program aimed to support instrument conversion from pen-and-paper to electronic format. Digital technology (Android/iOS) and a traditional mode of administration study in the target population were used to migrate or convert the validated PRO IBS-D pen-and-paper measure to an electronic format. Equivalence interviews, conducted in three waves, each had three parts: 1) conceptual equivalence testing between formats, 2) electronic-version report-history cognitive debriefing, and 3) electronic version usability evaluation. After each inter-view wave, preliminary analyses were conducted and modifications made to the electronic version, before the next wave. Final revisions were based on a full analysis of equivalence interviews. The final analysis evaluated subjects' ability to read, understand, and provide meaningful responses to the instruments across both formats. Responses were classified according to conceptual equivalence between formats and mobile-format usability assessed with a questionnaire and open-ended probes. Equivalence interviews (n=25) demonstrated conceptual equivalence between formats. Mobile-application cognitive debriefing showed some subjects experienced difficulty with font/screen visibility and understanding or reading some report-history charts and summary screens. To address difficulties, minor revisions/modifications were made and landscape orientation and zoom-in/zoom-out features incorporated. This study indicates that the two administration modes are conceptually equivalent. Since both formats are conceptually equivalent, both are psychometrically reliable, as established in the pen-and-paper version. Subjects found both mobile applications (Android/iOS) offered many advantages over the paper version, such as real-time assessment of their experience.

  19. Enabling High Spectral Resolution Thermal Imaging from CubeSat and MicroSatellite Platforms Using Uncooled Microbolometers and a Fabry-Perot interferometer

    NASA Astrophysics Data System (ADS)

    Wright, R.; Lucey, P. G.; Crites, S.; Garbeil, H.; Wood, M.; Pilger, E. J.; Honniball, C.; Gabrieli, A.

    2016-12-01

    Measurements of reflectance or emittance in tens of narrow, contiguous wavebands, allow for the derivation of laboratory quality spectra remotely, from which the chemical composition and physical properties of targets can be determined. Although spaceborne (e.g. EO-1 Hyperion) hyperspectral data in the 0.4-2.5 micron (VSWIR) region are available, the provision of equivalent data in the log-wave infrared has lagged behind, there being no currently operational high spatial resolution LWIR imaging spectrometer on orbit. This is attributable to two factors. Firstly, earth emits less light than it reflects, reducing the signal available to measure in the TIR, and secondly, instruments designed to measure (and spectrally decompose) this signal are more complex, massive, and expensive than their VSWIR counterparts, largely due to the need to cryogenically cool the detector and optics. However, this measurement gap needs to be filled, as LWIR data provide fundamentally different information than VSWIR measurements. The TIRCIS instrument (Thermal Infra-Red Compact Imaging Spectrometer), developed at the Hawaii Institute of Geophysics and Planetology, uses a Fabry-Perot interferometer, an uncooled microbolometer array, and push-broom scanning to acquire hyperspectral image data in the 8-14 micron spectral range. Radiometric calibration is provided by blackbody targets while spectral calibration is achieved using monochromatic light sources. The instrument has a mass of <15 kg and dimensions of 53 cm × 25 cm × 22 cm, and has been designed to be compatible with integration into a micro-satellite platform. (A precursor to this instrument was launched onboard a 55 kg microsatellite as part of the ORS-4 mission in October 2015). The optical design yields a 120 m ground sample size given an orbit of 500 km. Over the wavelength interval of 7.5 to 14 microns up to 50 spectral samples are possible (the accompanying image shows a quartz spectrum composed of 17 spectral samples). Our performance model indicates signal-to-noise ratios of 400-800:1.

  20. Practical solutions to implementing "Born Semantic" data systems

    NASA Astrophysics Data System (ADS)

    Leadbetter, A.; Buck, J. J. H.; Stacey, P.

    2015-12-01

    The concept of data being "Born Semantic" has been proposed in recent years as a Semantic Web analogue to the idea of data being "born digital"[1], [2]. Within the "Born Semantic" concept, data are captured digitally and at a point close to the time of creation are annotated with markup terms from semantic web resources (controlled vocabularies, thesauri or ontologies). This allows heterogeneous data to be more easily ingested and amalgamated in near real-time due to the standards compliant annotation of the data. In taking the "Born Semantic" proposal from concept to operation, a number of difficulties have been encountered. For example, although there are recognised methods such as Header, Dictionary, Triples [3] for the compression, publication and dissemination of large volumes of triples these systems are not practical to deploy in the field on low-powered (both electrically and computationally) devices. Similarly, it is not practical for instruments to output fully formed semantically annotated data files if they are designed to be plugged into a modular system and the data to be centrally logged in the field as is the case on Argo floats and oceanographic gliders where internal bandwidth becomes an issue [2]. In light of these issues, this presentation will concentrate on pragmatic solutions being developed to the problem of generating Linked Data in near real-time systems. Specific examples from the European Commission SenseOCEAN project where Linked Data systems are being developed for autonomous underwater platforms, and from work being undertaken in the streaming of data from the Irish Galway Bay Cable Observatory initiative will be highlighted. Further, developments of a set of tools for the LogStash-ElasticSearch software ecosystem to allow the storing and retrieval of Linked Data will be introduced. References[1] A. Leadbetter & J. Fredericks, We have "born digital" - now what about "born semantic"?, European Geophysical Union General Assembly, 2014.[2] J. Buck & A. Leadbetter, Born semantic: linking data from sensors to users and balancing hardware limitations with data standards, European Geophysical Union General Assembly, 2015.[3] J. Fernandez et al., Binary RDF Representation for Publication and Exchange (HDT), Web Semantics 19:22-41, 2013.

Top