Sample records for technique involves measuring

  1. WE-A-BRE-01: Debate: To Measure or Not to Measure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moran, J; Miften, M; Mihailidis, D

    2014-06-15

    Recent studies have highlighted some of the limitations of patient-specific pre-treatment IMRT QA measurements with respect to assessing plan deliverability. Pre-treatment QA measurements are frequently performed with detectors in phantoms that do not involve any patient heterogeneities or with an EPID without a phantom. Other techniques have been developed where measurement results are used to recalculate the patient-specific dose volume histograms. Measurements continue to play a fundamental role in understanding the initial and continued performance of treatment planning and delivery systems. Less attention has been focused on the role of computational techniques in a QA program such as calculation withmore » independent dose calculation algorithms or recalculation of the delivery with machine log files or EPID measurements. This session will explore the role of pre-treatment measurements compared to other methods such as computational and transit dosimetry techniques. Efficiency and practicality of the two approaches will also be presented and debated. The speakers will present a history of IMRT quality assurance and debate each other regarding which types of techniques are needed today and for future quality assurance. Examples will be shared of situations where overall quality needed to be assessed with calculation techniques in addition to measurements. Elements where measurements continue to be crucial such as for a thorough end-to-end test involving measurement will be discussed. Operational details that can reduce the gamma tool effectiveness and accuracy for patient-specific pre-treatment IMRT/VMAT QA will be described. Finally, a vision for the future of IMRT and VMAT plan QA will be discussed from a safety perspective. Learning Objectives: Understand the advantages and limitations of measurement and calculation approaches for pre-treatment measurements for IMRT and VMAT planning Learn about the elements of a balanced quality assurance program involving modulated techniques Learn how to use tools and techniques such as an end-to-end test to enhance your IMRT and VMAT QA program.« less

  2. Mass balance for on-line alphakLa estimation in activated sludge oxidation ditch.

    PubMed

    Chatellier, P; Audic, J M

    2001-01-01

    The capacity of an aeration system to transfer oxygen to a given activated sludge oxidation ditch is characterised by the alphakLa parameter. This parameter is difficult to measure under normal plant working conditions. Usually this measurement involves off-gas techniques or static mass balance. Therefore an on-line technique has been developed and tested in order to evaluate alphakLa. This technique deduces alphakLa from a data analysis of low cost sensor measurement: two flow meters and one oxygen probe. It involves a dynamic mass balance applied to aeration cycles selected according to given criteria. This technique has been applied to a wastewater treatment plant during four years. Significant variations of the alphakLa values have been detected while the number of blowers changes. This technique has been applied to another plant during two months.

  3. 40 CFR 86.1308-84 - Dynamometer and engine equipment specifications.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... technique involves the calibration of a master load cell (i.e., dynamometer case load cell). This... hydraulically actuated precalibrated master load cell. This calibration is then transferred to the flywheel torque measuring device. The technique involves the following steps: (i) A master load cell shall be...

  4. Research relative to weather radar measurement techniques

    NASA Technical Reports Server (NTRS)

    Smith, Paul L.

    1992-01-01

    Research relative to weather radar measurement techniques, which involves some investigations related to measurement techniques applicable to meteorological radar systems in Thailand, is reported. A major part of the activity was devoted to instruction and discussion with Thai radar engineers, technicians, and meteorologists concerning the basic principles of radar meteorology and applications to specific problems, including measurement of rainfall and detection of wind shear/microburst hazards. Weather radar calibration techniques were also considered during this project. Most of the activity took place during two visits to Thailand, in December 1990 and February 1992.

  5. Techniques for Down-Sampling a Measured Surface Height Map for Model Validation

    NASA Technical Reports Server (NTRS)

    Sidick, Erkin

    2012-01-01

    This software allows one to down-sample a measured surface map for model validation, not only without introducing any re-sampling errors, but also eliminating the existing measurement noise and measurement errors. The software tool of the current two new techniques can be used in all optical model validation processes involving large space optical surfaces

  6. Measurement of Human Blood and Plasma Volumes

    NASA Technical Reports Server (NTRS)

    Greenleaf, J. E.; Szalkay, H. G. H.

    1987-01-01

    Report reviews techniques for measuring blood-plasma volume in humans. Common technique of using radioactive iodine isotope to label plasma albumin involves unwarranted risks from low-level radiation. Report emphasizes techniques using Evans-blue-dye (T-1824) labeling of albumin, hematocrit or hemoglobin/hematocrit measurements, or blood densitometry. In Evans-blue-dye technique, plasma volume determined from decrease in dye concentration occurring after small amount of dye solution injected into circulatory system. Subjection of Evans blue dye to test for carcinogenicity gave negative results.

  7. Neutron total cross section measurement at WNR. [215 to 250 MeV experimental techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lisowski, P.W.; Moore, M.S.; Morgan, G.L.

    1979-01-01

    The techniques involved in measuring fast-neutron total cross sections at the Weapons Neutron Facility (WNR) of the Los Alamos Scientific Laboratory are described. Results of total cross section measurements on natural carbon covering the range 2.5 to 250 MeV are presented. 16 references.

  8. COAL SULFUR MEASUREMENTS

    EPA Science Inventory

    The report describes a new technique for sulfur forms analysis based on low-temperature oxygen plasma ashing. The technique involves analyzing the low-temperature plasma ash by modified ASTM techniques after selectively removing the organic material. The procedure has been tested...

  9. Momentum--"Evaluating Your Marketing Program: Measuring and Tracking Techniques."

    ERIC Educational Resources Information Center

    Meservey, Lynne D.

    1990-01-01

    Suggests 10 tracking techniques for evaluating marketing performance. Techniques involve utilization rate, inquiry and source of inquiry tracking, appointment and interview tracking, enrollment conversion, cost per inquiry and per enrollment, retention rate, survey results, and "mystery shopper." (RJC)

  10. Exploring Uncertainty with Projectile Launchers

    ERIC Educational Resources Information Center

    Orzel, Chad; Reich, Gary; Marr, Jonathan

    2012-01-01

    The proper choice of a measurement technique that minimizes systematic and random uncertainty is an essential part of experimental physics. These issues are difficult to teach in the introductory laboratory, though. Because most experiments involve only a single measurement technique, students are often unable to make a clear distinction between…

  11. General solution for high dynamic range three-dimensional shape measurement using the fringe projection technique

    NASA Astrophysics Data System (ADS)

    Feng, Shijie; Zhang, Yuzhen; Chen, Qian; Zuo, Chao; Li, Rubin; Shen, Guochen

    2014-08-01

    This paper presents a general solution for realizing high dynamic range three-dimensional (3-D) shape measurement based on fringe projection. Three concrete techniques are involved in the solution for measuring object with large range of reflectivity (LRR) or one with shiny specular surface. For the first technique, the measured surface reflectivities are sub-divided into several groups based on its histogram distribution, then the optimal exposure time for each group can be predicted adaptively so that the bright as well as dark areas on the measured surface are able to be handled without any compromise. Phase-shifted images are then captured at the calculated exposure times and a composite phase-shifted image is generated by extracting the optimally exposed pixels in the raw fringes images. For the second technique, it is proposed by introducing two orthogonal polarizers which are placed separately in front of the camera and projector into the first technique and the third one is developed by combining the second technique with the strategy of properly altering the angle between the transmission axes of the two polarizers. Experimental results show that the first technique can effectively improve the measurement accuracy of diffuse objects with LRR, the second one is capable of measuring object with weak specular reflection (WSR: e.g. shiny plastic surface) and the third can inspect surface with strong specular reflection (SSR: e.g. highlight on aluminum alloy) precisely. Further, more complex scene, such as the one with LRR and WSR, or even the one simultaneously involving LRR, WSR and SSR, can be measured accurately by the proposed solution.

  12. Archimedes Revisited: A Faster, Better, Cheaper Method of Accurately Measuring the Volume of Small Objects

    ERIC Educational Resources Information Center

    Hughes, Stephen W.

    2005-01-01

    A little-known method of measuring the volume of small objects based on Archimedes' principle is described, which involves suspending an object in a water-filled container placed on electronic scales. The suspension technique is a variation on the hydrostatic weighing technique used for measuring volume. The suspension method was compared with two…

  13. Measurement techniques and applications of charge transfer to aerospace research

    NASA Technical Reports Server (NTRS)

    Smith, A.

    1978-01-01

    A technique of developing high-velocity low-intensity neutral gas beams for use in aerospace research problems is described. This technique involves ionization of gaseous species with a mass spectrometer and focusing the resulting primary ion beam into a collision chamber containing a static gas at a known pressure and temperature. Equations are given to show how charge-transfer cross sections are obtained from a total-current measurement technique. Important parameters are defined for the charge-transfer process.

  14. Interferometric Methods of Measuring Refractive Indices and Double-Refraction of Fibres.

    ERIC Educational Resources Information Center

    Hamza, A. A.; El-Kader, H. I. Abd

    1986-01-01

    Presents two methods used to measure the refractive indices and double-refraction of fibers. Experiments are described, with one involving the use of Pluta microscope in the double-beam interference technique, the other employing the multiple-beam technique. Immersion liquids are discussed that can be used in the experiments. (TW)

  15. APPLICATION OF ADVANCED IN VITRO TECHNIQUES TO MEASURE, UNDERSTAND AND PREDICT THE KINETICS AND MECHANISMS OF XENOBIOTIC METABOLISM

    EPA Science Inventory

    We have developed a research program in metabolism that involves numerous collaborators across EPA as well as other federal and academic labs. A primary goal is to develop and apply advanced in vitro techniques to measure, understand and predict the kinetics and mechanisms of xen...

  16. Aperture synthesis for microwave radiometers in space

    NASA Technical Reports Server (NTRS)

    Levine, D. M.; Good, J. C.

    1983-01-01

    A technique is described for obtaining passive microwave measurements from space with high spatial resolution for remote sensing applications. The technique involves measuring the product of the signal from pairs of antennas at many different antenna spacings, thereby mapping the correlation function of antenna voltage. The intensity of radiation at the source can be obtained from the Fourier transform of this correlation function. Theory is presented to show how the technique can be applied to large extended sources such as the Earth when observed from space. Details are presented for a system with uniformly spaced measurements.

  17. A review on creatinine measurement techniques.

    PubMed

    Mohabbati-Kalejahi, Elham; Azimirad, Vahid; Bahrami, Manouchehr; Ganbari, Ahmad

    2012-08-15

    This paper reviews the entire recent global tendency for creatinine measurement. Creatinine biosensors involve complex relationships between biology and micro-mechatronics to which the blood is subjected. Comparison between new and old methods shows that new techniques (e.g. Molecular Imprinted Polymers based algorithms) are better than old methods (e.g. Elisa) in terms of stability and linear range. All methods and their details for serum, plasma, urine and blood samples are surveyed. They are categorized into five main algorithms: optical, electrochemical, impedometrical, Ion Selective Field-Effect Transistor (ISFET) based technique and chromatography. Response time, detection limit, linear range and selectivity of reported sensors are discussed. Potentiometric measurement technique has the lowest response time of 4-10 s and the lowest detection limit of 0.28 nmol L(-1) belongs to chromatographic technique. Comparison between various techniques of measurements indicates that the best selectivity belongs to MIP based and chromatographic techniques. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. Comments on: Accuracy of Raman Lidar Water Vapor Calibration and its Applicability to Long-Term Measurements

    NASA Technical Reports Server (NTRS)

    Whiteman, David N.; Venable, Demetrius; Landulfo, Eduardo

    2012-01-01

    In a recent publication, LeBlanc and McDermid proposed a hybrid calibration technique for Raman water vapor lidar involving a tungsten lamp and radiosondes. Measurements made with the lidar telescope viewing the calibration lamp were used to stabilize the lidar calibration determined by comparison with radiosonde. The technique provided a significantly more stable calibration constant than radiosondes used alone. The technique involves the use of a calibration lamp in a fixed position in front of the lidar receiver aperture. We examine this configuration and find that such a configuration likely does not properly sample the full lidar system optical efficiency. While the technique is a useful addition to the use of radiosondes alone for lidar calibration, it is important to understand the scenarios under which it will not provide an accurate quantification of system optical efficiency changes. We offer examples of these scenarios.

  19. Laser Doppler measurement techniques for spacecraft

    NASA Technical Reports Server (NTRS)

    Kinman, Peter W.; Gagliardi, Robert M.

    1986-01-01

    Two techniques are proposed for using laser links to measure the relative radial velocity of two spacecraft. The first technique determines the relative radial velocity from a measurement of the two-way Doppler shift on a transponded radio-frequency subcarrier. The subcarrier intensity-modulates reciprocating laser beams. The second technique determines the relative radial velocity from a measurement of the two-way Doppler shift on an optical frequency carrier which is transponded between spacecraft using optical Costas loops. The first technique might be used in conjunction with noncoherent optical communications, while the second technique is compatible with coherent optical communications. The first technique simultaneously exploits the diffraction advantage of laser beams and the maturity of radio-frequency phase-locked loop technology. The second technique exploits both the diffraction advantage of laser beams and the large Doppler effect at optical frequencies. The second technique has the potential for greater accuracy; unfortunately, it is more difficult to implement since it involves optical Costas loops.

  20. Computer assessment of atherosclerosis from angiographic images

    NASA Technical Reports Server (NTRS)

    Selzer, R. H.; Blankenhorn, D. H.; Brooks, S. H.; Crawford, D. W.; Cashin, W. L.

    1982-01-01

    A computer method for detection and quantification of atherosclerosis from angiograms has been developed and used to measure lesion change in human clinical trials. The technique involves tracking the vessel edges and measuring individual lesions as well as the overall irregularity of the arterial image. Application of the technique to conventional arterial-injection femoral and coronary angiograms is outlined and an experimental study to extend the technique to analysis of intravenous angiograms of the carotid and cornary arteries is described.

  1. Blade pressure measurements

    NASA Astrophysics Data System (ADS)

    Chivers, J. W. H.

    Three measurement techniques which enable rotating pressures to be measured during the normal operation of a gas turbine or a component test rig are described. The first technique was developed specifically to provide steady and transient blade surface pressure data to aid both fan flutter research and general fan performance development. This technique involves the insertion of miniature high frequency response pressure transducers into the fan blades of a large civil gas turbine. The other two techniques were developed to measure steady rotating pressures inside and on the surface of engine or rig turbine blades and also rotating pressures in cooling feed systems. These two low frequency response systems are known as the "pressure pineapple' (a name which resulted from the shape of the original prototype) and the rotating scanivalve.

  2. Observations of the global structure of the stratosphere and mesosphere with sounding rockets and with remote sensing techniques from satellites

    NASA Technical Reports Server (NTRS)

    Heath, D. F.; Hilsenrath, E.; Krueger, A. J.; Nordberg, W.; Prabhakara, C.; Theon, J. S.

    1972-01-01

    Brief descriptions are given of the techniques involved in determining the global structure of the mesosphere and stratosphere based on sounding rocket observations and satellite remotely sensed measurements.

  3. A technique for fast and accurate measurement of hand volumes using Archimedes' principle.

    PubMed

    Hughes, S; Lau, J

    2008-03-01

    A new technique for measuring hand volumes using Archimedes principle is described. The technique involves the immersion of a hand in a water container placed on an electronic balance. The volume is given by the change in weight divided by the density of water. This technique was compared with the more conventional technique of immersing an object in a container with an overflow spout and collecting and weighing the volume of overflow water. The hand volume of two subjects was measured. Hand volumes were 494 +/- 6 ml and 312 +/- 7 ml for the immersion method and 476 +/- 14 ml and 302 +/- 8 ml for the overflow method for the two subjects respectively. Using plastic test objects, the mean difference between the actual and measured volume was -0.3% and 2.0% for the immersion and overflow techniques respectively. This study shows that hand volumes can be obtained more quickly than the overflow method. The technique could find an application in clinics where frequent hand volumes are required.

  4. Enterprise Professional Development--Evaluating Learning

    ERIC Educational Resources Information Center

    Murphy, Gerald A.; Calway, Bruce A.

    2010-01-01

    Whilst professional development (PD) is an activity required by many regulatory authorities, the value that enterprises obtain from PD is often unknown, particularly when it involves development of knowledge. This paper discusses measurement techniques and processes and provides a review of established evaluation techniques, highlighting…

  5. Determination of gas volume trapped in a closed fluid system

    NASA Technical Reports Server (NTRS)

    Hunter, W. F.; Jolley, J. E.

    1971-01-01

    Technique involves extracting known volume of fluid and measuring system before and after extraction, volume of entrapped gas is then computed. Formula derived from ideal gas laws is basis of this method. Technique is applicable to thermodynamic cycles and hydraulic systems.

  6. Appraising Two Techniques for Increasing the Honesty of Students' Answers to Self-Report Assessment Devices.

    ERIC Educational Resources Information Center

    Popham, W. James

    1993-01-01

    Techniques for increasing honesty of student self-report measures, the inaccessible coding system and the alphabet-soup response form, were investigated in a study involving over 1,200 high school students. Both techniques were regarded favorably by students. Because both enhance anonymity, it appears that they could be used jointly. (SLD)

  7. A drag measurement technique for free piston shock tunnels

    NASA Technical Reports Server (NTRS)

    Sanderson, S. R.; Simmons, J. M.; Tuttle, S. L.

    1991-01-01

    A new technique is described for measuring drag with 100-microsecond rise time on a nonlifting model in a free piston shock tunnel. The technique involves interpretation of the stress waves propagating within the model and its support. A finite element representation and spectral methods are used to obtain a mean square optimal estimate of the time history of the aerodynamic loading. Thus, drag is measured instantaneously and the previous restriction caused by the mechanical time constant of balances is overcome. The effectiveness of the balance is demonstrated by measuring the drag on cones with 5 and 15 deg semi-vertex angles in nominally Mach 5.6 flow with stagnation enthalpies from 2.6 to 33 MJ/kg.

  8. Time-resolved brightness measurements by streaking

    NASA Astrophysics Data System (ADS)

    Torrance, Joshua S.; Speirs, Rory W.; McCulloch, Andrew J.; Scholten, Robert E.

    2018-03-01

    Brightness is a key figure of merit for charged particle beams, and time-resolved brightness measurements can elucidate the processes involved in beam creation and manipulation. Here we report on a simple, robust, and widely applicable method for the measurement of beam brightness with temporal resolution by streaking one-dimensional pepperpots, and demonstrate the technique to characterize electron bunches produced from a cold-atom electron source. We demonstrate brightness measurements with 145 ps temporal resolution and a minimum resolvable emittance of 40 nm rad. This technique provides an efficient method of exploring source parameters and will prove useful for examining the efficacy of techniques to counter space-charge expansion, a critical hurdle to achieving single-shot imaging of atomic scale targets.

  9. Statewide planning scenario synthesis : transportation congestion measurement and management.

    DOT National Transportation Integrated Search

    2005-09-01

    This study is a review of current practices in 13 states to: (1) measure traffic congestion and its costs; and (2) manage congestion with programs and techniques that do not involve the building of new highway capacity. In regard to the measures of c...

  10. An improved dual-frequency technique for the remote sensing of ocean currents and wave spectra

    NASA Technical Reports Server (NTRS)

    Schuler, D. L.; Eng, W. P.

    1984-01-01

    A two frequency microwave radar technique for the remote sensing of directional ocean wave spectra and surface currents is investigated. This technique is conceptually attractive because its operational physical principle involves a spatial electromagnetic scattering resonance with a single, but selectable, long gravity wave. Multiplexing of signals having different spacing of the two transmitted frequencies allows measurements of the entire long wave ocean spectrum to be carried out. A new scatterometer is developed and experimentally tested which is capable of making measurements having much larger signal/background values than previously possible. This instrument couples the resonance technique with coherent, frequency agility radar capabilities. This scatterometer is presently configured for supporting a program of surface current measurements.

  11. Ethylene Trace-gas Techniques for High-speed Flows

    NASA Technical Reports Server (NTRS)

    Davis, David O.; Reichert, Bruce A.

    1994-01-01

    Three applications of the ethylene trace-gas technique to high-speed flows are described: flow-field tracking, air-to-air mixing, and bleed mass-flow measurement. The technique involves injecting a non-reacting gas (ethylene) into the flow field and measuring the concentration distribution in a downstream plane. From the distributions, information about flow development, mixing, and mass-flow rates can be dtermined. The trace-gas apparatus and special considerations for use in high-speed flow are discussed. A description of each application, including uncertainty estimates is followed by a demonstrative example.

  12. Measurement of Density, Sound Velocity, Surface Tension, and Viscosity of Freely Suspended Supercooled Liquids

    NASA Technical Reports Server (NTRS)

    Trinh, E. H.

    1995-01-01

    Non-contact methods have been implemented in conjunction with levitation techniques to carry out the measurement of the macroscopic properties of liquids significantly cooled below their nominal melting point. Free suspension of the sample and remote methods allow the deep excursion into the metastable liquid state and the determination of its thermophysical properties. We used this approach to investigate common substances such as water, o-terphenyl, succinonitrile, as well as higher temperature melts such as molten indium, aluminum and other metals. Although these techniques have thus far involved ultrasonic, electromagnetic, and more recently electrostatic levitation, we restrict our attention to ultrasonic methods in this paper. The resulting magnitude of maximum thermal supercooling achieved have ranged between 10 and 15% of the absolute temperature of the melting point for the materials mentioned above. The physical properties measurement methods have been mostly novel approaches, and the typical accuracy achieved have not yet matched their standard equivalent techniques involving contained samples and invasive probing. They are currently being refined, however, as the levitation techniques become more widespread, and as we gain a better understanding of the physics of levitated liquid samples.

  13. Measurement of density, sound velocity, surface tension, and viscosity of freely suspended supercooled liquids

    NASA Astrophysics Data System (ADS)

    Trinh, E. H.; Ohsaka, K.

    1995-03-01

    Noncontact methods have been implemented in conjunction with levitation techniques to carry out the measurement of the macroscopic properties of liquids significantly cooled below their nominal melting point. Free suspension of the sample and remote methods allow the deep excursion into the metastable liquid state and the determination of its thermophysical properties. We used this approach to investigate common substances such as water, v-terphenyl. succinonitrile, as well as higher temperature melts such as molten indium, aluminum, and other metals. Although these techniques have thus far involved ultrasonic, eletromagnetic, and more recently electrostatic levitation, we restrict our attention to ultrasonic methods in this paper. The resulting magnitude of maximum thermal supercooling achieved has ranged between 10% and 15% of the absolute temperature of the melting point for the materials mentioned above. The methods for measuring the physical properties have been mostly novel approaches, and the typical accuracy achieved has not yet matched the standard equivalent techniques involving contained samples and invasive probing. They are currently being refined, however, as the levitation techniques become more widespread and as we gain a better understanding of the physics of levitated liquid samples.

  14. Concepts and techniques for ultrasonic evaluation of material mechanical properties

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1980-01-01

    Ultrasonic methods that can be used for material strength are reviewed. Emergency technology involving advanced ultrasonic techniques and associated measurements is described. It is shown that ultrasonic NDE is particularly useful in this area because it involves mechanical elastic waves that are strongly modulated by morphological factors that govern mechanical strength and also dynamic failure modes. These aspects of ultrasonic NDE are described in conjunction with advanced approaches and theoretical concepts for signal acquisition and analysis for materials characterization. It is emphasized that the technology is in its infancy and that much effort is still required before the techniques and concepts can be transferred from laboratory to field conditions.

  15. Acoustical properties of materials and muffler configurations for the 80 by 120 foot wind tunnel

    NASA Technical Reports Server (NTRS)

    Scharton, T. D.; Sneddon, M. D.

    1977-01-01

    Techniques for measuring the impedance of the muffler configurations and of porous plates with grazing flow were investigated and changes in the configuration parameters to enhance acoustic performance are explored. The feasibility of a pulse reflection technique for measuring the impedance of built-up structures in situ was demonstrated. A second technique involving the use of an open-end impedance tube with grazing flow was used to obtain detailed design data for the perforated plate configuration. Acoustic benefits associated with configuration changes such as curving the baffles, spacing and staggering baffle partitions, and techniques for alleviating baffle self-generated noise are described.

  16. Description and Evaluation of a Measurement Technique for Assessment of Performing Gender

    PubMed Central

    Harris, Kathleen Mullan; Halpern, Carolyn Tucker

    2016-01-01

    The influence of masculinity and femininity on behaviors and outcomes has been extensively studied in social science research using various measurement strategies. In the present paper, we describe and evaluate a measurement technique that uses existing survey items to capture the extent to which an individual behaves similarly to their same-gender peers. We use data from the first four waves of The National Longitudinal Study of Adolescent to Adult Health (Add Health), a nationally representative sample of adolescents (age 12–18) in the United States who were re-interviewed at ages 13–19, 18–26, and 24–32. We estimate split-half reliability and provide evidence that supports the validity of this measurement technique. We demonstrate that the resulting measure does not perform as a trait measure and is associated with involvement in violent fights, a pattern consistent with theory and empirical findings. This measurement technique represents a novel approach for gender researchers with the potential for expanding our current knowledge base. PMID:28630528

  17. Organizational Decision Making

    DTIC Science & Technology

    1975-08-01

    the lack of formal techniques typically used by large organizations, digress on the advantages of formal over informal... optimization ; for example one might do a number of optimization calculations, each time using a different measure of effectiveness as the optimized ...final decision. The next level of computer application involves the use of computerized optimization techniques. Optimization

  18. New Technique Identifies First Events in Translocations | Center for Cancer Research

    Cancer.gov

    A novel technique that enables scientists to measure and document tumor-inducing changes in genomic DNA is providing new insight into the earliest events involved in the formation of leukemias, lymphomas and sarcomas, and could potentially lead to the discovery of ways to stop those events.

  19. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1977-01-01

    Models, measures and techniques were developed for evaluating the effectiveness of aircraft computing systems. The concept of effectiveness involves aspects of system performance, reliability and worth. Specifically done was a detailed development of model hierarchy at mission, functional task, and computational task levels. An appropriate class of stochastic models was investigated which served as bottom level models in the hierarchial scheme. A unified measure of effectiveness called 'performability' was defined and formulated.

  20. Use of an ultrasonic-acoustic technique for nondestructive evaluation of fiber composite strength

    NASA Technical Reports Server (NTRS)

    Vary, A.; Bowles, K. J.

    1978-01-01

    Details of the method used to measure the stress wave factor are described. Frequency spectra of the stress waves are analyzed in order to clarify the nature of the wave phenomena involved. The stress wave factor was measured with simple contact probes requiring only one-side access to a part. This is beneficial in nondestructive evaluations because the waves can run parallel to fiber directions and thus measure material properties in directions assumed by actual loads. The technique can be applied where conventional through transmission techniques are impractical or where more quantitative data are required. The stress wave factor was measured for a series of graphite/polyimide composite panels, and results obtained are compared with through transmission immersion ultrasonic scans.

  1. Short communication: milk output in llamas (Lama glama) in relation to energy intake and water turnover measured by an isotope dilution technique.

    PubMed

    Riek, A; Klinkert, A; Gerken, M; Hummel, J; Moors, E; Südekum, K-H

    2013-03-01

    Despite the fact that llamas have become increasingly popular as companion and farm animals in both Europe and North America, scientific knowledge on their nutrient requirements is scarce. Compared with other livestock species, relatively little is known especially about the nutrient and energy requirements for lactating llamas. Therefore, we aimed to measure milk output in llama dams using an isotope dilution technique and relate it to energy intakes at different stages of lactation. We also validated the dilution technique by measuring total water turnover (TWT) directly and comparing it with values estimated by the isotope dilution technique. Our study involved 5 lactating llama dams and their suckling young. Milk output and TWT were measured at 4 stages of lactation (wk 3, 10, 18, and 26 postpartum). The method involved the application of the stable hydrogen isotope deuterium ((2)H) to the lactating dam. Drinking water intake and TWT decreased significantly with lactation stage, whether estimated by the isotope dilution technique or calculated from drinking water and water ingested from feeds. In contrast, lactation stage had no effect on dry matter intake, metabolizable energy (ME) intake, or the milk water fraction (i.e., the ratio between milk water excreted and TWT). The ratios between TWT measured and TWT estimated (by isotope dilution) did not differ with lactation stage and were close to 100% in all measurement weeks, indicating that the D(2)O dilution technique estimated TWT with high accuracy and only small variations. Calculating the required ME intakes for lactation from milk output data and gross energy content of milk revealed that, with increasing lactation stage, ME requirements per day for lactation decreased but remained constant per kilogram of milk output. Total measured ME intakes at different stages of lactation were similar to calculated ME intakes from published recommendation models for llamas. Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  2. Correlation of engineering parameters of the presumpscot formation to the seismic cone penetration test (SCPTU).

    DOT National Transportation Integrated Search

    2015-08-01

    The seismic cone penetration test with pore pressure measurement (SCPTu) is a geotechnical investigation technique which : involves pushing a sensitized cone into the subsurface at a constant rate while continuously measuring tip resistance, sleeve :...

  3. Measurement of relative cross sections for simultaneous ionization and excitation of the helium 4 2s and 4 2p states

    NASA Technical Reports Server (NTRS)

    Sutton, J. F.

    1972-01-01

    The relative cross sections for simultaneous ionization and excitation of helium by 200-eV electrons into the 4 2s and 4 2p states were measured via a fast delayed coincidence technique. Results show good agreement with the relative cross sections for single electron excitation of helium and hydrogen. An application of the results of the measurement to the development of ultraviolet intensity standard is suggested. This technique involves the use of known branching ratios, a visible light flux reference, and the measured relative cross sections.

  4. A new technique for measuring listening and reading literacy in developing countries

    NASA Astrophysics Data System (ADS)

    Greene, Barbara A.; Royer, James M.; Anzalone, Stephen

    1990-03-01

    One problem in evaluating educational interventions in developing countries is the absence of tests that adequately reflect the culture and curriculum. The Sentence Verification Technique is a new procedure for measuring reading and listening comprehension that allows for the development of tests based on materials indigenous to a given culture. The validity of using the Sentence Verification Technique to measure reading comprehension in Grenada was evaluated in the present study. The study involved 786 students at standards 3, 4 and 5. The tests for each standard consisted of passages that varied in difficulty. The students identified as high ability students in all three standards performed better than those identified as low ability. All students performed better with easier passages. Additionally, students in higher standards performed bettter than students in lower standards on a given passage. These results supported the claim that the Sentence Verification Technique is a valid measure of reading comprehension in Grenada.

  5. In-Vivo Techniques for Measuring Electrical Properties of Tissues.

    DTIC Science & Technology

    1980-09-01

    probe Electromagnetic energy Dielectric properties Monopole antenna In-situ tissues , Antemortem/Pos tmortem studies Renal blood flow 10 ABSTRACT... mice or rats, which were positioned beneath a fixed measurement probe. Several alternative methods involving the use of semi-rigid or flexible coaxial

  6. Convergence of Chahine's nonlinear relaxation inversion method used for limb viewing remote sensing

    NASA Technical Reports Server (NTRS)

    Chu, W. P.

    1985-01-01

    The application of Chahine's (1970) inversion technique to remote sensing problems utilizing the limb viewing geometry is discussed. The problem considered here involves occultation-type measurements and limb radiance-type measurements from either spacecraft or balloon platforms. The kernel matrix of the inversion problem is either an upper or lower triangular matrix. It is demonstrated that the Chahine inversion technique always converges, provided the diagonal elements of the kernel matrix are nonzero.

  7. Review of chemical separation techniques applicable to alpha spectrometric measurements

    NASA Astrophysics Data System (ADS)

    de Regge, P.; Boden, R.

    1984-06-01

    Prior to alpha-spectrometric measurements several chemical manipulations are usually required to obtain alpha-radiating sources with the desired radiochemical and chemical purity. These include sampling, dissolution or leaching of the elements of interest, conditioning of the solution, chemical separation and preparation of the alpha-emitting source. The choice of a particular method is dependent on different criteria but always involves aspects of the selectivity or the quantitative nature of the separations. The availability of suitable tracers or spikes and modern high resolution instruments resulted in the wide-spread application of isotopic dilution techniques to the problems associated with quantitative chemical separations. This enhanced the development of highly elective methods and reagents which led to important simplifications in the separation schemes. The chemical separation methods commonly used in connection with alpha-spectrometric measurements involve precipitation with selected scavenger elements, solvent extraction, ion exchange and electrodeposition techniques or any combination of them. Depending on the purpose of the final measurement and the type of sample available the chemical separation methods have to be adapted to the particular needs of environment monitoring, nuclear chemistry and metrology, safeguards and safety, waste management and requirements in the nuclear fuel cycle. Against the background of separation methods available in the literature the present paper highlights the current developments and trends in the chemical techniques applicable to alpha spectrometry.

  8. A United Effort for Crystal Growth, Neutron Scattering, and X-ray Scattering Studies of Novel Correlated Electron Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Young S.

    2015-02-12

    The research accomplishments during the award involved experimental studies of correlated electron systems and quantum magnetism. The techniques of crystal growth, neutron scattering, x-ray scattering, and thermodynamic & transport measurements were employed, and graduate students and postdoctoral research associates were trained in these techniques.

  9. Mexican Immigrants and the Use of Cognitive Assessment Techniques in Questionnaire Development

    ERIC Educational Resources Information Center

    Agans, Robert P.; Deeb-Sossa, Natalia; Kalsbeek, William

    2006-01-01

    The aim of this article is to identify the measurement challenges involved in obtaining sensitive health outcomes from Mexican women in both settled and unsettled segments of the United States population and to suggest how cognitive assessment techniques might be better employed to construct culturally and linguistically appropriate survey…

  10. Feasibility Study of a Rotorcraft Health and Usage Monitoring System (HUMS): Usage and Structural Life Monitoring Evaluation

    NASA Technical Reports Server (NTRS)

    Dickson, B.; Cronkhite, J.; Bielefeld, S.; Killian, L.; Hayden, R.

    1996-01-01

    The objective of this study was to evaluate two techniques, Flight Condition Recognition (FCR) and Flight Load Synthesis (FIS), for usage monitoring and assess the potential benefits of extending the retirement intervals of life-limited components, thus reducing the operator's maintenance and replacement costs. Both techniques involve indirect determination of loads using measured flight parameters and subsequent fatigue analysis to calculate the life expended on the life-limited components. To assess the potential benefit of usage monitoring, the two usage techniques were compared to current methods of component retirement. In addition, comparisons were made with direct load measurements to assess the accuracy of the two techniques.

  11. The 'sniffer-patch' technique for detection of neurotransmitter release.

    PubMed

    Allen, T G

    1997-05-01

    A wide variety of techniques have been employed for the detection and measurement of neurotransmitter release from biological preparations. Whilst many of these methods offer impressive levels of sensitivity, few are able to combine sensitivity with the necessary temporal and spatial resolution required to study quantal release from single cells. One detection method that is seeing a revival of interest and has the potential to fill this niche is the so-called 'sniffer-patch' technique. In this article, specific examples of the practical aspects of using this technique are discussed along with the procedures involved in calibrating these biosensors to extend their applications to provide quantitative, in addition to simple qualitative, measurements of quantal transmitter release.

  12. Calibrated LCD/TFT stimulus presentation for visual psychophysics in fMRI.

    PubMed

    Strasburger, H; Wüstenberg, T; Jäncke, L

    2002-11-15

    Standard projection techniques using liquid crystal (LCD) or thin-film transistor (TFT) technology show drastic distortions in luminance and contrast characteristics across the screen and across grey levels. Common luminance measurement and calibration techniques are not applicable in the vicinity of MRI scanners. With the aid of a fibre optic, we measured screen luminances for the full space of screen position and image grey values and on that basis developed a compensation technique that involves both luminance homogenisation and position-dependent gamma correction. By the technique described, images displayed to a subject in functional MRI can be specified with high precision by a matrix of desired luminance values rather than by local grey value.

  13. Innovative acoustic techniques for studying new materials and new developments in solid state physics

    NASA Astrophysics Data System (ADS)

    Maynard, Julian D.

    1994-06-01

    The goals of this project involve the use of innovative acoustic techniques to study new materials and new developments in solid state physics. Major accomplishments include (a) the preparation and publication of a number of papers and book chapters, (b) the measurement and new analysis of more samples of aluminum quasicrystal and its cubic approximant to eliminate the possibility of sample artifacts, (c) the use of resonant ultrasound to measure acoustic attenuation and determine the effects of heat treatment on ceramics, (d) the extension of our technique for measuring even lower (possibly the lowest) infrared optical absorption coefficient, and (e) the measurement of the effects of disorder on the propagation of a nonlinear pulse, and (f) the observation of statistical effects in measurements of individual bond breaking events in fracture.

  14. Development of a new test cell to measure cumulative permeation of water-insoluble pesticides with low vapor pressure through protective clothing and glove materials

    PubMed Central

    SHAW, Anugrah; COLEONE-CARVALHO, Ana Carla; HOLLINGSHURST, Julien; DRAPER, Michael; MACHADO NETO, Joaquim Gonçalves

    2017-01-01

    A collaborative approach, involving resources and expertise from several countries, was used to develop a test cell to measure cumulative permeation by a solid-state collection technique. The new technique was developed to measure the permeation of pesticide active ingredients and other chemicals with low vapor pressure that would otherwise be difficult to test via standard techniques. The development process is described and the results from the final chosen test method are reported. Inter-laboratory studies were conducted to further refine the new method and determine repeatability and reliability. The revised test method has been approved as a new ISO/EN standard to measure permeation of chemicals with low vapor pressure and/or solubility in water. PMID:29033403

  15. Measuring liquid density using Archimedes' principle

    NASA Astrophysics Data System (ADS)

    Hughes, Stephen W.

    2006-09-01

    A simple technique is described for measuring absolute and relative liquid density based on Archimedes' principle. The technique involves placing a container of the liquid under test on an electronic balance and suspending a probe (e.g. a glass marble) attached to a length of line beneath the surface of the liquid. If the volume of the probe is known, the density of liquid is given by the difference between the balance reading before and after immersion of the probe divided by the volume of the probe. A test showed that the density of water at room temperature could be measured to an accuracy and precision of 0.01 ± 0.1%. The probe technique was also used to measure the relative density of milk, Coca-Cola, fruit juice, olive oil and vinegar.

  16. In vitro measurement of nucleus pulposus swelling pressure: A new technique for studies of spinal adaptation to gravity

    NASA Technical Reports Server (NTRS)

    Hargens, A. R.; Glover, M. G.; Mahmood, M. M.; Gott, S.; Garfin, S. R.; Ballard, R.; Murthy, G.; Brown, M. D.

    1992-01-01

    Swelling of the intervertebral disc nucleus pulposus is altered by posture and gravity. We have designed and tested a new osmometer for in vitro determination of nucleus pulposus swelling pressure. The functional principle of the osmometer involves compressing a sample of nucleus pulposus with nitrogen gas until saline pressure gradients across a 0.45 microns Millipore filter are eliminated. Swelling pressure of both pooled dog and pooled pig lumbar disc nucleus pulposus were measured on the new osmometer and compared to swelling pressures determined using the equilibrium dialysis technique. The osmometer measured swelling pressures comparable to those obtained by the dialysis technique. This osmometer provides a rapid, direct, and accurate measurement of swelling pressure of the nucleus pulposus.

  17. The construct of food involvement in behavioral research: scale development and validation.

    PubMed

    Bell, Rick; Marshall, David W

    2003-06-01

    The construct of involvement has been found to influence brand loyalty, product information search processing, responses to advertising communications, diffusion of innovations, and ultimately, product choice decisions. Traditionally, involvement has been defined as being a characteristic of either a product or of an individual. In the present research, we make an assumption that an individual's 'food involvement' is a somewhat stable characteristic and we hypothesized that involvement with foods would vary between individuals, that individuals who are more highly involved with food would be better able to discriminate between a set of food samples than would less food involved individuals, and that this discrimination would operate both in affective and perceptive relative judgments. Using standard scale construction techniques, we developed a measure of the characteristic of food involvement, based on activities relating to food acquisition, preparation, cooking, eating and disposal. After several iterations, a final 12-item measure was found to have good test-retest reliability and internal consistency within two subscales. A behavioral validation study demonstrated that measures of food involvement were associated with discrimination and hedonic ratings for a range of foods in a laboratory setting. These findings suggest that food involvement, as measured by the Food Involvement Scale, may be an important mediator to consider when undertaking research with food and food habits.

  18. Integration of Scale Invariant Generator Technique and S-A Technique for Characterizing 2-D Patterns for Information Retrieve

    NASA Astrophysics Data System (ADS)

    Cao, L.; Cheng, Q.

    2004-12-01

    The scale invariant generator technique (SIG) and spectrum-area analysis technique (S-A) were developed independently relevant to the concept of the generalized scale invariance (GSI). The former was developed for characterizing the parameters involved in the GSI for characterizing and simulating multifractal measures whereas the latter was for identifying scaling breaks for decomposition of superimposed multifractal measures caused by multiple geophysical processes. A natural integration of these two techniques may yield a new technique to serve two purposes, on the one hand, that can enrich the power of S-A by increasing the interpretability of decomposed patterns in some applications of S-A and, on the other hand, that can provide a mean to test the uniqueness of multifractality of measures which is essential for application of SIG technique in more complicated environment. The implementation of the proposed technique has been done as a Dynamic Link Library (DLL) in Visual C++. The program can be friendly used for method validation and application in different fields.

  19. Study to validate the Non-Interference Performance Assessment (NIPA) technique

    NASA Technical Reports Server (NTRS)

    Seeman, J. S.; Murphy, G. L.

    1973-01-01

    The NIPA (Non-Interference Performance Assessment) technique involves direct observation of group verbal activities by trained observers who rate the emotional content (affect) of each verbal interaction as either positive, negative, or neutral. During the test, in which four men were confined for 90 consecutive days, feasibility of the NIPA technique was demonstrated and observer reliability was verified. However, the validity of the test was not proved because an independent criterion measure of morale for the confined crew was lacking. There were indications, however, that NIPA measures were tracking changes in crew morale. At approximately the two-thirds point (Days 60 to 70), morale apparently fell dramatically for a period of about ten days, and simultaneously NIPA measure of positive verbalization decreased in number. A need was indicated for a separate study to apply the NIPA technique under experimental conditions and using a clearly defined criterion measure against which the ability of NIPA observations to truly measure morale changes could be determined.

  20. Correcting For Seed-Particle Lag In LV Measurements

    NASA Technical Reports Server (NTRS)

    Jones, Gregory S.; Gartrell, Luther R.; Kamemoto, Derek Y.

    1994-01-01

    Two experiments conducted to evaluate effects of sizes of seed particles on errors in LV measurements of mean flows. Both theoretical and conventional experimental methods used to evaluate errors. First experiment focused on measurement of decelerating stagnation streamline of low-speed flow around circular cylinder with two-dimensional afterbody. Second performed in transonic flow and involved measurement of decelerating stagnation streamline of hemisphere with cylindrical afterbody. Concluded, mean-quantity LV measurements subject to large errors directly attributable to sizes of particles. Predictions of particle-response theory showed good agreement with experimental results, indicating velocity-error-correction technique used in study viable for increasing accuracy of laser velocimetry measurements. Technique simple and useful in any research facility in which flow velocities measured.

  1. Parametric studies and characterization measurements of x-ray lithography mask membranes

    NASA Astrophysics Data System (ADS)

    Wells, Gregory M.; Chen, Hector T. H.; Engelstad, Roxann L.; Palmer, Shane R.

    1991-08-01

    The techniques used in the experimental characterization of thin membranes are considered for their potential use as mask blanks for x-ray lithography. Among the parameters of interest for this evaluation are the film's stress, fracture strength, uniformity of thickness, absorption in the x-ray and visible spectral regions and the modulus and grain structure of the material. The experimental techniques used for measuring these properties are described. The accuracy and applicability of the assumptions used to derive the formulas that relate the experimental measurements to the parameters of interest are considered. Experimental results for silicon carbide and diamond films are provided. Another characteristic needed for an x-ray mask carrier is radiation stability. The number of x-ray exposures expected to be performed in the lifetime of an x-ray mask on a production line is on the order of 107. The dimensional stability requirements placed on the membranes during this period are discussed. Interferometric techniques that provide sufficient sensitivity for these stability measurements are described. A comparison is made between the different techniques that have been developed in term of the information that each technique provides, the accuracy of the various techniques, and the implementation issues that are involved with each technique.

  2. An Undergraduate Experiment on Nuclear Lifetime Measurement Using the Doppler Effect

    ERIC Educational Resources Information Center

    Campbell, J. L.; And Others

    1972-01-01

    While designed for a senior undergraduate laboratory, the experiment illustrates the principles involved in the various Doppler techniques currently used in nuclear lifetime studies and demonstrates the versatility of the Ge(Li) detector in applications other than direct energy or intensity measurement. (Author/TS)

  3. Alternative Models for Small Samples in Psychological Research: Applying Linear Mixed Effects Models and Generalized Estimating Equations to Repeated Measures Data

    ERIC Educational Resources Information Center

    Muth, Chelsea; Bales, Karen L.; Hinde, Katie; Maninger, Nicole; Mendoza, Sally P.; Ferrer, Emilio

    2016-01-01

    Unavoidable sample size issues beset psychological research that involves scarce populations or costly laboratory procedures. When incorporating longitudinal designs these samples are further reduced by traditional modeling techniques, which perform listwise deletion for any instance of missing data. Moreover, these techniques are limited in their…

  4. Measurement of absolute gamma emission probabilities

    NASA Astrophysics Data System (ADS)

    Sumithrarachchi, Chandana S.; Rengan, Krish; Griffin, Henry C.

    2003-06-01

    The energies and emission probabilities (intensities) of gamma-rays emitted in radioactive decays of particular nuclides are the most important characteristics by which to quantify mixtures of radionuclides. Often, quantification is limited by uncertainties in measured intensities. A technique was developed to reduce these uncertainties. The method involves obtaining a pure sample of a nuclide using radiochemical techniques, and using appropriate fractions for beta and gamma measurements. The beta emission rates were measured using a liquid scintillation counter, and the gamma emission rates were measured with a high-purity germanium detector. Results were combined to obtain absolute gamma emission probabilities. All sources of uncertainties greater than 0.1% were examined. The method was tested with 38Cl and 88Rb.

  5. Noncontact Measurement Of Critical Current In Superconductor

    NASA Technical Reports Server (NTRS)

    Israelsson, Ulf E.; Strayer, Donald M.

    1992-01-01

    Critical current measured indirectly via flux-compression technique. Magnetic flux compressed into gap between superconductive hollow cylinder and superconductive rod when rod inserted in hole in cylinder. Hall-effect probe measures flux density before and after compression. Method does not involve any electrical contact with superconductor. Therefore, does not cause resistive heating and consequent premature loss of superconductivity.

  6. Report to TRMM

    NASA Technical Reports Server (NTRS)

    Jameson, Arthur R.

    1997-01-01

    The effort involved three elements all related to the measurement of rain and clouds using microwaves: (1) Examine recently proposed techniques for measuring rainfall rate and rain water content using data from ground-based radars and the TRMM microwave link in order to develop improved ground validation and radar calibration techniques; (2) Develop dual-polarization, multiple frequency radar techniques for estimating rain water content and cloud water content to interpret the vertical profiles of radar reflectivity factors (Z) measured by the TRMM Precipitation Radar; and (3) Investigate theoretically and experimentally the potential biases in TRMM Z measurements due to spatial inhomogeneities in precipitation. The research succeeded in addressing all of these topics, resulting in several referred publications. addition, the research indicated that the effects of non-Rayleigh statistics resulting from the nature of the precipitation inhomogeneities will probably not result in serious errors for the TRMM radar Measurements, but the TRMM radiometers may be subject to significant bias due to the inhomogeneities.

  7. Laser-Induced Fluorescence Velocity Measurements in Supersonic Underexpanded Impinging Jets

    NASA Technical Reports Server (NTRS)

    Inman, Jennifer A.; Danehy, Paul M.; Barthel, Brett; Alderfer, David W.; Novak, Robert J.

    2010-01-01

    We report on an application of nitric oxide (NO) flow-tagging velocimetry to impinging underexpanded jet flows issuing from a Mach 2.6 nozzle. The technique reported herein utilizes a single laser, single camera system to obtain planar maps of the streamwise component of velocity. Whereas typical applications of this technique involve comparing two images acquired at different time delays, this application uses a single image and time delay. The technique extracts velocity by assuming that particular regions outside the jet flowfield have negligible velocity and may therefore serve as a stationary reference against which to measure motion of the jet flowfield. By taking the average of measurements made in 100 single-shot images for each flow condition, streamwise velocities of between -200 and +1,000 m/s with accuracies of between 15 and 50 m/s are reported within the jets. Velocity measurements are shown to explain otherwise seemingly anomalous impingement surface pressure measurements.

  8. Torque measurement at the single-molecule level.

    PubMed

    Forth, Scott; Sheinin, Maxim Y; Inman, James; Wang, Michelle D

    2013-01-01

    Methods for exerting and measuring forces on single molecules have revolutionized the study of the physics of biology. However, it is often the case that biological processes involve rotation or torque generation, and these parameters have been more difficult to access experimentally. Recent advances in the single-molecule field have led to the development of techniques that add the capability of torque measurement. By combining force, displacement, torque, and rotational data, a more comprehensive description of the mechanics of a biomolecule can be achieved. In this review, we highlight a number of biological processes for which torque plays a key mechanical role. We describe the various techniques that have been developed to directly probe the torque experienced by a single molecule, and detail a variety of measurements made to date using these new technologies. We conclude by discussing a number of open questions and propose systems of study that would be well suited for analysis with torsional measurement techniques.

  9. Report to TRMM

    NASA Technical Reports Server (NTRS)

    Jameson, Arthur R.

    1997-01-01

    The effort involved three elements all related to the measurement of rain and clouds using microwaves: (1) Examine recently proposed techniques for measuring rainfall rate and rain water content using data from ground-based radars and the TRMM microwave link in order to develop improved ground validation and radar calibration techniques; (2) Develop dual-polarization, multiple frequency radar techniques for estimating rain water content and cloud water content to interpret the vertical profiles of radar reflectivity factors (Z) measured by the TRMM Precipitation Radar; and (3) Investigate theoretically and experimentally the potential biases in TRMM Z measurements due to spatial inhomogeneities in precipitation. The research succeeded in addressing all of these topics, resulting in several refereed publications. In addition, the research indicated that the effects of non-Rayleigh statistics resulting from the nature of the precipitation inhomogeneities will probably not result in serious errors for the TRMM radar measurements, but the TRMM radiometers may be subject to significant bias due to the inhomogeneities.

  10. Measurement of lung volumes from supine portable chest radiographs.

    PubMed

    Ries, A L; Clausen, J L; Friedman, P J

    1979-12-01

    Lung volumes in supine nonambulatory patients are physiological parameters often difficult to measure with current techniques (plethysmograph, gas dilution). Existing radiographic methods for measuring lung volumes require standard upright chest radiographs. Accordingly, in 31 normal supine adults, we determined helium-dilution functional residual and total lung capacities and measured planimetric lung field areas (LFA) from corresponding portable anteroposterior and lateral radiographs. Low radiation dose methods, which delivered less than 10% of that from standard portable X-ray technique, were utilized. Correlation between lung volume and radiographic LFA was highly significant (r = 0.96, SEE = 10.6%). Multiple-step regressions using height and chest diameter correction factors reduced variance, but weight and radiographic magnification factors did not. In 17 additional subjects studied for validation, the regression equations accurately predicted radiographic lung volume. Thus, this technique can provide accurate and rapid measurement of lung volume in studies involving supine patients.

  11. Estimating the effect of gang membership on nonviolent and violent delinquency: a counterfactual analysis.

    PubMed

    Barnes, J C; Beaver, Kevin M; Miller, J Mitchell

    2010-01-01

    This study reconsiders the well-known link between gang membership and criminal involvement. Recently developed analytical techniques enabled the approximation of an experimental design to determine whether gang members, after being matched with similarly situated nongang members, exhibited greater involvement in nonviolent and violent delinquency. Findings indicated that while gang membership is a function of self-selection, selection effects alone do not account for the greater involvement in delinquency exhibited by gang members. After propensity score matching was employed, gang members maintained a greater involvement in both nonviolent and violent delinquency when measured cross-sectionally, but only violent delinquency when measured longitudinally. Additional analyses using inverse probability of treatment weights reaffirmed these conclusions. © 2010 Wiley-Liss, Inc.

  12. A Critical Review on Clinical Application of Separation Techniques for Selective Recognition of Uracil and 5-Fluorouracil.

    PubMed

    Pandey, Khushaboo; Dubey, Rama Shankar; Prasad, Bhim Bali

    2016-03-01

    The most important objectives that are frequently found in bio-analytical chemistry involve applying tools to relevant medical/biological problems and refining these applications. Developing a reliable sample preparation step, for the medical and biological fields is another primary objective in analytical chemistry, in order to extract and isolate the analytes of interest from complex biological matrices. Since, main inborn errors of metabolism (IEM) diagnosable through uracil analysis and the therapeutic monitoring of toxic 5-fluoruracil (an important anti-cancerous drug) in dihydropyrimidine dehydrogenase deficient patients, require an ultra-sensitive, reproducible, selective, and accurate analytical techniques for their measurements. Therefore, keeping in view, the diagnostic value of uracil and 5-fluoruracil measurements, this article refines several analytical techniques involved in selective recognition and quantification of uracil and 5-fluoruracil from biological and pharmaceutical samples. The prospective study revealed that implementation of molecularly imprinted polymer as a solid-phase material for sample preparation and preconcentration of uracil and 5-fluoruracil had proven to be effective as it could obviates problems related to tedious separation techniques, owing to protein binding and drastic interferences, from the complex matrices in real samples such as blood plasma, serum samples.

  13. Short hold times in dynamic vapor sorption measurements mischaracterize the equilibrium moisture content of wood

    Treesearch

    Samuel V. Glass; Charles R. Boardman; Samuel L. Zelinka

    2017-01-01

    Recently, the dynamic vapor sorption (DVS) technique has been used to measure sorption isotherms and develop moisture-mechanics models for wood and cellulosic materials. This method typically involves measuring the time-dependent mass response of a sample following step changes in relative humidity (RH), fitting a kinetic model to the data, and extrapolating the...

  14. Analytical solutions for determining residual stresses in two-dimensional domains using the contour method

    PubMed Central

    Kartal, Mehmet E.

    2013-01-01

    The contour method is one of the most prevalent destructive techniques for residual stress measurement. Up to now, the method has involved the use of the finite-element (FE) method to determine the residual stresses from the experimental measurements. This paper presents analytical solutions, obtained for a semi-infinite strip and a finite rectangle, which can be used to calculate the residual stresses directly from the measured data; thereby, eliminating the need for an FE approach. The technique is then used to determine the residual stresses in a variable-polarity plasma-arc welded plate and the results show good agreement with independent neutron diffraction measurements. PMID:24204187

  15. A Doppler-Cancellation Technique for Determining the Altitude Dependence of Gravitational Red Shift in an Earth Satellite

    NASA Technical Reports Server (NTRS)

    Badessa, R. S.; Kent, R. L.; Nowell, J. C.; Searle, C. L.

    1960-01-01

    A cancellation technique permits measurement of the frequency of a source moving relative to an observer without the obscuring effect of first-order Doppler shifts. The application of this method to a gravitational red shift experiment involving the use of an earth satellite containing a highly stable oscillator is described. The rapidity with which a measurement can be made permits the taking of data at various altitudes in a given elliptical orbit. Tropospheric and ionospheric effects upon the accuracy of results are estimated.

  16. The energetics of mesopore formation in zeolites with surfactants.

    PubMed

    Linares, Noemi; Jardim, Erika de Oliveira; Sachse, Alexander; Serrano, Elena; Garcia-Martinez, Javier

    2018-05-02

    Mesoporosity can be conveniently introduced in zeolites by treating them in basic surfactant solutions. The apparent activation energy involved in the formation of mesopores in USY via surfactant-templating was obtained through the combination of in situ synchrotron XRD and ex situ gas adsorption. Additionally, techniques such as pH measurements and TG/DTA were employed to determine the OH- evolution and the CTA+ uptake during the development of mesoporosity, providing information about the different steps involved. By combining both in situ and ex situ techniques, we have been able, for the first time, to determine the apparent activation energies of the different processes involved in the mesostructuring of USY zeolites, which are in the same order of magnitude (30 - 65 kJ mol-1) of those involved in the crystallization of zeolites. Hence, important mechanistic insights on the surfactant-templating method were obtained. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Measuring and Estimating Normalized Contrast in Infrared Flash Thermography

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2013-01-01

    Infrared flash thermography (IRFT) is used to detect void-like flaws in a test object. The IRFT technique involves heating up the part surface using a flash of flash lamps. The post-flash evolution of the part surface temperature is sensed by an IR camera in terms of pixel intensity of image pixels. The IR technique involves recording of the IR video image data and analysis of the data using the normalized pixel intensity and temperature contrast analysis method for characterization of void-like flaws for depth and width. This work introduces a new definition of the normalized IR pixel intensity contrast and normalized surface temperature contrast. A procedure is provided to compute the pixel intensity contrast from the camera pixel intensity evolution data. The pixel intensity contrast and the corresponding surface temperature contrast differ but are related. This work provides a method to estimate the temperature evolution and the normalized temperature contrast from the measured pixel intensity evolution data and some additional measurements during data acquisition.

  18. The Inverse-Square Law with Data Loggers

    ERIC Educational Resources Information Center

    Bates, Alan

    2013-01-01

    The inverse-square law for the intensity of light received at a distance from a light source has been verified using various experimental techniques. Typical measurements involve a manual variation of the distance between a light source and a light sensor, usually by sliding the sensor or source along a bench, measuring the source-sensor distance…

  19. Techniques for Developing Health Quality of Life Scales for Point of Service Use

    ERIC Educational Resources Information Center

    Lee, Young-Sun; Douglas, Jeffrey; Chewning, Betty

    2007-01-01

    Clinical and health policy research frequently involves health status measurement using generic or disease specific instruments. These instruments are generally developed to arrive at several scales, each measuring a distinct domain of health quality of life (HQOL). Clinical settings are starting to explore how to integrate patient perspectives of…

  20. Absorption Filter Based Optical Diagnostics in High Speed Flows

    NASA Technical Reports Server (NTRS)

    Samimy, Mo; Elliott, Gregory; Arnette, Stephen

    1996-01-01

    Two major regimes where laser light scattered by molecules or particles in a flow contains significant information about the flow are Mie scattering and Rayleigh scattering. Mie scattering is used to obtain only velocity information, while Rayleigh scattering can be used to measure both the velocity and the thermodynamic properties of the flow. Now, recently introduced (1990, 1991) absorption filter based diagnostic techniques have started a new era in flow visualization, simultaneous velocity and thermodynamic measurements, and planar velocity measurements. Using a filtered planar velocimetry (FPV) technique, we have modified the optically thick iodine filter profile of Miles, et al., and used it in the pressure-broaden regime which accommodates measurements in a wide range of velocity applications. Measuring velocity and thermodynamic properties simultaneously, using absorption filtered based Rayleigh scattering, involves not only the measurement of the Doppler shift, but also the spectral profile of the Rayleigh scattering signal. Using multiple observation angles, simultaneous measurement of one component velocity and thermodynamic properties in a supersonic jet were measured. Presently, the technique is being extended for simultaneous measurements of all three components of velocity and thermodynamic properties.

  1. Advanced ultrasonic measurement methodology for non-invasive interrogation and identification of fluids in sealed containers

    NASA Astrophysics Data System (ADS)

    Tucker, Brian J.; Diaz, Aaron A.; Eckenrode, Brian A.

    2006-03-01

    Government agencies and homeland security related organizations have identified the need to develop and establish a wide range of unprecedented capabilities for providing scientific and technical forensic services to investigations involving hazardous chemical, biological, and radiological materials, including extremely dangerous chemical and biological warfare agents. Pacific Northwest National Laboratory (PNNL) has developed a prototype portable, hand-held, hazardous materials acoustic inspection prototype that provides noninvasive container interrogation and material identification capabilities using nondestructive ultrasonic velocity and attenuation measurements. Due to the wide variety of fluids as well as container sizes and materials encountered in various law enforcement inspection activities, the need for high measurement sensitivity and advanced ultrasonic measurement techniques were identified. The prototype was developed using a versatile electronics platform, advanced ultrasonic wave propagation methods, and advanced signal processing techniques. This paper primarily focuses on the ultrasonic measurement methods and signal processing techniques incorporated into the prototype. High bandwidth ultrasonic transducers combined with an advanced pulse compression technique allowed researchers to 1) obtain high signal-to-noise ratios and 2) obtain accurate and consistent time-of-flight (TOF) measurements through a variety of highly attenuative containers and fluid media. Results of work conducted in the laboratory have demonstrated that the prototype experimental measurement technique also provided information regarding container properties, which will be utilized in future container-independent measurements of hidden liquids.

  2. Assessment of Robotic Patient Simulators for Training in Manual Physical Therapy Examination Techniques

    PubMed Central

    Ishikawa, Shun; Okamoto, Shogo; Isogai, Kaoru; Akiyama, Yasuhiro; Yanagihara, Naomi; Yamada, Yoji

    2015-01-01

    Robots that simulate patients suffering from joint resistance caused by biomechanical and neural impairments are used to aid the training of physical therapists in manual examination techniques. However, there are few methods for assessing such robots. This article proposes two types of assessment measures based on typical judgments of clinicians. One of the measures involves the evaluation of how well the simulator presents different severities of a specified disease. Experienced clinicians were requested to rate the simulated symptoms in terms of severity, and the consistency of their ratings was used as a performance measure. The other measure involves the evaluation of how well the simulator presents different types of symptoms. In this case, the clinicians were requested to classify the simulated resistances in terms of symptom type, and the average ratios of their answers were used as performance measures. For both types of assessment measures, a higher index implied higher agreement among the experienced clinicians that subjectively assessed the symptoms based on typical symptom features. We applied these two assessment methods to a patient knee robot and achieved positive appraisals. The assessment measures have potential for use in comparing several patient simulators for training physical therapists, rather than as absolute indices for developing a standard. PMID:25923719

  3. Phased Array Ultrasound System for Planar Flow Mapping in Liquid Metals.

    PubMed

    Mader, Kevin; Nauber, Richard; Galindo, Vladimir; Beyer, Hannes; Buttner, Lars; Eckert, Sven; Czarske, Jurgen

    2017-09-01

    Controllable magnetic fields can be used to optimize flows in technical and industrial processes involving liquid metals in order to improve quality and yield. However, experimental studies in magnetohydrodynamics often involve complex, turbulent flows and require planar, two-component (2c) velocity measurements through only one acoustical access. We present the phased array ultrasound Doppler velocimeter as a modular research platform for flow mapping in liquid metals. It combines the pulse wave Doppler method with the phased array technique to adaptively focus the ultrasound beam. This makes it possible to resolve smaller flow structures in planar measurements compared with fixed-beam sensors and enables 2c flow mapping with only one acoustical access via the cross beam technique. From simultaneously measured 2-D velocity fields, quantities for turbulence characterization can be derived. The capabilities of this measurement system are demonstrated through measurements in the alloy gallium-indium-tin at room temperature. The 2-D, 2c velocity measurements of a flow in a cubic vessel driven by a rotating magnetic field (RMF) with a spatial resolution of up to 2.2 mm are presented. The measurement results are in good agreement with a semianalytical simulation. As a highlight, two-point correlation functions of the velocity field for different magnitudes of the RMF are presented.

  4. Pulse echo and combined resonance techniques: a full set of LGT acoustic wave constants and temperature coefficients.

    PubMed

    Sturtevant, Blake T; Davulis, Peter M; da Cunha, Mauricio Pereira

    2009-04-01

    This work reports on the determination of langatate elastic and piezoelectric constants and their associated temperature coefficients employing 2 independent methods, the pulse echo overlap (PEO) and a combined resonance technique (CRT) to measure bulk acoustic wave (BAW) phase velocities. Details on the measurement techniques are provided and discussed, including the analysis of the couplant material in the PEO technique used to couple signal to the sample, which showed to be an order of magnitude more relevant than the experimental errors involved in the data extraction. At room temperature, elastic and piezoelectric constants were extracted by the PEO and the CRT methods and showed results consistent to within a few percent for the elastic constants. Both raw acquired data and optimized constants, based on minimization routines applied to all the modes involved in the measurements, are provided and discussed. Comparison between the elastic constants and their temperature behavior with the literature reveals the recent efforts toward the consistent growth and characterization of LGT, in spite of significant variations (between 1 and 30%) among the constants extracted by different groups at room temperature. The density, dielectric permittivity constants, and respective temperature coefficients used in this work have also been independently determined based on samples from the same crystal boule. The temperature behavior of the BAW modes was extracted using the CRT technique, which has the advantage of not relying on temperature dependent acoustic couplants. Finally, the extracted temperature coefficients for the elastic and piezoelectric constants between room temperature and 120 degrees C are reported and discussed in this work.

  5. Neutron spectrum measurements using proton recoil proportional counters: results of measurements of leakage spectra for the Little Boy assembly

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, E.F.; Yule, T.J.

    1984-01-01

    Measurements of degraded fission-neutron spectra using recoil proportional counters are done routinely for studies involving fast reactor mockups. The same techniques are applicable to measurements of neutron spectra required for personnel dosimetry in fast neutron environments. A brief discussion of current applications of these methods together with the results of a measurement made on the LITTLE BOY assembly at Los Alamos are here described.

  6. The Evolving Field of Wound Measurement Techniques: A Literature Review.

    PubMed

    Khoo, Rachel; Jansen, Shirley

    2016-06-01

    Wound healing is a complex and multifactorial process that requires the involvement of a multidisciplinary approach. Methods of wound measurement have been developed and continually refined with the purpose of ensuring precision in wound measurement and documentation as the primary indicator of healing. This review aims to ascertain the efficacies of current wound area measurement techniques, and to highlight any perceived gaps in the literature so as to develop suggestions for future studies and practice. Med- line, PubMed, CliniKey, and CINAHL were searched using the terms "wound/ulcer measurement techniques," "wound assessment," "digi- tal planimetry," and "structured light." Articles between 2000 and 2014 were selected, and secondary searches were carried out by exam- ining the references of relevant articles. Only papers written in English were included. A universal, standardized method of wound as- sessment has not been established or proposed. At present, techniques range from the simple to the more complex - most of which have char- acteristics that allow for applicability in both rural and urban settings. Techniques covered are: ruler measurements, acetate tracings/contact planimetry, digital planimetry, and structured light devices. Conclu- sion. In reviewing the literature, the precision and reliability of digital planimetry over the more conventional methods of ruler measurements and acetate tracings are consistently demonstrated. The advent and utility of the laser or structured light approach, however, is promising, has only been analyzed by a few, and opens up the scope for further evaluation of this technique.

  7. Psychrometric chart for physiological research

    NASA Technical Reports Server (NTRS)

    Chambers, A. B.

    1971-01-01

    Chart facilitates use of graphical techniques for solving problems involving thermodynamic properties of moist air. The properties are presented, and their units of measurement are listed. Chart presenting conditions at standard atmosphere pressure at sea level is most useful.

  8. Evaluation, Feasibility, and Design of a Three-Wavelength Infrared Atmospheric Aerosol Extinctiometer.

    DTIC Science & Technology

    1980-09-02

    laser or searchlight measurements . The study program consisted of three basic tasks: (1) a review of existing techniques for measuring aerosol extinction ...to aerosol extinction along a path can be deduced. Solutions to this problcaii fall into several classes. One class of solutions invoLves measuring ...employed such a windowless system to measure the absorption of an artificial aerosol consisting of quartz particles, using a CO 2 laser in the

  9. Torque Measurement at the Single Molecule Level

    PubMed Central

    Forth, Scott; Sheinin, Maxim Y.; Inman, James; Wang, Michelle D.

    2017-01-01

    Methods for exerting and measuring forces on single molecules have revolutionized the study of the physics of biology. However, it is often the case that biological processes involve rotation or torque generation, and these parameters have been more difficult to access experimentally. Recent advances in the single molecule field have led to the development of techniques which add the capability of torque measurement. By combining force, displacement, torque, and rotational data, a more comprehensive description of the mechanics of a biomolecule can be achieved. In this review, we highlight a number of biological processes for which torque plays a key mechanical role. We describe the various techniques that have been developed to directly probe the torque experienced by a single molecule, and detail a variety of measurements made to date using these new technologies. We conclude by discussing a number of open questions and propose systems of study which would be well suited for analysis with torsional measurement techniques. PMID:23541162

  10. Prototype instrument for noninvasive ultrasonic inspection and identification of fluids in sealed containers

    NASA Astrophysics Data System (ADS)

    Tucker, Brian J.; Diaz, Aaron A.; Eckenrode, Brian A.

    2006-05-01

    Government agencies and homeland security related organizations have identified the need to develop and establish a wide range of unprecedented capabilities for providing scientific and technical forensic services to investigations involving hazardous chemical, biological, and radiological materials, including extremely dangerous chemical and biological warfare agents. Pacific Northwest National Laboratory (PNNL) has developed a prototype portable, handheld, hazardous materials acoustic inspection prototype that provides noninvasive container interrogation and material identification capabilities using nondestructive ultrasonic velocity and attenuation measurements. Due to the wide variety of fluids as well as container sizes and materials encountered in various law enforcement inspection activities, the need for high measurement sensitivity and advanced ultrasonic measurement techniques were identified. The prototype was developed using a versatile electronics platform, advanced ultrasonic wave propagation methods, and advanced signal processing techniques. This paper primarily focuses on the ultrasonic measurement methods and signal processing techniques incorporated into the prototype. High bandwidth ultrasonic transducers combined with an advanced pulse compression technique allowed researchers to 1) obtain high signal-to-noise ratios and 2) obtain accurate and consistent time-of-flight (TOF) measurements through a variety of highly attenuative containers and fluid media. Results of work conducted in the laboratory have demonstrated that the prototype experimental measurement technique also provided information regarding container properties, which will be utilized in future container-independent measurements of hidden liquids.

  11. Computer quantitation of coronary angiograms

    NASA Technical Reports Server (NTRS)

    Ledbetter, D. C.; Selzer, R. H.; Gordon, R. M.; Blankenhorn, D. H.; Sanmarco, M. E.

    1978-01-01

    A computer technique is being developed at the Jet Propulsion Laboratory to automate the measurement of coronary stenosis. A Vanguard 35mm film transport is optically coupled to a Spatial Data System vidicon/digitizer which in turn is controlled by a DEC PDP 11/55 computer. Programs have been developed to track the edges of the arterial shadow, to locate normal and atherosclerotic vessel sections and to measure percent stenosis. Multiple frame analysis techniques are being investigated that involve on the one hand, averaging stenosis measurements from adjacent frames, and on the other hand, averaging adjacent frame images directly and then measuring stenosis from the averaged image. For the latter case, geometric transformations are used to force registration of vessel images whose spatial orientation changes.

  12. The Q-Sort method: use in landscape assessment research and landscape planning

    Treesearch

    David G. Pitt; Ervin H. Zube

    1979-01-01

    The assessment of visual quality inherently involves the measurement of perceptual response to landscape. The Q-Sort Method is a psychometric technique which produces reliable and valid interval measurements of people's perceptions of landscape visual quality as depicted in photographs. It is readily understood by participants across a wide range of age groups and...

  13. Use of an ultrasonic reflectance technique to examine bubble size changes in dough

    NASA Astrophysics Data System (ADS)

    Strybulevych, A.; Leroy, V.; Shum, A. L.; Koksel, H. F.; Scanlon, M. G.; Page, J. H.

    2012-12-01

    Bread quality largely depends on the manner in which bubbles are created and manipulated in the dough during processing. We have developed an ultrasonic reflectance technique to monitor bubbles in dough, even at high volume fractions, where near the bubble resonances it is difficult to make measurements using transmission techniques. A broadband transducer centred at 3.5 MHz in a normal incidence wave reflection set-up is used to measure longitudinal velocity and attenuation from acoustic impedance measurements. The technique is illustrated by examining changes in bubbles in dough due to two very different physical effects. In dough made without yeast, a peak in attenuation due to bubble resonance is observed at approximately 2 MHz. This peak diminishes rapidly and shifts to lower frequencies, indicative of Ostwald ripening of bubbles within the dough. The second effect involves the growth of bubble sizes due to gas generated by yeast during fermentation. This process is experimentally challenging to investigate with ultrasound because of very high attenuation. The reflectance technique allows the changes of the velocity and attenuation during fermentation to be measured as a function of frequency and time, indicating bubble growth effects that can be monitored even at high volume fractions of bubbles.

  14. A simple technique for measuring buoyant weight increment of entire, transplanted coral colonies in the field.

    PubMed

    Herler, Jürgen; Dirnwöber, Markus

    2011-10-31

    Estimating the impacts of global and local threats on coral reefs requires monitoring reef health and measuring coral growth and calcification rates at different time scales. This has traditionally been mostly performed in short-term experimental studies in which coral fragments were grown in the laboratory or in the field but measured ex situ. Practical techniques in which growth and measurements are performed over the long term in situ are rare. Apart from photographic approaches, weight increment measurements have also been applied. Past buoyant weight measurements under water involved a complicated and little-used apparatus. We introduce a new method that combines previous field and laboratory techniques to measure the buoyant weight of entire, transplanted corals under water. This method uses an electronic balance fitted into an acrylic glass underwater housing and placed atop of an acrylic glass cube. Within this cube, corals transplanted onto artificial bases can be attached to the balance and weighed at predetermined intervals while they continue growth in the field. We also provide a set of simple equations for the volume and weight determinations required to calculate net growth rates. The new technique is highly accurate: low error of weight determinations due to variation of coral density (< 0.08%) and low standard error (< 0.01%) for repeated measurements of the same corals. We outline a transplantation technique for properly preparing corals for such long-term in situ experiments and measurements.

  15. Differential thermal voltammetry for tracking of degradation in lithium-ion batteries

    NASA Astrophysics Data System (ADS)

    Wu, Billy; Yufit, Vladimir; Merla, Yu; Martinez-Botas, Ricardo F.; Brandon, Nigel P.; Offer, Gregory J.

    2015-01-01

    Monitoring of lithium-ion batteries is of critical importance in electric vehicle applications in order to manage the operational condition of the cells. Measurements on a vehicle often involve current, voltage and temperature which enable in-situ diagnostic techniques. This paper presents a novel diagnostic technique, termed differential thermal voltammetry, which is capable of monitoring the state of the battery using voltage and temperature measurements in galvanostatic operating modes. This tracks battery degradation through phase transitions, and the resulting entropic heat, occurring in the electrodes. Experiments to monitor battery degradation using the new technique are compared with a pseudo-2D cell model. Results show that the differential thermal voltammetry technique provides information comparable to that of slow rate cyclic voltammetry at shorter timescale and with load conditions easier to replicate in a vehicle.

  16. On-Line Measurement of Heat of Combustion of Gaseous Hydrocarbon Fuel Mixtures

    NASA Technical Reports Server (NTRS)

    Sprinkle, Danny R.; Chaturvedi, Sushil K.; Kheireddine, Ali

    1996-01-01

    A method for the on-line measurement of the heat of combustion of gaseous hydrocarbon fuel mixtures has been developed and tested. The method involves combustion of a test gas with a measured quantity of air to achieve a preset concentration of oxygen in the combustion products. This method involves using a controller which maintains the fuel (gas) volumetric flow rate at a level consistent with the desired oxygen concentration in the combustion products. The heat of combustion is determined form a known correlation with the fuel flow rate. An on-line computer accesses the fuel flow data and displays the heat of combustion measurement at desired time intervals. This technique appears to be especially applicable for measuring heats of combustion of hydrocarbon mixtures of unknown composition such as natural gas.

  17. Fabrication of glass gas cells for the HALOE and MAPS satellite experiments

    NASA Technical Reports Server (NTRS)

    Sullivan, E. M.; Walthall, H. G.

    1984-01-01

    The Halogen Occultation Experiment (HALOE) and the Measurement of Air Pollution from Satellites (MAPS) experiment are satellite-borne experiments which measure trace constituents in the Earth's atmosphere. The instruments which obtain the data for these experiments are based on the gas filter correlation radiometer measurement technique. In this technique, small samples of the gases of interest are encapsulated in glass cylinders, called gas cells, which act as very selective optical filters. This report describes the techniques employed in the fabrication of the gas cells for the HALOE and MAPS instruments. Details of the method used to fuse the sapphire windows (required for IR transmission) to the glass cell bodies are presented along with detailed descriptions of the jigs and fixtures used during the assembly process. The techniques and equipment used for window inspection and for pairing the HALOE windows are discussed. Cell body materials and the steps involved in preparing the cell bodies for the glass-to-sapphire fusion process are given.

  18. Mischel Technique; Technical Report 12. Disadvantaged Children and Their First School Experiences. ETS-Head Start Longitudinal Study. Technical Report Series.

    ERIC Educational Resources Information Center

    Lindstrom, David R.; Shipman, Virginia C.

    An adaptation of a technique devised by Mischel (1958) was used in the longitudinal study to measure delay of gratification. Adaptations involved (1) asking the child to identify the larger of two pieces of candy to facilitate comprehension of the rewards, (2) specifying a standard time limit for receipt of the delayed reward which would be…

  19. Erosion measurement techniques for plasma-driven railgun barrels

    NASA Astrophysics Data System (ADS)

    Jamison, K. A.; Niiler, Andrus

    1987-04-01

    Plasma-driven railguns are now in operation at several locations throughout the world. All share common problems in barrel erosion arising from the fact that the bore surface must contain a high temperature plasma armature which transmits the acceleration force to a projectile. The plasma temperature at the core of the armature is estimated to be 30 000 K or higher. Such conditions are erosive to most materials even when the exposure time is 100 μs or less. We have adapted two accelerator based techniques to aid in the study of this erosion. The first technique involves the collection and analysis of material ablated and left behind by the plasma. This analysis is based on the unfolding of the Rutherford backscattered (RBS) spectra of 1 MeV deuterons incident on residue collected from a railgun bore. The second technique is an erosion measurement involving thin layer activation (TLA) of surfaces. In this process, the copper rail surface is activated by 2.4 MeV protons creating a relatively thin (3 m) layer sparsely seeded with a long lived zinc isotope. Monitoring the decay of the activated sample before and after a firing can detect surface wear of about 0. 1 m. Results from the RBS and TLA experiments on the BRL plasma driven railgun are described.

  20. Disability: a model and measurement technique.

    PubMed Central

    Williams, R G; Johnston, M; Willis, L A; Bennett, A E

    1976-01-01

    Current methods of ranking or scoring disability tend to be arbitrary. A new method is put forward on the hypothesis that disability progresses in regular, cumulative patterns. A model of disability is defined and tested with the use of Guttman scale analysis. Its validity is indicated on data from a survey in the community and from postsurgical patients, and some factors involved in scale variation are identified. The model provides a simple measurement technique and has implications for the assessment of individual disadvantage, for the prediction of progress in recovery or deterioration, and for evaluation of the outcome of treatment regimes. PMID:953379

  1. Astrometric observations of Saturn's satellites from McDonald Observatory, 1972. [using reference stars

    NASA Technical Reports Server (NTRS)

    Abbot, R. I.; Mulholland, J. D.; Shelus, P. J.

    1974-01-01

    Observations of Saturn's satellites were reduced by means of secondary reference stars obtained by reduction of Palomar Sky Survey (PSS) plates. This involved the use of 39 SAO stars and plate overlap technique to determine the coordinates of 59 fainter stars in the satellite field. Fourteen plate constants were determined for each of the two PSS plates. Comparison of two plate measurement and reduction techniques on the satellite measurements demonstrate the existence of a serious background gradient effect and the utility of microdensitometry to eliminate this error source in positional determinations of close satellites.

  2. Astrometric observations of Saturn's satellites from McDonald Observatory, 1972

    NASA Technical Reports Server (NTRS)

    Abbot, R. I.; Mulholland, J. D.; Shelus, P. J.

    1975-01-01

    Observations of Saturn's satellites have been reduced by means of secondary reference stars obtained by reduction of Palomar Sky Survey plates. This involved the use of 29 SAO stars and plate overlap technique to determine the coordinates of 59 fainter stars in the satellite field. Fourteen plate constants were determined for each of the two PSS plates. Comparison of two plate measurement and reduction techniques on the satellite measures appears to demonstrate the existence of a serious background gradient effect and the utility of microdensitometry to eliminate this error source in positional determinations of close satellites.

  3. Material Measurements Using Groundplane Apertures

    NASA Technical Reports Server (NTRS)

    Komisarek, K.; Dominek, A.; Wang, N.

    1995-01-01

    A technique for material parameter determination using an aperture in a groundplane is studied. The material parameters are found by relating the measured reflected field in the aperture to a numerical model. Two apertures are studied which can have a variety of different material configurations covering the aperture. The aperture cross-sections studied are rectangular and coaxial. The material configurations involved combinations of single layer and dual layers with or without a resistive exterior resistive sheet. The resistivity of the resistive sheet can be specified to simulate a perfect electric conductor (PEC) backing (0 Ohms/square) to a free space backing (infinity Ohms/square). Numerical parameter studies and measurements were performed to assess the feasibility of the technique.

  4. A technique for estimating dry deposition velocities based on similarity with latent heat flux

    NASA Astrophysics Data System (ADS)

    Pleim, Jonathan E.; Finkelstein, Peter L.; Clarke, John F.; Ellestad, Thomas G.

    Field measurements of chemical dry deposition are needed to assess impacts and trends of airborne contaminants on the exposure of crops and unmanaged ecosystems as well as for the development and evaluation of air quality models. However, accurate measurements of dry deposition velocities require expensive eddy correlation measurements and can only be practically made for a few chemical species such as O 3 and CO 2. On the other hand, operational dry deposition measurements such as those used in large area networks involve relatively inexpensive standard meteorological and chemical measurements but rely on less accurate deposition velocity models. This paper describes an intermediate technique which can give accurate estimates of dry deposition velocity for chemical species which are dominated by stomatal uptake such as O 3 and SO 2. This method can give results that are nearly the quality of eddy correlation measurements of trace gas fluxes at much lower cost. The concept is that bulk stomatal conductance can be accurately estimated from measurements of latent heat flux combined with standard meteorological measurements of humidity, temperature, and wind speed. The technique is tested using data from a field experiment where high quality eddy correlation measurements were made over soybeans. Over a four month period, which covered the entire growth cycle, this technique showed very good agreement with eddy correlation measurements for O 3 deposition velocity.

  5. NASA EPSCoR Preparation Grant

    NASA Technical Reports Server (NTRS)

    Sukanek, Peter C.

    2002-01-01

    The NASA EPSCoR project in Mississippi involved investigations into three areas of interest to NASA by researchers at the four comprehensive universities in the state. These areas involved: (1) Noninvasive Flow Measurement Techniques, (2) Spectroscopic Exhaust Plume Measurements of Hydrocarbon Fueled Rocket Engines and (3) Integration of Remote Sensing and GIS data for Flood Forecasting on the Mississippi Gulf Coast. Each study supported a need at the Stennis Space Center in Mississippi. The first two addressed needs in rocket testing, and the third, in commercial remote sensing. Students from three of the institutions worked with researchers at Stennis Space Center on the projects.

  6. Thermal conductivity measurements on Mullite and Silica REI for the Space Shuttle - Complementary guarded hot plate and radical outflow measurements. [Reusable External Insulators

    NASA Technical Reports Server (NTRS)

    Brazel, J. P.; Kennedy, B. S.

    1974-01-01

    The materials studied are described along with the apparatus and the experimental techniques employed. The results of the measurements involving two REI Silica materials and a Mod 1 B REI Mullite are listed in a table. Measurements were conducted at unusually high temperature differences to detect 'shine-through' radiation transparency. Photographs are presented of the high-temperature guarded hot plate assembly.

  7. Data selection techniques in the interpretation of MAGSAT data over Australia

    NASA Technical Reports Server (NTRS)

    Johnson, B. D.; Dampney, C. N. G.

    1983-01-01

    The MAGSAT data require critical selection in order to produce a self-consistent data set suitable for map construction and subsequent interpretation. Interactive data selection techniques are described which involve the use of a special-purpose profile-oriented data base and a colour graphics display. The careful application of these data selection techniques permits validation every data value and ensures that the best possible self-consistent data set is being used to construct the maps of the magnetic field measured at satellite altitudes over Australia.

  8. Advanced Ultrasonic Measurement Methodology for Non-Invasive Interrogation and Identification of Fluids in Sealed Containers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tucker, Brian J.; Diaz, Aaron A.; Eckenrode, Brian A.

    2006-03-16

    The Hazardous Materials Response Unit (HMRU) and the Counterterrorism and Forensic Science Research Unit (CTFSRU), Laboratory Division, Federal Bureau of Investigation (FBI) have been mandated to develop and establish a wide range of unprecedented capabilities for providing scientific and technical forensic services to investigations involving hazardous chemical, biological, and radiological materials, including extremely dangerous chemical and biological warfare agents. Pacific Northwest National Laboratory (PNNL) has developed a portable, hand-held, hazardous materials acoustic inspection device (HAZAID) that provides noninvasive container interrogation and material identification capabilities using nondestructive ultrasonic velocity and attenuation measurements. Due to the wide variety of fluids as wellmore » as container sizes and materials, the need for high measurement sensitivity and advanced ultrasonic measurement techniques were identified. The HAZAID prototype was developed using a versatile electronics platform, advanced ultrasonic wave propagation methods, and advanced signal processing techniques. This paper primarily focuses on the ultrasonic measurement methods and signal processing techniques incorporated into the HAZAID prototype. High bandwidth ultrasonic transducers combined with the advanced pulse compression technique allowed researchers to 1) impart large amounts of energy, 2) obtain high signal-to-noise ratios, and 3) obtain accurate and consistent time-of-flight (TOF) measurements through a variety of highly attenuative containers and fluid media. Results of this feasibility study demonstrated that the HAZAID experimental measurement technique also provided information regarding container properties, which will be utilized in future container-independent measurements of hidden liquids.« less

  9. Techniques For Measuring Absorption Coefficients In Crystalline Materials

    NASA Astrophysics Data System (ADS)

    Klein, Philipp H.

    1981-10-01

    Absorption coefficients smaller than 0.001 cm-1 can, with more or less difficulty, be measured by several techniques. With diligence, all methods can be refined to permit measurement of absorption coefficients as small as 0.00001 cm-1. Spectral data are most readily obtained by transmission (spectrophotometric) methods, using multiple internal reflection to increase effective sample length. Emissivity measurements, requiring extreme care in the elimination of detector noise and stray light, nevertheless afford the most accessible spectral data in the 0.0001 to 0.00001 cm-1 range. Single-wavelength informa-tion is most readily obtained with modifications of laser calorimetry. Thermo-couple detection of energy absorbed from a laser beam is convenient, but involves dc amplification techniques and is susceptible to stray-light problems. Photoacoustic detection, using ac methods, tends to diminish errors of these types, but at some expense in experimental complexity. Laser calorimetry has been used for measurements of absorption coefficients as small as 0.000003 cm-1. Both transmission and calorimetric data, taken as functions of intensity, have been used for measurement of nonlinear absorption coefficients.

  10. Measurements of gluconeogenesis and glycogenolysis: A methodological review

    USDA-ARS?s Scientific Manuscript database

    Gluconeogenesis is a complex metabolic process that involves multiple enzymatic steps regulated by myriad factors, including substrate concentrations, the redox state, activation and inhibition of specific enzyme steps, and hormonal modulation. At present, the most widely accepted technique to deter...

  11. High-Speed Imaging Optical Pyrometry for Study of Boron Nitride Nanotube Generation

    NASA Technical Reports Server (NTRS)

    Inman, Jennifer A.; Danehy, Paul M.; Jones, Stephen B.; Lee, Joseph W.

    2014-01-01

    A high-speed imaging optical pyrometry system is designed for making in-situ measurements of boron temperature during the boron nitride nanotube synthesis process. Spectrometer measurements show molten boron emission to be essentially graybody in nature, lacking spectral emission fine structure over the visible range of the electromagnetic spectrum. Camera calibration experiments are performed and compared with theoretical calculations to quantitatively establish the relationship between observed signal intensity and temperature. The one-color pyrometry technique described herein involves measuring temperature based upon the absolute signal intensity observed through a narrowband spectral filter, while the two-color technique uses the ratio of the signals through two spectrally separated filters. The present study calibrated both the one- and two-color techniques at temperatures between 1,173 K and 1,591 K using a pco.dimax HD CMOS-based camera along with three such filters having transmission peaks near 550 nm, 632.8 nm, and 800 nm.

  12. Development of basic theories and techniques for determining stresses in rotating turbine or compressor blades

    NASA Technical Reports Server (NTRS)

    Chien, C. H.; Swinson, W. F.; Turner, J. L.; Moslehy, F. A.; Ranson, W. F.

    1980-01-01

    A method for measuring in-plane displacement of a rotating structure by using two laser speckle photographs is described. From the displacement measurements one can calculate strains and stresses due to a centrifugal load. This technique involves making separate speckle photographs of a test model. One photograph is made with the model loaded (model is rotating); the second photograph is made with no load on the model (model is stationary). A sandwich is constructed from the two speckle photographs and data are recovered in a manner similar to that used with conventional speckle photography. The basic theory, experimental procedures of this method, and data analysis of a simple rotating specimen are described. In addition the measurement of in-plane surface displacement components of a deformed solid, and the application of the coupled laser speckle interferometry and boundary-integral solution technique to two dimensional elasticity problems are addressed.

  13. A simple technique for measurement of pressure in the tympanitic rumen of cattle.

    PubMed

    Turner, C B; Whyte, T D

    1978-05-13

    The construction and method of use of a simple device for the non-invasive measurement of intra-rumenal pressure is outlined. Results obtained from calves suffering from increased intra-rumenal pressure (bloat) are shown. The method is capable of quantifying pressures involved in bloat and could be used to augment the visual assessment of bloat scoring.

  14. Lidar Measurements of Tropospheric Wind Profiles with the Double Edge Technique

    NASA Technical Reports Server (NTRS)

    Gentry, Bruce M.; Li, Steven X.; Korb, C. Laurence; Mathur, Savyasachee; Chen, Huailin

    1998-01-01

    Research has established the importance of global tropospheric wind measurements for large scale improvements in numerical weather prediction. In addition, global wind measurements provide data that are fundamental to the understanding and prediction of global climate change. These tasks are closely linked with the goals of the NASA Earth Science Enterprise and Global Climate Change programs. NASA Goddard has been actively involved in the development of direct detection Doppler lidar methods and technologies to meet the wind observing needs of the atmospheric science community. A variety of direct detection Doppler wind lidar measurements have recently been reported indicating the growing interest in this area. Our program at Goddard has concentrated on the development of the edge technique for lidar wind measurements. Implementations of the edge technique using either the aerosol or molecular backscatter for the Doppler wind measurement have been described. The basic principles have been verified in lab and atmospheric lidar wind experiments. The lidar measurements were obtained with an aerosol edge technique lidar operating at 1064 nm. These measurements demonstrated high spatial resolution (22 m) and high velocity sensitivity (rms variances of 0.1 m/s) in the planetary boundary layer (PBL). The aerosol backscatter is typically high in the PBL and the effects of the molecular backscatter can often be neglected. However, as was discussed in the original edge technique paper, the molecular contribution to the signal is significant above the boundary layer and a correction for the effects of molecular backscatter is required to make wind measurements. In addition, the molecular signal is a dominant source of noise in regions where the molecular to aerosol ratio is large since the energy monitor channel used in the single edge technique measures the sum of the aerosol and molecular signals. To extend the operation of the edge technique into the free troposphere we have developed a variation of the edge technique called the double edge technique. In this paper a ground based aerosol double edge lidar is described and the first measurements of wind profiles in the free troposphere obtained with this lidar will be presented.

  15. Feasibility Study of a Rotorcraft Health and Usage Monitoring System ( HUMS): Usage and Structural Life Monitoring Evaluation

    NASA Technical Reports Server (NTRS)

    Dickson, B.; Cronkhite, J.; Bielefeld, S.; Killian, L.; Hayden, R.

    1996-01-01

    The objective of this study was to evaluate two techniques, Flight Condition Recognition (FCR) and Flight Load Synthesis (FLS), for usage monitoring and assess the potential benefits of extending the retirement intervals of life-limited components, thus reducing the operator's maintenance and replacement costs. Both techniques involve indirect determination of loads using measured flight parameters and subsequent fatigue analysis to calculate the life expended on the life-limited components. To assess the potential benefit of usage monitoring, the two usage techniques were compared to current methods of component retirement. In addition, comparisons were made with direct load measurements to assess the accuracy of the two techniques. The data that was used for the evaluation of the usage monitoring techniques was collected under an independent HUMS Flight trial program, using a commercially available HUMS and data recording system. The usage data collect from the HUMS trial aircraft was analyzed off-line using PC-based software that included the FCR and FLS techniques. In the future, if the technique prove feasible, usage monitoring would be incorporated into the onboard HUMS.

  16. Compaction behavior of surrogate degraded emplaced WIPP waste.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Broome, Scott Thomas; Bronowski, David R.; Kuthakun, Souvanny James

    The present study results are focused on laboratory testing of surrogate waste materials. The surrogate wastes correspond to a conservative estimate of degraded Waste Isolation Pilot Plant (WIPP) containers and TRU waste materials at the end of the 10,000 year regulatory period. Testing consists of hydrostatic, triaxial, and uniaxial strain tests performed on surrogate waste recipes that were previously developed by Hansen et al. (1997). These recipes can be divided into materials that simulate 50% and 100% degraded waste by weight. The percent degradation indicates the anticipated amount of iron corrosion, as well as the decomposition of cellulosics, plastics, andmore » rubbers (CPR). Axial, lateral, and volumetric strain and axial, lateral, and pore stress measurements were made. Two unique testing techniques were developed during the course of the experimental program. The first involves the use of dilatometry to measure sample volumetric strain under a hydrostatic condition. Bulk moduli of the samples measured using this technique were consistent with those measured using more conventional methods. The second technique involved performing triaxial tests under lateral strain control. By limiting the lateral strain to zero by controlling the applied confining pressure while loading the specimen axially in compression, one can maintain a right-circular cylindrical geometry even under large deformations. This technique is preferred over standard triaxial testing methods which result in inhomogeneous deformation or (3z(Bbarreling(3y. (BManifestations of the inhomogeneous deformation included non-uniform stress states, as well as unrealistic Poissons ratios (> 0.5) or those that vary significantly along the length of the specimen. Zero lateral strain controlled tests yield a more uniform stress state, and admissible and uniform values of Poissons ratio.« less

  17. Intercomparison of HONO Measurements Made Using Wet-Chemical (NITROMAC) and Spectroscopic (IBBCEAS & LP/FAGE) Techniques

    NASA Astrophysics Data System (ADS)

    Dusanter, S.; Lew, M.; Bottorff, B.; Bechara, J.; Mielke, L. H.; Berke, A.; Raff, J. D.; Stevens, P. S.; Afif, C.

    2013-12-01

    A good understanding of the oxidative capacity of the atmosphere is important to tackle fundamental issues related to climate change and air quality. The hydroxyl radical (OH) is the dominant oxidant in the daytime troposphere and an accurate description of its sources in atmospheric models is of utmost importance. Recent field studies indicate higher-than-expected concentrations of HONO during the daytime, suggesting that the photolysis of HONO may be an important underestimated source of OH. Understanding the tropospheric HONO budget requires confidence in analytical instrumentation capable of selectively measuring HONO. In this presentation, we discuss an intercomparison study of HONO measurements performed during summer 2013 at the edge of a hardwood forest in Southern Indiana. This exercise involved a wet chemical technique (NITROMAC), an Incoherent Broad-Band Cavity Enhanced Absorption Spectroscopy instrument (IBBCEAS), and a Laser-Photofragmentation/Fluorescence Assay by Gas Expansion instrument (LP/FAGE). The agreement observed between the three techniques will be discussed for both ambient measurements and cross calibration experiments.

  18. Thermoluminescence Dosimetry (TLD) and its Application in Medical Physics

    NASA Astrophysics Data System (ADS)

    Azorín Nieto, Juan

    2004-09-01

    Radiation dosimetry is fundamental in Medical Physics, involving patients and phantom dosimetry. In both cases thermoluminescence dosimetry (TLD) is the most appropriate technique for measuring the absorbed dose. In this paper thermoluminescence phenomenon as well as the use of TLD in radiodiagnosis and radiotherapy for in vivo or in phantom measurements is discussed. Some results of measurements made in radiotherapy and radiodiagnosis using home made LiF:Mg,Cu,P+PTFE TLD are presented.

  19. The Synthesis of Proteins-A Simple Experiment To Show the Procedures and Problems of Using Radioisotopes in Biochemical Studies

    NASA Astrophysics Data System (ADS)

    Hawcroft, David M.

    1996-11-01

    Courses of organic chemistry frequently include studies of biochemistry and hence of biochemical techniques. Radioisotopes have played a major role in the understanding of metabolic pathways, transport, enzyme activity and other processes. The experiment described in this paper uses simple techniques to illustrate the procedures involved in working with radioisotopes when following a simplified metabolic pathway. Safety considerations are discussed and a list of safety rules is provided, but the experiment itself uses very low levels of a weak beta-emitting isotope (tritium). Plant material is suggested to reduce legal, financial and emotive problems, but the techniques are applicable to all soft-tissued material. The problems involved in data interpretation in radioisotope experiments resulting from radiation quenching are resolved by simple correction calculations, and the merits of using radioisotopes shown by a calculation of the low mass of material being measured. Suggestions for further experiments are given.

  20. Problems of millipound thrust measurement. The "Hansen Suspension"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carta, David G.

    Considered in detail are problems which led to the need and use of the 'Hansen Suspension'. Also discussed are problems which are likely to be encountered in any low level thrust measuring system. The methods of calibration and the accuracies involved are given careful attention. With all parameters optimized and calibration techniques perfected, the system was found capable of a resolution of 10 {mu} lbs. A comparison of thrust measurements made by the 'Hansen Suspension' with measurements of a less sophisticated device leads to some surprising results.

  1. Sensitivity studies and laboratory measurements for the laser heterodyne spectrometer experiment

    NASA Technical Reports Server (NTRS)

    Allario, F.; Katzberg, S. J.; Larsen, J. C.

    1980-01-01

    Several experiments involving spectral scanning interferometers and gas filter correlation radiometers (ref. 2) using limb scanning solar occultation techniques under development for measurements of stratospheric trace gases from Spacelab and satellite platforms are described. An experiment to measure stratospheric trace constituents by Laser Heterodyne Spectroscopy, a summary of sensitivity analyses, and supporting laboratory measurements are presented for O3, ClO, and H2O2 in which the instrument transfer function is modeled using a detailed optical receiver design.

  2. 40 CFR 1065.150 - Continuous sampling.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Continuous sampling. 1065.150 Section... ENGINE-TESTING PROCEDURES Equipment Specifications § 1065.150 Continuous sampling. You may use continuous sampling techniques for measurements that involve raw or dilute sampling. Make sure continuous sampling...

  3. 40 CFR 1065.150 - Continuous sampling.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Continuous sampling. 1065.150 Section... ENGINE-TESTING PROCEDURES Equipment Specifications § 1065.150 Continuous sampling. You may use continuous sampling techniques for measurements that involve raw or dilute sampling. Make sure continuous sampling...

  4. Highway-railway at-grade crossing structures : long term settlement measurements and assessments.

    DOT National Transportation Integrated Search

    2016-03-22

    A common maintenance technique to correct track geometry at bridge transitions is hand tamping. The first section presents a non-invasive track monitoring system involving high-speed video cameras that evaluates the change in track behavior before an...

  5. EVALUATION OF ACID DEPOSITION MODELS USING PRINCIPAL COMPONENT SPACES

    EPA Science Inventory

    An analytical technique involving principal components analysis is proposed for use in the evaluation of acid deposition models. elationships among model predictions are compared to those among measured data, rather than the more common one-to-one comparison of predictions to mea...

  6. Blood doping by cobalt. Should we measure cobalt in athletes?

    PubMed

    Lippi, Giuseppe; Franchini, Massimo; Guidi, Gian Cesare

    2006-07-24

    Blood doping is commonplace in competitive athletes who seek to enhance their aerobic performances through illicit techniques. Cobalt, a naturally-occurring element with properties similar to those of iron and nickel, induces a marked and stable polycythemic response through a more efficient transcription of the erythropoietin gene. Although little information is available so far on cobalt metabolism, reference value ranges or supplementation in athletes, there is emerging evidence that cobalt is used as a supplement and increased serum concentrations are occasionally observed in athletes. Therefore, given the athlete's connatural inclination to experiment with innovative, unfair and potentially unhealthy doping techniques, cobalt administration might soon become the most suited complement or surrogate for erythropoiesis-stimulating substances. Nevertheless, cobalt administration is not free from unsafe consequences, which involve toxic effects on heart, liver, kidney, thyroid and cancer promotion. Cobalt is easily purchasable, inexpensive and not currently comprehended within the World Anti-Doping Agency prohibited list. Moreover, available techniques for measuring whole blood, serum, plasma or urinary cobalt involve analytic approaches which are currently not practical for antidoping laboratories. Thus more research on cobalt metabolism in athletes is compelling, along with implementation of effective strategies to unmask this potentially deleterious doping practice.

  7. Determination of minor and trace elements in kidney stones by x-ray fluorescence analysis

    NASA Astrophysics Data System (ADS)

    Srivastava, Anjali; Heisinger, Brianne J.; Sinha, Vaibhav; Lee, Hyong-Koo; Liu, Xin; Qu, Mingliang; Duan, Xinhui; Leng, Shuai; McCollough, Cynthia H.

    2014-03-01

    The determination of accurate material composition of a kidney stone is crucial for understanding the formation of the kidney stone as well as for preventive therapeutic strategies. Radiations probing instrumental activation analysis techniques are excellent tools for identification of involved materials present in the kidney stone. In particular, x-ray fluorescence (XRF) can be very useful for the determination of minor and trace materials in the kidney stone. The X-ray fluorescence measurements were performed at the Radiation Measurements and Spectroscopy Laboratory (RMSL) of department of nuclear engineering of Missouri University of Science and Technology and different kidney stones were acquired from the Mayo Clinic, Rochester, Minnesota. Presently, experimental studies in conjunction with analytical techniques were used to determine the exact composition of the kidney stone. A new type of experimental set-up was developed and utilized for XRF analysis of the kidney stone. The correlation of applied radiation source intensity, emission of X-ray spectrum from involving elements and absorption coefficient characteristics were analyzed. To verify the experimental results with analytical calculation, several sets of kidney stones were analyzed using XRF technique. The elements which were identified from this techniques are Silver (Ag), Arsenic (As), Bromine (Br), Chromium (Cr), Copper (Cu), Gallium (Ga), Germanium (Ge), Molybdenum (Mo), Niobium (Nb), Rubidium (Rb), Selenium (Se), Strontium (Sr), Yttrium (Y), Zirconium (Zr). This paper presents a new approach for exact detection of accurate material composition of kidney stone materials using XRF instrumental activation analysis technique.

  8. Fibre Optic Sensors for Selected Wastewater Characteristics

    PubMed Central

    Chong, Su Sin; Abdul Aziz, A. R.; Harun, Sulaiman W.

    2013-01-01

    Demand for online and real-time measurements techniques to meet environmental regulation and treatment compliance are increasing. However the conventional techniques, which involve scheduled sampling and chemical analysis can be expensive and time consuming. Therefore cheaper and faster alternatives to monitor wastewater characteristics are required as alternatives to conventional methods. This paper reviews existing conventional techniques and optical and fibre optic sensors to determine selected wastewater characteristics which are colour, Chemical Oxygen Demand (COD) and Biological Oxygen Demand (BOD). The review confirms that with appropriate configuration, calibration and fibre features the parameters can be determined with accuracy comparable to conventional method. With more research in this area, the potential for using FOS for online and real-time measurement of more wastewater parameters for various types of industrial effluent are promising. PMID:23881131

  9. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1982-01-01

    Models, measures, and techniques for evaluating the effectiveness of aircraft computing systems were developed. By "effectiveness" in this context we mean the extent to which the user, i.e., a commercial air carrier, may expect to benefit from the computational tasks accomplished by a computing system in the environment of an advanced commercial aircraft. Thus, the concept of effectiveness involves aspects of system performance, reliability, and worth (value, benefit) which are appropriately integrated in the process of evaluating system effectiveness. Specifically, the primary objectives are: the development of system models that provide a basis for the formulation and evaluation of aircraft computer system effectiveness, the formulation of quantitative measures of system effectiveness, and the development of analytic and simulation techniques for evaluating the effectiveness of a proposed or existing aircraft computer.

  10. ELISA test for anti-neutrophil cytoplasm antibodies detection evaluated by a computer screen photo-assisted technique.

    PubMed

    Filippini, D; Tejle, K; Lundström, I

    2005-08-15

    The computer screen photo-assisted technique (CSPT), a method for substance classification based on spectral fingerprinting, which involves just a computer screen and a web camera as measuring platform is used here for the evaluation of a prospective enzyme-linked immunosorbent assay (ELISA). A anti-neutrophil cytoplasm antibodies (ANCA-ELISA) test, typically used for diagnosing patients suffering from chronic inflammatory disorders in the skin, joints, blood vessels and other tissues is comparatively tested with a standard microplate reader and CSPT, yielding equivalent results at a fraction of the instrumental costs. The CSPT approach is discussed as a distributed measuring platform allowing decentralized measurements in routine applications, whereas keeping centralized information management due to its natural network embedded operation.

  11. Microfluidic ultralow interfacial tensiometry with magnetic particles.

    PubMed

    Tsai, Scott S H; Wexler, Jason S; Wan, Jiandi; Stone, Howard A

    2013-01-07

    We describe a technique that measures ultralow interfacial tensions using paramagnetic spheres in a co-flow microfluidic device designed with a magnetic section. Our method involves tuning the distance between the co-flowing interface and the magnet's center, and observing the behavior of the spheres as they approach the liquid-liquid interface-the particles either pass through or are trapped by the interface. Using threshold values of the magnet-to-interface distance, we make estimates of the two-fluid interfacial tension. We demonstrate the effectiveness of this technique for measuring very low interfacial tensions, O(10(-6)-10(-5)) N m(-1), by testing solutions of different surfactant concentrations, and we show that our results are comparable with measurements made using a spinning drop tensiometer.

  12. Ultraviolet absorption: Experiment MA-059. [measurement of atmospheric species concentrations

    NASA Technical Reports Server (NTRS)

    Donahue, T. M.; Hudson, R. D.; Rawlins, W. T.; Anderson, J.; Kaufman, F.; Mcelroy, M. B.

    1977-01-01

    A technique devised to permit the measurement of atmospheric species concentrations is described. This technique involves the application of atomic absorption spectroscopy and the quantitative observation of resonance fluorescence in which atomic or molecular species scatter resonance radiation from a light source into a detector. A beam of atomic oxygen and atomic nitrogen resonance radiation, strong unabsorbable oxygen and nitrogen radiation, and visual radiation was sent from Apollo to Soyuz. The density of atomic oxygen and atomic nitrogen between the two spacecraft was measured by observing the amount of resonance radiation absorbed when the line joining Apollo and Soyuz was perpendicular to their velocity with respect to the ambient atmosphere. Results of postflight analysis of the resonance fluorescence data are discussed.

  13. The vibro-acoustic mapping of low gravity trajectories on a Learjet aircraft

    NASA Technical Reports Server (NTRS)

    Grodsinsky, C. M.; Sutliff, T. J.

    1990-01-01

    Terrestrial low gravity research techniques have been employed to gain a more thorough understanding of basic science and technology concepts. One technique frequently used involves flying parabolic trajectories aboard the NASA Lewis Research Center Learjet aircraft. A measurement program was developed to support an isolation system conceptual design. This program primarily was intended to measure time correlated high frequency accelerations (up to 100 Hz) present at various locations throughout the Learjet during a series of trajectories and flights. As suspected, the measurements obtained revealed that the environment aboard such an aircraft can not simply be described in terms of the static level low gravity g vector obtained, but that it also must account for both rigid body and high frequency vibro-acoustic dynamics.

  14. Development of metrology for freeform optics in reflection mode

    NASA Astrophysics Data System (ADS)

    Burada, Dali R.; Pant, Kamal K.; Mishra, Vinod; Bichra, Mohamed; Khan, Gufran S.; Sinzinger, Stefan; Shakher, Chandra

    2017-06-01

    The increased range of manufacturable freeform surfaces offered by the new fabrication techniques is giving opportunities to incorporate them in the optical systems. However, the success of these fabrication techniques depends on the capabilities of metrology procedures and a feedback mechanism to CNC machines for optimizing the manufacturing process. Therefore, a precise and in-situ metrology technique for freeform optics is in demand. Though all the techniques available for aspheres have been extended for the freeform surfaces by the researchers, but none of the techniques has yet been incorporated into the manufacturing machine for in-situ measurement. The most obvious reason is the complexity involved in the optical setups to be integrated in the manufacturing platforms. The Shack-Hartmann sensor offers the potential to be incorporated into the machine environment due to its vibration insensitivity, compactness and 3D shape measurement capability from slope data. In the present work, a measurement scheme is reported in which a scanning Shack-Hartmann Sensor has been employed and used as a metrology tool for measurement of freeform surface in reflection mode. Simulation studies are conducted for analyzing the stitching accuracy in presence of various misalignment errors. The proposed scheme is experimentally verified on a freeform surface of cubic phase profile.

  15. Attaching Thermocouples by Peening or Crimping

    NASA Technical Reports Server (NTRS)

    Murtland, Kevin; Cox, Robert; Immer, Christopher

    2006-01-01

    Two simple, effective techniques for attaching thermocouples to metal substrates have been devised for high-temperature applications in which attachment by such conventional means as welding, screws, epoxy, or tape would not be effective. The techniques have been used successfully to attach 0.005- in. (0.127-mm)-diameter type-S thermocouples to substrates of niobium alloy C-103 and stainless steel 416 for measuring temperatures up to 2,600 F (1,427 C). The techniques are equally applicable to other thermocouple and substrate materials. In the first technique, illustrated in the upper part of the figure, a hole slightly wider than twice the diameter of one thermocouple wire is drilled in the substrate. The thermocouple is placed in the hole, then the edge of the hole is peened in one or more places by use of a punch (see figure). The deformed material at the edge secures the thermocouple in the hole. In the second technique a hole is drilled as in the first technique, then an annular relief area is machined around the hole, resulting in structure reminiscent of a volcano in a crater. The thermocouple is placed in the hole as in the first technique, then the "volcano" material is either peened by use of a punch or crimped by use of sidecutters to secure the thermocouple in place. This second technique is preferable for very thin thermocouples [wire diameter .0.005 in. (.0.127 mm)] because standard peening poses a greater risk of clipping one or both of the thermocouple wires. These techniques offer the following advantages over prior thermocouple-attachment techniques: . Because these techniques involve drilling of very small holes, they are minimally invasive . an important advantage in that, to a first approximation, the thermal properties of surrounding areas are not appreciably affected. . These techniques do not involve introduction of any material, other than the substrate and thermocouple materials, that could cause contamination, could decompose, or oxidize at high measurement temperatures. . The simplicity of these techniques makes it possible to attach thermocouples quickly. . These techniques can be used to attach thermocouples at locations where access is somewhat restricted by the surrounding objects.

  16. Hazards and Health Risks Encountered by Manual Sand Dredgers from Udupi, India: A Cross-sectional Study

    PubMed Central

    Shaikh, Alfiya; Nayak, Priyanka; Navada, Rajesh

    2017-01-01

    Introduction Globalization and urbanization have resulted in an increased demand on sand dredging. Legal and environmental restrictions on automated dredging have led to a rise in manual technique. The working techniques and environment involved in manual sand dredging may expose the workers to multiple work related disorders. Aim To determine the health risks and occupational hazards involved in manual sand dredging. Materials and Methods An assessment schedule was developed and content was validated by five experts for the study. A cross-sectional study was then conducted using this assessment schedule. Thirty manual sand dredgers were recruited from three randomly selected docks on Swarna riverbed in Udupi district, Karnataka, India. A detailed work and worksite assessments were conducted using systematic observation and close-ended questions. Work-related health risk evaluation included onsite-evaluation and self-reported health complains. Results The prevalence of musculoskeletal pain and discomfort was 93.34% with lower back (70%), shoulder (56.7%) and neck (46.7%) involvements being most common regions. Prevalence of sensory deficits at multiple site and ear pain was 66.6% and 76.6% respectively. All the workers recruited, complained of dermatological and ophthalmic involvements. Also, lack of health and safety measures like personal protective devices and security schemes were identified. Conclusion This study shows a high prevalence of multiple work-related disorders and hazards involved in manual sand dredging, a highly demanding job in coastal Karnataka. Lack of health and safety measures were also identified. PMID:28892936

  17. Nuclear Resonance Fluorescence Response of U-235

    NASA Astrophysics Data System (ADS)

    Warren, Glen

    2008-05-01

    Nuclear resonance fluorescence (NRF) is a physical process that provides an isotopic-specific signature that could be used for the identification and characterization of materials. The technique involves the detection of prompt discrete-energy photons emitted from a sample, which is exposed to photons in the MeV energy range. Potential applications of the technique range from detection of high explosives to characterization of special nuclear materials. Pacific Northwest National Laboratory and Passport Systems have collaboratively conducted a set of measurements to search for an NRF response of U-235 in the 1.5 to 9 MeV energy range. Results from these measurements will be presented.

  18. Flow-Tagging Velocimetry for Hypersonic Flows Using Fluorescence of Nitric Oxide

    NASA Technical Reports Server (NTRS)

    Danehy, P. M.; OByrne, S.; Houwing, A. F. P.

    2001-01-01

    We investigate a new type of flow-tagging velocimetry technique for hypersonic flows. The technique involves exciting a thin line of nitric oxide molecules with a laser beam and then, after some delay, acquiring an image of the displaced line. One component of velocity is determined from the time of flight. This method is applied to measure the velocity profile in a Mach 8.5 laminar, hypersonic boundary layer in the Australian National Universities T2 free-piston shock tunnel. The velocity is measured with an uncertainty of approximately 2%. Comparison with a CFD simulation of the flow shows reasonable agreement.

  19. Investigation of large α production in reactions involving weakly bound 7Li

    NASA Astrophysics Data System (ADS)

    Pandit, S. K.; Shrivastava, A.; Mahata, K.; Parkar, V. V.; Palit, R.; Keeley, N.; Rout, P. C.; Kumar, A.; Ramachandran, K.; Bhattacharyya, S.; Nanal, V.; Palshetkar, C. S.; Nag, T. N.; Gupta, Shilpi; Biswas, S.; Saha, S.; Sethi, J.; Singh, P.; Chatterjee, A.; Kailas, S.

    2017-10-01

    The origin of the large α -particle production cross sections in systems involving weakly bound 7Li projectiles has been investigated by measuring the cross sections of all possible fragment-capture as well as complete fusion using the particle-γ coincidence, in-beam, and off-beam γ -ray counting techniques for the 7Li+93Nb system at near Coulomb barrier energies. Almost all of the inclusive α -particle yield has been accounted for. While the t -capture mechanism is found to be dominant (˜70 % ), compound nuclear evaporation and breakup processes contribute ˜15 % each to the inclusive α -particle production in the measured energy range. Systematic behavior of the t capture and inclusive α cross sections for reactions involving 7Li over a wide mass range is also reported.

  20. Improved format for radiocardiographic data

    NASA Technical Reports Server (NTRS)

    Dimeff, J.; Sevelius, G.

    1973-01-01

    Technique involves introduction of radioactive sample into antecubital vein. Scintillation crystal mounted in collimating housing views portion of right and left hearts. As radioactive sample passes through heart, counting rate is measured by crystal and recorded on strip chart. Data is insensitive to geometric effects and other parameters.

  1. Design of a low cost Zimm-Crothers viscometer: From theory to experiment

    NASA Astrophysics Data System (ADS)

    Courbin, L.; Cristobal, G.; Winckert, M.; Panizza, P.

    2005-09-01

    To accurately measure low viscosities of liquids, we describe how a Zimm-Crothers viscometer works and how to build it. The viscometer involves the action of a rotating magnetic field on a metallic cylinder floating on the liquid to be studied. The principles of electromagnetism and fluid mechanics involved make the viscometer an excellent tool for undergraduate laboratory courses and for measuring the shear viscosity of low viscous fluids. We discuss the advantages and limitations of this inexpensive and easy to use apparatus compared to other classical techniques. Calibrations with Newtonian fluids are explained and experiments with Non-Newtonian materials are discussed.

  2. Two-step fabrication technique of gold tips for use in point-contact spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Narasiwodeyar, S.; Dwyer, M.; Liu, M.

    For a successful point-contact spectroscopy (PCS) measurement, metallic tips of proper shape and smoothness are essential to ensure the ballistic nature of a point-contact junction. Until recently, the fabrication of Au tips suitable for use in point-contact spectroscopy has remained more of an art involving a trial and error method rather than an automated scientific process. To address these issues, we have developed a technique with which one can prepare high quality Au tips reproducibly and systematically. It involves an electronic control of the driving voltages used for an electrochemical etching of a gold wire in a HCl-glycerol mixture ormore » a HCl solution. We find that a stopping current, below which the circuit is set to shut off, is a single very important parameter to produce an Au tip of desired shape. We present detailed descriptions for a two-step etching process for Au tips and also test results from PCS measurements using them.« less

  3. Ion beam analysis of diffusion in heterogeneous materials

    NASA Astrophysics Data System (ADS)

    Clough, A. S.; Jenneson, P. M.

    1998-04-01

    Ion-beam analysis has been applied to a variety of problems involving diffusion in heterogeneous materials. An energy loss technique has been used to study both the diffusion of water and the surface segregation of fluoropolymers in polymeric matrices. A scanning micro-beam technique has been developed to allow water concentrations in hydrophilic polymers and cements to be measured together with associated solute elements. It has also been applied to the diffusion of shampoo into hair.

  4. Recent advances in the characterization of high temperature industrial materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meadowcroft, D.B.; Tomkings, A.B.

    1995-12-31

    This paper reviews several techniques under development or recently commercialized which aid the characterization of high temperature plant components when carrying out lifetime predictions. Temperature measurements are frequently limited because of the limited lifetime and cost of thermocouples in aggressive environments and three alternative methods of assessing the ``average effective`` temperature of a component being evaluated by the authors are described steam side oxide thickness (specifically for ferritic superheater tubes), copper gold diffusion couples (``PETIT``), and the measurement of ferrite in duplex steels (``FEROPLUG``). Advances are described which have been made recently in the measurement techniques available for making plantmore » measurements on components to reduce the time needed for significant values of wastage rates to be established. In addition on-line high, temperature corrosion monitors are coming available which allow wastage rates to be assessed over periods of hours or days. These involve electrical resistance or electrochemical techniques. Finally the use of thin layer activation by a radioactive isotope is highlighted which enables the wastage of components to be assessed remotely without direct contact. Whilst available for a long time for laboratory and pilot plant studies, the authors are actively concerned with introducing the technique into operational boiler plant.« less

  5. The measurement of ultrafine particles: A pilot study using a portable particle counting technique to measure generated particles during a micromachining process

    NASA Astrophysics Data System (ADS)

    Handy, Rodney G.; Jackson, Mark J.; Robinson, Grant M.; Lafreniere, Michael D.

    2006-04-01

    The accurate measurement of airborne particles in the nanometer range is a challenging task. Because several studies have linked exposures to airborne ultrafine particles to elevated human health risks, the need to assess the concentrations of particles in the workplace that are below 100 nm in diameter is imperative. Several different techniques for monitoring nanoparticles are now available, and others are currently being tested for their merit. Laboratory condensation particle counters (CPC), field-portable CPC, nanometer differential mobility analyzers, electron microscopy, and other novel and experimental approaches to measuring nanoparticles have been recently used in investigations. The first part of this article gives an overview of these techniques, and provides the advantages and disadvantages for each. The second part of this article introduces a portable technique, coupling two particle measurement devices that are capable of characterizing microscale and nanoscale particles in the field environment. Specifically, this pilot study involved the use of a direct-reading CPC and a laser particle counter to measure airborne concentrations of ultrafine particles during a laboratory machining process. The measurements were evaluated in real time, and subsequently, decisions regarding human exposure could be made in an efficient and effective manner. Along with the results from this study, further research efforts in related areas are discussed.

  6. Flash Infrared Thermography Contrast Data Analysis Technique

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  7. Direct Determination of the Dependence of the Surface Shear and Dilatational Viscosities on the Thermodynamic State of the Interface: Theoretical Foundations.

    PubMed

    Lopez; Hirsa

    1998-10-01

    Recent developments in nonlinear optical techniques for noninvasive probing of a surfactant influenced gas/liquid interface allow for the measurement of the surfactant surface concentration, c, and thus provide new opportunities for the direct determination of its intrinsic viscosities. Here, we present the theoretical foundations, based on the Boussinesq-Scriven surface model without the usual simplification of constant viscosities, for an experimental technique to directly measure the surface shear (µs) and dilatational (kappas) viscosities of a Newtonian interface as functions of the surfactant surface concentration. This ability to directly measure the surfactant concentration permits the use of a simple surface flow for the measurement of the surface viscosities. The requirements are that the interface must be nearly flat, and the flow steady, axisymmetric, and swirling; these flow conditions can be achieved in the deep-channel viscometer driven at relatively fast rates. The tangential stress balance on such an interface leads to two equations; the balance in the azimuthal direction involves only µs and its gradients, and the balance in the radial direction involves both µs and kappas and their gradients. By further exploiting recent developments in laser-based flow measuring techniques, the surface velocities and their gradients which appear in the two equations can be measured directly. The surface tension gradient, which appears in the radial balance equation, is incorporated from the equation of state for the surfactant system and direct measurements of the surfactant surface concentration distribution. The stress balance equations are then ordinary differential equations in the surface viscosities as functions of radial position, which can be readily integrated. Since c is measured as a function of radial position, we then have a direct measurement of µs and kappas as functions of c. Numerical computations of the Navier-Stokes equations are performed to determine the appropriate conditions to achieve the requisite secondary flow. Copyright 1998 Academic Press.

  8. Infrared Contrast Analysis Technique for Flash Thermography Nondestructive Evaluation

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay

    2014-01-01

    The paper deals with the infrared flash thermography inspection to detect and analyze delamination-like anomalies in nonmetallic materials. It provides information on an IR Contrast technique that involves extracting normalized contrast verses time evolutions from the flash thermography infrared video data. The paper provides the analytical model used in the simulation of infrared image contrast. The contrast evolution simulation is achieved through calibration on measured contrast evolutions from many flat bottom holes in the subject material. The paper also provides formulas to calculate values of the thermal measurement features from the measured contrast evolution curve. Many thermal measurement features of the contrast evolution that relate to the anomaly characteristics are calculated. The measurement features and the contrast simulation are used to evaluate flash thermography inspection data in order to characterize the delamination-like anomalies. In addition, the contrast evolution prediction is matched to the measured anomaly contrast evolution to provide an assessment of the anomaly depth and width in terms of depth and diameter of the corresponding equivalent flat-bottom hole (EFBH) or equivalent uniform gap (EUG). The paper provides anomaly edge detection technique called the half-max technique which is also used to estimate width of an indication. The EFBH/EUG and half-max width estimations are used to assess anomaly size. The paper also provides some information on the "IR Contrast" software application, half-max technique and IR Contrast feature imaging application, which are based on models provided in this paper.

  9. A preliminary look at techniques used to obtain airdata from flight at high angles of attack

    NASA Technical Reports Server (NTRS)

    Moes, Timothy R.; Whitmore, Stephen A.

    1990-01-01

    Flight research at high angles of attack has posed new problems for airdata measurements. New sensors and techniques for measuring the standard airdata quantities of static pressure, dynamic pressure, angle of attack, and angle of sideslip were subsequently developed. The ongoing airdata research supporting NASA's F-18 high alpha research program is updated. Included are the techniques used and the preliminary results. The F-18 aircraft was flown with three research airdata systems: a standard airdata probe on the right wingtip, a self-aligning airdata probe on the left wingtip, and a flush airdata system on the nose cone. The primary research goal was to obtain steady-state calibrations for each airdata system up to an angle of attack of 50 deg. This goal was accomplished and preliminary accuracies of the three airdata systems were assessed and are presented. An effort to improve the fidelity of the airdata measurements during dynamic maneuvering is also discussed. This involved enhancement of the aerodynamic data with data obtained from linear accelerometers, rate gyros, and attitude gyros. Preliminary results of this technique are presented.

  10. Nonlinear ultrasonics for material state awareness

    NASA Astrophysics Data System (ADS)

    Jacobs, L. J.

    2014-02-01

    Predictive health monitoring of structural components will require the development of advanced sensing techniques capable of providing quantitative information on the damage state of structural materials. By focusing on nonlinear acoustic techniques, it is possible to measure absolute, strength based material parameters that can then be coupled with uncertainty models to enable accurate and quantitative life prediction. Starting at the material level, this review will present current research that involves a combination of sensing techniques and physics-based models to characterize damage in metallic materials. In metals, these nonlinear ultrasonic measurements can sense material state, before the formation of micro- and macro-cracks. Typically, cracks of a measurable size appear quite late in a component's total life, while the material's integrity in terms of toughness and strength gradually decreases due to the microplasticity (dislocations) and associated change in the material's microstructure. This review focuses on second harmonic generation techniques. Since these nonlinear acoustic techniques are acoustic wave based, component interrogation can be performed with bulk, surface and guided waves using the same underlying material physics; these nonlinear ultrasonic techniques provide results which are independent of the wave type used. Recent physics-based models consider the evolution of damage due to dislocations, slip bands, interstitials, and precipitates in the lattice structure, which can lead to localized damage.

  11. Wind-instrument reflection function measurements in the time domain.

    PubMed

    Keefe, D H

    1996-04-01

    Theoretical and computational analyses of wind-instrument sound production in the time domain have emerged as useful tools for understanding musical instrument acoustics, yet there exist few experimental measurements of the air-column response directly in the time domain. A new experimental, time-domain technique is proposed to measure the reflection function response of woodwind and brass-instrument air columns. This response is defined at the location of sound regeneration in the mouthpiece or double reed. A probe assembly comprised of an acoustic source and microphone is inserted directly into the air column entryway using a foam plug to ensure a leak-free fit. An initial calibration phase involves measurements on a single cylindrical tube of known dimensions. Measurements are presented on an alto saxophone and euphonium. The technique has promise for testing any musical instrument air columns using a single probe assembly and foam plugs over a range of diameters typical of air-column entryways.

  12. Measurements of Gluconeogenesis and Glycogenolysis: A Methodological Review

    PubMed Central

    Chung, Stephanie T.; Chacko, Shaji K.; Sunehag, Agneta L.

    2015-01-01

    Gluconeogenesis is a complex metabolic process that involves multiple enzymatic steps regulated by myriad factors, including substrate concentrations, the redox state, activation and inhibition of specific enzyme steps, and hormonal modulation. At present, the most widely accepted technique to determine gluconeogenesis is by measuring the incorporation of deuterium from the body water pool into newly formed glucose. However, several techniques using radioactive and stable-labeled isotopes have been used to quantitate the contribution and regulation of gluconeogenesis in humans. Each method has its advantages, methodological assumptions, and set of propagated errors. In this review, we examine the strengths and weaknesses of the most commonly used stable isotopes methods to measure gluconeogenesis in vivo. We discuss the advantages and limitations of each method and summarize the applicability of these measurements in understanding normal and pathophysiological conditions. PMID:26604176

  13. 24 CFR 91.105 - Citizen participation plan; local governments.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... met in the case of public hearings where a significant number of non-English speaking residents can be... encourage the participation of all its citizens, including minorities and non-English speaking persons, as... jurisdiction should also explore alternative public involvement techniques and quantitative ways to measure...

  14. Thermal Nondestructive Characterization of Corrosion in Boiler Tubes by Application fo a Moving Line Heat Source

    NASA Technical Reports Server (NTRS)

    Cramer, K. Elliott; Winfree, William P.

    2000-01-01

    Wall thinning in utility boiler waterwall tubing is a significant inspection concern for boiler operators. Historically, conventional ultrasonics has been used lor inspection of these tubes. This technique has proved to be very labor intensive and slow. This has resulted in a "spot check" approach to inspections, making thickness measurements over a relatively small percentage of the total boiler wall area. NASA Langley Research Center has developed a thermal NDE technique designed to image and quantitatively characterize the amount of material thinning present in steel tubing. The technique involves the movement of a thermal line source across the outer surface of the tubing followed by an infrared imager at a fixed distance behind the line source. Quantitative images of the material loss due to corrosion are reconstructed from measurements of the induced surface temperature variations. This paper will present a discussion of the development of the thermal imaging system as well as the techniques used to reconstruct images of flaws. The application of the thermal line source, coupled with this analysis technique, represents a significant improvement in the inspection speed for large structures such as boiler waterwalls while still providing high-resolution thickness measurements. A theoretical basis for the technique will be presented thus demonstrating the quantitative nature of the technique. Further, results of laboratory experiments on flat Panel specimens with fabricated material loss regions will be presented.

  15. Total body water measurements using resonant cavity perturbation techniques.

    PubMed

    Stone, Darren A; Robinson, Martin P

    2004-05-07

    A recent paper proposed a novel technique for determining the total body water (TBW) of patients suffering with abnormal hydration levels, using a resonant cavity perturbation method. Current techniques to measure TBW are limited by resolution and technical constraints. However, this new method involves measuring the dielectric properties of the body, by placing a subject in a large cavity resonator and measuring the subsequent change in its resonant frequency, fres and its Q-factor. Utilizing the relationship that water content correlates to these dielectric properties, it has been shown that the measured response of these parameters enables determination of TBW. Results are presented for a preliminary study using data estimated from anthropometric measurements, where volunteers were asked to lie and stand in an electromagnetic screened room, before and after drinking between 1 and 2 l of water, and in some cases, after voiding the bladder. Notable changes in the parameters were observed; fres showed a negative shift and Q was reduced. Preliminary calibration curves using estimated values of water content have been developed from these results, showing that for each subject the measured resonant frequency is a linear function of TBW. Because the gradients of these calibration curves correlate to the mass-to-height-ratio of the volunteers, it has proved that a system in which TBW can be unequivocally obtained is possible. Measured values of TBW have been determined using this new pilot-technique, and the values obtained correlate well with theoretical values of body water (r = 0.87) and resolution is very good (750 ml). The results obtained are measurable, repeatable and statistically significant. This leads to confidence in the integrity of the proposed technique.

  16. Total body water measurements using resonant cavity perturbation techniques

    NASA Astrophysics Data System (ADS)

    Stone, Darren A.; Robinson, Martin P.

    2004-05-01

    A recent paper proposed a novel technique for determining the total body water (TBW) of patients suffering with abnormal hydration levels, using a resonant cavity perturbation method. Current techniques to measure TBW are limited by resolution and technical constraints. However, this new method involves measuring the dielectric properties of the body, by placing a subject in a large cavity resonator and measuring the subsequent change in its resonant frequency, fres and its Q-factor. Utilizing the relationship that water content correlates to these dielectric properties, it has been shown that the measured response of these parameters enables determination of TBW. Results are presented for a preliminary study using data estimated from anthropometric measurements, where volunteers were asked to lie and stand in an electromagnetic screened room, before and after drinking between 1 and 2 l of water, and in some cases, after voiding the bladder. Notable changes in the parameters were observed; fres showed a negative shift and Q was reduced. Preliminary calibration curves using estimated values of water content have been developed from these results, showing that for each subject the measured resonant frequency is a linear function of TBW. Because the gradients of these calibration curves correlate to the mass-to-height-ratio of the volunteers, it has proved that a system in which TBW can be unequivocally obtained is possible. Measured values of TBW have been determined using this new pilot-technique, and the values obtained correlate well with theoretical values of body water (r = 0.87) and resolution is very good (750 ml). The results obtained are measurable, repeatable and statistically significant. This leads to confidence in the integrity of the proposed technique.

  17. Detection of electrically neutral and nonpolar molecules in ionic solutions using silicon nanowires

    NASA Astrophysics Data System (ADS)

    Wu, Ying-Pin; Chu, Chia-Jung; Tsai, Li-Chu; Su, Ya-Wen; Chen, Pei-Hua; Moodley, Mathew K.; Huang, Ding; Chen, Yit-Tsong; Yang, Ying-Jay; Chen, Chii-Dong

    2017-04-01

    We report on a technique that can extend the use of nanowire sensors to the detection of interactions involving nonpolar and neutral molecules in an ionic solution environment. This technique makes use of the fact that molecular interactions result in a change in the permittivity of the molecules involved. For the interactions taking place at the surface of nanowires, this permittivity change can be determined from the analysis of the measured complex impedance of the nanowire. To demonstrate this technique, histidine was detected using different charge polarities controlled by the pH value of the solution. This included the detection of electrically neutral histidine at a sensitivity of 1 pM. Furthermore, it is shown that nonpolar molecules, such as hexane, can also be detected. The technique is applicable to the use of nanowires with and without a surface-insulating oxide. We show that information about the changes in amplitude and the phase of the complex impedance reveals the fundamental characteristics of the molecular interactions, including the molecular field and the permittivity.

  18. Airborne sulfur trace species intercomparison campaign: Sulfur dioxide, dimethylsulfide, hydrogen sulfide, carbon disulfide, and carbonyl sulfide

    NASA Technical Reports Server (NTRS)

    Gregory, Gerald L.; Hoell, James M., Jr.; Davis, Douglas D.

    1991-01-01

    Results from an airborne intercomparison of techniques to measure tropospheric levels of sulfur trace gases are presented. The intercomparison was part of the NASA Global Tropospheric Experiment (GTE) and was conducted during the summer of 1989. The intercomparisons were conducted on the Wallops Electra aircraft during flights from Wallops Island, Virginia, and Natal, Brazil. Sulfur measurements intercompared included sulfur dioxide (SO2), dimethylsulfide (DMS), hydrogen sulfide (H2S), carbon disulfide (CS2), and carbonyl sulfide (OCS). Measurement techniques ranged from filter collection systems with post-flight analyses to mass spectrometer and gas chromatograph systems employing various methods for measuring and identifying the sulfur gases during flight. Sampling schedules for the techniques ranged from integrated collections over periods as long as 50 minutes to one- to three-minute samples every ten or fifteen minutes. Several of the techniques provided measurements of more than one sulfur gas. Instruments employing different detection principles were involved in each of the sulfur intercomparisons. Also included in the intercomparison measurement scenario were a host of supporting measurements (i.e., ozone, nitrogen oxides, carbon monoxide, total sulfur, aerosols, etc.) for purposes of: (1) interpreting results (i.e., correlation of any noted instrument disagreement with the chemical composition of the measurement environment); and (2) providing supporting chemical data to meet CITE-3 science objectives of studying ozone/sulfur photochemistry, diurnal cycles, etc. The results of the intercomparison study are briefly discussed.

  19. Constitutive parameter measurements of lossy materials

    NASA Technical Reports Server (NTRS)

    Dominek, A.; Park, A.

    1989-01-01

    The electrical constitutive parameters of lossy materials are considered. A discussion of the NRL arch for lossy coatings is presented involving analytical analyses of the reflected field using the geometrical theory of diffraction (GTD) and physical optics (PO). The actual values for these parameters can be obtained through a traditional transmission technique which is examined from an error analysis standpoint. Alternate sample geometries are suggested for this technique to reduce sample tolerance requirements for accurate parameter determination. The performance for one alternate geometry is given.

  20. An introduction to the processes, problems, and management of urban lakes

    USGS Publications Warehouse

    Britton, L.J.; Averett, R.C.; Ferreira, R.F.

    1975-01-01

    As lake studies become more common, sampling techniques for data collection need increased accuracy and consistency, in order to make meaningful comparisons between different lakes. Therefore, the report discusses the main factors involved in conducting lake studies. These factors include the types and frequency of measurements useful in lake reconnaissance studies and a review of literature on sampling equipment and techniques. A glossary of selected terms begins the report, which is intended for guideline use by urban planners and managers.

  1. Information prioritization for control and automation of space operations

    NASA Technical Reports Server (NTRS)

    Ray, Asock; Joshi, Suresh M.; Whitney, Cynthia K.; Jow, Hong N.

    1987-01-01

    The applicability of a real-time information prioritization technique to the development of a decision support system for control and automation of Space Station operations is considered. The steps involved in the technique are described, including the definition of abnormal scenarios and of attributes, measures of individual attributes, formulation and optimization of a cost function, simulation of test cases on the basis of the cost function, and examination of the simulation scenerios. A list is given comparing the intrinsic importances of various Space Station information data.

  2. Eddy Correlation Flux Measurement System (ECOR) Handbook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, DR

    2011-01-31

    The eddy correlation (ECOR) flux measurement system provides in situ, half-hour measurements of the surface turbulent fluxes of momentum, sensible heat, latent heat, and carbon dioxide (CO2) (and methane at one Southern Great Plains extended facility (SGP EF) and the North Slope of Alaska Central Facility (NSA CF). The fluxes are obtained with the eddy covariance technique, which involves correlation of the vertical wind component with the horizontal wind component, the air temperature, the water vapor density, and the CO2 concentration.

  3. Low level measurements of atmospheric DMS, H2S, and SO2 for GTE/CITE-3

    NASA Technical Reports Server (NTRS)

    Saltzman, Eric; Cooper, David

    1991-01-01

    This project involved the measurement of atmospheric dimethylsulfide (DMS) and hydrogen sulfide (H2S) as part of the GTE/CITE-3 instrument intercomparison program. The two instruments were adapted for use on the NASA Electra aircraft and participated in all phases of the mission. This included ground-based measurements of NIST-provided standard gases and a series of airborne missions over the Western Atlantic Ocean. Analytical techniques used are described and the results are summarized.

  4. A demonstration of an independent-station radio interferometry system with 4-cm precision on a 16-km base line. [for geodesy

    NASA Technical Reports Server (NTRS)

    Thomas, J. B.; Fanselow, J. L.; Macdoran, P. F.; Skjerve, L. J.; Spitzmesser, D. J.; Fliegel, H. F.

    1976-01-01

    Radio interferometry promises eventually to measure directly, with accuracies of a few centimeters, both whole earth motions and relative crustal motions with respect to an 'inertial' reference frame. Interferometry measurements of arbitrarily long base lines require, however, the development of new techniques for independent-station observation. In connection with the development of such techniques, a series of short base line demonstration experiments has been conducted between two antennas. The experiments were related to a program involving the design of independent-station instrumentation capable of making three-dimensional earth-fixed base line measurements with an accuracy of a few centimeters. Attention is given to the instrumentation used in the experiments, aspects of data analysis, and the experimental results.

  5. Innovative acoustic technique for studying new materials and new developments in solid state physics

    NASA Astrophysics Data System (ADS)

    Maynard, Julian D.

    1993-10-01

    The goals of this project involve the use of innovative acoustic techniques to study new materials and new developments in solid state physics, such as effects in mesoscopic electronic systems. Major accomplishments include (1) the preparation and publication of a number of major papers and chapters in books, (2) the comparison of the anisotropy of an aluminum alloy quasicrystal with that of its cubic approximant, (3) the measurement of the elastic constants of a diamond substitute material, TiB2, (4) the measurement of an extremely low (possibly the lowest) infrared optical-absorption coefficient, (5) the measurement of the effects of disorder on the propagation of a nonlinear pulse, and (6) the acquisition of initial data in an experiment on the onset of fracture.

  6. UF6 Density and Mass Flow Measurements for Enrichment Plants using Acoustic Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Good, Morris S.; Smith, Leon E.; Warren, Glen A.

    A key enabling capability for enrichment plant safeguards being considered by the International Atomic Energy Agency (IAEA) is high-accuracy, noninvasive, unattended measurement of UF6 gas density and mass flow rate. Acoustic techniques are currently used to noninvasively monitor gas flow in industrial applications; however, the operating pressures at gaseous centrifuge enrichment plants (GCEPs) are roughly two orders magnitude below the capabilities of commercial instrumentation. Pacific Northwest National Laboratory is refining acoustic techniques for estimating density and mass flow rate of UF6 gas in scenarios typical of GCEPs, with the goal of achieving 1% measurement accuracy. Proof-of-concept laboratory measurements using amore » surrogate gas for UF6 have demonstrated signatures sensitive to gas density at low operating pressures such as 10–50 Torr, which were observed over the background acoustic interference. Current efforts involve developing a test bed for conducting acoustic measurements on flowing SF6 gas at representative flow rates and pressures to ascertain the viability of conducting gas flow measurements under these conditions. Density and flow measurements will be conducted to support the evaluation. If successful, the approach could enable an unattended, noninvasive approach to measure mass flow in unit header pipes of GCEPs.« less

  7. Inclusion for People with Developmental Disabilities: Measuring an Elusive Construct.

    PubMed

    Neely-Barnes, Susan Louise; Elswick, Susan E

    2016-01-01

    The philosophy of inclusion for people with intellectual and developmental disabilities (IDD) has evolved over the last 50 years. Over time, inclusion research has shifted from a focus on deinstitutionalization to understanding the extent to which individuals with IDD are meaningfully involved in the community and social relationships. Yet, there has been no agreed on way to measure inclusion. Many different measurement and data collection techniques have been used in the literature. This study proposes a brief measure of inclusion that can be used with family members and on survey instruments.

  8. Assessment on the methods of measuring the tyre-road contact patch stresses

    NASA Astrophysics Data System (ADS)

    Anghelache, G.; Moisescu, A.-R.; Buretea, D.

    2017-08-01

    The paper reviews established and modern methods for investigating tri-axial stress distributions in the tyre-road contact patch. The authors used three methods of measuring stress distributions: strain gauge method; force sensing technique; acceleration measurements. Four prototypes of instrumented pins transducers involving mentioned measuring methods were developed. Data acquisitions of the contact patch stresses distributions were performed using each transducer with instrumented pin. The results are analysed and compared, underlining the advantages and drawbacks of each method. The experimental results indicate that the three methods are valuable.

  9. Measuring snow cover using satellite imagery during 1973 and 1974 melt season: North Santiam, Boise, and Upper Snake Basins, phase 1. [LANDSAT satellites, imaging techniques

    NASA Technical Reports Server (NTRS)

    Wiegman, E. J.; Evans, W. E.; Hadfield, R.

    1975-01-01

    Measurements are examined of snow coverage during the snow-melt season in 1973 and 1974 from LANDSAT imagery for the three Columbia River Subbasins. Satellite derived snow cover inventories for the three test basins were obtained as an alternative to inventories performed with the current operational practice of using small aircraft flights over selected snow fields. The accuracy and precision versus cost for several different interactive image analysis procedures was investigated using a display device, the Electronic Satellite Image Analysis Console. Single-band radiance thresholding was the principal technique employed in the snow detection, although this technique was supplemented by an editing procedure involving reference to hand-generated elevation contours. For each data and view measured, a binary thematic map or "mask" depicting the snow cover was generated by a combination of objective and subjective procedures. Photographs of data analysis equipment (displays) are shown.

  10. Digital stereo-holographic microscopy for studying three-dimensional particle dynamics

    NASA Astrophysics Data System (ADS)

    Byeon, Hyeokjun; Go, Taesik; Lee, Sang Joon

    2018-06-01

    A digital stereo-holographic microscopy (DsHM) with two viewing angles is proposed to measure 3D information of microscale particles. This approach includes two volumetric recordings and numerical reconstruction, and it involves the combination of separately reconstructed holograms. The 3D positional information of a particle was determined by searching the center of the overlapped reconstructed volume. After confirming the proposed technique using static spherical particles, the 3D information of moving particles suspended in a Hagen-Poiseiulle flow was successfully obtained. Moreover, the 3D information of nonspherical particles, including ellipsoidal particles and red blood cells, were measured using the proposed technique. In addition to 3D positional information, the orientation and shape of the test samples were obtained from the plane images by slicing the overlapped volume perpendicular to the directions of the image recordings. This DsHM technique will be useful in analyzing the 3D dynamic behavior of various nonspherical particles, which cannot be measured by conventional digital holographic microscopy.

  11. Psychrometric Field Measurement of Water Potential Changes following Leaf Excision.

    PubMed

    Savage, M J; Cass, A

    1984-01-01

    In situ measurement of sudden leaf water potential changes has not been performed under field conditions. A laboratory investigation involving the measurement of leaf water potential prior to and 2 to 200 minutes after excision of citrus leaves (Citrus jambhiri) showed good linear correlation (r = 0.99) between in situ leaf psychrometer and Scholander pressure chamber measurements. Following this, a field investigation was conducted which involved psychrometric measurement prior to petiole excision and 1 minute after excision. Simultaneous pressure chamber measurements were performed on neighboring leaves prior to the time of excision and then on the psychrometer leaf about 2 minutes after excision. These data indicate that within the first 2 minutes after excision, psychrometer and pressure chamber measurements were linearly correlated (r = 0.97). Under high evaporative demand conditions, the rate of water potential decrease was between 250 and 700 kilopascals in the first minute after excision. These results show that the thermocouple psychrometer can be used as a dynamic and nondestructive field technique for monitoring leaf water potential.

  12. Psychrometric Field Measurement of Water Potential Changes following Leaf Excision 1

    PubMed Central

    Savage, Michael J.; Cass, Alfred

    1984-01-01

    In situ measurement of sudden leaf water potential changes has not been performed under field conditions. A laboratory investigation involving the measurement of leaf water potential prior to and 2 to 200 minutes after excision of citrus leaves (Citrus jambhiri) showed good linear correlation (r = 0.99) between in situ leaf psychrometer and Scholander pressure chamber measurements. Following this, a field investigation was conducted which involved psychrometric measurement prior to petiole excision and 1 minute after excision. Simultaneous pressure chamber measurements were performed on neighboring leaves prior to the time of excision and then on the psychrometer leaf about 2 minutes after excision. These data indicate that within the first 2 minutes after excision, psychrometer and pressure chamber measurements were linearly correlated (r = 0.97). Under high evaporative demand conditions, the rate of water potential decrease was between 250 and 700 kilopascals in the first minute after excision. These results show that the thermocouple psychrometer can be used as a dynamic and nondestructive field technique for monitoring leaf water potential. PMID:16663394

  13. Land utilization and water resource inventories over extended test sites

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M.

    1972-01-01

    In addition to the work on the corn blight this year, several other analysis tests were completed which resulted in significant findings. These aspects are discussed as follows: (1) field spectral measurements of soil conditions; (2) analysis of extended test site data; this discussion involves three different sets of data analysis sequences; (3) urban land use analysis, for studying water runoff potentials; and (4) thermal data quality study, as an expansion of our water resources studies involving temperature calibration techniques.

  14. US EPA'S LANDSCAPE ECOLOGY RESEARCH: ASSESSING TRENDS FOR WETLANDS AND SURFACE WATERS USING REMORE SENSING, GIS, AND FIELD-BASED TECHNIQUES

    EPA Science Inventory

    The US EPA, Environmental Sciences Division-Las Vegas is using a variety of geopspatical and statistical modeling approaches to locate and assess the complex functions of wetland ecosystems. These assessments involve measuring landscape characteristrics and change, at multiple s...

  15. Lake trophic applications: Wisconsin

    NASA Technical Reports Server (NTRS)

    Scarpace, F.

    1981-01-01

    Efforts to classify the water quality characteristics of lakes using LANDSAT imagery are reported. Image processing and registration techniques are described. A lake classification scheme which involves the assignment of a trophic class number was used in the data analysis. The resulting values were compared to the corresponding rank assignment derived from field measurements.

  16. Radical Recombination Kinetics: An Experiment in Physical Organic Chemistry.

    ERIC Educational Resources Information Center

    Pickering, Miles

    1980-01-01

    Describes a student kinetic experiment involving second order kinetics as well as displaying photochromism using a wide variety of techniques from both physical and organic chemistry. Describes measurement of (1) the rate of the recombination reaction; (2) the extinction coefficient; and (3) the ESR spectrometer signal. (Author/JN)

  17. All the noncontextuality inequalities for arbitrary prepare-and-measure experiments with respect to any fixed set of operational equivalences

    NASA Astrophysics Data System (ADS)

    Schmid, David; Spekkens, Robert W.; Wolfe, Elie

    2018-06-01

    Within the framework of generalized noncontextuality, we introduce a general technique for systematically deriving noncontextuality inequalities for any experiment involving finitely many preparations and finitely many measurements, each of which has a finite number of outcomes. Given any fixed sets of operational equivalences among the preparations and among the measurements as input, the algorithm returns a set of noncontextuality inequalities whose satisfaction is necessary and sufficient for a set of operational data to admit of a noncontextual model. Additionally, we show that the space of noncontextual data tables always defines a polytope. Finally, we provide a computationally efficient means for testing whether any set of numerical data admits of a noncontextual model, with respect to any fixed operational equivalences. Together, these techniques provide complete methods for characterizing arbitrary noncontextuality scenarios, both in theory and in practice. Because a quantum prepare-and-measure experiment admits of a noncontextual model if and only if it admits of a positive quasiprobability representation, our techniques also determine the necessary and sufficient conditions for the existence of such a representation.

  18. A device to improve the Schleger and Turner method for sweating rate measurements

    NASA Astrophysics Data System (ADS)

    Pereira, Alfredo Manuel Franco; Alves, Alexandre; Infante, Paulo; Titto, Evaldo A. L.; Baccari, Flávio; Almeida, J. A. Afonso

    2010-01-01

    The objective of this study was to test a device developed to improve the functionality, accuracy and precision of the original technique for sweating rate measurements proposed by Schleger and Turner [Schleger AV, Turner HG (1965) Aust J Agric Res 16:92-106]. A device was built for this purpose and tested against the original Schleger and Turner technique. Testing was performed by measuring sweating rates in an experiment involving six Mertolenga heifers subjected to four different thermal levels in a climatic chamber. The device exhibited no functional problems and the results obtained with its use were more consistent than with the Schleger and Turner technique. There was no difference in the reproducibility of the two techniques (same accuracy), but measurements performed with the new device had lower repeatability, corresponding to lower variability and, consequently, to higher precision. When utilizing this device, there is no need for physical contact between the operator and the animal to maintain the filter paper discs in position. This has important advantages: the animals stay quieter, and several animals can be evaluated simultaneously. This is a major advantage because it allows more measurements to be taken in a given period of time, increasing the precision of the observations and diminishing the error associated with temporal hiatus (e.g., the solar angle during field studies). The new device has higher functional versatility when taking measurements in large-scale studies (many animals) under field conditions. The results obtained in this study suggest that the technique using the device presented here could represent an advantageous alternative to the original technique described by Schleger and Turner.

  19. Installation Status of the Electron Beam Profiler for the Fermilab Main Injector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thurman-Keup, R.; Alvarez, M.; Fitzgerald, J.

    2015-11-06

    The planned neutrino program at Fermilab requires large proton beam intensities in excess of 2 MW. Measuring the transverse profiles of these high intensity beams is challenging and often depends on non-invasive techniques. One such technique involves measuring the deflection of a probe beam of electrons with a trajectory perpendicular to the proton beam. A device such as this is already in use at the Spallation Neutron Source at ORNL and the installation of a similar device is underway in the Main Injector at Fermilab. The present installation status of the electron beam profiler for the Main Injector will bemore » discussed together with some simulations and test stand results.« less

  20. Spectral emissivity of cirrus clouds

    NASA Technical Reports Server (NTRS)

    Beck, Gordon H.; Davis, John M.; Cox, Stephen K.

    1993-01-01

    The inference of cirrus cloud properties has many important applications including global climate studies, radiation budget determination, remote sensing techniques and oceanic studies from satellites. Data taken at the Parsons Kansas site during the FIRE II project are used for this study. On November 26 there were initially clear sky conditions gradually giving way to a progressively thickening cirrus shield over a period of a few hours. Interferometer radiosonde and lidar data were taken throughout this event. Two techniques are used to infer the downward spectral emittance of the observed cirrus layer. One uses only measurements and the other involves measurements and FASCODE III calculations. FASCODE III is a line-by line radiance/transmittance model developed at the Air Force Geophysics Laboratory.

  1. Rarefied-flow pitching moment coefficient measurements of the Shuttle Orbiter

    NASA Technical Reports Server (NTRS)

    Blanchard, R. C.; Hinson, E. W.

    1988-01-01

    An overview of the process for obtaining the Shuttle Orbiter rarefied-flow pitching moment from flight gyro data is presented. The extraction technique involves differentiation of the output of the pitch gyro after accounting for nonaerodynamic torques, such as those produced by gravity gradient and the Orbiter's auxiliary power unit and adjusting for drift biases. The overview of the extraction technique includes examples of results from each of the steps involved in the process, using the STS-32 mission as a typical sample case. The total pitching moment and moment coefficient (Cm) for that flight are calculated and compared with preflight predictions. The flight results show the anticipated decrease in Cm with increasing altitude. However, the total moment coefficient is less than predicted using preflight estimates.

  2. Thermodynamic Activity Measurements with Knudsen Cell Mass Spectrometry

    NASA Technical Reports Server (NTRS)

    Copland, Evan H.; Jacobson, Nathan S.

    2001-01-01

    Coupling the Knudsen effusion method with mass spectrometry has proven to be one of the most useful experimental techniques for studying the equilibrium between condensed phases and complex vapors. The Knudsen effusion method involves placing a condensed sample in a Knudsen cell, a small "enclosure", that is uniformly heated and held until equilibrium is attained between the condensed and vapor phases. The vapor is continuously sampled by effusion through a small orifice in the cell. A molecular beam is formed from the effusing vapor and directed into a mass spectrometer for identification and pressure measurement of the species in the vapor phase. Knudsen cell mass spectrometry (KCMS) has been used for nearly fifty years now and continues to be a leading technique for obtaining thermodynamic data. Indeed, much of the well-established vapor specie data in the JANAF tables has been obtained from this technique. This is due to the extreme versatility of the technique. All classes of materials can be studied and all constituents of the vapor phase can be measured over a wide range of pressures (approximately 10(exp -4) to 10(exp -11) bar) and temperatures (500-2800 K). The ability to selectively measure different vapor species makes KCMS a very powerful tool for the measurement of component activities in metallic and ceramic solutions. Today several groups are applying KCMS to measure thermodynamic functions in multicomponent metallic and ceramic systems. Thermodynamic functions, especially component activities, are extremely important in the development of CALPHAD (Calculation of Phase Diagrams) type thermodynamic descriptions. These descriptions, in turn, are useful for modeling materials processing and predicting reactions such as oxide formation and fiber/matrix interactions. The leading experimental methods for measuring activities are the Galvanic cell or electro-motive force (EMF) technique and the KCMS technique. Each has specific advantages, depending on material and conditions. The EMF technique is suitable for lower temperature measurements, provided a suitable cell can be constructed. KCMS is useful for higher temperature measurements in a system with volatile components. In this paper, we briefly review the KCMS technique and identify the major experimental issues that must be addressed for precise measurements. These issues include temperature measurements, cell material and cell design and absolute pressure calibration. The resolution of these issues are discussed together with some recent examples of measured thermodynamic data.

  3. Blood doping by cobalt. Should we measure cobalt in athletes?

    PubMed Central

    Lippi, Giuseppe; Franchini, Massimo; Guidi, Gian Cesare

    2006-01-01

    Background Blood doping is commonplace in competitive athletes who seek to enhance their aerobic performances through illicit techniques. Presentation of the hypothesis Cobalt, a naturally-occurring element with properties similar to those of iron and nickel, induces a marked and stable polycythemic response through a more efficient transcription of the erythropoietin gene. Testing the hypothesis Although little information is available so far on cobalt metabolism, reference value ranges or supplementation in athletes, there is emerging evidence that cobalt is used as a supplement and increased serum concentrations are occasionally observed in athletes. Therefore, given the athlete's connatural inclination to experiment with innovative, unfair and potentially unhealthy doping techniques, cobalt administration might soon become the most suited complement or surrogate for erythropoiesis-stimulating substances. Nevertheless, cobalt administration is not free from unsafe consequences, which involve toxic effects on heart, liver, kidney, thyroid and cancer promotion. Implications of the hypothesis Cobalt is easily purchasable, inexpensive and not currently comprehended within the World Anti-Doping Agency prohibited list. Moreover, available techniques for measuring whole blood, serum, plasma or urinary cobalt involve analytic approaches which are currently not practical for antidoping laboratories. Thus more research on cobalt metabolism in athletes is compelling, along with implementation of effective strategies to unmask this potentially deleterious doping practice PMID:16863591

  4. Monitoring beach changes using GPS surveying techniques

    USGS Publications Warehouse

    Morton, Robert; Leach, Mark P.; Paine, Jeffrey G.; Cardoza, Michael A.

    1993-01-01

    The adaptation of Global Positioning System (GPS) surveying techniques to beach monitoring activities is a promising response to this challenge. An experiment that employed both GPS and conventional beach surveying was conducted, and a new beach monitoring method employing kinematic GPS surveys was devised. This new method involves the collection of precise shore-parallel and shore-normal GPS positions from a moving vehicle so that an accurate two-dimensional beach surface can be generated. Results show that the GPS measurements agree with conventional shore-normal surveys at the 1 cm level, and repeated GPS measurements employing the moving vehicle demonstrate a precision of better than 1 cm. In addition, the nearly continuous sampling and increased resolution provided by the GPS surveying technique reveals alongshore changes in beach morphology that are undetected by conventional shore-normal profiles. The application of GPS surveying techniques combined with the refinement of appropriate methods for data collection and analysis provides a better understanding of beach changes, sediment transport, and storm impacts.

  5. Development of Rayleigh Doppler lidar for measuring middle atmosphere winds

    NASA Astrophysics Data System (ADS)

    Raghunath, K.; Patra, A. K.; Narayana Rao, D.

    Interpretation of most of the middle and upper atmospheric dynamical and chemical data relies on the climatological description of the wind field Rayleigh Doppler lidar is one instrument which monitors wind profiles continuously though continuity is limited to clear meteorological conditions in the middle atmosphere A Doppler wind lidar operating in incoherent mode gives excellent wind and temperature information at these altitudes with necessary spectral sensitivity It observes atmospheric winds by measuring the spectral shift of the scattered light due to the motions of atmospheric molecules with background winds and temperature by spectral broadening The presentation is about the design and development of Incoherent Doppler lidar to obtain wind information in the height regions of 30-65 km The paper analyses and describes various types of techniques that can be adopted viz Edge technique and Fringe Imaging technique The paper brings out the scientific objectives configuration simulations error sources and technical challenges involved in the development of Rayleigh Doppler lidar The presentation also gives a novel technique for calibrating the lidar

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casperson, R. J.; Burke, J. T.; Hughes, R. O.

    Directly measuring (n,2n) cross sections on short-lived actinides presents a number of experimental challenges. The surrogate reaction technique is an experimental method for measuring cross sections on short-­lived isotopes, and it provides a unique solution for measuring (n,2n) cross sections. This technique involves measuring a charged-­particle reaction cross section, where the reaction populates the same compound nucleus as the reaction of interest. To perform these surrogate (n,2n) cross section measurements, a silicon telescope array has been placed along a beam line at the Texas A&M University Cyclotron Institute, which is surrounded by a large tank of gadolinium-doped liquid scintillator, whichmore » acts as a neutron detector. The combination of the charge-particle and neutron-detector arrays is referred to as NeutronSTARS. In the analysis procedure for calculating the (n,2n) cross section, the neutron detection efficiency and time structure plays an important role. Due to the lack of availability of isotropic, mono-energetic neutron sources, modeling is an important component in establishing this efficiency and time structure. This report describes the NeutronSTARS array, which was designed and commissioned during this project. It also describes the surrogate reaction technique, specifically referencing a 235U(n,2n) commissioning measurement that was fielded during the past year. Advanced multiplicity analysis techniques have been developed for this work, which should allow for efficient analysis of 241Pu(n,2n) and 239Pu(n,2n) cross section measurements« less

  7. Surface Tension and Viscosity Measurements in Microgravity: Some Results and Fluid Flow Observations during MSL-1

    NASA Technical Reports Server (NTRS)

    Hyer, Robert W.; Trapaga, G.; Flemings, M. C.

    1999-01-01

    The viscosity of a liquid metal was successfully measured for the first time by a containerless method, the oscillating drop technique. This method also provides a means to obtain a precise, non-contact measurement of the surface tension of the droplet. This technique involves exciting the surface of the molten sample and then measuring the resulting oscillations; the natural frequency of the oscillating sample is determined by its surface tension, and the damping of the oscillations by the viscosity. These measurements were performed in TEMPUS, a microgravity electromagnetic levitator (EML), on the Space Shuttle as a part of the First Microgravity Science Laboratory (MSL-1), which flew in April and July 1997 (STS-83 and STS-94). Some results of the surface tension and viscosity measurements are presented for Pd82Si18. Some observations of the fluid dynamic characteristics (dominant flow patterns, turbulent transition, cavitation, etc.) of levitated droplets are presented and discussed together with magnetohydrodynamic calculations, which were performed to justify these findings.

  8. An Empirical State Error Covariance Matrix for the Weighted Least Squares Estimation Method

    NASA Technical Reports Server (NTRS)

    Frisbee, Joseph H., Jr.

    2011-01-01

    State estimation techniques effectively provide mean state estimates. However, the theoretical state error covariance matrices provided as part of these techniques often suffer from a lack of confidence in their ability to describe the un-certainty in the estimated states. By a reinterpretation of the equations involved in the weighted least squares algorithm, it is possible to directly arrive at an empirical state error covariance matrix. This proposed empirical state error covariance matrix will contain the effect of all error sources, known or not. Results based on the proposed technique will be presented for a simple, two observer, measurement error only problem.

  9. CARES: Completely Automated Robust Edge Snapper for carotid ultrasound IMT measurement on a multi-institutional database of 300 images: a two stage system combining an intensity-based feature approach with first order absolute moments

    NASA Astrophysics Data System (ADS)

    Molinari, Filippo; Acharya, Rajendra; Zeng, Guang; Suri, Jasjit S.

    2011-03-01

    The carotid intima-media thickness (IMT) is the most used marker for the progression of atherosclerosis and onset of the cardiovascular diseases. Computer-aided measurements improve accuracy, but usually require user interaction. In this paper we characterized a new and completely automated technique for carotid segmentation and IMT measurement based on the merits of two previously developed techniques. We used an integrated approach of intelligent image feature extraction and line fitting for automatically locating the carotid artery in the image frame, followed by wall interfaces extraction based on Gaussian edge operator. We called our system - CARES. We validated the CARES on a multi-institutional database of 300 carotid ultrasound images. IMT measurement bias was 0.032 +/- 0.141 mm, better than other automated techniques and comparable to that of user-driven methodologies. Our novel approach of CARES processed 96% of the images leading to the figure of merit to be 95.7%. CARES ensured complete automation and high accuracy in IMT measurement; hence it could be a suitable clinical tool for processing of large datasets in multicenter studies involving atherosclerosis.pre-

  10. One way Doppler Extractor. Volume 2: Digital VCO technique

    NASA Technical Reports Server (NTRS)

    Nossen, E. J.; Starner, E. R.

    1974-01-01

    A feasibility analysis and trade-offs for a one-way Doppler extractor using digital VCO techniques is presented. The method of Doppler measurement involves the use of a digital phase lock loop; once this loop is locked to the incoming signal, the precise frequency and hence the Doppler component can be determined directly from the contents of the digital control register. The only serious error source is due to internally generated noise. Techniques are presented for minimizing this error source and achieving an accuracy of 0.01 Hz in a one second averaging period. A number of digitally controlled oscillators were analyzed from a performance and complexity point of view. The most promising technique uses an arithmetic synthesizer as a digital waveform generator.

  11. An experimental facility for the visual study of turbulent flows.

    NASA Technical Reports Server (NTRS)

    Brodkey, R. S.; Hershey, H. C.; Corino, E. R.

    1971-01-01

    An experimental technique which allows visual observations of the wall area in turbulent pipe flow is described in detail. It requires neither the introduction of any injection or measuring device into the flow nor the presence of a two-phase flow or of a non-Newtonian fluid. The technique involves suspending solid MgO particles of colloidal size in trichloroethylene and photographing their motions near the wall with a high speed movie camera moving with the flow. Trichloroethylene was chosen in order to eliminate the index of refraction problem in a curved wall. Evaluation of the technique including a discussion of limitations is included. Also the technique is compared with previous methods of visual observations of turbulent flow.

  12. A technique based on droplet evaporation to recognize alcoholic drinks

    NASA Astrophysics Data System (ADS)

    González-Gutiérrez, Jorge; Pérez-Isidoro, Rosendo; Ruiz-Suárez, J. C.

    2017-07-01

    Chromatography is, at present, the most used technique to determine the purity of alcoholic drinks. This involves a careful separation of the components of the liquid elements. However, since this technique requires sophisticated instrumentation, there are alternative techniques such as conductivity measurements and UV-Vis and infrared spectrometries. We report here a method based on salt-induced crystallization patterns formed during the evaporation of alcoholic drops. We found that droplets of different samples form different structures upon drying, which we characterize by their radial density profiles. We prove that using the dried deposit of a spirit as a control sample, our method allows us to differentiate between pure and adulterated drinks. As a proof of concept, we study tequila.

  13. User-controlled photographic animations, photograph-based questions, and questionnaires: three Internet-based instruments for measuring drivers' risk-taking behavior.

    PubMed

    Horswill, M S; Coster, M E

    2001-02-01

    The Internet has been exploited successfully in the past as a medium for behavioral research. This paper presents a series of studies designed to assess Internet-based measures of drivers' risk-taking behavior. First, we compared responses from an Internet sample with a traditional pencil-and-paper sample using established questionnaire measures of risk taking. No significant differences were found. Second, we assessed the validity of new Internet-based instruments, involving photographs and photographic animations, that measured speed, gap acceptance, and passing. Responses were found to reflect known demographic patterns of actual behavior to some degree. Also, a roadside survey of speeds was carried out at the locations depicted in the photographic measure of speeding and, with certain exceptions, differences between the two appeared to be constant. Third, a between-subject experimental manipulation involving the photographic animation measure of gap acceptance was used to demonstrate one application of these techniques.

  14. Real-time monitoring of CO2 storage sites: Application to Illinois Basin-Decatur Project

    USGS Publications Warehouse

    Picard, G.; Berard, T.; Chabora, E.; Marsteller, S.; Greenberg, S.; Finley, R.J.; Rinck, U.; Greenaway, R.; Champagnon, C.; Davard, J.

    2011-01-01

    Optimization of carbon dioxide (CO2) storage operations for efficiency and safety requires use of monitoring techniques and implementation of control protocols. The monitoring techniques consist of permanent sensors and tools deployed for measurement campaigns. Large amounts of data are thus generated. These data must be managed and integrated for interpretation at different time scales. A fast interpretation loop involves combining continuous measurements from permanent sensors as they are collected to enable a rapid response to detected events; a slower loop requires combining large datasets gathered over longer operational periods from all techniques. The purpose of this paper is twofold. First, it presents an analysis of the monitoring objectives to be performed in the slow and fast interpretation loops. Second, it describes the implementation of the fast interpretation loop with a real-time monitoring system at the Illinois Basin-Decatur Project (IBDP) in Illinois, USA. ?? 2011 Published by Elsevier Ltd.

  15. Application of Functional Near-Infrared Spectroscopy to the Study of Brain Function in Humans and Animal Models

    PubMed Central

    Kim, Hak Yeong; Seo, Kain; Jeon, Hong Jin; Lee, Unjoo; Lee, Hyosang

    2017-01-01

    Functional near-infrared spectroscopy (fNIRS) is a noninvasive optical imaging technique that indirectly assesses neuronal activity by measuring changes in oxygenated and deoxygenated hemoglobin in tissues using near-infrared light. fNIRS has been used not only to investigate cortical activity in healthy human subjects and animals but also to reveal abnormalities in brain function in patients suffering from neurological and psychiatric disorders and in animals that exhibit disease conditions. Because of its safety, quietness, resistance to motion artifacts, and portability, fNIRS has become a tool to complement conventional imaging techniques in measuring hemodynamic responses while a subject performs diverse cognitive and behavioral tasks in test settings that are more ecologically relevant and involve social interaction. In this review, we introduce the basic principles of fNIRS and discuss the application of this technique in human and animal studies. PMID:28835022

  16. Feasibility study of a swept frequency electromagnetic probe (SWEEP) using inductive coupling for the determination of subsurface conductivity of the earth and water prospecting in arid regions

    NASA Technical Reports Server (NTRS)

    Latorraca, G. A.; Bannister, L. H.

    1974-01-01

    Techniques developed for electromagnetic probing of the lunar interior, and techniques developed for the generation of high power audio frequencies were combined to make practical a magnetic inductive coupling system for the rapid measurement of ground conductivity profiles which are helpful when prospecting for the presence and quality of subsurface water. A system which involves the measurement of the direction, intensity, and time phase of the magnetic field observed near the surface of the earth at a distance from a horizontal coil energized so as to create a field that penetrates the earth was designed and studied to deduce the conductivity and stratification of the subsurface. Theoretical studies and a rudimentary experiment in an arid region showed that the approach is conceptually valid and that this geophysical prospecting technique can be developed for the economical exploration of subterranean water resources.

  17. Quantitative Proton Magnetic Resonance Techniques for Measuring Fat

    PubMed Central

    Harry, Houchun; Kan, Hermien E.

    2014-01-01

    Accurate, precise, and reliable techniques for quantifying body and organ fat distributions are important tools in physiology research. They are critically needed in studies of obesity and diseases involving excess fat accumulation. Proton magnetic resonance methods address this need by providing an array of relaxometry-based (T1, T2) and chemical-shift-based approaches. These techniques can generate informative visualizations of regional and whole-body fat distributions, yield measurements of fat volumes within specific body depots, and quantify fat accumulation in abdominal organs and muscles. MR methods are commonly used to investigate the role of fat in nutrition and metabolism, to measure the efficacy of short and long-term dietary and exercise interventions, to study the implications of fat in organ steatosis and muscular dystrophies, and to elucidate pathophysiological mechanisms in the context of obesity and its comorbidities. The purpose of this review is to provide a summary of mainstream MR strategies for fat quantification. The article will succinctly describe the principles that differentiate water and fat proton signals, summarize advantages and limitations of various techniques, and offer a few illustrative examples. The article will also highlight recent efforts in MR of brown adipose tissue and conclude by briefly discussing some future research directions. PMID:24123229

  18. Molecular-Based Optical Measurement Techniques for Transition and Turbulence in High-Speed Flow

    NASA Technical Reports Server (NTRS)

    Bathel, Brett F.; Danehy, Paul M.; Cutler, Andrew D.

    2013-01-01

    High-speed laminar-to-turbulent transition and turbulence affect the control of flight vehicles, the heat transfer rate to a flight vehicle's surface, the material selected to protect such vehicles from high heating loads, the ultimate weight of a flight vehicle due to the presence of thermal protection systems, the efficiency of fuel-air mixing processes in high-speed combustion applications, etc. Gaining a fundamental understanding of the physical mechanisms involved in the transition process will lead to the development of predictive capabilities that can identify transition location and its impact on parameters like surface heating. Currently, there is no general theory that can completely describe the transition-to-turbulence process. However, transition research has led to the identification of the predominant pathways by which this process occurs. For a truly physics-based model of transition to be developed, the individual stages in the paths leading to the onset of fully turbulent flow must be well understood. This requires that each pathway be computationally modeled and experimentally characterized and validated. This may also lead to the discovery of new physical pathways. This document is intended to describe molecular based measurement techniques that have been developed, addressing the needs of the high-speed transition-to-turbulence and high-speed turbulence research fields. In particular, we focus on techniques that have either been used to study high speed transition and turbulence or techniques that show promise for studying these flows. This review is not exhaustive. In addition to the probe-based techniques described in the previous paragraph, several other classes of measurement techniques that are, or could be, used to study high speed transition and turbulence are excluded from this manuscript. For example, surface measurement techniques such as pressure and temperature paint, phosphor thermography, skin friction measurements and photogrammetry (for model attitude and deformation measurement) are excluded to limit the scope of this report. Other physical probes such as heat flux gauges, total temperature probes are also excluded. We further exclude measurement techniques that require particle seeding though particle based methods may still be useful in many high speed flow applications. This manuscript details some of the more widely used molecular-based measurement techniques for studying transition and turbulence: laser-induced fluorescence (LIF), Rayleigh and Raman Scattering and coherent anti-Stokes Raman scattering (CARS). These techniques are emphasized, in part, because of the prior experience of the authors. Additional molecular based techniques are described, albeit in less detail. Where possible, an effort is made to compare the relative advantages and disadvantages of the various measurement techniques, although these comparisons can be subjective views of the authors. Finally, the manuscript concludes by evaluating the different measurement techniques in view of the precision requirements described in this chapter. Additional requirements and considerations are discussed to assist with choosing an optical measurement technique for a given application.

  19. Study of Lamb Waves for Non-Destructive Testing Behind Screens

    NASA Astrophysics Data System (ADS)

    Kauffmann, P.; Ploix, M.-A.; Chaix, J.-F.; Gueudré, C.; Corneloup, G.; Baqué, F. AF(; )

    2018-01-01

    The inspection and control of sodium-cooled fast reactors (SFR) is a major issue for the nuclear industry. Ultrasonic solutions are under study because of the opacity of liquid sodium. In this paper, the use of leaky Lamb waves is considered for non-destructive testing (NDT) on parallel and immersed structures assimilated as plates. The first phase of our approach involved studying the propagation properties of leaky Lamb waves. Equations that model the propagation of Lamb waves in an immersed plate were solved numerically. The phase velocity can be experimentally measured using a two dimensional Fourier transform. The group velocity can be experimentally measured using a short-time Fourier transform technique. Attenuation of leaky Lamb waves is mostly due to the re-emission of energy into the surrounding fluid, and it can be measured by these two techniques.

  20. Tapping the Potential of Skill Integration as a Conduit for Communicative Language Teaching

    ERIC Educational Resources Information Center

    Wu, Shu-hua; Alrabah, Sulaiman

    2014-01-01

    The purpose of this classroom-based study was to discover the kinds of skill integration tasks that were employed by English teachers in Kuwait and to measure their attitudes toward implementing the skill integration technique in their classrooms. Data collection involved recording 25 hours of classroom-based observations, conducting interviews…

  1. Stakeholder Partnerships as Collaborative Policymaking: Evaluation Criteria Applied to Watershed Management in California and Washington

    ERIC Educational Resources Information Center

    Leach, William D.; Pelkey, Neil W.; Sabatier, Paul A.

    2002-01-01

    Public policymaking and implementation in the United States are increasingly handled through local, consensus-seeking partnerships involving most affected stakeholders. This paper formalizes the concept of a stakeholder partnership, and proposes techniques for using interviews, surveys, and documents to measure each of six evaluation criteria.…

  2. VALIDATION OF A METHOD FOR ESTIMATING POLLUTION EMISSION RATES FROM AREA SOURCES USING OPEN-PATH FTIR SEPCTROSCOPY AND DISPERSION MODELING TECHNIQUES

    EPA Science Inventory

    The paper describes a methodology developed to estimate emissions factors for a variety of different area sources in a rapid, accurate, and cost effective manner. he methodology involves using an open-path Fourier transform infrared (FTIR) spectrometer to measure concentrations o...

  3. Interference detection and correction applied to incoherent-scatter radar power spectrum measurement

    NASA Technical Reports Server (NTRS)

    Ying, W. P.; Mathews, J. D.; Rastogi, P. K.

    1986-01-01

    A median filter based interference detection and correction technique is evaluated and the method applied to the Arecibo incoherent scatter radar D-region ionospheric power spectrum is discussed. The method can be extended to other kinds of data when the statistics involved in the process are still valid.

  4. Estimating a Meaningful Point of Change: A Comparison of Exploratory Techniques Based on Nonparametric Regression

    ERIC Educational Resources Information Center

    Klotsche, Jens; Gloster, Andrew T.

    2012-01-01

    Longitudinal studies are increasingly common in psychological research. Characterized by repeated measurements, longitudinal designs aim to observe phenomena that change over time. One important question involves identification of the exact point in time when the observed phenomena begin to meaningfully change above and beyond baseline…

  5. Temperature and heat flux measurements: Challenges for high temperature aerospace application

    NASA Technical Reports Server (NTRS)

    Neumann, Richard D.

    1992-01-01

    The measurement of high temperatures and the influence of heat transfer data is not strictly a problem of either the high temperatures involved or the level of the heating rates to be measured at those high temperatures. It is a problem of duration during which measurements are made and the nature of the materials in which the measurements are made. Thermal measurement techniques for each application must respect and work with the unique features of that application. Six challenges in the development of measurement technology are discussed: (1) to capture the character and localized peak values within highly nonuniform heating regions; (2) to manage large volumes of thermal instrumentation in order to efficiently derive critical information; (3) to accommodate thermal sensors into practical flight structures; (4) to broaden the capabilities of thermal survey techniques to replace discrete gages in flight and on the ground; (5) to provide supporting instrumentation conduits which connect the measurement points to the thermally controlled data acquisition system; and (6) to develop a class of 'vehicle tending' thermal sensors to assure the integrity of flight vehicles in an efficient manner.

  6. Outcome of Vaginoplasty in Male-to-Female Transgenders: A Systematic Review of Surgical Techniques.

    PubMed

    Horbach, Sophie E R; Bouman, Mark-Bram; Smit, Jan Maerten; Özer, Müjde; Buncamper, Marlon E; Mullender, Margriet G

    2015-06-01

    Gender reassignment surgery is the keystone of the treatment of transgender patients. For male-to-female transgenders, this involves the creation of a neovagina. Many surgical methods for vaginoplasty have been opted. The penile skin inversion technique is the method of choice for most gender surgeons. However, the optimal surgical technique for vaginoplasty in transgender women has not yet been identified, as outcomes of the different techniques have never been compared. With this systematic review, we aim to give a detailed overview of the published outcomes of all currently available techniques for vaginoplasty in male-to-female transgenders. A PubMed and EMBASE search for relevant publications (1995-present), which provided data on the outcome of techniques for vaginoplasty in male-to-female transgender patients. Main outcome measures are complications, neovaginal depth and width, sexual function, patient satisfaction, and improvement in quality of life (QoL). Twenty-six studies satisfied the inclusion criteria. The majority of these studies were retrospective case series of low to intermediate quality. Outcome of the penile skin inversion technique was reported in 1,461 patients, bowel vaginoplasty in 102 patients. Neovaginal stenosis was the most frequent complication in both techniques. Sexual function and patient satisfaction were overall acceptable, but many different outcome measures were used. QoL was only reported in one study. Comparison between techniques was difficult due to the lack of standardization. The penile skin inversion technique is the most researched surgical procedure. Outcome of bowel vaginoplasty has been reported less frequently but does not seem to be inferior. The available literature is heterogeneous in patient groups, surgical procedure, outcome measurement tools, and follow-up. Standardized protocols and prospective study designs are mandatory for correct interpretation and comparability of data. © 2015 International Society for Sexual Medicine.

  7. Martial arts: time needed for training.

    PubMed

    Burke, David T; Protopapas, Marina; Bonato, Paolo; Burke, John T; Landrum, Rpbert F

    2011-03-01

    To measure the time needed to teach a series of martial arts techniques to proficiency. Fifteen volunteer subjects without any prior martial arts or self-defense experience were recruited. A panel of martial arts experts selected 21 different techniques including defensive stances, arm blocks, elbow strikes, palm strikes, thumbs to eyes, instep kicks and a carotid neck restraint. The critical elements of each technique were identified by the panel and incorporated into a teaching protocol, and then into a scoring system. Two black belt martial arts instructors directed a total of forty-five 45-minute training sessions. Videotaped proficiency testing was performed weekly. The videotapes were reviewed by the investigators to determine the proficiency levels of each subject for each technique. The techniques were rated by the average number of training sessions needed for an individual to develop proficiency in that technique. The mean number of sessions necessary to train individuals to proficiency ranged from 27 to 38.3. Using this system, the most difficult techniques seemed to be elbow strikes to the rear, striking with thumbs to the eyes and arm blocking. In this study 29 hours of training was necessary to train novice students to be proficient in 21 offensive and defensive martial arts techniques. To our knowledge, this is the first study that attempts to measure the learning curves involved when teaching martial arts techniques.

  8. Martial Arts: Time Needed for Training

    PubMed Central

    Burke, David T.; Protopapas, Marina; Bonato, Paolo; Burke, John T.; Landrum, Rpbert F.

    2011-01-01

    Purpose To measure the time needed to teach a series of martial arts techniques to proficiency. Methods Fifteen volunteer subjects without any prior martial arts or self-defense experience were recruited. A panel of martial arts experts selected 21 different techniques including defensive stances, arm blocks, elbow strikes, palm strikes, thumbs to eyes, instep kicks and a carotid neck restraint. The critical elements of each technique were identified by the panel and incorporated into a teaching protocol, and then into a scoring system. Two black belt martial arts instructors directed a total of forty-five 45-minute training sessions. Videotaped proficiency testing was performed weekly. The videotapes were reviewed by the investigators to determine the proficiency levels of each subject for each technique. Results The techniques were rated by the average number of training sessions needed for an individual to develop proficiency in that technique. The mean number of sessions necessary to train individuals to proficiency ranged from 27 to 38.3. Using this system, the most difficult techniques seemed to be elbow strikes to the rear, striking with thumbs to the eyes and arm blocking. Conclusions In this study 29 hours of training was necessary to train novice students to be proficient in 21 offensive and defensive martial arts techniques. To our knowledge, this is the first study that attempts to measure the learning curves involved when teaching martial arts techniques. PMID:22375215

  9. Self-calibrating d-scan: measuring ultrashort laser pulses on-target using an arbitrary pulse compressor.

    PubMed

    Alonso, Benjamín; Sola, Íñigo J; Crespo, Helder

    2018-02-19

    In most applications of ultrashort pulse lasers, temporal compressors are used to achieve a desired pulse duration in a target or sample, and precise temporal characterization is important. The dispersion-scan (d-scan) pulse characterization technique usually involves using glass wedges to impart variable, well-defined amounts of dispersion to the pulses, while measuring the spectrum of a nonlinear signal produced by those pulses. This works very well for broadband few-cycle pulses, but longer, narrower bandwidth pulses are much more difficult to measure this way. Here we demonstrate the concept of self-calibrating d-scan, which extends the applicability of the d-scan technique to pulses of arbitrary duration, enabling their complete measurement without prior knowledge of the introduced dispersion. In particular, we show that the pulse compressors already employed in chirped pulse amplification (CPA) systems can be used to simultaneously compress and measure the temporal profile of the output pulses on-target in a simple way, without the need of additional diagnostics or calibrations, while at the same time calibrating the often-unknown differential dispersion of the compressor itself. We demonstrate the technique through simulations and experiments under known conditions. Finally, we apply it to the measurement and compression of 27.5 fs pulses from a CPA laser.

  10. The investigation of advanced remote sensing, radiative transfer and inversion techniques for the measurement of atmospheric constituents

    NASA Technical Reports Server (NTRS)

    Deepak, Adarsh; Wang, Pi-Huan

    1985-01-01

    The research program is documented for developing space and ground-based remote sensing techniques performed during the period from December 15, 1977 to March 15, 1985. The program involved the application of sophisticated radiative transfer codes and inversion methods to various advanced remote sensing concepts for determining atmospheric constituents, particularly aerosols. It covers detailed discussions of the solar aureole technique for monitoring columnar aerosol size distribution, and the multispectral limb scattered radiance and limb attenuated radiance (solar occultation) techniques, as well as the upwelling scattered solar radiance method for determining the aerosol and gaseous characteristics. In addition, analytical models of aerosol size distribution and simulation studies of the limb solar aureole radiance technique and the variability of ozone at high altitudes during satellite sunrise/sunset events are also described in detail.

  11. A constrained modulus reconstruction technique for breast cancer assessment.

    PubMed

    Samani, A; Bishop, J; Plewes, D B

    2001-09-01

    A reconstruction technique for breast tissue elasticity modulus is described. This technique assumes that the geometry of normal and suspicious tissues is available from a contrast-enhanced magnetic resonance image. Furthermore, it is assumed that the modulus is constant throughout each tissue volume. The technique, which uses quasi-static strain data, is iterative where each iteration involves modulus updating followed by stress calculation. Breast mechanical stimulation is assumed to be done by two compressional rigid plates. As a result, stress is calculated using the finite element method based on the well-controlled boundary conditions of the compression plates. Using the calculated stress and the measured strain, modulus updating is done element-by-element based on Hooke's law. Breast tissue modulus reconstruction using simulated data and phantom modulus reconstruction using experimental data indicate that the technique is robust.

  12. Technical support for creating an artificial intelligence system for feature extraction and experimental design

    NASA Technical Reports Server (NTRS)

    Glick, B. J.

    1985-01-01

    Techniques for classifying objects into groups or clases go under many different names including, most commonly, cluster analysis. Mathematically, the general problem is to find a best mapping of objects into an index set consisting of class identifiers. When an a priori grouping of objects exists, the process of deriving the classification rules from samples of classified objects is known as discrimination. When such rules are applied to objects of unknown class, the process is denoted classification. The specific problem addressed involves the group classification of a set of objects that are each associated with a series of measurements (ratio, interval, ordinal, or nominal levels of measurement). Each measurement produces one variable in a multidimensional variable space. Cluster analysis techniques are reviewed and methods for incuding geographic location, distance measures, and spatial pattern (distribution) as parameters in clustering are examined. For the case of patterning, measures of spatial autocorrelation are discussed in terms of the kind of data (nominal, ordinal, or interval scaled) to which they may be applied.

  13. An intercomparison of carbon monoxide, nitric oxide, and hydroxyl measurement techniques - Overview of results

    NASA Technical Reports Server (NTRS)

    Hoell, J. M.; Gregory, G. L.; Carroll, M. A.; Mcfarland, M.; Ridley, B. A.; Davis, D. D.; Bradshaw, J.; Rodgers, M. O.; Torres, A. L.; Condon, E. P.

    1984-01-01

    Results from an intercomparison of methods to measure carbon monoxide (CO), nitric oxide (NO), and the hydroxyl radical (OH) are discussed. The intercomparison was conducted at Wallops Island, Virginia, in July 1983 and included a laser differential absorption and three grab sample/gas chromatograph methods for CO, a laser-induced fluorescence (LIF) and two chemiluminescence methods for NO, and two LIF methods and a radiocarbon tracer method for OH. The intercomparison was conducted as a field measurement program involving ambient measurements of CO (150-300 ppbv) and NO (10-180 pptv) from a common manifold with controlled injection of CO in incremental steps from 20 to 500 ppbv and NO in steps from 10 to 220 pptv. Only ambient measurements of OH were made. The agreement between the techniques was on the order of 14 percent for CO and 17 percent for NO. Hardware difficulties during the OH tests resulted in a data base with insufficient data and uncertanties too large to permit a meaningful intercomposition.

  14. Red blood cell-deformability measurement: review of techniques.

    PubMed

    Musielak, M

    2009-01-01

    Cell-deformability characterization involves general measurement of highly complex relationships between cell biology and physical forces to which the cell is subjected. The review takes account of the modern technical solutions simulating the action of the force applied to the red blood cell in macro- and microcirculation. Diffraction ektacytometers and rheoscopes measure the mean deformability value for the total red blood cell population investigated and the deformation distribution index of individual cells, respectively. Deformation assays of a whole single cell are possible by means of optical tweezers. The single cell-measuring setups for micropipette aspiration and atomic force microscopy allow conducting a selective investigation of deformation parameters (e.g., cytoplasm viscosity, viscoelastic membrane properties). The distinction between instrument sensitivity to various RBC-rheological features as well as the influence of temperature on measurement are discussed. The reports quoted confront fascinating possibilities of the techniques with their medical applications since the RBC-deformability has the key position in the etiology of a wide range of conditions.

  15. Measurement of magnetic field gradients using Raman spectroscopy in a fountain

    NASA Astrophysics Data System (ADS)

    Srinivasan, Arvind; Zimmermann, Matthias; Efremov, Maxim A.; Davis, Jon P.; Narducci, Frank A.

    2017-02-01

    In many experiments involving cold atoms, it is crucial to know the strength of the magnetic field and/or the magnetic field gradient at the precise location of a measurement. While auxiliary sensors can provide some of this information, the sensors are usually not perfectly co-located with the atoms and so can only provide an approximation to the magnetic field strength. In this article, we describe a technique to measure the magnetic field, based on Raman spectroscopy, using the same atomic fountain source that will be used in future magnetically sensitive measurements.

  16. On the consistency among different approaches for nuclear track scanning and data processing

    NASA Astrophysics Data System (ADS)

    Inozemtsev, K. O.; Kushin, V. V.; Kodaira, S.; Shurshakov, V. A.

    2018-04-01

    The article describes various approaches for space radiation track measurement using CR-39™ detector (Tastrak). The results of comparing different methods for track scanning and data processing are presented. Basic algorithms for determination of track parameters are described. Every approach involves individual set of measured track parameters. For two sets, track scanning is sufficient in the plane of detector surface (2-D measurement), third set requires scanning in the additional projection (3-D measurement). An experimental comparison of considered techniques was made with the use of accelerated heavy ions Ar, Fe and Kr.

  17. Absolute photon-flux measurements in the vacuum ultraviolet

    NASA Technical Reports Server (NTRS)

    Samson, J. A. R.; Haddad, G. N.

    1974-01-01

    Absolute photon-flux measurements in the vacuum ultraviolet have extended to short wavelengths by use of rare-gas ionization chambers. The technique involves the measurement of the ion current as a function of the gas pressure in the ion chamber. The true value of the ion current, and hence the absolute photon flux, is obtained by extrapolating the ion current to zero gas pressure. Examples are given at 162 and 266 A. The short-wavelength limit is determined only by the sensitivity of the current-measuring apparatus and by present knowledge of the photoionization processes that occur in the rate gases.

  18. Microbial detection method based on sensing molecular hydrogen

    NASA Technical Reports Server (NTRS)

    Wilkins, J. R.; Stoner, G. E.; Boykin, E. H.

    1974-01-01

    An approach involving the measurement of hydrogen evolution by test organisms was used to detect and enumerate various members of the Enterobacteriaceae group. The experimental setup for measuring hydrogen evolution consisted of a test tube containing two electrodes plus broth and organisms. The test tube was kept in a water bath at a temperature of 35 C. It is pointed out that the hydrogen-sensing method, coupled with the pressure transducer technique reported by Wilkins (1974) could be used in various experiments in which gas production by microorganisms is being measured.

  19. Studies of Particle Sedimentation by Novel Scattering Techniques

    NASA Technical Reports Server (NTRS)

    Tong, Penger

    2000-01-01

    The four-year grant began May 1, 1996 (5-1-96 to 4-30-00, $100,000/year). We have finished 4 major research projects and published 10 papers during this grant period. An important aspect of this research has been the education of students at graduate and undergraduate levels. They have been fully involved in the research described below: 1. Polymer-induced depletion interaction in colloid-polymer mixtures. 2. Colloidal sedimentation in polymer solutions. 3. Velocity fluctuations in particle sedimentation. New laser light scattering techniques for velocity difference measurements.

  20. Fused Deposition Technique for Continuous Fiber Reinforced Thermoplastic

    NASA Astrophysics Data System (ADS)

    Bettini, Paolo; Alitta, Gianluca; Sala, Giuseppe; Di Landro, Luca

    2017-02-01

    A simple technique for the production of continuous fiber reinforced thermoplastic by fused deposition modeling, which involves a common 3D printer with quite limited modifications, is presented. An adequate setting of processing parameters and deposition path allows to obtain components with well-enhanced mechanical characteristics compared to conventional 3D printed items. The most relevant problems related to the simultaneous feeding of fibers and polymer are discussed. The properties of obtained aramid fiber reinforced polylactic acid (PLA) in terms of impregnation quality and of mechanical response are measured.

  1. The emergence of optical elastography in biomedicine

    NASA Astrophysics Data System (ADS)

    Kennedy, Brendan F.; Wijesinghe, Philip; Sampson, David D.

    2017-04-01

    Optical elastography, the use of optics to characterize and map the mechanical properties of biological tissue, involves measuring the deformation of tissue in response to a load. Such measurements may be used to form an image of a mechanical property, often elastic modulus, with the resulting mechanical contrast complementary to the more familiar optical contrast. Optical elastography is experiencing new impetus in response to developments in the closely related fields of cell mechanics and medical imaging, aided by advances in photonics technology, and through probing the microscale between that of cells and whole tissues. Two techniques -- optical coherence elastography and Brillouin microscopy -- have recently shown particular promise for medical applications, such as in ophthalmology and oncology, and as new techniques in cell mechanics.

  2. Computation of transonic flow past projectiles at angle of attack

    NASA Technical Reports Server (NTRS)

    Reklis, R. P.; Sturek, W. B.; Bailey, F. R.

    1978-01-01

    Aerodynamic properties of artillery shell such as normal force and pitching moment reach peak values in a narrow transonic Mach number range. In order to compute these quantities, numerical techniques have been developed to obtain solutions to the three-dimensional transonic small disturbance equation about slender bodies at angle of attack. The computation is based on a plane relaxation technique involving Fourier transforms to partially decouple the three-dimensional difference equations. Particular care is taken to assure accurate solutions near corners found in shell designs. Computed surface pressures are compared to experimental measurements for circular arc and cone cylinder bodies which have been selected as test cases. Computed pitching moments are compared to range measurements for a typical projectile shape.

  3. Commissioning and First Results of the Electron Beam Profiler in the Main Injector at Fermilab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thurman-Keup, R.; Alvarez, M.; Fitzgerald, J.

    2017-08-01

    The planned neutrino program at Fermilab requires large proton beam intensities in excess of 2 MW. Measuring the transverse profiles of these high intensity beams is challenging and often depends on non-invasive techniques. One such technique involves measuring the deflection of a probe beam of electrons with a trajectory perpendicular to the proton beam. A device such as this is already in use at the Spallation Neutron Source at ORNL and a similar device has been installed in the Main Injector at Fermilab. Commissioning of the device is in progress with the goal of having it operational by the endmore » of the year. The status of the commissioning and initial results will be presented« less

  4. Development of a morphology-based modeling technique for tracking solid-body displacements: examining the reliability of a potential MRI-only approach for joint kinematics assessment.

    PubMed

    Mahato, Niladri K; Montuelle, Stephane; Cotton, John; Williams, Susan; Thomas, James; Clark, Brian

    2016-05-18

    Single or biplanar video radiography and Roentgen stereophotogrammetry (RSA) techniques used for the assessment of in-vivo joint kinematics involves application of ionizing radiation, which is a limitation for clinical research involving human subjects. To overcome this limitation, our long-term goal is to develop a magnetic resonance imaging (MRI)-only, three dimensional (3-D) modeling technique that permits dynamic imaging of joint motion in humans. Here, we present our initial findings, as well as reliability data, for an MRI-only protocol and modeling technique. We developed a morphology-based motion-analysis technique that uses MRI of custom-built solid-body objects to animate and quantify experimental displacements between them. The technique involved four major steps. First, the imaging volume was calibrated using a custom-built grid. Second, 3-D models were segmented from axial scans of two custom-built solid-body cubes. Third, these cubes were positioned at pre-determined relative displacements (translation and rotation) in the magnetic resonance coil and scanned with a T1 and a fast contrast-enhanced pulse sequences. The digital imaging and communications in medicine (DICOM) images were then processed for animation. The fourth step involved importing these processed images into an animation software, where they were displayed as background scenes. In the same step, 3-D models of the cubes were imported into the animation software, where the user manipulated the models to match their outlines in the scene (rotoscoping) and registered the models into an anatomical joint system. Measurements of displacements obtained from two different rotoscoping sessions were tested for reliability using coefficient of variations (CV), intraclass correlation coefficients (ICC), Bland-Altman plots, and Limits of Agreement analyses. Between-session reliability was high for both the T1 and the contrast-enhanced sequences. Specifically, the average CVs for translation were 4.31 % and 5.26 % for the two pulse sequences, respectively, while the ICCs were 0.99 for both. For rotation measures, the CVs were 3.19 % and 2.44 % for the two pulse sequences with the ICCs being 0.98 and 0.97, respectively. A novel biplanar imaging approach also yielded high reliability with mean CVs of 2.66 % and 3.39 % for translation in the x- and z-planes, respectively, and ICCs of 0.97 in both planes. This work provides basic proof-of-concept for a reliable marker-less non-ionizing-radiation-based quasi-dynamic motion quantification technique that can potentially be developed into a tool for real-time joint kinematics analysis.

  5. Measuring contemporary crustal motions; NASA’s Crustal Dynamics Project

    USGS Publications Warehouse

    Frey, H. V.; Bosworth, J. M.

    1988-01-01

    In this article we describe briefly the two space geodetic techniques and how they are used by the Crustal Dynamics Project, show some of the very exciting results that have emerged at the halfway point in the project's life, describe the availability and utilization of the data being collected, and consider what the future may hold when measurement accuracies eventually exceed even those now available and when other international groups become more heavily involved.   

  6. Apparatus Translates Crossed-Laser-Beam Probe Volume

    NASA Technical Reports Server (NTRS)

    Herring, Gregory C.; South, Bruce W.; Exton, Reginald J.

    1994-01-01

    Optomechanical apparatus translates probe volume of crossed-beam laser velocimeter or similar instrument while maintaining optical alignment of beams. Measures velocity, pressure, and temperature of flowing gas at several locations. Repeated tedious realignments no longer necessary. Designed to accommodate stimulated-Raman-gain spectrometer for noninvasive measurement of local conditions in flowing gas in supersonic wind tunnel. Applicable to other techniques like coherent anti-Stokes Raman spectroscopy involving use of laser beams crossed at small angles (10 degrees or less).

  7. Real-time three-dimensional digital image correlation for biomedical applications

    NASA Astrophysics Data System (ADS)

    Wu, Rong; Wu, Hua; Arola, Dwayne; Zhang, Dongsheng

    2016-10-01

    Digital image correlation (DIC) has been successfully applied for evaluating the mechanical behavior of biological tissues. A three-dimensional (3-D) DIC system has been developed and applied to examining the motion of bones in the human foot. To achieve accurate, real-time displacement measurements, an algorithm including matching between sequential images and image pairs has been developed. The system was used to monitor the movement of markers which were attached to a precisely motorized stage. The accuracy of the proposed technique for in-plane and out-of-plane measurements was found to be -0.25% and 1.17%, respectively. Two biomedical applications were presented. In the experiment involving the foot arch, a human cadaver lower leg and foot specimen were subjected to vertical compressive loads up to 700 N at a rate of 10 N/s and the 3-D motions of bones in the foot were monitored in real time. In the experiment involving distal tibio fibular syndesmosis, a human cadaver lower leg and foot specimen were subjected to a monotonic rotational torque up to 5 Nm at a speed of 5 deg per min and the relative displacements of the tibia and fibula were monitored in real time. Results showed that the system could reach a frequency of up to 16 Hz with 6 points measured simultaneously. This technique sheds new lights on measuring 3-D motion of bones in biomechanical studies.

  8. Miscellaneous methods for measuring matric or water potential

    USGS Publications Warehouse

    Scanlon, Bridget R.; Andraski, Brian J.; Bilskie, Jim; Dane, Jacob H.; Topp, G. Clarke

    2002-01-01

    A variety of techniques to measure matric potential or water potential in the laboratory and in the field are described in this section. The techniques described herein require equilibration of some medium whose matric or water potential can be determined from previous calibration or can be measured directly. Under equilibrium conditions the matric or water potential of the medium is equal to that of the soil. The techniques can be divided into: (i) those that measure matric potential and (ii) those that measure water potential (sum of matric and osmotic potentials). Matric potential is determined when the sensor matrix is in direct contact with the soil, so salts are free to diffuse in or out of the sensor matrix, and the equilibrium measurement therefore reflects matric forces acting on the water. Water potential is determined when the sensor is separated from the soil by a vapor gap, so salts are not free to move in or out of the sensor, and the equilibrium measurement reflects the sum of the matric and osmotic forces acting on the water.Seven different techniques are described in this section. Those that measure matric potential include (i) heat dissipation sensors, (ii) electrical resistance sensors, (iii) frequency domain and time domain sensors, and (iv) electro-optical switches. A method that can be used to measure matric potential or water potential is the (v) filter paper method. Techniques that measure water potential include (vi) the Dew Point Potentiameter (Decagon Devices, Inc., Pullman, WA1) (water activity meter) and (vii) vapor equilibration.The first four techniques are electronically based methods for measuring matric potential. Heat dissipation sensors and electrical resistance sensors infer matric potential from previously determined calibration relations between sensor heat dissipation or electrical resistance and matric potential. Frequency-domain and timedomain matric potential sensors measure water content, which is related to matric potential of the sensor through calibration. Electro-optical switches measure changes in light transmission through thin, nylon filters as they absorb or desorb water in response to changes in matric potential. Heat dissipation sensors and electrical resistance sensors are used primarily in the field to provide information on matric potential. Frequency domain matric potential sensors are new and have not been widely used. Time domain matric potential sensors and electro-optical switches are new and have not been commercialized. For the fifth technique, filter paper is used as the standard matrix. The filter paper technique measures matric potential when the filter paper is in direct contact with soil or water potential when separated from soil by a vapor gap. The Dew Point Potentiameter calculates water potential from the measured dew point and sample temperature. The vapor equilibration technique involves equilibration of soil samples with salt solutions of known osmotic potential. The filter paper, Dew Point Potentiameter, and vapor equilibration techniques are generally used in the laboratory to measure water potential of disturbed field samples or to measure water potential for water retention functions.

  9. Development of Two-Photon Pump Polarization Spectroscopy Probe Technique Tpp-Psp for Measurements of Atomic Hydrogen .

    NASA Astrophysics Data System (ADS)

    Satija, Aman; Lucht, Robert P.

    2015-06-01

    Atomic hydrogen (H) is a key radical in combustion and plasmas. Accurate knowledge of its concentration can be used to better understand transient phenomenon such as ignition and extinction in combustion environments. Laser induced polarization spectroscopy is a spatially resolved absorption technique which we have adapted for quantitative measurements of H atom. This adaptation is called two-photon pump, polarization spectroscopy probe technique (TPP-PSP) and it has been implemented using two different laser excitation schemes. The first scheme involves the two-photon excitation of 1S-2S transitions using a linearly polarized 243-nm beam. An anisotropy is created amongst Zeeman states in 2S-3P levels using a circularly polarized 656-nm pump beam. This anisotropy rotates the polarization of a weak, linearly polarized probe beam at 656 nm. As a result, the weak probe beam "leaks" past an analyzer in the detection channel and is measured using a PMT. This signal can be related to H atom density in the probe volume. The laser beams were created by optical parametric generation followed by multiple pulse dye amplification stages. This resulted in narrow linewidth beams which could be scanned in frequency domain and varied in energy. This allowed us to systematically investigate saturation and Stark effect in 2S-3P transitions with the goal of developing a quantitative H atom measurement technique. The second scheme involves the two-photon excitation of 1S-2S transitions using a linearly polarized 243-nm beam. An anisotropy is created amongst Zeeman states in 2S-4P transitions using a circularly polarized 486-nm pump beam. This anisotropy rotates the polarization of a weak, linearly polarized probe beam at 486 nm. As a result the weak probe beam "leaks" past an analyzer in the detection channel and is measured using a PMT. This signal can be related to H atom density in the probe volume. A dye laser was pumped by third harmonic of a Nd:YAG laser to create a laser beam at 486 nm. The 486-nm beam was frequency doubled to a 243-nm beam. Use of the second scheme simplifies the TPP-PSP technique making it more convenient for diagnostics in practical systems.

  10. Combining Temporal and Spectral Information with Spatial Mapping to Identify Differences between Phonological and Semantic Networks: A Magnetoencephalographic Approach.

    PubMed

    McNab, Fiona; Hillebrand, Arjan; Swithenby, Stephen J; Rippon, Gina

    2012-01-01

    Early, lesion-based models of language processing suggested that semantic and phonological processes are associated with distinct temporal and parietal regions respectively, with frontal areas more indirectly involved. Contemporary spatial brain mapping techniques have not supported such clear-cut segregation, with strong evidence of activation in left temporal areas by both processes and disputed evidence of involvement of frontal areas in both processes. We suggest that combining spatial information with temporal and spectral data may allow a closer scrutiny of the differential involvement of closely overlapping cortical areas in language processing. Using beamforming techniques to analyze magnetoencephalography data, we localized the neuronal substrates underlying primed responses to nouns requiring either phonological or semantic processing, and examined the associated measures of time and frequency in those areas where activation was common to both tasks. Power changes in the beta (14-30 Hz) and gamma (30-50 Hz) frequency bands were analyzed in pre-selected time windows of 350-550 and 500-700 ms In left temporal regions, both tasks elicited power changes in the same time window (350-550 ms), but with different spectral characteristics, low beta (14-20 Hz) for the phonological task and high beta (20-30 Hz) for the semantic task. In frontal areas (BA10), both tasks elicited power changes in the gamma band (30-50 Hz), but in different time windows, 500-700 ms for the phonological task and 350-550 ms for the semantic task. In the left inferior parietal area (BA40), both tasks elicited changes in the 20-30 Hz beta frequency band but in different time windows, 350-550 ms for the phonological task and 500-700 ms for the semantic task. Our findings suggest that, where spatial measures may indicate overlapping areas of involvement, additional beamforming techniques can demonstrate differential activation in time and frequency domains.

  11. A new approach to measuring tortuosity

    NASA Astrophysics Data System (ADS)

    Wert, Amanda; Scott, Sherry E.

    2012-03-01

    The detection and measurement of the tortuosity - i.e. the bending and winding - of vessels has been shown to be potentially useful in the assessment of cancer progression and treatment response. Although several metrics for tortuosity are used, no single one measure is able to capture all types of tortuosity. This report presents a new multiscale technique for measuring vessel tortuosity. The approach is based on a method - called the ergodicity defect - which gives a scale-dependent measure of deviation from ergodicity. Ergodicity is a concept that captures the manner in which trajectories or signals sample the space; thus, ergodicity and vessel tortuosity both involve the notion of how a signal samples space. Here we begin to explore this connection. We first apply the ergodicity defect tortuosity measure to both 2D and 3D synthetic data in order to demonstrate the response of the method to three types of tortuosity observed in clinical patterns. We then implement the technique on segmented vessels extracted from brain tumor MRA images. Results indicate that the method can be effectively used to detect and measure several types of vessel tortuosity.

  12. Novel Technique for Making Measurements of SO2 with a Standalone Sonde

    NASA Astrophysics Data System (ADS)

    Flynn, J. H., III; Morris, G. A.; Kotsakis, A.; Alvarez, S. L.

    2017-12-01

    A novel technique has been developed to measure SO2 using the existing electrochemical concentration cell (ECC) ozonesonde technology. An interference in the ozone measurement occurs when SO2 is introduced to the iodide redox reaction causing the signal to decrease and go to zero when [O3] < [SO2]. The original method of measuring SO2 with ozonesondes involves launching two ozonesondes together with one ozonesonde unmodified and one with an SO2 filter [Morris et al, 2010]. By taking the difference between these profiles, the SO2 profile could be determined as long as [O3] > [SO2]. A new method allows for making a direct measurement of SO2 without the need for the dual payload by modifying the existing design. The ultimate goal is to be able to measure SO2 vertical profiles in the atmosphere, such as in plumes from anthropogenic or natural sources (i.e. volcanic eruptions). The benefits of an SO2 sonde include the ability to make measurements where aircraft cannot safely fly, such as in volcanic plumes, and to provide validation of SO2 columns from satellites.

  13. STORM-SEWER FLOW MEASUREMENT AND RECORDING SYSTEM.

    USGS Publications Warehouse

    Kilpatrick, Frederick A.; Kaehrle, William R.

    1986-01-01

    A comprehensive study and development of instruments and techniques for measuring all components of flow in a storm-sewer drainage system were undertaken by the U. S. Geological Survey under the sponsorship of FHWA. The study involved laboratory and field calibration and testing of measuring flumes, pipe insert meters, weirs, and electromagnetic velocity meters as well as the development and calibration of pneumatic bubbler and pressure transducer head-measuring systems. Tracer dilution and acoustic-flowmeter measurements were used in field verification tests. A single micrologger was used to record data from all the instruments and also to activate on command the electromagnetic velocity meter and tracer dilution systems.

  14. A novel, eco-friendly technique for covalent functionalization of graphene nanoplatelets and the potential of their nanofluids for heat transfer applications

    NASA Astrophysics Data System (ADS)

    Sadri, Rad; Hosseini, Maryam; Kazi, S. N.; Bagheri, Samira; Zubir, Nashrul; Ahmadi, Goodarz; Dahari, Mahidzal; Zaharinie, Tuan

    2017-05-01

    In this study, a facile and eco-friendly covalent functionalization technique is developed to synthesize highly stable graphene nanoplatelets (GNPs) in aqueous media. This technique involves free radical grafting of gallic acid onto the surface of GNPs rather than corrosive inorganic acids. Raman spectroscopy, X-ray photoelectron spectroscopy and transmission electron microscopy are used to confirm the covalent functionalization of GNPs with gallic acid (GAGNPs). The solubility of the GAGNPs in aqueous media is verified using zeta potential and UV-vis spectra measurements. The nanofluid shows significant improvement in thermo-physical properties, indicating its superb potential for various thermal applications.

  15. A technology roadmap of smart biosensors from conventional glucose monitoring systems.

    PubMed

    Shende, Pravin; Sahu, Pratiksha; Gaud, Ram

    2017-06-01

    The objective of this review article is to focus on technology roadmap of smart biosensors from a conventional glucose monitoring system. The estimation of glucose with commercially available devices involves analysis of blood samples that are obtained by pricking finger or extracting blood from the forearm. Since pain and discomfort are associated with invasive methods, the non-invasive measurement techniques have been investigated. The non-invasive methods show advantages like non-exposure to sharp objects such as needles and syringes, due to which there is an increase in testing frequency, improved control of glucose concentration and absence of pain and biohazard materials. This review study is aimed to describe recent invasive techniques and major noninvasive techniques, viz. biosensors, optical techniques and sensor-embedded contact lenses for glucose estimation.

  16. Slow neutron mapping technique for level interface measurement

    NASA Astrophysics Data System (ADS)

    Zain, R. M.; Ithnin, H.; Razali, A. M.; Yusof, N. H. M.; Mustapha, I.; Yahya, R.; Othman, N.; Rahman, M. F. A.

    2017-01-01

    Modern industrial plant operations often require accurate level measurement of process liquids in production and storage vessels. A variety of advanced level indicators are commercially available to meet the demand, but these may not suit specific need of situations. The neutron backscatter technique is exceptionally useful for occasional and routine determination, particularly in situations such as pressure vessel with wall thickness up to 10 cm, toxic and corrosive chemical in sealed containers, liquid petroleum gas storage vessels. In level measurement, high energy neutrons from 241Am-Be radioactive source are beamed onto a vessel. Fast neutrons are slowed down mostly by collision with hydrogen atoms of material inside the vessel. Parts of thermal neutron are bounced back towards the source. By placing a thermal detector next to the source, these backscatter neutrons can be measured. The number of backscattered neutrons is directly proportional to the concentration of the hydrogen atoms in front of the neutron detector. As the source and detector moved by the matrix around the side of the vessel, interfaces can be determined as long as it involves a change in hydrogen atom concentration. This paper presents the slow neutron mapping technique to indicate level interface of a test vessel.

  17. Reconsideration of dynamic force spectroscopy analysis of streptavidin-biotin interactions.

    PubMed

    Taninaka, Atsushi; Takeuchi, Osamu; Shigekawa, Hidemi

    2010-05-13

    To understand and design molecular functions on the basis of molecular recognition processes, the microscopic probing of the energy landscapes of individual interactions in a molecular complex and their dependence on the surrounding conditions is of great importance. Dynamic force spectroscopy (DFS) is a technique that enables us to study the interaction between molecules at the single-molecule level. However, the obtained results differ among previous studies, which is considered to be caused by the differences in the measurement conditions. We have developed an atomic force microscopy technique that enables the precise analysis of molecular interactions on the basis of DFS. After verifying the performance of this technique, we carried out measurements to determine the landscapes of streptavidin-biotin interactions. The obtained results showed good agreement with theoretical predictions. Lifetimes were also well analyzed. Using a combination of cross-linkers and the atomic force microscope that we developed, site-selective measurement was carried out, and the steps involved in bonding due to microscopic interactions are discussed using the results obtained by site-selective analysis.

  18. New optoelectronic methodology for nondestructive evaluation of MEMS at the wafer level

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Ferguson, Curtis F.; Melson, Michael J.

    2004-02-01

    One of the approaches to fabrication of MEMS involves surface micromachining to define dies on single crystal silicon wafers, dicing of the wafers to separate the dies, and electronic packaging of the individual dies. Dicing and packaging of MEMS accounts for a large fraction of the fabrication costs, therefore, nondestructive evaluation at the wafer level, before dicing, can have significant implications on improving production yield and costs. In this paper, advances in development of optoelectronic holography (OEH) techniques for nondestructive, noninvasive, full-field of view evaluation of MEMS at the wafer level are described. With OEH techniques, quantitative measurements of shape and deformation of MEMS, as related to their performance and integrity, are obtained with sub-micrometer spatial resolution and nanometer measuring accuracy. To inspect an entire wafer with OEH methodologies, measurements of overlapping regions of interest (ROI) on a wafer are recorded and adjacent ROIs are stitched together through efficient 3D correlation analysis algorithms. Capabilities of the OEH techniques are illustrated with representative applications, including determination of optimal inspection conditions to minimize inspection time while achieving sufficient levels of accuracy and resolution.

  19. Electroencephalogram (EEG) and Magnetoencephalogram (MEG) as Tools for Evaluation of Cognitive Function

    NASA Technical Reports Server (NTRS)

    Fender, Derek H.; Hestenes, John D.

    1985-01-01

    We have developed computerized analysis and display techniques to help identify the origins of visually evoked scalped potentials (VESP). The potentials are recorded simultaneously from many electrodes (usually 40 to 48) spaced over the region of the scalp where appreciable evoked potentials are found in response to particular stimulus. Contour mapping algorithms are then used to display the time behavior of equipotential surfaces on the scalp during the VESP. We then use an optimization technique to select the parameters of arrays of current dipole sources within the model until the model equipotential field distribution closely fits the measured data. Computer graphics are then used to display, as a movie, the actual and model scalp potential fields and the parameters of the dipole generators within the model head during the course of VESP activity. We have devised reaction time tests that involve potentially separable stages of cognitive processing and utilize stimuli that produce measurable cognition-related features in the late component of the evoked potential. We have used these techniques to determine the loci in the brain where known cognition-related features in the evoked potential are generated and we have explored the extent to which each of these features can be related to the reaction time tasks. We have also examined the temporal-spatial aspects of their cerebral involvement. Our current work is planned to characterize the age-related changes in the processes performed by such sources. We also use a neuromagnetometer to measure the evoked magnetic fields in similar circumstances; we will discuss the relative merits of the two methodologies.

  20. A balloon system for profiling smoke plumes from forest fires

    Treesearch

    Paul W. Ryan; Charles D. Tangren; Charles K. McMahon

    1979-01-01

    This paper is directed to those interested in techniques for measuring emission rates and emission factors for forest fires and other open combustion sources. A source-sampling procedure that involved the use of a vertical array of lightweight, battery-operated instruments suspended from a helium-filled aerodynamic balloon is described. In this procedure, plume...

  1. Developing Management Techniques For Black Walnut to Stabilize the Annual Nut Supply

    Treesearch

    Felix, Jr. Ponder; James E. Jones; Rita Mueller

    2001-01-01

    Two studies involving cultural methods to increase nut production of plantation black walnut are presented. In the first study, nut production was measured for 5 years to determine the effect of nitrogen (N) and potassium (K) fertilization separately, in combination, and with and without phosphorus (P) broadcast annually for 4 years at two rates. Fertilization...

  2. Measuring the Effectiveness of Visual Analytics and Data Fusion Techniques on Situation Awareness in Cyber-Security

    ERIC Educational Resources Information Center

    Giacobe, Nicklaus A.

    2013-01-01

    Cyber-security involves the monitoring a complex network of inter-related computers to prevent, identify and remediate from undesired actions. This work is performed in organizations by human analysts. These analysts monitor cyber-security sensors to develop and maintain situation awareness (SA) of both normal and abnormal activities that occur on…

  3. Extracting the redox orbitals in Li battery materials with high-resolution x-ray compton scattering spectroscopy.

    PubMed

    Suzuki, K; Barbiellini, B; Orikasa, Y; Go, N; Sakurai, H; Kaprzyk, S; Itou, M; Yamamoto, K; Uchimoto, Y; Wang, Yung Jui; Hafiz, H; Bansil, A; Sakurai, Y

    2015-02-27

    We present an incisive spectroscopic technique for directly probing redox orbitals based on bulk electron momentum density measurements via high-resolution x-ray Compton scattering. Application of our method to spinel Li_{x}Mn_{2}O_{4}, a lithium ion battery cathode material, is discussed. The orbital involved in the lithium insertion and extraction process is shown to mainly be the oxygen 2p orbital. Moreover, the manganese 3d states are shown to experience spatial delocalization involving 0.16±0.05 electrons per Mn site during the battery operation. Our analysis provides a clear understanding of the fundamental redox process involved in the working of a lithium ion battery.

  4. Evaluating the methods used for measuring cerebral blood flow at rest and during exercise in humans.

    PubMed

    Tymko, Michael M; Ainslie, Philip N; Smith, Kurt J

    2018-05-16

    The first accounts of measuring cerebral blood flow (CBF) in humans were made by Angelo Mosso in ~1880, who recorded brain pulsations in patients with skull defects. In 1890, Charles Roy and Charles Sherrington determined in animals that brain pulsations-assessed via a similar method used by Mosso-were altered during a variety of stimuli including sensory nerve stimulation, asphyxia, and pharmacological interventions. Between 1880 and 1944, measurements for CBF were typically relied on skull abnormalities in humans. Thereafter, Kety and Schmidt introduced a new methodological approach in 1945 that involved nitrous oxide dilution combined with serial arterial and jugular venous blood sampling. Less than a decade later (1950's), several research groups employed the Kety-Schmidt technique to assess the effects of exercise on global CBF and metabolism; these studies demonstrated an uncoupling of CBF and metabolism during exercise, which was contrary to early hypotheses. However, there were several limitations to this technique related to low temporal resolution and the inability to measure regional CBF. These limitations were overcome in the 1960's when transcranial Doppler ultrasound (TCD) was developed as a method to measure beat-by-beat cerebral blood velocity. Between 1990 and 2010, TCD further progressed our understanding of CBF regulation and allowed for insight into other mechanistic factors, independent of local metabolism, involved in regulating CBF during exercise. Recently, it was discovered that TCD may not be accurate under several physiological conditions. Other measures of indexing CBF such as Duplex ultrasound and magnetic resonance imaging, although not without some limitations, may be more applicable for future investigations.

  5. Towards a green analytical laboratory: microextraction techniques as a useful tool for the monitoring of polluted soils

    NASA Astrophysics Data System (ADS)

    Lopez-Garcia, Ignacio; Viñas, Pilar; Campillo, Natalia; Hernandez Cordoba, Manuel; Perez Sirvent, Carmen

    2016-04-01

    Microextraction techniques are a valuable tool at the analytical laboratory since they allow sensitive measurements of pollutants to be carried out by means of easily available instrumentation. There is a large number of such procedures involving miniaturized liquid-liquid or liquid-solid extractions with the common denominator of using very low amounts (only a few microliters) or even none of organic solvents. Since minimal amounts of reagents are involved, and the generation of residues is consequently minimized, the approach falls within the concept of Green Analytical Chemistry. This general methodology is useful both for inorganic and organic pollutants. Thus, low amounts of metallic ions can be measured without the need of using ICP-MS since this instrument can be replaced by a simple AAS spectrometer which is commonly present in any laboratory and involves low acquisition and maintenance costs. When dealing with organic pollutants, the microextracts obtained can be introduced into liquid or gas chromatographs equipped with common detectors and there is no need for the most sophisticated and expensive mass spectrometers. This communication reports an overview of the advantages of such a methodology, and gives examples for the determination of some particular contaminants in soil and water samples The authors are grateful to the Comunidad Autonóma de la Región de Murcia , Spain (Fundación Séneca, 19888/GERM/15) for financial support

  6. The development of experimental techniques for the study of helicopter rotor noise

    NASA Technical Reports Server (NTRS)

    Widnall, S. E.; Harris, W. L.; Lee, Y. C. A.; Drees, H. M.

    1974-01-01

    The features of existing wind tunnels involved in noise studies are discussed. The acoustic characteristics of the MIT low noise open jet wind tunnel are obtained by employing calibration techniques: one technique is to measure the decay of sound pressure with distance in the far field; the other technique is to utilize a speaker, which was calibrated, as a sound source. The sound pressure level versus frequency was obtained in the wind tunnel chamber and compared with the corresponding calibrated values. Fiberglas board-block units were installed on the chamber interior. The free field was increased significantly after this treatment and the chamber cut-off frequency was reduced to 160 Hz from the original designed 250 Hz. The flow field characteristics of the rotor-tunnel configuration were studied by using flow visualization techniques. The influence of open-jet shear layer on the sound transmission was studied by using an Aeolian tone as the sound source. A dynamometer system was designed to measure the steady and low harmonics of the rotor thrust. A theoretical Mach number scaling formula was developed to scale the rotational noise and blade slap noise data of model rotors to full scale helicopter rotors.

  7. Application of thin layer activation technique for monitoring corrosion of carbon steel in hydrocarbon processing environment.

    PubMed

    Saxena, R C; Biswal, Jayashree; Pant, H J; Samantray, J S; Sharma, S C; Gupta, A K; Ray, S S

    2018-05-01

    Acidic crude oil transportation and processing in petroleum refining and petrochemical operations cause corrosion in the pipelines and associated components. Corrosion monitoring is invariably required to test and prove operational reliability. Thin Layer Activation (TLA) technique is a nuclear technique used for measurement of corrosion and erosion of materials. The technique involves irradiation of material with high energy ion beam from an accelerator and measurement of loss of radioactivity after the material is subjected to corrosive environment. In the present study, TLA technique has been used to monitor corrosion of carbon steel (CS) in crude oil environment at high temperature. Different CS coupons were irradiated with a 13 MeV proton beam to produce Cobalt-56 radioisotope on the surface of the coupons. The corrosion studies were carried out by subjecting the irradiated coupons to a corrosive environment, i.e, uninhibited straight run gas oil (SRGO) containing known amount of naphthenic acid (NA) at high temperature. The effects of different parameters, such as, concentration of NA, temperature and fluid velocity (rpm) on corrosion behaviour of CS were studied. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Use of the single-breath method of estimating cardiac output during exercise-stress testing.

    NASA Technical Reports Server (NTRS)

    Buderer, M. C.; Rummel, J. A.; Sawin, C. F.; Mauldin, D. G.

    1973-01-01

    The single-breath cardiac output measurement technique of Kim et al. (1966) has been modified for use in obtaining cardiac output measurements during exercise-stress tests on Apollo astronauts. The modifications involve the use of a respiratory mass spectrometer for data acquisition and a digital computer program for data analysis. The variation of the modified method for triplicate steady-state cardiac output measurements was plus or minus 1 liter/min. The combined physiological and methodological variation seen during a set of three exercise tests on a series of subjects was 1 to 2.5 liter/min. Comparison of the modified method with the direct Fick technique showed that although the single-breath values were consistently low, the scatter of data was small and the correlation between the two methods was high. Possible reasons for the low single-breath cardiac output values are discussed.

  9. Atmospheric Backscatter Model Development for CO Sub 2 Wavelengths

    NASA Technical Reports Server (NTRS)

    Deepak, A.; Kent, G.; Yue, G. K.

    1982-01-01

    The results of investigations into the problems of modeling atmospheric backscatter from aerosols, in the lowest 20 km of the atmosphere, at CO2 wavelengths are presented, along with a summary of the relevant aerosol characteristics and their variability, and a discussion of the measurement techniques and errors involved. The different methods of calculating the aerosol backscattering function, both from measured aerosol characteristics and from optical measurements made at other wavelengths, are discussed in detail, and limits are placed on the accuracy of these methods. The effects of changing atmospheric humidity and temperature on the backscatter are analyzed and related to the actual atmosphere. Finally, the results of modeling CO2 backscatter in the atmosphere are presented and the variation with height and geographic location discussed, and limits placed on the magnitude of the backscattering function. Conclusions regarding modeling techniques and modeled atmospheric backscatter values are presented in tabular form.

  10. Characterization of urania vaporization with transpiration coupled thermogravimetry

    DOE PAGES

    McMurray, J. W.

    2015-12-05

    Determining equilibrium vapor pressures of materials is made easier by transpiration measurements. However, the traditional technique involves condensing the volatiles entrained in a carrier gas outside of the hot measurement zone. One potential problem is deposition en route to a cooled collector. Thermogravimetric analysis (TGA) can be used to measure in situ mass loss due to vaporization and therefore obviate the need to analyze the entire gas train due to premature plating of vapor species. Therefore, a transpiration coupled TGA technique was used to determine equilibrium pressures of UO3 gas over fluorite structure UO2+x and U3O8 at T = (1573more » and 1773) K. Moreover, we compared to calculations from models and databases in the open literature. Our study gives clarity to the thermochemical data for UO3 gas and validates the mass loss transpiration method using thermogravimetry for determining equilibrium vapor pressures of non-stoichiometric oxides.« less

  11. Dating human skeletal remains: investigating the viability of measuring the equilibrium between 210Po and 210Pb as a means of estimating the post-mortem interval.

    PubMed

    Swift, B

    1998-11-30

    Estimating the post-mortem interval in skeletal remains is a notoriously difficult task; forensic pathologists often rely heavily upon experience in recognising morphological appearances. Previous techniques have involved measuring physical or chemical changes within the hydroxyapatite matrix, radiocarbon dating and 90Sr dating, though no individual test has been advocated. Within this paper it is proposed that measuring the equilibrium between two naturally occurring radio-isotopes, 210Po and 210Pb, and comparison with post-mortem examination samples would produce a new method of dating human skeletal remains. Possible limitations exist, notably the effect of diagenesis, time limitations and relative cost, though this technique could provide a relatively accurate means of determining the post-mortem interval. It is therefore proposed that a large study be undertaken to provide a calibration scale against which bones uncovered can be dated.

  12. Error Propagation Dynamics of PIV-based Pressure Field Calculations: How well does the pressure Poisson solver perform inherently?

    PubMed

    Pan, Zhao; Whitehead, Jared; Thomson, Scott; Truscott, Tadd

    2016-08-01

    Obtaining pressure field data from particle image velocimetry (PIV) is an attractive technique in fluid dynamics due to its noninvasive nature. The application of this technique generally involves integrating the pressure gradient or solving the pressure Poisson equation using a velocity field measured with PIV. However, very little research has been done to investigate the dynamics of error propagation from PIV-based velocity measurements to the pressure field calculation. Rather than measure the error through experiment, we investigate the dynamics of the error propagation by examining the Poisson equation directly. We analytically quantify the error bound in the pressure field, and are able to illustrate the mathematical roots of why and how the Poisson equation based pressure calculation propagates error from the PIV data. The results show that the error depends on the shape and type of boundary conditions, the dimensions of the flow domain, and the flow type.

  13. Imaging high-speed friction at the nanometer scale

    PubMed Central

    Thorén, Per-Anders; de Wijn, Astrid S.; Borgani, Riccardo; Forchheimer, Daniel; Haviland, David B.

    2016-01-01

    Friction is a complicated phenomenon involving nonlinear dynamics at different length and time scales. Understanding its microscopic origin requires methods for measuring force on nanometer-scale asperities sliding at velocities reaching centimetres per second. Despite enormous advances in experimental technique, this combination of small length scale and high velocity remain elusive. We present a technique for rapidly measuring the frictional forces on a single asperity over a velocity range from zero to several centimetres per second. At each image pixel we obtain the velocity dependence of both conservative and dissipative forces, revealing the transition from stick-slip to smooth sliding friction. We explain measurements on graphite using a modified Prandtl–Tomlinson model, including the damped elastic deformation of the asperity. With its improved force sensitivity and small sliding amplitude, our method enables rapid and detailed surface mapping of the velocity dependence of frictional forces with less than 10 nm spatial resolution. PMID:27958267

  14. Evaluation of Am–Li neutron spectra data for active well type neutron multiplicity measurements of uranium

    DOE PAGES

    Goddard, Braden; Croft, Stephen; Lousteau, Angela; ...

    2016-05-25

    Safeguarding nuclear material is an important and challenging task for the international community. One particular safeguards technique commonly used for uranium assay is active neutron correlation counting. This technique involves irradiating unused uranium with ( α,n) neutrons from an Am-Li source and recording the resultant neutron pulse signal which includes induced fission neutrons. Although this non-destructive technique is widely employed in safeguards applications, the neutron energy spectra from an Am-Li sources is not well known. Several measurements over the past few decades have been made to characterize this spectrum; however, little work has been done comparing the measured spectra ofmore » various Am-Li sources to each other. This paper examines fourteen different Am-Li spectra, focusing on how these spectra affect simulated neutron multiplicity results using the code Monte Carlo N-Particle eXtended (MCNPX). Two measurement and simulation campaigns were completed using Active Well Coincidence Counter (AWCC) detectors and uranium standards of varying enrichment. The results of this work indicate that for standard AWCC measurements, the fourteen Am-Li spectra produce similar doubles and triples count rates. Finally, the singles count rates varied by as much as 20% between the different spectra, although they are usually not used in quantitative analysis.« less

  15. The potential for actigraphy to be used as an indicator of sitting discomfort.

    PubMed

    Telfer, Scott; Spence, William D; Solomonidis, Stephan E

    2009-10-01

    A novel technique that uses actigraphy, the study of activity involving the use of body-mounted accelerometers, to detect the discomfort-related movements of a sitting individual has been proposed as a potential indicator of sitting discomfort, and the purpose of this study was to test its validity. Objective measurement of sitting discomfort has always been challenging for researchers. Electromyographic measurements, pressure mapping, and a wide range of other techniques have all been investigated with limited success. The activity monitor's ability to detect and measure seated movement was assessed, and 12 participants were tested on four different chairs (100-min sessions for each). The activity monitor was able to detect participants' sitting movements (Pearson coefficients > 0.9). The chairs were shown to have significantly different subjective discomfort ratings, all of which increased over time. The movements detected by the activity monitor also increased significantly with time, and the amount measured was greater in the chairs rated as most uncomfortable. Regression analysis indicated that the actigraphy data were able to account for 29.6% of the variation in perceived discomfort ratings. Actigraphy can reliably detect sitting movements and may be of use in measuring sitting discomfort. Potential applications of this technique exist for seating research in the automotive industry, health care, and office and leisure chairs.

  16. Preliminary clinical investigations of a new noninvasive venous pulse oximeter

    NASA Astrophysics Data System (ADS)

    Chan, Daniel; Smith, Peter R.; Caine, Michael P.; Spyt, Tomasz; Boehm, Maria; Machin, David

    2003-10-01

    For decades, the monitoring of mixed venous oxygen saturation, SvO2 has been performed invasively using fibre-optic catheters. This procedure is not without risk as complications may arise from catheterisation. The group has devised a new non-invasive venous oximetry method which involves inducing regular modulations of the venous blood volume and associated measurement of those modulations using optical means. A clinical investigation was conducted in Glenfield Hospital, UK to evaluate the sensitivity of the new technique to haemodynamic changes such as Cardiac Output (CO) in intraoperative and postoperative cardiac patients. Preliminary trials on patients recovering from cardiac surgery yielded an average correlation of r = 0.72 between CO at different Intra Aortic Balloon Pump (IABP) augmentation levels and SvO2 measured by the new venous oximeter. In intraoperative patients undergoing off-pump cardiac surgery, SvO2 recorded by the new technique responded to unplanned events such as a cardiac arrest. CONCLUSION: The new venous oximetry technique is a promising technique which responds to haemodynamic changes such as CO and with further development might offer an alternative means of monitoring SvO2 non-invasively.

  17. Instrumentation for air quality measurements.

    NASA Technical Reports Server (NTRS)

    Loewenstein, M.

    1973-01-01

    Comparison of the new generation of air quality monitoring instruments with some more traditional methods. The first generation of air quality measurement instruments, based on the use of oxidant coulometric cells, nitrogen oxide colorimetry, carbon monoxide infrared analyzers, and other types of detectors, is compared with new techniques now coming into wide use in the air monitoring field and involving the use of chemiluminescent reactions, optical absorption detectors, a refinement of the carbon monoxide infrared analyzer, electrochemical cells based on solid electrolytes, and laser detectors.

  18. Assessing dynamics, spatial scale, and uncertainty in task-related brain network analyses

    PubMed Central

    Stephen, Emily P.; Lepage, Kyle Q.; Eden, Uri T.; Brunner, Peter; Schalk, Gerwin; Brumberg, Jonathan S.; Guenther, Frank H.; Kramer, Mark A.

    2014-01-01

    The brain is a complex network of interconnected elements, whose interactions evolve dynamically in time to cooperatively perform specific functions. A common technique to probe these interactions involves multi-sensor recordings of brain activity during a repeated task. Many techniques exist to characterize the resulting task-related activity, including establishing functional networks, which represent the statistical associations between brain areas. Although functional network inference is commonly employed to analyze neural time series data, techniques to assess the uncertainty—both in the functional network edges and the corresponding aggregate measures of network topology—are lacking. To address this, we describe a statistically principled approach for computing uncertainty in functional networks and aggregate network measures in task-related data. The approach is based on a resampling procedure that utilizes the trial structure common in experimental recordings. We show in simulations that this approach successfully identifies functional networks and associated measures of confidence emergent during a task in a variety of scenarios, including dynamically evolving networks. In addition, we describe a principled technique for establishing functional networks based on predetermined regions of interest using canonical correlation. Doing so provides additional robustness to the functional network inference. Finally, we illustrate the use of these methods on example invasive brain voltage recordings collected during an overt speech task. The general strategy described here—appropriate for static and dynamic network inference and different statistical measures of coupling—permits the evaluation of confidence in network measures in a variety of settings common to neuroscience. PMID:24678295

  19. Assessing dynamics, spatial scale, and uncertainty in task-related brain network analyses.

    PubMed

    Stephen, Emily P; Lepage, Kyle Q; Eden, Uri T; Brunner, Peter; Schalk, Gerwin; Brumberg, Jonathan S; Guenther, Frank H; Kramer, Mark A

    2014-01-01

    The brain is a complex network of interconnected elements, whose interactions evolve dynamically in time to cooperatively perform specific functions. A common technique to probe these interactions involves multi-sensor recordings of brain activity during a repeated task. Many techniques exist to characterize the resulting task-related activity, including establishing functional networks, which represent the statistical associations between brain areas. Although functional network inference is commonly employed to analyze neural time series data, techniques to assess the uncertainty-both in the functional network edges and the corresponding aggregate measures of network topology-are lacking. To address this, we describe a statistically principled approach for computing uncertainty in functional networks and aggregate network measures in task-related data. The approach is based on a resampling procedure that utilizes the trial structure common in experimental recordings. We show in simulations that this approach successfully identifies functional networks and associated measures of confidence emergent during a task in a variety of scenarios, including dynamically evolving networks. In addition, we describe a principled technique for establishing functional networks based on predetermined regions of interest using canonical correlation. Doing so provides additional robustness to the functional network inference. Finally, we illustrate the use of these methods on example invasive brain voltage recordings collected during an overt speech task. The general strategy described here-appropriate for static and dynamic network inference and different statistical measures of coupling-permits the evaluation of confidence in network measures in a variety of settings common to neuroscience.

  20. Constructing Inexpensive, Flexible, and Versatile Microdialysis Probes in an Undergraduate Microdialysis Research Lab

    PubMed Central

    Steffes, Sally; Sandstrom, Michael

    2008-01-01

    Several challenges await new assistant professors setting up a neuroscience lab, and obtaining sufficient research help is typically a top priority. A secondary, but no less daunting, challenge is juggling accuracy and reliability with costs and limited start-up funds. These concerns are particularly crucial for those engaging technically sophisticated measurements, such as microdialysis. We have developed straightforward procedures that our undergraduate students have utilized to successfully construct high-quality, low-cost microdialysis probes. Students mastering the various steps involved have also gained valuable insight into their use, troubleshooting, and the implications of data obtained from these constructed probes. These procedures are explained here to foster increased use in neuroscience labs that involve undergraduates, along with pointers about teaching the technique to newcomers. Students who master the techniques can pass them on to new students easily. These procedures train students in the overall research technique of microdialysis more thoroughly than when manufactured probes are used, they save money, and will eventually save the principal investigator time when students develop independence with troubleshooting and repairs. PMID:23493044

  1. Measurement potential of laser speckle velocimetry

    NASA Technical Reports Server (NTRS)

    Adrian, R. J.

    1982-01-01

    Laser speckle velocimetry, the measurement of fluid velocity by measuring the translation of speckle pattern or individual particles that are moving with the fluid, is described. The measurement is accomplished by illuminating the fluid with consecutive pulses of Laser Light and recording the images of the particles or the speckles on a double exposed photographic plate. The plate contains flow information throughout the image plane so that a single double exposure may provide data at hundreds or thousands of points in the illuminated region of the fluid. Conventional interrogation of the specklegram involves illuminating the plate to form Young's fringes, whose spacing is inversely proportional to the speckle separation. Subsequently the fringes are digitized and analyzed in a computer to determine their frequency and orientation, yielding the velocity magnitude and orientation. The Young's fringe technique is equivalent to performing a 2-D spatial correlation of the double exposed specklegram intensity pattern, and this observation suggests that correlation should be considered as an alternative processing method. The principle of the correlation technique is examined.

  2. High-precision double-frequency interferometric measurement of the cornea shape

    NASA Astrophysics Data System (ADS)

    Molebny, Vasyl V.; Pallikaris, Ioannis G.; Naoumidis, Leonidas P.; Smirnov, Eugene M.; Ilchenko, Leonid M.; Goncharov, Vadym O.

    1996-11-01

    To measure the shape of the cornea and its declinations from the necessary values before and after PRK operation, s well as the shape of other spherical objects like artificial pupil, a technique was used of double-frequency dual-beam interferometry. The technique is based on determination of the optical path difference between two neighboring laser beams, reflected from the cornea or other surface under investigation. Knowing the distance between the beams on the investigated shape. The shape itself is reconstructed by along-line integration. To adjust the wavefront orientation of the laser beam to the spherical shape of the cornea or artificial pupil in the course of scanning, additional lens is involved. Signal-to-noise ratio is ameliorated excluding losses in the acousto-optic deflectors. Polarization selection is realized for choosing the signal needed for measurement. 2D image presentation is accompanied by convenient PC accessories, permitting precise cross-section measurements along selected directions. Sensitivity of the order of 10-2 micrometers is achieved.

  3. Measurements of Gluconeogenesis and Glycogenolysis: A Methodological Review.

    PubMed

    Chung, Stephanie T; Chacko, Shaji K; Sunehag, Agneta L; Haymond, Morey W

    2015-12-01

    Gluconeogenesis is a complex metabolic process that involves multiple enzymatic steps regulated by myriad factors, including substrate concentrations, the redox state, activation and inhibition of specific enzyme steps, and hormonal modulation. At present, the most widely accepted technique to determine gluconeogenesis is by measuring the incorporation of deuterium from the body water pool into newly formed glucose. However, several techniques using radioactive and stable-labeled isotopes have been used to quantitate the contribution and regulation of gluconeogenesis in humans. Each method has its advantages, methodological assumptions, and set of propagated errors. In this review, we examine the strengths and weaknesses of the most commonly used stable isotopes methods to measure gluconeogenesis in vivo. We discuss the advantages and limitations of each method and summarize the applicability of these measurements in understanding normal and pathophysiological conditions. © 2015 by the American Diabetes Association. Readers may use this article as long as the work is properly cited, the use is educational and not for profit, and the work is not altered.

  4. Chromatic changes to artificial irises produced using different techniques

    NASA Astrophysics Data System (ADS)

    Bannwart, Lisiane Cristina; Goiato, Marcelo Coelho; dos Santos, Daniela Micheline; Moreno, Amália; Pesqueira, Aldiéris Alves; Haddad, Marcela Filié; Andreotti, Agda Marobo; de Medeiros, Rodrigo Antonio

    2013-05-01

    Ocular prostheses are important determinants of their users' aesthetic recovery and self-esteem. Because of use, ocular prostheses longevity is strongly affected by instability of the iris color due to polymerization. The goal of this study is to examine how the color of the artificial iris button is affected by different techniques of artificial wear and by the application of varnish following polymerization of the colorless acrylic resin that covers the colored paint. We produce 60 samples (n=10) according to the wear technique applied: conventional technique without varnish (PE); conventional technique with varnish (PEV); technique involving a prefabricated cap without varnish (CA); technique involving a prefabricated cap with varnish (CAV); technique involving inverted painting without varnish (PI); and technique involving inverted painting with varnish (PIV). Color readings using a spectrophotometer are taken before and after polymerization. We submitted the data obtained to analyses of variance and Tukey's test (P<0.05). The color test shows significant changes after polymerization in all groups. The PE and PI techniques have clinically acceptable values of ΔE, independent of whether we apply varnish to protect the paint. The PI technique produces the least color change, whereas the PE and CA techniques significantly improve color stability.

  5. Integration of Quartz Crystal Microbalance-Dissipation and Reflection-Mode Localized Surface Plasmon Resonance Sensors for Biomacromolecular Interaction Analysis.

    PubMed

    Ferhan, Abdul Rahim; Jackman, Joshua A; Cho, Nam-Joon

    2016-12-20

    The combination of label-free, surface-sensitive measurement techniques based on different physical principles enables detailed characterization of biomacromolecular interactions at solid-liquid interfaces. To date, most combined measurement systems have involved experimental techniques with similar probing volumes, whereas the potential of utilizing techniques with different surface sensitivities remains largely unexplored, especially for data interpretation. Herein, we report a combined measurement approach that integrates a conventional quartz crystal microbalance-dissipation (QCM-D) setup with a reflection-mode localized surface plasmon (LSPR) sensor. Using this platform, we investigate vesicle adsorption on a titanium oxide-coated sensing substrate along with the amphipathic, α-helical (AH) peptide-induced structural transformation of surface-adsorbed lipid vesicles into a supported lipid bilayer (SLB) as a model biomacromolecular interaction. While the QCM-D and LSPR signals both detected mass uptake arising from vesicle adsorption, tracking the AH peptide-induced structural transformation revealed more complex measurement responses based on the different surface sensitivities of the two techniques. In particular, the LSPR signal recorded an increase in optical mass near the sensor surface which indicated SLB formation, whereas the QCM-D signals detected a significant loss in net acoustic mass due to excess lipid and coupled solvent leaving the probing volume. Importantly, these measurement capabilities allowed us to temporally distinguish the process of SLB formation at the sensor surface from the overall structural transformation process. Looking forward, these label-free measurement capabilities to simultaneously probe adsorbates at multiple length scales will provide new insights into complex biomacromolecular interactions.

  6. A Measure of Perceived Argument Strength: Reliability and Validity

    PubMed Central

    Zhao, Xiaoquan; Strasser, Andrew; Cappella, Joseph N.; Lerman, Caryn; Fishbein, Martin

    2014-01-01

    Studies of the content of persuasive messages in which the central arguments of the message are scrutinized have traditionally relied on the technique of thought-listing to assess argument strength. Although the validity of the thought-listing procedure is well documented, its utility can be limited in situations involving non-adult populations and sensitive topics. In this paper we present a self-reported scale that can be used to assess perceived argument strength in contexts where thought-listing may be less appropriate. This scale taps into perceived argument strength from multiple points of view, including but also extending beyond the potential of the argument to elicit positive and negative thoughts. Reliability and validity of this scale were assessed in health communication contexts involving anti-drug PSAs directed at adolescents and anti-smoking PSAs targeting adults. Evidence of convergence between this scale and the thought-listing technique was also obtained using the classical comprehensive exam arguments. PMID:25568663

  7. Cell Signaling Experiments Driven by Optical Manipulation

    PubMed Central

    Difato, Francesco; Pinato, Giulietta; Cojoc, Dan

    2013-01-01

    Cell signaling involves complex transduction mechanisms in which information released by nearby cells or extracellular cues are transmitted to the cell, regulating fundamental cellular activities. Understanding such mechanisms requires cell stimulation with precise control of low numbers of active molecules at high spatial and temporal resolution under physiological conditions. Optical manipulation techniques, such as optical tweezing, mechanical stress probing or nano-ablation, allow handling of probes and sub-cellular elements with nanometric and millisecond resolution. PicoNewton forces, such as those involved in cell motility or intracellular activity, can be measured with femtoNewton sensitivity while controlling the biochemical environment. Recent technical achievements in optical manipulation have new potentials, such as exploring the actions of individual molecules within living cells. Here, we review the progress in optical manipulation techniques for single-cell experiments, with a focus on force probing, cell mechanical stimulation and the local delivery of active molecules using optically manipulated micro-vectors and laser dissection. PMID:23698758

  8. Boiler Tube Corrosion Characterization with a Scanning Thermal Line

    NASA Technical Reports Server (NTRS)

    Cramer, K. Elliott; Jacobstein, Ronald; Reilly, Thomas

    2001-01-01

    Wall thinning due to corrosion in utility boiler water wall tubing is a significant operational concern for boiler operators. Historically, conventional ultrasonics has been used for inspection of these tubes. Unfortunately, ultrasonic inspection is very manpower intense and slow. Therefore, thickness measurements are typically taken over a relatively small percentage of the total boiler wall and statistical analysis is used to determine the overall condition of the boiler tubing. Other inspection techniques, such as electromagnetic acoustic transducer (EMAT), have recently been evaluated, however they provide only a qualitative evaluation - identifying areas or spots where corrosion has significantly reduced the wall thickness. NASA Langley Research Center, in cooperation with ThermTech Services, has developed a thermal NDE technique designed to quantitatively measure the wall thickness and thus determine the amount of material thinning present in steel boiler tubing. The technique involves the movement of a thermal line source across the outer surface of the tubing followed by an infrared imager at a fixed distance behind the line source. Quantitative images of the material loss due to corrosion are reconstructed from measurements of the induced surface temperature variations. This paper will present a discussion of the development of the thermal imaging system as well as the techniques used to reconstruct images of flaws. The application of the thermal line source coupled with the analysis technique represents a significant improvement in the inspection speed and accuracy for large structures such as boiler water walls. A theoretical basis for the technique will be presented to establish the quantitative nature of the technique. Further, a dynamic calibration system will be presented for the technique that allows the extraction of thickness information from the temperature data. Additionally, the results of the application of this technology to actual water wall tubing samples and in-situ inspections will be presented.

  9. Technical Errors May Affect Accuracy of Torque Limiter in Locking Plate Osteosynthesis.

    PubMed

    Savin, David D; Lee, Simon; Bohnenkamp, Frank C; Pastor, Andrew; Garapati, Rajeev; Goldberg, Benjamin A

    2016-01-01

    In locking plate osteosynthesis, proper surgical technique is crucial in reducing potential pitfalls, and use of a torque limiter makes it possible to control insertion torque. We conducted a study of the ways in which different techniques can alter the accuracy of torque limiters. We tested 22 torque limiters (1.5 Nm) for accuracy using hand and power tools under different rotational scenarios: hand power at low and high velocity and drill power at low and high velocity. We recorded the maximum torque reached after each torque-limiting event. Use of torque limiters under hand power at low velocity and high velocity resulted in significantly (P < .0001) different mean (SD) measurements: 1.49 (0.15) Nm and 3.73 (0.79) Nm. Use under drill power at controlled low velocity and at high velocity also resulted in significantly (P < .0001) different mean (SD) measurements: 1.47 (0.14) Nm and 5.37 (0.90) Nm. Maximum single measurement obtained was 9.0 Nm using drill power at high velocity. Locking screw insertion with improper technique may result in higher than expected torque and subsequent complications. For torque limiters, the most reliable technique involves hand power at slow velocity or drill power with careful control of insertion speed until 1 torque-limiting event occurs.

  10. The eradication of Simulium neavei from Kenya

    PubMed Central

    McMahon, J. P.; Highton, R. B.; Goiny, H.

    1958-01-01

    S. neavei, the vector of onchocerciasis, has been virtually eradicated from Kenya by larviciding measures in which DDT was used. Only a very small area remains infested and this is in the course of being treated by the Uganda medical authorities as it is part of a much larger focus occurring in that country. An account is given of the various surveys which have been carried out during the last ten years in Nyanza Province, involving 15 000 square miles (about 40 000 km2), and survey techniques are described. An account is given of the eradication measures carried out in North and South Nyanza, and techniques in connexion with dosing and checking operations are described. Costs for both surveys and eradication schemes are given, and minimum requirements for transport are indicated. PMID:13585062

  11. Next generation initiation techniques

    NASA Technical Reports Server (NTRS)

    Warner, Tom; Derber, John; Zupanski, Milija; Cohn, Steve; Verlinde, Hans

    1993-01-01

    Four-dimensional data assimilation strategies can generally be classified as either current or next generation, depending upon whether they are used operationally or not. Current-generation data-assimilation techniques are those that are presently used routinely in operational-forecasting or research applications. They can be classified into the following categories: intermittent assimilation, Newtonian relaxation, and physical initialization. It should be noted that these techniques are the subject of continued research, and their improvement will parallel the development of next generation techniques described by the other speakers. Next generation assimilation techniques are those that are under development but are not yet used operationally. Most of these procedures are derived from control theory or variational methods and primarily represent continuous assimilation approaches, in which the data and model dynamics are 'fitted' to each other in an optimal way. Another 'next generation' category is the initialization of convective-scale models. Intermittent assimilation systems use an objective analysis to combine all observations within a time window that is centered on the analysis time. Continuous first-generation assimilation systems are usually based on the Newtonian-relaxation or 'nudging' techniques. Physical initialization procedures generally involve the use of standard or nonstandard data to force some physical process in the model during an assimilation period. Under the topic of next-generation assimilation techniques, variational approaches are currently being actively developed. Variational approaches seek to minimize a cost or penalty function which measures a model's fit to observations, background fields and other imposed constraints. Alternatively, the Kalman filter technique, which is also under investigation as a data assimilation procedure for numerical weather prediction, can yield acceptable initial conditions for mesoscale models. The third kind of next-generation technique involves strategies to initialize convective scale (non-hydrostatic) models.

  12. Fractional exhaled nitric oxide-measuring devices: technology update

    PubMed Central

    Maniscalco, Mauro; Vitale, Carolina; Vatrella, Alessandro; Molino, Antonio; Bianco, Andrea; Mazzarella, Gennaro

    2016-01-01

    The measurement of exhaled nitric oxide (NO) has been employed in the diagnosis of specific types of airway inflammation, guiding treatment monitoring by predicting and assessing response to anti-inflammatory therapy and monitoring for compliance and detecting relapse. Various techniques are currently used to analyze exhaled NO concentrations under a range of conditions for both health and disease. These include chemiluminescence and electrochemical sensor devices. The cost effectiveness and ability to achieve adequate flexibility in sensitivity and selectivity of NO measurement for these methods are evaluated alongside the potential for use of laser-based technology. This review explores the technologies involved in the measurement of exhaled NO. PMID:27382340

  13. Advantage of the modified Lunn-McNeil technique over Kalbfleisch-Prentice technique in competing risks

    NASA Astrophysics Data System (ADS)

    Lukman, Iing; Ibrahim, Noor A.; Daud, Isa B.; Maarof, Fauziah; Hassan, Mohd N.

    2002-03-01

    Survival analysis algorithm is often applied in the data mining process. Cox regression is one of the survival analysis tools that has been used in many areas, and it can be used to analyze the failure times of aircraft crashed. Another survival analysis tool is the competing risks where we have more than one cause of failure acting simultaneously. Lunn-McNeil analyzed the competing risks in the survival model using Cox regression with censored data. The modified Lunn-McNeil technique is a simplify of the Lunn-McNeil technique. The Kalbfleisch-Prentice technique is involving fitting models separately from each type of failure, treating other failure types as censored. To compare the two techniques, (the modified Lunn-McNeil and Kalbfleisch-Prentice) a simulation study was performed. Samples with various sizes and censoring percentages were generated and fitted using both techniques. The study was conducted by comparing the inference of models, using Root Mean Square Error (RMSE), the power tests, and the Schoenfeld residual analysis. The power tests in this study were likelihood ratio test, Rao-score test, and Wald statistics. The Schoenfeld residual analysis was conducted to check the proportionality of the model through its covariates. The estimated parameters were computed for the cause-specific hazard situation. Results showed that the modified Lunn-McNeil technique was better than the Kalbfleisch-Prentice technique based on the RMSE measurement and Schoenfeld residual analysis. However, the Kalbfleisch-Prentice technique was better than the modified Lunn-McNeil technique based on power tests measurement.

  14. Optimum projection pattern generation for grey-level coded structured light illumination systems

    NASA Astrophysics Data System (ADS)

    Porras-Aguilar, Rosario; Falaggis, Konstantinos; Ramos-Garcia, Ruben

    2017-04-01

    Structured light illumination (SLI) systems are well-established optical inspection techniques for noncontact 3D surface measurements. A common technique is multi-frequency sinusoidal SLI that obtains the phase map at various fringe periods in order to estimate the absolute phase, and hence, the 3D surface information. Nevertheless, multi-frequency SLI systems employ multiple measurement planes (e.g. four phase shifted frames) to obtain the phase at a given fringe period. It is therefore an age old challenge to obtain the absolute surface information using fewer measurement frames. Grey level (GL) coding techniques have been developed as an attempt to reduce the number of planes needed, because a spatio-temporal GL sequence employing p discrete grey-levels and m frames has the potential to unwrap up to pm fringes. Nevertheless, one major disadvantage of GL based SLI techniques is that there are often errors near the border of each stripe, because an ideal stepwise intensity change cannot be measured. If the step-change in intensity is a single discrete grey-level unit, this problem can usually be overcome by applying an appropriate threshold. However, severe errors occur if the intensity change at the border of the stripe exceeds several discrete grey-level units. In this work, an optimum GL based technique is presented that generates a series of projection patterns with a minimal gradient in the intensity. It is shown that when using this technique, the errors near the border of the stripes can be significantly reduced. This improvement is achieved with the choice generated patterns, and does not involve additional hardware or special post-processing techniques. The performance of that method is validated using both simulations and experiments. The reported technique is generic, works with an arbitrary number of frames, and can employ an arbitrary number of grey-levels.

  15. Measurements in the Turbulent Boundary Layer at Constant Pressure in Subsonic and Supersonic Flow. Part 2: Laser-Doppler Velocity Measurements

    NASA Technical Reports Server (NTRS)

    Dimotakis, P. E.; Collins, D. J.; Lang, D. B.

    1979-01-01

    A description of both the mean and the fluctuating components of the flow, and of the Reynolds stress as observed using a dual forward scattering laser-Doppler velocimeter is presented. A detailed description of the instrument and of the data analysis techniques were included in order to fully document the data. A detailed comparison was made between the laser-Doppler results and those presented in Part 1, and an assessment was made of the ability of the laser-Doppler velocimeter to measure the details of the flows involved.

  16. [Inpatient rehabilitation of adults with atopic dermatitis].

    PubMed

    Breuer, K; Kapp, A

    2006-07-01

    Atopic dermatitis is a chronic inflammatory skin disease which often persists until adulthood. In severe cases, eczematous lesions and pruritus are resistant to therapy and result in depression, impairment of professional activities and social withdrawal. The goal of inpatient rehabilitation measures is to keep the patient involved and active in professional and social activities. Rehabilitative measures include diagnostics and medical therapy according to current guidelines, instruction in basic medical information, psychological intervention (relaxation techniques, improvement of self-confidence), dietetic measures, exercise, and social advice. Patients with atopic dermatitis often have work-related problems which should be identified as early as possible during rehabilitation.

  17. Josephson frequency meter for millimeter and submillimeter wavelengths

    NASA Technical Reports Server (NTRS)

    Anischenko, S. E.; Larkin, S. Y.; Chaikovsky, V. I.; Kabayev, P. V.; Kamyshin, V. V.

    1995-01-01

    Frequency measurements of electromagnetic oscillations of millimeter and submillimeter wavebands with frequency growth due to a number of reasons become more and more difficult. First, these frequencies are considered to be cutoffs for semiconductor converting devices and one has to use optical measurement methods instead of traditional ones with frequency transfer. Second, resonance measurement methods are characterized by using relatively narrow bands and optical ones are limited in frequency and time resolution due to the limited range and velocity of movement of their mechanical elements as well as the efficiency of these optical techniques decrease with the increase of wavelength due to diffraction losses. That requires a priori information on the radiation frequency band of the source involved. Method of measuring frequency of harmonic microwave signals in millimeter and submillimeter wavebands based on the ac Josephson effect in superconducting contacts is devoid of all the above drawbacks. This approach offers a number of major advantages over the more traditional measurement methods, that is one based on frequency conversion, resonance and interferometric techniques. It can be characterized by high potential accuracy, wide range of frequencies measured, prompt measurement and the opportunity to obtain a panoramic display of the results as well as full automation of the measuring process.

  18. Application of the ultrasonic technique and high-speed filming for the study of the structure of air-water bubbly flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carvalho, R.D.M.; Venturini, O.J.; Tanahashi, E.I.

    2009-10-15

    Multiphase flows are very common in industry, oftentimes involving very harsh environments and fluids. Accordingly, there is a need to determine the dispersed phase holdup using noninvasive fast responding techniques; besides, knowledge of the flow structure is essential for the assessment of the transport processes involved. The ultrasonic technique fulfills these requirements and could have the capability to provide the information required. In this paper, the potential of the ultrasonic technique for application to two-phase flows was investigated by checking acoustic attenuation data against experimental data on the void fraction and flow topology of vertical, upward, air-water bubbly flows inmore » the zero to 15% void fraction range. The ultrasonic apparatus consisted of one emitter/receiver transducer and three other receivers at different positions along the pipe circumference; simultaneous high-speed motion pictures of the flow patterns were made at 250 and 1000 fps. The attenuation data for all sensors exhibited a systematic interrelated behavior with void fraction, thereby testifying to the capability of the ultrasonic technique to measure the dispersed phase holdup. From the motion pictures, basic gas phase structures and different flows patterns were identified that corroborated several features of the acoustic attenuation data. Finally, the acoustic wave transit time was also investigated as a function of void fraction. (author)« less

  19. Experimental study using Nearfield Acoustical Holography of sound transmission fuselage sidewall structures

    NASA Technical Reports Server (NTRS)

    Maynard, J. D.

    1983-01-01

    This project involves the development of the Nearfield Acoustic Holography (NAH) technique (in particular its extension from single frequency to wideband noise measurement) and its application in a detailed study of the noise radiation characteristics of several samples of aircraft sidewall panels. With the extensive amount of information provided by the NAH technique, the properties of the sound field radiated by the panels may be correlated with their structure, mounting, and excitation (single frequency or wideband, spatially correlated or uncorrelated, structure-borne). The work accomplished at the beginning of this grant period included: (1) Calibration of the 256 microphone array and test of its accuracy. (2) extension of the facility to permit measurements on wideband noise sources. The extensions incuded the addition of high-speed data acquisition hardware and an array processor, and the development of new software. (3) Installation of motion picture graphics for correlating panel motion with structure, mounting, radiation, etc. (4) Development of new holographic data processing techniques.

  20. Comparison of oral surgery task performance in a virtual reality surgical simulator and an animal model using objective measures.

    PubMed

    Ioannou, Ioanna; Kazmierczak, Edmund; Stern, Linda

    2015-01-01

    The use of virtual reality (VR) simulation for surgical training has gathered much interest in recent years. Despite increasing popularity and usage, limited work has been carried out in the use of automated objective measures to quantify the extent to which performance in a simulator resembles performance in the operating theatre, and the effects of simulator training on real world performance. To this end, we present a study exploring the effects of VR training on the performance of dentistry students learning a novel oral surgery task. We compare the performance of trainees in a VR simulator and in a physical setting involving ovine jaws, using a range of automated metrics derived by motion analysis. Our results suggest that simulator training improved the motion economy of trainees without adverse effects on task outcome. Comparison of surgical technique on the simulator with the ovine setting indicates that simulator technique is similar, but not identical to real world technique.

  1. Neutron activation analysis of certified samples by the absolute method

    NASA Astrophysics Data System (ADS)

    Kadem, F.; Belouadah, N.; Idiri, Z.

    2015-07-01

    The nuclear reactions analysis technique is mainly based on the relative method or the use of activation cross sections. In order to validate nuclear data for the calculated cross section evaluated from systematic studies, we used the neutron activation analysis technique (NAA) to determine the various constituent concentrations of certified samples for animal blood, milk and hay. In this analysis, the absolute method is used. The neutron activation technique involves irradiating the sample and subsequently performing a measurement of the activity of the sample. The fundamental equation of the activation connects several physical parameters including the cross section that is essential for the quantitative determination of the different elements composing the sample without resorting to the use of standard sample. Called the absolute method, it allows a measurement as accurate as the relative method. The results obtained by the absolute method showed that the values are as precise as the relative method requiring the use of standard sample for each element to be quantified.

  2. A class of temporal boundaries derived by quantifying the sense of separation.

    PubMed

    Paine, Llewyn Elise; Gilden, David L

    2013-12-01

    The perception of moment-to-moment environmental flux as being composed of meaningful events requires that memory processes coordinate with cues that signify beginnings and endings. We have constructed a technique that allows this coordination to be monitored indirectly. This technique works by embedding a sequential priming task into the event under study. Memory and perception must be coordinated to resolve temporal flux into scenes. The implicit memory processes inherent in sequential priming are able to effectively shadow then mirror scene-forming processes. Certain temporal boundaries are found to weaken the strength of irrelevant feature priming, a signal which can then be used in more ambiguous cases to infer how people segment time. Over the course of 13 independent studies, we were able to calibrate the technique and then use it to measure the strength of event segmentation in several instructive contexts that involved both visual and auditory modalities. The signal generated by sequential priming may permit the sense of separation between events to be measured as an extensive psychophysical quantity.

  3. Assessment of upper airway mechanics during sleep.

    PubMed

    Farré, Ramon; Montserrat, Josep M; Navajas, Daniel

    2008-11-30

    Obstructive sleep apnea, which is the most prevalent sleep breathing disorder, is characterized by recurrent episodes of upper airway collapse and reopening. However, the mechanical properties of the upper airway are not directly measured in routine polysomnography because only qualitative sensors (thermistors for flow and thoraco-abdominal bands for pressure) are used. This review focuses on two techniques that quantify upper airway obstruction during sleep. A Starling model of collapsible conduit allows us to interpret the mechanics of the upper airway by means of two parameters: the critical pressure (Pcrit) and the upstream resistance (Rup). A simple technique to measure Pcrit and Rup involves the application of different levels of continuous positive airway pressure (CPAP) during sleep. The forced oscillation technique is another non-invasive procedure for quantifying upper airway impedance during the breathing cycle in sleep studies. The latest developments in these two methods allow them to be easily applied on a routine basis in order to more fully characterize upper airway mechanics in patients with sleep breathing disorders.

  4. Numerical analysis of thermal drilling technique on titanium sheet metal

    NASA Astrophysics Data System (ADS)

    Kumar, R.; Hynes, N. Rajesh Jesudoss

    2018-05-01

    Thermal drilling is a technique used in drilling of sheet metal for various applications. It involves rotating conical tool with high speed in order to drill the sheet metal and formed a hole with bush below the surface of sheet metal. This article investigates the finite element analysis of thermal drilling on Ti6Al4Valloy sheet metal. This analysis was carried out by means of DEFORM-3D simulation software to simulate the performance characteristics of thermal drilling technique. Due to the contribution of high temperature deformation in this technique, the output performances which are difficult to measure by the experimental approach, can be successfully achieved by finite element method. Therefore, the modeling and simulation of thermal drilling is an essential tool to predict the strain rate, stress distribution and temperature of the workpiece.

  5. Temperature measurements behind reflected shock waves in air. [radiometric measurement of gas temperature in self-absorbing gas flow

    NASA Technical Reports Server (NTRS)

    Bader, J. B.; Nerem, R. M.; Dann, J. B.; Culp, M. A.

    1972-01-01

    A radiometric method for the measurement of gas temperature in self-absorbing gases has been applied in the study of shock tube generated flows. This method involves making two absolute intensity measurements at identical wavelengths, but for two different pathlengths in the same gas sample. Experimental results are presented for reflected shock waves in air at conditions corresponding to incident shock velocities from 7 to 10 km/s and an initial driven tube pressure of 1 torr. These results indicate that, with this technique, temperature measurements with an accuracy of + or - 5 percent can be carried out. The results also suggest certain facility related problems.

  6. 3D-Laser-Scanning Technique Applied to Bulk Density Measurements of Apollo Lunar Samples

    NASA Technical Reports Server (NTRS)

    Macke, R. J.; Kent, J. J.; Kiefer, W. S.; Britt, D. T.

    2015-01-01

    In order to better interpret gravimetric data from orbiters such as GRAIL and LRO to understand the subsurface composition and structure of the lunar crust, it is import to have a reliable database of the density and porosity of lunar materials. To this end, we have been surveying these physical properties in both lunar meteorites and Apollo lunar samples. To measure porosity, both grain density and bulk density are required. For bulk density, our group has historically utilized sub-mm bead immersion techniques extensively, though several factors have made this technique problematic for our work with Apollo samples. Samples allocated for measurement are often smaller than optimal for the technique, leading to large error bars. Also, for some samples we were required to use pure alumina beads instead of our usual glass beads. The alumina beads were subject to undesirable static effects, producing unreliable results. Other investigators have tested the use of 3d laser scanners on meteorites for measuring bulk volumes. Early work, though promising, was plagued with difficulties including poor response on dark or reflective surfaces, difficulty reproducing sharp edges, and large processing time for producing shape models. Due to progress in technology, however, laser scanners have improved considerably in recent years. We tested this technique on 27 lunar samples in the Apollo collection using a scanner at NASA Johnson Space Center. We found it to be reliable and more precise than beads, with the added benefit that it involves no direct contact with the sample, enabling the study of particularly friable samples for which bead immersion is not possible

  7. Laser-Induced Fluorescence Helps Diagnose Plasma Processes

    NASA Technical Reports Server (NTRS)

    Beattie, J. R.; Mattosian, J. N.; Gaeta, C. J.; Turley, R. S.; Williams, J. D.; Williamson, W. S.

    1994-01-01

    Technique developed to provide in situ monitoring of rates of ion sputter erosion of accelerator electrodes in ion thrusters also used for ground-based applications to monitor, calibrate, and otherwise diagnose plasma processes in fabrication of electronic and optical devices. Involves use of laser-induced-fluorescence measurements, which provide information on rates of ion etching, inferred rates of sputter deposition, and concentrations of contaminants.

  8. Empirical State Error Covariance Matrix for Batch Estimation

    NASA Technical Reports Server (NTRS)

    Frisbee, Joe

    2015-01-01

    State estimation techniques effectively provide mean state estimates. However, the theoretical state error covariance matrices provided as part of these techniques often suffer from a lack of confidence in their ability to describe the uncertainty in the estimated states. By a reinterpretation of the equations involved in the weighted batch least squares algorithm, it is possible to directly arrive at an empirical state error covariance matrix. The proposed empirical state error covariance matrix will contain the effect of all error sources, known or not. This empirical error covariance matrix may be calculated as a side computation for each unique batch solution. Results based on the proposed technique will be presented for a simple, two observer and measurement error only problem.

  9. Key techniques and risk management for the application of the Pile-Beam-Arch (PBA) excavation method: a case study of the Zhongjie subway station.

    PubMed

    Guan, Yong-ping; Zhao, Wen; Li, Shen-gang; Zhang, Guo-bin

    2014-01-01

    The design and construction of shallow-buried tunnels in densely populated urban areas involve many challenges. The ground movements induced by tunneling effects pose potential risks to infrastructure such as surface buildings, pipelines, and roads. In this paper, a case study of the Zhongjie subway station located in Shenyang, China, is examined to investigate the key construction techniques and the influence of the Pile-Beam-Arch (PBA) excavation method on the surrounding environment. This case study discusses the primary risk factors affecting the environmental safety and summarizes the corresponding risk mitigation measures and key techniques for subway station construction using the PBA excavation method in a densely populated urban area.

  10. NASA Glenn Research Center Experience with LENR Phenomenon

    NASA Technical Reports Server (NTRS)

    Wrbanek, Susan Y.; Fralick, Gustave C.; Wrbanek, John D.; Niedra, Janis M.

    2012-01-01

    Since 1989 NASA Glenn Research Center (GRC) has performed some small-scale limited experiments that show evidence of effects claimed by some to be evidence of Low Energy Nuclear Reactions (LENR). The research at GRC has involved observations and work on measurement techniques for observing the temperature effects in reactions of isotopes of hydrogen with palladium hydrides. The various experiments performed involved loading Pd with gaseous H2 and D2, and exposing Pd thin films to multi-bubble sonoluminescence in regular and deuterated water. An overview of these experiments and their results will be presented.

  11. NASA Glenn Research Center Experience with "LENR Phenomenon"

    NASA Technical Reports Server (NTRS)

    Wrbanek, Susan Y.; Fralick, Gustave C.; Wrbanek, John D.; Niedra, Janis M.

    2012-01-01

    Since 1989 NASA Glenn Research Center (GRC) has performed some small-scale limited experiments that show evidence of effects claimed by some to be evidence of Low Energy Nuclear Reactions (LENR). The research at GRC has involved observations and work on measurement techniques for observing the temperature effects in reactions of isotopes of hydrogen with palladium hydrides. The various experiments performed involved loading Pd with gaseous H2 and D2, and exposing Pd thin films to multi-bubble sonoluminescence in regular and deuterated water. An overview of these experiments and their results will be presented.

  12. Atomic Oxygen Treatment as a Method of Recovering Smoke Damaged Paintings. Revised

    NASA Technical Reports Server (NTRS)

    Rutledge, Sharon K.; Banks, Bruce A.; Forkapa, Mark; Stueber, Thomas; Sechkar, Edward; Malinowski, Kevin

    1999-01-01

    A noncontact technique is described that uses atomic oxygen, generated under low pressure in the presence of nitrogen, to remove soot and charred varnish from the surface of a painting. The process, which involves surface oxidation, permits control of the amount of surface material removed. The effectiveness of the process was evaluated by reflectance measurements from selected areas made during the removal of soot from acrylic gesso, ink on paper, and varnished oil paint substrates. For the latter substrate, treatment also involved the removal of damaged varnish and paint binder from the surface.

  13. Column ratio mapping: a processing technique for atomic resolution high-angle annular dark-field (HAADF) images.

    PubMed

    Robb, Paul D; Craven, Alan J

    2008-12-01

    An image processing technique is presented for atomic resolution high-angle annular dark-field (HAADF) images that have been acquired using scanning transmission electron microscopy (STEM). This technique is termed column ratio mapping and involves the automated process of measuring atomic column intensity ratios in high-resolution HAADF images. This technique was developed to provide a fuller analysis of HAADF images than the usual method of drawing single intensity line profiles across a few areas of interest. For instance, column ratio mapping reveals the compositional distribution across the whole HAADF image and allows a statistical analysis and an estimation of errors. This has proven to be a very valuable technique as it can provide a more detailed assessment of the sharpness of interfacial structures from HAADF images. The technique of column ratio mapping is described in terms of a [110]-oriented zinc-blende structured AlAs/GaAs superlattice using the 1 angstroms-scale resolution capability of the aberration-corrected SuperSTEM 1 instrument.

  14. A participatory approach for selecting cost-effective measures in the WFD context: the Mar Menor (SE Spain).

    PubMed

    Perni, Angel; Martínez-Paz, José M

    2013-08-01

    Achieving a good ecological status in water bodies by 2015 is one of the objectives established in the European Water Framework Directive. Cost-effective analysis (CEA) has been applied for selecting measures to achieve this goal, but this appraisal technique requires technical and economic information that is not always available. In addition, there are often local insights that can only be identified by engaging multiple stakeholders in a participatory process. This paper proposes to combine CEA with the active involvement of stakeholders for selecting cost-effective measures. This approach has been applied to the case study of one of the main coastal lagoons in the European Mediterranean Sea, the Mar Menor, which presents eutrophication problems. Firstly, face-to-face interviews were conducted to estimate relative effectiveness and relative impacts of a set of measures by means of the pairwise comparison technique. Secondly, relative effectiveness was used to estimate cost-effectiveness ratios. The most cost-effective measures were the restoration of watercourses that drain into the lagoon and the treatment of polluted groundwater. Although in general the stakeholders approved the former, most of them stated that the latter involved some uncertainties, which must be addressed before implementing it. Stakeholders pointed out that the PoM would have a positive impact not only on water quality, but also on fishing, agriculture and tourism in the area. This approach can be useful to evaluate other programmes, plans or projects related to other European environmental strategies. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. Ring-Down Spectroscopy for Characterizing a CW Raman Laser

    NASA Technical Reports Server (NTRS)

    Matsko, Andrey; Savchenkov, Anatoliy; Maleki, Lute

    2007-01-01

    .A relatively simple technique for characterizing an all-resonant intracavity continuous-wave (CW) solid-state Raman laser involves the use of ring-down spectroscopy. As used here, characterizing signifies determining such parameters as threshold pump power, Raman gain, conversion efficiency, and quality factors (Q values) of the pump and Stokes cavity modes. Heretofore, in order to characterize resonant-cavity-based Raman lasers, it has usually been necessary to manipulate the frequencies and power levels of pump lasers and, in each case, to take several sets of measurements. In cases involving ultra-high-Q resonators, it also has been desirable to lock pump lasers to resonator modes to ensure the quality of measurement data. Simpler techniques could be useful. In the present ring-down spectroscopic technique, one infers the parameters of interest from the decay of the laser out of its steady state. This technique does not require changing the power or frequency of the pump laser or locking the pump laser to the resonator mode. The technique is based on a theoretical analysis of what happens when the pump laser is abruptly switched off after the Raman generation reaches the steady state. The analysis starts with differential equations for the evolution of the amplitudes of the pump and Stokes electric fields, leading to solutions for the power levels of the pump and Stokes fields as functions of time and of the aforementioned parameters. Among other things, these solutions show how the ring-down time depends, to some extent, on the electromagnetic energy accumulated in the cavity. The solutions are readily converted to relatively simple equations for the parameters as functions of quantities that can be determined from measurements of the time-dependent power levels. For example, the steady-state intracavity conversion efficiency is given by G1/G2 1 and the threshold power is given by Pin(G2/G1)2, where Pin is the steady-state input pump power immediately prior to abrupt switch-off, G1 is the initial rate of decay of the pump field, and G2 is the final rate of decay of the pump field. Hence, it is possible to determine all the parameters from a single ring-down scan, provided that the measurements taken in that scan are sufficiently accurate and complete.

  16. In situ strain and temperature measurement and modelling during arc welding

    DOE PAGES

    Chen, Jian; Yu, Xinghua; Miller, Roger G.; ...

    2014-12-26

    In this study, experiments and numerical models were applied to investigate the thermal and mechanical behaviours of materials adjacent to the weld pool during arc welding. In the experiment, a new high temperature strain measurement technique based on digital image correlation (DIC) was developed and applied to measure the in situ strain evolution. In contrast to the conventional DIC method that is vulnerable to the high temperature and intense arc light involved in fusion welding processes, the new technique utilised a special surface preparation method to produce high temperature sustaining speckle patterns required by the DIC algorithm as well asmore » a unique optical illumination and filtering system to suppress the influence of the intense arc light. These efforts made it possible for the first time to measure in situ the strain field 1 mm away from the fusion line. The temperature evolution in the weld and the adjacent regions was simultaneously monitored by an infrared camera. Finally and additionally, a thermal–mechanical finite element model was applied to substantiate the experimental measurement.« less

  17. Optical measurement of sound using time-varying laser speckle patterns

    NASA Astrophysics Data System (ADS)

    Leung, Terence S.; Jiang, Shihong; Hebden, Jeremy

    2011-02-01

    In this work, we introduce an optical technique to measure sound. The technique involves pointing a coherent pulsed laser beam on the surface of the measurement site and capturing the time-varying speckle patterns using a CCD camera. Sound manifests itself as vibrations on the surface which induce a periodic translation of the speckle pattern over time. Using a parallel speckle detection scheme, the dynamics of the time-varying speckle patterns can be captured and processed to produce spectral information of the sound. One potential clinical application is to measure pathological sounds from the brain as a screening test. We performed experiments to demonstrate the principle of the detection scheme using head phantoms. The results show that the detection scheme can measure the spectra of single frequency sounds between 100 and 2000 Hz. The detection scheme worked equally well in both a flat geometry and an anatomical head geometry. However, the current detection scheme is too slow for use in living biological tissues which has a decorrelation time of a few milliseconds. Further improvements have been suggested.

  18. Patch nearfield acoustic holography combined with sound field separation technique applied to a non-free field

    NASA Astrophysics Data System (ADS)

    Bi, ChuanXing; Jing, WenQian; Zhang, YongBin; Xu, Liang

    2015-02-01

    The conventional nearfield acoustic holography (NAH) is usually based on the assumption of free-field conditions, and it also requires that the measurement aperture should be larger than the actual source. This paper is to focus on the problem that neither of the above-mentioned requirements can be met, and to examine the feasibility of reconstructing the sound field radiated by partial source, based on double-layer pressure measurements made in a non-free field by using patch NAH combined with sound field separation technique. And also, the sensitivity of the reconstructed result to the measurement error is analyzed in detail. Two experiments involving two speakers in an exterior space and one speaker inside a car cabin are presented. The experimental results demonstrate that the patch NAH based on single-layer pressure measurement cannot obtain a satisfied result due to the influences of disturbing sources and reflections, while the patch NAH based on double-layer pressure measurements can successfully remove these influences and reconstruct the patch sound field effectively.

  19. Implementation of quality improvement techniques for management and technical processes in the ACRV project

    NASA Technical Reports Server (NTRS)

    Raiman, Laura B.

    1992-01-01

    Total Quality Management (TQM) is a cooperative form of doing business that relies on the talents of everyone in an organization to continually improve quality and productivity, using teams and an assortment of statistical and measurement tools. The objective of the activities described in this paper was to implement effective improvement tools and techniques in order to build work processes which support good management and technical decisions and actions which are crucial to the success of the ACRV project. The objectives were met by applications in both the technical and management areas. The management applications involved initiating focused continuous improvement projects with widespread team membership. The technical applications involved applying proven statistical tools and techniques to the technical issues associated with the ACRV Project. Specific activities related to the objective included working with a support contractor team to improve support processes, examining processes involved in international activities, a series of tutorials presented to the New Initiatives Office and support contractors, a briefing to NIO managers, and work with the NIO Q+ Team. On the technical side, work included analyzing data from the large-scale W.A.T.E.R. test, landing mode trade analyses, and targeting probability calculations. The results of these efforts will help to develop a disciplined, ongoing process for producing fundamental decisions and actions that shape and guide the ACRV organization .

  20. Implementation of quality improvement techniques for management and technical processes in the ACRV project

    NASA Astrophysics Data System (ADS)

    Raiman, Laura B.

    1992-12-01

    Total Quality Management (TQM) is a cooperative form of doing business that relies on the talents of everyone in an organization to continually improve quality and productivity, using teams and an assortment of statistical and measurement tools. The objective of the activities described in this paper was to implement effective improvement tools and techniques in order to build work processes which support good management and technical decisions and actions which are crucial to the success of the ACRV project. The objectives were met by applications in both the technical and management areas. The management applications involved initiating focused continuous improvement projects with widespread team membership. The technical applications involved applying proven statistical tools and techniques to the technical issues associated with the ACRV Project. Specific activities related to the objective included working with a support contractor team to improve support processes, examining processes involved in international activities, a series of tutorials presented to the New Initiatives Office and support contractors, a briefing to NIO managers, and work with the NIO Q+ Team. On the technical side, work included analyzing data from the large-scale W.A.T.E.R. test, landing mode trade analyses, and targeting probability calculations. The results of these efforts will help to develop a disciplined, ongoing process for producing fundamental decisions and actions that shape and guide the ACRV organization .

  1. Molecular biology of myopia.

    PubMed

    Schaeffel, Frank; Simon, Perikles; Feldkaemper, Marita; Ohngemach, Sibylle; Williams, Robert W

    2003-09-01

    Experiments in animal models of myopia have emphasised the importance of visual input in emmetropisation but it is also evident that the development of human myopia is influenced to some degree by genetic factors. Molecular genetic approaches can help to identify both the genes involved in the control of ocular development and the potential targets for pharmacological intervention. This review covers a variety of techniques that are being used to study the molecular biology of myopia. In the first part, we describe techniques used to analyse visually induced changes in gene expression: Northern Blot, polymerase chain reaction (PCR) and real-time PCR to obtain semi-quantitative and quantitative measures of changes in transcription level of a known gene, differential display reverse transcription PCR (DD-RT-PCR) to search for new genes that are controlled by visual input, rapid amplification of 5' cDNA (5'-RACE) to extend the 5' end of sequences that are regulated by visual input, in situ hybridisation to localise the expression of a given gene in a tissue and oligonucleotide microarray assays to simultaneously test visually induced changes in thousands of transcripts in single experiments. In the second part, we describe techniques that are used to localise regions in the genome that contain genes that are involved in the control of eye growth and refractive errors in mice and humans. These include quantitative trait loci (QTL) mapping, exploiting experimental test crosses of mice and transmission disequilibrium tests (TDT) in humans to find chromosomal intervals that harbour genes involved in myopia development. We review several successful applications of this battery of techniques in myopia research.

  2. Three-dimensional venous visualization with phase-lag computed tomography angiography for reconstructive microsurgery.

    PubMed

    Sakakibara, Shunsuke; Onishi, Hiroyuki; Hashikawa, Kazunobu; Akashi, Masaya; Sakakibara, Akiko; Nomura, Tadashi; Terashi, Hiroto

    2015-05-01

    Most free flap reconstruction complications involve vascular compromise. Evaluation of vascular anatomy provides considerable information that can potentially minimize these complications. Previous reports have shown that contrast-enhanced computed tomography is effective for understanding three-dimensional arterial anatomy. However, most vascular complications result from venous thromboses, making imaging of venous anatomy highly desirable. The phase-lag computed tomography angiography (pl-CTA) technique involves 64-channel (virtually, 128-channel) multidetector CT and is used to acquire arterial images using conventional CTA. Venous images are three-dimensionally reconstructed using a subtraction technique involving combined venous phase and arterial phase images, using a computer workstation. This technique was used to examine 48 patients (12 lower leg reconstructions, 34 head and neck reconstructions, and 2 upper extremity reconstructions) without complications. The pl-CTA technique can be used for three-dimensional visualization of peripheral veins measuring approximately 1 mm in diameter. The pl-CTA information was especially helpful for secondary free flap reconstructions in the head and neck region after malignant tumor recurrence. In such cases, radical dissection of the neck was performed as part of the first operation, and many vessels, including veins, were resected and used in the first free-tissue transfer. The pl-CTA images also allowed visualization of varicose changes in the lower leg region and helped us avoid selecting those vessels for anastomosis. Thus, the pl-CTA-derived venous anatomy information was useful for exact evaluations during the planning of free-tissue transfers. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  3. Rare-Earth Oxide (Yb2O3) Selective Emitter Fabrication and Evaluation

    NASA Technical Reports Server (NTRS)

    Jennette, Bryan; Gregory, Don A.; Herren, Kenneth; Tucker, Dennis; Smith, W. Scott (Technical Monitor)

    2001-01-01

    This investigation involved the fabrication and evaluation of rare-earth oxide selective emitters. The first goal of this study was to successfully fabricate the selective emitter samples using paper and ceramic materials processing techniques. The resulting microstructure was also analyzed using a Scanning Electron Microscope. All selective emitter samples fabricated for this study were made with ytterbium oxide (Yb2O3). The second goal of this study involved the measurement of the spectral emission and the radiated power of all the selective emitter samples. The final goal of this study involved the direct comparison of the radiated power emitted by the selective emitter samples to that of a standard blackbody at the same temperature and within the same wavelength range.

  4. Multi-wavelength dual polarisation lidar for monitoring precipitation process in the cloud seeding technique

    NASA Astrophysics Data System (ADS)

    Sudhakar, P.; Sheela, K. Anitha; Ramakrishna Rao, D.; Malladi, Satyanarayana

    2016-05-01

    In recent years weather modification activities are being pursued in many countries through cloud seeding techniques to facilitate the increased and timely precipitation from the clouds. In order to induce and accelerate the precipitation process clouds are artificially seeded with suitable materials like silver iodide, sodium chloride or other hygroscopic materials. The success of cloud seeding can be predicted with confidence if the precipitation process involving aerosol, the ice water balance, water vapor content and size of the seeding material in relation to aerosol in the cloud is monitored in real time and optimized. A project on the enhancement of rain fall through cloud seeding is being implemented jointly with Kerala State Electricity Board Ltd. Trivandrum, Kerala, India at the catchment areas of the reservoir of one of the Hydro electric projects. The dual polarization lidar is being used to monitor and measure the microphysical properties, the extinction coefficient, size distribution and related parameters of the clouds. The lidar makes use of the Mie, Rayleigh and Raman scattering techniques for the various measurement proposed. The measurements with the dual polarization lidar as above are being carried out in real time to obtain the various parameters during cloud seeding operations. In this paper we present the details of the multi-wavelength dual polarization lidar being used and the methodology to monitor the various cloud parameters involved in the precipitation process. The necessary retrieval algorithms for deriving the microphysical properties of clouds, aerosols characteristics and water vapor profiles are incorporated as a software package working under Lab-view for online and off line analysis. Details on the simulation studies and the theoretical model developed in this regard for the optimization of various parameters are discussed.

  5. A portable meter for measuring low frequency currents in the human body.

    PubMed

    Niple, J C; Daigle, J P; Zaffanella, L E; Sullivan, T; Kavet, R

    2004-07-01

    A portable meter has been developed for measuring low frequency currents that flow in the human body. Although the present version of the meter was specifically designed to measure 50/60 Hz "contact currents," the principles involved can be used with other low frequency body currents. Contact currents flow when the human body provides a conductive path between objects in the environment with different electrical potentials. The range of currents the meter detects is approximately 0.4-800 microA. This provides measurements of currents from the threshold of human perception (approximately 500 microA(RMS)) down to single microampere levels. The meter has a unique design, which utilizes the human subject's body impedance as the sensing element. Some of the advantages of this approach are high sensitivity, the ability to measure current flow in the majority of the body, and relative insensitivity to the current path connection points. Current measurement accuracy varies with the accuracy of the body impedance (resistance) measurement and different techniques can be used to obtain a desired level of accuracy. Techniques are available to achieve an estimated +/-20% accuracy. Copyright 2004 Wiley-Liss, Inc.

  6. Optimal plane search method in blood flow measurements by magnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    Bargiel, Pawel; Orkisz, Maciej; Przelaskowski, Artur; Piatkowska-Janko, Ewa; Bogorodzki, Piotr; Wolak, Tomasz

    2004-07-01

    This paper offers an algorithm for determining the blood flow parameters in the neck vessel segments using a single (optimal) measurement plane instead of the usual approach involving four planes orthogonal to the artery axis. This new approach aims at significantly shortening the time required to complete measurements using Nuclear Magnetic Resonance techniques. Based on a defined error function, the algorithm scans the solution space to find the minimum of the error function, and thus to determine a single plane characterized by a minimum measurement error, which allows for an accurate measurement of blood flow in the four carotid arteries. The paper also comprises a practical implementation of this method (as a module of a larger imaging-measuring system), including preliminary research results.

  7. Temperature of the plasmasphere from Van Allen Probes HOPE

    NASA Astrophysics Data System (ADS)

    Genestreti, K. J.; Goldstein, J.; Corley, G. D.; Farner, W.; Kistler, L. M.; Larsen, B. A.; Mouikis, C. G.; Ramnarace, C.; Skoug, R. M.; Turner, N. E.

    2017-01-01

    We introduce two novel techniques for estimating temperatures of very low energy space plasmas using, primarily, in situ data from an electrostatic analyzer mounted on a charged and moving spacecraft. The techniques are used to estimate proton temperatures during intervals where the bulk of the ion plasma is well below the energy bandpass of the analyzer. Both techniques assume that the plasma may be described by a one-dimensional E→×B→ drifting Maxwellian and that the potential field and motion of the spacecraft may be accounted for in the simplest possible manner, i.e., by a linear shift of coordinates. The first technique involves the application of a constrained theoretical fit to a measured distribution function. The second technique involves the comparison of total and partial-energy number densities. Both techniques are applied to Van Allen Probes Helium, Oxygen, Proton, and Electron (HOPE) observations of the proton component of the plasmasphere during two orbits on 15 January 2013. We find that the temperatures calculated from these two order-of-magnitude-type techniques are in good agreement with typical ranges of the plasmaspheric temperature calculated using retarding potential analyzer-based measurements—generally between 0.2 and 2 eV (2000-20,000 K). We also find that the temperature is correlated with L shell and hot plasma density and is negatively correlated with the cold plasma density. We posit that the latter of these three relationships may be indicative of collisional or wave-driven heating of the plasmasphere in the ring current overlap region. We note that these techniques may be easily applied to similar data sets or used for a variety of purposes.

  8. Accuracy of 3 different impression techniques for internal connection angulated implants.

    PubMed

    Tsagkalidis, George; Tortopidis, Dimitrios; Mpikos, Pavlos; Kaisarlis, George; Koidis, Petros

    2015-10-01

    Making implant impressions with different angulations requires a more precise and time-consuming impression technique. The purpose of this in vitro study was to compare the accuracy of nonsplinted, splinted, and snap-fit impression techniques of internal connection implants with different angulations. An experimental device was used to allow a clinical simulation of impression making by means of open and closed tray techniques. Three different impression techniques (nonsplinted, acrylic-resin splinted, and indirect snap-fit) for 6 internal-connected implants at different angulations (0, 15, 25 degrees) were examined using polyether. Impression accuracy was evaluated by measuring the differences in 3-dimensional (3D) position deviations between the implant body/impression coping before the impression procedure and the coping/laboratory analog positioned within the impression, using a coordinate measuring machine. Data were analyzed by 2-way ANOVA. Means were compared with the least significant difference criterion at P<.05. Results showed that at 25 degrees of implant angulation, the highest accuracy was obtained with the splinted technique (mean ±SE: 0.39 ±0.05 mm) and the lowest with the snap-fit technique (0.85 ±0.09 mm); at 15 degrees of angulation, there were no significant differences among splinted (0.22 ±0.04 mm) and nonsplinted technique (0.15 ±0.02 mm) and the lowest accuracy obtained with the snap-fit technique (0.95 ±0.15 mm); and no significant differences were found between nonsplinted and splinted technique at 0 degrees of implant placement. Splinted impression technique exhibited a higher accuracy than the other techniques studied when increased implant angulations at 25 degrees were involved. Copyright © 2015 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  9. Practical exergy analysis of centrifugal compressor performance using ASME-PTC-10 data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carranti, F.J.

    1997-07-01

    It has been shown that measures of performance currently in use for industrial and process compressors do not give a true measure of energy utilization, and that the required assumptions of isentropic or adiabatic behavior are now always valid. A better indication of machine or process performance can be achieved using exergetic (second law) efficiencies and by employing the second law of thermodynamics to indicate the nature of irreversibilities and entropy generation in the compression process. In this type of analysis, performance is related to an environmental equilibrium condition, or dead state. Often, the differences between avoidable and unavoidable irreversibilitiesmore » ca be interpreted from these results. A general overview of the techniques involved in exergy analysis as applied to compressors and blowers is presented. A practical method to allow the calculation of exergetic efficiencies by manufacturers and end users is demonstrated using data from ASME Power Test Code input. These data are often readily available from compressor manufacturers for both design and off-design conditions, or can sometimes be obtained from field measurements. The calculations involved are simple and straightforward, and can demonstrate the energy usage situation for a variety of conditions. Here off-design is taken to mean at different rates of flow, as well as at different environmental states. The techniques presented are also applicable to many other equipment and process types.« less

  10. Characterization of Bond Strength of U-Mo Fuel Plates Using the Laser Shockwave Technique: Capabilities and Preliminary Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J. A. Smith; D. L. Cottle; B. H. Rabin

    2013-09-01

    This report summarizes work conducted to-date on the implementation of new laser-based capabilities for characterization of bond strength in nuclear fuel plates, and presents preliminary results obtained from fresh fuel studies on as-fabricated monolithic fuel consisting of uranium-10 wt.% molybdenum alloys clad in 6061 aluminum by hot isostatic pressing. Characterization involves application of two complementary experimental methods, laser-shock testing and laser-ultrasonic imaging, collectively referred to as the Laser Shockwave Technique (LST), that allows the integrity, physical properties and interfacial bond strength in fuel plates to be evaluated. Example characterization results are provided, including measurement of layer thicknesses, elastic properties ofmore » the constituents, and the location and nature of generated debonds (including kissing bonds). LST provides spatially localized, non-contacting measurements with minimum specimen preparation, and is ideally suited for applications involving radioactive materials, including irradiated materials. The theoretical principles and experimental approaches employed in characterizing nuclear fuel plates are described, and preliminary bond strength measurement results are discussed, with emphasis on demonstrating the capabilities and limitations of these methods. These preliminary results demonstrate the ability to distinguish bond strength variations between different fuel plates. Although additional development work is necessary to validate and qualify the test methods, these results suggest LST is viable as a method to meet fuel qualification requirements to demonstrate acceptable bonding integrity.« less

  11. Measurements of optical underwater turbulence under controlled conditions

    NASA Astrophysics Data System (ADS)

    Kanaev, A. V.; Gladysz, S.; Almeida de Sá Barros, R.; Matt, S.; Nootz, G. A.; Josset, D. B.; Hou, W.

    2016-05-01

    Laser beam propagation underwater is becoming an important research topic because of high demand for its potential applications. Namely, ability to image underwater at long distances is highly desired for scientific and military purposes, including submarine awareness, diver visibility, and mine detection. Optical communication in the ocean can provide covert data transmission with much higher rates than that available with acoustic techniques, and it is now desired for certain military and scientific applications that involve sending large quantities of data. Unfortunately underwater environment presents serious challenges for propagation of laser beams. Even in clean ocean water, the extinction due to absorption and scattering theoretically limit the useful range to few attenuation lengths. However, extending the laser light propagation range to the theoretical limit leads to significant beam distortions due to optical underwater turbulence. Experiments show that the magnitude of the distortions that are caused by water temperature and salinity fluctuations can significantly exceed the magnitude of the beam distortions due to atmospheric turbulence even for relatively short propagation distances. We are presenting direct measurements of optical underwater turbulence in controlled conditions of laboratory water tank using two separate techniques involving wavefront sensor and LED array. These independent approaches will enable development of underwater turbulence power spectrum model based directly on the spatial domain measurements and will lead to accurate predictions of underwater beam propagation.

  12. Surface topography characterization using 3D stereoscopic reconstruction of SEM images

    NASA Astrophysics Data System (ADS)

    Vedantha Krishna, Amogh; Flys, Olena; Reddy, Vijeth V.; Rosén, B. G.

    2018-06-01

    A major drawback of the optical microscope is its limitation to resolve finer details. Many microscopes have been developed to overcome the limitations set by the diffraction of visible light. The scanning electron microscope (SEM) is one such alternative: it uses electrons for imaging, which have much smaller wavelength than photons. As a result high magnification with superior image resolution can be achieved. However, SEM generates 2D images which provide limited data for surface measurements and analysis. Often many research areas require the knowledge of 3D structures as they contribute to a comprehensive understanding of microstructure by allowing effective measurements and qualitative visualization of the samples under study. For this reason, stereo photogrammetry technique is employed to convert SEM images into 3D measurable data. This paper aims to utilize a stereoscopic reconstruction technique as a reliable method for characterization of surface topography. Reconstructed results from SEM images are compared with coherence scanning interferometer (CSI) results obtained by measuring a roughness reference standard sample. This paper presents a method to select the most robust/consistent surface texture parameters that are insensitive to the uncertainties involved in the reconstruction technique itself. Results from the two-stereoscopic reconstruction algorithms are also documented in this paper.

  13. Experimental and data analysis techniques for deducing collision-induced forces from photographic histories of engine rotor fragment impact/interaction with a containment ring

    NASA Technical Reports Server (NTRS)

    Yeghiayan, R. P.; Leech, J. W.; Witmer, E. A.

    1973-01-01

    An analysis method termed TEJ-JET is described whereby measured transient elastic and inelastic deformations of an engine-rotor fragment-impacted structural ring are analyzed to deduce the transient external forces experienced by that ring as a result of fragment impact and interaction with the ring. Although the theoretical feasibility of the TEJ-JET concept was established, its practical feasibility when utilizing experimental measurements of limited precision and accuracy remains to be established. The experimental equipment and the techniques (high-speed motion photography) employed to measure the transient deformations of fragment-impacted rings are described. Sources of error and data uncertainties are identified. Techniques employed to reduce data reading uncertainties and to correct the data for optical-distortion effects are discussed. These procedures, including spatial smoothing of the deformed ring shape by Fourier series and timewise smoothing by Gram polynomials, are applied illustratively to recent measurements involving the impact of a single T58 turbine rotor blade against an aluminum containment ring. Plausible predictions of the fragment-ring impact/interaction forces are obtained by one branch of this TEJ-JET method; however, a second branch of this method, which provides an independent estimate of these forces, remains to be evaluated.

  14. Utility of the theory of planned behavior to predict nursing staff blood pressure monitoring behaviours.

    PubMed

    Nelson, Joan M; Cook, Paul F; Ingram, Jennifer C

    2014-02-01

    To evaluate constructs from the theory of planned behavior (TPB, Ajzen 2002) - attitudes, sense of control, subjective norms and intentions - as predictors of accuracy in blood pressure monitoring. Despite numerous initiatives aimed at teaching blood pressure measurement techniques, many healthcare providers measure blood pressures incorrectly. Descriptive, cohort design. Medical assistants and licensed practical nurses were asked to complete a questionnaire on TPB variables. These nursing staff's patients had their blood pressures measured and completed a survey about techniques used to measure their blood pressure. We correlated nursing staff's responses on the TBP questionnaire with their intention to measure an accurate blood pressure and with the difference between their actual blood pressure measurement and a second measurement taken by a researcher immediately after the clinic visit. Patients' perceptions of MAs' and LPNs' blood pressure measurement techniques were examined descriptively. Perceived control and social norm predicted intention to measure an accurate blood pressure, with a negative relationship between knowledge and intention. Consistent with the TPB, intention was the only significant predictor of blood pressure measurement accuracy. Theory of planned behavior constructs predicted the healthcare providers' intention to measure blood pressure accurately and intention predicted the actual accuracy of systolic blood pressure measurement. However, participants' knowledge about blood pressure measurement had an unexpected negative relationship with their intentions. These findings have important implications for nursing education departments and organisations which traditionally invest significant time and effort in annual competency training focused on knowledge enhancement by staff. This study suggests that a better strategy might involve efforts to enhance providers' intention to change, particularly by changing social norms or increasing perceived control of the behaviour by nursing staff. © 2013 Blackwell Publishing Ltd.

  15. Techniques and recommendations for the inclusion of users with autism in the design of assistive technologies.

    PubMed

    Francis, Peter; Mellor, David; Firth, Lucy

    2009-01-01

    The increasing numbers of technology platforms offer opportunities to develop new visual assistive aids for people with autism. However, their involvement in the design of such aids is critical to their short-term uptake and longer term use. Using a three-round Delphi study involving seven Australian psychologists specializing in treating people with autism, the authors explored the utility of four techniques that might be implemented to involve users with autism in the design process. The authors found that individual users from the target group would be likely to respond differently to the techniques and that no technique was clearly better than any other. Recommendations for using these techniques to involve individuals with autism in the design of assistive technologies are suggested.

  16. Methods of photoelectrode characterization with high spatial and temporal resolution

    DOE PAGES

    Esposito, Daniel V.; Baxter, Jason B.; John, Jimmy; ...

    2015-06-19

    Here, materials and photoelectrode architectures that are highly efficient, extremely stable, and made from low cost materials are required for commercially viable photoelectrochemical (PEC) water-splitting technology. A key challenge is the heterogeneous nature of real-world materials, which often possess spatial variation in their crystal structure, morphology, and/or composition at the nano-, micro-, or macro-scale. Different structures and compositions can have vastly different properties and can therefore strongly influence the overall performance of the photoelectrode through complex structure–property relationships. A complete understanding of photoelectrode materials would also involve elucidation of processes such as carrier collection and electrochemical charge transfer that occurmore » at very fast time scales. We present herein an overview of a broad suite of experimental and computational tools that can be used to define the structure–property relationships of photoelectrode materials at small dimensions and on fast time scales. A major focus is on in situ scanning-probe measurement (SPM) techniques that possess the ability to measure differences in optical, electronic, catalytic, and physical properties with nano- or micro-scale spatial resolution. In situ ultrafast spectroscopic techniques, used to probe carrier dynamics involved with processes such as carrier generation, recombination, and interfacial charge transport, are also discussed. Complementing all of these experimental techniques are computational atomistic modeling tools, which can be invaluable for interpreting experimental results, aiding in materials discovery, and interrogating PEC processes at length and time scales not currently accessible by experiment. In addition to reviewing the basic capabilities of these experimental and computational techniques, we highlight key opportunities and limitations of applying these tools for the development of PEC materials.« less

  17. A field technique for estimating aquifer parameters using flow log data

    USGS Publications Warehouse

    Paillet, Frederick L.

    2000-01-01

    A numerical model is used to predict flow along intervals between producing zones in open boreholes for comparison with measurements of borehole flow. The model gives flow under quasi-steady conditions as a function of the transmissivity and hydraulic head in an arbitrary number of zones communicating with each other along open boreholes. The theory shows that the amount of inflow to or outflow from the borehole under any one flow condition may not indicate relative zone transmissivity. A unique inversion for both hydraulic-head and transmissivity values is possible if flow is measured under two different conditions such as ambient and quasi-steady pumping, and if the difference in open-borehole water level between the two flow conditions is measured. The technique is shown to give useful estimates of water levels and transmissivities of two or more water-producing zones intersecting a single interval of open borehole under typical field conditions. Although the modeling technique involves some approximation, the principle limit on the accuracy of the method under field conditions is the measurement error in the flow log data. Flow measurements and pumping conditions are usually adjusted so that transmissivity estimates are most accurate for the most transmissive zones, and relative measurement error is proportionately larger for less transmissive zones. The most effective general application of the borehole-flow model results when the data are fit to models that systematically include more production zones of progressively smaller transmissivity values until model results show that all accuracy in the data set is exhausted.A numerical model is used to predict flow along intervals between producing zones in open boreholes for comparison with measurements of borehole flow. The model gives flow under quasi-steady conditions as a function of the transmissivity and hydraulic head in an arbitrary number of zones communicating with each other along open boreholes. The theory shows that the amount of inflow to or outflow from the borehole under any one flow condition may not indicate relative zone transmissivity. A unique inversion for both hydraulic-head and transmissivity values is possible if flow is measured under two different conditions such as ambient and quasi-steady pumping, and if the difference in open-borehole water level between the two flow conditions is measured. The technique is shown to give useful estimates of water levels and transmissivities of two or more water-producing zones intersecting a single interval of open borehole under typical field conditions. Although the modeling technique involves some approximation, the principle limit on the accuracy of the method under field conditions is the measurement error in the flow log data. Flow measurements and pumping conditions are usually adjusted so that transmissivity estimates are most accurate for the most transmissive zones, and relative measurement error is proportionately larger for less transmissive zones. The most effective general application of the borehole-flow model results when the data are fit to models that symmetrically include more production zones of progressively smaller transmissivity values until model results show that all accuracy in the data set is exhausted.

  18. Vapor pressures of acetylene at low temperatures

    NASA Technical Reports Server (NTRS)

    Masterson, C. M.; Allen, John E., Jr.; Kraus, G. F.; Khanna, R. K.

    1990-01-01

    The atmospheres of many of the outer planets and their satellites contain a large number of hydrocarbon species. In particular, acetylene (C2H2) has been identified at Jupiter, Saturn and its satellite Titan, Uranus and Neptune. In the lower atmospheres of these planets, where colder temperatures prevail, the condensation and/or freezing of acetylene is probable. In order to obtain accurate models of the acetylene in these atmospheres, it is necessary to have a complete understanding of its vapor pressures at low temperatures. Vapor pressures at low temperatures for acetylene are being determined. The vapor pressures are measured with two different techniques in order to cover a wide range of temperatures and pressures. In the first, the acetylene is placed in a sample tube which is immersed in a low temperature solvent/liquid nitrogen slush bath whose temperature is measured with a thermocouple. The vapor pressure is then measured directly with a capacitance manometer. For lower pressures, a second technique which was called the thin-film infrared method (TFIR) was developed. It involves measuring the disappearance rate of a thin film of acetylene at a particular temperature. The spectra are then analyzed using previously determined extinction coefficient values, to determine the disappearance rate R (where R = delta n/delta t, the number of molecules that disappear per unit time). This can be related to the vapor pressure directly. This technique facilitates measurement of the lower temperatures and pressures. Both techniques have been calibrated using CO2, and have shown good agreement with the existing literature data.

  19. Beampattern control of a microphone array to minimize secondary source contamination.

    PubMed

    Jordan, Peter; Fitzpatrick, John A; Meskell, Craig

    2003-10-01

    A null-steering technique is adapted and applied to a linear delay-and-sum beamformer in order to measure the noise generated by one of the propellers of a 1/8 scale twin propeller aircraft model. The technique involves shading the linear array using a set of weights, which are calculated according to the locations onto which the nulls need to be steered (in this case onto the second propeller). The technique is based on an established microwave antenna theory, and uses a plane-wave, or far field formulation in order to represent the response of the array by an nth-order polynomial, where n is the number of array elements. The roots of this polynomial correspond to the minima of the array response, and so by an appropriate choice of roots, a polynomial can be generated, the coefficients of which are the weights needed to achieve the prespecified set of null positions. It is shown that, for the technique to work with actual data, the cross-spectral matrix must be conditioned before array shading is implemented. This ensures that the shading function is not distorted by the intrinsic element weighting which can occur as a result of the directional nature of aeroacoustic systems. A difference of 6 dB between measurements before and after null steering shows the technique to have been effective in eliminating the contribution from one of the propellers, thus providing a quantitative measure of the acoustic energy from the other.

  20. Experimental analysis of computer system dependability

    NASA Technical Reports Server (NTRS)

    Iyer, Ravishankar, K.; Tang, Dong

    1993-01-01

    This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.

  1. Comparison of Minimally and More Invasive Methods of Determining Mixed Venous Oxygen Saturation.

    PubMed

    Smit, Marli; Levin, Andrew I; Coetzee, Johan F

    2016-04-01

    To investigate the accuracy of a minimally invasive, 2-step, lookup method for determining mixed venous oxygen saturation compared with conventional techniques. Single-center, prospective, nonrandomized, pilot study. Tertiary care hospital, university setting. Thirteen elective cardiac and vascular surgery patients. All participants received intra-arterial and pulmonary artery catheters. Minimally invasive oxygen consumption and cardiac output were measured using a metabolic module and lithium-calibrated arterial waveform analysis (LiDCO; LiDCO, London), respectively. For the minimally invasive method, Step 1 involved these minimally invasive measurements, and arterial oxygen content was entered into the Fick equation to calculate mixed venous oxygen content. Step 2 used an oxyhemoglobin curve spreadsheet to look up mixed venous oxygen saturation from the calculated mixed venous oxygen content. The conventional "invasive" technique used pulmonary artery intermittent thermodilution cardiac output, direct sampling of mixed venous and arterial blood, and the "reverse-Fick" method of calculating oxygen consumption. LiDCO overestimated thermodilution cardiac output by 26%. Pulmonary artery catheter-derived oxygen consumption underestimated metabolic module measurements by 27%. Mixed venous oxygen saturation differed between techniques; the calculated values underestimated the direct measurements by between 12% to 26.3%, this difference being statistically significant. The magnitude of the differences between the minimally invasive and invasive techniques was too great for the former to act as a surrogate of the latter and could adversely affect clinical decision making. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. STRATEGIES FOR QUANTIFYING PET IMAGING DATA FROM TRACER STUDIES OF BRAIN RECEPTORS AND ENZYMES.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Logan, J.

    2001-04-02

    A description of some of the methods used in neuroreceptor imaging to distinguish changes in receptor availability has been presented in this chapter. It is necessary to look beyond regional uptake of the tracer since uptake generally is affected by factors other than the number of receptors for which the tracer has affinity. An exception is the infusion method producing an equilibrium state. The techniques vary in complexity some requiring arterial blood measurements of unmetabolized tracer and multiple time uptake data. Others require only a few plasma and uptake measurements and those based on a reference region require no plasmamore » measurements. We have outlined some of the limitations of the different methods. Laruelle (1999) has pointed out that test/retest studies to which various methods can be applied are crucial in determining the optimal method for a particular study. The choice of method will also depend upon the application. In a clinical setting, methods not involving arterial blood sampling are generally preferred. In the future techniques for externally measuring arterial plasma radioactivity with only a few blood samples for metabolite correction will extend the modeling options of clinical PET. Also since parametric images can provide information beyond that of ROI analysis, improved techniques for generating such images will be important, particularly for ligands requiring more than a one-compartment model. Techniques such as the wavelet transform proposed by Turkheimer et al. (2000) may prove to be important in reducing noise and improving quantitation.« less

  3. Laser fringe anemometry for aero engine components

    NASA Technical Reports Server (NTRS)

    Strazisar, A. J.

    1986-01-01

    Advances in flow measurement techniques in turbomachinery continue to be paced by the need to obtain detailed data for use in validating numerical predictions of the flowfield and for use in the development of empirical models for those flow features which cannot be readily modelled numerically. The use of laser anemometry in turbomachinery research has grown over the last 14 years in response to these needs. Based on past applications and current developments, this paper reviews the key issues which are involved when considering the application of laser anemometry to the measurement of turbomachinery flowfields. Aspects of laser fringe anemometer optical design which are applicable to turbomachinery research are briefly reviewed. Application problems which are common to both laser fringe anemometry (LFA) and laser transit anemometry (LTA) such as seed particle injection, optical access to the flowfield, and measurement of rotor rotational position are covered. The efficiency of various data acquisition schemes is analyzed and issues related to data integrity and error estimation are addressed. Real-time data analysis techniques aimed at capturing flow physics in real time are discussed. Finally, data reduction and analysis techniques are discussed and illustrated using examples taken from several LFA turbomachinery applications.

  4. Prospects for x-ray polarimetry measurements of magnetic fields in magnetized liner inertial fusion plasmas.

    PubMed

    Lynn, Alan G; Gilmore, Mark

    2014-11-01

    Magnetized Liner Inertial Fusion (MagLIF) experiments, where a metal liner is imploded to compress a magnetized seed plasma may generate peak magnetic fields ∼10(4) T (100 Megagauss) over small volumes (∼10(-10)m(3)) at high plasma densities (∼10(28)m(-3)) on 100 ns time scales. Such conditions are extremely challenging to diagnose. We discuss the possibility of, and issues involved in, using polarimetry techniques at x-ray wavelengths to measure magnetic fields under these extreme conditions.

  5. Radiative lifetimes, branching fractions, and oscillator strengths of some levels in Be I

    NASA Astrophysics Data System (ADS)

    Wang, Xinghao; Quinet, Pascal; Li, Qiu; Yu, Qi; Li, Yongfan; Wang, Qian; Gong, Yimin; Dai, Zhenwen

    2018-06-01

    Radiative lifetimes of five levels in Be I lying in the energy range 64,506.45-71,160.52 cm-1 were measured by the time-resolved laser-induced fluorescence technique. These new data, together with previously measured radiative lifetimes and two reliable calculated lifetimes, were combined with branching fractions obtained from pseudo-relativistic Hartree-Fock calculations to deduce semi-empirical transition probabilities and oscillator strengths for 90 Be I spectral lines involving upper levels ranging from 42,565.35 to 72,251.27 cm-1.

  6. Description of the meteoroid detection experiment flown on the Pioneer 10 and 11 Jupiter flyby missions

    NASA Technical Reports Server (NTRS)

    Oneal, R. L. (Compiler)

    1974-01-01

    The meteoroid detection experiment has the objective of measuring the population of 10 to the minus 9th power and 10 to the minus 8th power grams mass particles in interplanetary space with emphasis on making these measurements in the Asteroid Belt. The instrument design, which uses the pressurized-cell-penetration detection technique, and the tests involved in obtaining a flight-qualified instrument are described. The successful demonstration of flight-quality penetration detectors to function properly under long-term simulated space environments is also described.

  7. Frequency and time resolved measurements at rotating ring-disk electrodes for studying localized corrosion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huet, F.; Keddam, M.; Takenouti, H.

    1993-07-01

    By conferring frequency and time resolution on the rotating rink-disk electrode technique, original information can be obtained on the mechanism of corrosion processes involving the formation of intermediate, passive, or corrosion product layers. The methodology that allows the measurement of the actual flux of chemical species generated by a localized corrosion site is described which takes into account the usual parameters of the RRDE and the location of the active spot on the disk surface. Application to pitting corrosion of iron by Cl[sup [minus

  8. Radar cross section studies

    NASA Technical Reports Server (NTRS)

    Burnside, W. D.; Dominek, A. K.; Gupta, I. J.; Newman, E. H.; Pathak, P. H.; Peters, L., Jr.

    1987-01-01

    The ultimate goal is to generate experimental techniques and computer codes of rather general capability that would enable the aerospace industry to evaluate the scattering properties of aerodynamic shapes. Another goal involves developing an understanding of scattering mechanisms so that modification of the vehicular structure could be introduced within constraints set by aerodynamics. The development of indoor scattering measurement systems with special attention given to the compact range is another goal. There has been considerable progress in advancing state-of-the-art scattering measurements and control and analysis of the electromagnetic scattering from general targets.

  9. Electrical properties of CZTS thin films

    NASA Astrophysics Data System (ADS)

    Rao, M. C.; Kumar, M. Seshu; Lakshmi, K.; Rao, K. Koteswara; Parimala, M. P. D.; Basha, S. K. Shahenoor

    2018-05-01

    CZTS (Cu2ZnSnS4) thin films have been coated on to FTO and MO glass substrates by single step electro deposition process. Different characterization techniques were performed on to the prepared samples such as DSC and Raman studies. The Phase transition and weight loss of the precursors can be measured by DSC analysis. Raman spectrum is used to identify the functional groups and chemical structure involved in the materials. Electrical measurements confirm the nature of the film and also depend on the charge concentration present in the samples.

  10. New Challenges in Tribology: Wear Assessment Using 3D Optical Scanners

    PubMed Central

    Valigi, Maria Cristina; Logozzo, Silvia; Affatato, Saverio

    2017-01-01

    Wear is a significant mechanical and clinical problem. To acquire further knowledge on the tribological phenomena that involve freeform mechanical components or medical prostheses, wear tests are performed on biomedical and industrial materials in order to solve or reduce failures or malfunctions due to material loss. Scientific and technological advances in the field of optical scanning allow the application of innovative devices for wear measurements, leading to improvements that were unimaginable until a few years ago. It is therefore important to develop techniques, based on new instrumentations, for more accurate and reproducible measurements of wear. The aim of this work is to discuss the use of innovative 3D optical scanners and an experimental procedure to detect and evaluate wear, comparing this technique with other wear evaluation methods for industrial components and biomedical devices. PMID:28772905

  11. New Challenges in Tribology: Wear Assessment Using 3D Optical Scanners.

    PubMed

    Valigi, Maria Cristina; Logozzo, Silvia; Affatato, Saverio

    2017-05-18

    Wear is a significant mechanical and clinical problem. To acquire further knowledge on the tribological phenomena that involve freeform mechanical components or medical prostheses, wear tests are performed on biomedical and industrial materials in order to solve or reduce failures or malfunctions due to material loss. Scientific and technological advances in the field of optical scanning allow the application of innovative devices for wear measurements, leading to improvements that were unimaginable until a few years ago. It is therefore important to develop techniques, based on new instrumentations, for more accurate and reproducible measurements of wear. The aim of this work is to discuss the use of innovative 3D optical scanners and an experimental procedure to detect and evaluate wear, comparing this technique with other wear evaluation methods for industrial components and biomedical devices.

  12. Simultaneous optical and electrical recording of a single ion-channel.

    PubMed

    Ide, Toru; Takeuchi, Yuko; Aoki, Takaaki; Yanagida, Toshio

    2002-10-01

    In recent years, the single-molecule imaging technique has proven to be a valuable tool in solving many basic problems in biophysics. The technique used to measure single-molecule functions was initially developed to study electrophysiological properties of channel proteins. However, the technology to visualize single channels at work has not received as much attention. In this study, we have for the first time, simultaneously measured the optical and electrical properties of single-channel proteins. The large conductance calcium-activated potassium channel (BK-channel) labeled with fluorescent dye molecules was incorporated into a planar bilayer membrane and the fluorescent image captured with a total internal reflection fluorescence microscope simultaneously with single-channel current recording. This innovative technology will greatly advance the study of channel proteins as well as signal transduction processes that involve ion permeation processes.

  13. Research in cosmic and gamma ray astrophysics

    NASA Technical Reports Server (NTRS)

    Stone, Edward C.; Mewaldt, Richard A.; Prince, Thomas A.

    1992-01-01

    Discussed here is research in cosmic ray and gamma ray astrophysics at the Space Radiation Laboratory (SRL) of the California Institute of Technology. The primary activities discussed involve the development of new instrumentation and techniques for future space flight. In many cases these instrumentation developments were tested in balloon flight instruments designed to conduct new investigations in cosmic ray and gamma ray astrophysics. The results of these investigations are briefly summarized. Specific topics include a quantitative investigation of the solar modulation of cosmic ray protons and helium nuclei, a study of cosmic ray positron and electron spectra in interplanetary and interstellar space, the solar modulation of cosmic rays, an investigation of techniques for the measurement and interpretation of cosmic ray isotopic abundances, and a balloon measurement of the isotopic composition of galactic cosmic ray boron, carbon, and nitrogen.

  14. High-speed technique based on a parallel projection correlation procedure for digital image correlation

    NASA Astrophysics Data System (ADS)

    Zaripov, D. I.; Renfu, Li

    2018-05-01

    The implementation of high-efficiency digital image correlation methods based on a zero-normalized cross-correlation (ZNCC) procedure for high-speed, time-resolved measurements using a high-resolution digital camera is associated with big data processing and is often time consuming. In order to speed-up ZNCC computation, a high-speed technique based on a parallel projection correlation procedure is proposed. The proposed technique involves the use of interrogation window projections instead of its two-dimensional field of luminous intensity. This simplification allows acceleration of ZNCC computation up to 28.8 times compared to ZNCC calculated directly, depending on the size of interrogation window and region of interest. The results of three synthetic test cases, such as a one-dimensional uniform flow, a linear shear flow and a turbulent boundary-layer flow, are discussed in terms of accuracy. In the latter case, the proposed technique is implemented together with an iterative window-deformation technique. On the basis of the results of the present work, the proposed technique is recommended to be used for initial velocity field calculation, with further correction using more accurate techniques.

  15. Active Interrogation using Photofission Technique for Nuclear Materials Control and Accountability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Haori

    2016-03-31

    Innovative systems with increased sensitivity and resolution are in great demand to detect diversion and to prevent misuse in support of nuclear materials management for the U.S. fuel cycle. Nuclear fission is the most important multiplicative process involved in non-destructive active interrogation. This process produces the most easily recognizable signature for nuclear materials. In addition to thermal or high-energy neutrons, high-energy gamma rays can also excite a nucleus and cause fission through a process known as photofission. Electron linear accelerators (linacs) are widely used as the interrogating photon sources for inspection methods involving photofission technique. After photofission reactions, prompt signalsmore » are much stronger than the delayed signals, but it is difficult to quantify them in practical measurements. Delayed signals are easily distinguishable from the interrogating radiation. Linac-based, advanced inspection techniques utilizing the delayed signals after photofission have been extensively studied for homeland security applications. Previous research also showed that a unique delayed gamma ray energy spectrum exists for each fissionable isotope. In this work, high-energy delayed γ-rays were demonstrated to be signatures for detection, identification, and quantification of special nuclear materials. Such γ-rays were measured in between linac pulses using independent data acquisition systems. A list-mode system was developed to measure low-energy delayed γ-rays after irradiation. Photofission product yields of 238U and 239Pu were determined based on the measured delayed γ-ray spectra. The differential yields of delayed γ-rays were also proven to be able to discriminate nuclear from non-nuclear materials. The measurement outcomes were compared with Monte Carlo simulation results. It was demonstrated that the current available codes have capabilities and limitations in the simulation of photofission process. A two-fold approach was used to address the high-rate challenge in used nuclear fuel assay based on photofission technique. First, a standard HPGe preamplifier was modified to improve its capabilities in high-rate pulsed photofission environment. Second, advanced pulse processing algorithms were shown to greatly improve throughput rate without large sacrifice in energy resolution at ultra-high input count rate. Two customized gamma spectroscopy systems were also developed in real-time on FPGAs. They were shown to have promising performance matching available commercial units.« less

  16. Electrochemical Assay of Gold-Plating Solutions

    NASA Technical Reports Server (NTRS)

    Chiodo, R.

    1982-01-01

    Gold content of plating solution is assayed by simple method that required only ordinary electrochemical laboratory equipment and materials. Technique involves electrodeposition of gold from solution onto electrode, the weight gain of which is measured. Suitable fast assay methods are economically and practically necessary in electronics and decorative-plating industries. If gold content in plating bath is too low, poor plating may result, with consequent economic loss to user.

  17. Mu Wave Suppression during the Perception of Meaningless Syllables: EEG Evidence of Motor Recruitment

    ERIC Educational Resources Information Center

    Crawcour, Stephen; Bowers, Andrew; Harkrider, Ashley; Saltuklaroglu, Tim

    2009-01-01

    Motor involvement in speech perception has been recently studied using a variety of techniques. In the current study, EEG measurements from Cz, C3 and C4 electrodes were used to examine the relative power of the mu rhythm (i.e., 8-13 Hz) in response to various audio-visual speech and non-speech stimuli, as suppression of these rhythms is…

  18. Measuring nursing care and compassion: the McDonaldised nurse?

    PubMed

    Bradshaw, A

    2009-08-01

    In June 2008 the UK government, supported by the Royal College of Nursing, stated that nursing care would be measured for compassion. This paper considers the implications of this statement by critically examining the relationship of compassion to care from a variety of perspectives. It is argued that the current market-driven approaches to healthcare involve redefining care as a pale imitation, even parody, of the traditional approach of the nurse as "my brother's keeper". Attempts to measure such parody can only measure artificial techniques and give rise to a McDonald's-type nursing care rather than heartfelt care. The arguments of this paper, although applied to nursing, also apply to medicine and healthcare generally.

  19. Quantification of sauter mean diameter in diesel sprays using scattering-absorption extinction measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martinez, Gabrielle L; Magnotti, Gina M; Knox, Benjamin W

    Quantitative measurements of the primary breakup process in diesel sprays are lacking due to a range of experimental and diagnostic challenges, including: high droplet number density environments, very small characteristic drop size scales (~1-10 μm), and high characteristic velocities in the primary breakup region (~600 m/s). Due to these challenges, existing measurement techniques have failed to resolve a sufficient range of the temporal and spatial scales involved and much remains unknown about the primary atomization process in practical diesel sprays. To gain a better insight into this process, we have developed a joint visible and x-ray extinction measurement technique tomore » quantify axial and radial distributions of the path-integrated Sauter Mean Diameter (SMD) and Liquid Volume Fraction (LVF) for diesel-like sprays. This technique enables measurement of the SMD in regions of moderate droplet number density, enabling construction of the temporal history of drop size development within practical diesel sprays. The experimental campaign was conducted jointly at the Georgia Institute of Technology and Argonne National Laboratory using the Engine Combustion Network “Spray D” injector. X-ray radiography liquid absorption measurements, conducted at the Advanced Photon Source at Argonne, quantify the liquid-fuel mass and volume distribution in the spray. Diffused back-illumination liquid scattering measurements were conducted at Georgia Tech to quantify the optical thickness throughout the spray. By application of Mie-scatter equations, the ratio of the absorption and scattering extinction measurements is demonstrated to yield solutions for the SMD. This work introduces the newly developed scattering-absorption measurement technique and highlights the important considerations that must be taken into account when jointly processing these measurements to extract the SMD. These considerations include co-alignment of measurements taken at different institutions, identification of viable regions where the measurement ratio can be accurately interpreted, and uncertainty analysis in the measurement ratio and resulting SMD. Because the measurement technique provides the spatial history of the SMD development, it is expected to be especially informative to the diesel spray modeling community. Results from this work will aid in understanding the effect of ambient densities and injection pressures on primary breakup and help assess the appropriateness of spray submodels for engine computational fluid dynamics codes.« less

  20. Experimental investigations of helium cryotrapping by argon frost

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mack, A.; Perinic, D.; Murdoch, D.

    1992-03-01

    At the Karlsruhe Nuclear Research Centre (KfK) cryopumping techniques are being investigated by which the gaseous exhausts from the NET/ITER reactor can be pumped out during the burn-and dwell-times. Cryosorption and cryotrapping are techniques which are suitable for this task. It is the target of the investigations to test the techniques under NET/ITER conditions and to determine optimum design data for a prototype. They involve measurement of the pumping speed as a function of the gas composition, gas flow and loading condition of the pump surfaces. The following parameters are subjected to variations: Ar/He ratio, specific helium volume flow rate,more » cryosurface temperature, process gas composition, impurities in argon trapping gas, three-stage operation and two-stage operation. This paper is a description of the experiments on argon trapping techniques started in 1990. Eleven tests as well as the results derived from them are described.« less

  1. Surface Plasmon Resonance: New Biointerface Designs and High-Throughput Affinity Screening

    NASA Astrophysics Data System (ADS)

    Linman, Matthew J.; Cheng, Quan Jason

    Surface plasmon resonance (SPR) is a surface optical technique that measures minute changes in refractive index at a metal-coated surface. It has become increasingly popular in the study of biological and chemical analytes because of its label-free measurement feature. In addition, SPR allows for both quantitative and qualitative assessment of binding interactions in real time, making it ideally suited for probing weak interactions that are often difficult to study with other methods. This chapter presents the biosensor development in the last 3 years or so utilizing SPR as the principal analytical technique, along with a concise background of the technique itself. While SPR has demonstrated many advantages, it is a nonselective method and so, building reproducible and functional interfaces is vital to sensing applications. This chapter, therefore, focuses mainly on unique surface chemistries and assay approaches to examine biological interactions with SPR. In addition, SPR imaging for high-throughput screening based on microarrays and novel hyphenated techniques involving the coupling of SPR to other analytical methods is discussed. The chapter concludes with a commentary on the current state of SPR biosensing technology and the general direction of future biosensor research.

  2. Measurements of a potential interference with laser-induced fluorescence measurements of ambient OH from the ozonolysis of biogenic alkenes

    NASA Astrophysics Data System (ADS)

    Rickly, Pamela; Stevens, Philip S.

    2018-01-01

    Reactions of the hydroxyl radical (OH) play a central role in the chemistry of the atmosphere, and measurements of its concentration can provide a rigorous test of our understanding of atmospheric oxidation. Several recent studies have shown large discrepancies between measured and modeled OH concentrations in forested areas impacted by emissions of biogenic volatile organic compounds (BVOCs), where modeled concentrations were significantly lower than measurements. A potential reason for some of these discrepancies involves interferences associated with the measurement of OH using the laser-induced fluorescence-fluorescence assay by gas expansion (LIF-FAGE) technique in these environments. In this study, a turbulent flow reactor operating at atmospheric pressure was coupled to a LIF-FAGE cell and the OH signal produced from the ozonolysis of α-pinene, β-pinene, ocimene, isoprene, and 2-methyl-3-buten-2-ol (MBO) was measured. To distinguish between OH produced from the ozonolysis reactions and any OH artifact produced inside the LIF-FAGE cell, an external chemical scrubbing technique was used, allowing for the direct measurement of any interference. An interference under high ozone (between 2 × 1013 and 10 × 1013 cm-3) and BVOC concentrations (between approximately 0.1 × 1012 and 40 × 1012 cm-3) was observed that was not laser generated and was independent of the ozonolysis reaction time. For the ozonolysis of α- and β-pinene, the observed interference accounted for approximately 40 % of the total OH signal, while for the ozonolysis of ocimene the observed interference accounted for approximately 70 % of the total OH signal. Addition of acetic acid to the reactor eliminated the interference, suggesting that the source of the interference in these experiments involved the decomposition of stabilized Criegee intermediates (SCIs) inside the FAGE detection cell. Extrapolation of these measurements to ambient concentrations suggests that these interferences should be below the detection limit of the instrument.

  3. Picturing pathogen infection in plants.

    PubMed

    Barón, Matilde; Pineda, Mónica; Pérez-Bueno, María Luisa

    2016-09-01

    Several imaging techniques have provided valuable tools to evaluate the impact of biotic stress on host plants. The use of these techniques enables the study of plant-pathogen interactions by analysing the spatial and temporal heterogeneity of foliar metabolism during pathogenesis. In this work we review the use of imaging techniques based on chlorophyll fluorescence, multicolour fluorescence and thermography for the study of virus, bacteria and fungi-infected plants. These studies have revealed the impact of pathogen challenge on photosynthetic performance, secondary metabolism, as well as leaf transpiration as a promising tool for field and greenhouse management of diseases. Images of standard chlorophyll fluorescence (Chl-F) parameters obtained during Chl-F induction kinetics related to photochemical processes and those involved in energy dissipation, could be good stress indicators to monitor pathogenesis. Changes on UV-induced blue (F440) and green fluorescence (F520) measured by multicolour fluorescence imaging in pathogen-challenged plants seem to be related with the up-regulation of the plant secondary metabolism and with an increase in phenolic compounds involved in plant defence, such as scopoletin, chlorogenic or ferulic acids. Thermal imaging visualizes the leaf transpiration map during pathogenesis and emphasizes the key role of stomata on innate plant immunity. Using several imaging techniques in parallel could allow obtaining disease signatures for a specific pathogen. These techniques have also turned out to be very useful for presymptomatic pathogen detection, and powerful non-destructive tools for precision agriculture. Their applicability at lab-scale, in the field by remote sensing, and in high-throughput plant phenotyping, makes them particularly useful. Thermal sensors are widely used in crop fields to detect early changes in leaf transpiration induced by both air-borne and soil-borne pathogens. The limitations of measuring photosynthesis by Chl-F at the canopy level are being solved, while the use of multispectral fluorescence imaging is very challenging due to the type of light excitation that is used.

  4. Reliability of System Identification Techniques to Assess Standing Balance in Healthy Elderly

    PubMed Central

    Maier, Andrea B.; Aarts, Ronald G. K. M.; van Gerven, Joop M. A.; Arendzen, J. Hans; Schouten, Alfred C.; Meskers, Carel G. M.; van der Kooij, Herman

    2016-01-01

    Objectives System identification techniques have the potential to assess the contribution of the underlying systems involved in standing balance by applying well-known disturbances. We investigated the reliability of standing balance parameters obtained with multivariate closed loop system identification techniques. Methods In twelve healthy elderly balance tests were performed twice a day during three days. Body sway was measured during two minutes of standing with eyes closed and the Balance test Room (BalRoom) was used to apply four disturbances simultaneously: two sensory disturbances, to the proprioceptive and the visual system, and two mechanical disturbances applied at the leg and trunk segment. Using system identification techniques, sensitivity functions of the sensory disturbances and the neuromuscular controller were estimated. Based on the generalizability theory (G theory), systematic errors and sources of variability were assessed using linear mixed models and reliability was assessed by computing indexes of dependability (ID), standard error of measurement (SEM) and minimal detectable change (MDC). Results A systematic error was found between the first and second trial in the sensitivity functions. No systematic error was found in the neuromuscular controller and body sway. The reliability of 15 of 25 parameters and body sway were moderate to excellent when the results of two trials on three days were averaged. To reach an excellent reliability on one day in 7 out of 25 parameters, it was predicted that at least seven trials must be averaged. Conclusion This study shows that system identification techniques are a promising method to assess the underlying systems involved in standing balance in elderly. However, most of the parameters do not appear to be reliable unless a large number of trials are collected across multiple days. To reach an excellent reliability in one third of the parameters, a training session for participants is needed and at least seven trials of two minutes must be performed on one day. PMID:26953694

  5. Combining Temporal and Spectral Information with Spatial Mapping to Identify Differences between Phonological and Semantic Networks: A Magnetoencephalographic Approach

    PubMed Central

    McNab, Fiona; Hillebrand, Arjan; Swithenby, Stephen J.; Rippon, Gina

    2012-01-01

    Early, lesion-based models of language processing suggested that semantic and phonological processes are associated with distinct temporal and parietal regions respectively, with frontal areas more indirectly involved. Contemporary spatial brain mapping techniques have not supported such clear-cut segregation, with strong evidence of activation in left temporal areas by both processes and disputed evidence of involvement of frontal areas in both processes. We suggest that combining spatial information with temporal and spectral data may allow a closer scrutiny of the differential involvement of closely overlapping cortical areas in language processing. Using beamforming techniques to analyze magnetoencephalography data, we localized the neuronal substrates underlying primed responses to nouns requiring either phonological or semantic processing, and examined the associated measures of time and frequency in those areas where activation was common to both tasks. Power changes in the beta (14–30 Hz) and gamma (30–50 Hz) frequency bands were analyzed in pre-selected time windows of 350–550 and 500–700 ms In left temporal regions, both tasks elicited power changes in the same time window (350–550 ms), but with different spectral characteristics, low beta (14–20 Hz) for the phonological task and high beta (20–30 Hz) for the semantic task. In frontal areas (BA10), both tasks elicited power changes in the gamma band (30–50 Hz), but in different time windows, 500–700 ms for the phonological task and 350–550 ms for the semantic task. In the left inferior parietal area (BA40), both tasks elicited changes in the 20–30 Hz beta frequency band but in different time windows, 350–550 ms for the phonological task and 500–700 ms for the semantic task. Our findings suggest that, where spatial measures may indicate overlapping areas of involvement, additional beamforming techniques can demonstrate differential activation in time and frequency domains. PMID:22908001

  6. The OH + HBr reaction revisited

    NASA Technical Reports Server (NTRS)

    Ravishankara, A. R.; Wine, P. H.; Wells, J. R.

    1985-01-01

    Variable-temperature measurements of the rate coefficient /k(1)/ for the reaction OH + HBr yield Br + H2O are presented. The measurements are verified by two techniques: one involved a 266-nm pulsed-laser photolysis of O3/H2O/HBr/He mixtures in conjunction with time-resolved resonance fluorescence detection of OH, the second comprised pulsed laser-induced fluorescence detection of OH following 248-nm pulsed-laser photolysis of H2O2/HBr/Ar mixtures. It is reported that k(1) = (11.9 + or -1.4 x 10 to the -12th (cu cm)/(molecule)(s) independent of temperature. The measurements are compared with other available results.

  7. Inversion of solar extinction data from the Apollo-Soyuz Test Project Stratospheric Aerosol Measurement (ASTP/SAM) experiment

    NASA Technical Reports Server (NTRS)

    Pepin, T. J.

    1977-01-01

    The inversion methods are reported that have been used to determine the vertical profile of the extinction coefficient due to the stratospheric aerosols from data measured during the ASTP/SAM solar occultation experiment. Inversion methods include the onion skin peel technique and methods of solving the Fredholm equation for the problem subject to smoothing constraints. The latter of these approaches involves a double inversion scheme. Comparisons are made between the inverted results from the SAM experiment and near simultaneous measurements made by lidar and balloon born dustsonde. The results are used to demonstrate the assumptions required to perform the inversions for aerosols.

  8. Evidence from a partial report task for forgetting in dynamic spatial memory.

    PubMed

    Gugerty, L

    1998-09-01

    G. Sperling (1960) and others have investigated memory for briefly presented stimuli by using a partial versus whole report technique in which participants sometimes reported part of a stimulus array and sometimes reported all of it. For simple, static stimulus displays, the partial report technique showed that participants could recall most of the information in the stimulus array but that this information faded quickly when participants engaged in whole report recall. An experiment was conducted that applied the partial report method to a task involving complex displays of moving objects. In the experiment, 26 participants viewed cars in a low-fidelity driving simulator and then reported the locations of some or all of the cars in each scene. A statistically significant advantage was found for the partial report trials. This finding suggests that detailed spatial location information was forgotten from dynamic spatial memory over the 14 s that it took participants to recall whole report trials. The experiment results suggest better ways of measuring situation awareness. Partial report recall techniques may give a more accurate measure of people's momentary situation awareness than whole report techniques. Potential applications of this research include simulator-based measures of situation awareness ability that can be part of inexpensive test batteries to select people for real-time tasks (e.g., in a driver licensing battery) and to identify people who need additional training.

  9. An overview assessment of the effectiveness and global popularity of some methods used in measuring riverbank filtration

    NASA Astrophysics Data System (ADS)

    Umar, Da'u. Abba; Ramli, Mohammad Firuz; Aris, Ahmad Zaharin; Sulaiman, Wan Nor Azmin; Kura, Nura Umar; Tukur, Abubakar Ibrahim

    2017-07-01

    This paper presents an overview assessment of the effectiveness and popularity of some methods adopted in measuring river bank filtration (RBF). The review is aim at understanding some of the appropriate methods used in measuring riverbank filtration, their frequencies of use, and their spatial applications worldwide. The most commonly used methods and techniques in riverbank filtration studies are: Geographical Information System (GIS) (site suitability/surface characterization), Geophysical, Pumping Test and borehole logging (sub-surface), Hydrochemical, Geochemical, and Statistical techniques (hydrochemistry of water), Numerical modelling, Tracer techniques and Stable Isotope Approaches (degradation and contaminants attenuation processes). From the summary in Table 1, hydrochemical, numerical modelling and pumping test are the frequently used and popular methods, while geophysical, GIS and statistical techniques are the less attractive. However, many researchers prefer integrated approach especially that riverbank filtration studies involve diverse and interrelated components. In term of spatial popularity and successful implementation of riverbank filtration, it is explicitly clear that the popularity and success of the technology is more pronounced in developed countries like U.S. and most European countries. However, it is gradually gaining ground in Asia and Africa, although it is not far from its infancy state in Africa, where the technology could be more important considering the economic status of the region and its peculiarity when it comes to water resources predicaments.

  10. Statistical process management: An essential element of quality improvement

    NASA Astrophysics Data System (ADS)

    Buckner, M. R.

    Successful quality improvement requires a balanced program involving the three elements that control quality: organization, people and technology. The focus of the SPC/SPM User's Group is to advance the technology component of Total Quality by networking within the Group and by providing an outreach within Westinghouse to foster the appropriate use of statistic techniques to achieve Total Quality. SPM encompasses the disciplines by which a process is measured against its intrinsic design capability, in the face of measurement noise and other obscuring variability. SPM tools facilitate decisions about the process that generated the data. SPM deals typically with manufacturing processes, but with some flexibility of definition and technique it accommodates many administrative processes as well. The techniques of SPM are those of Statistical Process Control, Statistical Quality Control, Measurement Control, and Experimental Design. In addition, techniques such as job and task analysis, and concurrent engineering are important elements of systematic planning and analysis that are needed early in the design process to ensure success. The SPC/SPM User's Group is endeavoring to achieve its objectives by sharing successes that have occurred within the member's own Westinghouse department as well as within other US and foreign industry. In addition, failures are reviewed to establish lessons learned in order to improve future applications. In broader terms, the Group is interested in making SPM the accepted way of doing business within Westinghouse.

  11. Nuclear constraints on the age of the universe

    NASA Technical Reports Server (NTRS)

    Schramm, D. N.

    1982-01-01

    A review is made of how one can use nuclear physics to put rather stringent limits on the age of the universe and thus the cosmic distance scale. The age can be estimated to a fair degree of accuracy. No single measurement of the time since the Big Bang gives a specific, unambiguous age. There are several methods that together fix the age with surprising precision. In particular, there are three totally independent techniques for estimating an age and a fourth technique which involves finding consistency of the other three in the framework of the standard Big Bang cosmological model. The three independent methods are: cosmological dynamics, the age of the oldest stars, and radioactive dating. This paper concentrates on the third of the three methods, and the consistency technique.

  12. Differential dynamic microscopy to characterize Brownian motion and bacteria motility

    NASA Astrophysics Data System (ADS)

    Germain, David; Leocmach, Mathieu; Gibaud, Thomas

    2016-03-01

    We have developed a lab module for undergraduate students, which involves the process of quantifying the dynamics of a suspension of microscopic particles using Differential Dynamic Microscopy (DDM). DDM is a relatively new technique that constitutes an alternative method to more classical techniques such as dynamic light scattering (DLS) or video particle tracking (VPT). The technique consists of imaging a particle dispersion with a standard light microscope and a camera and analyzing the images using a digital Fourier transform to obtain the intermediate scattering function, an autocorrelation function that characterizes the dynamics of the dispersion. We first illustrate DDM in the textbook case of colloids under Brownian motion, where we measure the diffusion coefficient. Then we show that DDM is a pertinent tool to characterize biological systems such as motile bacteria.

  13. Physiological correlates of mental workload

    NASA Technical Reports Server (NTRS)

    Zacharias, G. L.

    1980-01-01

    A literature review was conducted to assess the basis of and techniques for physiological assessment of mental workload. The study findings reviewed had shortcomings involving one or more of the following basic problems: (1) physiologic arousal can be easily driven by nonworkload factors, confounding any proposed metric; (2) the profound absence of underlying physiologic models has promulgated a multiplicity of seemingly arbitrary signal processing techniques; (3) the unspecified multidimensional nature of physiological "state" has given rise to a broad spectrum of competing noncommensurate metrics; and (4) the lack of an adequate definition of workload compels physiologic correlations to suffer either from the vagueness of implicit workload measures or from the variance of explicit subjective assessments. Using specific studies as examples, two basic signal processing/data reduction techniques in current use, time and ensemble averaging are discussed.

  14. Comparative Study on the Different Testing Techniques in Tree Classification for Detecting the Learning Motivation

    NASA Astrophysics Data System (ADS)

    Juliane, C.; Arman, A. A.; Sastramihardja, H. S.; Supriana, I.

    2017-03-01

    Having motivation to learn is a successful requirement in a learning process, and needs to be maintained properly. This study aims to measure learning motivation, especially in the process of electronic learning (e-learning). Here, data mining approach was chosen as a research method. For the testing process, the accuracy comparative study on the different testing techniques was conducted, involving Cross Validation and Percentage Split. The best accuracy was generated by J48 algorithm with a percentage split technique reaching at 92.19 %. This study provided an overview on how to detect the presence of learning motivation in the context of e-learning. It is expected to be good contribution for education, and to warn the teachers for whom they have to provide motivation.

  15. Key Techniques and Risk Management for the Application of the Pile-Beam-Arch (PBA) Excavation Method: A Case Study of the Zhongjie Subway Station

    PubMed Central

    Guan, Yong-ping; Zhao, Wen; Li, Shen-gang; Zhang, Guo-bin

    2014-01-01

    The design and construction of shallow-buried tunnels in densely populated urban areas involve many challenges. The ground movements induced by tunneling effects pose potential risks to infrastructure such as surface buildings, pipelines, and roads. In this paper, a case study of the Zhongjie subway station located in Shenyang, China, is examined to investigate the key construction techniques and the influence of the Pile-Beam-Arch (PBA) excavation method on the surrounding environment. This case study discusses the primary risk factors affecting the environmental safety and summarizes the corresponding risk mitigation measures and key techniques for subway station construction using the PBA excavation method in a densely populated urban area. PMID:25221783

  16. Decoupling pipeline influences in soil resistivity measurements with finite element techniques

    NASA Astrophysics Data System (ADS)

    Deo, R. N.; Azoor, R. M.; Zhang, C.; Kodikara, J. K.

    2018-03-01

    Periodic inspection of pipeline conditions is an important asset management strategy conducted by water and sewer utilities for efficient and economical operations of their assets in field. The Level 1 pipeline condition assessment involving resistivity profiling along the pipeline right-of-way is a common technique for delineating pipe sections that might be installed in highly corrosive soil environment. However, the technique can suffer from significant perturbations arising from the buried pipe itself, resulting in errors in native soil characterisation. To address this problem, a finite element model was developed to investigate the degree to which pipes of different a) diameters, b) burial depths, and c) surface conditions (bare or coated) can influence in-situ soil resistivity measurements using Wenner methods. It was found that the greatest errors can arise when conducting measurements over a bare pipe with the array aligned parallel to the pipe. Depending upon the pipe surface conditions, in-situ resistivity measurements can either be underestimated or overestimated from true soil resistivities. Following results based on simulations and decoupling equations, a guiding framework for removing pipe influences in soil resistivity measurements were developed that can be easily used to perform corrections on measurements. The equations require simple a-prior information on the pipe diameter, burial depth, surface condition, and the array length and orientation used. Findings from this study have immediate application and is envisaged to be useful for critical civil infrastructure monitoring and assessment.

  17. Spatial calibration of an optical see-through head mounted display

    PubMed Central

    Gilson, Stuart J.; Fitzgibbon, Andrew W.; Glennerster, Andrew

    2010-01-01

    We present here a method for calibrating an optical see-through Head Mounted Display (HMD) using techniques usually applied to camera calibration (photogrammetry). Using a camera placed inside the HMD to take pictures simultaneously of a tracked object and features in the HMD display, we could exploit established camera calibration techniques to recover both the intrinsic and extrinsic properties of the HMD (width, height, focal length, optic centre and principal ray of the display). Our method gives low re-projection errors and, unlike existing methods, involves no time-consuming and error-prone human measurements, nor any prior estimates about the HMD geometry. PMID:18599125

  18. Graphene-based terahertz photodetector by noise thermometry technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Ming-Jye, E-mail: mingjye@asiss.sinica.edu.tw; Institute of Physics, Academia Sinica, Taipei 11529, Taiwan; Wang, Ji-Wun

    2014-01-20

    We report the characteristics of graphene-based terahertz (THz) photodetector based on noise thermometry technique by measuring its noise power at frequency from 4 to 6 GHz. Hot electron system in graphene microbridge is generated after THz photon pumping and creates extra noise power. The equivalent noise temperature and electron temperature increase rapidly in low THz pumping regime and saturate gradually in high THz power regime which is attributed to a faster energy relaxation process involved by stronger electron-phonon interaction. Based on this detector, a conversion efficiency around 0.15 from THz power to noise power in 4–6 GHz span has been achieved.

  19. Comet composition and density analyzer

    NASA Technical Reports Server (NTRS)

    Clark, B. C.

    1982-01-01

    Distinctions between cometary material and other extraterrestrial materials (meteorite suites and stratospherically-captured cosmic dust) are addressed. The technique of X-ray fluorescence (XRF) for analysis of elemental composition is involved. Concomitant with these investigations, the problem of collecting representative samples of comet dust (for rendezvous missions) was solved, and several related techniques such as mineralogic analysis (X-ray diffraction), direct analysis of the nucleus without docking (electron macroprobe), dust flux rate measurement, and test sample preparation were evaluated. An explicit experiment concept based upon X-ray fluorescence analysis of biased and unbiased sample collections was scoped and proposed for a future rendezvous mission with a short-period comet.

  20. Calibration of strontium-90 eye applicator using a strontium external beam standard.

    PubMed

    Siddle, D; Langmack, K

    1999-07-01

    Four techniques for measuring the dose rate from Sr-90 concave eye plaques are presented. The techniques involve calibrating a concave eye plaque against a Sr-90 teletherapy unit using X-Omat film, radiochromic film, black LiF TLD discs and LiF chips. The mean dose rate predicted by these dosimeters is 7.5 cGy s(-1). The dose rate quoted by the manufacturer is 33% lower than this value, which is consistent with discrepancies reported by other authors. Calibration against a 6 MV linear accelerator was also carried out using each of the above dosimetric devices, and appropriate sensitivity correction factors have been presented.

  1. Calibration of strontium-90 eye applicator using a strontium external beam standard

    NASA Astrophysics Data System (ADS)

    Siddle, D.; Langmack, K.

    1999-07-01

    Four techniques for measuring the dose rate from Sr-90 concave eye plaques are presented. The techniques involve calibrating a concave eye plaque against a Sr-90 teletherapy unit using X-Omat film, radiochromic film, black LiF TLD discs and LiF chips. The mean dose rate predicted by these dosimeters is 7.5 cGy s-1. The dose rate quoted by the manufacturer is 33% lower than this value, which is consistent with discrepancies reported by other authors. Calibration against a 6 MV linear accelerator was also carried out using each of the above dosimetric devices, and appropriate sensitivity correction factors have been presented.

  2. Detecting gas hydrate behavior in crude oil using NMR.

    PubMed

    Gao, Shuqiang; House, Waylon; Chapman, Walter G

    2006-04-06

    Because of the associated experimental difficulties, natural gas hydrate behavior in black oil is poorly understood despite its grave importance in deep-water flow assurance. Since the hydrate cannot be visually observed in black oil, traditional methods often rely on gas pressure changes to monitor hydrate formation and dissociation. Because gases have to diffuse through the liquid phase for hydrate behavior to create pressure responses, the complication of gas mass transfer is involved and hydrate behavior is only indirectly observed. This pressure monitoring technique encounters difficulties when the oil phase is too viscous, the amount of water is too small, or the gas phase is absent. In this work we employ proton nuclear magnetic resonance (NMR) spectroscopy to observe directly the liquid-to-solid conversion of the water component in black oil emulsions. The technique relies on two facts. The first, well-known, is that water becomes essentially invisible to liquid state NMR as it becomes immobile, as in hydrate or ice formation. The second, our recent finding, is that in high magnetic fields of sufficient homogeneity, it is possible to distinguish water from black oil spectrally by their chemical shifts. By following changes in the area of the water peak, the process of hydrate conversion can be measured, and, at lower temperatures, the formation of ice. Taking only seconds to accomplish, this measurement is nearly direct in contrast to conventional techniques that measure the pressure changes of the whole system and assume these changes represent formation or dissociation of hydrates - rather than simply changes in solubility. This new technique clearly can provide accurate hydrate thermodynamic data in black oils. Because the technique measures the total mobile water with rapidity, extensions should prove valuable in studying the dynamics of phase transitions in emulsions.

  3. Basic investigation of dual-energy x-ray absorptiometry for bone densitometry using computed radiography

    NASA Astrophysics Data System (ADS)

    Shimura, Kazuo; Nakajima, Nobuyoshi; Tanaka, Hiroshi; Ishida, Masamitsu; Kato, Hisatoyo

    1993-09-01

    Dual-energy X-ray absorptiometry (DXA) is one of the bone densitometry techniques to diagnose osteoporosis, and has been gradually getting popular due to its high degree of precision. However, DXA involves a time-consuming examination because of its pencil-beam scan, and the equipment is expensive. In this study, we examined a new bone densitometry technique (CR-DXA) utilizing an X-ray imaging system and Computed Radiography (CR) used for medical X-ray image diagnosis. High level of measurement precision and accuracy could be achieved by X-ray rube voltage/filter optimization and various nonuniformity corrections based on simulation and experiment. The phantom study using a bone mineral block showed precision of 0.83% c.v. (coefficient of variation), and accuracy of 0.01 g/cm2, suggesting that a practically equivalent degree of measurement precision and accuracy to that of the DXA approach is achieved. CR-DXA is considered to provide bone mineral densitometry to facilitate simple, quick and precise bone mineral density measurement.

  4. The study of frequency-scan photothermal reflectance technique for thermal diffusivity measurement

    DOE PAGES

    Hua, Zilong; Ban, Heng; Hurley, David H.

    2015-05-05

    A frequency scan photothermal reflectance technique to measure thermal diffusivity of bulk samples is studied in this manuscript. Similar to general photothermal reflectance methods, an intensity-modulated heating laser and a constant intensity probe laser are used to determine the surface temperature response under sinusoidal heating. The approach involves fixing the distance between the heating and probe laser spots, recording the phase lag of reflected probe laser intensity with respect to the heating laser frequency modulation, and extracting thermal diffusivity using the phase lag – (frequency) 1/2 relation. The experimental validation is performed on three samples (SiO 2, CaF 2 andmore » Ge), which have a wide range of thermal diffusivities. The measured thermal diffusivity values agree closely with literature values. Lastly, compared to the commonly used spatial scan method, the experimental setup and operation of the frequency scan method are simplified, and the uncertainty level is equal to or smaller than that of the spatial scan method.« less

  5. Three-dimensional displacements of a large volcano flank movement during the May 2010 eruptions at Pacaya Volcano, Guatemala

    NASA Astrophysics Data System (ADS)

    Schaefer, L. N.; Wang, T.; Escobar-Wolf, R.; Oommen, T.; Lu, Z.; Kim, J.; Lundgren, P. R.; Waite, G. P.

    2017-01-01

    Although massive flank failure is fairly common in the evolution of volcanoes, measurements of flank movement indicative of instability are rare. Here 3-D displacements from airborne radar amplitude images derived using an amplitude image pixel offset tracking technique show that the west and southwest flanks of Pacaya Volcano in Guatemala experienced large ( 4 m), discrete landsliding that was ultimately aborted. Pixel offset tracking improved measurement recovery by nearly 50% over classic interferometric synthetic aperture radar techniques, providing unique measurements at the event. The 3-D displacement field shows that the flank moved coherently downslope along a complex failure surface involving both rotational and along-slope movement. Notably, the lack of continuous movement of the slide in the years leading up to the event emphasizes that active movement should not always be expected at volcanoes for which triggering factors (e.g., magmatic intrusions and eruptions) could precipitate sudden major flank instability.

  6. The study of frequency-scan photothermal reflectance technique for thermal diffusivity measurement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hua, Zilong; Ban, Heng; Hurley, David H.

    A frequency scan photothermal reflectance technique to measure thermal diffusivity of bulk samples is studied in this manuscript. Similar to general photothermal reflectance methods, an intensity-modulated heating laser and a constant intensity probe laser are used to determine the surface temperature response under sinusoidal heating. The approach involves fixing the distance between the heating and probe laser spots, recording the phase lag of reflected probe laser intensity with respect to the heating laser frequency modulation, and extracting thermal diffusivity using the phase lag – (frequency) 1/2 relation. The experimental validation is performed on three samples (SiO 2, CaF 2 andmore » Ge), which have a wide range of thermal diffusivities. The measured thermal diffusivity values agree closely with literature values. Lastly, compared to the commonly used spatial scan method, the experimental setup and operation of the frequency scan method are simplified, and the uncertainty level is equal to or smaller than that of the spatial scan method.« less

  7. Why You Should Believe Cold Fusion is Real

    NASA Astrophysics Data System (ADS)

    Storms, Edmund K.

    2005-03-01

    Nuclear reactions are now claimed to be initiated in certain solid materials at an energy too low to overcome the Coulomb barrier. These reactions include fusion, accelerated radioactive decay, and transmutation involving heavy elements. Evidence is based on hundreds of measurements of anomalous energy using a variety of calorimeters at levels far in excess of error, measurement of nuclear products using many normally accepted techniques, observations of many patterns of behavior common to all studies, measurement of anomalous energetic emissions using accepted techniques, and an understanding of most variables that have hindered reproducibility in the past. This evidence can be found at www.LENR-CANR.orgwww.LENR-CANR.org. Except for an accepted theory, the claims have met all requirements normally required before a new idea is accepted by conventional science, yet rejection continues. How long can the US afford to reject a clean and potentially cheap source of energy, especially when other nations are attempting to develop this energy and the need for such an energy source is so great?

  8. A low-cost three-dimensional laser surface scanning approach for defining body segment parameters.

    PubMed

    Pandis, Petros; Bull, Anthony Mj

    2017-11-01

    Body segment parameters are used in many different applications in ergonomics as well as in dynamic modelling of the musculoskeletal system. Body segment parameters can be defined using different methods, including techniques that involve time-consuming manual measurements of the human body, used in conjunction with models or equations. In this study, a scanning technique for measuring subject-specific body segment parameters in an easy, fast, accurate and low-cost way was developed and validated. The scanner can obtain the body segment parameters in a single scanning operation, which takes between 8 and 10 s. The results obtained with the system show a standard deviation of 2.5% in volumetric measurements of the upper limb of a mannequin and 3.1% difference between scanning volume and actual volume. Finally, the maximum mean error for the moment of inertia by scanning a standard-sized homogeneous object was 2.2%. This study shows that a low-cost system can provide quick and accurate subject-specific body segment parameter estimates.

  9. Techniques for hot structures testing

    NASA Technical Reports Server (NTRS)

    Deangelis, V. Michael; Fields, Roger A.

    1990-01-01

    Hot structures testing have been going on since the early 1960's beginning with the Mach 6, X-15 airplane. Early hot structures test programs at NASA-Ames-Dryden focused on operational testing required to support the X-15 flight test program, and early hot structures research projects focused on developing lab test techniques to simulate flight thermal profiles. More recent efforts involved numerous large and small hot structures test programs that served to develop test methods and measurement techniques to provide data that promoted the correlation of test data with results from analytical codes. In Nov. 1988 a workshop was sponsored that focused on the correlation of hot structures test data with analysis. Limited material is drawn from the workshop and a more formal documentation is provided of topics that focus on hot structures test techniques used at NASA-Ames-Dryden. Topics covered include the data acquisition and control of testing, the quartz lamp heater systems, current strain and temperature sensors, and hot structures test techniques used to simulate the flight thermal environment in the lab.

  10. Measurement of the ^235mU Production Cross Section Using a Critical Assembly*

    NASA Astrophysics Data System (ADS)

    Macri, Robert; Authier, Nicolas; Becker, John; Belier, Gilbert; Bond, Evelyn; Bredeweg, Todd; Glover, S.; Meot, Vincent; Rundberg, Robert; Vieira, David; Wilhelmy, Jerry

    2006-10-01

    Measurements of the creation and destruction cross sections for actinide nuclei constitute an important experimental effort in support of Stockpile Stewardship. In this talk I will give a progress report on the effort to measure the production cross section of the ^235mU isomer integrated over a fission neutron spectrum. This ongoing experiment is fielded at CEA in Valduc, France, taking advantage of the CALIBAN critical assembly. This effort is performed in collaboration with LANL, LLNL, Bruyeres le Chatel, and Valduc staff. This experiment utilizes a technique to measure internal conversion electrons from the ^235mU isomer with the French BIII detector (Bruyeres le Chatel), and involves a substantial chemistry effort (LANL) to prepare targets for irradiation and counting, as well as to remove fission fragments after irradiation. Experimental techniques will be discussed and preliminary data presented. *Work performed under the auspices of the U.S. Department of Energy by Los Alamos National Laboratory (W-7405-ENG-36) and Lawrence Livermore National Laboratory (W-7405-ENG-48), and CEA-DAM under CEA-DAM NNSA-DOE agreement.

  11. Wind tunnel measurements of pollutant turbulent fluxes in urban intersections

    NASA Astrophysics Data System (ADS)

    Carpentieri, Matteo; Hayden, Paul; Robins, Alan G.

    2012-01-01

    Wind tunnel experiments have been carried out at the EnFlo laboratory to measure mean and turbulent tracer fluxes in geometries of real street canyon intersections. The work was part of the major DAPPLE project, focussing on the area surrounding the intersection between Marylebone Road and Gloucester Place in Central London, UK. Understanding flow and dispersion in urban streets is a very important issue for air quality management and planning, and turbulent mass exchange processes are important phenomena that are very often neglected in urban modelling studies. The adopted methodology involved the combined use of laser Doppler anemometry and tracer concentration measurements. This methodology was applied to quantify the mean and turbulent flow and dispersion fields within several street canyon intersections. Vertical profiles of turbulent tracer flux were also measured. The technique, despite a number of limitations, proved reliable and allowed tracer balance calculations to be undertaken in the selected street canyon intersections. The experience gained in this work will enable much more precise studies in the future as issues affecting the accuracy of the experimental technique have been identified and resolved.

  12. Measurement of methane emissions from ruminant livestock using a SF[sub 6] tracer technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, K.; Huyler, M.; Westberg, H.

    1994-02-01

    The purpose of this paper is to describe a method for determining methane emission factors for cattle. The technique involves the direct measurement of methane emissions from livestock in their natural environment. A small permeation tube containing SF[sub 6] is placed in the cow's rumen, and SF[sub 6] and CH[sub 4] concentrations are measured near the mouth and nostrils of the cow. The SF[sub 6] release provides a way to account for the dilution of gases near the animal's mouth. The CH[sub 4] emission rate can be calculated from the known SF[sub 6] emission rate and the measured SF[sub 6]more » and CH[sub 4] concentrations. The tracer method described provides an easy means for acquiring a large methane emissions data base from domestic livestock. The low cost and simplicity should make it possible to monitor a large number of animals in countries throughout the world. An expanded data base of this type helps to reduce uncertainty in the ruminant contribution to the global methane budget. 18 refs., 3 figs., 3 tabs.« less

  13. Josephson frequency meter for millimeter and submillimeter wavelengths

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anischenko, S.E.; Larkin, S.Y.; Chaikovsky, V.I.

    1994-12-31

    Frequency measurements of electromagnetic oscillations of millimeter and submillimeter wavebands with frequency growth due to a number of reasons become more and more difficult. First, these frequencies are considered to be cutoff for semiconductor converting devices and one has to use optical measurement methods instead of traditional ones with frequency transfer. Second, resonance measurement methods are characterized by using relatively narrow bands and optical ones are limited in frequency and time resolution due to the limited range and velocity of movement of their mechanical elements as well as the efficiency of these optical techniques decreases with the increase of wavelengthmore » due to diffraction losses. That requires the apriori information on the radiation frequency band of the source involved. Method of measuring frequency of harmonic microwave signals in millimeter and submillimeter wavebands based on the ac Josephson effect in superconducting contacts is devoid of all the above drawbacks. This approach offers a number of major advantages over the more traditional measurement methods, that is the one based on frequency conversion, resonance and interferrometric techniques. It can be characterized by high potential accuracy, wide range of frequencies measured, prompt measurement and the opportunity to obtain panoramic display of the results as well as full automation of the measuring process.« less

  14. A Hydrogen Exchange Method Using Tritium and Sephadex: Its Application to Ribonuclease*

    PubMed Central

    Englander, S. Walter

    2012-01-01

    A new method for measuring the hydrogen exchange of macromolecules in solution is described. The method uses tritium to trace the movement of hydrogen, and utilizes Sephadex columns to effect, in about 2 minutes, a separation between tritiated macromolecule and tritiated solvent great enough to allow the measurement of bound tritium. High sensitivity and freedom from artifact is demonstrated and the possible value of the technique for investigation of other kinds of colloid-small molecule interaction is indicated. Competition experiments involving tritium, hydrogen, and deuterium indicate the absence of any equilibrium isotope effect in the ribonuclease-hydrogen isotope system, though a secondary kinetic isotope effect is apparent when ribonuclease is largely deuterated. Ribonuclease shows four clearly distinguishable kinetic classes of exchangeable hydrogens. Evidence is marshaled to suggest the independently measurable classes II, III, and IV (in order of decreasing rate of exchange) to represent “random-chain” peptides, peptides involved in α-helix, and otherwise shielded side-chain and peptide hydrogens, respectively. PMID:14075117

  15. Soot and Radiation Measurements in Microgravity Jet Diffusion Flames

    NASA Technical Reports Server (NTRS)

    Ku, Jerry C.

    1996-01-01

    The subject of soot formation and radiation heat transfer in microgravity jet diffusion flames is important not only for the understanding of fundamental transport processes involved but also for providing findings relevant to spacecraft fire safety and soot emissions and radiant heat loads of combustors used in air-breathing propulsion systems. Our objectives are to measure and model soot volume fraction, temperature, and radiative heat fluxes in microgravity jet diffusion flames. For this four-year project, we have successfully completed three tasks, which have resulted in new research methodologies and original results. First is the implementation of a thermophoretic soot sampling technique for measuring particle size and aggregate morphology in drop-tower and other reduced gravity experiments. In those laminar flames studied, we found that microgravity soot aggregates typically consist of more primary particles and primary particles are larger in size than those under normal gravity. Comparisons based on data obtained from limited samples show that the soot aggregate's fractal dimension varies within +/- 20% of its typical value of 1.75, with no clear trends between normal and reduced gravity conditions. Second is the development and implementation of a new imaging absorption technique. By properly expanding and spatially-filtering the laser beam to image the flame absorption on a CCD camera and applying numerical smoothing procedures, this technique is capable of measuring instantaneous full-field soot volume fractions. Results from this technique have shown the significant differences in local soot volume fraction, smoking point, and flame shape between normal and reduced gravity flames. We observed that some laminar flames become open-tipped and smoking under microgravity. The third task we completed is the development of a computer program which integrates and couples flame structure, soot formation, and flame radiation analyses together. We found good agreements between model predictions and experimental data for laminar and turbulent flames under both normal and reduced gravity. We have also tested in the laboratory the techniques of rapid-insertion fine-wire thermocouples and emission pyrometry for temperature measurements. These techniques as well as laser Doppler velocimetry and spectral radiative intensity measurement have been proposed to provide valuable data and improve the modeling analyses.

  16. The return of Phineas Gage: clues about the brain from the skull of a famous patient.

    PubMed

    Damasio, H; Grabowski, T; Frank, R; Galaburda, A M; Damasio, A R

    1994-05-20

    When the landmark patient Phineas Gage died in 1861, no autopsy was performed, but his skull was later recovered. The brain lesion that caused the profound personality changes for which his case became famous has been presumed to have involved the left frontal region, but questions have been raised about the involvement of other regions and about the exact placement of the lesion within the vast frontal territory. Measurements from Gage's skull and modern neuroimaging techniques were used to reconstitute the accident and determine the probable location of the lesion. The damage involved both left and right prefrontal cortices in a pattern that, as confirmed by Gage's modern counterparts, causes a defect in rational decision making and the processing of emotion.

  17. Development and testing of highway storm-sewer flow measurement and recording system

    USGS Publications Warehouse

    Kilpatrick, F.A.; Kaehrle, W.R.; Hardee, Jack; Cordes, E.H.; Landers, M.N.

    1985-01-01

    A comprehensive study and development of measuring instruments and techniques for measuring all components of flow in a storm-sewer drainage system was undertaken by the U.S. Geological Survey under the sponsorship of the Federal Highway Administration. The study involved laboratory and field calibration and testing of measuring flumes, pipe insert meters, weirs, electromagnetic velocity meters as well as the development and calibration of pneumatic-bubbler pressure transducer head measuring systems. Tracer-dilution and acoustic flow meter measurements were used in field verification tests. A single micrologger was used to record data from all the above instruments as well as from a tipping-bucket rain gage and also to activate on command the electromagnetic velocity meter and tracer-dilution systems. (Author 's abstract)

  18. A new measure for gene expression biclustering based on non-parametric correlation.

    PubMed

    Flores, Jose L; Inza, Iñaki; Larrañaga, Pedro; Calvo, Borja

    2013-12-01

    One of the emerging techniques for performing the analysis of the DNA microarray data known as biclustering is the search of subsets of genes and conditions which are coherently expressed. These subgroups provide clues about the main biological processes. Until now, different approaches to this problem have been proposed. Most of them use the mean squared residue as quality measure but relevant and interesting patterns can not be detected such as shifting, or scaling patterns. Furthermore, recent papers show that there exist new coherence patterns involved in different kinds of cancer and tumors such as inverse relationships between genes which can not be captured. The proposed measure is called Spearman's biclustering measure (SBM) which performs an estimation of the quality of a bicluster based on the non-linear correlation among genes and conditions simultaneously. The search of biclusters is performed by using a evolutionary technique called estimation of distribution algorithms which uses the SBM measure as fitness function. This approach has been examined from different points of view by using artificial and real microarrays. The assessment process has involved the use of quality indexes, a set of bicluster patterns of reference including new patterns and a set of statistical tests. It has been also examined the performance using real microarrays and comparing to different algorithmic approaches such as Bimax, CC, OPSM, Plaid and xMotifs. SBM shows several advantages such as the ability to recognize more complex coherence patterns such as shifting, scaling and inversion and the capability to selectively marginalize genes and conditions depending on the statistical significance. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  19. Thermographic Imaging of Material Loss in Boiler Water-Wall Tubing by Application of Scanning Line Source

    NASA Technical Reports Server (NTRS)

    Cramer, K. Elliott; Winfree, William P.

    2000-01-01

    Localized wall thinning due to corrosion in utility boiler water-wall tubing is a significant inspection concern for boiler operators. Historically, conventional ultrasonics has been used for inspection of these tubes. This technique has proven to be very manpower and time intensive. This has resulted in a spot check approach to inspections, documenting thickness measurements over a relatively small percentage of the total boiler wall area. NASA Langley Research Center has developed a thermal NDE technique designed to image and quantitatively characterize the amount of material thinning present in steel tubing. The technique involves the movement of a thermal line source across the outer surface of the tubing followed by an infrared imager at a fixed distance behind the line source. Quantitative images of the material loss due to corrosion are reconstructed from measurements of the induced surface temperature variations. This paper will present a discussion of the development of the thermal imaging system as well as the techniques used to reconstruct images of flaws. The application of the thermal line source coupled with the analysis technique represents a significant improvement in the inspection speed for large structures such as boiler water-walls. A theoretical basis for the technique will be presented which explains the quantitative nature of the technique. Further, a dynamic calibration system will be presented for the technique that allows the extraction of thickness information from the temperature data. Additionally, the results of applying this technology to actual water-wall tubing samples and in situ inspections will be presented.

  20. Fundamental phenomena on fuel decomposition and boundary-layer combustion processes with applications to hybrid rocket motors

    NASA Technical Reports Server (NTRS)

    Kuo, Kenneth K.; Lu, Yeu-Cherng; Chiaverini, Martin J.; Harting, George C.; Johnson, David K.; Serin, Nadir

    1995-01-01

    The experimental study on the fundamental processes involved in fuel decomposition and boundary-layer combustion in hybrid rocket motors is continuously being conducted at the High Pressure Combustion Laboratory of The Pennsylvania State University. This research will provide a useful engineering technology base in the development of hybrid rocket motors as well as a fundamental understanding of the complex processes involved in hybrid propulsion. A high-pressure, 2-D slab motor has been designed, manufactured, and utilized for conducting seven test firings using HTPB fuel processed at PSU. A total of 20 fuel slabs have been received from the Mcdonnell Douglas Aerospace Corporation. Ten of these fuel slabs contain an array of fine-wire thermocouples for measuring solid fuel surface and subsurface temperatures. Diagnostic instrumentation used in the test include high-frequency pressure transducers for measuring static and dynamic motor pressures and fine-wire thermocouples for measuring solid fuel surface and subsurface temperatures. The ultrasonic pulse-echo technique as well as a real-time x-ray radiography system have been used to obtain independent measurements of instantaneous solid fuel regression rates.

  1. Fundamental phenomena on fuel decomposition and boundary-layer combustion processes with applications to hybrid rocket motors

    NASA Astrophysics Data System (ADS)

    Kuo, Kenneth K.; Lu, Yeu-Cherng; Chiaverini, Martin J.; Harting, George C.; Johnson, David K.; Serin, Nadir

    The experimental study on the fundamental processes involved in fuel decomposition and boundary-layer combustion in hybrid rocket motors is continuously being conducted at the High Pressure Combustion Laboratory of The Pennsylvania State University. This research will provide a useful engineering technology base in the development of hybrid rocket motors as well as a fundamental understanding of the complex processes involved in hybrid propulsion. A high-pressure, 2-D slab motor has been designed, manufactured, and utilized for conducting seven test firings using HTPB fuel processed at PSU. A total of 20 fuel slabs have been received from the Mcdonnell Douglas Aerospace Corporation. Ten of these fuel slabs contain an array of fine-wire thermocouples for measuring solid fuel surface and subsurface temperatures. Diagnostic instrumentation used in the test include high-frequency pressure transducers for measuring static and dynamic motor pressures and fine-wire thermocouples for measuring solid fuel surface and subsurface temperatures. The ultrasonic pulse-echo technique as well as a real-time x-ray radiography system have been used to obtain independent measurements of instantaneous solid fuel regression rates.

  2. System for routine surface anthropometry using reprojection registration

    NASA Astrophysics Data System (ADS)

    Sadleir, R. J.; Owens, R. A.; Hartmann, P. E.

    2003-11-01

    Range data measurement can be usefully applied to non-invasive monitoring of anthropometric changes due to disease, healing or during normal physiological processes. We have developed a computer vision system that allows routine capture of biological surface shapes and accurate measurement of anthropometric changes, using a structured light stripe triangulation system. In many applications involving relocation of soft tissue for image-guided surgery or anthropometry it is neither accurate nor practical to apply fiducial markers directly to the body. This system features a novel method of achieving subject re-registration that involves application of fiducials by a standard data projector. Calibration of this reprojector is achieved using a variation of structured lighting techniques. The method allows accurate and comparable repositioning of elastic surfaces. Tests of repositioning using the reprojector found a significant improvement in subject registration compared to an earlier method which used video overlay comparison only. It has a current application to the measurement of breast volume changes in lactating mothers, but may be extended to any application where repeatable positioning and measurement is required.

  3. Full field gas phase velocity measurements in microgravity

    NASA Technical Reports Server (NTRS)

    Griffin, Devon W.; Yanis, William

    1995-01-01

    Measurement of full-field velocities via Particle Imaging Velocimetry (PIV) is common in research efforts involving fluid motion. While such measurements have been successfully performed in the liquid phase in a microgravity environment, gas-phase measurements have been beset by difficulties with seeding and laser strength. A synthesis of techniques developed at NASA LeRC exhibits promise in overcoming these difficulties. Typical implementation of PIV involves forming the light from a pulsed laser into a sheet that is some fraction of a millimeter thick and 50 or more millimeters wide. When a particle enters this sheet during a pulse, light scattered from the particle is recorded by a detector, which may be a film plane or a CCD array. Assuming that the particle remains within the boundaries of the sheet for the second pulse and can be distinguished from neighboring particles, comparison of the two images produces an average velocity vector for the time between the pulses. If the concentration of particles in the sampling volume is sufficiently large but the particles remain discrete, a full field map may be generated.

  4. EXTENSION METHODS IDEAS FOR RURAL CIVIL DEFENSE.

    ERIC Educational Resources Information Center

    Department of Agriculture, Washington, DC.

    TECHNIQUES FOR INVOLVING THE RURAL POPULATION IN CIVIL DEFENSE PLANNING IS THE SUBJECT OF THIS DOCUMENT. AN INITIAL STEP INVOLVES DETERMINING THE VARIOUS COMMUNICATION SKILLS TO BE USED. METHODS OF WORKING WITH COMMUNITY ORGANIZATIONS, MASS MEDIA TECHNIQUES, AND CONSTRUCTION OF EXHIBITS ARE DESCRIBED. SMALL GROUP DISCUSSION TECHNIQUES EXPLAINED…

  5. Hush now baby: mothers' and fathers' strategies for soothing their infants and associated parenting outcomes.

    PubMed

    Dayton, Carolyn Joy; Walsh, Tova B; Oh, Wonjung; Volling, Brenda

    2015-01-01

    The purpose of this study was to examine the types of soothing behaviors used by mothers and fathers of infants, differences in use trajectories over time, and associated parenting outcomes. A longitudinal study of 241 families expecting their second child was performed. Data were collected at 1, 4, and 8 postnatal months and included measures of parental soothing techniques, involvement in soothing, distress in response to infant crying, and parenting self-efficacy. The average number of soothing techniques used was 7.7 for mothers and 5.9 for fathers. Soothing frequency decreased over time, and change patterns of soothing differed over time by gender. In couples who shared responsibility for soothing, fathers felt more efficacious in parenting and mothers were less upset by infant crying. Clinicians are encouraged to support fathers' engagement in infant soothing, facilitate the development of fathers' parenting confidence, and promote fathers' involvement in children's health and health care. Copyright © 2015 National Association of Pediatric Nurse Practitioners. Published by Elsevier Inc. All rights reserved.

  6. Hush Now Baby: Mothers’ and Fathers’ Strategies for Soothing Their Infants and Associated Parenting Outcomes

    PubMed Central

    Dayton, Carolyn Joy; Walsh, Tova B.; Oh, Wonjung; Volling, Brenda

    2014-01-01

    Objectives The purpose of this study was to examine the types of soothing behaviors used by mothers and fathers of infants, differences in use trajectories over time, and associated parenting outcomes. Methods Longitudinal study of 241 families expecting their second child. Data were collected at 1, 4 and 8 postnatal months and included measures of parental soothing techniques, involvement in soothing, distress in response to infant crying, and parenting self-efficacy. Results Average number of soothing techniques used was 7.7 for mothers and 5.9 for fathers. Soothing frequency decreased over time and change patterns of soothing differed over time by gender. In couples who shared responsibility for soothing fathers felt more efficacious in parenting and mothers were less upset by infant crying. Discussion Clinicians are encouraged to support fathers’ engagement in infant soothing, to facilitate the development of fathers’ parenting confidence, and to promote fathers’ involvement in children’s health and healthcare. PMID:25440811

  7. Comparison of laboratory and in-situ measurements of waterflood residual oil saturations for the Cormorant field

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    van Poelgeest, F.; Niko, H.; Modwid, A.R.

    1991-03-01

    Shell Expro and Koninklijke/Shell E and P Laboratorium (KSEPL) have been engaged in a multidisciplinary effort to determine the water flood residual oil saturation (ROS) in two principal reservoirs of the Cormorant oil field in the U.K. sector of the North Sea. Data acquisition included special coring and testing. The study, which involved new reservoir-engineering and petrophysical techniques, was aimed at establishing consistent ROS values. This paper reports that reservoir-engineering work centered on reservoir-condition corefloods in the relative-permeability-at-reservoir-conditions (REPARC) apparatus, in which restoration of representative wettability condition was attempted with the aging technique. Aging results in a consistent reduction ofmore » water-wetness of all core samples. The study indicated that ROS values obtained on aged cores at water throughputs of at least 5 PV represented reservoir conditions. The petrophysical part of the study involved ROS estimation from sponge-core analysis and log evaluation.« less

  8. Minimizing target interference in PK immunoassays: new approaches for low-pH-sample treatment.

    PubMed

    Partridge, Michael A; Pham, John; Dziadiv, Olena; Luong, Onson; Rafique, Ashique; Sumner, Giane; Torri, Albert

    2013-08-01

    Quantitating total levels of monoclonal antibody (mAb) biotherapeutics in serum using ELISA may be hindered by soluble targets. We developed two low-pH-sample-pretreatment techniques to minimize target interference. The first procedure involves sample pretreatment at pH <3.0 before neutralization and analysis in a target capture ELISA. Careful monitoring of acidification time is required to minimize potential impact on mAb detection. The second approach involves sample dilution into mild acid (pH ∼4.5) before transferring to an anti-human capture-antibody-coated plate without neutralization. Analysis of target-drug and drug-capture antibody interactions at pH 4.5 indicated that the capture antibody binds to the drug, while the drug and the target were dissociated. Using these procedures, total biotherapeutic levels were accurately measured when soluble target was >30-fold molar excess. These techniques provide alternatives for quantitating mAb biotherapeutics in the presence of a target when standard acid-dissociation procedures are ineffective.

  9. Solid state tritium detector for biomedical applications

    NASA Astrophysics Data System (ADS)

    Gordon, J. S.; Farrell, R.; Daley, K.; Oakes, C. E.

    1994-08-01

    Radioactive labeling of proteins is a very important technique used in biomedical research to identify, isolate, and investigate the expression and properties of proteins in biological systems. In such procedures, the preferred radiolabel is often tritium. Presently, binding assays involving tritium are carried out using inconvenient and expensive techniques which rely on the use of scintillation fluid counting systems. This traditional method involves both time-consuming laboratory protocols and the generation of substantial quantities of radioactive and chemical waste. We have developed a novel technology to measure the tritium content of biological specimens that does not rely on scintillation fluids. The tritiated samples can be positioned directly under a large area, monolithic array of specially prepared avalanche photodiodes (APDs) which record the tritium activity distribution at each point within the field of view of the array. The 1 mm(sup 2) sensing elements exhibit an intrinsic tritium beta detection efficiency of 27% with high gain uniformity and very low cross talk.

  10. Neutron Measurements for Radiation Protection in Low Earth Orbit - History and Future

    NASA Technical Reports Server (NTRS)

    Golightly, M. J.; Se,pmes. E/

    2003-01-01

    The neutron environment inside spacecraft has been of interest from a scientific and radiation protection perspective since early in the history of manned spaceflight. With 1:.1e exception of a few missions which carried plutonium-fueled radioisotope thermoelectric generators, all of the neutrons inside the spacecraft are secondary radiations resulting from interactions of high-energy charged particles with nuclei in the Earth's atmosphere, spacecraft structural materials, and the astronaut's own bodies. Although of great interest, definitive measurements of the spacecraft neutron field have been difficult due to the wide particle energy range and the limited available volume and power for traditional techniques involving Bonner spheres. A multitude of measurements, however, have been made of the neutron environment inside spacecraft. The majority of measurements were made using passive techniques including metal activation fo ils, fission foils, nuclear photoemulsions, plastic track detectors, and thermoluminescent detectors. Active measurements have utilized proton recoil spectrometers (stilbene), Bonner Spheres eRe proportional counter based), and LiI(Eu)phoswich scintillation detectors. For the International Space Station (ISS), only the plastic track! thermoluminescent detectors are used with any regularity. A monitoring program utilizing a set of active Bonner spheres was carried out in the ISS Lab module from March - December 200l. These measurements provide a very limited look at the crew neutron exposure, both in time coverage and neutron energy coverage. A review of the currently published data from past flights will be made and compared with the more recent results from the ISS. Future measurement efforts using currently available techniques and those in development will be also discussed.

  11. FAA Aviation Forecast Conference Proceedings (16th)

    DTIC Science & Technology

    1991-02-01

    FORECASTS The FAA forecasting process is a continuous one which involves FAA Forecast Branch’s interaction with various FAA Offices and Services... process uses various economic and aviation data bases, the outputs of several econometric models and equations, and other analytical techniques. The FAA...workload measures, summarized numerically in the table on page 8, are the resultant forecasts of this process and are used annually by the agency for

  12. Where and How Does Grammatically Geared Processing Take Place--And Why Is Broca's Area Often Involved. A Coordinated fMRI/ERBP Study of Language Processing

    ERIC Educational Resources Information Center

    Dogil, Grzegorz; Frese, Inga; Haider, Hubert; Rohm, Dietmar; Wokurek, Wolfgang

    2004-01-01

    We address the possibility of combining the results from hemodynamic and electrophysiological methods for the study of cognitive processing of language. The hemodynamic method we use is Event-Related fMRI, and the electrophysiological method measures Event-Related Band Power (ERBP) of the EEG signal. The experimental technique allows us to…

  13. Trace Elements in Ovaries: Measurement and Physiology.

    PubMed

    Ceko, Melanie J; O'Leary, Sean; Harris, Hugh H; Hummitzsch, Katja; Rodgers, Raymond J

    2016-04-01

    Traditionally, research in the field of trace element biology and human and animal health has largely depended on epidemiological methods to demonstrate involvement in biological processes. These studies were typically followed by trace element supplementation trials or attempts at identification of the biochemical pathways involved. With the discovery of biological molecules that contain the trace elements, such as matrix metalloproteinases containing zinc (Zn), cytochrome P450 enzymes containing iron (Fe), and selenoproteins containing selenium (Se), much of the current research focuses on these molecules, and, hence, only indirectly on trace elements themselves. This review focuses largely on two synchrotron-based x-ray techniques: X-ray absorption spectroscopy and x-ray fluorescence imaging that can be used to identify the in situ speciation and distribution of trace elements in tissues, using our recent studies of bovine ovaries, where the distribution of Fe, Se, Zn, and bromine were determined. It also discusses the value of other techniques, such as inductively coupled plasma mass spectrometry, used to garner information about the concentrations and elemental state of the trace elements. These applications to measure trace elemental distributions in bovine ovaries at high resolutions provide new insights into possible roles for trace elements in the ovary. © 2016 by the Society for the Study of Reproduction, Inc.

  14. Mosquitoes meet microfluidics: High-throughput microfluidic tools for insect-parasite ecology in field conditions

    NASA Astrophysics Data System (ADS)

    Prakash, Manu; Mukundarajan, Haripriya

    2013-11-01

    A simple bite from an insect is the transmission mechanism for many deadly diseases worldwide--including malaria, yellow fever, west nile and dengue. Very little is known about how populations of numerous insect species and disease-causing parasites interact in their natural habitats due to a lack of measurement techniques. At present, vector surveillance techniques involve manual capture by using humans as live bait, which is hard to justify on ethical grounds. Individual mosquitoes are manually dissected to isolate salivary glands to detect sporozites. With typical vector infection rates being very low even in endemic areas, it is almost impossible to get an accurate picture of disease distribution, in both space and time. Here we present novel high-throughput microfluidic tools for vector surveillance, specifically mosquitoes. A two-dimensional high density array with baits provide an integrated platform for multiplex PCR for detection of both vector and parasite species. Combining techniques from engineering and field ecology, methods and tools developed here will enable high-throughput measurement of infection rates for a number of diseases in mosquito populations in field conditions. Pew Foundation.

  15. Ultrasonic Evaluation and Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crawford, Susan L.; Anderson, Michael T.; Diaz, Aaron A.

    2015-10-01

    Ultrasonic evaluation of materials for material characterization and flaw detection is as simple as manually moving a single-element probe across a speci-men and looking at an oscilloscope display in real time or as complex as automatically (under computer control) scanning a phased-array probe across a specimen and collecting encoded data for immediate or off-line data analyses. The reliability of the results in the second technique is greatly increased because of a higher density of measurements per scanned area and measurements that can be more precisely related to the specimen geometry. This chapter will briefly discuss applications of the collection ofmore » spatially encoded data and focus primarily on the off-line analyses in the form of data imaging. Pacific Northwest National Laboratory (PNNL) has been involved with as-sessing and advancing the reliability of inservice inspections of nuclear power plant components for over 35 years. Modern ultrasonic imaging techniques such as the synthetic aperture focusing technique (SAFT), phased-array (PA) technolo-gy and sound field mapping have undergone considerable improvements to effec-tively assess and better understand material constraints.« less

  16. Weak-value amplification and optimal parameter estimation in the presence of correlated noise

    NASA Astrophysics Data System (ADS)

    Sinclair, Josiah; Hallaji, Matin; Steinberg, Aephraim M.; Tollaksen, Jeff; Jordan, Andrew N.

    2017-11-01

    We analytically and numerically investigate the performance of weak-value amplification (WVA) and related parameter estimation methods in the presence of temporally correlated noise. WVA is a special instance of a general measurement strategy that involves sorting data into separate subsets based on the outcome of a second "partitioning" measurement. Using a simplified correlated noise model that can be analyzed exactly together with optimal statistical estimators, we compare WVA to a conventional measurement method. We find that WVA indeed yields a much lower variance of the parameter of interest than the conventional technique does, optimized in the absence of any partitioning measurements. In contrast, a statistically optimal analysis that employs partitioning measurements, incorporating all partitioned results and their known correlations, is found to yield an improvement—typically slight—over the noise reduction achieved by WVA. This result occurs because the simple WVA technique is not tailored to any specific noise environment and therefore does not make use of correlations between the different partitions. We also compare WVA to traditional background subtraction, a familiar technique where measurement outcomes are partitioned to eliminate unknown offsets or errors in calibration. Surprisingly, for the cases we consider, background subtraction turns out to be a special case of the optimal partitioning approach, possessing a similar typically slight advantage over WVA. These results give deeper insight into the role of partitioning measurements (with or without postselection) in enhancing measurement precision, which some have found puzzling. They also resolve previously made conflicting claims about the usefulness of weak-value amplification to precision measurement in the presence of correlated noise. We finish by presenting numerical results to model a more realistic laboratory situation of time-decaying correlations, showing that our conclusions hold for a wide range of statistical models.

  17. Measurement of charge transport through organic semiconducting devices

    NASA Astrophysics Data System (ADS)

    Klenkler, Richard A.

    2007-12-01

    In this thesis, two important and unexplored areas of organic semiconductor device physics are investigated: The first area involves determining the effect of energy barriers and intermixing at the interfaces between hole transport layers (HTLs). This effect was discerned by first establishing a method of pressure-laminating successive solution coated HTLs to gether. It was found that in the range of 0.8--3.0 MPa a pressure-laminated interface between two identical HTLs causes no measurable perturbation to charge transport. By this method, 2 different HTLs can be sandwiched together to create a discrete interface, and by inserting a mixed HTL in the middle an intermixed interface between the 2 HTLs can be simulated. With these sandwiched devices, charge injection across discrete versus intermixed interfaces were compared using time-of-flight measurements. For the hole transport materials investigated, no perturbation to the overall charge transport was observed with the discrete interface, however in contrast the rate of charge transport was clearly reduced through the intermixed interface. The second area that was investigated pertains to the development of a bulk mobility measurement technique that has a higher resolution than existing methods. The approach that was used involved decoupling the charge carrier transient signal from the device charging circuit. With this approach, the RC time constant constraint that limits the resolution of existing methods is eliminated. The resulting method, termed the photoinduced electroluminescence (EL) mobility measurement technique, was then used to compare the electron mobility of the metal chelate, AlQ3 to that of the novel triazine material, BTB. Results showed that BTB demonstrated an order of magnitude higher mobility than AlQ3. Overall, these findings have broad implications regarding device design. The pressure-lamination method could be used, e.g., as a diagnostic tool to help in the design of multilayer xerographic photoreceptors, such as those that include an abrasion resistant overcoat. Further, the photoinduced EL technique could be use as a tool to help characterize charge flow and balance in organic light emitting devices amongst others.

  18. CANDU in-reactor quantitative visual-based inspection techniques

    NASA Astrophysics Data System (ADS)

    Rochefort, P. A.

    2009-02-01

    This paper describes two separate visual-based inspection procedures used at CANDU nuclear power generating stations. The techniques are quantitative in nature and are delivered and operated in highly radioactive environments with access that is restrictive, and in one case is submerged. Visual-based inspections at stations are typically qualitative in nature. For example a video system will be used to search for a missing component, inspect for a broken fixture, or locate areas of excessive corrosion in a pipe. In contrast, the methods described here are used to measure characteristic component dimensions that in one case ensure ongoing safe operation of the reactor and in the other support reactor refurbishment. CANDU reactors are Pressurized Heavy Water Reactors (PHWR). The reactor vessel is a horizontal cylindrical low-pressure calandria tank approximately 6 m in diameter and length, containing heavy water as a neutron moderator. Inside the calandria, 380 horizontal fuel channels (FC) are supported at each end by integral end-shields. Each FC holds 12 fuel bundles. The heavy water primary heat transport water flows through the FC pressure tube, removing the heat from the fuel bundles and delivering it to the steam generator. The general design of the reactor governs both the type of measurements that are required and the methods to perform the measurements. The first inspection procedure is a method to remotely measure the gap between FC and other in-core horizontal components. The technique involves delivering vertically a module with a high-radiation-resistant camera and lighting into the core of a shutdown but fuelled reactor. The measurement is done using a line-of-sight technique between the components. Compensation for image perspective and viewing elevation to the measurement is required. The second inspection procedure measures flaws within the reactor's end shield FC calandria tube rolled joint area. The FC calandria tube (the outer shell of the FC) is sealed by rolling its ends into the rolled joint area. During reactor refurbishment, the original FC calandria tubes are removed, potentially scratching the rolled joint area and, thereby, compromising the seal with the new FC calandria tube. The procedure involves delivering an inspection module having a radiation-resistant camera, standard lighting, and a structured lighting projector. The surface is inspected by rotating the module within the rolled joint area. If a flaw is detected, its depth and width are gauged from the profile variation of the structured lighting in a captured image. As well, the diameter profile of the area is measured from the analysis of a series of captured circumferential images of the structured lighting profiles on the surface.

  19. Measurement and image processing evaluation of surface modifications of dental implants G4 pure titanium created by different techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bulutsuz, A. G., E-mail: asligunaya@gmail.com; Demircioglu, P., E-mail: pinar.demircioglu@adu.edu.tr; Bogrekci, I., E-mail: ismail.bogrekci@adu.edu.tr

    Foreign substances and organic tissue interaction placed into the jaw in order to eliminate tooth loss involves a highly complex process. Many biological reactions take place as well as the biomechanical forces that influence this formation. Osseointegration denotes to the direct structural and functional association between the living bone and the load-bearing artificial implant's surface. Taking into consideration of the requirements in the manufacturing processes of the implants, surface characterizations with high precise measurement techniques are investigated and thus long-term success of dental implant is emphasized on the importance of these processes in this study. In this research, the detailedmore » surface characterization was performed to identify the dependence of the manufacturing techniques on the surface properties by using the image processing methods and using the scanning electron microscope (SEM) for morphological properties in 3D and Taylor Hobson stylus profilometer for roughness properties in 2D. Three implant surfaces fabricated by different manufacturing techniques were inspected, and a machined surface was included into the study as a reference specimen. The results indicated that different surface treatments were strongly influenced surface morphology. Thus 2D and 3D precise inspection techniques were highlighted on the importance for surface characterization. Different image analyses techniques such as Dark-light technique were used to verify the surface measurement results. The computational phase was performed using image processing toolbox in Matlab with precise evaluation of the roughness for the implant surfaces. The relationship between the number of black and white pixels and surface roughness is presented. FFT image processing and analyses results explicitly imply that the technique is useful in the determination of surface roughness. The results showed that the number of black pixels in the image increases with increase in surface roughness.« less

  20. Division B Commission 25: Astronomical Photometry and Polarimetry

    NASA Astrophysics Data System (ADS)

    Walker, Alistair; Adelman, Saul; Milone, Eugene; Anthony-Twarog, Barbara; Bastien, Pierre; Chen, Wen Ping; Howell, Steve; Knude, Jens; Kurtz, Donald; Magalhães, Antonio Mario; Menzies, John; Smith, Allyn; Volk, Kevin

    2016-04-01

    Commission 25 (C25) deals with the techniques and issues involved with the measurement of optical and infrared radiation intensities and polarization from astronomical sources. As such, in recent years attention has focused on photometric standard stars, atmospheric extinction, photometric passbands, transformation between systems, nomenclature, and observing and reduction techniques. At the start of the trimester C25 changed its name from Stellar Photometry and Polarization to Astronomical Photometry and Polarization so as to explicitly include in its mandate particular issues arising from the measurement of resolved sources, given the importance of photometric redshifts of distant galaxies for many of the large photometric surveys now underway. We begin by summarizing commission activities over the 2012-2014 period, follow with a report on Polarimetry, continue with Photometry topics that have been of interest to C25 members, and conclude with a Vision for the Future.

  1. Measurement of free radical kinetics in pulsed plasmas by UV and VUV absorption spectroscopy and by modulated beam mass spectrometry

    NASA Astrophysics Data System (ADS)

    Cunge, G.; Bodart, P.; Brihoum, M.; Boulard, F.; Chevolleau, T.; Sadeghi, N.

    2012-04-01

    This paper reviews recent progress in the development of time-resolved diagnostics to probe high-density pulsed plasma sources. We focus on time-resolved measurements of radicals' densities in the afterglow of pulsed discharges to provide useful information on production and loss mechanisms of free radicals. We show that broad-band absorption spectroscopy in the ultraviolet and vacuum ultraviolet spectral domain and threshold ionization modulated beam mass spectrometry are powerful techniques for the determination of the time variation of the radicals' densities in pulsed plasmas. The combination of these complementary techniques allows detection of most of the reactive species present in industrial etching plasmas, giving insights into the physico-chemistry reactions involving these species. As an example, we discuss briefly the radicals' kinetics in the afterglow of a SiCl4/Cl2/Ar discharge.

  2. Nuclear Resonance Fluorescence of U-235

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warren, Glen A.; Caggiano, Joseph A.; Hensley, Walter K.

    Nuclear resonance fluorescence is a physical process that provides an isotopic-specific signature that could be used for the identification and characterization of materials. The technique involves the detection of prompt discrete-energy photons emitted from a sample which is exposed to photons in the MeV energy range. Potential applications of the technique range from detection of high explosives to characterization of special nuclear materials. One isotope of significant interest is 235U. Pacific Northwest National Laboratory and Passport Systems have collaborated to conduct measurements to search for a nuclear resonance fluorescence response of 235U below 3 MeV using a 200 g samplemore » of highly enriched uranium. Nine 235U resonances between 1650 and 2010 keV were identified in the preliminary analysis. Analysis of the measurement data to determine the integrated cross sections of the resonances is in progress.« less

  3. Brain Tissue Oxygen: In Vivo Monitoring with Carbon Paste Electrodes

    PubMed Central

    Bolger, Fiachra B.; Lowry, John P.

    2005-01-01

    In this communication we review selected experiments involving the use of carbon paste electrodes (CPEs) to monitor and measure brain tissue O2 levels in awake freely-moving animals. Simultaneous measurements of rCBF were performed using the H2 clearance technique. Voltammetric techniques used include both differential pulse (O2) and constant potential amperometry (rCBF). Mild hypoxia and hyperoxia produced rapid changes (decrease and increase respectively) in the in vivo O2 signal. Neuronal activation (tail pinch and stimulated grooming) produced similar increases in both O2and rCBF indicating that CPE O2currents provide an index of increases in rCBF when such increases exceed O2 utilization. Saline injection produced a transient increase in the O2 signal while chloral hydrate produced slower more long-lasting changes that accompanied the behavioral changes associated with anaesthesia. Acetazolamide increased O2 levels through an increase in rCBF.

  4. H2/O2 three-body rates at high temperatures

    NASA Technical Reports Server (NTRS)

    Marinelli, William J.; Kessler, William J.; Piper, Lawrence G.; Rawlins, W. Terry

    1990-01-01

    The extraction of thrust from air breathing hypersonic propulsion systems is critically dependent on the degree to which chemical equilibrium is reached in the combustion process. In the combustion of H2/Air mixtures, slow three-body chemical reactions involving H-atoms, O-atoms, and the OH radical play an important role in energy extraction. A first-generation high temperature and pressure flash-photolysis/laser-induced fluorescence reactor was designed and constructed to measure these important three-body rates. The system employs a high power excimer laser to produce these radicals via the photolysis of stable precursors. A novel two-photon laser-induced fluorescence technique is employed to detect H-atoms without optical thickness or O2 absorption problems. To demonstrate the feasibility of the technique the apparatus in the program is designed to perform preliminary measurements on the H + O2 + M reaction at temperatures from 300 to 835 K.

  5. Tip-enhanced Raman mapping with top-illumination AFM.

    PubMed

    Chan, K L Andrew; Kazarian, Sergei G

    2011-04-29

    Tip-enhanced Raman mapping is a powerful, emerging technique that offers rich chemical information and high spatial resolution. Currently, most of the successes in tip-enhanced Raman scattering (TERS) measurements are based on the inverted configuration where tips and laser are approaching the sample from opposite sides. This results in the limitation of measurement for transparent samples only. Several approaches have been developed to obtain tip-enhanced Raman mapping in reflection mode, many of which involve certain customisations of the system. We have demonstrated in this work that it is also possible to obtain TERS nano-images using an upright microscope (top-illumination) with a gold-coated Si atomic force microscope (AFM) cantilever without significant modification to the existing integrated AFM/Raman system. A TERS image of a single-walled carbon nanotube has been achieved with a spatial resolution of ∼ 20-50 nm, demonstrating the potential of this technique for studying non-transparent nanoscale materials.

  6. Acousto-ultrasonic nondestructive evaluation of materials using laser beam generation and detection. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Huber, Robert D.; Green, Robert E., Jr.

    1990-01-01

    The acousto-ultrasonic method has proven to be a most interesting technique for nondestructive evaluation of the mechanical properties of a variety of materials. Use of the technique or a modification thereof, has led to correlation of the associated stress wave factor with mechanical properties of both metals and composite materials. The method is applied to the nondestructive evaluation of selected fiber reinforced structural composites. For the first time, conventional piezoelectric transducers were replaced with laser beam ultrasonic generators and detectors. This modification permitted true non-contact acousto-ultrasonic measurements to be made, which yielded new information about the basic mechanisms involved as well as proved the feasibility of making such non-contact measurements on terrestrial and space structures and heat engine components. A state-of-the-art laser based acousto-ultrasonic system, incorporating a compact pulsed laser and a fiber-optic heterodyne interferometer, was delivered to the NASA Lewis Research Center.

  7. Caking and characterizing graphene oxide thin films via electrodeposition technique for possible application in photoelectrochemical spliting of water

    NASA Astrophysics Data System (ADS)

    Singh, Nirupama; Kumar, Pushpendra; Upadhyay, Sumant; Choudhary, Surbhi; Satsangi, Vibha R.; Dass, Sahab; Shrivastav, Rohit

    2013-06-01

    In the present study Readymade Graphene oxide (GO) has been coated using electrochemical deposition technique [1] on to the conducting glass (ITO) substrate. Raman spectra generated D and G Peaks obtained at 1346 and 1575 cm-1 confirmed the presence of GO [2]. The UV-Visible absorption measurements provided absorption peak at 262 nm and the Tauc plots yielded band-gap energy of sample around 3.9 eV. The PEC measurements involved determination of current-voltage (I-V) characteristics, both under darkness as well as under illumination. The photocurrent of 1.21 mA/cm-2 at 0.5 V applied voltage (vs. saturated calomel electrode), was recorded under the illumination of 150 Wcm-2 (Xenon arc lamp; Oriel, USA). The photocurrent values were utilized further to calculate applied bias photon-to-current efficiency (% ABPE), which was estimated to 0.98 % at 0.5 V bias.

  8. Containerless Studies of Nucleation and Undercooling

    NASA Technical Reports Server (NTRS)

    Trinh, E. H.

    1985-01-01

    The long term research goals are to perform experiments to determine the achievable limits of undercooling, the characteristics of heterogeneous nucleation, and the physical properties of significantly undercooled melts. The techniques used are based on the newly developed containerless manipulation methods afforded by acoustic levitation. Ground based investigations involved 0.1 to 2 mm specimens of pure metals and alloys (In, Ga, Sn, Ga-In, ...) as well as glass-forming organic compounds (O-Terphenyl). A currently operating ultrasonic high temperature apparatus has allowed the ground-based levitation of 1 to 2 mm samples of solid aluminum at 550 deg C in an argon atmosphere. Present work is concentrating on the undercooling of pure metal samples (In, Sn), and on the measurements of surface tension and viscosity of the undercooled melts via shape oscillation techniques monitored through optical detection methods. The sound velocity of undercooled O-Terphenyl is being measured in an immiscible liquid levitation cells.

  9. Detection of essential hypertension with physiological signals from wearable devices.

    PubMed

    Ghosh, Arindam; Torres, Juan Manuel Mayor; Danieli, Morena; Riccardi, Giuseppe

    2015-08-01

    Early detection of essential hypertension can support the prevention of cardiovascular disease, a leading cause of death. The traditional method of identification of hypertension involves periodic blood pressure measurement using brachial cuff-based measurement devices. While these devices are non-invasive, they require manual setup for each measurement and they are not suitable for continuous monitoring. Research has shown that physiological signals such as Heart Rate Variability, which is a measure of the cardiac autonomic activity, is correlated with blood pressure. Wearable devices capable of measuring physiological signals such as Heart Rate, Galvanic Skin Response, Skin Temperature have recently become ubiquitous. However, these signals are not accurate and are prone to noise due to different artifacts. In this paper a) we present a data collection protocol for continuous non-invasive monitoring of physiological signals from wearable devices; b) we implement signal processing techniques for signal estimation; c) we explore how the continuous monitoring of these physiological signals can be used to identify hypertensive patients; d) We conduct a pilot study with a group of normotensive and hypertensive patients to test our techniques. We show that physiological signals extracted from wearable devices can distinguish between these two groups with high accuracy.

  10. Measurement of vibration using phase only correlation technique

    NASA Astrophysics Data System (ADS)

    Balachandar, S.; Vipin, K.

    2017-08-01

    A novel method for the measurement of vibration is proposed and demonstrated. The proposed experiment is based on laser triangulation: consists of line laser, object under test and a high speed camera remotely controlled by a software. Experiment involves launching a line-laser probe beam perpendicular to the axis of the vibrating object. The reflected probe beam is recorded by a high speed camera. The dynamic position of the line laser in camera plane is governed by the magnitude and frequency of the vibrating test-object. Using phase correlation technique the maximum distance travelled by the probe beam in CCD plane is measured in terms of pixels using MATLAB. An actual displacement of the object in mm is measured by calibration. Using displacement data with time, other vibration associated quantities such as acceleration, velocity and frequency are evaluated. The preliminary result of the proposed method is reported for acceleration from 1g to 3g, and from frequency 6Hz to 26Hz. The results are closely matching with its theoretical values. The advantage of the proposed method is that it is a non-destructive method and using phase correlation algorithm subpixel displacement in CCD plane can be measured with high accuracy.

  11. Purdue Rare Isotope Measurement Laboratory

    NASA Astrophysics Data System (ADS)

    Caffee, M.; Elmore, D.; Granger, D.; Muzikar, P.

    2002-12-01

    The Purdue Rare Isotope Measurement Laboratory (PRIME Lab) is a dedicated research and service facility for accelerator mass spectrometry. AMS is an ultra-sensitive analytical technique used to measure low levels of long-lived cosmic-ray-produced and anthropogenic radionuclides, and rare trace elements. We measure 10Be (T1/2 = 1.5 My), 26Al (.702 My), 36Cl (.301 My), and 129I (16 My), in geologic samples. Applications include dating the cosmic-ray-exposure time of rocks on Earth's surface, determining rock and sediment burial ages, measuring the erosion rates of rocks and soils, and tracing and dating ground water. We perform sample preparation and separation chemistries for these radio-nuclides for our internal research activities and for those external researchers not possessing this capability. Our chemical preparation laboratories also serve as training sites for members of the geoscience community developing these techniques at their institutions. Research at Purdue involves collaborators among members of the Purdue Departments of Physics, Earth and Atmospheric Sciences, Chemistry, Agronomy, and Anthropology. We also collaborate and serve numerous scientists from other institutions. We are currently in the process of modernizing the facility with the goals of higher precision for routinely measured radio-nuclides, increased sample throughput, and the development of new measurement capabilities for the geoscience community.

  12. Jet Mixing Enhancement by Feedback Control

    NASA Technical Reports Server (NTRS)

    Glauser, Mark; Taylor, Jeffrey

    1999-01-01

    The objective of this work has been to produce methodologies for high speed jet noise reduction based on natural mechanisms and enhanced feedback control to affect frequencies and structures in a prescribed manner. In this effort the two-point hot wire measurements obtained in the Langley jet facility by Ukeiley were used in conjuction with linear stochastic estimation (LSE) to implement the LSE component of the complementary technique. This method combines the Proper Orthogonal Decomposition (POD) and LSE to provide an experimental low dimensional time dependent description of the flow field. From such a description it should be possible to identify short time high strain rate events in the jet which contribute to the noise. The main task completed for this effort is summarized: LSE experiments were performed at the downstream locations where the two point hot wire measurements have been obtained by Ukeiley. These experiments involved sampling simultaneously hot wire signals from a relatively course spatial grid in gamma and theta. From this simultaneous data, coupled with the two-point measurements of Ukeiley via the LSE components of the complementary technique, an experimental low dimensional description of the jet at 4, 5, 6, 7 and 8 diameters downstream was obtained for Mach numbers of 0.3 and 0.6. We first present an overview of the theory involved. We finish up with a statement of the work performed and finally provide charts from a 1999 APS talk which summarizes the results.

  13. Comparison of heliox and oxygen as washing gases for the nitrogen washout technique in preterm infants.

    PubMed

    Poets, C F; Rau, G A; Gappa, M; Seidenberg, J

    1996-06-01

    The nitrogen washout technique usually involves exposure of the patient to 100% oxygen for several minutes. This may be dangerous in preterm infants who are at risk of retinopathy of prematurity (ROP). We wanted to know whether heliox (79% He, 21% O2) can be used instead of oxygen when determining functional residual capacity (FRC). FRC measurements were made in 14 preterm infants [median (range) gestational age at birth 34 wk (27-37 wk), and at time of study 36 wk (33-40 wk)] who were breathing room air. FRC was measured using a computerized infant pulmonary function system, beginning in random order with either 100% O2 followed by heliox or vice versa. There was no systematic difference between the two methods with regard to lung volume measurements: mean (SD) FRC values, corrected for body weight, were 22.9 (7.1) mL/kg for O2 and 23.4 (7.0) mL/kg for heliox. We did not observe a systematic influence of the type of washing gas used (heliox or oxygen) on FRC in these infants. Our results suggest that the use of heliox instead of pure oxygen may be a suitable and safer alternative for FRC measurements with the nitrogen washout technique in preterm infants who are breathing low concentrations of inspired oxygen and are still at risk of ROP.

  14. Using frequency response functions to manage image degradation from equipment vibration in the Daniel K. Inouye Solar Telescope

    NASA Astrophysics Data System (ADS)

    McBride, William R.; McBride, Daniel R.

    2016-08-01

    The Daniel K Inouye Solar Telescope (DKIST) will be the largest solar telescope in the world, providing a significant increase in the resolution of solar data available to the scientific community. Vibration mitigation is critical in long focal-length telescopes such as the Inouye Solar Telescope, especially when adaptive optics are employed to correct for atmospheric seeing. For this reason, a vibration error budget has been implemented. Initially, the FRFs for the various mounting points of ancillary equipment were estimated using the finite element analysis (FEA) of the telescope structures. FEA analysis is well documented and understood; the focus of this paper is on the methods involved in estimating a set of experimental (measured) transfer functions of the as-built telescope structure for the purpose of vibration management. Techniques to measure low-frequency single-input-single-output (SISO) frequency response functions (FRF) between vibration source locations and image motion on the focal plane are described. The measurement equipment includes an instrumented inertial-mass shaker capable of operation down to 4 Hz along with seismic accelerometers. The measurement of vibration at frequencies below 10 Hz with good signal-to-noise ratio (SNR) requires several noise reduction techniques including high-performance windows, noise-averaging, tracking filters, and spectral estimation. These signal-processing techniques are described in detail.

  15. Cervical motion testing: methodology and clinical implications.

    PubMed

    Prushansky, Tamara; Dvir, Zeevi

    2008-09-01

    Measurement of cervical motion (CM) is probably the most commonly applied functional outcome measure in assessing the status of patients with cervical pathology. In general terms, CM refers to motion of the head relative to the trunk as well as conjunct motions within the cervical spine. Multiple techniques and instruments have been used for assessing CM. These were associated with a wide variety of parameters relating to accuracy, reproducibility, and validity. Modern measurement systems enable recording, processing, and documentation of CM with a high degree of precision. Cervical motion measures provide substantial information regarding the severity of motion limitation and level of effort in cervically involved patients. They may also be used for following up performance during and after conservative or invasive interventions.

  16. Process modelling for materials preparation experiments

    NASA Technical Reports Server (NTRS)

    Rosenberger, Franz; Alexander, J. Iwan D.

    1994-01-01

    The main goals of the research under this grant consist of the development of mathematical tools and measurement techniques for transport properties necessary for high fidelity modelling of crystal growth from the melt and solution. Of the tasks described in detail in the original proposal, two remain to be worked on: development of a spectral code for moving boundary problems, and development of an expedient diffusivity measurement technique for concentrated and supersaturated solutions. We have focused on developing a code to solve for interface shape, heat and species transport during directional solidification. The work involved the computation of heat, mass and momentum transfer during Bridgman-Stockbarger solidification of compound semiconductors. Domain decomposition techniques and preconditioning methods were used in conjunction with Chebyshev spectral methods to accelerate convergence while retaining the high-order spectral accuracy. During the report period we have further improved our experimental setup. These improvements include: temperature control of the measurement cell to 0.1 C between 10 and 60 C; enclosure of the optical measurement path outside the ZYGO interferometer in a metal housing that is temperature controlled to the same temperature setting as the measurement cell; simultaneous dispensing and partial removal of the lower concentration (lighter) solution above the higher concentration (heavier) solution through independently motor-driven syringes; three-fold increase in data resolution by orientation of the interferometer with respect to diffusion direction; and increase of the optical path length in the solution cell to 12 mm.

  17. Absolute measurement of hadronic branching fractions of the Ds+ meson.

    PubMed

    Alexander, J P; Berkelman, K; Cassel, D G; Duboscq, J E; Ehrlich, R; Fields, L; Gibbons, L; Gray, R; Gray, S W; Hartill, D L; Heltsley, B K; Hertz, D; Jones, C D; Kandaswamy, J; Kreinick, D L; Kuznetsov, V E; Mahlke-Krüger, H; Mohapatra, D; Onyisi, P U E; Patterson, J R; Peterson, D; Riley, D; Ryd, A; Sadoff, A J; Shi, X; Stroiney, S; Sun, W M; Wilksen, T; Athar, S B; Patel, R; Yelton, J; Rubin, P; Eisenstein, B I; Karliner, I; Mehrabyan, S; Lowrey, N; Selen, M; White, E J; Wiss, J; Mitchell, R E; Shepherd, M R; Besson, D; Pedlar, T K; Cronin-Hennessy, D; Gao, K Y; Hietala, J; Kubota, Y; Klein, T; Lang, B W; Poling, R; Scott, A W; Zweber, P; Dobbs, S; Metreveli, Z; Seth, K K; Tomaradze, A; Libby, J; Powell, A; Wilkinson, G; Ecklund, K M; Love, W; Savinov, V; Lopez, A; Mendez, H; Ramirez, J; Ge, J Y; Miller, D H; Sanghi, B; Shipsey, I P J; Xin, B; Adams, G S; Anderson, M; Cummings, J P; Danko, I; Hu, D; Moziak, B; Napolitano, J; He, Q; Insler, J; Muramatsu, H; Park, C S; Thorndike, E H; Yang, F; Artuso, M; Blusk, S; Khalil, S; Li, J; Mountain, R; Nisar, S; Randrianarivony, K; Sultana, N; Skwarnicki, T; Stone, S; Wang, J C; Zhang, L M; Bonvicini, G; Cinabro, D; Dubrovin, M; Lincoln, A; Rademacker, J; Asner, D M; Edwards, K W; Naik, P; Briere, R A; Ferguson, T; Tatishvili, G; Vogel, H; Watkins, M E; Rosner, J L

    2008-04-25

    The branching fractions of D(s)(+/-) meson decays serve to normalize many measurements of processes involving charm quarks. Using 298 pb(-1) of e(+)e(-) collisions recorded at a center of mass energy of 4.17 GeV, we determine absolute branching fractions for eight D(s)(+/-) decays with a double tag technique. In particular we determine the branching fraction B(D(s)(+)-->K(-)K(+}pi(+))=(5.50+/-0.23+/-0.16)%, where the uncertainties are statistical and systematic, respectively. We also provide partial branching fractions for kinematic subsets of the K(-)K(+)pi(+) decay mode.

  18. Pathogenesis and treatment of pseudofolliculitis barbae.

    PubMed

    Brown, L A

    1983-10-01

    Pseudofolliculitis barbae is a condition in which a foreign body inflammatory reaction surrounds an ingrown hair. Shaving is the major cause, especially in persons with wavy or curly hair. Among black men who shave, the disease is of particular concern because of both social pressures and limited medical understanding concerning its treatment. The pathogenesis of the disease as well as treatment modalities are presented. General measures, as well as specific techniques involving use of electric clippers, chemical depilatories, manual razors, and complete epilation are discussed. Adjuvant measures are presented such as antibiotics and, in very special cases, retinoic acid.

  19. Differential surface stress sensor for detection of chemical and biological species

    NASA Astrophysics Data System (ADS)

    Kang, K.; Nilsen-Hamilton, M.; Shrotriya, P.

    2008-10-01

    We report a sensor consisting of two micromachined cantilevers (a sensing/reference pair) that is suitable for detection of chemical and biological species. The sensing strategy involves coating the sensing cantilever with receptors that have high affinities for the analyte. The presence of analyte is detected by determining the differential surface stress associated with its adsorption/absorption to the sensing cantilever. An interferometric technique is utilized to measure the differential bending of the sensing cantilever with respect to reference. Surface stress associated with hybridization of single stranded DNA is measured to demonstrate the unique advantages of the sensor.

  20. Absolute Measurement of Hadronic Branching Fractions of the Ds+ Meson

    NASA Astrophysics Data System (ADS)

    Alexander, J. P.; Berkelman, K.; Cassel, D. G.; Duboscq, J. E.; Ehrlich, R.; Fields, L.; Gibbons, L.; Gray, R.; Gray, S. W.; Hartill, D. L.; Heltsley, B. K.; Hertz, D.; Jones, C. D.; Kandaswamy, J.; Kreinick, D. L.; Kuznetsov, V. E.; Mahlke-Krüger, H.; Mohapatra, D.; Onyisi, P. U. E.; Patterson, J. R.; Peterson, D.; Riley, D.; Ryd, A.; Sadoff, A. J.; Shi, X.; Stroiney, S.; Sun, W. M.; Wilksen, T.; Athar, S. B.; Patel, R.; Yelton, J.; Rubin, P.; Eisenstein, B. I.; Karliner, I.; Mehrabyan, S.; Lowrey, N.; Selen, M.; White, E. J.; Wiss, J.; Mitchell, R. E.; Shepherd, M. R.; Besson, D.; Pedlar, T. K.; Cronin-Hennessy, D.; Gao, K. Y.; Hietala, J.; Kubota, Y.; Klein, T.; Lang, B. W.; Poling, R.; Scott, A. W.; Zweber, P.; Dobbs, S.; Metreveli, Z.; Seth, K. K.; Tomaradze, A.; Libby, J.; Powell, A.; Wilkinson, G.; Ecklund, K. M.; Love, W.; Savinov, V.; Lopez, A.; Mendez, H.; Ramirez, J.; Ge, J. Y.; Miller, D. H.; Sanghi, B.; Shipsey, I. P. J.; Xin, B.; Adams, G. S.; Anderson, M.; Cummings, J. P.; Danko, I.; Hu, D.; Moziak, B.; Napolitano, J.; He, Q.; Insler, J.; Muramatsu, H.; Park, C. S.; Thorndike, E. H.; Yang, F.; Artuso, M.; Blusk, S.; Khalil, S.; Li, J.; Mountain, R.; Nisar, S.; Randrianarivony, K.; Sultana, N.; Skwarnicki, T.; Stone, S.; Wang, J. C.; Zhang, L. M.; Bonvicini, G.; Cinabro, D.; Dubrovin, M.; Lincoln, A.; Rademacker, J.; Asner, D. M.; Edwards, K. W.; Naik, P.; Briere, R. A.; Ferguson, T.; Tatishvili, G.; Vogel, H.; Watkins, M. E.; Rosner, J. L.

    2008-04-01

    The branching fractions of Ds± meson decays serve to normalize many measurements of processes involving charm quarks. Using 298pb-1 of e+e- collisions recorded at a center of mass energy of 4.17 GeV, we determine absolute branching fractions for eight Ds± decays with a double tag technique. In particular we determine the branching fraction B(Ds+→K-K+π+)=(5.50±0.23±0.16)%, where the uncertainties are statistical and systematic, respectively. We also provide partial branching fractions for kinematic subsets of the K-K+π+ decay mode.

  1. Measurement techniques for analysis of fission fragment excited gases

    NASA Technical Reports Server (NTRS)

    Schneider, R. T.; Carroll, E. E.; Davis, J. F.; Davie, R. N.; Maguire, T. C.; Shipman, R. G.

    1976-01-01

    Spectroscopic analysis of fission fragment excited He, Ar, Xe, N2, Ne, Ar-N2, and Ne-N2 have been conducted. Boltzmann plot analysis of He, Ar and Xe have indicated a nonequilibrium, recombining plasma, and population inversions have been found in these gases. The observed radiating species in helium have been adequately described by a simple kinetic model. A more extensive model for argon, nitrogen and Ar-N2 mixtures was developed which adequately describes the energy flow in the system and compares favorably with experimental measurements. The kinetic processes involved in these systems are discussed.

  2. Shuttle GPS R/PA configuration and specification study

    NASA Technical Reports Server (NTRS)

    Booth, R. W. D.

    1979-01-01

    Changes in the technical specifications for a global positioning system (GPS) receiving system dedicated to space shuttle use are presented. Various hardware functions including acquisition, tracking, and measurement are emphasized. The anti-jam performance of the baseline GPS systems are evaluated. Other topics addressed include: the impact on R/PA design of the use of ground based transmitters; problems involved with the use of single channel tests sets; utility of various R/PA antenna interconnections topologies; the choice of the averaging interval for delta range measurements; and the use of interferometry techniques for the computation of orbiter attitude were undertaken.

  3. A review of experimental investigations on thermal phenomena in nanofluids

    PubMed Central

    2011-01-01

    Nanoparticle suspensions (nanofluids) have been recommended as a promising option for various engineering applications, due to the observed enhancement of thermophysical properties and improvement in the effectiveness of thermal phenomena. A number of investigations have been reported in the recent past, in order to quantify the thermo-fluidic behavior of nanofluids. This review is focused on examining and comparing the measurements of convective heat transfer and phase change in nanofluids, with an emphasis on the experimental techniques employed to measure the effective thermal conductivity, as well as to characterize the thermal performance of systems involving nanofluids. PMID:21711918

  4. Nuclear data for r-process models from ion trap measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clark, Jason, E-mail: jclark@anl.gov

    2016-06-21

    To truly understand how elements are created in the universe via the astrophysical r process, accurate nuclear data are required. Historically, the isotopes involved in the r process have been difficult to access for study, but the development of new facilities and measurement techniques have put many of the r-process isotopes within reach. This paper will discuss the new CARIBU facility at Argonne National Laboratory and two pieces of experimental equipment, the Beta-decay Paul Trap and the Canadian Penning Trap, that will dramatically increase the nuclear data available for models of the astrophysical r process.

  5. Measurement requirements and techniques for degradation studies and lifetime prediction testing of photovoltaic modules

    NASA Technical Reports Server (NTRS)

    Noel, G. T.; Sliemers, F. A.; Derringer, G. C.; Wood, V. E.; Wilkes, K. E.; Gaines, G. B.; Carmichael, D. C.

    1978-01-01

    Tests of weathering and aging behavior are being developed to characterize the degradation and predict the lifetimes of low-cost photovoltaic arrays. Environmental factors which affect array performance include UV radiation, thermal energy, water, oxygen (generally involved in synergistic effects with UV radiation or high temperatures), physical stress, pollutants (oxides of nitrogen, sulfur dioxide and ozone), abrasives and dirt. A survey of photovoltaic array testing has shown the need to establish quantitative correlations between certain measurable properties (carbonyl formation, glass transition temperature, and molecular weight change) and modes of degradation and failure.

  6. Size and shape measurement in contemporary cephalometrics.

    PubMed

    McIntyre, Grant T; Mossey, Peter A

    2003-06-01

    The traditional method of analysing cephalograms--conventional cephalometric analysis (CCA)--involves the calculation of linear distance measurements, angular measurements, area measurements, and ratios. Because shape information cannot be determined from these 'size-based' measurements, an increasing number of studies employ geometric morphometric tools in the cephalometric analysis of craniofacial morphology. Most of the discussions surrounding the appropriateness of CCA, Procrustes superimposition, Euclidean distance matrix analysis (EDMA), thin-plate spline analysis (TPS), finite element morphometry (FEM), elliptical Fourier functions (EFF), and medial axis analysis (MAA) have centred upon mathematical and statistical arguments. Surprisingly, little information is available to assist the orthodontist in the clinical relevance of each technique. This article evaluates the advantages and limitations of the above methods currently used to analyse the craniofacial morphology on cephalograms and investigates their clinical relevance and possible applications.

  7. Coverage by land, sea, and airplane surveys, 1900-1967.

    NASA Technical Reports Server (NTRS)

    Fabiano, E.; Cain, S. J.

    1971-01-01

    The worldwide coverage of the earth by land, sea, and aircraft magnetic surveys since the beginning of the 20th century is shown on three world maps for surface surveys spanning the periods of 1900-1930, 1930-1955, and 1955-1967, respectively, on a fourth map for ship-towed magnetometer surveys performed after 1956, and on a fifth map for 1953-1966 airborne survey data. The technique used, involving a position plotting of each measurement with a microfilm plotter, results in the appearance of heavily surveyed regions as completely darkened areas. The coverage includes measurements at about 100,000 land stations, airborne measurements at over 90,000 points, and marine measurements at over 25,000 points. The marine measurements cover over 1,000,000 km of trackline.

  8. Laser Calorimetry Spectroscopy for ppm-level Dissolved Gas Detection and Analysis

    PubMed Central

    K. S., Nagapriya; Sinha, Shashank; R., Prashanth; Poonacha, Samhitha; Chaudhry, Gunaranjan; Bhattacharya, Anandaroop; Choudhury, Niloy; Mahalik, Saroj; Maity, Sandip

    2017-01-01

    In this paper we report a newly developed technique – laser calorimetry spectroscopy (LCS), which is a combination of laser absorption spectroscopy and calorimetry - for the detection of gases dissolved in liquids. The technique involves determination of concentration of a dissolved gas by irradiating the liquid with light of a wavelength where the gas absorbs, and measuring the temperature change caused by the absorbance. Conventionally, detection of dissolved gases with sufficient sensitivity and specificity was done by first extracting the gases from the liquid and then analyzing the gases using techniques such as gas chromatography. Using LCS, we have been able to detect ppm levels of dissolved gases without extracting them from the liquid. In this paper, we show the detection of dissolved acetylene in transformer oil in the mid infrared (MIR) wavelength (3021 nm) region. PMID:28218304

  9. Nuclear constraints on the age of the universe

    NASA Technical Reports Server (NTRS)

    Schramm, D. N.

    1983-01-01

    A review is made of how one can use nuclear physics to put rather stringent limits on the age of the universe and thus the cosmic distance scale. The age can be estimated to a fair degree of accuracy. No single measurement of the time since the Big Bang gives a specific, unambiguous age. There are several methods that together fix the age with surprising precision. In particular, there are three totally independent techniques for estimating an age and a fourth technique which involves finding consistency of the other three in the framework of the standard Big Bang cosmological model. The three independent methods are: cosmological dynamics, the age of the oldest stars, and radioactive dating. This paper concentrates on the third of the three methods, and the consistency technique. Previously announced in STAR as N83-34868

  10. Nanoscale visualization of redox activity at lithium-ion battery cathodes.

    PubMed

    Takahashi, Yasufumi; Kumatani, Akichika; Munakata, Hirokazu; Inomata, Hirotaka; Ito, Komachi; Ino, Kosuke; Shiku, Hitoshi; Unwin, Patrick R; Korchev, Yuri E; Kanamura, Kiyoshi; Matsue, Tomokazu

    2014-11-17

    Intercalation and deintercalation of lithium ions at electrode surfaces are central to the operation of lithium-ion batteries. Yet, on the most important composite cathode surfaces, this is a rather complex process involving spatially heterogeneous reactions that have proved difficult to resolve with existing techniques. Here we report a scanning electrochemical cell microscope based approach to define a mobile electrochemical cell that is used to quantitatively visualize electrochemical phenomena at the battery cathode material LiFePO4, with resolution of ~100 nm. The technique measures electrode topography and different electrochemical properties simultaneously, and the information can be combined with complementary microscopic techniques to reveal new perspectives on structure and activity. These electrodes exhibit highly spatially heterogeneous electrochemistry at the nanoscale, both within secondary particles and at individual primary nanoparticles, which is highly dependent on the local structure and composition.

  11. Laser Calorimetry Spectroscopy for ppm-level Dissolved Gas Detection and Analysis.

    PubMed

    K S, Nagapriya; Sinha, Shashank; R, Prashanth; Poonacha, Samhitha; Chaudhry, Gunaranjan; Bhattacharya, Anandaroop; Choudhury, Niloy; Mahalik, Saroj; Maity, Sandip

    2017-02-20

    In this paper we report a newly developed technique - laser calorimetry spectroscopy (LCS), which is a combination of laser absorption spectroscopy and calorimetry - for the detection of gases dissolved in liquids. The technique involves determination of concentration of a dissolved gas by irradiating the liquid with light of a wavelength where the gas absorbs, and measuring the temperature change caused by the absorbance. Conventionally, detection of dissolved gases with sufficient sensitivity and specificity was done by first extracting the gases from the liquid and then analyzing the gases using techniques such as gas chromatography. Using LCS, we have been able to detect ppm levels of dissolved gases without extracting them from the liquid. In this paper, we show the detection of dissolved acetylene in transformer oil in the mid infrared (MIR) wavelength (3021 nm) region.

  12. Heterodyne method for high specificity gas detection.

    NASA Technical Reports Server (NTRS)

    Dimeff, J.; Donaldson, R. W.; Gunter, W. D., Jr.; Jaynes, D. N.; Margozzi, A. P.; Deboo, G. J.; Mcclatchie, E. A.; Williams, K. G.

    1971-01-01

    This paper describes a new technique for measuring trace quantities of gases. The technique involves the use of a reference cell (containing a known amount of the gas being sought) and a sample cell (containing an unknown amount of the same gas) wherein the gas densities are modulated. Light passing through the two cells in sequence is modulated in intensity at the vibrational-rotational lines characteristic of the absorption spectrum for the gas of interest. Since the absorption process is nonlinear, modulating the two absorption cells at two different frequencies gives rise to a heterodyning effect, which in turn introduces sum and difference frequencies in the detected signal. Measuring the ratio of the difference frequency signal for example, to the signal introduced by the reference cell provides a normalized measure of the amount of the gas in the sample cell. The readings produced are thereby independent of source intensity, window transparency, and detector sensitivity. Experimental evaluation of the technique suggests that it should be applicable to a wide range of gases, that it should be able to reject spurious signals due to unwanted gases, and that it should be sensitive to concentrations of the order of 10 to the minus 8th power when used with a sample cell of only 20 cm length.

  13. Facial Soft Tissue Measurement in Microgravity-induces Fluid Shifts

    NASA Technical Reports Server (NTRS)

    Marshburn, Thomas; Cole, Richard; Pavela, James; Garcia, Kathleen; Sargsyan, Ashot

    2014-01-01

    Fluid shifts are a well-known phenomenon in microgravity, and one result is facial edema. Objective measurement of tissue thickness in a standardized location could provide a correlate with the severity of the fluid shift. Previous studies of forehead tissue thickness (TTf) suggest that when exposed to environments that cause fluid shifts, including hypergravity, head-down tilt, and high-altitude/lowpressure, TTf changes in a consistent and measurable fashion. However, the technique in past studies is not well described or standardized. The International Space Station (ISS) houses an ultrasound (US) system capable of accurate sub-millimeter measurements of TTf. We undertook to measure TTf during long-duration space flight using a new accurate, repeatable and transferable technique. Methods: In-flight and post-flight B-mode ultrasound images of a single astronaut's facial soft tissues were obtained using a Vivid-q US system with a 12L-RS high-frequency linear array probe (General Electric, USA). Strictly mid-sagittal images were obtained involving the lower frontal bone, the nasofrontal angle, and the osseo-cartilaginous junction below. Single images were chosen for comparison that contained identical views of the bony landmarks and identical acoustical interface between the probe and skin. Using Gingko CADx DICOM viewing software, soft tissue thickness was measured at a right angle to the most prominent point of the inferior frontal bone to the epidermis. Four independent thickness measurements were made. Conclusions: Forehead tissue thickness measurement by ultrasound in microgravity is feasible, and our data suggest a decrease in tissue thickness upon return from microgravity environment, which is likely related to the cessation of fluid shifts. Further study is warranted to standardize the technique with regard to the individual variability of the local anatomy in this area.

  14. Lidar In-space Technology Experiment: Overview and early results

    NASA Technical Reports Server (NTRS)

    McCormick, M. Patrick

    1995-01-01

    The September 1994 Shuttle flight of the Lidar In-space Technology Experiment (LITE) brought to fruition 10 years of effort at NASA's Langley Research Center where it was built. Being the first flight of a spaceborne lidar to measure atmospheric constituents and parameters and surface properties, it culminates the efforts of many worldwide over the last 20 years to usher in this new remote sensing technique from space. This paper will describe the LITE instrument, the in-orbit performance, and initial results. In addition, the global correlative measurements program will be outlined which involved 60 groups in 20 countries who made various simultaneous ground-based or aircraft measurements as LITE flew overhead.

  15. Resolving phase information of the optical local density of state with scattering near-field probes

    NASA Astrophysics Data System (ADS)

    Prasad, R.; Vincent, R.

    2016-10-01

    We theoretically discuss the link between the phase measured using a scattering optical scanning near-field microscopy (s-SNOM) and the local density of optical states (LDOS). A remarkable result is that the LDOS information is directly included in the phase of the probe. Therefore by monitoring the spatial variation of the trans-scattering phase, we locally measure the phase modulation associated with the probe and the optical paths. We demonstrate numerically that a technique involving two-phase imaging of a sample with two different sized tips should allow to obtain the image the pLDOS. For this imaging method, numerical comparison with extinction probe measurement shows crucial qualitative and quantitative improvement.

  16. Picosecond pulse measurements using the active laser medium

    NASA Technical Reports Server (NTRS)

    Bernardin, James P.; Lawandy, N. M.

    1990-01-01

    A simple method for measuring the pulse lengths of synchronously pumped dye lasers which does not require the use of an external nonlinear medium, such as a doubling crystal or two-photon fluorescence cell, to autocorrelate the pulses is discussed. The technique involves feeding the laser pulses back into the dye jet, thus correlating the output pulses with the intracavity pulses to obtain pulse length signatures in the resulting time-averaged laser power. Experimental measurements were performed using a rhodamine 6G dye laser pumped by a mode-locked frequency-doubled Nd:YAG laser. The results agree well with numerical computations, and the method proves effective in determining lengths of picosecond laser pulses.

  17. An Overview of Unsteady Pressure Measurements in the Transonic Dynamics Tunnel

    NASA Technical Reports Server (NTRS)

    Schuster, David M.; Edwards, John W.; Bennett, Robert M.

    2000-01-01

    The NASA Langley Transonic Dynamics Tunnel has served as a unique national facility for aeroelastic testing for over forty years. A significant portion of this testing has been to measure unsteady pressures on models undergoing flutter, forced oscillations, or buffet. These tests have ranged from early launch vehicle buffet to flutter of a generic high-speed transport. This paper will highlight some of the test techniques, model design approaches, and the many unsteady pressure tests conducted in the TDT. The objectives and results of the data acquired during these tests will be summarized for each case and a brief discussion of ongoing research involving unsteady pressure measurements and new TDT capabilities will be presented.

  18. Comparative analysis of quantitative methodologies for Vibrionaceae biofilms.

    PubMed

    Chavez-Dozal, Alba A; Nourabadi, Neda; Erken, Martina; McDougald, Diane; Nishiguchi, Michele K

    2016-11-01

    Multiple symbiotic and free-living Vibrio spp. grow as a form of microbial community known as a biofilm. In the laboratory, methods to quantify Vibrio biofilm mass include crystal violet staining, direct colony-forming unit (CFU) counting, dry biofilm cell mass measurement, and observation of development of wrinkled colonies. Another approach for bacterial biofilms also involves the use of tetrazolium (XTT) assays (used widely in studies of fungi) that are an appropriate measure of metabolic activity and vitality of cells within the biofilm matrix. This study systematically tested five techniques, among which the XTT assay and wrinkled colony measurement provided the most reproducible, accurate, and efficient methods for the quantitative estimation of Vibrionaceae biofilms.

  19. Exploration of a variation of the bottle buoyancy technique for the assessment of body composition.

    PubMed

    Gulick, Dawn T; Geigle, Paula Richley

    2003-05-01

    Hydrostatic weighing has long been recognized as a reliable and valid method for the assessment of body composition. An alternative method known as bottle buoyancy (BB) was introduced by Katch, Hortobagyi, and Denahan in 1989. The purpose of this clinical investigation was to determine the accuracy of the BB technique using an 11-L container. Sixteen individuals (8 men, 8 women) were weighed hydrostatically using a chair/scale and the BB technique. The overall intraclass correlation coefficient for the two techniques was 0.9537. A 2-variable ANOVA was significant for gender but not for technique, and there was no interaction between variables. Thus, the BB technique appears to be an accurate substitute for the chair/scale technique for hydrostatic weighing. The BB method does not involve elaborate equipment and is portable. It could be improved with the use of multiple bottles of various volumes or a calibrated bottle to minimize the number of trials needed for accurate measurements. BB is a valuable, simple clinical tool for assessing body composition based on the principles of hydrostatic weighing and can be performed in any high school, college, or community swimming pool.

  20. Mental health nurses' emotions, exposure to patient aggression, attitudes to and use of coercive measures: Cross sectional questionnaire survey.

    PubMed

    Jalil, Rahul; Huber, Jorg W; Sixsmith, Judith; Dickens, Geoffrey L

    2017-10-01

    Mental health nurses are exposed to patient aggression, and required to manage and de-escalate aggressive incidents; coercive measures such as restraint and seclusion should only be used as a last resort. An improved understanding of links between nurses' exposure to aggression, attitudes to, and actual involvement in, coercive measures, and their emotions (anger, guilt, fear, fatigue, sadness), could inform preparation and education for prevention and management of violence. To identify relationships between mental health nurses' exposure to patient aggression, their emotions, their attitudes towards coercive containment measures, and their involvement in incidents involving seclusion and restraint. Cross-sectional, correlational, observational study. Low and medium secure wards for men and women with mental disorder in three secure mental health hospitals in England. N=Sixty eight mental health nurses who were designated keyworkers for patients enrolled into a related study. Participants completed a questionnaire battery comprising measures of their exposure to various types of aggression, their attitudes towards seclusion and restraint, and their emotions. Information about their involvement in restraint and/or restraint plus seclusion incidents was gathered for the three-month period pre- and post- their participation. Linear and logistic regression analyses were performed to test study hypotheses. Nurses who reported greater exposure to a related set of aggressive behaviours, mostly verbal in nature, which seemed personally derogatory, targeted, or humiliating, also reported higher levels of anger-related provocation. Exposure to mild and severe physical aggression was unrelated to nurses' emotions. Nurses' reported anger was significantly positively correlated with their endorsement of restraint as a management technique, but not with their actual involvement in restraint episodes. Significant differences in scores related to anger and fatigue, and to fatigue and guilt, between those involved/not involved in physical restraint and in physical restraint plus seclusion respectively were detected. In regression analyses, models comprising significant variables, but not the variables themselves, predicted involvement/non-involvement in coercive measures. Verbal aggression which appears targeted, demeaning or humiliating is associated with higher experienced anger provocation. Nurses may benefit from interventions which aim to improve their skills and coping strategies for dealing with this specific aggressive behaviour. Nurse-reported anger predicted approval of coercive violence management interventions; this may have implications for staff deployment and support. However, anger did not predict actual involvement in such incidents. Possible explanations are that nurses experiencing anger are sufficiently self-aware to avoid involvement or that teams are successful in supporting colleagues who they perceive to be 'at risk'. Future research priorities are considered. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Effect of atmosphere on the surface tension and viscosity of molten LiNbO 3 measured using the surface laser-light scattering method

    NASA Astrophysics Data System (ADS)

    Nagasaka, Yuji; Kobayashi, Yusuke

    2007-09-01

    The surface tension and the viscosity of molten LiNbO 3 (LN) having the congruent composition have been measured simultaneously in a temperature range from 1537 to 1756 K under argon gas and dry-air atmospheres. The present measurement technique involves surface laser-light scattering (SLLS) that detects nanometer-order-amplitude surface waves usually regarded as ripplons excited by thermal fluctuations. This technique's non-invasive nature allows it to avoid the experimental difficulties of conventional techniques resulting from the insertion of an actuator in the melt. The results of surface tension measurement obtained under a dry-air atmosphere are about 5% smaller than those obtained under an argon atmosphere near the melting temperature, and the temperature dependence of the surface tension under a dry-air atmosphere is twice that under an argon atmosphere. The uncertainty of surface tension measurement is estimated to be ±2.6% under argon and ±1.9% under dry air. The temperature dependence of viscosity can be well correlated with the results of Arrhenius-type equations without any anomalous behavior near the melting point. The viscosities obtained under a dry-air atmosphere were slightly smaller than those obtained under an argon atmosphere. The uncertainty of viscosity measurement is estimated to be ±11.1% for argon and ±14.3% for dry air. Moreover, we observed the real-time dynamic behavior of the surface tension and the viscosity of molten LN in response to argon and dry-air atmospheres.

  2. A new technique for the measurement of surface shear stress vectors using liquid crystal coatings

    NASA Technical Reports Server (NTRS)

    Reda, Daniel C.; Muratore, J. J., Jr.

    1994-01-01

    Research has recently shown that liquid crystal coating (LCC) color-change response to shear depends on both shear stress magnitude and direction. Additional research was thus conducted to extend the LCC method from a flow-visualization tool to a surface shear stress vector measurement technique. A shear-sensitive LCC was applied to a planar test surface and illuminated by white light from the normal direction. A fiber optic probe was used to capture light scattered by the LCC from a point on the centerline of a turbulent, tangential-jet flow. Both the relative shear stress magnitude and the relative in-plane view angle between the sensor and the centerline shear vector were systematically varied. A spectrophotometer was used to obtain scattered-light spectra which were used to quantify the LCC color (dominant wavelength) as a function of shear stress magnitude and direction. At any fixed shear stress magnitude, the minimum dominant wavelength was measured when the shear vector was aligned with and directed away from the observer; changes in the relative in-plane view angle to either side of this vector/observer aligned position resulted in symmetric Gaussian increases in measured dominant wavelength. Based on these results, a vector measurement methodology, involving multiple oblique-view observations of the test surface, was formulated. Under present test conditions, the measurement resolution of this technique was found to be +/- 1 deg for vector orientations and +/- 5% for vector magnitudes. An approach t o extend the present methodology to full-surface applications is proposed.

  3. Deep frequency modulation interferometry.

    PubMed

    Gerberding, Oliver

    2015-06-01

    Laser interferometry with pm/Hz precision and multi-fringe dynamic range at low frequencies is a core technology to measure the motion of various objects (test masses) in space and ground based experiments for gravitational wave detection and geodesy. Even though available interferometer schemes are well understood, their construction remains complex, often involving, for example, the need to build quasi-monolithic optical benches with dozens of components. In recent years techniques have been investigated that aim to reduce this complexity by combining phase modulation techniques with sophisticated digital readout algorithms. This article presents a new scheme that uses strong laser frequency modulations in combination with the deep phase modulation readout algorithm to construct simpler and easily scalable interferometers.

  4. A measurement technique of time-dependent dielectric breakdown in MOS capacitors

    NASA Technical Reports Server (NTRS)

    Li, S. P.

    1974-01-01

    The statistical nature of time-dependent dielectric breakdown characteristics in MOS capacitors was evidenced by testing large numbers of capacitors fabricated on single wafers. A multipoint probe and automatic electronic visual display technique are introduced that will yield statistical results which are necessary for the investigation of temperature, electric field, thermal annealing, and radiation effects in the breakdown characteristics, and an interpretation of the physical mechanisms involved. It is shown that capacitors of area greater than 0.002 sq cm may yield worst-case results, and that a multipoint probe of capacitors of smaller sizes can be used to obtain a profile of nonuniformities in the SiO2 films.

  5. Veterinary Aspects of Bird of Prey Reproduction.

    PubMed

    Bailey, Tom A; Lierz, Michael

    2017-05-01

    Captive breeding has contributed to successful restoration of many species of birds of prey. Avicultural techniques pioneered by raptor breeders include double clutching, direct fostering, cross-fostering, hatch and switch, hacking, imprinting male and female falcons for semen collection, and artificial insemination techniques. However, reproductive failure occurs related to management problems, including hygiene measures, food quality issues, breeding flock structure, or individual health issues of breeding birds. These may result in non-egg laying females, low-quality eggs, or infertile eggs caused by male infertility. Veterinary care of breeding collections is extremely important. This article provides an overview of veterinary involvement in raptor breeding projects. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Streamflow data

    USGS Publications Warehouse

    Holmes, Robert R.; Singh, Vijay P.

    2016-01-01

    The importance of streamflow data to the world’s economy, environmental health, and public safety continues to grow as the population increases. The collection of streamflow data is often an involved and complicated process. The quality of streamflow data hinges on such things as site selection, instrumentation selection, streamgage maintenance and quality assurance, proper discharge measurement techniques, and the development and continued verification of the streamflow rating. This chapter serves only as an overview of the streamflow data collection process as proper treatment of considerations, techniques, and quality assurance cannot be addressed adequately in the space limitations of this chapter. Readers with the need for the detailed information on the streamflow data collection process are referred to the many references noted in this chapter. 

  7. Introduction to the virtual special issue on super-resolution imaging techniques

    NASA Astrophysics Data System (ADS)

    Cao, Liangcai; Liu, Zhengjun

    2017-12-01

    Until quite recently, the resolution of optical imaging instruments, including telescopes, cameras and microscopes, was considered to be limited by the diffraction of light and by image sensors. In the past few years, many exciting super-resolution approaches have emerged that demonstrate intriguing ways to bypass the classical limit in optics and detectors. More and more research groups are engaged in the study of advanced super-resolution schemes, devices, algorithms, systems, and applications [1-6]. Super-resolution techniques involve new methods in science and engineering of optics [7,8], measurements [9,10], chemistry [11,12] and information [13,14]. Promising applications, particularly in biomedical research and semiconductor industry, have been successfully demonstrated.

  8. Statistical Analysis of a Round-Robin Measurement Survey of Two Candidate Materials for a Seebeck Coefficient Standard Reference Material

    PubMed Central

    Lu, Z. Q. J.; Lowhorn, N. D.; Wong-Ng, W.; Zhang, W.; Thomas, E. L.; Otani, M.; Green, M. L.; Tran, T. N.; Caylor, C.; Dilley, N. R.; Downey, A.; Edwards, B.; Elsner, N.; Ghamaty, S.; Hogan, T.; Jie, Q.; Li, Q.; Martin, J.; Nolas, G.; Obara, H.; Sharp, J.; Venkatasubramanian, R.; Willigan, R.; Yang, J.; Tritt, T.

    2009-01-01

    In an effort to develop a Standard Reference Material (SRM™) for Seebeck coefficient, we have conducted a round-robin measurement survey of two candidate materials—undoped Bi2Te3 and Constantan (55 % Cu and 45 % Ni alloy). Measurements were performed in two rounds by twelve laboratories involved in active thermoelectric research using a number of different commercial and custom-built measurement systems and techniques. In this paper we report the detailed statistical analyses on the interlaboratory measurement results and the statistical methodology for analysis of irregularly sampled measurement curves in the interlaboratory study setting. Based on these results, we have selected Bi2Te3 as the prototype standard material. Once available, this SRM will be useful for future interlaboratory data comparison and instrument calibrations. PMID:27504212

  9. Remote Sensing of Precipitation from Airborne and Spaceborne Radar. Chapter 13

    NASA Technical Reports Server (NTRS)

    Munchak, S. Joseph

    2017-01-01

    Weather radar measurements from airborne or satellite platforms can be an effective remote sensing tool for examining the three-dimensional structures of clouds and precipitation. This chapter describes some fundamental properties of radar measurements and their dependence on the particle size distribution (PSD) and radar frequency. The inverse problem of solving for the vertical profile of PSD from a profile of measured reflectivity is stated as an optimal estimation problem for single- and multi-frequency measurements. Phenomena that can change the measured reflectivity Z(sub m) from its intrinsic value Z(sub e), namely attenuation, non-uniform beam filling, and multiple scattering, are described and mitigation of these effects in the context of the optimal estimation framework is discussed. Finally, some techniques involving the use of passive microwave measurements to further constrain the retrieval of the PSD are presented.

  10. Electrical resistance tomography from measurements inside a steel cased borehole

    DOEpatents

    Daily, William D.; Schenkel, Clifford; Ramirez, Abelardo L.

    2000-01-01

    Electrical resistance tomography (ERT) produced from measurements taken inside a steel cased borehole. A tomographic inversion of electrical resistance measurements made within a steel casing was then made for the purpose of imaging the electrical resistivity distribution in the formation remotely from the borehole. The ERT method involves combining electrical resistance measurements made inside a steel casing of a borehole to determine the electrical resistivity in the formation adjacent to the borehole; and the inversion of electrical resistance measurements made from a borehole not cased with an electrically conducting casing to determine the electrical resistivity distribution remotely from a borehole. It has been demonstrated that by using these combined techniques, highly accurate current injection and voltage measurements, made at appropriate points within the casing, can be tomographically inverted to yield useful information outside the borehole casing.

  11. Studies of nonlinear femtosecond pulse propagation in bulk materials

    NASA Astrophysics Data System (ADS)

    Eaton, Hilary Kaye

    2000-10-01

    Femtosecond pulse lasers are finding widespread application in a variety of fields including medical research, optical switching and communications, plasma formation, high harmonic generation, and wavepacket formation and control. As the number of applications for femtosecond pulses increases, so does the need to fully understand the linear and nonlinear processes involved in propagating these pulses through materials under various conditions. Recent advances in pulse measurement techniques, such as frequency-resolved optical gating (FROG), allow measurement of the full electric field of the pulse and have made detailed investigations of short- pulse propagation effects feasible. In this thesis, I present detailed experimental studies of my work involving nonlinear propagation of femtosecond pulses in bulk media. Studies of plane-wave propagation in fused silica extend the SHG form of FROG from a simple pulse diagnostic to a useful method of interrogating the nonlinear response of a material. Studies of nonlinear propagation are also performed in a regime where temporal pulse splitting occurs. Experimental results are compared with a three- dimensional nonlinear Schrödinger equation. This comparison fuels the development of a more complete model for pulse splitting. Experiments are also performed at peak input powers above those at which pulse splitting is observed. At these higher intensities, a broadband continuum is generated. This work presents a detailed study of continuum behavior and power loss as well as the first near-field spatial- spectral measurements of the generated continuum light. Nonlinear plane-wave propagation of short pulses in liquids is also investigated, and a non-instantaneous nonlinearity with a surprisingly short response time of 10 fs is observed in methanol. Experiments in water confirm that this effect in methanol is indeed real. Possible explanations for the observed effect are discussed and several are experimentally rejected. This thesis applies FROG as a powerful tool for science and not just a useful pulse diagnostic technique. Studies of three-dimensional propagation provide an in-depth understanding of the processes involved in femtosecond pulse splitting. In addition, the experimental investigations of continuum generation and pulse propagation in liquids provide new insights into the possible processes involved and should provide a useful comparison for developing theories.

  12. Technique Feature Analysis or Involvement Load Hypothesis: Estimating Their Predictive Power in Vocabulary Learning.

    PubMed

    Gohar, Manoochehr Jafari; Rahmanian, Mahboubeh; Soleimani, Hassan

    2018-02-05

    Vocabulary learning has always been a great concern and has attracted the attention of many researchers. Among the vocabulary learning hypotheses, involvement load hypothesis and technique feature analysis have been proposed which attempt to bring some concepts like noticing, motivation, and generation into focus. In the current study, 90 high proficiency EFL students were assigned into three vocabulary tasks of sentence making, composition, and reading comprehension in order to examine the power of involvement load hypothesis and technique feature analysis frameworks in predicting vocabulary learning. It was unraveled that involvement load hypothesis cannot be a good predictor, and technique feature analysis was a good predictor in pretest to posttest score change and not in during-task activity. The implications of the results will be discussed in the light of preparing vocabulary tasks.

  13. Development and applications of laser-induced incandescence

    NASA Technical Reports Server (NTRS)

    Vanderwal, Randy L.; Dietrich, Daniel L.; Zhou, Zhiquang; Choi, Mun Y.

    1995-01-01

    Several NASA-funded investigations focus on soot processes and radiative influences of soot in diffusion flames given their simplicity, practical significance, and potential for theoretical modeling. Among the physical parameters characterizing soot, soot volume fraction, f(sub v), a function of particle size and number density, is often of chief practical interest in these investigations, as this is the geometrical property that directly impacts radiative characteristics and the temperature field of the flame and is basic to understanding soot growth and oxidation processes. Diffusion flames, however, present a number of challenges to the determination of f(sub v) via traditional extinction measurements. Laser-induced incandescence (LII) possesses several advantages compared to line-of-sight extinction techniques for determination of f(sub v). Since LII is not a line-of-sight technique, similar to fluorescence, it possesses geometric versatility allowing spatially resolved measurements of f(sub v) in real time in nonaxisymmetric systems without using deconvolution techniques. The spatial resolution of LII is determined by the detector and imaging magnification used. Neither absorption by polycyclic aromatic hydrocarbons (PAH's) nor scattering contributes to the signal. Temporal capabilities are limited only by the laser pulse and camera gate duration, with measurements having been demonstrated with 10 ns resolution. Because of these advantages, LII should be applicable to a variety of combustion processes involving both homogeneous and heterogeneous phases. Our work has focussed on characterization of the technique as well as exploration of its capabilities and is briefly described.

  14. Maintaining ear aesthetics in helical rim reconstruction.

    PubMed

    Taylor, James M; Rajan, Ruchika; Dickson, John K; Mahajan, Ajay L

    2014-03-01

    Wedge resections of the helical rim may result in a significant deformity of the ear with the ear not only smaller but cupped and prominent too. Our technique involves resection of the wedge in the scaphal area without extending into the concha followed by advancement of the helical rim into the defect. This technique is most suitable for peripheral defects of the helical rim, in the middle third. Our modified surgical technique was applied to reconstruction of the pinna after resection of the tumor in 12 patients. Free cartilaginous helical rim, length of helical rim to be resected, and projection of the ear from the mastoid was measured. This was then compared with measurements after the operation, and the patient satisfaction assessed with a visual analog scale. The free cartilaginous rim was 91.67 ± 5.61 mm. Of this, 21.92 ± 3.78 mm was resected, which amounted to 23.84% ± 3.35% of the rim. Although this resulted in a mean increase in ear projection of 6.42 ± 1.68 mm, the aesthetic outcome was good (visual analog scale, 9.08 ± 0.9). This technique reduces cupping and does not make the ear as prominent as it may do after a conventional wedge resection and results in high patient satisfaction.

  15. Scintillation-based Search for Off-pulse Radio Emission from Pulsars

    NASA Astrophysics Data System (ADS)

    Ravi, Kumar; Deshpande, Avinash A.

    2018-05-01

    We propose a new method to detect off-pulse (unpulsed and/or continuous) emission from pulsars using the intensity modulations associated with interstellar scintillation. Our technique involves obtaining the dynamic spectra, separately for on-pulse window and off-pulse region, with time and frequency resolutions to properly sample the intensity variations due to diffractive scintillation and then estimating their mutual correlation as a measure of off-pulse emission, if any. We describe and illustrate the essential details of this technique with the help of simulations, as well as real data. We also discuss the advantages of this method over earlier approaches to detect off-pulse emission. In particular, we point out how certain nonidealities inherent to measurement setups could potentially affect estimations in earlier approaches and argue that the present technique is immune to such nonidealities. We verify both of the above situations with relevant simulations. We apply this method to the observation of PSR B0329+54 at frequencies of 730 and 810 MHz made with the Green Bank Telescope and present upper limits for the off-pulse intensity at the two frequencies. We expect this technique to pave the way for extensive investigations of off-pulse emission with the help of existing dynamic spectral data on pulsars and, of course, with more sensitive long-duration data from new observations.

  16. Modal content of noise generated by a coaxial jet in a pipe

    NASA Technical Reports Server (NTRS)

    Kerschen, E. J.; Johnston, J. P.

    1978-01-01

    Noise generated by air flow through a coaxial obstruction in a long, straight pipe was investigated with concentration on the modal characteristics of the noise field inside the pipe and downstream of the restriction. Two measurement techniques were developed for separation of the noise into the acoustic duct modes. The instantaneous mode separation technique uses four microphones, equally spaced in the circumferential direction, at the same axial location. The time-averaged mode separation technique uses three microphones mounted at the same axial location. A matrix operation on time-averaged data produces the modal pressure levels. This technique requires the restrictive assumption that the acoustic modes are uncorrelated with each other. The measured modal pressure spectra were converted to modal power spectra and integrated over the frequency range 200-6000 Hz. The acoustic efficiency levels (acoustic power normalized by jet kinetic energy flow), when plotted vs. jet Mach number, showed a strong dependence on the ratio of restriction diameter to pipe diameter. The acoustic energy flow analyses based on the thermodynamic energy equation and on the results of Mohring both resulted in orthogonality properties for the eigenfunctions of the radial mode shape equation. These orthogonality relationships involve the eigenvalues and derivatives of the radial mode shape functions.

  17. An Adaptive Image Enhancement Technique by Combining Cuckoo Search and Particle Swarm Optimization Algorithm

    PubMed Central

    Ye, Zhiwei; Wang, Mingwei; Hu, Zhengbing; Liu, Wei

    2015-01-01

    Image enhancement is an important procedure of image processing and analysis. This paper presents a new technique using a modified measure and blending of cuckoo search and particle swarm optimization (CS-PSO) for low contrast images to enhance image adaptively. In this way, contrast enhancement is obtained by global transformation of the input intensities; it employs incomplete Beta function as the transformation function and a novel criterion for measuring image quality considering three factors which are threshold, entropy value, and gray-level probability density of the image. The enhancement process is a nonlinear optimization problem with several constraints. CS-PSO is utilized to maximize the objective fitness criterion in order to enhance the contrast and detail in an image by adapting the parameters of a novel extension to a local enhancement technique. The performance of the proposed method has been compared with other existing techniques such as linear contrast stretching, histogram equalization, and evolutionary computing based image enhancement methods like backtracking search algorithm, differential search algorithm, genetic algorithm, and particle swarm optimization in terms of processing time and image quality. Experimental results demonstrate that the proposed method is robust and adaptive and exhibits the better performance than other methods involved in the paper. PMID:25784928

  18. An adaptive image enhancement technique by combining cuckoo search and particle swarm optimization algorithm.

    PubMed

    Ye, Zhiwei; Wang, Mingwei; Hu, Zhengbing; Liu, Wei

    2015-01-01

    Image enhancement is an important procedure of image processing and analysis. This paper presents a new technique using a modified measure and blending of cuckoo search and particle swarm optimization (CS-PSO) for low contrast images to enhance image adaptively. In this way, contrast enhancement is obtained by global transformation of the input intensities; it employs incomplete Beta function as the transformation function and a novel criterion for measuring image quality considering three factors which are threshold, entropy value, and gray-level probability density of the image. The enhancement process is a nonlinear optimization problem with several constraints. CS-PSO is utilized to maximize the objective fitness criterion in order to enhance the contrast and detail in an image by adapting the parameters of a novel extension to a local enhancement technique. The performance of the proposed method has been compared with other existing techniques such as linear contrast stretching, histogram equalization, and evolutionary computing based image enhancement methods like backtracking search algorithm, differential search algorithm, genetic algorithm, and particle swarm optimization in terms of processing time and image quality. Experimental results demonstrate that the proposed method is robust and adaptive and exhibits the better performance than other methods involved in the paper.

  19. Immunocytochemical localization of a histone H2A variant in the mammalian nucleolar chromatin.

    PubMed

    Bhatnagar, Y M; McCullar, M K; Chronister, R B

    1984-11-01

    The distribution of protein "A", a minor variant of H2A present in the mouse testis, was studied in the liver and brain nuclei using peroxidase-antiperoxidase technique. The data presented here suggest that nucleolar-associated chromatin is highly enriched in protein "A". Microspectrophotometric measurements corroborate the immunocytochemical data. The regional differentiation in the eukaryotic chromatin, therefore, may involve qualitative changes in the histone composition.

  20. A Real Attention-Getter

    NASA Technical Reports Server (NTRS)

    2003-01-01

    While most parents would agree that playing videos games is the antithesis of time well spent for their children, recent advances involving NASA biofeedback technology are proving otherwise. The same techniques used to measure brain activity in NASA pilots during flight simulation exercises are now a part of a revolutionary video game system that is helping to improve overall mental awareness for Americans of all ages, including those who suffer from Attention Deficit Hyperactivity Disorder (ADHD).

  1. Pulsed Photothermal Radiometry for Noncontact Spectroscopy, Material Testing and Inspection Measurement.

    DTIC Science & Technology

    1984-08-08

    transmission PTR signal changes whenever the transmitted thermal wave crosses a void. This provides a means of nondestructive subsurface imaging of defects...and Busse and Renk( 2 2 ) have demonstrated a new stereoscopic subsurface imaging technique involving two adjacent modulated PT source for...modulation frequencies. In all cases of subsurface imaging , the authors preferred to use the shape or the phase of the PTR signal rather than the amplitude

  2. Work performed on velocity profiles in a hot jet by simplified RELIEF

    NASA Technical Reports Server (NTRS)

    Miles, Richard B.; Lempert, Walter R.

    1991-01-01

    The Raman Excitation + Laser Induced Electronic Fluorescence (RELIEF) velocity measurement method is based on vibrationally tagging oxygen molecules and observing their displacement after a short period of time. Two papers that discuss the use and implementation of the RELIEF technique are presented in this final report. Additionally, the end of the report contains a listing of the personnel involved and the reference documents used in the production of this final report.

  3. X-ray astronomical spectroscopy

    NASA Technical Reports Server (NTRS)

    Holt, Stephen S.

    1987-01-01

    The contributions of the Goddard group to the history of X-ray astronomy are numerous and varied. One role that the group has continued to play involves the pursuit of techniques for the measurement and interpretation of the X-ray spectra of cosmic sources. The latest development is the selection of the X-ray microcalorimeter for the Advanced X-ray Astrophysics Facility (AXAF) study payload. This technology is likely to revolutionize the study of cosmic X-ray spectra.

  4. Synchrotron-based dynamic computed tomography of tissue motion for regional lung function measurement

    PubMed Central

    Dubsky, Stephen; Hooper, Stuart B.; Siu, Karen K. W.; Fouras, Andreas

    2012-01-01

    During breathing, lung inflation is a dynamic process involving a balance of mechanical factors, including trans-pulmonary pressure gradients, tissue compliance and airway resistance. Current techniques lack the capacity for dynamic measurement of ventilation in vivo at sufficient spatial and temporal resolution to allow the spatio-temporal patterns of ventilation to be precisely defined. As a result, little is known of the regional dynamics of lung inflation, in either health or disease. Using fast synchrotron-based imaging (up to 60 frames s−1), we have combined dynamic computed tomography (CT) with cross-correlation velocimetry to measure regional time constants and expansion within the mammalian lung in vivo. Additionally, our new technique provides estimation of the airflow distribution throughout the bronchial tree during the ventilation cycle. Measurements of lung expansion and airflow in mice and rabbit pups are shown to agree with independent measures. The ability to measure lung function at a regional level will provide invaluable information for studies into normal and pathological lung dynamics, and may provide new pathways for diagnosis of regional lung diseases. Although proof-of-concept data were acquired on a synchrotron, the methodology developed potentially lends itself to clinical CT scanning and therefore offers translational research opportunities. PMID:22491972

  5. The preliminary exploration of 64-slice volume computed tomography in the accurate measurement of pleural effusion.

    PubMed

    Guo, Zhi-Jun; Lin, Qiang; Liu, Hai-Tao; Lu, Jun-Ying; Zeng, Yan-Hong; Meng, Fan-Jie; Cao, Bin; Zi, Xue-Rong; Han, Shu-Ming; Zhang, Yu-Huan

    2013-09-01

    Using computed tomography (CT) to rapidly and accurately quantify pleural effusion volume benefits medical and scientific research. However, the precise volume of pleural effusions still involves many challenges and currently does not have a recognized accurate measuring. To explore the feasibility of using 64-slice CT volume-rendering technology to accurately measure pleural fluid volume and to then analyze the correlation between the volume of the free pleural effusion and the different diameters of the pleural effusion. The 64-slice CT volume-rendering technique was used to measure and analyze three parts. First, the fluid volume of a self-made thoracic model was measured and compared with the actual injected volume. Second, the pleural effusion volume was measured before and after pleural fluid drainage in 25 patients, and the volume reduction was compared with the actual volume of the liquid extract. Finally, the free pleural effusion volume was measured in 26 patients to analyze the correlation between it and the diameter of the effusion, which was then used to calculate the regression equation. After using the 64-slice CT volume-rendering technique to measure the fluid volume of the self-made thoracic model, the results were compared with the actual injection volume. No significant differences were found, P = 0.836. For the 25 patients with drained pleural effusions, the comparison of the reduction volume with the actual volume of the liquid extract revealed no significant differences, P = 0.989. The following linear regression equation was used to compare the pleural effusion volume (V) (measured by the CT volume-rendering technique) with the pleural effusion greatest depth (d): V = 158.16 × d - 116.01 (r = 0.91, P = 0.000). The following linear regression was used to compare the volume with the product of the pleural effusion diameters (l × h × d): V = 0.56 × (l × h × d) + 39.44 (r = 0.92, P = 0.000). The 64-slice CT volume-rendering technique can accurately measure the volume in pleural effusion patients, and a linear regression equation can be used to estimate the volume of the free pleural effusion.

  6. ANALYSIS OF AIRCRAFT MOTIONS

    NASA Technical Reports Server (NTRS)

    Wingrove, R. C.

    1994-01-01

    This program was developed by Ames Research Center, in cooperation with the National Transportation Safety Board, as a technique for deriving time histories of an aircraft's motion from Air Traffic Control (ATC) radar records. This technique uses the radar range and azimuth data, along with the downlinked altitude data, to derive an expanded set of data which includes airspeed, lift, attitude angles (pitch, roll, and heading), etc. This technique should prove useful as a source of data in the investigation of commercial airline accidents and in the analysis of accidents involving aircraft which do not have onboard data recorders (e.g., military, short-haul, and general aviation). The technique used to determine the aircraft motions involves smoothing of raw radar data. These smoothed results, in combination with other available information (wind profiles and aircraft performance data), are used to derive the expanded set of data. This program uses a cubic least-square fit to smooth the raw data. This moving-arc procedure provides a smoothed time history of the aircraft position, the inertial velocities, and accelerations. Using known winds, these inertial data are transformed to aircraft stability axes to provide true airspeed, thrust-drag, lift, and roll angle. Further derivation, based on aircraft dependent performance data, can determine the aircraft angle of attack, pitch, and heading angle. Results of experimental tests indicate that values derived from ATC radar records using this technique agree favorably with airborne measurements. This program is written in FORTRAN IV to be executed in the batch mode, and has been implemented on a CDC 6000 series computer with a central memory requirement of 64k (octal) of 60 bit words.

  7. Measurement of tissue optical properties with optical coherence tomography: Implication for noninvasive blood glucose concentration monitoring

    NASA Astrophysics Data System (ADS)

    Larin, Kirill V.

    Approximately 14 million people in the USA and more than 140 million people worldwide suffer from diabetes mellitus. The current glucose sensing technique involves a finger puncture several times a day to obtain a droplet of blood for analysis. There have been enormous efforts by many scientific groups and companies to quantify glucose concentration noninvasively using different optical techniques. However, these techniques face limitations associated with low sensitivity, accuracy, and insufficient specificity of glucose concentrations over a physiological range. Optical coherence tomography (OCT), a new technology, is being applied for noninvasive imaging in tissues with high resolution. OCT utilizes sensitive detection of photons coherently scattered from tissue. The high resolution of this technique allows for exceptionally accurate measurement of tissue scattering from a specific layer of skin compared with other optical techniques and, therefore, may provide noninvasive and continuous monitoring of blood glucose concentration with high accuracy. In this dissertation work I experimentally and theoretically investigate feasibility of noninvasive, real-time, sensitive, and specific monitoring of blood glucose concentration using an OCT-based biosensor. The studies were performed in scattering media with stable optical properties (aqueous suspensions of polystyrene microspheres and milk), animals (New Zealand white rabbits and Yucatan micropigs), and normal subjects (during oral glucose tolerance tests). The results of these studies demonstrated: (1) capability of the OCT technique to detect changes in scattering coefficient with the accuracy of about 1.5%; (2) a sharp and linear decrease of the OCT signal slope in the dermis with the increase of blood glucose concentration; (3) the change in the OCT signal slope measured during bolus glucose injection experiments (characterized by a sharp increase of blood glucose concentration) is higher than that measured in the glucose clamping experiments (characterized by slow, controlled increase of the blood glucose concentration); and (4) the accuracy of glucose concentration monitoring may substantially be improved if optimal dimensions of the probed skin area are used. The results suggest that high-resolution OCT technique has a potential for noninvasive, accurate, and continuous glucose monitoring with high sensitivity.

  8. The determination of solubility and diffusion coefficient for solids in liquids by an inverse measurement technique using cylinders of amorphous glucose as a model compound

    NASA Astrophysics Data System (ADS)

    Hu, Chengyao; Huang, Pei

    2011-05-01

    The importance of sugar and sugar-containing materials is well recognized nowadays, owing to their application in industrial processes, particularly in the food, pharmaceutical and cosmetic industries. Because of the large numbers of those compounds involved and the relatively small number of solubility and/or diffusion coefficient data for each compound available, it is highly desirable to measure the solubility and/or diffusion coefficient as efficiently as possible and to be able to improve the accuracy of the methods used. In this work, a new technique was developed for the measurement of the diffusion coefficient of a stationary solid solute in a stagnant solvent which simultaneously measures solubility based on an inverse measurement problem algorithm with the real-time dissolved amount profile as a function of time. This study differs from established techniques in both the experimental method and the data analysis. The experimental method was developed in which the dissolved amount of solid solute in quiescent solvent was investigated using a continuous weighing technique. In the data analysis, the hybrid genetic algorithm is used to minimize an objective function containing a calculated and a measured dissolved amount with time. This is measured on a cylindrical sample of amorphous glucose in methanol or ethanol. The calculated dissolved amount, that is a function of the unknown physical properties of the solid solute in the solvent, is calculated by the solution of the two-dimensional nonlinear inverse natural convection problem. The estimated values of the solubility of amorphous glucose in methanol and ethanol at 293 K were respectively 32.1 g/100 g methanol and 1.48 g/100 g ethanol, in agreement with the literature values, and support the validity of the simultaneously measured diffusion coefficient. These results show the efficiency and the stability of the developed technique to simultaneously estimate the solubility and diffusion coefficient. Also the influence of the solution density change and the initial concentration conditions on the dissolved amount was investigated by the numerical results using the estimated parameters. It is found that the theoretical assumption to simplify the inverse measurement problem algorithm is reasonable for low solubility.

  9. In-flight acoustic testing techniques using the YO-3A Acoustic Research Aircraft

    NASA Technical Reports Server (NTRS)

    Cross, J. L.; Watts, M. E.

    1984-01-01

    This report discusses the flight testing techniques and equipment employed during air-to-air acoustic testing of helicopters at Ames Research Center. The in flight measurement technique used enables acoustic data to be obtained without the limitations of anechoic chambers or the multitude of variables encountered in ground based flyover testing. The air-to-air testing is made possible by the NASA YO-3A Acoustic Research Aircraft. This "Quiet Aircraft' is an acoustically instrumented version of a quiet observation aircraft manufactured for the military. To date, tests with the following aircraft have been conducted: YO-3A background noise; Hughes 500D; Hughes AH-64; Bell AH-1S; Bell AH-1G. Several system upgrades are being designed and implemented to improve the quality of data. This report will discuss not only the equipment involved and aircraft tested, but also the techniques used in these tests. In particular, formation flying position locations, and the test matrices will be discussed. Examples of data taken will also be presented.

  10. In-flight acoustic testing techniques using the YO-3A acoustic research aircraft

    NASA Technical Reports Server (NTRS)

    Cross, J. L.; Watts, M. E.

    1983-01-01

    This report discusses the flight testing techniques and equipment employed during air-to-air acoustic testing of helicopters at Ames Research Center. The in-flight measurement technique used enables acoustic data to be obtained without the limitations of anechoic chambers or the multitude of variables encountered in ground based flyover testing. The air-to-air testing is made possible by the NASA YO-3A Acoustic Research Aircraft. This 'Quiet Aircraft' is an acoustically instrumented version of a quiet observation aircraft manufactured for the military. To date, tests with the following aircraft have been conducted: YO-3A background noise; Hughes 500D; Hughes AH-64; Bell AH-1S; Bell AH-1G. Several system upgrades are being designed and implemented to improve the quality of data. This report will discuss not only the equipment involved and aircraft tested, but also the techniques used in these tests. In particular, formation flying, position locations, and the test matrices will be discussed. Examples of data taken will also be presented.

  11. Time-Domain Microfluidic Fluorescence Lifetime Flow Cytometry for High-Throughput Förster Resonance Energy Transfer Screening

    PubMed Central

    Nedbal, Jakub; Visitkul, Viput; Ortiz-Zapater, Elena; Weitsman, Gregory; Chana, Prabhjoat; Matthews, Daniel R; Ng, Tony; Ameer-Beg, Simon M

    2015-01-01

    Sensing ion or ligand concentrations, physico-chemical conditions, and molecular dimerization or conformation change is possible by assays involving fluorescent lifetime imaging. The inherent low throughput of imaging impedes rigorous statistical data analysis on large cell numbers. We address this limitation by developing a fluorescence lifetime-measuring flow cytometer for fast fluorescence lifetime quantification in living or fixed cell populations. The instrument combines a time-correlated single photon counting epifluorescent microscope with microfluidics cell-handling system. The associated computer software performs burst integrated fluorescence lifetime analysis to assign fluorescence lifetime, intensity, and burst duration to each passing cell. The maximum safe throughput of the instrument reaches 3,000 particles per minute. Living cells expressing spectroscopic rulers of varying peptide lengths were distinguishable by Förster resonant energy transfer measured by donor fluorescence lifetime. An epidermal growth factor (EGF)-stimulation assay demonstrated the technique's capacity to selectively quantify EGF receptor phosphorylation in cells, which was impossible by measuring sensitized emission on a standard flow cytometer. Dual-color fluorescence lifetime detection and cell-specific chemical environment sensing were exemplified using di-4-ANEPPDHQ, a lipophilic environmentally sensitive dye that exhibits changes in its fluorescence lifetime as a function of membrane lipid order. To our knowledge, this instrument opens new applications in flow cytometry which were unavailable due to technological limitations of previously reported fluorescent lifetime flow cytometers. The presented technique is sensitive to lifetimes of most popular fluorophores in the 0.5–5 ns range including fluorescent proteins and is capable of detecting multi-exponential fluorescence lifetime decays. This instrument vastly enhances the throughput of experiments involving fluorescence lifetime measurements, thereby providing statistically significant quantitative data for analysis of large cell populations. © 2014 International Society for Advancement of Cytometry PMID:25523156

  12. Methods of blood flow measurement in the arterial circulatory system.

    PubMed

    Tabrizchi, R; Pugsley, M K

    2000-01-01

    The most commonly employed techniques for the in vivo measurement of arterial blood flow to individual organs involve the use of flow probes or sensors. Commercially available systems for the measurement of in vivo blood flow can be divided into two categories: ultrasonic and electromagnetic. Two types of ultrasonic probes are used. The first type of flow probe measures blood flow-mediated Doppler shifts (Doppler flowmetry) in a vessel. The second type of flow probe measures the "transit time" required by an emitted ultrasound wave to traverse the vessel and are transit-time volume flow sensors. Measurement of blood flow in any vessel requires that the flow probe or sensor be highly accurate and exhibit signal linearity over the flow range in the vessel of interest. Moreover, additional desirable features include compact design, size, and weight. An additional important feature for flow probes is that they exhibit good biocompatability; it is imperative for the sensor to behave in an inert manner towards the biological system. A sensitive and reliable method to assess blood flow in individual organs in the body, other than by the use of probes/sensors, is the reference sample method that utilizes hematogeneously delivered microspheres. This method has been utilized to a large extend to assess regional blood flow in the entire body. Obviously, the purpose of measuring blood flow is to determine the amount of blood delivered to a given region per unit time (milliliters per minute) and it is desirable to achieve this goal by noninvasive methodologies. This, however, is not always possible. This review attempts to offer an overview of some of the techniques available for the assessment of regional blood flow in the arterial circulatory system and discusses advantages and disadvantages of these common techniques.

  13. Factors predicting health practitioners' awareness of UNHS program in Malaysian non-public hospitals.

    PubMed

    Ismail, Abdussalaam Iyanda; Abdul Majid, Abdul Halim; Zakaria, Mohd Normani; Abdullah, Nor Azimah Chew; Hamzah, Sulaiman; Mukari, Siti Zamratol-Mai Sarah

    2018-06-01

    The current study aims to examine the effects of human resource (measured with the perception of health workers' perception towards UNHS), screening equipment, program layout and screening techniques on healthcare practitioners' awareness (measured with knowledge) of universal newborn hearing screening (UNHS) in Malaysian non-public hospitals. Via cross sectional approach, the current study collected data using a validated questionnaire to obtain information on the awareness of UNHS program among the health practitioners and to test the formulated hypotheses. 51, representing 81% response rate, out of 63 questionnaires distributed to the health professionals were returned and usable for statistical analysis. The survey instruments involving healthcare practitioners' awareness, human resource, program layout, screening instrument, and screening techniques instruments were adapted and scaled with 7-point Likert scale ranging from 1 (little) to 7 (many). Partial Least Squares (PLS) algorithm and bootstrapping techniques were employed to test the hypotheses of the study. With the result involving beta values, t-values and p-values (i.e. β=0.478, t=1.904, p<0.10; β=0.809, t=3.921, p<0.01; β= -0.436, t=1.870, p<0.10), human resource, measured with training, functional equipment and program layout, are held to be significant predictors of enhanced knowledge of health practitioners. Likewise, program layout, human resource, screening technique and screening instrument explain 71% variance in health practitioners' awareness. Health practitioners' awareness is explained by program layout, human resource, and screening instrument with effect size (f2) of 0.065, 0.621, and 0.211 respectively, indicating that program layout, human resource, and screening instrument have small, large and medium effect size on health practitioners' awareness respectively. However, screening technique has zero effect on health practitioners' awareness, indicating the reason why T-statistics is not significant. Having started the UNHS program in 2003, non-public hospitals have more experienced and well-trained employees dealing with the screening tools and instrument, and the program layout is well structured in the hospitals. Yet, the issue of homogeneity exists. Non-public hospitals charge for the service they render, and, in turn, they would ensure quality service, given that they are profit-driven and/or profit-making establishments, and that they would have no option other than provision of value-added and innovative services. The employees in the non-public hospitals have less screening to carry out, given the low number of babies delivered in the private hospitals. In addition, non-significant relationship between screening techniques and healthcare practitioners' awareness of UNHS program is connected with the fact that the techniques that are practiced among public and non-public hospital are similar and standardized. Limitations and suggestions were discussed. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Upper limb joint kinetics of three sitting pivot wheelchair transfer techniques in individuals with spinal cord injury.

    PubMed

    Kankipati, Padmaja; Boninger, Michael L; Gagnon, Dany; Cooper, Rory A; Koontz, Alicia M

    2015-07-01

    Repeated measures design. This study compared the upper extremity (UE) joint kinetics between three transfer techniques. Research laboratory. Twenty individuals with spinal cord injury performed three transfer techniques from their wheelchair to a level tub bench. Two of the techniques involved a head-hips method with leading hand position close (HH-I) and far (HH-A) from the body, and the third technique with the trunk upright (TU) and hand far from body. Motion analysis equipment recorded upper body movements and force sensors recorded their hand and feet reaction forces during the transfers. Several significant differences were found between HH-A and HH-I and TU and HH-I transfers indicating that hand placement was a key factor influencing the UE joint kinetics. Peak resultant hand, elbow, and shoulder joint forces were significantly higher for the HH-A and TU techniques at the trailing arm (P < 0.036) and lower at the leading arm (P < 0.021), compared to the HH-I technique. Always trailing with the same arm if using HH-A or TU could predispose that arm to overuse related pain and injuries. Technique training should focus on initial hand placement close to the body followed by the amount of trunk flexion needed to facilitate movement.

  15. Effects of interfacial interaction on the properties of poly(vinyl chloride)/styrene-butadiene rubber blends

    NASA Astrophysics Data System (ADS)

    Zhu, Shuihan

    PVC/SBR blends---new thermoplastic elastomer material---were developed. They have potential applications due to low costs and low-temperature elasticity. A unique compatibilization method was employed to enhance the mechanical properties of the materials a compatibilizer miscible with one of the blend components can react chemically with the other component(s). Improvements in tensile and impact behavior were observed as a result of the compatibilization. A novel characterization technique to study the interface of PVC/SBR blends was developed. This technique involves the observation of the unstained sample under electron beam irradiation by a transmission electron microscope (TEM). An enrichment of rubber at the interface between PVC and SBR was detected in the compatiblized PVC/SBR blends. Magnetic relaxation measurements show that the rubber concentration in the proximity of PVC increases with the degree of covulcanization between NBR and SBR. The interface development and the rheological effect during processing were investigated. The interfacial concentration profile and the interfacial thickness were obtained by grayscale measurements on TEM micrographs, evaluation of SIMS images, and measurements of micromechanical properties.

  16. Pre-Adult MRI of Brain Cancer and Neurological Injury: Multivariate Analyses

    PubMed Central

    Levman, Jacob; Takahashi, Emi

    2016-01-01

    Brain cancer and neurological injuries, such as stroke, are life-threatening conditions for which further research is needed to overcome the many challenges associated with providing optimal patient care. Multivariate analysis (MVA) is a class of pattern recognition technique involving the processing of data that contains multiple measurements per sample. MVA can be used to address a wide variety of neuroimaging challenges, including identifying variables associated with patient outcomes; understanding an injury’s etiology, development, and progression; creating diagnostic tests; assisting in treatment monitoring; and more. Compared to adults, imaging of the developing brain has attracted less attention from MVA researchers, however, remarkable MVA growth has occurred in recent years. This paper presents the results of a systematic review of the literature focusing on MVA technologies applied to brain injury and cancer in neurological fetal, neonatal, and pediatric magnetic resonance imaging (MRI). With a wide variety of MRI modalities providing physiologically meaningful biomarkers and new biomarker measurements constantly under development, MVA techniques hold enormous potential toward combining available measurements toward improving basic research and the creation of technologies that contribute to improving patient care. PMID:27446888

  17. Insights into the Interactions of Amino Acids and Peptides with Inorganic Materials Using Single-Molecule Force Spectroscopy.

    PubMed

    Das, Priyadip; Duanias-Assaf, Tal; Reches, Meital

    2017-03-06

    The interactions between proteins or peptides and inorganic materials lead to several interesting processes. For example, combining proteins with minerals leads to the formation of composite materials with unique properties. In addition, the undesirable process of biofouling is initiated by the adsorption of biomolecules, mainly proteins, on surfaces. This organic layer is an adhesion layer for bacteria and allows them to interact with the surface. Understanding the fundamental forces that govern the interactions at the organic-inorganic interface is therefore important for many areas of research and could lead to the design of new materials for optical, mechanical and biomedical applications. This paper demonstrates a single-molecule force spectroscopy technique that utilizes an AFM to measure the adhesion force between either peptides or amino acids and well-defined inorganic surfaces. This technique involves a protocol for attaching the biomolecule to the AFM tip through a covalent flexible linker and single-molecule force spectroscopy measurements by atomic force microscope. In addition, an analysis of these measurements is included.

  18. Extractive sampling and optical remote sensing of F100 aircraft engine emissions.

    PubMed

    Cowen, Kenneth; Goodwin, Bradley; Joseph, Darrell; Tefend, Matthew; Satola, Jan; Kagann, Robert; Hashmonay, Ram; Spicer, Chester; Holdren, Michael; Mayfield, Howard

    2009-05-01

    The Strategic Environmental Research and Development Program (SERDP) has initiated several programs to develop and evaluate techniques to characterize emissions from military aircraft to meet increasingly stringent regulatory requirements. This paper describes the results of a recent field study using extractive and optical remote sensing (ORS) techniques to measure emissions from six F-15 fighter aircraft. Testing was performed between November 14 and 16, 2006 on the trim-pad facility at Tyndall Air Force Base in Panama City, FL. Measurements were made on eight different F100 engines, and the engines were tested on-wing of in-use aircraft. A total of 39 test runs were performed at engine power levels that ranged from idle to military power. The approach adopted for these tests involved extractive sampling with collocated ORS measurements at a distance of approximately 20-25 nozzle diameters downstream of the engine exit plane. The emission indices calculated for carbon dioxide, carbon monoxide, nitric oxide, and several volatile organic compounds showed very good agreement when comparing the extractive and ORS sampling methods.

  19. A technique for studying cardiac myosin dynamics using optical tweezers

    NASA Astrophysics Data System (ADS)

    Paolino, Michael; Migirditch, Sam; Nesmelov, Yuri; Hester, Brooke; Appalachian State Biophysics; Optical Sciences Facility Team

    A primary protein involved in human muscle contraction is myosin, which exists in α- and β- isoforms. Myosin exerts forces on actin filaments when ATP is present, driving muscle contraction. A significant decrease in the population of cardiac α-myosin has been linked to heart failure. It is proposed that slow β-myosin in a failing heart could, through introduction of a drug, be made to mimic the action of α-myosin, thereby improving cardiac muscle performance. In working towards testing this hypothesis, the focus of this work is to develop a technique to measure forces exerted by myosin on actin using optical tweezers. An actin-myosin arrangement is constructed between two optically trapped polystyrene microspheres. The displacement of a microsphere is monitored when ATP is introduced, and the force responsible is measured. With this achieved, we can then modify the actin-myosin arrangement, for example with varying amounts of α- and β- myosin and test the effects on forces exerted. In this work, assemblies of actin and myosin molecules and preliminary force measurements are discussed. North Carolina Space Grant.

  20. Integration of fringe projection and two-dimensional digital image correlation for three-dimensional displacements measurements

    NASA Astrophysics Data System (ADS)

    Felipe-Sesé, Luis; López-Alba, Elías; Siegmann, Philip; Díaz, Francisco A.

    2016-12-01

    A low-cost approach for three-dimensional (3-D) full-field displacement measurement is applied for the analysis of large displacements involved in two different mechanical events. The method is based on a combination of fringe projection and two-dimensional digital image correlation (DIC) techniques. The two techniques have been employed simultaneously using an RGB camera and a color encoding method; therefore, it is possible to measure in-plane and out-of-plane displacements at the same time with only one camera even at high speed rates. The potential of the proposed methodology has been employed for the analysis of large displacements during contact experiments in a soft material block. Displacement results have been successfully compared with those obtained using a 3D-DIC commercial system. Moreover, the analysis of displacements during an impact test on a metal plate was performed to emphasize the application of the methodology for dynamics events. Results show a good level of agreement, highlighting the potential of FP + 2D DIC as low-cost alternative for the analysis of large deformations problems.

  1. Response of Sap-Flow Measurements on Environmental Forcings

    NASA Astrophysics Data System (ADS)

    Howe, J. A.; Dragoni, D.; Schmid, H.

    2005-05-01

    The exchange of water between the atmosphere and biosphere is an important determinant of climate and the productivity of vegetation. Both evaporation and transpiration involve substantial amounts of energy exchange at the interface of the biosphere and atmosphere. Knowing how transpiration changes throughout the seasonal and diurnal cycles can help increase the understanding of how a forest reacts to changes in the biosphere and atmosphere. A common way to estimate transpiration is by measuring the sap flowing through the living tissues of trees. A study was conducted at Morgan-Monroe State Forest, a mixed deciduous forest in south central Indiana (USA), to investigate how sap flow in trees responds to changes in meteorological and environmental conditions. The heat -dissipation technique was used to estimate sap velocities from two Big Tooth Aspen (Populus grandidentata) and two Tulip Poplars (Liriodendron tulipifera). Sap velocity patterns (normalized by a reference potential evapo-transpiration) were directly compared with meteorological and ecological measurements, such as vapor pressure deficits, photosynthetic active radiation (PAR), rain fall, and soil moisture content. In this study, we also investigated the uncertainties and problems that arise in using the heat dissipation technique to extrapolate the single-tree measurements to the forest scale.

  2. The role of suppression in amblyopia.

    PubMed

    Li, Jingrong; Thompson, Benjamin; Lam, Carly S Y; Deng, Daming; Chan, Lily Y L; Maehara, Goro; Woo, George C; Yu, Minbin; Hess, Robert F

    2011-06-13

    This study had three main goals: to assess the degree of suppression in patients with strabismic, anisometropic, and mixed amblyopia; to establish the relationship between suppression and the degree of amblyopia; and to compare the degree of suppression across the clinical subgroups within the sample. Using both standard measures of suppression (Bagolini lenses and neutral density [ND] filters, Worth 4-Dot test) and a new approach involving the measurement of dichoptic motion thresholds under conditions of variable interocular contrast, the degree of suppression in 43 amblyopic patients with strabismus, anisometropia, or a combination of both was quantified. There was good agreement between the quantitative measures of suppression made with the new dichoptic motion threshold technique and measurements made with standard clinical techniques (Bagolini lenses and ND filters, Worth 4-Dot test). The degree of suppression was found to correlate directly with the degree of amblyopia within our clinical sample, whereby stronger suppression was associated with a greater difference in interocular acuity and poorer stereoacuity. Suppression was not related to the type or angle of strabismus when this was present or the previous treatment history. These results suggest that suppression may have a primary role in the amblyopia syndrome and therefore have implications for the treatment of amblyopia.

  3. Use of scatterometry for resist process control

    NASA Astrophysics Data System (ADS)

    Bishop, Kenneth P.; Milner, Lisa-Michelle; Naqvi, S. Sohail H.; McNeil, John R.; Draper, B. L.

    1992-06-01

    The formation of resist lines having submicron critical dimensions (CDs) is a complex multistep process, requiring precise control of each processing step. Optimization of parameters for each processing step may be accomplished through theoretical modeling techniques and/or the use of send-ahead wafers followed by scanning electron microscope measurements. Once the optimum parameters for any process having been selected, (e.g., time duration and temperature for post-exposure bake process), no in-situ CD measurements are made. In this paper we describe the use of scatterometry to provide this essential metrology capability. It involves focusing a laser beam on a periodic grating and predicting the shape of the grating lines from a measurement of the scattered power in the diffraction orders. The inverse prediction of lineshape from a measurement of the scatter power is based on a vector diffraction analysis used in conjunction with photolithography simulation tools to provide an accurate scatter model for latent image gratings. This diffraction technique has previously been applied to looking at latent image grating formation, as exposure is taking place. We have broadened the scope of the application and consider the problem of determination of optimal focus.

  4. Metrology conditions for thin layer activation in wear and corrosion studies

    NASA Astrophysics Data System (ADS)

    Lacroix, O.; Sauvage, T.; Blondiaux, G.; Racolta, P. M.; Popa-Simil, L.; Alexandreanu, B.

    1996-02-01

    Thin Layer Activation (TLA) is an ion beam technique. This method consists of an accelerated ion bombardment of the surface of interest of a machine part subjected to wear. Radioactive tracers are created by nuclear reactions in a well defined volume of material. Loss of material owing to wear, corrosion or abrasion phenomena is characterized by monitoring the resulting changes in radioactivity. For the industrial application of this method, special attention has been paid during irradiation to the range of activated thickness, yields and activation homogeneity and to on-line radioactivity measurements. There are two basic methods for measuring the material loss by TLA technique. One of them is based on remanant radioactivity measurements using a previously obtained calibration curve. The second is based on measuring the increasing radioactivity in the lubricant due to suspended wear particles. In this paper, we have chosen to present some calibration curves for both proton and deuteron irradiation of Fe, Cr, Cu, Ti and Ni samples. Thickness ranges are indicated and intrinsic error checking and calculational procedures are also presented. The article ends with a review of some typical experiments involving running-in programme optimization and lubricants certifying procedures.

  5. Partial Least Squares Regression Can Aid in Detecting Differential Abundance of Multiple Features in Sets of Metagenomic Samples

    PubMed Central

    Libiger, Ondrej; Schork, Nicholas J.

    2015-01-01

    It is now feasible to examine the composition and diversity of microbial communities (i.e., “microbiomes”) that populate different human organs and orifices using DNA sequencing and related technologies. To explore the potential links between changes in microbial communities and various diseases in the human body, it is essential to test associations involving different species within and across microbiomes, environmental settings and disease states. Although a number of statistical techniques exist for carrying out relevant analyses, it is unclear which of these techniques exhibit the greatest statistical power to detect associations given the complexity of most microbiome datasets. We compared the statistical power of principal component regression, partial least squares regression, regularized regression, distance-based regression, Hill's diversity measures, and a modified test implemented in the popular and widely used microbiome analysis methodology “Metastats” across a wide range of simulated scenarios involving changes in feature abundance between two sets of metagenomic samples. For this purpose, simulation studies were used to change the abundance of microbial species in a real dataset from a published study examining human hands. Each technique was applied to the same data, and its ability to detect the simulated change in abundance was assessed. We hypothesized that a small subset of methods would outperform the rest in terms of the statistical power. Indeed, we found that the Metastats technique modified to accommodate multivariate analysis and partial least squares regression yielded high power under the models and data sets we studied. The statistical power of diversity measure-based tests, distance-based regression and regularized regression was significantly lower. Our results provide insight into powerful analysis strategies that utilize information on species counts from large microbiome data sets exhibiting skewed frequency distributions obtained on a small to moderate number of samples. PMID:26734061

  6. Correlating Detergent Fiber Analysis and Dietary Fiber Analysis Data for Corn Stover

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolfrum, E. J.; Lorenz, A. J.; deLeon, N.

    There exist large amounts of detergent fiber analysis data [neutral detergent fiber (NDF), acid detergent fiber (ADF), acid detergent lignin (ADL)] for many different potential cellulosic ethanol feedstocks, since these techniques are widely used for the analysis of forages. Researchers working in the area of cellulosic ethanol are interested in the structural carbohydrates in a feedstock (principally glucan and xylan), which are typically determined by acid hydrolysis of the structural fraction after multiple extractions of the biomass. These so-called dietary fiber analysis methods are significantly more involved than detergent fiber analysis methods. The purpose of this study was to determinemore » whether it is feasible to correlate detergent fiber analysis values to glucan and xylan content determined by dietary fiber analysis methods for corn stover. In the detergent fiber analysis literature cellulose is often estimated as the difference between ADF and ADL, while hemicellulose is often estimated as the difference between NDF and ADF. Examination of a corn stover dataset containing both detergent fiber analysis data and dietary fiber analysis data predicted using near infrared spectroscopy shows that correlations between structural glucan measured using dietary fiber techniques and cellulose estimated using detergent techniques, and between structural xylan measured using dietary fiber techniques and hemicellulose estimated using detergent techniques are high, but are driven largely by the underlying correlation between total extractives measured by fiber analysis and NDF/ADF. That is, detergent analysis data is correlated to dietary fiber analysis data for structural carbohydrates, but only indirectly; the main correlation is between detergent analysis data and solvent extraction data produced during the dietary fiber analysis procedure.« less

  7. Artificial intelligence in sports on the example of weight training.

    PubMed

    Novatchkov, Hristo; Baca, Arnold

    2013-01-01

    The overall goal of the present study was to illustrate the potential of artificial intelligence (AI) techniques in sports on the example of weight training. The research focused in particular on the implementation of pattern recognition methods for the evaluation of performed exercises on training machines. The data acquisition was carried out using way and cable force sensors attached to various weight machines, thereby enabling the measurement of essential displacement and force determinants during training. On the basis of the gathered data, it was consequently possible to deduce other significant characteristics like time periods or movement velocities. These parameters were applied for the development of intelligent methods adapted from conventional machine learning concepts, allowing an automatic assessment of the exercise technique and providing individuals with appropriate feedback. In practice, the implementation of such techniques could be crucial for the investigation of the quality of the execution, the assistance of athletes but also coaches, the training optimization and for prevention purposes. For the current study, the data was based on measurements from 15 rather inexperienced participants, performing 3-5 sets of 10-12 repetitions on a leg press machine. The initially preprocessed data was used for the extraction of significant features, on which supervised modeling methods were applied. Professional trainers were involved in the assessment and classification processes by analyzing the video recorded executions. The so far obtained modeling results showed good performance and prediction outcomes, indicating the feasibility and potency of AI techniques in assessing performances on weight training equipment automatically and providing sportsmen with prompt advice. Key pointsArtificial intelligence is a promising field for sport-related analysis.Implementations integrating pattern recognition techniques enable the automatic evaluation of data measurements.Artificial neural networks applied for the analysis of weight training data show good performance and high classification rates.

  8. Contribution of multiple inert gas elimination technique to pulmonary medicine. 1. Principles and information content of the multiple inert gas elimination technique.

    PubMed Central

    Roca, J.; Wagner, P. D.

    1994-01-01

    This introductory review summarises four different aspects of the multiple inert gas elimination technique (MIGET). Firstly, the historical background that facilitated, in the mid 1970s, the development of the MIGET as a tool to obtain more information about the entire spectrum of VA/Q distribution in the lung by measuring the exchange of six gases of different solubility in trace concentrations. Its principle is based on the observation that the retention (or excretion) of any gas is dependent on the solubility (lambda) of that gas and the VA/Q distribution. A second major aspect is the analysis of the information content and limitations of the technique. During the last 15 years a substantial amount of clinical research using the MIGET has been generated by several groups around the world. The technique has been shown to be adequate in understanding the mechanisms of hypoxaemia in different forms of pulmonary disease and the effects of therapeutic interventions, but also in separately determining the quantitative role of each extrapulmonary factor on systemic arterial PO2 when they change between two conditions of MIGET measurement. This information will be extensively reviewed in the forthcoming articles of this series. Next, the different modalities of the MIGET, practical considerations involved in the measurements and the guidelines for quality control have been indicated. Finally, a section has been devoted to the analysis of available data in healthy subjects under different conditions. The lack of systematic information on the VA/Q distributions of older healthy subjects is emphasised, since it will be required to fully understand the changes brought about by diseases that affect the older population. PMID:8091330

  9. Artificial Intelligence in Sports on the Example of Weight Training

    PubMed Central

    Novatchkov, Hristo; Baca, Arnold

    2013-01-01

    The overall goal of the present study was to illustrate the potential of artificial intelligence (AI) techniques in sports on the example of weight training. The research focused in particular on the implementation of pattern recognition methods for the evaluation of performed exercises on training machines. The data acquisition was carried out using way and cable force sensors attached to various weight machines, thereby enabling the measurement of essential displacement and force determinants during training. On the basis of the gathered data, it was consequently possible to deduce other significant characteristics like time periods or movement velocities. These parameters were applied for the development of intelligent methods adapted from conventional machine learning concepts, allowing an automatic assessment of the exercise technique and providing individuals with appropriate feedback. In practice, the implementation of such techniques could be crucial for the investigation of the quality of the execution, the assistance of athletes but also coaches, the training optimization and for prevention purposes. For the current study, the data was based on measurements from 15 rather inexperienced participants, performing 3-5 sets of 10-12 repetitions on a leg press machine. The initially preprocessed data was used for the extraction of significant features, on which supervised modeling methods were applied. Professional trainers were involved in the assessment and classification processes by analyzing the video recorded executions. The so far obtained modeling results showed good performance and prediction outcomes, indicating the feasibility and potency of AI techniques in assessing performances on weight training equipment automatically and providing sportsmen with prompt advice. Key points Artificial intelligence is a promising field for sport-related analysis. Implementations integrating pattern recognition techniques enable the automatic evaluation of data measurements. Artificial neural networks applied for the analysis of weight training data show good performance and high classification rates. PMID:24149722

  10. Noninvasive measurement of central venous pressure

    NASA Technical Reports Server (NTRS)

    Webster, J. G.; Mastenbrook, S. M., Jr.

    1972-01-01

    A technique for the noninvasive measurement of CVP in man was developed. The method involves monitoring venous velocity at a point in the periphery with a transcutaneous Doppler ultrasonic velocity meter while the patient performs a forced expiratory maneuver. The idea is the CVP is related to the value of pressure measured at the mouth which just stops the flow in the vein. Two improvements were made over the original procedure. First, the site of venous velocity measurement was shifted from a vein at the antecubital fossa (elbow) to the right external jugular vein in the neck. This allows for sensing more readily events occurring in the central veins. Secondly, and perhaps most significantly, a procedure for obtaining a curve of relative mean venous velocity vs mouth pressure was developed.

  11. Viscoelastic properties of cell walls of single living plant cells determined by dynamic nanoindentation

    PubMed Central

    Hayot, Céline M.; Forouzesh, Elham; Goel, Ashwani; Avramova, Zoya; Turner, Joseph A.

    2012-01-01

    Plant development results from controlled cell divisions, structural modifications, and reorganizations of the cell wall. Thereby, regulation of cell wall behaviour takes place at multiple length scales involving compositional and architectural aspects in addition to various developmental and/or environmental factors. The physical properties of the primary wall are largely determined by the nature of the complex polymer network, which exhibits time-dependent behaviour representative of viscoelastic materials. Here, a dynamic nanoindentation technique is used to measure the time-dependent response and the viscoelastic behaviour of the cell wall in single living cells at a micron or sub-micron scale. With this approach, significant changes in storage (stiffness) and loss (loss of energy) moduli are captured among the tested cells. The results reveal hitherto unknown differences in the viscoelastic parameters of the walls of same-age similarly positioned cells of the Arabidopsis ecotypes (Col 0 and Ws 2). The technique is also shown to be sensitive enough to detect changes in cell wall properties in cells deficient in the activity of the chromatin modifier ATX1. Extensive computational modelling of the experimental measurements (i.e. modelling the cell as a viscoelastic pressure vessel) is used to analyse the influence of the wall thickness, as well as the turgor pressure, at the positions of our measurements. By combining the nanoDMA technique with finite element simulations quantifiable measurements of the viscoelastic properties of plant cell walls are achieved. Such techniques are expected to find broader applications in quantifying the influence of genetic, biological, and environmental factors on the nanoscale mechanical properties of the cell wall. PMID:22291130

  12. Features of the non-contact carotid pressure waveform: Cardiac and vascular dynamics during rebreathing

    NASA Astrophysics Data System (ADS)

    Casaccia, S.; Sirevaag, E. J.; Richter, E. J.; O'Sullivan, J. A.; Scalise, L.; Rohrbaugh, J. W.

    2016-10-01

    This report amplifies and extends prior descriptions of the use of laser Doppler vibrometry (LDV) as a method for assessing cardiovascular activity, on a non-contact basis. A rebreathing task (n = 35 healthy individuals) was used to elicit multiple effects associated with changes in autonomic drive as well as blood gases including hypercapnia. The LDV pulse was obtained from two sites overlying the carotid artery, separated by 40 mm. A robust pulse signal was obtained from both sites, in accord with the well-described changes in carotid diameter over the blood pressure cycle. Emphasis was placed on extracting timing measures from the LDV pulse, which could serve as surrogate measures of pulse wave velocity (PWV) and the associated arterial stiffness. For validation purposes, a standard measure of pulse transit time (PTT) to the radial artery was obtained using a tonometric sensor. Two key measures of timing were extracted from the LDV pulse. One involved the transit time along the 40 mm distance separating the two LDV measurement sites. A second measure involved the timing of a late feature of the LDV pulse contour, which was interpreted as reflection wave latency and thus a measure of round-trip travel time. Both LDV measures agreed with the conventional PTT measure, in disclosing increased PWV during periods of active rebreathing. These results thus provide additional evidence that measures based on the non-contact LDV technique might provide surrogate measures for those obtained using conventional, more obtrusive assessment methods that require attached sensors.

  13. Polarization spectroscopy of the sodium dimer utilizing a triple-resonance technique in the presence of argon

    NASA Astrophysics Data System (ADS)

    Arndt, Phillip; Horton, Timothy; McFarland, Jacob; Bayram, Burcin; Miami University Spectroscopy Team

    2015-05-01

    The collisional dynamics of molecular sodium in the 61Σg electronic state is under investigation using a triple resonance technique in the presence of argon. A continuous wave ring dye laser is used to populate specific rovibrational levels of the A1Σu electronic state. A pump-probe technique is then employed where the pump laser populates the 61Σg state, and the probe laser dumps the population to the B1Σu state. From this level, fluorescence is detected as the system decays to the X1Σg state. We measure the polarization of this signal in the presence of various argon pressures. We will present our current work as well as the processes involved in the experiment. Financial support from the National Science Foundation (Grant No. NSF-PHY-1309571) is gratefully acknowledged.

  14. Rule groupings in expert systems using nearest neighbour decision rules, and convex hulls

    NASA Technical Reports Server (NTRS)

    Anastasiadis, Stergios

    1991-01-01

    Expert System shells are lacking in many areas of software engineering. Large rule based systems are not semantically comprehensible, difficult to debug, and impossible to modify or validate. Partitioning a set of rules found in CLIPS (C Language Integrated Production System) into groups of rules which reflect the underlying semantic subdomains of the problem, will address adequately the concerns stated above. Techniques are introduced to structure a CLIPS rule base into groups of rules that inherently have common semantic information. The concepts involved are imported from the field of A.I., Pattern Recognition, and Statistical Inference. Techniques focus on the areas of feature selection, classification, and a criteria of how 'good' the classification technique is, based on Bayesian Decision Theory. A variety of distance metrics are discussed for measuring the 'closeness' of CLIPS rules and various Nearest Neighbor classification algorithms are described based on the above metric.

  15. Measuring the uncertainties of discharge measurements: interlaboratory experiments in hydrometry

    NASA Astrophysics Data System (ADS)

    Le Coz, Jérôme; Blanquart, Bertrand; Pobanz, Karine; Dramais, Guillaume; Pierrefeu, Gilles; Hauet, Alexandre; Despax, Aurélien

    2015-04-01

    Quantifying the uncertainty of streamflow data is key for hydrological sciences. The conventional uncertainty analysis based on error propagation techniques is restricted by the absence of traceable discharge standards and by the weight of difficult-to-predict errors related to the operator, procedure and measurement environment. Field interlaboratory experiments recently emerged as an efficient, standardized method to 'measure' the uncertainties of a given streamgauging technique in given measurement conditions. Both uncertainty approaches are compatible and should be developed jointly in the field of hydrometry. In the recent years, several interlaboratory experiments have been reported by different hydrological services. They involved different streamgauging techniques, including acoustic profilers (ADCP), current-meters and handheld radars (SVR). Uncertainty analysis was not always their primary goal: most often, testing the proficiency and homogeneity of instruments, makes and models, procedures and operators was the original motivation. When interlaboratory experiments are processed for uncertainty analysis, once outliers have been discarded all participants are assumed to be equally skilled and to apply the same streamgauging technique in equivalent conditions. A universal requirement is that all participants simultaneously measure the same discharge, which shall be kept constant within negligible variations. To our best knowledge, we were the first to apply the interlaboratory method for computing the uncertainties of streamgauging techniques, according to the authoritative international documents (ISO standards). Several specific issues arise due to the measurements conditions in outdoor canals and rivers. The main limitation is that the best available river discharge references are usually too uncertain to quantify the bias of the streamgauging technique, i.e. the systematic errors that are common to all participants in the experiment. A reference or a sensitivity analysis to the fixed parameters of the streamgauging technique remain very useful for estimating the uncertainty related to the (non quantified) bias correction. In the absence of a reference, the uncertainty estimate is referenced to the average of all discharge measurements in the interlaboratory experiment, ignoring the technique bias. Simple equations can be used to assess the uncertainty of the uncertainty results, as a function of the number of participants and of repeated measurements. The interlaboratory method was applied to several interlaboratory experiments on ADCPs and currentmeters mounted on wading rods, in streams of different sizes and aspects, with 10 to 30 instruments, typically. The uncertainty results were consistent with the usual expert judgment and highly depended on the measurement environment. Approximately, the expanded uncertainties (within the 95% probability interval) were ±5% to ±10% for ADCPs in good or poor conditions, and ±10% to ±15% for currentmeters in shallow creeks. Due to the specific limitations related to a slow measurement process and to small, natural streams, uncertainty results for currentmeters were more uncertain than for ADCPs, for which the site-specific errors were significantly evidenced. The proposed method can be applied to a wide range of interlaboratory experiments conducted in contrasted environments for different streamgauging techniques, in a standardized way. Ideally, an international open database would enhance the investigation of hydrological data uncertainties, according to the characteristics of the measurement conditions and procedures. Such a dataset could be used for implementing and validating uncertainty propagation methods in hydrometry.

  16. Quantity yields quality when it comes to creativity: a brain and behavioral test of the equal-odds rule

    PubMed Central

    Jung, Rex E.; Wertz, Christopher J.; Meadows, Christine A.; Ryman, Sephira G.; Vakhtin, Andrei A.; Flores, Ranee A.

    2015-01-01

    The creativity research community is in search of a viable cognitive measure providing support for behavioral observations that higher ideational output is often associated with higher creativity (known as the equal-odds rule). One such measure has included divergent thinking: the production of many examples or uses for a common or single object or image. We sought to test the equal-odds rule using a measure of divergent thinking, and applied the consensual assessment technique to determine creative responses as opposed to merely original responses. We also sought to determine structural brain correlates of both ideational fluency and ideational creativity. Two-hundred forty-six subjects were subjected to a broad battery of behavioral measures, including a core measure of divergent thinking (Foresight), and measures of intelligence, creative achievement, and personality (i.e., Openness to Experience). Cortical thickness and subcortical volumes (e.g., thalamus) were measured using automated techniques (FreeSurfer). We found that higher number of responses on the divergent thinking task was significantly associated with higher creativity (r = 0.73) as independently assessed by three judges. Moreover, we found that creativity was predicted by cortical thickness in regions including the left frontal pole and left parahippocampal gyrus. These results support the equal-odds rule, and provide neuronal evidence implicating brain regions involved with “thinking about the future” and “extracting future prospects.” PMID:26161075

  17. Application of an in vitro DNA protection assay to visualize stress mediation properties of the Dps protein.

    PubMed

    Karas, Vlad O; Westerlaken, Ilja; Meyer, Anne S

    2013-05-31

    Oxidative stress is an unavoidable byproduct of aerobic life. Molecular oxygen is essential for terrestrial metabolism, but it also takes part in many damaging reactions within living organisms. The combination of aerobic metabolism and iron, which is another vital compound for life, is enough to produce radicals through Fenton chemistry and degrade cellular components. DNA degradation is arguably the most damaging process involving intracellular radicals, as DNA repair is far from trivial. The assay presented in this article offers a quantitative technique to measure and visualize the effect of molecules and enzymes on radical-mediated DNA damage. The DNA protection assay is a simple, quick, and robust tool for the in vitro characterization of the protective properties of proteins or chemicals. It involves exposing DNA to a damaging oxidative reaction and adding varying concentrations of the compound of interest. The reduction or increase of DNA damage as a function of compound concentration is then visualized using gel electrophoresis. In this article we demonstrate the technique of the DNA protection assay by measuring the protective properties of the DNA-binding protein from starved cells (Dps). Dps is a mini-ferritin that is utilized by more than 300 bacterial species to powerfully combat environmental stressors. Here we present the Dps purification protocol and the optimized assay conditions for evaluating DNA protection by Dps.

  18. Estimation of Fine and Oversize Particle Ratio in a Heterogeneous Compound with Acoustic Emissions.

    PubMed

    Nsugbe, Ejay; Ruiz-Carcel, Cristobal; Starr, Andrew; Jennions, Ian

    2018-03-13

    The final phase of powder production typically involves a mixing process where all of the particles are combined and agglomerated with a binder to form a single compound. The traditional means of inspecting the physical properties of the final product involves an inspection of the particle sizes using an offline sieving and weighing process. The main downside of this technique, in addition to being an offline-only measurement procedure, is its inability to characterise large agglomerates of powders due to sieve blockage. This work assesses the feasibility of a real-time monitoring approach using a benchtop test rig and a prototype acoustic-based measurement approach to provide information that can be correlated to product quality and provide the opportunity for future process optimisation. Acoustic emission (AE) was chosen as the sensing method due to its low cost, simple setup process, and ease of implementation. The performance of the proposed method was assessed in a series of experiments where the offline quality check results were compared to the AE-based real-time estimations using data acquired from a benchtop powder free flow rig. A designed time domain based signal processing method was used to extract particle size information from the acquired AE signal and the results show that this technique is capable of estimating the required ratio in the washing powder compound with an average absolute error of 6%.

  19. A Real-Time Position-Locating Algorithm for CCD-Based Sunspot Tracking

    NASA Technical Reports Server (NTRS)

    Taylor, Jaime R.

    1996-01-01

    NASA Marshall Space Flight Center's (MSFC) EXperimental Vector Magnetograph (EXVM) polarimeter measures the sun's vector magnetic field. These measurements are taken to improve understanding of the sun's magnetic field in the hopes to better predict solar flares. Part of the procedure for the EXVM requires image motion stabilization over a period of a few minutes. A high speed tracker can be used to reduce image motion produced by wind loading on the EXVM, fluctuations in the atmosphere and other vibrations. The tracker consists of two elements, an image motion detector and a control system. The image motion detector determines the image movement from one frame to the next and sends an error signal to the control system. For the ground based application to reduce image motion due to atmospheric fluctuations requires an error determination at the rate of at least 100 hz. It would be desirable to have an error determination rate of 1 kHz to assure that higher rate image motion is reduced and to increase the control system stability. Two algorithms are presented that are typically used for tracking. These algorithms are examined for their applicability for tracking sunspots, specifically their accuracy if only one column and one row of CCD pixels are used. To examine the accuracy of this method two techniques are used. One involves moving a sunspot image a known distance with computer software, then applying the particular algorithm to see how accurately it determines this movement. The second technique involves using a rate table to control the object motion, then applying the algorithms to see how accurately each determines the actual motion. Results from these two techniques are presented.

  20. Situation awareness measures for simulated submarine track management.

    PubMed

    Loft, Shayne; Bowden, Vanessa; Braithwaite, Janelle; Morrell, Daniel B; Huf, Samuel; Durso, Francis T

    2015-03-01

    The aim of this study was to examine whether the Situation Present Assessment Method (SPAM) and the Situation Awareness Global Assessment Technique (SAGAT) predict incremental variance in performance on a simulated submarine track management task and to measure the potential disruptive effect of these situation awareness (SA) measures. Submarine track managers use various displays to localize and track contacts detected by own-ship sensors. The measurement of SA is crucial for designing effective submarine display interfaces and training programs. Participants monitored a tactical display and sonar bearing-history display to track the cumulative behaviors of contacts in relationship to own-ship position and landmarks. SPAM (or SAGAT) and the Air Traffic Workload Input Technique (ATWIT) were administered during each scenario, and the NASA Task Load Index (NASA-TLX) and Situation Awareness Rating Technique were administered postscenario. SPAM and SAGAT predicted variance in performance after controlling for subjective measures of SA and workload, and SA for past information was a stronger predictor than SA for current/future information. The NASA-TLX predicted performance on some tasks. Only SAGAT predicted variance in performance on all three tasks but marginally increased subjective workload. SPAM, SAGAT, and the NASA-TLX can predict unique variance in submarine track management performance. SAGAT marginally increased subjective workload, but this increase did not lead to any performance decrement. Defense researchers have identified SPAM as an alternative to SAGAT because it would not require field exercises involving submarines to be paused. SPAM was not disruptive, but it is potentially problematic that SPAM did not predict variance in all three performance tasks. © 2014, Human Factors and Ergonomics Society.

Top