Science.gov

Sample records for laue case analyzer

  1. Mammography imaging studies using a laue crystal analyzer

    SciTech Connect

    Chapman, D.; Thomlinson, W.; Arfelli, F. |

    1995-12-31

    Synchrotron based mammography imaging experiments have been performed with monochromatic x-rays in which a laue crystal placed after the object being imaged has been used to split the beam transmitted through the object. The X27C R&D beamline at the National Synchrotron Light Source was used with the white beam monochromatized by a double crystal Si(111) monochromator tuned to 18 keV. The imaging beam was a thin horizontal line approximately 0.5 mm high by 100 mm wide. Images were acquired in line scan mode with the phantom and detector both scanned together. The detector for these experiments was an image plate. A thin Si(l11) laue analyzer was used to diffract a portion of the beam transmitted through the phantom before the image plate detector. This ``scatter free`` diffracted beam was then recorded on the image plate during the phantom scan. Since the thin laue crystal also transmitted a fraction of the incident beam, this beam was also simultaneously recorded on the image plate. The imaging results are interpreted in terms of an x-ray schliere or refractive index inhomogeneities. The analyzer images taken at various points in the rocking curve will be presented.

  2. LauePt, a graphical-user-interface program for simulating and analyzing white-beam x-ray diffraction Laue patterns.

    SciTech Connect

    Huang, X.

    2010-08-01

    LauePt is a robust and extremely easy-to-use Windows application for accurately simulating, indexing and analyzing white-beam X-ray diffraction Laue patterns of any crystals under arbitrary diffraction geometry. This program has a user-friendly graphic interface and can be conveniently used by nonspecialists with little X-ray diffraction or crystallography knowledge. Its wide range of applications include (1) determination of single-crystal orientation with the Laue method, (2) white-beam topography, (3) white-beam microdiffraction, (4) X-ray studies of twinning, domains and heterostructures, (5) verification or determination of crystal structures from white-beam diffraction, and (6) teaching of X-ray crystallography.

  3. Research on a logarithmically bent Laue crystal analyzer for X-ray monochromatic backlight imaging

    SciTech Connect

    Wu, Yufen; Xiao, Shali; Lu, Jian; Liu, Lifeng; Yang, Qingguo; Huang, Xianbin

    2013-07-15

    A new logarithmically bent Laue imaging crystal analyzer (LBLICA) was proposed to obtain the monochromatic image of plasmas and exhibited a great potential for application in the Inertial Confinement Fusion experiment over a large field of view (FOV) and with a high spatial resolution. The imaging geometry of the LBLICA has been discussed. According to the Bragg condition and the equation of the logarithmic spiral, the key image parameters of the crystal analyzer, including the system magnification, the spatial resolution, and the FOV, have been analyzed theoretically. An experiment has been performed with a Cu target X-ray tube as a backlighter to backlight a mesh grid consisting of 50-μm Cu wires, and the monochromatic image of the grid has been obtained with a spatial resolution of approximately 30 μm.

  4. High resolution short focal distance Bent Crystal Laue Analyzer for copper K edge x-ray absorption spectroscopy

    SciTech Connect

    Kujala, N. G.; Barrea, R. A.; Karanfil, C.

    2011-06-15

    We have developed a compact short focal distance Bent Crystal Laue Analyzer (BCLA) for Cu speciation studies of biological systems with specific applications to cancer biology. The system provides high energy resolution and high background rejection. The system is composed of an aluminum block serving as a log spiral bender for a 15 micron thick Silicon 111 crystal and a set of soller slits. The energy resolution of the BCLA--about 14 eV at the Cu K{alpha} line-- allows resolution of the Cu K{alpha}{sub 1} and CuK{alpha}{sub 2} lines. The system is easily aligned by using a set of motorized XYZ linear stages. Two operation modes are available: incident energy scans (IES) and emission energy scans (EES). IES allows scanning of the incident energy while the BCLA system is maintained at a preselected fixed position - typically CuK{alpha}{sub 1} line. EES is used when the incident energy is fixed and the analyzer is scanned to provide the peak profile of the emission lines of Cu.

  5. Hemispherical Laue camera

    DOEpatents

    Li, James C. M.; Chu, Sungnee G.

    1980-01-01

    A hemispherical Laue camera comprises a crystal sample mount for positioning a sample to be analyzed at the center of sphere of a hemispherical, X-radiation sensitive film cassette, a collimator, a stationary or rotating sample mount and a set of standard spherical projection spheres. X-radiation generated from an external source is directed through the collimator to impinge onto the single crystal sample on the stationary mount. The diffracted beam is recorded on the hemispherical X-radiation sensitive film mounted inside the hemispherical film cassette in either transmission or back-reflection geometry. The distances travelled by X-radiation diffracted from the crystal to the hemispherical film are the same for all crystal planes which satisfy Bragg's Law. The recorded diffraction spots or Laue spots on the film thereby preserve both the symmetry information of the crystal structure and the relative intensities which are directly related to the relative structure factors of the crystal orientations. The diffraction pattern on the exposed film is compared with the known diffraction pattern on one of the standard spherical projection spheres for a specific crystal structure to determine the orientation of the crystal sample. By replacing the stationary sample support with a rotating sample mount, the hemispherical Laue camera can be used for crystal structure determination in a manner previously provided in conventional Debye-Scherrer cameras.

  6. Formulation of dynamical theory of X-ray diffraction for perfect crystals in the Laue case using the Riemann surface.

    PubMed

    Saka, Takashi

    2016-05-01

    The dynamical theory for perfect crystals in the Laue case was reformulated using the Riemann surface, as used in complex analysis. In the two-beam approximation, each branch of the dispersion surface is specified by one sheet of the Riemann surface. The characteristic features of the dispersion surface are analytically revealed using four parameters, which are the real and imaginary parts of two quantities specifying the degree of departure from the exact Bragg condition and the reflection strength. By representing these parameters on complex planes, these characteristics can be graphically depicted on the Riemann surface. In the conventional case, the absorption is small and the real part of the reflection strength is large, so the formulation is the same as the traditional analysis. However, when the real part of the reflection strength is small or zero, the two branches of the dispersion surface cross, and the dispersion relationship becomes similar to that of the Bragg case. This is because the geometrical relationships among the parameters are similar in both cases. The present analytical method is generally applicable, irrespective of the magnitudes of the parameters. Furthermore, the present method analytically revealed many characteristic features of the dispersion surface and will be quite instructive for further numerical calculations of rocking curves. PMID:27126110

  7. Formulation of dynamical theory of X-ray diffraction for perfect crystals in the Laue case using the Riemann surface.

    PubMed

    Saka, Takashi

    2016-05-01

    The dynamical theory for perfect crystals in the Laue case was reformulated using the Riemann surface, as used in complex analysis. In the two-beam approximation, each branch of the dispersion surface is specified by one sheet of the Riemann surface. The characteristic features of the dispersion surface are analytically revealed using four parameters, which are the real and imaginary parts of two quantities specifying the degree of departure from the exact Bragg condition and the reflection strength. By representing these parameters on complex planes, these characteristics can be graphically depicted on the Riemann surface. In the conventional case, the absorption is small and the real part of the reflection strength is large, so the formulation is the same as the traditional analysis. However, when the real part of the reflection strength is small or zero, the two branches of the dispersion surface cross, and the dispersion relationship becomes similar to that of the Bragg case. This is because the geometrical relationships among the parameters are similar in both cases. The present analytical method is generally applicable, irrespective of the magnitudes of the parameters. Furthermore, the present method analytically revealed many characteristic features of the dispersion surface and will be quite instructive for further numerical calculations of rocking curves.

  8. Laue: right or wrong?

    NASA Astrophysics Data System (ADS)

    Datta, Timir

    2015-03-01

    In 1912, Laue spots were discovered in x-ray scattering ‘photograms’ of crystals, which were amongst the most consequential experimental findings of the 20th century. Inter alia, spots established the x-ray waves and crystal lattice; plus, for the first time ever, revealed atoms as real physical objects. Laue, a protégé of Planck and a wave-optics expert, had theoretically predicted these spots, and promptly won the Physics Nobel Prize for 1914. The prize did not come easy: executing his experimentum cruces, over the judgments of Sommerfeld and Wien, required force of will and a certain amount of diplomacy. Besides, his explanation for missing spots and x-ray diffraction were proven wrong by Moseley, Darwin and the two Braggs. Traditionally, Laue’s three-dimensional diffraction model is reconciled with Bragg’s reflection formula by Ewald’s construction using reciprocal lattice space. Laue had overlooked that his fundamental equations violate Euclidean length invariance. This article shows that implementation of invariance consolidates Laue’s system of three (multi-parameter) equations into a single formula containing one integer, one angle and two distances; plus validating Bragg’s conjecture of reflection. This new derivation demonstrates that the mechanism of Laue spots is akin to the anti reflection coating the colour-plays in soap bubbles and oil slicks—reflection and interference not diffraction. Yet, Laue stimulated countless breakthroughs: Nobel Prizes and scientific innovations, with an enduring legacy of inspiration a century later.

  9. The LAUE project: latest developments

    NASA Astrophysics Data System (ADS)

    Liccardo, V.; Virgilli, E.; Frontera, F.; Rosati, P.

    2016-04-01

    We present the status of the LAUE project devoted to develop a technology for building long focal length Laue lenses for hard X-/soft gamma-ray astronomy (80-600 keV). The Laue lens will be composed of bent crystals of Gallium Arsenide (GaAs, 220) and Germanium (Ge, 111), and, for the first time, the focusing property of bent crystals will be exploited for this field of applications. At the present the goal of the project is the building of a 20 m focal length Laue lens petal capable of focusing X-ray in the energy range 90-300 keV.

  10. Multilayer Laue Lens Sequence Compiler

    2005-10-01

    For the growth of a new kind of x-ray focusing optic called a multilayer Laue lens, a device is constructed in which each layer of alernating high-z and low-z is placed in the appropriate place according to the Fresnel zone plate law. This requires that each layer have a different layer thickness. Because each layer is grown using DC magnetron sputter deposition, these layer thicknesses are not only dictated by the zone plate law, butmore » are adjusted to account for various drifting in the growth chamber due to target erosion, etc.« less

  11. Multilayer Laue Lens Sequence Compiler

    SciTech Connect

    Conley, Roy; Liu, Chian

    2005-10-01

    For the growth of a new kind of x-ray focusing optic called a multilayer Laue lens, a device is constructed in which each layer of alernating high-z and low-z is placed in the appropriate place according to the Fresnel zone plate law. This requires that each layer have a different layer thickness. Because each layer is grown using DC magnetron sputter deposition, these layer thicknesses are not only dictated by the zone plate law, but are adjusted to account for various drifting in the growth chamber due to target erosion, etc.

  12. Fabrication of wedged multilayer Laue lenses

    DOE PAGES

    Prasciolu, M.; Leontowich, A. F. G.; Krzywinski, J.; Andrejczuk, A.; Chapman, H. N.; Bajt, S.

    2015-01-01

    We present a new method to fabricate wedged multilayer Laue lenses, in which the angle of diffracting layers smoothly varies in the lens to achieve optimum diffracting efficiency across the entire pupil of the lens. This was achieved by depositing a multilayer onto a flat substrate placed in the penumbra of a straight-edge mask. The distance between the mask and the substrate was calibrated and the multilayer Laue lens was cut in a position where the varying layer thickness and the varying layer tilt simultaneously satisfy the Fresnel zone plate condition and Bragg’s law for all layers in the stack.more » This method can be used to extend the achievable numerical aperture of multilayer Laue lenses to reach considerably smaller focal spot sizes than achievable with lenses composed of parallel layers.« less

  13. Fabrication of wedged multilayer Laue lenses

    SciTech Connect

    Prasciolu, M.; Leontowich, A. F. G.; Krzywinski, J.; Andrejczuk, A.; Chapman, H. N.; Bajt, S.

    2015-01-01

    We present a new method to fabricate wedged multilayer Laue lenses, in which the angle of diffracting layers smoothly varies in the lens to achieve optimum diffracting efficiency across the entire pupil of the lens. This was achieved by depositing a multilayer onto a flat substrate placed in the penumbra of a straight-edge mask. The distance between the mask and the substrate was calibrated and the multilayer Laue lens was cut in a position where the varying layer thickness and the varying layer tilt simultaneously satisfy the Fresnel zone plate condition and Bragg’s law for all layers in the stack. This method can be used to extend the achievable numerical aperture of multilayer Laue lenses to reach considerably smaller focal spot sizes than achievable with lenses composed of parallel layers.

  14. Automation Enhancement of Multilayer Laue Lenses

    SciTech Connect

    Lauer K. R.; Conley R.

    2010-12-01

    X-ray optics fabrication at Brookhaven National Laboratory has been facilitated by a new, state of the art magnetron sputtering physical deposition system. With its nine magnetron sputtering cathodes and substrate carrier that moves on a linear rail via a UHV brushless linear servo motor, the system is capable of accurately depositing the many thousands of layers necessary for multilayer Laue lenses. I have engineered a versatile and automated control program from scratch for the base system and many subsystems. Its main features include a custom scripting language, a fully customizable graphical user interface, wireless and remote control, and a terminal-based interface. This control system has already been successfully used in the creation of many types of x-ray optics, including several thousand layer multilayer Laue lenses.Before reaching the point at which a deposition can be run, stencil-like masks for the sputtering cathodes must be created to ensure the proper distribution of sputtered atoms. Quality of multilayer Laue lenses can also be difficult to measure, given the size of the thin film layers. I employ my knowledge of software and algorithms to further ease these previously painstaking processes with custom programs. Additionally, I will give an overview of an x-ray optic simulator package I helped develop during the summer of 2010. In the interest of keeping my software free and open, I have worked mostly with the multiplatform Python and the PyQt application framework, utilizing C and C++ where necessary.

  15. Single Hit Energy-resolved Laue Diffraction.

    PubMed

    Patel, Shamim; Suggit, Matthew J; Stubley, Paul G; Hawreliak, James A; Ciricosta, Orlando; Comley, Andrew J; Collins, Gilbert W; Eggert, Jon H; Foster, John M; Wark, Justin S; Higginbotham, Andrew

    2015-05-01

    In situ white light Laue diffraction has been successfully used to interrogate the structure of single crystal materials undergoing rapid (nanosecond) dynamic compression up to megabar pressures. However, information on strain state accessible via this technique is limited, reducing its applicability for a range of applications. We present an extension to the existing Laue diffraction platform in which we record the photon energy of a subset of diffraction peaks. This allows for a measurement of the longitudinal and transverse strains in situ during compression. Consequently, we demonstrate measurement of volumetric compression of the unit cell, in addition to the limited aspect ratio information accessible in conventional white light Laue. We present preliminary results for silicon, where only an elastic strain is observed. VISAR measurements show the presence of a two wave structure and measurements show that material downstream of the second wave does not contribute to the observed diffraction peaks, supporting the idea that this material may be highly disordered, or has undergone large scale rotation.

  16. Single Hit Energy-resolved Laue Diffraction

    SciTech Connect

    Patel, Shamim; Suggit, Matthew J.; Stubley, Paul G.; Ciricosta, Orlando; Wark, Justin S.; Higginbotham, Andrew; Hawreliak, James A.; Collins, Gilbert W.; Eggert, Jon H.; Comley, Andrew J.; Foster, John M.

    2015-05-15

    In situ white light Laue diffraction has been successfully used to interrogate the structure of single crystal materials undergoing rapid (nanosecond) dynamic compression up to megabar pressures. However, information on strain state accessible via this technique is limited, reducing its applicability for a range of applications. We present an extension to the existing Laue diffraction platform in which we record the photon energy of a subset of diffraction peaks. This allows for a measurement of the longitudinal and transverse strains in situ during compression. Consequently, we demonstrate measurement of volumetric compression of the unit cell, in addition to the limited aspect ratio information accessible in conventional white light Laue. We present preliminary results for silicon, where only an elastic strain is observed. VISAR measurements show the presence of a two wave structure and measurements show that material downstream of the second wave does not contribute to the observed diffraction peaks, supporting the idea that this material may be highly disordered, or has undergone large scale rotation.

  17. Extraction of accurate structure-factor amplitudes from Laue data: wavelength normalization with wiggler and undulator X-ray sources.

    PubMed

    Srajer, V; Crosson, S; Schmidt, M; Key, J; Schotte, F; Anderson, S; Perman, B; Ren, Z; Teng, T Y; Bourgeois, D; Wulff, M; Moffat, K

    2000-07-01

    Wavelength normalization is an essential part of processing of Laue X-ray diffraction data and is critically important for deriving accurate structure-factor amplitudes. The results of wavelength normalization for Laue data obtained in nanosecond time-resolved experiments at the ID09 beamline at the European Synchrotron Radiation Facility, Grenoble, France, are presented. Several wiggler and undulator insertion devices with complex spectra were used. The results show that even in the most challenging cases, such as wiggler/undulator tandems or single-line undulators, accurate wavelength normalization does not require unusually redundant Laue data and can be accomplished using typical Laue data sets. Single-line undulator spectra derived from Laue data compare well with the measured incident X-ray spectra. Successful wavelength normalization of the undulator data was also confirmed by the observed signal in nanosecond time-resolved experiments. Single-line undulators, which are attractive for time-resolved experiments due to their high peak intensity and low polychromatic background, are compared with wigglers, based on data obtained on the same crystal. PMID:16609201

  18. Characterization of ion beam irradiated 304 stainless steel utilizing nanoindentation and Laue microdiffraction

    NASA Astrophysics Data System (ADS)

    Lupinacci, A.; Chen, K.; Li, Y.; Kunz, M.; Jiao, Z.; Was, G. S.; Abad, M. D.; Minor, A. M.; Hosemann, P.

    2015-03-01

    Characterizing irradiation damage in materials utilized in light water reactors is critical for both material development and application reliability. Here we use both nanoindentation and Laue microdiffraction to characterize both the mechanical response and microstructure evolution due to irradiation. Two different irradiation conditions were considered in 304 stainless steel: 1 dpa and 10 dpa. In addition, an annealed condition of the 10 dpa specimen for 1 h at 500 °C was evaluated. Nanoindentation revealed an increase in hardness due to irradiation and also revealed that hardness saturated in the 10 dpa case. Broadening using Laue microdiffraction peaks indicates a significant plastic deformation in the irradiated area that is in good agreement with both the SRIM calculations and the nanoindentation results.

  19. High numerical aperture multilayer Laue lenses

    SciTech Connect

    Morgan, Andrew J.; Prasciolu, Mauro; Andrejczuk, Andrzej; Krzywinski, Jacek; Meents, Alke; Pennicard, David; Graafsma, Heinz; Barty, Anton; Bean, Richard J.; Barthelmess, Miriam; Oberthuer, Dominik; Yefanov, Oleksandr; Aquila, Andrew; Chapman, Henry N.; Bajt, Saša

    2015-06-01

    The ever-increasing brightness of synchrotron radiation sources demands improved X-ray optics to utilise their capability for imaging and probing biological cells, nanodevices, and functional matter on the nanometer scale with chemical sensitivity. Here we demonstrate focusing a hard X-ray beam to an 8 nm focus using a volume zone plate (also referred to as a wedged multilayer Laue lens). This lens was constructed using a new deposition technique that enabled the independent control of the angle and thickness of diffracting layers to microradian and nanometer precision, respectively. This ensured that the Bragg condition is satisfied at each point along the lens, leading to a high numerical aperture that is limited only by its extent. We developed a phase-shifting interferometric method based on ptychography to characterise the lens focus. The precision of the fabrication and characterisation demonstrated here provides the path to efficient X-ray optics for imaging at 1 nm resolution.

  20. Bent Laue X-ray Fluorescence Imaging of Manganese in Biological Tissues--Preliminary Results

    SciTech Connect

    Zhu Ying; Zhang Honglin; Bewer, Brian; Nichol, Helen; Chapman, Dean; Thomlinson, Bill

    2010-06-23

    Manganese (Mn) is not abundant in human brain tissue, but it is recognized as a neurotoxin. The symptoms of manganese intoxication are similar to Parkinson's disease (PD), but the link between environmental, occupational or dietary Mn exposure and PD in humans is not well established. X-ray Absorption Spectroscopy (XAS) and in particular X-ray fluorescence can provide precise information on the distribution, concentration and chemical form of metals. However the scattered radiation and fluorescence from the adjacent abundant element, iron (Fe), may interfere with and limit the ability to detect ultra-dilute Mn. A bent Laue analyzer based Mn fluorescence detection system has been designed and fabricated to improve elemental specificity in XAS imaging. This bent Laue analyzer of logarithmic spiral shape placed upstream of an energy discriminating detector should improve the energy resolution from hundreds of eV to several eV. The bent Laue detection system was validated by imaging Mn fluorescence from Mn foils, gelatin calibration samples and adult Drosophila at the Hard X-ray MicroAnalysis (HXMA) beamline at the Canadian Light Source (CLS). Optimization of the design parameters, fabrication procedures and preliminary experimental results are presented along with future plans.

  1. Bent Laue X-ray Fluorescence Imaging of Manganese in Biological Tissues—Preliminary Results

    NASA Astrophysics Data System (ADS)

    Zhu, Ying; Bewer, Brian; Zhang, Honglin; Nichol, Helen; Thomlinson, Bill; Chapman, Dean

    2010-06-01

    Manganese (Mn) is not abundant in human brain tissue, but it is recognized as a neurotoxin. The symptoms of manganese intoxication are similar to Parkinson's disease (PD), but the link between environmental, occupational or dietary Mn exposure and PD in humans is not well established. X-ray Absorption Spectroscopy (XAS) and in particular X-ray fluorescence can provide precise information on the distribution, concentration and chemical form of metals. However the scattered radiation and fluorescence from the adjacent abundant element, iron (Fe), may interfere with and limit the ability to detect ultra-dilute Mn. A bent Laue analyzer based Mn fluorescence detection system has been designed and fabricated to improve elemental specificity in XAS imaging. This bent Laue analyzer of logarithmic spiral shape placed upstream of an energy discriminating detector should improve the energy resolution from hundreds of eV to several eV. The bent Laue detection system was validated by imaging Mn fluorescence from Mn foils, gelatin calibration samples and adult Drosophila at the Hard X-ray MicroAnalysis (HXMA) beamline at the Canadian Light Source (CLS). Optimization of the design parameters, fabrication procedures and preliminary experimental results are presented along with future plans.

  2. The LaueUtil toolkit for Laue photocrystallography. II. Spot finding and integration

    SciTech Connect

    Kalinowski, Jaroslaw A.; Fournier, Bertrand; Makal, Anna; Coppens, Philip

    2015-10-15

    A spot-integration method is described which does not require prior indexing of the reflections. It is based on statistical analysis of the values from each of the pixels on successive frames, followed for each frame by morphological analysis to identify clusters of high value pixels which form an appropriate mask corresponding to a reflection peak. The method does not require prior assumptions such as fitting of a profile or definition of an integration box. The results are compared with those of the seed-skewness method which is based on minimizing the skewness of the intensity distribution within a peak's integration box. Applications in Laue photocrystallography are presented.

  3. High numerical aperture multilayer Laue lenses

    DOE PAGES

    Morgan, Andrew J.; Prasciolu, Mauro; Andrejczuk, Andrzej; Krzywinski, Jacek; Meents, Alke; Pennicard, David; Graafsma, Heinz; Barty, Anton; Bean, Richard J.; Barthelmess, Miriam; et al

    2015-06-01

    The ever-increasing brightness of synchrotron radiation sources demands improved X-ray optics to utilise their capability for imaging and probing biological cells, nanodevices, and functional matter on the nanometer scale with chemical sensitivity. Here we demonstrate focusing a hard X-ray beam to an 8 nm focus using a volume zone plate (also referred to as a wedged multilayer Laue lens). This lens was constructed using a new deposition technique that enabled the independent control of the angle and thickness of diffracting layers to microradian and nanometer precision, respectively. This ensured that the Bragg condition is satisfied at each point along themore » lens, leading to a high numerical aperture that is limited only by its extent. We developed a phase-shifting interferometric method based on ptychography to characterise the lens focus. The precision of the fabrication and characterisation demonstrated here provides the path to efficient X-ray optics for imaging at 1 nm resolution.« less

  4. High numerical aperture multilayer Laue lenses.

    PubMed

    Morgan, Andrew J; Prasciolu, Mauro; Andrejczuk, Andrzej; Krzywinski, Jacek; Meents, Alke; Pennicard, David; Graafsma, Heinz; Barty, Anton; Bean, Richard J; Barthelmess, Miriam; Oberthuer, Dominik; Yefanov, Oleksandr; Aquila, Andrew; Chapman, Henry N; Bajt, Saša

    2015-01-01

    The ever-increasing brightness of synchrotron radiation sources demands improved X-ray optics to utilise their capability for imaging and probing biological cells, nanodevices, and functional matter on the nanometer scale with chemical sensitivity. Here we demonstrate focusing a hard X-ray beam to an 8 nm focus using a volume zone plate (also referred to as a wedged multilayer Laue lens). This lens was constructed using a new deposition technique that enabled the independent control of the angle and thickness of diffracting layers to microradian and nanometer precision, respectively. This ensured that the Bragg condition is satisfied at each point along the lens, leading to a high numerical aperture that is limited only by its extent. We developed a phase-shifting interferometric method based on ptychography to characterise the lens focus. The precision of the fabrication and characterisation demonstrated here provides the path to efficient X-ray optics for imaging at 1 nm resolution.

  5. High numerical aperture multilayer Laue lenses

    PubMed Central

    Morgan, Andrew J.; Prasciolu, Mauro; Andrejczuk, Andrzej; Krzywinski, Jacek; Meents, Alke; Pennicard, David; Graafsma, Heinz; Barty, Anton; Bean, Richard J.; Barthelmess, Miriam; Oberthuer, Dominik; Yefanov, Oleksandr; Aquila, Andrew; Chapman, Henry N.; Bajt, Saša

    2015-01-01

    The ever-increasing brightness of synchrotron radiation sources demands improved X-ray optics to utilise their capability for imaging and probing biological cells, nanodevices, and functional matter on the nanometer scale with chemical sensitivity. Here we demonstrate focusing a hard X-ray beam to an 8 nm focus using a volume zone plate (also referred to as a wedged multilayer Laue lens). This lens was constructed using a new deposition technique that enabled the independent control of the angle and thickness of diffracting layers to microradian and nanometer precision, respectively. This ensured that the Bragg condition is satisfied at each point along the lens, leading to a high numerical aperture that is limited only by its extent. We developed a phase-shifting interferometric method based on ptychography to characterise the lens focus. The precision of the fabrication and characterisation demonstrated here provides the path to efficient X-ray optics for imaging at 1 nm resolution. PMID:26030003

  6. High numerical aperture multilayer Laue lenses.

    PubMed

    Morgan, Andrew J; Prasciolu, Mauro; Andrejczuk, Andrzej; Krzywinski, Jacek; Meents, Alke; Pennicard, David; Graafsma, Heinz; Barty, Anton; Bean, Richard J; Barthelmess, Miriam; Oberthuer, Dominik; Yefanov, Oleksandr; Aquila, Andrew; Chapman, Henry N; Bajt, Saša

    2015-01-01

    The ever-increasing brightness of synchrotron radiation sources demands improved X-ray optics to utilise their capability for imaging and probing biological cells, nanodevices, and functional matter on the nanometer scale with chemical sensitivity. Here we demonstrate focusing a hard X-ray beam to an 8 nm focus using a volume zone plate (also referred to as a wedged multilayer Laue lens). This lens was constructed using a new deposition technique that enabled the independent control of the angle and thickness of diffracting layers to microradian and nanometer precision, respectively. This ensured that the Bragg condition is satisfied at each point along the lens, leading to a high numerical aperture that is limited only by its extent. We developed a phase-shifting interferometric method based on ptychography to characterise the lens focus. The precision of the fabrication and characterisation demonstrated here provides the path to efficient X-ray optics for imaging at 1 nm resolution. PMID:26030003

  7. In situ serial Laue diffraction on a microfluidic crystallization device

    PubMed Central

    Perry, Sarah L.; Guha, Sudipto; Pawate, Ashtamurthy S.; Henning, Robert; Kosheleva, Irina; Srajer, Vukica; Kenis, Paul J. A.; Ren, Zhong

    2014-01-01

    Renewed interest in room-temperature diffraction has been prompted by the desire to observe structural dynamics of proteins as they function. Serial crystallography, an experimental strategy that aggregates small pieces of data from a large uniform pool of crystals, has been demonstrated at synchrotrons and X-ray free-electron lasers. This work utilizes a microfluidic crystallization platform for serial Laue diffraction from macroscopic crystals and proposes that a collection of small slices of Laue data from many individual crystals is a realistic solution to the difficulties in dynamic studies of irreversible biochemical reactions. PMID:25484843

  8. The LaueUtil toolkit for Laue photocrystallography. II. Spot finding and integration

    PubMed Central

    Kalinowski, Jarosław A.; Fournier, Bertrand; Makal, Anna; Coppens, Philip

    2012-01-01

    A spot-integration method is described which does not require prior indexing of the reflections. It is based on statistical analysis of the values from each of the pixels on successive frames, followed for each frame by morphological analysis to identify clusters of high value pixels which form an appropriate mask corresponding to a reflection peak. The method does not require prior assumptions such as fitting of a profile or definition of an integration box. The results are compared with those of the seed-skewness method which is based on minimizing the skewness of the intensity distribution within a peak’s integration box. Applications in Laue photocrystallography are presented. PMID:22713901

  9. Dynamical focusing by bent, asymmetrically cut perfect crystals in Laue geometry.

    PubMed

    Guigay, J P; Ferrero, C

    2016-07-01

    A semi-analytical approach based on the influence functions of a point source located on the crystal surface has been adopted to show that the focusing ability of cylindrically bent Laue crystals may be strongly enhanced by replacing symmetrically cut crystals with asymmetrically cut crystals. This approach is generally applicable to any distance between the X-ray source and the focusing bent crystal. A mathematically straightforward method to simplify the derivation of the already known expression of the influence functions in the case of deformed crystals with a constant strain gradient (e.g. cylindrically bent crystals) is also presented. PMID:27357851

  10. Dynamical focusing by bent, asymmetrically cut perfect crystals in Laue geometry.

    PubMed

    Guigay, J P; Ferrero, C

    2016-07-01

    A semi-analytical approach based on the influence functions of a point source located on the crystal surface has been adopted to show that the focusing ability of cylindrically bent Laue crystals may be strongly enhanced by replacing symmetrically cut crystals with asymmetrically cut crystals. This approach is generally applicable to any distance between the X-ray source and the focusing bent crystal. A mathematically straightforward method to simplify the derivation of the already known expression of the influence functions in the case of deformed crystals with a constant strain gradient (e.g. cylindrically bent crystals) is also presented.

  11. The LaueUtil toolkit for Laue photocrystallography. I. Rapid orientation matrix determination for intermediate-size-unit-cell Laue data

    PubMed Central

    Kalinowski, Jarosław A.; Makal, Anna; Coppens, Philip

    2011-01-01

    A new method for determination of the orientation matrix of Laue X-ray data is presented. The method is based on matching of the experimental patterns of central reciprocal lattice rows projected on a unit sphere centered on the origin of the reciprocal lattice with the corresponding pattern of a monochromatic data set on the same material. This technique is applied to the complete data set and thus eliminates problems often encountered when single frames with a limited number of peaks are to be used for orientation matrix determination. Application of the method to a series of Laue data sets on organometallic crystals is described. The corresponding program is available under a Mozilla Public License-like open-source license. PMID:22199400

  12. The LaueUtil toolkit for Laue photocrystallography. I. Rapid orientation matrix determination for intermediate-size-unit-cell Laue data

    SciTech Connect

    Kalinowski, Jaroslaw A.; Makal, Anna; Coppens, Philip

    2015-10-15

    A new method for determination of the orientation matrix of Laue X-ray data is presented. The method is based on matching of the experimental patterns of central reciprocal lattice rows projected on a unit sphere centered on the origin of the reciprocal lattice with the corresponding pattern of a monochromatic data set on the same material. This technique is applied to the complete data set and thus eliminates problems often encountered when single frames with a limited number of peaks are to be used for orientation matrix determination. Application of the method to a series of Laue data sets on organometallic crystals is described. The corresponding program is available under a Mozilla Public License-like open-source license.

  13. Using Generalized Additive Models to Analyze Single-Case Designs

    ERIC Educational Resources Information Center

    Shadish, William; Sullivan, Kristynn

    2013-01-01

    Many analyses for single-case designs (SCDs)--including nearly all the effect size indicators-- currently assume no trend in the data. Regression and multilevel models allow for trend, but usually test only linear trend and have no principled way of knowing if higher order trends should be represented in the model. This paper shows how Generalized…

  14. Efficiency of a multilayer-Laue-lens with a 102 μm aperture

    SciTech Connect

    Macrander, Albert T. Wojcik, Michael; Maser, Jorg; Kubec, Adam; Conley, Raymond; Bouet, Nathalie; Zhou, Juan

    2015-08-24

    A multilayer-Laue-lens (MLL) comprised of WSi{sub 2}/Al layers stacked to a full thickness of 102 μm was characterized for its diffraction efficiency and dynamical diffraction properties by x-ray measurements made in the far field. The achieved aperture roughly doubles the previous maximum reported aperture for an MLL, thereby doubling the working distance. Negative and positive first orders were found to have 14.2% and 13.0% efficiencies, respectively. A section thickness of 9.6 μm was determined from Laue-case thickness fringes in the diffraction data. A background gas consisting of 90% Ar and 10% N{sub 2} was used for sputtering. This material system was chosen to reduce grown-in stress as the multilayer is deposited. Although some regions of the full MLL exhibited defects, the presently reported results were obtained for a region devoid of defects. The data compare well to dynamical diffraction calculations with Coupled Wave Theory (CWT) which provided confirmation of the optical constants and densities assumed for the CWT calculations.

  15. Efficiency of a multilayer-Laue-lens with a 102 μm aperture

    SciTech Connect

    Macrander, Albert T.; Kubec, Adam; Conley, Raymond; Bouet, Nathalie; Zhou, Juan; Wojcik, Michael; Maser, Jorg

    2015-08-25

    A multilayer-Laue-lens (MLL) comprised of WSi2/Al layers stacked to a full thickness of 102 microns was characterized for its diffraction efficiency and dynamical diffraction properties by x-ray measurements made in the far field. The achieved aperture roughly doubles the previous maximum reported aperture for an MLL, thereby doubling the working distance. Negative and positive first orders were found to have 14.2 % and 13.0 % efficiencies, respectively. A section thickness of 9.6 μm was determined from Laue-case thickness fringes in the diffraction data. A background gas consisting of 90 % Ar and 10 % N2 was used for sputtering. This material system was chosen to reduce grown-in stress as the multilayer is deposited. Although some regions of the full MLL exhibited defects, the presently reported results were obtained for a region devoid of defects. The data compare well to dynamical diffraction calculations with Coupled Wave Theory (CWT) which provided confirmation of the optical constants and densities assumed for the CWT calculations.

  16. Efficiency of a multilayer-Laue-lens with a 102 μm aperture

    DOE PAGES

    Macrander, Albert T.; Kubec, Adam; Conley, Raymond; Bouet, Nathalie; Zhou, Juan; Wojcik, Michael; Maser, Jorg

    2015-08-25

    A multilayer-Laue-lens (MLL) comprised of WSi2/Al layers stacked to a full thickness of 102 microns was characterized for its diffraction efficiency and dynamical diffraction properties by x-ray measurements made in the far field. The achieved aperture roughly doubles the previous maximum reported aperture for an MLL, thereby doubling the working distance. Negative and positive first orders were found to have 14.2 % and 13.0 % efficiencies, respectively. A section thickness of 9.6 μm was determined from Laue-case thickness fringes in the diffraction data. A background gas consisting of 90 % Ar and 10 % N2 was used for sputtering. Thismore » material system was chosen to reduce grown-in stress as the multilayer is deposited. Although some regions of the full MLL exhibited defects, the presently reported results were obtained for a region devoid of defects. The data compare well to dynamical diffraction calculations with Coupled Wave Theory (CWT) which provided confirmation of the optical constants and densities assumed for the CWT calculations.« less

  17. Laue diffraction protein crystallography at the National Synchrotron Light Source

    SciTech Connect

    Getzoff, E.D.; McRee, D. ); Jones, K.W.; Spanne, P.; Sweet, R.M. ); Moffat, K.; Ng, K.; Rivers, M.L.; Schildkamp, W.; Teng, T.Y. ); Singer, P.T.; Westbrook, E.M. )

    1992-01-01

    A new facility for the study of protein crystal structure using Laue diffraction has been established at the X26 beam line of the National Synchrotron Light Source (NSLS) at Brookhaven National Laboratory. The characteristics of the beam line and diffraction apparatus are described. Selected results of some of the initial experiments are discussed briefly by beam line users to illustrate the scope of the experimental program. Because the Laue method permits the recording of large data sets in a single shot, one goal in establishing this facility has been to develop the means to study time-resolved structures within protein crystals. Systems being studied include: the reactions catalyzed by trypsin; photolysis of carbonmonoxy myoglobin; and the photocycle of photoactive yellow protein.

  18. Laue diffraction protein crystallography at the National Synchrotron Light Source

    SciTech Connect

    Getzoff, E.D.; McRee, D.; Jones, K.W.; Spanne, P.; Sweet, R.M.; Moffat, K.; Ng, K.; Rivers, M.L.; Schildkamp, W.; Teng, T.Y.; Singer, P.T.; Westbrook, E.M.

    1992-12-31

    A new facility for the study of protein crystal structure using Laue diffraction has been established at the X26 beam line of the National Synchrotron Light Source (NSLS) at Brookhaven National Laboratory. The characteristics of the beam line and diffraction apparatus are described. Selected results of some of the initial experiments are discussed briefly by beam line users to illustrate the scope of the experimental program. Because the Laue method permits the recording of large data sets in a single shot, one goal in establishing this facility has been to develop the means to study time-resolved structures within protein crystals. Systems being studied include: the reactions catalyzed by trypsin; photolysis of carbonmonoxy myoglobin; and the photocycle of photoactive yellow protein.

  19. The RATIO method for time-resolved Laue crystallography

    PubMed Central

    Coppens, Philip; Pitak, Mateusz; Gembicky, Milan; Messerschmidt, Marc; Scheins, Stephan; Benedict, Jason; Adachi, Shin-ichi; Sato, Tokushi; Nozawa, Shunsuke; Ichiyanagi, Kohei; Chollet, Matthieu; Koshihara, Shin-ya

    2009-01-01

    A RATIO method for analysis of intensity changes in time-resolved pump–probe Laue diffraction experiments is described. The method eliminates the need for scaling the data with a wavelength curve representing the spectral distribution of the source and removes the effect of possible anisotropic absorption. It does not require relative scaling of series of frames and removes errors due to all but very short term fluctuations in the synchrotron beam. PMID:19240334

  20. Beam-smiling in bent-Laue monochromators

    SciTech Connect

    Ren, B.; Dilmanian, F. A.; Wu, X. Y.; Huang, X.; Chapman, L. D.; Ivanov, I.; Zhong, Z.; Thomlinson, W. C.

    1997-07-01

    When a wide fan-shaped x-ray beam is diffracted by a bent crystal in the Laue geometry, the profile of the diffracted beam generally does not appear as a straight line, but as a line with its ends curved up or curved down. This effect, referred to as 'beam-smiling', has been a major obstacle in developing bent-Laue crystal monochromators for medical applications of synchrotron x-ray. We modeled a cylindrically bent crystal using the Finite Element Analysis (FEA) method, and we carried out experiments at the National Synchrotron Light Source and Cornell High Energy Synchrotron Source. Our studies show that, while beam-smiling exists in most of the crystal's area because of anticlastic bending effects, there is a region parallel to the bending axis of the crystal where the diffracted beam is 'smile-free'. By applying asymmetrical bending, this smile-free region can be shifted vertically away from the geometric center of the crystal, as desired. This leads to a novel method of compensating for beam-smiling. We will discuss the method of 'differential bending' for smile removal, beam-smiling in the Cauchios and the polychromatic geometry, and the implications of the method on developing single- and double-bent Laue monochromators. The experimental results will be discussed, concentrating on specific beam-smiling observation and removal as applied to the new monochromator of the Multiple Energy Computed Tomography [MECT] project of the Medical Department, Brookhaven National Laboratory.

  1. Time of flight Laue fiber diffraction studies of perdeuterated DNA

    SciTech Connect

    Forsyth, V.T.; Whalley, M.A.; Mahendrasingam, A.; Fuller, W.

    1994-12-31

    The diffractometer SXD at the Rutherford Appleton Laboratory ISIS pulsed neutron source has been used to record high resolution time-of-flight Laue fiber diffraction data from DNA. These experiments, which are the first of their kind, were undertaken using fibers of DNA in the A conformation and prepared using deuterated DNA in order to minimis incoherent background scattering. These studies complement previous experiments on instrument D19 at the Institute Laue Langevin using monochromatic neutrons. Sample preparation involved drawing large numbers of these deuterated DNA fibers and mounting them in a parallel array. The strategy of data collection is discussed in terms of camera design, sample environment and data collection. The methods used to correct the recorded time-of-flight data and map it into the final reciprocal space fiber diffraction dataset are also discussed. Difference Fourier maps showing the distribution of water around A-DNA calculated on the basis of these data are compared with results obtained using data recorded from hydrogenated A-DNA on D19. Since the methods used for sample preparation, data collection and data processing are fundamentally different for the monochromatic and Laue techniques, the results of these experiments also afford a valuable opportunity to independently test the data reduction and analysis techniques used in the two methods.

  2. Beam-smiling in bent-Laue monochromators

    SciTech Connect

    Ren, B.; Dilmanian, F.A.; Wu, X.Y.; Huang, X.; Ivanov, I.; Thomlinson, W.C.

    1997-07-01

    When a wide fan-shaped x-ray beam is diffracted by a bent crystal in the Laue geometry, the profile of the diffracted beam generally does not appear as a straight line, but as a line with its ends curved up or curved down. This effect, referred to as {open_quotes}beam-smiling{close_quotes}, has been a major obstacle in developing bent-Laue crystal monochromators for medical applications of synchrotron x-ray. We modeled a cylindrically bent crystal using the Finite Element Analysis (FEA) method, and we carried out experiments at the National Synchrotron Light Source and Cornell High Energy Synchrotron Source. Our studies show that, while beam-smiling exists in most of the crystal{close_quote}s area because of anticlastic bending effects, there is a region parallel to the bending axis of the crystal where the diffracted beam is {open_quotes}smile-free{close_quotes}. By applying asymmetrical bending, this smile-free region can be shifted vertically away from the geometric center of the crystal, as desired. This leads to a novel method of compensating for beam-smiling. We will discuss the method of {open_quotes}differential bending{close_quotes} for smile removal, beam-smiling in the Cauchios and the polychromatic geometry, and the implications of the method on developing single- and double-bent Laue monochromators. The experimental results will be discussed, concentrating on specific beam-smiling observation and removal as applied to the new monochromator of the Multiple Energy Computed Tomography [MECT] project of the Medical Department, Brookhaven National Laboratory. {copyright} {ital 1997 American Institute of Physics.}

  3. Bragg-von Laue diffraction generalized to twisted X-rays.

    PubMed

    Jüstel, Dominik; Friesecke, Gero; James, Richard D

    2016-03-01

    A pervasive limitation of nearly all practical X-ray methods for the determination of the atomic scale structure of matter is the need to crystallize the molecule, compound or alloy in a sufficiently large (∼ 10 × 10 × 10 µm) periodic array. In this paper an X-ray method applicable to structure determination of some important noncrystalline structures is proposed. It is designed according to a strict mathematical analog of von Laue's method, but replacing the translation group by another symmetry group, and simultaneously replacing plane waves by different exact closed-form solutions of Maxwell's equations. Details are presented for helical structures like carbon nanotubes or filamentous viruses. In computer simulations the accuracy of the determination of structure is shown to be comparable to the periodic case. PMID:26919370

  4. Quantitative x-ray phase imaging at the nanoscale by multilayer Laue lenses.

    PubMed

    Yan, Hanfei; Chu, Yong S; Maser, Jörg; Nazaretski, Evgeny; Kim, Jungdae; Kang, Hyon Chol; Lombardo, Jeffrey J; Chiu, Wilson K S

    2013-01-01

    For scanning x-ray microscopy, many attempts have been made to image the phase contrast based on a concept of the beam being deflected by a specimen, the so-called differential phase contrast imaging (DPC). Despite the successful demonstration in a number of representative cases at moderate spatial resolutions, these methods suffer from various limitations that preclude applications of DPC for ultra-high spatial resolution imaging, where the emerging wave field from the focusing optic tends to be significantly more complicated. In this work, we propose a highly robust and generic approach based on a Fourier-shift fitting process and demonstrate quantitative phase imaging of a solid oxide fuel cell (SOFC) anode by multilayer Laue lenses (MLLs). The high sensitivity of the phase to structural and compositional variations makes our technique extremely powerful in correlating the electrode performance with its buried nanoscale interfacial structures that may be invisible to the absorption and fluorescence contrasts.

  5. Bragg-von Laue diffraction generalized to twisted X-rays.

    PubMed

    Jüstel, Dominik; Friesecke, Gero; James, Richard D

    2016-03-01

    A pervasive limitation of nearly all practical X-ray methods for the determination of the atomic scale structure of matter is the need to crystallize the molecule, compound or alloy in a sufficiently large (∼ 10 × 10 × 10 µm) periodic array. In this paper an X-ray method applicable to structure determination of some important noncrystalline structures is proposed. It is designed according to a strict mathematical analog of von Laue's method, but replacing the translation group by another symmetry group, and simultaneously replacing plane waves by different exact closed-form solutions of Maxwell's equations. Details are presented for helical structures like carbon nanotubes or filamentous viruses. In computer simulations the accuracy of the determination of structure is shown to be comparable to the periodic case.

  6. Sectioning of multilayers to make a multilayer Laue lens

    SciTech Connect

    Kang, Hyon Chol; Stephenson, G. Brian; Liu Chian; Conley, Ray; Khachatryan, Ruben; Wieczorek, Michael; Macrander, Albert T.; Yan Hanfei; Maser, Joerg; Hiller, Jon; Koritala, Rachel

    2007-04-15

    We report a process to fabricate multilayer Laue lenses (MLL's) by sectioning and thinning multilayer films. This method can produce a linear zone plate structure with a very large ratio of zone depth to width (e.g., >1000), orders of magnitude larger than can be attained with photolithography. Consequently, MLL's are advantageous for efficient nanofocusing of hard x rays. MLL structures prepared by the technique reported here have been tested at an x-ray energy of 19.5 keV, and a diffraction-limited performance was observed. The present article reports the fabrication techniques that were used to make the MLL's.

  7. The clinical diagnosis and treatment about 22 cases of limbic encephalitis were retrospectively analyzed.

    PubMed

    Zang, Weiping; Zhang, Zhijun; Feng, Laihui; Zhang, Ailing

    2016-03-01

    To summarize and analyze the clinical characteristics and treatment of limbic encephalitis, in order to provide the basis for clinical work. We retrospectively analyzed the clinical characteristics, magnetic resonance imaging (MRI), cerebrospinal fluid (CSF) and self immune antibody results of 22 patients with limbic encephalitis in Zheng zhou people's Hospital from March 2013 to May 2014. 22 cases of patients with psychiatric disturbance, such as hallucinations being typical clinical manifestations: Memory decline in 18 cases: Seizures in 13 patients: Altered level of consciousness in 10 cases; Movement disorders in 7 cases and 9 cases with febrile.14 cases have relieved after treating with antiviral and immunosuppressive therapy, 5 cases left memory decline, 2 patients left overwhelmingly excited, 1 cases of seizures. The clinical symptoms of patients with limbic encephalitis are complicated changeable and unspecific. so earlier diagnosis and treatment are very important for the prognosis of patients. PMID:27113304

  8. The clinical diagnosis and treatment about 22 cases of limbic encephalitis were retrospectively analyzed.

    PubMed

    Zang, Weiping; Zhang, Zhijun; Feng, Laihui; Zhang, Ailing

    2016-03-01

    To summarize and analyze the clinical characteristics and treatment of limbic encephalitis, in order to provide the basis for clinical work. We retrospectively analyzed the clinical characteristics, magnetic resonance imaging (MRI), cerebrospinal fluid (CSF) and self immune antibody results of 22 patients with limbic encephalitis in Zheng zhou people's Hospital from March 2013 to May 2014. 22 cases of patients with psychiatric disturbance, such as hallucinations being typical clinical manifestations: Memory decline in 18 cases: Seizures in 13 patients: Altered level of consciousness in 10 cases; Movement disorders in 7 cases and 9 cases with febrile.14 cases have relieved after treating with antiviral and immunosuppressive therapy, 5 cases left memory decline, 2 patients left overwhelmingly excited, 1 cases of seizures. The clinical symptoms of patients with limbic encephalitis are complicated changeable and unspecific. so earlier diagnosis and treatment are very important for the prognosis of patients.

  9. Phase contrast image segmentation using a Laue analyser crystal

    NASA Astrophysics Data System (ADS)

    Kitchen, Marcus J.; Paganin, David M.; Uesugi, Kentaro; Allison, Beth J.; Lewis, Robert A.; Hooper, Stuart B.; Pavlov, Konstantin M.

    2011-02-01

    Dual-energy x-ray imaging is a powerful tool enabling two-component samples to be separated into their constituent objects from two-dimensional images. Phase contrast x-ray imaging can render the boundaries between media of differing refractive indices visible, despite them having similar attenuation properties; this is important for imaging biological soft tissues. We have used a Laue analyser crystal and a monochromatic x-ray source to combine the benefits of both techniques. The Laue analyser creates two distinct phase contrast images that can be simultaneously acquired on a high-resolution detector. These images can be combined to separate the effects of x-ray phase, absorption and scattering and, using the known complex refractive indices of the sample, to quantitatively segment its component materials. We have successfully validated this phase contrast image segmentation (PCIS) using a two-component phantom, containing an iodinated contrast agent, and have also separated the lungs and ribcage in images of a mouse thorax. Simultaneous image acquisition has enabled us to perform functional segmentation of the mouse thorax throughout the respiratory cycle during mechanical ventilation.

  10. Laue-DIC: a new method for improved stress field measurements at the micrometer scale

    PubMed Central

    Petit, J.; Castelnau, O.; Bornert, M.; Zhang, F. G.; Hofmann, F.; Korsunsky, A. M.; Faurie, D.; Le Bourlot, C.; Micha, J. S.; Robach, O.; Ulrich, O.

    2015-01-01

    A better understanding of the effective mechanical behavior of polycrystalline materials requires an accurate knowledge of the behavior at a scale smaller than the grain size. The X-ray Laue microdiffraction technique available at beamline BM32 at the European Synchrotron Radiation Facility is ideally suited for probing elastic strains (and associated stresses) in deformed polycrystalline materials with a spatial resolution smaller than a micrometer. However, the standard technique used to evaluate local stresses from the distortion of Laue patterns lacks accuracy for many micromechanical applications, mostly due to (i) the fitting of Laue spots by analytical functions, and (ii) the necessary comparison of the measured pattern with the theoretical one from an unstrained reference specimen. In the present paper, a new method for the analysis of Laue images is presented. A Digital Image Correlation (DIC) technique, which is essentially insensitive to the shape of Laue spots, is applied to measure the relative distortion of Laue patterns acquired at two different positions on the specimen. The new method is tested on an in situ deformed Si single-crystal, for which the prescribed stress distribution has been calculated by finite-element analysis. It is shown that the new Laue-DIC method allows determination of local stresses with a strain resolution of the order of 10−5. PMID:26134802

  11. The NSLS-II Multilayer Laue Lens Deposition System

    SciTech Connect

    Conley, R.; Bouet, N.; Biancarosa, J.; Shen, Q.; Boas, L.; Feraca, J.; Rosenbaum, L.

    2009-08-02

    The NSLS-II[1] program has a requirement for an unprecedented level of x-ray nanofocusing and has selected the wedged multilayer Laue lens[2,3] (MLL) as the optic of choice to meet this goal. In order to fabricate the MLL a deposition system is required that is capable of depositing depth-graded and laterally-graded multilayers with precise thickness control over many thousands of layers, with total film growth in one run up to 100m thick or greater. This machine design expounds on the positive features of a rotary deposition system[4] constructed previously for MLLs and will contain multiple stationary, horizontally-oriented magnetron sources where a transport will move a substrate back and forth in a linear fashion over shaped apertures at well-defined velocities to affect a multilayer coating.

  12. Comparison of DTR spectral-angular characteristics of divergent beam of relativistic electrons in scattering geometry of Laue and Bragg

    NASA Astrophysics Data System (ADS)

    Blazhevich, S. V.; Koskova, T. V.; Ligidov, A. Z.; Noskov, A. V.

    2016-07-01

    Diffracted transition radiation (DTR) generated by a divergent beam of relativistic electrons crossing a single-crystal plate in different (Laue, Bragg) scattering geometry has been considered for the general case of asymmetric reflection of the electron coulomb field relative to the entrance target surface. The expressions for spectral-angular density of DTR and parametric X-ray Radiation (PXR) has been derived. Then DTR and PXR has been considered in case of a thin target, when multiple scattering of electron is negligibly small, which is important for divergence measurement in real time regime. Numerical calculation of spectral-angular density of DTR by a beam of relativistic electrons has been made using averaging over the bivariate Gauss distribution as angular distribution of relativistic electrons in the beam. It has been shown that in Bragg scattering geometry the angular density of DTR is bigger, than in Laue geometry, which can be explained by the existence of the frequency range, in which the incident wave propagation vector takes complex value even under absence of absorption. In this range, all of photons are reflected in Bragg direction. It means that the range of total reflection defines the width of DTR spectrum.

  13. Neutron interferometric measurement and calculations of a phase shift induced by Laue transmission.

    PubMed

    Potocar, T; Zawisky, M; Lemmel, H; Springer, J; Suda, M

    2015-09-01

    This study investigates the phase shift induced by Laue transmission in a perfect Si crystal blade in unprecedented detail. This `Laue phase' was measured at two wavelengths in the vicinity of the Bragg condition within a neutron interferometer. In particular, the sensitivity of the Laue phase to the alignment of the monochromator and interferometer (rocking angle) and beam divergence has been verified. However, the influence of fundamental quantities, such as the neutron-electron scattering length, on the Laue phase is rather small. The fascinating steep phase slope of 5.5° [(220) Bragg peak] and 11.5° [(440) Bragg peak] per 0.001 arcsec deviation from the Bragg angle has been achieved. The results are analysed using an upgraded simulation tool. PMID:26317196

  14. a Study of the Synchrotron Laue Method for Quantitative Crystal Structure Analysis.

    NASA Astrophysics Data System (ADS)

    Gomez de Anderez, Dora M.

    1990-01-01

    Available from UMI in association with The British Library. Quantitative crystal structure analyses have been carried out on small molecule crystals using synchrotron radiation and the Laue method. A variety of single crystal structure determinations and associated refinements are used and compared with the monochromatic analyses. The new molecular structure of 7-amino-5-bromo -4-methyl-2-oxo-1,2,3,4-tetrahidro-1, 6 -naphthyridine-8-carbonitrile (C_{10 }H_9ON_4 Br.H_2O) has been determined, first using monochromatic Mo Kalpha radiation and a four-circle diffractometer, then using synchrotron Laue diffraction photography. The structure refinements showed a R-factor of 4.97 and 14.0% for the Mo Kalpha and Laue data respectively. The molecular structure of (S)-2-chloro-2-fluoro-N-((S)-1-phenylethyl) ethanamide, (C_{10}H _{11}ClFNO), has been determined using the same crystal throughout for X-ray monochromatic analyses (Mo Kalpha and Cu K alpha) followed by synchrotron Laue data collection. The Laue and monochromatic data compare favourably. The R -factors (on F) were 6.23, 6.45 and 8.19% for the Mo K alpha, Cu Kalpha and Laue data sets respectively. The molecular structure of 3-(5-hydroxy-3-methyl-1-phenylpyrazol-4-yl)-1,3-diphenyl -prop- 2-en-1-one, (C_{25 }H_{20}N _2O_2) has been determined using the synchrotron Laue method. The results compare very well with Mo Kalpha monochromatic data. The R-factors (on F) were 4.60 and 5.29% for Mo Kalpha and Laue analysis respectively. The Laue method is assessed in locating the 20 hydrogen atoms in this structure. The structure analysis of the benzil compound ((C_6H_5 O.CO_2)) is carried out using the synchrotron Laue method firstly at room temperature and secondly at low temperature -114 ^circC. The structure shows an R-factor (on F) of 13.06% and 6.85% for each data set respectively. The synchrotron Laue method was used to collect data for ergocalciferol (Vitamin D_2). The same crystal was also used to record oscillation

  15. A Study of the Synchrotron Laue Method for Quantitative Crystal Structure Analysis

    NASA Astrophysics Data System (ADS)

    Gomez de Anderez, Dora M.

    1990-01-01

    Quantitative crystal structure analyses have been carried out on small molecule crystals using synchrotron radiation and the Laue method. A variety of single crystal structure determinations and associated refinements are used and compared with the monochromatic analyses. The new molecular structure of 7-amino-5-bromo -4-methyl-2-oxo-1,2,3,4 -tetrahidro-1,6 -naphthyridine-8-carbonitrile (C_{10 }H_9ON_4 BrcdotH_2O) has been determined, first using monochromatic Mo K alpha radiation and a four-circle diffractometer, then using synchrotron Laue diffraction photography. The structure refinements showed an R-factor of 4.97 and 14.0% for the Mo Kalpha and Laue data respectively. The molecular structure of (S)-2-chloro-2-fluoro-N-((S)-1-phenylethyl) ethanamide, (C_{10}H _{11}ClFNO), has been determined using the same crystal throughout for X-ray monochromatic analyses (Mo Kalpha and Cu K alpha) followed by synchrotron Laue data collection. The Laue and monochromatic data compare favourably. The R -factors (on F) were 6.23, 6.45 and 8.19% for the Mo K alpha, Cu Kalpha and Laue data sets respectively. The molecular structure of 3-(5-hydroxy-3-methyl-1-phenylpyrazol-4-yl)-1,3-diphenyl -prop-2-en-1-one, (C_{25}H _{20}N_2 O_2) has been determined using the synchrotron Laue method. The results compare very well with Mo Kalpha monochromatic data. The R-factors (on F) were 4.60 and 5.29% for Mo Kalpha and Laue analyses respectively. The Laue method is assessed in locating the 20 hydrogen atoms in this structure. The structure analyses of the benzil compound ((C_6H_5 OcdotCO_2)) is carried out using the synchrotron Laue method firstly at room temperature and secondly at low temperature. The structure shows an R-factor (on F) of 13.06% and 6.85% for each data set respectively. The synchrotron Laue method was used to collect data for ergocalciferol (Vitamin D_2). The same crystal was also used to record oscillation data with the synchrotron radiation monochromatic beam. A new

  16. Image-plate synchrotron laue data collection and subsequent structural analysis of a small test crystal of a nickel-containing aluminophosphate.

    PubMed

    Snell, E; Habash, J; Helliwell, M; Helliwell, J R; Raftery, J; Kaucic, V; Campbell, J W

    1995-01-01

    Image plates have advantages over photographic films, which include wider dynamic range, higher detector quantum efficiency, reduced exposure time and large size. In this study, an on-line image-plate system has been used to record crystallographic data from a small crystal. In particular, synchrotron Laue data were recorded with lambda(min) = 0.455, lambda(max) = 1.180 A, in 20 images 10 degrees apart and with an exposure time of 0.3 s each from a crystal (0.02 x 0.05 x 0.25 mm) of a nickel-containing aluminophosphate, NiAPO. The Laue data were analyzed with the Daresbury Laue software, including the application of an absorption correction. The structure was solved by a combination of the Patterson method and successive difference Fourier calculations using SHELXS86 and SHELXL93; the final R value for 1934 unique reflections (all data) and 310 parameters was 7.90%. The structure agrees with that determined by monochromatic diffractometry using the same crystal and reported by Helliwell, Gallois, Kariuki, Kaucic & Helliwell [Acta Cryst. (1993), B49, 420-428] with an r.m.s. deviation of 0.03 A. Hence, this study shows the image-plate device to be very effective for synchrotron data collection and subsequent structure analysis from small crystals, i.e. 0.02 x 0.05 x 0.25 mm, in chemical crystallography as well as providing further confirmation of the practicability of Laue data in structure solution and refinement. PMID:16714782

  17. Effects of Professional Experience and Group Interaction on Information Requested in Analyzing IT Cases

    ERIC Educational Resources Information Center

    Lehmann, Constance M.; Heagy, Cynthia D.

    2008-01-01

    The authors investigated the effects of professional experience and group interaction on the information that information technology professionals and graduate accounting information system (AIS) students request when analyzing business cases related to information systems design and implementation. Understanding these effects can contribute to…

  18. Rate of occurrence of failures based on a nonhomogeneous Poisson process: an ozone analyzer case study.

    PubMed

    de Moura Xavier, José Carlos; de Andrade Azevedo, Irany; de Sousa Junior, Wilson Cabral; Nishikawa, Augusto

    2013-02-01

    Atmospheric pollutant monitoring constitutes a primordial activity in public policies concerning air quality. In São Paulo State, Brazil, the São Paulo State Environment Company (CETESB) maintains an automatic network which continuously monitors CO, SO(2), NO(x), O(3), and particulate matter concentrations in the air. The monitoring process accuracy is a fundamental condition for the actions to be taken by CETESB. As one of the support systems, a preventive maintenance program for the different analyzers used is part of the data quality strategy. Knowledge of the behavior of analyzer failure times could help optimize the program. To achieve this goal, the failure times of an ozone analyzer-considered a repairable system-were modeled by means of the nonhomogeneous Poisson process. The rate of occurrence of failures (ROCOF) was estimated for the intervals 0-70,800 h and 0-88,320 h, in which six and seven failures were observed, respectively. The results showed that the ROCOF estimate is influenced by the choice of the observation period, t(0) = 70,800 h and t(7) = 88,320 h in the cases analyzed. Identification of preventive maintenance actions, mainly when parts replacement occurs in the last interval of observation, is highlighted, justifying the alteration in the behavior of the inter-arrival times. The performance of a follow-up on each analyzer is recommended in order to record the impact of the performed preventive maintenance program on the enhancement of its useful life.

  19. Feasibility of one-shot-per-crystal structure determination using Laue diffraction

    SciTech Connect

    Cornaby, Sterling; Szebenyi, Doletha M. E.; Smilgies, Detlef-M.; Schuller, David J.; Gillilan, Richard; Hao, Quan; Bilderback, Donald H.

    2010-01-01

    Structure determination was successfully carried out using single Laue exposures from a group of lysozyme crystals. The Laue method may be a viable option for collection of one-shot-per-crystal data from microcrystals. Crystal size is an important factor in determining the number of diffraction patterns which may be obtained from a protein crystal before severe radiation damage sets in. As crystal dimensions decrease this number is reduced, eventually falling to one, at which point a complete data set must be assembled using data from multiple crystals. When only a single exposure is to be collected from each crystal, the polychromatic Laue technique may be preferable to monochromatic methods owing to its simultaneous recording of a large number of fully recorded reflections per image. To assess the feasibility of solving structures using single Laue images from multiple crystals, data were collected using a ‘pink’ beam at the CHESS D1 station from groups of lysozyme crystals with dimensions of the order of 20–30 µm mounted on MicroMesh grids. Single-shot Laue data were used for structure determination by molecular replacement and correct solutions were obtained even when as few as five crystals were used.

  20. Designing and commissioning of a prototype double Laue monochromator at CHESS

    NASA Astrophysics Data System (ADS)

    Ko, J. Y. Peter; Oswald, Benjamin B.; Savino, James J.; Pauling, Alan K.; Lyndaker, Aaron; Revesz, Peter; Miller, Matthew P.; Brock, Joel D.

    2014-03-01

    High-energy X-rays are efficiently focused sagittally by a set of asymmetric Laue (transmission) crystals. We designed, built and commissioned a prototype double Laue monochromator ((111) reflection in Si(100)) optimized for high-energy X-rays (30-60 keV). Here, we report our design of novel prototype sagittal bender and highlight results from recent characterization experiments. The design of the bender combines the tuneable bending control afforded by previous leaf-spring designs with the stability and small size of a four-bar bender. The prototype monochromator focuses a 25 mm-wide white beam incident on the first monochromator crystal to a monochromatized 0.6 mm beam waist in the experimental station. Compared to the flux in the same focal spot with the Bragg crystal (without focusing), the prototype Laue monochromator delivered 85 times more at 30 keV.

  1. Laue crystal structure of Shewanella oneidensis cytochrome c nitrite reductase from a high-yield expression system

    SciTech Connect

    Youngblut, Matthew; Judd, Evan T.; Srajer, Vukica; Sayyed, Bilal; Goelzer, Tyler; Elliott, Sean J.; Schmidt, Marius; Pacheco, A. Andrew

    2012-09-11

    The high-yield expression and purification of Shewanella oneidensis cytochrome c nitrite reductase (ccNiR) and its characterization by a variety of methods, notably Laue crystallography, are reported. A key component of the expression system is an artificial ccNiR gene in which the N-terminal signal peptide from the highly expressed S. oneidensis protein 'small tetraheme c' replaces the wild-type signal peptide. This gene, inserted into the plasmid pHSG298 and expressed in S. oneidensis TSP-1 strain, generated approximately 20 mg crude ccNiR per liter of culture, compared with 0.5-1 mg/L for untransformed cells. Purified ccNiR has nitrite and hydroxylamine reductase activities comparable to those previously reported for Escherichia coli ccNiR, and is stable for over 2 weeks in pH 7 solution at 4 C. UV/vis spectropotentiometric titrations and protein film voltammetry identified five independent one-electron reduction processes. Global analysis of the spectropotentiometric data also allowed determination of the extinction coefficient spectra for the five reduced ccNiR species. The characteristics of the individual extinction coefficient spectra suggest that, within each reduced species, the electrons are distributed among the various hemes, rather than being localized on specific heme centers. The purified ccNiR yielded good-quality crystals, with which the 2.59-{angstrom}-resolution structure was solved at room temperature using the Laue diffraction method. The structure is similar to that of E. coli ccNiR, except in the region where the enzyme interacts with its physiological electron donor (CymA in the case of S. oneidensis ccNiR, NrfB in the case of the E. coli protein).

  2. Laue Crystal Structure of Shewanella oneidensis Cytochrome c Nitrite Reductase from a High-yield Expression System

    PubMed Central

    Youngblut, Matthew; Judd, Evan T.; Srajer, Vukica; Sayyed, Bilal; Goelzer, Tyler; Elliott, Sean J.; Schmidt, Marius; Pacheco, A. Andrew

    2012-01-01

    The high-yield expression and purification of Shewanella oneidensis cytochrome c nitrite reductase (ccNiR), and its characterization by a variety of methods, notably Laue crystallography, is reported. A key component of the expression system is an artificial ccNiR gene in which the N-terminal signal peptide from the highly expressed S. oneidensis protein “Small Tetra-heme c” replaces the wild-type signal peptide. This gene, inserted into the plasmid pHSG298 and expressed in S. oneidensis TSP-1 strain, generated ~20 mg crude ccNiR/L culture, compared with 0.5–1 mg/L for untransformed cells. Purified ccNiR has nitrite and hydroxylamine reductase activities comparable to those previously reported for E. coli ccNiR, and is stable for over two weeks in pH 7 solution at 4° C. UV/Vis spectropotentiometric titrations and protein film voltammetry identified 5 independent 1-electron reduction processes. Global analysis of the spectropotentiometric data also allowed determination of the extinction coefficient spectra for the 5 reduced ccNiR species. The characteristics of the individual extinction coefficient spectra suggest that, within each reduced species, the electrons are distributed amongst the various hemes, rather than being localized on specific heme centers. The purified ccNiR yielded good quality crystals, with which the 2.59 Å resolution structure was solved at room temperature using the Laue diffraction method. The structure is similar to that of E. coli ccNiR, except in the region where the enzyme interacts with its physiological electron donor (CymA in the case of S. oneidensis ccNiR, NrfB in the case of the E. coli protein). PMID:22382353

  3. Analyzing privacy requirements: A case study of healthcare in Saudi Arabia.

    PubMed

    Ebad, Shouki A; Jaha, Emad S; Al-Qadhi, Mohammed A

    2016-01-01

    Developing legally compliant systems is a challenging software engineering problem, especially in systems that are governed by law, such as healthcare information systems. This challenge comes from the ambiguities and domain-specific definitions that are found in governmental rules. Therefore, there is a significant business need to automatically analyze privacy texts, extract rules and subsequently enforce them throughout the supply chain. The existing works that analyze health regulations use the U.S. Health Insurance Portability and Accountability Act as a case study. In this article, we applied the Breaux and Antón approach to the text of the Saudi Arabian healthcare privacy regulations; in Saudi Arabia, privacy is among the top dilemmas for public and private healthcare practitioners. As a result, we extracted and analyzed 2 rights, 4 obligations, 22 constraints, and 6 rules. Our analysis can assist requirements engineers, standards organizations, compliance officers and stakeholders by ensuring that their systems conform to Saudi policy. In addition, this article discusses the threats to the study validity and suggests open problems for future research. PMID:25325796

  4. Analyzing and Comparing Biomass Feedstock Supply Systems in China: Corn Stover and Sweet Sorghum Case Studies

    SciTech Connect

    Ren, Lantian; Cafferty, Kara; Roni, Mohammad; Jacobson, Jacob; Xie, Guanghui; Ovard, Leslie; Wright, Christopher

    2015-06-11

    This paper analyzes the rural Chinese biomass supply system and models supply chain operations according to U.S. concepts of logistical unit operations: harvest and collection, storage, transportation, preprocessing, and handling and queuing. In this paper, we quantify the logistics cost of corn stover and sweet sorghum in China under different scenarios. We analyze three scenarios of corn stover logistics from northeast China and three scenarios of sweet sorghum stalks logistics from Inner Mongolia in China. The case study estimates that the logistics cost of corn stover and sweet sorghum stalk to be $52.95/dry metric ton and $52.64/dry metric ton, respectively, for the current labor-based biomass logistics system. However, if the feedstock logistics operation is mechanized, the cost of corn stover and sweet sorghum stalk decreases to $36.01/dry metric ton and $35.76/dry metric ton, respectively. The study also includes a sensitivity analysis to identify the cost factors that cause logistics cost variation. Results of the sensitivity analysis show that labor price has the most influence on the logistics cost of corn stover and sweet sorghum stalk, with a variation of $6 to $12/dry metric ton.

  5. Analyzing and Comparing Biomass Feedstock Supply Systems in China: Corn Stover and Sweet Sorghum Case Studies

    DOE PAGES

    Ren, Lantian; Cafferty, Kara; Roni, Mohammad; Jacobson, Jacob; Xie, Guanghui; Ovard, Leslie; Wright, Christopher

    2015-06-11

    This paper analyzes the rural Chinese biomass supply system and models supply chain operations according to U.S. concepts of logistical unit operations: harvest and collection, storage, transportation, preprocessing, and handling and queuing. In this paper, we quantify the logistics cost of corn stover and sweet sorghum in China under different scenarios. We analyze three scenarios of corn stover logistics from northeast China and three scenarios of sweet sorghum stalks logistics from Inner Mongolia in China. The case study estimates that the logistics cost of corn stover and sweet sorghum stalk to be $52.95/dry metric ton and $52.64/dry metric ton, respectively,more » for the current labor-based biomass logistics system. However, if the feedstock logistics operation is mechanized, the cost of corn stover and sweet sorghum stalk decreases to $36.01/dry metric ton and $35.76/dry metric ton, respectively. The study also includes a sensitivity analysis to identify the cost factors that cause logistics cost variation. Results of the sensitivity analysis show that labor price has the most influence on the logistics cost of corn stover and sweet sorghum stalk, with a variation of $6 to $12/dry metric ton.« less

  6. Analyzing the performance of the planning system by use of AAPM TG 119 test cases.

    PubMed

    Nithya, L; Raj, N Arunai Nambi; Rathinamuthu, Sasikumar; Pandey, Manish Bhushan

    2016-01-01

    Our objective in this study was to create AAPM TG 119 test plans for intensity-modulated radiation therapy (IMRT) and volumetric-modulated arc therapy (VMAT) in the Monaco planning system. The results were compared with the published studies, and the performance of the Monaco planning system was analyzed. AAPM TG 119 proposed a set of test cases called multi-target, mock prostate, mock head and neck and C-shape to ascertain the overall accuracy of IMRT planning, measurement, and analysis. We used these test cases to investigate the performance of the Monaco planning system for the complex plans. For these test cases, we created IMRT plans with static multi-leaf collimator (MLC) and dynamic MLC by using 7-9 static beams as explained in TG-119. VMAT plans were also created with a 320° arc length and a single or double arc. The planning objectives and dose were set as described in TG 119. The dose prescriptions for multi-target, mock prostate, mock head and neck, and C-shape were taken as 50, 75.6, 50 and 50 Gy, respectively. All plans were compared with the results of TG 119 and the study done by Mynampati et al. Point dose and fluence measurements were done with a CC13 chamber and ArcCHECK phantom, respectively. Gamma analysis was done for the calculated and measured dose. Using the Monaco planning system, we achieved the goals mentioned in AAPM TG-119, and the plans were comparable to those of other studies. A comparison of point dose and fluence showed good results. From these results, we conclude that the performance of the Monaco planning system is good for complex plans.

  7. Developing a second generation Laue lens prototype: high-reflectivity crystals and accurate assembly

    NASA Astrophysics Data System (ADS)

    Barrière, Nicolas M.; Tomsick, John A.; Boggs, Steven E.; Lowell, Alexander; von Ballmoos, Peter

    2011-09-01

    Laue lenses are an emerging technology that will enhance gamma-ray telescope sensitivity by one to two orders of magnitude in selected energy bands of the ~100 keV to ~1.5 MeV range. This optic would be particularly well adapted to the observation of faint gamma ray lines, as required for the study of Supernovae and Galactic positron annihilation. It could also prove very useful for the study of hard X-ray tails from a variety of compact objects, especially making a difference by providing sufficient sensitivity for polarization to be measured by the focal plane detector. Our group has been addressing the two key issues relevant to improve performance with respect to the first generation of Laue lens prototypes: obtaining large numbers of efficient crystals and developing a method to fix them with accurate orientation and dense packing factor onto a substrate. We present preliminary results of an on-going study aiming to enable a large number of crystals suitable for diffraction at energies above 500 keV. In addition, we show the first results of the Laue lens prototype assembled using our beamline at SSL/UC Berkeley, which demonstrates our ability to orient and glue crystals with accuracy of a few arcsec, as required for an efficient Laue lens telescope.

  8. A formulation to analyze system-of-systems problems: A case study of airport metroplex operations

    NASA Astrophysics Data System (ADS)

    Ayyalasomayajula, Sricharan Kishore

    A system-of-systems (SoS) can be described as a collection of multiple, heterogeneous, distributed, independent components interacting to achieve a range of objectives. A generic formulation was developed to model component interactions in an SoS to understand their influence on overall SoS performance. The formulation employs a lexicon to aggregate components into hierarchical interaction networks and understand how their topological properties affect the performance of the aggregations. Overall SoS performance is evaluated by monitoring the changes in stakeholder profitability due to changes in component interactions. The formulation was applied to a case study in air transportation focusing on operations at airport metroplexes. Metroplexes are geographical regions with two or more airports in close proximity to one another. The case study explored how metroplex airports interact with one another, what dependencies drive these interactions, and how these dependencies affect metroplex throughput and capacity. Metrics were developed to quantify runway dependencies at a metroplex and were correlated with its throughput and capacity. Operations at the New York/New Jersey metroplex (NYNJ) airports were simulated to explore the feasibility of operating very large aircraft (VLA), such as the Airbus A380, as a delay-mitigation strategy at these airports. The proposed formulation was employed to analyze the impact of this strategy on different stakeholders in the national air transportation system (ATS), such as airlines and airports. The analysis results and their implications were used to compare the pros and cons of operating VLAs at NYNJ from the perspectives of airline profitability, and flight delays at NYNJ and across the ATS.

  9. Fair shares: a preliminary framework and case analyzing the ethics of offshoring.

    PubMed

    Gordon, Cameron; Zimmerman, Alan

    2010-06-01

    Much has been written about the offshoring phenomenon from an economic efficiency perspective. Most authors have attempted to measure the net economic effects of the strategy and many purport to show that "in the long run" that benefits will outweigh the costs. There is also a relatively large literature on implementation which describes the best way to manage the offshoring process. But what is the morality of offshoring? What is its "rightness" or "wrongness?" Little analysis of the ethics of offshoring has been completed thus far. This paper develops a preliminary framework for analyzing the ethics of offshoring and then applies this framework to basic case study of offshoring in the U.S. The paper following discusses the definition of offshoring; shifts to the basic philosophical grounding of the ethical concepts; develops a template for conducting an ethics analysis of offshoring; applies this template using basic data for offshoring in the United States; and conducts a preliminary ethical analysis of the phenomenon in that country, using a form of utilitarianism as an analytical baseline. The paper concludes with suggestions for further research.

  10. Fair shares: a preliminary framework and case analyzing the ethics of offshoring.

    PubMed

    Gordon, Cameron; Zimmerman, Alan

    2010-06-01

    Much has been written about the offshoring phenomenon from an economic efficiency perspective. Most authors have attempted to measure the net economic effects of the strategy and many purport to show that "in the long run" that benefits will outweigh the costs. There is also a relatively large literature on implementation which describes the best way to manage the offshoring process. But what is the morality of offshoring? What is its "rightness" or "wrongness?" Little analysis of the ethics of offshoring has been completed thus far. This paper develops a preliminary framework for analyzing the ethics of offshoring and then applies this framework to basic case study of offshoring in the U.S. The paper following discusses the definition of offshoring; shifts to the basic philosophical grounding of the ethical concepts; develops a template for conducting an ethics analysis of offshoring; applies this template using basic data for offshoring in the United States; and conducts a preliminary ethical analysis of the phenomenon in that country, using a form of utilitarianism as an analytical baseline. The paper concludes with suggestions for further research. PMID:19629753

  11. An X-ray Raman spectrometer for EXAFS studies on minerals: bent Laue spectrometer with 20 keV X-rays.

    PubMed

    Hiraoka, N; Fukui, H; Tanida, H; Toyokawa, H; Cai, Y Q; Tsuei, K D

    2013-03-01

    An X-ray Raman spectrometer for studies of local structures in minerals is discussed. Contrary to widely adopted back-scattering spectrometers using ≤10 keV X-rays, a spectrometer utilizing ~20 keV X-rays and a bent Laue analyzer is proposed. The 20 keV photons penetrate mineral samples much more deeply than 10 keV photons, so that high intensity is obtained owing to an enhancement of the scattering volume. Furthermore, a bent Laue analyzer provides a wide band-pass and a high reflectivity, leading to a much enhanced integrated intensity. A prototype spectrometer has been constructed and performance tests carried out. The oxygen K-edge in SiO(2) glass and crystal (α-quartz) has been measured with energy resolutions of 4 eV (EXAFS mode) and 1.3 eV (XANES mode). Unlike methods previously adopted, it is proposed to determine the pre-edge curve based on a theoretical Compton profile and a Monte Carlo multiple-scattering simulation before extracting EXAFS features. It is shown that the obtained EXAFS features are reproduced fairly well by a cluster model with a minimal set of fitting parameters. The spectrometer and the data processing proposed here are readily applicable to high-pressure studies.

  12. Framework for Unified Systems Engineering and Design of Wind Plants (FUSED-Wind) cost models and case analyzer

    SciTech Connect

    Dykes, Katherine; Graf, Peter

    2014-09-10

    Cost and case analyzer components of the FUSED-Wind software. These are small pieces of code which define interfaces between software in order to do wind plant cost of energy on the one hand and analysis of load cases for an aeroelastic code on the other.

  13. Genetic algorithm to design Laue lenses with optimal performance for focusing hard X- and γ-rays

    NASA Astrophysics Data System (ADS)

    Camattari, Riccardo; Guidi, Vincenzo

    2014-10-01

    To focus hard X- and γ-rays it is possible to use a Laue lens as a concentrator. With this optics it is possible to improve the detection of radiation for several applications, from the observation of the most violent phenomena in the sky to nuclear medicine applications for diagnostic and therapeutic purposes. We implemented a code named LaueGen, which is based on a genetic algorithm and aims to design optimized Laue lenses. The genetic algorithm was selected because optimizing a Laue lens is a complex and discretized problem. The output of the code consists of the design of a Laue lens, which is composed of diffracting crystals that are selected and arranged in such a way as to maximize the lens performance. The code allows managing crystals of any material and crystallographic orientation. The program is structured in such a way that the user can control all the initial lens parameters. As a result, LaueGen is highly versatile and can be used to design very small lenses, for example, for nuclear medicine, or very large lenses, for example, for satellite-borne astrophysical missions.

  14. A neutron image plate quasi-Laue diffractometer for protein crystallography

    SciTech Connect

    Cipriani, F.; Castagna, J.C.; Wilkinson, C.

    1994-12-31

    An instrument which is based on image plate technology has been constructed to perform cold neutron Laue crystallography on protein structures. The crystal is mounted at the center of a cylindrical detector which is 400mm long and has a circumference of 1000mm, with gadolinium oxide-containing image plates mounted on its exterior surface. Laue images registered on the plate are read out by rotating the drum and translating a laser read head parallel to the cylinder axis, giving a pixel size of 200{mu}m x 200{mu}m and a total read time of 5 minutes. Preliminary results indicate that it should be possible to obtain a complete data set from a protein crystal to atomic resolution in about two weeks.

  15. A neutron image plate quasi-Laue diffractometer for protein crystallography.

    PubMed

    Cipriani, F; Castagna, J C; Wilkinson, C; Lehmann, M S; Büldt, G

    1996-01-01

    An instrument which is based on image plate technology has been constructed to perform cold neutron Laue crystallography on protein structures. The crystal is mounted at the center of a cylindrical detector which is 400mm long and has a circumference of 1000mm, with gadolinium oxide-containing image plates mounted on its exterior surface. Laue images registered on the plate are read out by rotating the drum and translating a laser read head parallel to the cylinder axis, giving a pixel size of 200 microm x 200 microm and a total read time of 5 minutes. Preliminary results indicate that it should be possible to obtain a complete data set from a protein crystal to atomic resolution in about two weeks. PMID:9092460

  16. Zone compensated multilayer laue lens and apparatus and method of fabricating the same

    DOEpatents

    Conley, Raymond P.; Liu, Chian Qian; Macrander, Albert T.; Yan, Hanfei; Maser, Jorg; Kang, Hyon Chol; Stephenson, Gregory Brian

    2015-07-14

    A multilayer Laue Lens includes a compensation layer formed in between a first multilayer section and a second multilayer section. Each of the first and second multilayer sections includes a plurality of alternating layers made of a pair of different materials. Also, the thickness of layers of the first multilayer section is monotonically increased so that a layer adjacent the substrate has a minimum thickness, and the thickness of layers of the second multilayer section is monotonically decreased so that a layer adjacent the compensation layer has a maximum thickness. In particular, the compensation layer of the multilayer Laue lens has an in-plane thickness gradient laterally offset by 90.degree. as compared to other layers in the first and second multilayer sections, thereby eliminating the strict requirement of the placement error.

  17. Dance Pedagogy Case Studies: A Grounded Theory Approach to Analyzing Qualitative Data

    ERIC Educational Resources Information Center

    Wilson, Margaret

    2009-01-01

    Combining traditional forms of research to fit unique populations contributes to understanding broad phenomena within the discipline of dance. This paper describes a methodological approach for understanding separate, but interrelated, case studies which illuminated a particular approach to teaching and learning about the body. In each case study…

  18. Analyzing the Roles, Activities, and Skills of Learning Technologists: A Case Study from City University London

    ERIC Educational Resources Information Center

    Fox, Olivia; Sumner, Neal

    2014-01-01

    This article reports on a case study carried out at City University London into the role of learning technologists. The article examines how the role developed by providing points of comparison with a report on the career development of learning technology staff in UK universities in 2001. This case study identified that learning technologists…

  19. Worked Examples Leads to Better Performance in Analyzing and Solving Real-Life Decision Cases

    ERIC Educational Resources Information Center

    Cevik, Yasemin Demiraslan; Andre, Thomas

    2012-01-01

    This study compared the impact of three types of case-based methods (worked example, faded worked example, and case-based reasoning) on preservice teachers' (n=71) decision making and reasoning related to realistic classroom management situations. Participants in this study received a short-term implementation of one of these three major…

  20. Focusing of hard x-rays to 16 manometers with a multilayer Laue lens.

    SciTech Connect

    Kang, H. C.; Yan, H.; Maser, J.; Liu, C.; Conley, R.; Macrander , A. T.; Vogt, S.; Winarski, R.; Holt, M.; Stephenson, G. B.

    2008-06-01

    We report improved results for hard x-ray focusing using a multilayer Laue lens (MLL). We have measured a line focus of 16 nm width with an efficiency of 31% at a wavelength {lambda} = 0.064 nm (19.5 keV) using a partial MLL structure with an outermost zone width of 5 nm. The results are in good agreement with the theoretically predicted performance.

  1. A Theoretical Study of the Two-Dimensional Point Focusing by Two Multilayer Laue Lenses.

    SciTech Connect

    Yan,H.; Maser, J.; Kang, H.C.; Macrader, A.; Stephenson, B.

    2008-08-10

    Hard x-ray point focusing by two crossed multilayer Laue lenses is studied using a full-wave modeling approach. This study shows that for a small numerical aperture, the two consecutive diffraction processes can be decoupled into two independent ones in respective directions. Using this theoretical tool, we investigated adverse effects of various misalignments on the 2D focus profile and discussed the tolerance to them. We also derived simple expressions that described the required alignment accuracy.

  2. Method and apparatus for producing monochromatic radiography with a bent laue crystal

    DOEpatents

    Zhong, Zhong; Chapman, Leroy Dean; Thomlinson, William C.

    2000-03-14

    A method and apparatus for producing a monochromatic beam. A plurality of beams are generated from a polyenergetic source. The beams are then transmitted through a bent crystal, preferably a bent Laue crystal, having a non-cylindrical shape. A position of the bent crystal is rocked with respect to the polyenergetic source until a plurality of divergent monochromatic beams are emitted from the bent crystal.

  3. Highly reproducible quasi-mosaic crystals as optical components for a Laue lens

    NASA Astrophysics Data System (ADS)

    Camattari, Riccardo; Battelli, Alessandro; Bellucci, Valerio; Guidi, Vincenzo

    2014-02-01

    The realization of a Laue lens for astronomical purposes involves the mass production of a series of crystalline tiles as optical components, allowing high-efficiency diffraction and high-resolution focusing of photons. Crystals with self-standing curved diffraction planes is a valid and promising solution. Exploiting the quasi-mosaic effect, it turns out to be possible to diffract radiation at higher resolution. In this paper we present the realization of 150 quasi-mosaic Ge samples, bent by grooving one of their largest surface. We show that grooving method is a viable technique to manufacture such crystals in a simple and very reproducible way, thus compatible with mass production. Realized samples present very homogenous curvature. Furthermore, with a specific chemical etch, it is possible to fine adjust one by one the radius of curvature of the grooved samples. Realized crystals was selected for the ASI's Laue project, that involves the implementation of a prototype of a Laue lens for hard X- and soft γ-ray astronomy.

  4. A cognitive systematic approach to analyzing preparation design for a difficult space management case.

    PubMed

    Bassett, Joyce L

    2007-11-01

    There are at least two different techniques for preparing teeth prior to bonded porcelain restorations. The first involves using depth cutters guided by the existing tooth structure. A more recently developed approach integrates an additive wax-up that represents the final volume of the teeth, with indices used to guide the preparation design. This article illustrates in detail a clinical case that was prepared by combining the earlier simplified depth cutter approach with recontouring and preparation design principles determined clinically by the dentist. The same case was prepared in the laboratory on plastic models, using labial and incisal reduction preparation guides fabricated from a diagnostic wax-up. This combination of techniques will simplify preparation design for difficult space management cases and facilitate predictable and repeatable results that meet current esthetic standards while staying conservative and preserving tooth structure.

  5. Analyzing Mathematics Textbooks through a Constructive-Empirical Perspective on Abstraction: The Case of Pythagoras' Theorem

    ERIC Educational Resources Information Center

    Yang, Kai-Lin

    2016-01-01

    This study aims at analyzing how Pythagoras' theorem is handled in three versions of Taiwanese textbooks using a conceptual framework of a constructive-empirical perspective on abstraction, which comprises three key attributes: the generality of the object, the connectivity of the subject and the functionality of diagrams as the focused semiotic…

  6. Analyzing Activities in the Course of Science Education, According to Activity Theory: The Case of Sound

    ERIC Educational Resources Information Center

    Theodoraki, Xarikleia; Plakitsi, Katerina

    2013-01-01

    In the present study, we analyze activities on the topic of sound, which are performed in the science education laboratory lessons in the third-year students of the Department of Early Childhood Education at the University of Ioannina. The analysis of the activities is based on one of the most modern learning theories of CHAT (Cultural Historical…

  7. Socioeconomic Indicators for Analyzing Convergence: The Case of Greece--1960-2004

    ERIC Educational Resources Information Center

    Liargovas, Panagiotis G.; Fotopoulos, Georgios

    2009-01-01

    The purpose of this paper is to use socioeconomic indicators for analyzing convergence within Greece at regional (NUTS II) and prefecture levels (NUTS III) since 1960. We use two alternative approaches. The first one is based on the coefficient of variation and the second one on quality of life rankings. We confirm the decline of regional…

  8. A Comparison of Mean Phase Difference and Generalized Least Squares for Analyzing Single-Case Data

    ERIC Educational Resources Information Center

    Manolov, Rumen; Solanas, Antonio

    2013-01-01

    The present study focuses on single-case data analysis specifically on two procedures for quantifying differences between baseline and treatment measurements. The first technique tested is based on generalized least square regression analysis and is compared to a proposed non-regression technique, which allows obtaining similar information. The…

  9. Analyzing green/open space accessibility by using GIS: case study of northern Cyprus cities

    NASA Astrophysics Data System (ADS)

    Kara, Can; Akçit, Nuhcan

    2015-06-01

    It is well known that green spaces are vital for increasing the quality of life within the urban environment. World Health Organization states that it should be 9 square meters per person at least. European Environment Agency defines that 5000 square meters of green space should be accessible within 300 meters distance from households. Green structure in Northern Cyprus is not sufficient and effective in this manner. In Northern Cyprus, they have neglected the urban planning process and they have started to lose significance and importance. The present work analyzes the accessibility of green spaces in Northern Cyprus cities. Kioneli, Famagusta, Kyrenia and the northern part of Nicosia are analyzed in this manner. To do that, green space structure is analyzed by using digital data. Additionally, accessibility of the green space is measured by using 300-meter buffers for each city. Euclidean distance is used from each building and accessibility maps are generated. Kyrenia and Famagusta have shortage in green space per capita. The amount of green space in these cities is less than 4 square meters. The factors affecting the accessibility and utilization of public spaces are discussed to present better solutions to urban planning.

  10. A Case of Contested Cremains Analyzed Through Metric and Chemical Comparison.

    PubMed

    Bartelink, Eric J; Sholts, Sabrina B; Milligan, Colleen F; Van Deest, Traci L; Wärmländer, Sebastian K T S

    2015-07-01

    Since the 1980s, cremation has become the fastest growing area of the U.S. funeral industry. At the same time, the number of litigations against funeral homes and cremation facilities has increased. Forensic anthropologists are often asked to determine whether the contents of an urn are actually cremated bone, and to address questions regarding the identity of the remains. This study uses both metric and chemical analyses for resolving a case of contested cremains. A cremains weight of 2021.8 g was predicted based on the decedent's reported stature and weight. However, the urn contents weighed 4173.5 g. The urn contents also contained material inconsistent with cremains (e.g., moist sediment, stones, ferrous metal). Analysis using XRD and SEM demonstrated that the urn contained thermally altered bone as well as inorganic material consistent with glass fiber cement. Although forensically challenging, cremains cases such as this one can be resolved using a multidisciplinary approach.

  11. Toward models of surgical procedures: analyzing a database of neurosurgical cases

    NASA Astrophysics Data System (ADS)

    Raimbault, Melanie; Morandi, Xavier; Jannin, Pierre

    2005-04-01

    Image-guided surgery systems can be improved by the knowledge of surgical expertise. The more the surgeon and the system know about the surgical procedure to perform beforehand, the easier it will be to plan and perform. The main objective of this paper is to introduce an approach for predicting surgical performance according to input variables related to the patient. This prediction is a first step towards the inclusion of surgical expertise in the image guided surgery systems. We previously proposed a generic model for describing surgical procedures in the specific context of multimodal neuronavigation. In this paper, we present the preliminary results of the analysis of a neurosurgical cases database built in accordance with the generic model and including 159 surgical cases concerning right-handed patients. We defined two queries on this surgical cases database to illustrate how it could be used to extract relevant and conclusive information about the surgical procedures: How does the anatomical localization of the target influence patient positioning? How does the anatomical localization of the target influence progress of the steps involved in the surgical procedure? The mid-term goal of our research is to semi automatically extract information, a priori models or scenarios of specific surgical procedures that can make easier the decision making process both for planning and surgery.

  12. Use of a miniature diamond-anvil cell in high-pressure single-crystal neutron Laue diffraction.

    PubMed

    Binns, Jack; Kamenev, Konstantin V; McIntyre, Garry J; Moggach, Stephen A; Parsons, Simon

    2016-05-01

    The first high-pressure neutron diffraction study in a miniature diamond-anvil cell of a single crystal of size typical for X-ray diffraction is reported. This is made possible by modern Laue diffraction using a large solid-angle image-plate detector. An unexpected finding is that even reflections whose diffracted beams pass through the cell body are reliably observed, albeit with some attenuation. The cell body does limit the range of usable incident angles, but the crystallographic completeness for a high-symmetry unit cell is only slightly less than for a data collection without the cell. Data collections for two sizes of hexamine single crystals, with and without the pressure cell, and at 300 and 150 K, show that sample size and temperature are the most important factors that influence data quality. Despite the smaller crystal size and dominant parasitic scattering from the diamond-anvil cell, the data collected allow a full anisotropic refinement of hexamine with bond lengths and angles that agree with literature data within experimental error. This technique is shown to be suitable for low-symmetry crystals, and in these cases the transmission of diffracted beams through the cell body results in much higher completeness values than are possible with X-rays. The way is now open for joint X-ray and neutron studies on the same sample under identical conditions.

  13. Use of a miniature diamond-anvil cell in high-pressure single-crystal neutron Laue diffraction

    PubMed Central

    Binns, Jack; Kamenev, Konstantin V.; McIntyre, Garry J.; Moggach, Stephen A.; Parsons, Simon

    2016-01-01

    The first high-pressure neutron diffraction study in a miniature diamond-anvil cell of a single crystal of size typical for X-ray diffraction is reported. This is made possible by modern Laue diffraction using a large solid-angle image-plate detector. An unexpected finding is that even reflections whose diffracted beams pass through the cell body are reliably observed, albeit with some attenuation. The cell body does limit the range of usable incident angles, but the crystallographic completeness for a high-symmetry unit cell is only slightly less than for a data collection without the cell. Data collections for two sizes of hexamine single crystals, with and without the pressure cell, and at 300 and 150 K, show that sample size and temperature are the most important factors that influence data quality. Despite the smaller crystal size and dominant parasitic scattering from the diamond-anvil cell, the data collected allow a full anisotropic refinement of hexamine with bond lengths and angles that agree with literature data within experimental error. This technique is shown to be suitable for low-symmetry crystals, and in these cases the transmission of diffracted beams through the cell body results in much higher completeness values than are possible with X-rays. The way is now open for joint X-ray and neutron studies on the same sample under identical conditions. PMID:27158503

  14. Laue diffraction in one-dimensional photonic crystals: The way for phase-matched second-harmonic generation

    NASA Astrophysics Data System (ADS)

    Novikov, V. B.; Maydykovskiy, A. I.; Mantsyzov, B. I.; Murzina, T. V.

    2016-06-01

    Phase-matched second-harmonic generation (SHG) under the Bragg diffraction in the Laue geometry in one-dimensional photonic crystal (PhC) is studied theoretically and experimentally. We demonstrate that the phase-matched SHG can be realized in a PhC by compensation of the material dispersion of the PhC constituent layers of adjustable thickness. The second-order nonlinear susceptibility is introduced in the porous quartz-based PhC by its infiltration by sodium nitrite. We observed that two second-harmonic (SH) beams appear after passing through the PhC under the phase-matched process, which correspond to the transmission and diffraction angular directions. The appearance of the phase-matched SHG is confirmed by a pronounced SH spectral dependence and a narrow SH angular distribution, with the FWHM of the SH peak of approximately 3.5 times smaller as compared to the case of non-phase-matched SHG.

  15. Hard x-ray broad band Laue lenses (80-600 keV): building methods and performances

    NASA Astrophysics Data System (ADS)

    Virgilli, E.; Frontera, F.; Rosati, P.; Liccardo, V.; Squerzanti, S.; Carassiti, V.; Caroli, E.; Auricchio, N.; Stephen, J. B.

    2015-09-01

    We present the status of the LAUE project devoted to develop a technology for building a 20 meter long focal length Laue lens for hard X-/soft gamma-ray astronomy (80-600 keV). The Laue lens is composed of bent crystals of Gallium Arsenide (GaAs, 220) and Germanium (Ge, 111), and, for the first time, the focusing property of bent crystals has been exploited for this field of applications. We show the preliminary results concerning the adhesive employed to fix the crystal tiles over the lens support, the positioning accuracy obtained and possible further improvements. The Laue lens petal that will be completed in a few months has a pass band of 80-300 keV and is a fraction of an entire Laue lens capable of focusing x-rays up to 600 keV, possibly extendable down to ~20-30 keV with suitable low absorption crystal materials and focal length. The final goal is to develop a focusing optics that can improve the sensitivity over current telescopes in this energy band by 2 orders of magnitude.

  16. Analyzing the United States Department of Transportation's Implementation Strategy for High Speed Rail: Three Case Studies

    NASA Astrophysics Data System (ADS)

    Robinson, Ryan

    High-speed rail (HSR) has become a major contributor to the transportation sector with a strong push by the Obama Administration and the Department of Transportation to implement high-speed rail in the United States. High-speed rail is a costly transportation alternative that has the potential displace some car and airport travel while increase energy security and environmental sustainability. This thesis will examine the United States high-speed rail implementation strategy by comparing it to the implementation strategies of France, Japan, and Germany in a multiple case study under four main criteria of success: economic profitability, reliability, safety, and ridership. Analysis will conclude with lessons to be taken away from the case studies and applied to the United States strategy. It is important to understand that this project has not been established to create a comprehensive implementation plan for high-speed rail in the United States; rather, this project is intended to observe the depth and quality of the current United States implementation strategy and make additional recommendations by comparing it with France, Japan, and Germany.

  17. A Case of Contested Cremains Analyzed Through Metric and Chemical Comparison.

    PubMed

    Bartelink, Eric J; Sholts, Sabrina B; Milligan, Colleen F; Van Deest, Traci L; Wärmländer, Sebastian K T S

    2015-07-01

    Since the 1980s, cremation has become the fastest growing area of the U.S. funeral industry. At the same time, the number of litigations against funeral homes and cremation facilities has increased. Forensic anthropologists are often asked to determine whether the contents of an urn are actually cremated bone, and to address questions regarding the identity of the remains. This study uses both metric and chemical analyses for resolving a case of contested cremains. A cremains weight of 2021.8 g was predicted based on the decedent's reported stature and weight. However, the urn contents weighed 4173.5 g. The urn contents also contained material inconsistent with cremains (e.g., moist sediment, stones, ferrous metal). Analysis using XRD and SEM demonstrated that the urn contained thermally altered bone as well as inorganic material consistent with glass fiber cement. Although forensically challenging, cremains cases such as this one can be resolved using a multidisciplinary approach. PMID:25754694

  18. Employing the Hilbert-Huang Transform to analyze observed natural complex signals: Calm wind meandering cases

    NASA Astrophysics Data System (ADS)

    Martins, Luis Gustavo Nogueira; Stefanello, Michel Baptistella; Degrazia, Gervásio Annes; Acevedo, Otávio Costa; Puhales, Franciano Scremin; Demarco, Giuliano; Mortarini, Luca; Anfossi, Domenico; Roberti, Débora Regina; Denardin, Felipe Costa; Maldaner, Silvana

    2016-11-01

    In this study we analyze natural complex signals employing the Hilbert-Huang spectral analysis. Specifically, low wind meandering meteorological data are decomposed into turbulent and non turbulent components. These non turbulent movements, responsible for the absence of a preferential direction of the horizontal wind, provoke negative lobes in the meandering autocorrelation functions. The meandering characteristic time scales (meandering periods) are determined from the spectral peak provided by the Hilbert-Huang marginal spectrum. The magnitudes of the temperature and horizontal wind meandering period obtained agree with the results found from the best fit of the heuristic meandering autocorrelation functions. Therefore, the new method represents a new procedure to evaluate meandering periods that does not employ mathematical expressions to represent observed meandering autocorrelation functions.

  19. Analyzing patient's waiting time in emergency & trauma department in public hospital - A case study

    NASA Astrophysics Data System (ADS)

    Roslan, Shazwa; Tahir, Herniza Md; Nordin, Noraimi Azlin Mohd; Zaharudin, Zati Aqmar

    2014-09-01

    Emergency and Trauma Department (ETD) is an important element for a hospital. It provides medical service, which operates 24 hours a day in most hospitals. However overcrowding is not exclusion for ETD. Overflowing occurs due to affordable services provided by public hospitals, since it is funded by the government. It is reported that a patient attending ETD must be treated within 90 minutes, in accordance to achieve the Key Performance Indicator (KPI). However, due to overcrowd situations, most patients have to wait longer than the KPI standard. In this paper, patient's average waiting time is analyzed. Using Chi-Square Test of Goodness, patient's inter arrival per hour is also investigated. As conclusion, Monday until Wednesday was identified as the days that exceed the KPI standard while Chi-Square Test of Goodness showed that the patient's inter arrival is independent and random.

  20. [Analyzing spatial-temporal dynamics of the ecological niche: a marten (Martes martes) population case study].

    PubMed

    Puzachenko, Iu G; Zheltukhin, A S; Sandlerskiĭ, R B

    2010-01-01

    A potential of discriminant analysis is demonstrated in a case study of the common marten (Martes martes L., 1758) ecological niche within the Central Forest Reserve and its buffer zone. The analysis is aimed at identifying how the probability to encounter a marten's footprint along a walking route depends on the relief and other parameters of the environment discerned by remote sensing. The analyses that were done individually for each of the eleven months from a three-year observation period have revealed the pattern of the species spatial distribution and a measure of its association with the environment to be dependent, to a large extent, on the weather conditions. In general, associations with the environment do increase under unfavorable conditions. The methods are suggested that integrate outcomes of the monthly analyses into a general map of habitat types. The technique presented has wide application opportunities in studying the ecology of populations and solving problems of practical ecology.

  1. Sex beyond species: the first genetically analyzed case of intergeneric fertile hybridization in pinnipeds.

    PubMed

    Franco-Trecu, Valentina; Abud, Carolina; Feijoo, Matías; Kloetzer, Guillermo; Casacuberta, Marcelo; Costa-Urrutia, Paula

    2016-01-01

    A species, according to the biological concept, is a natural group of potentially interbreeding individuals isolated by diverse mechanisms. Hybridization is considered the production of offspring resulting from the interbreeding of two genetically distinct taxa. It has been documented in over 10% of wild animals, and at least in 34 cases for Artic marine mammals. In Otariids, intergeneric hybridization has been reported though neither confirming it through genetic analyses nor presenting evidence of fertile offspring. In this study, we report the finding of a hybrid adult female between a South American fur seal (Arctocephalus australis) and a South American sea lion (Otaria byronia), and its offspring, a male pup, in Uruguay. Further based on morphological constraints and breeding seasons, sex-biased hybridization between the two species is hypothesized. Morphological and genetic (nuclear and mitochondrial) results confirm de hybrid nature of the female-pup pair. Here we discuss a genetic dilution effect, considering other hybridization events must be occurring, and how isolation mechanisms could be circumvented. Moreover, the results obtained from stable isotope analysis suggest feeding habits may be a trait transmitted maternally, leading to consider broader issues regarding hybridization as an evolutionary innovation phenomenon.

  2. Sex beyond species: the first genetically analyzed case of intergeneric fertile hybridization in pinnipeds.

    PubMed

    Franco-Trecu, Valentina; Abud, Carolina; Feijoo, Matías; Kloetzer, Guillermo; Casacuberta, Marcelo; Costa-Urrutia, Paula

    2016-01-01

    A species, according to the biological concept, is a natural group of potentially interbreeding individuals isolated by diverse mechanisms. Hybridization is considered the production of offspring resulting from the interbreeding of two genetically distinct taxa. It has been documented in over 10% of wild animals, and at least in 34 cases for Artic marine mammals. In Otariids, intergeneric hybridization has been reported though neither confirming it through genetic analyses nor presenting evidence of fertile offspring. In this study, we report the finding of a hybrid adult female between a South American fur seal (Arctocephalus australis) and a South American sea lion (Otaria byronia), and its offspring, a male pup, in Uruguay. Further based on morphological constraints and breeding seasons, sex-biased hybridization between the two species is hypothesized. Morphological and genetic (nuclear and mitochondrial) results confirm de hybrid nature of the female-pup pair. Here we discuss a genetic dilution effect, considering other hybridization events must be occurring, and how isolation mechanisms could be circumvented. Moreover, the results obtained from stable isotope analysis suggest feeding habits may be a trait transmitted maternally, leading to consider broader issues regarding hybridization as an evolutionary innovation phenomenon. PMID:26994861

  3. [Attention deficit hyperactivity disorder analyzed with array comparative genome hybridization method. Case report].

    PubMed

    Duga, Balázs; Czakó, Márta; Komlósi, Katalin; Hadzsiev, Kinga; Sümegi, Katalin; Kisfali, Péter; Melegh, Márton; Melegh, Béla

    2014-10-01

    One of the most common psychiatric disorders during childhood is attention deficit hyperactivity disorder, which affects 5-6% of children worldwide. Symptoms include attention deficit, hyperactivity, forgetfulness and weak impulse control. The exact mechanism behind the development of the disease is unknown. However, current data suggest that a strong genetic background is responsible, which explains the frequent occurrence within a family. Literature data show that copy number variations are very common in patients with attention deficit hyperactivity disorder. The authors present a patient with attention deficit hyperactivity disorder who proved to have two approximately 400 kb heterozygous microduplications at 6p25.2 and 15q13.3 chromosomal regions detected by comparative genomic hybridization methods. Both duplications affect genes (6p25.2: SLC22A23; 15q13.3: CHRNA7) which may play a role in the development of attention deficit hyperactivity disorder. This case serves as an example of the wide spectrum of indication of the array comparative genome hybridization method.

  4. Focusing effect of bent GaAs crystals for γ-ray Laue lenses: Monte Carlo and experimental results

    NASA Astrophysics Data System (ADS)

    Virgilli, E.; Frontera, F.; Rosati, P.; Bonnini, E.; Buffagni, E.; Ferrari, C.; Stephen, J. B.; Caroli, E.; Auricchio, N.; Basili, A.; Silvestri, S.

    2016-02-01

    We report on results of observation of the focusing effect from the planes (220) of Gallium Arsenide (GaAs) crystals. We have compared the experimental results with the Monte Carlo simulations of the focusing capability of GaAs tiles performed with a dedicated ray-tracer. The GaAs tiles were bent using a lapping process developed at the cnr/imem - Parma (Italy) in the framework of the laue project, funded by ASI, dedicated to build a broad band Laue lens prototype for astrophysical applications in the hard X-/soft γ-ray energy range (80-600 keV). We present and discuss the results obtained from their characterization, mainly in terms of focusing capability. Bent crystals will significantly increase the signal to noise ratio of a telescope based on a Laue lens, consequently leading to an unprecedented enhancement of sensitivity with respect to the present non focusing instrumentation.

  5. Unveiling Physical Processes in Type Ia Supernovae with a Laue Lens Telescope

    NASA Astrophysics Data System (ADS)

    Barriere, Nicolas; Boggs, S. E.; Tomsick, J. A.

    2010-03-01

    Despite their use as standard candles in cosmological studies, many fundamental aspects of Type Ia supernovae (SNIa) remain uncertain, including the progenitor systems, the explosion trigger and the detailed nuclear burning physics. The most popular model involves an accreting CO white dwarf undergoing a thermonuclear runaway, converting a substantial fraction of the stellar mass to 56Ni. The radioactive decay chain 56Ni -> 56Co -> 56Fe powers both the SNIa optical light curve and produces several gamma-ray lines, including bright lines at 158 keV and 847 keV. Observations of the spectrum and light curve of any of these lines would be extremely valuable in constraining and discriminating between the currently competing models of SNIa. However, these lines are weak in flux and evolve relatively quickly by gamma-ray standards: to be able to study a handful SNIa per year, the required sensitivity is about 10-6 ph/cm2/s at 847 keV and 10-7 ph/s/cm2 at 158 keV for 3% broadened lines, and these levels must be achieved in 105 s. A Laue lens telescope offers a novel and powerful method of achieving these extremely challenging requirements. In this paper, we briefly introduce the Laue lens principle and state-of-the-art technologies, and we demonstrate how a space-borne telescope based on a Laue lens focusing on a Compton camera could bring about the long-awaited observational clues leading to a better understanding of SNIa physics.

  6. Polarimetric performance of a Laue lens gamma-ray CdZnTe focal plane prototype

    SciTech Connect

    Curado da Silva, R. M.; Caroli, E.; Stephen, J. B.; Schiavone, F.; Donati, A.; Ventura, G.; Pisa, A.; Auricchio, N.; Frontera, F.; Del Sordo, S.; Honkimaeki, V.; Trindade, A. M. F.

    2008-10-15

    A gamma-ray telescope mission concept [gamma ray imager (GRI)] based on Laue focusing techniques has been proposed in reply to the European Space Agency call for mission ideas within the framework of the next decade planning (Cosmic Vision 2015-2025). In order to optimize the design of a focal plane for this satellite mission, a CdZnTe detector prototype has been tested at the European Synchrotron Radiation Facility under an {approx}100% polarized gamma-ray beam. The spectroscopic, imaging, and timing performances were studied and in particular its potential as a polarimeter was evaluated. Polarization has been recognized as being a very important observational parameter in high energy astrophysics (>100 keV) and therefore this capability has been specifically included as part of the GRI mission proposal. The prototype detector tested was a 5 mm thick CdZnTe array with an 11x11 active pixel matrix (pixel area of 2.5x2.5 mm{sup 2}). The detector was irradiated by a monochromatic linearly polarized beam with a spot diameter of about 0.5 mm over the energy range between 150 and 750 keV. Polarimetric Q factors of 0.35 and double event relative detection efficiency of 20% were obtained. Further measurements were performed with a copper Laue monochromator crystal placed between the beam and the detector prototype. In this configuration we have demonstrated that a polarized beam does not change its polarization level and direction after undergoing a small angle (<1 deg.) Laue diffraction inside a crystal.

  7. A social network approach to analyzing water governance: The case of the Mkindo catchment, Tanzania

    NASA Astrophysics Data System (ADS)

    Stein, C.; Ernstson, H.; Barron, J.

    The governance dimension of water resources management is just as complex and interconnected as the hydrological processes it aims to influence. There is an increasing need (i) to understand the multi-stakeholder governance arrangements that emerge from the cross-scale nature and multifunctional role of water; and (ii) to develop appropriate research tools to analyze them. In this study we demonstrate how social network analysis (SNA), a well-established technique from sociology and organizational research, can be used to empirically map collaborative social networks between actors that either directly or indirectly influence water flows in the Mkindo catchment in Tanzania. We assess how these collaborative social networks affect the capacity to govern water in this particular catchment and explore how knowledge about such networks can be used to facilitate more effective or adaptive water resources management. The study is novel as it applies social network analysis not only to organizations influencing blue water (the liquid water in rivers, lakes and aquifers) but also green water (the soil moisture used by plants). Using a questionnaire and semi-structured interviews, we generated social network data of 70 organizations, ranging from local resource users and village leaders, to higher-level governmental agencies, universities and NGOs. Results show that there is no organization that coordinates the various land and water related activities at the catchment scale. Furthermore, an important result is that village leader play a crucial role linking otherwise disconnected actors, but that they are not adequately integrated into the formal water governance system. Water user associations (WUAs) are in the process of establishment and could bring together actors currently not part of the formal governance system. However, the establishment of WUAs seems to follow a top-down approach not considering the existing informal organization of water users that are revealed

  8. Using climate response functions in analyzing electricity production variables. A case study from Norway.

    NASA Astrophysics Data System (ADS)

    Tøfte, Lena S.; Martino, Sara; Mo, Birger

    2016-04-01

    representation of hydropower is included and total hydro power production for each area is calculated, and the production is distributed among all available plants within each area. During simulation, the demand is affected by prices and temperatures. 6 different infrastructure scenarios of wind and power line development are analyzed. The analyses are done by running EMPS calibrated for today's situation for 11*11*8 different combinations of altered weather variables (temperature, precipitation and wind) describing different climate change scenarios, finding the climate response function for every EMPS-variable according the electricity production, such as prices and income, energy balances (supply, consumption and trade), overflow losses, probability of curtailment etc .

  9. Distortion Measurement of Multi-Finger Transistor Using Split Higher-Order Laue Zone Lines Analysis

    NASA Astrophysics Data System (ADS)

    Uesugi, Fumihiko; Yamazaki, Takashi; Kuramochi, Koji; Hashimoto, Iwao; Kojima, Kenji; Takeno, Shiro

    2008-05-01

    A distortion measurement in a region close to the interface between different materials in LSI is performed using a convergent beam electron diffraction (CBED) pattern. Split higher-order Laue zone (HOLZ) lines emerge in the CBED pattern so that a stressing region is observed close to the interface. The calculation method of the split HOLZ lines based on kinematical approximation with the sample's deformation model well reflects the experimental results. As a result of split HOLZ line analysis using the present method, it is found that there is distortion depending on the external form of a multi-finger transistor.

  10. Focusing performance of a multilayer Laue lens with layer placement error described by dynamical diffraction theory.

    PubMed

    Hu, Lingfei; Chang, Guangcai; Liu, Peng; Zhou, Liang

    2015-07-01

    The multilayer Laue lens (MLL) is essentially a linear zone plate with large aspect ratio, which can theoretically focus hard X-rays to well below 1 nm with high efficiency when ideal structures are used. However, the focusing performance of a MLL depends heavily on the quality of the layers, especially the layer placement error which always exists in real MLLs. Here, a dynamical modeling approach, based on the coupled wave theory, is proposed to study the focusing performance of a MLL with layer placement error. The result of simulation shows that this method can be applied to various forms of layer placement error.

  11. CONDENSED MATTER: STRUCTURE, MECHANICAL AND THERMAL PROPERTIES: An Accurate Image Simulation Method for High-Order Laue Zone Effects

    NASA Astrophysics Data System (ADS)

    Cai, Can-Ying; Zeng, Song-Jun; Liu, Hong-Rong; Yang, Qi-Bin

    2008-05-01

    A completely different formulation for simulation of the high order Laue zone (HOLZ) diffractions is derived. It refers to the new method, i.e. the Taylor series (TS) method. To check the validity and accuracy of the TS method, we take polyvinglidene fluoride (PVDF) crystal as an example to calculate the exit wavefunction by the conventional multi-slice (CMS) method and the TS method. The calculated results show that the TS method is much more accurate than the CMS method and is independent of the slice thicknesses. Moreover, the pure first order Laue zone wavefunction by the TS method can reflect the major potential distribution of the first reciprocal plane.

  12. Oscillation Laue Analysis (OLA) - A new crystal structure determination method for mineral physics

    NASA Astrophysics Data System (ADS)

    Dera, P.; Downs, R. T.; Liermann, H.; Yang, W.

    2006-12-01

    We present a new approach for collection and interpretation of polychromatic radiation diffraction images, called Oscillation Laue Analysis, which combines capabilities of single crystal X-ray diffraction and X-ray absorption spectroscopy. The method is based on smearing Laue reflections into variable-energy curves by slight oscillation of the crystal during the exposure. The OLA method allows for simple and precise peak energy determination and harmonic overlap deconvolution through measurement of X-ray attenuation coefficient of metal foils inserted into incident beam. The method provides an easy reliable way of determining unit cells of unknown single-crystal phases, yields multiple monochromatic structure factor sets covering wide range of energies, which can be used for Multiple Anomalous Dispersion (MAD) based structure solution or enhancement of contrast between neighboring elements in the periodic table, and allows the routine ab initio solution of unknown structures. The results of our first experiments, performed at sector 16 of the Advanced Photon Source Laboratory, and aimed at determination of the compression mechanism of escolite (Cr2O3) will be presented and discussed in the context of application of the new approach in micromineralogy, characterization of meteoritic samples, and high-pressure mineral physics.

  13. In-situ characterization of highly reversible phase transformation by synchrotron X-ray Laue microdiffraction

    NASA Astrophysics Data System (ADS)

    Chen, Xian; Tamura, Nobumichi; MacDowell, Alastair; James, Richard D.

    2016-05-01

    The alloy Cu25Au30Zn45 undergoes a huge first-order phase transformation (6% strain) and shows a high reversibility under thermal cycling and an unusual martensitc microstructure in sharp contrast to its nearby compositions. This alloy was discovered by systematically tuning the composition so that its lattice parameters satisfy the cofactor conditions (i.e., the kinematic conditions of compatibility between phases). It was conjectured that satisfaction of these conditions is responsible for the enhanced reversibility as well as the observed unusual fluid-like microstructure during transformation, but so far, there has been no direct evidence confirming that these observed microstructures are those predicted by the cofactor conditions. To verify this hypothesis, we use synchrotron X-ray Laue microdiffraction to measure the orientations and structural parameters of variants and phases near the austenite/martensite interface. The areas consisting of both austenite and multi-variants of martensite are scanned by microLaue diffraction. The cofactor conditions have been examined from the kinematic relation of lattice vectors across the interface. The continuity condition of the interface is precisely verified from the correspondent lattice vectors between two phases.

  14. Validation of a combined autosomal/Y-chromosomal STR approach for analyzing typical biological stains in sexual-assault cases.

    PubMed

    Purps, Josephine; Geppert, Maria; Nagy, Marion; Roewer, Lutz

    2015-11-01

    DNA testing is an established part of the investigation and prosecution of sexual assault. The primary purpose of DNA evidence is to identify a suspect and/or to demonstrate sexual contact. However, due to highly uneven proportions of female and male DNA in typical stains, routine autosomal analysis often fails to detect the DNA of the assailant. To evaluate the forensic efficiency of the combined application of autosomal and Y-chromosomal short tandem repeat (STR) markers, we present a large retrospective casework study of probative evidence collected in sexual-assault cases. We investigated up to 39 STR markers by testing combinations of the 16-locus NGMSElect kit with both the 23-locus PowerPlex Y23 and the 17-locus Yfiler kit. Using this dual approach we analyzed DNA extracts from 2077 biological stains collected in 287 cases over 30 months. To assess the outcome of the combined approach in comparison to stand-alone autosomal analysis we evaluated informative DNA profiles. Our investigation revealed that Y-STR analysis added up to 21% additional, highly informative (complete, single-source) profiles to the set of reportable autosomal STR profiles for typical stains collected in sexual-assault cases. Detection of multiple male contributors was approximately three times more likely with Y-chromosomal profiling than with autosomal STR profiling. In summary, 1/10 cases would have remained inconclusive (and could have been dismissed) if Y-STR analysis had been omitted from DNA profiling in sexual-assault cases.

  15. Real-time microstructure imaging by Laue microdiffraction: A sample application in laser 3D printed Ni-based superalloys

    NASA Astrophysics Data System (ADS)

    Zhou, Guangni; Zhu, Wenxin; Shen, Hao; Li, Yao; Zhang, Anfeng; Tamura, Nobumichi; Chen, Kai

    2016-06-01

    Synchrotron-based Laue microdiffraction has been widely applied to characterize the local crystal structure, orientation, and defects of inhomogeneous polycrystalline solids by raster scanning them under a micro/nano focused polychromatic X-ray probe. In a typical experiment, a large number of Laue diffraction patterns are collected, requiring novel data reduction and analysis approaches, especially for researchers who do not have access to fast parallel computing capabilities. In this article, a novel approach is developed by plotting the distributions of the average recorded intensity and the average filtered intensity of the Laue patterns. Visualization of the characteristic microstructural features is realized in real time during data collection. As an example, this method is applied to image key features such as microcracks, carbides, heat affected zone, and dendrites in a laser assisted 3D printed Ni-based superalloy, at a speed much faster than data collection. Such analytical approach remains valid for a wide range of crystalline solids, and therefore extends the application range of the Laue microdiffraction technique to problems where real-time decision-making during experiment is crucial (for instance time-resolved non-reversible experiments).

  16. Real-time microstructure imaging by Laue microdiffraction: A sample application in laser 3D printed Ni-based superalloys.

    PubMed

    Zhou, Guangni; Zhu, Wenxin; Shen, Hao; Li, Yao; Zhang, Anfeng; Tamura, Nobumichi; Chen, Kai

    2016-01-01

    Synchrotron-based Laue microdiffraction has been widely applied to characterize the local crystal structure, orientation, and defects of inhomogeneous polycrystalline solids by raster scanning them under a micro/nano focused polychromatic X-ray probe. In a typical experiment, a large number of Laue diffraction patterns are collected, requiring novel data reduction and analysis approaches, especially for researchers who do not have access to fast parallel computing capabilities. In this article, a novel approach is developed by plotting the distributions of the average recorded intensity and the average filtered intensity of the Laue patterns. Visualization of the characteristic microstructural features is realized in real time during data collection. As an example, this method is applied to image key features such as microcracks, carbides, heat affected zone, and dendrites in a laser assisted 3D printed Ni-based superalloy, at a speed much faster than data collection. Such analytical approach remains valid for a wide range of crystalline solids, and therefore extends the application range of the Laue microdiffraction technique to problems where real-time decision-making during experiment is crucial (for instance time-resolved non-reversible experiments). PMID:27302087

  17. Real-time microstructure imaging by Laue microdiffraction: A sample application in laser 3D printed Ni-based superalloys

    PubMed Central

    Zhou, Guangni; Zhu, Wenxin; Shen, Hao; Li, Yao; Zhang, Anfeng; Tamura, Nobumichi; Chen, Kai

    2016-01-01

    Synchrotron-based Laue microdiffraction has been widely applied to characterize the local crystal structure, orientation, and defects of inhomogeneous polycrystalline solids by raster scanning them under a micro/nano focused polychromatic X-ray probe. In a typical experiment, a large number of Laue diffraction patterns are collected, requiring novel data reduction and analysis approaches, especially for researchers who do not have access to fast parallel computing capabilities. In this article, a novel approach is developed by plotting the distributions of the average recorded intensity and the average filtered intensity of the Laue patterns. Visualization of the characteristic microstructural features is realized in real time during data collection. As an example, this method is applied to image key features such as microcracks, carbides, heat affected zone, and dendrites in a laser assisted 3D printed Ni-based superalloy, at a speed much faster than data collection. Such analytical approach remains valid for a wide range of crystalline solids, and therefore extends the application range of the Laue microdiffraction technique to problems where real-time decision-making during experiment is crucial (for instance time-resolved non-reversible experiments). PMID:27302087

  18. The development of Laue techniques for single-pulse diffraction of chemical complexes: time-resolved Laue diffraction on a binuclear rhodium metal-organic complex

    PubMed Central

    Makal, Anna; Trzop, Elzbieta; Sokolow, Jesse; Kalinowski, Jaroslaw; Benedict, Jason; Coppens, Philip

    2011-01-01

    A modified Laue method is shown to produce excited-state structures at atomic resolution of a quality competitive with those from monochromatic experiments. The much faster data collection allows the use of only one or a few X-ray pulses per data frame, which minimizes crystal damage caused by laser exposure of the samples and optimizes the attainable time resolution. The method has been applied to crystals of the α-modification of Rh2(μ-PNP)2(PNP)2 (BPh4)2 [PNP = CH3N(P(OCH3)2)2, Ph = phenyl]. The experimental results show a shortening of the Rh—Rh distance in the organometallic complex of 0.136 (8) Å on excitation and are quantitatively supported by quantum-mechanical (QM)/molecular-mechanics (MM) theoretical calculations which take into account the confining effect of the crystal environment, but not by theoretical results on the isolated complex, demonstrating the defining effect of the crystal matrix. PMID:21694470

  19. Conditioning adaptive combination of P-values method to analyze case-parent trios with or without population controls.

    PubMed

    Lin, Wan-Yu; Liang, Yun-Chieh

    2016-01-01

    Detection of rare causal variants can help uncover the etiology of complex diseases. Recruiting case-parent trios is a popular study design in family-based studies. If researchers can obtain data from population controls, utilizing them in trio analyses can improve the power of methods. The transmission disequilibrium test (TDT) is a well-known method to analyze case-parent trio data. It has been extended to rare-variant association testing (abbreviated as "rvTDT"), with the flexibility to incorporate population controls. The rvTDT method is robust to population stratification. However, power loss may occur in the conditioning process. Here we propose a "conditioning adaptive combination of P-values method" (abbreviated as "conADA"), to analyze trios with/without unrelated controls. By first truncating the variants with larger P-values, we decrease the vulnerability of conADA to the inclusion of neutral variants. Moreover, because the test statistic is developed by conditioning on parental genotypes, conADA generates valid statistical inference in the presence of population stratification. With regard to statistical methods for next-generation sequencing data analyses, validity may be hampered by population stratification, whereas power may be affected by the inclusion of neutral variants. We recommend conADA for its robustness to these two factors (population stratification and the inclusion of neutral variants). PMID:27341039

  20. Conditioning adaptive combination of P-values method to analyze case-parent trios with or without population controls

    PubMed Central

    Lin, Wan-Yu; Liang, Yun-Chieh

    2016-01-01

    Detection of rare causal variants can help uncover the etiology of complex diseases. Recruiting case-parent trios is a popular study design in family-based studies. If researchers can obtain data from population controls, utilizing them in trio analyses can improve the power of methods. The transmission disequilibrium test (TDT) is a well-known method to analyze case-parent trio data. It has been extended to rare-variant association testing (abbreviated as “rvTDT”), with the flexibility to incorporate population controls. The rvTDT method is robust to population stratification. However, power loss may occur in the conditioning process. Here we propose a “conditioning adaptive combination of P-values method” (abbreviated as “conADA”), to analyze trios with/without unrelated controls. By first truncating the variants with larger P-values, we decrease the vulnerability of conADA to the inclusion of neutral variants. Moreover, because the test statistic is developed by conditioning on parental genotypes, conADA generates valid statistical inference in the presence of population stratification. With regard to statistical methods for next-generation sequencing data analyses, validity may be hampered by population stratification, whereas power may be affected by the inclusion of neutral variants. We recommend conADA for its robustness to these two factors (population stratification and the inclusion of neutral variants). PMID:27341039

  1. INTERFACE RESIDUAL STRESSES IN DENTAL ZIRCONIA USING LAUE MICRO-DIFFRACTION

    SciTech Connect

    Bale, H. A.; Tamura, N.; Coelho, P.G.; Hanan, J. C.

    2009-01-01

    Due to their aesthetic value and high compressive strength, dentists have recently employed ceramics for restoration materials. Among the ceramic materials, zirconia provides high toughness and crack resistant characteristics. Residual stresses develop in processing due to factors including grain anisotropy and thermal coefficient mismatch. In the present study, polychromatic X-ray (Laue) micro-diffraction provided grain orientation and residual stresses on a clinically relevant zirconia model ceramic disk. A 0.5 mm x 0.024 mm region on zirconia was examined on a 500 nm scale for residual stresses using a focused poly-chromatic synchrotron X-ray beam. Large stresses ranging from - to + 1GPa were observed at some grains. On average, the method suggests a relatively small compressive stress at the surface between 47 and 75 MPa depending on direction.

  2. The new powder diffractometer D1B of the Institut Laue Langevin

    NASA Astrophysics Data System (ADS)

    Puente Orench, I.; Clergeau, J. F.; Martínez, S.; Olmos, M.; Fabelo, O.; Campo, J.

    2014-11-01

    D1B is a medium resolution high flux powder diffractometer located at the Institut Laue Langevin, ILL. D1B a suitable instrument for studying a large variety of polycrystalline materials. D1B runs since 1998 as a CRG (collaborating research group) instrument, being exploited by the CNRS (Centre National de la Recherche Scientifique, France) and CSIC (Consejo Superior de Investigaciones Cientificas, Spain). In 2008 the Spanish CRG started an updating program which included a new detector and a radial oscillating collimator (ROC). The detector, which has a sensitive height of 100mm, covers an angular range of 128°. Its 1280 gold wires provide a neutron detection point every 0.1°. The ROC is made of 198 gadolinium- based absorbing collimation blades, regular placed every 0.67°. Here the present characteristics of D1B are reviewed and the different experimental performances will be presented.

  3. A Quasi-Laue Neutron Crystallographic Study of D-Xylose Isomerase

    NASA Technical Reports Server (NTRS)

    Meilleur, Flora; Snell, Edward H.; vanderWoerd, Mark; Judge, Russell A.; Myles, Dean A. A.

    2006-01-01

    Hydrogen atom location and hydrogen bonding interaction determination are often critical to explain enzymatic mechanism. Whilst it is difficult to determine the position of hydrogen atoms using X-ray crystallography even with subatomic (less than 1.0 Angstrom) resolution data available, neutron crystallography provides an experimental tool to directly localise hydrogeddeuteriwn atoms in biological macromolecules at resolution of 1.5-2.0 Angstroms. Linearisation and isomerisation of xylose at the active site of D-xylose isomerase rely upon a complex hydrogen transfer. Neutron quasi-Laue data were collected on Streptomyces rubiginosus D-xylose isomerase crystal using the LADI instrument at ILL with the objective to provide insight into the enzymatic mechanism (Myles et al. 1998). The neutron structure unambiguously reveals the protonation state of His 53 in the active site, identifying the model for the enzymatic pathway.

  4. Characterization of Large Grain Nb Ingot Microstructure Using OIM and Laue Methods

    SciTech Connect

    D. Kang, D.C. Baars, T.R. Bieler, G. Ciovati, C. Compton, T.L. Grimm, A.A. Kolka

    2011-07-01

    Large grain niobium is being examined for fabricating superconducting radiofrequency cavities as an alternative to using rolled sheet with fine grains. It is desirable to know the grain orientations of a niobium ingot slice before fabrication, as this allows heterogeneous strain and surface roughness effects arising from etching to be anticipated. Characterization of grain orientations has been done using orientation imaging microscopy (OIM), which requires destructive extraction of pieces from an ingot slice. Use of a Laue camera allows nondestructive characterization of grain orientations, a process useful for evaluating slices and deformation during the manufacturing process. Five ingot slices from CBMM, Ningxia, and Heraeus are compared. One set of slices was deformed into a half cell and the deformation processes that cause crystal rotations have been investigated and compared with analytical predictions. The five ingot slices are compared in terms of their grain orientations and grain boundary misorientations, indicating no obvious commonalities, which suggests that grain orientations develop randomly during solidification.

  5. Achieving Hard X-ray Nanofocusing Using a Wedged Multilayer Laue Lens

    SciTech Connect

    Huang, Xiaojing; Conley, Raymond; Bouet, Nathalie; Zhou, Juan; Macrander, Albert; Maser, Jorg; Yan, Hanfei; Nazaretski, Evgeny; Lauer, Kenneth; Harder, Ross; Robinson, Ian K.; Kalbfleisch, Sebastian; Chu, Yong S.

    2015-05-04

    Here, we report on the fabrication and the characterization of a wedged multilayer Laue lens for x-ray nanofocusing. The lens was fabricated using a sputtering deposition technique, in which a specially designed mask was employed to introduce a thickness gradient in the lateral direction of the multilayer. X-ray characterization shows an efficiency of 27% and a focus size of 26 nm at 14.6 keV, in a good agreement with theoretical calculations. Our results indicate that the desired wedging is achieved in the fabricated structure. Furthermore, we anticipate that continuous development on wedged MLLs will advance x-ray nanofocusing optics to new frontiers and enrich capabilities and opportunities for hard X-ray microscopy.

  6. Strain anisotropy and shear strength of shock compressed tantalum from in-situ Laue diffraction

    NASA Astrophysics Data System (ADS)

    Wehrenberg, C.; Comley, A. J.; Rudd, R. E.; Terry, M.; Hawreliak, J.; Maddox, B. R.; Prisbrey, S. T.; Park, H.-S.; Remington, B. A.

    2014-05-01

    Laser driven shock experiments were performed at the Omega facility to study the dynamic yield strength of ~5 μm thick single crystal tantalum using in-situ Laue diffraction. Tantalum samples were shocked along the [001] direction to peak stresses up to 50 GPa and probed using a 150 ps pulse of bremsstrahlung radiation from an imploding CH capsule x-ray source timed for when the shock was halfway through the sample. The capsule implosion was monitored by a combination of pinhole cameras and DANTE x-ray diode scopes. Diffraction spots for both the undriven and driven regions of the sample were recorded simultaneously on image plate detectors. The strain state of the material was found by combining the strain anisotropy found from the driven diffraction pattern and with simultaneous VISAR measurements.

  7. Single-pulse Laue diffraction, stroboscopic data collection and femtosecond flash photolysis on macromolecules

    NASA Astrophysics Data System (ADS)

    Wulff, Michael; Schotte, Friedrich; Naylor, Graham; Bourgeois, Dominique; Moffat, Keith; Mourou, Gerard

    1997-01-01

    We review the time structure of synchrotron radiation and its use for fast time-resolved diffraction experiments in macromolecular photo-cycles using flash photolysis to initiate the reaction. The source parameters and optics for ID09 at ESRF are presented together with the phase-locked chopper and femtosecond laser. The chopper can set up a 900 Hz pulse train of 100 ps pulses from the hybrid bunch-mode and, in conjunction with a femtosecond laser, it can be used for stroboscopic data collection with both monochromatic and polychromatic beams. Single-pulse Laue data from Cutinase, a 22 kD lipolic enzyme, are presented which show that the quality of single-pulse Laue patterns is sufficient to refine the excited state(s) in a reaction pathway from a known ground state. The flash photolysis technique is discussed and an example is given for heme proteins. The radiation damage from a laser pulse in the femto and picosecond range can be reduced by triggering at a wavelength where the interaction is strong. We propose the use of microcrystals between 25-50 μm for efficient photolysis with femto and picosecond pulses. The performance of circular storage rings is compared with the predicted performance of an X-ray free electron laser (XFEL). The combination of micro beams, a gain of 105 photons per pulse and an ultrashort pulse length of 100 fs is likely to improve pulsed diffraction data very substantially. It may be used to image coherent nuclear motion at atomic resolution in ultrafast uni-molecular reactions.

  8. Analyzing simulation-based PRA data through traditional and topological clustering: A BWR station blackout case study

    DOE PAGES

    Maljovec, D.; Liu, S.; Wang, B.; Mandelli, D.; Bremer, P. -T.; Pascucci, V.; Smith, C.

    2015-07-14

    Here, dynamic probabilistic risk assessment (DPRA) methodologies couple system simulator codes (e.g., RELAP and MELCOR) with simulation controller codes (e.g., RAVEN and ADAPT). Whereas system simulator codes model system dynamics deterministically, simulation controller codes introduce both deterministic (e.g., system control logic and operating procedures) and stochastic (e.g., component failures and parameter uncertainties) elements into the simulation. Typically, a DPRA is performed by sampling values of a set of parameters and simulating the system behavior for that specific set of parameter values. For complex systems, a major challenge in using DPRA methodologies is to analyze the large number of scenarios generated,more » where clustering techniques are typically employed to better organize and interpret the data. In this paper, we focus on the analysis of two nuclear simulation datasets that are part of the risk-informed safety margin characterization (RISMC) boiling water reactor (BWR) station blackout (SBO) case study. We provide the domain experts a software tool that encodes traditional and topological clustering techniques within an interactive analysis and visualization environment, for understanding the structures of such high-dimensional nuclear simulation datasets. We demonstrate through our case study that both types of clustering techniques complement each other for enhanced structural understanding of the data.« less

  9. Analyzing simulation-based PRA data through traditional and topological clustering: A BWR station blackout case study

    SciTech Connect

    Maljovec, D.; Liu, S.; Wang, B.; Mandelli, D.; Bremer, P. -T.; Pascucci, V.; Smith, C.

    2015-07-14

    Here, dynamic probabilistic risk assessment (DPRA) methodologies couple system simulator codes (e.g., RELAP and MELCOR) with simulation controller codes (e.g., RAVEN and ADAPT). Whereas system simulator codes model system dynamics deterministically, simulation controller codes introduce both deterministic (e.g., system control logic and operating procedures) and stochastic (e.g., component failures and parameter uncertainties) elements into the simulation. Typically, a DPRA is performed by sampling values of a set of parameters and simulating the system behavior for that specific set of parameter values. For complex systems, a major challenge in using DPRA methodologies is to analyze the large number of scenarios generated, where clustering techniques are typically employed to better organize and interpret the data. In this paper, we focus on the analysis of two nuclear simulation datasets that are part of the risk-informed safety margin characterization (RISMC) boiling water reactor (BWR) station blackout (SBO) case study. We provide the domain experts a software tool that encodes traditional and topological clustering techniques within an interactive analysis and visualization environment, for understanding the structures of such high-dimensional nuclear simulation datasets. We demonstrate through our case study that both types of clustering techniques complement each other for enhanced structural understanding of the data.

  10. Analyzing and modeling CRE in a changing climate and energy system - a case study from Mid-Norway

    NASA Astrophysics Data System (ADS)

    Tøfte, Lena S.; Sauterleute, Julian F.; Kolberg, Sjur A.; Warland, Geir

    2014-05-01

    Climate related energy (CRE) is influenced by both weather, the system for energy transport and market mechanisms. In the COMPLEX-project, Mid-Norway is a case study where we analyze co-fluctuations between wind and hydropower resources; how co-fluctuations may change in the long-term; which effects this has on the power generation; and how the hydropower system can be operated optimally in this context. In the region Mid-Norway, nearly all power demand is generated by hydro-electric facilities, and the region experiences a deficit of electricity. This is both due to energy deficiency and limitations in the power grid system. In periods of low inflow and situations with high electricity demand (i.e. winter), power must be imported from neighboring regions. In future, this situation might change with the development of renewable energy sources. The region is likely to experience considerable investments in wind power and small-scale hydropower. In relation to the deployment of wind power and small-scale hydropower and security of supply, the transmission grid within and out of the region is extended. With increasing production of intermittent energy sources as wind and small-scale hydro, dependencies and co-fluctuations between rain and wind are to be analyzed due to spatial and temporal scale, in the present and a future climate. Climate change scenarios agree on higher temperatures, more precipitation in total and a larger portion of the precipitation coming as rain in this region, and the average wind speed as well as the frequency of storms along the coast is expected to increase slightly during the winter. Changing temperatures will also change the electricity needs, as electricity is the main source for heating in Norway. It's important to study if and to which extent today's hydropower system and reservoirs are able to balance new intermittent energy sources in the region, in both today's and tomorrow's climate. The case study includes down-scaling of climate

  11. Laue diffraction as a tool in dynamic studies: Hydrolysis of a transiently stable intermediate in catalysis by trypsin

    SciTech Connect

    Singer, P.T.; Berman, L.E.; Cai, Z.; Mangel, W.F.; Jones, K.W.; Sweet, R.M. ); Carty, R.P. . Dept. of Biochemistry); Schlichting, I. . Rosenstiel Basic Medical Science Center); Stock, A. (Center for Advanced Biotechnology and Medicine, Piscataway, NJ (Un

    1992-01-01

    A transiently stable intermediate in trypsin catalysis, guanidinobenzyol-Ser-195 trypsin, can be trapped and then released by control of the pH in crystals of the enzyme. This effect has been investigated by static and dynamic white-beam Laue crystallography. Comparison of structures determined before and immediately after a pH jump reveals the nature of concerted changes that accompany activation of the enzyme. Careful analysis of the results of several structure determinations gives information about the reliability of Laue results in general. A study of multiple exposures taken under differing conditions of beam intensity, crystal quality, and temperature revealed information about ways to control damage of specimens by the x-ray beam.

  12. Observation of optical second-harmonic generation in porous-silicon-based photonic crystals in the Laue diffraction scheme

    NASA Astrophysics Data System (ADS)

    Kopylov, D. A.; Svyakhovskiy, S. E.; Dergacheva, L. V.; Bushuev, V. A.; Mantsyzov, B. I.; Murzina, T. V.

    2016-05-01

    Second-harmonic generation (SHG) in the Laue scheme of the dynamical Bragg diffraction in one-dimensional photonic crystal (PhC) is studied. The experiments are performed for partially annealed porous-silicon PhC containing 250 periods of the structure. Our measurements confirm that the phase-matched optical SHG is observed under the Bragg conditions, which is evidenced by a narrow angular and spectral distribution of the diffracted SHG outgoing the PhC. This is confirmed by both the analytical description of the SHG process performed in the two-wave approximation, and by direct calculations of the PhC dispersion curves for the fundamental and SHG wavelengths by the revised plane wave method. Possible types of phase- and quasi-phase-matching realized in the studied PhC under the Laue diffraction scheme are discussed.

  13. Laue diffraction as a tool in dynamic studies: Hydrolysis of a transiently stable intermediate in catalysis by trypsin

    SciTech Connect

    Singer, P.T.; Berman, L.E.; Cai, Z.; Mangel, W.F.; Jones, K.W.; Sweet, R.M.; Carty, R.P.; Schlichting, I.; Stock, A.; Smalas, A.

    1992-11-01

    A transiently stable intermediate in trypsin catalysis, guanidinobenzyol-Ser-195 trypsin, can be trapped and then released by control of the pH in crystals of the enzyme. This effect has been investigated by static and dynamic white-beam Laue crystallography. Comparison of structures determined before and immediately after a pH jump reveals the nature of concerted changes that accompany activation of the enzyme. Careful analysis of the results of several structure determinations gives information about the reliability of Laue results in general. A study of multiple exposures taken under differing conditions of beam intensity, crystal quality, and temperature revealed information about ways to control damage of specimens by the x-ray beam.

  14. The geometric factor of electrostatic plasma analyzers: A case study from the Fast Plasma Investigation for the Magnetospheric Multiscale mission

    SciTech Connect

    Collinson, Glyn A.; Dorelli, John C.; Moore, Thomas E.; Pollock, Craig; Mariano, Al; Shappirio, Mark D.; Adrian, Mark L.; Avanov, Levon A.; Lewis, Gethyn R.; Kataria, Dhiren O.; Bedington, Robert; Owen, Christopher J.; Walsh, Andrew P.; Arridge, Chris S.; Gliese, Ulrik; Barrie, Alexander C.; Tucker, Corey

    2012-03-15

    We report our findings comparing the geometric factor (GF) as determined from simulations and laboratory measurements of the new Dual Electron Spectrometer (DES) being developed at NASA Goddard Space Flight Center as part of the Fast Plasma Investigation on NASA's Magnetospheric Multiscale mission. Particle simulations are increasingly playing an essential role in the design and calibration of electrostatic analyzers, facilitating the identification and mitigation of the many sources of systematic error present in laboratory calibration. While equations for laboratory measurement of the GF have been described in the literature, these are not directly applicable to simulation since the two are carried out under substantially different assumptions and conditions, making direct comparison very challenging. Starting from first principles, we derive generalized expressions for the determination of the GF in simulation and laboratory, and discuss how we have estimated errors in both cases. Finally, we apply these equations to the new DES instrument and show that the results agree within errors. Thus we show that the techniques presented here will produce consistent results between laboratory and simulation, and present the first description of the performance of the new DES instrument in the literature.

  15. The Geometric Factor of Electrostatic Plasma Analyzers: A Case Study from the Fast Plasma Investigation for the Magnetospheric Multiscale mission

    NASA Technical Reports Server (NTRS)

    Collinson, Glyn A.; Dorelli, John Charles; Avanov, Leon A.; Lewis, Gethyn R.; Moore, Thomas E.; Pollock, Craig; Kataria, Dhiren O.; Bedington, Robert; Arridge, Chris S.; Chornay, Dennis J.; Gliese,Ulrik; Mariano, Al.; Barrie, Alexander C; Tucker, Corey; Owen, Christopher J.; Walsh, Andrew P.; Shappirio, Mark D.; Adrian, Mark L.

    2012-01-01

    We report our findings comparing the geometric factor (GF) as determined from simulations and laboratory measurements of the new Dual Electron Spectrometer (DES) being developed at NASA Goddard Space Flight Center as part of the Fast Plasma Investigation on NASA's Magnetospheric Multiscale mission. Particle simulations are increasingly playing an essential role in the design and calibration of electrostatic analyzers, facilitating the identification and mitigation of the many sources of systematic error present in laboratory calibration. While equations for laboratory measurement of the Geometric Factpr (GF) have been described in the literature, these are not directly applicable to simulation since the two are carried out under substantially different assumptions and conditions, making direct comparison very challenging. Starting from first principles, we derive generalized expressions for the determination of the GF in simulation and laboratory, and discuss how we have estimated errors in both cases. Finally, we apply these equations to the new DES instrument and show that the results agree within errors. Thus we show that the techniques presented here will produce consistent results between laboratory and simulation, and present the first description of the performance of the new DES instrument in the literature.

  16. WSi2/Si multilayer sectioning by reactive ion etching for multilayer Laue lens fabrication

    NASA Astrophysics Data System (ADS)

    Bouet, N.; Conley, R.; Biancarosa, J.; Divan, R.; Macrander, A. T.

    2010-09-01

    Reactive ion etching (RIE) has been employed in a wide range of fields such as semiconductor fabrication, MEMS (microelectromechanical systems), and refractive x-ray optics with a large investment put towards the development of deep RIE. Due to the intrinsic differing chemistries related to reactivity, ion bombardment, and passivation of materials, the development of recipes for new materials or material systems can require intense effort and resources. For silicon in particular, methods have been developed to provide reliable anisotropic profiles with good dimensional control and high aspect ratios1,2,3, high etch rates, and excellent material to mask etch selectivity. A multilayer Laue lens4 is an x-ray focusing optic, which is produced by depositing many layers of two materials with differing electron density in a particular stacking sequence where the each layer in the stack satisfies the Fresnel zone plate law. When this stack is sectioned to allow side-illumination with radiation, the diffracted exiting radiation will constructively interfere at the focal point. Since the first MLLs were developed at Argonne in the USA in 20064, there have been published reports of MLL development efforts in Japan5, and, very recently, also in Germany6. The traditional technique for sectioning multilayer Laue lens (MLL) involves mechanical sectioning and polishing7, which is labor intensive and can induce delamination or structure damage and thereby reduce yield. If a non-mechanical technique can be used to section MLL, it may be possible to greatly shorten the fabrication cycle, create more usable optics from the same amount of deposition substrate, and perhaps develop more advanced structures to provide greater stability or flexibility. Plasma etching of high aspect-ratio multilayer structures will also expand the scope for other types of optics fabrication (such as gratings, zone plates, and so-on). However, well-performing reactive ion etching recipes have been developed

  17. A bent Laue analyser crystal for Rayleigh-to-Compton computed tomography.

    PubMed

    Schulze, C; Kleuker, U

    1998-05-01

    A new optical system to perform tomography based on the Rayleigh-to-Compton (RC) method with high spatial and spectral resolution is presented. The RC technique allows the effective atomic number of a sample to be measured and finds application in bone mineral densitometry in medicine. It is particularly useful for the characterization of the distribution of biological materials which do not exhibit distinctive diffraction peaks. The system is based on the separation of the elastic line from the spectrum that is scattered by the sample by means of a bent Laue analyser crystal, and the subsequent independent detection of the elastic and inelastic parts of the spectrum with two large-area scintillation counters. The high energy resolution permits operation at low momentum transfer, where the RC method has its best contrast-to-noise ratio for low-Z materials. The geometrical and spectral requirements in terms of the incident beam and the conical analyser crystal are discussed. A first-generation tomographic imaging system (pencil beam, scanned sample) as implemented at the ESRF Compton-Scattering Station ID15B is described. A high-resolution tomographic reconstruction of a bone sample is presented. PMID:15263753

  18. A focal plane detector design for a wide-band Laue-lens telescope

    NASA Astrophysics Data System (ADS)

    Caroli, Ezio; Auricchio, Natalia; Amati, Lorenzo; Bezsmolnyy, Yuriy; Budtz-Jørgensen, Carl; da Silva, Rui M. Curado; Frontera, Filippo; Pisa, Alessandro; del Sordo, Stefano; Stephen, John B.; Ventura, Giulio

    2005-12-01

    The energy range above 60 keV is important for the study of many open problems in high energy astrophysics such as the role of Inverse Compton with respect to synchrotron or thermal processes in GRBs, non thermal mechanisms in SNR, the study of the high energy cut-offs in AGN spectra, and the detection of nuclear and annihilation lines. Recently the development of high energy Laue lenses with broad energy bandpasses from 60 to 600keV have been proposed for a Hard X ray focusing Telescope (HAXTEL) in order to study the X-ray continuum of celestial sources. The required focal plane detector should have high detection efficiency over the entire operative range, a spatial resolution of about 1mm, an energy resolution of a few keV at 500keV and a sensitivity to linear polarization. We describe a possible configuration of the focal plane detector based on several CdTe/CZT pixelated layers stacked together to achieve the required detection efficiency at high energy. Each layer can operate both as a separate position sensitive detector and polarimeter or work with other layers to increase the overall photopeak efficiency. Each layer has a hexagonal shape in order to minimize the detector surface required to cover the lens field of view. The pixels would have the same geometry so as to provide the best coupling with the lens point spread function and to increase the symmetry for polarimetric studies.

  19. A positron annihilation radiation telescope using Laue diffraction in a crystal lens

    SciTech Connect

    Smither, R.K. ); von Ballmoos, P. . Centre d'Etude Spatiale des Rayonnements)

    1993-03-01

    We present a new type of gamma-ray telescope featuring a Laue diffraction lens, a detector module with a 3-by-3 germanium array, and a balloon gondola stabilized to 5 arc sec pointing accuracy. The instrument's lens is designed to collect 511 keV photons on its 150 CM[sup 2] effective area and focus them onto a small detector having only [approx]14 CM[sup 3] of equivalent volume for background noise. As a result, this telescope overcomes the mass-sensitivity impasse of present detectors in which the collection areas are identical to the detection area. The sensitivity of our instrument is anticipated to be 3 [times] 10[sup [minus]5] ph cm[sup [minus]2] S[sup [minus]1] at 511 key with an angular resolution of 15 arc sec and an energy resolution of 2 keV. These features will allow the resolve of a possible energetically narrow 511 keV positron annihilation line both energy-wise and spatially within a Galactic Center microquasar'' as 1El740.7-2942 or GRS1758-258. In addition to the galactic microquasars,'' other prime objectives include Cyg X-1, X-ray binaries, pulsars, and AGNS.

  20. A positron annihilation radiation telescope using Laue diffraction in a crystal lens

    SciTech Connect

    Smither, R.K.; von Ballmoos, P.

    1993-03-01

    We present a new type of gamma-ray telescope featuring a Laue diffraction lens, a detector module with a 3-by-3 germanium array, and a balloon gondola stabilized to 5 arc sec pointing accuracy. The instrument`s lens is designed to collect 511 keV photons on its 150 CM{sup 2} effective area and focus them onto a small detector having only {approx}14 CM{sup 3} of equivalent volume for background noise. As a result, this telescope overcomes the mass-sensitivity impasse of present detectors in which the collection areas are identical to the detection area. The sensitivity of our instrument is anticipated to be 3 {times} 10{sup {minus}5} ph cm{sup {minus}2} S{sup {minus}1} at 511 key with an angular resolution of 15 arc sec and an energy resolution of 2 keV. These features will allow the resolve of a possible energetically narrow 511 keV positron annihilation line both energy-wise and spatially within a Galactic Center ``microquasar`` as 1El740.7-2942 or GRS1758-258. In addition to the galactic ``microquasars,`` other prime objectives include Cyg X-1, X-ray binaries, pulsars, and AGNS.

  1. An improved method for calibrating time-of-flight Laue single-crystal neutron diffractometers

    PubMed Central

    Bull, Craig L.; Johnson, Michael W.; Hamidov, Hayrullo; Komatsu, Kazuki; Guthrie, Malcolm; Gutmann, Matthias J.; Loveday, John S.; Nelmes, Richard J.

    2014-01-01

    A robust and comprehensive method for determining the orientation matrix of a single-crystal sample using the neutron Laue time-of-flight (TOF) technique is described. The new method enables the measurement of the unit-cell parameters with an uncertainty in the range 0.015–0.06%, depending upon the crystal symmetry and the number of reflections measured. The improved technique also facilitates the location and integration of weak reflections, which are often more difficult to discern amongst the increased background at higher energies. The technique uses a mathematical model of the relative positions of all the detector pixels of the instrument, together with a methodology that establishes a reproducible reference frame and a method for determining the parameters of the instrument detector model. Since all neutron TOF instruments require precise detector calibration for their effective use, it is possible that the method described here may be of use on other instruments where the detector calibration cannot be determined by other means. PMID:24904244

  2. X-ray μ-Laue diffraction analysis of Cu through-silicon vias: A two-dimensional and three-dimensional study

    SciTech Connect

    Sanchez, Dario Ferreira; Weleguela, Monica Larissa Djomeni; Audoit, Guillaume; Grenier, Adeline; Gergaud, Patrice; Bleuet, Pierre; Ulrich, Olivier; Micha, Jean-Sébastien; Robach, Odile

    2014-10-28

    Here, white X-ray μ-beam Laue diffraction is developed and applied to investigate elastic strain distributions in three-dimensional (3D) materials, more specifically, for the study of strain in Cu 10 μm diameter–80 μm deep through-silicon vias (TSVs). Two different approaches have been applied: (i) two-dimensional μ-Laue scanning and (ii) μ-beam Laue tomography. 2D μ-Laue scans provided the maps of the deviatoric strain tensor integrated along the via length over an array of TSVs in a 100 μm thick sample prepared by Focused Ion Beam. The μ-beam Laue tomography analysis enabled to obtain the 3D grain and elemental distribution of both Cu and Si. The position, size (about 3 μm), shape, and orientation of Cu grains were obtained. Radial profiles of the equivalent deviatoric strain around the TSVs have been derived through both approaches. The results from both methods are compared and discussed.

  3. An input-output approach to analyze the ways to increase total output of energy sectors: The case of Japan

    NASA Astrophysics Data System (ADS)

    Zuhdi, Ubaidillah

    2014-03-01

    The purpose of this study is to analyze the ways to increase total output of Japanese energy sectors in future time. In this study, Input-Output (IO) analysis is employed as a tool of analysis. This study focuses on petroleum refinery products and non-ferrous metals as analyzed sectors. The results show that positive impact observed in export and outside households consumption modifications while opposite impact is given by modification of import. The recommendations suggested based on these results are Japanese government should make breakthroughs so analyzed sector's export activities can increase and they have to careful in conducting import activities related to these sectors.

  4. A space-time point process model for analyzing and predicting case patterns of diarrheal disease in northwestern Ecuador.

    PubMed

    Ahn, Jaeil; Johnson, Timothy D; Bhavnani, Darlene; Eisenberg, Joseph N S; Mukherjee, Bhramar

    2014-06-01

    We consider modeling case-patterns under a complex spatial and longitudinal sampling design as conducted via a serial case-control study of diarrheal disease in northwestern Ecuador. We build a two-stage space-time model to understand the role of spatially and temporally referenced covariates that reflect social and natural environments in the sampled region, after accounting for unmeasured residual heterogeneities. All diarrheal case events are collected from 21 sampled communities in Esmeraldes province in Ecuador, during seven sampling cycles from 2003 to 2008. The region of interest comprises 158 communities along a river basin. Prediction of case counts at unsampled communities at a future time is of interest along with estimation of risk-related parameters. We propose a computationally feasible two-stage Bayesian approach to estimate the risk-related parameters and conduct predictive inference. We first apply the log Gaussian Cox process (LGCP), commonly used to model spatial clustering of point patterns, to accommodate temporal variation within the sampled communities. Prediction of the number of cases at unsampled communities at a future time is obtained by a disease mapping model conditional on the expected case counts from Stage I.

  5. Watching a signaling protein function in real time via 100-ps time-resolved Laue crystallography

    SciTech Connect

    Schotte, Friedrich; Cho, Hyun Sun; Kaila, Ville R.I.; Kamikubo, Hironari; Dashdorj, Naranbaatar; Henry, Eric R.; Graber, Timothy J.; Henning, Robert; Wulff, Michael; Hummer, Gerhard; Kataoka, Mikio; Anfinrud, Philip A.

    2012-11-06

    To understand how signaling proteins function, it is necessary to know the time-ordered sequence of events that lead to the signaling state. We recently developed on the BioCARS 14-IDB beamline at the Advanced Photon Source the infrastructure required to characterize structural changes in protein crystals with near-atomic spatial resolution and 150-ps time resolution, and have used this capability to track the reversible photocycle of photoactive yellow protein (PYP) following trans-to-cis photoisomerization of its p-coumaric acid (pCA) chromophore over 10 decades of time. The first of four major intermediates characterized in this study is highly contorted, with the pCA carbonyl rotated nearly 90° out of the plane of the phenolate. A hydrogen bond between the pCA carbonyl and the Cys69 backbone constrains the chromophore in this unusual twisted conformation. Density functional theory calculations confirm that this structure is chemically plausible and corresponds to a strained cis intermediate. This unique structure is short-lived (~600 ps), has not been observed in prior cryocrystallography experiments, and is the progenitor of intermediates characterized in previous nanosecond time-resolved Laue crystallography studies. The structural transitions unveiled during the PYP photocycle include trans/cis isomerization, the breaking and making of hydrogen bonds, formation/relaxation of strain, and gated water penetration into the interior of the protein. This mechanistically detailed, near-atomic resolution description of the complete PYP photocycle provides a framework for understanding signal transduction in proteins, and for assessing and validating theoretical/computational approaches in protein biophysics.

  6. Texture, residual strain, and plastic deformation around scratches in alloy 600 using synchrotron x-ray Laue micro-diffraction.

    SciTech Connect

    Suominen Fuller, M. L.; Klassen, R. J.; McIntyre, N. S.; Gerson, A. R.; Ramamurthy, S.; King, P. J.; Liu, W.; Univ. of Western Ontario; Univ. of South Australia; Babcock & Wilcox Canada

    2008-01-01

    Deformation around two scratches in Alloy 600 (A600) was studied nondestructively using synchrotron Laue differential aperture X-ray microscopy. The orientation of grains and elastic strain distribution around the scratches were measured. A complex residual deviatoric elastic strain state was found to exist around the scratches. Heavy plastic deformation was observed up to a distance of 20 {micro}m from the scratches. In the region 20-30 {micro}m from the scratches the diffraction spots were heavily streaked and split indicating misoriented dislocation cell structures.

  7. Strength of shock-loaded single-crystal tantalum [100] determined using in situ broadband x-ray Laue diffraction.

    PubMed

    Comley, A J; Maddox, B R; Rudd, R E; Prisbrey, S T; Hawreliak, J A; Orlikowski, D A; Peterson, S C; Satcher, J H; Elsholz, A J; Park, H-S; Remington, B A; Bazin, N; Foster, J M; Graham, P; Park, N; Rosen, P A; Rothman, S R; Higginbotham, A; Suggit, M; Wark, J S

    2013-03-15

    The strength of shock-loaded single crystal tantalum [100] has been experimentally determined using in situ broadband x-ray Laue diffraction to measure the strain state of the compressed crystal, and elastic constants calculated from first principles. The inferred strength reaches 35 GPa at a shock pressure of 181 GPa and is in excellent agreement with a multiscale strength model [N. R. Barton et al., J. Appl. Phys. 109, 073501 (2011)], which employs a hierarchy of simulation methods over a range of length scales to calculate strength from first principles.

  8. Scanning Transmission Electron Microscopy Using Selective High-Order Laue Zones: Three-Dimensional Atomic Ordering in Sodium Cobaltate

    NASA Astrophysics Data System (ADS)

    Huang, F.-T.; Gloter, A.; Chu, M.-W.; Chou, F. C.; Shu, G. J.; Liu, L.-K.; Chen, C. H.; Colliex, C.

    2010-09-01

    A new scanning transmission electron microscopy (STEM) imaging technique using high-order Laue zones (named HOLZ-STEM), a diffraction contrast which has been strenuously avoided or minimized in traditional STEM imaging, can be used to obtain the additional 1D periodic information along the electron propagation axis without sacrificing atomic resolution in the lateral (2D) dimension. HOLZ-STEM has been demonstrated to resolve the 3D long-range Na ordering of Na0.71CoO2. Direct evidence of spiral-like Na-trimer chains twisting along the c axis is unambiguously established in real space.

  9. First results of the (n,γ) EXILL campaigns at the Institut Laue Langevin using EXOGAM and FATIMA

    NASA Astrophysics Data System (ADS)

    Jolie, J.; Régis, J.-M.; Wilmsen, D.; Ahmed, S.; Pfeiffer, M.; Saed-Samii, N.; Warr, N.; Blanc, A.; Jentschel, M.; Köster, U.; Mutti, P.; Soldner, T.; Simpson, G.; de France, G.; Urban, W.; Bruce, A. M.; Roberts, O. J.; Fraile, L. M.; Paziy, V.; Ignatov, A.; Ilieva, S.; Kröll, Th; Scheck, M.; Thürauf, M.; Ivanova, D.; Kisyov, S.; Lalkovski, S.; Podolyak, Zs; Regan, P. H.; Korten, W.; Habs, D.; Thirolf, P. G.; Ur, C. A.

    2014-09-01

    At the PF1B cold neutron beam line at the Institut Laue Langevin the EXILL array consisting of EXOGAM, GASP and LOHENGRIN detectors was used to perform (n,γ) measurements under very high coincidence rates. About ten different reactions were then measured in autumn 2012. In spring 2013 the EXOGAM array was combined with 16 LaBr3(Ce) scintillators in the FATIMA@EXILL campaign for the measurement of lifetimes using the generalised centroid difference method. We report on the properties of both set-ups and present first results on Pt isotopes from both campaigns.

  10. Optimizing the accuracy and precision of the single-pulse Laue technique for synchrotron photo-crystallography

    PubMed Central

    Kamiński, Radosław; Graber, Timothy; Benedict, Jason B.; Henning, Robert; Chen, Yu-Sheng; Scheins, Stephan; Messerschmidt, Marc; Coppens, Philip

    2010-01-01

    The accuracy that can be achieved in single-pulse pump-probe Laue experiments is discussed. It is shown that with careful tuning of the experimental conditions a reproducibility of the intensity ratios of equivalent intensities obtained in different measurements of 3–4% can be achieved. The single-pulse experiments maximize the time resolution that can be achieved and, unlike stroboscopic techniques in which the pump-probe cycle is rapidly repeated, minimize the temperature increase due to the laser exposure of the sample. PMID:20567080

  11. Middle ear squamous papilloma: A report of four cases analyzed by HPV and EBV in situ hybridization

    PubMed Central

    ZHOU, HAN; CHEN, ZHIBIN; ZHANG, WEIMING; XING, GUANGQIAN

    2014-01-01

    Squamous papilloma involving the middle ear as a primary lesion is an extremely rare occurrence. The aims of the present study were to investigate the presence of human papilloma virus (HPV) and Epstein-Barr virus (EBV) infections in primary middle ear squamous papilloma and to describe the clinical and pathological features of the disease along with therapeutic strategies. A retrospective review was conducted of four patients with clinical and pathological diagnoses of middle ear squamous papilloma. In situ hybridization (ISH) for a wide range of HPV DNA subtypes and EBV-encoded RNA was performed in the tissue samples obtained from these patients. Only two cases of primary squamous papilloma in the middle ear have been previously reported in the English literature. These papillomas developed in males of ~60-years of age and otorrhea was the most frequent complaint. Premalignant changes were observed in two of the present cases and ISH of HPV and EBV was negative in all four cases. The results of the present study indicated that chronic inflammatory stimulation, not HPV and EBV infection, is involved in the occurrence of middle ear squamous papilloma and its malignant transformation. Radical surgery and long-term postoperative follow-up are recommended due to its malignant and recurrent potential. Further genetic investigations with additional new cases are required to clarify the pathogenesis of squamous papilloma involving the middle ear. PMID:24348817

  12. Using coupled micropillar compression and micro-Laue diffraction to investigate deformation mechanisms in a complex metallic alloy Al13Co4

    NASA Astrophysics Data System (ADS)

    Bhowmik, Ayan; Dolbnya, Igor P.; Britton, T. Ben; Jones, Nicholas G.; Sernicola, Giorgio; Walter, Claudia; Gille, Peter; Dye, David; Clegg, William J.; Giuliani, Finn

    2016-03-01

    In this study, we have used in-situ micro-Laue diffraction combined with micropillar compression of focused ion beam milled Al13Co4 complex metallic alloy to investigate the evolution of deformation in Al13Co4. Streaking of the Laue spots shows that the onset of plastic flow occurs at stresses as low as 0.8 GPa, although macroscopic yield only becomes apparent at 2 GPa. The measured misorientations, obtained from peak splitting, enable the geometrically necessary dislocation density to be estimated as 1.1 × 1013 m-2.

  13. DIFFERENTIAL ANALYZER

    DOEpatents

    Sorensen, E.G.; Gordon, C.M.

    1959-02-10

    Improvements in analog eomputing machines of the class capable of evaluating differential equations, commonly termed differential analyzers, are described. In general form, the analyzer embodies a plurality of basic computer mechanisms for performing integration, multiplication, and addition, and means for directing the result of any one operation to another computer mechanism performing a further operation. In the device, numerical quantities are represented by the rotation of shafts, or the electrical equivalent of shafts.

  14. Gas Analyzer

    NASA Astrophysics Data System (ADS)

    1989-01-01

    The M200 originated in the 1970's under an Ames Research Center/Stanford University contract to develop a small, lightweight gas analyzer for Viking Landers. Although the unit was not used on the spacecraft, it was further developed by The National Institute for Occupational Safety and Health (NIOSH). Three researchers from the project later formed Microsensor Technology, Inc. (MTI) to commercialize the analyzer. The original version (Micromonitor 500) was introduced in 1982, and the M200 in 1988. The M200, a more advanced version, features dual gas chromatograph which separate a gaseous mixture into components and measure concentrations of each gas. It is useful for monitoring gas leaks, chemical spills, etc. Many analyses are completed in less than 30 seconds, and a wide range of mixtures can be analyzed.

  15. Process Analyzer

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The ChemScan UV-6100 is a spectrometry system originally developed by Biotronics Technologies, Inc. under a Small Business Innovation Research (SBIR) contract. It is marketed to the water and wastewater treatment industries, replacing "grab sampling" with on-line data collection. It analyzes the light absorbance characteristics of a water sample, simultaneously detects hundreds of individual wavelengths absorbed by chemical substances in a process solution, and quantifies the information. Spectral data is then processed by ChemScan analyzer and compared with calibration files in the system's memory in order to calculate concentrations of chemical substances that cause UV light absorbance in specific patterns. Monitored substances can be analyzed for quality and quantity. Applications include detection of a variety of substances, and the information provided enables an operator to control a process more efficiently.

  16. Blood Analyzer

    NASA Technical Reports Server (NTRS)

    1992-01-01

    In the 1970's, NASA provided funding for development of an automatic blood analyzer for Skylab at the Oak Ridge National Laboratory (ORNL). ORNL devised "dynamic loading," which employed a spinning rotor to load, transfer, and analyze blood samples by centrifugal processing. A refined, commercial version of the system was produced by ABAXIS and is marketed as portable ABAXIS MiniLab MCA. Used in a doctor's office, the equipment can perform 80 to 100 chemical blood tests on a single drop of blood and report results in five minutes. Further development is anticipated.

  17. A Modified Actor-Power-Accountability Framework (MAPAF) for analyzing decentralized forest governance: case study from Ethiopia.

    PubMed

    Mohammed, Abrar Juhar; Inoue, Makoto

    2014-06-15

    This paper posits a Modified Actor-Power-Accountability Framework (MAPAF) that makes three major improvements on the Actor-Power-Accountability Framework (APAF) developed by Agrawal and Ribot (1999). These improvements emphasize the nature of decentralized property rights, linking the outputs of decentralization with its outcomes and the inclusion of contextual factors. Applying MAPAF to analyze outputs and outcomes from two major decentralized forest policies in Ethiopia, i.e., delegation and devolution, has demonstrated the following strengths of the framework. First, by incorporating vital bundles of property rights into APAF, MAPAF creates a common ground for exploring and comparing the extent of democratization achieved by different decentralizing reforms. Second, the inclusion of social and environmental outcomes in MAPAF makes it possible to link the output of decentralization with local level outcomes. Finally, the addition of contextual factors enhances MAPAF's explanatory power by providing room for investigating exogenous factors other than democratization that contribute to the outcomes of decentralization reforms.

  18. Defibrillator analyzers.

    PubMed

    1999-12-01

    Defibrillator analyzers automate the inspection and preventive maintenance (IPM) testing of defibrillators. They need to be able to test at least four basic defibrillator performance characteristics: discharge energy, synchronized-mode operation, automated external defibrillation, and ECG monitoring. We prefer that they also be able to test a defibrillator's external noninvasive pacing function--but this is not essential if a facility already has a pacemaker analyzer that can perform this testing. In this Evaluation, we tested seven defibrillator analyzers from six suppliers. All seven units accurately measure the energies of a variety of discharge wave-forms over a wide range of energy levels--from 1 J for use in a neonatal intensive care unit to 360 J for use on adult patients requiring maximum discharge energy. Most of the analyzers are easy to use. However, only three of the evaluated units could perform the full range of defibrillator tests that we prefer. We rated these units Acceptable--Preferred. Three more units could perform four of the five tests, they could not test the pacing feature of a defibrillator. These units were rated Acceptable. The seventh unit could perform only discharge energy testing and synchronized-mode testing and was difficult to use. We rate that unit Acceptable--Not Recommended. PMID:10604089

  19. What is the Best Way to Analyze Less Frequent Forms of Violence? The Case of Sexual Aggression

    PubMed Central

    Swartout, Kevin M.; Thompson, Martie P.; Koss, Mary P.; Su, Nan

    2015-01-01

    Objective Most frequency data on violence are non-normally distributed, which can lead to faulty conclusions when not modeled appropriately. And, we can't prevent what we can't accurately predict. We therefore review a series of methods specifically suited to analyze frequency data, with specific reference to the psychological study of sexual aggression. In the process, we demonstrate a model comparison exercise using sample data on college men's sexual aggression. Method We used a subset (n=645) of a larger longitudinal dataset to demonstrate fitting and comparison of six analytic methods: OLS regression, OLS regression with a square-root-transformed outcome, Poisson regression, negative binomial regression, zero-inflated Poisson regression, and zero-inflated negative binomial regression. Risk and protective factors measured at Time 1 predicted frequency of SA at Time 2 (8 months later) within each model. Models were compared on overall fit, parsimony, and interpretability based upon previous findings and substantive theory. Results As we predicted, OLS regression assumptions were untenable. Of the count-based regression models, the negative binomial model fit the data best; it fit the data better than the Poisson and zero-inflated Poisson models, and it was more parsimonious than the zero-inflated negative binomial model without a significant degradation in model fit. Conclusion In addition to more accurately modeling violence frequency data, count-based models have clear interpretations that can be disseminated to a broad audience. We recommend analytic steps investigators can use when analyzing count outcomes as well as further avenues researchers can explore in working with non-normal data on violence. PMID:26925298

  20. A Case Study: Analyzing City Vitality with Four Pillars of Activity-Live, Work, Shop, and Play.

    PubMed

    Griffin, Matt; Nordstrom, Blake W; Scholes, Jon; Joncas, Kate; Gordon, Patrick; Krivenko, Elliott; Haynes, Winston; Higdon, Roger; Stewart, Elizabeth; Kolker, Natali; Montague, Elizabeth; Kolker, Eugene

    2016-03-01

    This case study evaluates and tracks vitality of a city (Seattle), based on a data-driven approach, using strategic, robust, and sustainable metrics. This case study was collaboratively conducted by the Downtown Seattle Association (DSA) and CDO Analytics teams. The DSA is a nonprofit organization focused on making the city of Seattle and its Downtown a healthy and vibrant place to Live, Work, Shop, and Play. DSA primarily operates through public policy advocacy, community and business development, and marketing. In 2010, the organization turned to CDO Analytics ( cdoanalytics.org ) to develop a process that can guide and strategically focus DSA efforts and resources for maximal benefit to the city of Seattle and its Downtown. CDO Analytics was asked to develop clear, easily understood, and robust metrics for a baseline evaluation of the health of the city, as well as for ongoing monitoring and comparisons of the vitality, sustainability, and growth. The DSA and CDO Analytics teams strategized on how to effectively assess and track the vitality of Seattle and its Downtown. The two teams filtered a variety of data sources, and evaluated the veracity of multiple diverse metrics. This iterative process resulted in the development of a small number of strategic, simple, reliable, and sustainable metrics across four pillars of activity: Live, Work, Shop, and Play. Data during the 5 years before 2010 were used for the development of the metrics and model and its training, and data during the 5 years from 2010 and on were used for testing and validation. This work enabled DSA to routinely track these strategic metrics, use them to monitor the vitality of Downtown Seattle, prioritize improvements, and identify new value-added programs. As a result, the four-pillar approach became an integral part of the data-driven decision-making and execution of the Seattle community's improvement activities. The approach described in this case study is actionable, robust, inexpensive

  1. A Case Study: Analyzing City Vitality with Four Pillars of Activity-Live, Work, Shop, and Play.

    PubMed

    Griffin, Matt; Nordstrom, Blake W; Scholes, Jon; Joncas, Kate; Gordon, Patrick; Krivenko, Elliott; Haynes, Winston; Higdon, Roger; Stewart, Elizabeth; Kolker, Natali; Montague, Elizabeth; Kolker, Eugene

    2016-03-01

    This case study evaluates and tracks vitality of a city (Seattle), based on a data-driven approach, using strategic, robust, and sustainable metrics. This case study was collaboratively conducted by the Downtown Seattle Association (DSA) and CDO Analytics teams. The DSA is a nonprofit organization focused on making the city of Seattle and its Downtown a healthy and vibrant place to Live, Work, Shop, and Play. DSA primarily operates through public policy advocacy, community and business development, and marketing. In 2010, the organization turned to CDO Analytics ( cdoanalytics.org ) to develop a process that can guide and strategically focus DSA efforts and resources for maximal benefit to the city of Seattle and its Downtown. CDO Analytics was asked to develop clear, easily understood, and robust metrics for a baseline evaluation of the health of the city, as well as for ongoing monitoring and comparisons of the vitality, sustainability, and growth. The DSA and CDO Analytics teams strategized on how to effectively assess and track the vitality of Seattle and its Downtown. The two teams filtered a variety of data sources, and evaluated the veracity of multiple diverse metrics. This iterative process resulted in the development of a small number of strategic, simple, reliable, and sustainable metrics across four pillars of activity: Live, Work, Shop, and Play. Data during the 5 years before 2010 were used for the development of the metrics and model and its training, and data during the 5 years from 2010 and on were used for testing and validation. This work enabled DSA to routinely track these strategic metrics, use them to monitor the vitality of Downtown Seattle, prioritize improvements, and identify new value-added programs. As a result, the four-pillar approach became an integral part of the data-driven decision-making and execution of the Seattle community's improvement activities. The approach described in this case study is actionable, robust, inexpensive

  2. Comparison between Windowed FFT and Hilbert-Huang Transform for Analyzing Time Series with Poissonian Fluctuations: A Case Study

    NASA Astrophysics Data System (ADS)

    Han, Dong; Zhang, Shuang-Nan

    2006-08-01

    Hilbert-Huang Transform (HHT) is a novel data analysis technique for nonlinear and non-stationary data. We present a time-frequency analysis of both simulated light curves and an X-ray burst from the X-ray burster 4U 1702--429 with both the HHT and the Windowed Fast Fourier Transform (WFFT) methods. Our results show that the HHT method has failed in all cases for light curves with Poissonian fluctuations which are typical for all photon counting instruments used in astronomy, whereas the WFFT method can sensitively detect the periodic signals in the presence of Poissonian fluctuations; the only drawback of the WFFT method is that it cannot detect sharp frequency variations accurately.

  3. Process Analyzer

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Under a NASA Small Business Innovation Research (SBIR) contract, Axiomatics Corporation developed a shunting Dielectric Sensor to determine the nutrient level and analyze plant nutrient solutions in the CELSS, NASA's space life support program. (CELSS is an experimental facility investigating closed-cycle plant growth and food processing for long duration manned missions.) The DiComp system incorporates a shunt electrode and is especially sensitive to changes in dielectric property changes in materials at measurements much lower than conventional sensors. The analyzer has exceptional capabilities for predicting composition of liquid streams or reactions. It measures concentrations and solids content up to 100 percent in applications like agricultural products, petrochemicals, food and beverages. The sensor is easily installed; maintenance is low, and it can be calibrated on line. The software automates data collection and analysis.

  4. Oxygen analyzer

    DOEpatents

    Benner, W.H.

    1984-05-08

    An oxygen analyzer which identifies and classifies microgram quantities of oxygen in ambient particulate matter and for quantitating organic oxygen in solvent extracts of ambient particulate matter. A sample is pyrolyzed in oxygen-free nitrogen gas (N/sub 2/), and the resulting oxygen quantitatively converted to carbon monoxide (CO) by contact with hot granular carbon (C). Two analysis modes are made possible: (1) rapid determination of total pyrolyzable obtained by decomposing the sample at 1135/sup 0/C, or (2) temperature-programmed oxygen thermal analysis obtained by heating the sample from room temperature to 1135/sup 0/C as a function of time. The analyzer basically comprises a pyrolysis tube containing a bed of granular carbon under N/sub 2/, ovens used to heat the carbon and/or decompose the sample, and a non-dispersive infrared CO detector coupled to a mini-computer to quantitate oxygen in the decomposition products and control oven heating.

  5. Oxygen analyzer

    DOEpatents

    Benner, William H.

    1986-01-01

    An oxygen analyzer which identifies and classifies microgram quantities of oxygen in ambient particulate matter and for quantitating organic oxygen in solvent extracts of ambient particulate matter. A sample is pyrolyzed in oxygen-free nitrogen gas (N.sub.2), and the resulting oxygen quantitatively converted to carbon monoxide (CO) by contact with hot granular carbon (C). Two analysis modes are made possible: (1) rapid determination of total pyrolyzable oxygen obtained by decomposing the sample at 1135.degree. C., or (2) temperature-programmed oxygen thermal analysis obtained by heating the sample from room temperature to 1135.degree. C. as a function of time. The analyzer basically comprises a pyrolysis tube containing a bed of granular carbon under N.sub.2, ovens used to heat the carbon and/or decompose the sample, and a non-dispersive infrared CO detector coupled to a mini-computer to quantitate oxygen in the decomposition products and control oven heating.

  6. Response in Electrostatic Analyzers due to Backscattered Electrons: Case Study Analysis with the Juno JADE-E Instrument

    NASA Astrophysics Data System (ADS)

    Clark, G. B.; Allegrini, F.; McComas, D. J.; Randol, B. M.; Valek, P. W.

    2012-12-01

    NASA's Juno mission includes the Jovian Auroral Distribution Experiment (JADE) that will measure in situ electrons and ions in Jupiter's polar magnetosphere. JADE consists of three nearly identical electron sensors (JADE-E) and one ion sensor (JADE-I). JADE-E measures full electron pitch angle distributions from 0.1 to 100 keV. The electro-optics are based on an electrostatic analyzer (ESA) with an Ebonol-C blackening surface treatment, and a detection system composed of a MCP stack and 16 individual anodes each with a 7.5 degree field of view. A non-ideal response was observed during calibration with count rates measured on anodes adjacent to the focal anode. The integrated non-ideal response contributes up to ~25%, relative to the ideal signal, for electron beam energies ~30 keV. We propose that this response is due to backscattered electrons (BSEs) from the electron beam off of the ESA coating. With a SIMION model, we explored the angular and energy distributions of BSEs and how they affect the response of JADE-E. Non-ideal responses occur at some level in electron plasma ESAs generally, likely due to the effect of BSEs discussed here.

  7. Using the social structure of markets as a framework for analyzing vaccination debates: The case of emergency polio vaccination.

    PubMed

    Connelly, Yaron; Ziv, Arnona; Goren, Uri; Tal, Orna; Kaplan, Giora; Velan, Baruch

    2016-07-01

    The framework of the social structure of markets was used to analyze an online debate revolving around an emergency poliovirus vaccination campaign in Israel. Examination of a representative sample of 200 discussions revealed the activity of three parties: authoritative agents promoting vaccinations, alternative agents promoting anti-vaccination, both representing sellers, and the impartial agents, representing the customers-the general public deliberating whether to comply with vaccination or not. Both sellers interacted with consumers using mechanisms of luring and convincing. The authoritative agents conveyed their message by exhibiting professionalism, building trust and offering to share information. The alternative agents spread doubts and evoked negative emotions of distrust and fear. Among themselves, the alternative agents strived to discredit the authoritative agents, while the latter preferred to ignore the former. Content analysis of discussions conducted by the general public reveal reiteration of the messages conveyed by the sellers, implying that the transaction of pro and anti-vaccination ideas indeed took place. We suggest that the framework of the market as a social structure can be applied to the analysis of other vaccination debates, and thereby provide additional insights into vaccination polemics.

  8. Analyzing small data sets using Bayesian estimation: the case of posttraumatic stress symptoms following mechanical ventilation in burn survivors

    PubMed Central

    van de Schoot, Rens; Broere, Joris J.; Perryck, Koen H.; Zondervan-Zwijnenburg, Mariëlle; van Loey, Nancy E.

    2015-01-01

    Background The analysis of small data sets in longitudinal studies can lead to power issues and often suffers from biased parameter values. These issues can be solved by using Bayesian estimation in conjunction with informative prior distributions. By means of a simulation study and an empirical example concerning posttraumatic stress symptoms (PTSS) following mechanical ventilation in burn survivors, we demonstrate the advantages and potential pitfalls of using Bayesian estimation. Methods First, we show how to specify prior distributions and by means of a sensitivity analysis we demonstrate how to check the exact influence of the prior (mis-) specification. Thereafter, we show by means of a simulation the situations in which the Bayesian approach outperforms the default, maximum likelihood and approach. Finally, we re-analyze empirical data on burn survivors which provided preliminary evidence of an aversive influence of a period of mechanical ventilation on the course of PTSS following burns. Results Not suprisingly, maximum likelihood estimation showed insufficient coverage as well as power with very small samples. Only when Bayesian analysis, in conjunction with informative priors, was used power increased to acceptable levels. As expected, we showed that the smaller the sample size the more the results rely on the prior specification. Conclusion We show that two issues often encountered during analysis of small samples, power and biased parameters, can be solved by including prior information into Bayesian analysis. We argue that the use of informative priors should always be reported together with a sensitivity analysis. PMID:25765534

  9. MULTICHANNEL ANALYZER

    DOEpatents

    Kelley, G.G.

    1959-11-10

    A multichannel pulse analyzer having several window amplifiers, each amplifier serving one group of channels, with a single fast pulse-lengthener and a single novel interrogation circuit serving all channels is described. A pulse followed too closely timewise by another pulse is disregarded by the interrogation circuit to prevent errors due to pulse pileup. The window amplifiers are connected to the pulse lengthener output, rather than the linear amplifier output, so need not have the fast response characteristic formerly required.

  10. Analyzing large-scale conservation interventions with Bayesian hierarchical models: a case study of supplementing threatened Pacific salmon.

    PubMed

    Scheuerell, Mark D; Buhle, Eric R; Semmens, Brice X; Ford, Michael J; Cooney, Tom; Carmichael, Richard W

    2015-05-01

    Myriad human activities increasingly threaten the existence of many species. A variety of conservation interventions such as habitat restoration, protected areas, and captive breeding have been used to prevent extinctions. Evaluating the effectiveness of these interventions requires appropriate statistical methods, given the quantity and quality of available data. Historically, analysis of variance has been used with some form of predetermined before-after control-impact design to estimate the effects of large-scale experiments or conservation interventions. However, ad hoc retrospective study designs or the presence of random effects at multiple scales may preclude the use of these tools. We evaluated the effects of a large-scale supplementation program on the density of adult Chinook salmon Oncorhynchus tshawytscha from the Snake River basin in the northwestern United States currently listed under the U.S. Endangered Species Act. We analyzed 43 years of data from 22 populations, accounting for random effects across time and space using a form of Bayesian hierarchical time-series model common in analyses of financial markets. We found that varying degrees of supplementation over a period of 25 years increased the density of natural-origin adults, on average, by 0-8% relative to nonsupplementation years. Thirty-nine of the 43 year effects were at least two times larger in magnitude than the mean supplementation effect, suggesting common environmental variables play a more important role in driving interannual variability in adult density. Additional residual variation in density varied considerably across the region, but there was no systematic difference between supplemented and reference populations. Our results demonstrate the power of hierarchical Bayesian models to detect the diffuse effects of management interventions and to quantitatively describe the variability of intervention success. Nevertheless, our study could not address whether ecological factors

  11. Analyzing large-scale conservation interventions with Bayesian hierarchical models: a case study of supplementing threatened Pacific salmon.

    PubMed

    Scheuerell, Mark D; Buhle, Eric R; Semmens, Brice X; Ford, Michael J; Cooney, Tom; Carmichael, Richard W

    2015-05-01

    Myriad human activities increasingly threaten the existence of many species. A variety of conservation interventions such as habitat restoration, protected areas, and captive breeding have been used to prevent extinctions. Evaluating the effectiveness of these interventions requires appropriate statistical methods, given the quantity and quality of available data. Historically, analysis of variance has been used with some form of predetermined before-after control-impact design to estimate the effects of large-scale experiments or conservation interventions. However, ad hoc retrospective study designs or the presence of random effects at multiple scales may preclude the use of these tools. We evaluated the effects of a large-scale supplementation program on the density of adult Chinook salmon Oncorhynchus tshawytscha from the Snake River basin in the northwestern United States currently listed under the U.S. Endangered Species Act. We analyzed 43 years of data from 22 populations, accounting for random effects across time and space using a form of Bayesian hierarchical time-series model common in analyses of financial markets. We found that varying degrees of supplementation over a period of 25 years increased the density of natural-origin adults, on average, by 0-8% relative to nonsupplementation years. Thirty-nine of the 43 year effects were at least two times larger in magnitude than the mean supplementation effect, suggesting common environmental variables play a more important role in driving interannual variability in adult density. Additional residual variation in density varied considerably across the region, but there was no systematic difference between supplemented and reference populations. Our results demonstrate the power of hierarchical Bayesian models to detect the diffuse effects of management interventions and to quantitatively describe the variability of intervention success. Nevertheless, our study could not address whether ecological factors

  12. Analyzing large-scale conservation interventions with Bayesian hierarchical models: a case study of supplementing threatened Pacific salmon

    PubMed Central

    Scheuerell, Mark D; Buhle, Eric R; Semmens, Brice X; Ford, Michael J; Cooney, Tom; Carmichael, Richard W

    2015-01-01

    Myriad human activities increasingly threaten the existence of many species. A variety of conservation interventions such as habitat restoration, protected areas, and captive breeding have been used to prevent extinctions. Evaluating the effectiveness of these interventions requires appropriate statistical methods, given the quantity and quality of available data. Historically, analysis of variance has been used with some form of predetermined before-after control-impact design to estimate the effects of large-scale experiments or conservation interventions. However, ad hoc retrospective study designs or the presence of random effects at multiple scales may preclude the use of these tools. We evaluated the effects of a large-scale supplementation program on the density of adult Chinook salmon Oncorhynchus tshawytscha from the Snake River basin in the northwestern United States currently listed under the U.S. Endangered Species Act. We analyzed 43 years of data from 22 populations, accounting for random effects across time and space using a form of Bayesian hierarchical time-series model common in analyses of financial markets. We found that varying degrees of supplementation over a period of 25 years increased the density of natural-origin adults, on average, by 0–8% relative to nonsupplementation years. Thirty-nine of the 43 year effects were at least two times larger in magnitude than the mean supplementation effect, suggesting common environmental variables play a more important role in driving interannual variability in adult density. Additional residual variation in density varied considerably across the region, but there was no systematic difference between supplemented and reference populations. Our results demonstrate the power of hierarchical Bayesian models to detect the diffuse effects of management interventions and to quantitatively describe the variability of intervention success. Nevertheless, our study could not address whether ecological

  13. Reflections of ions in electrostatic analyzers: a case study with New Horizons/Solar Wind Around Pluto.

    PubMed

    Randol, B M; Ebert, R W; Allegrini, F; McComas, D J; Schwadron, N A

    2010-11-01

    Electrostatic analyzers (ESAs), in various forms, are used to measure plasma in a range of applications. In this article, we describe how ions reflect from the interior surfaces of an ESA, the detection of which constitutes a fundamentally nonideal response of ESAs. We demonstrate this effect by comparing laboratory data from a real ESA-based space instrument, the Solar Wind Around Pluto (SWAP) instrument, aboard the NASA New Horizons spacecraft, to results from a model based on quantum mechanical simulations of particles reflected from the instrument's surfaces combined with simulations of particle trajectories through the instrument's applied electrostatic fields. Thus, we show, for the first time, how reflected ions in ESAs lead to nonideal effects that have important implications for understanding the data returned by these instruments, as well as for designing new low-background ESA-based instruments. Specifically, we show that the response of SWAP widens considerably below a level of 10(-3) of the peak response. Thus, a direct measurement of a plasma distribution with SWAP will have an energy-dependent background on the order of ≤10(-3) of the peak of the signal due to that distribution. We predict that this order of magnitude estimate for the background applies to a large number of ESA-based instruments because ESAs operate using a common principle. However, the exact shape of the energy-dependent response will be different for different instruments. The principle of operation is that ions outside the ideal range of energy-per-charge are deflected into the walls of the ESA. Therefore, we propose that a new design paradigm is necessary to mitigate the effect of ion reflections and thus accurately and directly measure the energy spectrum of a plasma using ESAs. In this article, we build a framework for minimizing the effect of ion reflections in the design of new ESAs. Through the use of existing computer simulation software, a design team can use our method

  14. Reflections of ions in electrostatic analyzers: a case study with New Horizons/Solar Wind Around Pluto.

    PubMed

    Randol, B M; Ebert, R W; Allegrini, F; McComas, D J; Schwadron, N A

    2010-11-01

    Electrostatic analyzers (ESAs), in various forms, are used to measure plasma in a range of applications. In this article, we describe how ions reflect from the interior surfaces of an ESA, the detection of which constitutes a fundamentally nonideal response of ESAs. We demonstrate this effect by comparing laboratory data from a real ESA-based space instrument, the Solar Wind Around Pluto (SWAP) instrument, aboard the NASA New Horizons spacecraft, to results from a model based on quantum mechanical simulations of particles reflected from the instrument's surfaces combined with simulations of particle trajectories through the instrument's applied electrostatic fields. Thus, we show, for the first time, how reflected ions in ESAs lead to nonideal effects that have important implications for understanding the data returned by these instruments, as well as for designing new low-background ESA-based instruments. Specifically, we show that the response of SWAP widens considerably below a level of 10(-3) of the peak response. Thus, a direct measurement of a plasma distribution with SWAP will have an energy-dependent background on the order of ≤10(-3) of the peak of the signal due to that distribution. We predict that this order of magnitude estimate for the background applies to a large number of ESA-based instruments because ESAs operate using a common principle. However, the exact shape of the energy-dependent response will be different for different instruments. The principle of operation is that ions outside the ideal range of energy-per-charge are deflected into the walls of the ESA. Therefore, we propose that a new design paradigm is necessary to mitigate the effect of ion reflections and thus accurately and directly measure the energy spectrum of a plasma using ESAs. In this article, we build a framework for minimizing the effect of ion reflections in the design of new ESAs. Through the use of existing computer simulation software, a design team can use our method

  15. Reflections of ions in electrostatic analyzers: A case study with New Horizons/Solar Wind Around Pluto

    SciTech Connect

    Randol, B. M.; Ebert, R. W.; Allegrini, F.; McComas, D. J.; Schwadron, N. A.

    2010-11-15

    Electrostatic analyzers (ESAs), in various forms, are used to measure plasma in a range of applications. In this article, we describe how ions reflect from the interior surfaces of an ESA, the detection of which constitutes a fundamentally nonideal response of ESAs. We demonstrate this effect by comparing laboratory data from a real ESA-based space instrument, the Solar Wind Around Pluto (SWAP) instrument, aboard the NASA New Horizons spacecraft, to results from a model based on quantum mechanical simulations of particles reflected from the instrument's surfaces combined with simulations of particle trajectories through the instrument's applied electrostatic fields. Thus, we show, for the first time, how reflected ions in ESAs lead to nonideal effects that have important implications for understanding the data returned by these instruments, as well as for designing new low-background ESA-based instruments. Specifically, we show that the response of SWAP widens considerably below a level of 10{sup -3} of the peak response. Thus, a direct measurement of a plasma distribution with SWAP will have an energy-dependent background on the order of {<=}10{sup -3} of the peak of the signal due to that distribution. We predict that this order of magnitude estimate for the background applies to a large number of ESA-based instruments because ESAs operate using a common principle. However, the exact shape of the energy-dependent response will be different for different instruments. The principle of operation is that ions outside the ideal range of energy-per-charge are deflected into the walls of the ESA. Therefore, we propose that a new design paradigm is necessary to mitigate the effect of ion reflections and thus accurately and directly measure the energy spectrum of a plasma using ESAs. In this article, we build a framework for minimizing the effect of ion reflections in the design of new ESAs. Through the use of existing computer simulation software, a design team can use

  16. Contamination Analyzer

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Measurement of the total organic carbon content in water is important in assessing contamination levels in high purity water for power generation, pharmaceutical production and electronics manufacture. Even trace levels of organic compounds can cause defects in manufactured products. The Sievers Model 800 Total Organic Carbon (TOC) Analyzer, based on technology developed for the Space Station, uses a strong chemical oxidizing agent and ultraviolet light to convert organic compounds in water to carbon dioxide. After ionizing the carbon dioxide, the amount of ions is determined by measuring the conductivity of the deionized water. The new technique is highly sensitive, does not require compressed gas, and maintenance is minimal.

  17. Stress Analyzer

    NASA Technical Reports Server (NTRS)

    1990-01-01

    SPATE 900 Dynamic Stress Analyzer is an acronym for Stress Pattern Analysis by Thermal Emission. It detects stress-induced temperature changes in a structure and indicates the degree of stress. Ometron, Inc.'s SPATE 9000 consists of a scan unit and a data display. The scan unit contains an infrared channel focused on the test structure to collect thermal radiation, and a visual channel used to set up the scan area and interrogate the stress display. Stress data is produced by detecting minute temperature changes, down to one-thousandth of a degree Centigrade, resulting from the application to the structure of dynamic loading. The electronic data processing system correlates the temperature changes with a reference signal to determine stress level.

  18. Optical analyzer

    DOEpatents

    Hansen, A.D.

    1987-09-28

    An optical analyzer wherein a sample of particulate matter, and particularly of organic matter, which has been collected on a quartz fiber filter is placed in a combustion tube, and light from a light source is passed through the sample. The temperature of the sample is raised at a controlled rate and in a controlled atmosphere. The magnitude of the transmission of light through the sample is detected as the temperature is raised. A data processor, differentiator and a two pen recorder provide a chart of the optical transmission versus temperature and the rate of change of optical transmission versus temperature signatures (T and D) of the sample. These signatures provide information as to physical and chemical processes and a variety of quantitative and qualitative information about the sample. Additional information is obtained by repeating the run in different atmospheres and/or different rates or heating with other samples of the same particulate material collected on other filters. 7 figs.

  19. Preliminary 3D In-situ measurements of the texture evolution of strained H2O ice during annealing using neutron Laue diffractometry

    NASA Astrophysics Data System (ADS)

    Journaux, Baptiste; Montagnat, Maurine; Chauve, Thomas; Ouladdiaf, Bachir; Allibon, John

    2015-04-01

    Dynamic recrystallization (DRX) strongly affects the evolution of microstructure (grain size and shape) and texture (crystal preferred orientation) in materials during deformation at high temperature. Since texturing leads to anisotropic physical properties, predicting the effect of DRX is essential for industrial applications, for interpreting geophysical data and modeling geodynamic flows, and predicting ice sheet flow and climate evolution. A large amount of literature is available related to metallurgy, geology or glaciology, but there remains overall fundamental questions about the relationship between nucleation, grain boundary migration and texture development at the microscopic scale. Previous measurements of DRX in ice were either conducted using 2D ex-situ techniques such as AITA [1,2] or Electron Backscattering Diffraction (EBSD) [3], or using 3D statistical ex-situ [4] or in-situ [5] techniques. Nevertheless, all these techniques failed to observe at the scale of nucleation processes during DRX in full 3D. Here we present a new approach using neutron Laue diffraction, which enable to perform 3D measurements of in-situ texture evolution of strained polycrystalline H2O ice (>2% at 266 K) during annealing at the microscopic scale. Thanks the CYCLOPS instrument [6] (Institut Laue Langevin Grenoble, France) and the intrinsic low background of this setup, preliminary observations enabled us to follow, in H2O ice, the evolution of serrated grain boundaries, and kink-band during annealing. Our observations show a significant evolution of the texture and internal misorientation over the course of few hours at an annealing temperature of 268.5 K. In the contrary, ice kink-band structures seem to be very stable over time at near melting temperatures. The same samples have been analyzed ex-situ using EBSD for comparison. These results represent a first step toward in-situ microscopic measurements of dynamic recrystallization processes in ice during strain. This

  20. Optical analyzer

    DOEpatents

    Hansen, Anthony D.

    1989-02-07

    An optical analyzer (10) wherein a sample (19) of particulate matter, and particularly of organic matter, which has been collected on a quartz fiber filter (20) is placed in a combustion tube (11), and light from a light source (14) is passed through the sample (19). The temperature of the sample (19) is raised at a controlled rate and in a controlled atmosphere. The magnitude of the transmission of light through the sample (19) is detected (18) as the temperature is raised. A data processor (23), differentiator (28) and a two pen recorder (24) provide a chart of the optical transmission versus temperature and the rate of change of optical transmission versus temperature signatures (T and D) of the sample (19). These signatures provide information as to physical and chemical processes and a variety of quantitative and qualitative information about the sample (19). Additional information is obtained by repeating the run in different atmospheres and/or different rates of heating with other samples of the same particulate material collected on other filters.

  1. Optical analyzer

    DOEpatents

    Hansen, Anthony D.

    1989-01-01

    An optical analyzer (10) wherein a sample (19) of particulate matter, and particularly of organic matter, which has been collected on a quartz fiber filter (20) is placed in a combustion tube (11), and light from a light source (14) is passed through the sample (19). The temperature of the sample (19) is raised at a controlled rate and in a controlled atmosphere. The magnitude of the transmission of light through the sample (19) is detected (18) as the temperature is raised. A data processor (23), differentiator (28) and a two pen recorder (24) provide a chart of the optical transmission versus temperature and the rate of change of optical transmission versus temperature signatures (T and D) of the sample (19). These signatures provide information as to physical and chemical processes and a variety of quantitative and qualitative information about the sample (19). Additional information is obtained by repeating the run in different atmospheres and/or different rates of heating with other samples of the same particulate material collected on other filters.

  2. ABSORPTION ANALYZER

    DOEpatents

    Brooksbank, W.A. Jr.; Leddicotte, G.W.; Strain, J.E.; Hendon, H.H. Jr.

    1961-11-14

    A means was developed for continuously computing and indicating the isotopic assay of a process solution and for automatically controlling the process output of isotope separation equipment to provide a continuous output of the desired isotopic ratio. A counter tube is surrounded with a sample to be analyzed so that the tube is exactly in the center of the sample. A source of fast neutrons is provided and is spaced from the sample. The neutrons from the source are thermalized by causing them to pass through a neutron moderator, and the neutrons are allowed to diffuse radially through the sample to actuate the counter. A reference counter in a known sample of pure solvent is also actuated by the thermal neutrons from the neutron source. The number of neutrons which actuate the detectors is a function of a concentration of the elements in solution and their neutron absorption cross sections. The pulses produced by the detectors responsive to each neu tron passing therethrough are amplified and counted. The respective times required to accumulate a selected number of counts are measured by associated timing devices. The concentration of a particular element in solution may be determined by utilizing the following relation: T2/Ti = BCR, where B is a constant proportional to the absorption cross sections, T2 is the time of count collection for the unknown solution, Ti is the time of count collection for the pure solvent, R is the isotopic ratlo, and C is the molar concentration of the element to be determined. Knowing the slope constant B for any element and when the chemical concentration is known, the isotopic concentration may be readily determined, and conversely when the isotopic ratio is known, the chemical concentrations may be determined. (AEC)

  3. High pressure Laue diffraction and its application to study microstructural changes during the α → β phase transition in Si

    SciTech Connect

    Popov, D. Park, C.; Kenney-Benson, C.; Shen, G.

    2015-07-15

    An approach using polychromatic x-ray Laue diffraction is described for studying pressure induced microstructural changes of materials under pressure. The advantages of this approach with respect to application of monochromatic x-ray diffraction and other techniques are discussed. Experiments to demonstrate the applications of the method have been performed on the α → β phase transition in Si at high pressures using a diamond anvil cell. We present the characterization of microstructures across the α–β phase transition, such as morphology of both the parent and product phases, relative orientation of single-crystals, and deviatoric strains. Subtle inhomogeneous strain of the single-crystal sample caused by lattice rotations becomes detectable with the approach.

  4. A new method to do time resolved, x-ray diffraction studies: The rotating crystal Laue method

    SciTech Connect

    Knapp, G.S.; Beno, M.A.

    1991-07-01

    In order to achieve the ultimate time resolution of a synchrotron source we propose a new experimental technique by which time dependent structural changes can be monitored on the time scale of synchrotron pulse widths. Samples will be studied by a rotating crystal Laue diffraction technique where we rapidly spin the sample and observe the diffraction pattern from a broad band of incident x-rays. A computer simulation is presented of the diffraction pattern time evolution using the parameters for an APS undulator of a phase change in the YBa{sub 2}Cu{sub 3}O{sub 7{minus}x} superconductor. We will discuss the application of this and closely related techniques at other synchrotron sources including bending magnets and insertion devices at NSLS and CHESS. 11 refs., 4 figs., 1 tab.

  5. Crystallography from Haüy to Laue: controversies on the molecular and atomistic nature of solids.

    PubMed

    Kubbinga, Henk

    2012-01-01

    The history of crystallography has been assessed in the context of the emergence and spread of the molecular theory. The present paper focuses on the 19th century, which saw the emancipation of crystallography as a science sui generis. Around 1800, Laplace's molecularism called the tune in the various sciences (physics, chemistry, biology, crystallography). In crystallography, two schools opposed each other: that of Weiss, in Berlin, and that of Haüy, in Paris. Symmetry proved essential. It will be shown how the lattice theory arose in an essentially molecular framework and how group theory imposed itself. The salt hydrates suggested the idea of (two or more) superimposed molecular lattices. Gradually it became clear that an ultimate lattice theory ought to be atomic. The experiments of Laue, Friedrich and Knipping confirmed that atomic basis.

  6. Evaluating statistical and clinical significance of intervention effects in single-case experimental designs: an SPSS method to analyze univariate data.

    PubMed

    Maric, Marija; de Haan, Else; Hogendoorn, Sanne M; Wolters, Lidewij H; Huizenga, Hilde M

    2015-03-01

    Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a data-analytic method to analyze univariate (i.e., one symptom) single-case data using the common package SPSS. This method can help the clinical researcher to investigate whether an intervention works as compared with a baseline period or another intervention type, and to determine whether symptom improvement is clinically significant. First, we describe the statistical method in a conceptual way and show how it can be implemented in SPSS. Simulation studies were performed to determine the number of observation points required per intervention phase. Second, to illustrate this method and its implications, we present a case study of an adolescent with anxiety disorders treated with cognitive-behavioral therapy techniques in an outpatient psychotherapy clinic, whose symptoms were regularly assessed before each session. We provide a description of the data analyses and results of this case study. Finally, we discuss the advantages and shortcomings of the proposed method.

  7. [Development of a Lead-covered Case for a Wireless X-ray Output Analyzer to Perform CT Half-value Layer Measurements].

    PubMed

    Akaishi, Hirokazu; Takeda, Hiromitsu; Kanazawa, Yoshiyuki; Yoshii, Yuji; Asanuma, Osamu

    2016-03-01

    Measurement of the half-value layer (HVL) is a difficult task in computed tomography (CT) , because a nonrotating X-ray tube must be used. The purpose of this study is to develop a lead-covered case, which enables HVL measurements with a rotating CT X-ray tube. The lead-covered case was manufactured from acrylic and lead plates, which are 3 mm thick and have a slit. The slit-detector distance can be selected between 14 mm and 122 mm. HVL measurements were performed using a wireless X-ray output analyzer "Piranha." We used the following exposure conditions: tube voltages of 80, 100, and 120 kV; a tube current of 550 mA; and an exposure time of 1.0 s. The HVLs were measured by using the following two methods: (a) Nonrotating method-a conventional method that uses the nonrotating exposure mode. (b) Rotating method-a new method that uses the lead-covered case and the rotating exposure mode. As a result, when the slit-detector distance was 58 mm, the HVL values obtained by the nonrotating and rotating methods were 4.38 and 4.24 mmAl at 80 kV, 5.51 and 5.37 mmAl at 100 kV, 6.61 and 6.48 mmAl at 120 kV, respectively. A lead-covered case, which enables the measurement of the HVL in a rotating X-ray tube, was developed. The case is useful in measuring the HVLs at facilities that cannot fix the X-ray tube.

  8. A new method for polychromatic X-ray μLaue diffraction on a Cu pillar using an energy-dispersive pn-junction charge-coupled device.

    PubMed

    Abboud, A; Kirchlechner, C; Send, S; Micha, J S; Ulrich, O; Pashniak, N; Strüder, L; Keckes, J; Pietsch, U

    2014-11-01

    μLaue diffraction with a polychromatic X-ray beam can be used to measure strain fields and crystal orientations of micro crystals. The hydrostatic strain tensor can be obtained once the energy profile of the reflections is measured. However, this remains a challenge both on the time scale and reproducibility of the beam position on the sample. In this review, we present a new approach to obtain the spatial and energy profiles of Laue spots by using a pn-junction charge-coupled device, an energy-dispersive area detector providing 3D resolution of incident X-rays. The morphology and energetic structure of various Bragg peaks from a single crystalline Cu micro-cantilever used as a test system were simultaneously acquired. The method facilitates the determination of the Laue spots' energy spectra without filtering the white X-ray beam. The synchrotron experiment was performed at the BM32 beamline of ESRF using polychromatic X-rays in the energy range between 5 and 25 keV and a beam size of 0.5 μm × 0.5 μm. The feasibility test on the well known system demonstrates the capabilities of the approach and introduces the "3D detector method" as a promising tool for material investigations to separate bending and strain for technical materials.

  9. A new method for polychromatic X-ray μLaue diffraction on a Cu pillar using an energy-dispersive pn-junction charge-coupled device

    SciTech Connect

    Abboud, A.; Send, S.; Pashniak, N.; Pietsch, U.; Kirchlechner, C.; Micha, J. S.; Ulrich, O.; Keckes, J.

    2014-11-15

    μLaue diffraction with a polychromatic X-ray beam can be used to measure strain fields and crystal orientations of micro crystals. The hydrostatic strain tensor can be obtained once the energy profile of the reflections is measured. However, this remains a challenge both on the time scale and reproducibility of the beam position on the sample. In this review, we present a new approach to obtain the spatial and energy profiles of Laue spots by using a pn-junction charge-coupled device, an energy-dispersive area detector providing 3D resolution of incident X-rays. The morphology and energetic structure of various Bragg peaks from a single crystalline Cu micro-cantilever used as a test system were simultaneously acquired. The method facilitates the determination of the Laue spots’ energy spectra without filtering the white X-ray beam. The synchrotron experiment was performed at the BM32 beamline of ESRF using polychromatic X-rays in the energy range between 5 and 25 keV and a beam size of 0.5 μm × 0.5 μm. The feasibility test on the well known system demonstrates the capabilities of the approach and introduces the “3D detector method” as a promising tool for material investigations to separate bending and strain for technical materials.

  10. Diffraction properties of multilayer Laue lenses with an aperture of 102 µm and WSi₂/Al bilayers.

    PubMed

    Kubec, Adam; Kujala, Naresh; Conley, Raymond; Bouet, Nathalie; Zhou, Juan; Mooney, Tim M; Shu, Deming; Kirchman, Jeffrey; Goetze, Kurt; Maser, Jörg; Macrander, Albert

    2015-10-19

    We report on the characterization of a multilayer Laue lens (MLL) with large acceptance, made of a novel WSi2/Al bilayer system. Fabrication of multilayers with large deposition thickness is required to obtain MLL structures with sufficient apertures capable of accepting the full lateral coherence length of x-rays at typical nanofocusing beamlines. To date, the total deposition thickness has been limited by stress-buildup in the multilayer. We were able to grow WSi2/Al with low grown-in stress, and asses the degree of stress reduction. X-ray diffraction experiments were conducted at beamline 1-BM at the Advanced Photon Source. We used monochromatic x-rays with a photon energy of 12 keV and a bandwidth of ΔE/E=5.4·10(-4). The MLL was grown with parallel layer interfaces, and was designed to have a large focal length of 9.6 mm. The mounted lens was 2.7 mm in width. We found and quantified kinks and bending of sections of the MLL. Sections with bending were found to partly have a systematic progression in the interface angles. We observed kinking in some, but not all, areas. The measurements are compared with dynamic diffraction calculations made with Coupled Wave Theory. Data are plotted showing the diffraction efficiency as a function of the external tilting angle of the entire mounted lens. This way of plotting the data was found to provide an overview into the diffraction properties of the whole lens, and enabled the following layer tilt analyses.

  11. Computation of diffuse scattering arising from one-phonon excitations in a neutron time-of-flight single-crystal Laue diffraction experiment

    PubMed Central

    Gutmann, Matthias J.; Graziano, Gabriella; Mukhopadhyay, Sanghamitra; Refson, Keith; von Zimmerman, Martin

    2015-01-01

    Direct phonon excitation in a neutron time-of-flight single-crystal Laue diffraction experiment has been observed in a single crystal of NaCl. At room temperature both phonon emission and excitation leave characteristic features in the diffuse scattering and these are well reproduced using ab initio phonons from density functional theory (DFT). A measurement at 20 K illustrates the effect of thermal population of the phonons, leaving the features corresponding to phonon excitation and strongly suppressing the phonon annihilation. A recipe is given to compute these effects combining DFT results with the geometry of the neutron experiment. PMID:26306090

  12. High-energy transmission Laue micro-beam X-ray diffraction: a probe for intra-granular lattice orientation and elastic strain in thicker samples.

    PubMed

    Hofmann, Felix; Song, Xu; Abbey, Brian; Jun, Tea-Sung; Korsunsky, Alexander M

    2012-05-01

    An understanding of the mechanical response of modern engineering alloys to complex loading conditions is essential for the design of load-bearing components in high-performance safety-critical aerospace applications. A detailed knowledge of how material behaviour is modified by fatigue and the ability to predict failure reliably are vital for enhanced component performance. Unlike macroscopic bulk properties (e.g. stiffness, yield stress, etc.) that depend on the average behaviour of many grains, material failure is governed by `weakest link'-type mechanisms. It is strongly dependent on the anisotropic single-crystal elastic-plastic behaviour, local morphology and microstructure, and grain-to-grain interactions. For the development and validation of models that capture these complex phenomena, the ability to probe deformation behaviour at the micro-scale is key. The diffraction of highly penetrating synchrotron X-rays is well suited to this purpose and micro-beam Laue diffraction is a particularly powerful tool that has emerged in recent years. Typically it uses photon energies of 5-25 keV, limiting penetration into the material, so that only thin samples or near-surface regions can be studied. In this paper the development of high-energy transmission Laue (HETL) micro-beam X-ray diffraction is described, extending the micro-beam Laue technique to significantly higher photon energies (50-150 keV). It allows the probing of thicker sample sections, with the potential for grain-level characterization of real engineering components. The new HETL technique is used to study the deformation behaviour of individual grains in a large-grained polycrystalline nickel sample during in situ tensile loading. Refinement of the Laue diffraction patterns yields lattice orientations and qualitative information about elastic strains. After deformation, bands of high lattice misorientation can be identified in the sample. Orientation spread within individual scattering volumes is

  13. Shedding Light on the Photochemistry of Coinage-Metal Phosphorescent Materials: A Time-Resolved Laue Diffraction Study of an AgI–CuI Tetranuclear Complex

    PubMed Central

    Jarzembska, Katarzyna N.; Kamiński, Radosław; Fournier, Bertrand; Trzop, Elżbieta; Sokolow, Jesse D.; Henning, Robert; Chen, Yang; Coppens, Philip

    2015-01-01

    The triplet excited state of a new crystalline form of a tetranuclear coordination d10–d10-type complex, Ag2Cu2L4 (L = 2-diphenylphosphino-3-methylindole ligand), containing AgI and CuI metal centers has been explored using the Laue pump–probe technique with ≈80 ps time resolution. The relatively short lifetime of 1 μs is accompanied by significant photoinduced structural changes, as large as the Ag1···Cu2 distance shortening by 0.59(3) Å. The results show a pronounced strengthening of the argentophilic interactions and formation of new Ag···Cu bonds on excitation. Theoretical calculations indicate that the structural changes are due to a ligand-to-metal charge transfer (LMCT) strengthening the Ag···Ag interaction, mainly occurring from the methylindole ligands to the silver metal centers. QM/MM optimizations of the ground and excited states of the complex support the experimental results. Comparison with isolated molecule optimizations demonstrates the restricting effect of the crystalline matrix on photoinduced distortions. The work represents the first time-resolved Laue diffraction study of a heteronuclear coordination complex and provides new information on the nature of photoresponse of coinage metal complexes, which have been the subject of extensive studies. PMID:25238405

  14. Analyzing the impacts of final demand changes on total output using input-output approach: The case of Japanese ICT sectors

    NASA Astrophysics Data System (ADS)

    Zuhdi, Ubaidillah

    2014-03-01

    The purpose of this study is to analyze the impacts of final demand changes on total output of Japanese Information and Communication Technologies (ICT) sectors in future time. This study employs one of analysis tool in Input-Output (IO) analysis, demand-pull IO quantity model, in achieving the purpose. There are three final demand changes used in this study, namely (1) export, (2) import, and (3) outside households consumption changes. This study focuses on "pure change" condition, the condition that final demand changes only appear in analyzed sectors. The results show that export and outside households consumption modifications give positive impact while opposite impact could be seen in import change.

  15. A Method for Using Adjacency Matrices to Analyze the Connections Students Make within and between Concepts: The Case of Linear Algebra

    ERIC Educational Resources Information Center

    Selinski, Natalie E.; Rasmussen, Chris; Wawro, Megan; Zandieh, Michelle

    2014-01-01

    The central goals of most introductory linear algebra courses are to develop students' proficiency with matrix techniques, to promote their understanding of key concepts, and to increase their ability to make connections between concepts. In this article, we present an innovative method using adjacency matrices to analyze students'…

  16. Analyzing the Classroom Teachers' Levels of Creating a Constructivist Learning Environments in Terms of Various Variables: A Mersin Case

    ERIC Educational Resources Information Center

    Üredi, Lütfi

    2014-01-01

    In this research, it was aimed to analyze the classroom teachers' level of creating a constructivist learning environment in terms of various variables. For that purpose, relational screening model was used in the research. Classroom teachers' level of creating a constructivist learning environment was determined using the…

  17. Sagittal focusing Laue monochromator

    DOEpatents

    Zhong; Zhong , Hanson; Jonathan , Hastings; Jerome , Kao; Chi-Chang , Lenhard; Anthony , Siddons; David Peter , Zhong; Hui

    2009-03-24

    An x-ray focusing device generally includes a slide pivotable about a pivot point defined at a forward end thereof, a rail unit fixed with respect to the pivotable slide, a forward crystal for focusing x-rays disposed at the forward end of the pivotable slide and a rearward crystal for focusing x-rays movably coupled to the pivotable slide and the fixed rail unit at a distance rearward from the forward crystal. The forward and rearward crystals define reciprocal angles of incidence with respect to the pivot point, wherein pivoting of the slide about the pivot point changes the incidence angles of the forward and rearward crystals while simultaneously changing the distance between the forward and rearward crystals.

  18. Analyzing geographic clustered response

    SciTech Connect

    Merrill, D.W.; Selvin, S.; Mohr, M.S.

    1991-08-01

    In the study of geographic disease clusters, an alternative to traditional methods based on rates is to analyze case locations on a transformed map in which population density is everywhere equal. Although the analyst's task is thereby simplified, the specification of the density equalizing map projection (DEMP) itself is not simple and continues to be the subject of considerable research. Here a new DEMP algorithm is described, which avoids some of the difficulties of earlier approaches. The new algorithm (a) avoids illegal overlapping of transformed polygons; (b) finds the unique solution that minimizes map distortion; (c) provides constant magnification over each map polygon; (d) defines a continuous transformation over the entire map domain; (e) defines an inverse transformation; (f) can accept optional constraints such as fixed boundaries; and (g) can use commercially supported minimization software. Work is continuing to improve computing efficiency and improve the algorithm. 21 refs., 15 figs., 2 tabs.

  19. Feasibility analyses for HEU to LEU fuel conversion of the LAUE Langivin Institute (ILL) High Flux Reactor (RHF).

    SciTech Connect

    Stevens, J.; Tentner. A.; Bergeron, A.; Nuclear Engineering Division

    2010-08-19

    The High Flux Reactor (RHF) of the Laue Langevin Institute (ILL) based in Grenoble, France is a research reactor designed primarily for neutron beam experiments for fundamental science. It delivers one of the most intense neutron fluxes worldwide, with an unperturbed thermal neutron flux of 1.5 x 10{sup 15} n/cm{sup 2}/s in its reflector. The reactor has been conceived to operate at a nuclear power of 57 MW but currently operates at 52 MW. The reactor currently uses a Highly Enriched Uranium (HEU) fuel. In the framework of its non-proliferation policies, the international community presently aims to minimize the amount of nuclear material available that could be used for nuclear weapons. In this geopolitical context, most worldwide research and test reactors have already started a program of conversion to the use of Low Enriched Uranium (LEU) fuel. A new type of LEU fuel based on a mixture of uranium and molybdenum (UMo) is expected to allow the conversion of compact high performance reactors like the RHF. This report presents the results of reactor design, performance and steady state safety analyses for conversion of the RHF from the use of HEU fuel to the use of UMo LEU fuel. The objective of this work was to show that is feasible, under a set of manufacturing assumptions, to design a new RHF fuel element that could safely replace the HEU element currently used. The new proposed design has been developed to maximize performance, minimize changes and preserve strong safety margins. Neutronics and thermal-hydraulics models of the RHF have been developed and qualified by benchmark against experiments and/or against other codes and models. The models developed were then used to evaluate the RHF performance if LEU UMo were to replace the current HEU fuel 'meat' without any geometric change to the fuel plates. Results of these direct replacement analyses have shown a significant degradation of the RHF performance, in terms of both neutron flux and cycle length

  20. Response in electrostatic analyzers due to backscattered electrons: case study analysis with the Juno Jovian Auroral Distribution Experiment-Electron instrument.

    PubMed

    Clark, G; Allegrini, F; Randol, B M; McComas, D J; Louarn, P

    2013-10-01

    In this study, we introduce a model to characterize electron scattering in an electrostatic analyzer. We show that electrons between 0.5 and 30 keV scatter from internal surfaces to produce a response up to ~20% of the ideal, unscattered response. We compare our model results to laboratory data from the Jovian Auroral Distribution Experiment-Electron sensor onboard the NASA Juno mission. Our model reproduces the measured energy-angle response of the instrument well. Understanding and quantifying this scattering process is beneficial to the analysis of scientific data as well as future instrument optimization.

  1. Response in electrostatic analyzers due to backscattered electrons: case study analysis with the Juno Jovian Auroral Distribution Experiment-Electron instrument.

    PubMed

    Clark, G; Allegrini, F; Randol, B M; McComas, D J; Louarn, P

    2013-10-01

    In this study, we introduce a model to characterize electron scattering in an electrostatic analyzer. We show that electrons between 0.5 and 30 keV scatter from internal surfaces to produce a response up to ~20% of the ideal, unscattered response. We compare our model results to laboratory data from the Jovian Auroral Distribution Experiment-Electron sensor onboard the NASA Juno mission. Our model reproduces the measured energy-angle response of the instrument well. Understanding and quantifying this scattering process is beneficial to the analysis of scientific data as well as future instrument optimization. PMID:24182165

  2. Evaluation of misalignments within a concentrator photovoltaic module by the module optical analyzer: A case of study concerning temperature effects on the module performance

    NASA Astrophysics Data System (ADS)

    Herrero, Rebeca; Askins, Stephen; Antón, Ignacio; Sala, Gabriel

    2015-08-01

    Instituto de Energía Solar, Universidad Politécnica de Madrid (IES-UPM) has developed a method [referred to as the luminescence inverse (LI) method] and equipment [called module optical analyzer (MOA)] to fast measure the optical-angular properties of a CPV module without illumination system nor module movement. This paper presents how the MOA can investigate the optical performance of concentrator photovoltaic (CPV) modules optical-angular performance (in particular, misalignments between the optical components comprising the module) at different temperature conditions.

  3. Lattice-level observation of the elastic-to-plastic relaxation process with subnanosecond resolution in shock-compressed Ta using time-resolved in situ Laue diffraction

    DOE PAGES

    Wehrenberg, C. E.; Comley, A. J.; Barton, N. R.; Coppari, F.; Fratanduono, D.; Huntington, C. M.; Maddox, B. R.; Park, H. -S.; Plechaty, C.; Prisbrey, S. T.; et al

    2015-09-29

    We report direct lattice level measurements of plastic relaxation kinetics through time-resolved, in-situ Laue diffraction of shock-compressed single-crystal [001] Ta at pressures of 27-210 GPa. For a 50 GPa shock, a range of shear strains is observed extending up to the uniaxial limit for early data points (<0.6 ns) and the average shear strain relaxes to a near steady state over ~1 ns. For 80 and 125 GPa shocks, the measured shear strains are fully relaxed already at 200 ps, consistent with rapid relaxation associated with the predicted threshold for homogeneous nucleation of dislocations occurring at shock pressure ~65 GPa.more » The relaxation rate and shear stresses are used to estimate the dislocation density and these quantities are compared to the Livermore Multiscale Strength model as well as various molecular dynamics simulations.« less

  4. Time-resolved structures of macromolecules at the ESRF: Single-pulse Laue diffraction, stroboscopic data collection and femtosecond flash photolysis

    NASA Astrophysics Data System (ADS)

    Wulff, Michael; Schotte, Friedrich; Naylor, Graham; Bourgeois, Dominique; Moffat, Keith; Mourou, Gerard

    1997-10-01

    We review the time structure of synchrotron radiation and its use for fast time-resolved diffraction experiments in macromolecular photocycles using flash photolysis to initiate the reaction. The source parameters and optics for ID09 at ESRF are presented together with the phase-locked chopper and femtosecond laser. The chopper can set up a 900 Hz pulse train of 100 ps pulses from the hybrid bunch-mode and, in conjunction with a femtosecond laser, it can be used for stroboscopic data collection with both monochromatic and polychromatic beams. Single-pulse Laue data from cutinase, a 22 kD lipolic enzyme, are presented which show that the quality of single-pulse Laue patterns are sufficient to refine the excited state(s) in a reaction pathway from a known ground state. The flash photolysis technique is discussed and an example is given for heme proteins. The radiation damage from a laser pulse in the femto and picosecond range can be reduced by triggering at a wavelength where the interaction is strong. We propose the use of microcrystals in the range 25-50 μm for efficient photolysis with femto and picosecond pulses. The performance of circular storage rings is compared with the predicted performance of an X-ray free electron laser (XFEL). The combination of micro beams, a gain of 10 5 photons per pulse and an ultrashort pulse length of 100 fs is likely to improve pulsed diffraction data very substantially. It may be used to image coherent nuclear motion at atomic resolution in ultrafast uni-molecular reactions.

  5. Methodology for analyzing stress states during in-situ thermomechanical cycling in individual lead free solder joints using synchrotron radiation

    SciTech Connect

    Zhou, Bite; Bieler, Thomas R.; Lee , Tae-Kyu; Liu, Kuo-Chuan

    2010-07-22

    To examine how a lead-free solder joint deforms in a thermal cycling environment, both the elastic and plastic stress and strain behavior must be understood. Methods to identify evolution of the internal strain (stress) state during thermal cycling are described. A slice of a package containing a single row of solder joints was thermally cycled from 0 C to 100 C with a period of about 1 h with concurrent acquisition of transmission Laue patterns using synchrotron radiation. These results indicated that most joints are single crystals, with some being multicrystals with no more than a few Sn grain orientations. Laue patterns were analyzed to estimate local strains in different crystal directions at different temperatures during a thermal cycle. While the strains perpendicular to various crystal planes all vary in a similar way, the magnitude of strain varies. The specimens were subsequently given several hundred additional thermal cycles and measured again to assess changes in the crystal orientations. These results show that modest changes in crystal orientations occur during thermal cycling.

  6. To adjust or not to adjust for baseline when analyzing repeated binary responses? The case of complete data when treatment comparison at study end is of interest.

    PubMed

    Jiang, Honghua; Kulkarni, Pandurang M; Mallinckrodt, Craig H; Shurzinske, Linda; Molenberghs, Geert; Lipkovich, Ilya

    2015-01-01

    The benefits of adjusting for baseline covariates are not as straightforward with repeated binary responses as with continuous response variables. Therefore, in this study, we compared different methods for analyzing repeated binary data through simulations when the outcome at the study endpoint is of interest. Methods compared included chi-square, Fisher's exact test, covariate adjusted/unadjusted logistic regression (Adj.logit/Unadj.logit), covariate adjusted/unadjusted generalized estimating equations (Adj.GEE/Unadj.GEE), covariate adjusted/unadjusted generalized linear mixed model (Adj.GLMM/Unadj.GLMM). All these methods preserved the type I error close to the nominal level. Covariate adjusted methods improved power compared with the unadjusted methods because of the increased treatment effect estimates, especially when the correlation between the baseline and outcome was strong, even though there was an apparent increase in standard errors. Results of the Chi-squared test were identical to those for the unadjusted logistic regression. Fisher's exact test was the most conservative test regarding the type I error rate and also with the lowest power. Without missing data, there was no gain in using a repeated measures approach over a simple logistic regression at the final time point. Analysis of results from five phase III diabetes trials of the same compound was consistent with the simulation findings. Therefore, covariate adjusted analysis is recommended for repeated binary data when the study endpoint is of interest. PMID:25866149

  7. Analyzing Two-Phase Single-Case Data with Non-overlap and Mean Difference Indices: Illustration, Software Tools, and Alternatives.

    PubMed

    Manolov, Rumen; Losada, José L; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana

    2016-01-01

    Two-phase single-case designs, including baseline evaluation followed by an intervention, represent the most clinically straightforward option for combining professional practice and research. However, unless they are part of a multiple-baseline schedule, such designs do not allow demonstrating a causal relation between the intervention and the behavior. Although the statistical options reviewed here cannot help overcoming this methodological limitation, we aim to make practitioners and applied researchers aware of the available appropriate options for extracting maximum information from the data. In the current paper, we suggest that the evaluation of behavioral change should include visual and quantitative analyses, complementing the substantive criteria regarding the practical importance of the behavioral change. Specifically, we emphasize the need to use structured criteria for visual analysis, such as the ones summarized in the What Works Clearinghouse Standards, especially if such criteria are complemented by visual aids, as illustrated here. For quantitative analysis, we focus on the non-overlap of all pairs and the slope and level change procedure, as they offer straightforward information and have shown reasonable performance. An illustration is provided of the use of these three pieces of information: visual, quantitative, and substantive. To make the use of visual and quantitative analysis feasible, open source software is referred to and demonstrated. In order to provide practitioners and applied researchers with a more complete guide, several analytical alternatives are commented on pointing out the situations (aims, data patterns) for which these are potentially useful. PMID:26834691

  8. Feedbacks from Green House Gas Emissions on Roads: A General Methodology for Analyzing Global Warming on Linear Infrastructure with a Case Study in the Northeastern U.S

    NASA Astrophysics Data System (ADS)

    Jacobs, J. M.; Meagher, W.; Daniel, J.; Linder, E.

    2011-12-01

    The Intergovernmental Panel on Climate Change attributes the observed pattern of change to the influence of anthropogenic forcing, stating that it is extremely unlikely that the global pattern of warming can be explained without external forcing, and that it is very likely the greenhouse gases caused the warming globally over the last 50 years. Consequently, much effort has been focused on understanding the contribution of road transportation to the emissions of greenhouse gases. Striking little research has been conducted to understand the implications of climate change on the performance and design of road networks. When using water and energy balance approaches, climate is an integral part of modeling pavement deterioration processes including rutting, thermal cracking, frost heave, and thaw weakening. The potential of climate change raises the possibility that the frequency, duration, and severity of these deterioration processes may increase. This research explores the value of NARCCAP climate data sets in transportation infrastructure models. Here, we present a general methodology to demonstrate how built infrastructure might from an effort to use various RCM climate scenarios and pavement designs to quantify the climate change impact on pavement performance using a case study approach. We present challenges and results in using the Regional Climate Model datasets as inputs, through intermediary hydrologic functions, into the Federal Department of Transportation's Mechanistic-Empirical Pavement Design Guide Model.

  9. Analyzing Two-Phase Single-Case Data with Non-overlap and Mean Difference Indices: Illustration, Software Tools, and Alternatives

    PubMed Central

    Manolov, Rumen; Losada, José L.; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana

    2016-01-01

    Two-phase single-case designs, including baseline evaluation followed by an intervention, represent the most clinically straightforward option for combining professional practice and research. However, unless they are part of a multiple-baseline schedule, such designs do not allow demonstrating a causal relation between the intervention and the behavior. Although the statistical options reviewed here cannot help overcoming this methodological limitation, we aim to make practitioners and applied researchers aware of the available appropriate options for extracting maximum information from the data. In the current paper, we suggest that the evaluation of behavioral change should include visual and quantitative analyses, complementing the substantive criteria regarding the practical importance of the behavioral change. Specifically, we emphasize the need to use structured criteria for visual analysis, such as the ones summarized in the What Works Clearinghouse Standards, especially if such criteria are complemented by visual aids, as illustrated here. For quantitative analysis, we focus on the non-overlap of all pairs and the slope and level change procedure, as they offer straightforward information and have shown reasonable performance. An illustration is provided of the use of these three pieces of information: visual, quantitative, and substantive. To make the use of visual and quantitative analysis feasible, open source software is referred to and demonstrated. In order to provide practitioners and applied researchers with a more complete guide, several analytical alternatives are commented on pointing out the situations (aims, data patterns) for which these are potentially useful. PMID:26834691

  10. Analyzing the enrichment regularity of hydrocardon reservoirs in sequence stratigraphic framework of Tertiary: a case study on the Qikou sag in Huanghua depression, Bohai Bay Basin, Eastern China

    NASA Astrophysics Data System (ADS)

    Chuanyan, Huang; Hua, Wang; Yongpin, Wu; Shi, Chen; Jiahao, Wang; Peigang, Ren

    2010-05-01

    Hydrocarbon enrichment relates to definite geologic condition. The enrichment of reservoirs shows certain regularity in the basin sequence stratigraphic framework, generally associated with the unconformity interface or tectonic transformation surface, and show different enrichment regularity in different basin margins. However, enrichment regularity of different sags presents some differences. From the view of sequence stratigraphy, the paper analyzes the characteristic of hydrocarbon reservoirs enrichment in the sequence stratigraphic framework of Qikou sag in Huanghua depression, Bohai Bay Basin, Eastern China according to 1340 wells. The research shows that reservoirs enriched near the second order sequence interface or tectonic transformation surface in Qikou Sag. The closer to the second order sequence interface, the more enriched the hydrocarbon was. In the internal sequence, hydrocarbon reservoirs mainly enriched in lacustrine expanding system tract (EST) and low stand system tract (LST). However, each sequence presented some differences. Upper the second order sequence interface, hydrocarbon reservoirs mainly enriched in lacustrine expanding system tract and low stand system tract which were under the maximum flooding surface. Under the second order sequence interface, hydrocarbon reservoirs mainly enriched in high stand system tract (HST). On the plane, hydrocarbon reservoirs mainly congregated near steep slope zone which was controlled by sag marginal faults. The flexure slope break was the least hydrocarbon enrichment zone, but the exploration potential of lithologic reservoirs was huge. Hereby, the author proposed the thought that second order sequence interface (or tectonic transformation surface) + corresponding system tract in the third sequence + correlatable sag slope break types = favorable exploration zone. It is the three elements coupling controlled the favorable exploration zone. The author put forward some suggestions for next exploration of

  11. Analyzing spatial clustering and the spatiotemporal nature and trends of HIV/AIDS prevalence using GIS: the case of Malawi, 1994-2010

    PubMed Central

    2014-01-01

    Background Although local spatiotemporal analysis can improve understanding of geographic variation of the HIV epidemic, its drivers, and the search for targeted interventions, it is limited in sub-Saharan Africa. Despite recent declines, Malawi’s estimated 10.0% HIV prevalence (2011) remained among the highest globally. Using data on pregnant women in Malawi, this study 1) examines spatiotemporal trends in HIV prevalence 1994-2010, and 2) for 2010, identifies and maps the spatial variation/clustering of factors associated with HIV prevalence at district level. Methods Inverse distance weighting was used within ArcGIS Geographic Information Systems (GIS) software to generate continuous surfaces of HIV prevalence from point data (1994, 1996, 1999, 2001, 2003, 2005, 2007, and 2010) obtained from surveillance antenatal clinics. From the surfaces prevalence estimates were extracted at district level and the results mapped nationally. Spatial dependency (autocorrelation) and clustering of HIV prevalence were also analyzed. Correlation and multiple regression analyses were used to identify factors associated with HIV prevalence for 2010 and their spatial variation/clustering mapped and compared to HIV clustering. Results Analysis revealed wide spatial variation in HIV prevalence at regional, urban/rural, district and sub-district levels. However, prevalence was spatially leveling out within and across ‘sub-epidemics’ while declining significantly after 1999. Prevalence exhibited statistically significant spatial dependence nationally following initial (1995-1999) localized, patchy low/high patterns as the epidemic spread rapidly. Locally, HIV “hotspots” clustered among eleven southern districts/cities while a “coldspot” captured configurations of six central region districts. Preliminary multiple regression of 2010 HIV prevalence produced a model with four significant explanatory factors (adjusted R2 = 0.688): mean distance to main roads, mean travel time

  12. Blood Gas Analyzers.

    PubMed

    Gonzalez, Anthony L; Waddell, Lori S

    2016-03-01

    Acid-base and respiratory disturbances are common in sick and hospitalized veterinary patients; therefore, blood gas analyzers have become integral diagnostic and monitoring tools. This article will discuss uses of blood gas analyzers, types of samples that can be used, sample collection methods, potential sources of error, and potential alternatives to blood gas analyzers and their limitations. It will also discuss the types of analyzers that are available, logistical considerations that should be taken into account when purchasing an analyzer, and the basic principles of how these analyzers work. PMID:27451046

  13. Wideband digital spectrum analyzer

    NASA Technical Reports Server (NTRS)

    Morris, G. A., Jr.; Wilck, H. C.

    1979-01-01

    Modular spectrum analyzer consisting of RF receiver, fast fourier transform spectrum analyzer, and data processor samples stochastic signals in 220 channels. Construction reduces design and fabrication costs of assembled unit.

  14. Image quality analyzer

    NASA Astrophysics Data System (ADS)

    Lukin, V. P.; Botugina, N. N.; Emaleev, O. N.; Antoshkin, L. V.; Konyaev, P. A.

    2012-07-01

    Image quality analyzer (IQA) which used as device for efficiency analysis of adaptive optics application is described. In analyzer marketed possibility estimations quality of images on three different criterions of quality images: contrast, sharpnesses and the spectral criterion. At present given analyzer is introduced on Big Solar Vacuum Telescope in stale work that allows at observations to conduct the choice of the most contrasting images of Sun. Is it hereinafter planned use the analyzer in composition of the ANGARA adaptive correction system.

  15. Analyzing Software Piracy in Education.

    ERIC Educational Resources Information Center

    Lesisko, Lee James

    This study analyzes the controversy of software piracy in education. It begins with a real world scenario that presents the setting and context of the problem. The legalities and background of software piracy are explained and true court cases are briefly examined. Discussion then focuses on explaining why individuals and organizations pirate…

  16. Crew Activity Analyzer

    NASA Technical Reports Server (NTRS)

    Murray, James; Kirillov, Alexander

    2008-01-01

    The crew activity analyzer (CAA) is a system of electronic hardware and software for automatically identifying patterns of group activity among crew members working together in an office, cockpit, workshop, laboratory, or other enclosed space. The CAA synchronously records multiple streams of data from digital video cameras, wireless microphones, and position sensors, then plays back and processes the data to identify activity patterns specified by human analysts. The processing greatly reduces the amount of time that the analysts must spend in examining large amounts of data, enabling the analysts to concentrate on subsets of data that represent activities of interest. The CAA has potential for use in a variety of governmental and commercial applications, including planning for crews for future long space flights, designing facilities wherein humans must work in proximity for long times, improving crew training and measuring crew performance in military settings, human-factors and safety assessment, development of team procedures, and behavioral and ethnographic research. The data-acquisition hardware of the CAA (see figure) includes two video cameras: an overhead one aimed upward at a paraboloidal mirror on the ceiling and one mounted on a wall aimed in a downward slant toward the crew area. As many as four wireless microphones can be worn by crew members. The audio signals received from the microphones are digitized, then compressed in preparation for storage. Approximate locations of as many as four crew members are measured by use of a Cricket indoor location system. [The Cricket indoor location system includes ultrasonic/radio beacon and listener units. A Cricket beacon (in this case, worn by a crew member) simultaneously transmits a pulse of ultrasound and a radio signal that contains identifying information. Each Cricket listener unit measures the difference between the times of reception of the ultrasound and radio signals from an identified beacon

  17. Simplified Digital Spectrum Analyzer

    NASA Technical Reports Server (NTRS)

    Cole, Steven W.

    1992-01-01

    Spectrum analyzer computes approximate cross-correlations between noisy input signal and reference signal of known frequency, yielding measure of amplitude of sinusoidal component of input. Complexity and power consumed less than other digital spectrum analyzers. Performs no multiplications, and because processes data on each frequency independently, focuses on narrow spectral range without processing data on rest of spectrum.

  18. Analyzing Peace Pedagogies

    ERIC Educational Resources Information Center

    Haavelsrud, Magnus; Stenberg, Oddbjorn

    2012-01-01

    Eleven articles on peace education published in the first volume of the Journal of Peace Education are analyzed. This selection comprises peace education programs that have been planned or carried out in different contexts. In analyzing peace pedagogies as proposed in the 11 contributions, we have chosen network analysis as our method--enabling…

  19. Portable automatic blood analyzer

    NASA Technical Reports Server (NTRS)

    Coleman, R. L.

    1975-01-01

    Analyzer employs chemical-sensing electrodes for determination of blood, gas, and ion concentrations. It is rugged, easily serviced, and comparatively simple to operate. System can analyze up to eight parameters and can be modified to measure other blood constituents including nonionic species, such as urea, glucose, and oxygen.

  20. Analyzing Costs of Services.

    ERIC Educational Resources Information Center

    Cox, James O.; Black, Talbot

    A simplified method to gather and analyze cost data is presented for administrators of Handicapped Children's Early Education Programs, and specifically for members of the Technical Assistance Development System, North Carolina. After identifying benefits and liabilities associated with analyzing program costs, attention is focused on the internal…

  1. X-ray Laue micro diffraction and neutron diffraction analysis of residual elastic strains and plastic deformation in a 1% uniaxial tensile tested nickel alloy 600 sample

    SciTech Connect

    Chao, Jing; Mark, Alison; Fuller, Marina; Barabash, Rozaliya; McIntyre, Stewart; Holt, Richard A.; Klassen, Robert; Liu, W.

    2009-01-01

    The magnitude and distribution of elastic strain for a nickel alloy 600 (A600) sample that had been subjected to uniaxial tensile stress were measured by micro Laue diffraction (MLD) and neutron diffraction techniques. For a sample that had been dimensionally strained by 1%, both MLD and neutron diffraction data indicated that the global residual elastic strain was on the order of 10{sup -4}, however the micro-diffraction data indicated considerable grain-to-grain variability amongst individual components of the residual strain tensor. A more precise comparison was done by finding those grains in the MLD map that had appropriate oriented in the specific directions matching those used in the neutron measurements and the strains were found to agree within the uncertainty. Large variations in strain values across the grains were noted during the MLD measurements which are reflected in the uncertainties. This is a possible explanation for the large uncertainty in the average strains measured from multiple grains during neutron diffraction.

  2. Real-time tracking of CO migration and binding in the α and β subunits of human hemoglobin via 150-ps time-resolved Laue crystallography

    PubMed Central

    Schotte, Friedrich; Cho, Hyun Sun; Soman, Jayashree; Wulff, Michael; Olson, John S.; Anfinrud, Philip A.

    2014-01-01

    We have developed the method of picosecond Laue crystallography and used this capability to probe ligand dynamics in tetrameric R-state hemoglobin (Hb). Time-resolved, 2 Å-resolution electron density maps of photolyzed HbCO reveal the time-dependent population of CO in the binding (A) and primary docking (B) sites of both α and β subunits from 100 ps to 10 μs. The proximity of the B site in the β subunit is about 0.25 Å closer to its A binding site, and its kBA rebinding rate (~300 μs−1) is six times faster, suggesting distal control of the rebinding dynamics. Geminate rebinding in the β subunit exhibits both prompt and delayed geminate phases. We developed a microscopic model to quantitatively explain the observed kinetics, with three states for the α subunit and four states for the β subunit. This model provides a consistent framework for interpreting rebinding kinetics reported in prior studies of both HbCO and HbO2. PMID:24839343

  3. A new macromolecular crystallography Station (9. 5) on the SRS wiggler beam line for very rapid Laue and rapidly tunable monochromatic measurements: Commissioning and first results

    SciTech Connect

    Thompson, A.W. ); Habash, J.; Harrop, S.; Helliwell, J.R. ); Nave, C.; Atkinson, P.; Hasnain, S.S. ); Glover, I.D. ); Moore, P.R.; Harris, N.; Kinder, S.; Buffey, S. )

    1992-01-01

    A new instrument (Station 9.5) has been established on the wiggler line at the Daresbury Synchrotron Radiation Source (SRS). It extends the experimental capability at Daresbury for macromolecular crystallography beyond what is provided for with Stations 7.2 (Ref. 1), 9.6 (Ref. 2), and 9.7 by providing a point focused white beam (from a Pt-coated toroid mirror) and/or a rapidly tunable monochromatic beam (using a water-cooled double-crystal monochromator) (Ref. 3). The design principles of the new Station 9.5 have been published (Ref. 4). A CCD detector for the station is being developed (preliminary work is described in Ref. 5, or see the additional poster at this meeting) to allow time slices of part of a diffraction pattern to be measured. Laue patterns are currently recorded on film, but access to an image plate detector will shortly become available. Shutter speeds down to 50 {mu}s are routinely available using a rotating disk shutter (Ref. 6). Fluorescence detectors are available for optimized anomalous dispersion data collection. The experimental bench is long enough to accommodate a camera system, and downstream from it an on-line'' image plate scanner. Data collected on the instrument in various modes of operation will be described for a variety of macro and small molecule crystal systems.

  4. Software Design Analyzer System

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1985-01-01

    CRISP80 software design analyzer system a set of programs that supports top-down, hierarchic, modular structured design, and programing methodologies. CRISP80 allows for expression of design as picture of program.

  5. Automatic amino acid analyzer

    NASA Technical Reports Server (NTRS)

    Berdahl, B. J.; Carle, G. C.; Oyama, V. I.

    1971-01-01

    Analyzer operates unattended or up to 15 hours. It has an automatic sample injection system and can be programmed. All fluid-flow valve switching is accomplished pneumatically from miniature three-way solenoid pilot valves.

  6. Analyzing binding data.

    PubMed

    Motulsky, Harvey J; Neubig, Richard R

    2010-07-01

    Measuring the rate and extent of radioligand binding provides information on the number of binding sites, and their affinity and accessibility of these binding sites for various drugs. This unit explains how to design and analyze such experiments.

  7. Soil Rock Analyzer

    NASA Technical Reports Server (NTRS)

    1985-01-01

    A redesigned version of a soil/rock analyzer developed by Martin Marietta under a Langley Research Center contract is being marketed by Aurora Tech, Inc. Known as the Aurora ATX-100, it has self-contained power, an oscilloscope, a liquid crystal readout, and a multichannel spectrum analyzer. It measures energy emissions to determine what elements in what percentages a sample contains. It is lightweight and may be used for mineral exploration, pollution monitoring, etc.

  8. Total organic carbon analyzer

    NASA Technical Reports Server (NTRS)

    Godec, Richard G.; Kosenka, Paul P.; Smith, Brian D.; Hutte, Richard S.; Webb, Johanna V.; Sauer, Richard L.

    1991-01-01

    The development and testing of a breadboard version of a highly sensitive total-organic-carbon (TOC) analyzer are reported. Attention is given to the system components including the CO2 sensor, oxidation reactor, acidification module, and the sample-inlet system. Research is reported for an experimental reagentless oxidation reactor, and good results are reported for linearity, sensitivity, and selectivity in the CO2 sensor. The TOC analyzer is developed with gravity-independent components and is designed for minimal additions of chemical reagents. The reagentless oxidation reactor is based on electrolysis and UV photolysis and is shown to be potentially useful. The stability of the breadboard instrument is shown to be good on a day-to-day basis, and the analyzer is capable of 5 sample analyses per day for a period of about 80 days. The instrument can provide accurate TOC and TIC measurements over a concentration range of 20 ppb to 50 ppm C.

  9. Electrosurgical unit analyzers.

    PubMed

    1998-07-01

    Electrosurgical unit (ESU) analyzers automate the testing and inspection of the output circuits and safety features of ESUs. They perform testing that would otherwise require several other pieces of equipment, as well as considerably more time and greater technician expertise. They are used largely by clinical engineering departments for routine inspection and preventive maintenance (IPM) procedures and, less often, for accident investigations and troubleshooting. In this Evaluation, we tested three ESU analyzers from three suppliers. We rated all three analyzers Acceptable and ranked them in two groupings. In ranking the units, we placed the greatest weight on ease of use for routine ESU inspections, and gave additional consideration to versatility for advanced applications such as ESU research. The unit in Group 1 was the easiest to use, especially for infrequent users. The units in Group 2 were satisfactory but require more frequent use to maintain proficiency and to avoid user errors. PMID:9689540

  10. Analyzing radioligand binding data.

    PubMed

    Motulsky, Harvey; Neubig, Richard

    2002-08-01

    Radioligand binding experiments are easy to perform, and provide useful data in many fields. They can be used to study receptor regulation, discover new drugs by screening for compounds that compete with high affinity for radioligand binding to a particular receptor, investigate receptor localization in different organs or regions using autoradiography, categorize receptor subtypes, and probe mechanisms of receptor signaling, via measurements of agonist binding and its regulation by ions, nucleotides, and other allosteric modulators. This unit reviews the theory of receptor binding and explains how to analyze experimental data. Since binding data are usually best analyzed using nonlinear regression, this unit also explains the principles of curve fitting with nonlinear regression.

  11. Analyzing Bilingual Education Costs.

    ERIC Educational Resources Information Center

    Bernal, Joe J.

    This paper examines the particular problems involved in analyzing the costs of bilingual education and suggests that cost analysis of bilingual education requires a fundamentally different approach than that followed in other recent school finance studies. Focus of the discussion is the Intercultural Development Research Association's (IDRA)…

  12. List mode multichannel analyzer

    SciTech Connect

    Archer, Daniel E.; Luke, S. John; Mauger, G. Joseph; Riot, Vincent J.; Knapp, David A.

    2007-08-07

    A digital list mode multichannel analyzer (MCA) built around a programmable FPGA device for onboard data analysis and on-the-fly modification of system detection/operating parameters, and capable of collecting and processing data in very small time bins (<1 millisecond) when used in histogramming mode, or in list mode as a list mode MCA.

  13. Analyzing Workforce Education. Monograph.

    ERIC Educational Resources Information Center

    Texas Community & Technical Coll. Workforce Education Consortium.

    This monograph examines the issue of task analysis as used in workplace literacy programs, debating the need for it and how to perform it in a rapidly changing environment. Based on experiences of community colleges in Texas, the report analyzes ways that task analysis can be done and how to implement work force education programs more quickly.…

  14. Electronic sleep analyzer

    NASA Technical Reports Server (NTRS)

    Frost, J. D., Jr.

    1970-01-01

    Electronic instrument automatically monitors the stages of sleep of a human subject. The analyzer provides a series of discrete voltage steps with each step corresponding to a clinical assessment of level of consciousness. It is based on the operation of an EEG and requires very little telemetry bandwidth or time.

  15. Micro acoustic spectrum analyzer

    DOEpatents

    Schubert, W. Kent; Butler, Michael A.; Adkins, Douglas R.; Anderson, Larry F.

    2004-11-23

    A micro acoustic spectrum analyzer for determining the frequency components of a fluctuating sound signal comprises a microphone to pick up the fluctuating sound signal and produce an alternating current electrical signal; at least one microfabricated resonator, each resonator having a different resonant frequency, that vibrate in response to the alternating current electrical signal; and at least one detector to detect the vibration of the microfabricated resonators. The micro acoustic spectrum analyzer can further comprise a mixer to mix a reference signal with the alternating current electrical signal from the microphone to shift the frequency spectrum to a frequency range that is a better matched to the resonant frequencies of the microfabricated resonators. The micro acoustic spectrum analyzer can be designed specifically for portability, size, cost, accuracy, speed, power requirements, and use in a harsh environment. The micro acoustic spectrum analyzer is particularly suited for applications where size, accessibility, and power requirements are limited, such as the monitoring of industrial equipment and processes, detection of security intrusions, or evaluation of military threats.

  16. PULSE AMPLITUDE ANALYZER

    DOEpatents

    Greenblatt, M.H.

    1958-03-25

    This patent pertains to pulse amplitude analyzers for sorting and counting a serles of pulses, and specifically discloses an analyzer which ls simple in construction and presents the puise height distribution visually on an oscilloscope screen. According to the invention, the pulses are applied to the vertical deflection plates of an oscilloscope and trigger the horizontal sweep. Each pulse starts at the same point on the screen and has a maximum amplitude substantially along the same vertical line. A mask is placed over the screen except for a slot running along the line where the maximum amplitudes of the pulses appear. After the slot has been scanned by a photocell in combination with a slotted rotating disk, the photocell signal is displayed on an auxiliary oscilloscope as vertical deflection along a horizontal time base to portray the pulse amplitude distribution.

  17. Analyzing radioligand binding data.

    PubMed

    Motulsky, H; Neubig, R

    2001-05-01

    A radioligand is a radioactively labeled drug that can associate with a receptor, transporter, enzyme, or any protein of interest. Measuring the rate and extent of binding provides information on the number of binding sites, and their affinity and accessibility for various drugs. Radioligand binding experiments are easy to perform, and provide useful data in many fields. For example, radioligand binding studies are used to study receptor regulation, investigate receptor localization in different organs or regions using autoradiography, categorize receptor subtypes, and probe mechanisms of receptor signaling. This unit reviews the theory of receptor binding and explains how to analyze experimental data. Since binding data are usually best analyzed using nonlinear regression, this unit also explains the principles of curve fitting with nonlinear regression.

  18. Analyzing Optical Communications Links

    NASA Technical Reports Server (NTRS)

    Marshall, William K.; Burk, Brian D.

    1990-01-01

    Optical Communication Link Analysis Program, OPTI, analyzes optical and near-infrared communication links using pulse-position modulation (PPM) and direct detention. Link margins and design-control tables generated from input parameters supplied by user. Enables user to save sets of input parameters that define given link and read them back into program later. Alters automatically any of input parameters to achieve desired link margin. Written in FORTRAN 77.

  19. Magnetoresistive emulsion analyzer.

    PubMed

    Lin, Gungun; Baraban, Larysa; Han, Luyang; Karnaushenko, Daniil; Makarov, Denys; Cuniberti, Gianaurelio; Schmidt, Oliver G

    2013-01-01

    We realize a magnetoresistive emulsion analyzer capable of detection, multiparametric analysis and sorting of ferrofluid-containing nanoliter-droplets. The operation of the device in a cytometric mode provides high throughput and quantitative information about the dimensions and magnetic content of the emulsion. Our method offers important complementarity to conventional optical approaches involving ferrofluids, and paves the way to the development of novel compact tools for diagnostics and nanomedicine including drug design and screening. PMID:23989504

  20. Analyzing Leakage Through Cracks

    NASA Technical Reports Server (NTRS)

    Romine, William D.

    1993-01-01

    Two related computer programs written for use in analyzing leakage through cracks. Leakage flow laminar or turbulent. One program used to determine dimensions of crack under given flow conditions and given measured rate of leakage. Other used to determine rate of leakage of gas through crack of given dimensions under given flow conditions. Programs, written in BASIC language, accelerate and facilitate iterative calculations and parametric analyses. Solve equations of Fanno flow. Enables rapid solution of leakage problem.

  1. Magnetoresistive Emulsion Analyzer

    PubMed Central

    Lin, Gungun; Baraban, Larysa; Han, Luyang; Karnaushenko, Daniil; Makarov, Denys; Cuniberti, Gianaurelio; Schmidt, Oliver G.

    2013-01-01

    We realize a magnetoresistive emulsion analyzer capable of detection, multiparametric analysis and sorting of ferrofluid-containing nanoliter-droplets. The operation of the device in a cytometric mode provides high throughput and quantitative information about the dimensions and magnetic content of the emulsion. Our method offers important complementarity to conventional optical approaches involving ferrofluids, and paves the way to the development of novel compact tools for diagnostics and nanomedicine including drug design and screening. PMID:23989504

  2. Fractional channel multichannel analyzer

    DOEpatents

    Brackenbush, Larry W.; Anderson, Gordon A.

    1994-01-01

    A multichannel analyzer incorporating the features of the present invention obtains the effect of fractional channels thus greatly reducing the number of actual channels necessary to record complex line spectra. This is accomplished by using an analog-to-digital converter in the asynscronous mode, i.e., the gate pulse from the pulse height-to-pulse width converter is not synchronized with the signal from a clock oscillator. This saves power and reduces the number of components required on the board to achieve the effect of radically expanding the number of channels without changing the circuit board.

  3. Portable Gas Analyzer

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The Michromonitor M500 universal gas analyzer contains a series of miniature modules, each of which is a complete gas chromatograph, an instrument which separates a gaseous mixture into its components and measures the concentrations of each gas in the mixture. The system is manufactured by Microsensor Technology, and is used for environmental analysis, monitoring for gas leaks and chemical spills, compliance with pollution laws, etc. The technology is based on a Viking attempt to detect life on Mars. Ames/Stanford miniaturized the system and NIOSH funded further development. Three Stanford researchers commercialized the technology, which can be operated by unskilled personnel.

  4. RELAPS desktop analyzer

    SciTech Connect

    Beelman, R.J.; Grush, W.H.; Mortensen, G.A.; Snider, D.M.; Wagner, K.L.

    1989-01-01

    The previously mainframe bound RELAP5 reactor safety computer code has been installed on a microcomputer. A simple color-graphic display driver has been developed to enable the user to view the code results as the calculation advances. In order to facilitate future interactive desktop applications, the Nuclear Plant Analyzer (NPA), also previously mainframe bound, is being redesigned to encompass workstation applications. The marriage of RELAP5 simulation capabilities with NPA interactive graphics on a desktop workstation promises to revolutionize reactor safety analysis methodology. 8 refs.

  5. Fluorescence analyzer for lignin

    DOEpatents

    Berthold, John W.; Malito, Michael L.; Jeffers, Larry

    1993-01-01

    A method and apparatus for measuring lignin concentration in a sample of wood pulp or black liquor comprises a light emitting arrangement for emitting an excitation light through optical fiber bundles into a probe which has an undiluted sensing end facing the sample. The excitation light causes the lignin concentration to produce fluorescent emission light which is then conveyed through the probe to analyzing equipment which measures the intensity of the emission light. Measures a This invention was made with Government support under Contract Number DOE: DE-FC05-90CE40905 awarded by the Department of Energy (DOE). The Government has certain rights in this invention.

  6. Analyzing Aeroelasticity in Turbomachines

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Srivastava, R.

    2003-01-01

    ASTROP2-LE is a computer program that predicts flutter and forced responses of blades, vanes, and other components of such turbomachines as fans, compressors, and turbines. ASTROP2-LE is based on the ASTROP2 program, developed previously for analysis of stability of turbomachinery components. In developing ASTROP2- LE, ASTROP2 was modified to include a capability for modeling forced responses. The program was also modified to add a capability for analysis of aeroelasticity with mistuning and unsteady aerodynamic solutions from another program, LINFLX2D, that solves the linearized Euler equations of unsteady two-dimensional flow. Using LINFLX2D to calculate unsteady aerodynamic loads, it is possible to analyze effects of transonic flow on flutter and forced response. ASTROP2-LE can be used to analyze subsonic, transonic, and supersonic aerodynamics and structural mistuning for rotors with blades of differing structural properties. It calculates the aerodynamic damping of a blade system operating in airflow so that stability can be assessed. The code also predicts the magnitudes and frequencies of the unsteady aerodynamic forces on the airfoils of a blade row from incoming wakes. This information can be used in high-cycle fatigue analysis to predict the fatigue lives of the blades.

  7. Ring Image Analyzer

    NASA Technical Reports Server (NTRS)

    Strekalov, Dmitry V.

    2012-01-01

    Ring Image Analyzer software analyzes images to recognize elliptical patterns. It determines the ellipse parameters (axes ratio, centroid coordinate, tilt angle). The program attempts to recognize elliptical fringes (e.g., Newton Rings) on a photograph and determine their centroid position, the short-to-long-axis ratio, and the angle of rotation of the long axis relative to the horizontal direction on the photograph. These capabilities are important in interferometric imaging and control of surfaces. In particular, this program has been developed and applied for determining the rim shape of precision-machined optical whispering gallery mode resonators. The program relies on a unique image recognition algorithm aimed at recognizing elliptical shapes, but can be easily adapted to other geometric shapes. It is robust against non-elliptical details of the image and against noise. Interferometric analysis of precision-machined surfaces remains an important technological instrument in hardware development and quality analysis. This software automates and increases the accuracy of this technique. The software has been developed for the needs of an R&TD-funded project and has become an important asset for the future research proposal to NASA as well as other agencies.

  8. Plutonium solution analyzer

    SciTech Connect

    Burns, D.A.

    1994-09-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%-O.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40-240 g/L and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4-4.0 g/L. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 mL of each sample and standard, and generates waste at the rate of about 1.5 mL per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded).

  9. Multiple capillary biochemical analyzer

    DOEpatents

    Dovichi, Norman J.; Zhang, Jian Z.

    1995-01-01

    A multiple capillary analyzer allows detection of light from multiple capillaries with a reduced number of interfaces through which light must pass in detecting light emitted from a sample being analyzed, using a modified sheath flow cuvette. A linear or rectangular array of capillaries is introduced into a rectangular flow chamber. Sheath fluid draws individual sample streams through the cuvette. The capillaries are closely and evenly spaced and held by a transparent retainer in a fixed position in relation to an optical detection system. Collimated sample excitation radiation is applied simultaneously across the ends of the capillaries in the retainer. Light emitted from the excited sample is detected by the optical detection system. The retainer is provided by a transparent chamber having inward slanting end walls. The capillaries are wedged into the chamber. One sideways dimension of the chamber is equal to the diameter of the capillaries and one end to end dimension varies from, at the top of the chamber, slightly greater than the sum of the diameters of the capillaries to, at the bottom of the chamber, slightly smaller than the sum of the diameters of the capillaries. The optical system utilizes optic fibres to deliver light to individual photodetectors, one for each capillary tube. A filter or wavelength division demultiplexer may be used for isolating fluorescence at particular bands.

  10. Multiple capillary biochemical analyzer

    DOEpatents

    Dovichi, N.J.; Zhang, J.Z.

    1995-08-08

    A multiple capillary analyzer allows detection of light from multiple capillaries with a reduced number of interfaces through which light must pass in detecting light emitted from a sample being analyzed, using a modified sheath flow cuvette. A linear or rectangular array of capillaries is introduced into a rectangular flow chamber. Sheath fluid draws individual sample streams through the cuvette. The capillaries are closely and evenly spaced and held by a transparent retainer in a fixed position in relation to an optical detection system. Collimated sample excitation radiation is applied simultaneously across the ends of the capillaries in the retainer. Light emitted from the excited sample is detected by the optical detection system. The retainer is provided by a transparent chamber having inward slanting end walls. The capillaries are wedged into the chamber. One sideways dimension of the chamber is equal to the diameter of the capillaries and one end to end dimension varies from, at the top of the chamber, slightly greater than the sum of the diameters of the capillaries to, at the bottom of the chamber, slightly smaller than the sum of the diameters of the capillaries. The optical system utilizes optic fibers to deliver light to individual photodetectors, one for each capillary tube. A filter or wavelength division demultiplexer may be used for isolating fluorescence at particular bands. 21 figs.

  11. Field Deployable DNA analyzer

    SciTech Connect

    Wheeler, E; Christian, A; Marion, J; Sorensen, K; Arroyo, E; Vrankovich, G; Hara, C; Nguyen, C

    2005-02-09

    This report details the feasibility of a field deployable DNA analyzer. Steps for swabbing cells from surfaces and extracting DNA in an automatable way are presented. Since enzymatic amplification reactions are highly sensitive to environmental contamination, sample preparation is a crucial step to make an autonomous deployable instrument. We perform sample clean up and concentration in a flow through packed bed. For small initial samples, whole genome amplification is performed in the packed bed resulting in enough product for subsequent PCR amplification. In addition to DNA, which can be used to identify a subject, protein is also left behind, the analysis of which can be used to determine exposure to certain substances, such as radionuclides. Our preparative step for DNA analysis left behind the protein complement as a waste stream; we determined to learn if the proteins themselves could be analyzed in a fieldable device. We successfully developed a two-step lateral flow assay for protein analysis and demonstrate a proof of principle assay.

  12. Analyzing Water's Optical Absorption

    NASA Technical Reports Server (NTRS)

    2002-01-01

    A cooperative agreement between World Precision Instruments (WPI), Inc., and Stennis Space Center has led the UltraPath(TM) device, which provides a more efficient method for analyzing the optical absorption of water samples at sea. UltraPath is a unique, high-performance absorbance spectrophotometer with user-selectable light path lengths. It is an ideal tool for any study requiring precise and highly sensitive spectroscopic determination of analytes, either in the laboratory or the field. As a low-cost, rugged, and portable system capable of high- sensitivity measurements in widely divergent waters, UltraPath will help scientists examine the role that coastal ocean environments play in the global carbon cycle. UltraPath(TM) is a trademark of World Precision Instruments, Inc. LWCC(TM) is a trademark of World Precision Instruments, Inc.

  13. Motion detector and analyzer

    DOEpatents

    Unruh, W.P.

    1987-03-23

    Method and apparatus are provided for deriving positive and negative Doppler spectrum to enable analysis of objects in motion, and particularly, objects having rotary motion. First and second returned radar signals are mixed with internal signals to obtain an in-phase process signal and a quadrature process signal. A broad-band phase shifter shifts the quadrature signal through 90/degree/ relative to the in-phase signal over a predetermined frequency range. A pair of signals is output from the broad-band phase shifter which are then combined to provide a first side band signal which is functionally related to a negative Doppler shift spectrum. The distinct positive and negative Doppler spectra may then be analyzed for the motion characteristics of the object being examined.

  14. System performance analyzer

    NASA Technical Reports Server (NTRS)

    Helbig, H. R.

    1981-01-01

    The System Performance Analyzer (SPA) designed to provide accurate real time information about the operation of complex systems and developed for use on the Airborne Data Analysis/Monitor System (ADAMS), a ROLM 1666 based system is described. The system uses an external processor to operate an intelligent, simulated control panel. Also provided are functions to trace operations, determine frequency of use of memory areas, and time or count user tasks in a multitask environment. This augments the information available from the standard debugger and control panel, and reduces the time and effort needed by ROLM 1666 users in optimizing their system, as well as providing documentation of the effect of any changes. The operation and state of the system are evaluated.

  15. Analyzing a Cometary 'Sneeze'

    NASA Technical Reports Server (NTRS)

    2005-01-01

    [figure removed for brevity, see original site] Figure 1: Analyzing a Cometary 'Sneeze'

    This display shows highly processed images of the outburst of comet Tempel 1 between June 22 and 23, 2005. The pictures were taken by Deep Impact's medium-resolution camera. An average image of the comet has been subtracted from each picture to provide an enhanced view of the outburst. The intensity has also been stretched to show the faintest parts. This processing enables measurement of the outflow speed and the details of the dissipation of the outburst. The left image was taken when the comet was very close to its normal, non-bursting state, so almost nothing is visible.

  16. Residual gas analyzer calibration

    NASA Technical Reports Server (NTRS)

    Lilienkamp, R. H.

    1972-01-01

    A technique which employs known gas mixtures to calibrate the residual gas analyzer (RGA) is described. The mass spectra from the RGA are recorded for each gas mixture. This mass spectra data and the mixture composition data each form a matrix. From the two matrices the calibration matrix may be computed. The matrix mathematics requires the number of calibration gas mixtures be equal to or greater than the number of gases included in the calibration. This technique was evaluated using a mathematical model of an RGA to generate the mass spectra. This model included shot noise errors in the mass spectra. Errors in the gas concentrations were also included in the valuation. The effects of these errors was studied by varying their magnitudes and comparing the resulting calibrations. Several methods of evaluating an actual calibration are presented. The effects of the number of gases in then, the composition of the calibration mixture, and the number of mixtures used are discussed.

  17. Managing healthcare information: analyzing trust.

    PubMed

    Söderström, Eva; Eriksson, Nomie; Åhlfeldt, Rose-Mharie

    2016-08-01

    Purpose - The purpose of this paper is to analyze two case studies with a trust matrix tool, to identify trust issues related to electronic health records. Design/methodology/approach - A qualitative research approach is applied using two case studies. The data analysis of these studies generated a problem list, which was mapped to a trust matrix. Findings - Results demonstrate flaws in current practices and point to achieving balance between organizational, person and technology trust perspectives. The analysis revealed three challenge areas, to: achieve higher trust in patient-focussed healthcare; improve communication between patients and healthcare professionals; and establish clear terminology. By taking trust into account, a more holistic perspective on healthcare can be achieved, where trust can be obtained and optimized. Research limitations/implications - A trust matrix is tested and shown to identify trust problems on different levels and relating to trusting beliefs. Future research should elaborate and more fully address issues within three identified challenge areas. Practical implications - The trust matrix's usefulness as a tool for organizations to analyze trust problems and issues is demonstrated. Originality/value - Healthcare trust issues are captured to a greater extent and from previously unchartered perspectives. PMID:27477934

  18. Analyzing crime scene videos

    NASA Astrophysics Data System (ADS)

    Cunningham, Cindy C.; Peloquin, Tracy D.

    1999-02-01

    Since late 1996 the Forensic Identification Services Section of the Ontario Provincial Police has been actively involved in state-of-the-art image capture and the processing of video images extracted from crime scene videos. The benefits and problems of this technology for video analysis are discussed. All analysis is being conducted on SUN Microsystems UNIX computers, networked to a digital disk recorder that is used for video capture. The primary advantage of this system over traditional frame grabber technology is reviewed. Examples from actual cases are presented and the successes and limitations of this approach are explored. Suggestions to companies implementing security technology plans for various organizations (banks, stores, restaurants, etc.) will be made. Future directions for this work and new technologies are also discussed.

  19. Lorentz force particle analyzer

    NASA Astrophysics Data System (ADS)

    Wang, Xiaodong; Thess, André; Moreau, René; Tan, Yanqing; Dai, Shangjun; Tao, Zhen; Yang, Wenzhi; Wang, Bo

    2016-07-01

    A new contactless technique is presented for the detection of micron-sized insulating particles in the flow of an electrically conducting fluid. A transverse magnetic field brakes this flow and tends to become entrained in the flow direction by a Lorentz force, whose reaction force on the magnetic-field-generating system can be measured. The presence of insulating particles suspended in the fluid produce changes in this Lorentz force, generating pulses in it; these pulses enable the particles to be counted and sized. A two-dimensional numerical model that employs a moving mesh method demonstrates the measurement principle when such a particle is present. Two prototypes and a three-dimensional numerical model are used to demonstrate the feasibility of a Lorentz force particle analyzer (LFPA). The findings of this study conclude that such an LFPA, which offers contactless and on-line quantitative measurements, can be applied to an extensive range of applications. These applications include measurements of the cleanliness of high-temperature and aggressive molten metal, such as aluminum and steel alloys, and the clean manufacturing of semiconductors.

  20. Analyzing nocturnal noise stratification.

    PubMed

    Rey Gozalo, Guillermo; Barrigón Morillas, Juan Miguel; Gómez Escobar, Valentín

    2014-05-01

    Pollution associated to traffic can be considered as one of the most relevant pollution sources in our cities; noise is one of the major components of traffic pollution; thus, efforts are necessary to search adequate noise assessment methods and low pollution city designs. Different methods have been proposed for the evaluation of noise in cities, including the categorization method, which is based on the functionality concept. Until now, this method has only been studied (with encouraging results) for short-term, diurnal measurements, but nocturnal noise presents a behavior clearly different on respect to the diurnal one. In this work 45 continuous measurements of approximately one week each in duration are statistically analyzed to identify differences between the proposed categories. The results show that the five proposed categories highlight the noise stratification of the studied city in each period of the day (day, evening, and night). A comparison of the continuous measurements with previous short-term measurements indicates that the latter can be a good approximation of the former in diurnal period, reducing the resource expenditure for noise evaluation. Annoyance estimated from the measured noise levels was compared with the response of population obtained from a questionnaire with good agreement. The categorization method can yield good information about the distribution of a pollutant associated to traffic in our cities in each period of the day and, therefore, is a powerful tool for town planning and the design of pollution prevention policies.

  1. TEAMS Model Analyzer

    NASA Technical Reports Server (NTRS)

    Tijidjian, Raffi P.

    2010-01-01

    The TEAMS model analyzer is a supporting tool developed to work with models created with TEAMS (Testability, Engineering, and Maintenance System), which was developed by QSI. In an effort to reduce the time spent in the manual process that each TEAMS modeler must perform in the preparation of reporting for model reviews, a new tool has been developed as an aid to models developed in TEAMS. The software allows for the viewing, reporting, and checking of TEAMS models that are checked into the TEAMS model database. The software allows the user to selectively model in a hierarchical tree outline view that displays the components, failure modes, and ports. The reporting features allow the user to quickly gather statistics about the model, and generate an input/output report pertaining to all of the components. Rules can be automatically validated against the model, with a report generated containing resulting inconsistencies. In addition to reducing manual effort, this software also provides an automated process framework for the Verification and Validation (V&V) effort that will follow development of these models. The aid of such an automated tool would have a significant impact on the V&V process.

  2. PULSE HEIGHT ANALYZER

    DOEpatents

    Johnstone, C.W.

    1958-01-21

    An anticoincidence device is described for a pair of adjacent channels of a multi-channel pulse height analyzer for preventing the lower channel from generating a count pulse in response to an input pulse when the input pulse has sufficient magnitude to reach the upper level channel. The anticoincidence circuit comprises a window amplifier, upper and lower level discriminators, and a biased-off amplifier. The output of the window amplifier is coupled to the inputs of the discriminators, the output of the upper level discriminator is connected to the resistance end of a series R-C network, the output of the lower level discriminator is coupled to the capacitance end of the R-C network, and the grid of the biased-off amplifier is coupled to the junction of the R-C network. In operation each discriminator produces a negative pulse output when the input pulse traverses its voltage setting. As a result of the connections to the R-C network, a trigger pulse will be sent to the biased-off amplifier when the incoming pulse level is sufficient to trigger only the lower level discriminator.

  3. Analyzing Spacecraft Telecommunication Systems

    NASA Technical Reports Server (NTRS)

    Kordon, Mark; Hanks, David; Gladden, Roy; Wood, Eric

    2004-01-01

    Multi-Mission Telecom Analysis Tool (MMTAT) is a C-language computer program for analyzing proposed spacecraft telecommunication systems. MMTAT utilizes parameterized input and computational models that can be run on standard desktop computers to perform fast and accurate analyses of telecommunication links. MMTAT is easy to use and can easily be integrated with other software applications and run as part of almost any computational simulation. It is distributed as either a stand-alone application program with a graphical user interface or a linkable library with a well-defined set of application programming interface (API) calls. As a stand-alone program, MMTAT provides both textual and graphical output. The graphs make it possible to understand, quickly and easily, how telecommunication performance varies with variations in input parameters. A delimited text file that can be read by any spreadsheet program is generated at the end of each run. The API in the linkable-library form of MMTAT enables the user to control simulation software and to change parameters during a simulation run. Results can be retrieved either at the end of a run or by use of a function call at any time step.

  4. PULSE HEIGHT ANALYZER

    DOEpatents

    Goldsworthy, W.W.

    1958-06-01

    A differential pulse-height discriminator circuit is described which is readily adaptable for operation in a single-channel pulse-height analyzer. The novel aspect of the circuit lies in the specific arrangement of differential pulse-height discriminator which includes two pulse-height discriminators having a comnnon input and an anticoincidence circuit having two interconnected vacuum tubes with a common cathode resistor. Pulses from the output of one discriminator circuit are delayed and coupled to the grid of one of the anticoincidence tubes by a resistor. The output pulses from the other discriminator circuit are coupled through a cathode follower circuit, which has a cathode resistor of such value as to provide a long time constant with the interelectrode capacitance of the tube, to lenthen the output pulses. The pulses are then fed to the grid of the other anticoincidence tube. With such connections of the circuits, only when the incoming pulse has a pesk value between the operating levels of the two discriminators does an output pulse occur from the anticoincidence circuit.

  5. Analyzing Atmospheric Neutrino Oscillations

    SciTech Connect

    Escamilla, J.; Ernst, D. J.; Latimer, D. C.

    2007-10-26

    We provide a pedagogic derivation of the formula needed to analyze atmospheric data and then derive, for the subset of the data that are fully-contained events, an analysis tool that is quantitative and numerically efficient. Results for the full set of neutrino oscillation data are then presented. We find the following preliminary results: 1.) the sub-dominant approximation provides reasonable values for the best fit parameters for {delta}{sub 32}, {theta}{sub 23}, and {theta}{sub 13} but does not quantitatively provide the errors for these three parameters; 2.) the size of the MSW effect is suppressed in the sub-dominant approximation; 3.) the MSW effect reduces somewhat the extracted error for {delta}{sub 32}, more so for {theta}{sub 23} and {theta}{sub 13}; 4.) atmospheric data alone constrains the allowed values of {theta}{sub 13} only in the sub-dominant approximation, the full three neutrino calculations requires CHOOZ to get a clean constraint; 5.) the linear in {theta}{sub 13} terms are not negligible; and 6.) the minimum value of {theta}{sub 13} is found to be negative, but at a statistically insignificant level.

  6. Pseudostupidity and analyzability.

    PubMed

    Cohn, L S

    1989-01-01

    This paper seeks to heighten awareness of pseudostupidity and the potential analyzability of patients who manifest it by defining and explicating it, reviewing the literature, and presenting in detail the psychoanalytic treatment of a pseudostupid patient. Pseudostupidity is caused by an inhibition of the integration and synthesis of thoughts resulting in a discrepancy between intellectual capacity and apparent intellect. The patient's pseudostupidity was determined in part by his need to prevent his being more successful than father, i.e., defeating his oedipal rival. Knowing and learning were instinctualized. The patient libidinally and defensively identified with father's passive, masochistic position. He needed to frustrate the analyst as he had felt excited and frustrated by his parents' nudity and thwarted by his inhibitions. He wanted to cause the analyst to feel as helpless as he, the patient, felt. Countertransference frustration was relevant and clinically useful in the analysis. Interpretation of evolving relevant issues led to more anxiety and guilt, less pseudostupidity, a heightened alliance, and eventual working through. Negative therapeutic reactions followed the resolution of pseudostupidity. PMID:2708771

  7. Downhole Fluid Analyzer Development

    SciTech Connect

    Bill Turner

    2006-11-28

    A novel fiber optic downhole fluid analyzer has been developed for operation in production wells. This device will allow real-time determination of the oil, gas and water fractions of fluids from different zones in a multizone or multilateral completion environment. The device uses near infrared spectroscopy and induced fluorescence measurement to unambiguously determine the oil, water and gas concentrations at all but the highest water cuts. The only downhole components of the system are the fiber optic cable and windows. All of the active components--light sources, sensors, detection electronics and software--will be located at the surface, and will be able to operate multiple downhole probes. Laboratory testing has demonstrated that the sensor can accurately determine oil, water and gas fractions with a less than 5 percent standard error. Once installed in an intelligent completion, this sensor will give the operating company timely information about the fluids arising from various zones or multilaterals in a complex completion pattern, allowing informed decisions to be made on controlling production. The research and development tasks are discussed along with a market analysis.

  8. Shedding Light on the Photochemistry of Coinage-Metal Phosphorescent Materials: A Time-Resolved Laue Diffraction Study of an AgI-CuI Tetranuclear Complex

    SciTech Connect

    Jarzembska, Katarzyna N.; Kami,; #324; ski, Radoslaw; Fournier, Bertrand; Trzop, El; #380; bieta,; Sokolow, Jesse D.; Henning, Robert; Chen, Yang; Coppens, Philip

    2014-11-14

    The triplet excited state of a new crystalline form of a tetranuclear coordination d10–d10-type complex, Ag2Cu2L4 (L = 2-diphenylphosphino-3-methylindole ligand), containing AgI and CuI metal centers has been explored using the Laue pump–probe technique with ≈80 ps time resolution. The relatively short lifetime of 1 μs is accompanied by significant photoinduced structural changes, as large as the Ag1···Cu2 distance shortening by 0.59(3) Å. The results show a pronounced strengthening of the argentophilic interactions and formation of new Ag···Cu bonds on excitation. Theoretical calculations indicate that the structural changes are due to a ligand-to-metal charge transfer (LMCT) strengthening the Ag···Ag interaction, mainly occurring from the methylindole ligands to the silver metal centers. QM/MM optimizations of the ground and excited states of the complex support the experimental results. Comparison with isolated molecule optimizations demonstrates the restricting effect of the crystalline matrix on photoinduced distortions. The work represents the first time-resolved Laue diffraction study of a heteronuclear coordination complex and provides new information on the nature of photoresponse of coinage metal complexes, which have been the subject of extensive studies.

  9. Digital Microfluidics Sample Analyzer

    NASA Technical Reports Server (NTRS)

    Pollack, Michael G.; Srinivasan, Vijay; Eckhardt, Allen; Paik, Philip Y.; Sudarsan, Arjun; Shenderov, Alex; Hua, Zhishan; Pamula, Vamsee K.

    2010-01-01

    Three innovations address the needs of the medical world with regard to microfluidic manipulation and testing of physiological samples in ways that can benefit point-of-care needs for patients such as premature infants, for which drawing of blood for continuous tests can be life-threatening in their own right, and for expedited results. A chip with sample injection elements, reservoirs (and waste), droplet formation structures, fluidic pathways, mixing areas, and optical detection sites, was fabricated to test the various components of the microfluidic platform, both individually and in integrated fashion. The droplet control system permits a user to control droplet microactuator system functions, such as droplet operations and detector operations. Also, the programming system allows a user to develop software routines for controlling droplet microactuator system functions, such as droplet operations and detector operations. A chip is incorporated into the system with a controller, a detector, input and output devices, and software. A novel filler fluid formulation is used for the transport of droplets with high protein concentrations. Novel assemblies for detection of photons from an on-chip droplet are present, as well as novel systems for conducting various assays, such as immunoassays and PCR (polymerase chain reaction). The lab-on-a-chip (a.k.a., lab-on-a-printed-circuit board) processes physiological samples and comprises a system for automated, multi-analyte measurements using sub-microliter samples of human serum. The invention also relates to a diagnostic chip and system including the chip that performs many of the routine operations of a central labbased chemistry analyzer, integrating, for example, colorimetric assays (e.g., for proteins), chemiluminescence/fluorescence assays (e.g., for enzymes, electrolytes, and gases), and/or conductometric assays (e.g., for hematocrit on plasma and whole blood) on a single chip platform.

  10. Soft Decision Analyzer

    NASA Technical Reports Server (NTRS)

    Steele, Glen; Lansdowne, Chatwin; Zucha, Joan; Schlensinger, Adam

    2013-01-01

    The Soft Decision Analyzer (SDA) is an instrument that combines hardware, firmware, and software to perform realtime closed-loop end-to-end statistical analysis of single- or dual- channel serial digital RF communications systems operating in very low signal-to-noise conditions. As an innovation, the unique SDA capabilities allow it to perform analysis of situations where the receiving communication system slips bits due to low signal-to-noise conditions or experiences constellation rotations resulting in channel polarity in versions or channel assignment swaps. SDA s closed-loop detection allows it to instrument a live system and correlate observations with frame, codeword, and packet losses, as well as Quality of Service (QoS) and Quality of Experience (QoE) events. The SDA s abilities are not confined to performing analysis in low signal-to-noise conditions. Its analysis provides in-depth insight of a communication system s receiver performance in a variety of operating conditions. The SDA incorporates two techniques for identifying slips. The first is an examination of content of the received data stream s relation to the transmitted data content and the second is a direct examination of the receiver s recovered clock signals relative to a reference. Both techniques provide benefits in different ways and allow the communication engineer evaluating test results increased confidence and understanding of receiver performance. Direct examination of data contents is performed by two different data techniques, power correlation or a modified Massey correlation, and can be applied to soft decision data widths 1 to 12 bits wide over a correlation depth ranging from 16 to 512 samples. The SDA detects receiver bit slips within a 4 bits window and can handle systems with up to four quadrants (QPSK, SQPSK, and BPSK systems). The SDA continuously monitors correlation results to characterize slips and quadrant change and is capable of performing analysis even when the

  11. Regolith Evolved Gas Analyzer

    NASA Technical Reports Server (NTRS)

    Hoffman, John H.; Hedgecock, Jud; Nienaber, Terry; Cooper, Bonnie; Allen, Carlton; Ming, Doug

    2000-01-01

    The Regolith Evolved Gas Analyzer (REGA) is a high-temperature furnace and mass spectrometer instrument for determining the mineralogical composition and reactivity of soil samples. REGA provides key mineralogical and reactivity data that is needed to understand the soil chemistry of an asteroid, which then aids in determining in-situ which materials should be selected for return to earth. REGA is capable of conducting a number of direct soil measurements that are unique to this instrument. These experimental measurements include: (1) Mass spectrum analysis of evolved gases from soil samples as they are heated from ambient temperature to 900 C; and (2) Identification of liberated chemicals, e.g., water, oxygen, sulfur, chlorine, and fluorine. REGA would be placed on the surface of a near earth asteroid. It is an autonomous instrument that is controlled from earth but does the analysis of regolith materials automatically. The REGA instrument consists of four primary components: (1) a flight-proven mass spectrometer, (2) a high-temperature furnace, (3) a soil handling system, and (4) a microcontroller. An external arm containing a scoop or drill gathers regolith samples. A sample is placed in the inlet orifice where the finest-grained particles are sifted into a metering volume and subsequently moved into a crucible. A movable arm then places the crucible in the furnace. The furnace is closed, thereby sealing the inner volume to collect the evolved gases for analysis. Owing to the very low g forces on an asteroid compared to Mars or the moon, the sample must be moved from inlet to crucible by mechanical means rather than by gravity. As the soil sample is heated through a programmed pattern, the gases evolved at each temperature are passed through a transfer tube to the mass spectrometer for analysis and identification. Return data from the instrument will lead to new insights and discoveries including: (1) Identification of the molecular masses of all of the gases

  12. COBSTRAN - COMPOSITE BLADE STRUCTURAL ANALYZER

    NASA Technical Reports Server (NTRS)

    Aiello, R. A.

    1994-01-01

    The COBSTRAN (COmposite Blade STRuctural ANalyzer) program is a pre- and post-processor that facilitates the design and analysis of composite turbofan and turboprop blades, as well as composite wind turbine blades. COBSTRAN combines composite mechanics and laminate theory with a data base of fiber and matrix properties. As a preprocessor for NASTRAN or another Finite Element Method (FEM) program, COBSTRAN generates an FEM model with anisotropic homogeneous material properties. Stress output from the FEM program is provided as input to the COBSTRAN postprocessor. The postprocessor then uses the composite mechanics and laminate theory routines to calculate individual ply stresses, strains, interply stresses, thru-the-thickness stresses and failure margins. COBSTRAN is designed to carry out the many linear analyses required to efficiently model and analyze blade-like structural components made of multilayered angle-plied fiber composites. Components made from isotropic or anisotropic homogeneous materials can also be modeled as a special case of COBSTRAN. NASTRAN MAT1 or MAT2 material cards are generated according to user supplied properties. COBSTRAN is written in FORTRAN 77 and was implemented on a CRAY X-MP with a UNICOS 5.0.12 operating system. The program requires either COSMIC NASTRAN or MSC NASTRAN as a structural analysis package. COBSTRAN was developed in 1989, and has a memory requirement of 262,066 64 bit words.

  13. Study of optical Laue diffraction

    SciTech Connect

    Chakravarthy, Giridhar E-mail: aloksharan@email.com; Allam, Srinivasa Rao E-mail: aloksharan@email.com; Satyanarayana, S. V. M. E-mail: aloksharan@email.com; Sharan, Alok E-mail: aloksharan@email.com

    2014-10-15

    We present the study of the optical diffraction pattern of one and two-dimensional gratings with defects, designed using desktop pc and printed on OHP sheet using laser printer. Gratings so prepared, using novel low cost technique provides good visual aid in teaching. Diffraction pattern of the monochromatic light (632.8nm) from the grating so designed is similar to that of x-ray diffraction pattern of crystal lattice with point defects in one and two-dimensions. Here both optical and x-ray diffractions are Fraunhofer. The information about the crystalline lattice structure and the defect size can be known.

  14. Lattice-level observation of the elastic-to-plastic relaxation process with subnanosecond resolution in shock-compressed Ta using time-resolved in situ Laue diffraction

    SciTech Connect

    Wehrenberg, C. E.; Comley, A. J.; Barton, N. R.; Coppari, F.; Fratanduono, D.; Huntington, C. M.; Maddox, B. R.; Park, H. -S.; Plechaty, C.; Prisbrey, S. T.; Remington, B. A.; Rudd, R. E.

    2015-09-29

    We report direct lattice level measurements of plastic relaxation kinetics through time-resolved, in-situ Laue diffraction of shock-compressed single-crystal [001] Ta at pressures of 27-210 GPa. For a 50 GPa shock, a range of shear strains is observed extending up to the uniaxial limit for early data points (<0.6 ns) and the average shear strain relaxes to a near steady state over ~1 ns. For 80 and 125 GPa shocks, the measured shear strains are fully relaxed already at 200 ps, consistent with rapid relaxation associated with the predicted threshold for homogeneous nucleation of dislocations occurring at shock pressure ~65 GPa. The relaxation rate and shear stresses are used to estimate the dislocation density and these quantities are compared to the Livermore Multiscale Strength model as well as various molecular dynamics simulations.

  15. Soft Decision Analyzer and Method

    NASA Technical Reports Server (NTRS)

    Steele, Glen F. (Inventor); Lansdowne, Chatwin (Inventor); Zucha, Joan P. (Inventor); Schlesinger, Adam M. (Inventor)

    2016-01-01

    A soft decision analyzer system is operable to interconnect soft decision communication equipment and analyze the operation thereof to detect symbol wise alignment between a test data stream and a reference data stream in a variety of operating conditions.

  16. Soft Decision Analyzer and Method

    NASA Technical Reports Server (NTRS)

    Steele, Glen F. (Inventor); Lansdowne, Chatwin (Inventor); Zucha, Joan P. (Inventor); Schlesinger, Adam M. (Inventor)

    2015-01-01

    A soft decision analyzer system is operable to interconnect soft decision communication equipment and analyze the operation thereof to detect symbol wise alignment between a test data stream and a reference data stream in a variety of operating conditions.

  17. Droplet actuator analyzer with cartridge

    NASA Technical Reports Server (NTRS)

    Smith, Gregory F. (Inventor); Sturmer, Ryan A. (Inventor); Paik, Philip Y. (Inventor); Srinivasan, Vijay (Inventor); Pollack, Michael G. (Inventor); Pamula, Vamsee K. (Inventor); Brafford, Keith R. (Inventor); West, Richard M. (Inventor)

    2011-01-01

    A droplet actuator with cartridge is provided. According to one embodiment, a sample analyzer is provided and includes an analyzer unit comprising electronic or optical receiving means, a cartridge comprising self-contained droplet handling capabilities, and a wherein the cartridge is coupled to the analyzer unit by a means which aligns electronic and/or optical outputs from the cartridge with electronic or optical receiving means on the analyzer unit. According to another embodiment, a sample analyzer is provided and includes a sample analyzer comprising a cartridge coupled thereto and a means of electrical interface and/or optical interface between the cartridge and the analyzer, whereby electrical signals and/or optical signals may be transmitted from the cartridge to the analyzer.

  18. Analyzing the Teaching of Professional Practice

    ERIC Educational Resources Information Center

    Moss, Pamela A.

    2011-01-01

    Background/Context: Based on their case studies of preparation for professional practice in the clergy, teaching, and clinical psychology, Grossman and colleagues (2009) identified three key concepts for analyzing and comparing practice in professional education--representations, decomposition, and approximations--to support professional educators…

  19. Interpolation Errors in Spectrum Analyzers

    NASA Technical Reports Server (NTRS)

    Martin, J. L.

    1996-01-01

    To obtain the proper measurement amplitude with a spectrum analyzer, the correct frequency-dependent transducer factor must be added to the voltage measured by the transducer. This report examines how entering transducer factors into a spectrum analyzer can cause significant errors in field amplitude due to the misunderstanding of the analyzer's interpolation methods. It also discusses how to reduce these errors to obtain a more accurate field amplitude reading.

  20. Construction and design principles for microprocessor-based conductometric analyzers

    SciTech Connect

    Gerasimov, B.I.; Mishchenko, S.V.; Glinkin, E.I.

    1995-04-01

    We consider questions connected with design of microprocessor-based conductometric analyzers and cases of the utilization of microprocessor technology to design automated instruments for analytical control.

  1. Nuclear fuel microsphere gamma analyzer

    DOEpatents

    Valentine, Kenneth H.; Long, Jr., Ernest L.; Willey, Melvin G.

    1977-01-01

    A gamma analyzer system is provided for the analysis of nuclear fuel microspheres and other radioactive particles. The system consists of an analysis turntable with means for loading, in sequence, a plurality of stations within the turntable; a gamma ray detector for determining the spectrum of a sample in one section; means for analyzing the spectrum; and a receiver turntable to collect the analyzed material in stations according to the spectrum analysis. Accordingly, particles may be sorted according to their quality; e.g., fuel particles with fractured coatings may be separated from those that are not fractured, or according to other properties.

  2. The discovery of X-rays diffraction: From crystals to DNA. A case study to promote understanding of the nature of science and of its interdisciplinary character

    NASA Astrophysics Data System (ADS)

    Guerra, Francesco; Leone, Matteo; Robotti, Nadia

    2016-05-01

    The advantages of introducing history of science topics into the teaching of science has been advocated by a large number of scholars within the science education community. One of the main reasons given for using history of science in teaching is its power to promote understanding of the nature of science (NOS). In this respect, the historical case of X-rays diffraction, from the discovery of Max von Laue (1912) to the first X-rays diffraction photographs of DNA (1953), is a case in point for showing that a correct experimental strategy and a favourable theoretical context are not enough to make a scientific discovery.

  3. Market study: Whole blood analyzer

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A market survey was conducted to develop findings relative to the commercialization potential and key market factors of the whole blood analyzer which is being developed in conjunction with NASA's Space Shuttle Medical System.

  4. Molecular wake shield gas analyzer

    NASA Technical Reports Server (NTRS)

    Hoffman, J. H.

    1980-01-01

    Techniques for measuring and characterizing the ultrahigh vacuum in the wake of an orbiting spacecraft are studied. A high sensitivity mass spectrometer that contains a double mass analyzer consisting of an open source miniature magnetic sector field neutral gas analyzer and an identical ion analyzer is proposed. These are configured to detect and identify gas and ion species of hydrogen, helium, nitrogen, oxygen, nitric oxide, and carbon dioxide and any other gas or ion species in the 1 to 46 amu mass range. This range covers the normal atmospheric constituents. The sensitivity of the instrument is sufficient to measure ambient gases and ion with a particle density of the order of one per cc. A chemical pump, or getter, is mounted near the entrance aperture of the neutral gas analyzer which integrates the absorption of ambient gases for a selectable period of time for subsequent release and analysis. The sensitivity is realizable for all but rare gases using this technique.

  5. A Categorization of Dynamic Analyzers

    NASA Technical Reports Server (NTRS)

    Lujan, Michelle R.

    1997-01-01

    Program analysis techniques and tools are essential to the development process because of the support they provide in detecting errors and deficiencies at different phases of development. The types of information rendered through analysis includes the following: statistical measurements of code, type checks, dataflow analysis, consistency checks, test data,verification of code, and debugging information. Analyzers can be broken into two major categories: dynamic and static. Static analyzers examine programs with respect to syntax errors and structural properties., This includes gathering statistical information on program content, such as the number of lines of executable code, source lines. and cyclomatic complexity. In addition, static analyzers provide the ability to check for the consistency of programs with respect to variables. Dynamic analyzers in contrast are dependent on input and the execution of a program providing the ability to find errors that cannot be detected through the use of static analysis alone. Dynamic analysis provides information on the behavior of a program rather than on the syntax. Both types of analysis detect errors in a program, but dynamic analyzers accomplish this through run-time behavior. This paper focuses on the following broad classification of dynamic analyzers: 1) Metrics; 2) Models; and 3) Monitors. Metrics are those analyzers that provide measurement. The next category, models, captures those analyzers that present the state of the program to the user at specified points in time. The last category, monitors, checks specified code based on some criteria. The paper discusses each classification and the techniques that are included under them. In addition, the role of each technique in the software life cycle is discussed. Familiarization with the tools that measure, model and monitor programs provides a framework for understanding the program's dynamic behavior from different, perspectives through analysis of the input

  6. Application of the LI-COR CO2 analyzer to volcanic plumes: a case study, volcán Popocatépetl, Mexico, June 7 and 10, 1995

    USGS Publications Warehouse

    Gerlach, T.M.; Delgado, H.; McGee, K.A.; Doukas, M.P.; Venegas, J.J.; Cardenas, L.

    1997-01-01

    Volcanic CO2 emission rate data are sparse despite their potential importance for constraining the role of magma degassing in the biogeochemical cycle of carbon and for assessing volcanic hazards. We used a LI-COR CO2 analyzer to determine volcanic CO2 emission rates by airborne measurements in volcanic plumes at Popocatépetl volcano on June 7 and 10, 1995. LI-COR sample paths of ∼72 m, compared with ∼1 km for the analyzer customarily used, together with fast Fourier transforms to remove instrument noise from raw data greatly improve resolution of volcanic CO2 anomalies. Parametric models fit to background CO2 provide a statistical tool for distinguishing volcanic from ambient CO2. Global Positioning System referenced flight traverses provide vastly improved data on the shape, coherence, and spatial distribution of volcanic CO2 in plume cross sections and contrast markedly with previous results based on traverse stacking. The continuous escape of CO2 and SO2 from Popocatépetl was fundamentally noneruptive and represented quiescent magma degassing from the top of a magma chamber ∼5 km deep. The average CO2 emission rate for January-June 1995 is estimated to be at least 6400 t d−1, one of the highest determined for a quiescently degassing volcano, although correction for downwind dispersion effects on volcanic CO2 indicates a higher rate of ∼9000 t d−1. Analysis of random errors indicates emission rates have 95% confidence intervals of ∼±20%, with uncertainty contributed mostly by wind speed variance, although the variance of plume cross-sectional areas during traversing is poorly constrained and possibly significant.

  7. On-Demand Urine Analyzer

    NASA Technical Reports Server (NTRS)

    Farquharson, Stuart; Inscore, Frank; Shende, Chetan

    2010-01-01

    A lab-on-a-chip was developed that is capable of extracting biochemical indicators from urine samples and generating their surface-enhanced Raman spectra (SERS) so that the indicators can be quantified and identified. The development was motivated by the need to monitor and assess the effects of extended weightlessness, which include space motion sickness and loss of bone and muscle mass. The results may lead to developments of effective exercise programs and drug regimes that would maintain astronaut health. The analyzer containing the lab-on-a- chip includes materials to extract 3- methylhistidine (a muscle-loss indicator) and Risedronate (a bone-loss indicator) from the urine sample and detect them at the required concentrations using a Raman analyzer. The lab-on- a-chip has both an extractive material and a SERS-active material. The analyzer could be used to monitor the onset of diseases, such as osteoporosis.

  8. Rotor for centrifugal fast analyzers

    DOEpatents

    Lee, N.E.

    1984-01-01

    The invention is an improved photometric analyzer of the rotary cuvette type, the analyzer incorporating a multicuvette rotor of novel design. The rotor (a) is leaktight, (b) permits operation in the 90/sup 0/ and 180/sup 0/ excitation modes, (c) is compatible with extensively used Centrifugal Fast Analyzers, and (d) can be used thousands of times. The rotor includes an assembly comprising a top plate, a bottom plate, and a central plate, the rim of the central plate being formed with circumferentially spaced indentations. A uv-transmitting ring is sealably affixed to the indented rim to define with the indentations an array of cuvettes. The ring serves both as a sealing means and an end window for the cuvettes.

  9. Rotor for centrifugal fast analyzers

    DOEpatents

    Lee, Norman E.

    1985-01-01

    The invention is an improved photometric analyzer of the rotary cuvette type, the analyzer incorporating a multicuvette rotor of novel design. The rotor (a) is leaktight, (b) permits operation in the 90.degree. and 180.degree. excitation modes, (c) is compatible with extensively used Centrifugal Fast Analyzers, and (d) can be used thousands of times. The rotor includes an assembly comprising a top plate, a bottom plate, and a central plate, the rim of the central plate being formed with circumferentially spaced indentations. A UV-transmitting ring is sealably affixed to the indented rim to define with the indentations an array of cuvettes. The ring serves both as a sealing means and an end window for the cuvettes.

  10. Real time infrared aerosol analyzer

    DOEpatents

    Johnson, Stanley A.; Reedy, Gerald T.; Kumar, Romesh

    1990-01-01

    Apparatus for analyzing aerosols in essentially real time includes a virtual impactor which separates coarse particles from fine and ultrafine particles in an aerosol sample. The coarse and ultrafine particles are captured in PTFE filters, and the fine particles impact onto an internal light reflection element. The composition and quantity of the particles on the PTFE filter and on the internal reflection element are measured by alternately passing infrared light through the filter and the internal light reflection element, and analyzing the light through infrared spectrophotometry to identify the particles in the sample.

  11. Software-Design-Analyzer System

    NASA Technical Reports Server (NTRS)

    Tausworthe, Robert C.

    1991-01-01

    CRISP-90 software-design-analyzer system, update of CRISP-80, is set of computer programs constituting software tool for design and documentation of other software and supporting top-down, hierarchical, modular, structured methodologies for design and programming. Written in Microsoft QuickBasic.

  12. Strategies for Analyzing Tone Languages

    ERIC Educational Resources Information Center

    Coupe, Alexander R.

    2014-01-01

    This paper outlines a method of auditory and acoustic analysis for determining the tonemes of a language starting from scratch, drawing on the author's experience of recording and analyzing tone languages of north-east India. The methodology is applied to a preliminary analysis of tone in the Thang dialect of Khiamniungan, a virtually undocumented…

  13. FORTRAN Static Source Code Analyzer

    NASA Technical Reports Server (NTRS)

    Merwarth, P.

    1982-01-01

    FORTRAN Static Source Code Analyzer program (SAP) automatically gathers and reports statistics on occurrences of statements and structures within FORTRAN program. Provisions are made for weighting each statistic, providing user with overall figure of complexity. Statistics, as well as figures of complexity, are gathered on module-by-module basis. Overall summed statistics are accumulated for complete input source file.

  14. FORTRAN Static Source Code Analyzer

    NASA Technical Reports Server (NTRS)

    Merwarth, P.

    1984-01-01

    FORTRAN Static Source Code Analyzer program, SAP (DEC VAX version), automatically gathers statistics on occurrences of statements and structures within FORTRAN program and provides reports of those statistics. Provisions made for weighting each statistic and provide an overall figure of complexity.

  15. The Statistical Loop Analyzer (SLA)

    NASA Technical Reports Server (NTRS)

    Lindsey, W. C.

    1985-01-01

    The statistical loop analyzer (SLA) is designed to automatically measure the acquisition, tracking and frequency stability performance characteristics of symbol synchronizers, code synchronizers, carrier tracking loops, and coherent transponders. Automated phase lock and system level tests can also be made using the SLA. Standard baseband, carrier and spread spectrum modulation techniques can be accomodated. Through the SLA's phase error jitter and cycle slip measurements the acquisition and tracking thresholds of the unit under test are determined; any false phase and frequency lock events are statistically analyzed and reported in the SLA output in probabilistic terms. Automated signal drop out tests can be performed in order to trouble shoot algorithms and evaluate the reacquisition statistics of the unit under test. Cycle slip rates and cycle slip probabilities can be measured using the SLA. These measurements, combined with bit error probability measurements, are all that are needed to fully characterize the acquisition and tracking performance of a digital communication system.

  16. Portable imaging polarized light analyzer

    NASA Astrophysics Data System (ADS)

    Shashar, Nadav; Cronin, Thomas W.; Johnson, George; Wolff, Lawrence B.

    1995-06-01

    Many animals, both marine and terrestrial, are sensitive to the orientation of the e-vector of partially linearly polarized light (PLPL). This sensitivity is used for navigation, spatial orientation, and detection of large bodies of water. However, it is not clear what other information animals may receive from polarized light. Natural light fields, both in the sky and underwater, are known to be partially polarized. Additionally, natural objects reflect light that is polarized at specific orientations. Sensors capable of measuring the characteristics of PLPL, namely partial polarization and orientation, throughout an image are not yet available. By placing 2 twisted nematic liquid crystals (TNLCs) and a fixed polarizing filter in series in front of a video camera, and by controlling the angles of rotation of the orientation of polarization produced by the TNLCs, we are able to fully analyze PLPL throughout a full image on a single pixel basis. As a recording device we use a small camcorder. The sensor can be operated autonomously, with the images analyzed at a later stage, or it can be connected (in a future phase) via a frame grabber to a personal computer which analyzes the information online. The analyzed image can be presented as a false color image, where hue represents orientation of polarization and saturation represents partial polarization. Field measurements confirm that PLPL is a characteristic distributed both under water and on land. Marine background light is strongly horizontally polarized. Light reflected from leaves is polarized mainly according to their spatial orientation. Differences between PLPL reflected from objects or animals and their background can be used to enhance contrast and break color camouflage. Our sensor presents a new approach for answering questions related to the ecology of vision and is a new tool for remote sensing.

  17. Satellite-based interference analyzer

    NASA Technical Reports Server (NTRS)

    Varice, H.; Johannsen, K.; Sabaroff, S.

    1977-01-01

    System identifies terrestrial sources of radiofrequency interference and measures their frequency spectra and amplitudes. Designed to protect satellite communication networks, system measures entire noise spectrum over selected frequency band and can raster-scan geographical region to locate noise sources. Once interference is analyzed, realistic interference protection ratios are determined and mathematical models for predicting ratio-frequency noise spectra are established. This enhances signal-detection and locates optimum geographical positions and frequency bands for communication equipment.

  18. DEEP WATER ISOTOPIC CURRENT ANALYZER

    DOEpatents

    Johnston, W.H.

    1964-04-21

    A deepwater isotopic current analyzer, which employs radioactive isotopes for measurement of ocean currents at various levels beneath the sea, is described. The apparatus, which can determine the direction and velocity of liquid currents, comprises a shaft having a plurality of radiation detectors extending equidistant radially therefrom, means for releasing radioactive isotopes from the shaft, and means for determining the time required for the isotope to reach a particular detector. (AEC)

  19. Analyzing ion distributions around DNA.

    PubMed

    Lavery, Richard; Maddocks, John H; Pasi, Marco; Zakrzewska, Krystyna

    2014-07-01

    We present a new method for analyzing ion, or molecule, distributions around helical nucleic acids and illustrate the approach by analyzing data derived from molecular dynamics simulations. The analysis is based on the use of curvilinear helicoidal coordinates and leads to highly localized ion densities compared to those obtained by simply superposing molecular dynamics snapshots in Cartesian space. The results identify highly populated and sequence-dependent regions where ions strongly interact with the nucleic and are coupled to its conformational fluctuations. The data from this approach is presented as ion populations or ion densities (in units of molarity) and can be analyzed in radial, angular and longitudinal coordinates using 1D or 2D graphics. It is also possible to regenerate 3D densities in Cartesian space. This approach makes it easy to understand and compare ion distributions and also allows the calculation of average ion populations in any desired zone surrounding a nucleic acid without requiring references to its constituent atoms. The method is illustrated using microsecond molecular dynamics simulations for two different DNA oligomers in the presence of 0.15 M potassium chloride. We discuss the results in terms of convergence, sequence-specific ion binding and coupling with DNA conformation. PMID:24906882

  20. Analyzing ion distributions around DNA

    PubMed Central

    Lavery, Richard; Maddocks, John H.; Pasi, Marco; Zakrzewska, Krystyna

    2014-01-01

    We present a new method for analyzing ion, or molecule, distributions around helical nucleic acids and illustrate the approach by analyzing data derived from molecular dynamics simulations. The analysis is based on the use of curvilinear helicoidal coordinates and leads to highly localized ion densities compared to those obtained by simply superposing molecular dynamics snapshots in Cartesian space. The results identify highly populated and sequence-dependent regions where ions strongly interact with the nucleic and are coupled to its conformational fluctuations. The data from this approach is presented as ion populations or ion densities (in units of molarity) and can be analyzed in radial, angular and longitudinal coordinates using 1D or 2D graphics. It is also possible to regenerate 3D densities in Cartesian space. This approach makes it easy to understand and compare ion distributions and also allows the calculation of average ion populations in any desired zone surrounding a nucleic acid without requiring references to its constituent atoms. The method is illustrated using microsecond molecular dynamics simulations for two different DNA oligomers in the presence of 0.15 M potassium chloride. We discuss the results in terms of convergence, sequence-specific ion binding and coupling with DNA conformation. PMID:24906882

  1. Remote Laser Diffraction PSD Analyzer

    SciTech Connect

    T. A. Batcheller; G. M. Huestis; S. M. Bolton

    2000-06-01

    Particle size distribution (PSD) analysis of radioactive slurry samples were obtained using a modified off-the-shelf classical laser light scattering particle size analyzer. A Horiba Instruments Inc. Model La-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a hot cell (gamma radiation) environment. The general details of the modifications to this analyzer are presented in this paper. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not achievable - making this technology far superior than the traditional methods used previously. Remote deployment and utilization of this technology is in an exploratory stage. The risk of malfunction in this radiation environment is countered by gaining of this tremendously useful fundamental engineering data. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives.

  2. Remote Laser Diffraction PSD Analyzer

    SciTech Connect

    Batcheller, Thomas Aquinas; Huestis, Gary Michael; Bolton, Steven Michael

    2000-06-01

    Particle size distribution (PSD) analysis of radioactive slurry samples were obtained using a modified "off-the-shelf" classical laser light scattering particle size analyzer. A Horiba Instruments Inc. Model La-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a "hot cell" (gamma radiation) environment. The general details of the modifications to this analyzer are presented in this paper. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not achievable - making this technology far superior than the traditional methods used previously. Remote deployment and utilization of this technology is in an exploratory stage. The risk of malfunction in this radiation environment is countered by gaining of this tremendously useful fundamental engineering data. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives.

  3. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach.

    PubMed

    Cheung, Mike W-L; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists-and probably the most crucial one-is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  4. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach.

    PubMed

    Cheung, Mike W-L; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists-and probably the most crucial one-is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study. PMID:27242639

  5. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    PubMed Central

    Cheung, Mike W.-L.; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists—and probably the most crucial one—is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study. PMID:27242639

  6. Analyzing lease/purchase options.

    PubMed

    Ciolek, D; Mace, J D

    1998-01-01

    The authors' previous article, "Equipment Acquisition Using Various Forms of Leasing," covers information necessary for selecting among the different kinds of leases. This article explains how to reach a proper financial analysis, preferably using two phases. Using a representative example, the article guides the reader through the first phase and introduces the elements needing review in the second phase. Key elements include pretax aftertax and cash flow analyses. Different organizations use different yardsticks to measure the financials of a transaction, but in general, cash is king. Therefore, the most widely used comparison is the purchase versus lease IRR (internal rate of return) produced by measuring the cash flow of the purchase case compared to the cash flow of the lease case.

  7. The Aqueduct Global Flood Analyzer

    NASA Astrophysics Data System (ADS)

    Iceland, Charles

    2015-04-01

    As population growth and economic growth take place, and as climate change accelerates, many regions across the globe are finding themselves increasingly vulnerable to flooding. A recent OECD study of the exposure of the world's large port cities to coastal flooding found that 40 million people were exposed to a 1 in 100 year coastal flood event in 2005, and the total value of exposed assets was about US 3,000 billion, or 5% of global GDP. By the 2070s, those numbers were estimated to increase to 150 million people and US 35,000 billion, or roughly 9% of projected global GDP. Impoverished people in developing countries are particularly at risk because they often live in flood-prone areas and lack the resources to respond. WRI and its Dutch partners - Deltares, IVM-VU University Amsterdam, Utrecht University, and PBL Netherlands Environmental Assessment Agency - are in the initial stages of developing a robust set of river flood and coastal storm surge risk measures that show the extent of flooding under a variety of scenarios (both current and future), together with the projected human and economic impacts of these flood scenarios. These flood risk data and information will be accessible via an online, easy-to-use Aqueduct Global Flood Analyzer. We will also investigate the viability, benefits, and costs of a wide array of flood risk reduction measures that could be implemented in a variety of geographic and socio-economic settings. Together, the activities we propose have the potential for saving hundreds of thousands of lives and strengthening the resiliency and security of many millions more, especially those who are most vulnerable. Mr. Iceland will present Version 1.0 of the Aqueduct Global Flood Analyzer and provide a preview of additional elements of the Analyzer to be released in the coming years.

  8. Analyzing PICL trace data with MEDEA

    SciTech Connect

    Merlo, A.P.; Worley, P.H.

    1994-04-01

    Execution traces and performance statistics can be collected for parallel applications on a variety of multiprocessor platforms by using the Portable Instrumented Communication Library (PICL). The static and dynamic performance characteristics of performance characteristics of performance data can be analyzed easily and effectively with the facilities provided within the MEasurements Description Evaluation and Analysis tool (MEDEA). A case study is then outlined that uses PICL and MEDEA to characterize the performance of a parallel benchmark code executed on different hardware platforms and using different parallel algorithms and communication protocols.

  9. Miniature integrated-optical wavelength analyzer chip

    NASA Astrophysics Data System (ADS)

    Kunz, R. E.; Dübendorfer, J.

    1995-11-01

    A novel integrated-optical chip suitable for realizing compact miniature wavelength analyzers with high linear dispersion is presented. The chip performs the complete task of converting the spectrum of an input beam into a corresponding spatial irradiance distribution without the need for an imaging function. We demonstrate the feasibility of this approach experimentally by monitoring the changes in the mode spectrum of a laser diode on varying its case temperature. Comparing the results with simultaneous measurements by a commercial spectrometer yielded a rms wavelength deviation of 0.01 nm.

  10. Analyzing PICL trace data with MEDEA

    SciTech Connect

    Merlo, A.P.; Worley, P.H.

    1993-11-01

    Execution traces and performance statistics can be collected for parallel applications on a variety of multiprocessor platforms by using the Portable Instrumented Communication Library (PICL). The static and dynamic performance characteristics of performance data can be analyzed easily and effectively with the facilities provided within the MEasurements Description Evaluation and Analysis tool (MEDEA). This report describes the integration of the PICL trace file format into MEDEA. A case study is then outlined that uses PICL and MEDEA to characterize the performance of a parallel benchmark code executed on different hardware platforms and using different parallel algorithms and communication protocols.

  11. [Examination of the olfactory analyzer].

    PubMed

    Domrachev, A A; Afon'kin, V Iu

    2002-01-01

    A method of threshold olfactometry is proposed consisting in the use of three olfactive substances (tincture of valerian, acetic acid, liquid ammonia) in selected concentrations. This allows to investigate the thresholds of certain modality. Each concentration of the olfactive substance is placed into a glass bottle (100 ml) and stored at the temperature 18-20 degrees C. The examination of the state of the olfactory analyzer within a 24-h working day showed stability of threshold olfactometry when the organism is tired. Utilization of threshold olfactometry in some diagnostic areas is shown. PMID:12056163

  12. The OpenSHMEM Analyzer

    2014-07-30

    The OpenSHMEM Analyzer is a compiler-based tool that can help users detect errors and provide useful analyses about their OpenSHMEM applications. The tool is built on top of the OpenUH compiler (a branch of Open64 compiler) and presents OpenSHMEM information as feedback to the user. Some of the analyses it provides include checks for correct usage of symmetric variables in OpenSHMEM calls, out-of-bounds checks for symmetric data, checks for the correct initialization of pointers tomore » symmetric data, and symmetric data alias information.« less

  13. MULTICHANNEL PULSE-HEIGHT ANALYZER

    DOEpatents

    Russell, J.T.; Lefevre, H.W.

    1958-01-21

    This patent deals with electronic computing circuits and more particularly to pulse-height analyzers used for classifying variable amplitude pulses into groups of different amplitudes. The device accomplishes this pulse allocation by by converting the pulses into frequencies corresponding to the amplitudes of the pulses, which frequencies are filtered in channels individually pretuned to a particular frequency and then detected and recorded in the responsive channel. This circuit substantially overcomes the disadvantages of prior annlyzers incorporating discriminators pre-set to respond to certain voltage levels, since small variation in component values is not as critical to satisfactory circuit operation.

  14. Method for analyzing microbial communities

    SciTech Connect

    Zhou, Jizhong; Wu, Liyou

    2010-07-20

    The present invention provides a method for quantitatively analyzing microbial genes, species, or strains in a sample that contains at least two species or strains of microorganisms. The method involves using an isothermal DNA polymerase to randomly and representatively amplify genomic DNA of the microorganisms in the sample, hybridizing the resultant polynucleotide amplification product to a polynucleotide microarray that can differentiate different genes, species, or strains of microorganisms of interest, and measuring hybridization signals on the microarray to quantify the genes, species, or strains of interest.

  15. Truck acoustic data analyzer system

    DOEpatents

    Haynes, Howard D.; Akerman, Alfred; Ayers, Curtis W.

    2006-07-04

    A passive vehicle acoustic data analyzer system having at least one microphone disposed in the acoustic field of a moving vehicle and a computer in electronic communication the microphone(s). The computer detects and measures the frequency shift in the acoustic signature emitted by the vehicle as it approaches and passes the microphone(s). The acoustic signature of a truck driving by a microphone can provide enough information to estimate the truck speed in miles-per-hour (mph), engine speed in rotations-per-minute (RPM), turbocharger speed in RPM, and vehicle weight.

  16. Charged particle mobility refrigerant analyzer

    DOEpatents

    Allman, Steve L.; Chen, Chung-Hsuan; Chen, Fang C.

    1993-01-01

    A method for analyzing a gaseous electronegative species comprises the steps of providing an analysis chamber; providing an electric field of known potential within the analysis chamber; admitting into the analysis chamber a gaseous sample containing the gaseous electronegative species; providing a pulse of free electrons within the electric field so that the pulse of free electrons interacts with the gaseous electronegative species so that a swarm of electrically charged particles is produced within the electric field; and, measuring the mobility of the electrically charged particles within the electric field.

  17. Charged particle mobility refrigerant analyzer

    DOEpatents

    Allman, S.L.; Chunghsuan Chen; Chen, F.C.

    1993-02-02

    A method for analyzing a gaseous electronegative species comprises the steps of providing an analysis chamber; providing an electric field of known potential within the analysis chamber; admitting into the analysis chamber a gaseous sample containing the gaseous electronegative species; providing a pulse of free electrons within the electric field so that the pulse of free electrons interacts with the gaseous electronegative species so that a swarm of electrically charged particles is produced within the electric field; and, measuring the mobility of the electrically charged particles within the electric field.

  18. The OpenSHMEM Analyzer

    SciTech Connect

    Hernandez, Oscar

    2014-07-30

    The OpenSHMEM Analyzer is a compiler-based tool that can help users detect errors and provide useful analyses about their OpenSHMEM applications. The tool is built on top of the OpenUH compiler (a branch of Open64 compiler) and presents OpenSHMEM information as feedback to the user. Some of the analyses it provides include checks for correct usage of symmetric variables in OpenSHMEM calls, out-of-bounds checks for symmetric data, checks for the correct initialization of pointers to symmetric data, and symmetric data alias information.

  19. Automating a residual gas analyzer

    NASA Technical Reports Server (NTRS)

    Petrie, W. F.; Westfall, A. H.

    1982-01-01

    A residual gas analyzer (RGA), a device for measuring the amounts and species of various gases present in a vacuum system is discussed. In a recent update of the RGA, it was shown that the use of microprocessors could revolutionize data acquisition and data reduction. This revolution is exemplified by the Inficon 1Q200 RGA which was selected to meet the needs of this update. The Inficon RGA and the Zilog microcomputer were interfaced in order the receive and format the digital data from the RGA. This automated approach is discussed in detail.

  20. Trace Gas Analyzer (TGA) program

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The design, fabrication, and test of a breadboard trace gas analyzer (TGA) is documented. The TGA is a gas chromatograph/mass spectrometer system. The gas chromatograph subsystem employs a recirculating hydrogen carrier gas. The recirculation feature minimizes the requirement for transport and storage of large volumes of carrier gas during a mission. The silver-palladium hydrogen separator which permits the removal of the carrier gas and its reuse also decreases vacuum requirements for the mass spectrometer since the mass spectrometer vacuum system need handle only the very low sample pressure, not sample plus carrier. System performance was evaluated with a representative group of compounds.

  1. Some easily analyzable convolutional codes

    NASA Technical Reports Server (NTRS)

    Mceliece, R.; Dolinar, S.; Pollara, F.; Vantilborg, H.

    1989-01-01

    Convolutional codes have played and will play a key role in the downlink telemetry systems on many NASA deep-space probes, including Voyager, Magellan, and Galileo. One of the chief difficulties associated with the use of convolutional codes, however, is the notorious difficulty of analyzing them. Given a convolutional code as specified, say, by its generator polynomials, it is no easy matter to say how well that code will perform on a given noisy channel. The usual first step in such an analysis is to computer the code's free distance; this can be done with an algorithm whose complexity is exponential in the code's constraint length. The second step is often to calculate the transfer function in one, two, or three variables, or at least a few terms in its power series expansion. This step is quite hard, and for many codes of relatively short constraint lengths, it can be intractable. However, a large class of convolutional codes were discovered for which the free distance can be computed by inspection, and for which there is a closed-form expression for the three-variable transfer function. Although for large constraint lengths, these codes have relatively low rates, they are nevertheless interesting and potentially useful. Furthermore, the ideas developed here to analyze these specialized codes may well extend to a much larger class.

  2. Thermal and evolved gas analyzer

    NASA Technical Reports Server (NTRS)

    Williams, M. S.; Boynton, W. V.; James, R. L.; Verts, W. T.; Bailey, S. H.; Hamara, D. K.

    1998-01-01

    The Thermal and Evolved Gas Analyzer (TEGA) instrument will perform calorimetry and evolved gas analysis on soil samples collected from the Martian surface. TEGA is one of three instruments, along with a robotic arm, that form the Mars Volatile and Climate Survey (MVACS) payload. The other instruments are a stereo surface imager, built by Peter Smith of the University of Arizona and a meteorological station, built by JPL. The MVACS lander will investigate a Martian landing site at approximately 70 deg south latitude. Launch will take place from Kennedy Space Center in January, 1999. The TEGA project started in February, 1996. In the intervening 24 months, a flight instrument concept has been designed, prototyped, built as an engineering model and flight model, and tested. The instrument performs laboratory-quality differential-scanning calorimetry (DSC) over the temperature range of Mars ambient to 1400K. Low-temperature volatiles (water and carbon dioxide ices) and the carbonates will be analyzed in this temperature range. Carbonates melt and evolve carbon dioxide at temperatures above 600 C. Evolved oxygen (down to a concentration of 1 ppm) is detected, and C02 and water vapor and the isotopic variations of C02 and water vapor are detected and their concentrations measured. The isotopic composition provides important tests of the theory of solar system formation.

  3. Coaxial charged particle energy analyzer

    NASA Technical Reports Server (NTRS)

    Kelly, Michael A. (Inventor); Bryson, III, Charles E. (Inventor); Wu, Warren (Inventor)

    2011-01-01

    A non-dispersive electrostatic energy analyzer for electrons and other charged particles having a generally coaxial structure of a sequentially arranged sections of an electrostatic lens to focus the beam through an iris and preferably including an ellipsoidally shaped input grid for collimating a wide acceptance beam from a charged-particle source, an electrostatic high-pass filter including a planar exit grid, and an electrostatic low-pass filter. The low-pass filter is configured to reflect low-energy particles back towards a charged particle detector located within the low-pass filter. Each section comprises multiple tubular or conical electrodes arranged about the central axis. The voltages on the lens are scanned to place a selected energy band of the accepted beam at a selected energy at the iris. Voltages on the high-pass and low-pass filters remain substantially fixed during the scan.

  4. Nonlinear Single Spin Spectrum Analyzer

    NASA Astrophysics Data System (ADS)

    Kotler, Shlomi; Akerman, Nitzan; Glickman, Yinnon; Ozeri, Roee

    2014-03-01

    Qubits have been used as linear spectrum analyzers of their environments, through the use of decoherence spectroscopy. Here we solve the problem of nonlinear spectral analysis, required for discrete noise induced by a strongly coupled environment. Our nonperturbative analytical model shows a nonlinear signal dependence on noise power, resulting in a spectral resolution beyond the Fourier limit as well as frequency mixing. We develop a noise characterization scheme adapted to this nonlinearity. We then apply it using a single trapped ion as a sensitive probe of strong, non-Gaussian, discrete magnetic field noise. Finally, we experimentally compared the performance of equidistant vs Uhrig modulation schemes for spectral analysis. Phys. Rev. Lett. 110, 110503 (2013). Synopsis at http://physics.aps.org/synopsis-for/10.1103/PhysRevLett.110.110503 Current position: NIST, Boulder, CO.

  5. Analyze distillation columns with thermodynamics

    SciTech Connect

    Ognisty, T.P. )

    1995-02-01

    In a distillation column, heat supplies the work for separating the components of a feed stream into products. Distillation columns consume some 95% of the total energy used in separations. This amounts to roughly 3% of the energy consumed in the US. Since distillation is so energy intensive and requires significant capital outlays, an endless quest to improve the economics has continued since the beginning of the industry. By analyzing the thermodynamics of a distillation column, an engineer can quantify the thermodynamic efficiency of the process, identify the regions where energy can be better utilized, and define the minimum targets for energy consumption. This article reviews the principles of distillation column thermodynamics and outlines the analysis of lost work profiles and column heat profiles. It then illustrates these concepts through three examples.

  6. Compact Microwave Fourier Spectrum Analyzer

    NASA Technical Reports Server (NTRS)

    Savchenkov, Anatoliy; Matsko, Andrey; Strekalov, Dmitry

    2009-01-01

    A compact photonic microwave Fourier spectrum analyzer [a Fourier-transform microwave spectrometer, (FTMWS)] with no moving parts has been proposed for use in remote sensing of weak, natural microwave emissions from the surfaces and atmospheres of planets to enable remote analysis and determination of chemical composition and abundances of critical molecular constituents in space. The instrument is based on a Bessel beam (light modes with non-zero angular momenta) fiber-optic elements. It features low power consumption, low mass, and high resolution, without a need for any cryogenics, beyond what is achievable by the current state-of-the-art in space instruments. The instrument can also be used in a wide-band scatterometer mode in active radar systems.

  7. Using Toyota's A3 Thinking for Analyzing MBA Business Cases

    ERIC Educational Resources Information Center

    Anderson, Joe S.; Morgan, James N.; Williams, Susan K.

    2011-01-01

    A3 Thinking is fundamental to Toyota's benchmark management philosophy and to their lean production system. It is used to solve problems, gain agreement, mentor team members, and lead organizational improvements. A structured problem-solving approach, A3 Thinking builds improvement opportunities through experience. We used "The Toyota…

  8. Analyzing bioterror response logistics: the case of smallpox.

    PubMed

    Kaplan, Edward H; Craft, David L; Wein, Lawrence M

    2003-09-01

    To evaluate existing and alternative proposals for emergency response to a deliberate smallpox attack, we embed the key operational features of such interventions into a smallpox disease transmission model. We use probabilistic reasoning within an otherwise deterministic epidemic framework to model the 'race to trace', i.e., attempting to trace (via the infector) and vaccinate an infected person while (s)he is still vaccine-sensitive. Our model explicitly incorporates a tracing/vaccination queue, and hence can be used as a capacity planning tool. An approximate analysis of this large (16 ODE) system yields closed-form estimates for the total number of deaths and the maximum queue length. The former estimate delineates the efficacy (i.e., accuracy) and efficiency (i.e., speed) of contact tracing, while the latter estimate reveals how congestion makes the race to trace more difficult to win, thereby causing more deaths. A probabilistic analysis is also used to find an approximate closed-form expression for the total number of deaths under mass vaccination, in terms of both the basic reproductive ratio and the vaccination capacity. We also derive approximate thresholds for initially controlling the epidemic for more general interventions that include imperfect vaccination and quarantine.

  9. Analyzing Sustainability Themes in State Science Standards: Two Case Studies

    ERIC Educational Resources Information Center

    Miller, Hannah K.; Jones, Linda Cronin

    2014-01-01

    Due to the interdisciplinary nature of environmental education, addressing the range of socioscientific issues included under the umbrella of sustainability can be challenging for educators working within the context of mandated state subject area standards. Two states (Washington and Vermont) have been recognized as leaders in incorporating…

  10. Analyzing radial acceleration with a smartphone acceleration sensor

    NASA Astrophysics Data System (ADS)

    Vogt, Patrik; Kuhn, Jochen

    2013-03-01

    This paper continues the sequence of experiments using the acceleration sensor of smartphones (for description of the function and the use of the acceleration sensor, see Ref. 1) within this column, in this case for analyzing the radial acceleration.

  11. Application de la méthode de Laue refocalisée à haute énergie à l'étude des mécanismes de recristallisation après déformation à froid de métaux CFC

    NASA Astrophysics Data System (ADS)

    Chauveau, Th.; Gerber, Ph.; Bastie, P.; Hamelin, B.; Tarasiuk, J.; Bacroix, B.

    2002-07-01

    In order to determine the kinetics of recystallization process, the Laue diffraction method is apllied here on cold rolled and recrystallized copper materials. Firts, the results obtained from reflexion X-ray diffraction texture measurements are presented. It is shown that the both techniques are complementar, and that the transmission Laue method presents some advantadges. Some previous hypotheses are verified in this work. Especially, the Cube orientation \\{100\\} <001> loses during recrystallization its advantadge of growth, which implies the development of a mixed “deformed-recrystallized” texture at the end of the process. The nucleation step determines the final recrystallized texture. La méthode de Laue haute énergie est appliquée ici dans le but de déterminer les cinétiques de recristallisation de matériaux cuivre après laminages à froid et recuits interrompus. Après présentation des résultats obtenus par mesures classiques en réflexion, une comparaison est effectuée entre les deux techniques permettant de rendre compte des avantages certains qu'offrent des mesures en transmission et de la complémentarité des deux techniques utilisées. Quelques hypothèses sur les mécanismes régissant le processus de recristallisation sont ainsi vérifiées. Au tout début du processus de recristallisation, il a clairement été mis en évidence une perte de l'avantage de croissance des germes d'orientation Cube \\{100\\} <001> au profit des composantes constituant la texture de déformation. Dans ce type de processus, n'est préférentielle que l'étape de germination, dans le sens où elle détermine la texture finale après recristallisation.

  12. Preparing and Analyzing Iced Airfoils

    NASA Technical Reports Server (NTRS)

    Vickerman, Mary B.; Baez, Marivell; Braun, Donald C.; Cotton, Barbara J.; Choo, Yung K.; Coroneos, Rula M.; Pennline, James A.; Hackenberg, Anthony W.; Schilling, Herbert W.; Slater, John W.; Burke, Kevin M.; Nolan, Gerald J.; Brown, Dennis

    2004-01-01

    SmaggIce version 1.2 is a computer program for preparing and analyzing iced airfoils. It includes interactive tools for (1) measuring ice-shape characteristics, (2) controlled smoothing of ice shapes, (3) curve discretization, (4) generation of artificial ice shapes, and (5) detection and correction of input errors. Measurements of ice shapes are essential for establishing relationships between characteristics of ice and effects of ice on airfoil performance. The shape-smoothing tool helps prepare ice shapes for use with already available grid-generation and computational-fluid-dynamics software for studying the aerodynamic effects of smoothed ice on airfoils. The artificial ice-shape generation tool supports parametric studies since ice-shape parameters can easily be controlled with the artificial ice. In such studies, artificial shapes generated by this program can supplement simulated ice obtained from icing research tunnels and real ice obtained from flight test under icing weather condition. SmaggIce also automatically detects geometry errors such as tangles or duplicate points in the boundary which may be introduced by digitization and provides tools to correct these. By use of interactive tools included in SmaggIce version 1.2, one can easily characterize ice shapes and prepare iced airfoils for grid generation and flow simulations.

  13. Thomson parabola ion energy analyzer

    SciTech Connect

    Cobble, James A; Flippo, Kirk A; Letzring, Samuel A; Lopez, Frank E; Offermann, Dustin T; Oertel, John A; Mastrosimone, Dino

    2010-01-01

    A new, versatile Thomson parabola ion energy (TPIE) analyzer has been designed and constructed for use at the OMEGA-EP facility. Multi-MeV ions from EP targets are transmitted through a W pinhole into a (5- or 8-kG) magnetic field and subsequently through a parallel electric field of up to 30 kV/cm. The ion drift region may have a user-selected length of 10, 50, or 80 cm. With the highest fields, 500-Me V C{sup 6+} and C{sup 5+} may be resolved. TPIE is TIM-mounted at OMEGA-EP and is qualified in all existing TIMs. The instrument runs on pressure-interlocked 15-VDC power available in EP TIM carts. It may be inserted to within several inches of the target to attain sufficient flux for a measurement. For additional flux control, the user may select a square-aperture W pinhole of 0.004-inch or 0.010-inch. The detector consists of CR-39 backed by an image plate. The fully relativistic design code and design features are discussed. Ion spectral results from first use at OMEGA-EP are expected.

  14. Analyzing and modeling heterogeneous behavior

    NASA Astrophysics Data System (ADS)

    Lin, Zhiting; Wu, Xiaoqing; He, Dongyue; Zhu, Qiang; Ni, Jixiang

    2016-05-01

    Recently, it was pointed out that the non-Poisson statistics with heavy tail existed in many scenarios of human behaviors. But most of these studies claimed that power-law characterized diverse aspects of human mobility patterns. In this paper, we suggest that human behavior may not be driven by identical mechanisms and can be modeled as a Semi-Markov Modulated Process. To verify our suggestion and model, we analyzed a total of 1,619,934 records of library visitations (including undergraduate and graduate students). It is found that the distribution of visitation intervals is well fitted with three sections of lines instead of the traditional power law distribution in log-log scale. The results confirm that some human behaviors cannot be simply expressed as power law or any other simple functions. At the same time, we divided the data into groups and extracted period bursty events. Through careful analysis in different groups, we drew a conclusion that aggregate behavior might be composed of heterogeneous behaviors, and even the behaviors of the same type tended to be different in different period. The aggregate behavior is supposed to be formed by "heterogeneous groups". We performed a series of experiments. Simulation results showed that we just needed to set up two states Semi-Markov Modulated Process to construct proper representation of heterogeneous behavior.

  15. Analyzing plant defenses in nature

    PubMed Central

    Kautz, Stefanie; Heil, Martin; Hegeman, Adrian D

    2009-01-01

    A broad range of chemical plant defenses against herbivores has been studied extensively under laboratory conditions. In many of these cases there is still little understanding of their relevance in nature. In natural systems, functional analyses of plant traits are often complicated by an extreme variability, which affects the interaction with higher trophic levels. Successful analyses require consideration of the numerous sources of variation that potentially affect the plant trait of interest. In our recent study on wild lima bean (Phaseolus lunatus L.) in South Mexico, we applied an integrative approach combining analyses for quantitative correlations of cyanogenic potential (HCNp; the maximum amount of cyanide that can be released from a given tissue) and herbivory in the field with subsequent feeding trials under controlled conditions. This approach allowed us to causally explain the consequences of quantitative variation of HCNp on herbivore-plant interactions in nature and highlights the importance of combining data obtained in natural systems with analyses under controlled conditions. PMID:19820300

  16. Analyzing Dynamics of Cooperating Spacecraft

    NASA Technical Reports Server (NTRS)

    Hughes, Stephen P.; Folta, David C.; Conway, Darrel J.

    2004-01-01

    A software library has been developed to enable high-fidelity computational simulation of the dynamics of multiple spacecraft distributed over a region of outer space and acting with a common purpose. All of the modeling capabilities afforded by this software are available independently in other, separate software systems, but have not previously been brought together in a single system. A user can choose among several dynamical models, many high-fidelity environment models, and several numerical-integration schemes. The user can select whether to use models that assume weak coupling between spacecraft, or strong coupling in the case of feedback control or tethering of spacecraft to each other. For weak coupling, spacecraft orbits are propagated independently, and are synchronized in time by controlling the step size of the integration. For strong coupling, the orbits are integrated simultaneously. Among the integration schemes that the user can choose are Runge-Kutta Verner, Prince-Dormand, Adams-Bashforth-Moulton, and Bulirsh- Stoer. Comparisons of performance are included for both the weak- and strongcoupling dynamical models for all of the numerical integrators.

  17. Analyzes Data from Semiconductor Wafers

    2002-07-23

    This program analyzes reflectance data from semiconductor wafers taken during the deposition or evolution of a thin film, typically via chemical vapor deposition (CVD) or molecular beam epitaxy (MBE). It is used to determine the growth rate and optical constants of the deposited thin films using a virtual interface concept. Growth rates and optical constants of multiple-layer structures is possible by selecting appropriate sections in the reflectance vs time waveform. No prior information or estimatesmore » of growth rates and materials properties is required if an absolute reflectance waveform is used. If the optical constants of a thin film are known, then the growth rate may be extracted from a relative reflectance data set. The analysis is valid for either s or p polarized light at any incidence angle and wavelength. The analysis package is contained within an easy-to-use graphical user interface. The program is based on the algorighm described in the following two publications: W.G. Breiland and K.P. Killen, J. Appl. Phys. 78 (1995) 6726, and W. G. Breiland, H.Q. Hou, B.E. Hammons, and J.F. Klem, Proc. XXVIII SOTAPOCS Symp. Electrochem. Soc. San Diego, May 3-8, 1998. It relies on the fact that any multiple-layer system has a reflectance spectrum that is mathematically equivalent to a single-layer thin film on a virtual substrate. The program fits the thin film reflectance with five adjustable parameters: 1) growth rate, 2) real part of complex refractive index, 3) imaginary part of refractive index, 4) amplitude of virtual interface reflectance, 5) phase of virtual interface reflectance.« less

  18. HPT as a Manager's Tool for Analyzing Individual Employee Performance

    ERIC Educational Resources Information Center

    Kyle-Needs, Denise A.; Lindbeck, Robin

    2011-01-01

    Typically the human performance technology (HPT) process is regarded as a tool for use when analyzing performance gaps in functional or larger organizational units. This case study demonstrates the application of the HPT process in a one-to-one relationship between a manager and a direct report. Specifically, the process is used to analyze the…

  19. Buccal microbiology analyzed by infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    de Abreu, Geraldo Magno Alves; da Silva, Gislene Rodrigues; Khouri, Sônia; Favero, Priscila Pereira; Raniero, Leandro; Martin, Airton Abrahão

    2012-01-01

    Rapid microbiological identification and characterization are very important in dentistry and medicine. In addition to dental diseases, pathogens are directly linked to cases of endocarditis, premature delivery, low birth weight, and loss of organ transplants. Fourier Transform Infrared Spectroscopy (FTIR) was used to analyze oral pathogens Aggregatibacter actinomycetemcomitans ATCC 29523, Aggregatibacter actinomycetemcomitans-JP2, and Aggregatibacter actinomycetemcomitans which was clinically isolated from the human blood-CI. Significant spectra differences were found among each organism allowing the identification and characterization of each bacterial species. Vibrational modes in the regions of 3500-2800 cm-1, the 1484-1420 cm-1, and 1000-750 cm-1 were used in this differentiation. The identification and classification of each strain were performed by cluster analysis achieving 100% separation of strains. This study demonstrated that FTIR can be used to decrease the identification time, compared to the traditional methods, of fastidious buccal microorganisms associated with the etiology of the manifestation of periodontitis.

  20. Orthopedic surgical analyzer for percutaneous vertebroplasty

    NASA Astrophysics Data System (ADS)

    Tack, Gye Rae; Choi, Hyung Guen; Lim, Do H.; Lee, Sung J.

    2001-05-01

    Since the spine is one of the most complex joint structures in the human body, its surgical treatment requires careful planning and high degree of precision to avoid any unwanted neurological compromises. In addition, comprehensive biomechanical analysis can be very helpful because the spine is subject to a variety of load. In case for the osteoporotic spine in which the structural integrity has been compromised, it brings out the double challenges for a surgeon both clinically and biomechanically. Thus, we have been developing an integrated medical image system that is capable of doing the both. This system is called orthopedic surgical analyzer and it combines the clinical results from image-guided examination and the biomechanical data from finite element analysis. In order to demonstrate its feasibility, this system was applied to percutaneous vertebroplasty. Percutaneous vertebroplasty is a surgical procedure that has been recently introduced for the treatment of compression fracture of the osteoporotic vertebrae. It involves puncturing vertebrae and filling with polymethylmethacrylate (PMMA). Recent studies have shown that the procedure could provide structural reinforcement for the osteoporotic vertebrae while being minimally invasive and safe with immediate pain relief. However, treatment failures due to excessive PMMA volume injection have been reported as one of complications. It is believed that control of PMMA volume is one of the most critical factors that can reduce the incidence of complications. Since the degree of the osteoporosis can influence the porosity of the cancellous bone in the vertebral body, the injection volume can be different from patient to patient. In this study, the optimal volume of PMMA injection for vertebroplasty was predicted based on the image analysis of a given patient. In addition, biomechanical effects due to the changes in PMMA volume and bone mineral density (BMD) level were investigated by constructing clinically

  1. Analyzing delay causes in Egyptian construction projects.

    PubMed

    Marzouk, Mohamed M; El-Rasas, Tarek I

    2014-01-01

    Construction delays are common problems in civil engineering projects in Egypt. These problems occur frequently during project life-time leading to disputes and litigation. Therefore, it is essential to study and analyze causes of construction delays. This research presents a list of construction delay causes retrieved from literature. The feedback of construction experts was obtained through interviews. Subsequently, a questionnaire survey was prepared. The questionnaire survey was distributed to thirty-three construction experts who represent owners, consultants, and contractor's organizations. Frequency Index, Severity Index, and Importance Index are calculated and according to the highest values of them the top ten delay causes of construction projects in Egypt are determined. A case study is analyzed and compared to the most important delay causes in the research. Statistical analysis is carried out using analysis of variance ANOVA method to test delay causes, obtained from the survey. The test results reveal good correlation between groups while there is significant difference between them for some delay causes and finally roadmap for prioritizing delay causes groups is presented.

  2. Stochastic Particle Real Time Analyzer (SPARTA) Validation and Verification Suite

    SciTech Connect

    Gallis, Michael A.; Koehler, Timothy P.; Plimpton, Steven J.

    2014-10-01

    This report presents the test cases used to verify, validate and demonstrate the features and capabilities of the first release of the 3D Direct Simulation Monte Carlo (DSMC) code SPARTA (Stochastic Real Time Particle Analyzer). The test cases included in this report exercise the most critical capabilities of the code like the accurate representation of physical phenomena (molecular advection and collisions, energy conservation, etc.) and implementation of numerical methods (grid adaptation, load balancing, etc.). Several test cases of simple flow examples are shown to demonstrate that the code can reproduce phenomena predicted by analytical solutions and theory. A number of additional test cases are presented to illustrate the ability of SPARTA to model flow around complicated shapes. In these cases, the results are compared to other well-established codes or theoretical predictions. This compilation of test cases is not exhaustive, and it is anticipated that more cases will be added in the future.

  3. Expert system for analyzing eddy current measurements

    DOEpatents

    Levy, Arthur J.; Oppenlander, Jane E.; Brudnoy, David M.; Englund, James M.; Loomis, Kent C.

    1994-01-01

    A method and apparatus (called DODGER) analyzes eddy current data for heat exchanger tubes or any other metallic object. DODGER uses an expert system to analyze eddy current data by reasoning with uncertainty and pattern recognition. The expert system permits DODGER to analyze eddy current data intelligently, and obviate operator uncertainty by analyzing the data in a uniform and consistent manner.

  4. Update on Integrated Optical Design Analyzer

    NASA Technical Reports Server (NTRS)

    Moore, James D., Jr.; Troy, Ed

    2003-01-01

    Updated information on the Integrated Optical Design Analyzer (IODA) computer program has become available. IODA was described in Software for Multidisciplinary Concurrent Optical Design (MFS-31452), NASA Tech Briefs, Vol. 25, No. 10 (October 2001), page 8a. To recapitulate: IODA facilitates multidisciplinary concurrent engineering of highly precise optical instruments. The architecture of IODA was developed by reviewing design processes and software in an effort to automate design procedures. IODA significantly reduces design iteration cycle time and eliminates many potential sources of error. IODA integrates the modeling efforts of a team of experts in different disciplines (e.g., optics, structural analysis, and heat transfer) working at different locations and provides seamless fusion of data among thermal, structural, and optical models used to design an instrument. IODA is compatible with data files generated by the NASTRAN structural-analysis program and the Code V (Registered Trademark) optical-analysis program, and can be used to couple analyses performed by these two programs. IODA supports multiple-load-case analysis for quickly accomplishing trade studies. IODA can also model the transient response of an instrument under the influence of dynamic loads and disturbances.

  5. Analyzing modified unimodular gravity via Lagrange multipliers

    NASA Astrophysics Data System (ADS)

    Sáez-Gómez, Diego

    2016-06-01

    The so-called unimodular version of general relativity is revisited. Unimodular gravity is constructed by fixing the determinant of the metric, which leads to the trace-free part of the equations instead of the usual Einstein field equations. Then a cosmological constant naturally arises as an integration constant. While unimodular gravity turns out to be equivalent to general relativity (GR) at the classical level, it provides important differences at the quantum level. Here we extend the unimodular constraint to some extensions of general relativity that have drawn a lot of attention over the last years—f (R ) gravity (or its scalar-tensor picture) and Gauss-Bonnet gravity. The corresponding unimodular version of such theories is constructed as well as the conformal transformation that relates the Einstein and Jordan frames for these nonminimally coupled theories. From the classical point of view, the unimodular versions of such extensions are completely equivalent to their originals, but an effective cosmological constant arises naturally, which may provide a richer description of the evolution of the Universe. Here we analyze the case of Starobisnky inflation and compare it with the original one.

  6. Analyzing the acoustic beat with mobile devices

    NASA Astrophysics Data System (ADS)

    Kuhn, Jochen; Vogt, Patrik; Hirth, Michael

    2014-04-01

    In this column, we have previously presented various examples of how physical relationships can be examined by analyzing acoustic signals using smartphones or tablet PCs. In this example, we will be exploring the acoustic phenomenon of small beats, which is produced by the overlapping of two tones with a low difference in frequency Δf. The resulting auditory sensation is a tone with a volume that varies periodically. Acoustic beats can be perceived repeatedly in day-to-day life and have some interesting applications. For example, string instruments are still tuned with the help of an acoustic beat, even with modern technology. If a reference tone (e.g., 440 Hz) and, for example, a slightly out-of-tune violin string produce a tone simultaneously, a beat can be perceived. The more similar the frequencies, the longer the duration of the beat. In the extreme case, when the frequencies are identical, a beat no longer arises. The string is therefore correctly tuned. Using the Oscilloscope app,4 it is possible to capture and save acoustic signals of this kind and determine the beat frequency fS of the signal, which represents the difference in frequency Δf of the two overlapping tones (for Android smartphones, the app OsciPrime Oscilloscope can be used).

  7. The geometric factor of a cylindrical plate electrostatic analyzer

    NASA Technical Reports Server (NTRS)

    Johnstone, A. D.

    1971-01-01

    A method for calculating the geometric factor of cylindrical plate electrostatic energy analyzers with various detector geometries is described. The effects of the fringe-field are estimated. For a special simple case an exact geometric factor is calculated enabling an estimate of the inaccuracies of the approximations used in other cases. The results of some calculations are presented and a simple approximate expression for the geometric factor is deduced.

  8. Implementation of Complexity Analyzing Based on Additional Effect

    NASA Astrophysics Data System (ADS)

    Zhang, Peng; Li, Na; Liang, Yanhong; Liu, Fang

    According to the Complexity Theory, there is complexity in the system when the functional requirement is not be satisfied. There are several study performances for Complexity Theory based on Axiomatic Design. However, they focus on reducing the complexity in their study and no one focus on method of analyzing the complexity in the system. Therefore, this paper put forth a method of analyzing the complexity which is sought to make up the deficiency of the researches. In order to discussing the method of analyzing the complexity based on additional effect, this paper put forth two concepts which are ideal effect and additional effect. The method of analyzing complexity based on additional effect combines Complexity Theory with Theory of Inventive Problem Solving (TRIZ). It is helpful for designers to analyze the complexity by using additional effect. A case study shows the application of the process.

  9. A network of automatic atmospherics analyzer

    NASA Technical Reports Server (NTRS)

    Schaefer, J.; Volland, H.; Ingmann, P.; Eriksson, A. J.; Heydt, G.

    1980-01-01

    The design and function of an atmospheric analyzer which uses a computer are discussed. Mathematical models which show the method of measurement are presented. The data analysis and recording procedures of the analyzer are discussed.

  10. Development of an Infrared Fluorescent Gas Analyzer.

    ERIC Educational Resources Information Center

    McClatchie, E. A.

    A prototype model low level carbon monoxide analyzer was developed using fluorescent cell and negative chopping techniques to achieve a device superior to state of art NDIR (Nondispersive infrared) analyzers in stability and cross-sensitivity to other gaseous species. It is clear that this type of analyzer has that capacity. The prototype…

  11. 46 CFR 154.1360 - Oxygen analyzer.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 5 2014-10-01 2014-10-01 false Oxygen analyzer. 154.1360 Section 154.1360 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) CERTAIN BULK DANGEROUS CARGOES SAFETY STANDARDS... Instrumentation § 154.1360 Oxygen analyzer. The vessel must have a portable analyzer that measures oxygen...

  12. 46 CFR 154.1360 - Oxygen analyzer.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 5 2010-10-01 2010-10-01 false Oxygen analyzer. 154.1360 Section 154.1360 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) CERTAIN BULK DANGEROUS CARGOES SAFETY STANDARDS... Instrumentation § 154.1360 Oxygen analyzer. The vessel must have a portable analyzer that measures oxygen...

  13. 46 CFR 154.1360 - Oxygen analyzer.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 5 2011-10-01 2011-10-01 false Oxygen analyzer. 154.1360 Section 154.1360 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) CERTAIN BULK DANGEROUS CARGOES SAFETY STANDARDS... Instrumentation § 154.1360 Oxygen analyzer. The vessel must have a portable analyzer that measures oxygen...

  14. 46 CFR 154.1360 - Oxygen analyzer.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 5 2013-10-01 2013-10-01 false Oxygen analyzer. 154.1360 Section 154.1360 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) CERTAIN BULK DANGEROUS CARGOES SAFETY STANDARDS... Instrumentation § 154.1360 Oxygen analyzer. The vessel must have a portable analyzer that measures oxygen...

  15. 46 CFR 154.1360 - Oxygen analyzer.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 5 2012-10-01 2012-10-01 false Oxygen analyzer. 154.1360 Section 154.1360 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) CERTAIN BULK DANGEROUS CARGOES SAFETY STANDARDS... Instrumentation § 154.1360 Oxygen analyzer. The vessel must have a portable analyzer that measures oxygen...

  16. Module optical analyzer: Identification of defects on the production line

    NASA Astrophysics Data System (ADS)

    Herrero, Rebeca; Askins, Stephen; Antón, Ignacio; Sala, Gabriel; Araki, Kenji; Nagai, Hirokazu

    2014-09-01

    The usefulness of the module optical analyzer when identifying module defects on production line is presented in this paper. Two different case studies performed with two different kind of CPV modules are presented to show the use of MOA both in IES-UPM and Daido Steel facilities.

  17. Analyzing personalized policies for online biometric verification.

    PubMed

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M

    2014-01-01

    Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident.

  18. Analyzing Literacy Practice: Grounded Theory to Model

    ERIC Educational Resources Information Center

    Purcell-Gates, Victoria; Perry, Kristen H.; Briseno, Adriana

    2011-01-01

    In this methodological and theoretical article, we address the need for more cross-case work on studies of literacy in use within different social and cultural contexts. The Cultural Practices of Literacy Study (CPLS) project has been working on a methodology for cross-case analyses that are principled in that the qualitative nature of each case,…

  19. Web-based multi-channel analyzer

    DOEpatents

    Gritzo, Russ E.

    2003-12-23

    The present invention provides an improved multi-channel analyzer designed to conveniently gather, process, and distribute spectrographic pulse data. The multi-channel analyzer may operate on a computer system having memory, a processor, and the capability to connect to a network and to receive digitized spectrographic pulses. The multi-channel analyzer may have a software module integrated with a general-purpose operating system that may receive digitized spectrographic pulses for at least 10,000 pulses per second. The multi-channel analyzer may further have a user-level software module that may receive user-specified controls dictating the operation of the multi-channel analyzer, making the multi-channel analyzer customizable by the end-user. The user-level software may further categorize and conveniently distribute spectrographic pulse data employing non-proprietary, standard communication protocols and formats.

  20. Using expert systems to analyze ATE data

    NASA Technical Reports Server (NTRS)

    Harrington, Jim

    1994-01-01

    The proliferation of automatic test equipment (ATE) is resulting in the generation of large amounts of component data. Some of this component data is not accurate due to the presence of noise. Analyzing this data requires the use of new techniques. This paper describes the process of developing an expert system to analyze ATE data and provides an example rule in the CLIPS language for analyzing trip thresholds for high gain/high speed comparators.

  1. Altitude characteristics of selected air quality analyzers

    NASA Technical Reports Server (NTRS)

    White, J. H.; Strong, R.; Tommerdahl, J. B.

    1979-01-01

    The effects of altitude (pressure) on the operation and sensitivity of various air quality analyzers frequently flown on aircraft were analyzed. Two ozone analyzers were studied at altitudes from 600 to 7500 m and a nitrogen oxides chemiluminescence detector and a sulfur dioxide flame photometric detector were studied at altitudes from 600 to 3000 m. Calibration curves for altitude corrections to the sensitivity of the instruments are presented along with discussion of observed instrument behavior.

  2. Modified chemiluminescent NO analyzer accurately measures NOX

    NASA Technical Reports Server (NTRS)

    Summers, R. L.

    1978-01-01

    Installation of molybdenum nitric oxide (NO)-to-higher oxides of nitrogen (NOx) converter in chemiluminescent gas analyzer and use of air purge allow accurate measurements of NOx in exhaust gases containing as much as thirty percent carbon monoxide (CO). Measurements using conventional analyzer are highly inaccurate for NOx if as little as five percent CO is present. In modified analyzer, molybdenum has high tolerance to CO, and air purge substantially quenches NOx destruction. In test, modified chemiluminescent analyzer accurately measured NO and NOx concentrations for over 4 months with no denegration in performance.

  3. A wideband, high-resolution spectrum analyzer

    NASA Technical Reports Server (NTRS)

    Quirk, M. P.; Wilck, H. C.; Garyantes, M. F.; Grimm, M. J.

    1988-01-01

    A two-million-channel, 40 MHz bandwidth, digital spectrum analyzer under development at the Jet Propulsion Laboratory is described. The analyzer system will serve as a prototype processor for the sky survey portion of NASA's Search for Extraterrestrial Intelligence program and for other applications in the Deep Space Network. The analyzer digitizes an analog input, performs a 2 (sup 21) point Discrete Fourier Transform, accumulates the output power, normalizes the output to remove frequency-dependent gain, and automates simple signal detection algorithms. Due to its built-in frequency-domain processing functions and configuration flexibility, the analyzer is a very powerful tool for real-time signal analysis.

  4. Method for analyzing multilayer nonlinear optical waveguide.

    PubMed

    Wu, Yaw-Dong; Chen, Mao-Hsiung

    2005-10-01

    We propose a novel method for analyzing a multilayer optical waveguide structure with all nonlinear guiding films. This method can also be used to analyze a multibranch optical waveguide structure with all nonlinear guiding branches. The results show that agreement between theory and numerics is excellent.

  5. 40 CFR 92.109 - Analyzer specifications.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... shall be at least 60 percent of full-scale chart deflection. For NOX analyzers using a water trap, the response time increase due to the water trap and associated plumbing need not be included in the analyzer... use of linearizing circuits is permitted. (3) The minimum water rejection ratio (maximum...

  6. 40 CFR 91.313 - Analyzers required.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    .... (i) For Raw Gas Sampling, the hydrocarbon analyzer must be of the heated flame ionization (HFID) type. For constant volume sampling, the hydrocarbon analyzer may be of the flame ionization (FID) type or of the heated flame ionization (HFID) type. (ii) For the HFID system, if the temperature of the...

  7. 40 CFR 90.313 - Analyzers required.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... (HC) analysis. (i) For Raw Gas Sampling, the hydrocarbon analyzer shall be of the heated flame ionization (HFID) type. For constant volume sampling, the hydrocarbon analyzer may be of the flame ionization (FID) type or of the heated flame ionization (HFID) type. (ii) For the HFID system, if the...

  8. 40 CFR 91.313 - Analyzers required.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    .... (i) For Raw Gas Sampling, the hydrocarbon analyzer must be of the heated flame ionization (HFID) type. For constant volume sampling, the hydrocarbon analyzer may be of the flame ionization (FID) type or of the heated flame ionization (HFID) type. (ii) For the HFID system, if the temperature of the...

  9. 40 CFR 90.313 - Analyzers required.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... (HC) analysis. (i) For Raw Gas Sampling, the hydrocarbon analyzer shall be of the heated flame ionization (HFID) type. For constant volume sampling, the hydrocarbon analyzer may be of the flame ionization (FID) type or of the heated flame ionization (HFID) type. (ii) For the HFID system, if the...

  10. 40 CFR 91.313 - Analyzers required.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    .... (i) For Raw Gas Sampling, the hydrocarbon analyzer must be of the heated flame ionization (HFID) type. For constant volume sampling, the hydrocarbon analyzer may be of the flame ionization (FID) type or of the heated flame ionization (HFID) type. (ii) For the HFID system, if the temperature of the...

  11. 40 CFR 90.313 - Analyzers required.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... (HC) analysis. (i) For Raw Gas Sampling, the hydrocarbon analyzer shall be of the heated flame ionization (HFID) type. For constant volume sampling, the hydrocarbon analyzer may be of the flame ionization (FID) type or of the heated flame ionization (HFID) type. (ii) For the HFID system, if the...

  12. 40 CFR 91.313 - Analyzers required.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    .... (i) For Raw Gas Sampling, the hydrocarbon analyzer must be of the heated flame ionization (HFID) type. For constant volume sampling, the hydrocarbon analyzer may be of the flame ionization (FID) type or of the heated flame ionization (HFID) type. (ii) For the HFID system, if the temperature of the...

  13. 40 CFR 90.313 - Analyzers required.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... (HC) analysis. (i) For Raw Gas Sampling, the hydrocarbon analyzer shall be of the heated flame ionization (HFID) type. For constant volume sampling, the hydrocarbon analyzer may be of the flame ionization (FID) type or of the heated flame ionization (HFID) type. (ii) For the HFID system, if the...

  14. 40 CFR 86.1422 - Analyzer calibration.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Analyzer calibration. 86.1422 Section... Trucks; Certification Short Test Procedures § 86.1422 Analyzer calibration. (a) Determine that the... receive calibration in accordance with § 85.2233 of this chapter and with good engineering practice....

  15. 40 CFR 86.1422 - Analyzer calibration.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 20 2012-07-01 2012-07-01 false Analyzer calibration. 86.1422 Section... Trucks; Certification Short Test Procedures § 86.1422 Analyzer calibration. (a) Determine that the... receive calibration in accordance with § 85.2233 of this chapter and with good engineering practice....

  16. 40 CFR 86.1422 - Analyzer calibration.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 20 2013-07-01 2013-07-01 false Analyzer calibration. 86.1422 Section... Trucks; Certification Short Test Procedures § 86.1422 Analyzer calibration. (a) Determine that the... receive calibration in accordance with § 85.2233 of this chapter and with good engineering practice....

  17. 40 CFR 86.1422 - Analyzer calibration.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 19 2011-07-01 2011-07-01 false Analyzer calibration. 86.1422 Section... Trucks; Certification Short Test Procedures § 86.1422 Analyzer calibration. (a) Determine that the... receive calibration in accordance with § 85.2233 of this chapter and with good engineering practice....

  18. Harmonic analysis utilizing a Phonodeik and an Henrici analyzer

    NASA Astrophysics Data System (ADS)

    Fickinger, William J.; Hanson, Roger J.; Hoekje, Peter L.

    2001-05-01

    Dayton C. Miller of the Case School of Applied Science assembled a series of instruments for accurate analysis of sound [D. C. Miller, J. Franklin Inst. 182, 285-322 (1916)]. He created the Phonodeik to display and record sound waveforms of musical instruments, voices, fog horns, and so on. Waveforms were analyzed with the Henrici harmonic analyzer, built in Switzerland by G. Coradi. In this device, the motion of a stylus along the curve to be analyzed causes a series of spheres to rotate; two moveable rollers in contact with the nth sphere record the contributions of the sine(nx) and cosine(nx) components of the wave. Corrections for the measured spectra are calculated from analysis of the response of the Phonodeik. Finally, the original waveform could be reconstructed from the corrected spectral amplitudes and phases by a waveform synthesizer, also built at Case. Videos will be presented that show the motion of the gears, spheres, and dials of a working Henrici analyzer, housed at the Department of Speech Pathology and Audiology at the University of Iowa. Operation of the Henrici analyzer and the waveform synthesizer will be explained.

  19. Properties of grain boundary networks in the NEEM ice core analyzed by combined transmission and reflection optical microscopy

    NASA Astrophysics Data System (ADS)

    Binder, Tobias; Weikusat, Ilka; Garbe, Christoph; Svensson, Anders; Kipfstuhl, Sepp

    2014-05-01

    Microstructure analysis of ice cores is vital to understand the processes controlling the flow of ice on the microscale. To quantify the microstructural variability (and thus occurring processes) on centimeter, meter and kilometer scale along deep polar ice cores, a large number of sections has to be analyzed. In the last decade, two different methods have been applied: On the one hand, transmission optical microscopy of thin sections between crossed polarizers yields information on the distribution of crystal c-axes. On the other hand, reflection optical microscopy of polished and controlled sublimated section surfaces allows to characterize the high resolution properties of a single grain boundary, e.g. its length, shape or curvature (further developed by [1]). Along the entire NEEM ice core (North-West Greenland, 2537 m length) drilled in 2008-2011 we applied both methods to the same set of vertical sections. The data set comprises series of six consecutive 6 x 9 cm2 sections in steps of 20 m - in total about 800 images. A dedicated method for automatic processing and matching both image types has recently been developed [2]. The high resolution properties of the grain boundary network are analyzed. Furthermore, the automatic assignment of c-axis misorientations to visible sublimation grooves enables us to quantify the degree of similarity between the microstructure revealed by both analysis techniques. The reliability to extract grain boundaries from both image types as well as the appearance of sublimation groove patterns exhibiting low misorientations is investigated. X-ray Laue diffraction measurements (yielding full crystallographic orientation) have validated the sensitivity of the surface sublimation method for sub-grain boundaries [3]. We introduce an approach for automatic extraction of sub-grain structures from sublimation grooves. A systematic analysis of sub-grain boundary densities indicates a possible influence of high impurity contents (amongst

  20. Systems Analyze Water Quality in Real Time

    NASA Technical Reports Server (NTRS)

    2010-01-01

    A water analyzer developed under Small Business Innovation Research (SBIR) contracts with Kennedy Space Center now monitors treatment processes at water and wastewater facilities around the world. Originally designed to provide real-time detection of nutrient levels in hydroponic solutions for growing plants in space, the ChemScan analyzer, produced by ASA Analytics Inc., of Waukesha, Wisconsin, utilizes spectrometry and chemometric algorithms to automatically analyze multiple parameters in the water treatment process with little need for maintenance, calibration, or operator intervention. The company has experienced a compound annual growth rate of 40 percent over its 15-year history as a direct result of the technology's success.

  1. System for analyzing coal liquefaction products

    DOEpatents

    Dinsmore, Stanley R.; Mrochek, John E.

    1984-01-01

    A system for analyzing constituents of coal-derived materials comprises three adsorption columns and a flow-control arrangement which permits separation of both aromatic and polar hydrocarbons by use of two eluent streams.

  2. Analyzing IT Service Delivery in an ISP from Nicaragua

    NASA Astrophysics Data System (ADS)

    Flores, Johnny; Rusu, Lazar; Johanneson, Paul

    This paper presents a method for analyzing IT service delivery and its application in an Internet Service Provider (ISP). The method proposed is based on ITIL-processes and case study technique; it includes questionnaires for gathering information, semi-structured interviews, focus groups and documents as sources of information for recognition of factual information. The method application allows to the ISP determines its practices and limitations of the IT Service Delivery.

  3. A traffic analyzer for multiple SpaceWire links

    NASA Astrophysics Data System (ADS)

    Liu, Scige J.; Giusi, Giovanni; Di Giorgio, Anna M.; Vertolli, Nello; Galli, Emanuele; Biondi, David; Farina, Maria; Pezzuto, Stefano; Spinoglio, Luigi

    2014-07-01

    Modern space missions are becoming increasingly complex: the interconnection of the units in a satellite is now a network of terminals linked together through routers, where devices with different level of automation and intelligence share the same data-network. The traceability of the network transactions is performed mostly at terminal level through log analysis and hence it is difficult to verify in real time the reliability of the interconnections and the interchange protocols. To improve and ease the traffic analysis in a SpaceWire network we implemented a low-level link analyzer, with the specific goal to simplify the integration and test phases in the development of space instrumentation. The traffic analyzer collects signals coming from pod probes connected in-series on the interested links between two SpaceWire terminals. With respect to the standard traffic analyzers, the design of this new tool includes the possibility to internally reshape the LVDS signal. This improvement increases the robustness of the analyzer towards environmental noise effects and guarantees a deterministic delay on all analyzed signals. The analyzer core is implemented on a Xilinx FPGA, programmed to decode the bidirectional LVDS signals at Link and Network level. Successively, the core packetizes protocol characters in homogeneous sets of time ordered events. The analyzer provides time-tagging functionality for each characters set, with a precision down to the FPGA Clock, i.e. about 20nsec in the adopted HW environment. The use of a common time reference for each character stream allows synchronous performance measurements. The collected information is then routed to an external computer for quick analysis: this is done via high-speed USB2 connection. With this analyzer it is possible to verify the link performances in terms of induced delays in the transmitted signals. A case study focused on the analysis of the Time-Code synchronization in presence of a SpaceWire Router is

  4. Ka-me: a Voronoi image analyzer

    PubMed Central

    Khiripet, Noppadon; Khantuwan, Wongarnet; Jungck, John R.

    2012-01-01

    Summary: Ka-me is a Voronoi image analyzer that allows users to analyze any image with a convex polygonal tessellation or any spatial point distribution by fitting Voronoi polygons and their dual, Delaunay triangulations, to the pattern. The analytical tools include a variety of graph theoretic and geometric tools that summarize the distribution of the numbers of edges per face, areas, perimeters, angles of Delaunay triangle edges (anglograms), Gabriel graphs, nearest neighbor graphs, minimal spanning trees, Ulam trees, Pitteway tests, circumcircles and convexhulls, as well as spatial statistics (Clark–Evans Nearest Neighborhood and Variance to Mean Ratio) and export functions for standard relationships (Lewis's Law, Desch's Law and Aboav–Weaire Law). Availability: Ka-me: a Voronoi image analyzer is available as an executable with documentation and sample applications from the BioQUEST Library (http://bioquest.org/downloads/kame_1.0.rar). Contact: noppadon.khiripet@nectec.or.th PMID:22556369

  5. In-situ continuous water analyzing module

    DOEpatents

    Thompson, Cyril V.; Wise, Marcus B.

    1998-01-01

    An in-situ continuous liquid analyzing system for continuously analyzing volatile components contained in a water source comprises: a carrier gas supply, an extraction container and a mass spectrometer. The carrier gas supply continuously supplies the carrier gas to the extraction container and is mixed with a water sample that is continuously drawn into the extraction container. The carrier gas continuously extracts the volatile components out of the water sample. The water sample is returned to the water source after the volatile components are extracted from it. The extracted volatile components and the carrier gas are delivered continuously to the mass spectometer and the volatile components are continuously analyzed by the mass spectrometer.

  6. The Deep Space Network stability analyzer

    NASA Technical Reports Server (NTRS)

    Breidenthal, Julian C.; Greenhall, Charles A.; Hamell, Robert L.; Kuhnle, Paul F.

    1995-01-01

    A stability analyzer for testing NASA Deep Space Network installations during flight radio science experiments is described. The stability analyzer provides realtime measurements of signal properties of general experimental interest: power, phase, and amplitude spectra; Allan deviation; and time series of amplitude, phase shift, and differential phase shift. Input ports are provided for up to four 100 MHz frequency standards and eight baseband analog (greater than 100 kHz bandwidth) signals. Test results indicate the following upper bounds to noise floors when operating on 100 MHz signals: -145 dBc/Hz for phase noise spectrum further than 200 Hz from carrier, 2.5 x 10(exp -15) (tau =1 second) and 1.5 x 10(exp -17) (tau =1000 seconds) for Allan deviation, and 1 x 10(exp -4) degrees for 1-second averages of phase deviation. Four copies of the stability analyzer have been produced, plus one transportable unit for use at non-NASA observatories.

  7. OASIS: Organics Analyzer for Sampling Icy Surfaces

    NASA Technical Reports Server (NTRS)

    Getty, S. A.; Dworkin, J. P.; Glavin, D. P.; Martin, M.; Zheng, Y.; Balvin, M.; Southard, A. E.; Ferrance, J.; Malespin, C.

    2012-01-01

    Liquid chromatography mass spectrometry (LC-MS) is a well established laboratory technique for detecting and analyzing organic molecules. This approach has been especially fruitful in the analysis of nucleobases, amino acids, and establishing chirol ratios [1 -3]. We are developing OASIS, Organics Analyzer for Sampling Icy Surfaces, for future in situ landed missions to astrochemically important icy bodies, such as asteroids, comets, and icy moons. The OASIS design employs a microfabricated, on-chip analytical column to chromatographically separate liquid ana1ytes using known LC stationary phase chemistries. The elution products are then interfaced through electrospray ionization (ESI) and analyzed by a time-of-flight mass spectrometer (TOF-MS). A particular advantage of this design is its suitability for microgravity environments, such as for a primitive small body.

  8. The Cosmic Dust Analyzer for Cassini

    NASA Technical Reports Server (NTRS)

    Bradley, James G.; Gruen, Eberhard; Srama, Ralf

    1996-01-01

    The Cosmic Dust Analyzer (CDA) is designed to characterize the dust environment in interplanetary space, in the Jovian and in the Saturnian systems. The instrument consists of two major components, the Dust Analyzer (DA) and the High Rate Detector (HRD). The DA has a large aperture to provide a large cross section for detection in low flux environments. The DA has the capability of determining dust particle mass, velocity, flight direction, charge, and chemical composition. The chemical composition is determined by the Chemical Analyzer system based on a time-of-flight mass spectrometer. The DA is capable of making full measurements up to one impact/second. The HRD contains two smaller PVDF detectors and electronics designed to characterize dust particle masses at impact rates up to 10(exp 4) impacts/second. These high impact rates are expected during Saturn ring, plane crossings.

  9. PARALYZER FOR PULSE HEIGHT DISTRIBUTION ANALYZER

    DOEpatents

    Fairstein, E.

    1960-01-19

    A paralyzer circuit is described for use with a pulseheight distribution analyzer to prevent the analyzer from counting overlapping pulses where they would serve to provide a false indication. The paralyzer circuit comprises a pair of cathode-coupled amplifiers for amplifying pulses of opposite polarity. Diodes are provided having their anodes coupled to the separate outputs of the amplifiers to produce only positive signals, and a trigger circuit is coupled to the diodes ior operation by input pulses of either polarity from the amplifiers. A delay network couples the output of the trigger circuit for delaying the pulses.

  10. Frequency spectrum analyzer with phase-lock

    DOEpatents

    Boland, Thomas J.

    1984-01-01

    A frequency-spectrum analyzer with phase-lock for analyzing the frequency and amplitude of an input signal is comprised of a voltage controlled oscillator (VCO) which is driven by a ramp generator, and a phase error detector circuit. The phase error detector circuit measures the difference in phase between the VCO and the input signal, and drives the VCO locking it in phase momentarily with the input signal. The input signal and the output of the VCO are fed into a correlator which transfers the input signal to a frequency domain, while providing an accurate absolute amplitude measurement of each frequency component of the input signal.

  11. The integrated optic RF spectrum analyzer

    NASA Technical Reports Server (NTRS)

    Pedinoff, M. E.; Ranganath, T. R.; Joseph, T. R.; Lee, J. Y.

    1981-01-01

    The results of measurements made on a fully integrated optic RF spectrum analyzer (IOSA) are reported. The performance of the device acousto-optic bandwidth, single-tone RF resolution, two-tone RF resolution, single-tone dynamic range, two-tone dynamic range, and single-tone RF response are presented. The device parameters that control device performance are analyzed. These results demonstrate the viability of the IOSA for real time spectrum analysis of pulsed and CW RF signals. Improvements of RF bandwidth resolution can be obtained by the use of larger collimated optical beams which requires larger optical lens elements, and hence, larger crystals.

  12. Empirical mode decomposition for analyzing acoustical signals

    NASA Technical Reports Server (NTRS)

    Huang, Norden E. (Inventor)

    2005-01-01

    The present invention discloses a computer implemented signal analysis method through the Hilbert-Huang Transformation (HHT) for analyzing acoustical signals, which are assumed to be nonlinear and nonstationary. The Empirical Decomposition Method (EMD) and the Hilbert Spectral Analysis (HSA) are used to obtain the HHT. Essentially, the acoustical signal will be decomposed into the Intrinsic Mode Function Components (IMFs). Once the invention decomposes the acoustic signal into its constituting components, all operations such as analyzing, identifying, and removing unwanted signals can be performed on these components. Upon transforming the IMFs into Hilbert spectrum, the acoustical signal may be compared with other acoustical signals.

  13. Imaging thermal plasma mass and velocity analyzer

    NASA Astrophysics Data System (ADS)

    Yau, Andrew W.; Howarth, Andrew

    2016-07-01

    We present the design and principle of operation of the imaging ion mass and velocity analyzer on the Enhanced Polar Outflow Probe (e-POP), which measures low-energy (1-90 eV/e) ion mass composition (1-40 AMU/e) and velocity distributions using a hemispherical electrostatic analyzer (HEA), a time-of-flight (TOF) gate, and a pair of toroidal electrostatic deflectors (TED). The HEA and TOF gate measure the energy-per-charge and azimuth of each detected ion and the ion transit time inside the analyzer, respectively, providing the 2-D velocity distribution of each major ionospheric ion species and resolving the minor ion species under favorable conditions. The TED are in front of the TOF gate and optionally sample ions at different elevation angles up to ±60°, for measurement of 3-D velocity distribution. We present examples of observation data to illustrate the measurement capability of the analyzer, and show the occurrence of enhanced densities of heavy "minor" O++, N+, and molecular ions and intermittent, high-velocity (a few km/s) upward and downward flowing H+ ions in localized regions of the quiet time topside high-latitude ionosphere.

  14. Analyzing the Information Economy: Tools and Techniques.

    ERIC Educational Resources Information Center

    Robinson, Sherman

    1986-01-01

    Examines methodologies underlying studies which measure the information economy and considers their applicability and limitations for analyzing policy issues concerning libraries and library networks. Two studies provide major focus for discussion: Porat's "The Information Economy: Definition and Measurement" and Machlup's "Production and…

  15. Fluidization quality analyzer for fluidized beds

    DOEpatents

    Daw, C.S.; Hawk, J.A.

    1995-07-25

    A control loop and fluidization quality analyzer for a fluidized bed utilizes time varying pressure drop measurements. A fast-response pressure transducer measures the overall bed pressure drop, or over some segment of the bed, and the pressure drop signal is processed to produce an output voltage which changes with the degree of fluidization turbulence. 9 figs.

  16. A calibration means for spectrum analyzers

    NASA Technical Reports Server (NTRS)

    Larson, M. S.

    1967-01-01

    Spectrum analyzer calibration system is rapid and provides an accurate family of adjustable markers at any point in the spectrum. Pulse width controls determine the number of markers. The unit operates with a repetition rate from 300 cps to 40 kc at a center frequency from 10 kc to 2 Mc.

  17. Battery analyzer for electric golf carts

    SciTech Connect

    Sharber, J.M.

    1982-02-09

    A battery tester for testing individual batteries on an electrically driven vehicle such that as the vehicle is in motion , the condition of each battery in a series connected power source maybe analyzed. The device briefly comprises indicator device such as the meter and connection devices to isolate individual battery to determine the voltage of each battery as the vehicle is being driven.

  18. How to Analyze Company Using Social Network?

    NASA Astrophysics Data System (ADS)

    Palus, Sebastian; Bródka, Piotr; Kazienko, Przemysław

    Every single company or institution wants to utilize its resources in the most efficient way. In order to do so they have to be have good structure. The new way to analyze company structure by utilizing existing within company natural social network and example of its usage on Enron company are presented in this paper.

  19. Lightweight, broad-band spectrum analyzer

    NASA Technical Reports Server (NTRS)

    Crook, G. M.

    1972-01-01

    Spectrum analyzer, utilizing techniques similar to those used to classify energy levels of nuclear particles, is incorporated into electric field detector. Primary advantage is ability to perform qualitative broad-band frequency analysis over a large dynamic amplitude range with minimum weight and electrical power requirements.

  20. 40 CFR 92.109 - Analyzer specifications.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) Hydrocarbon measurements are to be made with a heated flame ionization detector (HFID) analyzer. An overflow...-heated flame ionization detector (FID) that measures hydrocarbon emissions on a dry basis is permitted... methane consists of a gas chromatograph (GC) combined with a flame ionization detector (FID). (3)...

  1. Experiments using the Argonne Fragment Mass Analyzer

    SciTech Connect

    Davids, C.N.; Back, B.; Carpenter, M.P.; Henderson, D.J.; Henry, R.G.; Janssens, R.V.F.; Khoo, T.L.; Lauritsen, T.; Liang, Y.; Bindra, K. |; Chung, W. |; Soramel, F. |; Bearden, I.G.; Daly, P.J.; Fornal, B.; Grabowski, Z.W.; Mayer, R.H.; Nisius, D.; Broda, R. |; Ramayya, V.; Bingham, C.R.; Moltz, D.M.; Robertson, J.D.; Scarlassara, F.; Spolaore, P.; Toth, K.S.; Walters, W.B.

    1993-05-01

    The Fragment Mass Analyzer (FMA) at the ATLAS accelerator has been operational for about one year. During that period a number of test runs and experiments have been carried out. The test runs have verified that the ion optics of the FMA are essentially as calculated. A brief facility description is followed by recent experimental results.

  2. Experiments using the Argonne Fragment Mass Analyzer

    SciTech Connect

    Davids, C.N.; Back, B.; Carpenter, M.P.; Henderson, D.J.; Henry, R.G.; Janssens, R.V.F.; Khoo, T.L.; Lauritsen, T.; Liang, Y. ); Bindra, K. Vanderbilt Univ., Nashville, TN ); Chung, W. Notre Dame Univ., IN ); Soramel, F. (Argonne National Lab., IL (

    1993-01-01

    The Fragment Mass Analyzer (FMA) at the ATLAS accelerator has been operational for about one year. During that period a number of test runs and experiments have been carried out. The test runs have verified that the ion optics of the FMA are essentially as calculated. A brief facility description is followed by recent experimental results.

  3. Analyzing Languages for Specific Purposes Discourse

    ERIC Educational Resources Information Center

    Bowles, Hugo

    2012-01-01

    In the last 20 years, technological advancement and increased multidisciplinarity has expanded the range of data regarded as within the scope of languages for specific purposes (LSP) research and the means by which they can be analyzed. As a result, the analytical work of LSP researchers has developed from a narrow focus on specialist terminology…

  4. Illinois Community College Grievance Procedure Analyzer.

    ERIC Educational Resources Information Center

    Lovell, Ned B.; And Others

    The study described in this report analyzed the status of grievance procedures in the Illinois community colleges that engage in collective bargaining. Following an introductory chapter offering a rationale for the study, chapter 2 provides an analysis of existing grievance procedures in the colleges based on a study of the collective bargaining…

  5. Studying Reliability Using Identical Handheld Lactate Analyzers

    ERIC Educational Resources Information Center

    Stewart, Mark T.; Stavrianeas, Stasinos

    2008-01-01

    Accusport analyzers were used to generate lactate performance curves in an investigative laboratory activity emphasizing the importance of reliable instrumentation. Both the calibration and testing phases of the exercise provided students with a hands-on opportunity to use laboratory-grade instrumentation while allowing for meaningful connections…

  6. Images & Issues: How to Analyze Election Rhetoric.

    ERIC Educational Resources Information Center

    Rank, Hugh

    Although it is impossible to know in advance the credibility of political messages, such persuasive discourse can be analyzed in a non-partisan, common sense way using predictable patterns in content and form. The content of a candidate's message can be summarized as "I am competent and trustworthy; from me, you'll get 'more good' and 'less bad.'"…

  7. Multichannel analyzers at high rates of input

    NASA Technical Reports Server (NTRS)

    Rudnick, S. J.; Strauss, M. G.

    1969-01-01

    Multichannel analyzer, used with a gating system incorporating pole-zero compensation, pile-up rejection, and baseline-restoration, achieves good resolution at high rates of input. It improves resolution, reduces tailing and rate-contributed continuum, and eliminates spectral shift.

  8. PLT and PDX perpendicular charge exchange analyzers

    NASA Astrophysics Data System (ADS)

    Mueller, D.; Hammett, G. W.; McCune, D. C.

    1986-08-01

    The perpendicular charge-exchange systems used on the poloidal divertor experiment and the Princeton large torus are comprised of ten-channel, mass-resolved, charge-exchange analyzers. Results from these systems indicate that instrumental effects can lead to erroneous temperature measurements during deuterium neutral beam injection or at low hydrogen concentrations.

  9. Analyzing volatile compounds in dairy products

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Volatile compounds give the first indication of the flavor in a dairy product. Volatiles are isolated from the sample matrix and then analyzed by chromatography, sensory methods, or an electronic nose. Isolation may be performed by solvent extraction or headspace analysis, and gas chromatography i...

  10. Fluidization quality analyzer for fluidized beds

    DOEpatents

    Daw, C. Stuart; Hawk, James A.

    1995-01-01

    A control loop and fluidization quality analyzer for a fluidized bed utilizes time varying pressure drop measurements. A fast-response pressure transducer measures the overall bed pressure drop, or over some segment of the bed, and the pressure drop signal is processed to produce an output voltage which changes with the degree of fluidization turbulence.

  11. Analyzing the Acoustic Beat with Mobile Devices

    ERIC Educational Resources Information Center

    Kuhn, Jochen; Vogt, Patrik; Hirth, Michael

    2014-01-01

    In this column, we have previously presented various examples of how physical relationships can be examined by analyzing acoustic signals using smartphones or tablet PCs. In this example, we will be exploring the acoustic phenomenon of small beats, which is produced by the overlapping of two tones with a low difference in frequency ?f. The…

  12. Strengthening 4-H by Analyzing Enrollment Data

    ERIC Educational Resources Information Center

    Hamilton, Stephen F.; Northern, Angela; Neff, Robert

    2014-01-01

    The study reported here used data from the ACCESS 4-H Enrollment System to gain insight into strengthening New York State's 4-H programming. Member enrollment lists from 2009 to 2012 were analyzed using Microsoft Excel to determine trends and dropout rates. The descriptive data indicate declining 4-H enrollment in recent years and peak…

  13. ITK and ANALYZE: a synergistic integration

    NASA Astrophysics Data System (ADS)

    Augustine, Kurt E.; Holmes, David R., III; Robb, Richard A.

    2004-05-01

    The Insight Toolkit (ITK) is a C++ open-source software toolkit developed under sponsorship of the National Library of Medicine. It provides advanced algorithms for performing image registration and segmentation, but does not provide support for visualization and analysis, nor does it offer any graphical user interface (GUI). The purpose of this integration project is to make ITK readily accessible to end-users with little or no programming skills, and provide interactive processing, visualization and measurement capabilities. This is achieved through the integration of ITK with ANALYZE, a multi-dimension image visualization/analysis application installed in over 300 institutions around the world, with a user-base in excess of 4000. This integration is carried out at both the software foundation and GUI levels. The foundation technology upon which ANALYZE is built is a comprehensive C-function library called AVW. A new set of AVW-ITK functions have been developed and integrated into the AVW library, and four new ITK modules have been added to the ANALYZE interface. Since ITK is a software developer"s toolkit, the only way to access its intrinsic power is to write programs that incorporate it. Integrating ITK with ANALYZE opens the ITK algorithms to end-users who otherwise might never be able to take advantage of the toolkit"s advanced functionality. In addition, this integration provides end-to-end interactive problem solving capabilities which allow all users, including programmers, an integrated system to readily display and quantitatively evaluate the results from the segmentation and registration routines in ITK, regardless of the type or format of input images, which are comprehensively supported in ANALYZE.

  14. Calibration of optical particle-size analyzer

    DOEpatents

    Pechin, William H.; Thacker, Louis H.; Turner, Lloyd J.

    1979-01-01

    This invention relates to a system for the calibration of an optical particle-size analyzer of the light-intercepting type for spherical particles, wherein a rotary wheel or disc is provided with radially-extending wires of differing diameters, each wire corresponding to a particular equivalent spherical particle diameter. These wires are passed at an appropriate frequency between the light source and the light detector of the analyzer. The reduction of light as received at the detector is a measure of the size of the wire, and the electronic signal may then be adjusted to provide the desired signal for corresponding spherical particles. This calibrator may be operated at any time without interrupting other processing.

  15. Electric wind in a Differential Mobility Analyzer

    DOE PAGES

    Palo, Marus; Meelis Eller; Uin, Janek; Tamm, Eduard

    2015-10-25

    Electric wind -- the movement of gas, induced by ions moving in an electric field -- can be a distorting factor in size distribution measurements using Differential Mobility Analyzers (DMAs). The aim of this study was to determine the conditions under which electric wind occurs in the locally-built VLDMA (Very Long Differential Mobility Analyzer) and TSI Long-DMA (3081) and to describe the associated distortion of the measured spectra. Electric wind proved to be promoted by the increase of electric field strength, aerosol layer thickness, particle number concentration and particle size. The measured size spectra revealed three types of distortion: wideningmore » of the size distribution, shift of the mode of the distribution to smaller diameters and smoothing out the peaks of the multiply charged particles. Electric wind may therefore be a source of severe distortion of the spectrum when measuring large particles at high concentrations.« less

  16. Real time speech formant analyzer and display

    DOEpatents

    Holland, G.E.; Struve, W.S.; Homer, J.F.

    1987-02-03

    A speech analyzer for interpretation of sound includes a sound input which converts the sound into a signal representing the sound. The signal is passed through a plurality of frequency pass filters to derive a plurality of frequency formants. These formants are converted to voltage signals by frequency-to-voltage converters and then are prepared for visual display in continuous real time. Parameters from the inputted sound are also derived and displayed. The display may then be interpreted by the user. The preferred embodiment includes a microprocessor which is interfaced with a television set for displaying of the sound formants. The microprocessor software enables the sound analyzer to present a variety of display modes for interpretive and therapeutic used by the user. 19 figs.

  17. Real time speech formant analyzer and display

    DOEpatents

    Holland, George E.; Struve, Walter S.; Homer, John F.

    1987-01-01

    A speech analyzer for interpretation of sound includes a sound input which converts the sound into a signal representing the sound. The signal is passed through a plurality of frequency pass filters to derive a plurality of frequency formants. These formants are converted to voltage signals by frequency-to-voltage converters and then are prepared for visual display in continuous real time. Parameters from the inputted sound are also derived and displayed. The display may then be interpreted by the user. The preferred embodiment includes a microprocessor which is interfaced with a television set for displaying of the sound formants. The microprocessor software enables the sound analyzer to present a variety of display modes for interpretive and therapeutic used by the user.

  18. Compact fast analyzer of rotary cuvette type

    DOEpatents

    Thacker, Louis H.

    1976-01-01

    A compact fast analyzer of the rotary cuvette type is provided for simultaneously determining concentrations in a multiplicity of discrete samples using either absorbance or fluorescence measurement techniques. A rigid, generally rectangular frame defines optical passageways for the absorbance and fluorescence measurement systems. The frame also serves as a mounting structure for various optical components as well as for the cuvette rotor mount and drive system. A single light source and photodetector are used in making both absorbance and fluorescence measurements. Rotor removal and insertion are facilitated by a swing-out drive motor and rotor mount. BACKGROUND OF THE INVENTION The invention relates generally to concentration measuring instruments and more specifically to a compact fast analyzer of the rotary cuvette type which is suitable for making either absorbance or fluorescence measurements. It was made in the course of, or under, a contract with the U.S. Atomic Energy Commission.

  19. Modeling extreme ultraviolet suppression of electrostatic analyzers

    SciTech Connect

    Gershman, Daniel J.; Zurbuchen, Thomas H.

    2010-04-15

    In addition to analyzing energy-per-charge ratios of incident ions, electrostatic analyzers (ESAs) for spaceborne time-of-flight mass spectrometers must also protect detectors from extreme ultraviolet (EUV) photons from the Sun. The required suppression rate often exceeds 1:10{sup 7} and is generally established in tests upon instrument design and integration. This paper describes a novel technique to model the EUV suppression of ESAs using photon ray tracing integrated into SIMION, the most commonly used ion optics design software for such instruments. The paper compares simulation results with measurements taken from the ESA of the Mass instrument flying onboard the Wind spacecraft. This novel technique enables an active inclusion of EUV suppression requirements in the ESA design process. Furthermore, the simulation results also motivate design rules for such instruments.

  20. Clinical laboratory data: acquire, analyze, communicate, liberate.

    PubMed

    Azzazy, Hassan M E; Elbehery, Ali H A

    2015-01-01

    The availability of portable healthcare devices, which can acquire and transmit medical data to remote experts would dramatically affect healthcare in areas with poor infrastructure. Smartphones, which feature touchscreen computer capabilities and sophisticated cameras, have become widely available with over billion units shipped in 2013. In the clinical laboratory, smartphones have recently brought the capabilities of key instruments such as spectrophotometers, fluorescence analyzers and microscopes into the palm of the hand. Several research groups have developed sensitive and low-cost smartphone-based diagnostic assay prototypes for testing cholesterol, albumin, vitamin D, tumor markers, and the detection of infectious agents. This review covers the use of smartphones to acquire, analyze, communicate, and liberate clinical laboratory data. Smartphones promise to dramatically improve the quality and quantity of healthcare offered in resource-limited areas.

  1. Electric wind in a Differential Mobility Analyzer

    SciTech Connect

    Palo, Marus; Meelis Eller; Uin, Janek; Tamm, Eduard

    2015-10-25

    Electric wind -- the movement of gas, induced by ions moving in an electric field -- can be a distorting factor in size distribution measurements using Differential Mobility Analyzers (DMAs). The aim of this study was to determine the conditions under which electric wind occurs in the locally-built VLDMA (Very Long Differential Mobility Analyzer) and TSI Long-DMA (3081) and to describe the associated distortion of the measured spectra. Electric wind proved to be promoted by the increase of electric field strength, aerosol layer thickness, particle number concentration and particle size. The measured size spectra revealed three types of distortion: widening of the size distribution, shift of the mode of the distribution to smaller diameters and smoothing out the peaks of the multiply charged particles. Electric wind may therefore be a source of severe distortion of the spectrum when measuring large particles at high concentrations.

  2. Scanning energy analyzer of charge exchange atoms

    SciTech Connect

    Rogozin, A.I.; Shikhovtsev, I.V.

    1994-12-31

    The construction, operation principle, and parameters of multichannel scanning energy analyzer of charge exchange atoms are discussed. The analyzer is used to measure the splashing ion angular distribution and energy spectra, the ions being produced in a gas dynamic plasma trap (GDCS) during the injection of powerful atomic hydrogen beam into preliminary produced plasma with density n {approx_equal} 5 {times} 10{sup 13} cm{sup {minus}3}. The parameters of the hydrogen beam are as follows: particle energy -50 keV, equivalent current -250 A, pulse duration -1 ms. The device can be also used for measurements of energy spectra of atomic and charged particle beams in plasma diagnostics, beam physics, and physics of atomic collisions. 4 refs., 4 figs.

  3. Analyzing High-Dimensional Multispectral Data

    NASA Technical Reports Server (NTRS)

    Lee, Chulhee; Landgrebe, David A.

    1993-01-01

    In this paper, through a series of specific examples, we illustrate some characteristics encountered in analyzing high- dimensional multispectral data. The increased importance of the second-order statistics in analyzing high-dimensional data is illustrated, as is the shortcoming of classifiers such as the minimum distance classifier which rely on first-order variations alone. We also illustrate how inaccurate estimation or first- and second-order statistics, e.g., from use of training sets which are too small, affects the performance of a classifier. Recognizing the importance of second-order statistics on the one hand, but the increased difficulty in perceiving and comprehending information present in statistics derived from high-dimensional data on the other, we propose a method to aid visualization of high-dimensional statistics using a color coding scheme.

  4. The EPOS Automated Selective Chemistry Analyzer evaluated.

    PubMed

    Moses, G C; Lightle, G O; Tuckerman, J F; Henderson, A R

    1986-01-01

    We evaluated the analytical performance of the EPOS (Eppendorf Patient Oriented System) Automated Selective Chemistry Analyzer, using the following tests for serum analytes: alanine and aspartate aminotransferases, lactate dehydrogenase, creatine kinase, gamma-glutamyltransferase, alkaline phosphatase, and glucose. Results from the EPOS correlated well with those from comparison instruments (r greater than or equal to 0.990). Precision and linearity limits were excellent for all tests; linearity of the optical and pipetting systems was satisfactory. Reagent carryover was negligible. Sample-to-sample carryover was less than 1% for all tests, but only lactate dehydrogenase was less than the manufacturer's specified 0.5%. Volumes aspirated and dispensed by the sample and reagent II pipetting systems differed significantly from preset values, especially at lower settings; the reagent I system was satisfactory at all volumes tested. Minimal daily maintenance and an external data-reduction system make the EPOS a practical alternative to other bench-top chemistry analyzers.

  5. Automated acousto-optic infrared analyzer system

    SciTech Connect

    Steinbruegge, K.B.; Gottlieb, M.S.

    1984-12-25

    An automated acousto-optic tunable filter infrared analyzer system useable in a variety of industrial and commercial control applications. The system relies upon a narrow band pass tunable acousto-optic filter which is selectively tuned by predetermined rf frequency signals to selectively transmit the narrow band pass of interest which corresponds to a specific molecular species for identification and analysis. The system includes a microcomputer and associated memory function to measure and compare detected signals from an infrared detector which converts the filtered infrared signal to an electrical signal. The memory provides control signals for the computer and for controlling the sequence and frequency of rf energy applied to tune the filter. In this way, the near to mid range infrared can be analyzed for absorption bands corresponding to predetermined molecular species such as combustion product gases, and a feedback signal generated to control the combustion process.

  6. Real-time airborne particle analyzer

    DOEpatents

    Reilly, Peter T.A.

    2012-10-16

    An aerosol particle analyzer includes a laser ablation chamber, a gas-filled conduit, and a mass spectrometer. The laser ablation chamber can be operated at a low pressure, which can be from 0.1 mTorr to 30 mTorr. The ablated ions are transferred into a gas-filled conduit. The gas-filled conduit reduces the electrical charge and the speed of ablated ions as they collide and mix with buffer gases in the gas-filled conduit. Preferably, the gas filled-conduit includes an electromagnetic multipole structure that collimates the nascent ions into a beam, which is guided into the mass spectrometer. Because the gas-filled conduit allows storage of vast quantities of the ions from the ablated particles, the ions from a single ablated particle can be analyzed multiple times and by a variety of techniques to supply statistically meaningful analysis of composition and isotope ratios.

  7. Modeling extreme ultraviolet suppression of electrostatic analyzers.

    PubMed

    Gershman, Daniel J; Zurbuchen, Thomas H

    2010-04-01

    In addition to analyzing energy-per-charge ratios of incident ions, electrostatic analyzers (ESAs) for spaceborne time-of-flight mass spectrometers must also protect detectors from extreme ultraviolet (EUV) photons from the Sun. The required suppression rate often exceeds 1:10(7) and is generally established in tests upon instrument design and integration. This paper describes a novel technique to model the EUV suppression of ESAs using photon ray tracing integrated into SIMION, the most commonly used ion optics design software for such instruments. The paper compares simulation results with measurements taken from the ESA of the Mass instrument flying onboard the Wind spacecraft. This novel technique enables an active inclusion of EUV suppression requirements in the ESA design process. Furthermore, the simulation results also motivate design rules for such instruments.

  8. Analyzing network reliability using structural motifs

    NASA Astrophysics Data System (ADS)

    Khorramzadeh, Yasamin; Youssef, Mina; Eubank, Stephen; Mowlaei, Shahir

    2015-04-01

    This paper uses the reliability polynomial, introduced by Moore and Shannon in 1956, to analyze the effect of network structure on diffusive dynamics such as the spread of infectious disease. We exhibit a representation for the reliability polynomial in terms of what we call structural motifs that is well suited for reasoning about the effect of a network's structural properties on diffusion across the network. We illustrate by deriving several general results relating graph structure to dynamical phenomena.

  9. Direct memory access digital events analyzer

    NASA Astrophysics Data System (ADS)

    Basano, L.; Ottonello, P.

    1989-06-01

    We present a random-point-process multifunction analyzer in which a long sequence of interpulse intervals are recorded in the RAM bank of a personal computer, through a suitably designed front end attached to a commercial DMA interface. Laser light scattered by ground-glass disks and by aqueous suspensions of polystyrene latex spheres has been used to test the performance of the device that may be employed in a broad range of applications.

  10. A gas filter correlation analyzer for methane

    NASA Technical Reports Server (NTRS)

    Sebacher, D. I.

    1978-01-01

    A fast-response instrument for monitoring CH4 was designed and tested using a modified nondispersive infrared technique. An analysis of the single-beam rotating-cell system is presented along with the signal processing circuit. A calibration of the instrument shows that the technique can be used to measure CH4 concentrations as small as 5 ppm-m and the effects of interfering gases are analyzed.

  11. Coordinating, Scheduling, Processing and Analyzing IYA09

    NASA Technical Reports Server (NTRS)

    Gipson, John; Behrend, Dirk; Gordon, David; Himwich, Ed; MacMillan, Dan; Titus, Mike; Corey, Brian

    2010-01-01

    The IVS scheduled a special astrometric VLBI session for the International Year of Astronomy 2009 (IYA09) commemorating 400 years of optical astronomy and 40 years of VLBI. The IYA09 session is the most ambitious geodetic session to date in terms of network size, number of sources, and number of observations. We describe the process of designing, coordinating, scheduling, pre-session station checkout, correlating, and analyzing this session.

  12. Cometary particulate analyzer design definition study

    NASA Technical Reports Server (NTRS)

    Utterback, N. G.

    1981-01-01

    A concept for remotely determining the relative abundance of elements contained in cometary particulates collected by a spacecraft was conducted with very encouraging results. The technique utilizes a short high intensity burst of laser radiation to vaporize and ionize collected particulate material. Ions extracted from this laser-produced plasma are analyzed in a time of flight mass spectrometer to yield an atomic mass spectrum representative of the relative abundance of elements in the particulates. A prototype analyzer system was designed, constructed, and tested. Results show that: (1) energy-time focus performs as predicted in improving resolution; (2) power densities sufficient to produce usable ionization efficiencies can be obtained; (3) complex alloys such as stainless steel can be analyzed; and (4) a tiny, simple and reliable laser used in the demonstration easily meets spacecraft power and mass limitations. A mass resolution of 150 was experimentally demonstrated at mass 108, and an analytical extrapolation predicts a resolution sufficient to separate masses 250 and 251.

  13. RGA-5 process gas analyzer test report

    SciTech Connect

    Weamer, J.L.

    1994-11-09

    The gas monitoring system, GMS-2, includes two gas monitors. GC-2 measures high hydrogen concentrations (0.2--10%) and GC-3 measures the lower concentration levels (10--100 ppm). Although redundant instruments are in place for accurately measuring the higher hydrogen concentrations, there are no redundant instruments to accurately measure the relatively low baseline hydrogen concentrations. The RGA-5 process gas analyzer is a two-column GC that will replace GC-2 and provide redundancy for GC-3. This upgrade will provide faster response time and reduce tank farm entries for routine operations because the RGA-5 is remotely operable. Tests were conducted according to WHC-SD-WM-TP-262, RGA-5 Process Gas Analyzer Test Plan. The first objective was to verify that the vendor-supplied RGA host data acquisition software allowed communication between the RGA-5 and an ISA bus personal computer. The second objective was to determine the capabilities of the RGA-5 process gas analyzer. The tests did the following: with a constant flow rate and pressure, determined the concentration range that each column can accurately and precisely measure; identified any uncorrected interferences from other tank gases such as ammonia, nitrous oxide, or methane; and determined the response and decay time.

  14. A Raman-Based Portable Fuel Analyzer

    NASA Astrophysics Data System (ADS)

    Farquharson, Stuart

    2010-08-01

    Fuel is the single most import supply during war. Consider that the US Military is employing over 25,000 vehicles in Iraq and Afghanistan. Most fuel is obtained locally, and must be characterized to ensure proper operation of these vehicles. Fuel properties are currently determined using a deployed chemical laboratory. Unfortunately, each sample requires in excess of 6 hours to characterize. To overcome this limitation, we have developed a portable fuel analyzer capable of determine 7 fuel properties that allow determining fuel usage. The analyzer uses Raman spectroscopy to measure the fuel samples without preparation in 2 minutes. The challenge, however, is that as distilled fractions of crude oil, all fuels are composed of hundreds of hydrocarbon components that boil at similar temperatures, and performance properties can not be simply correlated to a single component, and certainly not to specific Raman peaks. To meet this challenge, we measured over 800 diesel and jet fuels from around the world and used chemometrics to correlate the Raman spectra to fuel properties. Critical to the success of this approach is laser excitation at 1064 nm to avoid fluorescence interference (many fuels fluoresce) and a rugged interferometer that provides 0.1 cm-1 wavenumber (x-axis) accuracy to guarantee accurate correlations. Here we describe the portable fuel analyzer, the chemometric models, and the successful determination of these 7 fuel properties for over 100 unknown samples provided by the US Marine Corps, US Navy, and US Army.

  15. Analyzing complex networks evolution through Information Theory quantifiers

    NASA Astrophysics Data System (ADS)

    Carpi, Laura C.; Rosso, Osvaldo A.; Saco, Patricia M.; Ravetti, Martín Gómez

    2011-01-01

    A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Niño/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.

  16. Analyzing the BBOB results by means of benchmarking concepts.

    PubMed

    Mersmann, O; Preuss, M; Trautmann, H; Bischl, B; Weihs, C

    2015-01-01

    We present methods to answer two basic questions that arise when benchmarking optimization algorithms. The first one is: which algorithm is the "best" one? and the second one is: which algorithm should I use for my real-world problem? Both are connected and neither is easy to answer. We present a theoretical framework for designing and analyzing the raw data of such benchmark experiments. This represents a first step in answering the aforementioned questions. The 2009 and 2010 BBOB benchmark results are analyzed by means of this framework and we derive insight regarding the answers to the two questions. Furthermore, we discuss how to properly aggregate rankings from algorithm evaluations on individual problems into a consensus, its theoretical background and which common pitfalls should be avoided. Finally, we address the grouping of test problems into sets with similar optimizer rankings and investigate whether these are reflected by already proposed test problem characteristics, finding that this is not always the case.

  17. Volatile Analyzer for Lunar Polar Missions

    NASA Technical Reports Server (NTRS)

    Gibons, Everett K.; Pillinger, Colin T.; McKay, David S.; Waugh, Lester J.

    2011-01-01

    One of the major questions remaining for the future exploration of the Moon by humans concerns the presence of volatiles on our nearest neighbor in space. Observational studies, and investigations involving returned lunar samples and using robotic spacecraft infer the existence of volatile compounds particularly water [1]. It seems very likely that a volatile component will be concentrated at the poles in circumstances where low-temperatures exist to provide cryogenic traps. However, the full inventory of species, their concentration and their origin and sources are unknown. Of particular importance is whether abundances are sufficient to act as a resource of consumables for future lunar expeditions especially if a long-term base involving humans is to be established. To address some of these issues requires a lander designed specifically for operation at a high-lunar latitude. A vital part of the payload needs to be a volatile analyzer such as the Gas Analysis Package specifically designed for identification quantification of volatile substances and collecting information which will allow the origin of these volatiles to be identified [1]. The equipment included, particularly the gas analyzer, must be capable of operation in the extreme environmental conditions to be encountered. No accurate information yet exists regarding volatile concentration even for sites closer to the lunar equator (because of contamination). In this respect it will be important to understand (and thus limit) contamination of the lunar surface by extraneous material contributed from a variety of sources. The only data for the concentrations of volatiles at the poles comes from orbiting spacecraft and whilst the levels at high latitudes may be greater than at the equator, the volatile analyzer package under consideration will be designed to operate at the highest specifications possible and in a way that does not compromise the data.

  18. A computer program for analyzing channel geometry

    USGS Publications Warehouse

    Regan, R.S.; Schaffranek, R.W.

    1985-01-01

    The Channel Geometry Analysis Program (CGAP) provides the capability to process, analyze, and format cross-sectional data for input to flow/transport simulation models or other computational programs. CGAP allows for a variety of cross-sectional data input formats through use of variable format specification. The program accepts data from various computer media and provides for modification of machine-stored parameter values. CGAP has been devised to provide a rapid and efficient means of computing and analyzing the physical properties of an open-channel reach defined by a sequence of cross sections. CGAP 's 16 options provide a wide range of methods by which to analyze and depict a channel reach and its individual cross-sectional properties. The primary function of the program is to compute the area, width, wetted perimeter, and hydraulic radius of cross sections at successive increments of water surface elevation (stage) from data that consist of coordinate pairs of cross-channel distances and land surface or channel bottom elevations. Longitudinal rates-of-change of cross-sectional properties are also computed, as are the mean properties of a channel reach. Output products include tabular lists of cross-sectional area, channel width, wetted perimeter, hydraulic radius, average depth, and cross-sectional symmetry computed as functions of stage; plots of cross sections; plots of cross-sectional area and (or) channel width as functions of stage; tabular lists of cross-sectional area and channel width computed as functions of stage for subdivisions of a cross section; plots of cross sections in isometric projection; and plots of cross-sectional area at a fixed stage as a function of longitudinal distance along an open-channel reach. A Command Procedure Language program and Job Control Language procedure exist to facilitate program execution on the U.S. Geological Survey Prime and Amdahl computer systems respectively. (Lantz-PTT)

  19. Remote Laser Diffraction Particle Size Distribution Analyzer

    SciTech Connect

    Batcheller, Thomas Aquinas; Huestis, Gary Michael; Bolton, Steven Michael

    2001-03-01

    In support of a radioactive slurry sampling and physical characterization task, an “off-the-shelf” laser diffraction (classical light scattering) particle size analyzer was utilized for remote particle size distribution (PSD) analysis. Spent nuclear fuel was previously reprocessed at the Idaho Nuclear Technology and Engineering Center (INTEC—formerly recognized as the Idaho Chemical Processing Plant) which is on DOE’s INEEL site. The acidic, radioactive aqueous raffinate streams from these processes were transferred to 300,000 gallon stainless steel storage vessels located in the INTEC Tank Farm area. Due to the transfer piping configuration in these vessels, complete removal of the liquid can not be achieved. Consequently, a “heel” slurry remains at the bottom of an “emptied” vessel. Particle size distribution characterization of the settled solids in this remaining heel slurry, as well as suspended solids in the tank liquid, is the goal of this remote PSD analyzer task. A Horiba Instruments Inc. Model LA-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a “hot cell” (gamma radiation) environment. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not previously achievable—making this technology far superior than the traditional methods used. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives.

  20. A computer program for analyzing unresolved Mossbauer hyperfine spectra

    NASA Technical Reports Server (NTRS)

    Schiess, J. R.; Singh, J. J.

    1978-01-01

    The program for analyzing unresolved Mossbauer hyperfine spectra was written in FORTRAN 4 language for the Control Data CYBER 170 series digital computer system with network operating system 1.1. With the present dimensions, the program requires approximately 36,000 octal locations of core storage. A typical case involving two innermost coordination shells in which the amplitudes and the peak positions of all three components were estimated in 25 iterations requires 30 seconds on CYBER 173. The program was applied to determine the effects of various near neighbor impurity shells on hyperfine fields in dilute FeAl alloys.

  1. Light-weight analyzer for odor recognition

    DOEpatents

    Vass, Arpad A; Wise, Marcus B

    2014-05-20

    The invention provides a light weight analyzer, e.g., detector, capable of locating clandestine graves. The detector utilizes the very specific and unique chemicals identified in the database of human decompositional odor. This detector, based on specific chemical compounds found relevant to human decomposition, is the next step forward in clandestine grave detection and will take the guess-work out of current methods using canines and ground-penetrating radar, which have historically been unreliable. The detector is self contained, portable and built for field use. Both visual and auditory cues are provided to the operator.

  2. Analyzing Ever Growing Datasets in PHENIX

    SciTech Connect

    Pinkenburg, C.; PHENIX Collaboration

    2010-10-18

    After 10 years of running, the PHENIX experiment has by now accumulated more than 700 TB of reconstructed data which are directly used for analysis. Analyzing these amounts of data efficiently requires a coordinated approach. Beginning in 2005 we started to develop a system for the RHIC Atlas Computing Facility (RACF) which allows the efficient analysis of these large data sets. The Analysis Taxi is now the tool which allows any collaborator to process any data set taken since 2003 in weekly passes with turnaround times of typically three to four days.

  3. Spectrum Analyzers Incorporating Tunable WGM Resonators

    NASA Technical Reports Server (NTRS)

    Savchenkov, Anatoliy; Matsko, Andrey; Strekalov, Dmitry; Maleki, Lute

    2009-01-01

    A photonic instrument is proposed to boost the resolution for ultraviolet/ optical/infrared spectral analysis and spectral imaging allowing the detection of narrow (0.00007-to-0.07-picometer wavelength resolution range) optical spectral signatures of chemical elements in space and planetary atmospheres. The idea underlying the proposal is to exploit the advantageous spectral characteristics of whispering-gallery-mode (WGM) resonators to obtain spectral resolutions at least three orders of magnitude greater than those of optical spectrum analyzers now in use. Such high resolutions would enable measurement of spectral features that could not be resolved by prior instruments.

  4. Nonlinear Single-Spin Spectrum Analyzer

    NASA Astrophysics Data System (ADS)

    Kotler, Shlomi; Akerman, Nitzan; Glickman, Yinnon; Ozeri, Roee

    2013-03-01

    Qubits have been used as linear spectrum analyzers of their environments. Here we solve the problem of nonlinear spectral analysis, required for discrete noise induced by a strongly coupled environment. Our nonperturbative analytical model shows a nonlinear signal dependence on noise power, resulting in a spectral resolution beyond the Fourier limit as well as frequency mixing. We develop a noise characterization scheme adapted to this nonlinearity. We then apply it using a single trapped ion as a sensitive probe of strong, non-Gaussian, discrete magnetic field noise. Finally, we experimentally compared the performance of equidistant vs Uhrig modulation schemes for spectral analysis.

  5. CRISP90 - SOFTWARE DESIGN ANALYZER SYSTEM

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1994-01-01

    The CRISP90 Software Design Analyzer System, an update of CRISP-80, is a set of programs forming a software design and documentation tool which supports top-down, hierarchic, modular, structured design and programming methodologies. The quality of a computer program can often be significantly influenced by the design medium in which the program is developed. The medium must foster the expression of the programmer's ideas easily and quickly, and it must permit flexible and facile alterations, additions, and deletions to these ideas as the design evolves. The CRISP90 software design analyzer system was developed to provide the PDL (Programmer Design Language) programmer with such a design medium. A program design using CRISP90 consists of short, English-like textual descriptions of data, interfaces, and procedures that are imbedded in a simple, structured, modular syntax. The display is formatted into two-dimensional, flowchart-like segments for a graphic presentation of the design. Together with a good interactive full-screen editor or word processor, the CRISP90 design analyzer becomes a powerful tool for the programmer. In addition to being a text formatter, the CRISP90 system prepares material that would be tedious and error prone to extract manually, such as a table of contents, module directory, structure (tier) chart, cross-references, and a statistics report on the characteristics of the design. Referenced modules are marked by schematic logic symbols to show conditional, iterative, and/or concurrent invocation in the program. A keyword usage profile can be generated automatically and glossary definitions inserted into the output documentation. Another feature is the capability to detect changes that were made between versions. Thus, "change-bars" can be placed in the output document along with a list of changed pages and a version history report. Also, items may be marked as "to be determined" and each will appear on a special table until the item is

  6. Real-Time Occupancy Change Analyzer

    2005-03-30

    The Real-Time Occupancy Change Analyzer (ROCA) produces an occupancy grid map of an environment around the robot, scans the environment to generate a current obstacle map relative to a current robot position, and converts the current obstacle map to a current occupancy grid map. Changes in the occupancy grid can be reported in real time to support a number of tracking capabilities. The benefit of ROCA is that rather than only providing a vector tomore » the detected change, it provides the actual x,y position of the change.« less

  7. FOMA: A Fast Optical Multichannel Analyzer

    NASA Astrophysics Data System (ADS)

    Haskovec, J. S.; Bramson, G.; Brooks, N. H.; Perry, M.

    1989-12-01

    A Fast Optical Multichannel Analyzer (FOMA) was built for spectroscopic measurements with fast time resolution on the DIII-D tokamak. The FOMA utilizes a linear photodiode array (RETICON RL 1024 SA) as the detector sensor. An external recharge switch and ultrafast operational amplifiers permit a readout time per pixel of 300 ns. In conjunction with standard CAMAC digitizer and timing modules, a readout time of 500 microns is achieved for the full 1024-element array. Data acquired in bench tests and in actual spectroscopic measurements on the DIII-D tokamak is presented to illustrate the camera's capability.

  8. Analyzing Ever Growing Datasets in PHENIX

    NASA Astrophysics Data System (ADS)

    Pinkenburg, Christopher; PHENIX Collaboration

    2011-12-01

    After 10 years of running, the PHENIX experiment has by now accumulated more than 700 TB of reconstructed data which are directly used for analysis. Analyzing these amounts of data efficiently requires a coordinated approach. Beginning in 2005 we started to develop a system for the RHIC Atlas Computing Facility (RACF) which allows the efficient analysis of these large data sets. The Analysis Taxi is now the tool which allows any collaborator to process any data set taken since 2003 in weekly passes with turnaround times of typically three to four days.

  9. Gas dynamics in residual gas analyzer calibration

    SciTech Connect

    Santeler, D.J.

    1987-01-01

    Residual gas analyzers are used for measuring partial flow rates as well as for measuring partial pressures. The required calibration may also be obtained with either known flow rates or known pressures. The calibration and application procedures are straightforward when both are of the same type; however, substantial errors may occur if the two types are mixed. This report develops the basic equations required to convert between partial pressure calibrations and partial flow rate calibrations. It also discusses the question of fractionating and nonfractionating gas flow in various gas inlet and pumping systems.

  10. A low power Multi-Channel Analyzer

    SciTech Connect

    Anderson, G.A.; Brackenbush, L.W.

    1993-06-01

    The instrumentation used in nuclear spectroscopy is generally large, is not portable, and requires a lot of power. Key components of these counting systems are the computer and the Multi-Channel Analyzer (MCA). To assist in performing measurements requiring portable systems, a small, very low power MCA has been developed at Pacific Northwest Laboratory (PNL). This MCA is interfaced with a Hewlett Packard palm top computer for portable applications. The MCA can also be connected to an IBM/PC for data storage and analysis. In addition, a real-time time display mode allows the user to view the spectra as they are collected.

  11. MULTI-CHANNEL PULSE HEIGHT ANALYZER

    DOEpatents

    Boyer, K.; Johnstone, C.W.

    1958-11-25

    An improved multi-channel pulse height analyzer of the type where the device translates the amplitude of each pulse into a time duration electrical quantity which is utilized to control the length of a train of pulses forwarded to a scaler is described. The final state of the scaler for any one train of pulses selects the appropriate channel in a magnetic memory in which an additional count of one is placed. The improvement consists of a storage feature for storing a signal pulse so that in many instances when two signal pulses occur in rapid succession, the second pulse is preserved and processed at a later time.

  12. Using SCR methods to analyze requirements documentation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Morrison, Jeffery

    1995-01-01

    Software Cost Reduction (SCR) methods are being utilized to analyze and verify selected parts of NASA's EOS-DIS Core System (ECS) requirements documentation. SCR is being used as a spot-inspection tool. Through this formal and systematic approach of the SCR requirements methods, insights as to whether the requirements are internally inconsistent or incomplete as the scenarios of intended usage evolve in the OC (Operations Concept) documentation. Thus, by modelling the scenarios and requirements as mode charts using the SCR methods, we have been able to identify problems within and between the documents.

  13. Plutonium solution analyzer. Revised February 1995

    SciTech Connect

    Burns, D.A.

    1995-02-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%--0.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40--240 g/l: and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4--4.0 g/y. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 ml of each sample and standard, and generates waste at the rate of about 1.5 ml per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded).

  14. The MAVEN Solar Wind Electron Analyzer

    NASA Astrophysics Data System (ADS)

    Mitchell, D. L.; Mazelle, C.; Sauvaud, J.-A.; Thocaven, J.-J.; Rouzaud, J.; Fedorov, A.; Rouger, P.; Toublanc, D.; Taylor, E.; Gordon, D.; Robinson, M.; Heavner, S.; Turin, P.; Diaz-Aguado, M.; Curtis, D. W.; Lin, R. P.; Jakosky, B. M.

    2016-04-01

    The MAVEN Solar Wind Electron Analyzer (SWEA) is a symmetric hemispheric electrostatic analyzer with deflectors that is designed to measure the energy and angular distributions of 3-4600-eV electrons in the Mars environment. This energy range is important for impact ionization of planetary atmospheric species, and encompasses the solar wind core and halo populations, shock-energized electrons, auroral electrons, and ionospheric primary photoelectrons. The instrument is mounted at the end of a 1.5-meter boom to provide a clear field of view that spans nearly 80 % of the sky with ˜20° resolution. With an energy resolution of 17 % (Δ E/E), SWEA readily distinguishes electrons of solar wind and ionospheric origin. Combined with a 2-second measurement cadence and on-board real-time pitch angle mapping, SWEA determines magnetic topology with high (˜8-km) spatial resolution, so that local measurements of the plasma and magnetic field can be placed into global context.

  15. CALIBRATION OF ONLINE ANALYZERS USING NEURAL NETWORKS

    SciTech Connect

    Rajive Ganguli; Daniel E. Walsh; Shaohai Yu

    2003-12-05

    Neural networks were used to calibrate an online ash analyzer at the Usibelli Coal Mine, Healy, Alaska, by relating the Americium and Cesium counts to the ash content. A total of 104 samples were collected from the mine, with 47 being from screened coal, and the rest being from unscreened coal. Each sample corresponded to 20 seconds of coal on the running conveyor belt. Neural network modeling used the quick stop training procedure. Therefore, the samples were split into training, calibration and prediction subsets. Special techniques, using genetic algorithms, were developed to representatively split the sample into the three subsets. Two separate approaches were tried. In one approach, the screened and unscreened coal was modeled separately. In another, a single model was developed for the entire dataset. No advantage was seen from modeling the two subsets separately. The neural network method performed very well on average but not individually, i.e. though each prediction was unreliable, the average of a few predictions was close to the true average. Thus, the method demonstrated that the analyzers were accurate at 2-3 minutes intervals (average of 6-9 samples), but not at 20 seconds (each prediction).

  16. Atmospheric Aerosol Chemistry Analyzer: Demonstration of feasibility

    SciTech Connect

    Mroz, E.J.; Olivares, J.; Kok, G.

    1996-04-01

    This is the final report of a three-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The project objective was to demonstrate the technical feasibility of an Atmospheric Aerosol Chemistry Analyzer (AACA) that will provide a continuous, real-time analysis of the elemental (major, minor and trace) composition of atmospheric aerosols. The AACA concept is based on sampling the atmospheric aerosol through a wet cyclone scrubber that produces an aqueous suspension of the particles. This suspension can then be analyzed for elemental composition by ICP/MS or collected for subsequent analysis by other methods. The key technical challenge was to develop a wet cyclone aerosol sampler suitable for respirable particles found in ambient aerosols. We adapted an ultrasonic nebulizer to a conventional, commercially available, cyclone aerosol sampler and completed collection efficiency tests for the unit, which was shown to efficiently collect particles as small as 0.2 microns. We have completed the necessary basic research and have demonstrated the feasibility of the AACA concept.

  17. Analyzing endocrine system conservation and evolution.

    PubMed

    Bonett, Ronald M

    2016-08-01

    Analyzing variation in rates of evolution can provide important insights into the factors that constrain trait evolution, as well as those that promote diversification. Metazoan endocrine systems exhibit apparent variation in evolutionary rates of their constituent components at multiple levels, yet relatively few studies have quantified these patterns and analyzed them in a phylogenetic context. This may be in part due to historical and current data limitations for many endocrine components and taxonomic groups. However, recent technological advancements such as high-throughput sequencing provide the opportunity to collect large-scale comparative data sets for even non-model species. Such ventures will produce a fertile data landscape for evolutionary analyses of nucleic acid and amino acid based endocrine components. Here I summarize evolutionary rate analyses that can be applied to categorical and continuous endocrine traits, and also those for nucleic acid and protein-based components. I emphasize analyses that could be used to test whether other variables (e.g., ecology, ontogenetic timing of expression, etc.) are related to patterns of rate variation and endocrine component diversification. The application of phylogenetic-based rate analyses to comparative endocrine data will greatly enhance our understanding of the factors that have shaped endocrine system evolution.

  18. Modular thermal analyzer routine, volume 1

    NASA Technical Reports Server (NTRS)

    Oren, J. A.; Phillips, M. A.; Williams, D. R.

    1972-01-01

    The Modular Thermal Analyzer Routine (MOTAR) is a general thermal analysis routine with strong capabilities for performing thermal analysis of systems containing flowing fluids, fluid system controls (valves, heat exchangers, etc.), life support systems, and thermal radiation situations. Its modular organization permits the analysis of a very wide range of thermal problems for simple problems containing a few conduction nodes to those containing complicated flow and radiation analysis with each problem type being analyzed with peak computational efficiency and maximum ease of use. The organization and programming methods applied to MOTAR achieved a high degree of computer utilization efficiency in terms of computer execution time and storage space required for a given problem. The computer time required to perform a given problem on MOTAR is approximately 40 to 50 percent that required for the currently existing widely used routines. The computer storage requirement for MOTAR is approximately 25 percent more than the most commonly used routines for the most simple problems but the data storage techniques for the more complicated options should save a considerable amount of space.

  19. Optoacoustic 13C-breath test analyzer

    NASA Astrophysics Data System (ADS)

    Harde, Hermann; Helmrich, Günther; Wolff, Marcus

    2010-02-01

    The composition and concentration of exhaled volatile gases reflects the physical ability of a patient. Therefore, a breath analysis allows to recognize an infectious disease in an organ or even to identify a tumor. One of the most prominent breath tests is the 13C-urea-breath test, applied to ascertain the presence of the bacterium helicobacter pylori in the stomach wall as an indication of a gastric ulcer. In this contribution we present a new optical analyzer that employs a compact and simple set-up based on photoacoustic spectroscopy. It consists of two identical photoacoustic cells containing two breath samples, one taken before and one after capturing an isotope-marked substrate, where the most common isotope 12C is replaced to a large extent by 13C. The analyzer measures simultaneously the relative CO2 isotopologue concentrations in both samples by exciting the molecules on specially selected absorption lines with a semiconductor laser operating at a wavelength of 2.744 μm. For a reliable diagnosis changes of the 13CO2 concentration of 1% in the exhaled breath have to be detected at a concentration level of this isotope in the breath of about 500 ppm.

  20. Using virtual reality to analyze sports performance.

    PubMed

    Bideau, Benoit; Kulpa, Richard; Vignais, Nicolas; Brault, Sébastien; Multon, Franck; Craig, Cathy

    2010-01-01

    Improving performance in sports can be difficult because many biomechanical, physiological, and psychological factors come into play during competition. A better understanding of the perception-action loop employed by athletes is necessary. This requires isolating contributing factors to determine their role in player performance. Because of its inherent limitations, video playback doesn't permit such in-depth analysis. Interactive, immersive virtual reality (VR) can overcome these limitations and foster a better understanding of sports performance from a behavioral-neuroscience perspective. Two case studies using VR technology and a sophisticated animation engine demonstrate how to use information from visual displays to inform a player's future course of action. PMID:20650707

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - FIELD PORTABLE X-RAY FLUORESCENCE ANALYZER - SCITEC, MAP SPECTRUM ANALYZER

    EPA Science Inventory

    In April 1995, the U.S. Environmental Protection Agency (EPA) sponsored a demonstration of field portable X-ray fluorescence (FPXRF) analyzers. The primary objectives of this demonstration were (1) to determine how well FPXRF analyzers perform in comparison to standard reference...

  2. Automated Root Tracking with "Root System Analyzer"

    NASA Astrophysics Data System (ADS)

    Schnepf, Andrea; Jin, Meina; Ockert, Charlotte; Bol, Roland; Leitner, Daniel

    2015-04-01

    Crucial factors for plant development are water and nutrient availability in soils. Thus, root architecture is a main aspect of plant productivity and needs to be accurately considered when describing root processes. Images of root architecture contain a huge amount of information, and image analysis helps to recover parameters describing certain root architectural and morphological traits. The majority of imaging systems for root systems are designed for two-dimensional images, such as RootReader2, GiA Roots, SmartRoot, EZ-Rhizo, and Growscreen, but most of them are semi-automated and involve mouse-clicks in each root by the user. "Root System Analyzer" is a new, fully automated approach for recovering root architectural parameters from two-dimensional images of root systems. Individual roots can still be corrected manually in a user interface if required. The algorithm starts with a sequence of segmented two-dimensional images showing the dynamic development of a root system. For each image, morphological operators are used for skeletonization. Based on this, a graph representation of the root system is created. A dynamic root architecture model helps to determine which edges of the graph belong to an individual root. The algorithm elongates each root at the root tip and simulates growth confined within the already existing graph representation. The increment of root elongation is calculated assuming constant growth. For each root, the algorithm finds all possible paths and elongates the root in the direction of the optimal path. In this way, each edge of the graph is assigned to one or more coherent roots. Image sequences of root systems are handled in such a way that the previous image is used as a starting point for the current image. The algorithm is implemented in a set of Matlab m-files. Output of Root System Analyzer is a data structure that includes for each root an identification number, the branching order, the time of emergence, the parent

  3. Transient One-dimensional Pipe Flow Analyzer

    1986-04-08

    TOPAZ-SNLL, the Transient One- dimensional Pipe flow AnalyZer code, is a user-friendly computer program for modeling the heat transfer, fluid mechanics, and thermodynamics of multi-species gas transfer in arbitrary arrangements of pipes, valves, vessels, and flow branches. Although the flow conservation equations are assumed to be one-dimensional and transient, multidimensional features of internal fluid flow and heat transfer may be accounted for using the available quasi-steady flow correlations (e.g., Moody friction factor correlation and variousmore » form loss and heat transfer correlations). Users may also model the effects of moving system boundaries such as pistons, diaphragms, and bladders. The features of fully compressible flow are modeled, including the propagation of shocks and rarefaction waves, as well as the establishment of multiple choke points along the flow path.« less

  4. LED-based NDIR natural gas analyzer

    NASA Astrophysics Data System (ADS)

    Fanchenko, Sergey; Baranov, Alexander; Savkin, Alexey; Sleptsov, Vladimir

    2016-03-01

    A new generation of the light-emitting diodes (LEDs) and photodiodes (PDs) was used recently to develop an open path non-dispersive infrared (NDIR) methane analyzer. The first open path detector prototype was constructed using LEDs for measurement and reference channels, accordingly, and first measurements for methane gas have been performed using optical paths of the order of several meters [3]. The natural gas consists of several first alkanes, mainly methane, and it is important to have a possibility of measuring all of them. In the present work we report the results of NDIR measurements for propane-butane mixture and new measurements of methane using LEDs for measurement and reference channels at 2300 and 1700 nm wavelengths, accordingly. The necessity of the double beam scheme is demonstrated and obtained results for methane and propane-butane mixture are compared.

  5. Coke from small-diameter tubes analyzed

    SciTech Connect

    Albright, L.F.

    1988-08-29

    The mechanism for coke deposit formation and the nature of the coke itself can vary with the design of the ethylene furnace tube bank. In this article, coke deposits from furnaces with small-diameter pyrolysis tubes are examined. The samples were taken from four furnaces of identical design (Plant B). As in both the first and second installments of the series, the coke deposits were examined using a scanning electron microscope (SEM) and an energy dispersive X-ray analyzer (EDAX). The deposits from the small-diameter tubes are compared with the coke deposits from the furnace discussed in earlier articles. Analysis of the coke in both sets of samples are then used to offer recommendations for improved decoking procedures, operating procedures, better feed selection, and better selection of the metallurgy used in furnace tubes, to extend the operating time of the furnace tubes by reducing the amount and type of coke build up.

  6. Composite blade structural analyzer (COBSTRAN) user's manual

    NASA Technical Reports Server (NTRS)

    Aiello, Robert A.

    1989-01-01

    The installation and use of a computer code, COBSTRAN (COmposite Blade STRuctrual ANalyzer), developed for the design and analysis of composite turbofan and turboprop blades and also for composite wind turbine blades was described. This code combines composite mechanics and laminate theory with an internal data base of fiber and matrix properties. Inputs to the code are constituent fiber and matrix material properties, factors reflecting the fabrication process, composite geometry and blade geometry. COBSTRAN performs the micromechanics, macromechanics and laminate analyses of these fiber composites. COBSTRAN generates a NASTRAN model with equivalent anisotropic homogeneous material properties. Stress output from NASTRAN is used to calculate individual ply stresses, strains, interply stresses, thru-the-thickness stresses and failure margins. Curved panel structures may be modeled providing the curvature of a cross-section is defined by a single value function. COBSTRAN is written in FORTRAN 77.

  7. Composite Blade Structural Analyzer (COBSTRAN) demonstration manual

    NASA Technical Reports Server (NTRS)

    Aiello, Robert A.

    1989-01-01

    The input deck setup is described for a computer code, composite blade structural analyzer (COBSTRAN) which was developed for the design and analysis of composite turbofan and turboprop blades and also for composite wind turbine blades. This manual is intended for use in conjunction with the COBSTRAN user's manual. Seven demonstration problems are described with pre- and postprocessing input decks. Modeling of blades which are solid thru-the-thickness and also aircraft wing airfoils with internal spars is shown. Corresponding NASTRAN and databank input decks are also shown. Detail descriptions of each line of the pre- and post-processing decks is provided with reference to the Card Groups defined in the user's manual. A dictionary of all program variables and terms used in this manual may be found in Section 6 of the user's manual.

  8. Perturbation technique to analyze nonlinear oscillations

    SciTech Connect

    Tu, S.T.

    1986-01-01

    Using perturbation and asymptotic methods, the author analyzes the nonlinear oscillations of two dynamical systems: the Bonhoeffer-van der Pol equations and the forced Duffing equation. In the two-dimensional model of the former system, he studies the transition from stable steady-state to relaxation oscillation as a parameter is varied. The analysis also helps to clarify a phenomenon commonly known as the duck trajectory. In the three-dimensional model, bursting oscillation is explained. In the forced Duffing equation, the main interest is the trajectory near the homoclinic orbit and the saddle point. A map of that trajectory is analytically constructed. From that map, limit cycles and their linear stability are investigated.

  9. METCAN: The metal matrix composite analyzer

    NASA Technical Reports Server (NTRS)

    Hopkins, Dale A.; Murthy, Pappu L. N.

    1988-01-01

    Metal matrix composites (MMC) are the subject of intensive study and are receiving serious consideration for critical structural applications in advanced aerospace systems. MMC structural analysis and design methodologies are studied. Predicting the mechanical and thermal behavior and the structural response of components fabricated from MMC requires the use of a variety of mathematical models. These models relate stresses to applied forces, stress intensities at the tips of cracks to nominal stresses, buckling resistance to applied force, or vibration response to excitation forces. The extensive research in computational mechanics methods for predicting the nonlinear behavior of MMC are described. This research has culminated in the development of the METCAN (METal Matrix Composite ANalyzer) computer code.

  10. Drug stability analyzer for long duration spaceflights

    NASA Astrophysics Data System (ADS)

    Shende, Chetan; Smith, Wayne; Brouillette, Carl; Farquharson, Stuart

    2014-06-01

    Crewmembers of current and future long duration spaceflights require drugs to overcome the deleterious effects of weightlessness, sickness and injuries. Unfortunately, recent studies have shown that some of the drugs currently used may degrade more rapidly in space, losing their potency well before their expiration dates. To complicate matters, the degradation products of some drugs can be toxic. Consequently there is a need for an analyzer that can determine if a drug is safe at the time of use, as well as to monitor and understand space-induced degradation, so that drug types, formulations, and packaging can be improved. Towards this goal we have been investigating the ability of Raman spectroscopy to monitor and quantify drug degradation. Here we present preliminary data by measuring acetaminophen, and its degradation product, p-aminophenol, as pure samples, and during forced degradation reactions.

  11. Analyzing Options for Airborne Emergency Wireless Communications

    SciTech Connect

    Michael Schmitt; Juan Deaton; Curt Papke; Shane Cherry

    2008-03-01

    In the event of large-scale natural or manmade catastrophic events, access to reliable and enduring commercial communication systems is critical. Hurricane Katrina provided a recent example of the need to ensure communications during a national emergency. To ensure that communication demands are met during these critical times, Idaho National Laboratory (INL) under the guidance of United States Strategic Command has studied infrastructure issues, concerns, and vulnerabilities associated with an airborne wireless communications capability. Such a capability could provide emergency wireless communications until public/commercial nodes can be systematically restored. This report focuses on the airborne cellular restoration concept; analyzing basic infrastructure requirements; identifying related infrastructure issues, concerns, and vulnerabilities and offers recommended solutions.

  12. Method for network analyzation and apparatus

    DOEpatents

    Bracht, Roger B.; Pasquale, Regina V.

    2001-01-01

    A portable network analyzer and method having multiple channel transmit and receive capability for real-time monitoring of processes which maintains phase integrity, requires low power, is adapted to provide full vector analysis, provides output frequencies of up to 62.5 MHz and provides fine sensitivity frequency resolution. The present invention includes a multi-channel means for transmitting and a multi-channel means for receiving, both in electrical communication with a software means for controlling. The means for controlling is programmed to provide a signal to a system under investigation which steps consecutively over a range of predetermined frequencies. The resulting received signal from the system provides complete time domain response information by executing a frequency transform of the magnitude and phase information acquired at each frequency step.

  13. Accuracy considerations of portable electrochemical NOX analyzers

    SciTech Connect

    Capetanopoulos, C.; Hobbs, B.

    1996-12-31

    Two key components contributing to measurement errors of electrochemical analyzers are discussed. These are the sample conditioning system and the electrochemical nitric oxide and nitrogen dioxide sensors. The problems associated with various types of conditioning systems are discussed and some experimental results are presented using analyte spiking methods. Permeation drier based systems are shown to cause the smallest loss of the analyte. Two major problems of the NO and NO{sub 2} sensors are examined. The first problem deals with the significant effect of temperature on the sensor and its associated interference rejection filter. The requirement for maintaining sensor and filter temperature below 30{degree}C is demonstrated. The second deals with the saturation and drift considerations caused by over exposure to the gas, The significance of capillary size to minimize drift for diffusion sensors is discussed. Experimental results are presented and discussed with a view to the recently published EPA CTM-022 Method. 2 refs., 7 figs.

  14. Digital avionics design and reliability analyzer

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The description and specifications for a digital avionics design and reliability analyzer are given. Its basic function is to provide for the simulation and emulation of the various fault-tolerant digital avionic computer designs that are developed. It has been established that hardware emulation at the gate-level will be utilized. The primary benefit of emulation to reliability analysis is the fact that it provides the capability to model a system at a very detailed level. Emulation allows the direct insertion of faults into the system, rather than waiting for actual hardware failures to occur. This allows for controlled and accelerated testing of system reaction to hardware failures. There is a trade study which leads to the decision to specify a two-machine system, including an emulation computer connected to a general-purpose computer. There is also an evaluation of potential computers to serve as the emulation computer.

  15. Analyzing antibody-Fc-receptor interactions.

    PubMed

    Nimmerjahn, Falk; Ravetch, Jeffrey V

    2008-01-01

    Cellular receptors for immunoglobulins (Fc-receptors; FcR) are central mediators of antibody-triggered effector functions. Immune complex (IC) binding to FcRs results in a variety of reactions such as the release of inflammatory mediators, antibody dependent cellular cytotoxicity (ADCC) and phagocytosis of ICs. Analyzing antibody-FcR (Ab-FcR) interactions in vitro is essential to determine the effector mechanisms, binding characteristics and affinity parameters that will impact and predict antibody activity in vivo. The methods described in this chapter include the generation of ICs and soluble FcR variants, as well as ELISA and FACS-based assays to study Ab-FcR interactions.

  16. Structural factoring approach for analyzing stochastic networks

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.; Shier, Douglas R.

    1991-01-01

    The problem of finding the distribution of the shortest path length through a stochastic network is investigated. A general algorithm for determining the exact distribution of the shortest path length is developed based on the concept of conditional factoring, in which a directed, stochastic network is decomposed into an equivalent set of smaller, generally less complex subnetworks. Several network constructs are identified and exploited to reduce significantly the computational effort required to solve a network problem relative to complete enumeration. This algorithm can be applied to two important classes of stochastic path problems: determining the critical path distribution for acyclic networks and the exact two-terminal reliability for probabilistic networks. Computational experience with the algorithm was encouraging and allowed the exact solution of networks that have been previously analyzed only by approximation techniques.

  17. Identifying and Analyzing Web Server Attacks

    SciTech Connect

    Seifert, Christian; Endicott-Popovsky, Barbara E.; Frincke, Deborah A.; Komisarczuk, Peter; Muschevici, Radu; Welch, Ian D.

    2008-08-29

    Abstract: Client honeypots can be used to identify malicious web servers that attack web browsers and push malware to client machines. Merely recording network traffic is insufficient to perform comprehensive forensic analyses of such attacks. Custom tools are required to access and analyze network protocol data. Moreover, specialized methods are required to perform a behavioral analysis of an attack, which helps determine exactly what transpired on the attacked system. This paper proposes a record/replay mechanism that enables forensic investigators to extract application data from recorded network streams and allows applications to interact with this data in order to conduct behavioral analyses. Implementations for the HTTP and DNS protocols are presented and their utility in network forensic investigations is demonstrated.

  18. Transient One-dimensional Pipe Flow Analyzer

    SciTech Connect

    1986-04-08

    TOPAZ-SNLL, the Transient One- dimensional Pipe flow AnalyZer code, is a user-friendly computer program for modeling the heat transfer, fluid mechanics, and thermodynamics of multi-species gas transfer in arbitrary arrangements of pipes, valves, vessels, and flow branches. Although the flow conservation equations are assumed to be one-dimensional and transient, multidimensional features of internal fluid flow and heat transfer may be accounted for using the available quasi-steady flow correlations (e.g., Moody friction factor correlation and various form loss and heat transfer correlations). Users may also model the effects of moving system boundaries such as pistons, diaphragms, and bladders. The features of fully compressible flow are modeled, including the propagation of shocks and rarefaction waves, as well as the establishment of multiple choke points along the flow path.

  19. Color recognition system for urine analyzer

    NASA Astrophysics Data System (ADS)

    Zhu, Lianqing; Wang, Zicai; Lin, Qian; Dong, Mingli

    2010-08-01

    In order to increase the speed of photoelectric conversion, a linear CCD is applied as the photoelectric converter instead of the traditional photodiode. A white LED is used as the light source of the system. The color information of the urine test strip is transferred into the CCD through a reflecting optical system. It is then converted to digital signals by an A/D converter. The test results of urine analysis are obtained by a data processing system. An ARM microprocessor is selected as the CPU of the system and a CPLD is employed to provide a driving timing for the CCD drive and the A/D converter. Active HDL7.2 and Verilog HDL are used to simulate the driving timing of the CPLD. Experimental results show that the correctness rate of the test results is better than 90%. The system satisfies the requirements of the color information collection of urine analyzer.

  20. Analyzing biphasic surface plasmon resonance data

    NASA Astrophysics Data System (ADS)

    Tiwari, Purushottam; Wang, Xuewen; He, Jin; Darici, Yesim

    Surface plasmon resonance (SPR) is a widely used label-free biophysical technique to quantitatively study biochemical processes. Analysis of monophasic SPR profiles by fitting using a single exponential function is straightforward. However, there is no simple procedure for SPR data fitting with double exponential functions. An existing approach is to fit the biphasic SPR profiles with numerical solutions of the rate equations. This procedure requires some prior knowledge of the underlying interaction mechanism, and the extracted rate constants often have large uncertainties. We propose a new method of analyzing the biphasic SPR data using the three commonly employed biphasic models. Our method is based on a general analytical solution of the biphasic rate equations. Our method can be used to determine the underlying biphasic interaction mechanism from the analysis of the SPR data, and to extract the rate constants with high confidence levels.

  1. Stackable differential mobility analyzer for aerosol measurement

    DOEpatents

    Cheng, Meng-Dawn; Chen, Da-Ren

    2007-05-08

    A multi-stage differential mobility analyzer (MDMA) for aerosol measurements includes a first electrode or grid including at least one inlet or injection slit for receiving an aerosol including charged particles for analysis. A second electrode or grid is spaced apart from the first electrode. The second electrode has at least one sampling outlet disposed at a plurality different distances along its length. A volume between the first and the second electrode or grid between the inlet or injection slit and a distal one of the plurality of sampling outlets forms a classifying region, the first and second electrodes for charging to suitable potentials to create an electric field within the classifying region. At least one inlet or injection slit in the second electrode receives a sheath gas flow into an upstream end of the classifying region, wherein each sampling outlet functions as an independent DMA stage and classifies different size ranges of charged particles based on electric mobility simultaneously.

  2. Gaseous trace impurity analyzer and method

    DOEpatents

    Edwards, Jr., David; Schneider, William

    1980-01-01

    Simple apparatus for analyzing trace impurities in a gas, such as helium or hydrogen, comprises means for drawing a measured volume of the gas as sample into a heated zone. A segregable portion of the zone is then chilled to condense trace impurities in the gas in the chilled portion. The gas sample is evacuated from the heated zone including the chilled portion. Finally, the chilled portion is warmed to vaporize the condensed impurities in the order of their boiling points. As the temperature of the chilled portion rises, pressure will develop in the evacuated, heated zone by the vaporization of an impurity. The temperature at which the pressure increase occurs identifies that impurity and the pressure increase attained until the vaporization of the next impurity causes a further pressure increase is a measure of the quantity of the preceding impurity.

  3. Numerical procedure for analyzing Langmuir probe data

    NASA Technical Reports Server (NTRS)

    Beattie, J. R.

    1975-01-01

    A numerical procedure is proposed for analyzing Langmuir probe data in the presence of a two-group plasma containing both primary and Maxwellian electrons. The procedure is known as a least-squares differential-correction technique for determining the unknown coefficients of the governing equation. It is shown that for a given set of input data the results of the analysis are unique and independent of the initial estimate of electron temperature, that convergence is fastest when electron temperature is overestimated, that the results are sensitive to the region of the curve used as input and also to the voltage increment, and that plasma properties determined by the proposed numerical procedure are either consistent with those determined graphically or closer to expected values. With a suitable data acquisition system, the Fortran IV program worked out for this procedure could be used to provide real-time plasma diagnostic information for an operating ion thruster.

  4. Analyzing epithelial and endothelial kisses in Merida

    PubMed Central

    Nusrat, Asma; Quiros, Miguel; González-Mariscal, Lorenza

    2013-01-01

    Last November a group of principal investigators, postdoctoral fellows and PhD students from around the world got together in the city of Merida in Southeastern Mexico in a State of the Art meeting on the “Molecular structure and function of the apical junctional complex in epithelial and endothelia.” They analyzed diverse tissue barriers including those in the gastrointestinal tract, the blood brain barrier, blood neural and blood retinal barriers. The talks revealed exciting new findings in the field, novel technical approaches and unpublished data and highlighted the importance of studying junctional complexes to better understand several pathogenesis and to develop therapeutic approaches that can be utilized for drug delivery. This meeting report has the purpose of highlighting the results and advances discussed by the speakers at the Merida Meeting.

  5. Ultrasonic interface level analyzer shop test procedure

    SciTech Connect

    STAEHR, T.W.

    1999-05-24

    The Royce Instrument Corporation Model 2511 Interface Level Analyzer (URSILLA) system uses an ultrasonic ranging technique (SONAR) to measure sludge depths in holding tanks. Three URSILLA instrument assemblies provided by the W-151 project are planned to be used during mixer pump testing to provide data for determining sludge mobilization effectiveness of the mixer pumps and sludge settling rates. The purpose of this test is to provide a documented means of verifying that the functional components of the three URSILLA instruments operate properly. Successful completion of this Shop Test Procedure (STP) is a prerequisite for installation in the AZ-101 tank. The objective of the test is to verify the operation of the URSILLA instruments and to verify data collection using a stand alone software program.

  6. Analyzing Hydrological Sustainability Through Water Balance

    NASA Astrophysics Data System (ADS)

    Menció, Anna; Folch, Albert; Mas-Pla, Josep

    2010-05-01

    The objective of the Water Framework Directive (2000/60/EC) is to assist in the development of management plans that will lead to the sustainable use of water resources in all EU member states. However, defining the degree of sustainability aimed at is not a straightforward task. It requires detailed knowledge of the hydrogeological characteristics of the basin in question, its environmental needs, the amount of human water demand, and the opportunity to construct a proper water balance that describes the behavior of the hydrological system and estimates available water resources. An analysis of the water balance in the Selva basin (Girona, NE Spain) points to the importance of regional groundwater fluxes in satisfying current exploitation rates, and shows that regional scale approaches are often necessary to evaluate water availability. In addition, we discuss the pressures on water resources, and analyze potential actions, based on the water balance results, directed towards achieving sustainable water management in the basin.

  7. Analyzing Collisions in Terms of Newton's Laws

    NASA Astrophysics Data System (ADS)

    Roeder, John L.

    2003-02-01

    Although the principle of momentum conservation is a consequence of Newton's second and third laws of motion, as recognized by Newton himself, this principle is typically applied in analyzing collisions as if it is a separate concept of its own. This year I sought to integrate my treatment of collisions with my coverage of Newton's laws by asking students to calculate the effect on the motion of two particles due to the forces they exerted for a specified time interval on each other. For example, "A 50-kg crate slides across the ice at 3 m/s and collides with a 25-kg crate at rest. During the collision process the 50-kg crate exerts a 500 N time-averaged force on the 25 kg for 0.1 s. What are the accelerations of the crates during the collision, and what are their velocities after the collision? What are the momenta of the crates before and after collision?"

  8. A calibration free vector network analyzer

    NASA Astrophysics Data System (ADS)

    Kothari, Arpit

    Recently, two novel single-port, phase-shifter based vector network analyzer (VNA) systems were developed and tested at X-band (8.2--12.4 GHz) and Ka-band (26.4--40 GHz), respectively. These systems operate based on electronically moving the standing wave pattern, set up in a waveguide, over a Schottky detector and sample the standing wave voltage for several phase shift values. Once this system is fully characterized, all parameters in the system become known and hence theoretically, no other correction (or calibration) should be required to obtain the reflection coefficient, (Gamma), of an unknown load. This makes this type of VNA "calibration free" which is a significant advantage over other types of VNAs. To this end, a VNA system, based on this design methodology, was developed at X-band using several design improvements (compared to the previous designs) with the aim of demonstrating this "calibration-free" feature. It was found that when a commercial VNA (HP8510C) is used as the source and the detector, the system works as expected. However, when a detector is used (Schottky diode, log detector, etc.), obtaining correct Gamma still requires the customary three-load calibration. With the aim of exploring the cause, a detailed sensitivity analysis of prominent error sources was performed. Extensive measurements were done with different detection techniques including use of a spectrum analyzer as power detector. The system was tested even for electromagnetic compatibility (EMC) which may have contributed to this issue. Although desired results could not be obtained using the proposed standing-wave-power measuring devices like the Schottky diode but the principle of "calibration-free VNA" was shown to be true.

  9. Analyzing and Visualizing Whole Program Architectures

    SciTech Connect

    Panas, T; Quinlan, D; Vuduc, R

    2007-05-10

    This paper describes our work to develop new tool support for analyzing and visualizing the architecture of complete large-scale (millions or more lines of code) programs. Our approach consists of (i) creating a compact, accurate representation of a whole C or C++ program, (ii) analyzing the program in this representation, and (iii) visualizing the analysis results with respect to the program's architecture. We have implemented our approach by extending and combining a compiler infrastructure and a program visualization tool, and we believe our work will be of broad interest to those engaged in a variety of program understanding and transformation tasks. We have added new whole-program analysis support to ROSE [15, 14], a source-to-source C/C++ compiler infrastructure for creating customized analysis and transformation tools. Our whole-program work does not rely on procedure summaries; rather, we preserve all of the information present in the source while keeping our representation compact. In our representation, a million-line application fits in well less than 1 GB of memory. Because whole-program analyses can generate large amounts of data, we believe that abstracting and visualizing analysis results at the architecture level is critical to reducing the cognitive burden on the consumer of the analysis results. Therefore, we have extended Vizz3D [19], an interactive program visualization tool, with an appropriate metaphor and layout algorithm for representing a program's architecture. Our implementation provides developers with an intuitive, interactive way to view analysis results, such as those produced by ROSE, in the context of the program's architecture. The remainder of this paper summarizes our approach to whole-program analysis (Section 2) and provides an example of how we visualize the analysis results (Section 3).

  10. Analyzing large biological datasets with association networks

    SciTech Connect

    Karpinets, T. V.; Park, B. H.; Uberbacher, E. C.

    2012-05-25

    Due to advances in high throughput biotechnologies biological information is being collected in databases at an amazing rate, requiring novel computational approaches for timely processing of the collected data into new knowledge. In this study we address this problem by developing a new approach for discovering modular structure, relationships and regularities in complex data. These goals are achieved by converting records of biological annotations of an object, like organism, gene, chemical, sequence, into networks (Anets) and rules (Arules) of the associated annotations. Anets are based on similarity of annotation profiles of objects and can be further analyzed and visualized providing a compact birds-eye view of most significant relationships in the collected data and a way of their clustering and classification. Arules are generated by Apriori considering each record of annotations as a transaction and augmenting each annotation item by its type. Arules provide a way to validate relationships discovered by Anets producing comprehensive statistics on frequently associated annotations and specific confident relationships among them. A combination of Anets and Arules represents condensed information on associations among the collected data, helping to discover new knowledge and generate hypothesis. As an example we have applied the approach to analyze bacterial metadata from the Genomes OnLine Database. The analysis allowed us to produce a map of sequenced bacterial and archaeal organisms based on their genomic, metabolic and physiological characteristics with three major clusters of metadata representing bacterial pathogens, environmental isolates, and plant symbionts. A signature profile of clustered annotations of environmental bacteria if compared with pathogens linked the aerobic respiration, the high GC content and the large genome size to diversity of metabolic activities and physiological features of the organisms.

  11. Diffractive interference optical analyzer (DiOPTER)

    NASA Astrophysics Data System (ADS)

    Sasikumar, Harish; Prasad, Vishnu; Pal, Parama; Varma, Manoj M.

    2016-03-01

    This report demonstrates a method for high-resolution refractometric measurements using, what we have termed as, a Diffractive Interference Optical Analyzer (DiOpter). The setup consists of a laser, polarizer, a transparent diffraction grating and Si-photodetectors. The sensor is based on the differential response of diffracted orders to bulk refractive index changes. In these setups, the differential read-out of the diffracted orders suppresses signal drifts and enables time-resolved determination of refractive index changes in the sample cell. A remarkable feature of this device is that under appropriate conditions, the measurement sensitivity of the sensor can be enhanced by more than two orders of magnitude due to interference between multiply reflected diffracted orders. A noise-equivalent limit of detection (LoD) of 6x10-7 RIU was achieved in glass. This work focuses on devices with integrated sample well, made on low-cost PDMS. As the detection methodology is experimentally straightforward, it can be used across a wide array of applications, ranging from detecting changes in surface adsorbates via binding reactions to estimating refractive index (and hence concentration) variations in bulk samples. An exciting prospect of this technique is the potential integration of this device to smartphones using a simple interface based on transmission mode configuration. In a transmission configuration, we were able to achieve an LoD of 4x10-4 RIU which is sufficient to explore several applications in food quality testing and related fields. We are envisioning the future of this platform as a personal handheld optical analyzer for applications ranging from environmental sensing to healthcare and quality testing of food products.

  12. Electrode contamination effects of retarding potential analyzer

    NASA Astrophysics Data System (ADS)

    Fang, H. K.; Oyama, K.-I.; Cheng, C. Z.

    2014-01-01

    The electrode contamination in electrostatic analyzers such as Langmuir probes and retarding potential analyzers (RPA) is a serious problem for space measurements. The contamination layer acts as extra capacitance and resistance and leads to distortion in the measured I-V curve, which leads to erroneous measurement results. There are two main effects of the contamination layer: one is the impedance effect and the other is the charge attachment and accumulation due to the capacitance. The impedance effect can be reduced or eliminated by choosing the proper sweeping frequency. However, for RPA the charge accumulation effect becomes serious because the capacitance of the contamination layer is much larger than that of the Langmuir probe of similar dimension. The charge accumulation on the retarding potential grid causes the effective potential, that ions experience, to be changed from the applied voltage. Then, the number of ions that can pass through the retarding potential grid to reach the collector and, thus, the measured ion current are changed. This effect causes the measured ion drift velocity and ion temperature to be changed from the actual values. The error caused by the RPA electrode contamination is expected to be significant for sounding rocket measurements with low rocket velocity (1-2 km/s) and low ion temperature of 200-300 K in the height range of 100-300 km. In this paper we discuss the effects associated with the RPA contaminated electrodes based on theoretical analysis and experiments performed in a space plasma operation chamber. Finally, the development of a contamination-free RPA for sounding rocket missions is presented.

  13. Interactive word cloud for analyzing reviews

    NASA Astrophysics Data System (ADS)

    Jung, HyunRyong

    2013-12-01

    A five-star quality rating is one of the most widely used systems for evaluating items. However, it has two fundamental limitations: 1) the rating for one item cannot describe crucial information in detail; 2) the rating is not on an absolute scale that can be used to compare items. Because of these limitations, users cannot make an optimal decision. In this paper, we introduce our sophisticated approach to extract useful information from user reviews using collapsed dependencies and sentiment analysis. We propose an interactive word cloud that can show grammatical relationships among words, explore reviews efficiently, and display positivity or negativity on a sentence. In addition, we introduce visualization for comparing multiple word clouds and illustrate the usage through test cases.

  14. Multi-Pass Quadrupole Mass Analyzer

    NASA Technical Reports Server (NTRS)

    Prestage, John D.

    2013-01-01

    Analysis of the composition of planetary atmospheres is one of the most important and fundamental measurements in planetary robotic exploration. Quadrupole mass analyzers (QMAs) are the primary tool used to execute these investigations, but reductions in size of these instruments has sacrificed mass resolving power so that the best present-day QMA devices are still large, expensive, and do not deliver performance of laboratory instruments. An ultra-high-resolution QMA was developed to resolve N2 +/CO+ by trapping ions in a linear trap quadrupole filter. Because N2 and CO are resolved, gas chromatography columns used to separate species before analysis are eliminated, greatly simplifying gas analysis instrumentation. For highest performance, the ion trap mode is used. High-resolution (or narrow-band) mass selection is carried out in the central region, but near the DC electrodes at each end, RF/DC field settings are adjusted to allow broadband ion passage. This is to prevent ion loss during ion reflection at each end. Ions are created inside the trap so that low-energy particles are selected by low-voltage settings on the end electrodes. This is beneficial to good mass resolution since low-energy particles traverse many cycles of the RF filtering fields. Through Monte Carlo simulations, it is shown that ions are reflected at each end many tens of times, each time being sent back through the central section of the quadrupole where ultrahigh mass filtering is carried out. An analyzer was produced with electrical length orders of magnitude longer than its physical length. Since the selector fields are sized as in conventional devices, the loss of sensitivity inherent in miniaturizing quadrupole instruments is avoided. The no-loss, multi-pass QMA architecture will improve mass resolution of planetary QMA instruments while reducing demands on the RF electronics for high-voltage/high-frequency production since ion transit time is no longer limited to a single pass. The

  15. Comparison of two dry chemistry analyzers and a wet chemistry analyzer using canine serum.

    PubMed

    Lanevschi, Anne; Kramer, John W.

    1996-01-01

    Canine serum was used to compare seven chemistry analytes on two tabletop clinical dry chemistry analyzers, Boehringer's Reflotron and Kodak's Ektachem. Results were compared to those obtained on a wet chemistry reference analyzer, Roche Diagnostic's Cobas Mira. Analytes measured were urea nitrogen (BUN), creatinine, glucose, aspartate aminotransferase (AST), alanine aminotransferase (ALT), cholesterol and bilirubin. Nine to 12 canine sera with values in the low, normal, and high range were evaluated. The correlations were acceptable for all comparisons with correlation coefficients greater than 0.98 for all analytes. Regression analysis resulted in significant differences for both tabletop analyzers when compared to the reference analyzer for cholesterol and bilirubin, and for glucose and AST on the Kodak Ektachem. Differences appeared to result from proportional systematic error occurring at high analyte concentrations.

  16. Analyzing the outcomes of health promotion practices.

    PubMed

    Pereira Lima, Vera Lucia Góes; Arruda, José Maria; Barroso, Maria Auxiliadora Bessa; Lobato Tavares, Maria de Fátima; Ribeiro Campos, Nora Zamith; Zandonadil, Regina Celi Moreira Basílio; da Rocha, Rosa Maria; Parreira, Clélia Maria de Souza Ferreira; Cohen, Simone Cynamon; Kligerman, Débora Cynamon; Sperandio, Ana Maria Girotti; Correa, Carlos Roberto Silveira; Serrano, Miguel Malo

    2007-01-01

    This article focuses on health promotion (HP) outcomes, illustrated through evaluation of case studies and identification of strategies which have contributed to their success and sustainability. Evaluation research and practice in three distinct sceneries are discussed: (i) institutional and governmental agencies; (ii) communities in the "Manguinhos Complex" and Nova Iguaqu Municipality, and (iii) building of potentially healthy municipality networks. The effectiveness of a social program in a health promotion perspective was based in the "School for Parents" program, undertaken by the First Court of Childhood and Youth of Rio de Janeiro, between 2001 and 2004. The analysis was grounded in the monitoring of 48 parents in charge of children under 18, who were victims of abuse, violence or negligence, and social exclusion, most of all. The study's objectives were: illustrating the evidence of effectiveness of health promotion, discussing the concept of HP effectiveness under macro unfavorable conditions, and identifying strategies that foster sustainability of results. Institutional resources included a multi-professional staff, multidisciplinary approaches, participatory workshops, family case management, partnership with public and private institutions, and volunteer and civil society sponsorship of the families. Evaluation was based on social impact indicators, and psychosocial and contextual determinants. Evaluation methods included program monitoring and quantitative-qualitative methods, through a longitudinal evaluation of 3 years, including one year post program. The evaluation showed highly favorable results concerning "family integration', "quality of family relations" and "human rights mobilization". Unsatisfactory results such as "lack of access to formal employment" are likely related to structural factors and the need for new public policies in areas such as education, professional training, housing, and access to formal employment. The training process

  17. Signal processing and analyzing works of art

    NASA Astrophysics Data System (ADS)

    Johnson, Don H.; Johnson, C. Richard, Jr.; Hendriks, Ella

    2010-08-01

    In examining paintings, art historians use a wide variety of physico-chemical methods to determine, for example, the paints, the ground (canvas primer) and any underdrawing the artist used. However, the art world has been little touched by signal processing algorithms. Our work develops algorithms to examine x-ray images of paintings, not to analyze the artist's brushstrokes but to characterize the weave of the canvas that supports the painting. The physics of radiography indicates that linear processing of the x-rays is most appropriate. Our spectral analysis algorithms have an accuracy superior to human spot-measurements and have the advantage that, through "short-space" Fourier analysis, they can be readily applied to entire x-rays. We have found that variations in the manufacturing process create a unique pattern of horizontal and vertical thread density variations in the bolts of canvas produced. In addition, we measure the thread angles, providing a way to determine the presence of cusping and to infer the location of the tacks used to stretch the canvas on a frame during the priming process. We have developed weave matching software that employs a new correlation measure to find paintings that share canvas weave characteristics. Using a corpus of over 290 paintings attributed to Vincent van Gogh, we have found several weave match cliques that we believe will refine the art historical record and provide more insight into the artist's creative processes.

  18. On geometric factors for neutral particle analyzers.

    PubMed

    Stagner, L; Heidbrink, W W

    2014-11-01

    Neutral particle analyzers (NPA) detect neutralized energetic particles that escape from plasmas. Geometric factors relate the counting rate of the detectors to the intensity of the particle source. Accurate geometric factors enable quick simulation of geometric effects without the need to resort to slower Monte Carlo methods. Previously derived expressions [G. R. Thomas and D. M. Willis, "Analytical derivation of the geometric factor of a particle detector having circular or rectangular geometry," J. Phys. E: Sci. Instrum. 5(3), 260 (1972); J. D. Sullivan, "Geometric factor and directional response of single and multi-element particle telescopes," Nucl. Instrum. Methods 95(1), 5-11 (1971)] for the geometric factor implicitly assume that the particle source is very far away from the detector (far-field); this excludes applications close to the detector (near-field). The far-field assumption does not hold in most fusion applications of NPA detectors. We derive, from probability theory, a generalized framework for deriving geometric factors that are valid for both near and far-field applications as well as for non-isotropic sources and nonlinear particle trajectories.

  19. On geometric factors for neutral particle analyzers

    SciTech Connect

    Stagner, L.; Heidbrink, W. W.

    2014-11-15

    Neutral particle analyzers (NPA) detect neutralized energetic particles that escape from plasmas. Geometric factors relate the counting rate of the detectors to the intensity of the particle source. Accurate geometric factors enable quick simulation of geometric effects without the need to resort to slower Monte Carlo methods. Previously derived expressions [G. R. Thomas and D. M. Willis, “Analytical derivation of the geometric factor of a particle detector having circular or rectangular geometry,” J. Phys. E: Sci. Instrum. 5(3), 260 (1972); J. D. Sullivan, “Geometric factor and directional response of single and multi-element particle telescopes,” Nucl. Instrum. Methods 95(1), 5–11 (1971)] for the geometric factor implicitly assume that the particle source is very far away from the detector (far-field); this excludes applications close to the detector (near-field). The far-field assumption does not hold in most fusion applications of NPA detectors. We derive, from probability theory, a generalized framework for deriving geometric factors that are valid for both near and far-field applications as well as for non-isotropic sources and nonlinear particle trajectories.

  20. New system analyzes pumping well performance

    SciTech Connect

    McCoy, J.N. ); Podio, A.L. )

    1990-11-01

    A SYSTEM has been developed that allows real-time analysis and visualization of the performance of the pumping well, including the pumping unit (beam or submersible), wellbore and reservoir. At a time when maximum efficiency is a prerequisite to profitable operations, this system has the potential to improve drastically the manner in which pumping wells are managed and operated. Micro computers have already had a major impact on petroleum engineering. And today one is unlikely to see an engineer's desk without some sort of PC or terminal to a local network, even in remote district offices. The development of extremely powerful and portable lap-top computers is causing the PC revolution to move to the field in the form of an intelligent data acquisition and diagnostic system. This one system combines all necessary elements to obtain data for annular liquid level surveys, dynamometer analysis, pressure transient analysis and other measurements required to analyze pumping well performance properly. Moreover, the system includes a database management component that allows maintaining and retrieving accurate records from past analyses.