Science.gov

Sample records for image analyses monitoracao

  1. Phase contrast image segmentation using a Laue analyser crystal

    NASA Astrophysics Data System (ADS)

    Kitchen, Marcus J.; Paganin, David M.; Uesugi, Kentaro; Allison, Beth J.; Lewis, Robert A.; Hooper, Stuart B.; Pavlov, Konstantin M.

    2011-02-01

    Dual-energy x-ray imaging is a powerful tool enabling two-component samples to be separated into their constituent objects from two-dimensional images. Phase contrast x-ray imaging can render the boundaries between media of differing refractive indices visible, despite them having similar attenuation properties; this is important for imaging biological soft tissues. We have used a Laue analyser crystal and a monochromatic x-ray source to combine the benefits of both techniques. The Laue analyser creates two distinct phase contrast images that can be simultaneously acquired on a high-resolution detector. These images can be combined to separate the effects of x-ray phase, absorption and scattering and, using the known complex refractive indices of the sample, to quantitatively segment its component materials. We have successfully validated this phase contrast image segmentation (PCIS) using a two-component phantom, containing an iodinated contrast agent, and have also separated the lungs and ribcage in images of a mouse thorax. Simultaneous image acquisition has enabled us to perform functional segmentation of the mouse thorax throughout the respiratory cycle during mechanical ventilation.

  2. Analyser-based phase contrast image reconstruction using geometrical optics.

    PubMed

    Kitchen, M J; Pavlov, K M; Siu, K K W; Menk, R H; Tromba, G; Lewis, R A

    2007-07-21

    Analyser-based phase contrast imaging can provide radiographs of exceptional contrast at high resolution (<100 microm), whilst quantitative phase and attenuation information can be extracted using just two images when the approximations of geometrical optics are satisfied. Analytical phase retrieval can be performed by fitting the analyser rocking curve with a symmetric Pearson type VII function. The Pearson VII function provided at least a 10% better fit to experimentally measured rocking curves than linear or Gaussian functions. A test phantom, a hollow nylon cylinder, was imaged at 20 keV using a Si(1 1 1) analyser at the ELETTRA synchrotron radiation facility. Our phase retrieval method yielded a more accurate object reconstruction than methods based on a linear fit to the rocking curve. Where reconstructions failed to map expected values, calculations of the Takagi number permitted distinction between the violation of the geometrical optics conditions and the failure of curve fitting procedures. The need for synchronized object/detector translation stages was removed by using a large, divergent beam and imaging the object in segments. Our image acquisition and reconstruction procedure enables quantitative phase retrieval for systems with a divergent source and accounts for imperfections in the analyser.

  3. Colony image acquisition and genetic segmentation algorithm and colony analyses

    NASA Astrophysics Data System (ADS)

    Wang, W. X.

    2012-01-01

    Colony anaysis is used in a large number of engineerings such as food, dairy, beverages, hygiene, environmental monitoring, water, toxicology, sterility testing. In order to reduce laboring and increase analysis acuracy, many researchers and developers have made efforts for image analysis systems. The main problems in the systems are image acquisition, image segmentation and image analysis. In this paper, to acquire colony images with good quality, an illumination box was constructed. In the box, the distances between lights and dishe, camra lens and lights, and camera lens and dishe are adjusted optimally. In image segmentation, It is based on a genetic approach that allow one to consider the segmentation problem as a global optimization,. After image pre-processing and image segmentation, the colony analyses are perfomed. The colony image analysis consists of (1) basic colony parameter measurements; (2) colony size analysis; (3) colony shape analysis; and (4) colony surface measurements. All the above visual colony parameters can be selected and combined together, used to make a new engineeing parameters. The colony analysis can be applied into different applications.

  4. Analyser-based x-ray imaging for biomedical research

    NASA Astrophysics Data System (ADS)

    Suortti, Pekka; Keyriläinen, Jani; Thomlinson, William

    2013-12-01

    Analyser-based imaging (ABI) is one of the several phase-contrast x-ray imaging techniques being pursued at synchrotron radiation facilities. With advancements in compact source technology, there is a possibility that ABI will become a clinical imaging modality. This paper presents the history of ABI as it has developed from its laboratory source to synchrotron imaging. The fundamental physics of phase-contrast imaging is presented both in a general sense and specifically for ABI. The technology is dependent on the use of perfect crystal monochromator optics. The theory of the x-ray optics is developed and presented in a way that will allow optimization of the imaging for specific biomedical systems. The advancement of analytical algorithms to produce separate images of the sample absorption, refraction angle map and small-angle x-ray scattering is detailed. Several detailed applications to biomedical imaging are presented to illustrate the broad range of systems and body sites studied preclinically to date: breast, cartilage and bone, soft tissue and organs. Ultimately, the application of ABI in clinical imaging will depend partly on the availability of compact sources with sufficient x-ray intensity comparable with that of the current synchrotron environment.

  5. Automated coregistration and statistical analyses of SPECT brain images

    SciTech Connect

    Gong, W.; Devous, M.D.

    1994-05-01

    Statistical analyses of SPECT image data often require highly accurate image coregistration. Several image coregistration algorithms have been developed. The Pellizari algorithm (PA) uses the Powell technique to estimate transformation parameters between the {open_quotes}head{close_quotes} (model) and {open_quotes}hat{close_quotes} (images to be registered). Image normalization and good initial transformation parameters heavily affect the accuracy and speed of convergence of the PA. We have explored various normalization methods and found a simple technique that avoids most artificial edge effects and minimizes blurring of useful edges. We have tested the effects on accuracy and convergence speed of the PA caused by different initial transformation parameters. From these data, a modified PA was integrated into an automated coregistration system for SPECT brain images on the PRISM 3000S under X Windows. The system yields an accuracy of approximately 2 mm between model and registered images, and employs minimal user intervention through a simple graphic user interface. Data are automatically resliced, normalized and coregistered, with the user choosing only the slice range for inclusion and two initial transformation parameters (under computer-aided guidance). Coregistration is accomplished (converges) in approximately 8 min for a 128 x 128 x 128 set of 2 mm{sup 3} voxels. The complete process (editing, reslicing, normalization, coregistration) takes about 20 min. We have also developed automated 3-dimensional parametric images ({open_quotes}t{close_quotes}, {open_quotes}z{close_quotes}, and subtraction images) from coregistered data sets for statistical analyses. Data are compared against a coregistered normal control group (N = 50) distributed in age and gender for matching against subject samples.

  6. Cartographic analyses of geographic information available on Google Earth Images

    NASA Astrophysics Data System (ADS)

    Oliveira, J. C.; Ramos, J. R.; Epiphanio, J. C.

    2011-12-01

    The propose was to evaluate planimetric accuracy of satellite images available on database of Google Earth. These images are referents to the vicinities of the Federal Univertisity of Viçosa, Minas Gerais - Brazil. The methodology developed evaluated the geographical information of three groups of images which were in accordance to the level of detail presented in the screen images (zoom). These groups of images were labeled to Zoom 1000 (a single image for the entire study area), Zoom 100 (formed by a mosaic of 73 images) and Zoom 100 with geometric correction (this mosaic is like before, however, it was applied a geometric correction through control points). In each group of image was measured the Cartographic Accuracy based on statistical analyses and brazilian's law parameters about planimetric mapping. For this evaluation were identified 22 points in each group of image, where the coordinates of each point were compared to the coordinates of the field obtained by GPS (Global Positioning System). The Table 1 show results related to accuracy (based on a threshold equal to 0.5 mm * mapping scale) and tendency (abscissa and ordinate) between the coordinates of the image and the coordinates of field. Table 1 The geometric correction applied to the Group Zoom 100 reduced the trends identified earlier, and the statistical tests pointed a usefulness of the data for a mapping at a scale of 1/5000 with error minor than 0.5 mm * scale. The analyses proved the quality of cartographic data provided by Google, as well as the possibility of reduce the divergences of positioning present on the data. It can be concluded that it is possible to obtain geographic information database available on Google Earth, however, the level of detail (zoom) used at the time of viewing and capturing information on the screen influences the quality cartographic of the mapping. Although cartographic and thematic potential present in the database, it is important to note that both the software

  7. Solid Hydrogen Experiments for Atomic Propellants: Image Analyses

    NASA Technical Reports Server (NTRS)

    Palaszewski, Bryan

    2002-01-01

    This paper presents the results of detailed analyses of the images from experiments that were conducted on the formation of solid hydrogen particles in liquid helium. Solid particles of hydrogen were frozen in liquid helium, and observed with a video camera. The solid hydrogen particle sizes, their agglomerates, and the total mass of hydrogen particles were estimated. Particle sizes of 1.9 to 8 mm (0.075 to 0.315 in.) were measured. The particle agglomerate sizes and areas were measured, and the total mass of solid hydrogen was computed. A total mass of from 0.22 to 7.9 grams of hydrogen was frozen. Compaction and expansion of the agglomerate implied that the particles remain independent particles, and can be separated and controlled. These experiment image analyses are one of the first steps toward visually characterizing these particles, and allow designers to understand what issues must be addressed in atomic propellant feed system designs for future aerospace vehicles.

  8. A Guide to Analysing Tongue Motion from Ultrasound Images

    ERIC Educational Resources Information Center

    Stone, Maureen

    2005-01-01

    This paper is meant to be an introduction to and general reference for ultrasound imaging for new and moderately experienced users of the instrument. The paper consists of eight sections. The first explains how ultrasound works, including beam properties, scan types and machine features. The second section discusses image quality, including the…

  9. Surveying and benchmarking techniques to analyse DNA gel fingerprint images.

    PubMed

    Heras, Jónathan; Domínguez, César; Mata, Eloy; Pascual, Vico

    2016-11-01

    DNA fingerprinting is a genetic typing technique that allows the analysis of the genomic relatedness between samples, and the comparison of DNA patterns. The analysis of DNA gel fingerprint images usually consists of five consecutive steps: image pre-processing, lane segmentation, band detection, normalization and fingerprint comparison. In this article, we firstly survey the main methods that have been applied in the literature in each of these stages. Secondly, we focus on lane-segmentation and band-detection algorithms-as they are the steps that usually require user-intervention-and detect the seven core algorithms used for both tasks. Subsequently, we present a benchmark that includes a data set of images, the gold standards associated with those images and the tools to measure the performance of lane-segmentation and band-detection algorithms. Finally, we implement the core algorithms used both for lane segmentation and band detection, and evaluate their performance using our benchmark. As a conclusion of that study, we obtain that the average profile algorithm is the best starting point for lane segmentation and band detection.

  10. The challenges of analysing blood stains with hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Kuula, J.; Puupponen, H.-H.; Rinta, H.; Pölönen, I.

    2014-06-01

    Hyperspectral imaging is a potential noninvasive technology for detecting, separating and identifying various substances. In the forensic and military medicine and other CBRNE related use it could be a potential method for analyzing blood and for scanning other human based fluids. For example, it would be valuable to easily detect whether some traces of blood are from one or more persons or if there are some irrelevant substances or anomalies in the blood. This article represents an experiment of separating four persons' blood stains on a white cotton fabric with a SWIR hyperspectral camera and FT-NIR spectrometer. Each tested sample includes standardized 75 _l of 100 % blood. The results suggest that on the basis of the amount of erythrocytes in the blood, different people's blood might be separable by hyperspectral analysis. And, referring to the indication given by erythrocytes, there might be a possibility to find some other traces in the blood as well. However, these assumptions need to be verified with wider tests, as the number of samples in the study was small. According to the study there also seems to be several biological, chemical and physical factors which affect alone and together on the hyperspectral analyzing results of blood on fabric textures, and these factors need to be considered before making any further conclusions on the analysis of blood on various materials.

  11. Integrating Medical Imaging Analyses through a High-throughput Bundled Resource Imaging System

    PubMed Central

    Covington, Kelsie; Welch, E. Brian; Jeong, Ha-Kyu; Landman, Bennett A.

    2011-01-01

    Exploitation of advanced, PACS-centric image analysis and interpretation pipelines provides well-developed storage, retrieval, and archival capabilities along with state-of-the-art data providence, visualization, and clinical collaboration technologies. However, pursuit of integrated medical imaging analysis through a PACS environment can be limiting in terms of the overhead required to validate, evaluate and integrate emerging research technologies. Herein, we address this challenge through presentation of a high-throughput bundled resource imaging system (HUBRIS) as an extension to the Philips Research Imaging Development Environment (PRIDE). HUBRIS enables PACS-connected medical imaging equipment to invoke tools provided by the Java Imaging Science Toolkit (JIST) so that a medical imaging platform (e.g., a magnetic resonance imaging scanner) can pass images and parameters to a server, which communicates with a grid computing facility to invoke the selected algorithms. Generated images are passed back to the server and subsequently to the imaging platform from which the images can be sent to a PACS. JIST makes use of an open application program interface layer so that research technologies can be implemented in any language capable of communicating through a system shell environment (e.g., Matlab, Java, C/C++, Perl, LISP, etc.). As demonstrated in this proof-of-concept approach, HUBRIS enables evaluation and analysis of emerging technologies within well-developed PACS systems with minimal adaptation of research software, which simplifies evaluation of new technologies in clinical research and provides a more convenient use of PACS technology by imaging scientists. PMID:21841899

  12. Advances in automated 3-D image analyses of cell populations imaged by confocal microscopy.

    PubMed

    Ancin, H; Roysam, B; Dufresne, T E; Chestnut, M M; Ridder, G M; Szarowski, D H; Turner, J N

    1996-11-01

    Automated three-dimensional (3-D) image analysis methods are presented for rapid and effective analysis of populations of fluorescently labeled cells or nuclei in thick tissue sections that have been imaged three dimensionally using a confocal microscope. The methods presented here greatly improve upon our earlier work (Roysam et al.:J Microsc 173: 115-126, 1994). The principal advances reported are: algorithms for efficient data pre-processing and adaptive segmentation, effective handling of image anisotrophy, and fast 3-D morphological algorithms for separating overlapping or connected clusters utilizing image gradient information whenever available. A particular feature of this method is its ability to separate densely packed and connected clusters of cell nuclei. Some of the challenges overcome in this work include the efficient and effective handling of imaging noise, anisotrophy, and large variations in image parameters such as intensity, object size, and shape. The method is able to handle significant inter-cell, intra-cell, inter-image, and intra-image variations. Studies indicate that this method is rapid, robust, and adaptable. Examples were presented to illustrate the applicability of this approach to analyzing images of nuclei from densely packed regions in thick sections of rat liver, and brain that were labeled with a fluorescent Schiff reagent.

  13. Analyses of S-Box in Image Encryption Applications Based on Fuzzy Decision Making Criterion

    NASA Astrophysics Data System (ADS)

    Rehman, Inayatur; Shah, Tariq; Hussain, Iqtadar

    2014-06-01

    In this manuscript, we put forward a standard based on fuzzy decision making criterion to examine the current substitution boxes and study their strengths and weaknesses in order to decide their appropriateness in image encryption applications. The proposed standard utilizes the results of correlation analysis, entropy analysis, contrast analysis, homogeneity analysis, energy analysis, and mean of absolute deviation analysis. These analyses are applied to well-known substitution boxes. The outcome of these analyses are additional observed and a fuzzy soft set decision making criterion is used to decide the suitability of an S-box to image encryption applications.

  14. Quantitative analysis of x-ray images with a television image analyser.

    PubMed

    Schleicher, A; Tillmann, B; Zilles, K

    1980-07-01

    A method for the quantitative evaluation of X-rays is described. The image is decomposed into individual image points by a mechanical scanning procedure, and at each image point the area fraction of a measuring field not covered by silver grains is determined with an image analyzer. This parameter is interpreted as representing a value corresponding to a specific degree of film blackness. The relationship between the measured value and the X-ray absorption is described by standard curves. With the aid of an aluminum scale, the measured value can be expressed directly by the thickness of an aluminum equivalent with a corresponding X-ray absorption. Details about the adjustment of the image analyzer for detecting the silver grains, the resolution of different degrees of X-ray absorption, as well as the computer-controlled scanning procedure are described. An example demonstrates its applicability to analyze the density distribution of bony tissue around the human humero-ulnar joint. The procedure is not limited to the evaluation of X-rays, but is applicable whenever silver grains can be detected in a film layer by an image analyzer.

  15. Analysing the Image Building Effects of TV Advertisements Using Internet Community Data

    NASA Astrophysics Data System (ADS)

    Uehara, Hiroshi; Sato, Tadahiko; Yoshida, Kenichi

    This paper proposes a method to measure the effects of TV advertisements on the Internet bulletin boards. It aims to clarify how the viewes' interests on TV advertisements are reflected on their images on the promoted products. Two kinds of time series data are generated based on the proposed method. First one represents the time series fluctuation of the interests on the TV advertisements. Another one represents the time series fluctuation of the images on the products. By analysing the correlations between these two time series data, we try to clarify the implicit relationship between the viewer's interests on the TV advertisement and their images on the promoted products. By applying the proposed method to an Internet bulletin board that deals with certain cosmetic brand, we show that the images on the products vary depending on the difference of the interests on each TV advertisement.

  16. Use of Image Analyses Techniques To Quantify Rock Morphological Characteristics of Lava Flows By Fms Logs.

    NASA Astrophysics Data System (ADS)

    Pechnig, R.; Ramani, S.; Bartetzko, A.; Clauser, C.

    Borehole wall images obtained from downhole measurements are mostly used for structure analyses (picking of fractures, foliation, layering) and qualitative descrip- tions of geological features. Qualitative results are difficult to compare with petro- physical data sets, either from laboratory measurements or form logging. We report on an application of image analyses techniques on FMS (Formation Micro Scanner) data, in order to select and quantify rock morphological characteristics. We selected image logs from subaerial basalts, drilled during Leg 183 in the Kerguelen Large Ig- neous Province Plateau. The selected subaerial basalts penetrated in Hole 1137 show significant morphological features, such as vesicles and fractures of different size, shape, distribution, and orientation. The excellent core recovery in this hole and the high quality of the standard and FMS logs provides a good opportunity to test the usefulness of image analyses in such rock types. We used the Zeiss K4000 software system for image analyses. The selection was performed by color scale definitions, where darker colors are associated with electrically conductive rock elements, which are in this case fluid or clay mineral filled voids, vesicles and fractures. Besides the se- lection of these morphological features, the application of this technique also allows to calculate aspect ratios for the selected elements and to discriminate between vesicles and fractures. This way, the qualitative information of the FMS logs was transferred to quantitative log curves, which may be used as input for statistical log processing. In our case, we used the information to relate rock morphological characteristics to the seismic properties of the drilled rocks.

  17. Solid Hydrogen Experiments for Atomic Propellants: Particle Formation Energy and Imaging Analyses

    NASA Technical Reports Server (NTRS)

    Palaszewski, Bryan

    2002-01-01

    This paper presents particle formation energy balances and detailed analyses of the images from experiments that were conducted on the formation of solid hydrogen particles in liquid helium during the Phase II testing in 2001. Solid particles of hydrogen were frozen in liquid helium and observed with a video camera. The solid hydrogen particle sizes and the total mass of hydrogen particles were estimated. The particle formation efficiency is also estimated. Particle sizes from the Phase I testing in 1999 and the Phase II testing in 2001 were similar. Though the 2001 testing created similar particles sizes, many new particle formation phenomena were observed. These experiment image analyses are one of the first steps toward visually characterizing these particles and it allows designers to understand what issues must be addressed in atomic propellant feed system designs for future aerospace vehicles.

  18. Solid Hydrogen Experiments for Atomic Propellants: Particle Formation, Imaging, Observations, and Analyses

    NASA Technical Reports Server (NTRS)

    Palaszewski, Bryan

    2005-01-01

    This report presents particle formation observations and detailed analyses of the images from experiments that were conducted on the formation of solid hydrogen particles in liquid helium. Hydrogen was frozen into particles in liquid helium, and observed with a video camera. The solid hydrogen particle sizes and the total mass of hydrogen particles were estimated. These newly analyzed data are from the test series held on February 28, 2001. Particle sizes from previous testing in 1999 and the testing in 2001 were similar. Though the 2001 testing created similar particles sizes, many new particle formation phenomena were observed: microparticles and delayed particle formation. These experiment image analyses are some of the first steps toward visually characterizing these particles, and they allow designers to understand what issues must be addressed in atomic propellant feed system designs for future aerospace vehicles.

  19. The Decoding Toolbox (TDT): a versatile software package for multivariate analyses of functional imaging data

    PubMed Central

    Hebart, Martin N.; Görgen, Kai; Haynes, John-Dylan

    2015-01-01

    The multivariate analysis of brain signals has recently sparked a great amount of interest, yet accessible and versatile tools to carry out decoding analyses are scarce. Here we introduce The Decoding Toolbox (TDT) which represents a user-friendly, powerful and flexible package for multivariate analysis of functional brain imaging data. TDT is written in Matlab and equipped with an interface to the widely used brain data analysis package SPM. The toolbox allows running fast whole-brain analyses, region-of-interest analyses and searchlight analyses, using machine learning classifiers, pattern correlation analysis, or representational similarity analysis. It offers automatic creation and visualization of diverse cross-validation schemes, feature scaling, nested parameter selection, a variety of feature selection methods, multiclass capabilities, and pattern reconstruction from classifier weights. While basic users can implement a generic analysis in one line of code, advanced users can extend the toolbox to their needs or exploit the structure to combine it with external high-performance classification toolboxes. The toolbox comes with an example data set which can be used to try out the various analysis methods. Taken together, TDT offers a promising option for researchers who want to employ multivariate analyses of brain activity patterns. PMID:25610393

  20. Computer-based image-analyses of laminated shales, carboniferous of the Midcontinent and surrounding areas

    SciTech Connect

    Archer, A.W. . Dept. of Geology)

    1993-02-01

    Computerized image-analyses of petrographic data can greatly facilitate the quantification of detailed descriptions and analyses of fine-scale fabric, or petrofabric. In thinly laminated rocks, manual measurement of successive lamina thicknesses is very time consuming, especially when applied to thick, cored sequences. In particular, images of core materials can be digitized and the resulting image then processed as a large matrix. Using such techniques, it is relatively easy to automate continuous measurements of lamina thickness and lateral continuity. This type of analyses has been applied to a variety of Carboniferous strata, particularly those siliciclastics that occur within the outside shale' portions of Kansas cyclothems. Of the various sedimentological processes capable of producing such non-random thickness variations, a model invoking tidal processes appears to be particularly robust. Tidal sedimentation could not only have resulted in the deposition of individual lamina, but in addition tidal-height variations during various phases of the lunar orbit can serve to explain the systematic variations. Comparison of these Carboniferous shales with similar laminations formed in modern high tidal-range environments indicates many similarities. These modern analogs include the Bay of Fundy in Canada, and Bay of Mont-Staint-Michel in France. Lamina-thickness variations, in specific cases, can be correlated with known tidal periodicities. In addition, in some samples, details of the tidal regime can be interpolated, such as the nature of the tidal system (i.e., diurnal or semidiurnal) and some indicators of tidal range can be ascertained based upon modern analogs.

  1. An accessible, scalable ecosystem for enabling and sharing diverse mass spectrometry imaging analyses.

    PubMed

    Fischer, Curt R; Ruebel, Oliver; Bowen, Benjamin P

    2016-01-01

    Mass spectrometry imaging (MSI) is used in an increasing number of biological applications. Typical MSI datasets contain unique, high-resolution mass spectra from tens of thousands of spatial locations, resulting in raw data sizes of tens of gigabytes per sample. In this paper, we review technical progress that is enabling new biological applications and that is driving an increase in the complexity and size of MSI data. Handling such data often requires specialized computational infrastructure, software, and expertise. OpenMSI, our recently described platform, makes it easy to explore and share MSI datasets via the web - even when larger than 50 GB. Here we describe the integration of OpenMSI with IPython notebooks for transparent, sharable, and replicable MSI research. An advantage of this approach is that users do not have to share raw data along with analyses; instead, data is retrieved via OpenMSI's web API. The IPython notebook interface provides a low-barrier entry point for data manipulation that is accessible for scientists without extensive computational training. Via these notebooks, analyses can be easily shared without requiring any data movement. We provide example notebooks for several common MSI analysis types including data normalization, plotting, clustering, and classification, and image registration.

  2. Partial correlation analyses of global diffusion tensor imaging-derived metrics in glioblastoma multiforme: Pilot study

    PubMed Central

    Cortez-Conradis, David; Rios, Camilo; Moreno-Jimenez, Sergio; Roldan-Valadez, Ernesto

    2015-01-01

    AIM: To determine existing correlates among diffusion tensor imaging (DTI)-derived metrics in healthy brains and brains with glioblastoma multiforme (GBM). METHODS: Case-control study using DTI data from brain magnetic resonance imaging of 34 controls (mean, 41.47; SD, ± 21.94 years; range, 21-80 years) and 27 patients with GBM (mean, SD; 48.41 ± 15.18 years; range, 18-78 years). Image postprocessing using FSL software calculated eleven tensor metrics: fractional (FA) and relative anisotropy; pure isotropic (p) and anisotropic diffusions (q), total magnitude of diffusion (L); linear (Cl), planar (Cp) and spherical tensors (Cs); mean (MD), axial (AD) and radial diffusivities (RD). Partial correlation analyses (controlling the effect of age and gender) and multivariate Mancova were performed. RESULTS: There was a normal distribution for all metrics. Comparing healthy brains vs brains with GBM, there were significant very strong bivariate correlations only depicted in GBM: [FA↔Cl (+)], [FA↔q (+)], [p↔AD (+)], [AD↔MD (+)], and [MD↔RD (+)]. Among 56 pairs of bivariate correlations, only seven were significantly different. The diagnosis variable depicted a main effect [F-value (11, 23) = 11.842, P ≤ 0.001], with partial eta squared = 0.850, meaning a large effect size; age showed a similar result. The age also had a significant influence as a covariate [F (11, 23) = 10.523, P < 0.001], with a large effect size (partial eta squared = 0.834). CONCLUSION: DTI-derived metrics depict significant differences between healthy brains and brains with GBM, with specific magnitudes and correlations. This study provides reference data and makes a contribution to decrease the underlying empiricism in the use of DTI parameters in brain imaging. PMID:26644826

  3. Mosquito Larval Habitats, Land Use, and Potential Malaria Risk in Northern Belize from Satellite Image Analyses

    NASA Technical Reports Server (NTRS)

    Pope, Kevin; Masuoka, Penny; Rejmankova, Eliska; Grieco, John; Johnson, Sarah; Roberts, Donald

    2004-01-01

    The distribution of Anopheles mosquito habitats and land use in northern Belize is examined with satellite data. -A land cover classification based on multispectral SPOT and multitemporal Radarsat images identified eleven land cover classes, including agricultural, forest, and marsh types. Two of the land cover types, Typha domingensis marsh and flooded forest, are Anopheles vestitipennis larval habitats. Eleocharis spp. marsh is the larval habitat for Anopheles albimanus. Geographic Information Systems (GIS) analyses of land cover demonstrate that the amount of T-ha domingensis in a marsh is positively correlated with the amount of agricultural land in the adjacent upland, and negatively correlated with the amount of adjacent forest. This finding is consistent with the hypothesis that nutrient (phosphorus) runoff from agricultural lands is causing an expansion of Typha domingensis in northern Belize. This expansion of Anopheles vestitipennis larval habitat may in turn cause an increase in malaria risk in the region.

  4. Development, Capabilities, and Impact on Wind Analyses of the Hurricane Imaging Radiometer (HIRAD)

    NASA Technical Reports Server (NTRS)

    Miller, T.; Amarin, R.; Atlas, R.; Bailey, M.; Black, P.; Buckley, C.; Chen, S.; El-Nimri, S.; Hood, R.; James, M.; Johnson, J.; Jones, W.; Ruf, C.; Simmons, D.; Uhlhorn, E.; Inglish, C.

    2010-01-01

    The Hurricane Imaging Radiometer (HIRAD) is a new airborne microwave remote sensor for hurricane observations that is currently under development by NASA Marshall Space Flight Center in partnership with the NOAA Atlantic Oceanographic and Meteorological Laboratory/Hurricane Research Division, the University of Central Florida, the University of Michigan, and the University of Alabama in Huntsville. The instrument is being test flown in January and is expected to participate in the tropical cyclone experiment GRIP (Genesis and Rapid Intensification Processes) in the 2010 season. HIRAD is being designed to study the wind field in some detail within strong hurricanes and to enhance the real-time airborne ocean surface winds observation capabilities of NOAA and USAF Weather Squadron hurricane hunter aircraft currently using the operational Stepped Frequency Microwave Radiometer (SFMR). Unlike SFMR, which measures wind speed and rain rate along the ground track at a single point directly beneath the aircraft, HIRAD will provide images of the surface wind and rain field over a wide swath (approximately 3 x the aircraft altitude) with approximately 2 km resolution. This paper describes the HIRAD instrument and the physical basis for its operations, including chamber test data from the instrument. The potential value of future HIRAD observations will be illustrated with a summary of Observing System Simulation Experiments (OSSEs) in which measurements from the new instrument as well as those from existing instruments (air, surface, and space-based) are simulated from the output of a detailed numerical model, and those results are used to construct simulated H*Wind analyses. Evaluations will be presented on the impact on H*Wind analyses of using the HIRAD instrument observations to replace those of the SFMR instrument, and also on the impact of a future satellite-based HIRAD in comparison to instruments with more limited capabilities for observing strong winds through heavy

  5. Correlative Imaging and Analyses of Soil Organic Matter Stabilization in the Rhizosphere

    NASA Astrophysics Data System (ADS)

    Dohnalkova, Alice; Tfaily, Malak; Chu, Rosalie; Crump, Alex; Brislawn, Colin; Varga, Tamas; Chrisler, William

    2016-04-01

    Correlative Imaging and Analyses of Soil Organic Matter Stabilization in the Rhizosphere Understanding the dynamics of carbon (C) pools in soil systems is a critical area for mitigating atmospheric carbon dioxide levels and maintaining healthy soils. Although microbial contributions to stable soil carbon pools have often been regarded as low to negligible, we present evidence that microbes may play a far greater role in the stabilization of soil organic matter (SOM), thus in contributing to soil organic matter pools with longer residence time. The rhizosphere, a zone immediately surrounding the plant roots, represents a geochemical hotspot with high microbial activity and profuse SOM production. Particularly, microbially secreted extracellular polymeric substances (EPS) present a remarkable dynamic entity that plays a critical role in numerous soil processes including mineral weathering. We approach the interface of soil minerals and microbes with a focus on the organic C stabilization mechanisms. We use a suite of high-resolution imaging and analytical methods (confocal, scanning and transmission electron microscopy, Fourier transform ion cyclotron resonance mass spectrometry, DNA sequencing and X-ray diffraction), to study the living and non-living rhizosphere components. Our goal is to elucidate a pathway for the formation, storage, transformation and protection of persistent microbially-produced carbon in soils. Based on our multimodal analytical approach, we propose that persistent microbial necromass in soils accounts for considerably higher soil carbon than previously estimated.

  6. [Development of automatic analyses for star-shot images using computed radiography (CR)].

    PubMed

    Kawata, Hidemichi; Ohkura, Sunao; Ono, Hiroshi; Fukudome, Yoshifumi; Kawamura, Seiji; Hayabuchi, Naofumi

    2006-12-20

    Recent progress in radiation therapy has been greatly enhanced in many facilities by the development of new machines for treatment, improved computer technology for radiotherapy treatment planning systems (RTPs), increased accuracy of radiation therapy such as stereotactic irradiation, and intensity-modulated radiation therapy (IMRT). Quality control (QC) of the isocenter, which has consisted of gantry rotation and limiting the radiation field, is important for greater accuracy of these radiation therapy technologies. Star-shot analyses using computed radiography (CR) for evaluation of the isocenter were employed in this study. Devices to support CR were created, and a method of automatically analyzing images obtained by the star-shot technique, which calculated the error (distance) from the isocenter and the incident beam angle, were developed. In terms of the accuracy of our method, the average maximum error was 0.33 mm (less than 2 pixels: 0.35 mm), the average absolute error and incident beam angle errors were 0.3 mm and 0.4 degrees at maximum and at one standard deviation (SD), respectively. In this study, the processing times were 16 sec at minimum, 152 sec at maximum, 18 sec at most frequencies, and 23.6 sec on average. In conclusion, it was considered that our newly developed method for analyzing star-shot images using CR enabled immediate, quantitative evaluation of the isocenter.

  7. Core Formation in Planetesimals: Textural Analyses From 3D Synchrotron Imaging and Complex Systems Modeling

    NASA Astrophysics Data System (ADS)

    Rushmer, T. A.; Tordesillas, A.; Walker, D. M.; Parkinson, D. Y.; Clark, S. M.

    2012-12-01

    Recent scenarios of core formation in planetesimals using calculations from planetary dynamists and from extinct radionuclides (e.g. 26Al, 60Fe), call for segregation of a metal liquid (core) from both solid silicate and a partially molten silicate - a silicate mush - matrix. These segregation scenarios require segregation of metallic metal along fracture networks or by the growth of molten core material into blebs large enough to overcome the strength of the mush matrix. Such segregation scenarios usually involve high strain rates so that separation can occur, which is in agreement with the accretion model of planetary growth. Experimental work has suggested deformation and shear can help develop fracture networks and coalesce metallic blebs. Here, we have developed an innovative approach that currently combines 2D textures in experimental deformation experiments on a partially molten natural meteorite with complex network analyses. 3D textural data from experimental samples, deformed at high strain rates, with or without silicate melts present, have been obtained by synchrotron-based high resolution hard x-ray microtomography imaging. A series of two-dimensional images is collected as the sample is rotated, and tomographic reconstruction yields the full 3D representation of the sample. Virtual slices through the 3D object in any arbitrary direction can be visualized, or the full data set can be visualized by volume rendering. More importantly, automated image filtering and segmentation allows the extraction of boundaries between the various phases. The volumes, shapes, and distributions of each phase, and the connectivity between them, can then be quantitatively analysed, and these results can be compared to models. We are currently using these new visual data sets to augment our 2D data. These results will be included in our current complex system analytical approach. This integrated method can elucidate and quantify the growth of metallic blebs in regions where

  8. Image analyses in bauxitic ores: The case of the Apulian karst bauxites

    NASA Astrophysics Data System (ADS)

    Buccione, Roberto; Sinisi, Rosa; Mongelli, Giovanni

    2015-04-01

    This study concern two different karst bauxite deposits of the Apulia region (southern Italy). These deposits outcrop in the Murge and Salento areas: the Murge bauxite (upper Cretaceous) is a typical canyon-like deposit formed in a karst depression whereas the Salento bauxite (upper Eocene - Oligocene) is the result of the erosion, remobilization and transport of older bauxitic material from a relative distant area. This particular bauxite arrangement gave the name to all the same bauxite deposits which are thus called Salento-type deposits. Bauxite's texture is essentially made of sub-circular concentric aggregates, called ooids, dispersed in a pelitic matrix. The textural properties of the two bauxitic ores, as assessed by SEM-EDX, are different. In the bauxite from the canyon-like deposit the ooids/matrix ratio is higher than in the Salento-type bauxite. Furthermore the ooids in the Salento-like bauxite are usually made by a large core surrounded by a narrow, single, accretion layer, whereas the ooids from the canyon-like deposit have a smaller core surrounded by several alternating layers of Al-hematite and boehmite (Mongelli et al., 2014). In order to explore in more detail the textural features of both bauxite deposits, particle shape analyses were performed. Image analyses and the fractal dimension have been widely used in geological studies including economic geology (e.g. Turcotte, 1986; Meakin, 1991; Deng et al., 2011). The geometric properties evaluated are amounts of ooids, average ooids size, ooids rounding and the fractal dimension D, which depends on the ooids/matrix ratio. D is the slope of a plotting line obtained using a particular counting technique on each sample image. The fractal dimension is slightly lower for the Salento-type bauxites. Since the process which led to the formation of the ooids is related to an aggregation growth involving chemical fractionation (Mongelli, 2002) a correlation among these parameters and the contents of major

  9. Short wave infrared chemical imaging as future tool for analysing gunshot residues patterns in targets.

    PubMed

    Ortega-Ojeda, F E; Torre-Roldán, M; García-Ruiz, C

    2017-05-15

    This work used chemical imaging in the short-wave infrared region for analysing gunshot residues (GSR) patterns in cotton fabric targets shot with conventional and non-toxic ammunition. It presents a non-destructive, non-toxic, highly visual and hiperspectral-based approach. The method was based on classical least squares regression, and was tested with the ammunition propellants and their standard components' spectra. The propellants' spectra were satisfactorily used (R(2) >0.966, and CorrCoef >0.982) for identifying the GSR irrespective of the type of ammunition used for the shooting. In a more versatile approach, nitrocellulose, the main component in the ammunition propellants, resulted an excellent standard for identifying GSR patterns (R(2)>0.842, and CorrCoef >0.908). In this case, the propellants' stabilizers (diphenilamine and centralite), and its nitrated derivatives as well as dinitrotoluene, showed also high spectral activity. Therefore, they could be recommended as complementary standards for confirming the GSR identification. These findings establish the proof of concept for a science-based evidence useful to support expert reports and final court rulings. This approach for obtaining GSR patterns can be an excellent alternative to the current and traditional chemical methods, which are based in presumptive and invasive colour tests.

  10. Molecular cytogenetic analysis of human blastocysts andcytotrophoblasts by multi-color FISH and Spectra Imaging analyses

    SciTech Connect

    Weier, Jingly F.; Ferlatte, Christy; Baumgartner, Adolf; Jung,Christine J.; Nguyen, Ha-Nam; Chu, Lisa W.; Pedersen, Roger A.; Fisher,Susan J.; Weier, Heinz-Ulrich G.

    2006-02-08

    Numerical chromosome aberrations in gametes typically lead to failed fertilization, spontaneous abortion or a chromosomally abnormal fetus. By means of preimplantation genetic diagnosis (PGD), we now can screen human embryos in vitro for aneuploidy before transferring the embryos to the uterus. PGD allows us to select unaffected embryos for transfer and increases the implantation rate in in vitro fertilization programs. Molecular cytogenetic analyses using multi-color fluorescence in situ hybridization (FISH) of blastomeres have become the major tool for preimplantation genetic screening of aneuploidy. However, current FISH technology can test for only a small number of chromosome abnormalities and hitherto failed to increase the pregnancy rates as expected. We are in the process of developing technologies to score all 24 chromosomes in single cells within a 3 day time limit, which we believe is vital to the clinical setting. Also, human placental cytotrophoblasts (CTBs) at the fetal-maternal interface acquire aneuploidies as they differentiate to an invasive phenotype. About 20-50% of invasive CTB cells from uncomplicated pregnancies were found aneuploidy, suggesting that the acquisition of aneuploidy is an important component of normal placentation, perhaps limiting the proliferative and invasive potential of CTBs. Since most invasive CTBs are interphase cells and possess extreme heterogeneity, we applied multi-color FISH and repeated hybridizations to investigate individual CTBs. In summary, this study demonstrates the strength of Spectral Imaging analysis and repeated hybridizations, which provides a basis for full karyotype analysis of single interphase cells.

  11. Systematic Comparison of Brain Imaging Meta-Analyses of ToM with vPT

    PubMed Central

    Schurz, Matthias; Perner, Josef

    2017-01-01

    In visual perspective taking (vPT) one has to concern oneself with what other people see and how they see it. Since seeing is a mental state, developmental studies have discussed vPT within the domain of “theory of mind (ToM)” but imaging studies have not treated it as such. Based on earlier results from several meta-analyses, we tested for the overlap of visual perspective taking studies with 6 different kinds of ToM studies: false belief, trait judgments, strategic games, social animations, mind in the eyes, and rational actions. Joint activation was observed between the vPT task and some kinds of ToM tasks in regions involving the left temporoparietal junction (TPJ), anterior precuneus, left middle occipital gyrus/extrastriate body area (EBA), and the left inferior frontal and precentral gyrus. Importantly, no overlap activation was found for the vPT tasks with the joint core of all six kinds of ToM tasks. This raises the important question of what the common denominator of all tasks that fall under the label of “theory of mind” is supposed to be if visual perspective taking is not one of them. PMID:28367446

  12. Systematic Comparison of Brain Imaging Meta-Analyses of ToM with vPT.

    PubMed

    Arora, Aditi; Schurz, Matthias; Perner, Josef

    2017-01-01

    In visual perspective taking (vPT) one has to concern oneself with what other people see and how they see it. Since seeing is a mental state, developmental studies have discussed vPT within the domain of "theory of mind (ToM)" but imaging studies have not treated it as such. Based on earlier results from several meta-analyses, we tested for the overlap of visual perspective taking studies with 6 different kinds of ToM studies: false belief, trait judgments, strategic games, social animations, mind in the eyes, and rational actions. Joint activation was observed between the vPT task and some kinds of ToM tasks in regions involving the left temporoparietal junction (TPJ), anterior precuneus, left middle occipital gyrus/extrastriate body area (EBA), and the left inferior frontal and precentral gyrus. Importantly, no overlap activation was found for the vPT tasks with the joint core of all six kinds of ToM tasks. This raises the important question of what the common denominator of all tasks that fall under the label of "theory of mind" is supposed to be if visual perspective taking is not one of them.

  13. Continuous Measurements of Eyeball Area and Their Spectrum Analyses -- Toward the Quantification of Rest Rhythm of Horses by Image Processing

    DTIC Science & Technology

    2007-11-02

    analyses of electroencephalogram at half- closed eye and fully closed eye. This study aimed at quantitative estimating rest rhythm of horses by the...analyses of eyeball movement. The mask attached with a miniature CCD camera was newly developed. The continuous images of the horse eye for about 24...eyeball area were calculated. As for the results, the fluctuating status of eyeball area was analyzed quantitatively, and the rest rhythm of horses was

  14. X-ray digital imaging petrography of lunar mare soils: modal analyses of minerals and glasses

    NASA Technical Reports Server (NTRS)

    Taylor, L. A.; Patchen, A.; Taylor, D. H.; Chambers, J. G.; McKay, D. S.

    1996-01-01

    It is essential that accurate modal (i.e., volume) percentages of the various mineral and glass phases in lunar soils be used for addressing and resolving the effects of space weathering upon reflectance spectra, as well as for their calibration such data are also required for evaluating the resource potential of lunar minerals for use at a lunar base. However, these data are largely lacking. Particle-counting information for lunar soils, originally obtained to study formational processes, does not provide these necessary data, including the percentages of minerals locked in multi-phase lithic fragments and fused-soil particles, such as agglutinates. We have developed a technique for modal analyses, sensu stricto, of lunar soils, using digital imaging of X-ray maps obtained with an energy-dispersive spectrometer mounted on an electron microprobe. A suite of nine soils (90 to 150 micrometers size fraction) from the Apollo 11, 12, 15, and 17 mare sites was used for this study. This is the first collection of such modal data on soils from all Apollo mare sites. The abundances of free-mineral fragments in the mare soils are greater for immature and submature soils than for mature soils, largely because of the formation of agglutinitic glass as maturity progresses. In considerations of resource utilization at a lunar base, the best lunar soils to use for mineral beneficiation (i.e., most free-mineral fragments) have maturities near the immature/submature boundary (Is/FeO approximately or = 30), not the mature soils with their complications due to extensive agglutination. The particle data obtained from the nine mare soils confirm the generalizations for lunar soils predicted by L.A. Taylor and D.S. McKay (1992, Lunar Planet Sci. Conf. 23rd, pp. 1411-1412 [Abstract]).

  15. Spatiotemporal Analyses of Osteogenesis and Angiogenesis via Intravital Imaging in Cranial Bone Defect Repair

    PubMed Central

    Huang, Chunlan; Ness, Vincent P.; Yang, Xiaochuan; Chen, Hongli; Luo, Jiebo; Brown, Edward B; Zhang, Xinping

    2015-01-01

    Osteogenesis and angiogenesis are two integrated components in bone repair and regeneration. A deeper understanding of osteogenesis and angiogenesis has been hampered by technical difficulties of analyzing bone and neovasculature simultaneously in spatiotemporal scales and in three-dimensional formats. To overcome these barriers, a cranial defect window chamber model was established that enabled high-resolution, longitudinal, and real-time tracking of angiogenesis and bone defect healing via Multiphoton Laser Scanning Microscopy (MPLSM). By simultaneously probing new bone matrix via second harmonic generation (SHG), neovascular networks via intravenous perfusion of fluorophore, and osteoblast differentiation via 2.3kb collagen type I promoter driven GFP (Col2.3GFP), we examined the morphogenetic sequence of cranial bone defect healing and further established the spatiotemporal analyses of osteogenesis and angiogenesis coupling in repair and regeneration. We demonstrated that bone defect closure was initiated in the residual bone around the edge of the defect. The expansion and migration of osteoprogenitors into the bone defect occurred during the first 3 weeks of healing, coupled with vigorous microvessel angiogenesis at the leading edge of the defect. Subsequent bone repair was marked by matrix deposition and active vascular network remodeling within new bone. Implantation of bone marrow stromal cells (BMSCs) isolated from Col2.3GFP mice further showed that donor-dependent bone formation occurred rapidly within the first 3 weeks of implantation, in concert with early angiogenesis. The subsequent bone wound closure was largely host-dependent, associated with localized modest induction of angiogenesis. The establishment of a live imaging platform via cranial window provides a unique tool to understand osteogenesis and angiogenesis in repair and regeneration, enabling further elucidation of the spatiotemporal regulatory mechanisms of osteoprogenitor cell interactions

  16. Pallasite formation after a non-destructive impact. An experimental- and image analyses-based study

    NASA Astrophysics Data System (ADS)

    Solferino, Giulio; Golabek, Gregor J.; Nimmo, Francis; Schmidt, Max W.

    2015-04-01

    The formation conditions of pallasite meteorites in the interior of terrestrial planetesimals have been matter of debate over the last 40 years. Among other characteristics, the simple mineralogical composition (i.e., olivine, FeNi, FeS +/- pyroxene) and the dualism between fragmental and rounded olivine-bearing pallasites must be successfully reproduced by a potential formation scenario. This study incorporates a series of annealing experiments with olivine plus Fe-S, and digital image analyses of slabs from Brenham, Brahin, Seymchan, and Springwater pallasites. Additionally a 1D finite-difference numerical model was employed to show that a non-destructive collision followed by mixing of the impactor's core with the target body silicate mantle could lead to the formation of both fragmental and rounded pallasite types. Specifically, an impact occurring right after the accomplishment of the target body differentiation and up to several millions of years afterwards allows for (i) average grain sizes consistent with the observed rounded olivine-bearing pallasites, (ii) a remnant magnetization of Fe-Ni olivine inclusions as measured in natural pallasites and (iii) for the metallographic cooling rates derived from Fe-Ni in pallasites. An important result of this investigation is the definition of the grain growth rate of olivine in molten Fe-S as follows: dn - d0n = k0 exp(-Ea/RT) t, where, d0 is the starting grain size, d the grain size at time t, n = 2.42(46) the growth exponent, k0 = 9.43•E06 μm n s-1 a characteristic constant, Ea = 289 kJ/mol the activation energy for a specific growth process, R the gas constant, and T the absolute temperature. The computed olivine coarsening rate is markedly faster than in olivine-FeNi and olivine-Ni systems.

  17. X-ray digital imaging petrography of lunar mare soils: modal analyses of minerals and glasses.

    PubMed

    Taylor, L A; Patchen, A; Taylor, D H; Chambers, J G; McKay, D S

    1996-12-01

    It is essential that accurate modal (i.e., volume) percentages of the various mineral and glass phases in lunar soils be used for addressing and resolving the effects of space weathering upon reflectance spectra, as well as for their calibration such data are also required for evaluating the resource potential of lunar minerals for use at a lunar base. However, these data are largely lacking. Particle-counting information for lunar soils, originally obtained to study formational processes, does not provide these necessary data, including the percentages of minerals locked in multi-phase lithic fragments and fused-soil particles, such as agglutinates. We have developed a technique for modal analyses, sensu stricto, of lunar soils, using digital imaging of X-ray maps obtained with an energy-dispersive spectrometer mounted on an electron microprobe. A suite of nine soils (90 to 150 micrometers size fraction) from the Apollo 11, 12, 15, and 17 mare sites was used for this study. This is the first collection of such modal data on soils from all Apollo mare sites. The abundances of free-mineral fragments in the mare soils are greater for immature and submature soils than for mature soils, largely because of the formation of agglutinitic glass as maturity progresses. In considerations of resource utilization at a lunar base, the best lunar soils to use for mineral beneficiation (i.e., most free-mineral fragments) have maturities near the immature/submature boundary (Is/FeO approximately or = 30), not the mature soils with their complications due to extensive agglutination. The particle data obtained from the nine mare soils confirm the generalizations for lunar soils predicted by L.A. Taylor and D.S. McKay (1992, Lunar Planet Sci. Conf. 23rd, pp. 1411-1412 [Abstract]).

  18. Difference method for analysing infrared images in pigs with elevated body temperatures.

    PubMed

    Siewert, Carsten; Dänicke, Sven; Kersten, Susanne; Brosig, Bianca; Rohweder, Dirk; Beyerbach, Martin; Seifert, Hermann

    2014-03-01

    Infrared imaging proves to be a quick and simple method for measuring temperature distribution on the pig's head. The study showed that infrared imaging and analysis with a difference ROI (region of interest) method may be used for early detection of elevated body temperature in pigs (> 39.5°C). A high specificity of approx. 85% and a high sensitivity of 86% existed. The only prerequisite is that there are at least 2 anatomical regions which can be recognised as reproducible in the IR image. Noise suppression is guaranteed by averaging the temperature value within both of these ROI. The subsequent difference imaging extensively reduces the off-set error which varies in every thermal IR-image.

  19. Effect of Harderian adenectomy on the statistical analyses of mouse brain imaging using positron emission tomography

    PubMed Central

    Kim, Minsoo; Woo, Sang-Keun; Yu, Jung Woo; Lee, Yong Jin; Kim, Kyeong Min; Kang, Joo Hyun; Eom, Kidong

    2014-01-01

    Positron emission tomography (PET) using 2-deoxy-2-[18F] fluoro-D-glucose (FDG) as a radioactive tracer is a useful technique for in vivo brain imaging. However, the anatomical and physiological features of the Harderian gland limit the use of FDG-PET imaging in the mouse brain. The gland shows strong FDG uptake, which in turn results in distorted PET images of the frontal brain region. The purpose of this study was to determine if a simple surgical procedure to remove the Harderian gland prior to PET imaging of mouse brains could reduce or eliminate FDG uptake. Measurement of FDG uptake in unilaterally adenectomized mice showed that the radioactive signal emitted from the intact Harderian gland distorts frontal brain region images. Spatial parametric measurement analysis demonstrated that the presence of the Harderian gland could prevent accurate assessment of brain PET imaging. Bilateral Harderian adenectomy efficiently eliminated unwanted radioactive signal spillover into the frontal brain region beginning on postoperative Day 10. Harderian adenectomy did not cause any post-operative complications during the experimental period. These findings demonstrate the benefits of performing a Harderian adenectomy prior to PET imaging of mouse brains. PMID:23820224

  20. Quantitative histological image analyses of reticulin fibers in a myelofibrotic mouse

    PubMed Central

    Lucero, Hector A.; Patterson, Shenia; Matsuura, Shinobu; Ravid, Katya

    2016-01-01

    Bone marrow (BM) reticulin fibrosis (RF), revealed by silver staining of tissue sections, is associated with myeloproliferative neoplasms, while tools for quantitative assessment of reticulin deposition throughout a femur BM are still in need. Here, we present such a method, allowing via analysis of hundreds of composite images to identify a patchy nature of RF throughout the BM during disease progression in a mouse model of myelofibrosis. To this end, initial conversion of silver stained BM color images into binary images identified two limitations: variable color, owing to polychromatic staining of reticulin fibers, and variable background in different sections of the same batch, limiting application of the color deconvolution method, and use of constant threshold, respectively. By blind coding image identities, to allow for threshold input (still within a narrow range), and using shape filtering to further eliminate background we were able to quantitate RF in myelofibrotic Gata-1low (experimental) and wild type (control) mice as a function of animal age. Color images spanning the whole femur BM were batch-analyzed using ImageJ software, aided by our two newly added macros. The results show heterogeneous RF density in different areas of the marrow of Gata-1low mice, with degrees of heterogeneity reduced upon aging. This method can be applied uniformly across laboratories in studies assessing RF remodeling induced by aging or other conditions in animal models. PMID:28008415

  1. Applying I-FGM to image retrieval and an I-FGM system performance analyses

    NASA Astrophysics Data System (ADS)

    Santos, Eugene, Jr.; Santos, Eunice E.; Nguyen, Hien; Pan, Long; Korah, John; Zhao, Qunhua; Xia, Huadong

    2007-04-01

    Intelligent Foraging, Gathering and Matching (I-FGM) combines a unique multi-agent architecture with a novel partial processing paradigm to provide a solution for real-time information retrieval in large and dynamic databases. I-FGM provides a unified framework for combining the results from various heterogeneous databases and seeks to provide easily verifiable performance guarantees. In our previous work, I-FGM had been implemented and validated with experiments on dynamic text data. However, the heterogeneity of search spaces requires our system having the ability to effectively handle various types of data. Besides texts, images are the most significant and fundamental data for information retrieval. In this paper, we extend the I-FGM system to incorporate images in its search spaces using a region-based Wavelet Image Retrieval algorithm called WALRUS. Similar to what we did for text retrieval, we modified the WALRUS algorithm to partially and incrementally extract the regions from an image and measure the similarity value of this image. Based on the obtained partial results, we refine our computational resources by updating the priority values of image documents. Experiments have been conducted on I-FGM system with image retrieval. The results show that I-FGM outperforms its control systems. Also, in this paper we present theoretical analysis of the systems with a focus on performance. Based on probability theory, we provide models and predictions of the average performance of the I-FGM system and its two control systems, as well as the systems without partial processing.

  2. Formal Distinctiveness of High- and Low-Imageability Nouns: Analyses and Theoretical Implications

    ERIC Educational Resources Information Center

    Reilly, Jamie; Kean, Jacob

    2007-01-01

    Words associated with perceptually salient, highly imageable concepts are learned earlier in life, more accurately recalled, and more rapidly named than abstract words (R. W. Brown, 1976; Walker & Hulme, 1999). Theories accounting for this concreteness effect have focused exclusively on semantic properties of word referents. A novel possibility is…

  3. Three-dimensional imaging system for analyses of dynamic droplet impaction and deposition formation on leaves

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A system was developed to assess the dynamic processes of droplet impact, rebound and retention on leaf surfaces with three-dimensional (3-D) images. The system components consisted of a uniform-size droplet generator, two high speed digital video cameras, a constant speed track, a leaf holder, and ...

  4. Functional connectivity analyses in imaging genetics: considerations on methods and data interpretation.

    PubMed

    Bedenbender, Johannes; Paulus, Frieder M; Krach, Sören; Pyka, Martin; Sommer, Jens; Krug, Axel; Witt, Stephanie H; Rietschel, Marcella; Laneri, Davide; Kircher, Tilo; Jansen, Andreas

    2011-01-01

    Functional magnetic resonance imaging (fMRI) can be combined with genotype assessment to identify brain systems that mediate genetic vulnerability to mental disorders ("imaging genetics"). A data analysis approach that is widely applied is "functional connectivity". In this approach, the temporal correlation between the fMRI signal from a pre-defined brain region (the so-called "seed point") and other brain voxels is determined. In this technical note, we show how the choice of freely selectable data analysis parameters strongly influences the assessment of the genetic modulation of connectivity features. In our data analysis we exemplarily focus on three methodological parameters: (i) seed voxel selection, (ii) noise reduction algorithms, and (iii) use of additional second level covariates. Our results show that even small variations in the implementation of a functional connectivity analysis can have an impact on the connectivity pattern that is as strong as the potential modulation by genetic allele variants. Some effects of genetic variation can only be found for one specific implementation of the connectivity analysis. A reoccurring difficulty in the field of psychiatric genetics is the non-replication of initially promising findings, partly caused by the small effects of single genes. The replication of imaging genetic results is therefore crucial for the long-term assessment of genetic effects on neural connectivity parameters. For a meaningful comparison of imaging genetics studies however, it is therefore necessary to provide more details on specific methodological parameters (e.g., seed voxel distribution) and to give information how robust effects are across the choice of methodological parameters.

  5. Hyperspectral and Chlorophyll Fluorescence Imaging to Analyse the Impact of Fusarium culmorum on the Photosynthetic Integrity of Infected Wheat Ears

    PubMed Central

    Bauriegel, Elke; Giebel, Antje; Herppich, Werner B.

    2011-01-01

    Head blight on wheat, caused by Fusarium spp., is a serious problem for both farmers and food production due to the concomitant production of highly toxic mycotoxins in infected cereals. For selective mycotoxin analyses, information about the on-field status of infestation would be helpful. Early symptom detection directly on ears, together with the corresponding geographic position, would be important for selective harvesting. Hence, the capabilities of various digital imaging methods to detect head blight disease on winter wheat were tested. Time series of images of healthy and artificially Fusarium-infected ears were recorded with a laboratory hyperspectral imaging system (wavelength range: 400 nm to 1,000 nm). Disease-specific spectral signatures were evaluated with an imaging software. Applying the ‘Spectral Angle Mapper’ method, healthy and infected ear tissue could be clearly classified. Simultaneously, chlorophyll fluorescence imaging of healthy and infected ears, and visual rating of the severity of disease was performed. Between six and eleven days after artificial inoculation, photosynthetic efficiency of infected compared to healthy ears decreased. The severity of disease highly correlated with photosynthetic efficiency. Above an infection limit of 5% severity of disease, chlorophyll fluorescence imaging reliably recognised infected ears. With this technique, differentiation of the severity of disease was successful in steps of 10%. Depending on the quality of chosen regions of interests, hyperspectral imaging readily detects head blight 7 d after inoculation up to a severity of disease of 50%. After beginning of ripening, healthy and diseased ears were hardly distinguishable with the evaluated methods. PMID:22163820

  6. Use of Very High-Resolution Airborne Images to Analyse 3d Canopy Architecture of a Vineyard

    NASA Astrophysics Data System (ADS)

    Burgos, S.; Mota, M.; Noll, D.; Cannelle, B.

    2015-08-01

    Differencing between green cover and grape canopy is a challenge for vigour status evaluation in viticulture. This paper presents the acquisition methodology of very high-resolution images (4 cm), using a Sensefly Swinglet CAM unmanned aerial vehicle (UAV) and their processing to construct a 3D digital surface model (DSM) for the creation of precise digital terrain models (DTM). The DTM was obtained using python processing libraries. The DTM was then subtracted to the DSM in order to obtain a differential digital model (DDM) of a vineyard. In the DDM, the vine pixels were then obtained by selecting all pixels with an elevation higher than 50 [cm] above the ground level. The results show that it was possible to separate pixels from the green cover and the vine rows. The DDM showed values between -0.1 and + 1.5 [m]. A manually delineation of polygons based on the RGB image belonging to the green cover and to the vine rows gave a highly significant differences with an average value of 1.23 [m] and 0.08 [m] for the vine and the ground respectively. The vine rows elevation is in good accordance with the topping height of the vines 1.35 [m] measured on the field. This mask could be used to analyse images of the same plot taken at different times. The extraction of only vine pixels will facilitate subsequent analyses, for example, a supervised classification of these pixels.

  7. Contextualising and Analysing Planetary Rover Image Products through the Web-Based PRoGIS

    NASA Astrophysics Data System (ADS)

    Morley, Jeremy; Sprinks, James; Muller, Jan-Peter; Tao, Yu; Paar, Gerhard; Huber, Ben; Bauer, Arnold; Willner, Konrad; Traxler, Christoph; Garov, Andrey; Karachevtseva, Irina

    2014-05-01

    The international planetary science community has launched, landed and operated dozens of human and robotic missions to the planets and the Moon. They have collected various surface imagery that has only been partially utilized for further scientific purposes. The FP7 project PRoViDE (Planetary Robotics Vision Data Exploitation) is assembling a major portion of the imaging data gathered so far from planetary surface missions into a unique database, bringing them into a spatial context and providing access to a complete set of 3D vision products. Processing is complemented by a multi-resolution visualization engine that combines various levels of detail for a seamless and immersive real-time access to dynamically rendered 3D scenes. PRoViDE aims to (1) complete relevant 3D vision processing of planetary surface missions, such as Surveyor, Viking, Pathfinder, MER, MSL, Phoenix, Huygens, and Lunar ground-level imagery from Apollo, Russian Lunokhod and selected Luna missions, (2) provide highest resolution & accuracy remote sensing (orbital) vision data processing results for these sites to embed the robotic imagery and its products into spatial planetary context, (3) collect 3D Vision processing and remote sensing products within a single coherent spatial data base, (4) realise seamless fusion between orbital and ground vision data, (5) demonstrate the potential of planetary surface vision data by maximising image quality visualisation in 3D publishing platform, (6) collect and formulate use cases for novel scientific application scenarios exploiting the newly introduced spatial relationships and presentation, (7) demonstrate the concepts for MSL, (9) realize on-line dissemination of key data & its presentation by a web-based GIS and rendering tool named PRoGIS (Planetary Robotics GIS). PRoGIS is designed to give access to rover image archives in geographical context, using projected image view cones, obtained from existing meta-data and updated according to

  8. Capabilities and Impact on Wind Analyses of the Hurricane Imaging Radiometer (HIRAD)

    NASA Technical Reports Server (NTRS)

    Miller, Timothy L.; Amarin, Ruba; Atlas, Robert; Bailey, M. C.; Black, Peter; Buckley, Courtney; James, Mark; Johnson, James; Jones, Linwood; Ruf, Christopher; Simmons, David; Uhlhorn, Eric

    2010-01-01

    The Hurricane Imaging Radiometer (HIRAD) is a new airborne microwave remote sensor for hurricane observations that is currently under development by NASA Marshall Space Flight Center in partnership with the NOAA Atlantic Oceanographic and Meteorological Laboratory/Hurricane Research Division, the University of Central Florida, the University of Michigan, and the University of Alabama in Huntsville. The instrument is being test flown in January and is expected to participate in or collaborate with the tropical cyclone experiment GRIP (Genesis and Rapid Intensification Processes) in the 2010 season. HIRAD is designed to study the wind field in some detail within strong hurricanes and to enhance the real-time airborne ocean surface winds observation capabilities of NOAA and USAF Weather Squadron hurricane hunter aircraft currently using the operational Stepped Frequency Microwave Radiometer (SFMR). Unlike SFMR, which measures wind speed and rain rate along the ground track at a single point directly beneath the aircraft, HIRAD will provide images of the surface wind and rain field over a wide swath (approx.3 x the aircraft altitude) with approx.2 km resolution. See Figure 1, which depicts a simulated HIRAD swath versus the line of data obtained by SFMR.

  9. Voxel-based analyses of magnetization transfer imaging of the brain in hepatic encephalopathy

    PubMed Central

    Miese, Falk R; Wittsack, Hans-Jörg; Kircheis, Gerald; Holstein, Arne; Mathys, Christian; Mödder, Ulrich; Cohnen, Mathias

    2009-01-01

    AIM: To evaluate the spatial distribution of cerebral abnormalities in cirrhotic subjects with and without hepatic encephalopathy (HE) found with magnetization transfer imaging (MTI). METHODS: Nineteen cirrhotic patients graded from neurologically normal to HE grade 2 and 18 healthy control subjects underwent magnetic resonance imaging. They gave institutional-review-board-approved written consent. Magnetization transfer ratio (MTR) maps were generated from MTI. We tested for significant differences compared to the control group using statistical non-parametric mapping (SnPM) for a voxel-based evaluation. RESULTS: The MTR of grey and white matter was lower in subjects with more severe HE. Changes were found in patients with cirrhosis without neurological deficits in the basal ganglia and bilateral white matter. The loss in magnetization transfer increased in severity and spatial extent in patients with overt HE. Patients with HE grade 2 showed an MTR decrease in white and grey matter: the maximum loss of magnetization transfer effect was located in the basal ganglia [SnPM (pseudo-)t = 17.98, P = 0.0001]. CONCLUSION: The distribution of MTR changes in HE points to an early involvement of basal ganglia and white matter in HE. PMID:19891014

  10. Analysed cap mesenchyme track data from live imaging of mouse kidney development.

    PubMed

    Lefevre, James G; Combes, Alexander N; Little, Melissa H; Hamilton, Nicholas A

    2016-12-01

    This article provides detailed information on manually tracked cap mesenchyme cells from timelapse imaging of multiple ex vivo embryonic mouse kidneys. Cells were imaged for up to 18 h at 15 or 20 min intervals, and multiple cell divisions were tracked. Positional data is supplemented with a range of information including the relative location of the closest ureteric tip and a correction for drift due to bulk movement and tip growth. A subset of tracks were annotated to indicate the presence of processes attached to the ureteric epithelium. The calculations used for drift correction are described, as are the main methods used in the analysis of this data for the purpose of describing cap cell motility. The outcomes of this analysis are discussed in "Cap mesenchyme cell swarming during kidney development is influenced by attraction, repulsion, and adhesion to the ureteric tip" (A.N. Combes, J.G. Lefevre, S. Wilson, N.A. Hamilton, M.H. Little, 2016) [1].

  11. X-ray fluorescence and imaging analyses of paintings by the Brazilian artist Oscar Pereira Da Silva

    NASA Astrophysics Data System (ADS)

    Campos, P. H. O. V.; Kajiya, E. A. M.; Rizzutto, M. A.; Neiva, A. C.; Pinto, H. P. F.; Almeida, P. A. D.

    2014-02-01

    Non-destructive analyses, such as EDXRF (Energy-Dispersive X-Ray Fluorescence) spectroscopy, and imaging were used to characterize easel paintings. The analyzed objects are from the collection of the Pinacoteca do Estado de São Paulo. EDXRF results allowed us to identify the chemical elements present in the pigments, showing the use of many Fe-based pigments, modern pigments, such as cobalt blue and cadmium yellow, as well as white pigments containing lead and zinc used by the artist in different layers. Imaging analysis was useful to identify the state of conservation, the localization of old and new restorations and also to detect and unveil the underlying drawings revealing the artist's creative processes.

  12. ICPES analyses using full image spectra and astronomical data fitting algorithms to provide diagnostic and result information

    SciTech Connect

    Spencer, W.A.; Goode, S.R.

    1997-10-01

    ICP emission analyses are prone to errors due to changes in power level, nebulization rate, plasma temperature, and sample matrix. As a result, accurate analyses of complex samples often require frequent bracketing with matrix matched standards. Information needed to track and correct the matrix errors is contained in the emission spectrum. But most commercial software packages use only the analyte line emission to determine concentrations. Changes in plasma temperature and the nebulization rate are reflected by changes in the hydrogen line widths, the oxygen emission, and neutral ion line ratios. Argon and off-line emissions provide a measure to correct the power level and the background scattering occurring in the polychromator. The authors` studies indicated that changes in the intensity of the Ar 404.4 nm line readily flag most matrix and plasma condition modifications. Carbon lines can be used to monitor the impact of organics on the analyses and calcium and argon lines can be used to correct for spectral drift and alignment. Spectra of contaminated groundwater and simulated defense waste glasses were obtained using a Thermo Jarrell Ash ICP that has an echelle CID detector system covering the 190-850 nm range. The echelle images were translated to the FITS data format, which astronomers recommend for data storage. Data reduction packages such as those in the ESO-MIDAS/ECHELLE and DAOPHOT programs were tried with limited success. The radial point spread function was evaluated as a possible improved peak intensity measurement instead of the common pixel averaging approach used in the commercial ICP software. Several algorithms were evaluated to align and automatically scale the background and reference spectra. A new data reduction approach that utilizes standard reference images, successive subtractions, and residual analyses has been evaluated to correct for matrix effects.

  13. Image analyser as a tool for the study of in vitro glomerular vasoreactivity.

    PubMed

    Lakhdar, B; Potier, M; L'Azou, B; Cambar, J

    1994-12-01

    Many drugs used in clinics can dramatically reduce renal hemodynamics. For some years there have been developed in our laboratory two in vitro glomerular models, isolated glomeruli and mesangial cell cultures, to quantitate, by video image analyzer, the direct glomerular effect of vasoreactive agents. The present study shows the vasoconstrictive effects of angiotensin II and cyclosporin in both models and compares their glomerular vasoconstriction with or without vasodilating agents such as verapamil. This drug-induced glomerular vasoreactivity is time- and dose-dependent; moreover, it can be reversible after perfusion in control conditions. The interest of these in vitro glomerular models is validated by fair correlations between in vivo and in vitro data and between the responses of both. These models can be considered as tools for assessing glomerular vasoreactivity of nephrotoxic agents.

  14. Expansion analyses on low-excitation planetary nebulae with stellar images

    SciTech Connect

    Tamura, Shinichi; Shibata, K.M. Nobeyama Radio Observatory, Minamimaki )

    1990-11-01

    The paper presents the results of analyses on the expansion characteristics of the low-excitation and unresolved planetary nebulae, M1-5, M1-9, K3-66, and K3-67. The sample nebulae are divided into two groups. The first includes the real compact planetary nebulae M1-5 and M1-9 based on their single-Gaussian profiles. The second one includes nebulae that are unresolved because of their large distances. The nebulae K3-66 and K3-67 should belong to the second group since they show the double-Gaussian components in the emission-line profiles. Relationships between expansion velocities and I(forbidden O III 5007 A)/I(H-beta) and between electron densities and expansion velocities give the basis for the above arguments and reveal that the nebulae IC 4997, Vy2-2, and M3-27 obviously are in different phases of evolution from those of other low-excitation planetary nebulae. 24 refs.

  15. Statistical improvements in functional magnetic resonance imaging analyses produced by censoring high-motion data points.

    PubMed

    Siegel, Joshua S; Power, Jonathan D; Dubis, Joseph W; Vogel, Alecia C; Church, Jessica A; Schlaggar, Bradley L; Petersen, Steven E

    2014-05-01

    Subject motion degrades the quality of task functional magnetic resonance imaging (fMRI) data. Here, we test two classes of methods to counteract the effects of motion in task fMRI data: (1) a variety of motion regressions and (2) motion censoring ("motion scrubbing"). In motion regression, various regressors based on realignment estimates were included as nuisance regressors in general linear model (GLM) estimation. In motion censoring, volumes in which head motion exceeded a threshold were withheld from GLM estimation. The effects of each method were explored in several task fMRI data sets and compared using indicators of data quality and signal-to-noise ratio. Motion censoring decreased variance in parameter estimates within- and across-subjects, reduced residual error in GLM estimation, and increased the magnitude of statistical effects. Motion censoring performed better than all forms of motion regression and also performed well across a variety of parameter spaces, in GLMs with assumed or unassumed response shapes. We conclude that motion censoring improves the quality of task fMRI data and can be a valuable processing step in studies involving populations with even mild amounts of head movement.

  16. Improving the local solution accuracy of large-scale digital image-based finite element analyses.

    PubMed

    Charras, G T; Guldberg, R E

    2000-02-01

    Digital image-based finite element modeling (DIBFEM) has become a widely utilized approach for efficiently meshing complex biological structures such as trabecular bone. While DIBFEM can provide accurate predictions of apparent mechanical properties, its application to simulate local phenomena such as tissue failure or adaptation has been limited by high local solution errors at digital model boundaries. Furthermore, refinement of digital meshes does not necessarily reduce local maximum errors. The purpose of this study was to evaluate the potential to reduce local mean and maximum solution errors in digital meshes using a post-processing filtration method. The effectiveness of a three-dimensional, boundary-specific filtering algorithm was found to be mesh size dependent. Mean absolute and maximum errors were reduced for meshes with more than five elements through the diameter of a cantilever beam considered representative of a single trabecula. Furthermore, mesh refinement consistently decreased errors for filtered solutions but not necessarily for non-filtered solutions. Models with more than five elements through the beam diameter yielded absolute mean errors of less than 15% for both Von Mises stress and maximum principal strain. When applied to a high-resolution model of trabecular bone microstructure, boundary filtering produced a more continuous solution distribution and reduced the predicted maximum stress by 30%. Boundary-specific filtering provides a simple means of improving local solution accuracy while retaining the model generation and numerical storage efficiency of the DIBFEM technique.

  17. Airflow analyses using thermal imaging in Arizona's Meteor Crater as part of METCRAX II

    NASA Astrophysics Data System (ADS)

    Grudzielanek, A. Martina; Vogt, Roland; Cermak, Jan; Maric, Mateja; Feigenwinter, Iris; Whiteman, C. David; Lehner, Manuela; Hoch, Sebastian W.; Krauß, Matthias G.; Bernhofer, Christian; Pitacco, Andrea

    2016-04-01

    In October 2013 the second Meteor Crater Experiment (METCRAX II) took place at the Barringer Meteorite Crater (aka Meteor Crater) in north central Arizona, USA. Downslope-windstorm-type flows (DWF), the main research objective of METCRAX II, were measured by a comprehensive set of meteorological sensors deployed in and around the crater. During two weeks of METCRAX II five infrared (IR) time lapse cameras (VarioCAM® hr research & VarioCAM® High Definition, InfraTec) were installed at various locations on the crater rim to record high-resolution images of the surface temperatures within the crater from different viewpoints. Changes of surface temperature are indicative of air temperature changes induced by flow dynamics inside the crater, including the DWF. By correlating thermal IR surface temperature data with meteorological sensor data during intensive observational periods the applicability of the IR method of representing flow dynamics can be assessed. We present evaluation results and draw conclusions relative to the application of this method for observing air flow dynamics in the crater. In addition we show the potential of the IR method for METCRAX II in 1) visualizing airflow processes to improve understanding of these flows, and 2) analyzing cold-air flows and cold-air pooling.

  18. Floral diversity in desert ecosystems: Comparing field sampling to image analyses in assessing species cover

    PubMed Central

    2013-01-01

    Background Developing a quick and reliable technique to estimate floral cover in deserts will assist in monitoring and management. The present attempt was to estimate plant cover in the UAE desert using both digital photography and field sampling. Digital photographs were correlated with field data to estimate floral cover in moderately (Al-Maha) and heavily (DDCR) grazed areas. The Kruskal-Wallis test was also used to assess compatibility between the two techniques within and across grazing intensities and soil substrates. Results Results showed that photographs could be a reliable technique within the sand dune substrate under moderate grazing (r = 0.69). The results were very poorly correlated (r =−0.24) or even inversely proportional (r =−0.48) when performed within DDCR. Overall, Chi-square values for Al-Maha and DDCR were not significant at P > 0.05, indicating similarities between the two methods. At the soil type level, the Kruskal-Wallis analysis was not significant (P > 0.05), except for gravel plains (P < 0.05). Across grazing intensities and soil substrates, the two techniques were in agreement in ranking most plant species, except for Lycium shawii. Conclusions Consequently, the present study has proven that digital photography could not be used reliably to asses floral cover, while further testing is required to support such claim. An image-based sampling approach of plant cover at the species level, across different grazing and substrate variations in desert ecosystems, has its uses, but results are to be cautiously interpreted. PMID:23758667

  19. Application of terahertz pulsed imaging to analyse film coating characteristics of sustained-release coated pellets.

    PubMed

    Haaser, M; Karrout, Y; Velghe, C; Cuppok, Y; Gordon, K C; Pepper, M; Siepmann, J; Rades, T; Taday, P F; Strachan, C J

    2013-12-05

    Terahertz pulsed imaging (TPI) was employed to explore its suitability for detecting differences in the film coating thickness and drug layer uniformity of multilayered, sustained-release coated, standard size pellets (approximately 1mm in diameter). Pellets consisting of a sugar starter core and a metoprolol succinate layer were coated with a Kollicoat(®) SR:Kollicoat(®) IR polymer blend for different times giving three groups of pellets (batches I, II and III), each with a different coating thickness according to weight gain. Ten pellets from each batch were mapped individually to evaluate the coating thickness and drug layer thickness between batches, between pellets within each batch, and across individual pellets (uniformity). From the terahertz waveform the terahertz electric field peak strength (TEFPS) was used to define a circular area (approximately 0.13 mm(2)) in the TPI maps, where no signal distortion was found due to pellet curvature in the measurement set-up used. The average coating thicknesses were 46 μm, 71 μm and 114 μm, for batches I, II and III respectively, whilst no drug layer thickness difference between batches was observed. No statistically significant differences in the average coating thickness and drug layer thickness within batches (between pellets) but high thickness variability across individual pellets was observed. These results were confirmed by scanning electron microscopy (SEM). The coating thickness results correlated with the subsequent drug release behaviour. The fastest drug release was obtained from batch I with the lowest coating thickness and the slowest from batch III with the highest coating thickness. In conclusion, TPI is suitable for detailed, non-destructive evaluation of film coating and drug layer thicknesses in multilayered standard size pellets.

  20. Analyse multiechelle d'images radar: Application au filtrage, a la classification et a la fusion d'images radar et optique

    NASA Astrophysics Data System (ADS)

    Foucher, Samuel

    Les images radar sont perturbees par un bruit multiplicatif (chatoiement) reduisant sensiblement la resolution radiometrique des cibles homogenes etendues. Le but de cette these est d'etudier l'apport de l'analyse multiechelle, plus particulierement de la transformee en ondelettes, dans le probleme de la reduction du chatoiement et de la classification non dirigee des images radar. Dans le cadre de la transformee en ondelettes stationnaire, garantissant l'invariance par translation de la representation, les techniques usuelles de filtrage adaptatif sont etendues au domaine multiechelle. Nous proposons de prendre en compte les specificites statistiques de l'image radar (modele multiplicatif, loi K) afin de separer les coefficients d'ondelettes engendres par le bruit seul de ceux engendres par les structures significatives de l'image. Le systeme de distribution de Pearson est applique afin de modeliser la distribution de probabilites des coefficients d'ondelettes. Lorsque l'intensite observee obeit a une loi K, le systeme de Pearson conduit a une loi de type IV (loi Beta complexe). Le type IV de Pearson est mis en oeuvre dans une ponderation de type MAP (Maximum A Posteriori). L'influence de la correlation du chatoiement sur les moments d'ordre superieur est ensuite evaluee quantitativement a partir d'une modelisation MA ("Moving Average") de l'image radar correlee. Les resultats obtenus sur un ensemble d'images artificielles montrent que l'approche multiechelle permet d'atteindre un meilleur compromis entre preservation des details et lissage des regions homogenes par rapport aux methodes de filtrage traditionnelles. En classification, la representation multiechelle permet de faire fluctuer le compromis precision spatiale/incertitude radiometrique. La theorie des croyances fournit un cadre theorique afin de manipuler les notions d'incertitude et d'imprecision. Nous proposons de combiner directement les decisions multiechelles par la regle de Dempster en integrant l

  1. White matter abnormalities associated with auditory hallucinations in schizophrenia: a combined study of voxel-based analyses of diffusion tensor imaging and structural magnetic resonance imaging.

    PubMed

    Seok, Jeong-Ho; Park, Hae-Jeong; Chun, Ji-Won; Lee, Seung-Koo; Cho, Hyun Sang; Kwon, Jun Soo; Kim, Jae-Jin

    2007-11-15

    White matter (WM) abnormalities in schizophrenia may offer important clues to a better understanding of the disconnectivity associated with the disorder. The aim of this study was to elucidate a WM basis of auditory hallucinations in schizophrenia through the simultaneous investigation of WM tract integrity and WM density. Diffusion tensor images (DTIs) and structural T1 magnetic resonance images (MRIs) were taken from 15 hallucinating schizophrenic patients, 15 non-hallucinating schizophrenic patients and 22 normal controls. Voxel-based analyses and post-hoc region of interest analyses were obtained to compare the three groups on fractional anisotropy (FA) derived from DTI as well as WM density derived from structural MRIs. In both the hallucinating and non-hallucinating groups, FA of the WM regions was significantly decreased in the left superior longitudinal fasciculus (SLF), whereas WM density was significantly increased in the left inferior longitudinal fasciculus (ILF). The mean FA value of the left frontal part of the SLF was positively correlated with the severity score of auditory hallucinations in the hallucinating patient group. Our findings show that WM changes were mainly observed in the frontal and temporal areas, suggesting that disconnectivity in the left fronto-temporal area may contribute to the pathophysiology of schizophrenia. In addition, pathologic WM changes in this region may be an important step in the development of auditory hallucinations in schizophrenia.

  2. Computerized multiple image analysis on mammograms: performance improvement of nipple identification for registration of multiple views using texture convergence analyses

    NASA Astrophysics Data System (ADS)

    Zhou, Chuan; Chan, Heang-Ping; Sahiner, Berkman; Hadjiiski, Lubomir M.; Paramagul, Chintana

    2004-05-01

    Automated registration of multiple mammograms for CAD depends on accurate nipple identification. We developed two new image analysis techniques based on geometric and texture convergence analyses to improve the performance of our previously developed nipple identification method. A gradient-based algorithm is used to automatically track the breast boundary. The nipple search region along the boundary is then defined by geometric convergence analysis of the breast shape. Three nipple candidates are identified by detecting the changes along the gray level profiles inside and outside the boundary and the changes in the boundary direction. A texture orientation-field analysis method is developed to estimate the fourth nipple candidate based on the convergence of the tissue texture pattern towards the nipple. The final nipple location is determined from the four nipple candidates by a confidence analysis. Our training and test data sets consisted of 419 and 368 randomly selected mammograms, respectively. The nipple location identified on each image by an experienced radiologist was used as the ground truth. For 118 of the training and 70 of the test images, the radiologist could not positively identify the nipple, but provided an estimate of its location. These were referred to as invisible nipple images. In the training data set, 89.37% (269/301) of the visible nipples and 81.36% (96/118) of the invisible nipples could be detected within 1 cm of the truth. In the test data set, 92.28% (275/298) of the visible nipples and 67.14% (47/70) of the invisible nipples were identified within 1 cm of the truth. In comparison, our previous nipple identification method without using the two convergence analysis techniques detected 82.39% (248/301), 77.12% (91/118), 89.93% (268/298) and 54.29% (38/70) of the nipples within 1 cm of the truth for the visible and invisible nipples in the training and test sets, respectively. The results indicate that the nipple on mammograms can be

  3. Automated reference region extraction and population-based input function for brain [11C]TMSX PET image analyses

    PubMed Central

    Rissanen, Eero; Tuisku, Jouni; Luoto, Pauliina; Arponen, Eveliina; Johansson, Jarkko; Oikonen, Vesa; Parkkola, Riitta; Airas, Laura; Rinne, Juha O

    2015-01-01

    [11C]TMSX ([7-N-methyl-11C]-(E)-8-(3,4,5-trimethoxystyryl)-1,3,7-trimethylxanthine) is a selective adenosine A2A receptor (A2AR) radioligand. In the central nervous system (CNS), A2AR are linked to dopamine D2 receptor function in striatum, but they are also important modulators of inflammation. The golden standard for kinetic modeling of brain [11C]TMSX positron emission tomography (PET) is to obtain arterial input function via arterial blood sampling. However, this method is laborious, prone to errors and unpleasant for study subjects. The aim of this work was to evaluate alternative input function acquisition methods for brain [11C]TMSX PET imaging. First, a noninvasive, automated method for the extraction of gray matter reference region using supervised clustering (SCgm) was developed. Second, a method for obtaining a population-based arterial input function (PBIF) was implemented. These methods were created using data from 28 study subjects (7 healthy controls, 12 multiple sclerosis patients, and 9 patients with Parkinson's disease). The results with PBIF correlated well with original plasma input, and the SCgm yielded similar results compared with cerebellum as a reference region. The clustering method for extracting reference region and the population-based approach for acquiring input for dynamic [11C]TMSX brain PET image analyses appear to be feasible and robust methods, that can be applied in patients with CNS pathology. PMID:25370856

  4. Automated reference region extraction and population-based input function for brain [(11)C]TMSX PET image analyses.

    PubMed

    Rissanen, Eero; Tuisku, Jouni; Luoto, Pauliina; Arponen, Eveliina; Johansson, Jarkko; Oikonen, Vesa; Parkkola, Riitta; Airas, Laura; Rinne, Juha O

    2015-01-01

    [(11)C]TMSX ([7-N-methyl-(11)C]-(E)-8-(3,4,5-trimethoxystyryl)-1,3,7-trimethylxanthine) is a selective adenosine A2A receptor (A2AR) radioligand. In the central nervous system (CNS), A2AR are linked to dopamine D2 receptor function in striatum, but they are also important modulators of inflammation. The golden standard for kinetic modeling of brain [(11)C]TMSX positron emission tomography (PET) is to obtain arterial input function via arterial blood sampling. However, this method is laborious, prone to errors and unpleasant for study subjects. The aim of this work was to evaluate alternative input function acquisition methods for brain [(11)C]TMSX PET imaging. First, a noninvasive, automated method for the extraction of gray matter reference region using supervised clustering (SCgm) was developed. Second, a method for obtaining a population-based arterial input function (PBIF) was implemented. These methods were created using data from 28 study subjects (7 healthy controls, 12 multiple sclerosis patients, and 9 patients with Parkinson's disease). The results with PBIF correlated well with original plasma input, and the SCgm yielded similar results compared with cerebellum as a reference region. The clustering method for extracting reference region and the population-based approach for acquiring input for dynamic [(11)C]TMSX brain PET image analyses appear to be feasible and robust methods, that can be applied in patients with CNS pathology.

  5. Reduced medial prefrontal-subcortical connectivity in dysphoria: Granger causality analyses of rapid functional magnetic resonance imaging.

    PubMed

    Sabatinelli, Dean; McTeague, Lisa M; Dhamala, Mukesh; Frank, David W; Wanger, Timothy J; Adhikari, Bhim M

    2015-02-01

    A cortico-limbic network consisting of the amygdala, medial prefrontal cortex (mPFC), and ventral striatum (vSTR) has been associated with altered function in emotional disorders. Here we used rapidly sampled functional magnetic resonance imaging and Granger causality analyses to assess the directional connectivity between these brain structures in a sample of healthy and age-matched participants endorsing moderate to severe depressive symptomatology as they viewed a series of natural scene stimuli varying systematically in pleasantness and arousal. Specifically during pleasant scene perception, dysphoric participants showed reduced activity in mPFC and vSTR, relative to healthy participants. In contrast, amygdala activity was enhanced to pleasant as well as unpleasant arousing scenes in both participant groups. Granger causality estimates of influence between mPFC and vSTR were significantly reduced in dysphoric relative to control participants during all picture contents. These findings provide direct evidence that during visual perception of evocative emotional stimuli, reduced reward-related activity in dysphoria is associated with dysfunctional causal connectivity between mPFC, amygdala, and vSTR.

  6. In vitro stability analyses as a model for metabolism of ferromagnetic particles (Clariscan), a contrast agent for magnetic resonance imaging.

    PubMed

    Skotland, Tore; Sontum, Per Christian; Oulie, Inger

    2002-04-15

    Clariscan is a new contrast agent for magnetic resonance imaging. It is an aqueous suspension of ferromagnetic particles injected for blood pool and liver imaging. Previous experiments showed that particles made of 59Fe were taken up by the mononuclear phagocytic system and then solubilised. The present work aims at explaining a possible mechanism for the dissolution of these ferromagnetic particles in the body. The particles were diluted in 10-mM citrate or 10-mM acetate buffers at pH 4.5, 5.0 and 5.5 and incubated at 37 degrees C for up to 22 days, protected from light. The mixtures were analysed at different times during this incubation period using photon correlation spectroscopy, magnetic relaxation, visible spectroscopy and reactivity of the iron with the chelator, bathophenanthroline disulphonic acid. The data obtained with these techniques showed that the particles were almost completely solubilised within 4-7 days when incubated in 10 mM citrate, pH 4.5. Incubation in 10 mM citrate buffer, pH 5.0 revealed a slower solubilisation of the particles, as the changes observed after 72 h of incubation at pH 5.0 were 43-71% of the changes observed at pH 4.5. Incubation in 10 mM citrate, pH 5.5 revealed an even slower solubilisation of the particles, as the changes observed after 72 h of incubation at pH 5.5 were 12-34% of those observed at pH 4.5. Incubation of the particles in 10 mM acetate at pH 4.5, 5.0 and 5.5, as well as incubation of the particles in water pH adjusted to pH 5.1, resulted in only minor or no solubilisation of the particles. The results indicate that the low pH of endosomes and lysosomes, as well as endogenous iron-complexing substances, may be important for the solubilisation of these ferromagnetic particles following i.v. injection of Clariscan.

  7. Analyses of Magnetic Resonance Imaging of Cerebrospinal Fluid Dynamics Pre and Post Short and Long-Duration Space Flights

    NASA Technical Reports Server (NTRS)

    Alperin, Noam; Barr, Yael; Lee, Sang H.; Mason,Sara; Bagci, Ahmet M.

    2015-01-01

    Preliminary results are based on analyses of data from 17 crewmembers. The initial analysis compares pre to post-flight changes in total cerebral blood flow (CBF) and craniospinal CSF flow volume. Total CBF is obtained by summation of the mean flow rates through the 4 blood vessels supplying the brain (right and left internal carotid and vertebral arteries). Volumetric flow rates were obtained using an automated lumen segmentation technique shown to have 3-4-fold improved reproducibility and accuracy over manual lumen segmentation (6). Two cohorts, 5 short-duration and 8 long-duration crewmembers, who were scanned within 3 to 8 days post landing were included (4 short-duration crewmembers with MRI scans occurring beyond 10 days post flight were excluded). The VIIP Clinical Practice Guideline (CPG) classification is being used initially as a measure for VIIP syndrome severity. Median CPG scores of the short and long-duration cohorts were similar, 2. Mean preflight total CBF for the short and long-duration cohorts were similar, 863+/-144 and 747+/-119 mL/min, respectively. Percentage CBF changes for all short duration crewmembers were 11% or lower, within the range of normal physiological fluctuations in healthy individuals. In contrast, in 4 of the 8 long-duration crewmembers, the change in CBF exceeded the range of normal physiological fluctuation. In 3 of the 4 subjects an increase in CBF was measured. Large pre to post-flight changes in the craniospinal CSF flow volume were found in 6 of the 8 long-duration crewmembers. Box-Whisker plots of the CPG and the percent CBF and CSF flow changes for the two cohorts are shown in Figure 4. Examples of CSF flow waveforms for a short and two long-duration (CPG 0 and 3) are shown in Figure 5. Changes in CBF and CSF flow dynamics larger than normal physiological fluctuations were observed in the long-duration crewmembers. Changes in CSF flow were more pronounced than changes in CBF. Decreased CSF flow dynamics were observed

  8. Rapid specimen preparation to improve the throughput of electron microscopic volume imaging for three-dimensional analyses of subcellular ultrastructures with serial block-face scanning electron microscopy.

    PubMed

    Thai, Truc Quynh; Nguyen, Huy Bang; Saitoh, Sei; Wu, Bao; Saitoh, Yurika; Shimo, Satoshi; Elewa, Yaser Hosny Ali; Ichii, Osamu; Kon, Yasuhiro; Takaki, Takashi; Joh, Kensuke; Ohno, Nobuhiko

    2016-09-01

    Serial block-face imaging using scanning electron microscopy enables rapid observations of three-dimensional ultrastructures in a large volume of biological specimens. However, such imaging usually requires days for sample preparation to reduce charging and increase image contrast. In this study, we report a rapid procedure to acquire serial electron microscopic images within 1 day for three-dimensional analyses of subcellular ultrastructures. This procedure is based on serial block-face with two major modifications, including a new sample treatment device and direct polymerization on the rivets, to reduce the time and workload needed. The modified procedure without uranyl acetate can produce tens of embedded samples observable under serial block-face scanning electron microscopy within 1 day. The serial images obtained are similar to the block-face images acquired by common procedures, and are applicable to three-dimensional reconstructions at a subcellular resolution. Using this approach, regional immune deposits and the double contour or heterogeneous thinning of basement membranes were observed in the glomerular capillary loops of an autoimmune nephropathy model. These modifications provide options to improve the throughput of three-dimensional electron microscopic examinations, and will ultimately be beneficial for the wider application of volume imaging in life science and clinical medicine.

  9. A Fusion-Based Approach for Breast Ultrasound Image Classification Using Multiple-ROI Texture and Morphological Analyses

    PubMed Central

    Bdair, Tariq M.; Al-Najar, Mahasen; Alazrai, Rami

    2016-01-01

    Ultrasound imaging is commonly used for breast cancer diagnosis, but accurate interpretation of breast ultrasound (BUS) images is often challenging and operator-dependent. Computer-aided diagnosis (CAD) systems can be employed to provide the radiologists with a second opinion to improve the diagnosis accuracy. In this study, a new CAD system is developed to enable accurate BUS image classification. In particular, an improved texture analysis is introduced, in which the tumor is divided into a set of nonoverlapping regions of interest (ROIs). Each ROI is analyzed using gray-level cooccurrence matrix features and a support vector machine classifier to estimate its tumor class indicator. The tumor class indicators of all ROIs are combined using a voting mechanism to estimate the tumor class. In addition, morphological analysis is employed to classify the tumor. A probabilistic approach is used to fuse the classification results of the multiple-ROI texture analysis and morphological analysis. The proposed approach is applied to classify 110 BUS images that include 64 benign and 46 malignant tumors. The accuracy, specificity, and sensitivity obtained using the proposed approach are 98.2%, 98.4%, and 97.8%, respectively. These results demonstrate that the proposed approach can effectively be used to differentiate benign and malignant tumors. PMID:28127383

  10. Tract-Specific Analyses of Diffusion Tensor Imaging Show Widespread White Matter Compromise in Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Shukla, Dinesh K.; Keehn, Brandon; Muller, Ralph-Axel

    2011-01-01

    Background: Previous diffusion tensor imaging (DTI) studies have shown white matter compromise in children and adults with autism spectrum disorder (ASD), which may relate to reduced connectivity and impaired function of distributed networks. However, tract-specific evidence remains limited in ASD. We applied tract-based spatial statistics (TBSS)…

  11. APT-Weighted and NOE-Weighted Image Contrasts in Glioma with Different RF Saturation Powers Based on Magnetization Transfer Ratio Asymmetry Analyses

    PubMed Central

    Zhou, Jinyuan; Hong, Xiaohua; Zhao, Xuna; Gao, Jia-Hong; Yuan, Jing

    2013-01-01

    Purpose To investigate the saturation-power dependence of amide proton transfer (APT)-weighted and nuclear Overhauser enhancement (NOE)-weighted image contrasts in a rat glioma model at 4.7 T. Methods 9L tumor-bearing rats (n = 8) and fresh eggs (n = 4) were scanned on a 4.7-T animal MRI scanner. Z-spectra over an offset range of ±6 ppm were acquired with different saturation powers, followed by the magnetization transfer-ratio (MTR) asymmetry analyses around the water resonance. Results The NOE signal upfield from the water resonance (−2.5 to −5 ppm) was clearly visible at lower saturation powers (e.g., 0.6 μT) and was larger in the contralateral normal brain tissue than in the tumor. Conversely, the APT effect downfield from the water resonance was observed at relatively higher saturation powers (e.g., 2.1 μT) and was larger in the tumor than in the contralateral normal brain tissue. The NOE decreased the APT-weighted image signal, based on the MTR asymmetry analysis, but increased the APT-weighted image contrast between the tumor and contralateral normal brain tissue. Conclusion The APT and NOE image signals in tumor are maximized at different saturation powers. The saturation power of roughly 2 μT is ideal for APT-weighted imaging at clinical B0 field strengths. PMID:23661598

  12. Analysing the impact of far-out sidelobes on the imaging performance of the SKA-LOW telescope

    NASA Astrophysics Data System (ADS)

    Mort, Benjamin; Dulwich, Fred; Razavi-Ghods, Nima; de Lera Acedo, Eloy; Grainge, Keith

    2017-03-01

    The Square Kilometre Array's Low Frequency instrument (SKA-LOW) will operate in the undersampled regime for most of the frequency band where grating lobes pose particular challenges. To achieve the expected level of sensitivity for SKA-LOW, it is particularly important to understand how interfering sources in both near and far side-lobes of the station beam affect the imaging performance. In this study, we discuss options for station designs, and adopting a random element layout, we assess its effectiveness by investigating how sources far from the main lobe of the station beam degrade images of the target field. These sources have the effect of introducing a noise-like corruption to images, which is called the far sidelobe confusion noise (FSCN). Using OSKAR, a software simulator accelerated using graphics processing units, we carried out end-to-end simulations using an all-sky model and telescope configuration representative of the SKA-LOW instrument. The FSCN is a function of both the station beam and the interferometric point spread function, and decreases with increasing observation time until the coverage of the aperture plane no longer improves. Using apodization to reduce the level of near-in sidelobes of the station beam had a notable improvement on the level of the FSCN at low frequencies. Our results indicate that the effects of picking up sources in the sidelobes are worse at low frequencies, where the array is less sparse.

  13. Unsupervised clustering analyses of features extraction for a caries computer-assisted diagnosis using dental fluorescence images

    NASA Astrophysics Data System (ADS)

    Bessani, Michel; da Costa, Mardoqueu M.; Lins, Emery C. C. C.; Maciel, Carlos D.

    2014-02-01

    Computer-assisted diagnoses (CAD) are performed by systems with embedded knowledge. These systems work as a second opinion to the physician and use patient data to infer diagnoses for health problems. Caries is the most common oral disease and directly affects both individuals and the society. Here we propose the use of dental fluorescence images as input of a caries computer-assisted diagnosis. We use texture descriptors together with statistical pattern recognition techniques to measure the descriptors performance for the caries classification task. The data set consists of 64 fluorescence images of in vitro healthy and carious teeth including different surfaces and lesions already diagnosed by an expert. The texture feature extraction was performed on fluorescence images using RGB and YCbCr color spaces, which generated 35 different descriptors for each sample. Principal components analysis was performed for the data interpretation and dimensionality reduction. Finally, unsupervised clustering was employed for the analysis of the relation between the output labeling and the diagnosis of the expert. The PCA result showed a high correlation between the extracted features; seven components were sufficient to represent 91.9% of the original feature vectors information. The unsupervised clustering output was compared with the expert classification resulting in an accuracy of 96.88%. The results show the high accuracy of the proposed approach in identifying carious and non-carious teeth. Therefore, the development of a CAD system for caries using such an approach appears to be promising.

  14. Combined magnetic resonance and diffusion tensor imaging analyses provide a powerful tool for in vivo assessment of deformation along human muscle fibers.

    PubMed

    Pamuk, Uluç; Karakuzu, Agah; Ozturk, Cengizhan; Acar, Burak; Yucesoy, Can A

    2016-10-01

    Muscle fiber direction strain provides invaluable information for characterizing muscle function. However, methods to study this for human muscles in vivo are lacking. Using magnetic resonance (MR) imaging based deformation analyses and diffusion tensor (DT) imaging based tractography combined, we aimed to assess muscle fiber direction local tissue deformations within the human medial gastrocnemius (GM) muscle. Healthy female subjects (n=5, age=27±1 years) were positioned prone within the MR scanner in a relaxed state with the ankle angle fixed at 90°. The knee was brought to flexion (140.8±3.0°) (undeformed state). Sets of 3D high resolution MR, and DT images were acquired. This protocol was repeated at extended knee joint position (177.0±1.0°) (deformed state). Tractography and Demons nonrigid registration algorithm was utilized to calculate local deformations along muscle fascicles. Undeformed state images were also transformed by a synthetic rigid body motion to calculate strain errors. Mean strain errors were significantly smaller then mean fiber direction strains (lengthening: 0.2±0.1% vs. 8.7±8.5%; shortening: 3.3±0.9% vs. 7.5±4.6%). Shortening and lengthening (up to 23.3% and 116.7%, respectively) occurs simultaneously along individual fascicles despite imposed GM lengthening. Along-fiber shear strains confirm the presence of much shearing between fascicles. Mean fiber direction strains of different tracts also show non-uniform distribution. Inhomogeneity of fiber strain indicates epimuscular myofascial force transmission. We conclude that MR and DT imaging analyses combined provide a powerful tool for quantifying deformation along human muscle fibers in vivo. This can help substantially achieving a better understanding of normal and pathological muscle function and mechanisms of treatment techniques.

  15. Functional assessment of glioma pathogenesis by in vivo multi-parametric magnetic resonance imaging and in vitro analyses

    PubMed Central

    Yao, Nai-Wei; Chang, Chen; Lin, Hsiu-Ting; Yen, Chen-Tung; Chen, Jeou-Yuan

    2016-01-01

    Gliomas are aggressive brain tumors with poor prognosis. In this study, we report a novel approach combining both in vivo multi-parametric MRI and in vitro cell culture assessments to evaluate the pathogenic development of gliomas. Osteopontin (OPN), a pleiotropic factor, has been implicated in the formation and progression of various human cancers, including gliomas, through its functions in regulating cell proliferation, survival, angiogenesis, and migration. Using rat C6 glioma model, the combined approach successfully monitors the acquisition and decrease of cancer hallmarks. We show that knockdown of the expression of OPN reduces C6 cell proliferation, survival, viability and clonogenicity in vitro, and reduces tumor burden and prolongs animal survival in syngeneic rats. OPN depletion is associated with reduced tumor growth, decreased angiogenesis, and an increase of tumor-associated metabolites, as revealed by T2-weighted images, diffusion-weighted images, Ktrans maps, and 1H-MRS, respectively. These strategies allow us to define an important role of OPN in conferring cancer hallmarks, which can be further applied to assess the functional roles of other candidate genes in glioma. In particular, the non-invasive multi-parametric MRI measurement of cancer hallmarks related to proliferation, angiogenesis and altered metabolism may serve as a useful tool for diagnosis and for patient management. PMID:27198662

  16. Single-Cell Imaging and Spectroscopic Analyses of Cr(VI) Reduction on the Surface of Bacterial Cells

    SciTech Connect

    Wang, Yuanmin; Sevinc, Papatya C.; Belchik, Sara M.; Fredrickson, Jim K.; Shi, Liang; Lu, H. Peter

    2013-01-22

    We investigate single-cell reduction of toxic Cr(VI) by the dissimilatory metal-reducing bacterium Shewanella oneidensis MR-1 (MR-1), an important bioremediation process, using Raman spectroscopy and scanning electron microscopy (SEM) combined with energy-dispersive X-ray spectroscopy (EDX). Our experiments indicate that the toxic and highly soluble Cr(VI) can be efficiently reduced to the less toxic and non-soluble Cr2O3 nanoparticles by MR-1. Cr2O3 is observed to emerge as nanoparticles adsorbed on the cell surface and its chemical nature is identified by EDX imaging and Raman spectroscopy. Co-localization of Cr2O3 and cytochromes by EDX imaging and Raman spectroscopy suggests a terminal reductase role for MR-1 surface-exposed cytochromes MtrC and OmcA. Our experiments revealed that the cooperation of surface proteins OmcA and MtrC makes the reduction reaction most efficient, and the sequence of the reducing reactivity of the MR-1 is: wild type > single mutant @mtrC or mutant @omcA > double mutant (@omcA-@mtrC). Moreover, our results also suggest that the direct microbial Cr(VI) reduction and Fe(II) (hematite)-mediated Cr(VI) reduction mechanisms may co-exist in the reduction processes.

  17. Calibration of remote mineralogy algorithms using modal analyses of Apollo soils by X-ray diffraction and microscopic spectral imaging

    NASA Astrophysics Data System (ADS)

    Crites, S. T.; Taylor, J.; Martel, L.; Lucey, P. G.; Blake, D. F.

    2012-12-01

    We have launched a project to determine the modal mineralogy of over 100 soils from all Apollo sites using quantitative X-ray diffraction (XRD) and microscopic hyperspectral imaging at visible, near-IR and thermal IR wavelengths. The two methods are complementary: XRD is optimal for obtaining the major mineral modes because its measurement is not limited to the surfaces of grains, whereas the hyperspectral imaging method allows us to identify minerals present even down to a single grain, well below the quantitative detection limit of XRD. Each soil is also sent to RELAB to obtain visible, near-IR, and thermal-IR reflectance spectra. The goal is to use quantitative mineralogy in comparison with spectra of the same soils and with remote sensing data of the sampling stations to improve our ability to extract quantitative mineralogy from remote sensing observations. Previous groups have demonstrated methods for using lab mineralogy to validate remote sensing. The LSCC pioneered the method of comparing mineralogy to laboratory spectra of the same soils (Pieters et al. 2002); Blewett et al. (1997) directly compared remote sensing results for sample sites with lab measurements of representative soils from those sites. We are building upon the work of both groups by expanding the number of soils measured to 128, with an emphasis on immature soils to support recent work studying fresh exposures like crater central peaks, and also by incorporating the recent high spatial and spectral resolution data sets over expanded wavelength ranges (e.g. Diviner TIR, M3 hyperspectral VNIR) not available at the time of the previous studies. We have thus far measured 32 Apollo 16 soils using quantitative XRD and are continuing with our collection of soils from the other landing sites. We have developed a microscopic spectral imaging system that includes TIR, VIS, and NIR capabilities and have completed proof-of-concept scans of mineral separates and preliminary lunar soil scans with plans

  18. Application of mid-infrared chemical imaging and multivariate chemometrics analyses to characterise a population of microalgae cells.

    PubMed

    Tan, Suat-Teng; Balasubramanian, Rajesh Kumar; Das, Probir; Obbard, Jeffrey Philip; Chew, Wee

    2013-04-01

    A suite of multivariate chemometrics methods was applied to a mid-infrared imaging dataset of a eustigmatophyte, marine Nannochloropsis sp. microalgae strain. This includes the improved leader-follower cluster analysis (iLFCA) to interrogate spectra in an unsupervised fashion, a resonant Mie optical scatter correction algorithm (RMieS-EMSC) that improves data linearity, the band-target entropy minimization (BTEM) self-modeling curve resolution for recovering component spectra, and a multi-linear regression (MLR) for estimating relative concentrations and plotting chemical maps of component spectra. A novel Alpha-Stable probability calculation for microalgae cellular lipid-to-protein ratio Λi is introduced for estimating population characteristics.

  19. Evaluating Climate Causation of Conflict in Darfur Using Multi-temporal, Multi-resolution Satellite Image Datasets With Novel Analyses

    NASA Astrophysics Data System (ADS)

    Brown, I.; Wennbom, M.

    2013-12-01

    Climate change, population growth and changes in traditional lifestyles have led to instabilities in traditional demarcations between neighboring ethic and religious groups in the Sahel region. This has resulted in a number of conflicts as groups resort to arms to settle disputes. Such disputes often centre on or are justified by competition for resources. The conflict in Darfur has been controversially explained by resource scarcity resulting from climate change. Here we analyse established methods of using satellite imagery to assess vegetation health in Darfur. Multi-decadal time series of observations are available using low spatial resolution visible-near infrared imagery. Typically normalized difference vegetation index (NDVI) analyses are produced to describe changes in vegetation ';greenness' or ';health'. Such approaches have been widely used to evaluate the long term development of vegetation in relation to climate variations across a wide range of environments from the Arctic to the Sahel. These datasets typically measure peak NDVI observed over a given interval and may introduce bias. It is furthermore unclear how the spatial organization of sparse vegetation may affect low resolution NDVI products. We develop and assess alternative measures of vegetation including descriptors of the growing season, wetness and resource availability. Expanding the range of parameters used in the analysis reduces our dependence on peak NDVI. Furthermore, these descriptors provide a better characterization of the growing season than the single NDVI measure. Using multi-sensor data we combine high temporal/moderate spatial resolution data with low temporal/high spatial resolution data to improve the spatial representativity of the observations and to provide improved spatial analysis of vegetation patterns. The approach places the high resolution observations in the NDVI context space using a longer time series of lower resolution imagery. The vegetation descriptors

  20. An Imaging Flow Cytometry-based approach to analyse the fission yeast cell cycle in fixed cells.

    PubMed

    Patterson, James O; Swaffer, Matthew; Filby, Andrew

    2015-07-01

    Fission yeast (Schizosaccharomyces pombe) is an excellent model organism for studying eukaryotic cell division because many of the underlying principles and key regulators of cell cycle biology are conserved from yeast to humans. As such it can be employed as tool for understanding complex human diseases that arise from dis-regulation in cell cycle controls, including cancers. Conventional Flow Cytometry (CFC) is a high-throughput, multi-parameter, fluorescence-based single cell analysis technology. It is widely used for studying the mammalian cell cycle both in the context of the normal and disease states by measuring changes in DNA content during the transition through G1, S and G2/M using fluorescent DNA-binding dyes. Unfortunately analysis of the fission yeast cell cycle by CFC is not straightforward because, unlike mammalian cells, cytokinesis occurs after S-phase meaning that bi-nucleated G1 cells have the same DNA content as mono-nucleated G2 cells and cannot be distinguished using total integrated fluorescence (pulse area). It has been elegantly shown that the width of the DNA pulse can be used to distinguish G2 cells with a single 2C foci versus G1 cells with two 1C foci, however the accuracy of this measurement is dependent on the orientation of the cell as it traverses the laser beam. To this end we sought to improve the accuracy of the fission yeast cell cycle analysis and have developed an Imaging Flow Cytometry (IFC)-based method that is able to preserve the high throughput, objective analysis afforded by CFC in combination with the spatial and morphometric information provide by microscopy. We have been able to derive an analysis framework for subdividing the yeast cell cycle that is based on intensiometric and morphometric measurements and is thus robust against orientation-based miss-classification. In addition we can employ image-based metrics to define populations of septated/bi-nucleated cells and measure cellular dimensions. To our knowledge

  1. Advances in preclinical therapeutics development using small animal imaging and molecular analyses: the gastrointestinal stromal tumors model.

    PubMed

    Pantaleo, M A; Landuzzi, L; Nicoletti, G; Nanni, C; Boschi, S; Piazzi, G; Santini, D; Di Battista, M; Castellucci, P; Lodi, F; Fanti, S; Lollini, P-L; Biasco, G

    2009-09-01

    The large use of target therapies in the treatment of gastrointestinal stromal tumors (GISTs) highlighted the urgency to integrate new molecular imaging technologies, to develop new criteria for tumor response evaluation and to reach a more comprehensive definition of the molecular target. These aspects, which come from clinical experiences, are not considered enough in preclinical research studies which aim to evaluate the efficacy of new drugs or new combination of drugs with molecular target. We developed a xenograft animal model GIST882 using nude mice. We evaluated both the molecular and functional characterization of the tumor mass. The mutational analysis of KIT receptor of the GIST882 cell lines and tumor mass showed a mutation on exon 13 that was still present after in vivo cell growth. The glucose metabolism and cell proliferation was evaluated with a small animal PET using both FDG and FLT. The experimental development of new therapies for GIST treatment requires sophisticated animal models in order to represent the tumor molecular heterogeneity already demonstrated in the clinical setting and in order to evaluate the efficacy of the treatment also considering the inhibition of tumor metabolism, and not only considering the change in size of tumors. This approach of cancer research on GISTs is crucial and essential for innovative perspectives that could cross over to other types of cancer.

  2. Utilizing magnetic resonance imaging logs, openhole logs, and sidewall core analyses to evaluate shaly sands for water-free production

    SciTech Connect

    Taylor, D.A.; Morganti, J.K.; White, H.J. ); Noblett, B.R. )

    1996-01-01

    Nuclear magnetic resonance (NMR) logging using the new C Series Magnetic Resonance Imaging Log (MRIL) system is rapidly enhancing formation evaluation throughout the industry. By measuring irreducible water saturations, permeabilities, and effective porosities, MRIL data can help petrophysicists evaluate low-resistivity pays. In these environments, conventional openhole logs may not define all of the pay intervals. The MRIL system can also reduce the number of unnecessary completions in zones of potentially high water cut. MRIL tool theory and log presentations used with conventional logs and sidewall cores are presented along with field examples. Scanning electron microscope (SEM) analysis shows good correlation of varying grain size in sandstones with the T2 distribution and bulk volume irreducible water determined from the MRIL measurements. Analysis of each new well drilled in the study area shows how water-free production zones were defined. Because the MRIL data were not recorded on one of the wells, predictions from the conventional logs and the MRIL data collected on the other two wells were used to estimate productive zones in the first well. Discussion of additional formation characteristics, completion procedures, actual production, and predicted producibility of the shaly sands is presented. Integrated methodologies resulted in the perforation of 3 new wells for a gross initial potential of 690 BOPD and 0 BWPD.

  3. Utilizing magnetic resonance imaging logs, openhole logs, and sidewall core analyses to evaluate shaly sands for water-free production

    SciTech Connect

    Taylor, D.A.; Morganti, J.K.; White, H.J.; Noblett, B.R.

    1996-12-31

    Nuclear magnetic resonance (NMR) logging using the new C Series Magnetic Resonance Imaging Log (MRIL) system is rapidly enhancing formation evaluation throughout the industry. By measuring irreducible water saturations, permeabilities, and effective porosities, MRIL data can help petrophysicists evaluate low-resistivity pays. In these environments, conventional openhole logs may not define all of the pay intervals. The MRIL system can also reduce the number of unnecessary completions in zones of potentially high water cut. MRIL tool theory and log presentations used with conventional logs and sidewall cores are presented along with field examples. Scanning electron microscope (SEM) analysis shows good correlation of varying grain size in sandstones with the T2 distribution and bulk volume irreducible water determined from the MRIL measurements. Analysis of each new well drilled in the study area shows how water-free production zones were defined. Because the MRIL data were not recorded on one of the wells, predictions from the conventional logs and the MRIL data collected on the other two wells were used to estimate productive zones in the first well. Discussion of additional formation characteristics, completion procedures, actual production, and predicted producibility of the shaly sands is presented. Integrated methodologies resulted in the perforation of 3 new wells for a gross initial potential of 690 BOPD and 0 BWPD.

  4. Characterization of Influenza Vaccine Hemagglutinin Complexes by Cryo-Electron Microscopy and Image Analyses Reveals Structural Polymorphisms

    PubMed Central

    McCraw, Dustin M.; Gallagher, John R.

    2016-01-01

    Influenza virus afflicts millions of people worldwide on an annual basis. There is an ever-present risk that animal viruses will cross the species barrier to cause epidemics and pandemics resulting in great morbidity and mortality. Zoonosis outbreaks, such as the H7N9 outbreak, underscore the need to better understand the molecular organization of viral immunogens, such as recombinant influenza virus hemagglutinin (HA) proteins, used in influenza virus subunit vaccines in order to optimize vaccine efficacy. Here, using cryo-electron microscopy and image analysis, we show that recombinant H7 HA in vaccines formed macromolecular complexes consisting of variable numbers of HA subunits (range, 6 to 8). In addition, HA complexes were distributed across at least four distinct structural classes (polymorphisms). Three-dimensional (3D) reconstruction and molecular modeling indicated that HA was in the prefusion state and suggested that the oligomerization and the structural polymorphisms observed were due to hydrophobic interactions involving the transmembrane regions. These experiments suggest that characterization of the molecular structures of influenza virus HA complexes used in subunit vaccines will lead to better understanding of the differences in vaccine efficacy and to the optimization of subunit vaccines to prevent influenza virus infection. PMID:27074939

  5. Nonintrusive Finger-Vein Recognition System Using NIR Image Sensor and Accuracy Analyses According to Various Factors.

    PubMed

    Pham, Tuyen Danh; Park, Young Ho; Nguyen, Dat Tien; Kwon, Seung Yong; Park, Kang Ryoung

    2015-07-13

    Biometrics is a technology that enables an individual person to be identified based on human physiological and behavioral characteristics. Among biometrics technologies, face recognition has been widely used because of its advantages in terms of convenience and non-contact operation. However, its performance is affected by factors such as variation in the illumination, facial expression, and head pose. Therefore, fingerprint and iris recognitions are preferred alternatives. However, the performance of the former can be adversely affected by the skin condition, including scarring and dryness. In addition, the latter has the disadvantages of high cost, large system size, and inconvenience to the user, who has to align their eyes with the iris camera. In an attempt to overcome these problems, finger-vein recognition has been vigorously researched, but an analysis of its accuracies according to various factors has not received much attention. Therefore, we propose a nonintrusive finger-vein recognition system using a near infrared (NIR) image sensor and analyze its accuracies considering various factors. The experimental results obtained with three databases showed that our system can be operated in real applications with high accuracy; and the dissimilarity of the finger-veins of different people is larger than that of the finger types and hands.

  6. Textural analyses of carbon fiber materials by 2D-FFT of complex images obtained by high frequency eddy current imaging (HF-ECI)

    NASA Astrophysics Data System (ADS)

    Schulze, Martin H.; Heuer, Henning

    2012-04-01

    Carbon fiber based materials are used in many lightweight applications in aeronautical, automotive, machine and civil engineering application. By the increasing automation in the production process of CFRP laminates a manual optical inspection of each resin transfer molding (RTM) layer is not practicable. Due to the limitation to surface inspection, the quality parameters of multilayer 3 dimensional materials cannot be observed by optical systems. The Imaging Eddy- Current (EC) NDT is the only suitable inspection method for non-resin materials in the textile state that allows an inspection of surface and hidden layers in parallel. The HF-ECI method has the capability to measure layer displacements (misaligned angle orientations) and gap sizes in a multilayer carbon fiber structure. EC technique uses the variation of the electrical conductivity of carbon based materials to obtain material properties. Beside the determination of textural parameters like layer orientation and gap sizes between rovings, the detection of foreign polymer particles, fuzzy balls or visualization of undulations can be done by the method. For all of these typical parameters an imaging classification process chain based on a high resolving directional ECimaging device named EddyCus® MPECS and a 2D-FFT with adapted preprocessing algorithms are developed.

  7. An automated image-based method of 3D subject-specific body segment parameter estimation for kinetic analyses of rapid movements.

    PubMed

    Sheets, Alison L; Corazza, Stefano; Andriacchi, Thomas P

    2010-01-01

    Accurate subject-specific body segment parameters (BSPs) are necessary to perform kinetic analyses of human movements with large accelerations, or no external contact forces or moments. A new automated topographical image-based method of estimating segment mass, center of mass (CM) position, and moments of inertia is presented. Body geometry and volume were measured using a laser scanner, then an automated pose and shape registration algorithm segmented the scanned body surface, and identified joint center (JC) positions. Assuming the constant segment densities of Dempster, thigh and shank masses, CM locations, and moments of inertia were estimated for four male subjects with body mass indexes (BMIs) of 19.7-38.2. The subject-specific BSP were compared with those determined using Dempster and Clauser regression equations. The influence of BSP and BMI differences on knee and hip net forces and moments during a running swing phase were quantified for the subjects with the smallest and largest BMIs. Subject-specific BSP for 15 body segments were quickly calculated using the image-based method, and total subject masses were overestimated by 1.7-2.9%.When compared with the Dempster and Clauser methods, image-based and regression estimated thigh BSP varied more than the shank parameters. Thigh masses and hip JC to thigh CM distances were consistently larger, and each transverse moment of inertia was smaller using the image-based method. Because the shank had larger linear and angular accelerations than the thigh during the running swing phase, shank BSP differences had a larger effect on calculated intersegmental forces and moments at the knee joint than thigh BSP differences did at the hip. It was the net knee kinetic differences caused by the shank BSP differences that were the largest contributors to the hip variations. Finally, BSP differences produced larger kinetic differences for the subject with larger segment masses, suggesting that parameter accuracy is more

  8. Growing seasons of Nordic mountain birch in northernmost Europe as indicated by long-term field studies and analyses of satellite images.

    PubMed

    Shutova, E; Wielgolaski, F E; Karlsen, S R; Makarova, O; Berlina, N; Filimonova, T; Haraldsson, E; Aspholm, P E; Flø, L; Høgda, K A

    2006-11-01

    The phenophases first greening (bud burst) and yellowing of Nordic mountain birch (Betula pubescens ssp.tortuosa, also called B. p. ssp. czerepanovii) were observed at three sites on the Kola Peninsula in northernmost Europe during the period 1964-2003, and at two sites in the trans-boundary Pasvik-Enare region during 1994-2003. The field observations were compared with satellite images based on the GIMMS-NDVI dataset covering 1982-2002 at the start and end of the growing season. A trend for a delay of first greening was observed at only one of the sites (Kandalaksha) over the 40 year period. This fits well with the delayed onset of the growing season for that site based on satellite images. No significant changes in time of greening at the other sites were found with either field observations or satellite analyses throughout the study period. These results differ from the earlier spring generally observed in other parts of Europe in recent decades. In the coldest regions of Europe, e.g. in northern high mountains and the northernmost continental areas, increased precipitation associated with the generally positive North Atlantic Oscillation in the last few decades has often fallen as snow. Increased snow may delay the time of onset of the growing season, although increased temperature generally causes earlier spring phenophases. Autumn yellowing of birch leaves tends towards an earlier date at all sites. Due to both later birch greening and earlier yellowing at the Kandalaksha site, the growing season there has also become significantly shorter during the years observed. The sites showing the most advanced yellowing in the field throughout the study period fit well with areas showing an earlier end of the growing season from satellite images covering 1982-2002. The earlier yellowing is highly correlated with a trend at the sites in autumn for earlier decreasing air temperature over the study period, indicating that this environmental factor is important also for

  9. Surface Roughness and Critical Exponent Analyses of Boron-Doped Diamond Films Using Atomic Force Microscopy Imaging: Application of Autocorrelation and Power Spectral Density Functions

    NASA Astrophysics Data System (ADS)

    Gupta, S.; Vierkant, G. P.

    2014-09-01

    The evolution of the surface roughness of growing metal or semiconductor thin films provides much needed information about their growth kinetics and corresponding mechanism. While some systems show stages of nucleation, coalescence, and growth, others exhibit varying microstructures for different process conditions. In view of these classifications, we report herein detailed analyses based on atomic force microscopy (AFM) characterization to extract the surface roughness and growth kinetics exponents of relatively low boron-doped diamond (BDD) films by utilizing the analytical power spectral density (PSD) and autocorrelation function (ACF) as mathematical tools. The machining industry has applied PSD for a number of years for tool design and analysis of wear and machined surface quality. Herein, we present similar analyses at the mesoscale to study the surface morphology as well as quality of BDD films grown using the microwave plasma-assisted chemical vapor deposition technique. PSD spectra as a function of boron concentration (in gaseous phase) are compared with those for samples grown without boron. We find that relatively higher boron concentration yields higher amplitudes of the longer-wavelength power spectral lines, with amplitudes decreasing in an exponential or power-law fashion towards shorter wavelengths, determining the roughness exponent ( α ≈ 0.16 ± 0.03) and growth exponent ( β ≈ 0.54), albeit indirectly. A unique application of the ACF, which is widely used in signal processing, was also applied to one-dimensional or line analyses (i.e., along the x- and y-axes) of AFM images, revealing surface topology datasets with varying boron concentration. Here, the ACF was used to cancel random surface "noise" and identify any spatial periodicity via repetitive ACF peaks or spatially correlated noise. Periodicity at shorter spatial wavelengths was observed for no doping and low doping levels, while smaller correlations were observed for relatively

  10. Computational replication of the patient-specific stenting procedure for coronary artery bifurcations: From OCT and CT imaging to structural and hemodynamics analyses.

    PubMed

    Chiastra, Claudio; Wu, Wei; Dickerhoff, Benjamin; Aleiou, Ali; Dubini, Gabriele; Otake, Hiromasa; Migliavacca, Francesco; LaDisa, John F

    2016-07-26

    The optimal stenting technique for coronary artery bifurcations is still debated. With additional advances computational simulations can soon be used to compare stent designs or strategies based on verified structural and hemodynamics results in order to identify the optimal solution for each individual's anatomy. In this study, patient-specific simulations of stent deployment were performed for 2 cases to replicate the complete procedure conducted by interventional cardiologists. Subsequent computational fluid dynamics (CFD) analyses were conducted to quantify hemodynamic quantities linked to restenosis. Patient-specific pre-operative models of coronary bifurcations were reconstructed from CT angiography and optical coherence tomography (OCT). Plaque location and composition were estimated from OCT and assigned to models, and structural simulations were performed in Abaqus. Artery geometries after virtual stent expansion of Xience Prime or Nobori stents created in SolidWorks were compared to post-operative geometry from OCT and CT before being extracted and used for CFD simulations in SimVascular. Inflow boundary conditions based on body surface area, and downstream vascular resistances and capacitances were applied at branches to mimic physiology. Artery geometries obtained after virtual expansion were in good agreement with those reconstructed from patient images. Quantitative comparison of the distance between reconstructed and post-stent geometries revealed a maximum difference in area of 20.4%. Adverse indices of wall shear stress were more pronounced for thicker Nobori stents in both patients. These findings verify structural analyses of stent expansion, introduce a workflow to combine software packages for solid and fluid mechanics analysis, and underscore important stent design features from prior idealized studies. The proposed approach may ultimately be useful in determining an optimal choice of stent and position for each patient.

  11. A SPITZER IRAC IMAGING SURVEY FOR T DWARF COMPANIONS AROUND M, L, AND T DWARFS: OBSERVATIONS, RESULTS, AND MONTE CARLO POPULATION ANALYSES

    SciTech Connect

    Carson, J. C.; Marengo, M.; Patten, B. M.; Hora, J. L.; Schuster, M. T.; Fazio, G. G.; Luhman, K. L.; Sonnett, S. M.; Allen, P. R.; Stauffer, J. R.; Schnupp, C.

    2011-12-20

    We report observational techniques, results, and Monte Carlo population analyses from a Spitzer Infrared Array Camera imaging survey for substellar companions to 117 nearby M, L, and T dwarf systems (median distance of 10 pc, mass range of 0.6 to {approx}0.05 M{sub Sun }). The two-epoch survey achieves typical detection sensitivities to substellar companions of [4.5 {mu}m] {<=} 17.2 mag for angular separations between about 7'' and 165''. Based on common proper motion analysis, we find no evidence for new substellar companions. Using Monte Carlo orbital simulations (assuming random inclination, random eccentricity, and random longitude of pericenter), we conclude that the observational sensitivities translate to an ability to detect 600-1100 K brown dwarf companions at semimajor axes {approx}>35 AU and to detect 500-600 K companions at semimajor axes {approx}>60 AU. The simulations also estimate a 600-1100 K T dwarf companion fraction of <3.4% for 35-1200 AU separations and <12.4% for the 500-600 K companions for 60-1000 AU separations.

  12. Sociopolitical Analyses.

    ERIC Educational Resources Information Center

    Van Galen, Jane, Ed.; And Others

    1992-01-01

    This theme issue of the serial "Educational Foundations" contains four articles devoted to the topic of "Sociopolitical Analyses." In "An Interview with Peter L. McLaren," Mary Leach presented the views of Peter L. McLaren on topics of local and national discourses, values, and the politics of difference. Landon E.…

  13. Histology-driven data mining of lipid signatures from multiple imaging mass spectrometry analyses: application to human colorectal cancer liver metastasis biopsies.

    PubMed

    Thomas, Aurélien; Patterson, Nathan Heath; Marcinkiewicz, Martin M; Lazaris, Anthoula; Metrakos, Peter; Chaurand, Pierre

    2013-03-05

    Imaging mass spectrometry (IMS) represents an innovative tool in the cancer research pipeline, which is increasingly being used in clinical and pharmaceutical applications. The unique properties of the technique, especially the amount of data generated, make the handling of data from multiple IMS acquisitions challenging. This work presents a histology-driven IMS approach aiming to identify discriminant lipid signatures from the simultaneous mining of IMS data sets from multiple samples. The feasibility of the developed workflow is evaluated on a set of three human colorectal cancer liver metastasis (CRCLM) tissue sections. Lipid IMS on tissue sections was performed using MALDI-TOF/TOF MS in both negative and positive ionization modes after 1,5-diaminonaphthalene matrix deposition by sublimation. The combination of both positive and negative acquisition results was performed during data mining to simplify the process and interrogate a larger lipidome into a single analysis. To reduce the complexity of the IMS data sets, a sub data set was generated by randomly selecting a fixed number of spectra from a histologically defined region of interest, resulting in a 10-fold data reduction. Principal component analysis confirmed that the molecular selectivity of the regions of interest is maintained after data reduction. Partial least-squares and heat map analyses demonstrated a selective signature of the CRCLM, revealing lipids that are significantly up- and down-regulated in the tumor region. This comprehensive approach is thus of interest for defining disease signatures directly from IMS data sets by the use of combinatory data mining, opening novel routes of investigation for addressing the demands of the clinical setting.

  14. Geologic analyses of LANDSAT-1 multispectral imagery of a possible power plant site employing digital and analog image processing. [in Pennsylvania

    NASA Technical Reports Server (NTRS)

    Lovegreen, J. R.; Prosser, W. J.; Millet, R. A.

    1975-01-01

    A site in the Great Valley subsection of the Valley and Ridge physiographic province in eastern Pennsylvania was studied to evaluate the use of digital and analog image processing for geologic investigations. Ground truth at the site was obtained by a field mapping program, a subsurface exploration investigation and a review of available published and unpublished literature. Remote sensing data were analyzed using standard manual techniques. LANDSAT-1 imagery was analyzed using digital image processing employing the multispectral Image 100 system and using analog color processing employing the VP-8 image analyzer. This study deals primarily with linears identified employing image processing and correlation of these linears with known structural features and with linears identified manual interpretation; and the identification of rock outcrops in areas of extensive vegetative cover employing image processing. The results of this study indicate that image processing can be a cost-effective tool for evaluating geologic and linear features for regional studies encompassing large areas such as for power plant siting. Digital image processing can be an effective tool for identifying rock outcrops in areas of heavy vegetative cover.

  15. Analyses of sexual dimorphism of contemporary Japanese using reconstructed three-dimensional CT images--curvature of the best-fit circle of the greater sciatic notch.

    PubMed

    Biwasaka, Hitoshi; Aoki, Yasuhiro; Tanijiri, Toyohisa; Sato, Kei; Fujita, Sachiko; Yoshioka, Kunihiro; Tomabechi, Makiko

    2009-04-01

    We examined various expression methods of sexual dimorphism of the greater sciatic notch (GSN) of the pelvis in contemporary Japanese residents by analyzing the three-dimensional (3D) images reconstructed by multi-slice computed tomography (CT) data, using image-processing and measurement software. Mean error of anthropological measurement values between two skeletonized pelves and their reconstructed 3D-CT images was 1.4%. A spline curve was set along the edge of the GSN of reconstructed pelvic 3D-CT images. Then a best-fit circle for subsets of the spline curve, 5-60mm in length and passing through the deepest point (inflection point) of the GSN, was created, and the radius of the circle (curvature radius) and its ratio to the maximum pelvic height (curvature quotient) were computed. In analysis of images reconstructed from CT data of 180 individuals (male: 91, female: 89), sexes were correctly identified in with 89.4% of specimens, with a spline curve length of 60mm. Because sexing was possible even in deeper regions of the GSN, which are relatively resistant to postmortem damage, the present method may be useful for practical forensic investigation.

  16. PCaAnalyser: A 2D-Image Analysis Based Module for Effective Determination of Prostate Cancer Progression in 3D Culture

    PubMed Central

    Lovitt, Carrie J.; Avery, Vicky M.

    2013-01-01

    Three-dimensional (3D) in vitro cell based assays for Prostate Cancer (PCa) research are rapidly becoming the preferred alternative to that of conventional 2D monolayer cultures. 3D assays more precisely mimic the microenvironment found in vivo, and thus are ideally suited to evaluate compounds and their suitability for progression in the drug discovery pipeline. To achieve the desired high throughput needed for most screening programs, automated quantification of 3D cultures is required. Towards this end, this paper reports on the development of a prototype analysis module for an automated high-content-analysis (HCA) system, which allows for accurate and fast investigation of in vitro 3D cell culture models for PCa. The Java based program, which we have named PCaAnalyser, uses novel algorithms that allow accurate and rapid quantitation of protein expression in 3D cell culture. As currently configured, the PCaAnalyser can quantify a range of biological parameters including: nuclei-count, nuclei-spheroid membership prediction, various function based classification of peripheral and non-peripheral areas to measure expression of biomarkers and protein constituents known to be associated with PCa progression, as well as defining segregate cellular-objects effectively for a range of signal-to-noise ratios. In addition, PCaAnalyser architecture is highly flexible, operating as a single independent analysis, as well as in batch mode; essential for High-Throughput-Screening (HTS). Utilising the PCaAnalyser, accurate and rapid analysis in an automated high throughput manner is provided, and reproducible analysis of the distribution and intensity of well-established markers associated with PCa progression in a range of metastatic PCa cell-lines (DU145 and PC3) in a 3D model demonstrated. PMID:24278197

  17. Histochemical analyses and quantum dot imaging of microvascular blood flow with pulmonary edema in living mouse lungs by "in vivo cryotechnique".

    PubMed

    Saitoh, Yurika; Terada, Nobuo; Saitoh, Sei; Ohno, Nobuhiko; Jin, Takashi; Ohno, Shinichi

    2012-02-01

    Light microscopic imaging of blood vessels and distribution of serum proteins is essential to analyze hemodynamics in living animal lungs under normal respiration or respiratory diseases. In this study, to demonstrate dynamically changing morphology and immunohistochemical images of their living states, "in vivo cryotechnique" (IVCT) combined with freeze-substitution fixation was applied to anesthetized mouse lungs. By hematoxylin-eosin staining, morphological features, such as shapes of alveolar septum and sizes of alveolar lumen, reflected their respiratory conditions in vivo, and alveolar capillaries were filled with variously shaped erythrocytes. Albumin was usually immunolocalized in the capillaries, which was confirmed by double-immunostaining for aquaporin-1 of endothelium. To capture accurate time-courses of blood flow in peripheral pulmonary alveoli, glutathione-coated quantum dots (QDs) were injected into right ventricles, and then IVCT was performed at different time-points after the QD injection. QDs were localized in most arterioles and some alveolar capillaries at 1 s, and later in venules at 2 s, reflecting a typical blood flow direction in vivo. Three-dimensional QD images of microvascular networks were reconstructed by confocal laser scanning microscopy. It was also applied to lungs of acute pulmonary hypertension mouse model. Erythrocytes were crammed in blood vessels, and some serum components leaked into alveolar lumens, as confirmed by mouse albumin immunostaining. Some separated collagen fibers and connecting elastic fibers were still detected in edematous tunica adventitia near terminal bronchioles. Thus, IVCT combined with histochemical approaches enabled us to capture native images of dynamically changing structures and microvascular hemodynamics of living mouse lungs.

  18. Micro-flow imaging analyses reflect mechanisms of aggregate formation: Comparing protein particle data sets using the Kullback-Leibler divergence.

    PubMed

    Maddux, Nathaniel R; Daniels, Austin L; Randolph, Theodore W

    2017-01-31

    Sub-visible particles in therapeutic protein formulations are an increasing manufacturing and regulatory concern due to their potential to cause adverse immune responses. Flow imaging microscopy is used extensively to detect sub-visible particles and investigate product deviations, typically by comparing imaging data using histograms of particle descriptors. Such an approach discards much information, and requires effort to interpret differences, which is problematic when comparing many data sets. We propose to compare imaging data by using the Kullback-Leibler divergence, an information theoretic measure of the difference of distributions.(1) We use the divergence to generate scatter plots representing the similarity between data sets, and to classify new data into previously determined categories. Our approach is multidimensional, automated and less biased than traditional techniques. We demonstrate the method with FlowCAM® imagery of protein aggregates acquired from monoclonal antibody samples subjected to different stresses. The method succeeds in classifying aggregated samples by stress condition, and, once trained, is able to identify the stress that caused aggregate formation in new samples. In addition to potentially detecting subtle incipient manufacturing faults, the method may have applications to verification of product uniformity after manufacturing changes, identification of counterfeit products, and development of closely matching bio-similar products.

  19. Active brain changes after initiating fingolimod therapy in multiple sclerosis patients using individual voxel-based analyses for diffusion tensor imaging

    PubMed Central

    Senda, Joe; Watanabe, Hirohisa; Endo, Kuniyuki; Yasui, Keizo; Hawsegawa, Yasuhiro; Yoneyama, Noritaka; Tsuboi, Takashi; Hara, Kazuhiro; Ito, Mizuki; Atsuta, Naoki; Epifanio Jr, Bagarinao; Katsuno, Masahisa; Naganawa, Shinji; Sobue, Gen

    2016-01-01

    ABSTRACT Voxel-based analysis (VBA) of diffusion tensor images (DTI) and voxel-based morphometry (VBM) in patients with multiple sclerosis (MS) can sensitively detect occult tissue damage that underlies pathological changes in the brain. In the present study, both at the start of fingolimod and post-four months clinical remission, we assessed four patients with MS who were evaluated with VBA of DTI, VBM, and fluid-attenuated inversion recovery (FLAIR). DTI images for all four patients showed widespread areas of increased mean diffusivity (MD) and decreased fractional anisotropy (FA) that were beyond the high-intensity signal areas across images. After four months of continuous fingolimod therapy, DTI abnormalities progressed; in particular, MD was significantly increased, while brain volume and high-intensity signals were unchanged. These findings suggest that VBA of DTI (e.g., MD) may help assess MS demyelination as neuroinflammatory conditions, even though clinical manifestations of MS appear to be in complete remission during fingolimod. PMID:28008201

  20. Analyses of sexual dimorphism of reconstructed pelvic computed tomography images of contemporary Japanese using curvature of the greater sciatic notch, pubic arch and greater pelvis.

    PubMed

    Biwasaka, Hitoshi; Aoki, Yasuhiro; Sato, Kei; Tanijiri, Toyohisa; Fujita, Sachiko; Dewa, Koji; Yoshioka, Kunihiro; Tomabechi, Makiko

    2012-06-10

    Three-dimensional pelvic images were reconstructed from multi-slice CT data of contemporary Japanese (males: 124; females: 104, 25-92 years old), and curvature analysis to examine sexual dimorphism was carried out in the great sciatic notch (GSN), the pubic arch and the greater pelvis in the images. Reconstructed pelvic CT images were visualized fairly well and anatomical landmarks were easily recognizable. When calculating the radii (curvature radii) of the best-fit circles for the spline curve lines set along the edges of the GSNs and of the pubic arches, sexes from these regions were correctly identified in 89.1% (males: 93.8%; females: 83.7%) and 94.7% (males: 97.3%; females: 91.8%) of cases, respectively, by setting an appropriate cut-off value. Furthermore, sexing was possible even in deeper regions of the GSN which are relatively resistant to postmortem damage. Curvature radii of the best-fit spheres of greater pelves showed no significant difference between sexes. However, curvature of the best-fit sphere for the left iliac fossa was significantly larger than that of the right one (p<10(-24)) in males, and the ratios were >1.0 in 88% of all male specimens analyzed. Meanwhile, no significant difference was observed among female samples. Although some left-sided dominancy has been reported in 2-dimensional measurements of the human pelvis, this 3-dimensional laterality in males was much more significant, and is a potential index of sex difference.

  1. Assimilating All-Sky GPM Microwave Imager(GMI) Radiance Data in NASA GEOS-5 System for Global Cloud and Precipitation Analyses

    NASA Astrophysics Data System (ADS)

    Kim, M. J.; Jin, J.; McCarty, W.; Todling, R.; Holdaway, D. R.; Gelaro, R.

    2014-12-01

    The NASA Global Modeling and Assimilation Office (GMAO) works to maximize the impact of satellite observations in the analysis and prediction of climate and weather through integrated Earth system modeling and data assimilation. To achieve this goal, the GMAO undertakes model and assimilation development, generates products to support NASA instrument teams and the NASA Earth science program. Currently Atmospheric Data Assimilation System (ADAS) in the Goddard Earth Observing System Model, Version 5(GEOS-5) system combines millions of observations and short-term forecasts to determine the best estimate, or analysis, of the instantaneous atmospheric state. However, ADAS has been geared towards utilization of observations in clear sky conditions and the majority of satellite channel data affected by clouds are discarded. Microwave imager data from satellites can be a significant source of information for clouds and precipitation but the data are presently underutilized, as only surface rain rates from the Tropical Rainfall Measurement Mission (TRMM) Microwave Imager (TMI) are assimilated with small weight assigned in the analysis process. As clouds and precipitation often occur in regions with high forecast sensitivity, improvements in the temperature, moisture, wind and cloud analysis of these regions are likely to contribute to significant gains in numerical weather prediction accuracy. This presentation is intended to give an overview of GMAO's recent progress in assimilating the all-sky GPM Microwave Imager (GMI) radiance data in GEOS-5 system. This includes development of various new components to assimilate cloud and precipitation affected data in addition to data in clear sky condition. New observation operators, quality controls, moisture control variables, observation and background error models, and a methodology to incorporate the linearlized moisture physics in the assimilation system are described. In addition preliminary results showing impacts of

  2. Analyses of requirements for computer control and data processing experiment subsystems: Image data processing system (IDAPS) software description (7094 version), volume 2

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A description of each of the software modules of the Image Data Processing System (IDAPS) is presented. The changes in the software modules are the result of additions to the application software of the system and an upgrade of the IBM 7094 Mod(1) computer to a 1301 disk storage configuration. Necessary information about IDAPS sofware is supplied to the computer programmer who desires to make changes in the software system or who desires to use portions of the software outside of the IDAPS system. Each software module is documented with: module name, purpose, usage, common block(s) description, method (algorithm of subroutine) flow diagram (if needed), subroutines called, and storage requirements.

  3. Characteristics and Origin of a Cratered Unit near the MSL Bradbury Landing Site (Gale Crater, Mars) Based on Analyses of Surface Data and Orbital Images

    NASA Astrophysics Data System (ADS)

    Jacob, S.; Rowland, S. K.; Edgett, K. S.; Kah, L. C.; Wiens, R. C.; Day, M. D.; Calef, F.; Palucis, M. C.; Anderson, R. B.

    2014-12-01

    Using orbiter images, the Curiosity landing ellipse was mapped as six distinct units based mainly on geomorphic characteristics. These units are the alluvial fan material (ALF), fractured light-toned surface (FLT), cratered plains/surfaces (CS), smooth hummocky plains (SH), rugged unit (RU) and striated light-toned outcrops (SLT) (Grotzinger et al., 2014; DOI: 10.1126/science.1242777). The goal of this project was to characterize and determine the origin of the CS. The CS is a thin, nearly horizontal, erosion resistant capping unit. HiRISE mosaics were utilized to subdivide the CS into four geomorphic sub-units. Crater densities were calculated for each sub-unit to provide a quantifiable parameter that could aid in understanding how the sub-units differ. Mastcam images from many locations along Curiosity's traverse show fields of dark, massive boulders, which are presumably erosional remnants of the CS. This indicates that the CS was likely more laterally extensive in the past. In situ CS outcrops, seen at Shaler and multiple locations near the Zabriskie Plateau, appear to have a rough, wind-sculpted surface and may consist of two distinct lithologies. The lower lithology displays hints of layering that have been enhanced by differential weathering, whereas the upper lithology consists of dark, massive rock. When present, the outcrops can extend laterally for several meters, but Mastcam images of outcrops do not always reveal both sections. ChemCam data show that CS targets have concentrations of Al, Na, and K that are indicative of an alkali feldspar phase. The physical and chemical characteristics of the CS suggest a massive deposit that has seen little to no chemical alteration. Physical characteristics of the CS do not allow us to unambiguously determine its geologic origin; possible emplacement mechanisms would require the ability to spread laterally over a nearly horizontal surface, and include inflating lava (e.g., pāhoehoe) or a distal delta deposit. The

  4. Imaging analyses of coagulation-dependent initiation of fibrinolysis on activated platelets and its modification by thrombin-activatable fibrinolysis inhibitor.

    PubMed

    Brzoska, Tomasz; Suzuki, Yuko; Sano, Hideto; Suzuki, Seiichirou; Tomczyk, Martyna; Tanaka, Hiroki; Urano, Tetsumei

    2017-04-03

    Using intravital confocal microscopy, we observed previously that the process of platelet phosphatidylserine (PS) exposure, fibrin formation and lysine binding site-dependent plasminogen (plg) accumulation took place only in the centre of thrombi, not at their periphery. These findings prompted us to analyse the spatiotemporal regulatory mechanisms underlying coagulation and fibrinolysis. We analysed the fibrin network formation and the subsequent lysis in an in vitro experiment using diluted platelet-rich plasma supplemented with fluorescently labelled coagulation and fibrinolytic factors, using confocal laser scanning microscopy. The structure of the fibrin network formed by supplemented tissue factor was uneven and denser at the sites of coagulation initiation regions (CIRs) on PS-exposed platelets. When tissue-type plasminogen activator (tPA; 7.5 nM) was supplemented, labelled plg (50 nM) as well as tPA accumulated at CIRs, from where fibrinolysis started and gradually expanded to the peripheries. The lysis time at CIRs and their peripheries (50 µm from the CIR) were 27.9 ± 6.6 and 44.4 ± 9.7 minutes (mean ± SD, n=50 from five independent experiments) after the addition of tissue factor, respectively. Recombinant human soluble thrombomodulin (TMα; 2.0 nM) attenuated the CIR-dependent plg accumulation and strongly delayed fibrinolysis at CIRs. A carboxypeptidase inhibitor dose-dependently enhanced the CIR-dependent fibrinolysis initiation, and at 20 µM it completely abrogated the TMα-induced delay of fibrinolysis. Our findings are the first to directly present crosstalk between coagulation and fibrinolysis, which takes place on activated platelets' surface and is further controlled by thrombin-activatable fibrinolysis inhibitor (TAFI).

  5. Clinical, imaging, and immunohistochemical characteristics of focal cortical dysplasia Type II extratemporal epilepsies in children: analyses of an institutional case series.

    PubMed

    Knerlich-Lukoschus, Friederike; Connolly, Mary B; Hendson, Glenda; Steinbok, Paul; Dunham, Christopher

    2017-02-01

    OBJECTIVE Focal cortical dysplasia (FCD) Type II is divided into 2 subgroups based on the absence (IIA) or presence (IIB) of balloon cells. In particular, extratemporal FCD Type IIA and IIB is not completely understood in terms of clinical, imaging, biological, and neuropathological differences. The aim of the authors was to analyze distinctions between these 2 formal entities and address clinical, MRI, and immunohistochemical features of extratemporal epilepsies in children. METHODS Cases formerly classified as Palmini FCD Type II nontemporal epilepsies were identified through the prospectively maintained epilepsy database at the British Columbia Children's Hospital in Vancouver, Canada. Clinical data, including age of seizure onset, age at surgery, seizure type(s) and frequency, affected brain region(s), intraoperative electrocorticographic findings, and outcome defined by Engel's classification were obtained for each patient. Preoperative and postoperative MRI results were reevaluated. H & E-stained tissue sections were reevaluated by using the 2011 International League Against Epilepsy classification system and additional immunostaining for standard cellular markers (neuronal nuclei, neurofilament, glial fibrillary acidic protein, CD68). Two additional established markers of pathology in epilepsy resection, namely, CD34 and α-B crystallin, were applied. RESULTS Seven nontemporal FCD Type IIA and 7 Type B cases were included. Patients with FCD Type IIA presented with an earlier age of epilepsy onset and slightly better Engel outcome. Radiology distinguished FCD Types IIA and IIB, in that Type IIB presented more frequently with characteristic cortical alterations. Nonphosphorylated neurofilament protein staining confirmed dysplastic cells in dyslaminated areas. The white-gray matter junction was focally blurred in patients with FCD Type IIB. α-B crystallin highlighted glial cells in the white matter and subpial layer with either of the 2 FCD Type II subtypes

  6. Utilizing magnetic resonance imaging logs, open hole logs and sidewall core analyses to evaluate shaly sands for water-free production

    SciTech Connect

    Taylor, D.; Morganti, J.; White, H.

    1995-06-01

    NMR logging using the new C series Magnetic Resonance Imaging Logging (MRIL){trademark} is rapidly enhancing formation evaluation throughout the industry. By measuring irreducible water saturations, permeability and effective porosities, MRIL data can help petrophysicists evaluate low resistivity pays. In these instances, conventional open hole logs may not define all of the pay intervals. MRIL can also minimize unnecessary completions in zones of potentially high water-cut. This case study will briefly discuss MRIL tool theory and log presentations used with the conventional logs and sidewall cores. SEM analysis will show a good correlation of varying grain size sands with the T{sub 2} distribution and bulk volume irreducible from MRIL. Discussions of each well in the study area will show how water-free production zones were defined. Because the MRIL data was not recorded on one of the wells, the advanced petrophysical program HORIZON was used to predict the MRIL bulk volume irreducible and effective porosity to estimate productive zones. Discussion of additional formation characteristics, completion procedures, actual production and predicted producibility of the shaly sands will be presented.

  7. IDATEN and G-SITENNO: GUI-assisted software for coherent X-ray diffraction imaging experiments and data analyses at SACLA.

    PubMed

    Sekiguchi, Yuki; Yamamoto, Masaki; Oroguchi, Tomotaka; Takayama, Yuki; Suzuki, Shigeyuki; Nakasako, Masayoshi

    2014-11-01

    Using our custom-made diffraction apparatus KOTOBUKI-1 and two multiport CCD detectors, cryogenic coherent X-ray diffraction imaging experiments have been undertaken at the SPring-8 Angstrom Compact free electron LAser (SACLA) facility. To efficiently perform experiments and data processing, two software suites with user-friendly graphical user interfaces have been developed. The first is a program suite named IDATEN, which was developed to easily conduct four procedures during experiments: aligning KOTOBUKI-1, loading a flash-cooled sample into the cryogenic goniometer stage inside the vacuum chamber of KOTOBUKI-1, adjusting the sample position with respect to the X-ray beam using a pair of telescopes, and collecting diffraction data by raster scanning the sample with X-ray pulses. Named G-SITENNO, the other suite is an automated version of the original SITENNO suite, which was designed for processing diffraction data. These user-friendly software suites are now indispensable for collecting a large number of diffraction patterns and for processing the diffraction patterns immediately after collecting data within a limited beam time.

  8. Flow modification in canine intracranial aneurysm model by an asymmetric stent: studies using digital subtraction angiography (DSA) and image-based computational fluid dynamics (CFD) analyses

    NASA Astrophysics Data System (ADS)

    Hoi, Yiemeng; Ionita, Ciprian N.; Tranquebar, Rekha V.; Hoffmann, Kenneth R.; Woodward, Scott H.; Taulbee, Dale B.; Meng, Hui; Rudin, Stephen

    2006-03-01

    An asymmetric stent with low porosity patch across the intracranial aneurysm neck and high porosity elsewhere is designed to modify the flow to result in thrombogenesis and occlusion of the aneurysm and yet to reduce the possibility of also occluding adjacent perforator vessels. The purposes of this study are to evaluate the flow field induced by an asymmetric stent using both numerical and digital subtraction angiography (DSA) methods and to quantify the flow dynamics of an asymmetric stent in an in vivo aneurysm model. We created a vein-pouch aneurysm model on the canine carotid artery. An asymmetric stent was implanted at the aneurysm, with 25% porosity across the aneurysm neck and 80% porosity elsewhere. The aneurysm geometry, before and after stent implantation, was acquired using cone beam CT and reconstructed for computational fluid dynamics (CFD) analysis. Both steady-state and pulsatile flow conditions using the measured waveforms from the aneurysm model were studied. To reduce computational costs, we modeled the asymmetric stent effect by specifying a pressure drop over the layer across the aneurysm orifice where the low porosity patch was located. From the CFD results, we found the asymmetric stent reduced the inflow into the aneurysm by 51%, and appeared to create a stasis-like environment which favors thrombus formation. The DSA sequences also showed substantial flow reduction into the aneurysm. Asymmetric stents may be a viable image guided intervention for treating intracranial aneurysms with desired flow modification features.

  9. Use of INSAT-3D sounder and imager radiances in the 4D-VAR data assimilation system and its implications in the analyses and forecasts

    NASA Astrophysics Data System (ADS)

    Indira Rani, S.; Taylor, Ruth; George, John P.; Rajagopal, E. N.

    2016-05-01

    INSAT-3D, the first Indian geostationary satellite with sounding capability, provides valuable information over India and the surrounding oceanic regions which are pivotal to Numerical Weather Prediction. In collaboration with UK Met Office, NCMRWF developed the assimilation capability of INSAT-3D Clear Sky Brightness Temperature (CSBT), both from the sounder and imager, in the 4D-Var assimilation system being used at NCMRWF. Out of the 18 sounder channels, radiances from 9 channels are selected for assimilation depending on relevance of the information in each channel. The first three high peaking channels, the CO2 absorption channels and the three water vapor channels (channel no. 10, 11, and 12) are assimilated both over land and Ocean, whereas the window channels (channel no. 6, 7, and 8) are assimilated only over the Ocean. Measured satellite radiances are compared with that from short range forecasts to monitor the data quality. This is based on the assumption that the observed satellite radiances are free from calibration errors and the short range forecast provided by NWP model is free from systematic errors. Innovations (Observation - Forecast) before and after the bias correction are indicative of how well the bias correction works. Since the biases vary with air-masses, time, scan angle and also due to instrument degradation, an accurate bias correction algorithm for the assimilation of INSAT-3D sounder radiance is important. This paper discusses the bias correction methods and other quality controls used for the selected INSAT-3D sounder channels and the impact of bias corrected radiance in the data assimilation system particularly over India and surrounding oceanic regions.

  10. Beta Adrenergic Receptor Stimulation Suppresses Cell Migration in Association with Cell Cycle Transition in Osteoblasts-Live Imaging Analyses Based on FUCCI System.

    PubMed

    Katsumura, Sakie; Ezura, Yoichi; Izu, Yayoi; Shirakawa, Jumpei; Miyawaki, Atsushi; Harada, Kiyoshi; Noda, Masaki

    2016-02-01

    Osteoporosis affects over 20 million patients in the United States. Among those, disuse osteoporosis is serious as it is induced by bed-ridden conditions in patients suffering from aging-associated diseases including cardiovascular, neurological, and malignant neoplastic diseases. Although the phenomenon that loss of mechanical stress such as bed-ridden condition reduces bone mass is clear, molecular bases for the disuse osteoporosis are still incompletely understood. In disuse osteoporosis model, bone loss is interfered by inhibitors of sympathetic tone and adrenergic receptors that suppress bone formation. However, how beta adrenergic stimulation affects osteoblastic migration and associated proliferation is not known. Here we introduced a live imaging system, fluorescent ubiquitination-based cell cycle indicator (FUCCI), in osteoblast biology and examined isoproterenol regulation of cell cycle transition and cell migration in osteoblasts. Isoproterenol treatment suppresses the levels of first entry peak of quiescent osteoblastic cells into cell cycle phase by shifting from G1 /G0 to S/G2 /M and also suppresses the levels of second major peak population that enters into S/G2 /M. The isoproterenol regulation of osteoblastic cell cycle transition is associated with isoproterenol suppression on the velocity of migration. This isoproterenol regulation of migration velocity is cell cycle phase specific as it suppresses migration velocity of osteoblasts in G1 phase but not in G1 /S nor in G2 /M phase. Finally, these observations on isoproterenol regulation of osteoblastic migration and cell cycle transition are opposite to the PTH actions in osteoblasts. In summary, we discovered that sympathetic tone regulates osteoblastic migration in association with cell cycle transition by using FUCCI system.

  11. Quantifying the Physical Composition of Urban Morphology throughout Wales by analysing a Time Series (1989-2011) of Landsat TM/ETM+ images and Supporting GIS data

    NASA Astrophysics Data System (ADS)

    Scott, Douglas; Petropoulos, George

    2014-05-01

    Knowledge of impervious surface areas (ISA) and on their changes in magnitude, location, geometry and morphology over time is significant for a range of practical applications and research alike from local to global scale. It is a key indicator of global environmental change and is also important parameter for urban planning and environmental resources management, especially within a European context due to the policy recommendations given to the European Commission by the Austrian Environment Agency in 2011. Despite this, use of Earth Observation (EO) technology in mapping ISAs within the European Union (EU) and in particular in the UK is inadequate. In the present study, selected study sites across Wales have been used to test the use of freely distributed EO data from Landsat TM/ETM+ sensors in retrieving ISA for improving the current European estimations of international urbanization and soil sealing. A traditional classifier and a linear spectral mixture analysis (LSMA) were both applied to a series of Landsat TM/ETM+ images acquired over a period spanning 22 years to extract ISA. Aerial photography with a spatial resolution of 0.4m, acquired over the summer period in 2005 was used for validation purposes. The Welsh study areas provided a unique chance to detect largely dispersed urban morphology within an urban-rural frontier context. The study also presents an innovative method for detecting clouds and cloud shadow layers, detected with an overall accuracy of around 97%. The process tree built and presented in this study is important in terms of moving forward into a biennial program for the Welsh Government and is comparable to currently existing products. This EO-based product also offers a much less subjectively static and more objectively dynamic estimation of ISA cover. Our methodology not only inaugurates the local retrieval of ISA for Wales but also meliorates the existing EU international figures, and expands relatively stationary 'global' US

  12. Analysing the effect of crystal size and structure in highly efficient CH3NH3PbI3 perovskite solar cells by spatially resolved photo- and electroluminescence imaging.

    PubMed

    Mastroianni, S; Heinz, F D; Im, J-H; Veurman, W; Padilla, M; Schubert, M C; Würfel, U; Grätzel, M; Park, N-G; Hinsch, A

    2015-12-14

    CH3NH3PbI3 perovskite solar cells with a mesoporous TiO2 layer and spiro-MeOTAD as a hole transport layer (HTL) with three different CH3NH3I concentrations (0.032 M, 0.044 M and 0.063 M) were investigated. Strong variations in crystal size and morphology resulting in diversified cell efficiencies (9.2%, 16.9% and 12.3%, respectively) were observed. The physical origin of this behaviour was analysed by detailed characterization combining current-voltage curves with photo- and electroluminescence (PL and EL) imaging as well as light beam induced current measurements (LBIC). It was found that the most efficient cell shows the highest luminescence and the least efficient cell is most strongly limited by non-radiative recombination. Crystal size, morphology and distribution in the capping layer and in the porous scaffold strongly affect the non-radiative recombination. Moreover, the very non-uniform crystal structure with multiple facets, as evidenced by SEM images of the 0.032 M device, suggests the creation of a large number of grain boundaries and crystal dislocations. These defects give rise to increased trap-assisted non-radiative recombination as is confirmed by high-resolution μ-PL images. The different imaging techniques used in this study prove to be well-suited to spatially investigate and thus correlate the crystal morphology of the perovskite layer with the electrical and radiative properties of the solar cells and thus with their performance.

  13. EPOXI Trajectory and Maneuver Analyses

    NASA Technical Reports Server (NTRS)

    Chung, Min-Kun J.; Bhaskaran, Shyamkumar; Chesley, Steven R.; Halsell, C. Allen; Helfrich, Clifford E.; Jefferson, David C.; McElrath, Timothy P.; Rush, Brian P.; Wang, Tseng-Chan M.; Yen, Chen-wan L.

    2011-01-01

    The EPOXI mission is a NASA Discovery Mission of Opportunity combining two separate investigations: Extrasolar Planet Observation and Characterization (EPOCh) and Deep Impact eXtended Investigation (DIXI). Both investigations reused the DI instruments and spacecraft that successfully flew by the comet Tempel-1 (4 July 2005). For EPOCh, the goal was to find exoplanets with the high resolution imager, while for DIXI it was to fly by the comet Hartley 2 (4 Nov 2010). This paper documents the navigation experience of the earlier ma-neuver analyses critical for the EPOXI mission including statistical ?V analyses and other useful analyses in designing maneuvers. It also recounts the trajectory design leading up to the final reference trajectory to Hartley 2.

  14. Spacelab Charcoal Analyses

    NASA Technical Reports Server (NTRS)

    Slivon, L. E.; Hernon-Kenny, L. A.; Katona, V. R.; Dejarme, L. E.

    1995-01-01

    This report describes analytical methods and results obtained from chemical analysis of 31 charcoal samples in five sets. Each set was obtained from a single scrubber used to filter ambient air on board a Spacelab mission. Analysis of the charcoal samples was conducted by thermal desorption followed by gas chromatography/mass spectrometry (GC/MS). All samples were analyzed using identical methods. The method used for these analyses was able to detect compounds independent of their polarity or volatility. In addition to the charcoal samples, analyses of three Environmental Control and Life Support System (ECLSS) water samples were conducted specifically for trimethylamine.

  15. Wavelet Analyses and Applications

    ERIC Educational Resources Information Center

    Bordeianu, Cristian C.; Landau, Rubin H.; Paez, Manuel J.

    2009-01-01

    It is shown how a modern extension of Fourier analysis known as wavelet analysis is applied to signals containing multiscale information. First, a continuous wavelet transform is used to analyse the spectrum of a nonstationary signal (one whose form changes in time). The spectral analysis of such a signal gives the strength of the signal in each…

  16. Apollo 14 microbial analyses

    NASA Technical Reports Server (NTRS)

    Taylor, G. R.

    1972-01-01

    Extensive microbiological analyses that were performed on the Apollo 14 prime and backup crewmembers and ancillary personnel are discussed. The crewmembers were subjected to four separate and quite different environments during the 137-day monitoring period. The relation between each of these environments and observed changes in the microflora of each astronaut are presented.

  17. Geomorphic analyses from space imagery

    NASA Technical Reports Server (NTRS)

    Morisawa, M.

    1985-01-01

    One of the most obvious applications of space imagery to geomorphological analyses is in the study of drainage patterns and channel networks. LANDSAT, high altitude photography and other types of remote sensing imagery are excellent for depicting stream networks on a regional scale because of their broad coverage in a single image. They offer a valuable tool for comparing and analyzing drainage patterns and channel networks all over the world. Three aspects considered in this geomorphological study are: (1) the origin, evolution and rates of development of drainage systems; (2) the topological studies of network and channel arrangements; and (3) the adjustment of streams to tectonic events and geologic structure (i.e., the mode and rate of adjustment).

  18. Systematic Processing of Clementine Data for Scientific Analyses

    NASA Technical Reports Server (NTRS)

    Mcewen, A. S.

    1993-01-01

    If fully successful, the Clementine mission will return about 3,000,000 lunar images and more than 5000 images of Geographos. Effective scientific analyses of such large datasets require systematic processing efforts. Concepts for two such efforts are described: glogal multispectral imaging of the moon; and videos of Geographos.

  19. Development of a systematic computer vision-based method to analyse and compare images of false identity documents for forensic intelligence purposes-Part I: Acquisition, calibration and validation issues.

    PubMed

    Auberson, Marie; Baechler, Simon; Zasso, Michaël; Genessay, Thibault; Patiny, Luc; Esseiva, Pierre

    2016-03-01

    Following their detection and seizure by police and border guard authorities, false identity and travel documents are usually scanned, producing digital images. This research investigates the potential of these images to classify false identity documents, highlight links between documents produced by a same modus operandi or same source, and thus support forensic intelligence efforts. Inspired by previous research work about digital images of Ecstasy tablets, a systematic and complete method has been developed to acquire, collect, process and compare images of false identity documents. This first part of the article highlights the critical steps of the method and the development of a prototype that processes regions of interest extracted from images. Acquisition conditions have been fine-tuned in order to optimise reproducibility and comparability of images. Different filters and comparison metrics have been evaluated and the performance of the method has been assessed using two calibration and validation sets of documents, made up of 101 Italian driving licenses and 96 Portuguese passports seized in Switzerland, among which some were known to come from common sources. Results indicate that the use of Hue and Edge filters or their combination to extract profiles from images, and then the comparison of profiles with a Canberra distance-based metric provides the most accurate classification of documents. The method appears also to be quick, efficient and inexpensive. It can be easily operated from remote locations and shared amongst different organisations, which makes it very convenient for future operational applications. The method could serve as a first fast triage method that may help target more resource-intensive profiling methods (based on a visual, physical or chemical examination of documents for instance). Its contribution to forensic intelligence and its application to several sets of false identity documents seized by police and border guards will be

  20. Atmospheric tether mission analyses

    NASA Technical Reports Server (NTRS)

    1996-01-01

    NASA is considering the use of tethered satellites to explore regions of the atmosphere inaccessible to spacecraft or high altitude research balloons. This report summarizes the Lockheed Martin Astronautics (LMA) effort for the engineering study team assessment of an Orbiter-based atmospheric tether mission. Lockheed Martin responsibilities included design recommendations for the deployer and tether, as well as tether dynamic analyses for the mission. Three tether configurations were studied including single line, multistrand (Hoytether) and tape designs.

  1. LDEF Satellite Radiation Analyses

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.

    1996-01-01

    Model calculations and analyses have been carried out to compare with several sets of data (dose, induced radioactivity in various experiment samples and spacecraft components, fission foil measurements, and LET spectra) from passive radiation dosimetry on the Long Duration Exposure Facility (LDEF) satellite, which was recovered after almost six years in space. The calculations and data comparisons are used to estimate the accuracy of current models and methods for predicting the ionizing radiation environment in low earth orbit. The emphasis is on checking the accuracy of trapped proton flux and anisotropy models.

  2. Analysing immune cell migration.

    PubMed

    Beltman, Joost B; Marée, Athanasius F M; de Boer, Rob J

    2009-11-01

    The visualization of the dynamic behaviour of and interactions between immune cells using time-lapse video microscopy has an important role in modern immunology. To draw robust conclusions, quantification of such cell migration is required. However, imaging experiments are associated with various artefacts that can affect the estimated positions of the immune cells under analysis, which form the basis of any subsequent analysis. Here, we describe potential artefacts that could affect the interpretation of data sets on immune cell migration. We propose how these errors can be recognized and corrected, and suggest ways to prevent the data analysis itself leading to biased results.

  3. IMAGES, IMAGES, IMAGES

    SciTech Connect

    Marcus, A.

    1980-07-01

    The role of images of information (charts, diagrams, maps, and symbols) for effective presentation of facts and concepts is expanding dramatically because of advances in computer graphics technology, increasingly hetero-lingual, hetero-cultural world target populations of information providers, the urgent need to convey more efficiently vast amounts of information, the broadening population of (non-expert) computer users, the decrease of available time for reading texts and for decision making, and the general level of literacy. A coalition of visual performance experts, human engineering specialists, computer scientists, and graphic designers/artists is required to resolve human factors aspects of images of information. The need for, nature of, and benefits of interdisciplinary effort are discussed. The results of an interdisciplinary collaboration are demonstrated in a product for visualizing complex information about global energy interdependence. An invited panel will respond to the presentation.

  4. EEG analyses with SOBI.

    SciTech Connect

    Glickman, Matthew R.; Tang, Akaysha

    2009-02-01

    The motivating vision behind Sandia's MENTOR/PAL LDRD project has been that of systems which use real-time psychophysiological data to support and enhance human performance, both individually and of groups. Relevant and significant psychophysiological data being a necessary prerequisite to such systems, this LDRD has focused on identifying and refining such signals. The project has focused in particular on EEG (electroencephalogram) data as a promising candidate signal because it (potentially) provides a broad window on brain activity with relatively low cost and logistical constraints. We report here on two analyses performed on EEG data collected in this project using the SOBI (Second Order Blind Identification) algorithm to identify two independent sources of brain activity: one in the frontal lobe and one in the occipital. The first study looks at directional influences between the two components, while the second study looks at inferring gender based upon the frontal component.

  5. Network Class Superposition Analyses

    PubMed Central

    Pearson, Carl A. B.; Zeng, Chen; Simha, Rahul

    2013-01-01

    Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., for the yeast cell cycle process [1]), considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix , which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for derived from Boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with . We show how to generate Derrida plots based on . We show that -based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on . We motivate all of these results in terms of a popular molecular biology Boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for , for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses. PMID:23565141

  6. Network class superposition analyses.

    PubMed

    Pearson, Carl A B; Zeng, Chen; Simha, Rahul

    2013-01-01

    Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., ≈ 10(30) for the yeast cell cycle process), considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix T, which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for T derived from boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying T to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with T. We show how to generate Derrida plots based on T. We show that T-based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on T. We motivate all of these results in terms of a popular molecular biology boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for T, for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses.

  7. NOAA's National Snow Analyses

    NASA Astrophysics Data System (ADS)

    Carroll, T. R.; Cline, D. W.; Olheiser, C. M.; Rost, A. A.; Nilsson, A. O.; Fall, G. M.; Li, L.; Bovitz, C. T.

    2005-12-01

    NOAA's National Operational Hydrologic Remote Sensing Center (NOHRSC) routinely ingests all of the electronically available, real-time, ground-based, snow data; airborne snow water equivalent data; satellite areal extent of snow cover information; and numerical weather prediction (NWP) model forcings for the coterminous U.S. The NWP model forcings are physically downscaled from their native 13 km2 spatial resolution to a 1 km2 resolution for the CONUS. The downscaled NWP forcings drive an energy-and-mass-balance snow accumulation and ablation model at a 1 km2 spatial resolution and at a 1 hour temporal resolution for the country. The ground-based, airborne, and satellite snow observations are assimilated into the snow model's simulated state variables using a Newtonian nudging technique. The principle advantages of the assimilation technique are: (1) approximate balance is maintained in the snow model, (2) physical processes are easily accommodated in the model, and (3) asynoptic data are incorporated at the appropriate times. The snow model is reinitialized with the assimilated snow observations to generate a variety of snow products that combine to form NOAA's NOHRSC National Snow Analyses (NSA). The NOHRSC NSA incorporate all of the available information necessary and available to produce a "best estimate" of real-time snow cover conditions at 1 km2 spatial resolution and 1 hour temporal resolution for the country. The NOHRSC NSA consist of a variety of daily, operational, products that characterize real-time snowpack conditions including: snow water equivalent, snow depth, surface and internal snowpack temperatures, surface and blowing snow sublimation, and snowmelt for the CONUS. The products are generated and distributed in a variety of formats including: interactive maps, time-series, alphanumeric products (e.g., mean areal snow water equivalent on a hydrologic basin-by-basin basis), text and map discussions, map animations, and quantitative gridded products

  8. Broadband seismic illumination and resolution analyses based on staining algorithm

    NASA Astrophysics Data System (ADS)

    Chen, Bo; Jia, Xiao-Feng; Xie, Xiao-Bi

    2016-09-01

    Seismic migration moves reflections to their true subsurface positions and yields seismic images of subsurface areas. However, due to limited acquisition aperture, complex overburden structure and target dipping angle, the migration often generates a distorted image of the actual subsurface structure. Seismic illumination and resolution analyses provide a quantitative description of how the above-mentioned factors distort the image. The point spread function (PSF) gives the resolution of the depth image and carries full information about the factors affecting the quality of the image. The staining algorithm establishes a correspondence between a certain structure and its relevant wavefield and reflected data. In this paper, we use the staining algorithm to calculate the PSFs, then use these PSFs for extracting the acquisition dip response and correcting the original depth image by deconvolution. We present relevant results of the SEG salt model. The staining algorithm provides an efficient tool for calculating the PSF and for conducting broadband seismic illumination and resolution analyses.

  9. Effect of energy restriction and physical exercise intervention on phenotypic flexibility as examined by transcriptomics analyses of mRNA from adipose tissue and whole body magnetic resonance imaging.

    PubMed

    Lee, Sindre; Norheim, Frode; Langleite, Torgrim M; Noreng, Hans J; Storås, Trygve H; Afman, Lydia A; Frost, Gary; Bell, Jimmy D; Thomas, E Louise; Kolnes, Kristoffer J; Tangen, Daniel S; Stadheim, Hans K; Gilfillan, Gregor D; Gulseth, Hanne L; Birkeland, Kåre I; Jensen, Jørgen; Drevon, Christian A; Holen, Torgeir

    2016-11-01

    Overweight and obesity lead to changes in adipose tissue such as inflammation and reduced insulin sensitivity. The aim of this study was to assess how altered energy balance by reduced food intake or enhanced physical activity affect these processes. We studied sedentary subjects with overweight/obesity in two intervention studies, each lasting 12 weeks affecting energy balance either by energy restriction (~20% reduced intake of energy from food) in one group, or by enhanced energy expenditure due to physical exercise (combined endurance- and strength-training) in the other group. We monitored mRNA expression by microarray and mRNA sequencing from adipose tissue biopsies. We also measured several plasma parameters as well as fat distribution with magnetic resonance imaging and spectroscopy. Comparison of microarray and mRNA sequencing showed strong correlations, which were also confirmed using RT-PCR In the energy restricted subjects (body weight reduced by 5% during a 12 weeks intervention), there were clear signs of enhanced lipolysis as monitored by mRNA in adipose tissue as well as plasma concentration of free-fatty acids. This increase was strongly related to increased expression of markers for M1-like macrophages in adipose tissue. In the exercising subjects (glucose infusion rate increased by 29% during a 12-week intervention), there was a marked reduction in the expression of markers of M2-like macrophages and T cells, suggesting that physical exercise was especially important for reducing inflammation in adipose tissue with insignificant reduction in total body weight. Our data indicate that energy restriction and physical exercise affect energy-related pathways as well as inflammatory processes in different ways, probably related to macrophages in adipose tissue.

  10. On study design in neuroimaging heritability analyses

    NASA Astrophysics Data System (ADS)

    Koran, Mary Ellen; Li, Bo; Jahanshad, Neda; Thornton-Wells, Tricia A.; Glahn, David C.; Thompson, Paul M.; Blangero, John; Nichols, Thomas E.; Kochunov, Peter; Landman, Bennett A.

    2014-03-01

    Imaging genetics is an emerging methodology that combines genetic information with imaging-derived metrics to understand how genetic factors impact observable structural, functional, and quantitative phenotypes. Many of the most well-known genetic studies are based on Genome-Wide Association Studies (GWAS), which use large populations of related or unrelated individuals to associate traits and disorders with individual genetic factors. Merging imaging and genetics may potentially lead to improved power of association in GWAS because imaging traits may be more sensitive phenotypes, being closer to underlying genetic mechanisms, and their quantitative nature inherently increases power. We are developing SOLAR-ECLIPSE (SE) imaging genetics software which is capable of performing genetic analyses with both large-scale quantitative trait data and family structures of variable complexity. This program can estimate the contribution of genetic commonality among related subjects to a given phenotype, and essentially answer the question of whether or not the phenotype is heritable. This central factor of interest, heritability, offers bounds on the direct genetic influence over observed phenotypes. In order for a trait to be a good phenotype for GWAS, it must be heritable: at least some proportion of its variance must be due to genetic influences. A variety of family structures are commonly used for estimating heritability, yet the variability and biases for each as a function of the sample size are unknown. Herein, we investigate the ability of SOLAR to accurately estimate heritability models based on imaging data simulated using Monte Carlo methods implemented in R. We characterize the bias and the variability of heritability estimates from SOLAR as a function of sample size and pedigree structure (including twins, nuclear families, and nuclear families with grandparents).

  11. EEO Implications of Job Analyses.

    ERIC Educational Resources Information Center

    Lacy, D. Patrick, Jr.

    1979-01-01

    Discusses job analyses as they relate to the requirements of Title VII of the Civil Rights Act of 1964, the Equal Pay Act of 1963, and the Rehabilitation Act of 1973. Argues that job analyses can establish the job-relatedness of entrance requirements and aid in defenses against charges of discrimination. Journal availability: see EA 511 615.

  12. Analysing the Metaphorical Images of Turkish Preschool Teachers

    ERIC Educational Resources Information Center

    Kabadayi, Abdulkadir

    2008-01-01

    The metaphorical basis of teacher reflection about teaching and learning has been a rich area of theory and research. This is a study of metaphor as a shared system of interpretation and classification, which teachers and student teachers and their supervising teachers can cooperatively explore. This study employs metaphor as a means of research…

  13. Analysing the ventricular fibrillation waveform.

    PubMed

    Reed, Matthew J; Clegg, Gareth R; Robertson, Colin E

    2003-04-01

    The surface electrocardiogram associated with ventricular fibrillation has been of interest to researchers for some time. Over the last few decades, techniques have been developed to analyse this signal in an attempt to obtain more information about the state of the myocardium and the chances of successful defibrillation. This review looks at the implications of analysing the VF waveform and discusses the various techniques that have been used, including fast Fourier transform analysis, wavelet transform analysis and mathematical techniques such as chaos theory.

  14. Stereological analyses of the whole human pancreas

    PubMed Central

    Poudel, Ananta; Fowler, Jonas L.; Zielinski, Mark C.; Kilimnik, German; Hara, Manami

    2016-01-01

    The large size of human tissues requires a practical stereological approach to perform a comprehensive analysis of the whole organ. We have developed a method to quantitatively analyze the whole human pancreas, as one of the challenging organs to study, in which endocrine cells form various sizes of islets that are scattered unevenly throughout the exocrine pancreas. Furthermore, the human pancreas possesses intrinsic characteristics of intra-individual variability, i.e. regional differences in endocrine cell/islet distribution, and marked inter-individual heterogeneity regardless of age, sex and disease conditions including obesity and diabetes. The method is built based on large-scale image capture, computer-assisted unbiased image analysis and quantification, and further mathematical analyses, using widely-used software such as Fiji/ImageJ and MATLAB. The present study includes detailed protocols of every procedure as well as all the custom-written computer scripts, which can be modified according to specific experimental plans and specimens of interest. PMID:27658965

  15. Feed analyses and their interpretation

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Compositional analysis is central to determining the nutritional value of feedstuffs. The utility of the values and how they should be used depends on how representative the feed subsample is, the nutritional relevance of the assays, analytical variability of the analyses, and whether a feed is suit...

  16. Analysing Children's Drawings: Applied Imagination

    ERIC Educational Resources Information Center

    Bland, Derek

    2012-01-01

    This article centres on a research project in which freehand drawings provided a richly creative and colourful data source of children's imagined, ideal learning environments. Issues concerning the analysis of the visual data are discussed, in particular, how imaginative content was analysed and how the analytical process was dependent on an…

  17. FORTRAN Algorithm for Image Processing

    NASA Technical Reports Server (NTRS)

    Roth, Don J.; Hull, David R.

    1987-01-01

    FORTRAN computer algorithm containing various image-processing analysis and enhancement functions developed. Algorithm developed specifically to process images of developmental heat-engine materials obtained with sophisticated nondestructive evaluation instruments. Applications of program include scientific, industrial, and biomedical imaging for studies of flaws in materials, analyses of steel and ores, and pathology.

  18. Workload analyse of assembling process

    NASA Astrophysics Data System (ADS)

    Ghenghea, L. D.

    2015-11-01

    The workload is the most important indicator for managers responsible of industrial technological processes no matter if these are automated, mechanized or simply manual in each case, machines or workers will be in the focus of workload measurements. The paper deals with workload analyses made to a most part manual assembling technology for roller bearings assembling process, executed in a big company, with integrated bearings manufacturing processes. In this analyses the delay sample technique have been used to identify and divide all bearing assemblers activities, to get information about time parts from 480 minutes day work time that workers allow to each activity. The developed study shows some ways to increase the process productivity without supplementary investments and also indicated the process automation could be the solution to gain maximum productivity.

  19. Nonlinear structural crash dynamics analyses

    NASA Technical Reports Server (NTRS)

    Hayduk, R. J.; Thomson, R. G.; Wittlin, G.; Kamat, M. P.

    1979-01-01

    Presented in this paper are the results of three nonlinear computer programs, KRASH, ACTION and DYCAST used to analyze the dynamic response of a twin-engine, low-wing airplane section subjected to a 8.38 m/s (27.5 ft/s) vertical impact velocity crash condition. This impact condition simulates the vertical sink rate in a shallow aircraft landing or takeoff accident. The three distinct analysis techniques for nonlinear dynamic response of aircraft structures are briefly examined and compared versus each other and the experimental data. The report contains brief descriptions of the three computer programs, the respective aircraft section mathematical models, pertinent data from the experimental test performed at NASA Langley, and a comparison of the analyses versus test results. Cost and accuracy comparisons between the three analyses are made to illustrate the possible uses of the different nonlinear programs and their future potential.

  20. Supplementary report on antilock analyses

    NASA Technical Reports Server (NTRS)

    Zellner, J. W.

    1985-01-01

    Generic modulator analysis was performed to quantify the effects of dump and reapply pressure rates on antilock stability and performance. Analysis will include dump and reapply rates, and lumped modulator delay. Based on the results of the generic modulator analysis and earlier toggle optimization analysis (with Mitsubishi modulator), a recommended preliminary antilock design was synthesized and its response and performance simulated. The results of these analyses are documented.

  1. Mars periglacial punctual features analyses

    NASA Astrophysics Data System (ADS)

    Machado, Adriane; Barata, Teresa; Ivo Alves, E.; Cunha, Pedro P.

    2012-11-01

    The presence of patterned grounds on Mars has been reported in several papers, especially the study of polygons distribution, size and formation processes. In the last years, the presence of basketball terrains has been noticed on Mars. Studies were made to recognize these terrains on Mars through the analysis of Mars Orbiter Camera (MOC) images. We have been developing an algorithm that recognizes automatically and extracts the hummocky patterns on Mars related to landforms generated by freeze-thaw cycles such as mud boils features. The algorithm is based on remote sensing data that establishes a comparison between the hummocks and mud boils morphology and size from Adventdalen at Longyearbyen (Svalbard - Norway) and hummocky patterns on Mars using High Resolution Imaging Science Experiment (HiRISE) imagery.

  2. Steganalysis of overlapping images

    NASA Astrophysics Data System (ADS)

    Whitaker, James M.; Ker, Andrew D.

    2015-03-01

    We examine whether steganographic images can be detected more reliably when there exist other images, taken with the same camera under the same conditions, of the same scene. We argue that such a circumstance is realistic and likely in practice. In `laboratory conditions' mimicking circumstances favourable to the analyst, and with a custom set of digital images which capture the same scenes with controlled amounts of overlap, we use an overlapping reference image to calibrate steganographic features of the image under analysis. Experimental results show that the analysed image can be classified as cover or stego with much greater reliability than traditional steganalysis not exploiting overlapping content, and the improvement in reliability depends on the amount of overlap. These results are curious because two different photographs of exactly the same scene, taken only a few seconds apart with a fixed camera and settings, typically have steganographic features that differ by considerably more than a cover and stego image.

  3. Analysing photonic structures in plants

    PubMed Central

    Vignolini, Silvia; Moyroud, Edwige; Glover, Beverley J.; Steiner, Ullrich

    2013-01-01

    The outer layers of a range of plant tissues, including flower petals, leaves and fruits, exhibit an intriguing variation of microscopic structures. Some of these structures include ordered periodic multilayers and diffraction gratings that give rise to interesting optical appearances. The colour arising from such structures is generally brighter than pigment-based colour. Here, we describe the main types of photonic structures found in plants and discuss the experimental approaches that can be used to analyse them. These experimental approaches allow identification of the physical mechanisms producing structural colours with a high degree of confidence. PMID:23883949

  4. Laser power beaming system analyses

    NASA Technical Reports Server (NTRS)

    Zeiders, Glenn W., Jr.

    1993-01-01

    The successful demonstration of the PAMELA adaptive optics hardware and the fabrication of the BTOS truss structure were identified by the program office as the two most critical elements of the NASA power beaming program, so it was these that received attention during this program. Much of the effort was expended in direct program support at MSFC, but detailed technical analyses of the AMP deterministic control scheme and the BTOS truss structure (both the JPL design and a spherical one) were prepared and are attached, and recommendations are given.

  5. Summary of LDEF battery analyses

    NASA Technical Reports Server (NTRS)

    Johnson, Chris; Thaller, Larry; Bittner, Harlin; Deligiannis, Frank; Tiller, Smith; Sullivan, David; Bene, James

    1992-01-01

    Tests and analyses of NiCd, LiSO2, and LiCf batteries flown on the Long Duration Exposure Facility (LDEF) includes results from NASA, Aerospace, and commercial labs. The LiSO2 cells illustrate six-year degradation of internal components acceptable for space applications, with up to 85 percent battery capacity remaining on discharge of some returned cells. LiCf batteries completed their mission, but lost any remaining capacity due to internal degradation. Returned NiCd batteries tested an GSFC showed slight case distortion due to pressure build up, but were functioning as designed.

  6. Analysing photonic structures in plants.

    PubMed

    Vignolini, Silvia; Moyroud, Edwige; Glover, Beverley J; Steiner, Ullrich

    2013-10-06

    The outer layers of a range of plant tissues, including flower petals, leaves and fruits, exhibit an intriguing variation of microscopic structures. Some of these structures include ordered periodic multilayers and diffraction gratings that give rise to interesting optical appearances. The colour arising from such structures is generally brighter than pigment-based colour. Here, we describe the main types of photonic structures found in plants and discuss the experimental approaches that can be used to analyse them. These experimental approaches allow identification of the physical mechanisms producing structural colours with a high degree of confidence.

  7. THOR Turbulence Electron Analyser: TEA

    NASA Astrophysics Data System (ADS)

    Fazakerley, Andrew; Moore, Tom; Owen, Chris; Pollock, Craig; Wicks, Rob; Samara, Marilia; Rae, Jonny; Hancock, Barry; Kataria, Dhiren; Rust, Duncan

    2016-04-01

    Turbulence Heating ObserveR (THOR) is the first mission ever flown in space dedicated to plasma turbulence. The Turbulence Electron Analyser (TEA) will measure the plasma electron populations in the mission's Regions of Interest. It will collect a 3D electron velocity distribution with cadences as short as 5 ms. The instrument will be capable of measuring energies up to 30 keV. TEA consists of multiple electrostatic analyser heads arranged so as to measure electrons arriving from look directions covering the full sky, i.e. 4 pi solid angle. The baseline concept is similar to the successful FPI-DES instrument currently operating on the MMS mission. TEA is intended to have a similar angular resolution, but a larger geometric factor. In comparison to earlier missions, TEA improves on the measurement cadence. For example, MMS FPI-DES routinely operates at 30 ms cadence. The objective of measuring distributions at rates as fast as 5 ms is driven by the mission's scientific requirements to resolve electron gyroscale size structures, where plasma heating and fluctuation dissipation is predicted to occur. TEA will therefore be capable of making measurements of the evolution of distribution functions across thin (a few km) current sheets travelling past the spacecraft at up to 600 km/s, of the Power Spectral Density of fluctuations of electron moments and of distributions fast enough to match frequencies with waves expected to be dissipating turbulence (e.g. with 100 Hz whistler waves).

  8. Perturbation analyses of intermolecular interactions

    NASA Astrophysics Data System (ADS)

    Koyama, Yohei M.; Kobayashi, Tetsuya J.; Ueda, Hiroki R.

    2011-08-01

    Conformational fluctuations of a protein molecule are important to its function, and it is known that environmental molecules, such as water molecules, ions, and ligand molecules, significantly affect the function by changing the conformational fluctuations. However, it is difficult to systematically understand the role of environmental molecules because intermolecular interactions related to the conformational fluctuations are complicated. To identify important intermolecular interactions with regard to the conformational fluctuations, we develop herein (i) distance-independent and (ii) distance-dependent perturbation analyses of the intermolecular interactions. We show that these perturbation analyses can be realized by performing (i) a principal component analysis using conditional expectations of truncated and shifted intermolecular potential energy terms and (ii) a functional principal component analysis using products of intermolecular forces and conditional cumulative densities. We refer to these analyses as intermolecular perturbation analysis (IPA) and distance-dependent intermolecular perturbation analysis (DIPA), respectively. For comparison of the IPA and the DIPA, we apply them to the alanine dipeptide isomerization in explicit water. Although the first IPA principal components discriminate two states (the α state and PPII (polyproline II) + β states) for larger cutoff length, the separation between the PPII state and the β state is unclear in the second IPA principal components. On the other hand, in the large cutoff value, DIPA eigenvalues converge faster than that for IPA and the top two DIPA principal components clearly identify the three states. By using the DIPA biplot, the contributions of the dipeptide-water interactions to each state are analyzed systematically. Since the DIPA improves the state identification and the convergence rate with retaining distance information, we conclude that the DIPA is a more practical method compared with the

  9. Analyses to improve operational flexibility

    SciTech Connect

    Trikouros, N.G.

    1986-01-01

    Operational flexibility is greatly enhanced if the technical bases for plant limits and design margins are fully understood, and the analyses necessary to evaluate the effect of plant modifications or changes in operating modes on these parameters can be performed as required. If a condition should arise that might jeopardize a plant limit or reduce operational flexibility, it would be necessary to understand the basis for the limit or the specific condition limiting operational flexibility and be capable of performing a reanalysis to either demonstrate that the limit will not be violated or to change the limit. This paper provides examples of GPU Nuclear efforts in this regard. Examples of Oyster Creek and Three Mile Island operating experiences are discussed.

  10. Uncertainty and Sensitivity Analyses Plan

    SciTech Connect

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.

  11. Chemical analyses of provided samples

    NASA Technical Reports Server (NTRS)

    Becker, Christopher H.

    1993-01-01

    Two batches of samples were received and chemical analysis was performed of the surface and near surface regions of the samples by the surface analysis by laser ionization (SALI) method. The samples included four one-inch optics and several paint samples. The analyses emphasized surface contamination or modification. In these studies, pulsed sputtering by 7 keV Ar+ and primarily single-photon ionization (SPI) by coherent 118 nm radiation (at approximately 5 x 10(exp 5) W/cm(sup 2) were used. For two of the samples, also multiphoton ionization (MPI) at 266 nm (approximately 5 x 10(exp 11) W/cm(sup 2) was used. Most notable among the results was the silicone contamination on Mg2 mirror 28-92, and that the Long Duration Exposure Facility (LDEF) paint sample had been enriched in K and Na and depleted in Zn, Si, B, and organic compounds relative to the control paint.

  12. Genetic Analyses of Integrin Signaling

    PubMed Central

    Wickström, Sara A.; Radovanac, Korana; Fässler, Reinhard

    2011-01-01

    The development of multicellular organisms, as well as maintenance of organ architecture and function, requires robust regulation of cell fates. This is in part achieved by conserved signaling pathways through which cells process extracellular information and translate this information into changes in proliferation, differentiation, migration, and cell shape. Gene deletion studies in higher eukaryotes have assigned critical roles for components of the extracellular matrix (ECM) and their cellular receptors in a vast number of developmental processes, indicating that a large proportion of this signaling is regulated by cell-ECM interactions. In addition, genetic alterations in components of this signaling axis play causative roles in several human diseases. This review will discuss what genetic analyses in mice and lower organisms have taught us about adhesion signaling in development and disease. PMID:21421914

  13. Isotopic signatures by bulk analyses

    SciTech Connect

    Efurd, D.W.; Rokop, D.J.

    1997-12-01

    Los Alamos National Laboratory has developed a series of measurement techniques for identification of nuclear signatures by analyzing bulk samples. Two specific applications for isotopic fingerprinting to identify the origin of anthropogenic radioactivity in bulk samples are presented. The first example is the analyses of environmental samples collected in the US Arctic to determine the impact of dumping of radionuclides in this polar region. Analyses of sediment and biota samples indicate that for the areas sampled the anthropogenic radionuclide content of sediments was predominantly the result of the deposition of global fallout. The anthropogenic radionuclide concentrations in fish, birds and mammals were very low. It can be surmised that marine food chains are presently not significantly affected. The second example is isotopic fingerprinting of water and sediment samples from the Rocky Flats Facility (RFP). The largest source of anthropogenic radioactivity presently affecting surface-waters at RFP is the sediments that are currently residing in the holding ponds. One gram of sediment from a holding pond contains approximately 50 times more plutonium than 1 liter of water from the pond. Essentially 100% of the uranium in Ponds A-1 and A-2 originated as depleted uranium. The largest source of radioactivity in the terminal Ponds A-4, B-5 and C-2 was naturally occurring uranium and its decay product radium. The uranium concentrations in the waters collected from the terminal ponds contained 0.05% or less of the interim standard calculated derived concentration guide for uranium in waters available to the public. All of the radioactivity observed in soil, sediment and water samples collected at RFP was naturally occurring, the result of processes at RFP or the result of global fallout. No extraneous anthropogenic alpha, beta or gamma activities were detected. The plutonium concentrations in Pond C-2 appear to vary seasonally.

  14. imageMCR

    SciTech Connect

    2011-09-27

    imageMCR is a user friendly software package that consists of a variety inputs to preprocess and analyze the hyperspectral image data using multivariate algorithms such as Multivariate Curve Resolution (MCR), Principle Component Analysis (PCA), Classical Least Squares (CLS) and Parallel Factor Analysis (PARAFAC). MCR provides a relative quantitative analysis of the hyperspectral image data without the need for standards, and it discovers all the emitting species (spectral pure components) present in an image, even those in which there is no a priori information. Once the spectral components are discovered, these spectral components can be used for future MCR analyses or used with CLS algorithms to quickly extract concentration image maps for each component within spectral image data sets.

  15. Biomedical Imaging,

    DTIC Science & Technology

    precision required from the task. This report details the technologies in surface and subsurface imaging systems for research and commercial applications. Biomedical imaging, Anthropometry, Computer imaging.

  16. Genetic analyses of captive Alala (Corvus hawaiiensis) using AFLP analyses

    USGS Publications Warehouse

    Jarvi, Susan I.; Bianchi, Kiara R.

    2006-01-01

    affected by the mutation rate at microsatellite loci, thus introducing a bias. Also, the number of loci that can be studied is frequently limited to fewer than 10. This theoretically represents a maximum of one marker for each of 10 chromosomes. Dominant markers like AFLP allow a larger fraction of the genome to be screened. Large numbers of loci can be screened by AFLP to resolve very small individual differences that can be used for identification of individuals, estimates of pairwise relatedness and, in some cases, for parentage analyses. Since AFLP is a dominant marker (can not distinguish between +/+ homozygote versus +/- heterozygote), it has limitations for parentage analyses. Only when both parents are homozygous for the absence of alleles (-/-) and offspring show a presence (+/+ or +/-) can the parents be excluded. In this case, microsatellites become preferable as they have the potential to exclude individual parents when the other parent is unknown. Another limitation of AFLP is that the loci are generally less polymorphic (only two alleles/locus) than microsatellite loci (often >10 alleles/locus). While generally fewer than 10 highly polymorphic microsatellite loci are enough to exclude and assign parentage, it might require up to 100 or more AFLP loci. While there are pros and cons to different methodologies, the total number of loci evaluated by AFLP generally offsets the limitations imposed due to the dominant nature of this approach and end results between methods are generally comparable. Overall objectives of this study were to evaluate the level of genetic diversity in the captive population of Alala, to compare genetic data with currently available pedigree information, and to determine the extent of relatedness of mating pairs and among founding individuals.

  17. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 3 2012-01-01 2012-01-01 false Uncertainty analyses. 436.24 Section 436.24 Energy... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... by conducting additional analyses using any standard engineering economics method such as...

  18. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 3 2014-01-01 2014-01-01 false Uncertainty analyses. 436.24 Section 436.24 Energy... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... by conducting additional analyses using any standard engineering economics method such as...

  19. Photovoltaics: Life-cycle Analyses

    SciTech Connect

    Fthenakis V. M.; Kim, H.C.

    2009-10-02

    Life-cycle analysis is an invaluable tool for investigating the environmental profile of a product or technology from cradle to grave. Such life-cycle analyses of energy technologies are essential, especially as material and energy flows are often interwoven, and divergent emissions into the environment may occur at different life-cycle-stages. This approach is well exemplified by our description of material and energy flows in four commercial PV technologies, i.e., mono-crystalline silicon, multi-crystalline silicon, ribbon-silicon, and cadmium telluride. The same life-cycle approach is applied to the balance of system that supports flat, fixed PV modules during operation. We also discuss the life-cycle environmental metrics for a concentration PV system with a tracker and lenses to capture more sunlight per cell area than the flat, fixed system but requires large auxiliary components. Select life-cycle risk indicators for PV, i.e., fatalities, injures, and maximum consequences are evaluated in a comparative context with other electricity-generation pathways.

  20. Comparison between Inbreeding Analyses Methodologies.

    PubMed

    Esparza, Mireia; Martínez-Abadías, Neus; Sjøvold, Torstein; González-José, Rolando; Hernández, Miquel

    2015-12-01

    Surnames are widely used in inbreeding analysis, but the validity of results has often been questioned due to the failure to comply with the prerequisites of the method. Here we analyze inbreeding in Hallstatt (Austria) between the 17th and the 19th centuries both using genealogies and surnames. The high and significant correlation of the results obtained by both methods demonstrates the validity of the use of surnames in this kind of studies. On the other hand, the inbreeding values obtained (0.24 x 10⁻³ in the genealogies analysis and 2.66 x 10⁻³ in the surnames analysis) are lower than those observed in Europe for this period and for this kind of population, demonstrating the falseness of the apparent isolation of Hallstatt's population. The temporal trend of inbreeding in both analyses does not follow the European general pattern, but shows a maximum in 1850 with a later decrease along the second half of the 19th century. This is probably due to the high migration rate that is implied by the construction of transport infrastructures around the 1870's.

  1. Helicopter tail rotor noise analyses

    NASA Technical Reports Server (NTRS)

    George, A. R.; Chou, S. T.

    1986-01-01

    A study was made of helicopter tail rotor noise, particularly that due to interactions with the main rotor tip vortices, and with the fuselage separation mean wake. The tail rotor blade-main rotor tip vortex interaction is modelled as an airfoil of infinite span cutting through a moving vortex. The vortex and the geometry information required by the analyses are obtained through a free wake geometry analysis of the main rotor. The acoustic pressure-time histories for the tail rotor blade-vortex interactions are then calculated. These acoustic results are compared to tail rotor loading and thickness noise, and are found to be significant to the overall tail rotor noise generation. Under most helicopter operating conditions, large acoustic pressure fluctuations can be generated due to a series of skewed main rotor tip vortices passing through the tail rotor disk. The noise generation depends strongly upon the helicopter operating conditions and the location of the tail rotor relative to the main rotor.

  2. Proteins analysed as virtual knots

    NASA Astrophysics Data System (ADS)

    Alexander, Keith; Taylor, Alexander J.; Dennis, Mark R.

    2017-02-01

    Long, flexible physical filaments are naturally tangled and knotted, from macroscopic string down to long-chain molecules. The existence of knotting in a filament naturally affects its configuration and properties, and may be very stable or disappear rapidly under manipulation and interaction. Knotting has been previously identified in protein backbone chains, for which these mechanical constraints are of fundamental importance to their molecular functionality, despite their being open curves in which the knots are not mathematically well defined; knotting can only be identified by closing the termini of the chain somehow. We introduce a new method for resolving knotting in open curves using virtual knots, which are a wider class of topological objects that do not require a classical closure and so naturally capture the topological ambiguity inherent in open curves. We describe the results of analysing proteins in the Protein Data Bank by this new scheme, recovering and extending previous knotting results, and identifying topological interest in some new cases. The statistics of virtual knots in protein chains are compared with those of open random walks and Hamiltonian subchains on cubic lattices, identifying a regime of open curves in which the virtual knotting description is likely to be important.

  3. Network analyses in systems pharmacology

    PubMed Central

    Berger, Seth I.; Iyengar, Ravi

    2009-01-01

    Systems pharmacology is an emerging area of pharmacology which utilizes network analysis of drug action as one of its approaches. By considering drug actions and side effects in the context of the regulatory networks within which the drug targets and disease gene products function, network analysis promises to greatly increase our knowledge of the mechanisms underlying the multiple actions of drugs. Systems pharmacology can provide new approaches for drug discovery for complex diseases. The integrated approach used in systems pharmacology can allow for drug action to be considered in the context of the whole genome. Network-based studies are becoming an increasingly important tool in understanding the relationships between drug action and disease susceptibility genes. This review discusses how analysis of biological networks has contributed to the genesis of systems pharmacology and how these studies have improved global understanding of drug targets, suggested new targets and approaches for therapeutics, and provided a deeper understanding of the effects of drugs. Taken together, these types of analyses can lead to new therapeutic options while improving the safety and efficacy of existing medications. Contact: ravi.iyengar@mssm.edu PMID:19648136

  4. Proteins analysed as virtual knots

    PubMed Central

    Alexander, Keith; Taylor, Alexander J.; Dennis, Mark R.

    2017-01-01

    Long, flexible physical filaments are naturally tangled and knotted, from macroscopic string down to long-chain molecules. The existence of knotting in a filament naturally affects its configuration and properties, and may be very stable or disappear rapidly under manipulation and interaction. Knotting has been previously identified in protein backbone chains, for which these mechanical constraints are of fundamental importance to their molecular functionality, despite their being open curves in which the knots are not mathematically well defined; knotting can only be identified by closing the termini of the chain somehow. We introduce a new method for resolving knotting in open curves using virtual knots, which are a wider class of topological objects that do not require a classical closure and so naturally capture the topological ambiguity inherent in open curves. We describe the results of analysing proteins in the Protein Data Bank by this new scheme, recovering and extending previous knotting results, and identifying topological interest in some new cases. The statistics of virtual knots in protein chains are compared with those of open random walks and Hamiltonian subchains on cubic lattices, identifying a regime of open curves in which the virtual knotting description is likely to be important. PMID:28205562

  5. Consumption patterns and perception analyses of hangwa.

    PubMed

    Kwock, Chang Geun; Lee, Min A; Park, So Hyun

    2012-03-01

    Hangwa is a traditional food, corresponding to the current consumption trend, in need of marketing strategies to extend its consumption. Therefore, the purpose of this study was to analyze consumers' consumption patterns and perception of Hangwa to increase consumption in the market. A questionnaire was sent to 250 consumers by e-mail from Oct 8∼23, 2009 and the data from 231 persons were analyzed in this study. Statistical, descriptive, paired samples t-test, and importance-performance analyses were conducted using SPSS WIN 17.0. According to the results, Hangwa was purchased mainly 'for present' (39.8%) and the main reasons for buying it were 'traditional image' (33.3%) and 'taste' (22.5%). When importance and performance of attributes considered in purchasing Hangwa were evaluated, performance was assessed to be lower than importance for all attributes. The attributes in the first quadrant with a high importance and a high performance were 'a sanitary process', 'a rigorous quality mark' and 'taste', which were related with quality of the products. In addition, those with a high importance but a low performance were 'popularization through advertisement', 'promotion through mass media', 'conversion of thought on traditional foods', 'a reasonable price' and 'a wide range of price'. In conclusion, Hangwa manufacturers need to diversify products and extend the expiration date based on technologies to promote its consumption. In terms of price, Hangwa should become more available by lowering the price barrier for consumers who are sensitive to price.

  6. APXS ANALYSES OF BOUNCE ROCK: THE FIRST SHERGOTTITE ON MARS

    NASA Technical Reports Server (NTRS)

    Ming, Douglas W.; Zipfel, J.; Anderson, R.; Brueckner, J.; Clark, B. C.; Dreibus, G.; Economou, T.; Gellert, R.; Lugmair, G. W.; Klingelhoefer, G.

    2005-01-01

    During the MER Mission, an isolated rock at Meridiani Planum was analyzed by the Athena instrument suite [1]. Remote sensing instruments noticed its distinct appearance. Two areas on the untreated rock surface and one area that was abraded with the Rock Abrasion Tool were analyzed by Microscopic Imager, Mossbauer Mimos II [2], and Alpha Particle X-ray Spectrometer (APXS). Results of all analyses revealed a close relationship of this rock with known basaltic shergottites.

  7. Digital imaging.

    PubMed

    Daniel, Gregory B

    2009-07-01

    Medical imaging is rapidly moving toward a digital-based image system. An understanding of the principles of digital imaging is necessary to evaluate features of imaging systems and can play an important role in purchasing decisions.

  8. The relationship among sea surface roughness variations, oceanographic analyses, and airborne remote sensing analyses

    NASA Technical Reports Server (NTRS)

    Oertel, G. F.; Wade, T. L.

    1981-01-01

    The synthetic aperture radar (SAR) was studied to determine whether it could image large scale estuaries and oceanic features such as fronts and to explain the electromagnetic interaction between SAR and the individual surface front features. Fronts were observed to occur at the entrance to the Chesapeake Bay. The airborne measurements consisted of data collection by SAR onboard an F-4 aircraft and real aperture side looking radar (SLAR) in Mohawk aircraft. A total of 89 transects were flown. Surface roughness and color as well as temperature and salinity were evaluated. Cross-frontal surveys were made. Frontal shear and convergence flow were obtained. Surface active organic materials, it was indicated, are present at the air-sea interface. In all, 2000 analyses were conducted to characterize the spatial and temporal variabilities associated with water mass boundaries.

  9. Analyses of Transistor Punchthrough Failures

    NASA Technical Reports Server (NTRS)

    Nicolas, David P.

    1999-01-01

    The failure of two transistors in the Altitude Switch Assembly for the Solid Rocket Booster followed by two additional failures a year later presented a challenge to failure analysts. These devices had successfully worked for many years on numerous missions. There was no history of failures with this type of device. Extensive checks of the test procedures gave no indication for a source of the cause. The devices were manufactured more than twenty years ago and failure information on this lot date code was not readily available. External visual exam, radiography, PEID, and leak testing were performed with nominal results Electrical testing indicated nearly identical base-emitter and base-collector characteristics (both forward and reverse) with a low resistance short emitter to collector. These characteristics are indicative of a classic failure mechanism called punchthrough. In failure analysis punchthrough refers to an condition where a relatively low voltage pulse causes the device to conduct very hard producing localized areas of thermal runaway or "hot spots". At one or more of these hot spots, the excessive currents melt the silicon. Heavily doped emitter material diffuses through the base region to the collector forming a diffusion pipe shorting the emitter to base to collector. Upon cooling, an alloy junction forms between the pipe and the base region. Generally, the hot spot (punch-through site) is under the bond and no surface artifact is visible. The devices were delidded and the internal structures were examined microscopically. The gold emitter lead was melted on one device, but others had anomalies in the metallization around the in-tact emitter bonds. The SEM examination confirmed some anomalies to be cosmetic defects while other anomalies were artifacts of the punchthrough site. Subsequent to these analyses, the contractor determined that some irregular testing procedures occurred at the time of the failures heretofore unreported. These testing

  10. Quantum Image Encryption Algorithm Based on Quantum Image XOR Operations

    NASA Astrophysics Data System (ADS)

    Gong, Li-Hua; He, Xiang-Tao; Cheng, Shan; Hua, Tian-Xiang; Zhou, Nan-Run

    2016-07-01

    A novel encryption algorithm for quantum images based on quantum image XOR operations is designed. The quantum image XOR operations are designed by using the hyper-chaotic sequences generated with the Chen's hyper-chaotic system to control the control-NOT operation, which is used to encode gray-level information. The initial conditions of the Chen's hyper-chaotic system are the keys, which guarantee the security of the proposed quantum image encryption algorithm. Numerical simulations and theoretical analyses demonstrate that the proposed quantum image encryption algorithm has larger key space, higher key sensitivity, stronger resistance of statistical analysis and lower computational complexity than its classical counterparts.

  11. Pawnee Nation Energy Option Analyses

    SciTech Connect

    Matlock, M.; Kersey, K.; Riding In, C.

    2009-07-21

    Pawnee Nation of Oklahoma Energy Option Analyses In 2003, the Pawnee Nation leadership identified the need for the tribe to comprehensively address its energy issues. During a strategic energy planning workshop a general framework was laid out and the Pawnee Nation Energy Task Force was created to work toward further development of the tribe’s energy vision. The overarching goals of the “first steps” project were to identify the most appropriate focus for its strategic energy initiatives going forward, and to provide information necessary to take the next steps in pursuit of the “best fit” energy options. Description of Activities Performed The research team reviewed existing data pertaining to the availability of biomass (focusing on woody biomass, agricultural biomass/bio-energy crops, and methane capture), solar, wind and hydropower resources on the Pawnee-owned lands. Using these data, combined with assumptions about costs and revenue streams, the research team performed preliminary feasibility assessments for each resource category. The research team also reviewed available funding resources and made recommendations to Pawnee Nation highlighting those resources with the greatest potential for financially-viable development, both in the near-term and over a longer time horizon. Findings and Recommendations Due to a lack of financial incentives for renewable energy, particularly at the state level, combined mediocre renewable energy resources, renewable energy development opportunities are limited for Pawnee Nation. However, near-term potential exists for development of solar hot water at the gym, and an exterior wood-fired boiler system at the tribe’s main administrative building. Pawnee Nation should also explore options for developing LFGTE resources in collaboration with the City of Pawnee. Significant potential may also exist for development of bio-energy resources within the next decade. Pawnee Nation representatives should closely monitor

  12. Image data processing of earth resources management. [technology transfer

    NASA Technical Reports Server (NTRS)

    Desio, A. W.

    1974-01-01

    Various image processing and information extraction systems are described along with the design and operation of an interactive multispectral information system, IMAGE 100. Analyses of ERTS data, using IMAGE 100, over a number of U.S. sites are presented. The following analyses are included: investigations of crop inventory and management using remote sensing; and (2) land cover classification for environmental impact assessments. Results show that useful information is provided by IMAGE 100 analyses of ERTS data in digital form.

  13. Integrated Field Analyses of Thermal Springs

    NASA Astrophysics Data System (ADS)

    Shervais, K.; Young, B.; Ponce-Zepeda, M. M.; Rosove, S.

    2011-12-01

    A group of undergraduate researchers through the SURE internship offered by the Southern California Earthquake Center (SCEC) have examined thermal springs in southern Idaho, northern Utah as well as mud volcanoes in the Salton Sea, California. We used an integrated approach to estimate the setting and maximum temperature, including water chemistry, Ipad-based image and data-base management, microbiology, and gas analyses with a modified Giggenbach sampler.All springs were characterized using GISRoam (tmCogent3D). We are performing geothermometry calculations as well as comparisons with temperature gradient data on the results while also analyzing biological samples. Analyses include water temperature, pH, electrical conductivity, and TDS measured in the field. Each sample is sealed and chilled and delivered to a water lab within 12 hours.Temperatures are continuously monitored with the use of Solinst Levelogger Juniors. Through partnership with a local community college geology club, we receive results on a monthly basis and are able to process initial data earlier in order to evaluate data over a longer time span. The springs and mudpots contained microbial organisms which were analyzed using methods of single colony isolation, polymerase chain reaction, and DNA sequencing showing the impact of the organisms on the springs or vice versa. Soon we we will collect gas samples at sites that show signs of gas. This will be taken using a hybrid of the Giggenbach method and our own methods. Drawing gas samples has proven a challenge, however we devised a method to draw out gas samples utilizing the Giggenbach flask, transferring samples to glass blood sample tubes, replacing NaOH in the Giggenbach flask, and evacuating it in the field for multiple samples using a vacuum pump. We also use a floating platform devised to carry and lower a levelogger, to using an in-line fuel filter from a tractor in order to keep mud from contaminating the equipment.The use of raster

  14. Imaging medical imaging

    NASA Astrophysics Data System (ADS)

    Journeau, P.

    2015-03-01

    This paper presents progress on imaging the research field of Imaging Informatics, mapped as the clustering of its communities together with their main results by applying a process to produce a dynamical image of the interactions between their results and their common object(s) of research. The basic side draws from a fundamental research on the concept of dimensions and projective space spanning several streams of research about three-dimensional perceptivity and re-cognition and on their relation and reduction to spatial dimensionality. The application results in an N-dimensional mapping in Bio-Medical Imaging, with dimensions such as inflammatory activity, MRI acquisition sequencing, spatial resolution (voxel size), spatiotemporal dimension inferred, toxicity, depth penetration, sensitivity, temporal resolution, wave length, imaging duration, etc. Each field is represented through the projection of papers' and projects' `discriminating' quantitative results onto the specific N-dimensional hypercube of relevant measurement axes, such as listed above and before reduction. Past published differentiating results are represented as red stars, achieved unpublished results as purple spots and projects at diverse progress advancement levels as blue pie slices. The goal of the mapping is to show the dynamics of the trajectories of the field in its own experimental frame and their direction, speed and other characteristics. We conclude with an invitation to participate and show a sample mapping of the dynamics of the community and a tentative predictive model from community contribution.

  15. Image Calibration

    NASA Technical Reports Server (NTRS)

    Peay, Christopher S.; Palacios, David M.

    2011-01-01

    Calibrate_Image calibrates images obtained from focal plane arrays so that the output image more accurately represents the observed scene. The function takes as input a degraded image along with a flat field image and a dark frame image produced by the focal plane array and outputs a corrected image. The three most prominent sources of image degradation are corrected for: dark current accumulation, gain non-uniformity across the focal plane array, and hot and/or dead pixels in the array. In the corrected output image the dark current is subtracted, the gain variation is equalized, and values for hot and dead pixels are estimated, using bicubic interpolation techniques.

  16. Image Guidance

    EPA Pesticide Factsheets

    Guidance that explains the process for getting images approved in One EPA Web microsites and resource directories. includes an appendix that shows examples of what makes some images better than others, how some images convey meaning more than others

  17. Applications of Epsilon Radial Networks in Neuroimage Analyses

    PubMed Central

    Adluru, Nagesh; Chung, Moo K.; Lange, Nicholas T.; Lainhart, Janet E.; Alexander, Andrew L.

    2016-01-01

    “Is the brain ’wiring’ different between groups of populations?” is an increasingly important question with advances in diffusion MRI and abundance of network analytic tools. Recently, automatic, data-driven and computationally efficient framework for extracting brain networks using tractography and epsilon neighborhoods were proposed in the diffusion tensor imaging (DTI) literature [1]. In this paper we propose new extensions to that framework and show potential applications of such epsilon radial networks (ERN) in performing various types of neuroimage analyses. These extensions allow us to use ERNs not only to mine for topo-physical properties of the structural brain networks but also to perform classical region-of-interest (ROI) analyses in a very efficient way. Thus we demonstrate the use of ERNs as a novel image processing lens for statistical and machine learning based analyses. We demonstrate its application in an autism study for identifying topological and quantitative group differences, as well as performing classification. Finally, these views are not restricted to ERNs but can be effective for population studies using any computationally efficient network-extraction procedures. PMID:28251191

  18. The ASSET intercomparison of ozone analyses: method and first results

    NASA Astrophysics Data System (ADS)

    Geer, A. J.; Lahoz, W. A.; Bekki, S.; Bormann, N.; Errera, Q.; Eskes, H. J.; Fonteyn, D.; Jackson, D. R.; Juckes, M. N.; Massart, S.; Peuch, V.-H.; Rharmili, S.; Segers, A.

    2006-06-01

    This paper examines 11 sets of ozone analyses from 7 different data assimilation systems. Two are numerical weather prediction (NWP) systems based on general circulation models (GCMs); the other five use chemistry transport models (CTMs). These systems contain either linearised or detailed ozone chemistry, or no chemistry at all. In most analyses, MIPAS (Michelson Interferometer for Passive Atmospheric Sounding) ozone data are assimilated. Two examples assimilate SCIAMACHY (Scanning Imaging Absorption Spectrometer for Atmospheric Chartography) observations. The analyses are compared to independent ozone observations covering the troposphere, stratosphere and lower mesosphere during the period July to November 2003. Through most of the stratosphere (50 hPa to 1 hPa), biases are usually within ±10% and standard deviations less than 10% compared to ozonesondes and HALOE (Halogen Occultation Experiment). Biases and standard deviations are larger in the upper-troposphere/lower-stratosphere, in the troposphere, the mesosphere, and the Antarctic ozone hole region. In these regions, some analyses do substantially better than others, and this is mostly due to differences in the models. At the tropical tropopause, many analyses show positive biases and excessive structure in the ozone fields, likely due to known deficiencies in assimilated tropical wind fields and a degradation in MIPAS data at these levels. In the southern hemisphere ozone hole, only the analyses which correctly model heterogeneous ozone depletion are able to reproduce the near-complete ozone destruction over the pole. In the upper-stratosphere and mesosphere (above 5 hPa), some ozone photochemistry schemes caused large but easily remedied biases. The diurnal cycle of ozone in the mesosphere is not captured, except by the one system that includes a detailed treatment of mesospheric chemistry. In general, similarly good results are obtained no matter what the assimilation method (Kalman filter, three or

  19. Digital Imaging

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Digital Imaging is the computer processed numerical representation of physical images. Enhancement of images results in easier interpretation. Quantitative digital image analysis by Perceptive Scientific Instruments, locates objects within an image and measures them to extract quantitative information. Applications are CAT scanners, radiography, microscopy in medicine as well as various industrial and manufacturing uses. The PSICOM 327 performs all digital image analysis functions. It is based on Jet Propulsion Laboratory technology, is accurate and cost efficient.

  20. MELCOR analyses for accident progression issues

    SciTech Connect

    Dingman, S.E.; Shaffer, C.J.; Payne, A.C.; Carmel, M.K. )

    1991-01-01

    Results of calculations performed with MELCOR and HECTR in support of the NUREG-1150 study are presented in this report. The analyses examined a wide range of issues. The analyses included integral calculations covering an entire accident sequence, as well as calculations that addressed specific issues that could affect several accident sequences. The results of the analyses for Grand Gulf, Peach Bottom, LaSalle, and Sequoyah are described, and the major conclusions are summarized. 23 refs., 69 figs., 8 tabs.

  1. Electron/proton spectrometer certification documentation analyses

    NASA Technical Reports Server (NTRS)

    Gleeson, P.

    1972-01-01

    A compilation of analyses generated during the development of the electron-proton spectrometer for the Skylab program is presented. The data documents the analyses required by the electron-proton spectrometer verification plan. The verification plan was generated to satisfy the ancillary hardware requirements of the Apollo Applications program. The certification of the spectrometer requires that various tests, inspections, and analyses be documented, approved, and accepted by reliability and quality control personnel of the spectrometer development program.

  2. 7 CFR 94.102 - Analyses available.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) COMMODITY LABORATORY TESTING PROGRAMS... analyses for total ash, fat by acid hydrolysis, moisture, salt, protein, beta-carotene,...

  3. 7 CFR 94.102 - Analyses available.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) COMMODITY LABORATORY TESTING PROGRAMS... analyses for total ash, fat by acid hydrolysis, moisture, salt, protein, beta-carotene,...

  4. The ASSET intercomparison of ozone analyses: method and first results

    NASA Astrophysics Data System (ADS)

    Geer, A. J.; Lahoz, W. A.; Bekki, S.; Bormann, N.; Errera, Q.; Eskes, H. J.; Fonteyn, D.; Jackson, D. R.; Juckes, M. N.; Massart, S.; Peuch, V.-H.; Rharmili, S.; Segers, A.

    2006-12-01

    This paper aims to summarise the current performance of ozone data assimilation (DA) systems, to show where they can be improved, and to quantify their errors. It examines 11 sets of ozone analyses from 7 different DA systems. Two are numerical weather prediction (NWP) systems based on general circulation models (GCMs); the other five use chemistry transport models (CTMs). The systems examined contain either linearised or detailed ozone chemistry, or no chemistry at all. In most analyses, MIPAS (Michelson Interferometer for Passive Atmospheric Sounding) ozone data are assimilated; two assimilate SCIAMACHY (Scanning Imaging Absorption Spectrometer for Atmospheric Chartography) observations instead. Analyses are compared to independent ozone observations covering the troposphere, stratosphere and lower mesosphere during the period July to November 2003. Biases and standard deviations are largest, and show the largest divergence between systems, in the troposphere, in the upper-troposphere/lower-stratosphere, in the upper-stratosphere and mesosphere, and the Antarctic ozone hole region. However, in any particular area, apart from the troposphere, at least one system can be found that agrees well with independent data. In general, none of the differences can be linked to the assimilation technique (Kalman filter, three or four dimensional variational methods, direct inversion) or the system (CTM or NWP system). Where results diverge, a main explanation is the way ozone is modelled. It is important to correctly model transport at the tropical tropopause, to avoid positive biases and excessive structure in the ozone field. In the southern hemisphere ozone hole, only the analyses which correctly model heterogeneous ozone depletion are able to reproduce the near-complete ozone destruction over the pole. In the upper-stratosphere and mesosphere (above 5 hPa), some ozone photochemistry schemes caused large but easily remedied biases. The diurnal cycle of ozone in the

  5. Oncological image analysis: medical and molecular image analysis

    NASA Astrophysics Data System (ADS)

    Brady, Michael

    2007-03-01

    This paper summarises the work we have been doing on joint projects with GE Healthcare on colorectal and liver cancer, and with Siemens Molecular Imaging on dynamic PET. First, we recall the salient facts about cancer and oncological image analysis. Then we introduce some of the work that we have done on analysing clinical MRI images of colorectal and liver cancer, specifically the detection of lymph nodes and segmentation of the circumferential resection margin. In the second part of the paper, we shift attention to the complementary aspect of molecular image analysis, illustrating our approach with some recent work on: tumour acidosis, tumour hypoxia, and multiply drug resistant tumours.

  6. Micro-FE analyses of bone: state of the art.

    PubMed

    van Rietbergen, B

    2001-01-01

    The ability to provide a complete characterization of elastic properties of bone has vastly improved our understanding of trabecular bone mechanical properties. Based on this information, it was possible to validate several mechanical concepts related to the elastic behavior of trabecular bone that could not be validated earlier. With recently developed micro-CT scanners and the availability of large parallel computer systems, this technique has also enabled the determination of physiological bone tissue loading conditions from very large microFE models that can represent whole human bones in detail. Such analyses can provide the data needed for a better understanding of bone failure processes or cell mediated load adaptive remodeling processes. Computational demands for whole bone analyses, however, are still excessive. Unlike linear stress and strain analyses, the application of PFE to study non-linear processes, in particular bone failure mechanisms, is still in an early phase Results of recent studies, however, are promising and indicate that an accurate prediction of bone failure with these techniques is possible. Compelling features of such analyses are that they enable multi-axial failure criteria at the apparent level to be developed using primarily computational methods as well as that they can provide a basis for detailed analysis of micro-mechanics associated with trabecular failure at the apparent level. The application of microFE techniques to analyze bone in vivo is in an early stage as well. First results have indicated that, although the resolution of presently available in vivo imaging techniques (i.e. pQCT and MR) is much less than that of images used so far for uFE analyses, the technique can provide meaningful elastic properties of trabecular bone in vivo in most cases. It is expected that the remaining uncertainties in the microFE results can be eliminated as soon as the resolution of in vivo images is improved. With the fast developments in p

  7. 49 CFR 1180.7 - Market analyses.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 8 2011-10-01 2011-10-01 false Market analyses. 1180.7 Section 1180.7..., TRACKAGE RIGHTS, AND LEASE PROCEDURES General Acquisition Procedures § 1180.7 Market analyses. (a) For... identify and address relevant markets and issues, and provide additional information as requested by...

  8. 49 CFR 1180.7 - Market analyses.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 8 2010-10-01 2010-10-01 false Market analyses. 1180.7 Section 1180.7..., TRACKAGE RIGHTS, AND LEASE PROCEDURES General Acquisition Procedures § 1180.7 Market analyses. (a) For... identify and address relevant markets and issues, and provide additional information as requested by...

  9. Aviation System Analysis Capability Executive Assistant Analyses

    NASA Technical Reports Server (NTRS)

    Roberts, Eileen; Kostiuk, Peter

    1999-01-01

    This document describes the analyses that may be incorporated into the Aviation System Analysis Capability Executive Assistant. The document will be used as a discussion tool to enable NASA and other integrated aviation system entities to evaluate, discuss, and prioritize analyses.

  10. 49 CFR 1180.7 - Market analyses.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 8 2014-10-01 2014-10-01 false Market analyses. 1180.7 Section 1180.7 Transportation Other Regulations Relating to Transportation (Continued) SURFACE TRANSPORTATION BOARD, DEPARTMENT..., TRACKAGE RIGHTS, AND LEASE PROCEDURES General Acquisition Procedures § 1180.7 Market analyses. (a)...

  11. 49 CFR 1180.7 - Market analyses.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 8 2012-10-01 2012-10-01 false Market analyses. 1180.7 Section 1180.7 Transportation Other Regulations Relating to Transportation (Continued) SURFACE TRANSPORTATION BOARD, DEPARTMENT..., TRACKAGE RIGHTS, AND LEASE PROCEDURES General Acquisition Procedures § 1180.7 Market analyses. (a)...

  12. 49 CFR 1180.7 - Market analyses.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 8 2013-10-01 2013-10-01 false Market analyses. 1180.7 Section 1180.7 Transportation Other Regulations Relating to Transportation (Continued) SURFACE TRANSPORTATION BOARD, DEPARTMENT..., TRACKAGE RIGHTS, AND LEASE PROCEDURES General Acquisition Procedures § 1180.7 Market analyses. (a)...

  13. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle...

  14. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle...

  15. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle...

  16. Photothermal imaging

    NASA Astrophysics Data System (ADS)

    Lapotko, Dmitry; Antonishina, Elena

    1995-02-01

    An automated image analysis system with two imaging regimes is described. Photothermal (PT) effect is used for imaging of a temperature field or absorption structure of the sample (the cell) with high sensitivity and spatial resolution. In a cell study PT-technique enables imaging of live non-stained cells, and the monitoring of the cell shape/structure. The system includes a dual laser illumination unit coupled to a conventional optical microscope. A sample chamber provides automated or manual loading of up to 3 samples and cell positioning. For image detection a 256 X 256 10-bit CCD-camera is used. The lasers, scanning stage, and camera are controlled by PC. The system provides optical (transmitted light) image, probe laser optical image, and PT-image acquisition. Operation rate is 1 - 1.5 sec per cell for a cycle: cell positioning -- 3 images acquisition -- image parameters calculation. A special database provides image/parameters storage, presentation, and cell diagnostic according to quantitative image parameters. The described system has been tested during live and stained blood cell studies. PT-images of the cells have been used for cell differentiation. In experiments with the red blood cells (RBC) that originate from normal and anaemia blood parameters for disease differentiation have been found. For white blood cells in PT-images the details of cell structure have found that absent in their optical images.

  17. Photoacoustic imaging.

    PubMed

    Zhang, Yin; Hong, Hao; Cai, Weibo

    2011-09-01

    Photoacoustic imaging, which is based on the photoacoustic effect, has developed extensively over the last decade. Possessing many attractive characteristics such as the use of nonionizing electromagnetic waves, good resolution and contrast, portable instrumention, and the ability to partially quantitate the signal, photoacoustic techniques have been applied to the imaging of cancer, wound healing, disorders in the brain, and gene expression, among others. As a promising structural, functional, and molecular imaging modality for a wide range of biomedical applications, photoacoustic imaging can be categorized into two types of systems: photoacoustic tomography (PAT), which is the focus of this article, and photoacoustic microscopy (PAM). We first briefly describe the endogenous (e.g., hemoglobin and melanin) and the exogenous (e.g., indocyanine green [ICG], various gold nanoparticles, single-walled carbon nanotubes [SWNTs], quantum dots [QDs], and fluorescent proteins) contrast agents for photoacoustic imaging. Next, we discuss in detail the applications of nontargeted photoacoustic imaging. Recently, molecular photoacoustic (MPA) imaging has gained significant interest, and a few proof-of-principle studies have been reported. We summarize the current state of the art of MPA imaging, including the imaging of gene expression and the combination of photoacoustic imaging with other imaging modalities. Last, we point out obstacles facing photoacoustic imaging. Although photoacoustic imaging will likely continue to be a highly vibrant research field for years to come, the key question of whether MPA imaging could provide significant advantages over nontargeted photoacoustic imaging remains to be answered in the future.

  18. Operator-free flow injection analyser

    PubMed Central

    de Faria, Lourival C.

    1991-01-01

    A flow injection analyser has been constructed to allow an operator-free determination of up to 40 samples. Besides the usual FIA apparatus, the analyser includes a home-made sample introduction device made with three electromechanical three-way valves and an auto-sampler from Technicon which has been adapted to be commanded by an external digital signal. The analyser is controlled by a single board SDK-8085 microcomputer. The necessary interface to couple the analyser components to the microcomputer is also described. The analyser was evaluated for a Cr(VI)-FIA determination showing a very good performance with a relative standard deviation for 15 signals from the injection of 100 μl of a 1.0 mg.ml-1 standard Cr(VI) solution being equal to 0.5%. PMID:18924899

  19. Polarization imaging detection technology research

    NASA Astrophysics Data System (ADS)

    Xue, Mo-gen; Wang, Feng; Xu, Guo-ming; Yuan, Hong-wu

    2013-09-01

    In this paper we analyse the polarization imaging theory and the commonly process of the polarization imaging detection. Based on this, we summarize our many years' research work especially in the mechanism, technology and system of the polarization imaging detection technology. Combined with the up-to-date development at home and abroad, this paper discusses many theory and technological problems of polarization imaging detection in detail from the view of the object polarization characteristics, key problem and key technology of polarization imaging detection, polarization imaging detection system and application, etc. The theory and technological problems include object all direction polarization characteristic retrieving, the optical electronic machinery integration designing of the polarization imaging detection system, the high precision polarization information analysis and the polarization image fast processing. Moreover, we point out the possible application direction of the polarization imaging detection technology both in martial and civilian fields. We also summarize the possible future development trend of the polarization imaging detection technology in the field of high spectrum polarization imaging. This paper can provide evident reference and guidance to promote the research and development of the polarization imaging detection technology.

  20. Image processing technology

    SciTech Connect

    Van Eeckhout, E.; Pope, P.; Balick, L.

    1996-07-01

    This is the final report of a two-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The primary objective of this project was to advance image processing and visualization technologies for environmental characterization. This was effected by developing and implementing analyses of remote sensing data from satellite and airborne platforms, and demonstrating their effectiveness in visualization of environmental problems. Many sources of information were integrated as appropriate using geographic information systems.

  1. Level II Ergonomic Analyses, Dover AFB, DE

    DTIC Science & Technology

    1999-02-01

    IERA-RS-BR-TR-1999-0002 UNITED STATES AIR FORCE IERA Level II Ergonomie Analyses, Dover AFB, DE Andrew Marcotte Marilyn Joyce The Joyce...Project (070401881, Washington, DC 20503. 1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE 4. TITLE AND SUBTITLE Level II Ergonomie Analyses, Dover...1.0 INTRODUCTION 1-1 1.1 Purpose Of The Level II Ergonomie Analyses : 1-1 1.2 Approach 1-1 1.2.1 Initial Shop Selection and Administration of the

  2. Quantitative analyses for elucidating mechanisms of cell fate commitment in the mouse blastocyst

    NASA Astrophysics Data System (ADS)

    Saiz, Néstor; Kang, Minjung; Puliafito, Alberto; Schrode, Nadine; Xenopoulos, Panagiotis; Lou, Xinghua; Di Talia, Stefano; Hadjantonakis, Anna-Katerina

    2015-03-01

    In recent years we have witnessed a shift from qualitative image analysis towards higher resolution, quantitative analyses of imaging data in developmental biology. This shift has been fueled by technological advances in both imaging and analysis software. We have recently developed a tool for accurate, semi-automated nuclear segmentation of imaging data from early mouse embryos and embryonic stem cells. We have applied this software to the study of the first lineage decisions that take place during mouse development and established analysis pipelines for both static and time-lapse imaging experiments. In this paper we summarize the conclusions from these studies to illustrate how quantitative, single-cell level analysis of imaging data can unveil biological processes that cannot be revealed by traditional qualitative studies.

  3. Neuroperformance Imaging

    DTIC Science & Technology

    2014-10-01

    Neuroperformance, Sleep cycle, metabolism, thalamic structures, PET Imaging, MR Imaging, functional MRI, electroencephalography (EEG) 16. SECURITY...emotional and psychological stress as proposed in the original application. These studies will use high-density (128 channel) electroencephalography

  4. Medical imaging

    SciTech Connect

    Schneider, R.H.; Dwyer, S.J.

    1987-01-01

    This book contains papers from 26 sessions. Some of the session titles are: Tomographic Reconstruction, Radiography, Fluoro/Angio, Imaging Performance Measures, Perception, Image Processing, 3-D Display, and Printers, Displays, and Digitizers.

  5. Interactive graphics for functional data analyses.

    PubMed

    Wrobel, Julia; Park, So Young; Staicu, Ana Maria; Goldsmith, Jeff

    Although there are established graphics that accompany the most common functional data analyses, generating these graphics for each dataset and analysis can be cumbersome and time consuming. Often, the barriers to visualization inhibit useful exploratory data analyses and prevent the development of intuition for a method and its application to a particular dataset. The refund.shiny package was developed to address these issues for several of the most common functional data analyses. After conducting an analysis, the plot shiny() function is used to generate an interactive visualization environment that contains several distinct graphics, many of which are updated in response to user input. These visualizations reduce the burden of exploratory analyses and can serve as a useful tool for the communication of results to non-statisticians.

  6. Quality control considerations in performing washability analyses

    SciTech Connect

    Graham, R.D.

    1984-10-01

    The author describes, in considerable detail, the procedures for carrying out washability analyses as laid down in ASTM Standard Test Method D4371. These include sampling, sample preparation, hydrometer standardisation, washability testing, and analysis of specific gravity fractions.

  7. SCM Forcing Data Derived from NWP Analyses

    DOE Data Explorer

    Jakob, Christian

    2008-01-15

    Forcing data, suitable for use with single column models (SCMs) and cloud resolving models (CRMs), have been derived from NWP analyses for the ARM (Atmospheric Radiation Measurement) Tropical Western Pacific (TWP) sites of Manus Island and Nauru.

  8. Anthocyanin analyses of Vaccinium fruit dietary supplements

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Vaccinium fruit ingredients within dietary supplements were identified by comparisons with anthocyanin analyses of known Vaccinium profiles (demonstration of anthocyanin fingerprinting). Available Vaccinium supplements were purchased and analyzed; their anthocyanin profiles (based on HPLC separation...

  9. Comparison with Russian analyses of meteor impact

    SciTech Connect

    Canavan, G.H.

    1997-06-01

    The inversion model for meteor impacts is used to discuss Russian analyses and compare principal results. For common input parameters, the models produce consistent estimates of impactor parameters. Directions for future research are discussed and prioritized.

  10. 49 CFR 1572.107 - Other analyses.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... ASSESSMENTS Standards for Security Threat Assessments § 1572.107 Other analyses. (a) TSA may determine that an... the search conducted under this part reveals extensive foreign or domestic criminal convictions,...

  11. A History of Rotorcraft Comprehensive Analyses

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2013-01-01

    A history of the development of rotorcraft comprehensive analyses is presented. Comprehensive analyses are digital computer programs that calculate the aeromechanical behavior of the rotor and aircraft, bringing together the most advanced models of the geometry, structure, dynamics, and aerodynamics available in rotary wing technology. The development of the major codes of the last five decades from industry, government, and universities is described. A number of common themes observed in this history are discussed.

  12. Analyses and forecasts with LAWS winds

    NASA Technical Reports Server (NTRS)

    Wang, Muyin; Paegle, Jan

    1994-01-01

    Horizontal fluxes of atmospheric water vapor are studied for summer months during 1989 and 1992 over North and South America based on analyses from European Center for Medium Range Weather Forecasts, US National Meteorological Center, and United Kingdom Meteorological Office. The calculations are performed over 20 deg by 20 deg box-shaped midlatitude domains located to the east of the Rocky Mountains in North America, and to the east of the Andes Mountains in South America. The fluxes are determined from operational center gridded analyses of wind and moisture. Differences in the monthly mean moisture flux divergence determined from these analyses are as large as 7 cm/month precipitable water equivalent over South America, and 3 cm/month over North America. Gridded analyses at higher spatial and temporal resolution exhibit better agreement in the moisture budget study. However, significant discrepancies of the moisture flux divergence computed from different gridded analyses still exist. The conclusion is more pessimistic than Rasmusson's estimate based on station data. Further analysis reveals that the most significant sources of error result from model surface elevation fields, gaps in the data archive, and uncertainties in the wind and specific humidity analyses. Uncertainties in the wind analyses are the most important problem. The low-level jets, in particular, are substantially different in the different data archives. Part of the reason for this may be due to the way the different analysis models parameterized physical processes affecting low-level jets. The results support the inference that the noise/signal ratio of the moisture budget may be improved more rapidly by providing better wind observations and analyses than by providing better moisture data.

  13. Electronic Imaging

    DTIC Science & Technology

    1997-10-01

    spread function, image processing, speckle, defocus blur, Fresnel zone, lensless imaging, image quality, particulate sizing, particulate distribution...feasibility of image retrieval using a lensless recording system and post digital processing. We found the interesting result that this intensity...topic: holographic optical switch Jun Ren, M.S. Fellow. Ms. Ren received her M.S. in April 1997. Master’s title: "Atomic force microscopy

  14. Neuroperformance Imaging

    DTIC Science & Technology

    2013-10-01

    Sleepiness Scale (ESS), and mood (Beck Depression Inventory, BDI ). Subjects participated in a two-day imaging protocol. Imaging on the first day was...laboratory. Two subjects exhibited BDI scores outside the normal range (>10), all others were within normal range (mean 4.16±6). The focus of these...and mood (Beck Depression Inventory, BDI ). Protocol. Subjects participated in a two-day imaging protocol. Imaging on the first day was conducted

  15. Albedo Pattern Recognition and Time-Series Analyses in Malaysia

    NASA Astrophysics Data System (ADS)

    Salleh, S. A.; Abd Latif, Z.; Mohd, W. M. N. Wan; Chan, A.

    2012-07-01

    Pattern recognition and time-series analyses will enable one to evaluate and generate predictions of specific phenomena. The albedo pattern and time-series analyses are very much useful especially in relation to climate condition monitoring. This study is conducted to seek for Malaysia albedo pattern changes. The pattern recognition and changes will be useful for variety of environmental and climate monitoring researches such as carbon budgeting and aerosol mapping. The 10 years (2000-2009) MODIS satellite images were used for the analyses and interpretation. These images were being processed using ERDAS Imagine remote sensing software, ArcGIS 9.3, the 6S code for atmospherical calibration and several MODIS tools (MRT, HDF2GIS, Albedo tools). There are several methods for time-series analyses were explored, this paper demonstrates trends and seasonal time-series analyses using converted HDF format MODIS MCD43A3 albedo land product. The results revealed significance changes of albedo percentages over the past 10 years and the pattern with regards to Malaysia's nebulosity index (NI) and aerosol optical depth (AOD). There is noticeable trend can be identified with regards to its maximum and minimum value of the albedo. The rise and fall of the line graph show a similar trend with regards to its daily observation. The different can be identified in term of the value or percentage of rises and falls of albedo. Thus, it can be concludes that the temporal behavior of land surface albedo in Malaysia have a uniform behaviours and effects with regards to the local monsoons. However, although the average albedo shows linear trend with nebulosity index, the pattern changes of albedo with respects to the nebulosity index indicates that there are external factors that implicates the albedo values, as the sky conditions and its diffusion plotted does not have uniform trend over the years, especially when the trend of 5 years interval is examined, 2000 shows high negative linear

  16. Canonical Images

    ERIC Educational Resources Information Center

    Hewitt, Dave

    2007-01-01

    In this article, the author offers two well-known mathematical images--that of a dot moving around a circle; and that of the tens chart--and considers their power for developing mathematical thinking. In his opinion, these images each contain the essence of a particular topic of mathematics. They are contrasting images in the sense that they deal…

  17. Image alignment

    DOEpatents

    Dowell, Larry Jonathan

    2014-04-22

    Disclosed is a method and device for aligning at least two digital images. An embodiment may use frequency-domain transforms of small tiles created from each image to identify substantially similar, "distinguishing" features within each of the images, and then align the images together based on the location of the distinguishing features. To accomplish this, an embodiment may create equal sized tile sub-images for each image. A "key" for each tile may be created by performing a frequency-domain transform calculation on each tile. A information-distance difference between each possible pair of tiles on each image may be calculated to identify distinguishing features. From analysis of the information-distance differences of the pairs of tiles, a subset of tiles with high discrimination metrics in relation to other tiles may be located for each image. The subset of distinguishing tiles for each image may then be compared to locate tiles with substantially similar keys and/or information-distance metrics to other tiles of other images. Once similar tiles are located for each image, the images may be aligned in relation to the identified similar tiles.

  18. Proof Image

    ERIC Educational Resources Information Center

    Kidron, Ivy; Dreyfus, Tommy

    2014-01-01

    The emergence of a proof image is often an important stage in a learner's construction of a proof. In this paper, we introduce, characterize, and exemplify the notion of proof image. We also investigate how proof images emerge. Our approach starts from the learner's efforts to construct a justification without (or before) attempting any…

  19. Web-based cephalometric procedure for craniofacial and dentition analyses

    NASA Astrophysics Data System (ADS)

    Arun Kumar, N. S.; Kamath, Srijit R.; Ram, S.; Muthukumaran, B.; Venkatachalapathy, A.; Nandakumar, A.; Jayakumar, P.

    2000-05-01

    Craniofacial analysis is a very important and widely used procedure in orthodontic caphalometry, which plays a key role in diagnosis and treatment planning. This involves establishing reference standards and specification of landmarks and variables. The manual approach takes up a tremendous amount of the orthodontist's time. In this paper, we developed a web-based approach for the craniofacial and dentition analyses. A digital computed radiography (CR) system is utilized for obtaining the craniofacial image, which is stored as a bitmap file. The system comprises of two components - a server and a client. The server component is a program that runs on a remote machine. To use the system, the user has to connect to the website. The client component is now activated, which uploads the image from the PC and displays it on the canvas area. The landmarks are identified using a mouse interface. The reference lines are generated. The resulting image is then sent to the server which performs all measurement and calculates the mean, standard deviation, etc. of the variables. The results generated are sent immediately to the client where it is displayed on a separate frame along with the standard values for comparison. This system eliminates the need for every user to load other expensive programs on his machine.

  20. Photoacoustic Imaging

    PubMed Central

    Zhang, Yin; Hong, Hao; Cai, Weibo

    2014-01-01

    Photoacoustic imaging, based on the photoacoustic effect, has come a long way over the last decade. Possessing many attractive characteristics such as the use of non-ionizing electromagnetic waves, good resolution/contrast, portable instrumention, as well as the ability to quantitate the signal to a certain extent, photoacoustic techniques have been applied for the imaging of cancer, wound healing, disorders in the brain, gene expression, among others. As a promising structural, functional and molecular imaging modality for a wide range of biomedical applications, photoacoustic imaging systems can be briefly categorized into two types: photoacoustic tomography (PAT, the focus of this chapter) and photoacoustic microscopy (PAM). We will first briefly describe the endogenous (e.g. hemoglobin and melanin) and exogenous contrast agents (e.g. indocyanine green, various gold nanoparticles, single-walled carbon nanotubes, quantum dots, and fluorescent proteins) for photoacoustic imaging. Next, we will discuss in detail the applications of non-targeted photoacoustic imaging. Recently, molecular photoacoustic (MPA) imaging has gained significant interest and a few proof-of-principle studies have been reported. We will summarize the current state-of-the-art of MPA imaging, including the imaging of gene expression and combination of photoacoustic imaging with other imaging modalities. Lastly, we will point out the obstacles facing photoacoustic imaging. Although photoacoustic imaging will likely continue to be a highly vibrant research field for the years to come, the key question of whether MPA imaging could provide significant advantages over non-targeted photoacoustic imaging remains to be demonstrated in the future. PMID:21880823

  1. Image Processing Language. Phase 2.

    DTIC Science & Technology

    1988-11-01

    knowledge engineering of coherent collections of methodological tools as they appear in the literature, and the implementation of expert knowledge in...knowledge representation becomes even more desirable. The role of morphology ( Reference 30 as a knowledge formalization tool is another area which is...sets of image processing algorithms. These analyses are to be carried out in several modes including a complete translation to image algebra machine

  2. Image Querying by Image Professionals.

    ERIC Educational Resources Information Center

    Jorgensen, Corinne; Jorgensen, Peter

    2003-01-01

    Reports the analysis of search logs from a commercial image provider over a one-month period and discusses results in relation to previous findings. Analyzes image searches, image queries composing the search, user search modification strategies, results returned, and user browsing of results. (Author/AEF)

  3. Prismatic analyser concept for neutron spectrometers

    SciTech Connect

    Birk, Jonas O.; Jacobsen, Johan; Hansen, Rasmus L.; Lefmann, Kim; Markó, Márton; Niedermayer, Christof; Freeman, Paul G.; Christensen, Niels B.; Månsson, Martin; Rønnow, Henrik M.

    2014-11-15

    Developments in modern neutron spectroscopy have led to typical sample sizes decreasing from few cm to several mm in diameter samples. We demonstrate how small samples together with the right choice of analyser and detector components makes distance collimation an important concept in crystal analyser spectrometers. We further show that this opens new possibilities where neutrons with different energies are reflected by the same analyser but counted in different detectors, thus improving both energy resolution and total count rate compared to conventional spectrometers. The technique can readily be combined with advanced focussing geometries and with multiplexing instrument designs. We present a combination of simulations and data showing three different energies simultaneously reflected from one analyser. Experiments were performed on a cold triple axis instrument and on a prototype inverse geometry Time-of-flight spectrometer installed at PSI, Switzerland, and shows excellent agreement with the predictions. Typical improvements will be 2.0 times finer resolution and a factor of 1.9 in flux gain compared to a focussing Rowland geometry, or of 3.3 times finer resolution and a factor of 2.4 in flux gain compared to a single flat analyser slab.

  4. Geomagnetic local and regional harmonic analyses.

    USGS Publications Warehouse

    Alldredge, L.R.

    1982-01-01

    Procedures are developed for using rectangular and cylindrical harmonic analyses in local and regional areas. Both the linear least squares analysis, applicable when component data are available, and the nonlinear least squares analysis, applicable when only total field data are available, are treated. When component data are available, it is advantageous to work with residual fields obtained by subtracting components derived from a harmonic potential from the observed components. When only total field intensity data are available, they must be used directly. Residual values cannot be used. Cylindrical harmonic analyses are indicated when fields tend toward cylindrical symmetry; otherwise, rectangular harmonic analyses will be more advantageous. Examples illustrating each type of analysis are given.-Author

  5. NEUTRONICS ANALYSES FOR SNS TARGETS DEPOSITIONS

    SciTech Connect

    Popova, Irina I; Remec, Igor; Gallmeier, Franz X

    2016-01-01

    In order to deposit Spallation Neutron Source (SNS) spent facility components replaced due to end-of-life radiation-induced material damage or burn-up, or because of mechanical failure or design improvements, waste classification analyses are being performed. These analyses include an accurate estimate of the radionuclide inventory, on which base components are classified and an appropriate container for transport and storage is determined. After the choice for the container is made, transport calculations are performed for the facility component to be placed inside the container, ensuring compliance with waste management regulations. When necessary, additional shielding is added. Most of the effort is concentrated on the target deposition, which normally takes place once or twice per year. Additionally, the second target station (STS) is in a process of design and waste management analyses for the STS target are being developed to support a deposition plan

  6. Proteomic Analyses of the Vitreous Humour

    PubMed Central

    Angi, Martina; Kalirai, Helen; Coupland, Sarah E.; Damato, Bertil E.; Semeraro, Francesco; Romano, Mario R.

    2012-01-01

    The human vitreous humour (VH) is a transparent, highly hydrated gel, which occupies the posterior segment of the eye between the lens and the retina. Physiological and pathological conditions of the retina are reflected in the protein composition of the VH, which can be sampled as part of routine surgical procedures. Historically, many studies have investigated levels of individual proteins in VH from healthy and diseased eyes. In the last decade, proteomics analyses have been performed to characterise the proteome of the human VH and explore networks of functionally related proteins, providing insight into the aetiology of diabetic retinopathy and proliferative vitreoretinopathy. Recent proteomic studies on the VH from animal models of autoimmune uveitis have identified new signalling pathways associated to autoimmune triggers and intravitreal inflammation. This paper aims to guide biological scientists through the different proteomic techniques that have been used to analyse the VH and present future perspectives for the study of intravitreal inflammation using proteomic analyses. PMID:22973072

  7. A qualitative method for analysing multivoicedness

    PubMed Central

    Aveling, Emma-Louise; Gillespie, Alex; Cornish, Flora

    2015-01-01

    ‘Multivoicedness’ and the ‘multivoiced Self’ have become important theoretical concepts guiding research. Drawing on the tradition of dialogism, the Self is conceptualised as being constituted by a multiplicity of dynamic, interacting voices. Despite the growth in literature and empirical research, there remains a paucity of established methodological tools for analysing the multivoiced Self using qualitative data. In this article, we set out a systematic, practical ‘how-to’ guide for analysing multivoicedness. Using theoretically derived tools, our three-step method comprises: identifying the voices of I-positions within the Self’s talk (or text), identifying the voices of ‘inner-Others’, and examining the dialogue and relationships between the different voices. We elaborate each step and illustrate our method using examples from a published paper in which data were analysed using this method. We conclude by offering more general principles for the use of the method and discussing potential applications. PMID:26664292

  8. Imaging Biomarkers or Biomarker Imaging?

    PubMed Central

    Mitterhauser, Markus; Wadsak, Wolfgang

    2014-01-01

    Since biomarker imaging is traditionally understood as imaging of molecular probes, we highly recommend to avoid any confusion with the previously defined term “imaging biomarkers” and, therefore, only use “molecular probe imaging (MPI)” in that context. Molecular probes (MPs) comprise all kinds of molecules administered to an organism which inherently carry a signalling moiety. This review highlights the basic concepts and differences of molecular probe imaging using specific biomarkers. In particular, PET radiopharmaceuticals are discussed in more detail. Specific radiochemical and radiopharmacological aspects as well as some legal issues are presented. PMID:24967536

  9. Advanced laser stratospheric monitoring systems analyses

    NASA Technical Reports Server (NTRS)

    Larsen, J. C.

    1984-01-01

    This report describes the software support supplied by Systems and Applied Sciences Corporation for the study of Advanced Laser Stratospheric Monitoring Systems Analyses under contract No. NAS1-15806. This report discusses improvements to the Langley spectroscopic data base, development of LHS instrument control software and data analyses and validation software. The effect of diurnal variations on the retrieved concentrations of NO, NO2 and C L O from a space and balloon borne measurement platform are discussed along with the selection of optimum IF channels for sensing stratospheric species from space.

  10. Analysing particulate deposition to plant canopies

    NASA Astrophysics Data System (ADS)

    Bache, D. H.

    Experimental measurements of the deposition of Lycopodium spores to a plant canopy were analysed to generate specific estimates of the relative significance of sedimentation, impaction and the effective foliage density fp. For the particular case analysed impaction appeared to be the dominating trapping mechanism and it was demonstrated that considerable aerodynamic shading was present. Using an estimate of fp. a consistant picture emerged in the behaviour of the canopy when both wet and dry and when tested against independent data on the trapping characteristics of individual elements. These conclusions differed significantly from those derived using a model in which impaction was neglected and lead to an apparent overestimate of fp.

  11. JPEG2000 Image Compression on Solar EUV Images

    NASA Astrophysics Data System (ADS)

    Fischer, Catherine E.; Müller, Daniel; De Moortel, Ineke

    2017-01-01

    For future solar missions as well as ground-based telescopes, efficient ways to return and process data have become increasingly important. Solar Orbiter, which is the next ESA/NASA mission to explore the Sun and the heliosphere, is a deep-space mission, which implies a limited telemetry rate that makes efficient onboard data compression a necessity to achieve the mission science goals. Missions like the Solar Dynamics Observatory (SDO) and future ground-based telescopes such as the Daniel K. Inouye Solar Telescope, on the other hand, face the challenge of making petabyte-sized solar data archives accessible to the solar community. New image compression standards address these challenges by implementing efficient and flexible compression algorithms that can be tailored to user requirements. We analyse solar images from the Atmospheric Imaging Assembly (AIA) instrument onboard SDO to study the effect of lossy JPEG2000 (from the Joint Photographic Experts Group 2000) image compression at different bitrates. To assess the quality of compressed images, we use the mean structural similarity (MSSIM) index as well as the widely used peak signal-to-noise ratio (PSNR) as metrics and compare the two in the context of solar EUV images. In addition, we perform tests to validate the scientific use of the lossily compressed images by analysing examples of an on-disc and off-limb coronal-loop oscillation time-series observed by AIA/SDO.

  12. Positioning the image of AIDS.

    PubMed

    Cooter, Roger; Stein, Claudia

    2010-03-01

    AIDS posters can be treated as material objects whose production, distribution and consumption varied across time and place. It is also possible to reconstruct and analyse the public health discourse at the time these powerful images appeared. More recently, however, these conventional historical approaches have been challenged by projects in literary and art criticism. Here, images of AIDS are considered in terms of their function in and for a new discursive regime of power centred on the human body and its visualization. How images of AIDS came to be understood in Western culture in relation to wider political and economic conditions redefines the historical task.

  13. Hyperspectral imaging in medicine: image pre-processing problems and solutions in Matlab.

    PubMed

    Koprowski, Robert

    2015-11-01

    The paper presents problems and solutions related to hyperspectral image pre-processing. New methods of preliminary image analysis are proposed. The paper shows problems occurring in Matlab when trying to analyse this type of images. Moreover, new methods are discussed which provide the source code in Matlab that can be used in practice without any licensing restrictions. The proposed application and sample result of hyperspectral image analysis.

  14. Automated Quality Assurance of Online NIR Analysers

    PubMed Central

    Aaljoki, Kari

    2005-01-01

    Modern NIR analysers produce valuable data for closed-loop process control and optimisation practically in real time. Thus it is highly important to keep them in the best possible shape. Quality assurance (QA) of NIR analysers is an interesting and complex issue because it is not only the instrument and sample handling that has to be monitored. At the same time, validity of prediction models has to be assured. A system for fully automated QA of NIR analysers is described. The system takes care of collecting and organising spectra from various instruments, relevant laboratory, and process management system (PMS) data. Validation of spectra is based on simple diagnostics values derived from the spectra. Predictions are validated against laboratory (LIMS) or other online analyser results (collected from PMS). The system features automated alarming, reporting, trending, and charting functions for major key variables for easy visual inspection. Various textual and graphical reports are sent to maintenance people through email. The software was written with Borland Delphi 7 Enterprise. Oracle and PMS ODBC interfaces were used for accessing LIMS and PMS data using appropriate SQL queries. It will be shown that it is possible to take actions even before the quality of predictions is seriously affected, thus maximising the overall uptime of the instrument. PMID:18924628

  15. Uncertainty quantification approaches for advanced reactor analyses.

    SciTech Connect

    Briggs, L. L.; Nuclear Engineering Division

    2009-03-24

    The original approach to nuclear reactor design or safety analyses was to make very conservative modeling assumptions so as to ensure meeting the required safety margins. Traditional regulation, as established by the U. S. Nuclear Regulatory Commission required conservatisms which have subsequently been shown to be excessive. The commission has therefore moved away from excessively conservative evaluations and has determined best-estimate calculations to be an acceptable alternative to conservative models, provided the best-estimate results are accompanied by an uncertainty evaluation which can demonstrate that, when a set of analysis cases which statistically account for uncertainties of all types are generated, there is a 95% probability that at least 95% of the cases meet the safety margins. To date, nearly all published work addressing uncertainty evaluations of nuclear power plant calculations has focused on light water reactors and on large-break loss-of-coolant accident (LBLOCA) analyses. However, there is nothing in the uncertainty evaluation methodologies that is limited to a specific type of reactor or to specific types of plant scenarios. These same methodologies can be equally well applied to analyses for high-temperature gas-cooled reactors and to liquid metal reactors, and they can be applied to steady-state calculations, operational transients, or severe accident scenarios. This report reviews and compares both statistical and deterministic uncertainty evaluation approaches. Recommendations are given for selection of an uncertainty methodology and for considerations to be factored into the process of evaluating uncertainties for advanced reactor best-estimate analyses.

  16. Cosmetology: Task Analyses. Competency-Based Education.

    ERIC Educational Resources Information Center

    Henrico County Public Schools, Glen Allen, VA. Virginia Vocational Curriculum Center.

    These task analyses are designed to be used in combination with the "Trade and Industrial Education Service Area Resource" in order to implement competency-based education in the cosmetology program in Virginia. The task analysis document contains the task inventory, suggested task sequence lists, and content outlines for the secondary…

  17. Amino acid analyses of Apollo 14 samples.

    NASA Technical Reports Server (NTRS)

    Gehrke, C. W.; Zumwalt, R. W.; Kuo, K.; Aue, W. A.; Stalling, D. L.; Kvenvolden, K. A.; Ponnamperuma, C.

    1972-01-01

    Detection limits were between 300 pg and 1 ng for different amino acids, in an analysis by gas-liquid chromatography of water extracts from Apollo 14 lunar fines in which amino acids were converted to their N-trifluoro-acetyl-n-butyl esters. Initial analyses of water and HCl extracts of sample 14240 and 14298 samples showed no amino acids above background levels.

  18. The Economic Cost of Homosexuality: Multilevel Analyses

    ERIC Educational Resources Information Center

    Baumle, Amanda K.; Poston, Dudley, Jr.

    2011-01-01

    This article builds on earlier studies that have examined "the economic cost of homosexuality," by using data from the 2000 U.S. Census and by employing multilevel analyses. Our findings indicate that partnered gay men experience a 12.5 percent earnings penalty compared to married heterosexual men, and a statistically insignificant earnings…

  19. Written Case Analyses and Critical Reflection.

    ERIC Educational Resources Information Center

    Harrington, Helen L.; And Others

    1996-01-01

    The study investigated the use of case-based pedagogy to develop critical reflection in prospective teachers. Analysis of students written analyses of dilemma-based cases found patterns showing evidence of students open-mindedness, sense of professional responsibility, and wholeheartedness in approach to teaching. (DB)

  20. Chemical Analyses of Silicon Aerogel Samples

    SciTech Connect

    van der Werf, I.; Palmisano, F.; De Leo, Raffaele; Marrone, Stefano

    2008-04-01

    After five years of operating, two Aerogel counters: A1 and A2, taking data in Hall A at Jefferson Lab, suffered a loss of performance. In this note possible causes of degradation have been studied. In particular, various chemical and physical analyses have been carried out on several Aerogel tiles and on adhesive tape in order to reveal the presence of contaminants.

  1. Automated Quality Assurance of Online NIR Analysers.

    PubMed

    Aaljoki, Kari

    2005-01-01

    Modern NIR analysers produce valuable data for closed-loop process control and optimisation practically in real time. Thus it is highly important to keep them in the best possible shape. Quality assurance (QA) of NIR analysers is an interesting and complex issue because it is not only the instrument and sample handling that has to be monitored. At the same time, validity of prediction models has to be assured. A system for fully automated QA of NIR analysers is described. The system takes care of collecting and organising spectra from various instruments, relevant laboratory, and process management system (PMS) data. Validation of spectra is based on simple diagnostics values derived from the spectra. Predictions are validated against laboratory (LIMS) or other online analyser results (collected from PMS). The system features automated alarming, reporting, trending, and charting functions for major key variables for easy visual inspection. Various textual and graphical reports are sent to maintenance people through email. The software was written with Borland Delphi 7 Enterprise. Oracle and PMS ODBC interfaces were used for accessing LIMS and PMS data using appropriate SQL queries. It will be shown that it is possible to take actions even before the quality of predictions is seriously affected, thus maximising the overall uptime of the instrument.

  2. Analysing Simple Electric Motors in the Classroom

    ERIC Educational Resources Information Center

    Yap, Jeff; MacIsaac, Dan

    2006-01-01

    Electromagnetic phenomena and devices such as motors are typically unfamiliar to both teachers and students. To better visualize and illustrate the abstract concepts (such as magnetic fields) underlying electricity and magnetism, we suggest that students construct and analyse the operation of a simply constructed Johnson electric motor. In this…

  3. Correlation Functions Aid Analyses Of Spectra

    NASA Technical Reports Server (NTRS)

    Beer, Reinhard; Norton, Robert H., Jr.

    1989-01-01

    New uses found for correlation functions in analyses of spectra. In approach combining elements of both pattern-recognition and traditional spectral-analysis techniques, spectral lines identified in data appear useless at first glance because they are dominated by noise. New approach particularly useful in measurement of concentrations of rare species of molecules in atmosphere.

  4. Functional Analyses and Treatment of Precursor Behavior

    ERIC Educational Resources Information Center

    Najdowski, Adel C.; Wallace, Michele D.; Ellsworth, Carrie L.; MacAleese, Alicia N.; Cleveland, Jackie

    2008-01-01

    Functional analysis has been demonstrated to be an effective method to identify environmental variables that maintain problem behavior. However, there are cases when conducting functional analyses of severe problem behavior may be contraindicated. The current study applied functional analysis procedures to a class of behavior that preceded severe…

  5. A Call for Conducting Multivariate Mixed Analyses

    ERIC Educational Resources Information Center

    Onwuegbuzie, Anthony J.

    2016-01-01

    Several authors have written methodological works that provide an introductory- and/or intermediate-level guide to conducting mixed analyses. Although these works have been useful for beginning and emergent mixed researchers, with very few exceptions, works are lacking that describe and illustrate advanced-level mixed analysis approaches. Thus,…

  6. Body Imaging

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Magnetic Resonance Imaging (MRI) and Computer-aided Tomography (CT) images are often complementary. In most cases, MRI is good for viewing soft tissue but not bone, while CT images are good for bone but not always good for soft tissue discrimination. Physicians and engineers in the Department of Radiology at the University of Michigan Hospitals are developing a technique for combining the best features of MRI and CT scans to increase the accuracy of discriminating one type of body tissue from another. One of their research tools is a computer program called HICAP. The program can be used to distinguish between healthy and diseased tissue in body images.

  7. Multispectral imaging and image processing

    NASA Astrophysics Data System (ADS)

    Klein, Julie

    2014-02-01

    The color accuracy of conventional RGB cameras is not sufficient for many color-critical applications. One of these applications, namely the measurement of color defects in yarns, is why Prof. Til Aach and the Institute of Image Processing and Computer Vision (RWTH Aachen University, Germany) started off with multispectral imaging. The first acquisition device was a camera using a monochrome sensor and seven bandpass color filters positioned sequentially in front of it. The camera allowed sampling the visible wavelength range more accurately and reconstructing the spectra for each acquired image position. An overview will be given over several optical and imaging aspects of the multispectral camera that have been investigated. For instance, optical aberrations caused by filters and camera lens deteriorate the quality of captured multispectral images. The different aberrations were analyzed thoroughly and compensated based on models for the optical elements and the imaging chain by utilizing image processing. With this compensation, geometrical distortions disappear and sharpness is enhanced, without reducing the color accuracy of multispectral images. Strong foundations in multispectral imaging were laid and a fruitful cooperation was initiated with Prof. Bernhard Hill. Current research topics like stereo multispectral imaging and goniometric multispectral measure- ments that are further explored with his expertise will also be presented in this work.

  8. Airbags to Martian Landers: Analyses at Sandia National Laboratories

    SciTech Connect

    Gwinn, K.W.

    1994-03-01

    A new direction for the national laboratories is to assist US business with research and development, primarily through cooperative research and development agreements (CRADAs). Technology transfer to the private sector has been very successful as over 200 CRADAs are in place at Sandia. Because of these cooperative efforts, technology has evolved into some new areas not commonly associated with the former mission of the national laboratories. An example of this is the analysis of fabric structures. Explicit analyses and expertise in constructing parachutes led to the development of a next generation automobile airbag; which led to the construction, testing, and analysis of the Jet Propulsion Laboratory Mars Environmental Survey Lander; and finally led to the development of CAD based custom garment designs using 3D scanned images of the human body. The structural analysis of these fabric structures is described as well as a more traditional example Sandia with the test/analysis correlation of the impact of a weapon container.

  9. Passive adaptive imaging through turbulence

    NASA Astrophysics Data System (ADS)

    Tofsted, David

    2016-05-01

    Standard methods for improved imaging system performance under degrading optical turbulence conditions typically involve active adaptive techniques or post-capture image processing. Here, passive adaptive methods are considered where active sources are disallowed, a priori. Theoretical analyses of short-exposure turbulence impacts indicate that varying aperture sizes experience different degrees of turbulence impacts. Smaller apertures often outperform larger aperture systems as turbulence strength increases. This suggests a controllable aperture system is advantageous. In addition, sub-aperture sampling of a set of training images permits the system to sense tilts in different sub-aperture regions through image acquisition and image cross-correlation calculations. A four sub-aperture pattern supports corrections involving five realizable operating modes (beyond tip and tilt) for removing aberrations over an annular pattern. Progress to date will be discussed regarding development and field trials of a prototype system.

  10. Image of the Singapore Child

    ERIC Educational Resources Information Center

    Ebbeck, Marjory; Warrier, Sheela

    2008-01-01

    The purpose of this study was to analyse the contents of one of the leading newspapers of Singapore in an effort to identify the public image of the children of the nation. Newspaper clippings of news/articles, pictures/photographs and advertisements featuring children below 15 years of age were collected over a one-week period and the content…

  11. Radiation effects on video imagers

    NASA Astrophysics Data System (ADS)

    Yates, G. J.; Bujnosek, J. J.; Jaramillo, S. A.; Walton, R. B.; Martinez, T. M.

    1986-02-01

    Radiation senstivity of several photoconductive, photoemissive, and solid state silicon-based video imagers was measured by analysing stored photo-charge induced by irradiation with continuous and pulsed sources of high energy photons and neutrons. Transient effects as functions of absorbed dose, dose rate, fluences, and ionizing particle energy are presented.

  12. Body image and media use among adolescents.

    PubMed

    Borzekowski, Dina L G; Bayer, Angela M

    2005-06-01

    This article reviews the literature on body image and media use among adolescents. We begin by defining body image and how it is constructed, especially among young people. We then offer information on when one's body image perception is askew with one's perception of personal ideal, which can result in disordered eating, including obesity, anorexia, and bulimia. Next, we describe the research literature on media use and its relationship to adolescents' body image perceptions and discuss content analyses and correlational, experimental, and qualitative studies. Lastly, we recommend, beyond conducting further and improved research studies, interventions and policies that may have an impact on body image and media use.

  13. Advanced Land Imager Assessment System

    NASA Technical Reports Server (NTRS)

    Chander, Gyanesh; Choate, Mike; Christopherson, Jon; Hollaren, Doug; Morfitt, Ron; Nelson, Jim; Nelson, Shar; Storey, James; Helder, Dennis; Ruggles, Tim; Kaita, Ed; Levy, Raviv; Ong, Lawrence; Markham, Brian; Schweiss, Robert

    2008-01-01

    The Advanced Land Imager Assessment System (ALIAS) supports radiometric and geometric image processing for the Advanced Land Imager (ALI) instrument onboard NASA s Earth Observing-1 (EO-1) satellite. ALIAS consists of two processing subsystems for radiometric and geometric processing of the ALI s multispectral imagery. The radiometric processing subsystem characterizes and corrects, where possible, radiometric qualities including: coherent, impulse; and random noise; signal-to-noise ratios (SNRs); detector operability; gain; bias; saturation levels; striping and banding; and the stability of detector performance. The geometric processing subsystem and analysis capabilities support sensor alignment calibrations, sensor chip assembly (SCA)-to-SCA alignments and band-to-band alignment; and perform geodetic accuracy assessments, modulation transfer function (MTF) characterizations, and image-to-image characterizations. ALIAS also characterizes and corrects band-toband registration, and performs systematic precision and terrain correction of ALI images. This system can geometrically correct, and automatically mosaic, the SCA image strips into a seamless, map-projected image. This system provides a large database, which enables bulk trending for all ALI image data and significant instrument telemetry. Bulk trending consists of two functions: Housekeeping Processing and Bulk Radiometric Processing. The Housekeeping function pulls telemetry and temperature information from the instrument housekeeping files and writes this information to a database for trending. The Bulk Radiometric Processing function writes statistical information from the dark data acquired before and after the Earth imagery and the lamp data to the database for trending. This allows for multi-scene statistical analyses.

  14. Image fusion

    NASA Technical Reports Server (NTRS)

    Pavel, M.

    1993-01-01

    The topics covered include the following: a system overview of the basic components of a system designed to improve the ability of a pilot to fly through low-visibility conditions such as fog; the role of visual sciences; fusion issues; sensor characterization; sources of information; image processing; and image fusion.

  15. Imaging Atherosclerosis

    PubMed Central

    Tarkin, Jason M.; Dweck, Marc R.; Evans, Nicholas R.; Takx, Richard A.P.; Brown, Adam J.; Tawakol, Ahmed; Fayad, Zahi A.

    2016-01-01

    Advances in atherosclerosis imaging technology and research have provided a range of diagnostic tools to characterize high-risk plaque in vivo; however, these important vascular imaging methods additionally promise great scientific and translational applications beyond this quest. When combined with conventional anatomic- and hemodynamic-based assessments of disease severity, cross-sectional multimodal imaging incorporating molecular probes and other novel noninvasive techniques can add detailed interrogation of plaque composition, activity, and overall disease burden. In the catheterization laboratory, intravascular imaging provides unparalleled access to the world beneath the plaque surface, allowing tissue characterization and measurement of cap thickness with micrometer spatial resolution. Atherosclerosis imaging captures key data that reveal snapshots into underlying biology, which can test our understanding of fundamental research questions and shape our approach toward patient management. Imaging can also be used to quantify response to therapeutic interventions and ultimately help predict cardiovascular risk. Although there are undeniable barriers to clinical translation, many of these hold-ups might soon be surpassed by rapidly evolving innovations to improve image acquisition, coregistration, motion correction, and reduce radiation exposure. This article provides a comprehensive review of current and experimental atherosclerosis imaging methods and their uses in research and potential for translation to the clinic. PMID:26892971

  16. Photoacoustic Imaging.

    DTIC Science & Technology

    1983-12-01

    DIODE LASER AS THE OPTICAL SOURCE ......... 1 6. IIIG RESOLUTION ACOUSTO-OPTIC LASER PROBE .............. 21 6-1. Introduction...4 * - S.1 SECTION 5 IMAGING WITH A DIODE LASER AS THE OPTICAL SOURCE .,, We have recently imaged photoacoustically...with micron resolution using a 5 milliwatt diode laser as the optical source. This demonstration is an indication of the tremendous sensitivity that we

  17. Imaging Genetics

    ERIC Educational Resources Information Center

    Munoz, Karen E.; Hyde, Luke W.; Hariri, Ahmad R.

    2009-01-01

    Imaging genetics is an experimental strategy that integrates molecular genetics and neuroimaging technology to examine biological mechanisms that mediate differences in behavior and the risks for psychiatric disorder. The basic principles in imaging genetics and the development of the field are discussed.

  18. Retinal Imaging and Image Analysis

    PubMed Central

    Abràmoff, Michael D.; Garvin, Mona K.; Sonka, Milan

    2011-01-01

    Many important eye diseases as well as systemic diseases manifest themselves in the retina. While a number of other anatomical structures contribute to the process of vision, this review focuses on retinal imaging and image analysis. Following a brief overview of the most prevalent causes of blindness in the industrialized world that includes age-related macular degeneration, diabetic retinopathy, and glaucoma, the review is devoted to retinal imaging and image analysis methods and their clinical implications. Methods for 2-D fundus imaging and techniques for 3-D optical coherence tomography (OCT) imaging are reviewed. Special attention is given to quantitative techniques for analysis of fundus photographs with a focus on clinically relevant assessment of retinal vasculature, identification of retinal lesions, assessment of optic nerve head (ONH) shape, building retinal atlases, and to automated methods for population screening for retinal diseases. A separate section is devoted to 3-D analysis of OCT images, describing methods for segmentation and analysis of retinal layers, retinal vasculature, and 2-D/3-D detection of symptomatic exudate-associated derangements, as well as to OCT-based analysis of ONH morphology and shape. Throughout the paper, aspects of image acquisition, image analysis, and clinical relevance are treated together considering their mutually interlinked relationships. PMID:21743764

  19. Quantitative analyses of maxillary sinus using computed tomography.

    PubMed

    Perella, Andréia; Rocha, Sara Dos Santos; Cavalcanti, Marcelo de Gusmão Paraiso

    2003-09-01

    The aim of this study was to evaluate the precision and accuracy of linear measurements of maxillary sinus made in tomographic films, by comparing with 3D reconstructed images. Linear measurements of both maxillary sinus in computed tomography CT of 17 patients, with or without lesion by two calibrated examiners independently, on two occasions, with a single manual caliper. A third examiner has done the same measurements electronically in 3D-CT reconstruction. The statistical analysis was performed using ANOVA (analyses of variance). Intra-observer percentage error was little in both cases, with and without lesion; it ranged from 1.14% to 1.82%. The inter-observer error was a little higher reaching a 2.08% value. The accuracy presented a higher value. The perceptual accuracy error was higher in samples, which had lesion compared to that which had not. CT had provided adequate precision and accuracy for maxillary sinus analyses. The precision in cases with lesion was considered inferior when compared to that without lesion, but it can't affect the method efficacy.

  20. HASE: Framework for efficient high-dimensional association analyses

    PubMed Central

    Roshchupkin, G. V.; Adams, H. H. H.; Vernooij, M. W.; Hofman, A.; Van Duijn, C. M.; Ikram, M. A.; Niessen, W. J.

    2016-01-01

    High-throughput technology can now provide rich information on a person’s biological makeup and environmental surroundings. Important discoveries have been made by relating these data to various health outcomes in fields such as genomics, proteomics, and medical imaging. However, cross-investigations between several high-throughput technologies remain impractical due to demanding computational requirements (hundreds of years of computing resources) and unsuitability for collaborative settings (terabytes of data to share). Here we introduce the HASE framework that overcomes both of these issues. Our approach dramatically reduces computational time from years to only hours and also requires several gigabytes to be exchanged between collaborators. We implemented a novel meta-analytical method that yields identical power as pooled analyses without the need of sharing individual participant data. The efficiency of the framework is illustrated by associating 9 million genetic variants with 1.5 million brain imaging voxels in three cohorts (total N = 4,034) followed by meta-analysis, on a standard computational infrastructure. These experiments indicate that HASE facilitates high-dimensional association studies enabling large multicenter association studies for future discoveries. PMID:27782180

  1. Quantitative Analyse und Visualisierung der Herzfunktionen

    NASA Astrophysics Data System (ADS)

    Sauer, Anne; Schwarz, Tobias; Engel, Nicole; Seitel, Mathias; Kenngott, Hannes; Mohrhardt, Carsten; Loßnitzer, Dirk; Giannitsis, Evangelos; Katus, Hugo A.; Meinzer, Hans-Peter

    Die computergestützte bildbasierte Analyse der Herzfunktionen ist mittlerweile Standard in der Kardiologie. Die verfügbaren Produkte erfordern meist ein hohes Maß an Benutzerinteraktion und somit einen erhöhten Zeitaufwand. In dieser Arbeit wird ein Ansatz vorgestellt, der dem Kardiologen eine größtenteils automatische Analyse der Herzfunktionen mittels MRT-Bilddaten ermöglicht und damit Zeitersparnis schafft. Hierbei werden alle relevanten herzphysiologsichen Parameter berechnet und mithilfe von Diagrammen und Graphen visualisiert. Diese Berechnungen werden evaluiert, indem die ermittelten Werte mit manuell vermessenen verglichen werden. Der hierbei berechnete mittlere Fehler liegt mit 2,85 mm für die Wanddicke und 1,61 mm für die Wanddickenzunahme immer noch im Bereich einer Pixelgrösse der verwendeten Bilder.

  2. Reliability of chemical analyses of water samples

    SciTech Connect

    Beardon, R.

    1989-11-01

    Ground-water quality investigations require reliable chemical analyses of water samples. Unfortunately, laboratory analytical results are often unreliable. The Uranium Mill Tailings Remedial Action (UMTRA) Project`s solution to this problem was to establish a two phase quality assurance program for the analysis of water samples. In the first phase, eight laboratories analyzed three solutions of known composition. The analytical accuracy of each laboratory was ranked and three laboratories were awarded contracts. The second phase consists of on-going monitoring of the reliability of the selected laboratories. The following conclusions are based on two years experience with the UMTRA Project`s Quality Assurance Program. The reliability of laboratory analyses should not be taken for granted. Analytical reliability may be independent of the prices charged by laboratories. Quality assurance programs benefit both the customer and the laboratory.

  3. Sensitivity in risk analyses with uncertain numbers.

    SciTech Connect

    Tucker, W. Troy; Ferson, Scott

    2006-06-01

    Sensitivity analysis is a study of how changes in the inputs to a model influence the results of the model. Many techniques have recently been proposed for use when the model is probabilistic. This report considers the related problem of sensitivity analysis when the model includes uncertain numbers that can involve both aleatory and epistemic uncertainty and the method of calculation is Dempster-Shafer evidence theory or probability bounds analysis. Some traditional methods for sensitivity analysis generalize directly for use with uncertain numbers, but, in some respects, sensitivity analysis for these analyses differs from traditional deterministic or probabilistic sensitivity analyses. A case study of a dike reliability assessment illustrates several methods of sensitivity analysis, including traditional probabilistic assessment, local derivatives, and a ''pinching'' strategy that hypothetically reduces the epistemic uncertainty or aleatory uncertainty, or both, in an input variable to estimate the reduction of uncertainty in the outputs. The prospects for applying the methods to black box models are also considered.

  4. Center for Naval Analyses Annual Report 1982.

    DTIC Science & Technology

    1982-01-01

    recipients, alike, was due mainly to temporary rather than permanent layoffs ; they were unemployed for about the same length of time, and their post- layoff ...equal to 70 percent of average weekly wages for 52 weeks in the two years following layoff ) apparently encouraged workers to remain unemployed longer...Institute for Defense Analyses. William A. Nierenberg, Director of the Scripps Institution of Oceanog- raphy. Member, NASA Advisory Council. Member

  5. [Resistance analyses for recirculated membrane bioreactor].

    PubMed

    Yang, Qi; Huang, Xia; Shang, Hai-Tao; Wen, Xiang-Hua; Qian, Yi

    2006-11-01

    The resistance analyses for recirculated membrane bioreactor by the resistance-in-series model and the modified gel-polarization model respectively were extended to the turbulent ultrafiltration system. The experiments are carried out by dye wastewater in a tubular membrane module, it is found that the permeate fluxes are predicted very well by these models for turbinate systems. And the resistance caused by the concentration polarization is studied; the gel layer resistance is the most important of all the resistances.

  6. [Clinical research=design*measurements*statistical analyses].

    PubMed

    Furukawa, Toshiaki

    2012-06-01

    A clinical study must address true endpoints that matter for the patients and the doctors. A good clinical study starts with a good clinical question. Formulating a clinical question in the form of PECO can sharpen one's original question. In order to perform a good clinical study one must have a knowledge of study design, measurements and statistical analyses: The first is taught by epidemiology, the second by psychometrics and the third by biostatistics.

  7. Optimizing header strength utilizing finite element analyses

    NASA Astrophysics Data System (ADS)

    Burchett, S. N.

    Finite element techniques have been successfully applied as a design tool in the optimization of high strength headers for pyrotechnic-driven actuators. These techniques have been applied to three aspects of the design process of a high strength header. The design process was a joint effort of experts from several disciplines including design engineers, material scientists, test engineers, manufacturing engineers, and structural analysts. Following material selection, finite element techniques were applied to evaluate the residual stresses due to manufacturing which were developed in the high strength glass ceramic-to-metal seal headers. Results from these finite element analyses were used to identify header designs which were manufacturable and had a minimum residual stress state. Finite element techniques were than applied to obtain the response of the header due to pyrotechnic burn. The results provided realistic upper bounds on the pressure containment ability of various preliminary header designs and provided a quick and inexpensive method of strengthening and refining the designs. Since testing of the headers was difficult and sometimes destructive, results of the analyses were also used to interpret test results and identify failure modes. In this paper, details of the finite element element techniques including the models used, material properties, material failure models, and loading will be presented. Results from the analyses showing the header failure process will also be presented. This paper will show that significant gains in capability and understanding can result when finite element techniques are included as an integral part of the design process of complicated high strength headers.

  8. Evaluation of the Technicon Axon analyser.

    PubMed

    Martínez, C; Márquez, M; Cortés, M; Mercé, J; Rodriguez, J; González, F

    1990-01-01

    An evaluation of the Technicon Axon analyser was carried out following the guidelines of the 'Sociedad Española de Química Clínica' and the European Committee for Clinical Laboratory Standards.A photometric study revealed acceptable results at both 340 nm and 404 nm. Inaccuracy and imprecision were lower at 404 nm than at 340 nm, although poor dispersion was found at both wavelengths, even at low absorbances. Drift was negligible, the imprecision of the sample pipette delivery system was greater for small sample volumes, the reagent pipette delivery system imprecision was acceptable and the sample diluting system study showed good precision and accuracy.Twelve analytes were studied for evaluation of the analyser under routine working conditions. Satisfactory results were obtained for within-run imprecision, while coefficients of variation for betweenrun imprecision were much greater than expected. Neither specimenrelated nor specimen-independent contamination was found in the carry-over study. For all analytes assayed, when comparing patient sample results with those obtained in a Hitachi 737 analyser, acceptable relative inaccuracy was observed.

  9. MULTISPECTRAL THERMAL IMAGER - OVERVIEW

    SciTech Connect

    P. WEBER

    2001-03-01

    The Multispectral Thermal Imager satellite fills a new and important role in advancing the state of the art in remote sensing sciences. Initial results with the full calibration system operating indicate that the system was already close to achieving the very ambitious goals which we laid out in 1993, and we are confident of reaching all of these goals as we continue our research and improve our analyses. In addition to the DOE interests, the satellite is tasked about one-third of the time with requests from other users supporting research ranging from volcanology to atmospheric sciences.

  10. Malware Memory Analysis for Non specialists: Investigating Publicly Available Memory Images for Prolaco and SpyEye

    DTIC Science & Technology

    2013-10-01

    images, the author is of the opinion that these analyses are insufficient for use as a learning guide. Specifically, these analyses are either too...mentionné avoir effectué l’analyse de ces images mémoires publiques , l’auteur croit que ces analyses ne sont pas assez détaillées pour servir de guide

  11. Optical eigenmodes for illumination & imaging

    NASA Astrophysics Data System (ADS)

    Kosmeier, Sebastian

    Gravitational Microlensing, as a technique for detecting Extrasolar Planets, is recognised for its potential in discovering small-mass planets similar to Earth, at a distance of a few Astronomical Units from their host stars. However, analysing the data from microlensing events (which statistically rarely reveal planets) is complex and requires continued and intensive use of various networks of telescopes working together in order to observe the phenomenon. As such the techniques are constantly being developed and refined; this project outlines some steps of the careful analysis required to model an event and ensure the best quality data is used in the fitting. A quantitative investigation into increasing the quality of the original photometric data available from any microlensing event demonstrates that 'lucky imaging' can lead to a marked improvement in the signal to noise ratio of images over standard imaging techniques, which could result in more accurate models and thus the calculation of more accurate planetary parameters. In addition, a simulation illustrating the effects of atmospheric turbulence on exposures was created, and expanded upon to give an approximation of the lucky imaging technique. This further demonstrated the advantages of lucky images which are shown to potentially approach the quality of those expected from diffraction limited photometry. The simulation may be further developed for potential future use as a 'theoretical lucky imager' in our research group, capable of producing and analysing synthetic exposures through customisable conditions.

  12. Bathymetric imaging

    NASA Technical Reports Server (NTRS)

    Paluzzi, P. R.; Malin, M. C.

    1981-01-01

    Digital topography has, for some years, been formatted and processed into shaded relief images for specific studies involving land use and thermal properties. Application to bathymetry is a new and seemingly fruitful extension of these techniques. Digital terrain models of the earth - combining subaerial topography with an extensive collection of bathymetric soundings - have been processed to yield shaded relief images. These images provide new and exciting insights into submarine geomorphology and portray many aspects of plate tectonic physiography in a manner not previously possible.

  13. Department of Energy's team's analyses of Soviet designed VVERs

    SciTech Connect

    Not Available

    1989-09-01

    This document provides Appendices A thru K of this report. The topics discussed respectively are: radiation induced embrittlement and annealing of reactor pressure vessel steels; loss of coolant accident blowdown analyses; LOCA blowdown response analyses; non-seismic structural response analyses; seismic analyses; S'' seal integrity; reactor transient analyses; fire protection; aircraft impacts; and boric acid induced corrosion. (FI).

  14. Stable isotopic analyses in paleoclimatic reconstruction

    SciTech Connect

    Wigand, P.E.

    1995-09-01

    Most traditional paleoclimatic proxy data have inherent time lags between climatic input and system response that constrain their use in accurate reconstruction of paleoclimate chronology, scaling of its variability, and the elucidation of the processes that determine its impact on the biotic and abiotic environment. With the exception of dendroclimatology, and studies of short-lived organisms and pollen recovered from annually varved lacustrine sediments, significant periods of time ranging from years, to centuries, to millennia may intervene between climate change and its first manifestation in paleoclimatic proxy data records. Reconstruction of past climate through changes in plant community composition derived from pollen sequences and plant remains from ancient woodrat middens, wet environments and dry caves all suffer from these lags. However, stable isotopic analyses can provide more immediate indication of biotic response to climate change. Evidence of past physiological response of organisms to changes in effective precipitation as climate varies can be provided by analyses of the stable isotopic content of plant macrofossils from various contexts. These analyses consider variation in the stable isotopic (hydrogen, oxygen and carbon) content of plant tissues as it reflects (1) past global or local temperature through changes in meteoric (rainfall) water chemistry in the case of the first two isotopes, and (2) plant stress through changes in plant respiration/transpiration processes under differing water availability, and varying atmospheric CO, composition (which itself may actually be a net result of biotic response to climate change). Studies currently being conducted in the Intermountain West indicate both long- and short-term responses that when calibrated with modem analogue studies have the potential of revealing not only the timing of climate events, but their direction, magnitude and rapidity.

  15. Quality of reporting of dental survival analyses.

    PubMed

    Layton, D M; Clarke, M

    2014-12-01

    To explore the quality of reporting (writing and graphics) of articles that used time-to-event analyses to report dental treatment outcomes. A systematic search of the top 50 dental journals in 2008 produced the sample of articles for this analysis. Articles reporting treatment outcomes with (n = 95) and without (n = 91) time-to-event statistics were reviewed. Survival descriptive words used in the two groups were analysed (Pearson's chi-square). The quality of life tables, survival curves and time-to-event statistics were assessed (Kappa analysed agreement) and explored. Words describing dental outcomes 'over time' were more common in time-to-event compared with control articles (77%, 3%, P < 0.001). Non-specific use of 'rate' was common across both groups. Life tables and survival curves were used by 39% and 48% of the time-to-event articles, with at least one used by 82%. Construction quality was poor: 21% of life tables and 28% of survival curves achieved an acceptable standard. Time-to-event statistical reporting was poor: 3% achieved a high and 59% achieved an acceptable standard. The survival statistic, summary figure and standard error were reported in 76%, 95% and 20% of time-to-event articles. Individual statistical terms and graphic aids were common within and unique to time-to-event articles. Unfortunately, important details were regularly omitted from statistical descriptions and survival figures making the overall quality poor. It is likely this will mean such articles will be incorrectly indexed in databases, missed by searchers and unable to be understood completely if identified.

  16. Combustion Devices CFD Team Analyses Review

    NASA Technical Reports Server (NTRS)

    Rocker, Marvin

    2008-01-01

    A variety of CFD simulations performed by the Combustion Devices CFD Team at Marshall Space Flight Center will be presented. These analyses were performed to support Space Shuttle operations and Ares-1 Crew Launch Vehicle design. Results from the analyses will be shown along with pertinent information on the CFD codes and computational resources used to obtain the results. Six analyses will be presented - two related to the Space Shuttle and four related to the Ares I-1 launch vehicle now under development at NASA. First, a CFD analysis of the flow fields around the Space Shuttle during the first six seconds of flight and potential debris trajectories within those flow fields will be discussed. Second, the combusting flows within the Space Shuttle Main Engine's main combustion chamber will be shown. For the Ares I-1, an analysis of the performance of the roll control thrusters during flight will be described. Several studies are discussed related to the J2-X engine to be used on the upper stage of the Ares I-1 vehicle. A parametric study of the propellant flow sequences and mixture ratios within the GOX/GH2 spark igniters on the J2-X is discussed. Transient simulations will be described that predict the asymmetric pressure loads that occur on the rocket nozzle during the engine start as the nozzle fills with combusting gases. Simulations of issues that affect temperature uniformity within the gas generator used to drive the J-2X turbines will described as well, both upstream of the chamber in the injector manifolds and within the combustion chamber itself.

  17. Method of performing computational aeroelastic analyses

    NASA Technical Reports Server (NTRS)

    Silva, Walter A. (Inventor)

    2011-01-01

    Computational aeroelastic analyses typically use a mathematical model for the structural modes of a flexible structure and a nonlinear aerodynamic model that can generate a plurality of unsteady aerodynamic responses based on the structural modes for conditions defining an aerodynamic condition of the flexible structure. In the present invention, a linear state-space model is generated using a single execution of the nonlinear aerodynamic model for all of the structural modes where a family of orthogonal functions is used as the inputs. Then, static and dynamic aeroelastic solutions are generated using computational interaction between the mathematical model and the linear state-space model for a plurality of periodic points in time.

  18. Thermal structure analyses for CSM testbed (COMET)

    NASA Technical Reports Server (NTRS)

    Xue, David Y.; Mei, Chuh

    1994-01-01

    This document is the final report for the project entitled 'Thermal Structure Analyses for CSM Testbed (COMET),' for the period of May 16, 1992 - August 15, 1994. The project was focused on the investigation and development of finite element analysis capability of the computational structural mechanics (CSM) testbed (COMET) software system in the field of thermal structural responses. The stages of this project consisted of investigating present capabilities, developing new functions, analysis demonstrations, and research topics. The appendices of this report list the detailed documents of major accomplishments and demonstration runstreams for future references.

  19. Further analyses of Rio Cuarto impact glass

    NASA Technical Reports Server (NTRS)

    Schultz, Peter H.; Bunch, T. E.; Koeberl, C.; Collins, W.

    1993-01-01

    Initial analyses of the geologic setting, petrology, and geochemistry of glasses recovered from within and around the elongate Rio Cuarto (RC) craters in Argentina focused on selected samples in order to document the general similarity with impactites around other terrestrial impact craters and to establish their origin. Continued analysis has surveyed the diversity in compositions for a range of samples, examined further evidence for temperature and pressure history, and compared the results with experimentally fused loess from oblique hypervelocity impacts. These new results not only firmly establish their impact origin but provide new insight on the impact process.

  20. Laser power beaming system analyses. Final report

    SciTech Connect

    Zeiders, G.W. Jr.

    1993-08-01

    The successful demonstration of the PAMELA adaptive optics hardware and the fabrication of the BTOS truss structure were identified by the program office as the two most critical elements of the NASA power beaming program, so it was these that received attention during this program. Much of the effort was expended in direct program support at MSFC, but detailed technical analyses of the AMP deterministic control scheme and the BTOS truss structure (both the JPL design and a spherical one) were prepared and are attached, and recommendations are given.

  1. Environmental monitoring final report: groundwater chemical analyses

    SciTech Connect

    Not Available

    1984-02-01

    This report presents the results of analyses of groundwater qualtiy at the SRC-I Demonstration Plant site in Newman, Kentucky. Samples were obtained from a network of 23 groundwater observation wells installed during previous studies. The groundwater was well within US EPA Interim Primary Drinking Water Standards for trace metals, radioactivity, and pesticides, but exceeded the standard for coliform bacteria. Several US EPA Secondary Drinking Water Standards were exceeded, namely, manganese, color, iron, and total dissolved solids. Based on the results, Dames and Moore recommend that all wells should be sterilized and those wells built in 1980 should be redeveloped. 1 figure, 6 tables.

  2. Fundamentals of fungal molecular population genetic analyses.

    PubMed

    Xu, Jianping

    2006-07-01

    The last two decades have seen tremendous growth in the development and application of molecular methods in the analyses of fungal species and populations. In this paper, I provide an overview of the molecular techniques and the basic analytical tools used to address various fundamental population and evolutionary genetic questions in fungi. With increasing availability and decreasing cost, DNA sequencing is becoming a mainstream data acquisition method in fungal evolutionary genetic studies. However, other methods, especially those based on the polymerase chain reaction, remain powerful in addressing specific questions for certain groups of taxa. These developments are bringing fungal population and evolutionary genetics into mainstream ecology and evolutionary biology.

  3. Analyses of containment structures with corrosion damage

    SciTech Connect

    Cherry, J.L.

    1996-12-31

    Corrosion damage to a nuclear power plant containment structure can degrade the pressure capacity of the vessel. For the low-carbon, low- strength steels used in containments, the effect of corrosion on material properties is discussed. Strain-to-failure tests, in uniaxial tension, have been performed on corroded material samples. Results were used to select strain-based failure criteria for corroded steel. Using the ABAQUS finite element analysis code, the capacity of a typical PWR Ice Condenser containment with corrosion damage has been studied. Multiple analyses were performed with the locations of the corrosion the containment, and the amount of corrosion varied in each analysis.

  4. Medical Imaging.

    ERIC Educational Resources Information Center

    Jaffe, C. Carl

    1982-01-01

    Describes principle imaging techniques, their applications, and their limitations in terms of diagnostic capability and possible adverse biological effects. Techniques include film radiography, computed tomography, nuclear medicine, positron emission tomography (PET), ultrasonography, nuclear magnetic resonance, and digital radiography. PET has…

  5. Body Imaging

    NASA Technical Reports Server (NTRS)

    2001-01-01

    The high-tech art of digital signal processing (DSP) was pioneered at NASA's Jet Propulsion Laboratory (JPL) in the mid-1960s for use in the Apollo Lunar Landing Program. Designed to computer enhance pictures of the Moon, this technology became the basis for the Landsat Earth resources satellites and subsequently has been incorporated into a broad range of Earthbound medical and diagnostic tools. DSP is employed in advanced body imaging techniques including Computer-Aided Tomography, also known as CT and CATScan, and Magnetic Resonance Imaging (MRI). CT images are collected by irradiating a thin slice of the body with a fan-shaped x-ray beam from a number of directions around the body's perimeter. A tomographic (slice-like) picture is reconstructed from these multiple views by a computer. MRI employs a magnetic field and radio waves, rather than x-rays, to create images. In this photograph, a patient undergoes an open MRI.

  6. Body Imaging

    NASA Technical Reports Server (NTRS)

    2001-01-01

    The high-tech art of digital signal processing (DSP) was pioneered at NASA's Jet Propulsion Laboratory (JPL) in the mid-1960s for use in the Apollo Lunar Landing Program. Designed to computer enhance pictures of the Moon, this technology became the basis for the Landsat Earth resources satellites and subsequently has been incorporated into a broad range of Earthbound medical and diagnostic tools. DSP is employed in advanced body imaging techniques including Computer-Aided Tomography, also known as CT and CATScan, and Magnetic Resonance Imaging (MRI). CT images are collected by irradiating a thin slice of the body with a fan-shaped x-ray beam from a number of directions around the body's perimeter. A tomographic (slice-like) picture is reconstructed from these multiple views by a computer. MRI employs a magnetic field and radio waves, rather than x-rays, to create images.

  7. Imaging System

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The 1100C Virtual Window is based on technology developed under NASA Small Business Innovation (SBIR) contracts to Ames Research Center. For example, under one contract Dimension Technologies, Inc. developed a large autostereoscopic display for scientific visualization applications. The Virtual Window employs an innovative illumination system to deliver the depth and color of true 3D imaging. Its applications include surgery and Magnetic Resonance Imaging scans, viewing for teleoperated robots, training, and in aviation cockpit displays.

  8. Image Inpainting

    DTIC Science & Technology

    2005-01-01

    regions with in- formation surrounding them. The fill-in is done in such a way that isophote lines arriving at the regions boundaries are completed in...Applications—; Keywords: Image restoration, inpainting, isophotes , anisotropic diffusion. 1 Introduction The modification of images in a way that...be restored, the algorithm automatically fills-in these regions with information surrounding them. The fill-in is done in such a way that isophote

  9. Image Security

    DTIC Science & Technology

    2007-11-02

    popularity, contemplates the cru- cial needs for protecting intellectual property rights on multimedia content like images, video, audio , and oth- ers...protection for still images, audio , video, and multimedia products.’ The networking environment of the future will require tools that provide m secure and fast...technique known as steganography ? Steganography , or “covered writing,” George Voyatzis and Ioannis Pitas University of Thessaloniki has a long

  10. Magnesium alloy ingots: Chemical and metallographic analyses

    NASA Astrophysics Data System (ADS)

    Tartaglia, John M.; Swartz, Robert E.; Bentz, Rodney L.; Howard, Jane H.

    2001-11-01

    The quality of a magnesium die casting is likely dependent on the quality of the feed stockingot material. Therefore, both Daimler-Chrysler and General Motors have established quality assurance measures that include analysis of magnesium ingots. These processes include chemical analysis, corrosion testing, fast neutron activation analysis, and metallography. Optical emission spectroscopy, inductively coupled plasma spectroscopy, and gravimetric analysis are several methods for determining the chemical composition of the material. Fast neutron activation analysis, image analysis and energy dispersive X-ray spectroscopy are used to quantify ingot cleanliness. These experimental techniques are described and discussed in this paper, and example case studies are presented for illustration.

  11. Nonindependence and sensitivity analyses in ecological and evolutionary meta-analyses.

    PubMed

    Noble, Daniel W A; Lagisz, Malgorzata; O'dea, Rose E; Nakagawa, Shinichi

    2017-01-30

    Meta-analysis is an important tool for synthesizing research on a variety of topics in ecology and evolution, including molecular ecology, but can be susceptible to nonindependence. Nonindependence can affect two major interrelated components of a meta-analysis: (i) the calculation of effect size statistics and (ii) the estimation of overall meta-analytic estimates and their uncertainty. While some solutions to nonindependence exist at the statistical analysis stages, there is little advice on what to do when complex analyses are not possible, or when studies with nonindependent experimental designs exist in the data. Here we argue that exploring the effects of procedural decisions in a meta-analysis (e.g. inclusion of different quality data, choice of effect size) and statistical assumptions (e.g. assuming no phylogenetic covariance) using sensitivity analyses are extremely important in assessing the impact of nonindependence. Sensitivity analyses can provide greater confidence in results and highlight important limitations of empirical work (e.g. impact of study design on overall effects). Despite their importance, sensitivity analyses are seldom applied to problems of nonindependence. To encourage better practice for dealing with nonindependence in meta-analytic studies, we present accessible examples demonstrating the impact that ignoring nonindependence can have on meta-analytic estimates. We also provide pragmatic solutions for dealing with nonindependent study designs, and for analysing dependent effect sizes. Additionally, we offer reporting guidelines that will facilitate disclosure of the sources of nonindependence in meta-analyses, leading to greater transparency and more robust conclusions.

  12. Fractal and multifractal analyses of bipartite networks.

    PubMed

    Liu, Jin-Long; Wang, Jian; Yu, Zu-Guo; Xie, Xian-Hua

    2017-03-31

    Bipartite networks have attracted considerable interest in various fields. Fractality and multifractality of unipartite (classical) networks have been studied in recent years, but there is no work to study these properties of bipartite networks. In this paper, we try to unfold the self-similarity structure of bipartite networks by performing the fractal and multifractal analyses for a variety of real-world bipartite network data sets and models. First, we find the fractality in some bipartite networks, including the CiteULike, Netflix, MovieLens (ml-20m), Delicious data sets and (u, v)-flower model. Meanwhile, we observe the shifted power-law or exponential behavior in other several networks. We then focus on the multifractal properties of bipartite networks. Our results indicate that the multifractality exists in those bipartite networks possessing fractality. To capture the inherent attribute of bipartite network with two types different nodes, we give the different weights for the nodes of different classes, and show the existence of multifractality in these node-weighted bipartite networks. In addition, for the data sets with ratings, we modify the two existing algorithms for fractal and multifractal analyses of edge-weighted unipartite networks to study the self-similarity of the corresponding edge-weighted bipartite networks. The results show that our modified algorithms are feasible and can effectively uncover the self-similarity structure of these edge-weighted bipartite networks and their corresponding node-weighted versions.

  13. Waste Stream Analyses for Nuclear Fuel Cycles

    SciTech Connect

    N. R. Soelberg

    2010-08-01

    A high-level study was performed in Fiscal Year 2009 for the U.S. Department of Energy (DOE) Office of Nuclear Energy (NE) Advanced Fuel Cycle Initiative (AFCI) to provide information for a range of nuclear fuel cycle options (Wigeland 2009). At that time, some fuel cycle options could not be adequately evaluated since they were not well defined and lacked sufficient information. As a result, five families of these fuel cycle options are being studied during Fiscal Year 2010 by the Systems Analysis Campaign for the DOE NE Fuel Cycle Research and Development (FCRD) program. The quality and completeness of data available to date for the fuel cycle options is insufficient to perform quantitative radioactive waste analyses using recommended metrics. This study has been limited thus far to qualitative analyses of waste streams from the candidate fuel cycle options, because quantitative data for wastes from the front end, fuel fabrication, reactor core structure, and used fuel for these options is generally not yet available.

  14. Used Fuel Management System Interface Analyses - 13578

    SciTech Connect

    Howard, Robert; Busch, Ingrid; Nutt, Mark; Morris, Edgar; Puig, Francesc; Carter, Joe; Delley, Alexcia; Rodwell, Phillip; Hardin, Ernest; Kalinina, Elena; Clark, Robert; Cotton, Thomas

    2013-07-01

    Preliminary system-level analyses of the interfaces between at-reactor used fuel management, consolidated storage facilities, and disposal facilities, along with the development of supporting logistics simulation tools, have been initiated to provide the U.S. Department of Energy (DOE) and other stakeholders with information regarding the various alternatives for managing used nuclear fuel (UNF) generated by the current fleet of light water reactors operating in the United States. An important UNF management system interface consideration is the need for ultimate disposal of UNF assemblies contained in waste packages that are sized to be compatible with different geologic media. Thermal analyses indicate that waste package sizes for the geologic media under consideration by the Used Fuel Disposition Campaign may be significantly smaller than the canisters being used for on-site dry storage by the nuclear utilities. Therefore, at some point along the UNF disposition pathway, there could be a need to repackage fuel assemblies already loaded and being loaded into the dry storage canisters currently in use. The implications of where and when the packaging or repackaging of commercial UNF will occur are key questions being addressed in this evaluation. The analysis demonstrated that thermal considerations will have a major impact on the operation of the system and that acceptance priority, rates, and facility start dates have significant system implications. (authors)

  15. Computational analyses of multilevel discourse comprehension.

    PubMed

    Graesser, Arthur C; McNamara, Danielle S

    2011-04-01

    The proposed multilevel framework of discourse comprehension includes the surface code, the textbase, the situation model, the genre and rhetorical structure, and the pragmatic communication level. We describe these five levels when comprehension succeeds and also when there are communication misalignments and comprehension breakdowns. A computer tool has been developed, called Coh-Metrix, that scales discourse (oral or print) on dozens of measures associated with the first four discourse levels. The measurement of these levels with an automated tool helps researchers track and better understand multilevel discourse comprehension. Two sets of analyses illustrate the utility of Coh-Metrix in discourse theory and educational practice. First, Coh-Metrix was used to measure the cohesion of the text base and situation model, as well as potential extraneous variables, in a sample of published studies that manipulated text cohesion. This analysis helped us better understand what was precisely manipulated in these studies and the implications for discourse comprehension mechanisms. Second, Coh-Metrix analyses are reported for samples of narrative and science texts in order to advance the argument that traditional text difficulty measures are limited because they fail to accommodate most of the levels of the multilevel discourse comprehension framework.

  16. Fractal and multifractal analyses of bipartite networks

    PubMed Central

    Liu, Jin-Long; Wang, Jian; Yu, Zu-Guo; Xie, Xian-Hua

    2017-01-01

    Bipartite networks have attracted considerable interest in various fields. Fractality and multifractality of unipartite (classical) networks have been studied in recent years, but there is no work to study these properties of bipartite networks. In this paper, we try to unfold the self-similarity structure of bipartite networks by performing the fractal and multifractal analyses for a variety of real-world bipartite network data sets and models. First, we find the fractality in some bipartite networks, including the CiteULike, Netflix, MovieLens (ml-20m), Delicious data sets and (u, v)-flower model. Meanwhile, we observe the shifted power-law or exponential behavior in other several networks. We then focus on the multifractal properties of bipartite networks. Our results indicate that the multifractality exists in those bipartite networks possessing fractality. To capture the inherent attribute of bipartite network with two types different nodes, we give the different weights for the nodes of different classes, and show the existence of multifractality in these node-weighted bipartite networks. In addition, for the data sets with ratings, we modify the two existing algorithms for fractal and multifractal analyses of edge-weighted unipartite networks to study the self-similarity of the corresponding edge-weighted bipartite networks. The results show that our modified algorithms are feasible and can effectively uncover the self-similarity structure of these edge-weighted bipartite networks and their corresponding node-weighted versions. PMID:28361962

  17. Analyses of containment structures with corrosion damage

    SciTech Connect

    Cherry, J.L.

    1997-01-01

    Corrosion damage that has been found in a number of nuclear power plant containment structures can degrade the pressure capacity of the vessel. This has prompted concerns regarding the capacity of corroded containments to withstand accident loadings. To address these concerns, finite element analyses have been performed for a typical PWR Ice Condenser containment structure. Using ABAQUS, the pressure capacity was calculated for a typical vessel with no corrosion damage. Multiple analyses were then performed with the location of the corrosion and the amount of corrosion varied in each analysis. Using a strain-based failure criterion, a {open_quotes}lower bound{close_quotes}, {open_quotes}best estimate{close_quotes}, and {open_quotes}upper bound{close_quotes} failure level was predicted for each case. These limits were established by: determining the amount of variability that exists in material properties of typical containments, estimating the amount of uncertainty associated with the level of modeling detail and modeling assumptions, and estimating the effect of corrosion on the material properties.

  18. Integrated Genomic Analyses of Ovarian Carcinoma

    PubMed Central

    2011-01-01

    Summary The Cancer Genome Atlas (TCGA) project has analyzed mRNA expression, miRNA expression, promoter methylation, and DNA copy number in 489 high-grade serous ovarian adenocarcinomas (HGS-OvCa) and the DNA sequences of exons from coding genes in 316 of these tumors. These results show that HGS-OvCa is characterized by TP53 mutations in almost all tumors (96%); low prevalence but statistically recurrent somatic mutations in 9 additional genes including NF1, BRCA1, BRCA2, RB1, and CDK12; 113 significant focal DNA copy number aberrations; and promoter methylation events involving 168 genes. Analyses delineated four ovarian cancer transcriptional subtypes, three miRNA subtypes, four promoter methylation subtypes, a transcriptional signature associated with survival duration and shed new light on the impact on survival of tumors with BRCA1/2 and CCNE1 aberrations. Pathway analyses suggested that homologous recombination is defective in about half of tumors, and that Notch and FOXM1 signaling are involved in serous ovarian cancer pathophysiology. PMID:21720365

  19. Autisme et douleur – analyse bibliographique

    PubMed Central

    Dubois, Amandine; Rattaz, Cécile; Pry, René; Baghdadli, Amaria

    2010-01-01

    La présente analyse bibliographique a pour objectif de réaliser un bilan des travaux publiés dans le champ de la douleur et de l’autisme. L’article aborde, dans un premier temps, les études publiées concernant les modes d’expression de la douleur observés dans cette population. Différentes hypothèses permettant d’expliquer les particularités expressives des personnes avec autisme sont ensuite passées en revue : excès d’endorphines, particularités dans le traitement sensoriel, déficit sociocommunicatif. Cette analyse bibliographique aborde, pour terminer, la question de l’évaluation et de la prise en compte de la douleur chez les personnes avec autisme. Les auteurs concluent à l’absence d’homogénéité des résultats des études publiées et au besoin de poursuivre les recherches afin de parvenir à des données consensuelles sur un domaine d’étude encore peu exploité au plan scientifique. Sur un plan clinique, l’approfondissement des connaissances dans ce domaine devrait permettre de mettre au point des outils d’évaluation de la douleur et d’ainsi en assurer une meilleure prise en charge au quotidien. PMID:20808970

  20. NEXT Ion Thruster Performance Dispersion Analyses

    NASA Technical Reports Server (NTRS)

    Soulas, George C.; Patterson, Michael J.

    2008-01-01

    The NEXT ion thruster is a low specific mass, high performance thruster with a nominal throttling range of 0.5 to 7 kW. Numerous engineering model and one prototype model thrusters have been manufactured and tested. Of significant importance to propulsion system performance is thruster-to-thruster performance dispersions. This type of information can provide a bandwidth of expected performance variations both on a thruster and a component level. Knowledge of these dispersions can be used to more conservatively predict thruster service life capability and thruster performance for mission planning, facilitate future thruster performance comparisons, and verify power processor capabilities are compatible with the thruster design. This study compiles the test results of five engineering model thrusters and one flight-like thruster to determine unit-to-unit dispersions in thruster performance. Component level performance dispersion analyses will include discharge chamber voltages, currents, and losses; accelerator currents, electron backstreaming limits, and perveance limits; and neutralizer keeper and coupling voltages and the spot-to-plume mode transition flow rates. Thruster level performance dispersion analyses will include thrust efficiency.

  1. Bioinformatics tools for analysing viral genomic data.

    PubMed

    Orton, R J; Gu, Q; Hughes, J; Maabar, M; Modha, S; Vattipally, S B; Wilkie, G S; Davison, A J

    2016-04-01

    The field of viral genomics and bioinformatics is experiencing a strong resurgence due to high-throughput sequencing (HTS) technology, which enables the rapid and cost-effective sequencing and subsequent assembly of large numbers of viral genomes. In addition, the unprecedented power of HTS technologies has enabled the analysis of intra-host viral diversity and quasispecies dynamics in relation to important biological questions on viral transmission, vaccine resistance and host jumping. HTS also enables the rapid identification of both known and potentially new viruses from field and clinical samples, thus adding new tools to the fields of viral discovery and metagenomics. Bioinformatics has been central to the rise of HTS applications because new algorithms and software tools are continually needed to process and analyse the large, complex datasets generated in this rapidly evolving area. In this paper, the authors give a brief overview of the main bioinformatics tools available for viral genomic research, with a particular emphasis on HTS technologies and their main applications. They summarise the major steps in various HTS analyses, starting with quality control of raw reads and encompassing activities ranging from consensus and de novo genome assembly to variant calling and metagenomics, as well as RNA sequencing.

  2. Image Mission Attitude Support Experiences

    NASA Technical Reports Server (NTRS)

    Ottenstein, N.; Challa, M.; Home, A.; Harman, R.; Burley, R.

    2001-01-01

    The spin-stabilized Imager for Magnetopause to Aurora Global Exploration (IMAGE) is the National Aeronautics and Space Administration's (NASA's) first Medium-class Explorer Mission (MIDEX). IMAGE was launched into a highly elliptical polar orbit on March 25, 2000 from Vandenberg Air Force Base, California, aboard a Boeing Delta II 7326 launch vehicle. This paper presents some of the observations of the flight dynamics analyses during the launch and in-orbit checkout period through May 18, 2000. Three new algorithms - one algebraic and two differential correction - for computing the parameters of the coning motion of a spacecraft are described and evaluated using in-flight data from the autonomous star tracker (AST) on IMAGE. Other attitude aspects highlighted include support for active damping consequent upon the failure of the passive nutation damper, performance evaluation of the AST, evaluation of the Sun sensor and magnetometer using AST data, and magnetometer calibration.

  3. X-ray CT analyses, models and numerical simulations: a comparison with petrophysical analyses in an experimental CO2 study

    NASA Astrophysics Data System (ADS)

    Henkel, Steven; Pudlo, Dieter; Enzmann, Frieder; Reitenbach, Viktor; Albrecht, Daniel; Ganzer, Leonhard; Gaupp, Reinhard

    2016-06-01

    An essential part of the collaborative research project H2STORE (hydrogen to store), which is funded by the German government, was a comparison of various analytical methods for characterizing reservoir sandstones from different stratigraphic units. In this context Permian, Triassic and Tertiary reservoir sandstones were analysed. Rock core materials, provided by RWE Gasspeicher GmbH (Dortmund, Germany), GDF Suez E&P Deutschland GmbH (Lingen, Germany), E.ON Gas Storage GmbH (Essen, Germany) and RAG Rohöl-Aufsuchungs Aktiengesellschaft (Vienna, Austria), were processed by different laboratory techniques; thin sections were prepared, rock fragments were crushed and cubes of 1 cm edge length and plugs 3 to 5 cm in length with a diameter of about 2.5 cm were sawn from macroscopic homogeneous cores. With this prepared sample material, polarized light microscopy and scanning electron microscopy, coupled with image analyses, specific surface area measurements (after Brunauer, Emmet and Teller, 1938; BET), He-porosity and N2-permeability measurements and high-resolution microcomputer tomography (μ-CT), which were used for numerical simulations, were applied. All these methods were practised on most of the same sample material, before and on selected Permian sandstones also after static CO2 experiments under reservoir conditions. A major concern in comparing the results of these methods is an appraisal of the reliability of the given porosity, permeability and mineral-specific reactive (inner) surface area data. The CO2 experiments modified the petrophysical as well as the mineralogical/geochemical rock properties. These changes are detectable by all applied analytical methods. Nevertheless, a major outcome of the high-resolution μ-CT analyses and following numerical data simulations was that quite similar data sets and data interpretations were maintained by the different petrophysical standard methods. Moreover, the μ-CT analyses are not only time saving, but also

  4. Stellar Imager

    NASA Technical Reports Server (NTRS)

    Carpenter, Kenneth

    2007-01-01

    The Stellar Imager (SI) is one of NASA's "Vision Missions" - concepts for future, space-based, strategic missions that could enormously increase our capabilities for observing the Cosmos. SI is designed as a UV/Optical Interferometer which will enable 0.1 milli-arcsecond (mas) spectral imaging of stellar surfaces and, via asteroseismology, stellar interiors and of the Universe in general. The ultra-sharp images of the Stellar Imager will revolutionize our view of many dynamic astrophysical processes by transforming point sources into extended sources, and snapshots into evolving views. SI, with a characteristic angular resolution of 0.1 milli-arcseconds at 2000 Angstroms, represents an advance in image detail of several hundred times over that provided by the Hubble Space Telescope. The Stellar Imager will zoom in on what today-with few exceptions - we only know as point sources, revealing processes never before seen, thus providing a tool as fundamental to astrophysics as the microscope is to the study of life on Earth. SI's science focuses on the role of magnetism in the Universe, particularly on magnetic activity on the surfaces of stars like the Sun. It's prime goal is to enable long-term forecasting of solar activity and the space weather that it drives, in support of the Living With a Star program in the Exploration Era. SI will also revolutionize our understanding of the formation of planetary systems, of the habitability and climatology of distant planets, and of many magneto-hydrodynamically controlled processes in the Universe. Stellar Imager is included as a "Flagship and Landmark Discovery Mission" in the 2005 Sun Solar System Connection (SSSC) Roadmap and as a candidate for a "Pathways to Life Observatory" in the Exploration of the Universe Division (EUD) Roadmap (May, 2005) and as such is a candidate mission for the 2025-2030 timeframe. An artist's drawing of the current "baseline" concept for SI is presented.

  5. Time series analyses of global change data.

    PubMed

    Lane, L J; Nichols, M H; Osborn, H B

    1994-01-01

    The hypothesis that statistical analyses of historical time series data can be used to separate the influences of natural variations from anthropogenic sources on global climate change is tested. Point, regional, national, and global temperature data are analyzed. Trend analyses for the period 1901-1987 suggest mean annual temperatures increased (in degrees C per century) globally at the rate of about 0.5, in the USA at about 0.3, in the south-western USA desert region at about 1.2, and at the Walnut Gulch Experimental Watershed in south-eastern Arizona at about 0.8. However, the rates of temperature change are not constant but vary within the 87-year period. Serial correlation and spectral density analysis of the temperature time series showed weak periodicities at various frequencies. The only common periodicity among the temperature series is an apparent cycle of about 43 years. The temperature time series were correlated with the Wolf sunspot index, atmospheric CO(2) concentrations interpolated from the Siple ice core data, and atmospheric CO(2) concentration data from Mauna Loa measurements. Correlation analysis of temperature data with concurrent data on atmospheric CO(2) concentrations and the Wolf sunspot index support previously reported significant correlation over the 1901-1987 period. Correlation analysis between temperature, atmospheric CO(2) concentration, and the Wolf sunspot index for the shorter period, 1958-1987, when continuous Mauna Loa CO(2) data are available, suggest significant correlation between global warming and atmospheric CO(2) concentrations but no significant correlation between global warming and the Wolf sunspot index. This may be because the Wolf sunspot index apparently increased from 1901 until about 1960 and then decreased thereafter, while global warming apparently continued to increase through 1987. Correlation of sunspot activity with global warming may be spurious but additional analyses are required to test this hypothesis

  6. Ultrasonic Evaluation and Imaging

    SciTech Connect

    Crawford, Susan L.; Anderson, Michael T.; Diaz, Aaron A.; Larche, Michael R.; Prowant, Matthew S.; Cinson, Anthony D.

    2015-10-01

    Ultrasonic evaluation of materials for material characterization and flaw detection is as simple as manually moving a single-element probe across a speci-men and looking at an oscilloscope display in real time or as complex as automatically (under computer control) scanning a phased-array probe across a specimen and collecting encoded data for immediate or off-line data analyses. The reliability of the results in the second technique is greatly increased because of a higher density of measurements per scanned area and measurements that can be more precisely related to the specimen geometry. This chapter will briefly discuss applications of the collection of spatially encoded data and focus primarily on the off-line analyses in the form of data imaging. Pacific Northwest National Laboratory (PNNL) has been involved with as-sessing and advancing the reliability of inservice inspections of nuclear power plant components for over 35 years. Modern ultrasonic imaging techniques such as the synthetic aperture focusing technique (SAFT), phased-array (PA) technolo-gy and sound field mapping have undergone considerable improvements to effec-tively assess and better understand material constraints.

  7. Study of spin-scan imaging for outer planets missions. [imaging techniques for Jupiter orbiter missions

    NASA Technical Reports Server (NTRS)

    Russell, E. E.; Chandos, R. A.; Kodak, J. C.; Pellicori, S. F.; Tomasko, M. G.

    1974-01-01

    The constraints that are imposed on the Outer Planet Missions (OPM) imager design are of critical importance. Imager system modeling analyses define important parameters and systematic means for trade-offs applied to specific Jupiter orbiter missions. Possible image sequence plans for Jupiter missions are discussed in detail. Considered is a series of orbits that allow repeated near encounters with three of the Jovian satellites. The data handling involved in the image processing is discussed, and it is shown that only minimal processing is required for the majority of images for a Jupiter orbiter mission.

  8. Precise Chemical Analyses of Planetary Surfaces

    NASA Technical Reports Server (NTRS)

    Kring, David; Schweitzer, Jeffrey; Meyer, Charles; Trombka, Jacob; Freund, Friedemann; Economou, Thanasis; Yen, Albert; Kim, Soon Sam; Treiman, Allan H.; Blake, David; Lisse, Carey

    1996-01-01

    We identify the chemical elements and element ratios that should be analyzed to address many of the issues identified by the Committee on Planetary and Lunar Exploration (COMPLEX). We determined that most of these issues require two sensitive instruments to analyze the necessary complement of elements. In addition, it is useful in many cases to use one instrument to analyze the outermost planetary surface (e.g. to determine weathering effects), while a second is used to analyze a subsurface volume of material (e.g., to determine the composition of unaltered planetary surface material). This dual approach to chemical analyses will also facilitate the calibration of orbital and/or Earth-based spectral observations of the planetary body. We determined that in many cases the scientific issues defined by COMPLEX can only be fully addressed with combined packages of instruments that would supplement the chemical data with mineralogic or visual information.

  9. FACS binding assay for analysing GDNF interactions.

    PubMed

    Quintino, Luís; Baudet, Aurélie; Larsson, Jonas; Lundberg, Cecilia

    2013-08-15

    Glial cell-line derived neurotrophic factor (GDNF) is a secreted protein with great therapeutic potential. However, in order to analyse the interactions between GDNF and its receptors, researchers have been mostly dependent of radioactive binding assays. We developed a FACS-based binding assay for GDNF as an alternative to current methods. We demonstrated that the FACS-based assay using TGW cells allowed readily detection of GDNF binding and displacement to endogenous receptors. The dissociation constant and half maximal inhibitory concentration obtained were comparable to other studies using standard binding assays. Overall, this FACS-based, simple to perform and adaptable to high throughput setup, provides a safer and reliable alternative to radioactive methods.

  10. Phylogenomic Analyses Support Traditional Relationships within Cnidaria.

    PubMed

    Zapata, Felipe; Goetz, Freya E; Smith, Stephen A; Howison, Mark; Siebert, Stefan; Church, Samuel H; Sanders, Steven M; Ames, Cheryl Lewis; McFadden, Catherine S; France, Scott C; Daly, Marymegan; Collins, Allen G; Haddock, Steven H D; Dunn, Casey W; Cartwright, Paulyn

    2015-01-01

    Cnidaria, the sister group to Bilateria, is a highly diverse group of animals in terms of morphology, lifecycles, ecology, and development. How this diversity originated and evolved is not well understood because phylogenetic relationships among major cnidarian lineages are unclear, and recent studies present contrasting phylogenetic hypotheses. Here, we use transcriptome data from 15 newly-sequenced species in combination with 26 publicly available genomes and transcriptomes to assess phylogenetic relationships among major cnidarian lineages. Phylogenetic analyses using different partition schemes and models of molecular evolution, as well as topology tests for alternative phylogenetic relationships, support the monophyly of Medusozoa, Anthozoa, Octocorallia, Hydrozoa, and a clade consisting of Staurozoa, Cubozoa, and Scyphozoa. Support for the monophyly of Hexacorallia is weak due to the equivocal position of Ceriantharia. Taken together, these results further resolve deep cnidarian relationships, largely support traditional phylogenetic views on relationships, and provide a historical framework for studying the evolutionary processes involved in one of the most ancient animal radiations.

  11. Determining Significant Endpoints for Ecological risk Analyses

    SciTech Connect

    Hinton, Thimas G.; Bedford, Joel

    1999-06-01

    Our interest is in obtaining a scientifically defensible endpoint for measuring ecological risks to populations exposed to chronic, low-level radiation, and radiation with concomitant exposure to chemicals. To do so, we believe that we must understand the extent to which molecular damage is detrimental at the individual and population levels of biological organization. Ecological risk analyses based on molecular damage, without an understanding of the impacts to higher levels of biological organization, could cause cleanup strategies on DOE sites to be overly conservative and unnecessarily expensive. Our goal is to determine the relevancy of sublethal cellular damage to the performance of individuals and populations. We think that we can achieve this by using novel biological dosimeters in controlled, manipulative dose/effects experiments, and by coupling changes in metabolic rates and energy allocation patterns to meaningful population response variables such as age-specific survivorship, reproductive output, age at maturity and longevity.

  12. [Use of pharmacoeconomics analyses to health protection].

    PubMed

    Drozd, Mariola

    2002-01-01

    The pharmacoeconomics makes possible a most favourable utilization of capital resources appropriated for the health protection. For the use of economic analysis health and effects of disease and its treatment are represented in absolute values having a common base--money. The economic analysis is usually carried out from a certain perspective. Something, what is an expense for someone can be a profit for someone else. This work is a review of available Polish literature describing main assumptions of the pharmoeconomics and its instruments--the pharmacoeconomic analyses. As a result of the review it has been ascertained that a modern medicine can not do without economics. At present the capital resources are constantly too small, profitability of an employed method of the therapy or drug must be assessed all the time.

  13. An introduction to modern missing data analyses.

    PubMed

    Baraldi, Amanda N; Enders, Craig K

    2010-02-01

    A great deal of recent methodological research has focused on two modern missing data analysis methods: maximum likelihood and multiple imputation. These approaches are advantageous to traditional techniques (e.g. deletion and mean imputation techniques) because they require less stringent assumptions and mitigate the pitfalls of traditional techniques. This article explains the theoretical underpinnings of missing data analyses, gives an overview of traditional missing data techniques, and provides accessible descriptions of maximum likelihood and multiple imputation. In particular, this article focuses on maximum likelihood estimation and presents two analysis examples from the Longitudinal Study of American Youth data. One of these examples includes a description of the use of auxiliary variables. Finally, the paper illustrates ways that researchers can use intentional, or planned, missing data to enhance their research designs.

  14. Neutronic Analyses of the Trade Demonstration Facility

    SciTech Connect

    Rubbia, C.

    2004-09-15

    The TRiga Accelerator-Driven Experiment (TRADE), to be performed in the TRIGA reactor of the ENEA-Casaccia Centre in Italy, consists of the coupling of an external proton accelerator to a target to be installed in the central channel of the reactor scrammed to subcriticality. This pilot experiment, aimed at a global demonstration of the accelerator-driven system concept, is based on an original idea of C. Rubbia. The present paper reports the results of some neutronic analyses focused on the feasibility of TRADE. Results show that all relevant experiments (at different power levels in a wide range of subcriticalities) can be carried out with relatively limited modifications to the present TRIGA reactor.

  15. Error analyses for a gravity gradiometer mission

    NASA Technical Reports Server (NTRS)

    Kahn, W. D.; Von Bun, F. O.

    1985-01-01

    This paper addresses the usefulness of an orbiting gravity gradiometer as a sensor for mapping the fine structure of the earth gravity field. The exact knowledge of this field is essential for studies of the solid earth and the dynamics of the oceans. Although the earth gravity tensor, measured by a gradiometer assembly, has nine components, only five components are independent. This latter fact is as a consequence of the symmetry and conservative nature of the earth's gravity field. The most dominant component is the radial one. The error analyses considered here are therefore based only upon a single axis gradiometer sensing this radial component. The expected global gravity and geoid errors for a 50 x 50-km (1/2 x 1/2 deg) area utilizing a spaceborne gradiometer with a precision of 0.001 EU in a 160-km circular polar orbit are about 3 mGAL and 5 cm, respectively.

  16. TRACE ELEMENT ANALYSES OF URANIUM MATERIALS

    SciTech Connect

    Beals, D; Charles Shick, C

    2008-06-09

    The Savannah River National Laboratory (SRNL) has developed an analytical method to measure many trace elements in a variety of uranium materials at the high part-per-billion (ppb) to low part-per-million (ppm) levels using matrix removal and analysis by quadrapole ICP-MS. Over 35 elements were measured in uranium oxides, acetate, ore and metal. Replicate analyses of samples did provide precise results however none of the materials was certified for trace element content thus no measure of the accuracy could be made. The DOE New Brunswick Laboratory (NBL) does provide a Certified Reference Material (CRM) that has provisional values for a series of trace elements. The NBL CRM were purchased and analyzed to determine the accuracy of the method for the analysis of trace elements in uranium oxide. These results are presented and discussed in the following paper.

  17. Project analysis and integration economic analyses summary

    NASA Technical Reports Server (NTRS)

    Macomber, H. L.

    1986-01-01

    An economic-analysis summary was presented for the manufacture of crystalline-silicon modules involving silicon ingot/sheet, growth, slicing, cell manufacture, and module assembly. Economic analyses provided: useful quantitative aspects for complex decision-making to the Flat-plate Solar Array (FSA) Project; yardsticks for design and performance to industry; and demonstration of how to evaluate and understand the worth of research and development both to JPL and other government agencies and programs. It was concluded that future research and development funds for photovoltaics must be provided by the Federal Government because the solar industry today does not reap enough profits from its present-day sales of photovoltaic equipment.

  18. Spatially explicit analyses unveil density dependence.

    PubMed Central

    Veldtman, Ruan; McGeoch, Melodie A.

    2004-01-01

    Density-dependent processes are fundamental in the understanding of species population dynamics. Whereas the benefits of considering the spatial dimension in population biology are widely acknowledged, the implications of doing so for the statistical detection of spatial density dependence have not been examined. The outcome of traditional tests may therefore differ from those that include ecologically relevant locational information on both the prey species and natural enemy. Here, we explicitly incorporate spatial information on individual counts when testing for density dependence between an insect herbivore and its parasitoids. The spatially explicit approach used identified significant density dependence more frequently and in different instances than traditional methods. The form of density dependence detected also differed between methods. These results demonstrate that the explicit consideration of patch location in density-dependence analyses is likely to significantly alter current understanding of the prevalence and form of spatial density dependence in natural populations. PMID:15590593

  19. Ensemble decadal predictions from analysed initial conditions.

    PubMed

    Troccoli, Alberto; Palmer, T N

    2007-08-15

    Sensitivity experiments using a coupled model initialized from analysed atmospheric and oceanic observations are used to investigate the potential for interannual-to-decadal predictability. The potential for extending seasonal predictions to longer time scales is explored using the same coupled model configuration and initialization procedure as used for seasonal prediction. It is found that, despite model drift, climatic signals on interannual-to-decadal time scales appear to be detectable. Two climatic states have been chosen: one starting in 1965, i.e. ahead of a period of global cooling, and the other in 1994, ahead of a period of global warming. The impact of initial conditions and of the different levels of greenhouse gases are isolated in order to gain insights into the source of predictability.

  20. Life cycle analyses and resource assessments.

    PubMed

    Fredga, Karl; Mäler, Karl-Göran

    2010-01-01

    Prof. Ulgiati stresses that we should always use an ecosystem view when transforming energy from one form to another. Sustainable growth and development of both environmental and human-dominated systems require optimum use of available resources for maximum power output. We have to adapt to the laws of nature because nature has to take care of all the waste products we produce. The presentation addresses a much needed shift away from linear production and consumption pattern, toward reorganization of economies and lifestyle that takes complexity--of resources, of the environment and of the economy--into proper account. The best way to reach maximum yield from the different kinds of biomass is to use biorefineries. Biorefinery is defined as the sustainable processing of biomass into a spectrum of marketable products like heat, power, fuels, chemicals, food, feed, and materials. However, biomass from agricultural land must be used for the production of food and not fuel. Prof. Voss focuses on the sustainability of energy supply chains and energy systems. Life cycle analyses (LCA) provides the conceptual framework for a comprehensive comparative evaluation of energy supply options with regard to their resource requirements as well as the health and environmental impact. Full scope LCA considers not only the emissions from plant operation, construction, and decommissioning but also the environmental burdens and resource requirements associated with the entire lifetime of all relevant upstream and downstream processes within the energy chain. This article describes the results of LCA analyses for state-of-the-art heating and electricity systems as well as of advanced future systems. Total costs are used as a measure for the overall resource consumption.

  1. Topological Analyses of Symmetric Eruptive Prominences

    NASA Astrophysics Data System (ADS)

    Panasenco, O.; Martin, S. F.

    Erupting prominences (filaments) that we have analyzed from Hα Doppler data at Helio Research and from SOHO/EIT 304 Å, show strong coherency between their chirality, the direction of the vertical and lateral motions of the top of the prominences, and the directions of twisting of their legs. These coherent properties in erupting prominences occur in two patterns of opposite helicity; they constitute a form of dynamic chirality called the ``roll effect." Viewed from the positive network side as they erupt, many symmetrically-erupting dextral prominences develop rolling motion toward the observer along with right-hand helicity in the left leg and left-hand helicity in the right leg. Many symmetricaly-erupting sinistral prominences, also viewed from the positive network field side, have the opposite pattern: rolling motion at the top away from the observer, left-hand helical twist in the left leg, and right-hand twist in the right leg. We have analysed the motions seen in the famous movie of the ``Grand Daddy" erupting prominence and found that it has all the motions that define the roll effect. From our analyses of this and other symmetric erupting prominences, we show that the roll effect is an alternative to the popular hypothetical configuration of an eruptive prominence as a twisted flux rope or flux tube. Instead we find that a simple flat ribbon can be bent such that it reproduces nearly all of the observed forms. The flat ribbon is the most logical beginning topology because observed prominence spines already have this topology prior to eruption and an initial long magnetic ribbon with parallel, non-twisted threads, as a basic form, can be bent into many more and different geometrical forms than a flux rope.

  2. Reporting guidelines for population pharmacokinetic analyses.

    PubMed

    Dykstra, Kevin; Mehrotra, Nitin; Tornøe, Christoffer Wenzel; Kastrissios, Helen; Patel, Bela; Al-Huniti, Nidal; Jadhav, Pravin; Wang, Yaning; Byon, Wonkyung

    2015-06-01

    The purpose of this work was to develop a consolidated set of guiding principles for reporting of population pharmacokinetic (PK) analyses based on input from a survey of practitioners as well as discussions between industry, consulting and regulatory scientists. The survey found that identification of population covariate effects on drug exposure and support for dose selection (where population PK frequently serves as preparatory analysis to exposure-response modeling) are the main areas of influence for population PK analysis. The proposed guidelines consider two main purposes of population PK reports (1) to present key analysis findings and their impact on drug development decisions, and (2) as documentation of the analysis methods for the dual purpose of enabling review of the analysis and facilitating future use of the models. This work also identified two main audiences for the reports: (1) a technically competent group responsible for in-depth review of the data, methodology, and results, and (2) a scientifically literate, but not technically adept group, whose main interest is in the implications of the analysis for the broader drug development program. We recommend a generalized question-based approach with six questions that need to be addressed throughout the report. We recommend eight sections (Synopsis, Introduction, Data, Methods, Results, Discussion, Conclusions, Appendix) with suggestions for the target audience and level of detail for each section. A section providing general expectations regarding population PK reporting from a regulatory perspective is also included. We consider this an important step towards industrialization of the field of pharmacometrics such that non-technical audience also understands the role of pharmacometrics analyses in decision making. Population PK reports were chosen as representative reports to derive these recommendations; however, the guiding principles presented here are applicable for all pharmacometric reports

  3. Analyse de formes par moiré

    NASA Astrophysics Data System (ADS)

    Harthong, J.; Sahli, H.; Poinsignon, R.; Meyrueis, P.

    1991-01-01

    We present a mathematical analysis of moiré phenomena for shape recognition. The basic theoretical concept - and tool - will be the contour function. We show that the mathematical analysis is greatly simplified by the systematic recourse to this tool. The analysis presented permits a simultaneous treatment of two different modes of implementing the moiré technique : the direct mode (widely used and well-known), and the converse mode (scarcely used). The converse mode consists in computing and designing a grating especially for one model of object, in such a manner that if (and only if) the object is in conformity with the prescribed model, the resulting moiré fringes are parallel straight lines. We give explicit formulas and algorithms for such computations. Nous présentons une analyse mathématique du moiré permettant une reconnaissance des formes. Le concept théorique de base est celui de “ fonction de contour ”. Nous montrons que l'analyse mathématique est simplifiée en faisant appel à ces fonctions. De plus, la méthode proposée permet de traiter d'une manière unifiée les deux différents modes d'utilisation des techniques de moiré : le mode direct (le plus utilisé et le mieux connu), et le moiré inverse, qui consiste, pour un modèle d'objet donné, à calculer et réaliser un réseau spécifique, tel que si (et seulement si) un objet est conforme au modèle, les franges de moiré obtenues seront des lignes droites parallèles. Nous proposons des formules explicites et des algorithmes pour ces traitements.

  4. ANALYSES OF WOUND EXUDATES FOR CLOSTRIDIAL TOXINS

    PubMed Central

    Noyes, Howard E.; Pritchard, William L.; Brinkley, Floyd B.; Mendelson, Janice A.

    1964-01-01

    Noyes, Howard E. (Walter Reed Army Institute of Research, Washington, D.C.), William L. Pritchard, Floyd B. Brinkley, and Janice A. Mendelson. Analyses of wound exudates for clostridial toxins. J. Bacteriol. 87:623–629. 1964.—Earlier studies indicated that death of goats with traumatic wounds of the hindquarter could be related to the number of clostridia in the wounds, and that toxicity of wound exudates for mice and guinea pigs could be partially neutralized by commercial trivalent gas gangrene antitoxin. This report describes in vitro and in vivo analyses of wound exudates for known clostridial toxins. Wounds were produced by detonation of high-explosive pellets. Wound exudates were obtained by cold saline extraction of both necrotic tissues and gauze sponges used to cover the wounds. Exudates were sterilized by Seitz filtration in the cold. In vitro tests were used to measure alpha-, theta-, and mu-toxins of Clostridium perfringens and the epsilon-toxin of C. novyi. Mouse protection tests, employing commercial typing antisera, were used to analyze exudates for other clostridial toxins. Lethality of wound exudates for mice could be related to (i) the numbers of clostridia present in the wound, (ii) survival time of the goats, and (iii) positive lecithovitellin (LV) tests of the exudates. However, the LV tests could not be neutralized by antitoxin specific for C. perfringens alpha-toxin. Mice were not protected by typing antisera specific for types A, C, or D C. perfringens or C. septicum but were protected by antisera specific for type B C. perfringens and types A and B C. novyi. PMID:14127581

  5. Energy adjustment methods applied to alcohol analyses.

    PubMed

    Johansen, Ditte; Andersen, Per K; Overvad, Kim; Jensen, Gorm; Schnohr, Peter; Sørensen, Thorkild I A; Grønbaek, Morten

    2003-01-01

    When alcohol consumption is related to outcome, associations between alcohol type and health outcomes may occur simply because of the ethanol in the beverage type. When one analyzes the consequences of consumption of beer, wine, and spirits, the total alcohol intake must therefore be taken into account. However, owing to the linear dependency between total alcohol intake and the alcohol content of each beverage type, the effects cannot be separated from each other or from the effect of ethanol. In nutritional epidemiology, similar problems regarding intake of macronutrients and total energy intake have been addressed, and four methods have been proposed to solve the problem: energy partition, standard, density, and residual. The aim of this study was to evaluate the usefulness of the energy adjustment methods in alcohol analyses by using coronary heart disease as an example. Data obtained from the Copenhagen City Heart Study were used. The standard and energy partition methods yielded similar results for continuous, and almost similar results for categorical, alcohol variables. The results from the density method differed, but nevertheless were concordant with these. Beer and wine drinkers, in comparison with findings for nondrinkers, had lower risk of coronary heart disease. Except for the case of men drinking beer, the effect seemed to be associated with drinking one drink per week. The standard method derives influence of substituting alcohol types at constant total alcohol intake and complements the estimates of adding consumption of a particular alcohol type to the total intake. For most diseases, the effect of ethanol predominates over that of substances in the beverage type, which makes the density method less relevant in alcohol analyses.

  6. Analyses of moisture in polymers and composites

    NASA Technical Reports Server (NTRS)

    Ryan, L. E.; Vaughan, R. W.

    1980-01-01

    A suitable method for the direct measurement of moisture concentrations after humidity/thermal exposure on state of the art epoxy and polyimide resins and their graphite and glass fiber reinforcements was investigated. Methods for the determination of moisture concentration profiles, moisture diffusion modeling and moisture induced chemical changes were examined. Carefully fabricated, precharacterized epoxy and polyimide neat resins and their AS graphite and S glass reinforced composites were exposed to humid conditions using heavy water (D20), at ambient and elevated temperatures. These specimens were fixtured to theoretically limit the D20 permeation to a unidirectional penetration axis. The analytical techniques evaluated were: (1) laser pyrolysis gas chromatography mass spectrometry; (2) solids probe mass spectrometry; (3) laser pyrolysis conventional infrared spectroscopy; and (4) infrared imaging thermovision. The most reproducible and sensitive technique was solids probe mass spectrometry. The fabricated exposed specimens were analyzed for D20 profiling after humidity/thermal conditioning at three exposure time durations.

  7. Hierarchical Segmentation Enhances Diagnostic Imaging

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Bartron Medical Imaging LLC (BMI), of New Haven, Connecticut, gained a nonexclusive license from Goddard Space Flight Center to use the RHSEG software in medical imaging. To manage image data, BMI then licensed two pattern-matching software programs from NASA's Jet Propulsion Laboratory that were used in image analysis and three data-mining and edge-detection programs from Kennedy Space Center. More recently, BMI made NASA history by being the first company to partner with the Space Agency through a Cooperative Research and Development Agreement to develop a 3-D version of RHSEG. With U.S. Food and Drug Administration clearance, BMI will sell its Med-Seg imaging system with the 2-D version of the RHSEG software to analyze medical imagery from CAT and PET scans, MRI, ultrasound, digitized X-rays, digitized mammographies, dental X-rays, soft tissue analyses, moving object analyses, and soft-tissue slides such as Pap smears for the diagnoses and management of diseases. Extending the software's capabilities to three dimensions will eventually enable production of pixel-level views of a tumor or lesion, early identification of plaque build-up in arteries, and identification of density levels of microcalcification in mammographies.

  8. Medical imaging

    NASA Astrophysics Data System (ADS)

    Elliott, Alex

    2005-07-01

    Diagnostic medical imaging is a fundamental part of the practice of modern medicine and is responsible for the expenditure of considerable amounts of capital and revenue monies in healthcare systems around the world. Much research and development work is carried out, both by commercial companies and the academic community. This paper reviews briefly each of the major diagnostic medical imaging techniques—X-ray (planar and CT), ultrasound, nuclear medicine (planar, SPECT and PET) and magnetic resonance. The technical challenges facing each are highlighted, with some of the most recent developments. In terms of the future, interventional/peri-operative imaging, the advancement of molecular medicine and gene therapy are identified as potential areas of expansion.

  9. Brain Imaging

    PubMed Central

    Racine, Eric; Bar-Ilan, Ofek; Illes, Judy

    2007-01-01

    Advances in neuroscience are increasingly intersecting with issues of ethical, legal, and social interest. This study is an analysis of press coverage of an advanced technology for brain imaging, functional magnetic resonance imaging, that has gained significant public visibility over the past ten years. Discussion of issues of scientific validity and interpretation dominated over ethical content in both the popular and specialized press. Coverage of research on higher order cognitive phenomena specifically attributed broad personal and societal meaning to neuroimages. The authors conclude that neuroscience provides an ideal model for exploring science communication and ethics in a multicultural context. PMID:17330151

  10. Phloem imaging.

    PubMed

    Truernit, Elisabeth

    2014-04-01

    The phloem is the long-distance solute-conducting tissue of plants. The observation of phloem cells is particularly challenging for several reasons and many recent advances in microscopy are, therefore, especially beneficial for the study of phloem anatomy and physiology. This review will give an overview of the imaging techniques that have been used for studying different aspects of phloem biology. It will also highlight some new imaging techniques that have emerged in recent years that will certainly advance our knowledge about phloem function.

  11. Database-Driven Analyses of Astronomical Spectra

    NASA Astrophysics Data System (ADS)

    Cami, Jan

    2012-03-01

    Spectroscopy is one of the most powerful tools to study the physical properties and chemical composition of very diverse astrophysical environments. In principle, each nuclide has a unique set of spectral features; thus, establishing the presence of a specific material at astronomical distances requires no more than finding a laboratory spectrum of the right material that perfectly matches the astronomical observations. Once the presence of a substance is established, a careful analysis of the observational characteristics (wavelengths or frequencies, intensities, and line profiles) allows one to determine many physical parameters of the environment in which the substance resides, such as temperature, density, velocity, and so on. Because of this great diagnostic potential, ground-based and space-borne astronomical observatories often include instruments to carry out spectroscopic analyses of various celestial objects and events. Of particular interest is molecular spectroscopy at infrared wavelengths. From the spectroscopic point of view, molecules differ from atoms in their ability to vibrate and rotate, and quantum physics inevitably causes those motions to be quantized. The energies required to excite vibrations or rotations are such that vibrational transitions generally occur at infrared wavelengths, whereas pure rotational transitions typically occur at sub-mm wavelengths. Molecular vibration and rotation are coupled though, and thus at infrared wavelengths, one commonly observes a multitude of ro-vibrational transitions (see Figure 13.1). At lower spectral resolution, all transitions blend into one broad ro-vibrational molecular band. The isotope. Molecular spectroscopy thus allows us to see a difference of one neutron in an atomic nucleus that is located at astronomical distances! Since the detection of the first interstellar molecules (the CH [21] and CN [14] radicals), more than 150 species have been detected in space, ranging in size from diatomic

  12. Operational Satellite-based Surface Oil Analyses (Invited)

    NASA Astrophysics Data System (ADS)

    Streett, D.; Warren, C.

    2010-12-01

    During the Deepwater Horizon spill, NOAA imagery analysts in the Satellite Analysis Branch (SAB) issued more than 300 near-real-time satellite-based oil spill analyses. These analyses were used by the oil spill response community for planning, issuing surface oil trajectories and tasking assets (e.g., oil containment booms, skimmers, overflights). SAB analysts used both Synthetic Aperture Radar (SAR) and high resolution visible/near IR multispectral satellite imagery as well as a variety of ancillary datasets. Satellite imagery used included ENVISAT ASAR (ESA), TerraSAR-X (DLR), Cosmo-Skymed (ASI), ALOS (JAXA), Radarsat (MDA), ENVISAT MERIS (ESA), SPOT (SPOT Image Corp.), Aster (NASA), MODIS (NASA), and AVHRR (NOAA). Ancillary datasets included ocean current information, wind information, location of natural oil seeps and a variety of in situ oil observations. The analyses were available as jpegs, pdfs, shapefiles and through Google, KML files and also available on a variety of websites including Geoplatform and ERMA. From the very first analysis issued just 5 hours after the rig sank through the final analysis issued in August, the complete archive is still publicly available on the NOAA/NESDIS website http://www.ssd.noaa.gov/PS/MPS/deepwater.html SAB personnel also served as the Deepwater Horizon International Disaster Charter Project Manager (at the official request of the USGS). The Project Manager’s primary responsibility was to acquire and oversee the processing and dissemination of satellite data generously donated by numerous private companies and nations in support of the oil spill response including some of the imagery described above. SAB has begun to address a number of goals that will improve our routine oil spill response as well as help assure that we are ready for the next spill of national significance. We hope to (1) secure a steady, abundant and timely stream of suitable satellite imagery even in the absence of large-scale emergencies such as

  13. Efficient ALL vs. ALL collision risk analyses

    NASA Astrophysics Data System (ADS)

    Escobar, D.; Paskowitz, M.; Agueda, A.; Garcia, G.; Molina, M.

    2011-09-01

    In recent years, the space debris has gained a lot of attention due to the increasing amount of uncontrolled man-made objects orbiting the Earth. This population poses a significant and constantly growing thread to operational satellites. In order to face this thread in an independent manner, ESA has launched an initiative for the development of a European SSA System where GMV is participating via several activities. Apart from those activities financed by ESA, GMV has developed closeap, a tool for efficient conjunction assessment and collision probability prediction. ESÁs NAPEOS has been selected as computational engine and numerical propagator to be used in the tool, which can be considered as an add-on to the standard NAPEOS package. closeap makes use of the same orbit computation, conjunction assessment and collision risk algorithms implemented in CRASS, but at the same time both systems are completely independent. Moreover, the implementation in closeap has been validated against CRASS with excellent results. This paper describes the performance improvements implemented in closeap at algorithm level to ensure that the most time demanding scenarios (e.g., all catalogued objects are analysed against each other - all vs. all scenarios -) can be analysed in a reasonable amount of time with commercial-off-the-shelf hardware. However, the amount of space debris increases steadily due to the human activities. Thus, the number of objects involved in a full collision assessment is expected to increase notably and, consequently, the computational cost, which scales as the square of the number of objects, will increase as well. Additionally, orbit propagation algorithms that are computationally expensive might be needed to predict more accurately the trajectories of the space debris. In order to cope with such computational needs, the next natural step in the development of collision assessment tools is the use of parallelization techniques. In this paper we investigate

  14. High perfomance liquid chromatography in pharmaceutical analyses.

    PubMed

    Nikolin, Branko; Imamović, Belma; Medanhodzić-Vuk, Saira; Sober, Miroslav

    2004-05-01

    In testing the pre-sale procedure the marketing of drugs and their control in the last ten years, high performance liquid chromatography replaced numerous spectroscopic methods and gas chromatography in the quantitative and qualitative analysis. In the first period of HPLC application it was thought that it would become a complementary method of gas chromatography, however, today it has nearly completely replaced gas chromatography in pharmaceutical analysis. The application of the liquid mobile phase with the possibility of transformation of mobilized polarity during chromatography and all other modifications of mobile phase depending upon the characteristics of substance which are being tested, is a great advantage in the process of separation in comparison to other methods. The greater choice of stationary phase is the next factor which enables realization of good separation. The separation line is connected to specific and sensitive detector systems, spectrafluorimeter, diode detector, electrochemical detector as other hyphernated systems HPLC-MS and HPLC-NMR, are the basic elements on which is based such wide and effective application of the HPLC method. The purpose high performance liquid chromatography (HPLC) analysis of any drugs is to confirm the identity of a drug and provide quantitative results and also to monitor the progress of the therapy of a disease.1) Measuring presented on the Fig. 1. is chromatogram obtained for the plasma of depressed patients 12 h before oral administration of dexamethasone. It may also be used to further our understanding of the normal and disease process in the human body trough biomedical and therapeutically research during investigation before of the drugs registration. The analyses of drugs and metabolites in biological fluids, particularly plasma, serum or urine is one of the most demanding but one of the most common uses of high performance of liquid chromatography. Blood, plasma or serum contains numerous endogenous

  15. Imaging sciences workshop

    SciTech Connect

    Candy, J.V.

    1994-11-15

    This workshop on the Imaging Sciences sponsored by Lawrence Livermore National Laboratory contains short abstracts/articles submitted by speakers. The topic areas covered include the following: Astronomical Imaging; biomedical imaging; vision/image display; imaging hardware; imaging software; Acoustic/oceanic imaging; microwave/acoustic imaging; computed tomography; physical imaging; imaging algorithms. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.

  16. Image Processing Diagnostics: Emphysema

    NASA Astrophysics Data System (ADS)

    McKenzie, Alex

    2009-10-01

    Currently the computerized tomography (CT) scan can detect emphysema sooner than traditional x-rays, but other tests are required to measure more accurately the amount of affected lung. CT scan images show clearly if a patient has emphysema, but is unable by visual scan alone, to quantify the degree of the disease, as it appears merely as subtle, barely distinct, dark spots on the lung. Our goal is to create a software plug-in to interface with existing open source medical imaging software, to automate the process of accurately diagnosing and determining emphysema severity levels in patients. This will be accomplished by performing a number of statistical calculations using data taken from CT scan images of several patients representing a wide range of severity of the disease. These analyses include an examination of the deviation from a normal distribution curve to determine skewness, a commonly used statistical parameter. Our preliminary results show that this method of assessment appears to be more accurate and robust than currently utilized methods which involve looking at percentages of radiodensities in air passages of the lung.

  17. BoneJ: Free and extensible bone image analysis in ImageJ.

    PubMed

    Doube, Michael; Kłosowski, Michał M; Arganda-Carreras, Ignacio; Cordelières, Fabrice P; Dougherty, Robert P; Jackson, Jonathan S; Schmid, Benjamin; Hutchinson, John R; Shefelbine, Sandra J

    2010-12-01

    Bone geometry is commonly measured on computed tomographic (CT) and X-ray microtomographic (μCT) images. We obtained hundreds of CT, μCT and synchrotron μCT images of bones from diverse species that needed to be analysed remote from scanning hardware, but found that available software solutions were expensive, inflexible or methodologically opaque. We implemented standard bone measurements in a novel ImageJ plugin, BoneJ, with which we analysed trabecular bone, whole bones and osteocyte lacunae. BoneJ is open source and free for anyone to download, use, modify and distribute.

  18. Biblical Images.

    ERIC Educational Resources Information Center

    Nir, Yeshayahu

    1987-01-01

    Responds to Marjorie Munsterberg's review of "The Bible and the Image: The History of Photography in the Holy Land 1839-1899." Claims that Munsterberg provided an incomplete and inaccurate knowledge of the book's content, and that she considered Western pictorial traditions as the only valid measure in the study of the history of…

  19. Image Processing

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Images are prepared from data acquired by the multispectral scanner aboard Landsat, which views Earth in four ranges of the electromagnetic spectrum, two visible bands and two infrared. Scanner picks up radiation from ground objects and converts the radiation signatures to digital signals, which are relayed to Earth and recorded on tape. Each tape contains "pixels" or picture elements covering a ground area; computerized equipment processes the tapes and plots each pixel, line be line to produce the basic image. Image can be further processed to correct sensor errors, to heighten contrast for feature emphasis or to enhance the end product in other ways. Key factor in conversion of digital data to visual form is precision of processing equipment. Jet Propulsion Laboratory prepared a digital mosaic that was plotted and enhanced by Optronics International, Inc. by use of the company's C-4300 Colorwrite, a high precision, high speed system which manipulates and analyzes digital data and presents it in visual form on film. Optronics manufactures a complete family of image enhancement processing systems to meet all users' needs. Enhanced imagery is useful to geologists, hydrologists, land use planners, agricultural specialists geographers and others.

  20. Diagnostic Imaging

    MedlinePlus

    ... stay still for a long time inside a machine. This can be uncomfortable. Certain tests involve exposure to a small amount of radiation. For some imaging tests, doctors insert a tiny camera attached to a long, thin tube into your body. This tool is called a scope. The doctor moves it ...

  1. Inner Image

    ERIC Educational Resources Information Center

    Mollhagen, Nancy

    2004-01-01

    In this article, the author states that she has always loved self portraits but most teenagers do not enjoy looking too closely at their own faces in an effort to replicate them. Thanks to a new digital camera, she was able to use this new technology to inspire students to take a closer look at their inner image. Prior to the self-portrait…

  2. Forest Imaging

    NASA Technical Reports Server (NTRS)

    1992-01-01

    NASA's Technology Applications Center, with other government and academic agencies, provided technology for improved resources management to the Cibola National Forest. Landsat satellite images enabled vegetation over a large area to be classified for purposes of timber analysis, wildlife habitat, range measurement and development of general vegetation maps.

  3. PRECLOSURE CONSEQUENCE ANALYSES FOR LICENSE APPLICATION

    SciTech Connect

    S. Tsai

    2005-01-12

    Radiological consequence analyses are performed for potential releases from normal operations in surface and subsurface facilities and from Category 1 and Category 2 event sequences during the preclosure period. Surface releases from normal repository operations are primarily from radionuclides released from opening a transportation cask during dry transfer operations of spent nuclear fuel (SNF) in Dry Transfer Facility 1 (DTF 1), Dry Transfer Facility 2 (DTF 2), the Canister Handling facility (CHF), or the Fuel Handling Facility (FHF). Subsurface releases from normal repository operations are from resuspension of waste package surface contamination and neutron activation of ventilated air and silica dust from host rock in the emplacement drifts. The purpose of this calculation is to demonstrate that the preclosure performance objectives, specified in 10 CFR 63.111(a) and 10 CFR 63.111(b), have been met for the proposed design and operations in the geologic repository operations area. Preclosure performance objectives are discussed in Section 6.2.3 and are summarized in Tables 1 and 2.

  4. Genomic analyses of the CAM plant pineapple.

    PubMed

    Zhang, Jisen; Liu, Juan; Ming, Ray

    2014-07-01

    The innovation of crassulacean acid metabolism (CAM) photosynthesis in arid and/or low CO2 conditions is a remarkable case of adaptation in flowering plants. As the most important crop that utilizes CAM photosynthesis, the genetic and genomic resources of pineapple have been developed over many years. Genetic diversity studies using various types of DNA markers led to the reclassification of the two genera Ananas and Pseudananas and nine species into one genus Ananas and two species, A. comosus and A. macrodontes with five botanical varieties in A. comosus. Five genetic maps have been constructed using F1 or F2 populations, and high-density genetic maps generated by genotype sequencing are essential resources for sequencing and assembling the pineapple genome and for marker-assisted selection. There are abundant expression sequence tag resources but limited genomic sequences in pineapple. Genes involved in the CAM pathway has been analysed in several CAM plants but only a few of them are from pineapple. A reference genome of pineapple is being generated and will accelerate genetic and genomic research in this major CAM crop. This reference genome of pineapple provides the foundation for studying the origin and regulatory mechanism of CAM photosynthesis, and the opportunity to evaluate the classification of Ananas species and botanical cultivars.

  5. Social Media Analyses for Social Measurement

    PubMed Central

    Schober, Michael F.; Pasek, Josh; Guggenheim, Lauren; Lampe, Cliff; Conrad, Frederick G.

    2016-01-01

    Demonstrations that analyses of social media content can align with measurement from sample surveys have raised the question of whether survey research can be supplemented or even replaced with less costly and burdensome data mining of already-existing or “found” social media content. But just how trustworthy such measurement can be—say, to replace official statistics—is unknown. Survey researchers and data scientists approach key questions from starting assumptions and analytic traditions that differ on, for example, the need for representative samples drawn from frames that fully cover the population. New conversations between these scholarly communities are needed to understand the potential points of alignment and non-alignment. Across these approaches, there are major differences in (a) how participants (survey respondents and social media posters) understand the activity they are engaged in; (b) the nature of the data produced by survey responses and social media posts, and the inferences that are legitimate given the data; and (c) practical and ethical considerations surrounding the use of the data. Estimates are likely to align to differing degrees depending on the research topic and the populations under consideration, the particular features of the surveys and social media sites involved, and the analytic techniques for extracting opinions and experiences from social media. Traditional population coverage may not be required for social media content to effectively predict social phenomena to the extent that social media content distills or summarizes broader conversations that are also measured by surveys. PMID:27257310

  6. Characterization of branch complexity by fractal analyses

    USGS Publications Warehouse

    Alados, C.L.; Escos, J.; Emlen, J.M.; Freeman, D.C.

    1999-01-01

    The comparison between complexity in the sense of space occupancy (box-counting fractal dimension D(c) and information dimension D1) and heterogeneity in the sense of space distribution (average evenness index f and evenness variation coefficient J(cv)) were investigated in mathematical fractal objects and natural branch structures. In general, increased fractal dimension was paired with low heterogeneity. Comparisons between branch architecture in Anthyllis cytisoides under different slope exposure and grazing impact revealed that branches were more complex and more homogeneously distributed for plants on northern exposures than southern, while grazing had no impact during a wet year. Developmental instability was also investigated by the statistical noise of the allometric relation between internode length and node order. In conclusion, our study demonstrated that fractal dimension of branch structure can be used to analyze the structural organization of plants, especially if we consider not only fractal dimension but also shoot distribution within the canopy (lacunarity). These indexes together with developmental instability analyses are good indicators of growth responses to the environment.

  7. Reproducibility of neuroimaging analyses across operating systems.

    PubMed

    Glatard, Tristan; Lewis, Lindsay B; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C

    2015-01-01

    Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed.

  8. Statistical analyses of a screen cylinder wake

    NASA Astrophysics Data System (ADS)

    Mohd Azmi, Azlin; Zhou, Tongming; Zhou, Yu; Cheng, Liang

    2017-02-01

    The evolution of a screen cylinder wake was studied by analysing its statistical properties over a streamwise range of x/d={10-60}. The screen cylinder was made of a stainless steel screen mesh of 67% porosity. The experiments were conducted in a wind tunnel at a Reynolds number of 7000 using an X-probe. The results were compared with those obtained in the wake generated by a solid cylinder. It was observed that the evolution of the statistics in the wake of the screen cylinder was different from that of a solid cylinder, reflecting the differences in the formation of the organized large-scale vortices in both wakes. The streamwise evolution of the Reynolds stresses, energy spectra and cross-correlation coefficients indicated that there exists a critical location that differentiates the screen cylinder wake into two regions over the measured streamwise range. The formation of the fully formed large-scale vortices was delayed until this critical location. Comparison with existing results for screen strips showed that although the near-wake characteristics and the vortex formation mechanism were similar between the two wake generators, variation in the Strouhal frequencies was observed and the self-preservation states were non-universal, reconfirming the dependence of a wake on its initial condition.

  9. Trend Analyses of Nitrate in Danish Groundwater

    NASA Astrophysics Data System (ADS)

    Hansen, B.; Thorling, L.; Dalgaard, T.; Erlandsen, M.

    2012-04-01

    This presentation assesses the long-term development in the oxic groundwater nitrate concentration and nitrogen (N) loss due to intensive farming in Denmark. Firstly, up to 20-year time-series from the national groundwater monitoring network enable a statistically systematic analysis of distribution, trends and trend reversals in the groundwater nitrate concentration. Secondly, knowledge about the N surplus in Danish agriculture since 1950 is used as an indicator of the potential loss of N. Thirdly, groundwater recharge CFC (Chlorofluorocarbon) age determination allows linking of the first two dataset. The development in the nitrate concentration of oxic groundwater clearly mirrors the development in the national agricultural N surplus, and a corresponding trend reversal is found in groundwater. Regulation and technical improvements in the intensive farming in Denmark have succeeded in decreasing the N surplus by 40% since the mid 1980s while at the same time maintaining crop yields and increasing the animal production of especially pigs. Trend analyses prove that the youngest (0-15 years old) oxic groundwater shows more pronounced significant downward nitrate trends (44%) than the oldest (25-50 years old) oxic groundwater (9%). This amounts to clear evidence of the effect of reduced nitrate leaching on groundwater nitrate concentrations in Denmark. Are the Danish groundwater monitoring strategy obtimal for detection of nitrate trends? Will the nitrate concentrations in Danish groundwater continue to decrease or are the Danish nitrate concentration levels now appropriate according to the Water Framework Directive?

  10. Social Media Analyses for Social Measurement.

    PubMed

    Schober, Michael F; Pasek, Josh; Guggenheim, Lauren; Lampe, Cliff; Conrad, Frederick G

    2016-01-01

    Demonstrations that analyses of social media content can align with measurement from sample surveys have raised the question of whether survey research can be supplemented or even replaced with less costly and burdensome data mining of already-existing or "found" social media content. But just how trustworthy such measurement can be-say, to replace official statistics-is unknown. Survey researchers and data scientists approach key questions from starting assumptions and analytic traditions that differ on, for example, the need for representative samples drawn from frames that fully cover the population. New conversations between these scholarly communities are needed to understand the potential points of alignment and non-alignment. Across these approaches, there are major differences in (a) how participants (survey respondents and social media posters) understand the activity they are engaged in; (b) the nature of the data produced by survey responses and social media posts, and the inferences that are legitimate given the data; and (c) practical and ethical considerations surrounding the use of the data. Estimates are likely to align to differing degrees depending on the research topic and the populations under consideration, the particular features of the surveys and social media sites involved, and the analytic techniques for extracting opinions and experiences from social media. Traditional population coverage may not be required for social media content to effectively predict social phenomena to the extent that social media content distills or summarizes broader conversations that are also measured by surveys.

  11. Phylogenomic Analyses Support Traditional Relationships within Cnidaria

    PubMed Central

    Zapata, Felipe; Goetz, Freya E.; Smith, Stephen A.; Howison, Mark; Siebert, Stefan; Church, Samuel H.; Sanders, Steven M.; Ames, Cheryl Lewis; McFadden, Catherine S.; France, Scott C.; Daly, Marymegan; Collins, Allen G.; Haddock, Steven H. D.; Dunn, Casey W.; Cartwright, Paulyn

    2015-01-01

    Cnidaria, the sister group to Bilateria, is a highly diverse group of animals in terms of morphology, lifecycles, ecology, and development. How this diversity originated and evolved is not well understood because phylogenetic relationships among major cnidarian lineages are unclear, and recent studies present contrasting phylogenetic hypotheses. Here, we use transcriptome data from 15 newly-sequenced species in combination with 26 publicly available genomes and transcriptomes to assess phylogenetic relationships among major cnidarian lineages. Phylogenetic analyses using different partition schemes and models of molecular evolution, as well as topology tests for alternative phylogenetic relationships, support the monophyly of Medusozoa, Anthozoa, Octocorallia, Hydrozoa, and a clade consisting of Staurozoa, Cubozoa, and Scyphozoa. Support for the monophyly of Hexacorallia is weak due to the equivocal position of Ceriantharia. Taken together, these results further resolve deep cnidarian relationships, largely support traditional phylogenetic views on relationships, and provide a historical framework for studying the evolutionary processes involved in one of the most ancient animal radiations. PMID:26465609

  12. Comparative sequence analyses of sixteen reptilian paramyxoviruses

    USGS Publications Warehouse

    Ahne, W.; Batts, W.N.; Kurath, G.; Winton, J.R.

    1999-01-01

    Viral genomic RNA of Fer-de-Lance virus (FDLV), a paramyxovirus highly pathogenic for reptiles, was reverse transcribed and cloned. Plasmids with significant sequence similarities to the hemagglutinin-neuraminidase (HN) and polymerase (L) genes of mammalian paramyxoviruses were identified by BLAST search. Partial sequences of the FDLV genes were used to design primers for amplification by nested polymerase chain reaction (PCR) and sequencing of 518-bp L gene and 352-bp HN gene fragments from a collection of 15 previously uncharacterized reptilian paramyxoviruses. Phylogenetic analyses of the partial L and HN sequences produced similar trees in which there were two distinct subgroups of isolates that were supported with maximum bootstrap values, and several intermediate isolates. Within each subgroup the nucleotide divergence values were less than 2.5%, while the divergence between the two subgroups was 20-22%. This indicated that the two subgroups represent distinct virus species containing multiple virus strains. The five intermediate isolates had nucleotide divergence values of 11-20% and may represent additional distinct species. In addition to establishing diversity among reptilian paramyxoviruses, the phylogenetic groupings showed some correlation with geographic location, and clearly demonstrated a low level of host species-specificity within these viruses. Copyright (C) 1999 Elsevier Science B.V.

  13. Recent Advances in Cellular Glycomic Analyses

    PubMed Central

    Furukawa, Jun-ichi; Fujitani, Naoki; Shinohara, Yasuro

    2013-01-01

    A large variety of glycans is intricately located on the cell surface, and the overall profile (the glycome, given the entire repertoire of glycoconjugate-associated sugars in cells and tissues) is believed to be crucial for the diverse roles of glycans, which are mediated by specific interactions that control cell-cell adhesion, immune response, microbial pathogenesis and other cellular events. The glycomic profile also reflects cellular alterations, such as development, differentiation and cancerous change. A glycoconjugate-based approach would therefore be expected to streamline discovery of novel cellular biomarkers. Development of such an approach has proven challenging, due to the technical difficulties associated with the analysis of various types of cellular glycomes; however, recent progress in the development of analytical methodologies and strategies has begun to clarify the cellular glycomics of various classes of glycoconjugates. This review focuses on recent advances in the technical aspects of cellular glycomic analyses of major classes of glycoconjugates, including N- and O-linked glycans, derived from glycoproteins, proteoglycans and glycosphingolipids. Articles that unveil the glycomics of various biologically important cells, including embryonic and somatic stem cells, induced pluripotent stem (iPS) cells and cancer cells, are discussed. PMID:24970165

  14. Informative prior distributions for ELISA analyses.

    PubMed

    Klauenberg, Katy; Walzel, Monika; Ebert, Bernd; Elster, Clemens

    2015-07-01

    Immunoassays are capable of measuring very small concentrations of substances in solutions and have an immense range of application. Enzyme-linked immunosorbent assay (ELISA) tests in particular can detect the presence of an infection, of drugs, or hormones (as in the home pregnancy test). Inference of an unknown concentration via ELISA usually involves a non-linear heteroscedastic regression and subsequent prediction, which can be carried out in a Bayesian framework. For such a Bayesian inference, we are developing informative prior distributions based on extensive historical ELISA tests as well as theoretical considerations. One consideration regards the quality of the immunoassay leading to two practical requirements for the applicability of the priors. Simulations show that the additional prior information can lead to inferences which are robust to reasonable perturbations of the model and changes in the design of the data. On real data, the applicability is demonstrated across different laboratories, for different analytes and laboratory equipment as well as for previous and current ELISAs with sigmoid regression function. Consistency checks on real data (similar to cross-validation) underpin the adequacy of the suggested priors. Altogether, the new priors may improve concentration estimation for ELISAs that fulfill certain design conditions, by extending the range of the analyses, decreasing the uncertainty, or giving more robust estimates. Future use of these priors is straightforward because explicit, closed-form expressions are provided. This work encourages development and application of informative, yet general, prior distributions for other types of immunoassays.

  15. Cyanide analyses for risk and treatability assessments

    SciTech Connect

    MacFarlane, I.D.; Elseroad, H.J.; Pergrin, D.E.; Logan, C.M.

    1994-12-31

    Cyanide, an EPA priority pollutant and target analyte, is typically measured as total. However, cyanide complexation, information which is not acquired through total cyanide analysis, is often a driver of cyanide toxicity and treatability. A case study of a former manufacture gas plant (MGP) is used to demonstrate the usability of various cyanide analytical methods for risk and treatability assessments. Several analytical methods, including cyanide amenable to chlorination and weak acid dissociable cyanide help test the degree of cyanide complexation. Generally, free or uncomplexed cyanide is more biologically available, toxic, and reactive than complexed cyanide. Extensive site testing has shown that free and weakly dissociable cyanide composes only a small fraction of total cyanide as would be expected from the literature, and that risk assessment will be more realistic considering cyanide form. Likewise, aqueous treatment for cyanide can be properly tested if cyanide form is accounted for. Weak acid dissociable cyanide analyses proved to be the most reliable (and potentially acceptable) cyanide method, as well as represent the most toxic and reactive cyanide forms.

  16. Evaluation of the Hitachi 717 analyser.

    PubMed

    Biosca, C; Antoja, F; Sierra, C; Douezi, H; Macià, M; Alsina, M J; Galimany, R

    1989-01-01

    The selective multitest Boehringer Mannheim Hitachi 717 analyser was evaluated according to the guidelines of the Comisión de Instrumentación de la Sociedad Española de Química Clinica and the European Committee for Clinical Laboratory Standards. The evaluation was performed in two steps: examination of the analytical units and evaluation in routine operation.THE EVALUATION OF THE ANALYTICAL UNITS INCLUDED A PHOTOMETRIC STUDY: the inaccuracy is acceptable for 340 and 405 nm; the imprecision ranges from 0.12 to 0.95% at 340 nm and from 0.30 to 0.73 at 405 nm, the linearity shows some dispersion at low absorbance for NADH at 340 nm, the drift is negligible, the imprecision of the pipette delivery system increases when the sample pipette operates with 3 mul, the reagent pipette imprecision is acceptable and the temperature control system is good.UNDER ROUTINE WORKING CONDITIONS, SEVEN DETERMINATIONS WERE STUDIED: glucose, creatinine, iron, total protein, AST, ALP and calcium. The within-run imprecision (CV) ranged from 0.6% for total protein and AST to 6.9% for iron. The between run imprecision ranged from 2.4% for glucose to 9.7% for iron. Some contamination was found in the carry-over study. The relative inaccuracy is good for all the constituents assayed.

  17. Evaluation of the Olympus AU-510 analyser.

    PubMed

    Farré, C; Velasco, J; Ramón, F

    1991-01-01

    The selective multitest Olympus AU-510 analyser was evaluated according to the recommendations of the Comision de Instrumentacion de la Sociedad Española de Quimica Clinica and the European Committee for Clinical Laboratory Standards. The evaluation was carried out in two stages: an examination of the analytical units and then an evaluation in routine work conditions. The operational characteristics of the system were also studied.THE FIRST STAGE INCLUDED A PHOTOMETRIC STUDY: dependent on the absorbance, the inaccuracy varies between +0.5% to -0.6% at 405 nm and from -5.6% to 10.6% at 340 nm; the imprecision ranges between -0.22% and 0.56% at 405 nm and between 0.09% and 2.74% at 340 nm. Linearity was acceptable, apart from a very low absorbance for NADH at 340 nm; and the imprecision of the serum sample pipetter was satisfactory.TWELVE SERUM ANALYTES WERE STUDIED UNDER ROUTINE CONDITIONS: glucose, urea urate, cholesterol, triglycerides, total bilirubin, creatinine, phosphate, iron, aspartate aminotransferase, alanine aminotransferase and gamma-glutamyl transferase.The within-run imprecision (CV%) ranged from 0.67% for phosphate to 2.89% for iron and the between-run imprecision from 0.97% for total bilirubin to 7.06% for iron. There was no carryover in a study of the serum sample pipetter. Carry-over studies with the reagent and sample pipetters shows some cross contamination in the iron assay.

  18. CFD analyses of coolant channel flowfields

    NASA Technical Reports Server (NTRS)

    Yagley, Jennifer A.; Feng, Jinzhang; Merkle, Charles L.

    1993-01-01

    The flowfield characteristics in rocket engine coolant channels are analyzed by means of a numerical model. The channels are characterized by large length to diameter ratios, high Reynolds numbers, and asymmetrical heating. At representative flow conditions, the channel length is approximately twice the hydraulic entrance length so that fully developed conditions would be reached for a constant property fluid. For the supercritical hydrogen that is used as the coolant, the strong property variations create significant secondary flows in the cross-plane which have a major influence on the flow and the resulting heat transfer. Comparison of constant and variable property solutions show substantial differences. In addition, the property variations prevent fully developed flow. The density variation accelerates the fluid in the channels increasing the pressure drop without an accompanying increase in heat flux. Analyses of the inlet configuration suggest that side entry from a manifold can affect the development of the velocity profile because of vortices generated as the flow enters the channel. Current work is focused on studying the effects of channel bifurcation on the flow field and the heat transfer characteristics.

  19. Seismic Soil-Structure Interaction Analyses of a Deeply Embedded Model Reactor – SASSI Analyses

    SciTech Connect

    Nie J.; Braverman J.; Costantino, M.

    2013-10-31

    This report summarizes the SASSI analyses of a deeply embedded reactor model performed by BNL and CJC and Associates, as part of the seismic soil-structure interaction (SSI) simulation capability project for the NEAMS (Nuclear Energy Advanced Modeling and Simulation) Program of the Department of Energy. The SASSI analyses included three cases: 0.2 g, 0.5 g, and 0.9g, all of which refer to nominal peak accelerations at the top of the bedrock. The analyses utilized the modified subtraction method (MSM) for performing the seismic SSI evaluations. Each case consisted of two analyses: input motion in one horizontal direction (X) and input motion in the vertical direction (Z), both of which utilized the same in-column input motion. Besides providing SASSI results for use in comparison with the time domain SSI results obtained using the DIABLO computer code, this study also leads to the recognition that the frequency-domain method should be modernized so that it can better serve its mission-critical role for analysis and design of nuclear power plants.

  20. Evaluation of imaging performance of major image guidance systems

    PubMed Central

    Chan, MF; Yang, J; Song, Y; Burman, C; Chan, P; Li, S

    2011-01-01

    Purpose: The imaging characteristics of two popular kV cone-beam CT (CBCT) and two MVCT systems utilised in image-guided radiation therapy (IGRT) were evaluated. Materials and methods: The study was performed on Varian Clinac iX, Elekta Synergy S, Siemens Oncor, and Tomotherapy. A CT phantom (Catphan-504, Phantom Laboratory, Salem, NY) was scanned for measurements of image quality including image noise, uniformity, density accuracy, spatial resolution, contrast linearity, and contrast resolution. The measurement results were analysed using in-house image analysis software. Reproducibility, position correction, and geometric accuracy were also evaluated with markers in a smaller alignment phantom. The performance evaluation compared volumetric image properties from these four systems with those from a conventional diagnostic CT (CCT). Results: It was shown that the linearity of the two kV CBCT was fairly consistent with CCT. The Elekta CBCT with half-circle 27-cm FOV had higher CT numbers than the other three systems. The image noises of the Elekta kV CBCT, Siemens MV CBCT, and Tomotherapy fan-beam CT (FBCT) are about 2–4 times higher than that of the Varian CBCT. The spatial resolutions of two kV CBCTs and two MV CBCTs were 8-11 lp/cm and 3-5 lp/cm, respectively. Conclusion: Elekta CBCT provided a faster image reconstruction and low dose per scan for half-circle scanning. Varian CBCT had relatively lower image noise. Tomotherapy FBCT had the best uniformity. PMID:22287985

  1. Remote sensing as a tool to analyse lizards behaviour

    NASA Astrophysics Data System (ADS)

    Dos Santos, Remi; Teodoro, Ana C.; Carretero, Miguel; Sillero, Neftalí

    2016-10-01

    Although the spatial context is expected to be a major influence in the interactions among organisms and their environment, it is commonly ignored in ecological studies. This study is part of an investigation on home ranges and their influence in the escape behaviour of Iberian lizards. Fieldwork was conducted inside a 400 m2 mesocosm, using three acclimatized adult male individuals. In order to perform analyses at this local scale, tools with high spatial accuracy are needed. A total of 3016 GPS points were recorded and processed into a Digital Elevation Model (DEM), with a pixel resolution of 2 cm. Then, 1156 aerial photos were taken and processed to create an orthophoto. A refuge map, containing possible locations for retreats was generated with supervised image classification algorithms, obtaining four classes (refuges, vegetation, bare soil and organic soil). Furthermore, 50 data-loggers were randomly placed, recording evenly through the area temperature and humidity every 15'. After a month of recording, all environmental variables were interpolated using Kriging. The study area presented an irregular elevation. The humidity varied according to the topography and the temperature presented a West-East pattern. Both variables are of paramount importance for lizard activity and performance. In a predation risk scenario, a lizard located in a temperature close to its thermal optimum will be able to escape more efficiently. Integration of such ecologically relevant elements in a spatial context exemplifies how remote sensing tools can contribute to improve inference in behavioural ecology.

  2. 2010 oil spill: trajectory projections based on ensemble drifter analyses

    NASA Astrophysics Data System (ADS)

    Chang, Yu-Lin; Oey, Leo; Xu, Fang-Hua; Lu, Hung-Fu; Fujisaki, Ayumi

    2011-06-01

    An accurate method for long-term (weeks to months) projections of oil spill trajectories based on multi-year ensemble analyses of simulated surface and subsurface ( z = -800 m) drifters released at the northern Gulf of Mexico spill site is demonstrated during the 2010 oil spill. The simulation compares well with satellite images of the actual oil spill which show that the surface spread of oil was mainly confined to the northern shelf and slope of the Gulf of Mexico, with some (more limited) spreading over the north/northeastern face of the Loop Current, as well as northwestward toward the Louisiana-Texas shelf. At subsurface, the ensemble projection shows drifters spreading south/southwestward, and this tendency agrees well with ADCP current measurements near the spill site during the months of May-July, which also show southward mean currents. An additional model analysis during the spill period (Apr-Jul/2010) confirms the above ensemble projection. The 2010 analysis confirms that the reason for the surface oil spread to be predominantly confined to the northern Gulf shelf and slope is because the 2010 wind was more southerly compared to climatology and also because a cyclone existed north of the Loop Current which moreover was positioned to the south of the spilled site.

  3. Imaging analysis of LDEF craters

    NASA Technical Reports Server (NTRS)

    Radicatidibrozolo, F.; Harris, D. W.; Chakel, J. A.; Fleming, R. H.; Bunch, T. E.

    1991-01-01

    Two small craters in Al from the Long Duration Exposure Facility (LDEF) experiment tray A11E00F (no. 74, 119 micron diameter and no. 31, 158 micron diameter) were analyzed using Auger electron spectroscopy (AES), time-of-flight secondary ion mass spectroscopy (TOF-SIMS), low voltage scanning electron microscopy (LVSEM), and SEM energy dispersive spectroscopy (EDS). High resolution images and sensitive elemental and molecular analysis were obtained with this combined approach. The result of these analyses are presented.

  4. Image and Data-analysis Tools For Paleoclimatic Reconstructions

    NASA Astrophysics Data System (ADS)

    Pozzi, M.

    It comes here proposed a directory of instruments and computer science resources chosen in order to resolve the problematic ones that regard the paleoclimatic recon- structions. They will come discussed in particular the following points: 1) Numerical analysis of paleo-data (fossils abundances, species analyses, isotopic signals, chemical-physical parameters, biological data): a) statistical analyses (uni- variate, diversity, rarefaction, correlation, ANOVA, F and T tests, Chi^2) b) multidi- mensional analyses (principal components, corrispondence, cluster analysis, seriation, discriminant, autocorrelation, spectral analysis) neural analyses (backpropagation net, kohonen feature map, hopfield net genetic algorithms) 2) Graphical analysis (visu- alization tools) of paleo-data (quantitative and qualitative fossils abundances, species analyses, isotopic signals, chemical-physical parameters): a) 2-D data analyses (graph, histogram, ternary, survivorship) b) 3-D data analyses (direct volume rendering, iso- surfaces, segmentation, surface reconstruction, surface simplification,generation of tetrahedral grids). 3) Quantitative and qualitative digital image analysis (macro and microfossils image analysis, Scanning Electron Microscope. and Optical Polarized Microscope images capture and analysis, morphometric data analysis, 3-D reconstruc- tions): a) 2D image analysis (correction of image defects, enhancement of image de- tail, converting texture and directionality to grey scale or colour differences, visual enhancement using pseudo-colour, pseudo-3D, thresholding of image features, binary image processing, measurements, stereological measurements, measuring features on a white background) b) 3D image analysis (basic stereological procedures, two dimen- sional structures; area fraction from the point count, volume fraction from the point count, three dimensional structures: surface area and the line intercept count, three dimensional microstructures; line length and the

  5. Fracturing and brittleness index analyses of shales

    NASA Astrophysics Data System (ADS)

    Barnhoorn, Auke; Primarini, Mutia; Houben, Maartje

    2016-04-01

    The formation of a fracture network in rocks has a crucial control on the flow behaviour of fluids. In addition, an existing network of fractures , influences the propagation of new fractures during e.g. hydraulic fracturing or during a seismic event. Understanding of the type and characteristics of the fracture network that will be formed during e.g. hydraulic fracturing is thus crucial to better predict the outcome of a hydraulic fracturing job. For this, knowledge of the rock properties is crucial. The brittleness index is often used as a rock property that can be used to predict the fracturing behaviour of a rock for e.g. hydraulic fracturing of shales. Various terminologies of the brittleness index (BI1, BI2 and BI3) exist based on mineralogy, elastic constants and stress-strain behaviour (Jin et al., 2014, Jarvie et al., 2007 and Holt et al., 2011). A maximum brittleness index of 1 predicts very good and efficient fracturing behaviour while a minimum brittleness index of 0 predicts a much more ductile shale behaviour. Here, we have performed systematic petrophysical, acoustic and geomechanical analyses on a set of shale samples from Whitby (UK) and we have determined the three different brittleness indices on each sample by performing all the analyses on each of the samples. We show that each of the three brittleness indices are very different for the same sample and as such it can be concluded that the brittleness index is not a good predictor of the fracturing behaviour of shales. The brittleness index based on the acoustic data (BI1) all lie around values of 0.5, while the brittleness index based on the stress strain data (BI2) give an average brittleness index around 0.75, whereas the mineralogy brittleness index (BI3) predict values below 0.2. This shows that by using different estimates of the brittleness index different decisions can be made for hydraulic fracturing. If we would rely on the mineralogy (BI3), the Whitby mudstone is not a suitable

  6. Assessing the reproducibility of discriminant function analyses

    PubMed Central

    Andrew, Rose L.; Albert, Arianne Y.K.; Renaut, Sebastien; Rennison, Diana J.; Bock, Dan G.

    2015-01-01

    Data are the foundation of empirical research, yet all too often the datasets underlying published papers are unavailable, incorrect, or poorly curated. This is a serious issue, because future researchers are then unable to validate published results or reuse data to explore new ideas and hypotheses. Even if data files are securely stored and accessible, they must also be accompanied by accurate labels and identifiers. To assess how often problems with metadata or data curation affect the reproducibility of published results, we attempted to reproduce Discriminant Function Analyses (DFAs) from the field of organismal biology. DFA is a commonly used statistical analysis that has changed little since its inception almost eight decades ago, and therefore provides an opportunity to test reproducibility among datasets of varying ages. Out of 100 papers we initially surveyed, fourteen were excluded because they did not present the common types of quantitative result from their DFA or gave insufficient details of their DFA. Of the remaining 86 datasets, there were 15 cases for which we were unable to confidently relate the dataset we received to the one used in the published analysis. The reasons ranged from incomprehensible or absent variable labels, the DFA being performed on an unspecified subset of the data, or the dataset we received being incomplete. We focused on reproducing three common summary statistics from DFAs: the percent variance explained, the percentage correctly assigned and the largest discriminant function coefficient. The reproducibility of the first two was fairly high (20 of 26, and 44 of 60 datasets, respectively), whereas our success rate with the discriminant function coefficients was lower (15 of 26 datasets). When considering all three summary statistics, we were able to completely reproduce 46 (65%) of 71 datasets. While our results show that a majority of studies are reproducible, they highlight the fact that many studies still are not the

  7. Analysing policy transfer: perspectives for operational research.

    PubMed

    Bissell, K; Lee, K; Freeman, R

    2011-09-01

    Policy transfer occurs regularly. In essence, a strategy developed elsewhere is taken up and applied in another policy context. Yet what precisely is policy transfer and, more importantly, under what conditions does it occur? This paper describes policy transfer and addresses three main questions, exploring what perspectives of policy transfer might contribute to operational research (OR) efforts. First, what facilitates the transfer of OR results into policy and practice? Second, what facilitates effective lesson-drawing about OR results and processes between and within countries? And third, what would increase the amount of OR being carried out by low- and middle-income countries and used to inform policy and practice at local and global levels? Mexico's adoption and adaptation of the DOTS strategy is used here as an example of policy transfer. Policy transfer is relevant to all countries, levels and arenas of people, institutions and organisations involved in health. With a more systematic analysis of learning and policy processes, OR policy and practice outcomes could be improved at all levels, from local to global. Policy transfer offers theory and concepts for analysing OR from a new perspective. The present paper proposes a model of the policy transfer process for qualitative research use. Comprehensive policy transfer research, given its length, complexity and need for qualitative researchers, should not be envisaged for all OR projects. All OR projects could, however, incorporate some concepts and practical tools inspired from this model. This should help to plan, evaluate and improve OR processes and the resulting changes in policy and practice.

  8. SEDS Tether M/OD Damage Analyses

    NASA Technical Reports Server (NTRS)

    Hayashida, K. B.; Robinson, J. H.; Hill, S. A.

    1997-01-01

    The Small Expendable Deployer System (SEDS) was designed to deploy an endmass at the end of a 20-km-long tether which acts as an upper stage rocket, and the threats from the meteoroid and orbital debris (M/OD) particle environments on SEDS components are important issues for the safety and success of any SEDS mission. However, the possibility of severing the tether due to M/OD particle impacts is an even more serious concern, since the SEDS tether has a relatively large exposed area to the M/OD environments although its diameter is quite small. The threats from the M/OD environments became a very important issue for the third SEDS mission, since the project office proposed using the shuttle orbiter as a launch platform instead of the second stage of a Delta II expendable rocket, which was used for the first two SEDS missions. A series of hyper-velocity impact tests were performed at the Johnson Space Center and Arnold Engineering Development Center to help determine the critical particle sizes required to sever the tether. The computer hydrodynamic code or hydrocode called CTH, developed by the Sandia National Laboratories, was also used to simulate the damage on the SEDS tether caused by both the orbital debris and test particle impacts. The CTH hydrocode simulation results provided the much needed information to help determine the critical particle sizes required to sever the tether. The M/OD particle sizes required to sever the tether were estimated to be less than 0.1 cm in diameter from these studies, and these size particles are more abundant in low-Earth orbit than larger size particles. Finally, the authors performed the M/OD damage analyses for the three SEDS missions; i.e., SEDS-1, -2, and -3 missions, by using the information obtained from the hypervelocity impact test and hydrocode simulations results.

  9. Differentiating tremor patients using spiral analyses.

    PubMed

    Koirala, N; Muthuraman, M; Anjum, T; Chaitanya, C V; Helmolt, V F; Mideksa, K G; Lange, K; Schmidt, G; Schneider, S; Deuschl, G

    2015-01-01

    Essential tremor follows an autosomal dominant type of inheritance in the majority of patients, yet its genetic basis has not been identified. The age of onset in this tremor is bimodal, one in young age and another when they are old. The old onset is referred to as senile tremor in this study. The precise pathology is still not completely understood for both these tremors. We wanted to develop an easy diagnostic tool to differentiate these two tremors clinically. In this study, the spirals were asked to be drawn by 30 patients, 15 from each group. The spirals were recorded digitally from each hand, with and without the spiral template, using a Wacom intuos version 4 tablets. The aim of the study was to look at the easy diagnostic measures from these spirals to distinguish the two cohorts of patients. The first measure was to use the well-known clinical scores like the number of complete circles without the template, width, height, axis, and degree of severity. The second measure was to estimate the peak frequency and the peak amplitude for the position, velocity, and acceleration data, in the frequency domain. The well-known clinical scores, most of them, did not show any significant difference between the two patient cohorts except the degree of severity which showed significant difference. The peak frequency and the peak amplitude in most of the data were not significantly different between the two cohorts of patients, only the peak amplitude from the acceleration data showed significant difference. Thus, we could use these two parameters to differentiate between the two tremors patient groups, which would be an easy clinical diagnostic tool without the need for any complicated analyses.

  10. Comparative mutational analyses of influenza A viruses

    PubMed Central

    Cheung, Peter Pak-Hang; Rogozin, Igor B.; Choy, Ka-Tim; Ng, Hoi Yee

    2015-01-01

    The error-prone RNA-dependent RNA polymerase (RdRP) and external selective pressures are the driving forces for RNA viral diversity. When confounded by selective pressures, it is difficult to assess if influenza A viruses (IAV) that have a wide host range possess comparable or distinct spontaneous mutational frequency in their RdRPs. We used in-depth bioinformatics analyses to assess the spontaneous mutational frequencies of two RdRPs derived from human seasonal (A/Wuhan/359/95; Wuhan) and H5N1 (A/Vietnam/1203/04; VN1203) viruses using the mini-genome system with a common firefly luciferase reporter serving as the template. High-fidelity reverse transcriptase was applied to generate high-quality mutational spectra which allowed us to assess and compare the mutational frequencies and mutable motifs along a target sequence of the two RdRPs of two different subtypes. We observed correlated mutational spectra (τ correlation P < 0.0001), comparable mutational frequencies (H3N2:5.8 ± 0.9; H5N1:6.0 ± 0.5), and discovered a highly mutable motif “(A)AAG” for both Wuhan and VN1203 RdRPs. Results were then confirmed with two recombinant A/Puerto Rico/8/34 (PR8) viruses that possess RdRP derived from Wuhan or VN1203 (RG-PR8×WuhanPB2, PB1, PA, NP and RG-PR8×VN1203PB2, PB1, PA, NP). Applying novel bioinformatics analysis on influenza mutational spectra, we provide a platform for a comprehensive analysis of the spontaneous mutation spectra for an RNA virus. PMID:25404565

  11. Soil grain analyses at Meridiani Planum, Mars

    USGS Publications Warehouse

    Weitz, C.M.; Anderson, R.C.; Bell, J.F.; Farrand, W. H.; Herkenhoff, K. E.; Johnson, J. R.; Jolliff, B.L.; Morris, R.V.; Squyres, S. W.; Sullivan, R.J.

    2006-01-01

    Grain-size analyses of the soils at Meridiani Planum have been used to identify rock souces for the grains and provide information about depositional processes under past and current conditions. Basaltic sand, dust, millimeter-size hematite-rich spherules interpreted as concretions, spherule fragments, coated partially buried spherules, basalt fragments, sedimentary outcrop fragments, and centimeter-size cobbles are concentrated on the upper surfaces of the soils as a lag deposit, while finer basaltic sands and dust dominate the underlying soils. There is a bimodal distribution of soil grain sizes with one population representing grains <125 ??m and the other falling between 1-4.5 mm. Soils within craters like Eagle and Endurance show a much greater diversity of grain morphologies compared to the plains. The spherules found in the plains soils are approximately 1-2 mm smaller in size than those seen embedded in the outcrop rocks of Eagle and Endurance craters. The average major axis for all unfractured spherules measured in the soils and outcrop rocks is 2.87 ?? 1.18 mm, with a trend toward decreasing spherule sizes in both the soils and outcrop rocks as the rover drove southward. Wind ripples seen across the plains of Meridiani are dominated by similar size (1.3-1.7 mm) hematite-rich grains, and they match in size the larger grains on plains ripples at Gusev Crater. Larger clasts and centimeter-size cobbles that are scattered on the soils have several spectral and compositional types, reflecting multiple origins. The cobbles tend to concentrate within ripple troughs along the plains and in association with outcrop exposures. Copyright 2006 by the American Geophysical Union.

  12. Finite Element analyses of soil bioengineered slopes

    NASA Astrophysics Data System (ADS)

    Tamagnini, Roberto; Switala, Barbara Maria; Sudan Acharya, Madhu; Wu, Wei; Graf, Frank; Auer, Michael; te Kamp, Lothar

    2014-05-01

    Soil Bioengineering methods are not only effective from an economical point of view, but they are also interesting as fully ecological solutions. The presented project is aimed to define a numerical model which includes the impact of vegetation on slope stability, considering both mechanical and hydrological effects. In this project, a constitutive model has been developed that accounts for the multi-phase nature of the soil, namely the partly saturated condition and it also includes the effects of a biological component. The constitutive equation is implemented in the Finite Element (FE) software Comes-Geo with an implicit integration scheme that accounts for the collapse of the soils structure due to wetting. The mathematical formulation of the constitutive equations is introduced by means of thermodynamics and it simulates the growth of the biological system during the time. The numerical code is then applied in the analysis of an ideal rainfall induced landslide. The slope is analyzed for vegetated and non-vegetated conditions. The final results allow to quantitatively assessing the impact of vegetation on slope stability. This allows drawing conclusions and choosing whenever it is worthful to use soil bioengineering methods in slope stabilization instead of traditional approaches. The application of the FE methods show some advantages with respect to the commonly used limit equilibrium analyses, because it can account for the real coupled strain-diffusion nature of the problem. The mechanical strength of roots is in fact influenced by the stress evolution into the slope. Moreover, FE method does not need a pre-definition of any failure surface. FE method can also be used in monitoring the progressive failure of the soil bio-engineered system as it calculates the amount of displacements and strains of the model slope. The preliminary study results show that the formulated equations can be useful for analysis and evaluation of different soil bio

  13. Trend analyses with river sediment rating curves

    USGS Publications Warehouse

    Warrick, Jonathan A.

    2015-01-01

    Sediment rating curves, which are fitted relationships between river discharge (Q) and suspended-sediment concentration (C), are commonly used to assess patterns and trends in river water quality. In many of these studies it is assumed that rating curves have a power-law form (i.e., C = aQb, where a and b are fitted parameters). Two fundamental questions about the utility of these techniques are assessed in this paper: (i) How well to the parameters, a and b, characterize trends in the data? (ii) Are trends in rating curves diagnostic of changes to river water or sediment discharge? As noted in previous research, the offset parameter, a, is not an independent variable for most rivers, but rather strongly dependent on b and Q. Here it is shown that a is a poor metric for trends in the vertical offset of a rating curve, and a new parameter, â, as determined by the discharge-normalized power function [C = â (Q/QGM)b], where QGM is the geometric mean of the Q values sampled, provides a better characterization of trends. However, these techniques must be applied carefully, because curvature in the relationship between log(Q) and log(C), which exists for many rivers, can produce false trends in â and b. Also, it is shown that trends in â and b are not uniquely diagnostic of river water or sediment supply conditions. For example, an increase in â can be caused by an increase in sediment supply, a decrease in water supply, or a combination of these conditions. Large changes in water and sediment supplies can occur without any change in the parameters, â and b. Thus, trend analyses using sediment rating curves must include additional assessments of the time-dependent rates and trends of river water, sediment concentrations, and sediment discharge.

  14. Computational Analyses of Pressurization in Cryogenic Tanks

    NASA Technical Reports Server (NTRS)

    Ahuja, Vineet; Hosangadi, Ashvin; Lee, Chun P.; Field, Robert E.; Ryan, Harry

    2010-01-01

    A comprehensive numerical framework utilizing multi-element unstructured CFD and rigorous real fluid property routines has been developed to carry out analyses of propellant tank and delivery systems at NASA SSC. Traditionally CFD modeling of pressurization and mixing in cryogenic tanks has been difficult primarily because the fluids in the tank co-exist in different sub-critical and supercritical states with largely varying properties that have to be accurately accounted for in order to predict the correct mixing and phase change between the ullage and the propellant. For example, during tank pressurization under some circumstances, rapid mixing of relatively warm pressurant gas with cryogenic propellant can lead to rapid densification of the gas and loss of pressure in the tank. This phenomenon can cause serious problems during testing because of the resulting decrease in propellant flow rate. With proper physical models implemented, CFD can model the coupling between the propellant and pressurant including heat transfer and phase change effects and accurately capture the complex physics in the evolving flowfields. This holds the promise of allowing the specification of operational conditions and procedures that could minimize the undesirable mixing and heat transfer inherent in propellant tank operation. In our modeling framework, we incorporated two different approaches to real fluids modeling: (a) the first approach is based on the HBMS model developed by Hirschfelder, Beuler, McGee and Sutton and (b) the second approach is based on a cubic equation of state developed by Soave, Redlich and Kwong (SRK). Both approaches cover fluid properties and property variation spanning sub-critical gas and liquid states as well as the supercritical states. Both models were rigorously tested and properties for common fluids such as oxygen, nitrogen, hydrogen etc were compared against NIST data in both the sub-critical as well as supercritical regimes.

  15. Identifying Image Manipulation Software from Image Features

    DTIC Science & Technology

    2015-03-26

    6 2.1 Mathematical Image Definition , Image Formats, and Interpolation Algorithms...6 2.1.1 Mathematical Image Definition ...Farid [9]. The following section defines the mathematical definition of an image, discusses two image formats, and three interpolation algo- rithms. 2.1

  16. Image Understanding Research

    DTIC Science & Technology

    1980-09-30

    necessary and identify by block number) Key Words: Digital Image Processing, Image Restoration, Scene Analysis , Image Understanding, Edge Detection, Image...Segmentation, Image Matching, Texture Analysis , VLSI Processors. 20. ABSTRACT (Continue on Prae saide It necessary and identify by block n.mber) This...systems for understanding images, particularly for mapping applications. The research activity includes low level image analysis and feature

  17. A method to enhance the sensitivity of DTI analyses to group differences: a validation study with comparison to voxelwise analyses.

    PubMed

    Cykowski, Matthew D; Lancaster, Jack L; Fox, Peter T

    2011-09-30

    Studies of white matter (WM) abnormalities in psychiatric and neurological disorders often use the analysis package Tract-Based Spatial Statistics (TBSS). However, with small samples and/or subtle effects, a study using the standard TBSS approach can be underpowered. For such cases, a new method is presented that summarizes global differences between TBSS-derived fractional anisotropy (FA) images with a single paired t-statistic, estimating the degrees of freedom using spatial autocorrelation. The sensitivity of the method is demonstrated by using well-known aging effects on FA as a proxy for disease effects. Sixty healthy subjects were divided equally into younger- (YA), middle- (MA), and older-aged (OA) groups and significant global differences were demonstrated in the YA versus OA (all N ≥ 4, FA difference≈0.023), MA versus OA (all N≥4, FA difference≈0.017), and YA versus MA (FA difference≈0.005 at N=20) comparisons. In contrast, no significant difference could be detected in the YA versus MA comparison using voxelwise TBSS analysis with the full sample (N=20 per group). This method should facilitate localizing analyses in the direction of a proven group difference while providing clinically relevant information about pathophysiologic processes globally affecting WM.

  18. High-Resolution Views of Io's Emakong Patera: Latest Galileo Imaging Results

    NASA Technical Reports Server (NTRS)

    Williams, D. A.; Keszthelyi, L. P.; Davies, A. G.; Greeley, R.; Head, J. W., III

    2002-01-01

    This presentation will discuss analyses of the latest Galileo SSI (solid state imaging) high-resolution images of the Emakong lava channels and flow field on Jupiter's moon Io. Additional information is contained in the original extended abstract.

  19. Value of MR histogram analyses for prediction of microvascular invasion of hepatocellular carcinoma

    PubMed Central

    Huang, Ya-Qin; Liang, He-Yue; Yang, Zhao-Xia; Ding, Ying; Zeng, Meng-Su; Rao, Sheng-Xiang

    2016-01-01

    Abstract The objective is to explore the value of preoperative magnetic resonance (MR) histogram analyses in predicting microvascular invasion (MVI) of hepatocellular carcinoma (HCC). Fifty-one patients with histologically confirmed HCC who underwent diffusion-weighted and contrast-enhanced MR imaging were included. Histogram analyses were performed and mean, variance, skewness, kurtosis, 1th, 10th, 50th, 90th, and 99th percentiles were derived. Quantitative histogram parameters were compared between HCCs with and without MVI. Receiver operating characteristics (ROC) analyses were generated to compare the diagnostic performance of tumor size, histogram analyses of apparent diffusion coefficient (ADC) maps, and MR enhancement. The mean, 1th, 10th, and 50th percentiles of ADC maps, and the mean, variance. 1th, 10th, 50th, 90th, and 99th percentiles of the portal venous phase (PVP) images were significantly different between the groups with and without MVI (P <0.05), with area under the ROC curves (AUCs) of 0.66 to 0.74 for ADC and 0.76 to 0.88 for PVP. The largest AUC of PVP (1th percentile) showed significantly higher accuracy compared with that of arterial phase (AP) or tumor size (P <0.001). MR histogram analyses—in particular for 1th percentile for PVP images—held promise for prediction of MVI of HCC. PMID:27368028

  20. SIR-C/X-SAR: Imaging Radar Analyses for Forest Ecosystem Modeling

    NASA Technical Reports Server (NTRS)

    Ranson, K. Jon; Shugart, Herman; Smith, James A.; Sun, Guoqing

    1996-01-01

    Progress, significant results and future plans are discussed relating to the following objectives: (1) Ecosystem characterization using SIR-C/X-SAR and AirSAR data; (2) Improving radar backscatter models for forest canopies; and (3) Using SAR measurements and models with forest ecosystem models to improve inferences of ecosystem attributes and processes.

  1. Data Filtering in Instrumental Analyses with Applications to Optical Spectroscopy and Chemical Imaging

    ERIC Educational Resources Information Center

    Vogt, Frank

    2011-01-01

    Most measurement techniques have some limitations imposed by a sensor's signal-to-noise ratio (SNR). Thus, in analytical chemistry, methods for enhancing the SNR are of crucial importance and can be ensured experimentally or established via pre-treatment of digitized data. In many analytical curricula, instrumental techniques are given preference…

  2. Lambert albedo retrieval and analyses over Aram Chaos from OMEGA hyperspectral imaging data

    NASA Astrophysics Data System (ADS)

    Liu, Yang; Arvidson, Raymond E.; Wolff, Michael J.; Mellon, Michael T.; Catalano, Jeffrey G.; Wang, Alian; Bishop, Janice L.

    2012-08-01

    The DISORT radiative transfer model was used to retrieve Lambert albedos from 0.4 to 4.0 μm over hydrated sulfate deposits in Aram Chaos for the Mars Express OMEGA instrument. Albedos were also retrieved for a relatively anhydrous area to the north to use as a control for comparison to the hydrated sulfate spectra. Atmospheric gases and aerosols were modeled, along with both solar and thermal radiance contributions and retrieved Lambert albedos are similar for multiple OMEGA observations over the same areas. The Lambert albedo spectra show that the control area is dominated by electronic transition bands due to nanophase iron oxides and low-calcium orthopyroxenes, together with the ubiquitous 2.98 μm band due in part to water adsorbed onto particle surfaces. The retrieved Lambert albedos for Aram Chaos show an enhanced 2.98 μm water band and bands located at 0.938, 1.46, 1.96, and 2.41 μm. We infer the presence of nanophase iron oxides, schwertmannite, and starkeyite based on consideration of these band locations, inferred electronic and vibrational absorptions, stability under Mars conditions, and pathways for formation. This mineral assemblage, together with gray, crystalline hematite previously detected from TES data (Glotch and Christensen, 2005), can be explained as a result of iron oxidation and evaporation of iron-, magnesium-, and sulfur-rich fluids during periods of rising groundwater.

  3. Dynamic and still microcirculatory image analysis for quantitative microcirculation research

    NASA Astrophysics Data System (ADS)

    Ying, Xiaoyou; Xiu, Rui-juan

    1994-05-01

    Based on analyses of various types of digital microcirculatory image (DMCI), we summed up the image features of DMCI, the digitizing demands for digital microcirculatory imaging, and the basic characteristics of the DMCI processing. A dynamic and still imaging separation processing (DSISP) mode was designed for developing a DMCI workstation and the DMCI processing. Original images in this study were clinical microcirculatory images from human finger nail-bed and conjunctiva microvasculature, and intravital microvascular network images from animal tissue or organs. A series of dynamic and still microcirculatory image analysis functions were developed in this study. The experimental results indicate most of the established analog video image analysis methods for microcirculatory measurement could be realized in a more flexible way based on the DMCI. More information can be rapidly extracted from the quality improved DMCI by employing intelligence digital image analysis methods. The DSISP mode is very suitable for building a DMCI workstation.

  4. Imaging bolometer

    DOEpatents

    Wurden, Glen A.

    1999-01-01

    Radiation-hard, steady-state imaging bolometer. A bolometer employing infrared (IR) imaging of a segmented-matrix absorber of plasma radiation in a cooled-pinhole camera geometry is described. The bolometer design parameters are determined by modeling the temperature of the foils from which the absorbing matrix is fabricated by using a two-dimensional time-dependent solution of the heat conduction equation. The resulting design will give a steady-state bolometry capability, with approximately 100 Hz time resolution, while simultaneously providing hundreds of channels of spatial information. No wiring harnesses will be required, as the temperature-rise data will be measured via an IR camera. The resulting spatial data may be used to tomographically investigate the profile of plasmas.

  5. Imaging bolometer

    DOEpatents

    Wurden, G.A.

    1999-01-19

    Radiation-hard, steady-state imaging bolometer is disclosed. A bolometer employing infrared (IR) imaging of a segmented-matrix absorber of plasma radiation in a cooled-pinhole camera geometry is described. The bolometer design parameters are determined by modeling the temperature of the foils from which the absorbing matrix is fabricated by using a two-dimensional time-dependent solution of the heat conduction equation. The resulting design will give a steady-state bolometry capability, with approximately 100 Hz time resolution, while simultaneously providing hundreds of channels of spatial information. No wiring harnesses will be required, as the temperature-rise data will be measured via an IR camera. The resulting spatial data may be used to tomographically investigate the profile of plasmas. 2 figs.

  6. Subwavelength Imaging

    DTIC Science & Technology

    2008-06-12

    arrangement. The Kramers- Kronig relations, which describe the spectral relationship between the real and imag- inary parts of, for example, the dielectric...be analytic in the upper half ω plane. Satisfaction of the Kramers- Kronig relations guarantees causality, as does an analytic ǫ(ω) in the upper half...in direction to the Poynting vector (the direction of power flow) for a negative index material. The Kramers- Kronig relations do not convey the causal

  7. Riverine Imaging

    DTIC Science & Technology

    2011-12-16

    Tab -1 Information -Theoretic Analysis & Performance Bounds for Super- Resolution (SR) Video Imagery Reconstruction Tab - 2 Appendix: Riverine Imaging...Figure 27 Range Doppler Map of Reduced 7 Point Model 20 Figure 28 Frequency vs Doppler and Frequency vs Velocity Maps of Reduced 7 Point Model 21 Figure...curve) chamber background subtracted 45 Figure 52: Predicted SNR vs . Range Performance for the Akela Radar with 500 mW Power Amp Based on Noise Power

  8. Brain imaging

    SciTech Connect

    Bradshaw, J.R.

    1989-01-01

    This book presents a survey of the various imaging tools with examples of the different diseases shown best with each modality. It includes 100 case presentations covering the gamut of brain diseases. These examples are grouped according to the clinical presentation of the patient: headache, acute headache, sudden unilateral weakness, unilateral weakness of gradual onset, speech disorders, seizures, pituitary and parasellar lesions, sensory disorders, posterior fossa and cranial nerve disorders, dementia, and congenital lesions.

  9. Image structure restoration from sputnik with multi-matrix scanners

    NASA Astrophysics Data System (ADS)

    Eremeev, V.; Kuznetcov, A.; Myatov, G.; Presnyakov, Oleg; Poshekhonov, V.; Svetelkin, P.

    2014-10-01

    The paper is devoted to the earth surface image formation by means of multi-matrix scanning cameras. The realized formation of continuous and spatially combined images consists of consistent solutions for radiometric scan correction, stitching and geo-referencing of multispectral images. The radiometric scan correction algorithm based on statistical analyses of input images is described. Also, there is the algorithm for sub-pixel stitching of scans into one continuous image which could be formed by the virtual scanner. The paper contains algorithms for geometrical combining of multispectral images obtained in different moments; and, examples illustrating effectiveness of the suggested processing algorithms.

  10. Multispectral Imaging Broadens Cellular Analysis

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Amnis Corporation, a Seattle-based biotechnology company, developed ImageStream to produce sensitive fluorescence images of cells in flow. The company responded to an SBIR solicitation from Ames Research Center, and proposed to evaluate several methods of extending the depth of field for its ImageStream system and implement the best as an upgrade to its commercial products. This would allow users to view whole cells at the same time, rather than just one section of each cell. Through Phase I and II SBIR contracts, Ames provided Amnis the funding the company needed to develop this extended functionality. For NASA, the resulting high-speed image flow cytometry process made its way into Medusa, a life-detection instrument built to collect, store, and analyze sample organisms from erupting hydrothermal vents, and has the potential to benefit space flight health monitoring. On the commercial end, Amnis has implemented the process in ImageStream, combining high-resolution microscopy and flow cytometry in a single instrument, giving researchers the power to conduct quantitative analyses of individual cells and cell populations at the same time, in the same experiment. ImageStream is also built for many other applications, including cell signaling and pathway analysis; classification and characterization of peripheral blood mononuclear cell populations; quantitative morphology; apoptosis (cell death) assays; gene expression analysis; analysis of cell conjugates; molecular distribution; and receptor mapping and distribution.

  11. Imaging AMS

    SciTech Connect

    Freeman, S.P.H.T. |; Ramsey, C.B.; Hedges, R.E.M.

    1993-12-01

    The benefits of simultaneous high effective mass resolution and large spectrometer acceptance that accelerator mass spectrometry has afforded the bulk analysis of material samples by secondary ion mass spectrometry may also be applied to imaging SIMS. The authors are exploring imaging AMS with the addition to the Oxford {sup 14}C-AMS system of a scanning secondary ion source. It employs a sub micron probe and a separate Cs flood to further increase the useful ion yield. The source has been accommodated on the system by directly injecting sputtered ions into the accelerator without mass analysis. They are detected with a range of devices including new high-bandwidth detectors. Qualitative mass spectra may be easily generated by varying only the post-accelerator analysis magnet. Selected ion signals may be used for imaging. In developing the instrument for bioscience research the authors are establishing its capability for measuring the lighter elements prevalent in biological tissue. Importantly, the machine can map the distributions of radiocarbon labeled compounds with an efficiency of about 1{per_thousand}. A background due to misidentification of non-{sup 14}C ions as a result of the reduced ion mass filtering is too small to hinder high magnification microscopy.

  12. Imaging stress.

    PubMed

    Brielle, Shlomi; Gura, Rotem; Kaganovich, Daniel

    2015-11-01

    Recent innovations in cell biology and imaging approaches are changing the way we study cellular stress, protein misfolding, and aggregation. Studies have begun to show that stress responses are even more variegated and dynamic than previously thought, encompassing nano-scale reorganization of cytosolic machinery that occurs almost instantaneously, much faster than transcriptional responses. Moreover, protein and mRNA quality control is often organized into highly dynamic macromolecular assemblies, or dynamic droplets, which could easily be mistaken for dysfunctional "aggregates," but which are, in fact, regulated functional compartments. The nano-scale architecture of stress-response ranges from diffraction-limited structures like stress granules, P-bodies, and stress foci to slightly larger quality control inclusions like juxta nuclear quality control compartment (JUNQ) and insoluble protein deposit compartment (IPOD), as well as others. Examining the biochemical and physical properties of these dynamic structures necessitates live cell imaging at high spatial and temporal resolution, and techniques to make quantitative measurements with respect to movement, localization, and mobility. Hence, it is important to note some of the most recent observations, while casting an eye towards new imaging approaches that offer the possibility of collecting entirely new kinds of data from living cells.

  13. Imaging Borrelly

    USGS Publications Warehouse

    Soderblom, L.A.; Boice, D.C.; Britt, D.T.; Brown, R.H.; Buratti, B.J.; Kirk, R.L.; Lee, M.; Nelson, R.M.; Oberst, J.; Sandel, B.R.; Stern, S.A.; Thomas, N.; Yelle, R.V.

    2004-01-01

    The nucleus, coma, and dust jets of short-period Comet 19P/Borrelly were imaged from the Deep Space 1 spacecraft during its close flyby in September 2001. A prominent jet dominated the near-nucleus coma and emanated roughly normal to the long axis of nucleus from a broad central cavity. We show it to have remained fixed in position for more than 34 hr, much longer than the 26-hr rotation period. This confirms earlier suggestions that it is co-aligned with the rotation axis. From a combination of fitting the nucleus light curve from approach images and the nucleus' orientation from stereo images at encounter, we conclude that the sense of rotation is right-handed around the main jet vector. The inferred rotation pole is approximately perpendicular to the long axis of the nucleus, consistent with a simple rotational state. Lacking an existing IAU comet-specific convention but applying a convention provisionally adopted for asteroids, we label this the north pole. This places the sub-solar latitude at ???60?? N at the time of the perihelion with the north pole in constant sunlight and thus receiving maximum average insolation. ?? 2003 Elsevier Inc. All rights reserved.

  14. Dermatoglyphic analyses in children with cerebral palsy.

    PubMed

    Simsek, S; Taskiran, H; Karakaya, N; Fistik, T; Solak, M; Cakmak, E A

    1998-01-01

    This study was intended to elucidate the diagnostic values of dermatoglyphic features on the 45 cerebral palsy (CP) patients (28 boys and 17 girls). There were 50 healthy children in the control group. Dermatoglyphic samples were obtained from the both groups by using the paper and ink method and than analysed. The types of dermal patterns of fingertips, the counts of total ridges, the counts of a-b ridges, the values of atd angles, presence or absence of dermal patterns in the hypothenar, thenar/I, II, III, IV interdigital areas, presence of absence of the palmar flexion lines, were compared between the children with CP and control group. It was found that arch, radial loop, whorl prints have increased and ulnar print has decreased in boys investigated which was significant statistically (p < 0.001). No difference was found between investigation and control groups of girls (p > 0.05). The total ridge counts in boys and girls of the investigation group were found significantly decreasing according to the control group (p < 0.001). There was an important decrease in the counts of a-b ridges of investigation group as compared to controls. It was significant in boys (p < 0.01) but not in girls (p > 0.05). The values of atd angles of the investigation group have increased in the control group (p < 0.001 in girls and p < 0.01 in boys). The dermal prints in the hypothenar, thenar/I, II, III and IV interdigital areas showed important differences in the investigation group when compared with the control group (p < 0.01). No clear distinction occurred between the two groups from the viewpoint of palmar flexion lines (p > 0.05). In conclusion, remarkable differences in comparison to controls were found in the dermatoglyphic features of CP cases. In our opinion, by undertaking more studies on the subject and examining a higher number of cases it will be possible to obtain useful data in CP cases indicative of etiologically.

  15. Genome-Facilitated Analyses of Geomicrobial Processes

    SciTech Connect

    Kenneth H. Nealson

    2012-05-02

    that makes up chitin, virtually all of the strains were in fact capable. This led to the discovery of a great many new genes involved with chitin and NAG metabolism (7). In a similar vein, a detailed study of the sugar utilization pathway revealed a major new insight into the regulation of sugar metabolism in this genus (19). Systems Biology and Comparative Genomics of the shewanellae: Several publications were put together describing the use of comparative genomics for analyses of the group Shewanella, and these were a logical culmination of our genomic-driven research (10,15,18). Eight graduate students received their Ph.D. degrees doing part of the work described here, and four postdoctoral fellows were supported. In addition, approximately 20 undergraduates took part in projects during the grant period.

  16. Molecular Biomarker Analyses Using Circulating Tumor Cells

    PubMed Central

    Punnoose, Elizabeth A.; Atwal, Siminder K.; Spoerke, Jill M.; Savage, Heidi; Pandita, Ajay; Yeh, Ru-Fang; Pirzkall, Andrea; Fine, Bernard M.; Amler, Lukas C.; Chen, Daniel S.; Lackner, Mark R.

    2010-01-01

    Background Evaluation of cancer biomarkers from blood could significantly enable biomarker assessment by providing a relatively non-invasive source of representative tumor material. Circulating Tumor Cells (CTCs) isolated from blood of metastatic cancer patients hold significant promise in this regard. Methodology/Principal Findings Using spiked tumor-cells we evaluated CTC capture on different CTC technology platforms, including CellSearch® and two biochip platforms, and used the isolated CTCs to develop and optimize assays for molecular characterization of CTCs. We report similar performance for the various platforms tested in capturing CTCs, and find that capture efficiency is dependent on the level of EpCAM expression. We demonstrate that captured CTCs are amenable to biomarker analyses such as HER2 status, qRT-PCR for breast cancer subtype markers, KRAS mutation detection, and EGFR staining by immunofluorescence (IF). We quantify cell surface expression of EGFR in metastatic lung cancer patient samples. In addition, we determined HER2 status by IF and FISH in CTCs from metastatic breast cancer patients. In the majority of patients (89%) we found concordance with HER2 status from patient tumor tissue, though in a subset of patients (11%), HER2 status in CTCs differed from that observed in the primary tumor. Surprisingly, we found CTC counts to be higher in ER+ patients in comparison to HER2+ and triple negative patients, which could be explained by low EpCAM expression and a more mesenchymal phenotype of tumors belonging to the basal-like molecular subtype of breast cancer. Conclusions/Significance Our data suggests that molecular characterization from captured CTCs is possible and can potentially provide real-time information on biomarker status. In this regard, CTCs hold significant promise as a source of tumor material to facilitate clinical biomarker evaluation. However, limitations exist from a purely EpCAM based capture system and addition of antibodies

  17. Static and dynamic analyses of tensegrity structures

    NASA Astrophysics Data System (ADS)

    Nishimura, Yoshitaka

    Tensegrity structures are a class of truss structures consisting of a continuous set of tension members (cables) and a discrete set of compression members (bars). Since tensegrity structures are light weight and can be compactly stowed and deployed, cylindrical tensegrity modules have been proposed for space structures. From a view point of structural dynamics, tensegrity structures pose a new set of problems, i.e., initial shape finding. Initial configurations of tensegrity structures must be computed by imposing a pre-stressability condition to initial equilibrium equations. There are ample qualitative statements regarding the initial geometry of cylindrical and spherical tensegrity modules. Quantitative initial shape anlyses have only been performed on one-stage and two-stage cylindrical modules. However, analytical expressions for important geometrical parameters such as twist angles and overlap ratios lack the definition of the initial shape of both cylindrical and spherical tensegrity modules. In response to the above needs, a set of static and dynamic characterization procedures for tensegrity modules was first developed. The procedures were subsequently applied to Buckminster Fuller's spherical tensegrity modules. Both the initial shape and the corresponding pre-stress mode were analytically obtained by using the graphs of the tetrahedral, octahedral (cubic), and icosahedral (dodecahedral) groups. For pre-stressed configurations, modal analyses were conducted to classify a large number of infinitesimal mechanism modes. The procedures also applied tocyclic cylindrical tensegrity modules with an arbitrary number of stages. It was found that both the Maxwell number and the number of infinitesimal mechanism modes are independent of the number of stages in the axial direction. A reduced set of equilibrium equations was derived by incorporating cyclic symmetry and the flip, or quasi-flip, symmetry of the cylindrical modules. For multi-stage modules with more than

  18. Structure similarity-guided image binarization for automatic segmentation of epidermis surface microstructure images.

    PubMed

    Zou, Y; Lei, B; Dong, F; Xu, G; Sun, S; Xia, P

    2017-01-24

    Partitioning epidermis surface microstructure (ESM) images into skin ridge and skin furrow regions is an important preprocessing step before quantitative analyses on ESM images. Binarization segmentation is a potential technique for partitioning ESM images because of its computational simplicity and ease of implementation. However, even for some state-of-the-art binarization methods, it remains a challenge to automatically segment ESM images, because the grey-level histograms of ESM images have no obvious external features to guide automatic assessment of appropriate thresholds. Inspired by human visual perceptual functions of structural feature extraction and comparison, we propose a structure similarity-guided image binarization method. The proposed method seeks for the binary image that best approximates the input ESM image in terms of structural features. The proposed method is validated by comparing it with two recently developed automatic binarization techniques as well as a manual binarization method on 20 synthetic noisy images and 30 ESM images. The experimental results show: (1) the proposed method possesses self-adaption ability to cope with different images with same grey-level histogram; (2) compared to two automatic binarization techniques, the proposed method significantly improves average accuracy in segmenting ESM images with an acceptable decrease in computational efficiency; (3) and the proposed method is applicable for segmenting practical EMS images. (Matlab code of the proposed method can be obtained by contacting with the corresponding author.).

  19. Molecular cloning of chicken aggrecan. Structural analyses.

    PubMed Central

    Chandrasekaran, L; Tanzer, M L

    1992-01-01

    domain. Thus different variants of chondroitin sulphate and keratan sulphate domains may have evolved separately to fulfil specific biochemical and physiological functions. Images Fig. 1. Fig. 3. Fig. 4. Fig. 5. Fig. 6. Fig. 7. Fig. 8. PMID:1339285

  20. Body Image Dissatisfaction and Distortion, Steroid Use, and Sex Differences in College Age Bodybuilders.

    ERIC Educational Resources Information Center

    Peters, Mark Anthony; Phelps, LeAddelle

    2001-01-01

    Compares college age bodybuilders by sex and steroid intake on two variables: body image dissatisfaction and body image distortion. Results reveal only a significant effect for gender on body distortion. No steroid-use differences were apparent for either body image dissatisfaction or body image distortion. Analyses indicate that female…

  1. First Super-Earth Atmosphere Analysed

    NASA Astrophysics Data System (ADS)

    2010-12-01

    The atmosphere around a super-Earth exoplanet has been analysed for the first time by an international team of astronomers using ESO's Very Large Telescope. The planet, which is known as GJ 1214b, was studied as it passed in front of its parent star and some of the starlight passed through the planet's atmosphere. We now know that the atmosphere is either mostly water in the form of steam or is dominated by thick clouds or hazes. The results will appear in the 2 December 2010 issue of the journal Nature. The planet GJ 1214b was confirmed in 2009 using the HARPS instrument on ESO's 3.6-metre telescope in Chile (eso0950) [1]. Initial findings suggested that this planet had an atmosphere, which has now been confirmed and studied in detail by an international team of astronomers, led by Jacob Bean (Harvard-Smithsonian Center for Astrophysics), using the FORS instrument on ESO's Very Large Telescope. "This is the first super-Earth to have its atmosphere analysed. We've reached a real milestone on the road toward characterising these worlds," said Bean. GJ 1214b has a radius of about 2.6 times that of the Earth and is about 6.5 times as massive, putting it squarely into the class of exoplanets known as super-Earths. Its host star lies about 40 light-years from Earth in the constellation of Ophiuchus (the Serpent Bearer). It is a faint star [2], but it is also small, which means that the size of the planet is large compared to the stellar disc, making it relatively easy to study [3]. The planet travels across the disc of its parent star once every 38 hours as it orbits at a distance of only two million kilometres: about seventy times closer than the Earth orbits the Sun. To study the atmosphere, the team observed the light coming from the star as the planet passed in front of it [4]. During these transits, some of the starlight passes through the planet's atmosphere and, depending on the chemical composition and weather on the planet, specific wavelengths of light are

  2. FY01 Supplemental Science and Performance Analyses, Volume 1: Scientific Bases and Analyses, Rev 00

    SciTech Connect

    David Dobson

    2001-06-30

    The U.S. Department of Energy (DOE) is considering the possible recommendation of a site at Yucca Mountain, Nevada, for development as a geologic repository for the disposal of high-level radioactive waste and spent nuclear fuel. To facilitate public review and comment, in May 2001 the DOE released the Yucca Mountain Science and Engineering Report (S&ER) (DOE 2001 [DIRS 153849]), which presents technical information supporting the consideration of the possible site recommendation. The report summarizes the results of more than 20 years of scientific and engineering studies. A decision to recommend the site has not been made: the DOE has provided the S&ER and its supporting documents as an aid to the public in formulating comments on the possible recommendation. When the S&ER (DOE 2001 [DIRS 153849]) was released, the DOE acknowledged that technical and scientific analyses of the site were ongoing. Therefore, the DOE noted in the Federal Register Notice accompanying the report (66 FR 23 013 [DIRS 155009], p. 2) that additional technical information would be released before the dates, locations, and times for public hearings on the possible recommendation were announced. This information includes: (1) the results of additional technical studies of a potential repository at Yucca Mountain, contained in this FY01 Supplemental Science and Performance Analyses: Vol. 1, Scientific Bases and Analyses; and FY01 Supplemental Science and Performance Analyses: Vol. 2, Performance Analyses (McNeish 2001 [DIRS 155023]) (collectively referred to as the SSPA) and (2) a preliminary evaluation of the Yucca Mountain site's preclosure and postclosure performance against the DOE's proposed site suitability guidelines (10 CFR Part 963 [64 FR 67054] [DIRS 124754]). By making the large amount of information developed on Yucca Mountain available in stages, the DOE intends to provide the public and interested parties with time to review the available materials and to formulate and submit

  3. Restoring Turbulence-Degraded Underwater Images

    DTIC Science & Technology

    2012-11-19

    scien- tist at Corning Inc., Polymath Research Inc., and the Center for Optoelectronics and Optical Communications. He has held his current position for...limited resolution in large telescopes by Fourier analysing speckle patterns in star images, Astron. Astrophys. 6, p . 85, 1970. 2. M. A. Vorontsov...Parallel image processing based on an evolution equation with anisotropic gain: integrated optoelectronic architectures, J. Opt. Soc. Am. A 16, p . 1623

  4. Image Editing Via Searching Source Image

    NASA Astrophysics Data System (ADS)

    Yu, Han; Deng, Liang-Jian

    Image editing has important applications by changing the image texture, illumination, target location, etc. As an important application of Poisson equation, Poisson image editing processes images on the gradient domain and has been applied to seamless clone, selection editing, image denoising, etc. In this paper, we present a new application of Poisson image editing, which is based on searching source image. The main feature of the new application is all modifying information comes from the source image. Experimental results show that the proposed application performs well.

  5. Residual Strength Analyses of Monolithic Structures

    NASA Technical Reports Server (NTRS)

    Forth, Scott (Technical Monitor); Ambur, Damodar R. (Technical Monitor); Seshadri, B. R.; Tiwari, S. N.

    2003-01-01

    Finite-element fracture simulation methodology predicts the residual strength of damaged aircraft structures. The methodology uses the critical crack-tip-opening-angle (CTOA) fracture criterion to characterize the fracture behavior of the material. The CTOA fracture criterion assumes that stable crack growth occurs when the crack-tip angle reaches a constant critical value. The use of the CTOA criterion requires an elastic- plastic, finite-element analysis. The critical CTOA value is determined by simulating fracture behavior in laboratory specimens, such as a compact specimen, to obtain the angle that best fits the observed test behavior. The critical CTOA value appears to be independent of loading, crack length, and in-plane dimensions. However, it is a function of material thickness and local crack-front constraint. Modeling the local constraint requires either a three-dimensional analysis or a two-dimensional analysis with an approximation to account for the constraint effects. In recent times as the aircraft industry is leaning towards monolithic structures with the intention of reducing part count and manufacturing cost, there has been a consistent effort at NASA Langley to extend critical CTOA based numerical methodology in the analysis of integrally-stiffened panels.In this regard, a series of fracture tests were conducted on both flat and curved aluminum alloy integrally-stiffened panels. These flat panels were subjected to uniaxial tension and during the test, applied load-crack extension, out-of-plane displacements and local deformations around the crack tip region were measured. Compact and middle-crack tension specimens were tested to determine the critical angle (wc) using three-dimensional code (ZIP3D) and the plane-strain core height (hJ using two-dimensional code (STAGS). These values were then used in the STAGS analysis to predict the fracture behavior of the integrally-stiffened panels. The analyses modeled stable tearing, buckling, and crack

  6. Runtime and Pressurization Analyses of Propellant Tanks

    NASA Technical Reports Server (NTRS)

    Field, Robert E.; Ryan, Harry M.; Ahuja, Vineet; Hosangadi, Ashvin; Lee, Chung P.

    2007-01-01

    Multi-element unstructured CFD has been utilized at NASA SSC to carry out analyses of propellant tank systems in different modes of operation. The three regimes of interest at SSC include (a) tank chill down (b) tank pressurization and (c) runtime propellant draw-down and purge. While tank chill down is an important event that is best addressed with long time-scale heat transfer calculations, CFD can play a critical role in the tank pressurization and runtime modes of operation. In these situations, problems with contamination of the propellant by inclusion of the pressurant gas from the ullage causes a deterioration of the quality of the propellant delivered to the test article. CFD can be used to help quantify the mixing and propellant degradation. During tank pressurization under some circumstances, rapid mixing of relatively warm pressurant gas with cryogenic propellant can lead to rapid densification of the gas and loss of pressure in the tank. This phenomenon can cause serious problems during testing because of the resulting decrease in propellant flow rate. With proper physical models implemented, CFD can model the coupling between the propellant and pressurant including heat transfer and phase change effects and accurately capture the complex physics in the evolving flowfields. This holds the promise of allowing the specification of operational conditions and procedures that could minimize the undesirable mixing and heat transfer inherent in propellant tank operation. It should be noted that traditional CFD modeling is inadequate for such simulations because the fluids in the tank are in a range of different sub-critical and supercritical states and elaborate phase change and mixing rules have to be developed to accurately model the interaction between the ullage gas and the propellant. We show a typical run-time simulation of a spherical propellant tank, containing RP-1 in this case, being pressurized with room-temperature nitrogen at 540 R. Nitrogen

  7. Analysing regenerative potential in zebrafish models of congenital muscular dystrophy.

    PubMed

    Wood, A J; Currie, P D

    2014-11-01

    The congenital muscular dystrophies (CMDs) are a clinically and genetically heterogeneous group of muscle disorders. Clinically hypotonia is present from birth, with progressive muscle weakness and wasting through development. For the most part, CMDs can mechanistically be attributed to failure of basement membrane protein laminin-α2 sufficiently binding with correctly glycosylated α-dystroglycan. The majority of CMDs therefore arise as the result of either a deficiency of laminin-α2 (MDC1A) or hypoglycosylation of α-dystroglycan (dystroglycanopathy). Here we consider whether by filling a regenerative medicine niche, the zebrafish model can address the present challenge of delivering novel therapeutic solutions for CMD. In the first instance the readiness and appropriateness of the zebrafish as a model organism for pioneering regenerative medicine therapies in CMD is analysed, in particular for MDC1A and the dystroglycanopathies. Despite the recent rapid progress made in gene editing technology, these approaches have yet to yield any novel zebrafish models of CMD. Currently the most genetically relevant zebrafish models to the field of CMD, have all been created by N-ethyl-N-nitrosourea (ENU) mutagenesis. Once genetically relevant models have been established the zebrafish has several important facets for investigating the mechanistic cause of CMD, including rapid ex vivo development, optical transparency up to the larval stages of development and relative ease in creating transgenic reporter lines. Together, these tools are well suited for use in live-imaging studies such as in vivo modelling of muscle fibre detachment. Secondly, the zebrafish's contribution to progress in effective treatment of CMD was analysed. Two approaches were identified in which zebrafish could potentially contribute to effective therapies. The first hinges on the augmentation of functional redundancy within the system, such as upregulating alternative laminin chains in the candyfloss

  8. Analyses of 1999 PM Data for the PM NAAQS Review

    EPA Pesticide Factsheets

    These files document all analyses conducted in association with the EPA memorandum from Terence Fitz-Simons, Scott Mathias, and Mike Rizzo titled Analyses of 1999 PM Data for the PM NAAQS Review, November 17, 2000.

  9. MRI (Magnetic Resonance Imaging)

    MedlinePlus

    ... and Procedures Medical Imaging MRI (Magnetic Resonance Imaging) MRI (Magnetic Resonance Imaging) Share Tweet Linkedin Pin it More sharing options ... usually given through an IV in the arm. MRI Research Programs at FDA Magnetic Resonance Imaging (MRI) ...

  10. Image Processor

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Texas Instruments Programmable Remapper is a research tool used to determine how to best utilize the part of a patient's visual field still usable by mapping onto his field of vision with manipulated imagery. It is an offshoot of a NASA program for speeding up, improving the accuracy of pattern recognition in video imagery. The Remapper enables an image to be "pushed around" so more of it falls into the functional portions in the retina of a low vision person. It works at video rates, and researchers hope to significantly reduce its size and cost, creating a wearable prosthesis for visually impaired people.

  11. Digital image processing.

    PubMed

    Seeram, Euclid

    2004-01-01

    Digital image processing is now commonplace in radiology, nuclear medicine and sonography. This article outlines underlying principles and concepts of digital image processing. After completing this article, readers should be able to: List the limitations of film-based imaging. Identify major components of a digital imaging system. Describe the history and application areas of digital image processing. Discuss image representation and the fundamentals of digital image processing. Outline digital image processing techniques and processing operations used in selected imaging modalities. Explain the basic concepts and visualization tools used in 3-D and virtual reality imaging. Recognize medical imaging informatics as a new area of specialization for radiologic technologists.

  12. Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.

    PubMed

    Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg

    2009-11-01

    G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.

  13. Non-destructive infrared analyses: a method for provenance analyses of sandstones

    NASA Astrophysics Data System (ADS)

    Bowitz, Jörg; Ehling, Angela

    2008-12-01

    Infrared spectroscopy (IR spectroscopy) is commonly applied in the laboratory for mineral analyses in addition to XRD. Because such technical efforts are time and cost consuming, we present an infrared-based mobile method for non-destructive mineral and provenance analyses of sandstones. IR spectroscopy is based on activating chemical bonds. By irradiating a mineral mixture, special bonds are activated to vibrate depending on the bond energy (resonance vibration). Accordingly, the energy of the IR spectrum will be reduced thereby generating an absorption spectrum. The positions of the absorption maxima within the spectral region indicate the type of the bonds and in many cases identify minerals containing these bonds. The non-destructive reflection spectroscopy operates in the near infrared region (NIR) and can detect all common clay minerals as well as sulfates, hydroxides and carbonates. The spectra produced have been interpreted by computer using digital mineral libraries that have been especially collected for sandstones. The comparison of all results with XRD, RFA and interpretations of thin sections demonstrates impressively the accuracy and reliability of this method. Not only are different minerals detectable, but also differently ordered kaolinites and varieties of illites can be identified by the shape and size of the absorption bands. Especially clay minerals and their varieties in combination with their relative contents form the characteristic spectra of sandstones. Other components such as limonite, hematite and amorphous silica also influence the spectra. Sandstones, similar in colour and texture, often can be identified by their characteristic reflectance spectra. Reference libraries with more than 60 spectra of important German sandstones have been created to enable entirely computerized interpretations and identifications of these dimension stones. The analysis of infrared spectroscopy results is demonstrated with examples of different sandstones

  14. NASA Earth Exchange (NEX) Supporting Analyses for National Climate Assessments

    NASA Astrophysics Data System (ADS)

    Nemani, R. R.; Thrasher, B. L.; Wang, W.; Lee, T. J.; Melton, F. S.; Dungan, J. L.; Michaelis, A.

    2015-12-01

    The NASA Earth Exchange (NEX) is a collaborative computing platform that has been developed with the objective of bringing scientists together with the software tools, massive global datasets, and supercomputing resources necessary to accelerate research in Earth systems science and global change. NEX supports several research projects that are closely related with the National Climate Assessment including the generation of high-resolution climate projections, identification of trends and extremes in climate variables and the evaluation of their impacts on regional carbon/water cycles and biodiversity, the development of land-use management and adaptation strategies for climate-change scenarios, and even the exploration of climate mitigation through geo-engineering. Scientists also use the large collection of satellite data on NEX to conduct research on quantifying spatial and temporal changes in land surface processes in response to climate and land-cover-land-use changes. Researchers, leveraging NEX's massive compute/storage resources, have used statistical techniques to downscale the coarse-resolution CMIP5 projections to fulfill the demands of the community for a wide range of climate change impact analyses. The DCP-30 (Downscaled Climate Projections at 30 arcsecond) for the conterminous US at monthly, ~1km resolution and the GDDP (Global Daily Downscaled Projections) for the entire world at daily, 25km resolution are now widely used in climate research and applications, as well as for communicating climate change. In order to serve a broader community, the NEX team in collaboration with Amazon, Inc, created the OpenNEX platform. OpenNEX provides ready access to NEX data holdings, including the NEX-DCP30 and GDDP datasets along with a number of pertinent analysis tools and workflows on the AWS infrastructure in the form of publicly available, self contained, fully functional Amazon Machine Images (AMI's) for anyone interested in global climate change.

  15. Spectral Analyses of the Interactions of Giant Vortices on Jupiter

    NASA Astrophysics Data System (ADS)

    Yanamandra-Fisher, P. A.; Simon-Miller, A. A.; Orton, G. S.

    2010-12-01

    The merger of the three white ovals into Oval BA in 2000, and its subsequent color change from white to red in 2005, appear to be loosely correlated to periodic interactions with the Great Red Spot (GRS). The interactions of these two largest vortices in the solar system - the Great Red Spot (GRS) and Oval BA - on Jupiter occur once every 18 months. Our data was acquired primarily at NASA/Infrared Telescope Facility (IRTF), with 1- to 5-micron imager, NSFCAM and its successor, NSFCAM2. We chose four canonical wavelengths that characterize the vertical structure of Jupiter’s atmosphere. Spectral decomposition of the geometrically-registered data identifies several physical changes on the planet: variation of global cloudiness increases during the interaction; the albedo of discrete clouds at different altitudes vary and there appears to be either enhancement or depletion of ammonia vertically in the atmosphere, especially after the color change in Oval BA in 2005. Analyses of the post-color change interactions of GRS and Oval BA in 2006 and 2008 indicate changes in thermal and albedo fields, with enhancement of ammonia in the perturbed region (Otto, Yanamandra-Fisher and Simon-Miller, BAAS, 2009; Fletcher et al., Icarus, 208, Issue 1). We shall present results of the current 2010 interaction, and its comparison to the previous interactions of the GRS - Oval BA. Our goals are to establish common attributes of the interactions in terms of physical changes in the local meteorology for both the unperturbed and perturbed states of the atmosphere, while differences in the interactions may highlight the temporal changes in the global atmospheric state of Jupiter.

  16. Content standards for medical image metadata

    NASA Astrophysics Data System (ADS)

    d'Ornellas, Marcos C.; da Rocha, Rafael P.

    2003-12-01

    Medical images are at the heart of the healthcare diagnostic procedures. They have provided not only a noninvasive mean to view anatomical cross-sections of internal organs but also a mean for physicians to evaluate the patient"s diagnosis and monitor the effects of the treatment. For a Medical Center, the emphasis may shift from the generation of image to post processing and data management since the medical staff may generate even more processed images and other data from the original image after various analyses and post processing. A medical image data repository for health care information system is becoming a critical need. This data repository would contain comprehensive patient records, including information such as clinical data and related diagnostic images, and post-processed images. Due to the large volume and complexity of the data as well as the diversified user access requirements, the implementation of the medical image archive system will be a complex and challenging task. This paper discusses content standards for medical image metadata. In addition it also focuses on the image metadata content evaluation and metadata quality management.

  17. Using image analysis and ArcGIS® to improve automatic grain boundary detection and quantify geological images

    NASA Astrophysics Data System (ADS)

    DeVasto, Michael A.; Czeck, Dyanna M.; Bhattacharyya, Prajukti

    2012-12-01

    Geological images, such as photos and photomicrographs of rocks, are commonly used as supportive evidence to indicate geological processes. A limiting factor to quantifying images is the digitization process; therefore, image analysis has remained largely qualitative. ArcGIS®, the most widely used Geographic Information System (GIS) available, is capable of an array of functions including building models capable of digitizing images. We expanded upon a previously designed model built using Arc ModelBuilder® to quantify photomicrographs and scanned images of thin sections. In order to enhance grain boundary detection, but limit computer processing and hard drive space, we utilized a preprocessing image analysis technique such that only a single image is used in the digitizing model. Preprocessing allows the model to accurately digitize grain boundaries with fewer images and requires less user intervention by using batch processing in image analysis software and ArcCatalog®. We present case studies for five basic textural analyses using a semi-automated digitized image and quantified in ArcMap®. Grain Size Distributions, Shape Preferred Orientations, Weak phase connections (networking), and Nearest Neighbor statistics are presented in a simplified fashion for further analyses directly obtainable from the automated digitizing method. Finally, we discuss the ramifications for incorporating this method into geological image analyses.

  18. Image processing in astronomy

    NASA Astrophysics Data System (ADS)

    Berry, Richard

    1994-04-01

    Today's personal computers are more powerful than the mainframes that processed images during the early days of space exploration. We have entered an age in which anyone can do image processing. Topics covering the following aspects of image processing are discussed: digital-imaging basics, image calibration, image analysis, scaling, spatial enhancements, and compositing.

  19. Eos visible imagers

    NASA Technical Reports Server (NTRS)

    Barnes, W. L.

    1990-01-01

    Some of the proposed Earth Observing System (Eos) optical imagers are examined. These imagers include: moderate resolution imaging spectrometer (MODIS); geoscience laser ranging system (GLRS); high resolution imaging spectrometer (HIRIS); the intermediate thermal infrared spectrometer (ITIR); multi-angle imaging spectrometer (MISR); earth observing scanning polarimeter (EOSP); and the lightening imaging sensor (LIS).

  20. Image ambiguity and fluency.

    PubMed

    Jakesch, Martina; Leder, Helmut; Forster, Michael

    2013-01-01

    Ambiguity is often associated with negative affective responses, and enjoying ambiguity seems restricted to only a few situations, such as experiencing art. Nevertheless, theories of judgment formation, especially the "processing fluency account", suggest that easy-to-process (non-ambiguous) stimuli are processed faster and are therefore preferred to (ambiguous) stimuli, which are hard to process. In a series of six experiments, we investigated these contrasting approaches by manipulating fluency (presentation duration: 10 ms, 50 ms, 100 ms, 500 ms, 1000 ms) and testing effects of ambiguity (ambiguous versus non-ambiguous pictures of paintings) on classification performance (Part A; speed and accuracy) and aesthetic appreciation (Part B; liking and interest). As indicated by signal detection analyses, classification accuracy increased with presentation duration (Exp. 1a), but we found no effects of ambiguity on classification speed (Exp. 1b). Fifty percent of the participants were able to successfully classify ambiguous content at a presentation duration of 100 ms, and at 500 ms even 75% performed above chance level. Ambiguous artworks were found more interesting (in conditions 50 ms to 1000 ms) and were preferred over non-ambiguous stimuli at 500 ms and 1000 ms (Exp. 2a - 2c, 3). Importantly, ambiguous images were nonetheless rated significantly harder to process as non-ambiguous images. These results suggest that ambiguity is an essential ingredient in art appreciation even though or maybe because it is harder to process.

  1. Image Ambiguity and Fluency

    PubMed Central

    Jakesch, Martina; Leder, Helmut; Forster, Michael

    2013-01-01

    Ambiguity is often associated with negative affective responses, and enjoying ambiguity seems restricted to only a few situations, such as experiencing art. Nevertheless, theories of judgment formation, especially the “processing fluency account”, suggest that easy-to-process (non-ambiguous) stimuli are processed faster and are therefore preferred to (ambiguous) stimuli, which are hard to process. In a series of six experiments, we investigated these contrasting approaches by manipulating fluency (presentation duration: 10ms, 50ms, 100ms, 500ms, 1000ms) and testing effects of ambiguity (ambiguous versus non-ambiguous pictures of paintings) on classification performance (Part A; speed and accuracy) and aesthetic appreciation (Part B; liking and interest). As indicated by signal detection analyses, classification accuracy increased with presentation duration (Exp. 1a), but we found no effects of ambiguity on classification speed (Exp. 1b). Fifty percent of the participants were able to successfully classify ambiguous content at a presentation duration of 100 ms, and at 500ms even 75% performed above chance level. Ambiguous artworks were found more interesting (in conditions 50ms to 1000ms) and were preferred over non-ambiguous stimuli at 500ms and 1000ms (Exp. 2a - 2c, 3). Importantly, ambiguous images were nonetheless rated significantly harder to process as non-ambiguous images. These results suggest that ambiguity is an essential ingredient in art appreciation even though or maybe because it is harder to process. PMID:24040172

  2. Scrotal imaging

    PubMed Central

    Studniarek, Michał; Modzelewska, Elza

    2015-01-01

    Pathological lesions within the scrotum are relatively rare in imaging except for ultrasonography. The diseases presented in the paper are usually found in men at the age of 15–45, i.e. men of reproductive age, and therefore they are worth attention. Scrotal ultrasound in infertile individuals should be conducted on a routine basis owing to the fact that pathological scrotal lesions are frequently detected in this population. Malignant testicular cancers are the most common neoplasms in men at the age of 20–40. Ultrasound imaging is the method of choice characterized by the sensitivity of nearly 100% in the differentiation between intratesticular and extratesticular lesions. In the case of doubtful lesions that are not classified for intra-operative verification, nuclear magnetic resonance is applied. Computed tomography, however, is performed to monitor the progression of a neoplastic disease, in pelvic trauma with scrotal injury as well as in rare cases of scrotal hernias involving the ureters or a fragment of the urinary bladder. PMID:26674847

  3. Integrated Waste Treatment Unit (IWTU) Input Coal Analyses and Off-Gass Filter (OGF) Content Analyses

    SciTech Connect

    Jantzen, Carol M.; Missimer, David M.; Guenther, Chris P.; Shekhawat, Dushyant; VanEssendelft, Dirk T.; Means, Nicholas C.

    2015-04-23

    in process piping and materials, in excessive off-gas absorbent loading, and in undesired process emissions. The ash content of the coal is important as the ash adds to the DMR and other vessel products which affect the final waste product mass and composition. The amount and composition of the ash also affects the reaction kinetics. Thus ash content and composition contributes to the mass balance. In addition, sodium, potassium, calcium, sulfur, and maybe silica and alumina in the ash may contribute to wall-scale formation. Sodium, potassium, and alumina in the ash will be overwhelmed by the sodium, potassium, and alumina from the feed but the impact from the other ash components needs to be quantified. A maximum coal particle size is specified so the feed system does not plug and a minimum particle size is specified to prevent excess elutriation from the DMR to the Process Gas Filter (PGF). A vendor specification was used to procure the calcined coal for IWTU processing. While the vendor supplied a composite analysis for the 22 tons of coal (Appendix A), this study compares independent analyses of the coal performed at the Savannah River National Laboratory (SRNL) and at the National Energy Technology Laboratory (NETL). Three supersacks a were sampled at three different heights within the sack in order to determine within bag variability and between bag variability of the coal. These analyses were also compared to the vendor’s composite analyses and to the coal specification. These analyses were also compared to historic data on Bestac coal analyses that had been performed at Hazen Research Inc. (HRI) between 2004-2011.

  4. Hybrid ultrasound imaging techniques (fusion imaging).

    PubMed

    Sandulescu, Daniela Larisa; Dumitrescu, Daniela; Rogoveanu, Ion; Saftoiu, Adrian

    2011-01-07

    Visualization of tumor angiogenesis can facilitate non-invasive evaluation of tumor vascular characteristics to supplement the conventional diagnostic imaging goals of depicting tumor location, size, and morphology. Hybrid imaging techniques combine anatomic [ultrasound, computed tomography (CT), and/or magnetic resonance imaging (MRI)] and molecular (single photon emission CT and positron emission tomography) imaging modalities. One example is real-time virtual sonography, which combines ultrasound (grayscale, colour Doppler, or dynamic contrast harmonic imaging) with contrast-enhanced CT/MRI. The benefits of fusion imaging include an increased diagnostic confidence, direct comparison of the lesions using different imaging modalities, more precise monitoring of interventional procedures, and reduced radiation exposure.

  5. Speckle imaging algorithms for planetary imaging

    SciTech Connect

    Johansson, E.

    1994-11-15

    I will discuss the speckle imaging algorithms used to process images of the impact sites of the collision of comet Shoemaker-Levy 9 with Jupiter. The algorithms use a phase retrieval process based on the average bispectrum of the speckle image data. High resolution images are produced by estimating the Fourier magnitude and Fourier phase of the image separately, then combining them and inverse transforming to achieve the final result. I will show raw speckle image data and high-resolution image reconstructions from our recent experiment at Lick Observatory.

  6. Image processing techniques for acoustic images

    NASA Astrophysics Data System (ADS)

    Murphy, Brian P.

    1991-06-01

    The primary goal of this research is to test the effectiveness of various image processing techniques applied to acoustic images generated in MATLAB. The simulated acoustic images have the same characteristics as those generated by a computer model of a high resolution imaging sonar. Edge detection and segmentation are the two image processing techniques discussed in this study. The two methods tested are a modified version of the Kalman filtering and median filtering.

  7. Medical Imaging System

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The MD Image System, a true-color image processing system that serves as a diagnostic aid and tool for storage and distribution of images, was developed by Medical Image Management Systems, Huntsville, AL, as a "spinoff from a spinoff." The original spinoff, Geostar 8800, developed by Crystal Image Technologies, Huntsville, incorporates advanced UNIX versions of ELAS (developed by NASA's Earth Resources Laboratory for analysis of Landsat images) for general purpose image processing. The MD Image System is an application of this technology to a medical system that aids in the diagnosis of cancer, and can accept, store and analyze images from other sources such as Magnetic Resonance Imaging.

  8. Imaging genetics and psychiatric disorders.

    PubMed

    Hashimoto, R; Ohi, K; Yamamori, H; Yasuda, Y; Fujimoto, M; Umeda-Yano, S; Watanabe, Y; Fukunaga, M; Takeda, M

    2015-01-01

    Imaging genetics is an integrated research method that uses neuroimaging and genetics to assess the impact of genetic variation on brain function and structure. Imaging genetics is both a tool for the discovery of risk genes for psychiatric disorders and a strategy for characterizing the neural systems affected by risk gene variants to elucidate quantitative and mechanistic aspects of brain function implicated in psychiatric disease. Early studies of imaging genetics included association analyses between brain morphology and single nucleotide polymorphisms whose function is well known, such as catechol-Omethyltransferase (COMT) and brain-derived neurotrophic factor (BDNF). GWAS of psychiatric disorders have identified genes with unknown functions, such as ZNF804A, and imaging genetics has been used to investigate clues of the biological function of these genes. The difficulty in replicating the findings of studies with small sample sizes has motivated the creation of largescale collaborative consortiums, such as ENIGMA, CHARGE and IMAGEN, to collect thousands of images. In a genome-wide association study, the ENIGMA consortium successfully identified common variants in the genome associated with hippocampal volume at 12q24, and the CHARGE consortium replicated this finding. The new era of imaging genetics has just begun, and the next challenge we face is the discovery of small effect size signals from large data sets obtained from genetics and neuroimaging. New methods and technologies for data reduction with appropriate statistical thresholds, such as polygenic analysis and parallel independent component analysis (ICA), are warranted. Future advances in imaging genetics will aid in the discovery of genes and provide mechanistic insight into psychiatric disorders.

  9. Imaging Genetics and Psychiatric Disorders

    PubMed Central

    Hashimoto, R; Ohi, K; Yamamori, H; Yasuda, Y; Fujimoto, M; Umeda-Yano, S; Watanabe, Y; Fukunaga, M; Takeda, M

    2015-01-01

    Imaging genetics is an integrated research method that uses neuroimaging and genetics to assess the impact of genetic variation on brain function and structure. Imaging genetics is both a tool for the discovery of risk genes for psychiatric disorders and a strategy for characterizing the neural systems affected by risk gene variants to elucidate quantitative and mechanistic aspects of brain function implicated in psychiatric disease. Early studies of imaging genetics included association analyses between brain morphology and single nucleotide polymorphisms whose function is well known, such as catechol-O-methyltransferase (COMT) and brain-derived neurotrophic factor (BDNF). GWAS of psychiatric disorders have identified genes with unknown functions, such as ZNF804A, and imaging genetics has been used to investigate clues of the biological function of these genes. The difficulty in replicating the findings of studies with small sample sizes has motivated the creation of large-scale collaborative consortiums, such as ENIGMA, CHARGE and IMAGEN, to collect thousands of images. In a genome-wide association study, the ENIGMA consortium successfully identified common variants in the genome associated with hippocampal volume at 12q24, and the CHARGE consortium replicated this finding. The new era of imaging genetics has just begun, and the next challenge we face is the discovery of small effect size signals from large data sets obtained from genetics and neuroimaging. New methods and technologies for data reduction with appropriate statistical thresholds, such as polygenic analysis and parallel independent component analysis (ICA), are warranted. Future advances in imaging genetics will aid in the discovery of genes and provide mechanistic insight into psychiatric disorders. PMID:25732148

  10. Single particle raster image analysis of diffusion.

    PubMed

    Longfils, M; Schuster, E; Lorén, N; Särkkä, A; Rudemo, M

    2017-04-01

    As a complement to the standard RICS method of analysing Raster Image Correlation Spectroscopy images with estimation of the image correlation function, we introduce the method SPRIA, Single Particle Raster Image Analysis. Here, we start by identifying individual particles and estimate the diffusion coefficient for each particle by a maximum likelihood method. Averaging over the particles gives a diffusion coefficient estimate for the whole image. In examples both with simulated and experimental data, we show that the new method gives accurate estimates. It also gives directly standard error estimates. The method should be possible to extend to study heterogeneous materials and systems of particles with varying diffusion coefficient, as demonstrated in a simple simulation example. A requirement for applying the SPRIA method is that the particle concentration is low enough so that we can identify the individual particles. We also describe a bootstrap method for estimating the standard error of standard RICS.

  11. Metric Learning to Enhance Hyperspectral Image Segmentation

    NASA Technical Reports Server (NTRS)

    Thompson, David R.; Castano, Rebecca; Bue, Brian; Gilmore, Martha S.

    2013-01-01

    Unsupervised hyperspectral image segmentation can reveal spatial trends that show the physical structure of the scene to an analyst. They highlight borders and reveal areas of homogeneity and change. Segmentations are independently helpful for object recognition, and assist with automated production of symbolic maps. Additionally, a good segmentation can dramatically reduce the number of effective spectra in an image, enabling analyses that would otherwise be computationally prohibitive. Specifically, using an over-segmentation of the image instead of individual pixels can reduce noise and potentially improve the results of statistical post-analysis. In this innovation, a metric learning approach is presented to improve the performance of unsupervised hyperspectral image segmentation. The prototype demonstrations attempt a superpixel segmentation in which the image is conservatively over-segmented; that is, the single surface features may be split into multiple segments, but each individual segment, or superpixel, is ensured to have homogenous mineralogy.

  12. Nonspectroscopic imaging for quantitative chlorophyll sensing

    NASA Astrophysics Data System (ADS)

    Kim, Taehoon; Kim, Jeong-Im; Visbal-Onufrak, Michelle A.; Chapple, Clint; Kim, Young L.

    2016-01-01

    Nondestructive imaging of physiological changes in plants has been intensively used as an invaluable tool for visualizing heterogeneous responses to various types of abiotic and biotic stress. However, conventional approaches often have intrinsic limitations for quantitative analyses, requiring bulky and expensive optical instruments for capturing full spectral information. We report a spectrometerless (or spectrometer-free) reflectance imaging method that allows for nondestructive and quantitative chlorophyll imaging in individual leaves in situ in a handheld device format. The combination of a handheld-type imaging system and a hyperspectral reconstruction algorithm from an RGB camera offers simple instrumentation and operation while avoiding the use of an imaging spectrograph or tunable color filter. This platform could potentially be integrated into a compact, inexpensive, and portable system, while being of great value in high-throughput phenotyping facilities and laboratory settings.

  13. Medical imaging.

    PubMed Central

    Kreel, L.

    1991-01-01

    There is now a wide choice of medical imaging to show both focal and diffuse pathologies in various organs. Conventional radiology with plain films, fluoroscopy and contrast medium have many advantages, being readily available with low-cost apparatus and a familiarity that almost leads to contempt. The use of plain films in chest disease and in trauma does not need emphasizing, yet there are still too many occasions when the answer obtainable from a plain radiograph has not been available. The film may have been mislaid, or the examination was not requested, or the radiograph had been misinterpreted. The converse is also quite common. Examinations are performed that add nothing to patient management, such as skull films when CT will in any case be requested or views of the internal auditory meatus and heal pad thickness in acromegaly, to quote some examples. Other issues are more complicated. Should the patient who clinically has gall-bladder disease have more than a plain film that shows gall-stones? If the answer is yes, then why request a plain film if sonography will in any case be required to 'exclude' other pathologies especially of the liver or pancreas? But then should cholecystography, CT or scintigraphy be added for confirmation? Quite clearly there will be individual circumstances to indicate further imaging after sonography but in the vast majority of patients little or no extra information will be added. Statistics on accuracy and specificity will, in the case of gall-bladder pathology, vary widely if adenomyomatosis is considered by some to be a cause of symptoms or if sonographic examinations 'after fatty meals' are performed. The arguments for or against routine contrast urography rather than sonography are similar but the possibility of contrast reactions and the need to limit ionizing radiation must be borne in mind. These diagnostic strategies are also being influenced by their cost and availability; purely pragmatic considerations are not

  14. Superresolution images reconstructed from aliased images

    NASA Astrophysics Data System (ADS)

    Vandewalle, Patrick; Susstrunk, Sabine E.; Vetterli, Martin

    2003-06-01

    In this paper, we present a simple method to almost quadruple the spatial resolution of aliased images. From a set of four low resolution, undersampled and shifted images, a new image is constructed with almost twice the resolution in each dimension. The resulting image is aliasing-free. A small aliasing-free part of the frequency domain of the images is used to compute the exact subpixel shifts. When the relative image positions are known, a higher resolution image can be constructed using the Papoulis-Gerchberg algorithm. The proposed method is tested in a simulation where all simulation parameters are well controlled, and where the resulting image can be compared with its original. The algorithm is also applied to real, noisy images from a digital camera. Both experiments show very good results.

  15. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image.

  16. Third-harmonic generation imaging of breast tissue biopsies.

    PubMed

    Lee, Woowon; Kabir, Mohammad M; Emmadi, Rajyasree; Toussaint, Kimani C

    2016-11-01

    We demonstrate for the first time the imaging of unstained breast tissue biopsies using third-harmonic generation (THG) microscopy. As a label-free imaging technique, THG microscopy is compared to phase contrast and polarized light microscopy which are standard imaging methods for breast tissues. A simple feature detection algorithm is applied to detect tumour-associated lymphocyte rich regions in unstained breast biopsy tissue and compared with corresponding regions identified by a pathologist from bright-field images of hematoxylin and eosin stained breast tissue. Our results suggest that THG imaging holds potential as a complementary technique for analysing breast tissue biopsies.

  17. Towards Efficiency of Oblique Images Orientation

    NASA Astrophysics Data System (ADS)

    Ostrowski, W.; Bakuła, K.

    2016-03-01

    Many papers on both theoretical aspects of bundle adjustment of oblique images and new operators for detecting tie points on oblique images have been written. However, only a few achievements presented in the literature were practically implemented in commercial software. In consequence often aerial triangulation is performed either for nadir images obtained simultaneously with oblique photos or bundle adjustment for separate images captured in different directions. The aim of this study was to investigate how the orientation of oblique images can be carried out effectively in commercial software based on the structure from motion technology. The main objective of the research was to evaluate the impact of the orientation strategy on both duration of the process and accuracy of photogrammetric 3D products. Two, very popular software: Pix4D and Agisoft Photoscan were tested and two approaches for image blocks were considered. The first approach based only on oblique images collected in four directions and the second approach included nadir images. In this study, blocks for three test areas were analysed. Oblique images were collected with medium-format cameras in maltan cross configuration with registration of GNSS and INS data. As a reference both check points and digital surface models from airborne laser scanning were used.

  18. The Influence of University Image on Student Behaviour

    ERIC Educational Resources Information Center

    Alves, Helena; Raposo, Mario

    2010-01-01

    Purpose: The purpose of this paper is to analyse the influence of image on student satisfaction and loyalty. Design/methodology/approach: In order to accomplish the objectives proposed, a model reflecting the influence of image on student satisfaction and loyalty is applied. The model is tested through use of structural equations and the final…

  19. scikit-image: image processing in Python

    PubMed Central

    Schönberger, Johannes L.; Nunez-Iglesias, Juan; Boulogne, François; Warner, Joshua D.; Yager, Neil; Gouillart, Emmanuelle; Yu, Tony

    2014-01-01

    scikit-image is an image processing library that implements algorithms and utilities for use in research, education and industry applications. It is released under the liberal Modified BSD open source license, provides a well-documented API in the Python programming language, and is developed by an active, international team of collaborators. In this paper we highlight the advantages of open source to achieve the goals of the scikit-image library, and we showcase several real-world image processing applications that use scikit-image. More information can be found on the project homepage, http://scikit-image.org. PMID:25024921

  20. scikit-image: image processing in Python.

    PubMed

    van der Walt, Stéfan; Schönberger, Johannes L; Nunez-Iglesias, Juan; Boulogne, François; Warner, Joshua D; Yager, Neil; Gouillart, Emmanuelle; Yu, Tony

    2014-01-01

    scikit-image is an image processing library that implements algorithms and utilities for use in research, education and industry applications. It is released under the liberal Modified BSD open source license, provides a well-documented API in the Python programming language, and is developed by an active, international team of collaborators. In this paper we highlight the advantages of open source to achieve the goals of the scikit-image library, and we showcase several real-world image processing applications that use scikit-image. More information can be found on the project homepage, http://scikit-image.org.

  1. Analysing harmonic motions with an iPhone’s magnetometer

    NASA Astrophysics Data System (ADS)

    Yavuz, Ahmet; Kağan Temiz, Burak

    2016-05-01

    In this paper, we propose an experiment for analysing harmonic motion using an iPhone’s (or iPad’s) magnetometer. This experiment consists of the detection of magnetic field variations obtained from an iPhone’s magnetometer sensor. A graph of harmonic motion is directly displayed on the iPhone’s screen using the Sensor Kinetics application. Data from this application was analysed with Eureqa software to establish the equation of the harmonic motion. Analyses show that the use of an iPhone’s magnetometer to analyse harmonic motion is a practical and effective method for small oscillations and frequencies less than 15-20 Hz.

  2. SENSITIVITY ANALYSIS FOR SALTSTONE DISPOSAL UNIT COLUMN DEGRADATION ANALYSES

    SciTech Connect

    Flach, G.

    2014-10-28

    PORFLOW related analyses supporting a Sensitivity Analysis for Saltstone Disposal Unit (SDU) column degradation were performed. Previous analyses, Flach and Taylor 2014, used a model in which the SDU columns degraded in a piecewise manner from the top and bottom simultaneously. The current analyses employs a model in which all pieces of the column degrade at the same time. Information was extracted from the analyses which may be useful in determining the distribution of Tc-99 in the various SDUs throughout time and in determining flow balances for the SDUs.

  3. Split image optical display

    DOEpatents

    Veligdan, James T.

    2005-05-31

    A video image is displayed from an optical panel by splitting the image into a plurality of image components, and then projecting the image components through corresponding portions of the panel to collectively form the image. Depth of the display is correspondingly reduced.

  4. Split image optical display

    DOEpatents

    Veligdan, James T.

    2007-05-29

    A video image is displayed from an optical panel by splitting the image into a plurality of image components, and then projecting the image components through corresponding portions of the panel to collectively form the image. Depth of the display is correspondingly reduced.

  5. Mechanical evaluation by patient-specific finite element analyses demonstrates therapeutic effects for osteoporotic vertebrae.

    PubMed

    Tawara, Daisuke; Sakamoto, Jiro; Murakami, Hideki; Kawahara, Norio; Oda, Juhachi; Tomita, Katsuro

    2010-01-01

    Osteoporosis can lead to bone compressive fractures in the lower lumbar vertebrae. In order to assess the recovery of vertebral strength during drug treatment for osteoporosis, it is necessary not only to measure the bone mass but also to perform patient-specific mechanical analyses, since the strength of osteoporotic vertebrae is strongly dependent on patient-specific factors, such as bone shape and bone density distribution in cancellous bone, which are related to stress distribution in the vertebrae. In the present study, patient-specific general (not voxel) finite element analyses of osteoporotic vertebrae during drug treatment were performed over time. We compared changes in bone density and compressive principal strain distribution in a relative manner using models for the first lumbar vertebra based on computer tomography images of four patients at three time points (before therapy, and after 6 and 12 months of therapy). The patient-specific mechanical analyses indicated that increases in bone density and decreases in compressive principal strain were significant in some osteoporotic vertebrae. The data suggested that the vertebrae were strengthened structurally and the drug treatment was effective in preventing compression fractures. The effectiveness of patient-specific mechanical analyses for providing useful and important information for the prognosis of osteoporosis is demonstrated.

  6. Analyses and Measures of GPR Signal with Superimposed Noise

    NASA Astrophysics Data System (ADS)

    Chicarella, Simone; Ferrara, Vincenzo; D'Atanasio, Paolo; Frezza, Fabrizio; Pajewski, Lara; Pavoncello, Settimio; Prontera, Santo; Tedeschi, Nicola; Zambotti, Alessandro

    2014-05-01

    The influence of EM noises and environmental hard conditions on the GPR surveys has been examined analytically [1]. In the case of pulse radar GPR, many unwanted signals as stationary clutter, non-stationary clutter, random noise, and time jitter, influence the measurement signal. When GPR is motionless, stationary clutter is the most dominant signal component due to the reflections of static objects different from the investigated target, and to the direct antenna coupling. Moving objects like e.g. persons and vehicles, and the swaying of tree crown, produce non-stationary clutter. Device internal noise and narrowband jamming are e.g. two potential sources of random noises. Finally, trigger instabilities generate random jitter. In order to estimate the effective influence of these noise signal components, we organized some experimental setup of measurement. At first, we evaluated for the case of a GPR basic detection, simpler image processing of radargram. In the future, we foresee experimental measurements for detection of the Doppler frequency changes induced by movements of targets (like physiological movements of survivors under debris). We obtain image processing of radargram by using of GSSI SIR® 2000 GPR system together with the UWB UHF GPR-antenna (SUB-ECHO HBD 300, a model manufactured by Radarteam company). Our work includes both characterization of GPR signal without (or almost without) a superimposed noise, and the effect of jamming originated from the coexistence of a different radio signal. For characterizing GPR signal, we organized a measurement setup that includes the following instruments: mod. FSP 30 spectrum analyser by Rohde & Schwarz which operates in the frequency range 9 KHz - 30 GHz, mod. Sucoflex 104 cable by Huber Suhner (10 MHz - 18 GHz), and HL050 antenna by Rohde & Schwarz (bandwidth: from 850 MHz to 26.5 GHz). The next analysis of superimposed jamming will examine two different signal sources: by a cellular phone and by a

  7. Power analyses for negative binomial models with application to multiple sclerosis clinical trials.

    PubMed

    Rettiganti, Mallik; Nagaraja, H N

    2012-01-01

    We use negative binomial (NB) models for the magnetic resonance imaging (MRI)-based brain lesion count data from parallel group (PG) and baseline versus treatment (BVT) trials for relapsing remitting multiple sclerosis (RRMS) patients, and describe the associated likelihood ratio (LR), score, and Wald tests. We perform power analyses and sample size estimation using the simulated percentiles of the exact distribution of the test statistics for the PG and BVT trials. When compared to the corresponding nonparametric test, the LR test results in 30-45% reduction in sample sizes for the PG trials and 25-60% reduction for the BVT trials.

  8. Coordinated in Situ Analyses of Organic Nanoglobules in the Sutter's Mill Meteorite

    NASA Technical Reports Server (NTRS)

    Nakamura-Messenger, K.; Messenger, S.; Keller, L. P.; Clemett, S. J.; Nguyen, A. N.; Gibson, E. K.

    2013-01-01

    The Sutter's Mill meteorite is a newly fallen carbonaceous chondrite that was collected and curated quickly after its fall. Preliminary petrographic and isotopic investigations suggest affinities to the CM2 carbonaceous chondrites. The primitive nature of this meteorite and its rapid recovery provide an opportunity to investigate primordial solar system organic matter in a unique new sample. Here we report in-situ analyses of organic nanoglobules in the Sutter's Mill meteorite using UV fluorescence imaging, Fourier-transform infrared spectroscopy (FTIR), scanning transmission electron microscopy (STEM), NanoSIMS, and ultrafast two-step laser mass spectrometry (ultra-L2MS).

  9. Display depth analyses with the wave aberration for the auto-stereoscopic 3D display

    NASA Astrophysics Data System (ADS)

    Gao, Xin; Sang, Xinzhu; Yu, Xunbo; Chen, Duo; Chen, Zhidong; Zhang, Wanlu; Yan, Binbin; Yuan, Jinhui; Wang, Kuiru; Yu, Chongxiu; Dou, Wenhua; Xiao, Liquan

    2016-07-01

    Because the aberration severely affects the display performances of the auto-stereoscopic 3D display, the diffraction theory is used to analyze the diffraction field distribution and the display depth through aberration analysis. Based on the proposed method, the display depth of central and marginal reconstructed images is discussed. The experimental results agree with the theoretical analyses. Increasing the viewing distance or decreasing the lens aperture can improve the display depth. Different viewing distances and the LCD with two lens-arrays are used to verify the conclusion.

  10. Identifying neural correlates of visual consciousness with ALE meta-analyses.

    PubMed

    Bisenius, Sandrine; Trapp, Sabrina; Neumann, Jane; Schroeter, Matthias L

    2015-11-15

    Neural correlates of consciousness (NCC) have been a topic of study for nearly two decades. In functional imaging studies, several regions have been proposed to constitute possible candidates for NCC, but as of yet, no quantitative summary of the literature on NCC has been done. The question whether single (striate or extrastriate) regions or a network consisting of extrastriate areas that project directly to fronto-parietal regions are necessary and sufficient neural correlates for visual consciousness is still highly debated [e.g., Rees et al., 2002, Nat Rev. Neurosci 3, 261-270; Tong, 2003, Nat Rev. Neurosci 4, 219-229]. The aim of this work was to elucidate this issue and give a synopsis of the present state of the art by conducting systematic and quantitative meta-analyses across functional magnetic resonance imaging (fMRI) studies using several standard paradigms for conscious visual perception. In these paradigms, consciousness is operationalized via perceptual changes, while the visual stimulus remains invariant. An activation likelihood estimation (ALE) meta-analysis was performed, representing the best approach for voxel-wise meta-analyses to date. In addition to computing a meta-analysis across all paradigms, separate meta-analyses on bistable perception and masking paradigms were conducted to assess whether these paradigms show common or different NCC. For the overall meta-analysis, we found significant clusters of activation in inferior and middle occipital gyrus; fusiform gyrus; inferior temporal gyrus; caudate nucleus; insula; inferior, middle, and superior frontal gyri; precuneus; as well as in inferior and superior parietal lobules. These results suggest a subcortical-extrastriate-fronto-parietal network rather than a single region that constitutes the necessary NCC. The results of our exploratory paradigm-specific meta-analyses suggest that this subcortical-extrastriate-fronto-parietal network might be differentially activated as a function of the

  11. Enhancing forensic science with spectroscopic imaging

    NASA Astrophysics Data System (ADS)

    Ricci, Camilla; Kazarian, Sergei G.

    2006-09-01

    This presentation outlines the research we are developing in the area of Fourier Transform Infrared (FTIR) spectroscopic imaging with the focus on materials of forensic interest. FTIR spectroscopic imaging has recently emerged as a powerful tool for characterisation of heterogeneous materials. FTIR imaging relies on the ability of the military-developed infrared array detector to simultaneously measure spectra from thousands of different locations in a sample. Recently developed application of FTIR imaging using an ATR (Attenuated Total Reflection) mode has demonstrated the ability of this method to achieve spatial resolution beyond the diffraction limit of infrared light in air. Chemical visualisation with enhanced spatial resolution in micro-ATR mode broadens the range of materials studied with FTIR imaging with applications to pharmaceutical formulations or biological samples. Macro-ATR imaging has also been developed for chemical imaging analysis of large surface area samples and was applied to analyse the surface of human skin (e.g. finger), counterfeit tablets, textile materials (clothing), etc. This approach demonstrated the ability of this imaging method to detect trace materials attached to the surface of the skin. This may also prove as a valuable tool in detection of traces of explosives left or trapped on the surfaces of different materials. This FTIR imaging method is substantially superior to many of the other imaging methods due to inherent chemical specificity of infrared spectroscopy and fast acquisition times of this technique. Our preliminary data demonstrated that this methodology will provide the means to non-destructive detection method that could relate evidence to its source. This will be important in a wider crime prevention programme. In summary, intrinsic chemical specificity and enhanced visualising capability of FTIR spectroscopic imaging open a window of opportunities for counter-terrorism and crime-fighting, with applications ranging

  12. Field-Based Land Cover Classification Aided with Texture Analyses Using Terrasar-X Data

    NASA Astrophysics Data System (ADS)

    Mahmoud, Ali; Pradhan, Biswajeet; Buchroithner, Manfred

    The present study aims to evaluate the field-based approach for the classification of land cover using the recently launched high resolution SAR data. TerraSAR-X1 (TSX-1) strip mode im-age, coupled with Digital Ortho Photos with 20 cm spatial resolution was used for land cover classification and parcel mapping respectively. Different filtering and texture analyses tech-niques were applied to extract textural information from the TSX-1 image in order to assess the enhancement of the classification accuracy. Several attributes of parcels were derived from the available TSX-1 image in order to define the most suitable attributes discriminating be-tween different land cover types. Then, these attributes were further analyzed by statistical and various image classification methods for landcover classification. The results showed that, tex-tural analysis performed higher classification accuracy than the earlier. The authors conclude that, an integrated landcover classification using the textural information in TerraSAR-X1 has high potential for landcover mapping. Key words: Landcover classification, TerraSARX1, field based, texture analysis

  13. Analysing land cover and land use change in the Matobo National Park and surroundings in Zimbabwe

    NASA Astrophysics Data System (ADS)

    Scharsich, Valeska; Mtata, Kupakwashe; Hauhs, Michael; Lange, Holger; Bogner, Christina

    2016-04-01

    Natural forests are threatened worldwide, therefore their protection in National Parks is essential. Here, we investigate how this protection status affects the land cover. To answer this question, we analyse the surface reflectance of three Landsat images of Matobo National Park and surrounding in Zimbabwe from 1989, 1998 and 2014 to detect changes in land cover in this region. To account for the rolling countryside and the resulting prominent shadows, a topographical correction of the surface reflectance was required. To infer land cover changes it is not only necessary to have some ground data for the current satellite images but also for the old ones. In particular for the older images no recent field study could help to reconstruct these data reliably. In our study we follow the idea that land cover classes of pixels in current images can be transferred to the equivalent pixels of older ones if no changes occurred meanwhile. Therefore we combine unsupervised clustering with supervised classification as follows. At first, we produce a land cover map for 2014. Secondly, we cluster the images with clara, which is similar to k-means, but suitable for large data sets. Whereby the best number of classes were determined to be 4. Thirdly, we locate unchanged pixels with change vector analysis in the images of 1989 and 1998. For these pixels we transfer the corresponding cluster label from 2014 to 1989 and 1998. Subsequently, the classified pixels serve as training data for supervised classification with random forest, which is carried out for each image separately. Finally, we derive land cover classes from the Landsat image in 2014, photographs and Google Earth and transfer them to the other two images. The resulting classes are shrub land; forest/shallow waters; bare soils/fields with some trees/shrubs; and bare light soils/rocks, fields and settlements. Subsequently the three different classifications are compared and land changes are mapped. The main changes are

  14. Smart Image Enhancement Process

    NASA Technical Reports Server (NTRS)

    Jobson, Daniel J. (Inventor); Rahman, Zia-ur (Inventor); Woodell, Glenn A. (Inventor)

    2012-01-01

    Contrast and lightness measures are used to first classify the image as being one of non-turbid and turbid. If turbid, the original image is enhanced to generate a first enhanced image. If non-turbid, the original image is classified in terms of a merged contrast/lightness score based on the contrast and lightness measures. The non-turbid image is enhanced to generate a second enhanced image when a poor contrast/lightness score is associated therewith. When the second enhanced image has a poor contrast/lightness score associated therewith, this image is enhanced to generate a third enhanced image. A sharpness measure is computed for one image that is selected from (i) the non-turbid image, (ii) the first enhanced image, (iii) the second enhanced image when a good contrast/lightness score is associated therewith, and (iv) the third enhanced image. If the selected image is not-sharp, it is sharpened to generate a sharpened image. The final image is selected from the selected image and the sharpened image.

  15. Medical imaging 4

    SciTech Connect

    Loew, M.H. )

    1990-01-01

    This book is covered under the following topics: human visual pattern recognition, fractals, rules, and segments, three-dimensional image processing, MRI, MRI and mammography, clinical applications 1, angiography, image processing systems, image processing poster session.

  16. What Is an Image?

    ERIC Educational Resources Information Center

    Gerber, Andrew J.; Peterson, Bradley S.

    2008-01-01

    The article helps to understand the interpretation of an image by presenting as to what constitutes an image. A common feature in all images is the basic physical structure that can be described with a common set of terms.

  17. Postprocessing classification images

    NASA Technical Reports Server (NTRS)

    Kan, E. P.

    1979-01-01

    Program cleans up remote-sensing maps. It can be used with existing image-processing software. Remapped images closely resemble familiar resource information maps and can replace or supplement classification images not postprocessed by this program.

  18. Imaging in interventional oncology.

    PubMed

    Solomon, Stephen B; Silverman, Stuart G

    2010-12-01

    Medical imaging in interventional oncology is used differently than in diagnostic radiology and prioritizes different imaging features. Whereas diagnostic imaging prioritizes the highest-quality imaging, interventional imaging prioritizes real-time imaging with lower radiation dose in addition to high-quality imaging. In general, medical imaging plays five key roles in image-guided therapy, and interventional oncology, in particular. These roles are (a) preprocedure planning, (b) intraprocedural targeting, (c) intraprocedural monitoring, (d) intraprocedural control, and (e) postprocedure assessment. Although many of these roles are still relatively basic in interventional oncology, as research and development in medical imaging focuses on interventional needs, it is likely that the role of medical imaging in intervention will become even more integral and more widely applied. In this review, the current status of medical imaging for intervention in oncology will be described and directions for future development will be examined.

  19. Analysis of imaging quality under the systematic parameters for thermal imaging system

    NASA Astrophysics Data System (ADS)

    Liu, Bin; Jin, Weiqi

    2009-07-01

    The integration of thermal imaging system and radar system could increase the range of target identification as well as strengthen the accuracy and reliability of detection, which is a state-of-the-art and mainstream integrated system to search any invasive target and guard homeland security. When it works, there is, however, one defect existing of what the thermal imaging system would produce affected images which could cause serious consequences when searching and detecting. In this paper, we study and reveal the reason why and how the affected images would occur utilizing the principle of lightwave before establishing mathematical imaging model which could meet the course of ray transmitting. In the further analysis, we give special attentions to the systematic parameters of the model, and analyse in detail all parameters which could possibly affect the imaging process and the function how it does respectively. With comprehensive research, we obtain detailed information about the regulation of diffractive phenomena shaped by these parameters. Analytical results have been convinced through the comparison between experimental images and MATLAB simulated images, while simulated images based on the parameters we revised to judge our expectation have good comparability with images acquired in reality.

  20. Registration Of SAR Images With Multisensor Images

    NASA Technical Reports Server (NTRS)

    Evans, Diane L.; Burnette, Charles F.; Van Zyl, Jakob J.

    1993-01-01

    Semiautomated technique intended primarily to facilitate registration of polarimetric synthetic-aperture-radar (SAR) images with other images of same or partly overlapping terrain while preserving polarization information conveyed by SAR data. Technique generally applicable in sense one or both of images to be registered with each other generated by polarimetric or nonpolarimetric SAR, infrared radiometry, conventional photography, or any other applicable sensing method.

  1. Filter for biomedical imaging and image processing.

    PubMed

    Mondal, Partha P; Rajan, K; Ahmad, Imteyaz

    2006-07-01

    Image filtering techniques have numerous potential applications in biomedical imaging and image processing. The design of filters largely depends on the a priori, knowledge about the type of noise corrupting the image. This makes the standard filters application specific. Widely used filters such as average, Gaussian, and Wiener reduce noisy artifacts by smoothing. However, this operation normally results in smoothing of the edges as well. On the other hand, sharpening filters enhance the high-frequency details, making the image nonsmooth. An integrated general approach to design a finite impulse response filter based on Hebbian learning is proposed for optimal image filtering. This algorithm exploits the interpixel correlation by updating the filter coefficients using Hebbian learning. The algorithm is made iterative for achieving efficient learning from the neighborhood pixels. This algorithm performs optimal smoothing of the noisy image by preserving high-frequency as well as low-frequency features. Evaluation results show that the proposed finite impulse response filter is robust under various noise distributions such as Gaussian noise, salt-and-pepper noise, and speckle noise. Furthermore, the proposed approach does not require any a priori knowledge about the type of noise. The number of unknown parameters is few, and most of these parameters are adaptively obtained from the processed image. The proposed filter is successfully applied for image reconstruction in a positron emission tomography imaging modality. The images reconstructed by the proposed algorithm are found to be superior in quality compared with those reconstructed by existing PET image reconstruction methodologies.

  2. 36 CFR 228.102 - Leasing analyses and decisions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 36 Parks, Forests, and Public Property 2 2014-07-01 2014-07-01 false Leasing analyses and... AGRICULTURE MINERALS Oil and Gas Resources Leasing § 228.102 Leasing analyses and decisions. (a) Compliance with the National Environmental Policy Act of 1969. In analyzing lands for leasing, the...

  3. 36 CFR 228.102 - Leasing analyses and decisions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 36 Parks, Forests, and Public Property 2 2013-07-01 2013-07-01 false Leasing analyses and... AGRICULTURE MINERALS Oil and Gas Resources Leasing § 228.102 Leasing analyses and decisions. (a) Compliance with the National Environmental Policy Act of 1969. In analyzing lands for leasing, the...

  4. A computer graphics program for general finite element analyses

    NASA Technical Reports Server (NTRS)

    Thornton, E. A.; Sawyer, L. M.

    1978-01-01

    Documentation for a computer graphics program for displays from general finite element analyses is presented. A general description of display options and detailed user instructions are given. Several plots made in structural, thermal and fluid finite element analyses are included to illustrate program options. Sample data files are given to illustrate use of the program.

  5. Tracing Success: Graphical Methods for Analysing Successful Collaborative Problem Solving

    ERIC Educational Resources Information Center

    Joiner, Richard; Issroff, Kim

    2003-01-01

    The aim of this paper is to evaluate the use of trace diagrams for analysing collaborative problem solving. The paper describes a study where trace diagrams were used to analyse joint navigation in a virtual environment. Ten pairs of undergraduates worked together on a distributed virtual task to collect five flowers using two bees with each…

  6. Descriptive Analyses of Pediatric Food Refusal and Acceptance

    ERIC Educational Resources Information Center

    Borrero, Carrie S. W.; Woods, Julia N.; Borrero, John C.; Masler, Elizabeth A.; Lesser, Aaron D.

    2010-01-01

    Functional analyses of inappropriate mealtime behavior typically include conditions to determine if the contingent delivery of attention, tangible items, or escape reinforce food refusal. In the current investigation, descriptive analyses were conducted for 25 children who had been admitted to a program for the assessment and treatment of food…

  7. Analyses of Response-Stimulus Sequences in Descriptive Observations

    ERIC Educational Resources Information Center

    Samaha, Andrew L.; Vollmer, Timothy R.; Borrero, Carrie; Sloman, Kimberly; Pipkin, Claire St. Peter; Bourret, Jason

    2009-01-01

    Descriptive observations were conducted to record problem behavior displayed by participants and to record antecedents and consequences delivered by caregivers. Next, functional analyses were conducted to identify reinforcers for problem behavior. Then, using data from the descriptive observations, lag-sequential analyses were conducted to examine…

  8. Treatment of Pica through Multiple Analyses of Its Reinforcing Functions.

    ERIC Educational Resources Information Center

    Piazza, Cathleen C.; Fisher, Wayne W.; Hanley, Gregory P.; LeBlanc, Linda A.; Worsdell, April S.; And Others

    1998-01-01

    A study conducted functional analyses of the pica of three young children. The pica of one participant was maintained by automatic reinforcement; that of the other two was multiply-controlled by social and automatic reinforcement. Preference and treatment analyses were used to address the automatic function of the pica. (Author/CR)

  9. What can we do about exploratory analyses in clinical trials?

    PubMed

    Moyé, Lem

    2015-11-01

    The research community has alternatively embraced then repudiated exploratory analyses since the inception of clinical trials in the middle of the twentieth century. After a series of important but ultimately unreproducible findings, these non-prospectively declared evaluations were relegated to hypothesis generating. Since the majority of evaluations conducted in clinical trials with their rich data sets are exploratory, the absence of their persuasive power adds to the inefficiency of clinical trial analyses in an atmosphere of fiscal frugality. However, the principle argument against exploratory analyses is not based in statistical theory, but pragmatism and observation. The absence of any theoretical treatment of exploratory analyses postpones the day when their statistical weaknesses might be repaired. Here, we introduce examination of the characteristics of exploratory analyses from a probabilistic and statistical framework. Setting the obvious logistical concerns aside (i.e., the absence of planning produces poor precision), exploratory analyses do not appear to suffer from estimation theory weaknesses. The problem appears to be a difficulty in what is actually reported as the p-value. The use of Bayes Theorem provides p-values that are more in line with confirmatory analyses. This development may inaugurate a body of work that would lead to the readmission of exploratory analyses to a position of persuasive power in clinical trials.

  10. Integrated metagenomic and metaproteomic analyses of marine biofilm communities

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Metagenomic and metaproteomic analyses were utilized to begin to understand the role varying environments play on the composition and function of complex air-water interface biofilms sampled from the hulls of two ships that were deployed in different geographic waters. Prokaryotic community analyses...

  11. Restricted versus Unrestricted Learning: Synthesis of Recent Meta-Analyses

    ERIC Educational Resources Information Center

    Johnson, Genevieve

    2007-01-01

    Meta-analysis is a method of quantitatively summarizing the results of experimental research. This article summarizes four meta-analyses published since 2003 that compare the effect of DE and traditional education (TE) on student learning. Despite limitations, synthesis of these meta-analyses establish, at the very least, equivalent learning…

  12. 46 CFR Appendix B to Part 154 - Stress Analyses Definitions

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 5 2012-10-01 2012-10-01 false Stress Analyses Definitions B Appendix B to Part 154...—Stress Analyses Definitions The following are the standard definitions of stresses for the analysis of an independent tank type B: Normal stress means the component of stress normal to the plane of...

  13. 46 CFR Appendix B to Part 154 - Stress Analyses Definitions

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 5 2011-10-01 2011-10-01 false Stress Analyses Definitions B Appendix B to Part 154...—Stress Analyses Definitions The following are the standard definitions of stresses for the analysis of an independent tank type B: Normal stress means the component of stress normal to the plane of...

  14. 46 CFR Appendix B to Part 154 - Stress Analyses Definitions

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 5 2014-10-01 2014-10-01 false Stress Analyses Definitions B Appendix B to Part 154...—Stress Analyses Definitions The following are the standard definitions of stresses for the analysis of an independent tank type B: Normal stress means the component of stress normal to the plane of...

  15. 46 CFR Appendix B to Part 154 - Stress Analyses Definitions

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 5 2013-10-01 2013-10-01 false Stress Analyses Definitions B Appendix B to Part 154...—Stress Analyses Definitions The following are the standard definitions of stresses for the analysis of an independent tank type B: Normal stress means the component of stress normal to the plane of...

  16. ITEM RESPONSE ANALYSES OF THE EDUCATIONAL OPPORTUNITIES SURVEY PRINCIPAL QUESTIONNAIRE.

    ERIC Educational Resources Information Center

    MAYESKE, GEORGE W.; AND OTHERS

    THIS REPORT PRESENTS THE TABULATIONS AND ANALYSES OF RESPONSES TO EACH ITEM OF THE PRINCIPAL QUESTIONNAIRE THAT WAS ADMINISTERED AS PART OF THE EDUCATIONAL OPPORTUNITIES SURVEY. THE ITEM ANALYSES OF THESE DATA WERE CONDUCTED (1) TO PRESENT THE NUMBER AND PERCENTAGE OF ELEMENTARY AND SECONDARY SCHOOL PRINCIPALS RESPONDING TO EACH ITEM ALTERNATIVE,…

  17. 41 CFR 101-27.208 - Inventory analyses.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 41 Public Contracts and Property Management 2 2010-07-01 2010-07-01 true Inventory analyses. 101... Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 27-INVENTORY MANAGEMENT 27.2-Management of Shelf-Life Materials § 101-27.208 Inventory analyses. (a) An inventory analysis shall...

  18. 41 CFR 101-27.208 - Inventory analyses.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 41 Public Contracts and Property Management 2 2011-07-01 2007-07-01 true Inventory analyses. 101... Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 27-INVENTORY MANAGEMENT 27.2-Management of Shelf-Life Materials § 101-27.208 Inventory analyses. (a) An inventory analysis shall...

  19. 41 CFR 101-27.208 - Inventory analyses.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 41 Public Contracts and Property Management 2 2014-07-01 2012-07-01 true Inventory analyses. 101...-Management of Shelf-Life Materials § 101-27.208 Inventory analyses. (a) An inventory analysis shall be... the established shelf-life period. If the analysis indicates there are quantities which will not...

  20. 43 CFR 46.130 - Mitigation measures in analyses.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 43 Public Lands: Interior 1 2012-10-01 2011-10-01 true Mitigation measures in analyses. 46.130... Mitigation measures in analyses. (a) Bureau proposed action. The analysis of the proposed action and any alternatives must include an analysis of the effects of the proposed action or alternative as well as...

  1. 43 CFR 46.130 - Mitigation measures in analyses.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 43 Public Lands: Interior 1 2013-10-01 2013-10-01 false Mitigation measures in analyses. 46.130... Mitigation measures in analyses. (a) Bureau proposed action. The analysis of the proposed action and any alternatives must include an analysis of the effects of the proposed action or alternative as well as...

  2. 36 CFR 228.102 - Leasing analyses and decisions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 36 Parks, Forests, and Public Property 2 2012-07-01 2012-07-01 false Leasing analyses and... AGRICULTURE MINERALS Oil and Gas Resources Leasing § 228.102 Leasing analyses and decisions. (a) Compliance... Forest Service Manual chapter 1950 and Forest Service Handbook 1909.15. (b) Scheduling analysis...

  3. 41 CFR 101-27.208 - Inventory analyses.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 41 Public Contracts and Property Management 2 2012-07-01 2012-07-01 false Inventory analyses. 101...-Management of Shelf-Life Materials § 101-27.208 Inventory analyses. (a) An inventory analysis shall be... the established shelf-life period. If the analysis indicates there are quantities which will not...

  4. 41 CFR 101-27.208 - Inventory analyses.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 41 Public Contracts and Property Management 2 2013-07-01 2012-07-01 true Inventory analyses. 101...-Management of Shelf-Life Materials § 101-27.208 Inventory analyses. (a) An inventory analysis shall be... the established shelf-life period. If the analysis indicates there are quantities which will not...

  5. 44 CFR 1.9 - Regulatory impact analyses.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 44 Emergency Management and Assistance 1 2013-10-01 2013-10-01 false Regulatory impact analyses. 1... HOMELAND SECURITY GENERAL RULEMAKING; POLICY AND PROCEDURES General § 1.9 Regulatory impact analyses. (a) FEMA shall, in connection with any major rule, prepare and consider a Regulatory Impact Analysis....

  6. What Can We Do About Exploratory Analyses in Clinical Trials?

    PubMed Central

    Moyé, Lem

    2015-01-01

    The research community has alternatively embraced then repudiated exploratory analyses since the inception of clinical trials in the middle of the twentieth century. After a series of important but ultimately unreproducible findings, these non-prospectively declared evaluations were relegated to hypothesis generating. Since the majority of evaluations conducted in clinical trials with their rich data sets are exploratory, the absence of their persuasive power adds to the inefficiency of clinical trial analyses in an atmosphere of fiscal frugality. However, the principle argument against exploratory analyses is not based in statistical theory, but pragmatism and observation. The absence of any theoretical treatment of exploratory analyses postpones the day when their statistical weaknesses might be repaired. Here, we introduce examination of the characteristics of exploratory analyses from a probabilistic and statistical framework. Setting the obvious logistical concerns aside (i.e., the absence of planning produces poor precision), exploratory analyses do not appear to suffer from estimation theory weaknesses. The problem appears to be a difficulty in what is actually reported as the p-value. The use of Bayes Theorem provides p-values that are more in line with confirmatory analyses. This development may inaugurate a body of work that would lead to the readmission of exploratory analyses to a position of persuasive power in clinical trials. PMID:26390962

  7. 36 CFR 228.102 - Leasing analyses and decisions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 36 Parks, Forests, and Public Property 2 2011-07-01 2011-07-01 false Leasing analyses and... AGRICULTURE MINERALS Oil and Gas Resources Leasing § 228.102 Leasing analyses and decisions. (a) Compliance with the National Environmental Policy Act of 1969. In analyzing lands for leasing, the...

  8. Recent Trends in Conducting School-Based Experimental Functional Analyses

    ERIC Educational Resources Information Center

    Carter, Stacy L.

    2009-01-01

    Demonstrations of school-based experimental functional analyses have received limited attention within the literature. School settings present unique practical and ethical concerns related to the implementation of experimental analyses which were originally developed within clinical settings. Recent examples have made definite contributions toward…

  9. Rational Analyses of Information Foraging on the Web

    ERIC Educational Resources Information Center

    Pirolli, Peter

    2005-01-01

    This article describes rational analyses and cognitive models of Web users developed within information foraging theory. This is done by following the rational analysis methodology of (a) characterizing the problems posed by the environment, (b) developing rational analyses of behavioral solutions to those problems, and (c) developing cognitive…

  10. Training Residential Staff to Conduct Trial-Based Functional Analyses

    ERIC Educational Resources Information Center

    Lambert, Joseph M.; Bloom, Sarah E.; Kunnavatana, S. Shanun; Collins, Shawnee D.; Clay, Casey J.

    2013-01-01

    We taught 6 supervisors of a residential service provider for adults with developmental disabilities to train 9 house managers to conduct trial-based functional analyses. Effects of the training were evaluated with a nonconcurrent multiple baseline. Results suggest that house managers can be trained to conduct trial-based functional analyses with…

  11. Restoration Of MEX SRC Images For Improved Topography: A New Image Product

    NASA Astrophysics Data System (ADS)

    Duxbury, T. C.

    2012-12-01

    Surface topography is an important constraint when investigating the evolution of solar system bodies. Topography is typically obtained from stereo photogrammetric or photometric (shape from shading) analyses of overlapping / stereo images and from laser / radar altimetry data. The ESA Mars Express Mission [1] carries a Super Resolution Channel (SRC) as part of the High Resolution Stereo Camera (HRSC) [2]. The SRC can build up overlapping / stereo coverage of Mars, Phobos and Deimos by viewing the surfaces from different orbits. The derivation of high precision topography data from the SRC raw images is degraded because the camera is out of focus. The point spread function (PSF) is multi-peaked, covering tens of pixels. After registering and co-adding hundreds of star images, an accurate SRC PSF was reconstructed and is being used to restore the SRC images to near blur free quality. The restored images offer a factor of about 3 in improved geometric accuracy as well as identifying the smallest of features to significantly improve the stereo photogrammetric accuracy in producing digital elevation models. The difference between blurred and restored images provides a new derived image product that can provide improved feature recognition to increase spatial resolution and topographic accuracy of derived elevation models. Acknowledgements: This research was funded by the NASA Mars Express Participating Scientist Program. [1] Chicarro, et al., ESA SP 1291(2009) [2] Neukum, et al., ESA SP 1291 (2009). A raw SRC image (h4235.003) of a Martian crater within Gale crater (the MSL landing site) is shown in the upper left and the restored image is shown in the lower left. A raw image (h0715.004) of Phobos is shown in the upper right and the difference between the raw and restored images, a new derived image data product, is shown in the lower right. The lower images, resulting from an image restoration process, significantly improve feature recognition for improved derived

  12. Noninvasive Imaging of Experimental Lung Fibrosis

    PubMed Central

    Chen, Huaping; Ambalavanan, Namasivayam; Liu, Gang; Antony, Veena B.; Ding, Qiang; Nath, Hrudaya; Eary, Janet F.; Thannickal, Victor J.

    2015-01-01

    Small animal models of lung fibrosis are essential for unraveling the molecular mechanisms underlying human fibrotic lung diseases; additionally, they are useful for preclinical testing of candidate antifibrotic agents. The current end-point measures of experimental lung fibrosis involve labor-intensive histological and biochemical analyses. These measures fail to account for dynamic changes in the disease process in individual animals and are limited by the need for large numbers of animals for longitudinal studies. The emergence of noninvasive imaging technologies provides exciting opportunities to image lung fibrosis in live animals as often as needed and to longitudinally track the efficacy of novel antifibrotic compounds. Data obtained by noninvasive imaging provide complementary information to histological and biochemical measurements. In addition, the use of noninvasive imaging in animal studies reduces animal usage, thus satisfying animal welfare concerns. In this article, we review these new imaging modalities with the potential for evaluation of lung fibrosis in small animal models. Such techniques include micro-computed tomography (micro-CT), magnetic resonance imaging, positron emission tomography (PET), single photon emission computed tomography (SPECT), and multimodal imaging systems including PET/CT and SPECT/CT. It is anticipated that noninvasive imaging will be increasingly used in animal models of fibrosis to gain insights into disease pathogenesis and as preclinical tools to assess drug efficacy. PMID:25679265

  13. Quantum-secured imaging

    NASA Astrophysics Data System (ADS)

    Malik, Mehul; Magaña-Loaiza, Omar S.; Boyd, Robert W.

    2012-12-01

    We have built an imaging system that uses a photon's position or time-of-flight information to image an object, while using the photon's polarization for security. This ability allows us to obtain an image which is secure against an attack in which the object being imaged intercepts and resends the imaging photons with modified information. Popularly known as "jamming," this type of attack is commonly directed at active imaging systems such as radar. In order to jam our imaging system, the object must disturb the delicate quantum state of the imaging photons, thus introducing statistical errors that reveal its activity.

  14. Far Ultraviolet Imaging from the Image Spacecraft

    NASA Technical Reports Server (NTRS)

    Mende, S. B.; Heetderks, H.; Frey, H. U.; Lampton, M.; Geller, S. P.; Stock, J. M.; Abiad, R.; Siegmund, O. H. W.; Tremsin, A. S.; Habraken, S.

    2000-01-01

    Direct imaging of the magnetosphere by the IMAGE spacecraft will be supplemented by observation of the global aurora. The IMAGE satellite instrument complement includes three Far Ultraviolet (FUV) instruments. The Wideband Imaging Camera (WIC) will provide broad band ultraviolet images of the aurora for maximum spatial and temporal resolution by imaging the LBH N2 bands of the aurora. The Spectrographic Imager (SI), a novel form of monochromatic imager, will image the aurora, filtered by wavelength. The proton-induced component of the aurora will be imaged separately by measuring the Doppler-shifted Lyman-a. Finally, the GEO instrument will observe the distribution of the geocoronal emission to obtain the neutral background density source for charge exchange in the magnetosphere. The FUV instrument complement looks radially outward from the rotating IMAGE satellite and, therefore, it spends only a short time observing the aurora and the Earth during each spin. To maximize photon collection efficiency and use efficiently the short time available for exposures the FUV auroral imagers WIC and SI both have wide fields of view and take data continuously as the auroral region proceeds through the field of view. To minimize data volume, the set of multiple images are electronically co-added by suitably shifting each image to compensate for the spacecraft rotation. In order to minimize resolution loss, the images have to be distort ion-corrected in real time. The distortion correction is accomplished using high speed look up tables that are pre-generated by least square fitting to polynomial functions by the on-orbit processor. The instruments were calibrated individually while on stationary platforms, mostly in vacuum chambers. Extensive ground-based testing was performed with visible and near UV simulators mounted on a rotating platform to emulate their performance on a rotating spacecraft.

  15. Quantitative histogram analysis of images

    NASA Astrophysics Data System (ADS)

    Holub, Oliver; Ferreira, Sérgio T.

    2006-11-01

    A routine for histogram analysis of images has been written in the object-oriented, graphical development environment LabVIEW. The program converts an RGB bitmap image into an intensity-linear greyscale image according to selectable conversion coefficients. This greyscale image is subsequently analysed by plots of the intensity histogram and probability distribution of brightness, and by calculation of various parameters, including average brightness, standard deviation, variance, minimal and maximal brightness, mode, skewness and kurtosis of the histogram and the median of the probability distribution. The program allows interactive selection of specific regions of interest (ROI) in the image and definition of lower and upper threshold levels (e.g., to permit the removal of a constant background signal). The results of the analysis of multiple images can be conveniently saved and exported for plotting in other programs, which allows fast analysis of relatively large sets of image data. The program file accompanies this manuscript together with a detailed description of two application examples: The analysis of fluorescence microscopy images, specifically of tau-immunofluorescence in primary cultures of rat cortical and hippocampal neurons, and the quantification of protein bands by Western-blot. The possibilities and limitations of this kind of analysis are discussed. Program summaryTitle of program: HAWGC Catalogue identifier: ADXG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXG_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computers: Mobile Intel Pentium III, AMD Duron Installations: No installation necessary—Executable file together with necessary files for LabVIEW Run-time engine Operating systems or monitors under which the program has been tested: WindowsME/2000/XP Programming language used: LabVIEW 7.0 Memory required to execute with typical data:˜16MB for starting and ˜160MB used for

  16. Analysis of a multisensor image data set of south San Rafael Swell, Utah

    NASA Technical Reports Server (NTRS)

    Evans, D. L.

    1982-01-01

    A Shuttle Imaging Radar (SIR-A) image of the southern portion of the San Rafael Swell in Utah has been digitized and registered to coregistered Landsat, Seasat, and HCMM thermal inertia images. The addition of the SIR-A image to the registered data set improves rock type discrimination in both qualitative and quantitative analyses. Sedimentary units can be separated in a combined SIR-A/Seasat image that cannot be seen in either image alone. Discriminant Analyses show that the classification accuracy is improved with addition of the SIR-A image to Landsat images. Classification accuracy is further improved when texture information from the Seasat and SIR-A images is included.

  17. Image processing and recognition for biological images

    PubMed Central

    Uchida, Seiichi

    2013-01-01

    This paper reviews image processing and pattern recognition techniques, which will be useful to analyze bioimages. Although this paper does not provide their technical details, it will be possible to grasp their main tasks and typical tools to handle the tasks. Image processing is a large research area to improve the visibility of an input image and acquire some valuable information from it. As the main tasks of image processing, this paper introduces gray-level transformation, binarization, image filtering, image segmentation, visual object tracking, optical flow and image registration. Image pattern recognition is the technique to classify an input image into one of the predefined classes and also has a large research area. This paper overviews its two main modules, that is, feature extraction module and classification module. Throughout the paper, it will be emphasized that bioimage is a very difficult target for even state-of-the-art image processing and pattern recognition techniques due to noises, deformations, etc. This paper is expected to be one tutorial guide to bridge biology and image processing researchers for their further collaboration to tackle such a difficult target. PMID:23560739

  18. Image processing and recognition for biological images.

    PubMed

    Uchida, Seiichi

    2013-05-01

    This paper reviews image processing and pattern recognition techniques, which will be useful to analyze bioimages. Although this paper does not provide their technical details, it will be possible to grasp their main tasks and typical tools to handle the tasks. Image processing is a large research area to improve the visibility of an input image and acquire some valuable information from it. As the main tasks of image processing, this paper introduces gray-level transformation, binarization, image filtering, image segmentation, visual object tracking, optical flow and image registration. Image pattern recognition is the technique to classify an input image into one of the predefined classes and also has a large research area. This paper overviews its two main modules, that is, feature extraction module and classification module. Throughout the paper, it will be emphasized that bioimage is a very difficult target for even state-of-the-art image processing and pattern recognition techniques due to noises, deformations, etc. This paper is expected to be one tutorial guide to bridge biology and image processing researchers for their further collaboration to tackle such a difficult target.

  19. Imaging Sciences Workshop Proceedings

    SciTech Connect

    Candy, J.V.

    1996-11-21

    This report contains the proceedings of the Imaging Sciences Workshop sponsored by C.A.S.LS., the Center for Advanced Signal & Image Sciences. The Center, established primarily to provide a forum where researchers can freely exchange ideas on the signal and image sciences in a comfortable intellectual environment, has grown over the last two years with the opening of a Reference Library (located in Building 272). The Technical Program for the 1996 Workshop include a variety of efforts in the Imaging Sciences including applications in the Microwave Imaging, highlighted by the Micro-Impulse Radar (MIR) system invented at LLNL, as well as other applications in this area. Special sessions organized by various individuals in Speech, Acoustic Ocean Imaging, Radar Ocean Imaging, Ultrasonic Imaging, and Optical Imaging discuss various applica- tions of real world problems. For the more theoretical, sessions on Imaging Algorithms and Computed Tomography were organized as well as for the more pragmatic featuring a session on Imaging Systems.

  20. Image management research

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.

    1988-01-01

    Two types of research issues are involved in image management systems with space station applications: image processing research and image perception research. The image processing issues are the traditional ones of digitizing, coding, compressing, storing, analyzing, and displaying, but with a new emphasis on the constraints imposed by the human perceiver. Two image coding algorithms have been developed that may increase the efficiency of image management systems (IMS). Image perception research involves a study of the theoretical and practical aspects of visual perception of electronically displayed images. Issues include how rapidly a user can search through a library of images, how to make this search more efficient, and how to present images in terms of resolution and split screens. Other issues include optimal interface to an IMS and how to code images in a way that is optimal for the human perceiver. A test-bed within which such issues can be addressed has been designed.