Science.gov

Sample records for image analyses monitoracao

  1. RADTI: regression analyses of diffusion tensor images

    NASA Astrophysics Data System (ADS)

    Li, Yimei; Zhu, Hongtu; Chen, Yasheng; Ibrahim, Joseph G.; An, Hongyu; Lin, Weili; Hall, Colin; Shen, Dinggang

    2009-02-01

    Diffusion tensor image (DTI) is a powerful tool for quantitatively assessing the integrity of anatomical connectivity in white matter in clinical populations. The prevalent methods for group-level analysis of DTI are statistical analyses of invariant measures (e.g., fractional anisotropy) and principal directions across groups. The invariant measures and principal directions, however, do not capture all information in full diffusion tensor, which can decrease the statistical power of DTI in detecting subtle changes of white matters. Thus, it is very desirable to develop new statistical methods for analyzing full diffusion tensors. In this paper, we develop a set of toolbox, called RADTI, for the analysis of the full diffusion tensors as responses and establish their association with a set of covariates. The key idea is to use the recent development of log-Euclidean metric and then transform diffusion tensors in a nonlinear space into their matrix logarithms in a Euclidean space. Our regression model is a semiparametric model, which avoids any specific parametric assumptions. We develop an estimation procedure and a test procedure based on score statistics and a resampling method to simultaneously assess the statistical significance of linear hypotheses across a large region of interest. Monte Carlo simulations are used to examine the finite sample performance of the test procedure for controlling the family-wise error rate. We apply our methods to the detection of statistical significance of diagnostic and age effects on the integrity of white matter in a diffusion tensor study of human immunodeficiency virus.

  2. Analyses for Multistatic Geometric Image Correction.

    DTIC Science & Technology

    1980-02-01

    bistatic angle is very small). A means for achieving the appropriate data formatting has been de - vised by AIL. The method is referred to as Coherent...direction changes at twice the rate of the bistatic angle bisector direction change. By the use of this backward approach, only the de - sired set of...scene from below the printout, that is at an angle of 90 de - grees as defined in Figure 1A. With proper processing the image should be invarient with

  3. ["When the ad is good, the product is sold." The MonitorACAO Project and drug advertising in Brazil].

    PubMed

    Soares, Jussara Calmon Reis de Souza

    2008-04-01

    This paper presents an analysis on drug advertising in Brazil, based on the final report of the MonitorACAO Project, by the group from the Universidade Federal Fluminense, Niterói, Rio de Janeiro. Due to a partnership between the university and the National Agency for Health Surveillance (ANVISA), drug advertisements were monitored and analyzed for one year, according to the methodology defined by the Agency. The samples were collected in medical practices and hospitals, drugstores, pharmacies and in scientific magazines. TV and radio programs were monitored, in the case of OTC drugs. 159 advertisements referring to pharmaceuticals were sent to ANVISA,from a total of 263 irregular ads analyzed between October 2004 and August 2005. The main problems found were the poor quality of drug information to health professionals, as well as misleading drug use to lay population. Based on the results of this project and on other studies, the banning of drug advertising in Brazil is proposed.

  4. Phase contrast image segmentation using a Laue analyser crystal

    NASA Astrophysics Data System (ADS)

    Kitchen, Marcus J.; Paganin, David M.; Uesugi, Kentaro; Allison, Beth J.; Lewis, Robert A.; Hooper, Stuart B.; Pavlov, Konstantin M.

    2011-02-01

    Dual-energy x-ray imaging is a powerful tool enabling two-component samples to be separated into their constituent objects from two-dimensional images. Phase contrast x-ray imaging can render the boundaries between media of differing refractive indices visible, despite them having similar attenuation properties; this is important for imaging biological soft tissues. We have used a Laue analyser crystal and a monochromatic x-ray source to combine the benefits of both techniques. The Laue analyser creates two distinct phase contrast images that can be simultaneously acquired on a high-resolution detector. These images can be combined to separate the effects of x-ray phase, absorption and scattering and, using the known complex refractive indices of the sample, to quantitatively segment its component materials. We have successfully validated this phase contrast image segmentation (PCIS) using a two-component phantom, containing an iodinated contrast agent, and have also separated the lungs and ribcage in images of a mouse thorax. Simultaneous image acquisition has enabled us to perform functional segmentation of the mouse thorax throughout the respiratory cycle during mechanical ventilation.

  5. Analyser-based phase contrast image reconstruction using geometrical optics.

    PubMed

    Kitchen, M J; Pavlov, K M; Siu, K K W; Menk, R H; Tromba, G; Lewis, R A

    2007-07-21

    Analyser-based phase contrast imaging can provide radiographs of exceptional contrast at high resolution (<100 microm), whilst quantitative phase and attenuation information can be extracted using just two images when the approximations of geometrical optics are satisfied. Analytical phase retrieval can be performed by fitting the analyser rocking curve with a symmetric Pearson type VII function. The Pearson VII function provided at least a 10% better fit to experimentally measured rocking curves than linear or Gaussian functions. A test phantom, a hollow nylon cylinder, was imaged at 20 keV using a Si(1 1 1) analyser at the ELETTRA synchrotron radiation facility. Our phase retrieval method yielded a more accurate object reconstruction than methods based on a linear fit to the rocking curve. Where reconstructions failed to map expected values, calculations of the Takagi number permitted distinction between the violation of the geometrical optics conditions and the failure of curve fitting procedures. The need for synchronized object/detector translation stages was removed by using a large, divergent beam and imaging the object in segments. Our image acquisition and reconstruction procedure enables quantitative phase retrieval for systems with a divergent source and accounts for imperfections in the analyser.

  6. Analyser-based x-ray imaging for biomedical research

    NASA Astrophysics Data System (ADS)

    Suortti, Pekka; Keyriläinen, Jani; Thomlinson, William

    2013-12-01

    Analyser-based imaging (ABI) is one of the several phase-contrast x-ray imaging techniques being pursued at synchrotron radiation facilities. With advancements in compact source technology, there is a possibility that ABI will become a clinical imaging modality. This paper presents the history of ABI as it has developed from its laboratory source to synchrotron imaging. The fundamental physics of phase-contrast imaging is presented both in a general sense and specifically for ABI. The technology is dependent on the use of perfect crystal monochromator optics. The theory of the x-ray optics is developed and presented in a way that will allow optimization of the imaging for specific biomedical systems. The advancement of analytical algorithms to produce separate images of the sample absorption, refraction angle map and small-angle x-ray scattering is detailed. Several detailed applications to biomedical imaging are presented to illustrate the broad range of systems and body sites studied preclinically to date: breast, cartilage and bone, soft tissue and organs. Ultimately, the application of ABI in clinical imaging will depend partly on the availability of compact sources with sufficient x-ray intensity comparable with that of the current synchrotron environment.

  7. Colony image acquisition and genetic segmentation algorithm and colony analyses

    NASA Astrophysics Data System (ADS)

    Wang, W. X.

    2012-01-01

    Colony anaysis is used in a large number of engineerings such as food, dairy, beverages, hygiene, environmental monitoring, water, toxicology, sterility testing. In order to reduce laboring and increase analysis acuracy, many researchers and developers have made efforts for image analysis systems. The main problems in the systems are image acquisition, image segmentation and image analysis. In this paper, to acquire colony images with good quality, an illumination box was constructed. In the box, the distances between lights and dishe, camra lens and lights, and camera lens and dishe are adjusted optimally. In image segmentation, It is based on a genetic approach that allow one to consider the segmentation problem as a global optimization,. After image pre-processing and image segmentation, the colony analyses are perfomed. The colony image analysis consists of (1) basic colony parameter measurements; (2) colony size analysis; (3) colony shape analysis; and (4) colony surface measurements. All the above visual colony parameters can be selected and combined together, used to make a new engineeing parameters. The colony analysis can be applied into different applications.

  8. Automated coregistration and statistical analyses of SPECT brain images

    SciTech Connect

    Gong, W.; Devous, M.D.

    1994-05-01

    Statistical analyses of SPECT image data often require highly accurate image coregistration. Several image coregistration algorithms have been developed. The Pellizari algorithm (PA) uses the Powell technique to estimate transformation parameters between the {open_quotes}head{close_quotes} (model) and {open_quotes}hat{close_quotes} (images to be registered). Image normalization and good initial transformation parameters heavily affect the accuracy and speed of convergence of the PA. We have explored various normalization methods and found a simple technique that avoids most artificial edge effects and minimizes blurring of useful edges. We have tested the effects on accuracy and convergence speed of the PA caused by different initial transformation parameters. From these data, a modified PA was integrated into an automated coregistration system for SPECT brain images on the PRISM 3000S under X Windows. The system yields an accuracy of approximately 2 mm between model and registered images, and employs minimal user intervention through a simple graphic user interface. Data are automatically resliced, normalized and coregistered, with the user choosing only the slice range for inclusion and two initial transformation parameters (under computer-aided guidance). Coregistration is accomplished (converges) in approximately 8 min for a 128 x 128 x 128 set of 2 mm{sup 3} voxels. The complete process (editing, reslicing, normalization, coregistration) takes about 20 min. We have also developed automated 3-dimensional parametric images ({open_quotes}t{close_quotes}, {open_quotes}z{close_quotes}, and subtraction images) from coregistered data sets for statistical analyses. Data are compared against a coregistered normal control group (N = 50) distributed in age and gender for matching against subject samples.

  9. Cartographic analyses of geographic information available on Google Earth Images

    NASA Astrophysics Data System (ADS)

    Oliveira, J. C.; Ramos, J. R.; Epiphanio, J. C.

    2011-12-01

    The propose was to evaluate planimetric accuracy of satellite images available on database of Google Earth. These images are referents to the vicinities of the Federal Univertisity of Viçosa, Minas Gerais - Brazil. The methodology developed evaluated the geographical information of three groups of images which were in accordance to the level of detail presented in the screen images (zoom). These groups of images were labeled to Zoom 1000 (a single image for the entire study area), Zoom 100 (formed by a mosaic of 73 images) and Zoom 100 with geometric correction (this mosaic is like before, however, it was applied a geometric correction through control points). In each group of image was measured the Cartographic Accuracy based on statistical analyses and brazilian's law parameters about planimetric mapping. For this evaluation were identified 22 points in each group of image, where the coordinates of each point were compared to the coordinates of the field obtained by GPS (Global Positioning System). The Table 1 show results related to accuracy (based on a threshold equal to 0.5 mm * mapping scale) and tendency (abscissa and ordinate) between the coordinates of the image and the coordinates of field. Table 1 The geometric correction applied to the Group Zoom 100 reduced the trends identified earlier, and the statistical tests pointed a usefulness of the data for a mapping at a scale of 1/5000 with error minor than 0.5 mm * scale. The analyses proved the quality of cartographic data provided by Google, as well as the possibility of reduce the divergences of positioning present on the data. It can be concluded that it is possible to obtain geographic information database available on Google Earth, however, the level of detail (zoom) used at the time of viewing and capturing information on the screen influences the quality cartographic of the mapping. Although cartographic and thematic potential present in the database, it is important to note that both the software

  10. Solid Hydrogen Experiments for Atomic Propellants: Image Analyses

    NASA Technical Reports Server (NTRS)

    Palaszewski, Bryan

    2002-01-01

    This paper presents the results of detailed analyses of the images from experiments that were conducted on the formation of solid hydrogen particles in liquid helium. Solid particles of hydrogen were frozen in liquid helium, and observed with a video camera. The solid hydrogen particle sizes, their agglomerates, and the total mass of hydrogen particles were estimated. Particle sizes of 1.9 to 8 mm (0.075 to 0.315 in.) were measured. The particle agglomerate sizes and areas were measured, and the total mass of solid hydrogen was computed. A total mass of from 0.22 to 7.9 grams of hydrogen was frozen. Compaction and expansion of the agglomerate implied that the particles remain independent particles, and can be separated and controlled. These experiment image analyses are one of the first steps toward visually characterizing these particles, and allow designers to understand what issues must be addressed in atomic propellant feed system designs for future aerospace vehicles.

  11. A Guide to Analysing Tongue Motion from Ultrasound Images

    ERIC Educational Resources Information Center

    Stone, Maureen

    2005-01-01

    This paper is meant to be an introduction to and general reference for ultrasound imaging for new and moderately experienced users of the instrument. The paper consists of eight sections. The first explains how ultrasound works, including beam properties, scan types and machine features. The second section discusses image quality, including the…

  12. A Guide to Analysing Tongue Motion from Ultrasound Images

    ERIC Educational Resources Information Center

    Stone, Maureen

    2005-01-01

    This paper is meant to be an introduction to and general reference for ultrasound imaging for new and moderately experienced users of the instrument. The paper consists of eight sections. The first explains how ultrasound works, including beam properties, scan types and machine features. The second section discusses image quality, including the…

  13. Imaging data analyses for hazardous waste applications. Final report

    SciTech Connect

    David, N.; Ginsberg, I.W.

    1995-12-01

    The paper presents some examples of the use of remote sensing products for characterization of hazardous waste sites. The sites are located at the Los Alamos National Laboratory (LANL) where materials associated with past weapons testing are buried. Problems of interest include delineation of strata for soil sampling, detection and delineation of buried trenches containing contaminants, seepage from capped areas and old septic drain fields, and location of faults and fractures relative to hazardous waste areas. Merging of site map and other geographic information with imagery was found by site managers to produce useful products. Merging of hydrographic and soil contaminant data aided soil sampling strategists. Overlays of suspected trench on multispectral and thermal images showed correlation between image signatures and trenches. Overlays of engineering drawings on recent and historical photos showed error in trench location and extent. A thermal image showed warm anomalies suspected to be areas of water seepage through an asphalt cap. Overlays of engineering drawings on multispectral and thermal images showed correlation between image signatures and drain fields. Analysis of aerial photography and spectral signatures of faults/fractures improved geologic maps of mixed waste areas.

  14. Surveying and benchmarking techniques to analyse DNA gel fingerprint images.

    PubMed

    Heras, Jónathan; Domínguez, César; Mata, Eloy; Pascual, Vico

    2016-11-01

    DNA fingerprinting is a genetic typing technique that allows the analysis of the genomic relatedness between samples, and the comparison of DNA patterns. The analysis of DNA gel fingerprint images usually consists of five consecutive steps: image pre-processing, lane segmentation, band detection, normalization and fingerprint comparison. In this article, we firstly survey the main methods that have been applied in the literature in each of these stages. Secondly, we focus on lane-segmentation and band-detection algorithms-as they are the steps that usually require user-intervention-and detect the seven core algorithms used for both tasks. Subsequently, we present a benchmark that includes a data set of images, the gold standards associated with those images and the tools to measure the performance of lane-segmentation and band-detection algorithms. Finally, we implement the core algorithms used both for lane segmentation and band detection, and evaluate their performance using our benchmark. As a conclusion of that study, we obtain that the average profile algorithm is the best starting point for lane segmentation and band detection.

  15. The challenges of analysing blood stains with hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Kuula, J.; Puupponen, H.-H.; Rinta, H.; Pölönen, I.

    2014-06-01

    Hyperspectral imaging is a potential noninvasive technology for detecting, separating and identifying various substances. In the forensic and military medicine and other CBRNE related use it could be a potential method for analyzing blood and for scanning other human based fluids. For example, it would be valuable to easily detect whether some traces of blood are from one or more persons or if there are some irrelevant substances or anomalies in the blood. This article represents an experiment of separating four persons' blood stains on a white cotton fabric with a SWIR hyperspectral camera and FT-NIR spectrometer. Each tested sample includes standardized 75 _l of 100 % blood. The results suggest that on the basis of the amount of erythrocytes in the blood, different people's blood might be separable by hyperspectral analysis. And, referring to the indication given by erythrocytes, there might be a possibility to find some other traces in the blood as well. However, these assumptions need to be verified with wider tests, as the number of samples in the study was small. According to the study there also seems to be several biological, chemical and physical factors which affect alone and together on the hyperspectral analyzing results of blood on fabric textures, and these factors need to be considered before making any further conclusions on the analysis of blood on various materials.

  16. Integrating Medical Imaging Analyses through a High-throughput Bundled Resource Imaging System.

    PubMed

    Covington, Kelsie; Welch, E Brian; Jeong, Ha-Kyu; Landman, Bennett A

    2011-01-01

    Exploitation of advanced, PACS-centric image analysis and interpretation pipelines provides well-developed storage, retrieval, and archival capabilities along with state-of-the-art data providence, visualization, and clinical collaboration technologies. However, pursuit of integrated medical imaging analysis through a PACS environment can be limiting in terms of the overhead required to validate, evaluate and integrate emerging research technologies. Herein, we address this challenge through presentation of a high-throughput bundled resource imaging system (HUBRIS) as an extension to the Philips Research Imaging Development Environment (PRIDE). HUBRIS enables PACS-connected medical imaging equipment to invoke tools provided by the Java Imaging Science Toolkit (JIST) so that a medical imaging platform (e.g., a magnetic resonance imaging scanner) can pass images and parameters to a server, which communicates with a grid computing facility to invoke the selected algorithms. Generated images are passed back to the server and subsequently to the imaging platform from which the images can be sent to a PACS. JIST makes use of an open application program interface layer so that research technologies can be implemented in any language capable of communicating through a system shell environment (e.g., Matlab, Java, C/C++, Perl, LISP, etc.). As demonstrated in this proof-of-concept approach, HUBRIS enables evaluation and analysis of emerging technologies within well-developed PACS systems with minimal adaptation of research software, which simplifies evaluation of new technologies in clinical research and provides a more convenient use of PACS technology by imaging scientists.

  17. Integrating medical imaging analyses through a high-throughput bundled resource imaging system

    NASA Astrophysics Data System (ADS)

    Covington, Kelsie; Welch, E. Brian; Jeong, Ha-Kyu; Landman, Bennett A.

    2011-03-01

    Exploitation of advanced, PACS-centric image analysis and interpretation pipelines provides well-developed storage, retrieval, and archival capabilities along with state-of-the-art data providence, visualization, and clinical collaboration technologies. However, pursuit of integrated medical imaging analysis through a PACS environment can be limiting in terms of the overhead required to validate, evaluate and integrate emerging research technologies. Herein, we address this challenge through presentation of a high-throughput bundled resource imaging system (HUBRIS) as an extension to the Philips Research Imaging Development Environment (PRIDE). HUBRIS enables PACS-connected medical imaging equipment to invoke tools provided by the Java Imaging Science Toolkit (JIST) so that a medical imaging platform (e.g., a magnetic resonance imaging scanner) can pass images and parameters to a server, which communicates with a grid computing facility to invoke the selected algorithms. Generated images are passed back to the server and subsequently to the imaging platform from which the images can be sent to a PACS. JIST makes use of an open application program interface layer so that research technologies can be implemented in any language capable of communicating through a system shell environment (e.g., Matlab, Java, C/C++, Perl, LISP, etc.). As demonstrated in this proof-of-concept approach, HUBRIS enables evaluation and analysis of emerging technologies within well-developed PACS systems with minimal adaptation of research software, which simplifies evaluation of new technologies in clinical research and provides a more convenient use of PACS technology by imaging scientists.

  18. Integrating Medical Imaging Analyses through a High-throughput Bundled Resource Imaging System

    PubMed Central

    Covington, Kelsie; Welch, E. Brian; Jeong, Ha-Kyu; Landman, Bennett A.

    2011-01-01

    Exploitation of advanced, PACS-centric image analysis and interpretation pipelines provides well-developed storage, retrieval, and archival capabilities along with state-of-the-art data providence, visualization, and clinical collaboration technologies. However, pursuit of integrated medical imaging analysis through a PACS environment can be limiting in terms of the overhead required to validate, evaluate and integrate emerging research technologies. Herein, we address this challenge through presentation of a high-throughput bundled resource imaging system (HUBRIS) as an extension to the Philips Research Imaging Development Environment (PRIDE). HUBRIS enables PACS-connected medical imaging equipment to invoke tools provided by the Java Imaging Science Toolkit (JIST) so that a medical imaging platform (e.g., a magnetic resonance imaging scanner) can pass images and parameters to a server, which communicates with a grid computing facility to invoke the selected algorithms. Generated images are passed back to the server and subsequently to the imaging platform from which the images can be sent to a PACS. JIST makes use of an open application program interface layer so that research technologies can be implemented in any language capable of communicating through a system shell environment (e.g., Matlab, Java, C/C++, Perl, LISP, etc.). As demonstrated in this proof-of-concept approach, HUBRIS enables evaluation and analysis of emerging technologies within well-developed PACS systems with minimal adaptation of research software, which simplifies evaluation of new technologies in clinical research and provides a more convenient use of PACS technology by imaging scientists. PMID:21841899

  19. Advances in automated 3-D image analyses of cell populations imaged by confocal microscopy.

    PubMed

    Ancin, H; Roysam, B; Dufresne, T E; Chestnut, M M; Ridder, G M; Szarowski, D H; Turner, J N

    1996-11-01

    Automated three-dimensional (3-D) image analysis methods are presented for rapid and effective analysis of populations of fluorescently labeled cells or nuclei in thick tissue sections that have been imaged three dimensionally using a confocal microscope. The methods presented here greatly improve upon our earlier work (Roysam et al.:J Microsc 173: 115-126, 1994). The principal advances reported are: algorithms for efficient data pre-processing and adaptive segmentation, effective handling of image anisotrophy, and fast 3-D morphological algorithms for separating overlapping or connected clusters utilizing image gradient information whenever available. A particular feature of this method is its ability to separate densely packed and connected clusters of cell nuclei. Some of the challenges overcome in this work include the efficient and effective handling of imaging noise, anisotrophy, and large variations in image parameters such as intensity, object size, and shape. The method is able to handle significant inter-cell, intra-cell, inter-image, and intra-image variations. Studies indicate that this method is rapid, robust, and adaptable. Examples were presented to illustrate the applicability of this approach to analyzing images of nuclei from densely packed regions in thick sections of rat liver, and brain that were labeled with a fluorescent Schiff reagent.

  20. Analyses of S-Box in Image Encryption Applications Based on Fuzzy Decision Making Criterion

    NASA Astrophysics Data System (ADS)

    Rehman, Inayatur; Shah, Tariq; Hussain, Iqtadar

    2014-06-01

    In this manuscript, we put forward a standard based on fuzzy decision making criterion to examine the current substitution boxes and study their strengths and weaknesses in order to decide their appropriateness in image encryption applications. The proposed standard utilizes the results of correlation analysis, entropy analysis, contrast analysis, homogeneity analysis, energy analysis, and mean of absolute deviation analysis. These analyses are applied to well-known substitution boxes. The outcome of these analyses are additional observed and a fuzzy soft set decision making criterion is used to decide the suitability of an S-box to image encryption applications.

  1. Quantitative analysis of x-ray images with a television image analyser.

    PubMed

    Schleicher, A; Tillmann, B; Zilles, K

    1980-07-01

    A method for the quantitative evaluation of X-rays is described. The image is decomposed into individual image points by a mechanical scanning procedure, and at each image point the area fraction of a measuring field not covered by silver grains is determined with an image analyzer. This parameter is interpreted as representing a value corresponding to a specific degree of film blackness. The relationship between the measured value and the X-ray absorption is described by standard curves. With the aid of an aluminum scale, the measured value can be expressed directly by the thickness of an aluminum equivalent with a corresponding X-ray absorption. Details about the adjustment of the image analyzer for detecting the silver grains, the resolution of different degrees of X-ray absorption, as well as the computer-controlled scanning procedure are described. An example demonstrates its applicability to analyze the density distribution of bony tissue around the human humero-ulnar joint. The procedure is not limited to the evaluation of X-rays, but is applicable whenever silver grains can be detected in a film layer by an image analyzer.

  2. Geologist's Field Assistant: Developing Image and Spectral Analyses Algorithms for Remote Science Exploration

    NASA Technical Reports Server (NTRS)

    Gulick, V. C.; Morris, R. L.; Bishop, J.; Gazis, P.; Alena, R.; Sierhuis, M.

    2002-01-01

    We are developing science analyses algorithms to interface with a Geologist's Field Assistant device to allow robotic or human remote explorers to better sense their surroundings during limited surface excursions. Our algorithms will interpret spectral and imaging data obtained by various sensors. Additional information is contained in the original extended abstract.

  3. Coronary CT angiography: IVUS image fusion for quantitative plaque and stenosis analyses

    NASA Astrophysics Data System (ADS)

    Marquering, Henk A.; Dijkstra, Jouke; Besnehard, Quentin J. A.; Duthé, Julien P. M.; Schuijf, Joanne D.; Bax, Jeroen J.; Reiber, Johan H. C.

    2008-03-01

    Rationale and Objective: Due to the limited temporal and spatial resolution, coronary CT angiographic image quality is not optimal for robust and accurate stenosis quantification, and plaque differentiation and quantification. By combining the high-resolution IVUS images with CT images, a detailed representation of the coronary arteries can be provided in the CT images. Methods: The two vessel data sets are matched using three steps. First, vessel segments are matched using anatomical landmarks. Second, the landmarks are aligned in cross-sectional vessel images. Third, the semi-automatically detected IVUS lumen contours are matched to the CTA data, using manual interaction and automatic registration methods. Results: The IVUS-CTA fusion tool facilitates the unique combined view of the high-resolution IVUS segmentation of the outer vessel wall and lumen-intima transitions on the CT images. The cylindrical projection of the CMPR image decreases the analysis time with 50 percent. The automatic registration of the cross-vessel views decreases the analyses time with 85 percent. Conclusions: The fusion of IVUS images and their segmentation results with coronary CT angiographic images provide a detailed view of the lumen and vessel wall of coronary arteries. The automatic fusion tool makes such a registration feasible for the development and validation of analysis tools.

  4. Analysing the Image Building Effects of TV Advertisements Using Internet Community Data

    NASA Astrophysics Data System (ADS)

    Uehara, Hiroshi; Sato, Tadahiko; Yoshida, Kenichi

    This paper proposes a method to measure the effects of TV advertisements on the Internet bulletin boards. It aims to clarify how the viewes' interests on TV advertisements are reflected on their images on the promoted products. Two kinds of time series data are generated based on the proposed method. First one represents the time series fluctuation of the interests on the TV advertisements. Another one represents the time series fluctuation of the images on the products. By analysing the correlations between these two time series data, we try to clarify the implicit relationship between the viewer's interests on the TV advertisement and their images on the promoted products. By applying the proposed method to an Internet bulletin board that deals with certain cosmetic brand, we show that the images on the products vary depending on the difference of the interests on each TV advertisement.

  5. Use of Image Analyses Techniques To Quantify Rock Morphological Characteristics of Lava Flows By Fms Logs.

    NASA Astrophysics Data System (ADS)

    Pechnig, R.; Ramani, S.; Bartetzko, A.; Clauser, C.

    Borehole wall images obtained from downhole measurements are mostly used for structure analyses (picking of fractures, foliation, layering) and qualitative descrip- tions of geological features. Qualitative results are difficult to compare with petro- physical data sets, either from laboratory measurements or form logging. We report on an application of image analyses techniques on FMS (Formation Micro Scanner) data, in order to select and quantify rock morphological characteristics. We selected image logs from subaerial basalts, drilled during Leg 183 in the Kerguelen Large Ig- neous Province Plateau. The selected subaerial basalts penetrated in Hole 1137 show significant morphological features, such as vesicles and fractures of different size, shape, distribution, and orientation. The excellent core recovery in this hole and the high quality of the standard and FMS logs provides a good opportunity to test the usefulness of image analyses in such rock types. We used the Zeiss K4000 software system for image analyses. The selection was performed by color scale definitions, where darker colors are associated with electrically conductive rock elements, which are in this case fluid or clay mineral filled voids, vesicles and fractures. Besides the se- lection of these morphological features, the application of this technique also allows to calculate aspect ratios for the selected elements and to discriminate between vesicles and fractures. This way, the qualitative information of the FMS logs was transferred to quantitative log curves, which may be used as input for statistical log processing. In our case, we used the information to relate rock morphological characteristics to the seismic properties of the drilled rocks.

  6. Solid Hydrogen Experiments for Atomic Propellants: Particle Formation, Imaging, Observations, and Analyses

    NASA Technical Reports Server (NTRS)

    Palaszewski, Bryan

    2005-01-01

    This report presents particle formation observations and detailed analyses of the images from experiments that were conducted on the formation of solid hydrogen particles in liquid helium. Hydrogen was frozen into particles in liquid helium, and observed with a video camera. The solid hydrogen particle sizes and the total mass of hydrogen particles were estimated. These newly analyzed data are from the test series held on February 28, 2001. Particle sizes from previous testing in 1999 and the testing in 2001 were similar. Though the 2001 testing created similar particles sizes, many new particle formation phenomena were observed: microparticles and delayed particle formation. These experiment image analyses are some of the first steps toward visually characterizing these particles, and they allow designers to understand what issues must be addressed in atomic propellant feed system designs for future aerospace vehicles.

  7. Solid Hydrogen Experiments for Atomic Propellants: Particle Formation Energy and Imaging Analyses

    NASA Technical Reports Server (NTRS)

    Palaszewski, Bryan

    2002-01-01

    This paper presents particle formation energy balances and detailed analyses of the images from experiments that were conducted on the formation of solid hydrogen particles in liquid helium during the Phase II testing in 2001. Solid particles of hydrogen were frozen in liquid helium and observed with a video camera. The solid hydrogen particle sizes and the total mass of hydrogen particles were estimated. The particle formation efficiency is also estimated. Particle sizes from the Phase I testing in 1999 and the Phase II testing in 2001 were similar. Though the 2001 testing created similar particles sizes, many new particle formation phenomena were observed. These experiment image analyses are one of the first steps toward visually characterizing these particles and it allows designers to understand what issues must be addressed in atomic propellant feed system designs for future aerospace vehicles.

  8. Comparison of linear measurements and analyses taken from plaster models and three-dimensional images.

    PubMed

    Porto, Betina Grehs; Porto, Thiago Soares; Silva, Monica Barros; Grehs, Renésio Armindo; Pinto, Ary dos Santos; Bhandi, Shilpa H; Tonetto, Mateus Rodrigues; Bandéca, Matheus Coelho; dos Santos-Pinto, Lourdes Aparecida Martins

    2014-11-01

    Digital models are an alternative for carrying out analyses and devising treatment plans in orthodontics. The objective of this study was to evaluate the accuracy and the reproducibility of measurements of tooth sizes, interdental distances and analyses of occlusion using plaster models and their digital images. Thirty pairs of plaster models were chosen at random, and the digital images of each plaster model were obtained using a laser scanner (3Shape R-700, 3Shape A/S). With the plaster models, the measurements were taken using a caliper (Mitutoyo Digimatic(®), Mitutoyo (UK) Ltd) and the MicroScribe (MS) 3DX (Immersion, San Jose, Calif). For the digital images, the measurement tools used were those from the O3d software (Widialabs, Brazil). The data obtained were compared statistically using the Dahlberg formula, analysis of variance and the Tukey test (p < 0.05). The majority of the measurements, obtained using the caliper and O3d were identical, and both were significantly different from those obtained using the MS. Intra-examiner agreement was lowest when using the MS. The results demonstrated that the accuracy and reproducibility of the tooth measurements and analyses from the plaster models using the caliper and from the digital models using O3d software were identical.

  9. Fractal analyses of osseous healing using Tuned Aperture Computed Tomography images

    PubMed Central

    Seyedain, Ali; Webber, Richard L.; Nair, Umadevi P.; Piesco, Nicholas P.; Agarwal, Sudha; Mooney, Mark P.; Gröndahl, Hans-Göran

    2016-01-01

    The aim of this study was to evaluate osseous healing in mandibular defects using fractal analyses on conventional radiographs and tuned aperture computed tomography (TACT; OrthoTACT, Instrumentarium Imaging, Helsinki, Finland) images. Eighty test sites on the inferior margins of rabbit mandibles were subject to lesion induction and treated with one of the following: no treatment (controls); osteoblasts only; polymer matrix only; or osteoblast-polymer matrix (OPM) combination. Images were acquired using conventional radiography and TACT, including unprocessed TACT (TACT-U) and iteratively restored TACT (TACT-IR). Healing was followed up over time and images acquired at 3, 6, 9, and 12 weeks post-surgery. Fractal dimension (FD) was computed within regions of interest in the defects using the TACT workbench. Results were analyzed for effects produced by imaging modality, treatment modality, time after surgery and lesion location. Histomorphometric data were available to assess ground truth. Significant differences (p < 0.0001) were noted based on imaging modality with TACT-IR recording the highest mean fractal dimension (MFD), followed by TACT-U and conventional images, in that order. Sites treated with OPM recorded the highest MFDs among all treatment modalities (p < 0.0001). The highest MFD based on time was recorded at 3 weeks and differed significantly with 12 weeks (p < 0.035). Correlation of FD with results of histomorphometric data was high (r = 0.79; p < 0.001). The FD computed on TACT-IR showed the highest correlation with histomorphometric data, thus establishing the fact TACT is a more efficient and accurate imaging modality for quantification of osseous changes within healing bony defects. PMID:11519567

  10. The Decoding Toolbox (TDT): a versatile software package for multivariate analyses of functional imaging data

    PubMed Central

    Hebart, Martin N.; Görgen, Kai; Haynes, John-Dylan

    2015-01-01

    The multivariate analysis of brain signals has recently sparked a great amount of interest, yet accessible and versatile tools to carry out decoding analyses are scarce. Here we introduce The Decoding Toolbox (TDT) which represents a user-friendly, powerful and flexible package for multivariate analysis of functional brain imaging data. TDT is written in Matlab and equipped with an interface to the widely used brain data analysis package SPM. The toolbox allows running fast whole-brain analyses, region-of-interest analyses and searchlight analyses, using machine learning classifiers, pattern correlation analysis, or representational similarity analysis. It offers automatic creation and visualization of diverse cross-validation schemes, feature scaling, nested parameter selection, a variety of feature selection methods, multiclass capabilities, and pattern reconstruction from classifier weights. While basic users can implement a generic analysis in one line of code, advanced users can extend the toolbox to their needs or exploit the structure to combine it with external high-performance classification toolboxes. The toolbox comes with an example data set which can be used to try out the various analysis methods. Taken together, TDT offers a promising option for researchers who want to employ multivariate analyses of brain activity patterns. PMID:25610393

  11. Computer-based image-analyses of laminated shales, carboniferous of the Midcontinent and surrounding areas

    SciTech Connect

    Archer, A.W. . Dept. of Geology)

    1993-02-01

    Computerized image-analyses of petrographic data can greatly facilitate the quantification of detailed descriptions and analyses of fine-scale fabric, or petrofabric. In thinly laminated rocks, manual measurement of successive lamina thicknesses is very time consuming, especially when applied to thick, cored sequences. In particular, images of core materials can be digitized and the resulting image then processed as a large matrix. Using such techniques, it is relatively easy to automate continuous measurements of lamina thickness and lateral continuity. This type of analyses has been applied to a variety of Carboniferous strata, particularly those siliciclastics that occur within the outside shale' portions of Kansas cyclothems. Of the various sedimentological processes capable of producing such non-random thickness variations, a model invoking tidal processes appears to be particularly robust. Tidal sedimentation could not only have resulted in the deposition of individual lamina, but in addition tidal-height variations during various phases of the lunar orbit can serve to explain the systematic variations. Comparison of these Carboniferous shales with similar laminations formed in modern high tidal-range environments indicates many similarities. These modern analogs include the Bay of Fundy in Canada, and Bay of Mont-Staint-Michel in France. Lamina-thickness variations, in specific cases, can be correlated with known tidal periodicities. In addition, in some samples, details of the tidal regime can be interpolated, such as the nature of the tidal system (i.e., diurnal or semidiurnal) and some indicators of tidal range can be ascertained based upon modern analogs.

  12. Automatic image-based analyses using a coupled quadtree-SBFEM/SCM approach

    NASA Astrophysics Data System (ADS)

    Gravenkamp, Hauke; Duczek, Sascha

    2017-06-01

    Quadtree-based domain decomposition algorithms offer an efficient option to create meshes for automatic image-based analyses. Without introducing hanging nodes the scaled boundary finite element method (SBFEM) can directly operate on such meshes by only discretizing the edges of each subdomain. However, the convergence of a numerical method that relies on a quadtree-based geometry approximation is often suboptimal due to the inaccurate representation of the boundary. To overcome this problem a combination of the SBFEM with the spectral cell method (SCM) is proposed. The basic idea is to treat each uncut quadtree cell as an SBFEM polygon, while all cut quadtree cells are computed employing the SCM. This methodology not only reduces the required number of degrees of freedom but also avoids a two-dimensional quadrature in all uncut quadtree cells. Numerical examples including static, harmonic, modal and transient analyses of complex geometries are studied, highlighting the performance of this novel approach.

  13. Study of SGD along the French Mediterranean coastline using airborne TIR images and in situ analyses

    NASA Astrophysics Data System (ADS)

    van Beek, Pieter; Stieglitz, Thomas; Souhaut, Marc

    2015-04-01

    Although submarine groundwater discharge (SGD) has been investigated in many places of the world, very few studies were conducted along the French coastline of the Mediterranean Sea. Almost no information is available on the fluxes of water and chemical elements associated with these SGD and on their potential impact on the geochemical cycling and ecosystems of the coastal zones. In this work, we combined the use of airborne thermal infrared (TIR) images with in situ analyses of salinity, temperature, radon and radium isotopes to study SGD at various sites along the French Mediterranean coastline and in coastal lagoons. These analyses allowed us to detect SGD sites and to quantify SGD fluxes (that include both the fluxes of fresh groundwater and recirculated seawater). In particular, we will show how the Ra isotopes determined in the La Palme lagoon were used to estimate i) the residence time of waters in the lagoon and ii) SGD fluxes.

  14. Immunochemical Micro Imaging Analyses for the Detection of Proteins in Artworks.

    PubMed

    Sciutto, Giorgia; Zangheri, Martina; Prati, Silvia; Guardigli, Massimo; Mirasoli, Mara; Mazzeo, Rocco; Roda, Aldo

    2016-06-01

    The present review is aimed at reporting on the most advanced and recent applications of immunochemical imaging techniques for the localization of proteins within complex and multilayered paint stratigraphies. Indeed, a paint sample is usually constituted by the superimposition of different layers whose characterization is fundamental in the evaluation of the state of conservation and for addressing proper restoration interventions. Immunochemical methods, which are based on the high selectivity of antigen-antibody reactions, were proposed some years ago in the field of cultural heritage. In addition to enzyme-linked immunosorbent assays for protein identification, immunochemical imaging methods have also been explored in the last decades, thanks to the possibility to localize the target analytes, thus increasing the amount of information obtained and thereby reducing the number of samples and/or analyses needed for a comprehensive characterization of the sample. In this review, chemiluminescent, spectroscopic and electrochemical imaging detection methods are discussed to illustrate potentialities and limits of advanced immunochemical imaging systems for the analysis of paint cross-sections.

  15. An accessible, scalable ecosystem for enabling and sharing diverse mass spectrometry imaging analyses.

    SciTech Connect

    Fischer, CR; Ruebel, O; Bowen, BP

    2016-01-01

    Mass spectrometry imaging (MSI) is used in an increasing number of biological applications. Typical MSI datasets contain unique, high-resolution mass spectra from tens of thousands of spatial locations, resulting in raw data sizes of tens of gigabytes per sample. In this paper, we review technical progress that is enabling new biological applications and that is driving an increase in the complexity and size of MSI data. Handling such data often requires specialized computational infrastructure, software, and expertise. OpenMSI, our recently described platform, makes it easy to explore and share MSI datasets via the web - even when larger than 50 GB. Here we describe the integration of OpenMSI with IPython notebooks for transparent, sharable, and replicable MSI research. An advantage of this approach is that users do not have to share raw data along with analyses; instead, data is retrieved via OpenMSI's web API. The IPython notebook interface provides a low-barrier entry point for data manipulation that is accessible for scientists without extensive computational training. Via these notebooks, analyses can be easily shared without requiring any data movement. We provide example notebooks for several common MSI analysis types including data normalization, plotting, clustering, and classification, and image registration.

  16. An accessible, scalable ecosystem for enabling and sharing diverse mass spectrometry imaging analyses.

    PubMed

    Fischer, Curt R; Ruebel, Oliver; Bowen, Benjamin P

    2016-01-01

    Mass spectrometry imaging (MSI) is used in an increasing number of biological applications. Typical MSI datasets contain unique, high-resolution mass spectra from tens of thousands of spatial locations, resulting in raw data sizes of tens of gigabytes per sample. In this paper, we review technical progress that is enabling new biological applications and that is driving an increase in the complexity and size of MSI data. Handling such data often requires specialized computational infrastructure, software, and expertise. OpenMSI, our recently described platform, makes it easy to explore and share MSI datasets via the web - even when larger than 50 GB. Here we describe the integration of OpenMSI with IPython notebooks for transparent, sharable, and replicable MSI research. An advantage of this approach is that users do not have to share raw data along with analyses; instead, data is retrieved via OpenMSI's web API. The IPython notebook interface provides a low-barrier entry point for data manipulation that is accessible for scientists without extensive computational training. Via these notebooks, analyses can be easily shared without requiring any data movement. We provide example notebooks for several common MSI analysis types including data normalization, plotting, clustering, and classification, and image registration.

  17. Mosquito Larval Habitats, Land Use, and Potential Malaria Risk in Northern Belize from Satellite Image Analyses

    NASA Technical Reports Server (NTRS)

    Pope, Kevin; Masuoka, Penny; Rejmankova, Eliska; Grieco, John; Johnson, Sarah; Roberts, Donald

    2004-01-01

    The distribution of Anopheles mosquito habitats and land use in northern Belize is examined with satellite data. -A land cover classification based on multispectral SPOT and multitemporal Radarsat images identified eleven land cover classes, including agricultural, forest, and marsh types. Two of the land cover types, Typha domingensis marsh and flooded forest, are Anopheles vestitipennis larval habitats. Eleocharis spp. marsh is the larval habitat for Anopheles albimanus. Geographic Information Systems (GIS) analyses of land cover demonstrate that the amount of T-ha domingensis in a marsh is positively correlated with the amount of agricultural land in the adjacent upland, and negatively correlated with the amount of adjacent forest. This finding is consistent with the hypothesis that nutrient (phosphorus) runoff from agricultural lands is causing an expansion of Typha domingensis in northern Belize. This expansion of Anopheles vestitipennis larval habitat may in turn cause an increase in malaria risk in the region.

  18. Mosquito Larval Habitats, Land Use, and Potential Malaria Risk in Northern Belize from Satellite Image Analyses

    NASA Technical Reports Server (NTRS)

    Pope, Kevin; Masuoka, Penny; Rejmankova, Eliska; Grieco, John; Johnson, Sarah; Roberts, Donald

    2004-01-01

    The distribution of Anopheles mosquito habitats and land use in northern Belize is examined with satellite data. -A land cover classification based on multispectral SPOT and multitemporal Radarsat images identified eleven land cover classes, including agricultural, forest, and marsh types. Two of the land cover types, Typha domingensis marsh and flooded forest, are Anopheles vestitipennis larval habitats. Eleocharis spp. marsh is the larval habitat for Anopheles albimanus. Geographic Information Systems (GIS) analyses of land cover demonstrate that the amount of T-ha domingensis in a marsh is positively correlated with the amount of agricultural land in the adjacent upland, and negatively correlated with the amount of adjacent forest. This finding is consistent with the hypothesis that nutrient (phosphorus) runoff from agricultural lands is causing an expansion of Typha domingensis in northern Belize. This expansion of Anopheles vestitipennis larval habitat may in turn cause an increase in malaria risk in the region.

  19. Development, Capabilities, and Impact on Wind Analyses of the Hurricane Imaging Radiometer (HIRAD)

    NASA Technical Reports Server (NTRS)

    Miller, T.; Amarin, R.; Atlas, R.; Bailey, M.; Black, P.; Buckley, C.; Chen, S.; El-Nimri, S.; Hood, R.; James, M.; Johnson, J.; Jones, W.; Ruf, C.; Simmons, D.; Uhlhorn, E.; Inglish, C.

    2010-01-01

    The Hurricane Imaging Radiometer (HIRAD) is a new airborne microwave remote sensor for hurricane observations that is currently under development by NASA Marshall Space Flight Center in partnership with the NOAA Atlantic Oceanographic and Meteorological Laboratory/Hurricane Research Division, the University of Central Florida, the University of Michigan, and the University of Alabama in Huntsville. The instrument is being test flown in January and is expected to participate in the tropical cyclone experiment GRIP (Genesis and Rapid Intensification Processes) in the 2010 season. HIRAD is being designed to study the wind field in some detail within strong hurricanes and to enhance the real-time airborne ocean surface winds observation capabilities of NOAA and USAF Weather Squadron hurricane hunter aircraft currently using the operational Stepped Frequency Microwave Radiometer (SFMR). Unlike SFMR, which measures wind speed and rain rate along the ground track at a single point directly beneath the aircraft, HIRAD will provide images of the surface wind and rain field over a wide swath (approximately 3 x the aircraft altitude) with approximately 2 km resolution. This paper describes the HIRAD instrument and the physical basis for its operations, including chamber test data from the instrument. The potential value of future HIRAD observations will be illustrated with a summary of Observing System Simulation Experiments (OSSEs) in which measurements from the new instrument as well as those from existing instruments (air, surface, and space-based) are simulated from the output of a detailed numerical model, and those results are used to construct simulated H*Wind analyses. Evaluations will be presented on the impact on H*Wind analyses of using the HIRAD instrument observations to replace those of the SFMR instrument, and also on the impact of a future satellite-based HIRAD in comparison to instruments with more limited capabilities for observing strong winds through heavy

  20. Correlative Imaging and Analyses of Soil Organic Matter Stabilization in the Rhizosphere

    NASA Astrophysics Data System (ADS)

    Dohnalkova, Alice; Tfaily, Malak; Chu, Rosalie; Crump, Alex; Brislawn, Colin; Varga, Tamas; Chrisler, William

    2016-04-01

    Correlative Imaging and Analyses of Soil Organic Matter Stabilization in the Rhizosphere Understanding the dynamics of carbon (C) pools in soil systems is a critical area for mitigating atmospheric carbon dioxide levels and maintaining healthy soils. Although microbial contributions to stable soil carbon pools have often been regarded as low to negligible, we present evidence that microbes may play a far greater role in the stabilization of soil organic matter (SOM), thus in contributing to soil organic matter pools with longer residence time. The rhizosphere, a zone immediately surrounding the plant roots, represents a geochemical hotspot with high microbial activity and profuse SOM production. Particularly, microbially secreted extracellular polymeric substances (EPS) present a remarkable dynamic entity that plays a critical role in numerous soil processes including mineral weathering. We approach the interface of soil minerals and microbes with a focus on the organic C stabilization mechanisms. We use a suite of high-resolution imaging and analytical methods (confocal, scanning and transmission electron microscopy, Fourier transform ion cyclotron resonance mass spectrometry, DNA sequencing and X-ray diffraction), to study the living and non-living rhizosphere components. Our goal is to elucidate a pathway for the formation, storage, transformation and protection of persistent microbially-produced carbon in soils. Based on our multimodal analytical approach, we propose that persistent microbial necromass in soils accounts for considerably higher soil carbon than previously estimated.

  1. [Development of automatic analyses for star-shot images using computed radiography (CR)].

    PubMed

    Kawata, Hidemichi; Ohkura, Sunao; Ono, Hiroshi; Fukudome, Yoshifumi; Kawamura, Seiji; Hayabuchi, Naofumi

    2006-12-20

    Recent progress in radiation therapy has been greatly enhanced in many facilities by the development of new machines for treatment, improved computer technology for radiotherapy treatment planning systems (RTPs), increased accuracy of radiation therapy such as stereotactic irradiation, and intensity-modulated radiation therapy (IMRT). Quality control (QC) of the isocenter, which has consisted of gantry rotation and limiting the radiation field, is important for greater accuracy of these radiation therapy technologies. Star-shot analyses using computed radiography (CR) for evaluation of the isocenter were employed in this study. Devices to support CR were created, and a method of automatically analyzing images obtained by the star-shot technique, which calculated the error (distance) from the isocenter and the incident beam angle, were developed. In terms of the accuracy of our method, the average maximum error was 0.33 mm (less than 2 pixels: 0.35 mm), the average absolute error and incident beam angle errors were 0.3 mm and 0.4 degrees at maximum and at one standard deviation (SD), respectively. In this study, the processing times were 16 sec at minimum, 152 sec at maximum, 18 sec at most frequencies, and 23.6 sec on average. In conclusion, it was considered that our newly developed method for analyzing star-shot images using CR enabled immediate, quantitative evaluation of the isocenter.

  2. Core Formation in Planetesimals: Textural Analyses From 3D Synchrotron Imaging and Complex Systems Modeling

    NASA Astrophysics Data System (ADS)

    Rushmer, T. A.; Tordesillas, A.; Walker, D. M.; Parkinson, D. Y.; Clark, S. M.

    2012-12-01

    Recent scenarios of core formation in planetesimals using calculations from planetary dynamists and from extinct radionuclides (e.g. 26Al, 60Fe), call for segregation of a metal liquid (core) from both solid silicate and a partially molten silicate - a silicate mush - matrix. These segregation scenarios require segregation of metallic metal along fracture networks or by the growth of molten core material into blebs large enough to overcome the strength of the mush matrix. Such segregation scenarios usually involve high strain rates so that separation can occur, which is in agreement with the accretion model of planetary growth. Experimental work has suggested deformation and shear can help develop fracture networks and coalesce metallic blebs. Here, we have developed an innovative approach that currently combines 2D textures in experimental deformation experiments on a partially molten natural meteorite with complex network analyses. 3D textural data from experimental samples, deformed at high strain rates, with or without silicate melts present, have been obtained by synchrotron-based high resolution hard x-ray microtomography imaging. A series of two-dimensional images is collected as the sample is rotated, and tomographic reconstruction yields the full 3D representation of the sample. Virtual slices through the 3D object in any arbitrary direction can be visualized, or the full data set can be visualized by volume rendering. More importantly, automated image filtering and segmentation allows the extraction of boundaries between the various phases. The volumes, shapes, and distributions of each phase, and the connectivity between them, can then be quantitatively analysed, and these results can be compared to models. We are currently using these new visual data sets to augment our 2D data. These results will be included in our current complex system analytical approach. This integrated method can elucidate and quantify the growth of metallic blebs in regions where

  3. Image analyses in bauxitic ores: The case of the Apulian karst bauxites

    NASA Astrophysics Data System (ADS)

    Buccione, Roberto; Sinisi, Rosa; Mongelli, Giovanni

    2015-04-01

    This study concern two different karst bauxite deposits of the Apulia region (southern Italy). These deposits outcrop in the Murge and Salento areas: the Murge bauxite (upper Cretaceous) is a typical canyon-like deposit formed in a karst depression whereas the Salento bauxite (upper Eocene - Oligocene) is the result of the erosion, remobilization and transport of older bauxitic material from a relative distant area. This particular bauxite arrangement gave the name to all the same bauxite deposits which are thus called Salento-type deposits. Bauxite's texture is essentially made of sub-circular concentric aggregates, called ooids, dispersed in a pelitic matrix. The textural properties of the two bauxitic ores, as assessed by SEM-EDX, are different. In the bauxite from the canyon-like deposit the ooids/matrix ratio is higher than in the Salento-type bauxite. Furthermore the ooids in the Salento-like bauxite are usually made by a large core surrounded by a narrow, single, accretion layer, whereas the ooids from the canyon-like deposit have a smaller core surrounded by several alternating layers of Al-hematite and boehmite (Mongelli et al., 2014). In order to explore in more detail the textural features of both bauxite deposits, particle shape analyses were performed. Image analyses and the fractal dimension have been widely used in geological studies including economic geology (e.g. Turcotte, 1986; Meakin, 1991; Deng et al., 2011). The geometric properties evaluated are amounts of ooids, average ooids size, ooids rounding and the fractal dimension D, which depends on the ooids/matrix ratio. D is the slope of a plotting line obtained using a particular counting technique on each sample image. The fractal dimension is slightly lower for the Salento-type bauxites. Since the process which led to the formation of the ooids is related to an aggregation growth involving chemical fractionation (Mongelli, 2002) a correlation among these parameters and the contents of major

  4. Estimation of reactive surface area using a combined method of laboratory analyses and digital image processing

    NASA Astrophysics Data System (ADS)

    Ma, Jin; Kong, Xiang-Zhao; Saar, Martin O.

    2017-04-01

    Fluid-rock interactions play an important role in the engineering processes such as chemical stimulation of enhanced geothermal systems and carbon capture, utilization, and storage. However, these interactions highly depend on the accessible reactive surface area of the minerals that are generally poorly constrained for natural geologic samples. In particular, quantifying surface area of each reacting mineral within whole rock samples is challenging due to the heterogeneous distribution of minerals and pore space. In this study, detailed laboratory analyses were performed on sandstone samples from deep geothermal sites in Lithuania. We measure specific surface area of whole rock samples using a gas adsorption method (so-called B.E.T.) with N2 at a temperature of 77.3K. We also quantify their porosity and pore size distribution by a Helium gas pycnometer and a Hg porosimetry, respectively. Rock compositions are determined by a combination of X-ray fluorescence (XRF) and quantitative scanning electron microscopy (SEM) - Energy-dispersive X-ray spectroscopy (EDS), which are later geometrically mapped on images of two-dimensional SEM- Backscattered electrons (BSE) with a resolution of 1.2 μm and three-dimensional micro-CT with a resolution of 10.3 μm to produce a digital mineral map for further constraining the accessibility of reactive minerals. Moreover, we attempt to link the whole rock porosity, pore size distribution, and B.E.T. specific surface area with the digital mineral maps. We anticipate these necessary analyses to provide in-depth understanding of fluid sample chemistry from later hydrothermal reactive flow-through experiments on whole rock samples at elevated pressure and temperature.

  5. Autonomous Science Analyses of Digital Images for Mars Sample Return and Beyond

    NASA Technical Reports Server (NTRS)

    Gulick, V. C.; Morris, R. L.; Ruzon, M.; Roush, T. L.

    1999-01-01

    To adequately explore high priority landing sites, scientists require rovers with greater mobility. Therefore, future Mars missions will involve rovers capable of traversing tens of kilometers (vs. tens of meters traversed by Mars Pathfinder's Sojourner). However, the current process by which scientists interact with a rover does not scale to such distances. A single science objective is achieved through many iterations of a basic command cycle: (1) all data must be transmitted to Earth and analyzed; (2) from this data, new targets are selected and the necessary information from the appropriate instruments are requested; (3) new commands are then uplinked and executed by the spacecraft and (4) the resulting data are returned to Earth, starting the process again. Experience with rover tests on Earth shows that this time intensive process cannot be substantially shortened given the limited data downlink bandwidth and command cycle opportunities of real missions. Sending complete multicolor panoramas at several waypoints, for example, is out of the question for a single downlink opportunity. As a result, long traverses requiring many science command cycles would likely require many weeks, months or even years, perhaps exceeding rover design life or other constraints. Autonomous onboard science analyses can address these problems in two ways. First, it will allow the rover to transmit only "interesting" images, defined as those likely to have higher science content. Second, the rover will be able to anticipate future commands, for example acquiring and returning spectra of "interesting" rocks along with the images in which they were detected. Such approaches, coupled with appropriate navigational software, address both the data volume and command cycle bottlenecks that limit both rover mobility and science yield. We are developing algorithms to enable such intelligent decision making by autonomous spacecraft. Reflecting the ultimate level of ability we aim for, this

  6. Molecular cytogenetic analysis of human blastocysts andcytotrophoblasts by multi-color FISH and Spectra Imaging analyses

    SciTech Connect

    Weier, Jingly F.; Ferlatte, Christy; Baumgartner, Adolf; Jung,Christine J.; Nguyen, Ha-Nam; Chu, Lisa W.; Pedersen, Roger A.; Fisher,Susan J.; Weier, Heinz-Ulrich G.

    2006-02-08

    Numerical chromosome aberrations in gametes typically lead to failed fertilization, spontaneous abortion or a chromosomally abnormal fetus. By means of preimplantation genetic diagnosis (PGD), we now can screen human embryos in vitro for aneuploidy before transferring the embryos to the uterus. PGD allows us to select unaffected embryos for transfer and increases the implantation rate in in vitro fertilization programs. Molecular cytogenetic analyses using multi-color fluorescence in situ hybridization (FISH) of blastomeres have become the major tool for preimplantation genetic screening of aneuploidy. However, current FISH technology can test for only a small number of chromosome abnormalities and hitherto failed to increase the pregnancy rates as expected. We are in the process of developing technologies to score all 24 chromosomes in single cells within a 3 day time limit, which we believe is vital to the clinical setting. Also, human placental cytotrophoblasts (CTBs) at the fetal-maternal interface acquire aneuploidies as they differentiate to an invasive phenotype. About 20-50% of invasive CTB cells from uncomplicated pregnancies were found aneuploidy, suggesting that the acquisition of aneuploidy is an important component of normal placentation, perhaps limiting the proliferative and invasive potential of CTBs. Since most invasive CTBs are interphase cells and possess extreme heterogeneity, we applied multi-color FISH and repeated hybridizations to investigate individual CTBs. In summary, this study demonstrates the strength of Spectral Imaging analysis and repeated hybridizations, which provides a basis for full karyotype analysis of single interphase cells.

  7. The Elaboration of Spiral Galaxies: Morpho-Kinematics Analyses of their Progenitors with IMAGES

    NASA Astrophysics Data System (ADS)

    Hammer, F.; Images Collaboration

    2009-12-01

    The IMAGES (Intermediate MAss Galaxy Evolution Sequence) project aims at measuring the velocity fields of a representative sample of 100 massive galaxies at z=0.4-0.75, selected in the CDFS, the CFRS and the HDFS fields. It uses the world-unique mode of multiple integral field units of FLAMES/ GIRAFFE at VLT. The resolved-kinematics data allow us to sample the large scale motions at ˜ few kpc scale for each galaxy. They have been combined with the deepest HST/ACS, Spitzer (MIPS and IRAC) and VLT/FORS2 ever achieved observations. Most intermediate redshift galaxies show anomalous velocity fields: 6 Gyrs ago, half of the present day spirals were out of equilibrium and had peculiar morphologies. The wealth of the data in these fields allow us to modelize the physical processes in each galaxy with an accuracy almost similar to what is done in the local Universe. These detailed analyses reveal the importance of merger processes, including their remnant phases. Together with the large evolution of spiral properties, this points out the importance of disk survival and strengthens the disk rebuilding scenario. This suggests that the hierarchical scenario may apply to the elaboration of disk galaxies as it does for ellipticals.

  8. Short wave infrared chemical imaging as future tool for analysing gunshot residues patterns in targets.

    PubMed

    Ortega-Ojeda, F E; Torre-Roldán, M; García-Ruiz, C

    2017-05-15

    This work used chemical imaging in the short-wave infrared region for analysing gunshot residues (GSR) patterns in cotton fabric targets shot with conventional and non-toxic ammunition. It presents a non-destructive, non-toxic, highly visual and hiperspectral-based approach. The method was based on classical least squares regression, and was tested with the ammunition propellants and their standard components' spectra. The propellants' spectra were satisfactorily used (R(2) >0.966, and CorrCoef >0.982) for identifying the GSR irrespective of the type of ammunition used for the shooting. In a more versatile approach, nitrocellulose, the main component in the ammunition propellants, resulted an excellent standard for identifying GSR patterns (R(2)>0.842, and CorrCoef >0.908). In this case, the propellants' stabilizers (diphenilamine and centralite), and its nitrated derivatives as well as dinitrotoluene, showed also high spectral activity. Therefore, they could be recommended as complementary standards for confirming the GSR identification. These findings establish the proof of concept for a science-based evidence useful to support expert reports and final court rulings. This approach for obtaining GSR patterns can be an excellent alternative to the current and traditional chemical methods, which are based in presumptive and invasive colour tests.

  9. Systematic Comparison of Brain Imaging Meta-Analyses of ToM with vPT

    PubMed Central

    Schurz, Matthias; Perner, Josef

    2017-01-01

    In visual perspective taking (vPT) one has to concern oneself with what other people see and how they see it. Since seeing is a mental state, developmental studies have discussed vPT within the domain of “theory of mind (ToM)” but imaging studies have not treated it as such. Based on earlier results from several meta-analyses, we tested for the overlap of visual perspective taking studies with 6 different kinds of ToM studies: false belief, trait judgments, strategic games, social animations, mind in the eyes, and rational actions. Joint activation was observed between the vPT task and some kinds of ToM tasks in regions involving the left temporoparietal junction (TPJ), anterior precuneus, left middle occipital gyrus/extrastriate body area (EBA), and the left inferior frontal and precentral gyrus. Importantly, no overlap activation was found for the vPT tasks with the joint core of all six kinds of ToM tasks. This raises the important question of what the common denominator of all tasks that fall under the label of “theory of mind” is supposed to be if visual perspective taking is not one of them. PMID:28367446

  10. Systematic Comparison of Brain Imaging Meta-Analyses of ToM with vPT.

    PubMed

    Arora, Aditi; Schurz, Matthias; Perner, Josef

    2017-01-01

    In visual perspective taking (vPT) one has to concern oneself with what other people see and how they see it. Since seeing is a mental state, developmental studies have discussed vPT within the domain of "theory of mind (ToM)" but imaging studies have not treated it as such. Based on earlier results from several meta-analyses, we tested for the overlap of visual perspective taking studies with 6 different kinds of ToM studies: false belief, trait judgments, strategic games, social animations, mind in the eyes, and rational actions. Joint activation was observed between the vPT task and some kinds of ToM tasks in regions involving the left temporoparietal junction (TPJ), anterior precuneus, left middle occipital gyrus/extrastriate body area (EBA), and the left inferior frontal and precentral gyrus. Importantly, no overlap activation was found for the vPT tasks with the joint core of all six kinds of ToM tasks. This raises the important question of what the common denominator of all tasks that fall under the label of "theory of mind" is supposed to be if visual perspective taking is not one of them.

  11. Continuous Measurements of Eyeball Area and Their Spectrum Analyses -- Toward the Quantification of Rest Rhythm of Horses by Image Processing

    DTIC Science & Technology

    2001-10-25

    analyses of electroencephalogram at half- closed eye and fully closed eye. This study aimed at quantitative estimating rest rhythm of horses by the...analyses of eyeball movement. The mask attached with a miniature CCD camera was newly developed. The continuous images of the horse eye for about 24...eyeball area were calculated. As for the results, the fluctuating status of eyeball area was analyzed quantitatively, and the rest rhythm of horses was

  12. Pallasite formation after a non-destructive impact. An experimental- and image analyses-based study

    NASA Astrophysics Data System (ADS)

    Solferino, Giulio; Golabek, Gregor J.; Nimmo, Francis; Schmidt, Max W.

    2015-04-01

    The formation conditions of pallasite meteorites in the interior of terrestrial planetesimals have been matter of debate over the last 40 years. Among other characteristics, the simple mineralogical composition (i.e., olivine, FeNi, FeS +/- pyroxene) and the dualism between fragmental and rounded olivine-bearing pallasites must be successfully reproduced by a potential formation scenario. This study incorporates a series of annealing experiments with olivine plus Fe-S, and digital image analyses of slabs from Brenham, Brahin, Seymchan, and Springwater pallasites. Additionally a 1D finite-difference numerical model was employed to show that a non-destructive collision followed by mixing of the impactor's core with the target body silicate mantle could lead to the formation of both fragmental and rounded pallasite types. Specifically, an impact occurring right after the accomplishment of the target body differentiation and up to several millions of years afterwards allows for (i) average grain sizes consistent with the observed rounded olivine-bearing pallasites, (ii) a remnant magnetization of Fe-Ni olivine inclusions as measured in natural pallasites and (iii) for the metallographic cooling rates derived from Fe-Ni in pallasites. An important result of this investigation is the definition of the grain growth rate of olivine in molten Fe-S as follows: dn - d0n = k0 exp(-Ea/RT) t, where, d0 is the starting grain size, d the grain size at time t, n = 2.42(46) the growth exponent, k0 = 9.43•E06 μm n s-1 a characteristic constant, Ea = 289 kJ/mol the activation energy for a specific growth process, R the gas constant, and T the absolute temperature. The computed olivine coarsening rate is markedly faster than in olivine-FeNi and olivine-Ni systems.

  13. X-ray digital imaging petrography of lunar mare soils: modal analyses of minerals and glasses

    NASA Technical Reports Server (NTRS)

    Taylor, L. A.; Patchen, A.; Taylor, D. H.; Chambers, J. G.; McKay, D. S.

    1996-01-01

    It is essential that accurate modal (i.e., volume) percentages of the various mineral and glass phases in lunar soils be used for addressing and resolving the effects of space weathering upon reflectance spectra, as well as for their calibration such data are also required for evaluating the resource potential of lunar minerals for use at a lunar base. However, these data are largely lacking. Particle-counting information for lunar soils, originally obtained to study formational processes, does not provide these necessary data, including the percentages of minerals locked in multi-phase lithic fragments and fused-soil particles, such as agglutinates. We have developed a technique for modal analyses, sensu stricto, of lunar soils, using digital imaging of X-ray maps obtained with an energy-dispersive spectrometer mounted on an electron microprobe. A suite of nine soils (90 to 150 micrometers size fraction) from the Apollo 11, 12, 15, and 17 mare sites was used for this study. This is the first collection of such modal data on soils from all Apollo mare sites. The abundances of free-mineral fragments in the mare soils are greater for immature and submature soils than for mature soils, largely because of the formation of agglutinitic glass as maturity progresses. In considerations of resource utilization at a lunar base, the best lunar soils to use for mineral beneficiation (i.e., most free-mineral fragments) have maturities near the immature/submature boundary (Is/FeO approximately or = 30), not the mature soils with their complications due to extensive agglutination. The particle data obtained from the nine mare soils confirm the generalizations for lunar soils predicted by L.A. Taylor and D.S. McKay (1992, Lunar Planet Sci. Conf. 23rd, pp. 1411-1412 [Abstract]).

  14. X-ray digital imaging petrography of lunar mare soils: modal analyses of minerals and glasses

    NASA Technical Reports Server (NTRS)

    Taylor, L. A.; Patchen, A.; Taylor, D. H.; Chambers, J. G.; McKay, D. S.

    1996-01-01

    It is essential that accurate modal (i.e., volume) percentages of the various mineral and glass phases in lunar soils be used for addressing and resolving the effects of space weathering upon reflectance spectra, as well as for their calibration such data are also required for evaluating the resource potential of lunar minerals for use at a lunar base. However, these data are largely lacking. Particle-counting information for lunar soils, originally obtained to study formational processes, does not provide these necessary data, including the percentages of minerals locked in multi-phase lithic fragments and fused-soil particles, such as agglutinates. We have developed a technique for modal analyses, sensu stricto, of lunar soils, using digital imaging of X-ray maps obtained with an energy-dispersive spectrometer mounted on an electron microprobe. A suite of nine soils (90 to 150 micrometers size fraction) from the Apollo 11, 12, 15, and 17 mare sites was used for this study. This is the first collection of such modal data on soils from all Apollo mare sites. The abundances of free-mineral fragments in the mare soils are greater for immature and submature soils than for mature soils, largely because of the formation of agglutinitic glass as maturity progresses. In considerations of resource utilization at a lunar base, the best lunar soils to use for mineral beneficiation (i.e., most free-mineral fragments) have maturities near the immature/submature boundary (Is/FeO approximately or = 30), not the mature soils with their complications due to extensive agglutination. The particle data obtained from the nine mare soils confirm the generalizations for lunar soils predicted by L.A. Taylor and D.S. McKay (1992, Lunar Planet Sci. Conf. 23rd, pp. 1411-1412 [Abstract]).

  15. Spatiotemporal Analyses of Osteogenesis and Angiogenesis via Intravital Imaging in Cranial Bone Defect Repair

    PubMed Central

    Huang, Chunlan; Ness, Vincent P.; Yang, Xiaochuan; Chen, Hongli; Luo, Jiebo; Brown, Edward B; Zhang, Xinping

    2015-01-01

    Osteogenesis and angiogenesis are two integrated components in bone repair and regeneration. A deeper understanding of osteogenesis and angiogenesis has been hampered by technical difficulties of analyzing bone and neovasculature simultaneously in spatiotemporal scales and in three-dimensional formats. To overcome these barriers, a cranial defect window chamber model was established that enabled high-resolution, longitudinal, and real-time tracking of angiogenesis and bone defect healing via Multiphoton Laser Scanning Microscopy (MPLSM). By simultaneously probing new bone matrix via second harmonic generation (SHG), neovascular networks via intravenous perfusion of fluorophore, and osteoblast differentiation via 2.3kb collagen type I promoter driven GFP (Col2.3GFP), we examined the morphogenetic sequence of cranial bone defect healing and further established the spatiotemporal analyses of osteogenesis and angiogenesis coupling in repair and regeneration. We demonstrated that bone defect closure was initiated in the residual bone around the edge of the defect. The expansion and migration of osteoprogenitors into the bone defect occurred during the first 3 weeks of healing, coupled with vigorous microvessel angiogenesis at the leading edge of the defect. Subsequent bone repair was marked by matrix deposition and active vascular network remodeling within new bone. Implantation of bone marrow stromal cells (BMSCs) isolated from Col2.3GFP mice further showed that donor-dependent bone formation occurred rapidly within the first 3 weeks of implantation, in concert with early angiogenesis. The subsequent bone wound closure was largely host-dependent, associated with localized modest induction of angiogenesis. The establishment of a live imaging platform via cranial window provides a unique tool to understand osteogenesis and angiogenesis in repair and regeneration, enabling further elucidation of the spatiotemporal regulatory mechanisms of osteoprogenitor cell interactions

  16. X-ray digital imaging petrography of lunar mare soils: modal analyses of minerals and glasses.

    PubMed

    Taylor, L A; Patchen, A; Taylor, D H; Chambers, J G; McKay, D S

    1996-12-01

    It is essential that accurate modal (i.e., volume) percentages of the various mineral and glass phases in lunar soils be used for addressing and resolving the effects of space weathering upon reflectance spectra, as well as for their calibration such data are also required for evaluating the resource potential of lunar minerals for use at a lunar base. However, these data are largely lacking. Particle-counting information for lunar soils, originally obtained to study formational processes, does not provide these necessary data, including the percentages of minerals locked in multi-phase lithic fragments and fused-soil particles, such as agglutinates. We have developed a technique for modal analyses, sensu stricto, of lunar soils, using digital imaging of X-ray maps obtained with an energy-dispersive spectrometer mounted on an electron microprobe. A suite of nine soils (90 to 150 micrometers size fraction) from the Apollo 11, 12, 15, and 17 mare sites was used for this study. This is the first collection of such modal data on soils from all Apollo mare sites. The abundances of free-mineral fragments in the mare soils are greater for immature and submature soils than for mature soils, largely because of the formation of agglutinitic glass as maturity progresses. In considerations of resource utilization at a lunar base, the best lunar soils to use for mineral beneficiation (i.e., most free-mineral fragments) have maturities near the immature/submature boundary (Is/FeO approximately or = 30), not the mature soils with their complications due to extensive agglutination. The particle data obtained from the nine mare soils confirm the generalizations for lunar soils predicted by L.A. Taylor and D.S. McKay (1992, Lunar Planet Sci. Conf. 23rd, pp. 1411-1412 [Abstract]).

  17. Maximizing Science Return from Future Mars Missions with Onboard Image Analyses

    NASA Technical Reports Server (NTRS)

    Gulick, V. C.; Morris, R. L.; Bandari, E. B.; Roush, T. L.

    2000-01-01

    We have developed two new techniques to enhance science return and to decrease returned data volume for near-term Mars missions: 1) multi-spectral image compression and 2) autonomous identification and fusion of in-focus regions in an image series.

  18. Detection of melamine in milk powders based on NIR hyperspectral imaging and spectral similarity analyses

    USDA-ARS?s Scientific Manuscript database

    Melamine (2,4,6-triamino-1,3,5-triazine) contamination of food has become an urgent and broadly recognized topic as a result of several food safety scares in the past five years. Hyperspectral imaging techniques that combine the advantages of spectroscopy and imaging have been widely applied for a v...

  19. Segmentation d'images a tres haute resolution spatiale basee sur l'analyse multifractale

    NASA Astrophysics Data System (ADS)

    Voorons, Matthieu

    The recent availability, in remote sensing, of very high spatial resolution images brings the texture classification of images to a higher level of complexity. The singularity content of very high spatial resolution images, such as those from IKONOS, is very important due to the high local variability of their gray level. Such images have so many details that classical classification algorithms fail to achieve good results. In the case of IKONOS images of forest areas, a texture can be so different within a same class, that it becomes very difficult, even for a human, to classify or interpret them. The study of the high frequency content of the data seems to be a good way to study those images. The multifractal analysis provides us with global and local information on the singularities which represent the high frequency content of the image. We propose a new approach which uses the singularities of the image to achieve the classification of very high spatial resolution optical forestry images. It is based on the computation of the Holder regularity exponent at each point in the image. From this parameter we can compute the local Lacunarity spectrum or the large deviation multifractal spectrum which give information about the geometric distribution of the singularities in the image. So we use global and local descriptors of the regularity of the signal as input parameters to a k-means algorithm. The two algorithms are described and applied to an IKONOS image of forestry as well as to two artificial images, one made of Brodatz textures and the other of fractional brownian motions. The classification results are compared to those obtained with the gabor filters, the laws filters, the fractal dimension, the gaussian Markov random fields and the Haralick co-occurrence parameters. The proposed methods give good results and are even able to segment the image in tree density classes. We also devised tests to see the resistance of the discrimination power of the proposed

  20. Difference method for analysing infrared images in pigs with elevated body temperatures.

    PubMed

    Siewert, Carsten; Dänicke, Sven; Kersten, Susanne; Brosig, Bianca; Rohweder, Dirk; Beyerbach, Martin; Seifert, Hermann

    2014-03-01

    Infrared imaging proves to be a quick and simple method for measuring temperature distribution on the pig's head. The study showed that infrared imaging and analysis with a difference ROI (region of interest) method may be used for early detection of elevated body temperature in pigs (> 39.5°C). A high specificity of approx. 85% and a high sensitivity of 86% existed. The only prerequisite is that there are at least 2 anatomical regions which can be recognised as reproducible in the IR image. Noise suppression is guaranteed by averaging the temperature value within both of these ROI. The subsequent difference imaging extensively reduces the off-set error which varies in every thermal IR-image.

  1. Effect of Harderian adenectomy on the statistical analyses of mouse brain imaging using positron emission tomography

    PubMed Central

    Kim, Minsoo; Woo, Sang-Keun; Yu, Jung Woo; Lee, Yong Jin; Kim, Kyeong Min; Kang, Joo Hyun; Eom, Kidong

    2014-01-01

    Positron emission tomography (PET) using 2-deoxy-2-[18F] fluoro-D-glucose (FDG) as a radioactive tracer is a useful technique for in vivo brain imaging. However, the anatomical and physiological features of the Harderian gland limit the use of FDG-PET imaging in the mouse brain. The gland shows strong FDG uptake, which in turn results in distorted PET images of the frontal brain region. The purpose of this study was to determine if a simple surgical procedure to remove the Harderian gland prior to PET imaging of mouse brains could reduce or eliminate FDG uptake. Measurement of FDG uptake in unilaterally adenectomized mice showed that the radioactive signal emitted from the intact Harderian gland distorts frontal brain region images. Spatial parametric measurement analysis demonstrated that the presence of the Harderian gland could prevent accurate assessment of brain PET imaging. Bilateral Harderian adenectomy efficiently eliminated unwanted radioactive signal spillover into the frontal brain region beginning on postoperative Day 10. Harderian adenectomy did not cause any post-operative complications during the experimental period. These findings demonstrate the benefits of performing a Harderian adenectomy prior to PET imaging of mouse brains. PMID:23820224

  2. Focal plane array infrared imaging: a new way to analyse leaf tissue.

    PubMed

    Heraud, Philip; Caine, Sally; Sanson, Gordon; Gleadow, Ros; Wood, Bayden R; McNaughton, Don

    2007-01-01

    * Here, a new approach to macromolecular imaging of leaf tissue using a multichannel focal plane array (FPA) infrared detector was compared with the proven method of infrared mapping with a synchrotron source, using transverse sections of leaves from a species of Eucalyptus. * A new histological method was developed, ideally suited to infrared spectroscopic analysis of leaf tissue. Spatial resolution and the signal-to-noise ratio of the FPA imaging and synchrotron mapping methods were compared. * An area of tissue 350 microm(2) required approx. 8 h to map using the synchrotron technique and approx. 2 min to image using the FPA. The two methods produced similar infrared images, which differentiated all tissue types in the leaves according to their macromolecular chemistry. * The synchrotron and FPA methods produced similar results, with the synchrotron method having superior signal-to-noise ratio and potentially better spatial resolution, whereas the FPA method had the advantage in terms of data acquisition time, expense and ease of use. FPA imaging offers a convenient, laboratory-based approach to microscopic chemical imaging of leaves.

  3. Quantitative histological image analyses of reticulin fibers in a myelofibrotic mouse

    PubMed Central

    Lucero, Hector A.; Patterson, Shenia; Matsuura, Shinobu; Ravid, Katya

    2016-01-01

    Bone marrow (BM) reticulin fibrosis (RF), revealed by silver staining of tissue sections, is associated with myeloproliferative neoplasms, while tools for quantitative assessment of reticulin deposition throughout a femur BM are still in need. Here, we present such a method, allowing via analysis of hundreds of composite images to identify a patchy nature of RF throughout the BM during disease progression in a mouse model of myelofibrosis. To this end, initial conversion of silver stained BM color images into binary images identified two limitations: variable color, owing to polychromatic staining of reticulin fibers, and variable background in different sections of the same batch, limiting application of the color deconvolution method, and use of constant threshold, respectively. By blind coding image identities, to allow for threshold input (still within a narrow range), and using shape filtering to further eliminate background we were able to quantitate RF in myelofibrotic Gata-1low (experimental) and wild type (control) mice as a function of animal age. Color images spanning the whole femur BM were batch-analyzed using ImageJ software, aided by our two newly added macros. The results show heterogeneous RF density in different areas of the marrow of Gata-1low mice, with degrees of heterogeneity reduced upon aging. This method can be applied uniformly across laboratories in studies assessing RF remodeling induced by aging or other conditions in animal models. PMID:28008415

  4. Applying I-FGM to image retrieval and an I-FGM system performance analyses

    NASA Astrophysics Data System (ADS)

    Santos, Eugene, Jr.; Santos, Eunice E.; Nguyen, Hien; Pan, Long; Korah, John; Zhao, Qunhua; Xia, Huadong

    2007-04-01

    Intelligent Foraging, Gathering and Matching (I-FGM) combines a unique multi-agent architecture with a novel partial processing paradigm to provide a solution for real-time information retrieval in large and dynamic databases. I-FGM provides a unified framework for combining the results from various heterogeneous databases and seeks to provide easily verifiable performance guarantees. In our previous work, I-FGM had been implemented and validated with experiments on dynamic text data. However, the heterogeneity of search spaces requires our system having the ability to effectively handle various types of data. Besides texts, images are the most significant and fundamental data for information retrieval. In this paper, we extend the I-FGM system to incorporate images in its search spaces using a region-based Wavelet Image Retrieval algorithm called WALRUS. Similar to what we did for text retrieval, we modified the WALRUS algorithm to partially and incrementally extract the regions from an image and measure the similarity value of this image. Based on the obtained partial results, we refine our computational resources by updating the priority values of image documents. Experiments have been conducted on I-FGM system with image retrieval. The results show that I-FGM outperforms its control systems. Also, in this paper we present theoretical analysis of the systems with a focus on performance. Based on probability theory, we provide models and predictions of the average performance of the I-FGM system and its two control systems, as well as the systems without partial processing.

  5. Atmospheric moisture and cloud structure determined from SSM/I and global gridpoint analyses. [Special Sensor Microwave Imager

    NASA Technical Reports Server (NTRS)

    Robertson, Franklin R.; Huang, Huo-Jin

    1989-01-01

    Data from the Special Sensor Microwave Imager/I on the DMSP satellite are used to study atmospheric moisture and cloud structure. Column-integrated water vapor and total liquid water retrievals are obtained using an algorithm based on a radiative model for brightness temperature (Wentz, 1983). The results from analyzing microwave and IR measurements are combined with independent global gridpoint analyses to study the distribution and structure of atmospheric moisture over oceanic regions.

  6. ALISSA: an automated live-cell imaging system for signal transduction analyses.

    PubMed

    Wenus, Jakub; Düssmann, Heiko; Paul, Perrine; Kalamatianos, Dimitrios; Rehm, Markus; Wellstead, Peter; Prehn, Jochen; Huber, Heinrich

    2009-12-01

    Probe photobleaching and a specimen's sensitivity to phototoxicity severely limit the number of possible excitation cycles in time-lapse fluorescent microscopy experiments. Consequently, when a study of cellular processes requires measurements over hours or days, temporal resolution is limited, and spontaneous or rapid events may be missed, thus limiting conclusions about transduction events. We have developed ALISSA, a design framework and reference implementation for an automated live-cell imaging system for signal transduction analysis. It allows an adaptation of image modalities and laser resources tailored to the biological process, and thereby extends temporal resolution from minutes to seconds. The system employs online image analysis to detect cellular events that are then used to exercise microscope control. It consists of a reusable image analysis software for cell segmentation, tracking, and time series extraction, and a measurement-specific process control software that can be easily adapted to various biological settings. We have applied the ALISSA framework to the analysis of apoptosis as a demonstration case for slow onset and rapid execution signaling. The demonstration provides a clear proof-of-concept for ALISSA, and offers guidelines for its application in a broad spectrum of signal transduction studies.

  7. Formal Distinctiveness of High- and Low-Imageability Nouns: Analyses and Theoretical Implications

    ERIC Educational Resources Information Center

    Reilly, Jamie; Kean, Jacob

    2007-01-01

    Words associated with perceptually salient, highly imageable concepts are learned earlier in life, more accurately recalled, and more rapidly named than abstract words (R. W. Brown, 1976; Walker & Hulme, 1999). Theories accounting for this concreteness effect have focused exclusively on semantic properties of word referents. A novel possibility is…

  8. Three-dimensional imaging system for analyses of dynamic droplet impaction and deposition formation on leaves

    USDA-ARS?s Scientific Manuscript database

    A system was developed to assess the dynamic processes of droplet impact, rebound and retention on leaf surfaces with three-dimensional (3-D) images. The system components consisted of a uniform-size droplet generator, two high speed digital video cameras, a constant speed track, a leaf holder, and ...

  9. MALDI Imaging-Guided Microproteomic Analyses of Heterogeneous Breast Tumors - A Pilot Study.

    PubMed

    Alberts, Deborah; Pottier, Charles; Smargiasso, Nicolas; Baiwir, Dominique; Mazzucchelli, Gabriel; Delvenne, Philippe; Kriegsmann, Mark; Kazdal, Daniel; Warth, Arne; De Pauw, Edwin; Longuespée, Rémi

    2017-08-11

    Matrix-assisted laser desorption/ionization (MALDI) imaging is an ideal tool to study intratumor heterogeneity (ITH) and its implication in prognostic stratification of patients. However there are some drawbacks concerning protein identification. On the other hand, laser microdissection (LMD)-based microproteomics allows retrieving thousands of protein identifications from small tissue pieces. As a proof of concept, we combined these two complementary approaches to analyze heterogeneous regions in breast tumors. Invasive ductal breast cancer FFPE tissue sections from five patients were analyzed by MALDI imaging and the dataset were processed by segmentation. Heterogeneous regions within tumors were processed by LMD-based microproteomics, in duplicates. Liquid chromatography-tandem mass spectrometry data were classified by hierarchical clustering. Heterogeneous tissue regions were discriminated on the basis of their actual molecular heterogeneity. The dataset was correlated with MALDI imaging to identify m/z values discriminating heterogeneous regions. The molecular characterization of cell clones in tumors related to bad patient outcome could have great impact for pathology. We presented a combined application of LMD-based microproteomics and MALDI imaging for ITH studies. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  10. Functional connectivity analyses in imaging genetics: considerations on methods and data interpretation.

    PubMed

    Bedenbender, Johannes; Paulus, Frieder M; Krach, Sören; Pyka, Martin; Sommer, Jens; Krug, Axel; Witt, Stephanie H; Rietschel, Marcella; Laneri, Davide; Kircher, Tilo; Jansen, Andreas

    2011-01-01

    Functional magnetic resonance imaging (fMRI) can be combined with genotype assessment to identify brain systems that mediate genetic vulnerability to mental disorders ("imaging genetics"). A data analysis approach that is widely applied is "functional connectivity". In this approach, the temporal correlation between the fMRI signal from a pre-defined brain region (the so-called "seed point") and other brain voxels is determined. In this technical note, we show how the choice of freely selectable data analysis parameters strongly influences the assessment of the genetic modulation of connectivity features. In our data analysis we exemplarily focus on three methodological parameters: (i) seed voxel selection, (ii) noise reduction algorithms, and (iii) use of additional second level covariates. Our results show that even small variations in the implementation of a functional connectivity analysis can have an impact on the connectivity pattern that is as strong as the potential modulation by genetic allele variants. Some effects of genetic variation can only be found for one specific implementation of the connectivity analysis. A reoccurring difficulty in the field of psychiatric genetics is the non-replication of initially promising findings, partly caused by the small effects of single genes. The replication of imaging genetic results is therefore crucial for the long-term assessment of genetic effects on neural connectivity parameters. For a meaningful comparison of imaging genetics studies however, it is therefore necessary to provide more details on specific methodological parameters (e.g., seed voxel distribution) and to give information how robust effects are across the choice of methodological parameters.

  11. Three-Dimensional Acoustic Tissue Model: A Computational Tissue Phantom for Image Analyses

    NASA Astrophysics Data System (ADS)

    Mamou, J.; Oelze, M. L.; O'Brien, W. D.; Zachary, J. F.

    A novel methodology to obtain three-dimensional (3D) acoustic tissue models (3DATMs) is introduced. 3DATMs can be used as computational tools for ultrasonic imaging algorithm development and analysis. In particular, 3D models of biological structures can provide great benefit to better understand fundamentally how ultrasonic waves interact with biological materials. As an example, such models were used to generate ultrasonic images that characterize tumor tissue microstructures. 3DATMs can be used to evaluate a variety of tissue types. Typically, excised tissue is fixed, embedded, serially sectioned, and stained. The stained sections are digitally imaged (24-bit bitmap) with light microscopy. Contrast of each stained section is equalized and an automated registration algorithm aligns consecutive sections. The normalized mutual information is used as a similarity measure, and simplex optimization is conducted to find the best alignment. Both rigid and non-rigid registrations are performed. During tissue preparation, some sections are generally lost; thus, interpolation prior to 3D reconstruction is performed. Interpolation is conducted after registration using cubic Hermite polynoms. The registered (with interpolated) sections yield a 3D histologic volume (3DHV). Acoustic properties are then assigned to each tissue constituent of the 3DHV to obtain the 3DATMs. As an example, a 3D acoustic impedance tissue model (3DZM) was obtained for a solid breast tumor (EHS mouse sarcoma) and used to estimate ultrasonic scatterer size. The 3DZM results yielded an effective scatterer size of 32.9 (±6.1) μm. Ultrasonic backscatter measurements conducted on the same tumor tissue in vivo yielded an effective scatterer size of 33 (±8) μm. This good agreement shows that 3DATMs may be a powerful modeling tool for acoustic imaging applications

  12. Contextualising and Analysing Planetary Rover Image Products through the Web-Based PRoGIS

    NASA Astrophysics Data System (ADS)

    Morley, Jeremy; Sprinks, James; Muller, Jan-Peter; Tao, Yu; Paar, Gerhard; Huber, Ben; Bauer, Arnold; Willner, Konrad; Traxler, Christoph; Garov, Andrey; Karachevtseva, Irina

    2014-05-01

    The international planetary science community has launched, landed and operated dozens of human and robotic missions to the planets and the Moon. They have collected various surface imagery that has only been partially utilized for further scientific purposes. The FP7 project PRoViDE (Planetary Robotics Vision Data Exploitation) is assembling a major portion of the imaging data gathered so far from planetary surface missions into a unique database, bringing them into a spatial context and providing access to a complete set of 3D vision products. Processing is complemented by a multi-resolution visualization engine that combines various levels of detail for a seamless and immersive real-time access to dynamically rendered 3D scenes. PRoViDE aims to (1) complete relevant 3D vision processing of planetary surface missions, such as Surveyor, Viking, Pathfinder, MER, MSL, Phoenix, Huygens, and Lunar ground-level imagery from Apollo, Russian Lunokhod and selected Luna missions, (2) provide highest resolution & accuracy remote sensing (orbital) vision data processing results for these sites to embed the robotic imagery and its products into spatial planetary context, (3) collect 3D Vision processing and remote sensing products within a single coherent spatial data base, (4) realise seamless fusion between orbital and ground vision data, (5) demonstrate the potential of planetary surface vision data by maximising image quality visualisation in 3D publishing platform, (6) collect and formulate use cases for novel scientific application scenarios exploiting the newly introduced spatial relationships and presentation, (7) demonstrate the concepts for MSL, (9) realize on-line dissemination of key data & its presentation by a web-based GIS and rendering tool named PRoGIS (Planetary Robotics GIS). PRoGIS is designed to give access to rover image archives in geographical context, using projected image view cones, obtained from existing meta-data and updated according to

  13. Hyperspectral and Chlorophyll Fluorescence Imaging to Analyse the Impact of Fusarium culmorum on the Photosynthetic Integrity of Infected Wheat Ears

    PubMed Central

    Bauriegel, Elke; Giebel, Antje; Herppich, Werner B.

    2011-01-01

    Head blight on wheat, caused by Fusarium spp., is a serious problem for both farmers and food production due to the concomitant production of highly toxic mycotoxins in infected cereals. For selective mycotoxin analyses, information about the on-field status of infestation would be helpful. Early symptom detection directly on ears, together with the corresponding geographic position, would be important for selective harvesting. Hence, the capabilities of various digital imaging methods to detect head blight disease on winter wheat were tested. Time series of images of healthy and artificially Fusarium-infected ears were recorded with a laboratory hyperspectral imaging system (wavelength range: 400 nm to 1,000 nm). Disease-specific spectral signatures were evaluated with an imaging software. Applying the ‘Spectral Angle Mapper’ method, healthy and infected ear tissue could be clearly classified. Simultaneously, chlorophyll fluorescence imaging of healthy and infected ears, and visual rating of the severity of disease was performed. Between six and eleven days after artificial inoculation, photosynthetic efficiency of infected compared to healthy ears decreased. The severity of disease highly correlated with photosynthetic efficiency. Above an infection limit of 5% severity of disease, chlorophyll fluorescence imaging reliably recognised infected ears. With this technique, differentiation of the severity of disease was successful in steps of 10%. Depending on the quality of chosen regions of interests, hyperspectral imaging readily detects head blight 7 d after inoculation up to a severity of disease of 50%. After beginning of ripening, healthy and diseased ears were hardly distinguishable with the evaluated methods. PMID:22163820

  14. Respiratory motion of adrenal gland metastases: Analyses using four-dimensional computed tomography images.

    PubMed

    Chen, Bing; Hu, Yong; Liu, Jin; Cao, An-Ning; Ye, Lu-Xi; Zeng, Zhao-Chong

    2017-06-01

    To evaluate the respiratory motion of adrenal gland metastases in three-dimensional directions using four-dimensional computed tomography (4DCT) images. From January 2013 to May 2016, 12 patients with adrenal gland metastases were included in this study. They all underwent 4DCT scans to assess respiratory motion of adrenal gland metastases in free breathing state. The 4DCT images were sorted into 10 image series according to the respiratory phase from the end inspiration to the end expiration, and then transferred to FocalSim workstation. All gross tumor volumes (GTVs) of adrenal gland metastases were drawn by a single physician and confirmed by a second. Relative coordinates of adrenal gland metastases were automatically generated to calculate adrenal gland metastases motion in different axial directions. The average respiratory motion of adrenal gland metastases in left-right (LR), cranial-caudal (CC), anterior-posterior (AP), 3-dimensional (3D) vector directions was 3.4±2.2mm, 9.5±5.5mm, 3.8±2.0mm and 11.3±5.3mm, respectively. The ratios were 58.6%±11.4% and 63.2%±12.5% when the volumes of GTVIn0% and GTV In100% were compared with volume of IGTV10phase. The volume ratio of IGTV10phase to GTV3D was 1.73±0.48. Adrenal gland metastasis is a respiration-induced moving target, and an internal target volume boundary should be provided when designing the treatment plan. The CC motion of adrenal gland metastasis is predominant and >5mm, thus motion management strategies are recommended for patients undergoing external radiotherapy for adrenal gland metastasis. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  15. Use of Very High-Resolution Airborne Images to Analyse 3d Canopy Architecture of a Vineyard

    NASA Astrophysics Data System (ADS)

    Burgos, S.; Mota, M.; Noll, D.; Cannelle, B.

    2015-08-01

    Differencing between green cover and grape canopy is a challenge for vigour status evaluation in viticulture. This paper presents the acquisition methodology of very high-resolution images (4 cm), using a Sensefly Swinglet CAM unmanned aerial vehicle (UAV) and their processing to construct a 3D digital surface model (DSM) for the creation of precise digital terrain models (DTM). The DTM was obtained using python processing libraries. The DTM was then subtracted to the DSM in order to obtain a differential digital model (DDM) of a vineyard. In the DDM, the vine pixels were then obtained by selecting all pixels with an elevation higher than 50 [cm] above the ground level. The results show that it was possible to separate pixels from the green cover and the vine rows. The DDM showed values between -0.1 and + 1.5 [m]. A manually delineation of polygons based on the RGB image belonging to the green cover and to the vine rows gave a highly significant differences with an average value of 1.23 [m] and 0.08 [m] for the vine and the ground respectively. The vine rows elevation is in good accordance with the topping height of the vines 1.35 [m] measured on the field. This mask could be used to analyse images of the same plot taken at different times. The extraction of only vine pixels will facilitate subsequent analyses, for example, a supervised classification of these pixels.

  16. Capabilities and Impact on Wind Analyses of the Hurricane Imaging Radiometer (HIRAD)

    NASA Technical Reports Server (NTRS)

    Miller, Timothy L.; Amarin, Ruba; Atlas, Robert; Bailey, M. C.; Black, Peter; Buckley, Courtney; James, Mark; Johnson, James; Jones, Linwood; Ruf, Christopher; Simmons, David; Uhlhorn, Eric

    2010-01-01

    The Hurricane Imaging Radiometer (HIRAD) is a new airborne microwave remote sensor for hurricane observations that is currently under development by NASA Marshall Space Flight Center in partnership with the NOAA Atlantic Oceanographic and Meteorological Laboratory/Hurricane Research Division, the University of Central Florida, the University of Michigan, and the University of Alabama in Huntsville. The instrument is being test flown in January and is expected to participate in or collaborate with the tropical cyclone experiment GRIP (Genesis and Rapid Intensification Processes) in the 2010 season. HIRAD is designed to study the wind field in some detail within strong hurricanes and to enhance the real-time airborne ocean surface winds observation capabilities of NOAA and USAF Weather Squadron hurricane hunter aircraft currently using the operational Stepped Frequency Microwave Radiometer (SFMR). Unlike SFMR, which measures wind speed and rain rate along the ground track at a single point directly beneath the aircraft, HIRAD will provide images of the surface wind and rain field over a wide swath (approx.3 x the aircraft altitude) with approx.2 km resolution. See Figure 1, which depicts a simulated HIRAD swath versus the line of data obtained by SFMR.

  17. Voxel-based analyses of magnetization transfer imaging of the brain in hepatic encephalopathy

    PubMed Central

    Miese, Falk R; Wittsack, Hans-Jörg; Kircheis, Gerald; Holstein, Arne; Mathys, Christian; Mödder, Ulrich; Cohnen, Mathias

    2009-01-01

    AIM: To evaluate the spatial distribution of cerebral abnormalities in cirrhotic subjects with and without hepatic encephalopathy (HE) found with magnetization transfer imaging (MTI). METHODS: Nineteen cirrhotic patients graded from neurologically normal to HE grade 2 and 18 healthy control subjects underwent magnetic resonance imaging. They gave institutional-review-board-approved written consent. Magnetization transfer ratio (MTR) maps were generated from MTI. We tested for significant differences compared to the control group using statistical non-parametric mapping (SnPM) for a voxel-based evaluation. RESULTS: The MTR of grey and white matter was lower in subjects with more severe HE. Changes were found in patients with cirrhosis without neurological deficits in the basal ganglia and bilateral white matter. The loss in magnetization transfer increased in severity and spatial extent in patients with overt HE. Patients with HE grade 2 showed an MTR decrease in white and grey matter: the maximum loss of magnetization transfer effect was located in the basal ganglia [SnPM (pseudo-)t = 17.98, P = 0.0001]. CONCLUSION: The distribution of MTR changes in HE points to an early involvement of basal ganglia and white matter in HE. PMID:19891014

  18. Analysed cap mesenchyme track data from live imaging of mouse kidney development.

    PubMed

    Lefevre, James G; Combes, Alexander N; Little, Melissa H; Hamilton, Nicholas A

    2016-12-01

    This article provides detailed information on manually tracked cap mesenchyme cells from timelapse imaging of multiple ex vivo embryonic mouse kidneys. Cells were imaged for up to 18 h at 15 or 20 min intervals, and multiple cell divisions were tracked. Positional data is supplemented with a range of information including the relative location of the closest ureteric tip and a correction for drift due to bulk movement and tip growth. A subset of tracks were annotated to indicate the presence of processes attached to the ureteric epithelium. The calculations used for drift correction are described, as are the main methods used in the analysis of this data for the purpose of describing cap cell motility. The outcomes of this analysis are discussed in "Cap mesenchyme cell swarming during kidney development is influenced by attraction, repulsion, and adhesion to the ureteric tip" (A.N. Combes, J.G. Lefevre, S. Wilson, N.A. Hamilton, M.H. Little, 2016) [1].

  19. Image analyser as a tool for the study of in vitro glomerular vasoreactivity.

    PubMed

    Lakhdar, B; Potier, M; L'Azou, B; Cambar, J

    1994-12-01

    Many drugs used in clinics can dramatically reduce renal hemodynamics. For some years there have been developed in our laboratory two in vitro glomerular models, isolated glomeruli and mesangial cell cultures, to quantitate, by video image analyzer, the direct glomerular effect of vasoreactive agents. The present study shows the vasoconstrictive effects of angiotensin II and cyclosporin in both models and compares their glomerular vasoconstriction with or without vasodilating agents such as verapamil. This drug-induced glomerular vasoreactivity is time- and dose-dependent; moreover, it can be reversible after perfusion in control conditions. The interest of these in vitro glomerular models is validated by fair correlations between in vivo and in vitro data and between the responses of both. These models can be considered as tools for assessing glomerular vasoreactivity of nephrotoxic agents.

  20. Diffusion tensor imaging atlas-based analyses in major depression after mild traumatic brain injury.

    PubMed

    Rao, Vani; Mielke, Michelle; Xu, Xin; Smith, Gwenn S; McCann, Una D; Bergey, Alyssa; Doshi, Vishal; Pham, Dzung L; Yousem, David; Mori, Susumi

    2012-01-01

    There are currently no known early neuroanatomical markers predictive of the development of major depression or depressive symptoms after mild traumatic brain injury (mTBI). The authors conducted a 1-year longitudinal pilot study to determine whether diffusion tensor imaging (DTI) measures collected within 1 month of mTBI could predict incident depression. Of the 14 subjects who met study inclusion criteria, 4 (28.6%) developed major depression over the follow-up period. Compared with the nondepressed group, those who developed depression had white-matter abnormalities in the fronto-temporal regions measured by DTI. These preliminary results highlight the need for additional studies, including studies using a larger sample and appropriate controls.

  1. Dugong: a Docker image, based on Ubuntu Linux, focused on reproducibility and replicability for bioinformatics analyses.

    PubMed

    Menegidio, Fabiano B; Jabes, Daniela L; Costa de Oliveira, Regina; Nunes, Luiz R

    2017-09-04

    This manuscript introduces and describes Dugong, a Docker image based on Ubuntu 16.04, which automates installation of more than 3500 bioinformatics tools (along with their respective libraries and dependencies), in alternative computational environments. The software operates through a user-friendly XFCE4 graphic interface that allows software management and installation by users not fully familiarized with the Linux command line and provides the Jupyter Notebook to assist in the delivery and exchange of consistent and reproducible protocols and results across laboratories, assisting in the development of open science projects. Source code and instructions for local installation are available at https://github.com/DugongBioinformatics , under the MIT open source license. Luiz.Nunes@ufabc.edu.br. Supplementary data available at: https://dugongbioinformatics.github.io/.

  2. Accelerating Large Scale Image Analyses on Parallel, CPU-GPU Equipped Systems.

    PubMed

    Teodoro, George; Kurc, Tahsin M; Pan, Tony; Cooper, Lee A D; Kong, Jun; Widener, Patrick; Saltz, Joel H

    2012-05-01

    The past decade has witnessed a major paradigm shift in high performance computing with the introduction of accelerators as general purpose processors. These computing devices make available very high parallel computing power at low cost and power consumption, transforming current high performance platforms into heterogeneous CPU-GPU equipped systems. Although the theoretical performance achieved by these hybrid systems is impressive, taking practical advantage of this computing power remains a very challenging problem. Most applications are still deployed to either GPU or CPU, leaving the other resource under- or un-utilized. In this paper, we propose, implement, and evaluate a performance aware scheduling technique along with optimizations to make efficient collaborative use of CPUs and GPUs on a parallel system. In the context of feature computations in large scale image analysis applications, our evaluations show that intelligently co-scheduling CPUs and GPUs can significantly improve performance over GPU-only or multi-core CPU-only approaches.

  3. Accelerating Large Scale Image Analyses on Parallel, CPU-GPU Equipped Systems

    PubMed Central

    Teodoro, George; Kurc, Tahsin M.; Pan, Tony; Cooper, Lee A.D.; Kong, Jun; Widener, Patrick; Saltz, Joel H.

    2014-01-01

    The past decade has witnessed a major paradigm shift in high performance computing with the introduction of accelerators as general purpose processors. These computing devices make available very high parallel computing power at low cost and power consumption, transforming current high performance platforms into heterogeneous CPU-GPU equipped systems. Although the theoretical performance achieved by these hybrid systems is impressive, taking practical advantage of this computing power remains a very challenging problem. Most applications are still deployed to either GPU or CPU, leaving the other resource under- or un-utilized. In this paper, we propose, implement, and evaluate a performance aware scheduling technique along with optimizations to make efficient collaborative use of CPUs and GPUs on a parallel system. In the context of feature computations in large scale image analysis applications, our evaluations show that intelligently co-scheduling CPUs and GPUs can significantly improve performance over GPU-only or multi-core CPU-only approaches. PMID:25419545

  4. X-ray fluorescence and imaging analyses of paintings by the Brazilian artist Oscar Pereira Da Silva

    NASA Astrophysics Data System (ADS)

    Campos, P. H. O. V.; Kajiya, E. A. M.; Rizzutto, M. A.; Neiva, A. C.; Pinto, H. P. F.; Almeida, P. A. D.

    2014-02-01

    Non-destructive analyses, such as EDXRF (Energy-Dispersive X-Ray Fluorescence) spectroscopy, and imaging were used to characterize easel paintings. The analyzed objects are from the collection of the Pinacoteca do Estado de São Paulo. EDXRF results allowed us to identify the chemical elements present in the pigments, showing the use of many Fe-based pigments, modern pigments, such as cobalt blue and cadmium yellow, as well as white pigments containing lead and zinc used by the artist in different layers. Imaging analysis was useful to identify the state of conservation, the localization of old and new restorations and also to detect and unveil the underlying drawings revealing the artist's creative processes.

  5. MSiReader v1.0: Evolving Open-Source Mass Spectrometry Imaging Software for Targeted and Untargeted Analyses.

    PubMed

    Bokhart, Mark T; Nazari, Milad; Garrard, Kenneth P; Muddiman, David C

    2017-09-20

    A major update to the mass spectrometry imaging (MSI) software MSiReader is presented, offering a multitude of newly added features critical to MSI analyses. MSiReader is a free, open-source, and vendor-neutral software written in the MATLAB platform and is capable of analyzing most common MSI data formats. A standalone version of the software, which does not require a MATLAB license, is also distributed. The newly incorporated data analysis features expand the utility of MSiReader beyond simple visualization of molecular distributions. The MSiQuantification tool allows researchers to calculate absolute concentrations from quantification MSI experiments exclusively through MSiReader software, significantly reducing data analysis time. An image overlay feature allows the incorporation of complementary imaging modalities to be displayed with the MSI data. A polarity filter has also been incorporated into the data loading step, allowing the facile analysis of polarity switching experiments without the need for data parsing prior to loading the data file into MSiReader. A quality assurance feature to generate a mass measurement accuracy (MMA) heatmap for an analyte of interest has also been added to allow for the investigation of MMA across the imaging experiment. Most importantly, as new features have been added performance has not degraded, in fact it has been dramatically improved. These new tools and the improvements to the performance in MSiReader v1.0 enable the MSI community to evaluate their data in greater depth and in less time. Graphical Abstract ᅟ.

  6. ICPES analyses using full image spectra and astronomical data fitting algorithms to provide diagnostic and result information

    SciTech Connect

    Spencer, W.A.; Goode, S.R.

    1997-10-01

    ICP emission analyses are prone to errors due to changes in power level, nebulization rate, plasma temperature, and sample matrix. As a result, accurate analyses of complex samples often require frequent bracketing with matrix matched standards. Information needed to track and correct the matrix errors is contained in the emission spectrum. But most commercial software packages use only the analyte line emission to determine concentrations. Changes in plasma temperature and the nebulization rate are reflected by changes in the hydrogen line widths, the oxygen emission, and neutral ion line ratios. Argon and off-line emissions provide a measure to correct the power level and the background scattering occurring in the polychromator. The authors` studies indicated that changes in the intensity of the Ar 404.4 nm line readily flag most matrix and plasma condition modifications. Carbon lines can be used to monitor the impact of organics on the analyses and calcium and argon lines can be used to correct for spectral drift and alignment. Spectra of contaminated groundwater and simulated defense waste glasses were obtained using a Thermo Jarrell Ash ICP that has an echelle CID detector system covering the 190-850 nm range. The echelle images were translated to the FITS data format, which astronomers recommend for data storage. Data reduction packages such as those in the ESO-MIDAS/ECHELLE and DAOPHOT programs were tried with limited success. The radial point spread function was evaluated as a possible improved peak intensity measurement instead of the common pixel averaging approach used in the commercial ICP software. Several algorithms were evaluated to align and automatically scale the background and reference spectra. A new data reduction approach that utilizes standard reference images, successive subtractions, and residual analyses has been evaluated to correct for matrix effects.

  7. Expansion analyses on low-excitation planetary nebulae with stellar images

    SciTech Connect

    Tamura, Shinichi; Shibata, K.M. Nobeyama Radio Observatory, Minamimaki )

    1990-11-01

    The paper presents the results of analyses on the expansion characteristics of the low-excitation and unresolved planetary nebulae, M1-5, M1-9, K3-66, and K3-67. The sample nebulae are divided into two groups. The first includes the real compact planetary nebulae M1-5 and M1-9 based on their single-Gaussian profiles. The second one includes nebulae that are unresolved because of their large distances. The nebulae K3-66 and K3-67 should belong to the second group since they show the double-Gaussian components in the emission-line profiles. Relationships between expansion velocities and I(forbidden O III 5007 A)/I(H-beta) and between electron densities and expansion velocities give the basis for the above arguments and reveal that the nebulae IC 4997, Vy2-2, and M3-27 obviously are in different phases of evolution from those of other low-excitation planetary nebulae. 24 refs.

  8. Improving the local solution accuracy of large-scale digital image-based finite element analyses.

    PubMed

    Charras, G T; Guldberg, R E

    2000-02-01

    Digital image-based finite element modeling (DIBFEM) has become a widely utilized approach for efficiently meshing complex biological structures such as trabecular bone. While DIBFEM can provide accurate predictions of apparent mechanical properties, its application to simulate local phenomena such as tissue failure or adaptation has been limited by high local solution errors at digital model boundaries. Furthermore, refinement of digital meshes does not necessarily reduce local maximum errors. The purpose of this study was to evaluate the potential to reduce local mean and maximum solution errors in digital meshes using a post-processing filtration method. The effectiveness of a three-dimensional, boundary-specific filtering algorithm was found to be mesh size dependent. Mean absolute and maximum errors were reduced for meshes with more than five elements through the diameter of a cantilever beam considered representative of a single trabecula. Furthermore, mesh refinement consistently decreased errors for filtered solutions but not necessarily for non-filtered solutions. Models with more than five elements through the beam diameter yielded absolute mean errors of less than 15% for both Von Mises stress and maximum principal strain. When applied to a high-resolution model of trabecular bone microstructure, boundary filtering produced a more continuous solution distribution and reduced the predicted maximum stress by 30%. Boundary-specific filtering provides a simple means of improving local solution accuracy while retaining the model generation and numerical storage efficiency of the DIBFEM technique.

  9. Airflow analyses using thermal imaging in Arizona's Meteor Crater as part of METCRAX II

    NASA Astrophysics Data System (ADS)

    Grudzielanek, A. Martina; Vogt, Roland; Cermak, Jan; Maric, Mateja; Feigenwinter, Iris; Whiteman, C. David; Lehner, Manuela; Hoch, Sebastian W.; Krauß, Matthias G.; Bernhofer, Christian; Pitacco, Andrea

    2016-04-01

    In October 2013 the second Meteor Crater Experiment (METCRAX II) took place at the Barringer Meteorite Crater (aka Meteor Crater) in north central Arizona, USA. Downslope-windstorm-type flows (DWF), the main research objective of METCRAX II, were measured by a comprehensive set of meteorological sensors deployed in and around the crater. During two weeks of METCRAX II five infrared (IR) time lapse cameras (VarioCAM® hr research & VarioCAM® High Definition, InfraTec) were installed at various locations on the crater rim to record high-resolution images of the surface temperatures within the crater from different viewpoints. Changes of surface temperature are indicative of air temperature changes induced by flow dynamics inside the crater, including the DWF. By correlating thermal IR surface temperature data with meteorological sensor data during intensive observational periods the applicability of the IR method of representing flow dynamics can be assessed. We present evaluation results and draw conclusions relative to the application of this method for observing air flow dynamics in the crater. In addition we show the potential of the IR method for METCRAX II in 1) visualizing airflow processes to improve understanding of these flows, and 2) analyzing cold-air flows and cold-air pooling.

  10. Statistical improvements in functional magnetic resonance imaging analyses produced by censoring high-motion data points.

    PubMed

    Siegel, Joshua S; Power, Jonathan D; Dubis, Joseph W; Vogel, Alecia C; Church, Jessica A; Schlaggar, Bradley L; Petersen, Steven E

    2014-05-01

    Subject motion degrades the quality of task functional magnetic resonance imaging (fMRI) data. Here, we test two classes of methods to counteract the effects of motion in task fMRI data: (1) a variety of motion regressions and (2) motion censoring ("motion scrubbing"). In motion regression, various regressors based on realignment estimates were included as nuisance regressors in general linear model (GLM) estimation. In motion censoring, volumes in which head motion exceeded a threshold were withheld from GLM estimation. The effects of each method were explored in several task fMRI data sets and compared using indicators of data quality and signal-to-noise ratio. Motion censoring decreased variance in parameter estimates within- and across-subjects, reduced residual error in GLM estimation, and increased the magnitude of statistical effects. Motion censoring performed better than all forms of motion regression and also performed well across a variety of parameter spaces, in GLMs with assumed or unassumed response shapes. We conclude that motion censoring improves the quality of task fMRI data and can be a valuable processing step in studies involving populations with even mild amounts of head movement.

  11. Floral diversity in desert ecosystems: comparing field sampling to image analyses in assessing species cover.

    PubMed

    Ksiksi, Taoufik S; El-Keblawy, Ali A

    2013-06-10

    Developing a quick and reliable technique to estimate floral cover in deserts will assist in monitoring and management. The present attempt was to estimate plant cover in the UAE desert using both digital photography and field sampling. Digital photographs were correlated with field data to estimate floral cover in moderately (Al-Maha) and heavily (DDCR) grazed areas. The Kruskal-Wallis test was also used to assess compatibility between the two techniques within and across grazing intensities and soil substrates. Results showed that photographs could be a reliable technique within the sand dune substrate under moderate grazing (r = 0.69). The results were very poorly correlated (r =-0.24) or even inversely proportional (r =-0.48) when performed within DDCR. Overall, Chi-square values for Al-Maha and DDCR were not significant at P > 0.05, indicating similarities between the two methods. At the soil type level, the Kruskal-Wallis analysis was not significant (P > 0.05), except for gravel plains (P < 0.05). Across grazing intensities and soil substrates, the two techniques were in agreement in ranking most plant species, except for Lycium shawii. Consequently, the present study has proven that digital photography could not be used reliably to asses floral cover, while further testing is required to support such claim. An image-based sampling approach of plant cover at the species level, across different grazing and substrate variations in desert ecosystems, has its uses, but results are to be cautiously interpreted.

  12. Imaging Erg and Jun transcription factor interaction in living cells using fluorescence resonance energy transfer analyses

    SciTech Connect

    Camuzeaux, Barbara; Heliot, Laurent; Coll, Jean . E-mail: martine.duterque@ibl.fr

    2005-07-15

    Physical interactions between transcription factors play important roles in modulating gene expression. Previous in vitro studies have shown a transcriptional synergy between Erg protein, an Ets family member, and Jun/Fos heterodimer, members of the bZip family, which requires direct Erg-Jun protein interactions. Visualization of protein interactions in living cells is a new challenge in biology. For this purpose, we generated fusion proteins of Erg, Fos, and Jun with yellow and cyan fluorescent proteins, YFP and CFP, respectively. After transient expression in HeLa cells, interactions of the resulting fusion proteins were explored by fluorescence resonance energy transfer microscopy (FRET) in fixed and living cells. FRET between YFP-Erg and CFP-Jun was monitored by using photobleaching FRET and fluorescence lifetime imaging microscopy. Both techniques revealed the occurrence of intermolecular FRET between YFP-Erg and CFP-Jun. This is stressed by loss of FRET with an YFP-Erg version carrying a point mutation in its ETS domain. These results provide evidence for the interaction of Erg and Jun proteins in living cells as a critical prerequisite of their transcriptional synergy, but also for the essential role of the Y371 residue, conserved in most Ets proteins, in this interaction.

  13. Statistical Improvements in Functional Magnetic Resonance Imaging Analyses Produced by Censoring High-Motion Data Points

    PubMed Central

    Siegel, Joshua S.; Power, Jonathan D.; Dubis, Joseph W.; Vogel, Alecia C.; Church, Jessica A.; Schlaggar, Bradley L.; Petersen, Steven E.

    2013-01-01

    Subject motion degrades the quality of task functional magnetic resonance imaging (fMRI) data. Here, we test two classes of methods to counteract the effects of motion in task fMRI data: (1) a variety of motion regressions and (2) motion censoring (“motion scrubbing”). In motion regression, various regressors based on realignment estimates were included as nuisance regressors in general linear model (GLM) estimation. In motion censoring, volumes in which head motion exceeded a threshold were withheld from GLM estimation. The effects of each method were explored in several task fMRI data sets and compared using indicators of data quality and signal-to-noise ratio. Motion censoring decreased variance in parameter estimates within- and across-subjects, reduced residual error in GLM estimation, and increased the magnitude of statistical effects. Motion censoring performed better than all forms of motion regression and also performed well across a variety of parameter spaces, in GLMs with assumed or unassumed response shapes. We conclude that motion censoring improves the quality of task fMRI data and can be a valuable processing step in studies involving populations with even mild amounts of head movement. PMID:23861343

  14. Quantitative analyses of variability in normal vaginal shape and dimension on MR images

    PubMed Central

    Luo, Jiajia; Betschart, Cornelia; Ashton-Miller, James A.; DeLancey, John O. L.

    2016-01-01

    Introduction and hypothesis We present a technique for quantifying inter-individual variability in normal vaginal shape, axis, and dimension, and report findings in healthy women. Methods Eighty women (age: 28~70 years) with normal pelvic organ support underwent supine, multi-planar proton-density MRI. Vaginal width was assessed at five evenly-spaced locations, and vaginal axis, length, and surface area were quantified via ImageJ and MATLAB. Results The mid-sagittal plane angles, relative to the horizontal, of three vaginal axes were 90± 11, 72± 21, and 41± 22° (caudal to cranial, p < 0.001). The mean (± SD) vaginal widths were 17± 5, 24± 4, 30± 7, 41± 9, and 45± 12 mm at the five locations (caudal to cranial, p < 0.001). Mid-sagittal lengths for anterior and posterior vaginal walls were 63± 9 and 98 ± 18 mm respectively. The vaginal surface area was 72 ± 21 cm2 (range: 34 ~ 164 cm2). The coefficient of determination between any demographic variable and any vaginal dimension did not exceed 0.16. Conclusions Large variations in normal vaginal shape, axis, and dimensions were not explained by body size or other demographic variables. This variation has implications for reconstructive surgery, intravaginal and surgical product design, and vaginal drug delivery. PMID:26811115

  15. Electro-viscoelastic response of an acrylic elastomer analysed by digital image correlation

    NASA Astrophysics Data System (ADS)

    Thylander, S.; Menzel, A.; Ristinmaa, M.; Hall, S.; Engqvist, J.

    2017-08-01

    Experimental investigations are presented with respect to the electromechanically coupled and time-dependent behaviour of an acrylic elastomer, VHB 4910. For the electromechanically coupled experiments different biaxial pre-stretches have been considered and full-field measurements were made using three-dimensional surface digital image correlation. Both equi-biaxial and non equi-biaxial pre-stretches as well as both circular and non-circular electrodes are investigated. The experimental data provide new insights into the complex material behaviour of VHB 4910 and will enable enhanced calibration and development of constitutive models. Special emphasis lies on the quantification of out-of-plane deformation, i.e. the thickness change, of biaxially pre-stretched specimens based on an assumption of material incompressibility. The level of pre-stretch and thickness of the membrane plays a critical role in view of electromechanical instabilities. For completeness, cyclic uniaxial tests of the purely mechanical response are also presented and compared to similar experiments found in the literature.

  16. Gray and white matter volumetric and diffusion tensor imaging (DTI) analyses in the early stage of first-episode schizophrenia.

    PubMed

    Moriya, Junji; Kakeda, Shingo; Abe, Osamu; Goto, Naoki; Yoshimura, Reiji; Hori, Hikaru; Ohnari, Norihiro; Sato, Toru; Aoki, Shigeki; Ohtomo, Kuni; Nakamura, Jun; Korogi, Yukunori

    2010-02-01

    To determine whether statistical analyses of quantitative MR imaging data, including morphological changes, mean diffusivity (MD), and fractional anisotropy (FA), could provide useful biomarkers in early stage of first-episode schizophrenia. Twenty-three patients, who met all the criteria in the DSM-IV-TR category for schizophrenia excluding the duration of the disease (less than 6 months of follow-up), were examined by MR imaging during the initial consultation. Nineteen of the 23 patients were finally diagnosed to have schizophrenia after a 6-month follow-up, and they were included in this study as having been in the early stage of first-episode schizophrenia. Nineteen healthy volunteers also underwent MR imaging as age-matched controls. Three-dimensional spoiled gradient recalled acquisition with steady state (3D-SPGR) and diffusion tensor imaging (DTI) were performed at 3T. Image processing for voxel-based morphometry, a fully automatic technique for a computational analysis of differences in regional brain volume throughout the entire brain, was conducted using the Statistical Parametric Mapping 5 software package (SPM5). The 3D-SPGR images in the native space were bias-corrected; spatially normalized; segmented into gray matter, white matter, and cerebrospinal fluid images; and intensity-modulated using SPM5. A voxel-based analysis was conducted using both the MD and FA maps computed from DTI. The customized MD and FA template specific to this study was created from all participants. Thereafter, all the MD and FA maps in the native space were transformed onto the stereotactic space by registering each of the images to the customized MD and FA template. The two groups were compared using SPM5. Age and sex were treated as confounding covariates. The patients demonstrated a significant increase in the MD of the left parahippocampal gyrus, left insula, and right anterior cingulate gyrus in comparison to the control subjects (FDR corrected p<0.05). No significant

  17. Floral diversity in desert ecosystems: Comparing field sampling to image analyses in assessing species cover

    PubMed Central

    2013-01-01

    Background Developing a quick and reliable technique to estimate floral cover in deserts will assist in monitoring and management. The present attempt was to estimate plant cover in the UAE desert using both digital photography and field sampling. Digital photographs were correlated with field data to estimate floral cover in moderately (Al-Maha) and heavily (DDCR) grazed areas. The Kruskal-Wallis test was also used to assess compatibility between the two techniques within and across grazing intensities and soil substrates. Results Results showed that photographs could be a reliable technique within the sand dune substrate under moderate grazing (r = 0.69). The results were very poorly correlated (r =−0.24) or even inversely proportional (r =−0.48) when performed within DDCR. Overall, Chi-square values for Al-Maha and DDCR were not significant at P > 0.05, indicating similarities between the two methods. At the soil type level, the Kruskal-Wallis analysis was not significant (P > 0.05), except for gravel plains (P < 0.05). Across grazing intensities and soil substrates, the two techniques were in agreement in ranking most plant species, except for Lycium shawii. Conclusions Consequently, the present study has proven that digital photography could not be used reliably to asses floral cover, while further testing is required to support such claim. An image-based sampling approach of plant cover at the species level, across different grazing and substrate variations in desert ecosystems, has its uses, but results are to be cautiously interpreted. PMID:23758667

  18. Application of terahertz pulsed imaging to analyse film coating characteristics of sustained-release coated pellets.

    PubMed

    Haaser, M; Karrout, Y; Velghe, C; Cuppok, Y; Gordon, K C; Pepper, M; Siepmann, J; Rades, T; Taday, P F; Strachan, C J

    2013-12-05

    Terahertz pulsed imaging (TPI) was employed to explore its suitability for detecting differences in the film coating thickness and drug layer uniformity of multilayered, sustained-release coated, standard size pellets (approximately 1mm in diameter). Pellets consisting of a sugar starter core and a metoprolol succinate layer were coated with a Kollicoat(®) SR:Kollicoat(®) IR polymer blend for different times giving three groups of pellets (batches I, II and III), each with a different coating thickness according to weight gain. Ten pellets from each batch were mapped individually to evaluate the coating thickness and drug layer thickness between batches, between pellets within each batch, and across individual pellets (uniformity). From the terahertz waveform the terahertz electric field peak strength (TEFPS) was used to define a circular area (approximately 0.13 mm(2)) in the TPI maps, where no signal distortion was found due to pellet curvature in the measurement set-up used. The average coating thicknesses were 46 μm, 71 μm and 114 μm, for batches I, II and III respectively, whilst no drug layer thickness difference between batches was observed. No statistically significant differences in the average coating thickness and drug layer thickness within batches (between pellets) but high thickness variability across individual pellets was observed. These results were confirmed by scanning electron microscopy (SEM). The coating thickness results correlated with the subsequent drug release behaviour. The fastest drug release was obtained from batch I with the lowest coating thickness and the slowest from batch III with the highest coating thickness. In conclusion, TPI is suitable for detailed, non-destructive evaluation of film coating and drug layer thicknesses in multilayered standard size pellets.

  19. Should processed or raw image data be used in mammographic image quality analyses? A comparative study of three full-field digital mammography systems.

    PubMed

    Borg, Mark; Badr, Ishmail; Royle, Gary

    2015-01-01

    The purpose of this study is to compare a number of measured image quality parameters using processed and unprocessed or raw images in two full-field direct digital units and one computed radiography mammography system. This study shows that the difference between raw and processed image data is system specific. The results have shown that there are no significant differences between raw and processed data in the mean threshold contrast values using the contrast-detail mammography phantom in all the systems investigated; however, these results cannot be generalised to all available systems. Notable differences were noted in contrast-to-noise ratios and in other tests including: response function, modulation transfer function , noise equivalent quanta, normalised noise power spectra and detective quantum efficiency as specified in IEC 62220-1-2. Consequently, the authors strongly recommend the use of raw data for all image quality analyses in digital mammography. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. A framework for computational fluid dynamic analyses of patient-specific stented coronary arteries from optical coherence tomography images.

    PubMed

    Migliori, Susanna; Chiastra, Claudio; Bologna, Marco; Montin, Eros; Dubini, Gabriele; Aurigemma, Cristina; Fedele, Roberto; Burzotta, Francesco; Mainardi, Luca; Migliavacca, Francesco

    2017-09-01

    The clinical challenge of percutaneous coronary interventions (PCI) is highly dependent on the recognition of the coronary anatomy of each individual. The classic imaging modality used for PCI is angiography, but advanced imaging techniques that are routinely performed during PCI, like optical coherence tomography (OCT), may provide detailed knowledge of the pre-intervention vessel anatomy as well as the post-procedural assessment of the specific stent-to-vessel interactions. Computational fluid dynamics (CFD) is an emerging investigational tool in the setting of optimization of PCI results. In this study, an OCT-based reconstruction method was developed for the execution of CFD simulations of patient-specific coronary artery models which include the actual geometry of the implanted stent. The method was applied to a rigid phantom resembling a stented segment of the left anterior descending coronary artery. The segmentation algorithm was validated against manual segmentation. A strong correlation was found between automatic and manual segmentation of lumen in terms of area values. Similarity indices resulted >96% for the lumen segmentation and >77% for the stent strut segmentation. The 3D reconstruction achieved for the stented phantom was also assessed with the geometry provided by X-ray computed micro tomography scan, used as ground truth, and showed the incidence of distortion from catheter-based imaging techniques. The 3D reconstruction was successfully used to perform CFD analyses, demonstrating a great potential for patient-specific investigations. In conclusion, OCT may represent a reliable source for patient-specific CFD analyses which may be optimized using dedicated automatic segmentation algorithms. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. Analyse multiechelle d'images radar: Application au filtrage, a la classification et a la fusion d'images radar et optique

    NASA Astrophysics Data System (ADS)

    Foucher, Samuel

    Les images radar sont perturbees par un bruit multiplicatif (chatoiement) reduisant sensiblement la resolution radiometrique des cibles homogenes etendues. Le but de cette these est d'etudier l'apport de l'analyse multiechelle, plus particulierement de la transformee en ondelettes, dans le probleme de la reduction du chatoiement et de la classification non dirigee des images radar. Dans le cadre de la transformee en ondelettes stationnaire, garantissant l'invariance par translation de la representation, les techniques usuelles de filtrage adaptatif sont etendues au domaine multiechelle. Nous proposons de prendre en compte les specificites statistiques de l'image radar (modele multiplicatif, loi K) afin de separer les coefficients d'ondelettes engendres par le bruit seul de ceux engendres par les structures significatives de l'image. Le systeme de distribution de Pearson est applique afin de modeliser la distribution de probabilites des coefficients d'ondelettes. Lorsque l'intensite observee obeit a une loi K, le systeme de Pearson conduit a une loi de type IV (loi Beta complexe). Le type IV de Pearson est mis en oeuvre dans une ponderation de type MAP (Maximum A Posteriori). L'influence de la correlation du chatoiement sur les moments d'ordre superieur est ensuite evaluee quantitativement a partir d'une modelisation MA ("Moving Average") de l'image radar correlee. Les resultats obtenus sur un ensemble d'images artificielles montrent que l'approche multiechelle permet d'atteindre un meilleur compromis entre preservation des details et lissage des regions homogenes par rapport aux methodes de filtrage traditionnelles. En classification, la representation multiechelle permet de faire fluctuer le compromis precision spatiale/incertitude radiometrique. La theorie des croyances fournit un cadre theorique afin de manipuler les notions d'incertitude et d'imprecision. Nous proposons de combiner directement les decisions multiechelles par la regle de Dempster en integrant l

  2. White matter abnormalities associated with auditory hallucinations in schizophrenia: a combined study of voxel-based analyses of diffusion tensor imaging and structural magnetic resonance imaging.

    PubMed

    Seok, Jeong-Ho; Park, Hae-Jeong; Chun, Ji-Won; Lee, Seung-Koo; Cho, Hyun Sang; Kwon, Jun Soo; Kim, Jae-Jin

    2007-11-15

    White matter (WM) abnormalities in schizophrenia may offer important clues to a better understanding of the disconnectivity associated with the disorder. The aim of this study was to elucidate a WM basis of auditory hallucinations in schizophrenia through the simultaneous investigation of WM tract integrity and WM density. Diffusion tensor images (DTIs) and structural T1 magnetic resonance images (MRIs) were taken from 15 hallucinating schizophrenic patients, 15 non-hallucinating schizophrenic patients and 22 normal controls. Voxel-based analyses and post-hoc region of interest analyses were obtained to compare the three groups on fractional anisotropy (FA) derived from DTI as well as WM density derived from structural MRIs. In both the hallucinating and non-hallucinating groups, FA of the WM regions was significantly decreased in the left superior longitudinal fasciculus (SLF), whereas WM density was significantly increased in the left inferior longitudinal fasciculus (ILF). The mean FA value of the left frontal part of the SLF was positively correlated with the severity score of auditory hallucinations in the hallucinating patient group. Our findings show that WM changes were mainly observed in the frontal and temporal areas, suggesting that disconnectivity in the left fronto-temporal area may contribute to the pathophysiology of schizophrenia. In addition, pathologic WM changes in this region may be an important step in the development of auditory hallucinations in schizophrenia.

  3. Analysing radio-frequency coil arrays in high-field magnetic resonance imaging by the combined field integral equation method.

    PubMed

    Wang, Shumin; Duyn, Jeff H

    2006-06-21

    We present the combined field integral equation (CFIE) method for analysing radio-frequency coil arrays in high-field magnetic resonance imaging (MRI). Three-dimensional models of coils and the human body were used to take into account the electromagnetic coupling. In the method of moments formulation, we applied triangular patches and the Rao-Wilton-Glisson basis functions to model arbitrarily shaped geometries. We first examined a rectangular loop coil to verify the CFIE method and also demonstrate its efficiency and accuracy. We then studied several eight-channel receive-only head coil arrays for 7.0 T SENSE functional MRI. Numerical results show that the signal dropout and the average SNR are two major concerns in SENSE coil array design. A good design should be a balance of these two factors.

  4. Diffusion tensor imaging studies of attention-deficit/hyperactivity disorder: meta-analyses and reflections on head motion.

    PubMed

    Aoki, Yuta; Cortese, Samuele; Castellanos, Francisco Xavier

    2017-07-03

    Diffusion tensor imaging studies have shown atypical fractional anisotropy (FA) in individuals with attention-deficit/hyperactivity disorder (ADHD), albeit with conflicting results. We performed meta-analyses of whole-brain voxel-based analyses (WBVBA) and tract-based spatial statistics (TBSS) studies in ADHD, along with a qualitative review of TBSS studies addressing the issue of head motion, which may bias results. We conducted a systematic literature search (last search on April 1st, 2016) to identify studies comparing FA values between individuals with ADHD and typically developing (TD) participants. Signed differential mapping was used to compute effect sizes and integrate WBVBA and TBSS studies, respectively. TBSS datasets reporting no between-group motion differences were identified. We identified 14 WBVBA (ADHDn = 314, TDn = 278) and 13 TBSS datasets (ADHDn = 557, TDn = 568). WBVBA meta-analysis showed both significantly lower and higher FA values in individuals with ADHD; TBSS meta-analysis showed significantly lower FA in ADHD compared with TD in four clusters: two in the corpus callosum (isthmus and posterior midbody), one in right inferior fronto-occipital fasciculus, and one in left inferior longitudinal fasciculus. However, four of six datasets confirming no group-differences in motion showed no significant between-group FA differences. A growing diffusion tensor imaging (DTI) literature (total N = 1,717) and a plethora of apparent findings suggest atypical interhemispheric connection in ADHD. However, FA results in ADHD should be considered with caution, since many studies did not examine potential group differences in head motion, and most of the studies reporting no difference in motion showed no significant results. Future studies should address head motion as a priority and assure that groups do not differ in head motion. © 2017 Association for Child and Adolescent Mental Health.

  5. Combining satellite and seismic images to analyse the shallow structure of the Dead Sea Transform near the DESERT transect

    NASA Astrophysics Data System (ADS)

    Kesten, D.; Weber, M.; Haberland, Ch.; Janssen, Ch.; Agnon, A.; Bartov, Y.; Rabba, I.

    2008-02-01

    The left-lateral Dead Sea Transform (DST) in the Middle East is one of the largest continental strike-slip faults of the world. The southern segment of the DST in the Arava/Araba Valley between the Dead Sea and the Red Sea, called Arava/Araba Fault (AF), has been studied in detail in the multidisciplinary DESERT (DEad SEa Rift Transect) project. Based on these results, here, the interpretations of multi-spectral (ASTER) satellite images and seismic reflection studies have been combined to analyse geologic structures. Whereas satellite images reveal neotectonic activity in shallow young sediments, reflection seismic image deep faults that are possibly inactive at present. The combination of the two methods allows putting some age constraint on the activity of individual fault strands. Although the AF is clearly the main active fault segment of the southern DST, we propose that it has accommodated only a limited (up to 60 km) part of the overall 105 km of sinistral plate motion since Miocene times. There is evidence for sinistral displacement along other faults, based on geological studies, including satellite image interpretation. Furthermore, a subsurface fault is revealed ≈4 km west of the AF on two ≈E-W running seismic reflection profiles. Whereas these seismic data show a flower structure typical for strike-slip faults, on the satellite image this fault is not expressed in the post-Miocene sediments, implying that it has been inactive for the last few million years. About 1 km to the east of the AF another, now buried fault, was detected in seismic, magnetotelluric and gravity studies of DESERT. Taking together various evidences, we suggest that at the beginning of transform motion deformation occurred in a rather wide belt, possibly with the reactivation of older ≈N-S striking structures. Later, deformation became concentrated in the region of today’s Arava Valley. Till ≈5 Ma ago there might have been other, now inactive fault traces in the vicinity

  6. Computerized multiple image analysis on mammograms: performance improvement of nipple identification for registration of multiple views using texture convergence analyses

    NASA Astrophysics Data System (ADS)

    Zhou, Chuan; Chan, Heang-Ping; Sahiner, Berkman; Hadjiiski, Lubomir M.; Paramagul, Chintana

    2004-05-01

    Automated registration of multiple mammograms for CAD depends on accurate nipple identification. We developed two new image analysis techniques based on geometric and texture convergence analyses to improve the performance of our previously developed nipple identification method. A gradient-based algorithm is used to automatically track the breast boundary. The nipple search region along the boundary is then defined by geometric convergence analysis of the breast shape. Three nipple candidates are identified by detecting the changes along the gray level profiles inside and outside the boundary and the changes in the boundary direction. A texture orientation-field analysis method is developed to estimate the fourth nipple candidate based on the convergence of the tissue texture pattern towards the nipple. The final nipple location is determined from the four nipple candidates by a confidence analysis. Our training and test data sets consisted of 419 and 368 randomly selected mammograms, respectively. The nipple location identified on each image by an experienced radiologist was used as the ground truth. For 118 of the training and 70 of the test images, the radiologist could not positively identify the nipple, but provided an estimate of its location. These were referred to as invisible nipple images. In the training data set, 89.37% (269/301) of the visible nipples and 81.36% (96/118) of the invisible nipples could be detected within 1 cm of the truth. In the test data set, 92.28% (275/298) of the visible nipples and 67.14% (47/70) of the invisible nipples were identified within 1 cm of the truth. In comparison, our previous nipple identification method without using the two convergence analysis techniques detected 82.39% (248/301), 77.12% (91/118), 89.93% (268/298) and 54.29% (38/70) of the nipples within 1 cm of the truth for the visible and invisible nipples in the training and test sets, respectively. The results indicate that the nipple on mammograms can be

  7. Automated reference region extraction and population-based input function for brain [11C]TMSX PET image analyses

    PubMed Central

    Rissanen, Eero; Tuisku, Jouni; Luoto, Pauliina; Arponen, Eveliina; Johansson, Jarkko; Oikonen, Vesa; Parkkola, Riitta; Airas, Laura; Rinne, Juha O

    2015-01-01

    [11C]TMSX ([7-N-methyl-11C]-(E)-8-(3,4,5-trimethoxystyryl)-1,3,7-trimethylxanthine) is a selective adenosine A2A receptor (A2AR) radioligand. In the central nervous system (CNS), A2AR are linked to dopamine D2 receptor function in striatum, but they are also important modulators of inflammation. The golden standard for kinetic modeling of brain [11C]TMSX positron emission tomography (PET) is to obtain arterial input function via arterial blood sampling. However, this method is laborious, prone to errors and unpleasant for study subjects. The aim of this work was to evaluate alternative input function acquisition methods for brain [11C]TMSX PET imaging. First, a noninvasive, automated method for the extraction of gray matter reference region using supervised clustering (SCgm) was developed. Second, a method for obtaining a population-based arterial input function (PBIF) was implemented. These methods were created using data from 28 study subjects (7 healthy controls, 12 multiple sclerosis patients, and 9 patients with Parkinson's disease). The results with PBIF correlated well with original plasma input, and the SCgm yielded similar results compared with cerebellum as a reference region. The clustering method for extracting reference region and the population-based approach for acquiring input for dynamic [11C]TMSX brain PET image analyses appear to be feasible and robust methods, that can be applied in patients with CNS pathology. PMID:25370856

  8. Reduced medial prefrontal-subcortical connectivity in dysphoria: Granger causality analyses of rapid functional magnetic resonance imaging.

    PubMed

    Sabatinelli, Dean; McTeague, Lisa M; Dhamala, Mukesh; Frank, David W; Wanger, Timothy J; Adhikari, Bhim M

    2015-02-01

    A cortico-limbic network consisting of the amygdala, medial prefrontal cortex (mPFC), and ventral striatum (vSTR) has been associated with altered function in emotional disorders. Here we used rapidly sampled functional magnetic resonance imaging and Granger causality analyses to assess the directional connectivity between these brain structures in a sample of healthy and age-matched participants endorsing moderate to severe depressive symptomatology as they viewed a series of natural scene stimuli varying systematically in pleasantness and arousal. Specifically during pleasant scene perception, dysphoric participants showed reduced activity in mPFC and vSTR, relative to healthy participants. In contrast, amygdala activity was enhanced to pleasant as well as unpleasant arousing scenes in both participant groups. Granger causality estimates of influence between mPFC and vSTR were significantly reduced in dysphoric relative to control participants during all picture contents. These findings provide direct evidence that during visual perception of evocative emotional stimuli, reduced reward-related activity in dysphoria is associated with dysfunctional causal connectivity between mPFC, amygdala, and vSTR.

  9. Relationship between necrotic patterns in glioblastoma and patient survival: fractal dimension and lacunarity analyses using magnetic resonance imaging.

    PubMed

    Liu, Shuai; Wang, Yinyan; Xu, Kaibin; Wang, Zheng; Fan, Xing; Zhang, Chuanbao; Li, Shaowu; Qiu, Xiaoguang; Jiang, Tao

    2017-08-16

    Necrosis is a hallmark feature of glioblastoma (GBM). This study investigated the prognostic role of necrotic patterns in GBM using fractal dimension (FD) and lacunarity analyses of magnetic resonance imaging (MRI) data and evaluated the role of lacunarity in the biological processes leading to necrosis. We retrospectively reviewed clinical and MRI data of 95 patients with GBM. FD and lacunarity of the necrosis on MRI were calculated by fractal analysis and subjected to survival analysis. We also performed gene ontology analysis in 32 patients with available RNA-seq data. Univariate analysis revealed that FD < 1.56 and lacunarity > 0.46 significantly correlated with poor progression-free survival (p = 0.006 and p = 0.012, respectively) and overall survival (p = 0.008 and p = 0.005, respectively). Multivariate analysis revealed that both parameters were independent factors for unfavorable progression-free survival (p = 0.001 and p = 0.015, respectively) and overall survival (p = 0.002 and p = 0.007, respectively). Gene ontology analysis revealed that genes positively correlated with lacunarity were involved in the suppression of apoptosis and necrosis-associated biological processes. We demonstrate that the fractal parameters of necrosis in GBM can predict patient survival and are associated with the biological processes of tumor necrosis.

  10. In vitro stability analyses as a model for metabolism of ferromagnetic particles (Clariscan), a contrast agent for magnetic resonance imaging.

    PubMed

    Skotland, Tore; Sontum, Per Christian; Oulie, Inger

    2002-04-15

    Clariscan is a new contrast agent for magnetic resonance imaging. It is an aqueous suspension of ferromagnetic particles injected for blood pool and liver imaging. Previous experiments showed that particles made of 59Fe were taken up by the mononuclear phagocytic system and then solubilised. The present work aims at explaining a possible mechanism for the dissolution of these ferromagnetic particles in the body. The particles were diluted in 10-mM citrate or 10-mM acetate buffers at pH 4.5, 5.0 and 5.5 and incubated at 37 degrees C for up to 22 days, protected from light. The mixtures were analysed at different times during this incubation period using photon correlation spectroscopy, magnetic relaxation, visible spectroscopy and reactivity of the iron with the chelator, bathophenanthroline disulphonic acid. The data obtained with these techniques showed that the particles were almost completely solubilised within 4-7 days when incubated in 10 mM citrate, pH 4.5. Incubation in 10 mM citrate buffer, pH 5.0 revealed a slower solubilisation of the particles, as the changes observed after 72 h of incubation at pH 5.0 were 43-71% of the changes observed at pH 4.5. Incubation in 10 mM citrate, pH 5.5 revealed an even slower solubilisation of the particles, as the changes observed after 72 h of incubation at pH 5.5 were 12-34% of those observed at pH 4.5. Incubation of the particles in 10 mM acetate at pH 4.5, 5.0 and 5.5, as well as incubation of the particles in water pH adjusted to pH 5.1, resulted in only minor or no solubilisation of the particles. The results indicate that the low pH of endosomes and lysosomes, as well as endogenous iron-complexing substances, may be important for the solubilisation of these ferromagnetic particles following i.v. injection of Clariscan.

  11. Analyses of Magnetic Resonance Imaging of Cerebrospinal Fluid Dynamics Pre and Post Short and Long-Duration Space Flights

    NASA Technical Reports Server (NTRS)

    Alperin, Noam; Barr, Yael; Lee, Sang H.; Mason,Sara; Bagci, Ahmet M.

    2015-01-01

    Preliminary results are based on analyses of data from 17 crewmembers. The initial analysis compares pre to post-flight changes in total cerebral blood flow (CBF) and craniospinal CSF flow volume. Total CBF is obtained by summation of the mean flow rates through the 4 blood vessels supplying the brain (right and left internal carotid and vertebral arteries). Volumetric flow rates were obtained using an automated lumen segmentation technique shown to have 3-4-fold improved reproducibility and accuracy over manual lumen segmentation (6). Two cohorts, 5 short-duration and 8 long-duration crewmembers, who were scanned within 3 to 8 days post landing were included (4 short-duration crewmembers with MRI scans occurring beyond 10 days post flight were excluded). The VIIP Clinical Practice Guideline (CPG) classification is being used initially as a measure for VIIP syndrome severity. Median CPG scores of the short and long-duration cohorts were similar, 2. Mean preflight total CBF for the short and long-duration cohorts were similar, 863+/-144 and 747+/-119 mL/min, respectively. Percentage CBF changes for all short duration crewmembers were 11% or lower, within the range of normal physiological fluctuations in healthy individuals. In contrast, in 4 of the 8 long-duration crewmembers, the change in CBF exceeded the range of normal physiological fluctuation. In 3 of the 4 subjects an increase in CBF was measured. Large pre to post-flight changes in the craniospinal CSF flow volume were found in 6 of the 8 long-duration crewmembers. Box-Whisker plots of the CPG and the percent CBF and CSF flow changes for the two cohorts are shown in Figure 4. Examples of CSF flow waveforms for a short and two long-duration (CPG 0 and 3) are shown in Figure 5. Changes in CBF and CSF flow dynamics larger than normal physiological fluctuations were observed in the long-duration crewmembers. Changes in CSF flow were more pronounced than changes in CBF. Decreased CSF flow dynamics were observed

  12. Coupling MODIS images and agrometeorological data for agricultural water productivity analyses in the Mato Grosso State, Brazil

    NASA Astrophysics Data System (ADS)

    de C. Teixeira, Antônio H.; Victoria, Daniel C.; Andrade, Ricardo G.; Leivas, Janice F.; Bolfe, Edson L.; Cruz, Caroline R.

    2014-10-01

    Mato Grosso state, Central West Brazil, has been highlighted by the grain production, mainly soybean and corn, as first (November-March) and second (April-August) harvest crops, respectively. For water productivity (WP) analyses, MODIS products together with a net of weather stations were used. Evapotranspiration (ET) and biomass production (BIO) were acquired during the year 2012 and WP was considered as the ratio of BIO to ET. The SAFER (Simple Algorithm For Evapotranspiration Retrieving) for ET and the Monteith's radiation model for BIO were applied together, considering a mask which separated the crops from other surface types. In relation to the first harvest crop ET, BIO and WP values above of those for other surface types, happened only from November to January with incremental values reaching to 1.2 mm day-1; 67 kg ha-1 day-1; and 0.7 kg m-3, respectively; and between March and May for the second harvest crops, with incremental values attaining 0.5 mm day-1; 27 kg ha-1 day-1; and 0.3 kg m-3, respectively. In both cases, during the growing seasons, the highest WP parameters in cropped areas corresponded, in general, to the blooming to grain filling transition. Considering corn crop, which nowadays is increasing in terms of cultivated areas in the Brazilian Central West region, and crop water productivity (CWP) the ratio of yield to the amount of water consumed, the main growing regions North, Southeast and Northeast were analyzed. Southeast presented the highest annual pixel averages for ET, BIO and CWP (1.7 mm day-1, 78 kg ha-1 day-1 and 2.2 kg m-3, respectively); while for Northeast they were the lowest ones (1.2 mm day-1, 52 kg ha-1 dia-1 and 1.9 kg m-3). Throughout a soil moisture indicator, the ratio of precipitation (P) to ET, it was indeed noted that rainfall was enough for a good grain yield, with P/ET lower than 1.00 only outside the crop growing seasons. The combination of MODIS images and weather stations proved to be useful for monitoring

  13. Comparison of visual grading and free-response ROC analyses for assessment of image-processing algorithms in digital mammography.

    PubMed

    Zanca, F; Van Ongeval, C; Claus, F; Jacobs, J; Oyen, R; Bosmans, H

    2012-12-01

    To compare two methods for assessment of image-processing algorithms in digital mammography: free-response receiver operating characteristic (FROC) for the specific task of microcalcification detection and visual grading analysis (VGA). The FROC study was conducted prior to the VGA study reported here. 200 raw data files of low breast density (Breast Imaging-Reporting and Data System I-II) mammograms (Novation DR, Siemens, Germany)-100 of which abnormal-were processed by four image-processing algorithms: Raffaello (IMS, Bologna, Italy), Sigmoid (Sectra, Linköping, Sweden), and OpView v. 2 and v. 1 (Siemens, Erlangen, Germany). Four radiologists assessed the mammograms for the detection of microcalcifications. 8 months after the FROC study, a subset (200) of the 800 images was reinterpreted by the same radiologists, using the VGA methodology in a side-by-side approach. The VGA grading was based on noise, saturation, contrast, sharpness and confidence with the image in terms of normal structures. Ordinal logistic regression was applied; OpView v. 1 was the reference processing algorithm. In the FROC study all algorithms performed better than OpView v. 1. From the current VGA study and for confidence with the image, Sigmoid and Raffaello were significantly worse (p<0.001) than OpView v. 1; OpView v. 2 was significantly better (p=0.01). For the image quality criteria, results were mixed; Raffaello and Sigmoid for example were better than OpView v. 1 for sharpness and contrast (although not always significantly). VGA and FROC discordant results should be attributed to the different clinical task addressed. The method to use for image-processing assessment depends on the clinical task tested.

  14. Rapid specimen preparation to improve the throughput of electron microscopic volume imaging for three-dimensional analyses of subcellular ultrastructures with serial block-face scanning electron microscopy.

    PubMed

    Thai, Truc Quynh; Nguyen, Huy Bang; Saitoh, Sei; Wu, Bao; Saitoh, Yurika; Shimo, Satoshi; Elewa, Yaser Hosny Ali; Ichii, Osamu; Kon, Yasuhiro; Takaki, Takashi; Joh, Kensuke; Ohno, Nobuhiko

    2016-09-01

    Serial block-face imaging using scanning electron microscopy enables rapid observations of three-dimensional ultrastructures in a large volume of biological specimens. However, such imaging usually requires days for sample preparation to reduce charging and increase image contrast. In this study, we report a rapid procedure to acquire serial electron microscopic images within 1 day for three-dimensional analyses of subcellular ultrastructures. This procedure is based on serial block-face with two major modifications, including a new sample treatment device and direct polymerization on the rivets, to reduce the time and workload needed. The modified procedure without uranyl acetate can produce tens of embedded samples observable under serial block-face scanning electron microscopy within 1 day. The serial images obtained are similar to the block-face images acquired by common procedures, and are applicable to three-dimensional reconstructions at a subcellular resolution. Using this approach, regional immune deposits and the double contour or heterogeneous thinning of basement membranes were observed in the glomerular capillary loops of an autoimmune nephropathy model. These modifications provide options to improve the throughput of three-dimensional electron microscopic examinations, and will ultimately be beneficial for the wider application of volume imaging in life science and clinical medicine.

  15. Comparison of visual grading and free-response ROC analyses for assessment of image-processing algorithms in digital mammography

    PubMed Central

    Zanca, F; Van Ongeval, C; Claus, F; Jacobs, J; Oyen, R; Bosmans, H

    2012-01-01

    Objective To compare two methods for assessment of image-processing algorithms in digital mammography: free-response receiver operating characteristic (FROC) for the specific task of microcalcification detection and visual grading analysis (VGA). Methods The FROC study was conducted prior to the VGA study reported here. 200 raw data files of low breast density (Breast Imaging–Reporting and Data System I–II) mammograms (Novation DR, Siemens, Germany)—100 of which abnormal—were processed by four image-processing algorithms: Raffaello (IMS, Bologna, Italy), Sigmoid (Sectra, Linköping, Sweden), and OpView v. 2 and v. 1 (Siemens, Erlangen, Germany). Four radiologists assessed the mammograms for the detection of microcalcifications. 8 months after the FROC study, a subset (200) of the 800 images was reinterpreted by the same radiologists, using the VGA methodology in a side-by-side approach. The VGA grading was based on noise, saturation, contrast, sharpness and confidence with the image in terms of normal structures. Ordinal logistic regression was applied; OpView v. 1 was the reference processing algorithm. Results In the FROC study all algorithms performed better than OpView v. 1. From the current VGA study and for confidence with the image, Sigmoid and Raffaello were significantly worse (p<0.001) than OpView v. 1; OpView v. 2 was significantly better (p=0.01). For the image quality criteria, results were mixed; Raffaello and Sigmoid for example were better than OpView v. 1 for sharpness and contrast (although not always significantly). Conclusion VGA and FROC discordant results should be attributed to the different clinical task addressed. Advances in knowledge The method to use for image-processing assessment depends on the clinical task tested. PMID:22844032

  16. A proposal of Fourier-Bessel expansion with optimized ensembles of bases to analyse two dimensional image

    NASA Astrophysics Data System (ADS)

    Yamasaki, K.; Fujisawa, A.; Nagashima, Y.

    2017-09-01

    It is a critical issue to find the best set of fitting function bases in mode structural analysis of two dimensional images like plasma emission profiles. The paper proposes a method to optimize a set of the bases in the case of Fourier-Bessel function series, using their orthonormal property, for more efficient and precise analysis. The method is applied on a tomography image of plasma emission obtained with the Maximum-likelihood expectation maximization method in a linear cylindrical device. The result demonstrates the excellency of the method that realizes the smaller residual error and minimum Akaike information criterion using smaller number of fitting function bases.

  17. A Fusion-Based Approach for Breast Ultrasound Image Classification Using Multiple-ROI Texture and Morphological Analyses

    PubMed Central

    Bdair, Tariq M.; Al-Najar, Mahasen; Alazrai, Rami

    2016-01-01

    Ultrasound imaging is commonly used for breast cancer diagnosis, but accurate interpretation of breast ultrasound (BUS) images is often challenging and operator-dependent. Computer-aided diagnosis (CAD) systems can be employed to provide the radiologists with a second opinion to improve the diagnosis accuracy. In this study, a new CAD system is developed to enable accurate BUS image classification. In particular, an improved texture analysis is introduced, in which the tumor is divided into a set of nonoverlapping regions of interest (ROIs). Each ROI is analyzed using gray-level cooccurrence matrix features and a support vector machine classifier to estimate its tumor class indicator. The tumor class indicators of all ROIs are combined using a voting mechanism to estimate the tumor class. In addition, morphological analysis is employed to classify the tumor. A probabilistic approach is used to fuse the classification results of the multiple-ROI texture analysis and morphological analysis. The proposed approach is applied to classify 110 BUS images that include 64 benign and 46 malignant tumors. The accuracy, specificity, and sensitivity obtained using the proposed approach are 98.2%, 98.4%, and 97.8%, respectively. These results demonstrate that the proposed approach can effectively be used to differentiate benign and malignant tumors. PMID:28127383

  18. Tract-Specific Analyses of Diffusion Tensor Imaging Show Widespread White Matter Compromise in Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Shukla, Dinesh K.; Keehn, Brandon; Muller, Ralph-Axel

    2011-01-01

    Background: Previous diffusion tensor imaging (DTI) studies have shown white matter compromise in children and adults with autism spectrum disorder (ASD), which may relate to reduced connectivity and impaired function of distributed networks. However, tract-specific evidence remains limited in ASD. We applied tract-based spatial statistics (TBSS)…

  19. Tract-Specific Analyses of Diffusion Tensor Imaging Show Widespread White Matter Compromise in Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Shukla, Dinesh K.; Keehn, Brandon; Muller, Ralph-Axel

    2011-01-01

    Background: Previous diffusion tensor imaging (DTI) studies have shown white matter compromise in children and adults with autism spectrum disorder (ASD), which may relate to reduced connectivity and impaired function of distributed networks. However, tract-specific evidence remains limited in ASD. We applied tract-based spatial statistics (TBSS)…

  20. Unsupervised clustering analyses of features extraction for a caries computer-assisted diagnosis using dental fluorescence images

    NASA Astrophysics Data System (ADS)

    Bessani, Michel; da Costa, Mardoqueu M.; Lins, Emery C. C. C.; Maciel, Carlos D.

    2014-02-01

    Computer-assisted diagnoses (CAD) are performed by systems with embedded knowledge. These systems work as a second opinion to the physician and use patient data to infer diagnoses for health problems. Caries is the most common oral disease and directly affects both individuals and the society. Here we propose the use of dental fluorescence images as input of a caries computer-assisted diagnosis. We use texture descriptors together with statistical pattern recognition techniques to measure the descriptors performance for the caries classification task. The data set consists of 64 fluorescence images of in vitro healthy and carious teeth including different surfaces and lesions already diagnosed by an expert. The texture feature extraction was performed on fluorescence images using RGB and YCbCr color spaces, which generated 35 different descriptors for each sample. Principal components analysis was performed for the data interpretation and dimensionality reduction. Finally, unsupervised clustering was employed for the analysis of the relation between the output labeling and the diagnosis of the expert. The PCA result showed a high correlation between the extracted features; seven components were sufficient to represent 91.9% of the original feature vectors information. The unsupervised clustering output was compared with the expert classification resulting in an accuracy of 96.88%. The results show the high accuracy of the proposed approach in identifying carious and non-carious teeth. Therefore, the development of a CAD system for caries using such an approach appears to be promising.

  1. Analysing the impact of far-out sidelobes on the imaging performance of the SKA-LOW telescope

    NASA Astrophysics Data System (ADS)

    Mort, Benjamin; Dulwich, Fred; Razavi-Ghods, Nima; de Lera Acedo, Eloy; Grainge, Keith

    2017-03-01

    The Square Kilometre Array's Low Frequency instrument (SKA-LOW) will operate in the undersampled regime for most of the frequency band where grating lobes pose particular challenges. To achieve the expected level of sensitivity for SKA-LOW, it is particularly important to understand how interfering sources in both near and far side-lobes of the station beam affect the imaging performance. In this study, we discuss options for station designs, and adopting a random element layout, we assess its effectiveness by investigating how sources far from the main lobe of the station beam degrade images of the target field. These sources have the effect of introducing a noise-like corruption to images, which is called the far sidelobe confusion noise (FSCN). Using OSKAR, a software simulator accelerated using graphics processing units, we carried out end-to-end simulations using an all-sky model and telescope configuration representative of the SKA-LOW instrument. The FSCN is a function of both the station beam and the interferometric point spread function, and decreases with increasing observation time until the coverage of the aperture plane no longer improves. Using apodization to reduce the level of near-in sidelobes of the station beam had a notable improvement on the level of the FSCN at low frequencies. Our results indicate that the effects of picking up sources in the sidelobes are worse at low frequencies, where the array is less sparse.

  2. Proposed classification of posterior staphylomas based on analyses of eye shape by three-dimensional magnetic resonance imaging and wide-field fundus imaging.

    PubMed

    Ohno-Matsui, Kyoko

    2014-09-01

    To determine the incidence and types of posterior staphylomas in eyes with pathologic myopia by analyzing the entire eye shape by 3-dimensional (3D) magnetic resonance imaging (MRI). Observational, case series. A total of 105 patients with pathologic myopia (spherical equivalent <-8.0 diopters or axial length ≥26.5 mm) were examined by 3D MRI and Optos (Optos, PLC, Dunfermline, Scotland). Staphyloma was defined as an outpouching of the wall of the eye that had a radius of curvature less than the surrounding curvature of the wall of the eye. The presence and types of staphylomas were determined by the entire eye shape in 3D MRI scans. Fundus abnormalities suggesting the staphyloma border were analyzed in the fundus images, fundus autofluorescence images, and infrared images by Optos. Incidence and types of posterior staphylomas, and the correlation between the type of staphyloma by MRI and the Optos images. A total of 198 eyes (105 patients) met the inclusion criteria of pathologic myopia (mean age, 64.3±11.5 years; mean axial length, 30.0±2.3 mm). Among 198 eyes, 98 (49.5%) had no staphylomas in 3D MRI scans and showed a barrel-shaped globe. The other 100 eyes (50.5%) had posterior staphyloma by 3D MRI. The most predominant type was wide, macular staphyloma (74% of eyes with staphyloma), followed by narrow, macular staphyloma (14% of eyes with staphyloma). In eyes with peripapillary and nasal staphylomas, the change of the curvature was slight and the eye had a nasally distorted shape. Optos images showed pigmentary abnormalities or abnormal reflectance along the staphyloma border. The patients with staphylomas were significantly older and had significantly worse visual function and more frequent chorioretinal changes than patients without staphyloma. Three-dimensional MRI was useful in analyzing the shape of eyes with and without staphyloma. Even in elderly individuals with severe myopia, approximately one half of the patients did not show

  3. Combined magnetic resonance and diffusion tensor imaging analyses provide a powerful tool for in vivo assessment of deformation along human muscle fibers.

    PubMed

    Pamuk, Uluç; Karakuzu, Agah; Ozturk, Cengizhan; Acar, Burak; Yucesoy, Can A

    2016-10-01

    Muscle fiber direction strain provides invaluable information for characterizing muscle function. However, methods to study this for human muscles in vivo are lacking. Using magnetic resonance (MR) imaging based deformation analyses and diffusion tensor (DT) imaging based tractography combined, we aimed to assess muscle fiber direction local tissue deformations within the human medial gastrocnemius (GM) muscle. Healthy female subjects (n=5, age=27±1 years) were positioned prone within the MR scanner in a relaxed state with the ankle angle fixed at 90°. The knee was brought to flexion (140.8±3.0°) (undeformed state). Sets of 3D high resolution MR, and DT images were acquired. This protocol was repeated at extended knee joint position (177.0±1.0°) (deformed state). Tractography and Demons nonrigid registration algorithm was utilized to calculate local deformations along muscle fascicles. Undeformed state images were also transformed by a synthetic rigid body motion to calculate strain errors. Mean strain errors were significantly smaller then mean fiber direction strains (lengthening: 0.2±0.1% vs. 8.7±8.5%; shortening: 3.3±0.9% vs. 7.5±4.6%). Shortening and lengthening (up to 23.3% and 116.7%, respectively) occurs simultaneously along individual fascicles despite imposed GM lengthening. Along-fiber shear strains confirm the presence of much shearing between fascicles. Mean fiber direction strains of different tracts also show non-uniform distribution. Inhomogeneity of fiber strain indicates epimuscular myofascial force transmission. We conclude that MR and DT imaging analyses combined provide a powerful tool for quantifying deformation along human muscle fibers in vivo. This can help substantially achieving a better understanding of normal and pathological muscle function and mechanisms of treatment techniques.

  4. Characteristics of mesospheric gravity waves over Antarctica observed by Antarctic Gravity Wave Instrument Network imagers using 3-D spectral analyses

    NASA Astrophysics Data System (ADS)

    Matsuda, Takashi S.; Nakamura, Takuji; Ejiri, Mitsumu K.; Tsutsumi, Masaki; Tomikawa, Yoshihiro; Taylor, Michael J.; Zhao, Yucheng; Pautet, P.-Dominique; Murphy, Damian J.; Moffat-Griffin, Tracy

    2017-09-01

    We have obtained horizontal phase velocity distributions of the gravity waves around 90 km from four Antarctic airglow imagers, which belong to an international airglow imager/instrument network known as ANGWIN (Antarctic Gravity Wave Instrument Network). Results from the airglow imagers at Syowa (69°S, 40°E), Halley (76°S, 27°W), Davis (69°S, 78°E), and McMurdo (78°S, 167°E) were compared, using a new statistical analysis method based on 3-D Fourier transform (Matsuda et al., 2014) for the observation period between 7 April and 21 May 2013. Significant day-to-day and site-to-site differences were found. The averaged phase velocity spectrum during the observation period showed preferential westward direction at Syowa, McMurdo, and Halley, but no preferential direction at Davis. Gravity wave energy estimated by I'/I was 5 times larger at Davis and Syowa than at McMurdo and Halley. We also compared the phase velocity spectrum at Syowa and Davis with the background wind field and found that the directionality only over Syowa could be explained by critical level filtering of the waves. This suggests that the eastward propagating gravity waves over Davis could have been generated above the polar night jet. Comparison of nighttime variations of the phase velocity spectra with background wind measurements suggested that the effect of critical level filtering could not explain the temporal variation of gravity wave directionality well, and other reasons such as variation of wave sources should be taken into account. Directionality was determined to be dependent on the gravity wave periods.

  5. Functional assessment of glioma pathogenesis by in vivo multi-parametric magnetic resonance imaging and in vitro analyses

    PubMed Central

    Yao, Nai-Wei; Chang, Chen; Lin, Hsiu-Ting; Yen, Chen-Tung; Chen, Jeou-Yuan

    2016-01-01

    Gliomas are aggressive brain tumors with poor prognosis. In this study, we report a novel approach combining both in vivo multi-parametric MRI and in vitro cell culture assessments to evaluate the pathogenic development of gliomas. Osteopontin (OPN), a pleiotropic factor, has been implicated in the formation and progression of various human cancers, including gliomas, through its functions in regulating cell proliferation, survival, angiogenesis, and migration. Using rat C6 glioma model, the combined approach successfully monitors the acquisition and decrease of cancer hallmarks. We show that knockdown of the expression of OPN reduces C6 cell proliferation, survival, viability and clonogenicity in vitro, and reduces tumor burden and prolongs animal survival in syngeneic rats. OPN depletion is associated with reduced tumor growth, decreased angiogenesis, and an increase of tumor-associated metabolites, as revealed by T2-weighted images, diffusion-weighted images, Ktrans maps, and 1H-MRS, respectively. These strategies allow us to define an important role of OPN in conferring cancer hallmarks, which can be further applied to assess the functional roles of other candidate genes in glioma. In particular, the non-invasive multi-parametric MRI measurement of cancer hallmarks related to proliferation, angiogenesis and altered metabolism may serve as a useful tool for diagnosis and for patient management. PMID:27198662

  6. Single-Cell Imaging and Spectroscopic Analyses of Cr(VI) Reduction on the Surface of Bacterial Cells

    SciTech Connect

    Wang, Yuanmin; Sevinc, Papatya C.; Belchik, Sara M.; Fredrickson, Jim K.; Shi, Liang; Lu, H. Peter

    2013-01-22

    We investigate single-cell reduction of toxic Cr(VI) by the dissimilatory metal-reducing bacterium Shewanella oneidensis MR-1 (MR-1), an important bioremediation process, using Raman spectroscopy and scanning electron microscopy (SEM) combined with energy-dispersive X-ray spectroscopy (EDX). Our experiments indicate that the toxic and highly soluble Cr(VI) can be efficiently reduced to the less toxic and non-soluble Cr2O3 nanoparticles by MR-1. Cr2O3 is observed to emerge as nanoparticles adsorbed on the cell surface and its chemical nature is identified by EDX imaging and Raman spectroscopy. Co-localization of Cr2O3 and cytochromes by EDX imaging and Raman spectroscopy suggests a terminal reductase role for MR-1 surface-exposed cytochromes MtrC and OmcA. Our experiments revealed that the cooperation of surface proteins OmcA and MtrC makes the reduction reaction most efficient, and the sequence of the reducing reactivity of the MR-1 is: wild type > single mutant @mtrC or mutant @omcA > double mutant (@omcA-@mtrC). Moreover, our results also suggest that the direct microbial Cr(VI) reduction and Fe(II) (hematite)-mediated Cr(VI) reduction mechanisms may co-exist in the reduction processes.

  7. Calibration of remote mineralogy algorithms using modal analyses of Apollo soils by X-ray diffraction and microscopic spectral imaging

    NASA Astrophysics Data System (ADS)

    Crites, S. T.; Taylor, J.; Martel, L.; Lucey, P. G.; Blake, D. F.

    2012-12-01

    We have launched a project to determine the modal mineralogy of over 100 soils from all Apollo sites using quantitative X-ray diffraction (XRD) and microscopic hyperspectral imaging at visible, near-IR and thermal IR wavelengths. The two methods are complementary: XRD is optimal for obtaining the major mineral modes because its measurement is not limited to the surfaces of grains, whereas the hyperspectral imaging method allows us to identify minerals present even down to a single grain, well below the quantitative detection limit of XRD. Each soil is also sent to RELAB to obtain visible, near-IR, and thermal-IR reflectance spectra. The goal is to use quantitative mineralogy in comparison with spectra of the same soils and with remote sensing data of the sampling stations to improve our ability to extract quantitative mineralogy from remote sensing observations. Previous groups have demonstrated methods for using lab mineralogy to validate remote sensing. The LSCC pioneered the method of comparing mineralogy to laboratory spectra of the same soils (Pieters et al. 2002); Blewett et al. (1997) directly compared remote sensing results for sample sites with lab measurements of representative soils from those sites. We are building upon the work of both groups by expanding the number of soils measured to 128, with an emphasis on immature soils to support recent work studying fresh exposures like crater central peaks, and also by incorporating the recent high spatial and spectral resolution data sets over expanded wavelength ranges (e.g. Diviner TIR, M3 hyperspectral VNIR) not available at the time of the previous studies. We have thus far measured 32 Apollo 16 soils using quantitative XRD and are continuing with our collection of soils from the other landing sites. We have developed a microscopic spectral imaging system that includes TIR, VIS, and NIR capabilities and have completed proof-of-concept scans of mineral separates and preliminary lunar soil scans with plans

  8. A CAD system to analyse mammogram images using fully complex-valued relaxation neural network ensembled classifier.

    PubMed

    Saraswathi, D; Srinivasan, E

    2014-10-01

    This paper presents a new improved classification technique using the Fully Complex-Valued Relaxation Neural Networks (FCRN) based ensemble technique for classifying mammogram images. The system is developed based on three stages of Breast cancer, namely Normal, Benign and Malignant, defined by the MIAS database. Features like Binary object Features, RST Invariant Features, Histogram Features, Texture Features and Spectral Features are extracted from the MIAS database. Extracted features are then given to the proposed FCRN-based ensemble classifier. FCRN networks are ensembled together for improving the classification rate. Receiver Operating Characteristic (ROC) analysis is used for evaluating the system. The results illustrate the superior classification performance of the ensembled FCRN. Performance comparison of various sets of training and testing vectors are provided for FCRN classifier. The resultant ensembled FCRN approximates the desired output more accurately with a lower computational effort.

  9. Application of mid-infrared chemical imaging and multivariate chemometrics analyses to characterise a population of microalgae cells.

    PubMed

    Tan, Suat-Teng; Balasubramanian, Rajesh Kumar; Das, Probir; Obbard, Jeffrey Philip; Chew, Wee

    2013-04-01

    A suite of multivariate chemometrics methods was applied to a mid-infrared imaging dataset of a eustigmatophyte, marine Nannochloropsis sp. microalgae strain. This includes the improved leader-follower cluster analysis (iLFCA) to interrogate spectra in an unsupervised fashion, a resonant Mie optical scatter correction algorithm (RMieS-EMSC) that improves data linearity, the band-target entropy minimization (BTEM) self-modeling curve resolution for recovering component spectra, and a multi-linear regression (MLR) for estimating relative concentrations and plotting chemical maps of component spectra. A novel Alpha-Stable probability calculation for microalgae cellular lipid-to-protein ratio Λi is introduced for estimating population characteristics.

  10. Evaluating Climate Causation of Conflict in Darfur Using Multi-temporal, Multi-resolution Satellite Image Datasets With Novel Analyses

    NASA Astrophysics Data System (ADS)

    Brown, I.; Wennbom, M.

    2013-12-01

    Climate change, population growth and changes in traditional lifestyles have led to instabilities in traditional demarcations between neighboring ethic and religious groups in the Sahel region. This has resulted in a number of conflicts as groups resort to arms to settle disputes. Such disputes often centre on or are justified by competition for resources. The conflict in Darfur has been controversially explained by resource scarcity resulting from climate change. Here we analyse established methods of using satellite imagery to assess vegetation health in Darfur. Multi-decadal time series of observations are available using low spatial resolution visible-near infrared imagery. Typically normalized difference vegetation index (NDVI) analyses are produced to describe changes in vegetation ';greenness' or ';health'. Such approaches have been widely used to evaluate the long term development of vegetation in relation to climate variations across a wide range of environments from the Arctic to the Sahel. These datasets typically measure peak NDVI observed over a given interval and may introduce bias. It is furthermore unclear how the spatial organization of sparse vegetation may affect low resolution NDVI products. We develop and assess alternative measures of vegetation including descriptors of the growing season, wetness and resource availability. Expanding the range of parameters used in the analysis reduces our dependence on peak NDVI. Furthermore, these descriptors provide a better characterization of the growing season than the single NDVI measure. Using multi-sensor data we combine high temporal/moderate spatial resolution data with low temporal/high spatial resolution data to improve the spatial representativity of the observations and to provide improved spatial analysis of vegetation patterns. The approach places the high resolution observations in the NDVI context space using a longer time series of lower resolution imagery. The vegetation descriptors

  11. Digital immunohistochemistry platform for the staining variation monitoring based on integration of image and statistical analyses with laboratory information system.

    PubMed

    Laurinaviciene, Aida; Plancoulaine, Benoit; Baltrusaityte, Indra; Meskauskas, Raimundas; Besusparis, Justinas; Lesciute-Krilaviciene, Daiva; Raudeliunas, Darius; Iqbal, Yasir; Herlin, Paulette; Laurinavicius, Arvydas

    2014-01-01

    Digital immunohistochemistry (IHC) is one of the most promising applications brought by new generation image analysis (IA). While conventional IHC staining quality is monitored by semi-quantitative visual evaluation of tissue controls, IA may require more sensitive measurement. We designed an automated system to digitally monitor IHC multi-tissue controls, based on SQL-level integration of laboratory information system with image and statistical analysis tools. Consecutive sections of TMA containing 10 cores of breast cancer tissue were used as tissue controls in routine Ki67 IHC testing. Ventana slide label barcode ID was sent to the LIS to register the serial section sequence. The slides were stained and scanned (Aperio ScanScope XT), IA was performed by the Aperio/Leica Colocalization and Genie Classifier/Nuclear algorithms. SQL-based integration ensured automated statistical analysis of the IA data by the SAS Enterprise Guide project. Factor analysis and plot visualizations were performed to explore slide-to-slide variation of the Ki67 IHC staining results in the control tissue. Slide-to-slide intra-core IHC staining analysis revealed rather significant variation of the variables reflecting the sample size, while Brown and Blue Intensity were relatively stable. To further investigate this variation, the IA results from the 10 cores were aggregated to minimize tissue-related variance. Factor analysis revealed association between the variables reflecting the sample size detected by IA and Blue Intensity. Since the main feature to be extracted from the tissue controls was staining intensity, we further explored the variation of the intensity variables in the individual cores. MeanBrownBlue Intensity ((Brown+Blue)/2) and DiffBrownBlue Intensity (Brown-Blue) were introduced to better contrast the absolute intensity and the colour balance variation in each core; relevant factor scores were extracted. Finally, tissue-related factors of IHC staining variance were

  12. In Vivo Imaging of Human 11C-Metformin in Peripheral Organs: Dosimetry, Biodistribution, and Kinetic Analyses.

    PubMed

    Gormsen, Lars C; Sundelin, Elias Immanuel; Jensen, Jonas Brorson; Vendelbo, Mikkel Holm; Jakobsen, Steen; Munk, Ole Lajord; Hougaard Christensen, Mette Marie; Brøsen, Kim; Frøkiær, Jørgen; Jessen, Niels

    2016-12-01

    administration. Only slow accumulation of (11)C-metformin was observed in muscle. There was no elimination of (11)C-metformin through the bile both during the intravenous and during the oral part of the study. (11)C-metformin is suitable for imaging metformin uptake in target tissues and may prove a valuable tool to assess the impact of metformin treatment in patients with varying metformin transport capacity. © 2016 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  13. An Imaging Flow Cytometry-based approach to analyse the fission yeast cell cycle in fixed cells.

    PubMed

    Patterson, James O; Swaffer, Matthew; Filby, Andrew

    2015-07-01

    Fission yeast (Schizosaccharomyces pombe) is an excellent model organism for studying eukaryotic cell division because many of the underlying principles and key regulators of cell cycle biology are conserved from yeast to humans. As such it can be employed as tool for understanding complex human diseases that arise from dis-regulation in cell cycle controls, including cancers. Conventional Flow Cytometry (CFC) is a high-throughput, multi-parameter, fluorescence-based single cell analysis technology. It is widely used for studying the mammalian cell cycle both in the context of the normal and disease states by measuring changes in DNA content during the transition through G1, S and G2/M using fluorescent DNA-binding dyes. Unfortunately analysis of the fission yeast cell cycle by CFC is not straightforward because, unlike mammalian cells, cytokinesis occurs after S-phase meaning that bi-nucleated G1 cells have the same DNA content as mono-nucleated G2 cells and cannot be distinguished using total integrated fluorescence (pulse area). It has been elegantly shown that the width of the DNA pulse can be used to distinguish G2 cells with a single 2C foci versus G1 cells with two 1C foci, however the accuracy of this measurement is dependent on the orientation of the cell as it traverses the laser beam. To this end we sought to improve the accuracy of the fission yeast cell cycle analysis and have developed an Imaging Flow Cytometry (IFC)-based method that is able to preserve the high throughput, objective analysis afforded by CFC in combination with the spatial and morphometric information provide by microscopy. We have been able to derive an analysis framework for subdividing the yeast cell cycle that is based on intensiometric and morphometric measurements and is thus robust against orientation-based miss-classification. In addition we can employ image-based metrics to define populations of septated/bi-nucleated cells and measure cellular dimensions. To our knowledge

  14. Utilizing magnetic resonance imaging logs, openhole logs, and sidewall core analyses to evaluate shaly sands for water-free production

    SciTech Connect

    Taylor, D.A.; Morganti, J.K.; White, H.J. ); Noblett, B.R. )

    1996-01-01

    Nuclear magnetic resonance (NMR) logging using the new C Series Magnetic Resonance Imaging Log (MRIL) system is rapidly enhancing formation evaluation throughout the industry. By measuring irreducible water saturations, permeabilities, and effective porosities, MRIL data can help petrophysicists evaluate low-resistivity pays. In these environments, conventional openhole logs may not define all of the pay intervals. The MRIL system can also reduce the number of unnecessary completions in zones of potentially high water cut. MRIL tool theory and log presentations used with conventional logs and sidewall cores are presented along with field examples. Scanning electron microscope (SEM) analysis shows good correlation of varying grain size in sandstones with the T2 distribution and bulk volume irreducible water determined from the MRIL measurements. Analysis of each new well drilled in the study area shows how water-free production zones were defined. Because the MRIL data were not recorded on one of the wells, predictions from the conventional logs and the MRIL data collected on the other two wells were used to estimate productive zones in the first well. Discussion of additional formation characteristics, completion procedures, actual production, and predicted producibility of the shaly sands is presented. Integrated methodologies resulted in the perforation of 3 new wells for a gross initial potential of 690 BOPD and 0 BWPD.

  15. Utilizing magnetic resonance imaging logs, openhole logs, and sidewall core analyses to evaluate shaly sands for water-free production

    SciTech Connect

    Taylor, D.A.; Morganti, J.K.; White, H.J.; Noblett, B.R.

    1996-12-31

    Nuclear magnetic resonance (NMR) logging using the new C Series Magnetic Resonance Imaging Log (MRIL) system is rapidly enhancing formation evaluation throughout the industry. By measuring irreducible water saturations, permeabilities, and effective porosities, MRIL data can help petrophysicists evaluate low-resistivity pays. In these environments, conventional openhole logs may not define all of the pay intervals. The MRIL system can also reduce the number of unnecessary completions in zones of potentially high water cut. MRIL tool theory and log presentations used with conventional logs and sidewall cores are presented along with field examples. Scanning electron microscope (SEM) analysis shows good correlation of varying grain size in sandstones with the T2 distribution and bulk volume irreducible water determined from the MRIL measurements. Analysis of each new well drilled in the study area shows how water-free production zones were defined. Because the MRIL data were not recorded on one of the wells, predictions from the conventional logs and the MRIL data collected on the other two wells were used to estimate productive zones in the first well. Discussion of additional formation characteristics, completion procedures, actual production, and predicted producibility of the shaly sands is presented. Integrated methodologies resulted in the perforation of 3 new wells for a gross initial potential of 690 BOPD and 0 BWPD.

  16. Characterization of Influenza Vaccine Hemagglutinin Complexes by Cryo-Electron Microscopy and Image Analyses Reveals Structural Polymorphisms

    PubMed Central

    McCraw, Dustin M.; Gallagher, John R.

    2016-01-01

    Influenza virus afflicts millions of people worldwide on an annual basis. There is an ever-present risk that animal viruses will cross the species barrier to cause epidemics and pandemics resulting in great morbidity and mortality. Zoonosis outbreaks, such as the H7N9 outbreak, underscore the need to better understand the molecular organization of viral immunogens, such as recombinant influenza virus hemagglutinin (HA) proteins, used in influenza virus subunit vaccines in order to optimize vaccine efficacy. Here, using cryo-electron microscopy and image analysis, we show that recombinant H7 HA in vaccines formed macromolecular complexes consisting of variable numbers of HA subunits (range, 6 to 8). In addition, HA complexes were distributed across at least four distinct structural classes (polymorphisms). Three-dimensional (3D) reconstruction and molecular modeling indicated that HA was in the prefusion state and suggested that the oligomerization and the structural polymorphisms observed were due to hydrophobic interactions involving the transmembrane regions. These experiments suggest that characterization of the molecular structures of influenza virus HA complexes used in subunit vaccines will lead to better understanding of the differences in vaccine efficacy and to the optimization of subunit vaccines to prevent influenza virus infection. PMID:27074939

  17. Advances in preclinical therapeutics development using small animal imaging and molecular analyses: the gastrointestinal stromal tumors model.

    PubMed

    Pantaleo, M A; Landuzzi, L; Nicoletti, G; Nanni, C; Boschi, S; Piazzi, G; Santini, D; Di Battista, M; Castellucci, P; Lodi, F; Fanti, S; Lollini, P-L; Biasco, G

    2009-09-01

    The large use of target therapies in the treatment of gastrointestinal stromal tumors (GISTs) highlighted the urgency to integrate new molecular imaging technologies, to develop new criteria for tumor response evaluation and to reach a more comprehensive definition of the molecular target. These aspects, which come from clinical experiences, are not considered enough in preclinical research studies which aim to evaluate the efficacy of new drugs or new combination of drugs with molecular target. We developed a xenograft animal model GIST882 using nude mice. We evaluated both the molecular and functional characterization of the tumor mass. The mutational analysis of KIT receptor of the GIST882 cell lines and tumor mass showed a mutation on exon 13 that was still present after in vivo cell growth. The glucose metabolism and cell proliferation was evaluated with a small animal PET using both FDG and FLT. The experimental development of new therapies for GIST treatment requires sophisticated animal models in order to represent the tumor molecular heterogeneity already demonstrated in the clinical setting and in order to evaluate the efficacy of the treatment also considering the inhibition of tumor metabolism, and not only considering the change in size of tumors. This approach of cancer research on GISTs is crucial and essential for innovative perspectives that could cross over to other types of cancer.

  18. Seasonal forcing of image-analysed mesozooplankton community composition along the salinity gradient of the Guadalquivir estuary

    NASA Astrophysics Data System (ADS)

    Taglialatela, Simone; Ruiz, Javier; Prieto, Laura; Navarro, Gabriel

    2014-08-01

    The composition and distribution of the mesozooplankton was studied monthly from April 2008 to June 2009 in the Guadalquivir estuary using a fast image analysis technique as well as with traditional microscope counting. The mesozooplankton showed a very clear temporal and spatial pattern with peaks of abundance in late-Spring/early-Summer 2008 and Spring 2009 in the inner estuary. The abundances peaked at 135 × 103 ind. m-3. Calanipeda aquaedulcis was the most abundant species in the fresh and brackish waters (salinity between 0.5 and 7), accounting in many cases for up to 100% of the individuals. Acartia clausi instead was identified as the most abundant species in the middle part of the estuary (salinity between 10 and 30). Cyclopoida of the family Cyclopidae (possibly Acanthocyclops spp.) were occasionally abundant there as well as some species of freshwater Cladocera. At the mouth, the mesozooplanktonic community included appendicularians, chaetognaths, copepods and Cladocera. Canonical Correspondence Analysis (CCA) indicates that the changes observed in the taxonomic composition along the estuary were strictly correlated with the salinity gradient. Furthermore, no evidence of seasonal species substitution was observed in the Guadalquivir estuary, whereas a clear spatial displacement of C. aquaedulcis and A. clausi populations was observed after large discharges from the dam in Alcala del Rio.

  19. Nonintrusive Finger-Vein Recognition System Using NIR Image Sensor and Accuracy Analyses According to Various Factors.

    PubMed

    Pham, Tuyen Danh; Park, Young Ho; Nguyen, Dat Tien; Kwon, Seung Yong; Park, Kang Ryoung

    2015-07-13

    Biometrics is a technology that enables an individual person to be identified based on human physiological and behavioral characteristics. Among biometrics technologies, face recognition has been widely used because of its advantages in terms of convenience and non-contact operation. However, its performance is affected by factors such as variation in the illumination, facial expression, and head pose. Therefore, fingerprint and iris recognitions are preferred alternatives. However, the performance of the former can be adversely affected by the skin condition, including scarring and dryness. In addition, the latter has the disadvantages of high cost, large system size, and inconvenience to the user, who has to align their eyes with the iris camera. In an attempt to overcome these problems, finger-vein recognition has been vigorously researched, but an analysis of its accuracies according to various factors has not received much attention. Therefore, we propose a nonintrusive finger-vein recognition system using a near infrared (NIR) image sensor and analyze its accuracies considering various factors. The experimental results obtained with three databases showed that our system can be operated in real applications with high accuracy; and the dissimilarity of the finger-veins of different people is larger than that of the finger types and hands.

  20. Textural analyses of carbon fiber materials by 2D-FFT of complex images obtained by high frequency eddy current imaging (HF-ECI)

    NASA Astrophysics Data System (ADS)

    Schulze, Martin H.; Heuer, Henning

    2012-04-01

    Carbon fiber based materials are used in many lightweight applications in aeronautical, automotive, machine and civil engineering application. By the increasing automation in the production process of CFRP laminates a manual optical inspection of each resin transfer molding (RTM) layer is not practicable. Due to the limitation to surface inspection, the quality parameters of multilayer 3 dimensional materials cannot be observed by optical systems. The Imaging Eddy- Current (EC) NDT is the only suitable inspection method for non-resin materials in the textile state that allows an inspection of surface and hidden layers in parallel. The HF-ECI method has the capability to measure layer displacements (misaligned angle orientations) and gap sizes in a multilayer carbon fiber structure. EC technique uses the variation of the electrical conductivity of carbon based materials to obtain material properties. Beside the determination of textural parameters like layer orientation and gap sizes between rovings, the detection of foreign polymer particles, fuzzy balls or visualization of undulations can be done by the method. For all of these typical parameters an imaging classification process chain based on a high resolving directional ECimaging device named EddyCus® MPECS and a 2D-FFT with adapted preprocessing algorithms are developed.

  1. The influence of respiratory motion on the cumulative SUV-volume histogram and fractal analyses of intratumoral heterogeneity in PET/CT imaging.

    PubMed

    Takeshita, Toshiki; Morita, Keishin; Tsutsui, Yuji; Kidera, Daisuke; Mikasa, Shohei; Maebatake, Akira; Akamatsu, Go; Miwa, Kenta; Baba, Shingo; Sasaki, Masayuki

    2016-07-01

    The purpose of this study was to investigate the influence of respiratory motion on the evaluation of the intratumoral heterogeneity of FDG uptake using cumulative SUV-volume histogram (CSH) and fractal analyses. We used an NEMA IEC body phantom with a homogeneous hot sphere phantom (HO) and two heterogeneous hot sphere phantoms (HE1 and HE2). The background radioactivity of (18)F in the NEMA phantom was 5.3 kBq/mL. The ratio of radioactivity was 4:2:1 for the HO and the outer rims of the HE1 and HE2 phantoms, the inner cores of the HE1 and HE2 phantoms, and background, respectively. Respiratory motion was simulated using a motion table with an amplitude of 2 cm. PET/CT data were acquired using Biograph mCT in motionless and moving conditions. The PET images were analyzed by both CSH and fractal analyses. The area under the CSH (AUC-CSH) and the fractal dimension (FD) was used as quantitative metrics. In motionless conditions, the AUC-CSHs of the HO (0.80), HE1 (0.75) and HE2 (0.65) phantoms were different. They did not differ in moving conditions (HO, 0.63; HE1, 0.65; HE2, 0.60). The FD of the HO phantom (0.77) was smaller than the FDs of the HE1 (1.71) and HE2 (1.98) phantoms in motionless conditions; however, the FDs of the HO (1.99) and HE1 (2.19) phantoms were not different from each other and were smaller than that of the HE2 (3.73) phantom in moving conditions. Respiratory motion affected the results of the CSH and fractal analyses for the evaluation of the heterogeneity of the PET/CT images. The influence of respiratory motion was considered to vary depending on the object size.

  2. The Spatial Distribution of Alkaloids in Psychotria prunifolia (Kunth) Steyerm and Palicourea coriacea (Cham.) K. Schum Leaves Analysed by Desorption Electrospray Ionisation Mass Spectrometry Imaging.

    PubMed

    Kato, Lucilia; Moraes, Aline Pereira; de Oliveira, Cecília Maria Alves; Vaz, Boniek Gontijo; de Almeida Gonçalves, Letícia; E Silva, Elienai Cândida; Janfelt, Christian

    2017-09-06

    Species of the genera Psychotria and Palicourea are sources of indole alkaloids, however, the distribution of alkaloids within the plants is not known. Analysing the spatial distribution using desorption electrospray ionisation mass spectrometry imaging (DESI-MSI) has become attractive due to its simplicity and high selectivity compared to traditional histochemical techniques. To apply DESI-MSI to visualise the alkaloid distribution on the leaf surface of Psychotria prunifolia and Palicourea coriacea and to compare the distributions with HPLC-MS and histochemical analyses. Based upon previous structure elucidation studies, four alkaloids targeted in this study were identified using high resolution mass spectrometry by direct infusion of plant extracts, and their distributions were imaged by DESI-MSI via tissue imprints on a porous Teflon surface. Relative quantitation of the four alkaloids was obtained by HPLC-MS/MS analysis performed using multiple-reaction monitoring (MRM) mode on a triple quadrupole mass spectrometer. Alkaloids showed distinct distributions on the leaf surfaces. Prunifoleine was mainly present in the midrib, while 10-hydroxyisodeppeaninol was concentrated close to the petiole; a uniform distribution of 10-hydroxyantirhine was observed in the whole leaf of Psychotria prunifolia. The imprinted image from the Palicourea coriacea leaf also showed a homogeneous distribution of calycanthine throughout the leaf surface. Different distributions were found for three alkaloids in Psychotria prunifolia, and the distributions found by MSI were in complete accordance with HPLC-MS analysis and histochemical results. The DESI-MSI technique was therefore demonstrated to provide reliable information about the spatial distribution of metabolites in plants. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  3. Liquid Chromatography-Tandem and MALDI imaging mass spectrometry analyses of RCL2/CS100-fixed paraffin embedded tissues: proteomics evaluation of an alternate fixative for biomarker discovery

    PubMed Central

    Mangé, A; Chaurand, P; Perrochia, H; Roger, P; Caprioli, RM; Solassol, J

    2010-01-01

    Human tissues are an important source of biological material for the discovery of novel biomarkers. Fresh-frozen tissue could represent an ideal supply of archival material for molecular investigations. However, immediate flash freezing is usually not possible, especially for rare or valuable tissue samples such as biopsies. Here, we investigated the compatibility of RCL2/CS100, a non-crosslinking, non-toxic, and non-volatile organic fixative, with shotgun proteomic analyses. Several protein extraction protocols compatible with mass spectrometry were investigated from RCL2/CS100-fixed and fresh-frozen colonic mucosa, breast, and prostate tissues. The peptides and proteins identified from RCL2/CS100 tissue were then comprehensively compared with those identified from matched fresh-frozen tissues using a bottom-up strategy based on nano-reversed phase liquid chromatography coupled with tandem mass spectrometry (nanoRPLC-MS/MS). Results showed that similar peptides could be identified in both archival conditions and the proteome coverage was not obviously compromised by the RCL2/CS100 fixation process. NanoRPLC-MS/MS of laser capture microdissected RCL2/CS100-fixed tissues gave the same amount of biological information as that recovered from whole RCL2/CS100-fixed or frozen tissues. We next performed MALDI tissue profiling and imaging mass spectrometry and observed a high level of agreement in protein expression as well as excellent agreement between the images obtained from RCL2/CS100-fixed and fresh-frozen tissue samples. These results suggest that RCL2/CS100-fixed tissues are suitable for shotgun proteomic analyses and tissue imaging. More importantly, this alternate fixative opens the door to the analysis of small, valuable, and rare target lesions that are usually inaccessible to complementary biomarker-driven genomic and proteomic research. PMID:19856998

  4. An automated image-based method of 3D subject-specific body segment parameter estimation for kinetic analyses of rapid movements.

    PubMed

    Sheets, Alison L; Corazza, Stefano; Andriacchi, Thomas P

    2010-01-01

    Accurate subject-specific body segment parameters (BSPs) are necessary to perform kinetic analyses of human movements with large accelerations, or no external contact forces or moments. A new automated topographical image-based method of estimating segment mass, center of mass (CM) position, and moments of inertia is presented. Body geometry and volume were measured using a laser scanner, then an automated pose and shape registration algorithm segmented the scanned body surface, and identified joint center (JC) positions. Assuming the constant segment densities of Dempster, thigh and shank masses, CM locations, and moments of inertia were estimated for four male subjects with body mass indexes (BMIs) of 19.7-38.2. The subject-specific BSP were compared with those determined using Dempster and Clauser regression equations. The influence of BSP and BMI differences on knee and hip net forces and moments during a running swing phase were quantified for the subjects with the smallest and largest BMIs. Subject-specific BSP for 15 body segments were quickly calculated using the image-based method, and total subject masses were overestimated by 1.7-2.9%.When compared with the Dempster and Clauser methods, image-based and regression estimated thigh BSP varied more than the shank parameters. Thigh masses and hip JC to thigh CM distances were consistently larger, and each transverse moment of inertia was smaller using the image-based method. Because the shank had larger linear and angular accelerations than the thigh during the running swing phase, shank BSP differences had a larger effect on calculated intersegmental forces and moments at the knee joint than thigh BSP differences did at the hip. It was the net knee kinetic differences caused by the shank BSP differences that were the largest contributors to the hip variations. Finally, BSP differences produced larger kinetic differences for the subject with larger segment masses, suggesting that parameter accuracy is more

  5. Grain-size and grain-shape analyses using digital imaging technology: Application to the fluvial formation of the Ngandong paleoanthropological site in Central Java, Indonesia

    NASA Astrophysics Data System (ADS)

    Sipola, Maija

    2013-04-01

    This study implements grain-size and grain-shape analyses to better understand the fluvial processes responsible for forming the Ngandong paleoanthropological site along the Solo River in Central Java. The site was first discovered and excavated by the Dutch Geological Survey in the early 1930's, during which fourteen Homo erectus fossils and thousands of other macrofaunal remains were uncovered. The Homo erectus fossils discovered at Ngandong are particularly interesting to paleoanthropologists because the morphology of the excavated crania suggests they are from a recently-living variety of the species. The primary scientific focus for many years has been to determine the absolute age of the Ngandong fossils, while the question of exactly how the Ngandong site itself formed has been frequently overlooked. In this study I use Retsch CAMSIZER digital imaging technology to conduct grain-size and grain-shape analyses of sediments from the terrace stratigraphy at the Ngandong site to understand if there are significant differences between sedimentary layers in grain-size and/or grain-shape, and what these differences mean in terms of local paleoflow dynamics over time. Preliminary analyses indicate there are four distinct sedimentary layers present at Ngandong with regard to size sorting, with the fossil-bearing layers proving to be the most poorly-sorted and most similar to debris-flow deposits. These results support hypotheses by geoarchaeologists that the fossil-bearing layers present at Ngandong were deposited during special flow events rather than under normal stream flow conditions.

  6. Identification of factors contributing to phenotypic divergence via quantitative image analyses of autosomal recessive woolly hair/hypotrichosis with homozygous c.736T>A LIPH mutation.

    PubMed

    Kinoshita-Ise, M; Kubo, A; Sasaki, T; Umegaki-Arao, N; Amagai, M; Ohyama, M

    2017-01-01

    Autosomal recessive woolly hair/hypotrichosis (ARWH/H) is caused by mutations in LIPH. Homozygotes for the LIPH c.736T>A (p.C246S) mutation, the most prevalent genotype in Japanese patients, present varying degrees of hair loss; however, determinants of this phenotypic diversity remain elusive. To establish methodologies for quantitative assessment of clinical severity and provide a detailed characterization to elucidate the factors contributing to phenotypic divergence. Digital image analyses were conducted to convert clinical severities into numerical values. Eight patients with ARWH/H were classified into three groups (mild, severe, very severe), based on severity scores. Dermoscopic images were collected and assessed for total hair numbers and hair thickness for intergroup comparisons. The image analysis detected a difference in hair thickness but not in total hair numbers, between mild and severe cases. A marked decrease in total hair number was noted in an atypical very severe case. Histopathologically, a patient with a mild case demonstrated hair miniaturization and a high telogen/anagen ratio without a decrease in total hair count, endorsing dermoscopic observations. Two children demonstrated spontaneous improvement without an increase in total hair numbers, and two adults responded well to topical minoxidil with increased total hair numbers and hair thickness. The difference in the frequency of underdeveloped hairs may be a major factor contributing to the clinical diversity of hair sparseness in LIPH c.736T>A homozygotes with ARWH/H. Hence, pharmacological modification to thicken existing fine hairs may provide a therapeutic strategy. © 2016 British Association of Dermatologists.

  7. Growing seasons of Nordic mountain birch in northernmost Europe as indicated by long-term field studies and analyses of satellite images.

    PubMed

    Shutova, E; Wielgolaski, F E; Karlsen, S R; Makarova, O; Berlina, N; Filimonova, T; Haraldsson, E; Aspholm, P E; Flø, L; Høgda, K A

    2006-11-01

    The phenophases first greening (bud burst) and yellowing of Nordic mountain birch (Betula pubescens ssp.tortuosa, also called B. p. ssp. czerepanovii) were observed at three sites on the Kola Peninsula in northernmost Europe during the period 1964-2003, and at two sites in the trans-boundary Pasvik-Enare region during 1994-2003. The field observations were compared with satellite images based on the GIMMS-NDVI dataset covering 1982-2002 at the start and end of the growing season. A trend for a delay of first greening was observed at only one of the sites (Kandalaksha) over the 40 year period. This fits well with the delayed onset of the growing season for that site based on satellite images. No significant changes in time of greening at the other sites were found with either field observations or satellite analyses throughout the study period. These results differ from the earlier spring generally observed in other parts of Europe in recent decades. In the coldest regions of Europe, e.g. in northern high mountains and the northernmost continental areas, increased precipitation associated with the generally positive North Atlantic Oscillation in the last few decades has often fallen as snow. Increased snow may delay the time of onset of the growing season, although increased temperature generally causes earlier spring phenophases. Autumn yellowing of birch leaves tends towards an earlier date at all sites. Due to both later birch greening and earlier yellowing at the Kandalaksha site, the growing season there has also become significantly shorter during the years observed. The sites showing the most advanced yellowing in the field throughout the study period fit well with areas showing an earlier end of the growing season from satellite images covering 1982-2002. The earlier yellowing is highly correlated with a trend at the sites in autumn for earlier decreasing air temperature over the study period, indicating that this environmental factor is important also for

  8. JOB ANALYSES.

    ERIC Educational Resources Information Center

    JONES, HAROLD E.

    THE JOB ANALYSES WERE COMPOSED FROM ACTIVITY RECORDS KEPT BY EACH PROFESSIONAL EXTENSION WORKER IN KANSAS. JOB ANALYSES ARE GIVEN FOR THE ADMINISTRATION (DIRECTOR, ASSOCIATE DIRECTOR, ADMINISTRATIVE ASSISTANT, ASSISTANT DIRECTOR, SATE LEADERS AND DEPARTMENT HEADS), EXTENSION SPECIALISTS, DISTRICT AGENTS, AND COUNTY EXTENSION AGENTS. DISCUSSION OF…

  9. Surface Roughness and Critical Exponent Analyses of Boron-Doped Diamond Films Using Atomic Force Microscopy Imaging: Application of Autocorrelation and Power Spectral Density Functions

    NASA Astrophysics Data System (ADS)

    Gupta, S.; Vierkant, G. P.

    2014-09-01

    The evolution of the surface roughness of growing metal or semiconductor thin films provides much needed information about their growth kinetics and corresponding mechanism. While some systems show stages of nucleation, coalescence, and growth, others exhibit varying microstructures for different process conditions. In view of these classifications, we report herein detailed analyses based on atomic force microscopy (AFM) characterization to extract the surface roughness and growth kinetics exponents of relatively low boron-doped diamond (BDD) films by utilizing the analytical power spectral density (PSD) and autocorrelation function (ACF) as mathematical tools. The machining industry has applied PSD for a number of years for tool design and analysis of wear and machined surface quality. Herein, we present similar analyses at the mesoscale to study the surface morphology as well as quality of BDD films grown using the microwave plasma-assisted chemical vapor deposition technique. PSD spectra as a function of boron concentration (in gaseous phase) are compared with those for samples grown without boron. We find that relatively higher boron concentration yields higher amplitudes of the longer-wavelength power spectral lines, with amplitudes decreasing in an exponential or power-law fashion towards shorter wavelengths, determining the roughness exponent ( α ≈ 0.16 ± 0.03) and growth exponent ( β ≈ 0.54), albeit indirectly. A unique application of the ACF, which is widely used in signal processing, was also applied to one-dimensional or line analyses (i.e., along the x- and y-axes) of AFM images, revealing surface topology datasets with varying boron concentration. Here, the ACF was used to cancel random surface "noise" and identify any spatial periodicity via repetitive ACF peaks or spatially correlated noise. Periodicity at shorter spatial wavelengths was observed for no doping and low doping levels, while smaller correlations were observed for relatively

  10. Computational replication of the patient-specific stenting procedure for coronary artery bifurcations: From OCT and CT imaging to structural and hemodynamics analyses.

    PubMed

    Chiastra, Claudio; Wu, Wei; Dickerhoff, Benjamin; Aleiou, Ali; Dubini, Gabriele; Otake, Hiromasa; Migliavacca, Francesco; LaDisa, John F

    2016-07-26

    The optimal stenting technique for coronary artery bifurcations is still debated. With additional advances computational simulations can soon be used to compare stent designs or strategies based on verified structural and hemodynamics results in order to identify the optimal solution for each individual's anatomy. In this study, patient-specific simulations of stent deployment were performed for 2 cases to replicate the complete procedure conducted by interventional cardiologists. Subsequent computational fluid dynamics (CFD) analyses were conducted to quantify hemodynamic quantities linked to restenosis. Patient-specific pre-operative models of coronary bifurcations were reconstructed from CT angiography and optical coherence tomography (OCT). Plaque location and composition were estimated from OCT and assigned to models, and structural simulations were performed in Abaqus. Artery geometries after virtual stent expansion of Xience Prime or Nobori stents created in SolidWorks were compared to post-operative geometry from OCT and CT before being extracted and used for CFD simulations in SimVascular. Inflow boundary conditions based on body surface area, and downstream vascular resistances and capacitances were applied at branches to mimic physiology. Artery geometries obtained after virtual expansion were in good agreement with those reconstructed from patient images. Quantitative comparison of the distance between reconstructed and post-stent geometries revealed a maximum difference in area of 20.4%. Adverse indices of wall shear stress were more pronounced for thicker Nobori stents in both patients. These findings verify structural analyses of stent expansion, introduce a workflow to combine software packages for solid and fluid mechanics analysis, and underscore important stent design features from prior idealized studies. The proposed approach may ultimately be useful in determining an optimal choice of stent and position for each patient.

  11. Estimating the sample size required to detect an arterial spin labelling magnetic resonance imaging perfusion abnormality in voxel-wise group analyses.

    PubMed

    Mersov, Anna M; Crane, David E; Chappell, Michael A; Black, Sandra E; MacIntosh, Bradley J

    2015-04-30

    Voxel-based analyses are pervasive across the range of neuroimaging techniques. In the case of perfusion imaging using arterial spin labelling (ASL), a low signal-to-noise technique, there is a tradeoff between the contrast-to-noise required to detect a perfusion abnormality and its spatial localisation. In exploratory studies, the use of an a priori region of interest (ROI), which has the benefit of averaging multiple voxels, may not be justified. Thus the question considered in this study pertains to the sample size that is required to detect a voxel-level perfusion difference between groups and two algorithms are considered. Empirical 3T ASL data were acquired from 25 older adults and simulations were performed based on the group template cerebral blood flow (CBF) images. General linear model (GLM) and permutation-based algorithms were tested for their ability to detect a predefined hypoperfused ROI. Simulation parameters included: inter and intra-subject variability, degree of hypoperfusion and sample size. The true positive rate was used as a measure of sensitivity. For a modest group perfusion difference, i.e., 10%, 37 participants per group were required when using the permutation-based algorithm, whereas 20 participants were required for the GLM-based algorithm. This study advances the perfusion power calculation literature by considering a voxel-wise analysis with correction for multiple comparison. The sample size requirement to detect group differences decreased exponentially in proportion to increased degree of hypoperfusion. In addition, sensitivity to detect a perfusion abnormality was influenced by the choice of algorithm. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  12. Geochemical Features of Shallow Subduction Thrusts: Non-Destructive XRF Core-Imaging Scanner Analyses of NanTroSEIZE C0004 and C0007 Fault Zone Slabs

    NASA Astrophysics Data System (ADS)

    Yamaguchi, A.; Sakaguchi, A.; Sakamoto, T.; Iijima, K.; Kimura, G.; Ujiie, K.; Chester, F. M.; Fabbri, O.; Goldsby, D. L.; Tsutsumi, A.; Li, C.; Curewitz, D.

    2009-12-01

    images show that the slip zones are enriched in Al, K, Fe and depleted in Ca in spite of the difference of their settings. This chemical feature of slip zones probably reflects the increase of the amount of illite in slip zones. Concentration of illite in slip zones could be explained by 1) smectite-illite transition caused by frictional heating, 2) smectite-illite transition caused by hydrothermal fluid and water-rock interactions, or 3) mechanical adhesion of illite during fault movement. Positive anomaly of vitrinite reflectance around the C0004-28R-2 slip zone (Sakaguchi et al., this meeting) implies the possibility of frictional heating, yet further detailed mineralogical and microtextural analyses are needed to clarify the nature of the slip zones of shallow subduction thrusts.

  13. A SPITZER IRAC IMAGING SURVEY FOR T DWARF COMPANIONS AROUND M, L, AND T DWARFS: OBSERVATIONS, RESULTS, AND MONTE CARLO POPULATION ANALYSES

    SciTech Connect

    Carson, J. C.; Marengo, M.; Patten, B. M.; Hora, J. L.; Schuster, M. T.; Fazio, G. G.; Luhman, K. L.; Sonnett, S. M.; Allen, P. R.; Stauffer, J. R.; Schnupp, C.

    2011-12-20

    We report observational techniques, results, and Monte Carlo population analyses from a Spitzer Infrared Array Camera imaging survey for substellar companions to 117 nearby M, L, and T dwarf systems (median distance of 10 pc, mass range of 0.6 to {approx}0.05 M{sub Sun }). The two-epoch survey achieves typical detection sensitivities to substellar companions of [4.5 {mu}m] {<=} 17.2 mag for angular separations between about 7'' and 165''. Based on common proper motion analysis, we find no evidence for new substellar companions. Using Monte Carlo orbital simulations (assuming random inclination, random eccentricity, and random longitude of pericenter), we conclude that the observational sensitivities translate to an ability to detect 600-1100 K brown dwarf companions at semimajor axes {approx}>35 AU and to detect 500-600 K companions at semimajor axes {approx}>60 AU. The simulations also estimate a 600-1100 K T dwarf companion fraction of <3.4% for 35-1200 AU separations and <12.4% for the 500-600 K companions for 60-1000 AU separations.

  14. Sociopolitical Analyses.

    ERIC Educational Resources Information Center

    Van Galen, Jane, Ed.; And Others

    1992-01-01

    This theme issue of the serial "Educational Foundations" contains four articles devoted to the topic of "Sociopolitical Analyses." In "An Interview with Peter L. McLaren," Mary Leach presented the views of Peter L. McLaren on topics of local and national discourses, values, and the politics of difference. Landon E.…

  15. Histology-driven data mining of lipid signatures from multiple imaging mass spectrometry analyses: application to human colorectal cancer liver metastasis biopsies.

    PubMed

    Thomas, Aurélien; Patterson, Nathan Heath; Marcinkiewicz, Martin M; Lazaris, Anthoula; Metrakos, Peter; Chaurand, Pierre

    2013-03-05

    Imaging mass spectrometry (IMS) represents an innovative tool in the cancer research pipeline, which is increasingly being used in clinical and pharmaceutical applications. The unique properties of the technique, especially the amount of data generated, make the handling of data from multiple IMS acquisitions challenging. This work presents a histology-driven IMS approach aiming to identify discriminant lipid signatures from the simultaneous mining of IMS data sets from multiple samples. The feasibility of the developed workflow is evaluated on a set of three human colorectal cancer liver metastasis (CRCLM) tissue sections. Lipid IMS on tissue sections was performed using MALDI-TOF/TOF MS in both negative and positive ionization modes after 1,5-diaminonaphthalene matrix deposition by sublimation. The combination of both positive and negative acquisition results was performed during data mining to simplify the process and interrogate a larger lipidome into a single analysis. To reduce the complexity of the IMS data sets, a sub data set was generated by randomly selecting a fixed number of spectra from a histologically defined region of interest, resulting in a 10-fold data reduction. Principal component analysis confirmed that the molecular selectivity of the regions of interest is maintained after data reduction. Partial least-squares and heat map analyses demonstrated a selective signature of the CRCLM, revealing lipids that are significantly up- and down-regulated in the tumor region. This comprehensive approach is thus of interest for defining disease signatures directly from IMS data sets by the use of combinatory data mining, opening novel routes of investigation for addressing the demands of the clinical setting.

  16. Geologic analyses of LANDSAT-1 multispectral imagery of a possible power plant site employing digital and analog image processing. [in Pennsylvania

    NASA Technical Reports Server (NTRS)

    Lovegreen, J. R.; Prosser, W. J.; Millet, R. A.

    1975-01-01

    A site in the Great Valley subsection of the Valley and Ridge physiographic province in eastern Pennsylvania was studied to evaluate the use of digital and analog image processing for geologic investigations. Ground truth at the site was obtained by a field mapping program, a subsurface exploration investigation and a review of available published and unpublished literature. Remote sensing data were analyzed using standard manual techniques. LANDSAT-1 imagery was analyzed using digital image processing employing the multispectral Image 100 system and using analog color processing employing the VP-8 image analyzer. This study deals primarily with linears identified employing image processing and correlation of these linears with known structural features and with linears identified manual interpretation; and the identification of rock outcrops in areas of extensive vegetative cover employing image processing. The results of this study indicate that image processing can be a cost-effective tool for evaluating geologic and linear features for regional studies encompassing large areas such as for power plant siting. Digital image processing can be an effective tool for identifying rock outcrops in areas of heavy vegetative cover.

  17. A new semi-quantitative approach for analysing 3T diffusion tensor imaging of optic fibres and its clinical evaluation in glaucoma.

    PubMed

    Engelhorn, Tobias; Haider, Sultan; Michelson, Georg; Doerfler, Arnd

    2010-10-01

    Diffusion tensor imaging can depict rarefaction of the optical fibers. Manual segmentation is time consuming. The purposes of the study were (1) to present a new semiquantitative segmentation approach for analyzing 3-T diffusion tensor imaging of optical fibers and (2) to clinically test the new approach by comparing optic fiber rarefaction in patients with glaucoma to that in age-matched, healthy controls. To perform semiautomated and quantitative segmentation of the optical radiation, a Mathcad-based software program was developed. The results were compared to the manual evaluation of the images performed by two experienced neuroradiologists. The eyes of 42 subjects (22 patients with glaucoma and 20 controls) aged 37 to 86 years were assessed in full ophthalmologic examinations. Magnetic resonance imaging was performed using a 3-T high-field scanner. The evaluation using the new approach matched 94% with manually acquired rarefaction of the optical radiation; Cronbach's α was >0.81 for calculation of the manually and semiautomatically derived volumes. The new approach seems to be robust and is clearly faster compared to the more tedious manual segmentation. Using diffusion tensor imaging at 3 T, it could be shown that there was increasing atrophy of the optical radiation (fourth neuron) with increasing age in patients with glaucoma. Compared to age-matched, healthy patients, more pronounced atrophy of the fourth neuron was found in patients with glaucoma. Copyright © 2010 AUR. Published by Elsevier Inc. All rights reserved.

  18. Analyses of sexual dimorphism of contemporary Japanese using reconstructed three-dimensional CT images--curvature of the best-fit circle of the greater sciatic notch.

    PubMed

    Biwasaka, Hitoshi; Aoki, Yasuhiro; Tanijiri, Toyohisa; Sato, Kei; Fujita, Sachiko; Yoshioka, Kunihiro; Tomabechi, Makiko

    2009-04-01

    We examined various expression methods of sexual dimorphism of the greater sciatic notch (GSN) of the pelvis in contemporary Japanese residents by analyzing the three-dimensional (3D) images reconstructed by multi-slice computed tomography (CT) data, using image-processing and measurement software. Mean error of anthropological measurement values between two skeletonized pelves and their reconstructed 3D-CT images was 1.4%. A spline curve was set along the edge of the GSN of reconstructed pelvic 3D-CT images. Then a best-fit circle for subsets of the spline curve, 5-60mm in length and passing through the deepest point (inflection point) of the GSN, was created, and the radius of the circle (curvature radius) and its ratio to the maximum pelvic height (curvature quotient) were computed. In analysis of images reconstructed from CT data of 180 individuals (male: 91, female: 89), sexes were correctly identified in with 89.4% of specimens, with a spline curve length of 60mm. Because sexing was possible even in deeper regions of the GSN, which are relatively resistant to postmortem damage, the present method may be useful for practical forensic investigation.

  19. Active brain changes after initiating fingolimod therapy in multiple sclerosis patients using individual voxel-based analyses for diffusion tensor imaging.

    PubMed

    Senda, Joe; Watanabe, Hirohisa; Endo, Kuniyuki; Yasui, Keizo; Hawsegawa, Yasuhiro; Yoneyama, Noritaka; Tsuboi, Takashi; Hara, Kazuhiro; Ito, Mizuki; Atsuta, Naoki; Epifanio, Bagarinao; Katsuno, Masahisa; Naganawa, Shinji; Sobue, Gen

    2016-12-01

    Voxel-based analysis (VBA) of diffusion tensor images (DTI) and voxel-based morphometry (VBM) in patients with multiple sclerosis (MS) can sensitively detect occult tissue damage that underlies pathological changes in the brain. In the present study, both at the start of fingolimod and post-four months clinical remission, we assessed four patients with MS who were evaluated with VBA of DTI, VBM, and fluid-attenuated inversion recovery (FLAIR). DTI images for all four patients showed widespread areas of increased mean diffusivity (MD) and decreased fractional anisotropy (FA) that were beyond the high-intensity signal areas across images. After four months of continuous fingolimod therapy, DTI abnormalities progressed; in particular, MD was significantly increased, while brain volume and high-intensity signals were unchanged. These findings suggest that VBA of DTI (e.g., MD) may help assess MS demyelination as neuroinflammatory conditions, even though clinical manifestations of MS appear to be in complete remission during fingolimod.

  20. Effects of MK-801 treatment across several pre-clinical analyses including a novel assessment of brain metabolic function utilizing PET and CT fused imaging in live rats.

    PubMed

    Daya, R P; Bhandari, J K; Hui, P A; Tian, Y; Farncombe, T; Mishra, R K

    2014-02-01

    Functional imaging studies in schizophrenic patients have demonstrated metabolic brain abnormalities during cognitive tasks. This study aimed to 1) introduce a novel analysis of brain metabolic function in live animals to characterize the hypo- and hyperfrontality phenomena observed in schizophrenia and following NMDA antagonist exposure, and 2) identify a robust and representative MK-801 treatment regimen that effectively models brain metabolic abnormalities as well as a range of established behavioural abnormalities representative of schizophrenia. The validity of the MK-801 animal model was examined across several established pre-clinical tests, and a novel assessment of brain metabolic function using PET/CT fused imaging. In the present study, MK-801 was administered acutely at 0.1 mg/kg and 0.5 mg/kg, and sub-chronically at 0.5 mg/kg daily for 7 days. Acute treatment at 0.5 mg/kg-disrupted facets of memory measured through performance in the 8-arm radial maze task and generated abnormalities in sensorimotor gating, social interaction and locomotor activity. Furthermore, this treatment regimen induced hyperfrontality (increased brain metabolic function in the prefrontal area) observed via PET/CT fused imaging in the live rat. While PET and CT fused imaging in the live rat offers a functional representation of metabolic function, more advanced PET/CT integration is required to analyze more discrete brain regions. These findings provide insight on the effectiveness of the MK-801 pre-clinical model of schizophrenia and provide an optimal regimen to model schizophrenia. PET/CT fused imaging offers a highly translatable tool to assess hypo- and hyperfrontality in live animals. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. PCaAnalyser: A 2D-Image Analysis Based Module for Effective Determination of Prostate Cancer Progression in 3D Culture

    PubMed Central

    Lovitt, Carrie J.; Avery, Vicky M.

    2013-01-01

    Three-dimensional (3D) in vitro cell based assays for Prostate Cancer (PCa) research are rapidly becoming the preferred alternative to that of conventional 2D monolayer cultures. 3D assays more precisely mimic the microenvironment found in vivo, and thus are ideally suited to evaluate compounds and their suitability for progression in the drug discovery pipeline. To achieve the desired high throughput needed for most screening programs, automated quantification of 3D cultures is required. Towards this end, this paper reports on the development of a prototype analysis module for an automated high-content-analysis (HCA) system, which allows for accurate and fast investigation of in vitro 3D cell culture models for PCa. The Java based program, which we have named PCaAnalyser, uses novel algorithms that allow accurate and rapid quantitation of protein expression in 3D cell culture. As currently configured, the PCaAnalyser can quantify a range of biological parameters including: nuclei-count, nuclei-spheroid membership prediction, various function based classification of peripheral and non-peripheral areas to measure expression of biomarkers and protein constituents known to be associated with PCa progression, as well as defining segregate cellular-objects effectively for a range of signal-to-noise ratios. In addition, PCaAnalyser architecture is highly flexible, operating as a single independent analysis, as well as in batch mode; essential for High-Throughput-Screening (HTS). Utilising the PCaAnalyser, accurate and rapid analysis in an automated high throughput manner is provided, and reproducible analysis of the distribution and intensity of well-established markers associated with PCa progression in a range of metastatic PCa cell-lines (DU145 and PC3) in a 3D model demonstrated. PMID:24278197

  2. PCaAnalyser: a 2D-image analysis based module for effective determination of prostate cancer progression in 3D culture.

    PubMed

    Hoque, Md Tamjidul; Windus, Louisa C E; Lovitt, Carrie J; Avery, Vicky M

    2013-01-01

    Three-dimensional (3D) in vitro cell based assays for Prostate Cancer (PCa) research are rapidly becoming the preferred alternative to that of conventional 2D monolayer cultures. 3D assays more precisely mimic the microenvironment found in vivo, and thus are ideally suited to evaluate compounds and their suitability for progression in the drug discovery pipeline. To achieve the desired high throughput needed for most screening programs, automated quantification of 3D cultures is required. Towards this end, this paper reports on the development of a prototype analysis module for an automated high-content-analysis (HCA) system, which allows for accurate and fast investigation of in vitro 3D cell culture models for PCa. The Java based program, which we have named PCaAnalyser, uses novel algorithms that allow accurate and rapid quantitation of protein expression in 3D cell culture. As currently configured, the PCaAnalyser can quantify a range of biological parameters including: nuclei-count, nuclei-spheroid membership prediction, various function based classification of peripheral and non-peripheral areas to measure expression of biomarkers and protein constituents known to be associated with PCa progression, as well as defining segregate cellular-objects effectively for a range of signal-to-noise ratios. In addition, PCaAnalyser architecture is highly flexible, operating as a single independent analysis, as well as in batch mode; essential for High-Throughput-Screening (HTS). Utilising the PCaAnalyser, accurate and rapid analysis in an automated high throughput manner is provided, and reproducible analysis of the distribution and intensity of well-established markers associated with PCa progression in a range of metastatic PCa cell-lines (DU145 and PC3) in a 3D model demonstrated.

  3. Histochemical analyses and quantum dot imaging of microvascular blood flow with pulmonary edema in living mouse lungs by "in vivo cryotechnique".

    PubMed

    Saitoh, Yurika; Terada, Nobuo; Saitoh, Sei; Ohno, Nobuhiko; Jin, Takashi; Ohno, Shinichi

    2012-02-01

    Light microscopic imaging of blood vessels and distribution of serum proteins is essential to analyze hemodynamics in living animal lungs under normal respiration or respiratory diseases. In this study, to demonstrate dynamically changing morphology and immunohistochemical images of their living states, "in vivo cryotechnique" (IVCT) combined with freeze-substitution fixation was applied to anesthetized mouse lungs. By hematoxylin-eosin staining, morphological features, such as shapes of alveolar septum and sizes of alveolar lumen, reflected their respiratory conditions in vivo, and alveolar capillaries were filled with variously shaped erythrocytes. Albumin was usually immunolocalized in the capillaries, which was confirmed by double-immunostaining for aquaporin-1 of endothelium. To capture accurate time-courses of blood flow in peripheral pulmonary alveoli, glutathione-coated quantum dots (QDs) were injected into right ventricles, and then IVCT was performed at different time-points after the QD injection. QDs were localized in most arterioles and some alveolar capillaries at 1 s, and later in venules at 2 s, reflecting a typical blood flow direction in vivo. Three-dimensional QD images of microvascular networks were reconstructed by confocal laser scanning microscopy. It was also applied to lungs of acute pulmonary hypertension mouse model. Erythrocytes were crammed in blood vessels, and some serum components leaked into alveolar lumens, as confirmed by mouse albumin immunostaining. Some separated collagen fibers and connecting elastic fibers were still detected in edematous tunica adventitia near terminal bronchioles. Thus, IVCT combined with histochemical approaches enabled us to capture native images of dynamically changing structures and microvascular hemodynamics of living mouse lungs.

  4. Micro-flow imaging analyses reflect mechanisms of aggregate formation: Comparing protein particle data sets using the Kullback-Leibler divergence.

    PubMed

    Maddux, Nathaniel R; Daniels, Austin L; Randolph, Theodore W

    2017-01-31

    Sub-visible particles in therapeutic protein formulations are an increasing manufacturing and regulatory concern due to their potential to cause adverse immune responses. Flow imaging microscopy is used extensively to detect sub-visible particles and investigate product deviations, typically by comparing imaging data using histograms of particle descriptors. Such an approach discards much information, and requires effort to interpret differences, which is problematic when comparing many data sets. We propose to compare imaging data by using the Kullback-Leibler divergence, an information theoretic measure of the difference of distributions.(1) We use the divergence to generate scatter plots representing the similarity between data sets, and to classify new data into previously determined categories. Our approach is multidimensional, automated and less biased than traditional techniques. We demonstrate the method with FlowCAM® imagery of protein aggregates acquired from monoclonal antibody samples subjected to different stresses. The method succeeds in classifying aggregated samples by stress condition, and, once trained, is able to identify the stress that caused aggregate formation in new samples. In addition to potentially detecting subtle incipient manufacturing faults, the method may have applications to verification of product uniformity after manufacturing changes, identification of counterfeit products, and development of closely matching bio-similar products.

  5. Active brain changes after initiating fingolimod therapy in multiple sclerosis patients using individual voxel-based analyses for diffusion tensor imaging

    PubMed Central

    Senda, Joe; Watanabe, Hirohisa; Endo, Kuniyuki; Yasui, Keizo; Hawsegawa, Yasuhiro; Yoneyama, Noritaka; Tsuboi, Takashi; Hara, Kazuhiro; Ito, Mizuki; Atsuta, Naoki; Epifanio Jr, Bagarinao; Katsuno, Masahisa; Naganawa, Shinji; Sobue, Gen

    2016-01-01

    ABSTRACT Voxel-based analysis (VBA) of diffusion tensor images (DTI) and voxel-based morphometry (VBM) in patients with multiple sclerosis (MS) can sensitively detect occult tissue damage that underlies pathological changes in the brain. In the present study, both at the start of fingolimod and post-four months clinical remission, we assessed four patients with MS who were evaluated with VBA of DTI, VBM, and fluid-attenuated inversion recovery (FLAIR). DTI images for all four patients showed widespread areas of increased mean diffusivity (MD) and decreased fractional anisotropy (FA) that were beyond the high-intensity signal areas across images. After four months of continuous fingolimod therapy, DTI abnormalities progressed; in particular, MD was significantly increased, while brain volume and high-intensity signals were unchanged. These findings suggest that VBA of DTI (e.g., MD) may help assess MS demyelination as neuroinflammatory conditions, even though clinical manifestations of MS appear to be in complete remission during fingolimod. PMID:28008201

  6. Assimilating All-Sky GPM Microwave Imager(GMI) Radiance Data in NASA GEOS-5 System for Global Cloud and Precipitation Analyses

    NASA Astrophysics Data System (ADS)

    Kim, M. J.; Jin, J.; McCarty, W.; Todling, R.; Holdaway, D. R.; Gelaro, R.

    2014-12-01

    The NASA Global Modeling and Assimilation Office (GMAO) works to maximize the impact of satellite observations in the analysis and prediction of climate and weather through integrated Earth system modeling and data assimilation. To achieve this goal, the GMAO undertakes model and assimilation development, generates products to support NASA instrument teams and the NASA Earth science program. Currently Atmospheric Data Assimilation System (ADAS) in the Goddard Earth Observing System Model, Version 5(GEOS-5) system combines millions of observations and short-term forecasts to determine the best estimate, or analysis, of the instantaneous atmospheric state. However, ADAS has been geared towards utilization of observations in clear sky conditions and the majority of satellite channel data affected by clouds are discarded. Microwave imager data from satellites can be a significant source of information for clouds and precipitation but the data are presently underutilized, as only surface rain rates from the Tropical Rainfall Measurement Mission (TRMM) Microwave Imager (TMI) are assimilated with small weight assigned in the analysis process. As clouds and precipitation often occur in regions with high forecast sensitivity, improvements in the temperature, moisture, wind and cloud analysis of these regions are likely to contribute to significant gains in numerical weather prediction accuracy. This presentation is intended to give an overview of GMAO's recent progress in assimilating the all-sky GPM Microwave Imager (GMI) radiance data in GEOS-5 system. This includes development of various new components to assimilate cloud and precipitation affected data in addition to data in clear sky condition. New observation operators, quality controls, moisture control variables, observation and background error models, and a methodology to incorporate the linearlized moisture physics in the assimilation system are described. In addition preliminary results showing impacts of

  7. Analyses of sexual dimorphism of reconstructed pelvic computed tomography images of contemporary Japanese using curvature of the greater sciatic notch, pubic arch and greater pelvis.

    PubMed

    Biwasaka, Hitoshi; Aoki, Yasuhiro; Sato, Kei; Tanijiri, Toyohisa; Fujita, Sachiko; Dewa, Koji; Yoshioka, Kunihiro; Tomabechi, Makiko

    2012-06-10

    Three-dimensional pelvic images were reconstructed from multi-slice CT data of contemporary Japanese (males: 124; females: 104, 25-92 years old), and curvature analysis to examine sexual dimorphism was carried out in the great sciatic notch (GSN), the pubic arch and the greater pelvis in the images. Reconstructed pelvic CT images were visualized fairly well and anatomical landmarks were easily recognizable. When calculating the radii (curvature radii) of the best-fit circles for the spline curve lines set along the edges of the GSNs and of the pubic arches, sexes from these regions were correctly identified in 89.1% (males: 93.8%; females: 83.7%) and 94.7% (males: 97.3%; females: 91.8%) of cases, respectively, by setting an appropriate cut-off value. Furthermore, sexing was possible even in deeper regions of the GSN which are relatively resistant to postmortem damage. Curvature radii of the best-fit spheres of greater pelves showed no significant difference between sexes. However, curvature of the best-fit sphere for the left iliac fossa was significantly larger than that of the right one (p<10(-24)) in males, and the ratios were >1.0 in 88% of all male specimens analyzed. Meanwhile, no significant difference was observed among female samples. Although some left-sided dominancy has been reported in 2-dimensional measurements of the human pelvis, this 3-dimensional laterality in males was much more significant, and is a potential index of sex difference.

  8. Analyses of requirements for computer control and data processing experiment subsystems: Image data processing system (IDAPS) software description (7094 version), volume 2

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A description of each of the software modules of the Image Data Processing System (IDAPS) is presented. The changes in the software modules are the result of additions to the application software of the system and an upgrade of the IBM 7094 Mod(1) computer to a 1301 disk storage configuration. Necessary information about IDAPS sofware is supplied to the computer programmer who desires to make changes in the software system or who desires to use portions of the software outside of the IDAPS system. Each software module is documented with: module name, purpose, usage, common block(s) description, method (algorithm of subroutine) flow diagram (if needed), subroutines called, and storage requirements.

  9. Image

    SciTech Connect

    Marsh, Amber; Harsch, Tim; Pitt, Julie; Firpo, Mike; Lekin, April; Pardes, Elizabeth

    2007-08-31

    The computer side of the IMAGE project consists of a collection of Perl scripts that perform a variety of tasks; scripts are available to insert, update and delete data from the underlying Oracle database, download data from NCBI's Genbank and other sources, and generate data files for download by interested parties. Web scripts make up the tracking interface, and various tools available on the project web-site (image.llnl.gov) that provide a search interface to the database.

  10. In vivo cardiac nano-imaging: A new technology for high-precision analyses of sarcomere dynamics in the heart.

    PubMed

    Shimozawa, Togo; Hirokawa, Erisa; Kobirumaki-Shimozawa, Fuyu; Oyama, Kotaro; Shintani, Seine A; Terui, Takako; Kushida, Yasuharu; Tsukamoto, Seiichi; Fujii, Teruyuki; Ishiwata, Shin'ichi; Fukuda, Norio

    2017-03-01

    The cardiac pump function is a result of a rise in intracellular Ca(2+) and the ensuing sarcomeric contractions [i.e., excitation-contraction (EC) coupling] in myocytes in various locations of the heart. In order to elucidate the heart's mechanical properties under various settings, cardiac imaging is widely performed in today's clinical as well as experimental cardiology by using echocardiogram, magnetic resonance imaging and computed tomography. However, because these common techniques detect local myocardial movements at a spatial resolution of ∼100 μm, our knowledge on the sub-cellular mechanisms of the physiology and pathophysiology of the heart in vivo is limited. This is because (1) EC coupling occurs in the μm partition in a myocyte and (2) cardiac sarcomeres generate active force upon a length change of ∼100 nm on a beat-to-beat basis. Recent advances in optical technologies have enabled measurements of intracellular Ca(2+) dynamics and sarcomere length displacements at high spatial and temporal resolution in the beating heart of living rodents. Future studies with these technologies are warranted to open a new era in cardiac research. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Characteristics and Origin of a Cratered Unit near the MSL Bradbury Landing Site (Gale Crater, Mars) Based on Analyses of Surface Data and Orbital Images

    NASA Astrophysics Data System (ADS)

    Jacob, S.; Rowland, S. K.; Edgett, K. S.; Kah, L. C.; Wiens, R. C.; Day, M. D.; Calef, F.; Palucis, M. C.; Anderson, R. B.

    2014-12-01

    Using orbiter images, the Curiosity landing ellipse was mapped as six distinct units based mainly on geomorphic characteristics. These units are the alluvial fan material (ALF), fractured light-toned surface (FLT), cratered plains/surfaces (CS), smooth hummocky plains (SH), rugged unit (RU) and striated light-toned outcrops (SLT) (Grotzinger et al., 2014; DOI: 10.1126/science.1242777). The goal of this project was to characterize and determine the origin of the CS. The CS is a thin, nearly horizontal, erosion resistant capping unit. HiRISE mosaics were utilized to subdivide the CS into four geomorphic sub-units. Crater densities were calculated for each sub-unit to provide a quantifiable parameter that could aid in understanding how the sub-units differ. Mastcam images from many locations along Curiosity's traverse show fields of dark, massive boulders, which are presumably erosional remnants of the CS. This indicates that the CS was likely more laterally extensive in the past. In situ CS outcrops, seen at Shaler and multiple locations near the Zabriskie Plateau, appear to have a rough, wind-sculpted surface and may consist of two distinct lithologies. The lower lithology displays hints of layering that have been enhanced by differential weathering, whereas the upper lithology consists of dark, massive rock. When present, the outcrops can extend laterally for several meters, but Mastcam images of outcrops do not always reveal both sections. ChemCam data show that CS targets have concentrations of Al, Na, and K that are indicative of an alkali feldspar phase. The physical and chemical characteristics of the CS suggest a massive deposit that has seen little to no chemical alteration. Physical characteristics of the CS do not allow us to unambiguously determine its geologic origin; possible emplacement mechanisms would require the ability to spread laterally over a nearly horizontal surface, and include inflating lava (e.g., pāhoehoe) or a distal delta deposit. The

  12. Imaging analyses of coagulation-dependent initiation of fibrinolysis on activated platelets and its modification by thrombin-activatable fibrinolysis inhibitor.

    PubMed

    Brzoska, Tomasz; Suzuki, Yuko; Sano, Hideto; Suzuki, Seiichirou; Tomczyk, Martyna; Tanaka, Hiroki; Urano, Tetsumei

    2017-04-03

    Using intravital confocal microscopy, we observed previously that the process of platelet phosphatidylserine (PS) exposure, fibrin formation and lysine binding site-dependent plasminogen (plg) accumulation took place only in the centre of thrombi, not at their periphery. These findings prompted us to analyse the spatiotemporal regulatory mechanisms underlying coagulation and fibrinolysis. We analysed the fibrin network formation and the subsequent lysis in an in vitro experiment using diluted platelet-rich plasma supplemented with fluorescently labelled coagulation and fibrinolytic factors, using confocal laser scanning microscopy. The structure of the fibrin network formed by supplemented tissue factor was uneven and denser at the sites of coagulation initiation regions (CIRs) on PS-exposed platelets. When tissue-type plasminogen activator (tPA; 7.5 nM) was supplemented, labelled plg (50 nM) as well as tPA accumulated at CIRs, from where fibrinolysis started and gradually expanded to the peripheries. The lysis time at CIRs and their peripheries (50 µm from the CIR) were 27.9 ± 6.6 and 44.4 ± 9.7 minutes (mean ± SD, n=50 from five independent experiments) after the addition of tissue factor, respectively. Recombinant human soluble thrombomodulin (TMα; 2.0 nM) attenuated the CIR-dependent plg accumulation and strongly delayed fibrinolysis at CIRs. A carboxypeptidase inhibitor dose-dependently enhanced the CIR-dependent fibrinolysis initiation, and at 20 µM it completely abrogated the TMα-induced delay of fibrinolysis. Our findings are the first to directly present crosstalk between coagulation and fibrinolysis, which takes place on activated platelets' surface and is further controlled by thrombin-activatable fibrinolysis inhibitor (TAFI).

  13. SU-D-207A-02: Possible Characterization of the Brain Tumor Vascular Environment by a Novel Strategy of Quantitative Analysis in Dynamic Contrast Enhanced MR Imaging: A Combination of Both Patlak and Logan Analyses

    SciTech Connect

    Yee, S; Chinnaiyan, P; Wloch, J; Pirkola, M; Yan, D

    2016-06-15

    Purpose: The majority of quantitative analyses involving dynamic contrast enhanced (DCE) MRI have been performed to obtain kinetic parameters such as Ktrans and ve. Such analyses are generally performed assuming a “reversible” tissue compartment, where the tracer is assumed to be rapidly equilibrated between the plasma and tissue compartments. However, some tumor vascular environments may be more suited for a “non-reversible” tissue compartment, where, as with FDG PET imaging, the tracer is continuously deposited into the tissue compartment (or the return back to the plasma compartment is very slow in the imaging time scale). Therefore, Patlak and Logan analyses, which represent tools for the “non-reversible” and “reversible” modeling, respectively, were performed to better characterize the brain tumor vascular environment. Methods: A voxel-by-voxel analysis was performed to generate both Patlak and Logan plots in two brain tumor patients, one with grade III astrocytoma and the other with grade IV astrocytoma or glioblastoma. The slopes of plots and the r-square were then obtained by linear fitting and compared for each voxel. Results: The 2-dimensional scatter plots of Logan (Y-axis) vs. Patlak slopes (X-axis) clearly showed increased Logan slopes for glioblastoma (Figure 3A). The scatter plots of goodness-of-fit (Figure 3B) also suggested glioblastoma, relative to grade III astrocytoma, might consist of more voxels that are kinetically Logan-like (i.e. rapidly equilibrated extravascular space and active vascular environment). Therefore, the enhanced Logan-like behavior (and the Logan slope) in glioblastoma may imply an increased fraction of active vascular environment, while the enhanced Patlak-like behavior implies the vascular environment permitting a relatively slower washout of the tracer. Conclusion: Although further verification is required, the combination of Patlak and Logan analyses in DCE MRI may be useful in characterizing the tumor

  14. [Imaging].

    PubMed

    Chevrot, A; Drapé, J L; Godefroy, D; Dupont, A M; Pessis, E; Sarazin, L; Minoui, A

    1997-01-01

    The panoply of imaging techniques useful in podology is essentially limited to X-rays. Standard "standing" and "lying" X-rays furnish most of the required information. Arthrography is sometimes performed, in particular for trauma or tumour of the ankle. CT scan and MRI make a decisive contribution in difficult cases, notably in fractures and in small fractures without displacement. The two latter techniques are useful in tendon, ligament and muscular disorders, where echography is also informative. Rigorous analysis of radiographies and a good knowledge of foot disorders make these imaging techniques efficacious.

  15. Clinical, imaging, and immunohistochemical characteristics of focal cortical dysplasia Type II extratemporal epilepsies in children: analyses of an institutional case series.

    PubMed

    Knerlich-Lukoschus, Friederike; Connolly, Mary B; Hendson, Glenda; Steinbok, Paul; Dunham, Christopher

    2017-02-01

    OBJECTIVE Focal cortical dysplasia (FCD) Type II is divided into 2 subgroups based on the absence (IIA) or presence (IIB) of balloon cells. In particular, extratemporal FCD Type IIA and IIB is not completely understood in terms of clinical, imaging, biological, and neuropathological differences. The aim of the authors was to analyze distinctions between these 2 formal entities and address clinical, MRI, and immunohistochemical features of extratemporal epilepsies in children. METHODS Cases formerly classified as Palmini FCD Type II nontemporal epilepsies were identified through the prospectively maintained epilepsy database at the British Columbia Children's Hospital in Vancouver, Canada. Clinical data, including age of seizure onset, age at surgery, seizure type(s) and frequency, affected brain region(s), intraoperative electrocorticographic findings, and outcome defined by Engel's classification were obtained for each patient. Preoperative and postoperative MRI results were reevaluated. H & E-stained tissue sections were reevaluated by using the 2011 International League Against Epilepsy classification system and additional immunostaining for standard cellular markers (neuronal nuclei, neurofilament, glial fibrillary acidic protein, CD68). Two additional established markers of pathology in epilepsy resection, namely, CD34 and α-B crystallin, were applied. RESULTS Seven nontemporal FCD Type IIA and 7 Type B cases were included. Patients with FCD Type IIA presented with an earlier age of epilepsy onset and slightly better Engel outcome. Radiology distinguished FCD Types IIA and IIB, in that Type IIB presented more frequently with characteristic cortical alterations. Nonphosphorylated neurofilament protein staining confirmed dysplastic cells in dyslaminated areas. The white-gray matter junction was focally blurred in patients with FCD Type IIB. α-B crystallin highlighted glial cells in the white matter and subpial layer with either of the 2 FCD Type II subtypes

  16. Images.

    ERIC Educational Resources Information Center

    Barr, Catherine, Ed.

    1997-01-01

    The theme of this month's issue is "Images"--from early paintings and statuary to computer-generated design. Resources on the theme include Web sites, CD-ROMs and software, videos, books, and others. A page of reproducible activities is also provided. Features include photojournalism, inspirational Web sites, art history, pop art, and myths. (AEF)

  17. Flow modification in canine intracranial aneurysm model by an asymmetric stent: studies using digital subtraction angiography (DSA) and image-based computational fluid dynamics (CFD) analyses

    NASA Astrophysics Data System (ADS)

    Hoi, Yiemeng; Ionita, Ciprian N.; Tranquebar, Rekha V.; Hoffmann, Kenneth R.; Woodward, Scott H.; Taulbee, Dale B.; Meng, Hui; Rudin, Stephen

    2006-03-01

    An asymmetric stent with low porosity patch across the intracranial aneurysm neck and high porosity elsewhere is designed to modify the flow to result in thrombogenesis and occlusion of the aneurysm and yet to reduce the possibility of also occluding adjacent perforator vessels. The purposes of this study are to evaluate the flow field induced by an asymmetric stent using both numerical and digital subtraction angiography (DSA) methods and to quantify the flow dynamics of an asymmetric stent in an in vivo aneurysm model. We created a vein-pouch aneurysm model on the canine carotid artery. An asymmetric stent was implanted at the aneurysm, with 25% porosity across the aneurysm neck and 80% porosity elsewhere. The aneurysm geometry, before and after stent implantation, was acquired using cone beam CT and reconstructed for computational fluid dynamics (CFD) analysis. Both steady-state and pulsatile flow conditions using the measured waveforms from the aneurysm model were studied. To reduce computational costs, we modeled the asymmetric stent effect by specifying a pressure drop over the layer across the aneurysm orifice where the low porosity patch was located. From the CFD results, we found the asymmetric stent reduced the inflow into the aneurysm by 51%, and appeared to create a stasis-like environment which favors thrombus formation. The DSA sequences also showed substantial flow reduction into the aneurysm. Asymmetric stents may be a viable image guided intervention for treating intracranial aneurysms with desired flow modification features.

  18. Utilizing magnetic resonance imaging logs, open hole logs and sidewall core analyses to evaluate shaly sands for water-free production

    SciTech Connect

    Taylor, D.; Morganti, J.; White, H.

    1995-06-01

    NMR logging using the new C series Magnetic Resonance Imaging Logging (MRIL){trademark} is rapidly enhancing formation evaluation throughout the industry. By measuring irreducible water saturations, permeability and effective porosities, MRIL data can help petrophysicists evaluate low resistivity pays. In these instances, conventional open hole logs may not define all of the pay intervals. MRIL can also minimize unnecessary completions in zones of potentially high water-cut. This case study will briefly discuss MRIL tool theory and log presentations used with the conventional logs and sidewall cores. SEM analysis will show a good correlation of varying grain size sands with the T{sub 2} distribution and bulk volume irreducible from MRIL. Discussions of each well in the study area will show how water-free production zones were defined. Because the MRIL data was not recorded on one of the wells, the advanced petrophysical program HORIZON was used to predict the MRIL bulk volume irreducible and effective porosity to estimate productive zones. Discussion of additional formation characteristics, completion procedures, actual production and predicted producibility of the shaly sands will be presented.

  19. IDATEN and G-SITENNO: GUI-assisted software for coherent X-ray diffraction imaging experiments and data analyses at SACLA.

    PubMed

    Sekiguchi, Yuki; Yamamoto, Masaki; Oroguchi, Tomotaka; Takayama, Yuki; Suzuki, Shigeyuki; Nakasako, Masayoshi

    2014-11-01

    Using our custom-made diffraction apparatus KOTOBUKI-1 and two multiport CCD detectors, cryogenic coherent X-ray diffraction imaging experiments have been undertaken at the SPring-8 Angstrom Compact free electron LAser (SACLA) facility. To efficiently perform experiments and data processing, two software suites with user-friendly graphical user interfaces have been developed. The first is a program suite named IDATEN, which was developed to easily conduct four procedures during experiments: aligning KOTOBUKI-1, loading a flash-cooled sample into the cryogenic goniometer stage inside the vacuum chamber of KOTOBUKI-1, adjusting the sample position with respect to the X-ray beam using a pair of telescopes, and collecting diffraction data by raster scanning the sample with X-ray pulses. Named G-SITENNO, the other suite is an automated version of the original SITENNO suite, which was designed for processing diffraction data. These user-friendly software suites are now indispensable for collecting a large number of diffraction patterns and for processing the diffraction patterns immediately after collecting data within a limited beam time.

  20. Comparative Live-Cell Imaging Analyses of SPA-2, BUD-6 and BNI-1 in Neurospora crassa Reveal Novel Features of the Filamentous Fungal Polarisome

    PubMed Central

    Read, Nick D.; Castro-Longoria, Ernestina

    2012-01-01

    A key multiprotein complex involved in regulating the actin cytoskeleton and secretory machinery required for polarized growth in fungi, is the polarisome. Recognized core constituents in budding yeast are the proteins Spa2, Pea2, Aip3/Bud6, and the key effector Bni1. Multicellular fungi display a more complex polarized morphogenesis than yeasts, suggesting that the filamentous fungal polarisome might fulfill additional functions. In this study, we compared the subcellular organization and dynamics of the putative polarisome components BUD-6 and BNI-1 with those of the bona fide polarisome marker SPA-2 at various developmental stages of Neurospora crassa. All three proteins exhibited a yeast-like polarisome configuration during polarized germ tube growth, cell fusion, septal pore plugging and tip repolarization. However, the localization patterns of all three proteins showed spatiotemporally distinct characteristics during the establishment of new polar axes, septum formation and cytokinesis, and maintained hyphal tip growth. Most notably, in vegetative hyphal tips BUD-6 accumulated as a subapical cloud excluded from the Spitzenkörper (Spk), whereas BNI-1 and SPA-2 partially colocalized with the Spk and the tip apex. Novel roles during septal plugging and cytokinesis, connected to the reinitiation of tip growth upon physical injury and conidial maturation, were identified for BUD-6 and BNI-1, respectively. Phenotypic analyses of gene deletion mutants revealed additional functions for BUD-6 and BNI-1 in cell fusion regulation, and the maintenance of Spk integrity. Considered together, our findings reveal novel polarisome-independent functions of BUD-6 and BNI-1 in Neurospora, but also suggest that all three proteins cooperate at plugged septal pores, and their complex arrangement within the apical dome of mature hypha might represent a novel aspect of filamentous fungal polarisome architecture. PMID:22291944

  1. Beta Adrenergic Receptor Stimulation Suppresses Cell Migration in Association with Cell Cycle Transition in Osteoblasts-Live Imaging Analyses Based on FUCCI System.

    PubMed

    Katsumura, Sakie; Ezura, Yoichi; Izu, Yayoi; Shirakawa, Jumpei; Miyawaki, Atsushi; Harada, Kiyoshi; Noda, Masaki

    2016-02-01

    Osteoporosis affects over 20 million patients in the United States. Among those, disuse osteoporosis is serious as it is induced by bed-ridden conditions in patients suffering from aging-associated diseases including cardiovascular, neurological, and malignant neoplastic diseases. Although the phenomenon that loss of mechanical stress such as bed-ridden condition reduces bone mass is clear, molecular bases for the disuse osteoporosis are still incompletely understood. In disuse osteoporosis model, bone loss is interfered by inhibitors of sympathetic tone and adrenergic receptors that suppress bone formation. However, how beta adrenergic stimulation affects osteoblastic migration and associated proliferation is not known. Here we introduced a live imaging system, fluorescent ubiquitination-based cell cycle indicator (FUCCI), in osteoblast biology and examined isoproterenol regulation of cell cycle transition and cell migration in osteoblasts. Isoproterenol treatment suppresses the levels of first entry peak of quiescent osteoblastic cells into cell cycle phase by shifting from G1 /G0 to S/G2 /M and also suppresses the levels of second major peak population that enters into S/G2 /M. The isoproterenol regulation of osteoblastic cell cycle transition is associated with isoproterenol suppression on the velocity of migration. This isoproterenol regulation of migration velocity is cell cycle phase specific as it suppresses migration velocity of osteoblasts in G1 phase but not in G1 /S nor in G2 /M phase. Finally, these observations on isoproterenol regulation of osteoblastic migration and cell cycle transition are opposite to the PTH actions in osteoblasts. In summary, we discovered that sympathetic tone regulates osteoblastic migration in association with cell cycle transition by using FUCCI system.

  2. Quantitative analyses of pore-scale multi-phase flow processes: An application of synchrotron-based micro-imaging in the environmental sciences

    NASA Astrophysics Data System (ADS)

    Wildenschild, D.; Christensen, B. S.; Culligan, K. A.; Rivers, M. L.; Hopmans, J. W.; Kent, A. J.; Gray, W. G.

    2002-12-01

    Our current understanding of groundwater flow and contaminant transport in the subsurface is, to a large degree, limited by existing measurement techniques. To correctly describe transport of contaminant species, it is essential to understand the interplay of advection, mechanical dispersion, and diffusion and their dependency on soil water distribution, degree of saturation, as well as gas-liquid phase contact characteristics. However, these pore-scale mechanisms cannot be measured with traditional experimental techniques. X-ray computerized microtomography provides non-invasive pore-scale observation of variables such as changing fluid phase content and distribution, as well as interfacial area and curvatures. We present results obtained at the microtomography facility at GSECARS (sector 13) at the Advanced Photon Source, Argonne National Laboratory. Samples of 6-7 mm diameter sand or glass bead packs were scanned at different stages of drainage and imbibition and with varying boundary conditions. We observed significant differences in fluid saturation and phase distribution for different boundary conditions, clearly showing preferential flow and a dependence on the applied flow rate. Individual pores, water/air interfaces and their curvatures as a function of pore-water pressure were resolved and the interfacial areas quantified using image analysis techniques. We plan to use this detailed information to verify existing pore-scale numerical models and to aid development of new modeling approaches dealing with contaminant flow and transport in the subsurface. Use of the Advanced Photon Source was supported by the U.S. Department of Energy, Basic Energy Sciences, Office of Science, under Contract No. W-31-109-Eng-38.

  3. Use of INSAT-3D sounder and imager radiances in the 4D-VAR data assimilation system and its implications in the analyses and forecasts

    NASA Astrophysics Data System (ADS)

    Indira Rani, S.; Taylor, Ruth; George, John P.; Rajagopal, E. N.

    2016-05-01

    INSAT-3D, the first Indian geostationary satellite with sounding capability, provides valuable information over India and the surrounding oceanic regions which are pivotal to Numerical Weather Prediction. In collaboration with UK Met Office, NCMRWF developed the assimilation capability of INSAT-3D Clear Sky Brightness Temperature (CSBT), both from the sounder and imager, in the 4D-Var assimilation system being used at NCMRWF. Out of the 18 sounder channels, radiances from 9 channels are selected for assimilation depending on relevance of the information in each channel. The first three high peaking channels, the CO2 absorption channels and the three water vapor channels (channel no. 10, 11, and 12) are assimilated both over land and Ocean, whereas the window channels (channel no. 6, 7, and 8) are assimilated only over the Ocean. Measured satellite radiances are compared with that from short range forecasts to monitor the data quality. This is based on the assumption that the observed satellite radiances are free from calibration errors and the short range forecast provided by NWP model is free from systematic errors. Innovations (Observation - Forecast) before and after the bias correction are indicative of how well the bias correction works. Since the biases vary with air-masses, time, scan angle and also due to instrument degradation, an accurate bias correction algorithm for the assimilation of INSAT-3D sounder radiance is important. This paper discusses the bias correction methods and other quality controls used for the selected INSAT-3D sounder channels and the impact of bias corrected radiance in the data assimilation system particularly over India and surrounding oceanic regions.

  4. High-speed digital processing of electro-optic holography images for a quantitative analysisElaboration digitale à grande vitesse des images d'holographie électro-optique pour une analyse quantitative

    NASA Astrophysics Data System (ADS)

    Schirripa Spagnolo, G.; Ambrosini, D.; Paoletti, D.; Borghi, R.

    1997-06-01

    A quasi-automatic quantitative analysis of electro-optic holography images is proposed. The phase information is extracted by means of the Fourier transform method. The phase map is unwrapped by an algorithm based on no-path-following scheme and fast cosine transform.

  5. Quantifying the Physical Composition of Urban Morphology throughout Wales by analysing a Time Series (1989-2011) of Landsat TM/ETM+ images and Supporting GIS data

    NASA Astrophysics Data System (ADS)

    Scott, Douglas; Petropoulos, George

    2014-05-01

    Knowledge of impervious surface areas (ISA) and on their changes in magnitude, location, geometry and morphology over time is significant for a range of practical applications and research alike from local to global scale. It is a key indicator of global environmental change and is also important parameter for urban planning and environmental resources management, especially within a European context due to the policy recommendations given to the European Commission by the Austrian Environment Agency in 2011. Despite this, use of Earth Observation (EO) technology in mapping ISAs within the European Union (EU) and in particular in the UK is inadequate. In the present study, selected study sites across Wales have been used to test the use of freely distributed EO data from Landsat TM/ETM+ sensors in retrieving ISA for improving the current European estimations of international urbanization and soil sealing. A traditional classifier and a linear spectral mixture analysis (LSMA) were both applied to a series of Landsat TM/ETM+ images acquired over a period spanning 22 years to extract ISA. Aerial photography with a spatial resolution of 0.4m, acquired over the summer period in 2005 was used for validation purposes. The Welsh study areas provided a unique chance to detect largely dispersed urban morphology within an urban-rural frontier context. The study also presents an innovative method for detecting clouds and cloud shadow layers, detected with an overall accuracy of around 97%. The process tree built and presented in this study is important in terms of moving forward into a biennial program for the Welsh Government and is comparable to currently existing products. This EO-based product also offers a much less subjectively static and more objectively dynamic estimation of ISA cover. Our methodology not only inaugurates the local retrieval of ISA for Wales but also meliorates the existing EU international figures, and expands relatively stationary 'global' US

  6. Analysing the effect of crystal size and structure in highly efficient CH3NH3PbI3 perovskite solar cells by spatially resolved photo- and electroluminescence imaging.

    PubMed

    Mastroianni, S; Heinz, F D; Im, J-H; Veurman, W; Padilla, M; Schubert, M C; Würfel, U; Grätzel, M; Park, N-G; Hinsch, A

    2015-12-14

    CH3NH3PbI3 perovskite solar cells with a mesoporous TiO2 layer and spiro-MeOTAD as a hole transport layer (HTL) with three different CH3NH3I concentrations (0.032 M, 0.044 M and 0.063 M) were investigated. Strong variations in crystal size and morphology resulting in diversified cell efficiencies (9.2%, 16.9% and 12.3%, respectively) were observed. The physical origin of this behaviour was analysed by detailed characterization combining current-voltage curves with photo- and electroluminescence (PL and EL) imaging as well as light beam induced current measurements (LBIC). It was found that the most efficient cell shows the highest luminescence and the least efficient cell is most strongly limited by non-radiative recombination. Crystal size, morphology and distribution in the capping layer and in the porous scaffold strongly affect the non-radiative recombination. Moreover, the very non-uniform crystal structure with multiple facets, as evidenced by SEM images of the 0.032 M device, suggests the creation of a large number of grain boundaries and crystal dislocations. These defects give rise to increased trap-assisted non-radiative recombination as is confirmed by high-resolution μ-PL images. The different imaging techniques used in this study prove to be well-suited to spatially investigate and thus correlate the crystal morphology of the perovskite layer with the electrical and radiative properties of the solar cells and thus with their performance.

  7. Small oxygen analysers.

    PubMed

    Cole, A G

    1983-05-01

    Analysers using a polarographic electrode had a tendency to react to nitrous oxide, which was considered dangerous with one analyser. However, they had cheaper running costs and a faster response time than the galvanic-cell analysers. These latter analysers were slightly cheaper initially but their sensors were expensive and had a reduced life in the presence of nitrous oxide. Details of accuracy tests have been presented and opinions expressed with regard to the most satisfactory analysers for clinical use.

  8. EPOXI Trajectory and Maneuver Analyses

    NASA Technical Reports Server (NTRS)

    Chung, Min-Kun J.; Bhaskaran, Shyamkumar; Chesley, Steven R.; Halsell, C. Allen; Helfrich, Clifford E.; Jefferson, David C.; McElrath, Timothy P.; Rush, Brian P.; Wang, Tseng-Chan M.; Yen, Chen-wan L.

    2011-01-01

    The EPOXI mission is a NASA Discovery Mission of Opportunity combining two separate investigations: Extrasolar Planet Observation and Characterization (EPOCh) and Deep Impact eXtended Investigation (DIXI). Both investigations reused the DI instruments and spacecraft that successfully flew by the comet Tempel-1 (4 July 2005). For EPOCh, the goal was to find exoplanets with the high resolution imager, while for DIXI it was to fly by the comet Hartley 2 (4 Nov 2010). This paper documents the navigation experience of the earlier ma-neuver analyses critical for the EPOXI mission including statistical ?V analyses and other useful analyses in designing maneuvers. It also recounts the trajectory design leading up to the final reference trajectory to Hartley 2.

  9. Spacelab Charcoal Analyses

    NASA Technical Reports Server (NTRS)

    Slivon, L. E.; Hernon-Kenny, L. A.; Katona, V. R.; Dejarme, L. E.

    1995-01-01

    This report describes analytical methods and results obtained from chemical analysis of 31 charcoal samples in five sets. Each set was obtained from a single scrubber used to filter ambient air on board a Spacelab mission. Analysis of the charcoal samples was conducted by thermal desorption followed by gas chromatography/mass spectrometry (GC/MS). All samples were analyzed using identical methods. The method used for these analyses was able to detect compounds independent of their polarity or volatility. In addition to the charcoal samples, analyses of three Environmental Control and Life Support System (ECLSS) water samples were conducted specifically for trimethylamine.

  10. Wavelet Analyses and Applications

    ERIC Educational Resources Information Center

    Bordeianu, Cristian C.; Landau, Rubin H.; Paez, Manuel J.

    2009-01-01

    It is shown how a modern extension of Fourier analysis known as wavelet analysis is applied to signals containing multiscale information. First, a continuous wavelet transform is used to analyse the spectrum of a nonstationary signal (one whose form changes in time). The spectral analysis of such a signal gives the strength of the signal in each…

  11. Apollo 14 microbial analyses

    NASA Technical Reports Server (NTRS)

    Taylor, G. R.

    1972-01-01

    Extensive microbiological analyses that were performed on the Apollo 14 prime and backup crewmembers and ancillary personnel are discussed. The crewmembers were subjected to four separate and quite different environments during the 137-day monitoring period. The relation between each of these environments and observed changes in the microflora of each astronaut are presented.

  12. Wavelet Analyses and Applications

    ERIC Educational Resources Information Center

    Bordeianu, Cristian C.; Landau, Rubin H.; Paez, Manuel J.

    2009-01-01

    It is shown how a modern extension of Fourier analysis known as wavelet analysis is applied to signals containing multiscale information. First, a continuous wavelet transform is used to analyse the spectrum of a nonstationary signal (one whose form changes in time). The spectral analysis of such a signal gives the strength of the signal in each…

  13. Geomorphic analyses from space imagery

    NASA Technical Reports Server (NTRS)

    Morisawa, M.

    1985-01-01

    One of the most obvious applications of space imagery to geomorphological analyses is in the study of drainage patterns and channel networks. LANDSAT, high altitude photography and other types of remote sensing imagery are excellent for depicting stream networks on a regional scale because of their broad coverage in a single image. They offer a valuable tool for comparing and analyzing drainage patterns and channel networks all over the world. Three aspects considered in this geomorphological study are: (1) the origin, evolution and rates of development of drainage systems; (2) the topological studies of network and channel arrangements; and (3) the adjustment of streams to tectonic events and geologic structure (i.e., the mode and rate of adjustment).

  14. Development of a systematic computer vision-based method to analyse and compare images of false identity documents for forensic intelligence purposes-Part I: Acquisition, calibration and validation issues.

    PubMed

    Auberson, Marie; Baechler, Simon; Zasso, Michaël; Genessay, Thibault; Patiny, Luc; Esseiva, Pierre

    2016-03-01

    Following their detection and seizure by police and border guard authorities, false identity and travel documents are usually scanned, producing digital images. This research investigates the potential of these images to classify false identity documents, highlight links between documents produced by a same modus operandi or same source, and thus support forensic intelligence efforts. Inspired by previous research work about digital images of Ecstasy tablets, a systematic and complete method has been developed to acquire, collect, process and compare images of false identity documents. This first part of the article highlights the critical steps of the method and the development of a prototype that processes regions of interest extracted from images. Acquisition conditions have been fine-tuned in order to optimise reproducibility and comparability of images. Different filters and comparison metrics have been evaluated and the performance of the method has been assessed using two calibration and validation sets of documents, made up of 101 Italian driving licenses and 96 Portuguese passports seized in Switzerland, among which some were known to come from common sources. Results indicate that the use of Hue and Edge filters or their combination to extract profiles from images, and then the comparison of profiles with a Canberra distance-based metric provides the most accurate classification of documents. The method appears also to be quick, efficient and inexpensive. It can be easily operated from remote locations and shared amongst different organisations, which makes it very convenient for future operational applications. The method could serve as a first fast triage method that may help target more resource-intensive profiling methods (based on a visual, physical or chemical examination of documents for instance). Its contribution to forensic intelligence and its application to several sets of false identity documents seized by police and border guards will be

  15. Atmospheric tether mission analyses

    NASA Technical Reports Server (NTRS)

    1996-01-01

    NASA is considering the use of tethered satellites to explore regions of the atmosphere inaccessible to spacecraft or high altitude research balloons. This report summarizes the Lockheed Martin Astronautics (LMA) effort for the engineering study team assessment of an Orbiter-based atmospheric tether mission. Lockheed Martin responsibilities included design recommendations for the deployer and tether, as well as tether dynamic analyses for the mission. Three tether configurations were studied including single line, multistrand (Hoytether) and tape designs.

  16. Systematic Processing of Clementine Data for Scientific Analyses

    NASA Technical Reports Server (NTRS)

    Mcewen, A. S.

    1993-01-01

    If fully successful, the Clementine mission will return about 3,000,000 lunar images and more than 5000 images of Geographos. Effective scientific analyses of such large datasets require systematic processing efforts. Concepts for two such efforts are described: glogal multispectral imaging of the moon; and videos of Geographos.

  17. LDEF Satellite Radiation Analyses

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.

    1996-01-01

    Model calculations and analyses have been carried out to compare with several sets of data (dose, induced radioactivity in various experiment samples and spacecraft components, fission foil measurements, and LET spectra) from passive radiation dosimetry on the Long Duration Exposure Facility (LDEF) satellite, which was recovered after almost six years in space. The calculations and data comparisons are used to estimate the accuracy of current models and methods for predicting the ionizing radiation environment in low earth orbit. The emphasis is on checking the accuracy of trapped proton flux and anisotropy models.

  18. Analysing immune cell migration.

    PubMed

    Beltman, Joost B; Marée, Athanasius F M; de Boer, Rob J

    2009-11-01

    The visualization of the dynamic behaviour of and interactions between immune cells using time-lapse video microscopy has an important role in modern immunology. To draw robust conclusions, quantification of such cell migration is required. However, imaging experiments are associated with various artefacts that can affect the estimated positions of the immune cells under analysis, which form the basis of any subsequent analysis. Here, we describe potential artefacts that could affect the interpretation of data sets on immune cell migration. We propose how these errors can be recognized and corrected, and suggest ways to prevent the data analysis itself leading to biased results.

  19. Broadband rotor noise analyses

    NASA Technical Reports Server (NTRS)

    George, A. R.; Chou, S. T.

    1984-01-01

    The various mechanisms which generate broadband noise on a range of rotors studied include load fluctuations due to inflow turbulence, due to turbulent boundary layers passing the blades' trailing edges, and due to tip vortex formation. Existing analyses are used and extensions to them are developed to make more accurate predictions of rotor noise spectra and to determine which mechanisms are important in which circumstances. Calculations based on the various prediction methods in existing experiments were compared. The present analyses are adequate to predict the spectra from a wide variety of experiments on fans, full scale and model scale helicopter rotors, wind turbines, and propellers to within about 5 to 10 dB. Better knowledge of the inflow turbulence improves the accuracy of the predictions. Results indicate that inflow turbulence noise depends strongly on ambient conditions and dominates at low frequencies. Trailing edge noise and tip vortex noise are important at higher frequencies if inflow turbulence is weak. Boundary layer trailing edge noise, important, for large sized rotors, increases slowly with angle of attack but not as rapidly as tip vortex noise.

  20. IMAGES, IMAGES, IMAGES

    SciTech Connect

    Marcus, A.

    1980-07-01

    The role of images of information (charts, diagrams, maps, and symbols) for effective presentation of facts and concepts is expanding dramatically because of advances in computer graphics technology, increasingly hetero-lingual, hetero-cultural world target populations of information providers, the urgent need to convey more efficiently vast amounts of information, the broadening population of (non-expert) computer users, the decrease of available time for reading texts and for decision making, and the general level of literacy. A coalition of visual performance experts, human engineering specialists, computer scientists, and graphic designers/artists is required to resolve human factors aspects of images of information. The need for, nature of, and benefits of interdisciplinary effort are discussed. The results of an interdisciplinary collaboration are demonstrated in a product for visualizing complex information about global energy interdependence. An invited panel will respond to the presentation.

  1. EEG analyses with SOBI.

    SciTech Connect

    Glickman, Matthew R.; Tang, Akaysha

    2009-02-01

    The motivating vision behind Sandia's MENTOR/PAL LDRD project has been that of systems which use real-time psychophysiological data to support and enhance human performance, both individually and of groups. Relevant and significant psychophysiological data being a necessary prerequisite to such systems, this LDRD has focused on identifying and refining such signals. The project has focused in particular on EEG (electroencephalogram) data as a promising candidate signal because it (potentially) provides a broad window on brain activity with relatively low cost and logistical constraints. We report here on two analyses performed on EEG data collected in this project using the SOBI (Second Order Blind Identification) algorithm to identify two independent sources of brain activity: one in the frontal lobe and one in the occipital. The first study looks at directional influences between the two components, while the second study looks at inferring gender based upon the frontal component.

  2. LDEF Satellite Radiation Analyses

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.

    1996-01-01

    This report covers work performed by Science Applications International Corporation (SAIC) under contract NAS8-39386 from the NASA Marshall Space Flight Center entitled LDEF Satellite Radiation Analyses. The basic objective of the study was to evaluate the accuracy of present models and computational methods for defining the ionizing radiation environment for spacecraft in Low Earth Orbit (LEO) by making comparisons with radiation measurements made on the Long Duration Exposure Facility (LDEF) satellite, which was recovered after almost six years in space. The emphasis of the work here is on predictions and comparisons with LDEF measurements of induced radioactivity and Linear Energy Transfer (LET) measurements. These model/data comparisons have been used to evaluate the accuracy of current models for predicting the flux and directionality of trapped protons for LEO missions.

  3. Network Class Superposition Analyses

    PubMed Central

    Pearson, Carl A. B.; Zeng, Chen; Simha, Rahul

    2013-01-01

    Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., for the yeast cell cycle process [1]), considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix , which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for derived from Boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with . We show how to generate Derrida plots based on . We show that -based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on . We motivate all of these results in terms of a popular molecular biology Boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for , for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses. PMID:23565141

  4. Network class superposition analyses.

    PubMed

    Pearson, Carl A B; Zeng, Chen; Simha, Rahul

    2013-01-01

    Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., ≈ 10(30) for the yeast cell cycle process), considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix T, which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for T derived from boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying T to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with T. We show how to generate Derrida plots based on T. We show that T-based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on T. We motivate all of these results in terms of a popular molecular biology boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for T, for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses.

  5. Normative 3D acetabular orientation measurements by the low-dose EOS imaging system in 102 asymptomatic subjects in standing position: Analyses by side, gender, pelvic incidence and reproducibility.

    PubMed

    Thelen, T; Thelen, P; Demezon, H; Aunoble, S; Le Huec, J-C

    2017-04-01

    Three-dimensional (3D) acetabular orientation is a fundamental topic in orthopedic surgery. Computed tomography (CT) allows 3D measurement of native acetabular orientation, but with a substantial radiation dose. The EOS imaging system was developed to perform this kind of evaluation, but has not been validated in this indication with specific attention to the acetabulum. We therefore performed a prospective study using EOS to assess: (1) the reproducibility of the 3D acetabulum orientation measures; (2) normative asymptomatic acetabular morphology in standing position, according to side and gender; and (3) the relationship between acetabular anteversion and pelvic incidence. The low-dose EOS imaging system is a reproducible method for measuring 3D acetabular orientation in standing position. In a previous prospective study of spine sagittal balance, 165 asymptomatic volunteers were examined on whole-body EOS biplanar X-ray; 102 had appropriate images for pelvic and acetabular analysis, with an equal sex-ratio (53 female, 49 male). These EOS images were reviewed using sterEOS 3D software, allowing automatic measurement of acetabular parameters (anteversion and inclination) and pelvic parameters (pelvic incidence, pelvic tilt and sacral slope) in an anatomical (anterior pelvic plane: APP) and a functional reference plane (patient vertical plane: PVP). Both intra- and inter-observer analysis showed good agreement (ICC>0.90); Bland-Altman plot distributions were good. Acetabular anatomical anteversion and inclination relative to APP (AAAPP and AIAPP, respectively) were significantly greater in women than in men, with no effect of side (right AAA: women 21.3°±3.4° vs. men 16.1°±3.3° (P<0001); right AIAPP: women 55.3°±3.7° vs. men 52.5°±3.0° (P<0001); left AAAPP: women 20.9°±3.5° vs. men 15.6°±4.0° (P<0001); left AIAPP: women 54.6°±3.5° vs. men 52.7°±2.8° (P=0003)). The same differences between men and women were observed when measurements were

  6. NOAA's National Snow Analyses

    NASA Astrophysics Data System (ADS)

    Carroll, T. R.; Cline, D. W.; Olheiser, C. M.; Rost, A. A.; Nilsson, A. O.; Fall, G. M.; Li, L.; Bovitz, C. T.

    2005-12-01

    NOAA's National Operational Hydrologic Remote Sensing Center (NOHRSC) routinely ingests all of the electronically available, real-time, ground-based, snow data; airborne snow water equivalent data; satellite areal extent of snow cover information; and numerical weather prediction (NWP) model forcings for the coterminous U.S. The NWP model forcings are physically downscaled from their native 13 km2 spatial resolution to a 1 km2 resolution for the CONUS. The downscaled NWP forcings drive an energy-and-mass-balance snow accumulation and ablation model at a 1 km2 spatial resolution and at a 1 hour temporal resolution for the country. The ground-based, airborne, and satellite snow observations are assimilated into the snow model's simulated state variables using a Newtonian nudging technique. The principle advantages of the assimilation technique are: (1) approximate balance is maintained in the snow model, (2) physical processes are easily accommodated in the model, and (3) asynoptic data are incorporated at the appropriate times. The snow model is reinitialized with the assimilated snow observations to generate a variety of snow products that combine to form NOAA's NOHRSC National Snow Analyses (NSA). The NOHRSC NSA incorporate all of the available information necessary and available to produce a "best estimate" of real-time snow cover conditions at 1 km2 spatial resolution and 1 hour temporal resolution for the country. The NOHRSC NSA consist of a variety of daily, operational, products that characterize real-time snowpack conditions including: snow water equivalent, snow depth, surface and internal snowpack temperatures, surface and blowing snow sublimation, and snowmelt for the CONUS. The products are generated and distributed in a variety of formats including: interactive maps, time-series, alphanumeric products (e.g., mean areal snow water equivalent on a hydrologic basin-by-basin basis), text and map discussions, map animations, and quantitative gridded products

  7. Broadband seismic illumination and resolution analyses based on staining algorithm

    NASA Astrophysics Data System (ADS)

    Chen, Bo; Jia, Xiao-Feng; Xie, Xiao-Bi

    2016-09-01

    Seismic migration moves reflections to their true subsurface positions and yields seismic images of subsurface areas. However, due to limited acquisition aperture, complex overburden structure and target dipping angle, the migration often generates a distorted image of the actual subsurface structure. Seismic illumination and resolution analyses provide a quantitative description of how the above-mentioned factors distort the image. The point spread function (PSF) gives the resolution of the depth image and carries full information about the factors affecting the quality of the image. The staining algorithm establishes a correspondence between a certain structure and its relevant wavefield and reflected data. In this paper, we use the staining algorithm to calculate the PSFs, then use these PSFs for extracting the acquisition dip response and correcting the original depth image by deconvolution. We present relevant results of the SEG salt model. The staining algorithm provides an efficient tool for calculating the PSF and for conducting broadband seismic illumination and resolution analyses.

  8. Effect of energy restriction and physical exercise intervention on phenotypic flexibility as examined by transcriptomics analyses of mRNA from adipose tissue and whole body magnetic resonance imaging.

    PubMed

    Lee, Sindre; Norheim, Frode; Langleite, Torgrim M; Noreng, Hans J; Storås, Trygve H; Afman, Lydia A; Frost, Gary; Bell, Jimmy D; Thomas, E Louise; Kolnes, Kristoffer J; Tangen, Daniel S; Stadheim, Hans K; Gilfillan, Gregor D; Gulseth, Hanne L; Birkeland, Kåre I; Jensen, Jørgen; Drevon, Christian A; Holen, Torgeir

    2016-11-01

    Overweight and obesity lead to changes in adipose tissue such as inflammation and reduced insulin sensitivity. The aim of this study was to assess how altered energy balance by reduced food intake or enhanced physical activity affect these processes. We studied sedentary subjects with overweight/obesity in two intervention studies, each lasting 12 weeks affecting energy balance either by energy restriction (~20% reduced intake of energy from food) in one group, or by enhanced energy expenditure due to physical exercise (combined endurance- and strength-training) in the other group. We monitored mRNA expression by microarray and mRNA sequencing from adipose tissue biopsies. We also measured several plasma parameters as well as fat distribution with magnetic resonance imaging and spectroscopy. Comparison of microarray and mRNA sequencing showed strong correlations, which were also confirmed using RT-PCR In the energy restricted subjects (body weight reduced by 5% during a 12 weeks intervention), there were clear signs of enhanced lipolysis as monitored by mRNA in adipose tissue as well as plasma concentration of free-fatty acids. This increase was strongly related to increased expression of markers for M1-like macrophages in adipose tissue. In the exercising subjects (glucose infusion rate increased by 29% during a 12-week intervention), there was a marked reduction in the expression of markers of M2-like macrophages and T cells, suggesting that physical exercise was especially important for reducing inflammation in adipose tissue with insignificant reduction in total body weight. Our data indicate that energy restriction and physical exercise affect energy-related pathways as well as inflammatory processes in different ways, probably related to macrophages in adipose tissue.

  9. On study design in neuroimaging heritability analyses

    NASA Astrophysics Data System (ADS)

    Koran, Mary Ellen; Li, Bo; Jahanshad, Neda; Thornton-Wells, Tricia A.; Glahn, David C.; Thompson, Paul M.; Blangero, John; Nichols, Thomas E.; Kochunov, Peter; Landman, Bennett A.

    2014-03-01

    Imaging genetics is an emerging methodology that combines genetic information with imaging-derived metrics to understand how genetic factors impact observable structural, functional, and quantitative phenotypes. Many of the most well-known genetic studies are based on Genome-Wide Association Studies (GWAS), which use large populations of related or unrelated individuals to associate traits and disorders with individual genetic factors. Merging imaging and genetics may potentially lead to improved power of association in GWAS because imaging traits may be more sensitive phenotypes, being closer to underlying genetic mechanisms, and their quantitative nature inherently increases power. We are developing SOLAR-ECLIPSE (SE) imaging genetics software which is capable of performing genetic analyses with both large-scale quantitative trait data and family structures of variable complexity. This program can estimate the contribution of genetic commonality among related subjects to a given phenotype, and essentially answer the question of whether or not the phenotype is heritable. This central factor of interest, heritability, offers bounds on the direct genetic influence over observed phenotypes. In order for a trait to be a good phenotype for GWAS, it must be heritable: at least some proportion of its variance must be due to genetic influences. A variety of family structures are commonly used for estimating heritability, yet the variability and biases for each as a function of the sample size are unknown. Herein, we investigate the ability of SOLAR to accurately estimate heritability models based on imaging data simulated using Monte Carlo methods implemented in R. We characterize the bias and the variability of heritability estimates from SOLAR as a function of sample size and pedigree structure (including twins, nuclear families, and nuclear families with grandparents).

  10. EEO Implications of Job Analyses.

    ERIC Educational Resources Information Center

    Lacy, D. Patrick, Jr.

    1979-01-01

    Discusses job analyses as they relate to the requirements of Title VII of the Civil Rights Act of 1964, the Equal Pay Act of 1963, and the Rehabilitation Act of 1973. Argues that job analyses can establish the job-relatedness of entrance requirements and aid in defenses against charges of discrimination. Journal availability: see EA 511 615.

  11. EEO Implications of Job Analyses.

    ERIC Educational Resources Information Center

    Lacy, D. Patrick, Jr.

    1979-01-01

    Discusses job analyses as they relate to the requirements of Title VII of the Civil Rights Act of 1964, the Equal Pay Act of 1963, and the Rehabilitation Act of 1973. Argues that job analyses can establish the job-relatedness of entrance requirements and aid in defenses against charges of discrimination. Journal availability: see EA 511 615.

  12. Analysing the Metaphorical Images of Turkish Preschool Teachers

    ERIC Educational Resources Information Center

    Kabadayi, Abdulkadir

    2008-01-01

    The metaphorical basis of teacher reflection about teaching and learning has been a rich area of theory and research. This is a study of metaphor as a shared system of interpretation and classification, which teachers and student teachers and their supervising teachers can cooperatively explore. This study employs metaphor as a means of research…

  13. The Nullness Analyser of julia

    NASA Astrophysics Data System (ADS)

    Spoto, Fausto

    This experimental paper describes the implementation and evaluation of a static nullness analyser for single-threaded Java and Java bytecode programs, built inside the julia tool. Nullness analysis determines, at compile-time, those program points where the null value might be dereferenced, leading to a run-time exception. In order to improve the quality of software, it is important to prove that such situation does not occur. Our analyser is based on a denotational abstract interpretation of Java bytecode through Boolean logical formulas, strengthened with a set of denotational and constraint-based supporting analyses for locally non-null fields and full arrays and collections. The complete integration of all such analyses results in a correct system of very high precision whose time of analysis remains in the order of minutes, as we show with some examples of analysis of large software.

  14. Analysing the ventricular fibrillation waveform.

    PubMed

    Reed, Matthew J; Clegg, Gareth R; Robertson, Colin E

    2003-04-01

    The surface electrocardiogram associated with ventricular fibrillation has been of interest to researchers for some time. Over the last few decades, techniques have been developed to analyse this signal in an attempt to obtain more information about the state of the myocardium and the chances of successful defibrillation. This review looks at the implications of analysing the VF waveform and discusses the various techniques that have been used, including fast Fourier transform analysis, wavelet transform analysis and mathematical techniques such as chaos theory.

  15. Stereological analyses of the whole human pancreas

    PubMed Central

    Poudel, Ananta; Fowler, Jonas L.; Zielinski, Mark C.; Kilimnik, German; Hara, Manami

    2016-01-01

    The large size of human tissues requires a practical stereological approach to perform a comprehensive analysis of the whole organ. We have developed a method to quantitatively analyze the whole human pancreas, as one of the challenging organs to study, in which endocrine cells form various sizes of islets that are scattered unevenly throughout the exocrine pancreas. Furthermore, the human pancreas possesses intrinsic characteristics of intra-individual variability, i.e. regional differences in endocrine cell/islet distribution, and marked inter-individual heterogeneity regardless of age, sex and disease conditions including obesity and diabetes. The method is built based on large-scale image capture, computer-assisted unbiased image analysis and quantification, and further mathematical analyses, using widely-used software such as Fiji/ImageJ and MATLAB. The present study includes detailed protocols of every procedure as well as all the custom-written computer scripts, which can be modified according to specific experimental plans and specimens of interest. PMID:27658965

  16. Feed analyses and their interpretation

    USDA-ARS?s Scientific Manuscript database

    Compositional analysis is central to determining the nutritional value of feedstuffs. The utility of the values and how they should be used depends on how representative the feed subsample is, the nutritional relevance of the assays, analytical variability of the analyses, and whether a feed is suit...

  17. Mitogenomic analyses of caniform relationships.

    PubMed

    Arnason, Ulfur; Gullberg, Anette; Janke, Axel; Kullberg, Morgan

    2007-12-01

    Extant members of the order Carnivora split into two basal groups, Caniformia (dog-like carnivorans) and Feliformia (cat-like carnivorans). In this study we address phylogenetic relationships within Caniformia applying various methodological approaches to analyses of complete mitochondrial genomes. Pinnipeds are currently well represented with respect to mitogenomic data and here we add seven mt genomes to the non-pinniped caniform collection. The analyses identified a basal caniform divergence between Cynoidea and Arctoidea. Arctoidea split into three primary groups, Ursidae (including the giant panda), Pinnipedia, and a branch, Musteloidea, which encompassed Ailuridae (red panda), Mephitidae (skunks), Procyonidae (raccoons) and Mustelidae (mustelids). The analyses favored a basal arctoid split between Ursidae and a branch containing Pinnipedia and Musteloidea. Within the Musteloidea there was a preference for a basal divergence between Ailuridae and remaining families. Among the latter, the analyses identified a sister group relationship between Mephitidae and a branch that contained Procyonidae and Mustelidae. The mitogenomic distance between the wolf and the dog was shown to be at the same level as that of basal human divergences. The wolf and the dog are commonly considered as separate species in the popular literature. The mitogenomic result is inconsistent with that understanding at the same time as it provides insight into the time of the domestication of the dog relative to basal human mitogenomic divergences.

  18. Analysing Children's Drawings: Applied Imagination

    ERIC Educational Resources Information Center

    Bland, Derek

    2012-01-01

    This article centres on a research project in which freehand drawings provided a richly creative and colourful data source of children's imagined, ideal learning environments. Issues concerning the analysis of the visual data are discussed, in particular, how imaginative content was analysed and how the analytical process was dependent on an…

  19. Nonlinear structural crash dynamics analyses

    NASA Technical Reports Server (NTRS)

    Hayduk, R. J.; Thomson, R. G.; Wittlin, G.; Kamat, M. P.

    1979-01-01

    Presented in this paper are the results of three nonlinear computer programs, KRASH, ACTION and DYCAST used to analyze the dynamic response of a twin-engine, low-wing airplane section subjected to a 8.38 m/s (27.5 ft/s) vertical impact velocity crash condition. This impact condition simulates the vertical sink rate in a shallow aircraft landing or takeoff accident. The three distinct analysis techniques for nonlinear dynamic response of aircraft structures are briefly examined and compared versus each other and the experimental data. The report contains brief descriptions of the three computer programs, the respective aircraft section mathematical models, pertinent data from the experimental test performed at NASA Langley, and a comparison of the analyses versus test results. Cost and accuracy comparisons between the three analyses are made to illustrate the possible uses of the different nonlinear programs and their future potential.

  20. Workload analyse of assembling process

    NASA Astrophysics Data System (ADS)

    Ghenghea, L. D.

    2015-11-01

    The workload is the most important indicator for managers responsible of industrial technological processes no matter if these are automated, mechanized or simply manual in each case, machines or workers will be in the focus of workload measurements. The paper deals with workload analyses made to a most part manual assembling technology for roller bearings assembling process, executed in a big company, with integrated bearings manufacturing processes. In this analyses the delay sample technique have been used to identify and divide all bearing assemblers activities, to get information about time parts from 480 minutes day work time that workers allow to each activity. The developed study shows some ways to increase the process productivity without supplementary investments and also indicated the process automation could be the solution to gain maximum productivity.

  1. Mars periglacial punctual features analyses

    NASA Astrophysics Data System (ADS)

    Machado, Adriane; Barata, Teresa; Ivo Alves, E.; Cunha, Pedro P.

    2012-11-01

    The presence of patterned grounds on Mars has been reported in several papers, especially the study of polygons distribution, size and formation processes. In the last years, the presence of basketball terrains has been noticed on Mars. Studies were made to recognize these terrains on Mars through the analysis of Mars Orbiter Camera (MOC) images. We have been developing an algorithm that recognizes automatically and extracts the hummocky patterns on Mars related to landforms generated by freeze-thaw cycles such as mud boils features. The algorithm is based on remote sensing data that establishes a comparison between the hummocks and mud boils morphology and size from Adventdalen at Longyearbyen (Svalbard - Norway) and hummocky patterns on Mars using High Resolution Imaging Science Experiment (HiRISE) imagery.

  2. Supplementary report on antilock analyses

    NASA Technical Reports Server (NTRS)

    Zellner, J. W.

    1985-01-01

    Generic modulator analysis was performed to quantify the effects of dump and reapply pressure rates on antilock stability and performance. Analysis will include dump and reapply rates, and lumped modulator delay. Based on the results of the generic modulator analysis and earlier toggle optimization analysis (with Mitsubishi modulator), a recommended preliminary antilock design was synthesized and its response and performance simulated. The results of these analyses are documented.

  3. Uncertainty in Operational Atmospheric Analyses and Re-Analyses

    NASA Astrophysics Data System (ADS)

    Langland, R.; Maue, R. N.

    2016-12-01

    This talk will describe uncertainty in atmospheric analyses of wind and temperature produced by operational forecast models and in re-analysis products. Because the "true" atmospheric state cannot be precisely quantified, there is necessarily error in every atmospheric analysis, and this error can be estimated by computing differences ( variance and bias) between analysis products produced at various centers (e.g., ECMWF, NCEP, U.S Navy, etc.) that use independent data assimilation procedures, somewhat different sets of atmospheric observations and forecast models with different resolutions, dynamical equations, and physical parameterizations. These estimates of analysis uncertainty provide a useful proxy to actual analysis error. For this study, we use a unique multi-year and multi-model data archive developed at NRL-Monterey. It will be shown that current uncertainty in atmospheric analyses is closely correlated with the geographic distribution of assimilated in-situ atmospheric observations, especially those provided by high-accuracy radiosonde and commercial aircraft observations. The lowest atmospheric analysis uncertainty is found over North America, Europe and Eastern Asia, which have the largest numbers of radiosonde and commercial aircraft observations. Analysis uncertainty is substantially larger (by factors of two to three times) in most of the Southern hemisphere, the North Pacific ocean, and under-developed nations of Africa and South America where there are few radiosonde or commercial aircraft data. It appears that in regions where atmospheric analyses depend primarily on satellite radiance observations, analysis uncertainty of both temperature and wind remains relatively high compared to values found over North America and Europe.

  4. FORTRAN Algorithm for Image Processing

    NASA Technical Reports Server (NTRS)

    Roth, Don J.; Hull, David R.

    1987-01-01

    FORTRAN computer algorithm containing various image-processing analysis and enhancement functions developed. Algorithm developed specifically to process images of developmental heat-engine materials obtained with sophisticated nondestructive evaluation instruments. Applications of program include scientific, industrial, and biomedical imaging for studies of flaws in materials, analyses of steel and ores, and pathology.

  5. Steganalysis of overlapping images

    NASA Astrophysics Data System (ADS)

    Whitaker, James M.; Ker, Andrew D.

    2015-03-01

    We examine whether steganographic images can be detected more reliably when there exist other images, taken with the same camera under the same conditions, of the same scene. We argue that such a circumstance is realistic and likely in practice. In `laboratory conditions' mimicking circumstances favourable to the analyst, and with a custom set of digital images which capture the same scenes with controlled amounts of overlap, we use an overlapping reference image to calibrate steganographic features of the image under analysis. Experimental results show that the analysed image can be classified as cover or stego with much greater reliability than traditional steganalysis not exploiting overlapping content, and the improvement in reliability depends on the amount of overlap. These results are curious because two different photographs of exactly the same scene, taken only a few seconds apart with a fixed camera and settings, typically have steganographic features that differ by considerably more than a cover and stego image.

  6. Analysing photonic structures in plants.

    PubMed

    Vignolini, Silvia; Moyroud, Edwige; Glover, Beverley J; Steiner, Ullrich

    2013-10-06

    The outer layers of a range of plant tissues, including flower petals, leaves and fruits, exhibit an intriguing variation of microscopic structures. Some of these structures include ordered periodic multilayers and diffraction gratings that give rise to interesting optical appearances. The colour arising from such structures is generally brighter than pigment-based colour. Here, we describe the main types of photonic structures found in plants and discuss the experimental approaches that can be used to analyse them. These experimental approaches allow identification of the physical mechanisms producing structural colours with a high degree of confidence.

  7. Summary of LDEF battery analyses

    NASA Technical Reports Server (NTRS)

    Johnson, Chris; Thaller, Larry; Bittner, Harlin; Deligiannis, Frank; Tiller, Smith; Sullivan, David; Bene, James

    1992-01-01

    Tests and analyses of NiCd, LiSO2, and LiCf batteries flown on the Long Duration Exposure Facility (LDEF) includes results from NASA, Aerospace, and commercial labs. The LiSO2 cells illustrate six-year degradation of internal components acceptable for space applications, with up to 85 percent battery capacity remaining on discharge of some returned cells. LiCf batteries completed their mission, but lost any remaining capacity due to internal degradation. Returned NiCd batteries tested an GSFC showed slight case distortion due to pressure build up, but were functioning as designed.

  8. Analysing photonic structures in plants

    PubMed Central

    Vignolini, Silvia; Moyroud, Edwige; Glover, Beverley J.; Steiner, Ullrich

    2013-01-01

    The outer layers of a range of plant tissues, including flower petals, leaves and fruits, exhibit an intriguing variation of microscopic structures. Some of these structures include ordered periodic multilayers and diffraction gratings that give rise to interesting optical appearances. The colour arising from such structures is generally brighter than pigment-based colour. Here, we describe the main types of photonic structures found in plants and discuss the experimental approaches that can be used to analyse them. These experimental approaches allow identification of the physical mechanisms producing structural colours with a high degree of confidence. PMID:23883949

  9. Laser power beaming system analyses

    NASA Technical Reports Server (NTRS)

    Zeiders, Glenn W., Jr.

    1993-01-01

    The successful demonstration of the PAMELA adaptive optics hardware and the fabrication of the BTOS truss structure were identified by the program office as the two most critical elements of the NASA power beaming program, so it was these that received attention during this program. Much of the effort was expended in direct program support at MSFC, but detailed technical analyses of the AMP deterministic control scheme and the BTOS truss structure (both the JPL design and a spherical one) were prepared and are attached, and recommendations are given.

  10. [Laboratory analyses in sports medicine].

    PubMed

    Clénin, German E; Cordes, Mareike

    2015-05-01

    Laboratory analyses in sports medicine are relevant for three reasons: 1. In actively exercising individuals laboratory analysis are one of the central elements in the diagnosis of diseases and overreaching. 2. Regularly done laboratory analysis in competitive athletes with high load of training and competition may help to detect certain deficiencies early on. 3. Physical activity in general and competitive exercise training specifically do change certain routine laboratory parameters significantly although not reflecting pathological changes. These so-called preanalytic variations should be taken into consideration while interpreting laboratory data in medical emergency and routine diagnostics. This article intends to help the physician to interprete laboratory data of actively exercising sportsmen.

  11. THOR Turbulence Electron Analyser: TEA

    NASA Astrophysics Data System (ADS)

    Fazakerley, Andrew; Moore, Tom; Owen, Chris; Pollock, Craig; Wicks, Rob; Samara, Marilia; Rae, Jonny; Hancock, Barry; Kataria, Dhiren; Rust, Duncan

    2016-04-01

    Turbulence Heating ObserveR (THOR) is the first mission ever flown in space dedicated to plasma turbulence. The Turbulence Electron Analyser (TEA) will measure the plasma electron populations in the mission's Regions of Interest. It will collect a 3D electron velocity distribution with cadences as short as 5 ms. The instrument will be capable of measuring energies up to 30 keV. TEA consists of multiple electrostatic analyser heads arranged so as to measure electrons arriving from look directions covering the full sky, i.e. 4 pi solid angle. The baseline concept is similar to the successful FPI-DES instrument currently operating on the MMS mission. TEA is intended to have a similar angular resolution, but a larger geometric factor. In comparison to earlier missions, TEA improves on the measurement cadence. For example, MMS FPI-DES routinely operates at 30 ms cadence. The objective of measuring distributions at rates as fast as 5 ms is driven by the mission's scientific requirements to resolve electron gyroscale size structures, where plasma heating and fluctuation dissipation is predicted to occur. TEA will therefore be capable of making measurements of the evolution of distribution functions across thin (a few km) current sheets travelling past the spacecraft at up to 600 km/s, of the Power Spectral Density of fluctuations of electron moments and of distributions fast enough to match frequencies with waves expected to be dissipating turbulence (e.g. with 100 Hz whistler waves).

  12. Analyses of the temporomandibular disc.

    PubMed

    Jirman, R; Fricová, M; Horák, Z; Krystůfek, J; Konvicková, S; Mazánek, J

    2007-01-01

    This project is the beginning of a large research work with a goal to develop a new total replacement of temporomandibular (TM) joint. First aim of this work was to determine the relative displacement of the TM disc and the mandible during mouth opening. The movement of the TM disc was studied using a magnetic resonance imaging. Sagittal static images in revolved sections of the TM joint were obtained in various positions of jaw opening from 0 to 50 mm. The results provided a description of the TM disc displacements as a function of jaw opening. The displacements of the mandible and TM disc were about 16 mm and 10 mm respectively at mouth opening of 50 mm, maximum rotation of the mandible was 34s. The results of these measurements can be used for clinical diagnostics and also they were used as inputs for the follows finite element analysis (FEA). Second aim of this work was to create stress and strain analysis of TM joint using non-linear FEA. Complex of TM joint consists of mandibular disc, half skull and half mandible during normal jaw opening. The results illustrate the stress distributions in the TMJ during a normal jaw opening.

  13. Perturbation analyses of intermolecular interactions

    NASA Astrophysics Data System (ADS)

    Koyama, Yohei M.; Kobayashi, Tetsuya J.; Ueda, Hiroki R.

    2011-08-01

    Conformational fluctuations of a protein molecule are important to its function, and it is known that environmental molecules, such as water molecules, ions, and ligand molecules, significantly affect the function by changing the conformational fluctuations. However, it is difficult to systematically understand the role of environmental molecules because intermolecular interactions related to the conformational fluctuations are complicated. To identify important intermolecular interactions with regard to the conformational fluctuations, we develop herein (i) distance-independent and (ii) distance-dependent perturbation analyses of the intermolecular interactions. We show that these perturbation analyses can be realized by performing (i) a principal component analysis using conditional expectations of truncated and shifted intermolecular potential energy terms and (ii) a functional principal component analysis using products of intermolecular forces and conditional cumulative densities. We refer to these analyses as intermolecular perturbation analysis (IPA) and distance-dependent intermolecular perturbation analysis (DIPA), respectively. For comparison of the IPA and the DIPA, we apply them to the alanine dipeptide isomerization in explicit water. Although the first IPA principal components discriminate two states (the α state and PPII (polyproline II) + β states) for larger cutoff length, the separation between the PPII state and the β state is unclear in the second IPA principal components. On the other hand, in the large cutoff value, DIPA eigenvalues converge faster than that for IPA and the top two DIPA principal components clearly identify the three states. By using the DIPA biplot, the contributions of the dipeptide-water interactions to each state are analyzed systematically. Since the DIPA improves the state identification and the convergence rate with retaining distance information, we conclude that the DIPA is a more practical method compared with the

  14. Genetic Analyses of Integrin Signaling

    PubMed Central

    Wickström, Sara A.; Radovanac, Korana; Fässler, Reinhard

    2011-01-01

    The development of multicellular organisms, as well as maintenance of organ architecture and function, requires robust regulation of cell fates. This is in part achieved by conserved signaling pathways through which cells process extracellular information and translate this information into changes in proliferation, differentiation, migration, and cell shape. Gene deletion studies in higher eukaryotes have assigned critical roles for components of the extracellular matrix (ECM) and their cellular receptors in a vast number of developmental processes, indicating that a large proportion of this signaling is regulated by cell-ECM interactions. In addition, genetic alterations in components of this signaling axis play causative roles in several human diseases. This review will discuss what genetic analyses in mice and lower organisms have taught us about adhesion signaling in development and disease. PMID:21421914

  15. Chemical analyses of provided samples

    NASA Technical Reports Server (NTRS)

    Becker, Christopher H.

    1993-01-01

    Two batches of samples were received and chemical analysis was performed of the surface and near surface regions of the samples by the surface analysis by laser ionization (SALI) method. The samples included four one-inch optics and several paint samples. The analyses emphasized surface contamination or modification. In these studies, pulsed sputtering by 7 keV Ar+ and primarily single-photon ionization (SPI) by coherent 118 nm radiation (at approximately 5 x 10(exp 5) W/cm(sup 2) were used. For two of the samples, also multiphoton ionization (MPI) at 266 nm (approximately 5 x 10(exp 11) W/cm(sup 2) was used. Most notable among the results was the silicone contamination on Mg2 mirror 28-92, and that the Long Duration Exposure Facility (LDEF) paint sample had been enriched in K and Na and depleted in Zn, Si, B, and organic compounds relative to the control paint.

  16. Uncertainty and Sensitivity Analyses Plan

    SciTech Connect

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.

  17. Analyses to improve operational flexibility

    SciTech Connect

    Trikouros, N.G.

    1986-01-01

    Operational flexibility is greatly enhanced if the technical bases for plant limits and design margins are fully understood, and the analyses necessary to evaluate the effect of plant modifications or changes in operating modes on these parameters can be performed as required. If a condition should arise that might jeopardize a plant limit or reduce operational flexibility, it would be necessary to understand the basis for the limit or the specific condition limiting operational flexibility and be capable of performing a reanalysis to either demonstrate that the limit will not be violated or to change the limit. This paper provides examples of GPU Nuclear efforts in this regard. Examples of Oyster Creek and Three Mile Island operating experiences are discussed.

  18. Multivariate analyses in microbial ecology

    PubMed Central

    Ramette, Alban

    2007-01-01

    Environmental microbiology is undergoing a dramatic revolution due to the increasing accumulation of biological information and contextual environmental parameters. This will not only enable a better identification of diversity patterns, but will also shed more light on the associated environmental conditions, spatial locations, and seasonal fluctuations, which could explain such patterns. Complex ecological questions may now be addressed using multivariate statistical analyses, which represent a vast potential of techniques that are still underexploited. Here, well-established exploratory and hypothesis-driven approaches are reviewed, so as to foster their addition to the microbial ecologist toolbox. Because such tools aim at reducing data set complexity, at identifying major patterns and putative causal factors, they will certainly find many applications in microbial ecology. PMID:17892477

  19. Isotopic signatures by bulk analyses

    SciTech Connect

    Efurd, D.W.; Rokop, D.J.

    1997-12-01

    Los Alamos National Laboratory has developed a series of measurement techniques for identification of nuclear signatures by analyzing bulk samples. Two specific applications for isotopic fingerprinting to identify the origin of anthropogenic radioactivity in bulk samples are presented. The first example is the analyses of environmental samples collected in the US Arctic to determine the impact of dumping of radionuclides in this polar region. Analyses of sediment and biota samples indicate that for the areas sampled the anthropogenic radionuclide content of sediments was predominantly the result of the deposition of global fallout. The anthropogenic radionuclide concentrations in fish, birds and mammals were very low. It can be surmised that marine food chains are presently not significantly affected. The second example is isotopic fingerprinting of water and sediment samples from the Rocky Flats Facility (RFP). The largest source of anthropogenic radioactivity presently affecting surface-waters at RFP is the sediments that are currently residing in the holding ponds. One gram of sediment from a holding pond contains approximately 50 times more plutonium than 1 liter of water from the pond. Essentially 100% of the uranium in Ponds A-1 and A-2 originated as depleted uranium. The largest source of radioactivity in the terminal Ponds A-4, B-5 and C-2 was naturally occurring uranium and its decay product radium. The uranium concentrations in the waters collected from the terminal ponds contained 0.05% or less of the interim standard calculated derived concentration guide for uranium in waters available to the public. All of the radioactivity observed in soil, sediment and water samples collected at RFP was naturally occurring, the result of processes at RFP or the result of global fallout. No extraneous anthropogenic alpha, beta or gamma activities were detected. The plutonium concentrations in Pond C-2 appear to vary seasonally.

  20. 3-D Cavern Enlargement Analyses

    SciTech Connect

    EHGARTNER, BRIAN L.; SOBOLIK, STEVEN R.

    2002-03-01

    Three-dimensional finite element analyses simulate the mechanical response of enlarging existing caverns at the Strategic Petroleum Reserve (SPR). The caverns are located in Gulf Coast salt domes and are enlarged by leaching during oil drawdowns as fresh water is injected to displace the crude oil from the caverns. The current criteria adopted by the SPR limits cavern usage to 5 drawdowns (leaches). As a base case, 5 leaches were modeled over a 25 year period to roughly double the volume of a 19 cavern field. Thirteen additional leaches where then simulated until caverns approached coalescence. The cavern field approximated the geometries and geologic properties found at the West Hackberry site. This enabled comparisons are data collected over nearly 20 years to analysis predictions. The analyses closely predicted the measured surface subsidence and cavern closure rates as inferred from historic well head pressures. This provided the necessary assurance that the model displacements, strains, and stresses are accurate. However, the cavern field has not yet experienced the large scale drawdowns being simulated. Should they occur in the future, code predictions should be validated with actual field behavior at that time. The simulations were performed using JAS3D, a three dimensional finite element analysis code for nonlinear quasi-static solids. The results examine the impacts of leaching and cavern workovers, where internal cavern pressures are reduced, on surface subsidence, well integrity, and cavern stability. The results suggest that the current limit of 5 oil drawdowns may be extended with some mitigative action required on the wells and later on to surface structure due to subsidence strains. The predicted stress state in the salt shows damage to start occurring after 15 drawdowns with significant failure occurring at the 16th drawdown, well beyond the current limit of 5 drawdowns.

  1. THOR Turbulence Electron Analyser: TEA

    NASA Astrophysics Data System (ADS)

    Fazakerley, Andrew; Samara, Marilia; Hancock, Barry; Wicks, Robert; Moore, Tom; Rust, Duncan; Jones, Jonathan; Saito, Yoshifumi; Pollock, Craig; Owen, Chris; Rae, Jonny

    2017-04-01

    Turbulence Heating ObserveR (THOR) is the first mission ever flown in space dedicated to plasma turbulence. The Turbulence Electron Analyser (TEA) will measure the plasma electron populations in the mission's Regions of Interest. It will collect a 3D electron velocity distribution with cadences as short as 5 ms. The instrument will be capable of measuring energies up to 30 keV. TEA consists of multiple electrostatic analyser heads arranged so as to measure electrons arriving from look directions covering the full sky, i.e. 4 pi solid angle. The baseline concept is similar to the successful FPI-DES instrument currently operating on the MMS mission. TEA is intended to have a similar angular resolution, but a larger geometric factor. In comparison to earlier missions, TEA improves on the measurement cadence. For example, MMS FPI-DES routinely operates at 30 ms cadence. The objective of measuring distributions at rates as fast as 5 ms is driven by the mission's scientific requirements to resolve electron gyroscale size structures, where plasma heating and fluctuation dissipation is predicted to occur. TEA will therefore be capable of making measurements of the evolution of distribution functions across thin (a few km) current sheets travelling past the spacecraft at up to 600 km/s, of the Power Spectral Density of fluctuations of electron moments and of distributions fast enough to match frequencies with waves expected to be dissipating turbulence (e.g. with 100 Hz whistler waves). A novel capability to time tag individual electron events during short intervals for the purposes of ground analysis of wave-particle interactions is also planned.

  2. Genetic analyses of captive Alala (Corvus hawaiiensis) using AFLP analyses

    USGS Publications Warehouse

    Jarvi, Susan I.; Bianchi, Kiara R.

    2006-01-01

    affected by the mutation rate at microsatellite loci, thus introducing a bias. Also, the number of loci that can be studied is frequently limited to fewer than 10. This theoretically represents a maximum of one marker for each of 10 chromosomes. Dominant markers like AFLP allow a larger fraction of the genome to be screened. Large numbers of loci can be screened by AFLP to resolve very small individual differences that can be used for identification of individuals, estimates of pairwise relatedness and, in some cases, for parentage analyses. Since AFLP is a dominant marker (can not distinguish between +/+ homozygote versus +/- heterozygote), it has limitations for parentage analyses. Only when both parents are homozygous for the absence of alleles (-/-) and offspring show a presence (+/+ or +/-) can the parents be excluded. In this case, microsatellites become preferable as they have the potential to exclude individual parents when the other parent is unknown. Another limitation of AFLP is that the loci are generally less polymorphic (only two alleles/locus) than microsatellite loci (often >10 alleles/locus). While generally fewer than 10 highly polymorphic microsatellite loci are enough to exclude and assign parentage, it might require up to 100 or more AFLP loci. While there are pros and cons to different methodologies, the total number of loci evaluated by AFLP generally offsets the limitations imposed due to the dominant nature of this approach and end results between methods are generally comparable. Overall objectives of this study were to evaluate the level of genetic diversity in the captive population of Alala, to compare genetic data with currently available pedigree information, and to determine the extent of relatedness of mating pairs and among founding individuals.

  3. Biomedical Imaging,

    DTIC Science & Technology

    precision required from the task. This report details the technologies in surface and subsurface imaging systems for research and commercial applications. Biomedical imaging, Anthropometry, Computer imaging.

  4. imageMCR

    SciTech Connect

    2011-09-27

    imageMCR is a user friendly software package that consists of a variety inputs to preprocess and analyze the hyperspectral image data using multivariate algorithms such as Multivariate Curve Resolution (MCR), Principle Component Analysis (PCA), Classical Least Squares (CLS) and Parallel Factor Analysis (PARAFAC). MCR provides a relative quantitative analysis of the hyperspectral image data without the need for standards, and it discovers all the emitting species (spectral pure components) present in an image, even those in which there is no a priori information. Once the spectral components are discovered, these spectral components can be used for future MCR analyses or used with CLS algorithms to quickly extract concentration image maps for each component within spectral image data sets.

  5. Helicopter tail rotor noise analyses

    NASA Technical Reports Server (NTRS)

    George, A. R.; Chou, S. T.

    1986-01-01

    A study was made of helicopter tail rotor noise, particularly that due to interactions with the main rotor tip vortices, and with the fuselage separation mean wake. The tail rotor blade-main rotor tip vortex interaction is modelled as an airfoil of infinite span cutting through a moving vortex. The vortex and the geometry information required by the analyses are obtained through a free wake geometry analysis of the main rotor. The acoustic pressure-time histories for the tail rotor blade-vortex interactions are then calculated. These acoustic results are compared to tail rotor loading and thickness noise, and are found to be significant to the overall tail rotor noise generation. Under most helicopter operating conditions, large acoustic pressure fluctuations can be generated due to a series of skewed main rotor tip vortices passing through the tail rotor disk. The noise generation depends strongly upon the helicopter operating conditions and the location of the tail rotor relative to the main rotor.

  6. Proteins analysed as virtual knots

    PubMed Central

    Alexander, Keith; Taylor, Alexander J.; Dennis, Mark R.

    2017-01-01

    Long, flexible physical filaments are naturally tangled and knotted, from macroscopic string down to long-chain molecules. The existence of knotting in a filament naturally affects its configuration and properties, and may be very stable or disappear rapidly under manipulation and interaction. Knotting has been previously identified in protein backbone chains, for which these mechanical constraints are of fundamental importance to their molecular functionality, despite their being open curves in which the knots are not mathematically well defined; knotting can only be identified by closing the termini of the chain somehow. We introduce a new method for resolving knotting in open curves using virtual knots, which are a wider class of topological objects that do not require a classical closure and so naturally capture the topological ambiguity inherent in open curves. We describe the results of analysing proteins in the Protein Data Bank by this new scheme, recovering and extending previous knotting results, and identifying topological interest in some new cases. The statistics of virtual knots in protein chains are compared with those of open random walks and Hamiltonian subchains on cubic lattices, identifying a regime of open curves in which the virtual knotting description is likely to be important. PMID:28205562

  7. Proteins analysed as virtual knots

    NASA Astrophysics Data System (ADS)

    Alexander, Keith; Taylor, Alexander J.; Dennis, Mark R.

    2017-02-01

    Long, flexible physical filaments are naturally tangled and knotted, from macroscopic string down to long-chain molecules. The existence of knotting in a filament naturally affects its configuration and properties, and may be very stable or disappear rapidly under manipulation and interaction. Knotting has been previously identified in protein backbone chains, for which these mechanical constraints are of fundamental importance to their molecular functionality, despite their being open curves in which the knots are not mathematically well defined; knotting can only be identified by closing the termini of the chain somehow. We introduce a new method for resolving knotting in open curves using virtual knots, which are a wider class of topological objects that do not require a classical closure and so naturally capture the topological ambiguity inherent in open curves. We describe the results of analysing proteins in the Protein Data Bank by this new scheme, recovering and extending previous knotting results, and identifying topological interest in some new cases. The statistics of virtual knots in protein chains are compared with those of open random walks and Hamiltonian subchains on cubic lattices, identifying a regime of open curves in which the virtual knotting description is likely to be important.

  8. Photovoltaics: Life-cycle Analyses

    SciTech Connect

    Fthenakis V. M.; Kim, H.C.

    2009-10-02

    Life-cycle analysis is an invaluable tool for investigating the environmental profile of a product or technology from cradle to grave. Such life-cycle analyses of energy technologies are essential, especially as material and energy flows are often interwoven, and divergent emissions into the environment may occur at different life-cycle-stages. This approach is well exemplified by our description of material and energy flows in four commercial PV technologies, i.e., mono-crystalline silicon, multi-crystalline silicon, ribbon-silicon, and cadmium telluride. The same life-cycle approach is applied to the balance of system that supports flat, fixed PV modules during operation. We also discuss the life-cycle environmental metrics for a concentration PV system with a tracker and lenses to capture more sunlight per cell area than the flat, fixed system but requires large auxiliary components. Select life-cycle risk indicators for PV, i.e., fatalities, injures, and maximum consequences are evaluated in a comparative context with other electricity-generation pathways.

  9. Dendrochronological analyses of art objects

    NASA Astrophysics Data System (ADS)

    Klein, Peter

    1998-05-01

    Dendrochronology is a discipline of the biological sciences which makes it possible to determine the age of wooden objects. Dendrochronological analyses are used in art history as an important means of dating wooden panels, sculptures and musical instruments. This method of dating allows us to ascertain at least a 'terminus post quem' for an art-object by determining the felling date of the tree from which the object was cut, in other words the data after which the wood for the object could have been sawn. The method involves measuring the width of the annual rings on the panels and comparing the growth ring curve resulting from this measurement with dated master chronologies. Since the characteristics of the growth ring curve over several centuries are unique and specific to wood of differing geographical origins of wood, it is possible to obtain a relatively precise dating of art-objects. Since dendrochronology is year specific it is more accurate than other scientific methods. But like other methods it has limitations. The method is limited only to trees from temperate zones. And even among these, some woods are better than others. A dating is possible for oak, beech, fir, pine and spruce. Linden and poplar are not datable.

  10. Network analyses in systems pharmacology

    PubMed Central

    Berger, Seth I.; Iyengar, Ravi

    2009-01-01

    Systems pharmacology is an emerging area of pharmacology which utilizes network analysis of drug action as one of its approaches. By considering drug actions and side effects in the context of the regulatory networks within which the drug targets and disease gene products function, network analysis promises to greatly increase our knowledge of the mechanisms underlying the multiple actions of drugs. Systems pharmacology can provide new approaches for drug discovery for complex diseases. The integrated approach used in systems pharmacology can allow for drug action to be considered in the context of the whole genome. Network-based studies are becoming an increasingly important tool in understanding the relationships between drug action and disease susceptibility genes. This review discusses how analysis of biological networks has contributed to the genesis of systems pharmacology and how these studies have improved global understanding of drug targets, suggested new targets and approaches for therapeutics, and provided a deeper understanding of the effects of drugs. Taken together, these types of analyses can lead to new therapeutic options while improving the safety and efficacy of existing medications. Contact: ravi.iyengar@mssm.edu PMID:19648136

  11. Chemical analyses of provided samples

    NASA Technical Reports Server (NTRS)

    Becker, Christopher H.

    1993-01-01

    A batch of four samples were received and chemical analysis was performed of the surface and near surface regions of the samples by the surface analysis by laser ionization (SALI) method. The samples included four one-inch diameter optics labeled windows no. PR14 and PR17 and MgF2 mirrors 9-93 PPPC exp. and control DMES 26-92. The analyses emphasized surface contamination or modification. In these studies, pulsed desorption by 355 nm laser light and single-photon ionization (SPI) above the sample by coherent 118 nm radiation (at approximately 5 x 10(exp 5) W/cm(sup 2)) were used, emphasizing organic analysis. For the two windows with an apparent yellowish contaminant film, higher desorption laser power was needed to provide substantial signals, indicating a less volatile contamination than for the two mirrors. Window PR14 and the 9-93 mirror showed more hydrocarbon components than the other two samples. The mass spectra, which show considerable complexity, are discussed in terms of various potential chemical assignments.

  12. Comparison between Inbreeding Analyses Methodologies.

    PubMed

    Esparza, Mireia; Martínez-Abadías, Neus; Sjøvold, Torstein; González-José, Rolando; Hernández, Miquel

    2015-12-01

    Surnames are widely used in inbreeding analysis, but the validity of results has often been questioned due to the failure to comply with the prerequisites of the method. Here we analyze inbreeding in Hallstatt (Austria) between the 17th and the 19th centuries both using genealogies and surnames. The high and significant correlation of the results obtained by both methods demonstrates the validity of the use of surnames in this kind of studies. On the other hand, the inbreeding values obtained (0.24 x 10⁻³ in the genealogies analysis and 2.66 x 10⁻³ in the surnames analysis) are lower than those observed in Europe for this period and for this kind of population, demonstrating the falseness of the apparent isolation of Hallstatt's population. The temporal trend of inbreeding in both analyses does not follow the European general pattern, but shows a maximum in 1850 with a later decrease along the second half of the 19th century. This is probably due to the high migration rate that is implied by the construction of transport infrastructures around the 1870's.

  13. Phylogenomic analyses unravel annelid evolution.

    PubMed

    Struck, Torsten H; Paul, Christiane; Hill, Natascha; Hartmann, Stefanie; Hösel, Christoph; Kube, Michael; Lieb, Bernhard; Meyer, Achim; Tiedemann, Ralph; Purschke, Günter; Bleidorn, Christoph

    2011-03-03

    Annelida, the ringed worms, is a highly diverse animal phylum that includes more than 15,000 described species and constitutes the dominant benthic macrofauna from the intertidal zone down to the deep sea. A robust annelid phylogeny would shape our understanding of animal body-plan evolution and shed light on the bilaterian ground pattern. Traditionally, Annelida has been split into two major groups: Clitellata (earthworms and leeches) and polychaetes (bristle worms), but recent evidence suggests that other taxa that were once considered to be separate phyla (Sipuncula, Echiura and Siboglinidae (also known as Pogonophora)) should be included in Annelida. However, the deep-level evolutionary relationships of Annelida are still poorly understood, and a robust reconstruction of annelid evolutionary history is needed. Here we show that phylogenomic analyses of 34 annelid taxa, using 47,953 amino acid positions, recovered a well-supported phylogeny with strong support for major splits. Our results recover chaetopterids, myzostomids and sipunculids in the basal part of the tree, although the position of Myzostomida remains uncertain owing to its long branch. The remaining taxa are split into two clades: Errantia (which includes the model annelid Platynereis), and Sedentaria (which includes Clitellata). Ancestral character trait reconstructions indicate that these clades show adaptation to either an errant or a sedentary lifestyle, with alteration of accompanying morphological traits such as peristaltic movement, parapodia and sensory perception. Finally, life history characters in Annelida seem to be phylogenetically informative.

  14. TOPAZ II Temperature Coefficient Analyses

    NASA Astrophysics Data System (ADS)

    Loaiza, David; Haskin, F. Eric; Marshall, Albert C.

    1994-07-01

    A two-dimensional model of the Topaz II reactor core suitable for neutronic analyses of temperature coefficients of reactivity is presented. The model is based on a 30° r-theta segment of the core. Results of TWODANT calculations are used to estimate temperature coefficients associated with fuel, electrodes, moderator, reflector, and tube plates over the range of temperatures anticipated during startup and operation. Results are presented to assess the reactivity effects associated with Doppler broadening, spectral effects and thermal expansion. Comparisons are made between the TWODANT results and empirical Russian curves used for simulating Topaz II system transients. TWODANT results indicate that the prompt temperature coefficients associated with temperature changes in fuel and emitters are negative. This is primarily because of Doppler broadening of the absorption resonances of uranium and molybdenum. The delayed effect of tube plate heating is also negative because fuel is moved radially outward in the core where it is less important. Temperature coefficients associated with delayed heating of the zirconium hydride moderator and the Beryllium reflector are positive, as the change in the neutron spectrum with moderator or reflector temperature decreases the rate of absorption in these components. The TWODANT results agree with the results obtained from the empirical Russian correlations.

  15. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 3 2012-01-01 2012-01-01 false Uncertainty analyses. 436.24 Section 436.24 Energy... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... by conducting additional analyses using any standard engineering economics method such as...

  16. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 3 2014-01-01 2014-01-01 false Uncertainty analyses. 436.24 Section 436.24 Energy... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... by conducting additional analyses using any standard engineering economics method such as...

  17. Digital imaging.

    PubMed

    Daniel, Gregory B

    2009-07-01

    Medical imaging is rapidly moving toward a digital-based image system. An understanding of the principles of digital imaging is necessary to evaluate features of imaging systems and can play an important role in purchasing decisions.

  18. Consumption patterns and perception analyses of hangwa.

    PubMed

    Kwock, Chang Geun; Lee, Min A; Park, So Hyun

    2012-03-01

    Hangwa is a traditional food, corresponding to the current consumption trend, in need of marketing strategies to extend its consumption. Therefore, the purpose of this study was to analyze consumers' consumption patterns and perception of Hangwa to increase consumption in the market. A questionnaire was sent to 250 consumers by e-mail from Oct 8∼23, 2009 and the data from 231 persons were analyzed in this study. Statistical, descriptive, paired samples t-test, and importance-performance analyses were conducted using SPSS WIN 17.0. According to the results, Hangwa was purchased mainly 'for present' (39.8%) and the main reasons for buying it were 'traditional image' (33.3%) and 'taste' (22.5%). When importance and performance of attributes considered in purchasing Hangwa were evaluated, performance was assessed to be lower than importance for all attributes. The attributes in the first quadrant with a high importance and a high performance were 'a sanitary process', 'a rigorous quality mark' and 'taste', which were related with quality of the products. In addition, those with a high importance but a low performance were 'popularization through advertisement', 'promotion through mass media', 'conversion of thought on traditional foods', 'a reasonable price' and 'a wide range of price'. In conclusion, Hangwa manufacturers need to diversify products and extend the expiration date based on technologies to promote its consumption. In terms of price, Hangwa should become more available by lowering the price barrier for consumers who are sensitive to price.

  19. The relationship among sea surface roughness variations, oceanographic analyses, and airborne remote sensing analyses

    NASA Technical Reports Server (NTRS)

    Oertel, G. F.; Wade, T. L.

    1981-01-01

    The synthetic aperture radar (SAR) was studied to determine whether it could image large scale estuaries and oceanic features such as fronts and to explain the electromagnetic interaction between SAR and the individual surface front features. Fronts were observed to occur at the entrance to the Chesapeake Bay. The airborne measurements consisted of data collection by SAR onboard an F-4 aircraft and real aperture side looking radar (SLAR) in Mohawk aircraft. A total of 89 transects were flown. Surface roughness and color as well as temperature and salinity were evaluated. Cross-frontal surveys were made. Frontal shear and convergence flow were obtained. Surface active organic materials, it was indicated, are present at the air-sea interface. In all, 2000 analyses were conducted to characterize the spatial and temporal variabilities associated with water mass boundaries.

  20. APXS ANALYSES OF BOUNCE ROCK: THE FIRST SHERGOTTITE ON MARS

    NASA Technical Reports Server (NTRS)

    Ming, Douglas W.; Zipfel, J.; Anderson, R.; Brueckner, J.; Clark, B. C.; Dreibus, G.; Economou, T.; Gellert, R.; Lugmair, G. W.; Klingelhoefer, G.

    2005-01-01

    During the MER Mission, an isolated rock at Meridiani Planum was analyzed by the Athena instrument suite [1]. Remote sensing instruments noticed its distinct appearance. Two areas on the untreated rock surface and one area that was abraded with the Rock Abrasion Tool were analyzed by Microscopic Imager, Mossbauer Mimos II [2], and Alpha Particle X-ray Spectrometer (APXS). Results of all analyses revealed a close relationship of this rock with known basaltic shergottites.

  1. Analyses of Transistor Punchthrough Failures

    NASA Technical Reports Server (NTRS)

    Nicolas, David P.

    1999-01-01

    The failure of two transistors in the Altitude Switch Assembly for the Solid Rocket Booster followed by two additional failures a year later presented a challenge to failure analysts. These devices had successfully worked for many years on numerous missions. There was no history of failures with this type of device. Extensive checks of the test procedures gave no indication for a source of the cause. The devices were manufactured more than twenty years ago and failure information on this lot date code was not readily available. External visual exam, radiography, PEID, and leak testing were performed with nominal results Electrical testing indicated nearly identical base-emitter and base-collector characteristics (both forward and reverse) with a low resistance short emitter to collector. These characteristics are indicative of a classic failure mechanism called punchthrough. In failure analysis punchthrough refers to an condition where a relatively low voltage pulse causes the device to conduct very hard producing localized areas of thermal runaway or "hot spots". At one or more of these hot spots, the excessive currents melt the silicon. Heavily doped emitter material diffuses through the base region to the collector forming a diffusion pipe shorting the emitter to base to collector. Upon cooling, an alloy junction forms between the pipe and the base region. Generally, the hot spot (punch-through site) is under the bond and no surface artifact is visible. The devices were delidded and the internal structures were examined microscopically. The gold emitter lead was melted on one device, but others had anomalies in the metallization around the in-tact emitter bonds. The SEM examination confirmed some anomalies to be cosmetic defects while other anomalies were artifacts of the punchthrough site. Subsequent to these analyses, the contractor determined that some irregular testing procedures occurred at the time of the failures heretofore unreported. These testing

  2. Analyses of Transistor Punchthrough Failures

    NASA Technical Reports Server (NTRS)

    Nicolas, David P.

    1999-01-01

    The failure of two transistors in the Altitude Switch Assembly for the Solid Rocket Booster followed by two additional failures a year later presented a challenge to failure analysts. These devices had successfully worked for many years on numerous missions. There was no history of failures with this type of device. Extensive checks of the test procedures gave no indication for a source of the cause. The devices were manufactured more than twenty years ago and failure information on this lot date code was not readily available. External visual exam, radiography, PEID, and leak testing were performed with nominal results Electrical testing indicated nearly identical base-emitter and base-collector characteristics (both forward and reverse) with a low resistance short emitter to collector. These characteristics are indicative of a classic failure mechanism called punchthrough. In failure analysis punchthrough refers to an condition where a relatively low voltage pulse causes the device to conduct very hard producing localized areas of thermal runaway or "hot spots". At one or more of these hot spots, the excessive currents melt the silicon. Heavily doped emitter material diffuses through the base region to the collector forming a diffusion pipe shorting the emitter to base to collector. Upon cooling, an alloy junction forms between the pipe and the base region. Generally, the hot spot (punch-through site) is under the bond and no surface artifact is visible. The devices were delidded and the internal structures were examined microscopically. The gold emitter lead was melted on one device, but others had anomalies in the metallization around the in-tact emitter bonds. The SEM examination confirmed some anomalies to be cosmetic defects while other anomalies were artifacts of the punchthrough site. Subsequent to these analyses, the contractor determined that some irregular testing procedures occurred at the time of the failures heretofore unreported. These testing

  3. Pawnee Nation Energy Option Analyses

    SciTech Connect

    Matlock, M.; Kersey, K.; Riding In, C.

    2009-07-21

    Pawnee Nation of Oklahoma Energy Option Analyses In 2003, the Pawnee Nation leadership identified the need for the tribe to comprehensively address its energy issues. During a strategic energy planning workshop a general framework was laid out and the Pawnee Nation Energy Task Force was created to work toward further development of the tribe’s energy vision. The overarching goals of the “first steps” project were to identify the most appropriate focus for its strategic energy initiatives going forward, and to provide information necessary to take the next steps in pursuit of the “best fit” energy options. Description of Activities Performed The research team reviewed existing data pertaining to the availability of biomass (focusing on woody biomass, agricultural biomass/bio-energy crops, and methane capture), solar, wind and hydropower resources on the Pawnee-owned lands. Using these data, combined with assumptions about costs and revenue streams, the research team performed preliminary feasibility assessments for each resource category. The research team also reviewed available funding resources and made recommendations to Pawnee Nation highlighting those resources with the greatest potential for financially-viable development, both in the near-term and over a longer time horizon. Findings and Recommendations Due to a lack of financial incentives for renewable energy, particularly at the state level, combined mediocre renewable energy resources, renewable energy development opportunities are limited for Pawnee Nation. However, near-term potential exists for development of solar hot water at the gym, and an exterior wood-fired boiler system at the tribe’s main administrative building. Pawnee Nation should also explore options for developing LFGTE resources in collaboration with the City of Pawnee. Significant potential may also exist for development of bio-energy resources within the next decade. Pawnee Nation representatives should closely monitor

  4. Quantum Image Encryption Algorithm Based on Quantum Image XOR Operations

    NASA Astrophysics Data System (ADS)

    Gong, Li-Hua; He, Xiang-Tao; Cheng, Shan; Hua, Tian-Xiang; Zhou, Nan-Run

    2016-07-01

    A novel encryption algorithm for quantum images based on quantum image XOR operations is designed. The quantum image XOR operations are designed by using the hyper-chaotic sequences generated with the Chen's hyper-chaotic system to control the control-NOT operation, which is used to encode gray-level information. The initial conditions of the Chen's hyper-chaotic system are the keys, which guarantee the security of the proposed quantum image encryption algorithm. Numerical simulations and theoretical analyses demonstrate that the proposed quantum image encryption algorithm has larger key space, higher key sensitivity, stronger resistance of statistical analysis and lower computational complexity than its classical counterparts.

  5. Multiple-image encryption algorithm based on mixed image element and permutation

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoqiang; Wang, Xuesong

    2017-05-01

    To improve encryption efficiency and facilitate the secure transmission of multiple digital images, by defining the pure image element and mixed image element, this paper presents a new multiple-image encryption (MIE) algorithm based on the mixed image element and permutation, which can simultaneously encrypt any number of images. Firstly, segment the original images into pure image elements; secondly, scramble all the pure image elements with the permutation generated by the piecewise linear chaotic map (PWLCM) system; thirdly, combine mixed image elements into scrambled images; finally, diffuse the content of mixed image elements by performing the exclusive OR (XOR) operation among scrambled images and the chaotic image generated by another PWLCM system. The comparison with two similar algorithms is made. Experimental results and algorithm analyses show that the proposed MIE algorithm is very simple and efficient, which is suitable for practical image encryption.

  6. Imaging medical imaging

    NASA Astrophysics Data System (ADS)

    Journeau, P.

    2015-03-01

    This paper presents progress on imaging the research field of Imaging Informatics, mapped as the clustering of its communities together with their main results by applying a process to produce a dynamical image of the interactions between their results and their common object(s) of research. The basic side draws from a fundamental research on the concept of dimensions and projective space spanning several streams of research about three-dimensional perceptivity and re-cognition and on their relation and reduction to spatial dimensionality. The application results in an N-dimensional mapping in Bio-Medical Imaging, with dimensions such as inflammatory activity, MRI acquisition sequencing, spatial resolution (voxel size), spatiotemporal dimension inferred, toxicity, depth penetration, sensitivity, temporal resolution, wave length, imaging duration, etc. Each field is represented through the projection of papers' and projects' `discriminating' quantitative results onto the specific N-dimensional hypercube of relevant measurement axes, such as listed above and before reduction. Past published differentiating results are represented as red stars, achieved unpublished results as purple spots and projects at diverse progress advancement levels as blue pie slices. The goal of the mapping is to show the dynamics of the trajectories of the field in its own experimental frame and their direction, speed and other characteristics. We conclude with an invitation to participate and show a sample mapping of the dynamics of the community and a tentative predictive model from community contribution.

  7. Integrated Field Analyses of Thermal Springs

    NASA Astrophysics Data System (ADS)

    Shervais, K.; Young, B.; Ponce-Zepeda, M. M.; Rosove, S.

    2011-12-01

    A group of undergraduate researchers through the SURE internship offered by the Southern California Earthquake Center (SCEC) have examined thermal springs in southern Idaho, northern Utah as well as mud volcanoes in the Salton Sea, California. We used an integrated approach to estimate the setting and maximum temperature, including water chemistry, Ipad-based image and data-base management, microbiology, and gas analyses with a modified Giggenbach sampler.All springs were characterized using GISRoam (tmCogent3D). We are performing geothermometry calculations as well as comparisons with temperature gradient data on the results while also analyzing biological samples. Analyses include water temperature, pH, electrical conductivity, and TDS measured in the field. Each sample is sealed and chilled and delivered to a water lab within 12 hours.Temperatures are continuously monitored with the use of Solinst Levelogger Juniors. Through partnership with a local community college geology club, we receive results on a monthly basis and are able to process initial data earlier in order to evaluate data over a longer time span. The springs and mudpots contained microbial organisms which were analyzed using methods of single colony isolation, polymerase chain reaction, and DNA sequencing showing the impact of the organisms on the springs or vice versa. Soon we we will collect gas samples at sites that show signs of gas. This will be taken using a hybrid of the Giggenbach method and our own methods. Drawing gas samples has proven a challenge, however we devised a method to draw out gas samples utilizing the Giggenbach flask, transferring samples to glass blood sample tubes, replacing NaOH in the Giggenbach flask, and evacuating it in the field for multiple samples using a vacuum pump. We also use a floating platform devised to carry and lower a levelogger, to using an in-line fuel filter from a tractor in order to keep mud from contaminating the equipment.The use of raster

  8. Image Calibration

    NASA Technical Reports Server (NTRS)

    Peay, Christopher S.; Palacios, David M.

    2011-01-01

    Calibrate_Image calibrates images obtained from focal plane arrays so that the output image more accurately represents the observed scene. The function takes as input a degraded image along with a flat field image and a dark frame image produced by the focal plane array and outputs a corrected image. The three most prominent sources of image degradation are corrected for: dark current accumulation, gain non-uniformity across the focal plane array, and hot and/or dead pixels in the array. In the corrected output image the dark current is subtracted, the gain variation is equalized, and values for hot and dead pixels are estimated, using bicubic interpolation techniques.

  9. Indexing Images.

    ERIC Educational Resources Information Center

    Rasmussen, Edie M.

    1997-01-01

    Focuses on access to digital image collections by means of manual and automatic indexing. Contains six sections: (1) Studies of Image Systems and their Use; (2) Approaches to Indexing Images; (3) Image Attributes; (4) Concept-Based Indexing; (5) Content-Based Indexing; and (6) Browsing in Image Retrieval. Contains 105 references. (AEF)

  10. Image Guidance

    EPA Pesticide Factsheets

    Guidance that explains the process for getting images approved in One EPA Web microsites and resource directories. includes an appendix that shows examples of what makes some images better than others, how some images convey meaning more than others

  11. Image data processing of earth resources management. [technology transfer

    NASA Technical Reports Server (NTRS)

    Desio, A. W.

    1974-01-01

    Various image processing and information extraction systems are described along with the design and operation of an interactive multispectral information system, IMAGE 100. Analyses of ERTS data, using IMAGE 100, over a number of U.S. sites are presented. The following analyses are included: investigations of crop inventory and management using remote sensing; and (2) land cover classification for environmental impact assessments. Results show that useful information is provided by IMAGE 100 analyses of ERTS data in digital form.

  12. Digital Imaging

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Digital Imaging is the computer processed numerical representation of physical images. Enhancement of images results in easier interpretation. Quantitative digital image analysis by Perceptive Scientific Instruments, locates objects within an image and measures them to extract quantitative information. Applications are CAT scanners, radiography, microscopy in medicine as well as various industrial and manufacturing uses. The PSICOM 327 performs all digital image analysis functions. It is based on Jet Propulsion Laboratory technology, is accurate and cost efficient.

  13. Applications of Epsilon Radial Networks in Neuroimage Analyses

    PubMed Central

    Adluru, Nagesh; Chung, Moo K.; Lange, Nicholas T.; Lainhart, Janet E.; Alexander, Andrew L.

    2016-01-01

    “Is the brain ’wiring’ different between groups of populations?” is an increasingly important question with advances in diffusion MRI and abundance of network analytic tools. Recently, automatic, data-driven and computationally efficient framework for extracting brain networks using tractography and epsilon neighborhoods were proposed in the diffusion tensor imaging (DTI) literature [1]. In this paper we propose new extensions to that framework and show potential applications of such epsilon radial networks (ERN) in performing various types of neuroimage analyses. These extensions allow us to use ERNs not only to mine for topo-physical properties of the structural brain networks but also to perform classical region-of-interest (ROI) analyses in a very efficient way. Thus we demonstrate the use of ERNs as a novel image processing lens for statistical and machine learning based analyses. We demonstrate its application in an autism study for identifying topological and quantitative group differences, as well as performing classification. Finally, these views are not restricted to ERNs but can be effective for population studies using any computationally efficient network-extraction procedures. PMID:28251191

  14. The ASSET intercomparison of ozone analyses: method and first results

    NASA Astrophysics Data System (ADS)

    Geer, A. J.; Lahoz, W. A.; Bekki, S.; Bormann, N.; Errera, Q.; Eskes, H. J.; Fonteyn, D.; Jackson, D. R.; Juckes, M. N.; Massart, S.; Peuch, V.-H.; Rharmili, S.; Segers, A.

    2006-06-01

    This paper examines 11 sets of ozone analyses from 7 different data assimilation systems. Two are numerical weather prediction (NWP) systems based on general circulation models (GCMs); the other five use chemistry transport models (CTMs). These systems contain either linearised or detailed ozone chemistry, or no chemistry at all. In most analyses, MIPAS (Michelson Interferometer for Passive Atmospheric Sounding) ozone data are assimilated. Two examples assimilate SCIAMACHY (Scanning Imaging Absorption Spectrometer for Atmospheric Chartography) observations. The analyses are compared to independent ozone observations covering the troposphere, stratosphere and lower mesosphere during the period July to November 2003. Through most of the stratosphere (50 hPa to 1 hPa), biases are usually within ±10% and standard deviations less than 10% compared to ozonesondes and HALOE (Halogen Occultation Experiment). Biases and standard deviations are larger in the upper-troposphere/lower-stratosphere, in the troposphere, the mesosphere, and the Antarctic ozone hole region. In these regions, some analyses do substantially better than others, and this is mostly due to differences in the models. At the tropical tropopause, many analyses show positive biases and excessive structure in the ozone fields, likely due to known deficiencies in assimilated tropical wind fields and a degradation in MIPAS data at these levels. In the southern hemisphere ozone hole, only the analyses which correctly model heterogeneous ozone depletion are able to reproduce the near-complete ozone destruction over the pole. In the upper-stratosphere and mesosphere (above 5 hPa), some ozone photochemistry schemes caused large but easily remedied biases. The diurnal cycle of ozone in the mesosphere is not captured, except by the one system that includes a detailed treatment of mesospheric chemistry. In general, similarly good results are obtained no matter what the assimilation method (Kalman filter, three or

  15. MELCOR analyses for accident progression issues

    SciTech Connect

    Dingman, S.E.; Shaffer, C.J.; Payne, A.C.; Carmel, M.K. )

    1991-01-01

    Results of calculations performed with MELCOR and HECTR in support of the NUREG-1150 study are presented in this report. The analyses examined a wide range of issues. The analyses included integral calculations covering an entire accident sequence, as well as calculations that addressed specific issues that could affect several accident sequences. The results of the analyses for Grand Gulf, Peach Bottom, LaSalle, and Sequoyah are described, and the major conclusions are summarized. 23 refs., 69 figs., 8 tabs.

  16. Electron/proton spectrometer certification documentation analyses

    NASA Technical Reports Server (NTRS)

    Gleeson, P.

    1972-01-01

    A compilation of analyses generated during the development of the electron-proton spectrometer for the Skylab program is presented. The data documents the analyses required by the electron-proton spectrometer verification plan. The verification plan was generated to satisfy the ancillary hardware requirements of the Apollo Applications program. The certification of the spectrometer requires that various tests, inspections, and analyses be documented, approved, and accepted by reliability and quality control personnel of the spectrometer development program.

  17. [Introduction to the indirect meta-analyses].

    PubMed

    Bolaños Díaz, Rafael; Calderón Cahua, María

    2014-04-01

    Meta-analyses are studies that aim to compile all available information, grouping them according to an specific theme and evaluating it through methodological quality tools. When there are two specific comparisons of treatments based on randomized clinical trials, standard meta-analyses are the best option, but there are scenarios in which there is no available literature for those direct comparisons. In these cases, an alternative method to consider is indirect comparison or indirect meta-analyses. The aim of this review is to understand the conceptual foundations, the need, applications and limitations of indirect comparisons for further understanding of network meta-analyses.

  18. The ASSET intercomparison of ozone analyses: method and first results

    NASA Astrophysics Data System (ADS)

    Geer, A. J.; Lahoz, W. A.; Bekki, S.; Bormann, N.; Errera, Q.; Eskes, H. J.; Fonteyn, D.; Jackson, D. R.; Juckes, M. N.; Massart, S.; Peuch, V.-H.; Rharmili, S.; Segers, A.

    2006-12-01

    This paper aims to summarise the current performance of ozone data assimilation (DA) systems, to show where they can be improved, and to quantify their errors. It examines 11 sets of ozone analyses from 7 different DA systems. Two are numerical weather prediction (NWP) systems based on general circulation models (GCMs); the other five use chemistry transport models (CTMs). The systems examined contain either linearised or detailed ozone chemistry, or no chemistry at all. In most analyses, MIPAS (Michelson Interferometer for Passive Atmospheric Sounding) ozone data are assimilated; two assimilate SCIAMACHY (Scanning Imaging Absorption Spectrometer for Atmospheric Chartography) observations instead. Analyses are compared to independent ozone observations covering the troposphere, stratosphere and lower mesosphere during the period July to November 2003. Biases and standard deviations are largest, and show the largest divergence between systems, in the troposphere, in the upper-troposphere/lower-stratosphere, in the upper-stratosphere and mesosphere, and the Antarctic ozone hole region. However, in any particular area, apart from the troposphere, at least one system can be found that agrees well with independent data. In general, none of the differences can be linked to the assimilation technique (Kalman filter, three or four dimensional variational methods, direct inversion) or the system (CTM or NWP system). Where results diverge, a main explanation is the way ozone is modelled. It is important to correctly model transport at the tropical tropopause, to avoid positive biases and excessive structure in the ozone field. In the southern hemisphere ozone hole, only the analyses which correctly model heterogeneous ozone depletion are able to reproduce the near-complete ozone destruction over the pole. In the upper-stratosphere and mesosphere (above 5 hPa), some ozone photochemistry schemes caused large but easily remedied biases. The diurnal cycle of ozone in the

  19. Analysing tissue and gene function in intestinal organ culture.

    PubMed

    Abud, Helen E; Young, Heather M; Newgreen, Donald F

    2008-01-01

    The study of growth, differentiation, and migration of different cell types within the developing intestine has been enhanced by the development of methods to grow intestinal tissue in organ culture. Here, we describe the innovative method of catenary culture where the tubular architecture of the intestine is maintained and normal cell differentiation occurs. Rapid analysis of gene function can be achieved using low voltage, square wave electroporation to introduce expression constructs into the epithelial cell layer of cultured explants. This whole-organ culture system allows cells, signalling pathways, and gene function to be analysed in intact explants of embryonic gut that are accessible for experimental manipulation and live cell imaging.

  20. Photoacoustic imaging.

    PubMed

    Zhang, Yin; Hong, Hao; Cai, Weibo

    2011-09-01

    Photoacoustic imaging, which is based on the photoacoustic effect, has developed extensively over the last decade. Possessing many attractive characteristics such as the use of nonionizing electromagnetic waves, good resolution and contrast, portable instrumention, and the ability to partially quantitate the signal, photoacoustic techniques have been applied to the imaging of cancer, wound healing, disorders in the brain, and gene expression, among others. As a promising structural, functional, and molecular imaging modality for a wide range of biomedical applications, photoacoustic imaging can be categorized into two types of systems: photoacoustic tomography (PAT), which is the focus of this article, and photoacoustic microscopy (PAM). We first briefly describe the endogenous (e.g., hemoglobin and melanin) and the exogenous (e.g., indocyanine green [ICG], various gold nanoparticles, single-walled carbon nanotubes [SWNTs], quantum dots [QDs], and fluorescent proteins) contrast agents for photoacoustic imaging. Next, we discuss in detail the applications of nontargeted photoacoustic imaging. Recently, molecular photoacoustic (MPA) imaging has gained significant interest, and a few proof-of-principle studies have been reported. We summarize the current state of the art of MPA imaging, including the imaging of gene expression and the combination of photoacoustic imaging with other imaging modalities. Last, we point out obstacles facing photoacoustic imaging. Although photoacoustic imaging will likely continue to be a highly vibrant research field for years to come, the key question of whether MPA imaging could provide significant advantages over nontargeted photoacoustic imaging remains to be answered in the future.

  1. Photothermal imaging

    NASA Astrophysics Data System (ADS)

    Lapotko, Dmitry; Antonishina, Elena

    1995-02-01

    An automated image analysis system with two imaging regimes is described. Photothermal (PT) effect is used for imaging of a temperature field or absorption structure of the sample (the cell) with high sensitivity and spatial resolution. In a cell study PT-technique enables imaging of live non-stained cells, and the monitoring of the cell shape/structure. The system includes a dual laser illumination unit coupled to a conventional optical microscope. A sample chamber provides automated or manual loading of up to 3 samples and cell positioning. For image detection a 256 X 256 10-bit CCD-camera is used. The lasers, scanning stage, and camera are controlled by PC. The system provides optical (transmitted light) image, probe laser optical image, and PT-image acquisition. Operation rate is 1 - 1.5 sec per cell for a cycle: cell positioning -- 3 images acquisition -- image parameters calculation. A special database provides image/parameters storage, presentation, and cell diagnostic according to quantitative image parameters. The described system has been tested during live and stained blood cell studies. PT-images of the cells have been used for cell differentiation. In experiments with the red blood cells (RBC) that originate from normal and anaemia blood parameters for disease differentiation have been found. For white blood cells in PT-images the details of cell structure have found that absent in their optical images.

  2. Micro-FE analyses of bone: state of the art.

    PubMed

    van Rietbergen, B

    2001-01-01

    The ability to provide a complete characterization of elastic properties of bone has vastly improved our understanding of trabecular bone mechanical properties. Based on this information, it was possible to validate several mechanical concepts related to the elastic behavior of trabecular bone that could not be validated earlier. With recently developed micro-CT scanners and the availability of large parallel computer systems, this technique has also enabled the determination of physiological bone tissue loading conditions from very large microFE models that can represent whole human bones in detail. Such analyses can provide the data needed for a better understanding of bone failure processes or cell mediated load adaptive remodeling processes. Computational demands for whole bone analyses, however, are still excessive. Unlike linear stress and strain analyses, the application of PFE to study non-linear processes, in particular bone failure mechanisms, is still in an early phase Results of recent studies, however, are promising and indicate that an accurate prediction of bone failure with these techniques is possible. Compelling features of such analyses are that they enable multi-axial failure criteria at the apparent level to be developed using primarily computational methods as well as that they can provide a basis for detailed analysis of micro-mechanics associated with trabecular failure at the apparent level. The application of microFE techniques to analyze bone in vivo is in an early stage as well. First results have indicated that, although the resolution of presently available in vivo imaging techniques (i.e. pQCT and MR) is much less than that of images used so far for uFE analyses, the technique can provide meaningful elastic properties of trabecular bone in vivo in most cases. It is expected that the remaining uncertainties in the microFE results can be eliminated as soon as the resolution of in vivo images is improved. With the fast developments in p

  3. Oncological image analysis: medical and molecular image analysis

    NASA Astrophysics Data System (ADS)

    Brady, Michael

    2007-03-01

    This paper summarises the work we have been doing on joint projects with GE Healthcare on colorectal and liver cancer, and with Siemens Molecular Imaging on dynamic PET. First, we recall the salient facts about cancer and oncological image analysis. Then we introduce some of the work that we have done on analysing clinical MRI images of colorectal and liver cancer, specifically the detection of lymph nodes and segmentation of the circumferential resection margin. In the second part of the paper, we shift attention to the complementary aspect of molecular image analysis, illustrating our approach with some recent work on: tumour acidosis, tumour hypoxia, and multiply drug resistant tumours.

  4. 49 CFR 1180.7 - Market analyses.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 8 2011-10-01 2011-10-01 false Market analyses. 1180.7 Section 1180.7..., TRACKAGE RIGHTS, AND LEASE PROCEDURES General Acquisition Procedures § 1180.7 Market analyses. (a) For... identify and address relevant markets and issues, and provide additional information as requested by the...

  5. 49 CFR 1180.7 - Market analyses.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 8 2010-10-01 2010-10-01 false Market analyses. 1180.7 Section 1180.7..., TRACKAGE RIGHTS, AND LEASE PROCEDURES General Acquisition Procedures § 1180.7 Market analyses. (a) For... identify and address relevant markets and issues, and provide additional information as requested by the...

  6. Aviation System Analysis Capability Executive Assistant Analyses

    NASA Technical Reports Server (NTRS)

    Roberts, Eileen; Kostiuk, Peter

    1999-01-01

    This document describes the analyses that may be incorporated into the Aviation System Analysis Capability Executive Assistant. The document will be used as a discussion tool to enable NASA and other integrated aviation system entities to evaluate, discuss, and prioritize analyses.

  7. Meta-Analyses in Mental Retardation.

    ERIC Educational Resources Information Center

    Mostert, Mark P.

    2003-01-01

    This study reviews 26 meta-analyses in mental retardation in terms of selected hypotheses, sampling information, representative characteristics of the review, analysis of primary studies, interpretation of results, and reporting of the integrative view. Results indicate a wide variation in the amount of reported data similar to other analyses of…

  8. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle...

  9. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle...

  10. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle...

  11. 49 CFR 1180.7 - Market analyses.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 8 2014-10-01 2014-10-01 false Market analyses. 1180.7 Section 1180.7 Transportation Other Regulations Relating to Transportation (Continued) SURFACE TRANSPORTATION BOARD, DEPARTMENT..., TRACKAGE RIGHTS, AND LEASE PROCEDURES General Acquisition Procedures § 1180.7 Market analyses. (a)...

  12. 49 CFR 1180.7 - Market analyses.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 8 2012-10-01 2012-10-01 false Market analyses. 1180.7 Section 1180.7 Transportation Other Regulations Relating to Transportation (Continued) SURFACE TRANSPORTATION BOARD, DEPARTMENT..., TRACKAGE RIGHTS, AND LEASE PROCEDURES General Acquisition Procedures § 1180.7 Market analyses. (a)...

  13. 49 CFR 1180.7 - Market analyses.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 8 2013-10-01 2013-10-01 false Market analyses. 1180.7 Section 1180.7 Transportation Other Regulations Relating to Transportation (Continued) SURFACE TRANSPORTATION BOARD, DEPARTMENT..., TRACKAGE RIGHTS, AND LEASE PROCEDURES General Acquisition Procedures § 1180.7 Market analyses. (a)...

  14. Operator-free flow injection analyser

    PubMed Central

    de Faria, Lourival C.

    1991-01-01

    A flow injection analyser has been constructed to allow an operator-free determination of up to 40 samples. Besides the usual FIA apparatus, the analyser includes a home-made sample introduction device made with three electromechanical three-way valves and an auto-sampler from Technicon which has been adapted to be commanded by an external digital signal. The analyser is controlled by a single board SDK-8085 microcomputer. The necessary interface to couple the analyser components to the microcomputer is also described. The analyser was evaluated for a Cr(VI)-FIA determination showing a very good performance with a relative standard deviation for 15 signals from the injection of 100 μl of a 1.0 mg.ml-1 standard Cr(VI) solution being equal to 0.5%. PMID:18924899

  15. Image compression for functional imaging

    NASA Astrophysics Data System (ADS)

    Feng, Dagan D.; Li, Xianjin; Siu, Wan-Chi

    1997-04-01

    Function imaging has been playing an important role in modern biomedical research and clinical diagnosis, which provides human internal biochemical information previously not available. However, for a routine dynamic study with a typical medical function imaging system, such as positron emission tomography (PET), it is easily to acquire nearly 1000 images for just one patient in one study. Such a large number of images has given a considerable burden for computer image storage space, data processing and transmission time. In this paper, we present the theory and principles for the minimization of image frames in dynamic biomedical function imaging. We show that the minimum number of image frames required is just equal to the model identifiable parameters and that the quality of the physiological parameter estimation, based on these minimum number of image frames, can be controlled at a comparable level. As a result of our study, the image storage space required can be reduced by more than 80 percent.

  16. Polarization imaging detection technology research

    NASA Astrophysics Data System (ADS)

    Xue, Mo-gen; Wang, Feng; Xu, Guo-ming; Yuan, Hong-wu

    2013-09-01

    In this paper we analyse the polarization imaging theory and the commonly process of the polarization imaging detection. Based on this, we summarize our many years' research work especially in the mechanism, technology and system of the polarization imaging detection technology. Combined with the up-to-date development at home and abroad, this paper discusses many theory and technological problems of polarization imaging detection in detail from the view of the object polarization characteristics, key problem and key technology of polarization imaging detection, polarization imaging detection system and application, etc. The theory and technological problems include object all direction polarization characteristic retrieving, the optical electronic machinery integration designing of the polarization imaging detection system, the high precision polarization information analysis and the polarization image fast processing. Moreover, we point out the possible application direction of the polarization imaging detection technology both in martial and civilian fields. We also summarize the possible future development trend of the polarization imaging detection technology in the field of high spectrum polarization imaging. This paper can provide evident reference and guidance to promote the research and development of the polarization imaging detection technology.

  17. Level II Ergonomic Analyses, Dover AFB, DE

    DTIC Science & Technology

    1999-02-01

    IERA-RS-BR-TR-1999-0002 UNITED STATES AIR FORCE IERA Level II Ergonomie Analyses, Dover AFB, DE Andrew Marcotte Marilyn Joyce The Joyce...Project (070401881, Washington, DC 20503. 1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE 4. TITLE AND SUBTITLE Level II Ergonomie Analyses, Dover...1.0 INTRODUCTION 1-1 1.1 Purpose Of The Level II Ergonomie Analyses : 1-1 1.2 Approach 1-1 1.2.1 Initial Shop Selection and Administration of the

  18. Criteria for the assessment of analyser practicability

    PubMed Central

    Biosca, C.; Galimany, R.

    1993-01-01

    This article lists the theoretical criteria that need to be considered to assess the practicability of an automatic analyser. Two essential sets of criteria should be taken into account when selecting an automatic analyser: ‘reliability’ and ‘practicability’. Practibility covers the features that provide information about the suitability of an analyser for specific working conditions. These practibility criteria are classsified in this article and include the environment; work organization; versatility and flexibility; safely controls; staff training; maintenance and operational costs. PMID:18924972

  19. Medical imaging

    SciTech Connect

    Schneider, R.H.; Dwyer, S.J.

    1987-01-01

    This book contains papers from 26 sessions. Some of the session titles are: Tomographic Reconstruction, Radiography, Fluoro/Angio, Imaging Performance Measures, Perception, Image Processing, 3-D Display, and Printers, Displays, and Digitizers.

  20. Medical Imaging.

    ERIC Educational Resources Information Center

    Barker, M. C. J.

    1996-01-01

    Discusses four main types of medical imaging (x-ray, radionuclide, ultrasound, and magnetic resonance) and considers their relative merits. Describes important recent and possible future developments in image processing. (Author/MKR)

  1. Microhistological Techniques for Food Habits Analyses

    Treesearch

    Mark K. Johnson; Helen Wofford; Henry A. Pearson

    1983-01-01

    Techniques used to prepare and quantify herbivore diet samples for microhistological analyses are described. Plant fragments are illustrated for more than 50 selected plants common on longleaf-slash pine-bluestem range in the southeastern United States.

  2. 7 CFR 94.102 - Analyses available.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... analyses for total ash, fat by acid hydrolysis, moisture, salt, protein, beta-carotene, catalase... addition, egg products can be analyzed for high sucrose content, pH, heavy metals and minerals, monosodium...

  3. 7 CFR 94.102 - Analyses available.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... analyses for total ash, fat by acid hydrolysis, moisture, salt, protein, beta-carotene, catalase... addition, egg products can be analyzed for high sucrose content, pH, heavy metals and minerals, monosodium...

  4. Quality control considerations in performing washability analyses

    SciTech Connect

    Graham, R.D.

    1984-10-01

    The author describes, in considerable detail, the procedures for carrying out washability analyses as laid down in ASTM Standard Test Method D4371. These include sampling, sample preparation, hydrometer standardisation, washability testing, and analysis of specific gravity fractions.

  5. Anthocyanin analyses of Vaccinium fruit dietary supplements

    USDA-ARS?s Scientific Manuscript database

    Vaccinium fruit ingredients within dietary supplements were identified by comparisons with anthocyanin analyses of known Vaccinium profiles (demonstration of anthocyanin fingerprinting). Available Vaccinium supplements were purchased and analyzed; their anthocyanin profiles (based on HPLC separation...

  6. Interactive graphics for functional data analyses.

    PubMed

    Wrobel, Julia; Park, So Young; Staicu, Ana Maria; Goldsmith, Jeff

    Although there are established graphics that accompany the most common functional data analyses, generating these graphics for each dataset and analysis can be cumbersome and time consuming. Often, the barriers to visualization inhibit useful exploratory data analyses and prevent the development of intuition for a method and its application to a particular dataset. The refund.shiny package was developed to address these issues for several of the most common functional data analyses. After conducting an analysis, the plot shiny() function is used to generate an interactive visualization environment that contains several distinct graphics, many of which are updated in response to user input. These visualizations reduce the burden of exploratory analyses and can serve as a useful tool for the communication of results to non-statisticians.

  7. SCM Forcing Data Derived from NWP Analyses

    DOE Data Explorer

    Jakob, Christian

    2008-01-15

    Forcing data, suitable for use with single column models (SCMs) and cloud resolving models (CRMs), have been derived from NWP analyses for the ARM (Atmospheric Radiation Measurement) Tropical Western Pacific (TWP) sites of Manus Island and Nauru.

  8. Comparison with Russian analyses of meteor impact

    SciTech Connect

    Canavan, G.H.

    1997-06-01

    The inversion model for meteor impacts is used to discuss Russian analyses and compare principal results. For common input parameters, the models produce consistent estimates of impactor parameters. Directions for future research are discussed and prioritized.

  9. Image processing technology

    SciTech Connect

    Van Eeckhout, E.; Pope, P.; Balick, L.

    1996-07-01

    This is the final report of a two-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The primary objective of this project was to advance image processing and visualization technologies for environmental characterization. This was effected by developing and implementing analyses of remote sensing data from satellite and airborne platforms, and demonstrating their effectiveness in visualization of environmental problems. Many sources of information were integrated as appropriate using geographic information systems.

  10. Measurement of image contrast using diffraction enhanced imaging

    NASA Astrophysics Data System (ADS)

    Kiss, Miklos Z.; Sayers, Dale E.; Zhong, Zhong

    2003-02-01

    Refraction contrast of simple objects obtained using diffraction enhanced imaging (DEI) was studied and compared to conventional radiographic contrast. Lucite cylinders and nylon wires were imaged using monochromatic synchrotron radiation at the National Synchrotron Light Source (nslsweb.nsls.bnl.gov/nsls/Default.htm) at the Brookhaven National Laboratory. The DEI images were obtained by placing a silicon analyser crystal tuned to the [333] diffraction plane in the beam path between the sample and the detector. To compare the DEI images with conventional radiographic images requires a consistent definition of refraction and absorption contrast. Conventional definitions of contrast favour conventional radiography and DEI contrast is defined to emphasize the specific characteristics of DEI. The proposed definitions were then used to find the DEI gain (the ratio of the DEI contrast with respect to the conventional image contrast). The results presented here show that the DEI gain is consistently greater than 1, indicating that DEI provides more contrast information than conventional radiography.

  11. Analyses and forecasts with LAWS winds

    NASA Technical Reports Server (NTRS)

    Wang, Muyin; Paegle, Jan

    1994-01-01

    Horizontal fluxes of atmospheric water vapor are studied for summer months during 1989 and 1992 over North and South America based on analyses from European Center for Medium Range Weather Forecasts, US National Meteorological Center, and United Kingdom Meteorological Office. The calculations are performed over 20 deg by 20 deg box-shaped midlatitude domains located to the east of the Rocky Mountains in North America, and to the east of the Andes Mountains in South America. The fluxes are determined from operational center gridded analyses of wind and moisture. Differences in the monthly mean moisture flux divergence determined from these analyses are as large as 7 cm/month precipitable water equivalent over South America, and 3 cm/month over North America. Gridded analyses at higher spatial and temporal resolution exhibit better agreement in the moisture budget study. However, significant discrepancies of the moisture flux divergence computed from different gridded analyses still exist. The conclusion is more pessimistic than Rasmusson's estimate based on station data. Further analysis reveals that the most significant sources of error result from model surface elevation fields, gaps in the data archive, and uncertainties in the wind and specific humidity analyses. Uncertainties in the wind analyses are the most important problem. The low-level jets, in particular, are substantially different in the different data archives. Part of the reason for this may be due to the way the different analysis models parameterized physical processes affecting low-level jets. The results support the inference that the noise/signal ratio of the moisture budget may be improved more rapidly by providing better wind observations and analyses than by providing better moisture data.

  12. A History of Rotorcraft Comprehensive Analyses

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2013-01-01

    A history of the development of rotorcraft comprehensive analyses is presented. Comprehensive analyses are digital computer programs that calculate the aeromechanical behavior of the rotor and aircraft, bringing together the most advanced models of the geometry, structure, dynamics, and aerodynamics available in rotary wing technology. The development of the major codes of the last five decades from industry, government, and universities is described. A number of common themes observed in this history are discussed.

  13. Recent advances in imaging Alzheimer's disease.

    PubMed

    Braskie, Meredith N; Toga, Arthur W; Thompson, Paul M

    2013-01-01

    Advances in brain imaging technology in the past five years have contributed greatly to the understanding of Alzheimer's disease (AD). Here, we review recent research related to amyloid imaging, new methods for magnetic resonance imaging analyses, and statistical methods. We also review research that evaluates AD risk factors and brain imaging, in the context of AD prediction and progression. We selected a variety of illustrative studies, describing how they advanced the field and are leading AD research in promising new directions.

  14. Proof Image

    ERIC Educational Resources Information Center

    Kidron, Ivy; Dreyfus, Tommy

    2014-01-01

    The emergence of a proof image is often an important stage in a learner's construction of a proof. In this paper, we introduce, characterize, and exemplify the notion of proof image. We also investigate how proof images emerge. Our approach starts from the learner's efforts to construct a justification without (or before) attempting any…

  15. Image alignment

    DOEpatents

    Dowell, Larry Jonathan

    2014-04-22

    Disclosed is a method and device for aligning at least two digital images. An embodiment may use frequency-domain transforms of small tiles created from each image to identify substantially similar, "distinguishing" features within each of the images, and then align the images together based on the location of the distinguishing features. To accomplish this, an embodiment may create equal sized tile sub-images for each image. A "key" for each tile may be created by performing a frequency-domain transform calculation on each tile. A information-distance difference between each possible pair of tiles on each image may be calculated to identify distinguishing features. From analysis of the information-distance differences of the pairs of tiles, a subset of tiles with high discrimination metrics in relation to other tiles may be located for each image. The subset of distinguishing tiles for each image may then be compared to locate tiles with substantially similar keys and/or information-distance metrics to other tiles of other images. Once similar tiles are located for each image, the images may be aligned in relation to the identified similar tiles.

  16. Canonical Images

    ERIC Educational Resources Information Center

    Hewitt, Dave

    2007-01-01

    In this article, the author offers two well-known mathematical images--that of a dot moving around a circle; and that of the tens chart--and considers their power for developing mathematical thinking. In his opinion, these images each contain the essence of a particular topic of mathematics. They are contrasting images in the sense that they deal…

  17. Canonical Images

    ERIC Educational Resources Information Center

    Hewitt, Dave

    2007-01-01

    In this article, the author offers two well-known mathematical images--that of a dot moving around a circle; and that of the tens chart--and considers their power for developing mathematical thinking. In his opinion, these images each contain the essence of a particular topic of mathematics. They are contrasting images in the sense that they deal…

  18. Quantitative analyses for elucidating mechanisms of cell fate commitment in the mouse blastocyst

    NASA Astrophysics Data System (ADS)

    Saiz, Néstor; Kang, Minjung; Puliafito, Alberto; Schrode, Nadine; Xenopoulos, Panagiotis; Lou, Xinghua; Di Talia, Stefano; Hadjantonakis, Anna-Katerina

    2015-03-01

    In recent years we have witnessed a shift from qualitative image analysis towards higher resolution, quantitative analyses of imaging data in developmental biology. This shift has been fueled by technological advances in both imaging and analysis software. We have recently developed a tool for accurate, semi-automated nuclear segmentation of imaging data from early mouse embryos and embryonic stem cells. We have applied this software to the study of the first lineage decisions that take place during mouse development and established analysis pipelines for both static and time-lapse imaging experiments. In this paper we summarize the conclusions from these studies to illustrate how quantitative, single-cell level analysis of imaging data can unveil biological processes that cannot be revealed by traditional qualitative studies.

  19. Photoacoustic Imaging

    PubMed Central

    Zhang, Yin; Hong, Hao; Cai, Weibo

    2014-01-01

    Photoacoustic imaging, based on the photoacoustic effect, has come a long way over the last decade. Possessing many attractive characteristics such as the use of non-ionizing electromagnetic waves, good resolution/contrast, portable instrumention, as well as the ability to quantitate the signal to a certain extent, photoacoustic techniques have been applied for the imaging of cancer, wound healing, disorders in the brain, gene expression, among others. As a promising structural, functional and molecular imaging modality for a wide range of biomedical applications, photoacoustic imaging systems can be briefly categorized into two types: photoacoustic tomography (PAT, the focus of this chapter) and photoacoustic microscopy (PAM). We will first briefly describe the endogenous (e.g. hemoglobin and melanin) and exogenous contrast agents (e.g. indocyanine green, various gold nanoparticles, single-walled carbon nanotubes, quantum dots, and fluorescent proteins) for photoacoustic imaging. Next, we will discuss in detail the applications of non-targeted photoacoustic imaging. Recently, molecular photoacoustic (MPA) imaging has gained significant interest and a few proof-of-principle studies have been reported. We will summarize the current state-of-the-art of MPA imaging, including the imaging of gene expression and combination of photoacoustic imaging with other imaging modalities. Lastly, we will point out the obstacles facing photoacoustic imaging. Although photoacoustic imaging will likely continue to be a highly vibrant research field for the years to come, the key question of whether MPA imaging could provide significant advantages over non-targeted photoacoustic imaging remains to be demonstrated in the future. PMID:21880823

  20. Albedo Pattern Recognition and Time-Series Analyses in Malaysia

    NASA Astrophysics Data System (ADS)

    Salleh, S. A.; Abd Latif, Z.; Mohd, W. M. N. Wan; Chan, A.

    2012-07-01

    Pattern recognition and time-series analyses will enable one to evaluate and generate predictions of specific phenomena. The albedo pattern and time-series analyses are very much useful especially in relation to climate condition monitoring. This study is conducted to seek for Malaysia albedo pattern changes. The pattern recognition and changes will be useful for variety of environmental and climate monitoring researches such as carbon budgeting and aerosol mapping. The 10 years (2000-2009) MODIS satellite images were used for the analyses and interpretation. These images were being processed using ERDAS Imagine remote sensing software, ArcGIS 9.3, the 6S code for atmospherical calibration and several MODIS tools (MRT, HDF2GIS, Albedo tools). There are several methods for time-series analyses were explored, this paper demonstrates trends and seasonal time-series analyses using converted HDF format MODIS MCD43A3 albedo land product. The results revealed significance changes of albedo percentages over the past 10 years and the pattern with regards to Malaysia's nebulosity index (NI) and aerosol optical depth (AOD). There is noticeable trend can be identified with regards to its maximum and minimum value of the albedo. The rise and fall of the line graph show a similar trend with regards to its daily observation. The different can be identified in term of the value or percentage of rises and falls of albedo. Thus, it can be concludes that the temporal behavior of land surface albedo in Malaysia have a uniform behaviours and effects with regards to the local monsoons. However, although the average albedo shows linear trend with nebulosity index, the pattern changes of albedo with respects to the nebulosity index indicates that there are external factors that implicates the albedo values, as the sky conditions and its diffusion plotted does not have uniform trend over the years, especially when the trend of 5 years interval is examined, 2000 shows high negative linear

  1. Web-based cephalometric procedure for craniofacial and dentition analyses

    NASA Astrophysics Data System (ADS)

    Arun Kumar, N. S.; Kamath, Srijit R.; Ram, S.; Muthukumaran, B.; Venkatachalapathy, A.; Nandakumar, A.; Jayakumar, P.

    2000-05-01

    Craniofacial analysis is a very important and widely used procedure in orthodontic caphalometry, which plays a key role in diagnosis and treatment planning. This involves establishing reference standards and specification of landmarks and variables. The manual approach takes up a tremendous amount of the orthodontist's time. In this paper, we developed a web-based approach for the craniofacial and dentition analyses. A digital computed radiography (CR) system is utilized for obtaining the craniofacial image, which is stored as a bitmap file. The system comprises of two components - a server and a client. The server component is a program that runs on a remote machine. To use the system, the user has to connect to the website. The client component is now activated, which uploads the image from the PC and displays it on the canvas area. The landmarks are identified using a mouse interface. The reference lines are generated. The resulting image is then sent to the server which performs all measurement and calculates the mean, standard deviation, etc. of the variables. The results generated are sent immediately to the client where it is displayed on a separate frame along with the standard values for comparison. This system eliminates the need for every user to load other expensive programs on his machine.

  2. Fractal and Lacunarity Analyses: Quantitative Characterization of Hierarchical Surface Topographies.

    PubMed

    Ling, Edwin J Y; Servio, Phillip; Kietzig, Anne-Marie

    2016-02-01

    Biomimetic hierarchical surface structures that exhibit features having multiple length scales have been used in many technological and engineering applications. Their surface topographies are most commonly analyzed using scanning electron microscopy (SEM), which only allows for qualitative visual assessments. Here we introduce fractal and lacunarity analyses as a method of characterizing the SEM images of hierarchical surface structures in a quantitative manner. Taking femtosecond laser-irradiated metals as an example, our results illustrate that, while the fractal dimension is a poor descriptor of surface complexity, lacunarity analysis can successfully quantify the spatial texture of an SEM image; this, in turn, provides a convenient means of reporting changes in surface topography with respect to changes in processing parameters. Furthermore, lacunarity plots are shown to be sensitive to the different length scales present within a hierarchical structure due to the reversal of lacunarity trends at specific magnifications where new features become resolvable. Finally, we have established a consistent method of detecting pattern sizes in an image from the oscillation of lacunarity plots. Therefore, we promote the adoption of lacunarity analysis as a powerful tool for quantitative characterization of, but not limited to, multi-scale hierarchical surface topographies.

  3. Image Querying by Image Professionals.

    ERIC Educational Resources Information Center

    Jorgensen, Corinne; Jorgensen, Peter

    2003-01-01

    Reports the analysis of search logs from a commercial image provider over a one-month period and discusses results in relation to previous findings. Analyzes image searches, image queries composing the search, user search modification strategies, results returned, and user browsing of results. (Author/AEF)

  4. Prismatic analyser concept for neutron spectrometers

    SciTech Connect

    Birk, Jonas O.; Jacobsen, Johan; Hansen, Rasmus L.; Lefmann, Kim; Markó, Márton; Niedermayer, Christof; Freeman, Paul G.; Christensen, Niels B.; Månsson, Martin; Rønnow, Henrik M.

    2014-11-15

    Developments in modern neutron spectroscopy have led to typical sample sizes decreasing from few cm to several mm in diameter samples. We demonstrate how small samples together with the right choice of analyser and detector components makes distance collimation an important concept in crystal analyser spectrometers. We further show that this opens new possibilities where neutrons with different energies are reflected by the same analyser but counted in different detectors, thus improving both energy resolution and total count rate compared to conventional spectrometers. The technique can readily be combined with advanced focussing geometries and with multiplexing instrument designs. We present a combination of simulations and data showing three different energies simultaneously reflected from one analyser. Experiments were performed on a cold triple axis instrument and on a prototype inverse geometry Time-of-flight spectrometer installed at PSI, Switzerland, and shows excellent agreement with the predictions. Typical improvements will be 2.0 times finer resolution and a factor of 1.9 in flux gain compared to a focussing Rowland geometry, or of 3.3 times finer resolution and a factor of 2.4 in flux gain compared to a single flat analyser slab.

  5. Prismatic analyser concept for neutron spectrometers.

    PubMed

    Birk, Jonas O; Markó, Márton; Freeman, Paul G; Jacobsen, Johan; Hansen, Rasmus L; Christensen, Niels B; Niedermayer, Christof; Månsson, Martin; Rønnow, Henrik M; Lefmann, Kim

    2014-11-01

    Developments in modern neutron spectroscopy have led to typical sample sizes decreasing from few cm to several mm in diameter samples. We demonstrate how small samples together with the right choice of analyser and detector components makes distance collimation an important concept in crystal analyser spectrometers. We further show that this opens new possibilities where neutrons with different energies are reflected by the same analyser but counted in different detectors, thus improving both energy resolution and total count rate compared to conventional spectrometers. The technique can readily be combined with advanced focussing geometries and with multiplexing instrument designs. We present a combination of simulations and data showing three different energies simultaneously reflected from one analyser. Experiments were performed on a cold triple axis instrument and on a prototype inverse geometry Time-of-flight spectrometer installed at PSI, Switzerland, and shows excellent agreement with the predictions. Typical improvements will be 2.0 times finer resolution and a factor of 1.9 in flux gain compared to a focussing Rowland geometry, or of 3.3 times finer resolution and a factor of 2.4 in flux gain compared to a single flat analyser slab.

  6. Computer techniques used for some enhancements of ERTS images

    NASA Technical Reports Server (NTRS)

    Billingsley, F. C.; Goetz, A. F. H.

    1973-01-01

    The JPL VICAR image processing system has been used for the enhancement of images received from the ERTS for the Arizona geology mapping experiment. This system contains flexible capabilities for reading and repairing MSS digital tape images, for geometric corrections and interpicture registration, for various enhancements and analyses of the data, and for display of the images in black and white and color.

  7. Imaging Biomarkers or Biomarker Imaging?

    PubMed Central

    Mitterhauser, Markus; Wadsak, Wolfgang

    2014-01-01

    Since biomarker imaging is traditionally understood as imaging of molecular probes, we highly recommend to avoid any confusion with the previously defined term “imaging biomarkers” and, therefore, only use “molecular probe imaging (MPI)” in that context. Molecular probes (MPs) comprise all kinds of molecules administered to an organism which inherently carry a signalling moiety. This review highlights the basic concepts and differences of molecular probe imaging using specific biomarkers. In particular, PET radiopharmaceuticals are discussed in more detail. Specific radiochemical and radiopharmacological aspects as well as some legal issues are presented. PMID:24967536

  8. Geomagnetic local and regional harmonic analyses.

    USGS Publications Warehouse

    Alldredge, L.R.

    1982-01-01

    Procedures are developed for using rectangular and cylindrical harmonic analyses in local and regional areas. Both the linear least squares analysis, applicable when component data are available, and the nonlinear least squares analysis, applicable when only total field data are available, are treated. When component data are available, it is advantageous to work with residual fields obtained by subtracting components derived from a harmonic potential from the observed components. When only total field intensity data are available, they must be used directly. Residual values cannot be used. Cylindrical harmonic analyses are indicated when fields tend toward cylindrical symmetry; otherwise, rectangular harmonic analyses will be more advantageous. Examples illustrating each type of analysis are given.-Author

  9. Impact of ontology evolution on functional analyses.

    PubMed

    Groß, Anika; Hartung, Michael; Prüfer, Kay; Kelso, Janet; Rahm, Erhard

    2012-10-15

    Ontologies are used in the annotation and analysis of biological data. As knowledge accumulates, ontologies and annotation undergo constant modifications to reflect this new knowledge. These modifications may influence the results of statistical applications such as functional enrichment analyses that describe experimental data in terms of ontological groupings. Here, we investigate to what degree modifications of the Gene Ontology (GO) impact these statistical analyses for both experimental and simulated data. The analysis is based on new measures for the stability of result sets and considers different ontology and annotation changes. Our results show that past changes in the GO are non-uniformly distributed over different branches of the ontology. Considering the semantic relatedness of significant categories in analysis results allows a more realistic stability assessment for functional enrichment studies. We observe that the results of term-enrichment analyses tend to be surprisingly stable despite changes in ontology and annotation.

  10. Proteomic Analyses of the Vitreous Humour

    PubMed Central

    Angi, Martina; Kalirai, Helen; Coupland, Sarah E.; Damato, Bertil E.; Semeraro, Francesco; Romano, Mario R.

    2012-01-01

    The human vitreous humour (VH) is a transparent, highly hydrated gel, which occupies the posterior segment of the eye between the lens and the retina. Physiological and pathological conditions of the retina are reflected in the protein composition of the VH, which can be sampled as part of routine surgical procedures. Historically, many studies have investigated levels of individual proteins in VH from healthy and diseased eyes. In the last decade, proteomics analyses have been performed to characterise the proteome of the human VH and explore networks of functionally related proteins, providing insight into the aetiology of diabetic retinopathy and proliferative vitreoretinopathy. Recent proteomic studies on the VH from animal models of autoimmune uveitis have identified new signalling pathways associated to autoimmune triggers and intravitreal inflammation. This paper aims to guide biological scientists through the different proteomic techniques that have been used to analyse the VH and present future perspectives for the study of intravitreal inflammation using proteomic analyses. PMID:22973072

  11. A qualitative method for analysing multivoicedness

    PubMed Central

    Aveling, Emma-Louise; Gillespie, Alex; Cornish, Flora

    2015-01-01

    ‘Multivoicedness’ and the ‘multivoiced Self’ have become important theoretical concepts guiding research. Drawing on the tradition of dialogism, the Self is conceptualised as being constituted by a multiplicity of dynamic, interacting voices. Despite the growth in literature and empirical research, there remains a paucity of established methodological tools for analysing the multivoiced Self using qualitative data. In this article, we set out a systematic, practical ‘how-to’ guide for analysing multivoicedness. Using theoretically derived tools, our three-step method comprises: identifying the voices of I-positions within the Self’s talk (or text), identifying the voices of ‘inner-Others’, and examining the dialogue and relationships between the different voices. We elaborate each step and illustrate our method using examples from a published paper in which data were analysed using this method. We conclude by offering more general principles for the use of the method and discussing potential applications. PMID:26664292

  12. NEUTRONICS ANALYSES FOR SNS TARGETS DEPOSITIONS

    SciTech Connect

    Popova, Irina I; Remec, Igor; Gallmeier, Franz X

    2016-01-01

    In order to deposit Spallation Neutron Source (SNS) spent facility components replaced due to end-of-life radiation-induced material damage or burn-up, or because of mechanical failure or design improvements, waste classification analyses are being performed. These analyses include an accurate estimate of the radionuclide inventory, on which base components are classified and an appropriate container for transport and storage is determined. After the choice for the container is made, transport calculations are performed for the facility component to be placed inside the container, ensuring compliance with waste management regulations. When necessary, additional shielding is added. Most of the effort is concentrated on the target deposition, which normally takes place once or twice per year. Additionally, the second target station (STS) is in a process of design and waste management analyses for the STS target are being developed to support a deposition plan

  13. Analyses Reveal Record-Shattering Global Warm Temperatures in 2015

    NASA Image and Video Library

    2017-09-28

    2015 was the warmest year since modern record-keeping began in 1880, according to a new analysis by NASA’s Goddard Institute for Space Studies. The record-breaking year continues a long-term warming trend — 15 of the 16 warmest years on record have now occurred since 2001. Credits: Scientific Visualization Studio/Goddard Space Flight Center Details: Earth’s 2015 surface temperatures were the warmest since modern record keeping began in 1880, according to independent analyses by NASA and the National Oceanic and Atmospheric Administration (NOAA). Globally-averaged temperatures in 2015 shattered the previous mark set in 2014 by 0.23 degrees Fahrenheit (0.13 Celsius). Only once before, in 1998, has the new record been greater than the old record by this much. The 2015 temperatures continue a long-term warming trend, according to analyses by scientists at NASA’s Goddard Institute for Space Studies (GISS) in New York (GISTEMP). NOAA scientists agreed with the finding that 2015 was the warmest year on record based on separate, independent analyses of the data. Because weather station locations and measurements change over time, there is some uncertainty in the individual values in the GISTEMP index. Taking this into account, NASA analysis estimates 2015 was the warmest year with 94 percent certainty. Read more: www.nasa.gov/press-release/nasa-noaa-analyses-reveal-reco... NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  14. Advanced laser stratospheric monitoring systems analyses

    NASA Technical Reports Server (NTRS)

    Larsen, J. C.

    1984-01-01

    This report describes the software support supplied by Systems and Applied Sciences Corporation for the study of Advanced Laser Stratospheric Monitoring Systems Analyses under contract No. NAS1-15806. This report discusses improvements to the Langley spectroscopic data base, development of LHS instrument control software and data analyses and validation software. The effect of diurnal variations on the retrieved concentrations of NO, NO2 and C L O from a space and balloon borne measurement platform are discussed along with the selection of optimum IF channels for sensing stratospheric species from space.

  15. Analysing particulate deposition to plant canopies

    NASA Astrophysics Data System (ADS)

    Bache, D. H.

    Experimental measurements of the deposition of Lycopodium spores to a plant canopy were analysed to generate specific estimates of the relative significance of sedimentation, impaction and the effective foliage density fp. For the particular case analysed impaction appeared to be the dominating trapping mechanism and it was demonstrated that considerable aerodynamic shading was present. Using an estimate of fp. a consistant picture emerged in the behaviour of the canopy when both wet and dry and when tested against independent data on the trapping characteristics of individual elements. These conclusions differed significantly from those derived using a model in which impaction was neglected and lead to an apparent overestimate of fp.

  16. Image Processing

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Electronic Imagery, Inc.'s ImageScale Plus software, developed through a Small Business Innovation Research (SBIR) contract with Kennedy Space Flight Center for use on space shuttle Orbiter in 1991, enables astronauts to conduct image processing, prepare electronic still camera images in orbit, display them and downlink images to ground based scientists for evaluation. Electronic Imagery, Inc.'s ImageCount, a spin-off product of ImageScale Plus, is used to count trees in Florida orange groves. Other applications include x-ray and MRI imagery, textile designs and special effects for movies. As of 1/28/98, company could not be located, therefore contact/product information is no longer valid.

  17. Image Inpainting

    DTIC Science & Technology

    2005-01-01

    Image Inpainting Marcelo Bertalmio and Guillermo Sapiro Electrical and Computer Engineering, University of Minnesota Vicent Caselles and Coloma...Ballester Escola Superior Politecnica, Universitat Pompeu Fabra Abstract Inpainting , the technique of modifying an image in an undetectable form, is as...removal/replacement of selected objects. In this paper, we introduce a novel algorithm for digital inpainting of still images that attempts to replicate the

  18. Image barcodes

    NASA Astrophysics Data System (ADS)

    Damera-Venkata, Niranjan; Yen, Jonathan

    2003-01-01

    A Visually significant two-dimensional barcode (VSB) developed by Shaked et. al. is a method used to design an information carrying two-dimensional barcode, which has the appearance of a given graphical entity such as a company logo. The encoding and decoding of information using the VSB, uses a base image with very few graylevels (typically only two). This typically requires the image histogram to be bi-modal. For continuous-tone images such as digital photographs of individuals, the representation of tone or "shades of gray" is not only important to obtain a pleasing rendition of the face, but in most cases, the VSB renders these images unrecognizable due to its inability to represent true gray-tone variations. This paper extends the concept of a VSB to an image bar code (IBC). We enable the encoding and subsequent decoding of information embedded in the hardcopy version of continuous-tone base-images such as those acquired with a digital camera. The encoding-decoding process is modeled by robust data transmission through a noisy print-scan channel that is explicitly modeled. The IBC supports a high information capacity that differentiates it from common hardcopy watermarks. The reason for the improved image quality over the VSB is a joint encoding/halftoning strategy based on a modified version of block error diffusion. Encoder stability, image quality vs. information capacity tradeoffs and decoding issues with and without explicit knowledge of the base-image are discussed.

  19. Body Imaging

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Magnetic Resonance Imaging (MRI) and Computer-aided Tomography (CT) images are often complementary. In most cases, MRI is good for viewing soft tissue but not bone, while CT images are good for bone but not always good for soft tissue discrimination. Physicians and engineers in the Department of Radiology at the University of Michigan Hospitals are developing a technique for combining the best features of MRI and CT scans to increase the accuracy of discriminating one type of body tissue from another. One of their research tools is a computer program called HICAP. The program can be used to distinguish between healthy and diseased tissue in body images.

  20. Ticks (image)

    MedlinePlus

    ... Lyme disease, Erlichiosis, Rocky Mountain Spotted Fever, Colorado Tick Fever, tularemia, typhus, hemorrhagic fever, and viral encephalitis. (Image courtesy of the Centers for Disease Control and Prevention.)

  1. Automated Quality Assurance of Online NIR Analysers

    PubMed Central

    Aaljoki, Kari

    2005-01-01

    Modern NIR analysers produce valuable data for closed-loop process control and optimisation practically in real time. Thus it is highly important to keep them in the best possible shape. Quality assurance (QA) of NIR analysers is an interesting and complex issue because it is not only the instrument and sample handling that has to be monitored. At the same time, validity of prediction models has to be assured. A system for fully automated QA of NIR analysers is described. The system takes care of collecting and organising spectra from various instruments, relevant laboratory, and process management system (PMS) data. Validation of spectra is based on simple diagnostics values derived from the spectra. Predictions are validated against laboratory (LIMS) or other online analyser results (collected from PMS). The system features automated alarming, reporting, trending, and charting functions for major key variables for easy visual inspection. Various textual and graphical reports are sent to maintenance people through email. The software was written with Borland Delphi 7 Enterprise. Oracle and PMS ODBC interfaces were used for accessing LIMS and PMS data using appropriate SQL queries. It will be shown that it is possible to take actions even before the quality of predictions is seriously affected, thus maximising the overall uptime of the instrument. PMID:18924628

  2. Amino acid analyses of Apollo 14 samples.

    NASA Technical Reports Server (NTRS)

    Gehrke, C. W.; Zumwalt, R. W.; Kuo, K.; Aue, W. A.; Stalling, D. L.; Kvenvolden, K. A.; Ponnamperuma, C.

    1972-01-01

    Detection limits were between 300 pg and 1 ng for different amino acids, in an analysis by gas-liquid chromatography of water extracts from Apollo 14 lunar fines in which amino acids were converted to their N-trifluoro-acetyl-n-butyl esters. Initial analyses of water and HCl extracts of sample 14240 and 14298 samples showed no amino acids above background levels.

  3. A Call for Conducting Multivariate Mixed Analyses

    ERIC Educational Resources Information Center

    Onwuegbuzie, Anthony J.

    2016-01-01

    Several authors have written methodological works that provide an introductory- and/or intermediate-level guide to conducting mixed analyses. Although these works have been useful for beginning and emergent mixed researchers, with very few exceptions, works are lacking that describe and illustrate advanced-level mixed analysis approaches. Thus,…

  4. Multiphase Method for Analysing Online Discussions

    ERIC Educational Resources Information Center

    Häkkinen, P.

    2013-01-01

    Several studies have analysed and assessed online performance and discourse using quantitative and qualitative methods. Quantitative measures have typically included the analysis of participation rates and learning outcomes in terms of grades. Qualitative measures of postings, discussions and context features aim to give insights into the nature…

  5. Uncertainty quantification approaches for advanced reactor analyses.

    SciTech Connect

    Briggs, L. L.; Nuclear Engineering Division

    2009-03-24

    The original approach to nuclear reactor design or safety analyses was to make very conservative modeling assumptions so as to ensure meeting the required safety margins. Traditional regulation, as established by the U. S. Nuclear Regulatory Commission required conservatisms which have subsequently been shown to be excessive. The commission has therefore moved away from excessively conservative evaluations and has determined best-estimate calculations to be an acceptable alternative to conservative models, provided the best-estimate results are accompanied by an uncertainty evaluation which can demonstrate that, when a set of analysis cases which statistically account for uncertainties of all types are generated, there is a 95% probability that at least 95% of the cases meet the safety margins. To date, nearly all published work addressing uncertainty evaluations of nuclear power plant calculations has focused on light water reactors and on large-break loss-of-coolant accident (LBLOCA) analyses. However, there is nothing in the uncertainty evaluation methodologies that is limited to a specific type of reactor or to specific types of plant scenarios. These same methodologies can be equally well applied to analyses for high-temperature gas-cooled reactors and to liquid metal reactors, and they can be applied to steady-state calculations, operational transients, or severe accident scenarios. This report reviews and compares both statistical and deterministic uncertainty evaluation approaches. Recommendations are given for selection of an uncertainty methodology and for considerations to be factored into the process of evaluating uncertainties for advanced reactor best-estimate analyses.

  6. 10 CFR 61.13 - Technical analyses.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... REGULATORY COMMISSION (CONTINUED) LICENSING REQUIREMENTS FOR LAND DISPOSAL OF RADIOACTIVE WASTE Licenses § 61... characteristics and design features in isolating and segregating the wastes. The analyses must clearly demonstrate... inadvertent intrusion must include demonstration that there is reasonable assurance the waste...

  7. 10 CFR 61.13 - Technical analyses.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... REGULATORY COMMISSION (CONTINUED) LICENSING REQUIREMENTS FOR LAND DISPOSAL OF RADIOACTIVE WASTE Licenses § 61... characteristics and design features in isolating and segregating the wastes. The analyses must clearly demonstrate... inadvertent intrusion must include demonstration that there is reasonable assurance the waste...

  8. 10 CFR 61.13 - Technical analyses.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... REGULATORY COMMISSION (CONTINUED) LICENSING REQUIREMENTS FOR LAND DISPOSAL OF RADIOACTIVE WASTE Licenses § 61... characteristics and design features in isolating and segregating the wastes. The analyses must clearly demonstrate... inadvertent intrusion must include demonstration that there is reasonable assurance the waste...

  9. 10 CFR 61.13 - Technical analyses.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... REGULATORY COMMISSION (CONTINUED) LICENSING REQUIREMENTS FOR LAND DISPOSAL OF RADIOACTIVE WASTE Licenses § 61... characteristics and design features in isolating and segregating the wastes. The analyses must clearly demonstrate... inadvertent intrusion must include demonstration that there is reasonable assurance the waste...

  10. 10 CFR 61.13 - Technical analyses.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REGULATORY COMMISSION (CONTINUED) LICENSING REQUIREMENTS FOR LAND DISPOSAL OF RADIOACTIVE WASTE Licenses § 61... characteristics and design features in isolating and segregating the wastes. The analyses must clearly demonstrate... inadvertent intrusion must include demonstration that there is reasonable assurance the waste...

  11. Multivariate And Phylogenetic Analyses Of Galaxies

    NASA Astrophysics Data System (ADS)

    Fraix-Burnet, Didier; Chattopadhyay, Tanuka; D'Onofrio, Mauro; Marziani, Paula; Mondal, Saptarshi

    2017-06-01

    Investigating the formation and evolution of galaxies is becoming a complicated process with the increased availability of huge databases as a result of instrumental improvements. In this poster we present preliminary results on two statistical studies using multivariate partitioning and cladistic analyses to find homogeneous groups and their evolutionary relationships.

  12. Challenges and Opportunities in Analysing Students Modelling

    ERIC Educational Resources Information Center

    Blanco-Anaya, Paloma; Justi, Rosária; Díaz de Bustamante, Joaquín

    2017-01-01

    Modelling-based teaching activities have been designed and analysed from distinct theoretical perspectives. In this paper, we use one of them--the model of modelling diagram (MMD)--as an analytical tool in a regular classroom context. This paper examines the challenges that arise when the MMD is used as an analytical tool to characterise the…

  13. Correlation Functions Aid Analyses Of Spectra

    NASA Technical Reports Server (NTRS)

    Beer, Reinhard; Norton, Robert H., Jr.

    1989-01-01

    New uses found for correlation functions in analyses of spectra. In approach combining elements of both pattern-recognition and traditional spectral-analysis techniques, spectral lines identified in data appear useless at first glance because they are dominated by noise. New approach particularly useful in measurement of concentrations of rare species of molecules in atmosphere.

  14. Analysing Simple Electric Motors in the Classroom

    ERIC Educational Resources Information Center

    Yap, Jeff; MacIsaac, Dan

    2006-01-01

    Electromagnetic phenomena and devices such as motors are typically unfamiliar to both teachers and students. To better visualize and illustrate the abstract concepts (such as magnetic fields) underlying electricity and magnetism, we suggest that students construct and analyse the operation of a simply constructed Johnson electric motor. In this…

  15. Cosmetology: Task Analyses. Competency-Based Education.

    ERIC Educational Resources Information Center

    Henrico County Public Schools, Glen Allen, VA. Virginia Vocational Curriculum Center.

    These task analyses are designed to be used in combination with the "Trade and Industrial Education Service Area Resource" in order to implement competency-based education in the cosmetology program in Virginia. The task analysis document contains the task inventory, suggested task sequence lists, and content outlines for the secondary…

  16. Functional Analyses and Treatment of Precursor Behavior

    ERIC Educational Resources Information Center

    Najdowski, Adel C.; Wallace, Michele D.; Ellsworth, Carrie L.; MacAleese, Alicia N.; Cleveland, Jackie

    2008-01-01

    Functional analysis has been demonstrated to be an effective method to identify environmental variables that maintain problem behavior. However, there are cases when conducting functional analyses of severe problem behavior may be contraindicated. The current study applied functional analysis procedures to a class of behavior that preceded severe…

  17. Analysing Simple Electric Motors in the Classroom

    ERIC Educational Resources Information Center

    Yap, Jeff; MacIsaac, Dan

    2006-01-01

    Electromagnetic phenomena and devices such as motors are typically unfamiliar to both teachers and students. To better visualize and illustrate the abstract concepts (such as magnetic fields) underlying electricity and magnetism, we suggest that students construct and analyse the operation of a simply constructed Johnson electric motor. In this…

  18. Amino acid analyses of Apollo 14 samples.

    NASA Technical Reports Server (NTRS)

    Gehrke, C. W.; Zumwalt, R. W.; Kuo, K.; Aue, W. A.; Stalling, D. L.; Kvenvolden, K. A.; Ponnamperuma, C.

    1972-01-01

    Detection limits were between 300 pg and 1 ng for different amino acids, in an analysis by gas-liquid chromatography of water extracts from Apollo 14 lunar fines in which amino acids were converted to their N-trifluoro-acetyl-n-butyl esters. Initial analyses of water and HCl extracts of sample 14240 and 14298 samples showed no amino acids above background levels.

  19. The Economic Cost of Homosexuality: Multilevel Analyses

    ERIC Educational Resources Information Center

    Baumle, Amanda K.; Poston, Dudley, Jr.

    2011-01-01

    This article builds on earlier studies that have examined "the economic cost of homosexuality," by using data from the 2000 U.S. Census and by employing multilevel analyses. Our findings indicate that partnered gay men experience a 12.5 percent earnings penalty compared to married heterosexual men, and a statistically insignificant earnings…

  20. The Economic Cost of Homosexuality: Multilevel Analyses

    ERIC Educational Resources Information Center

    Baumle, Amanda K.; Poston, Dudley, Jr.

    2011-01-01

    This article builds on earlier studies that have examined "the economic cost of homosexuality," by using data from the 2000 U.S. Census and by employing multilevel analyses. Our findings indicate that partnered gay men experience a 12.5 percent earnings penalty compared to married heterosexual men, and a statistically insignificant earnings…

  1. Chemical Analyses of Silicon Aerogel Samples

    SciTech Connect

    van der Werf, I.; Palmisano, F.; De Leo, Raffaele; Marrone, Stefano

    2008-04-01

    After five years of operating, two Aerogel counters: A1 and A2, taking data in Hall A at Jefferson Lab, suffered a loss of performance. In this note possible causes of degradation have been studied. In particular, various chemical and physical analyses have been carried out on several Aerogel tiles and on adhesive tape in order to reveal the presence of contaminants.

  2. Written Case Analyses and Critical Reflection.

    ERIC Educational Resources Information Center

    Harrington, Helen L.; And Others

    1996-01-01

    The study investigated the use of case-based pedagogy to develop critical reflection in prospective teachers. Analysis of students written analyses of dilemma-based cases found patterns showing evidence of students open-mindedness, sense of professional responsibility, and wholeheartedness in approach to teaching. (DB)

  3. Cosmetology: Task Analyses. Competency-Based Education.

    ERIC Educational Resources Information Center

    Henrico County Public Schools, Glen Allen, VA. Virginia Vocational Curriculum Center.

    These task analyses are designed to be used in combination with the "Trade and Industrial Education Service Area Resource" in order to implement competency-based education in the cosmetology program in Virginia. The task analysis document contains the task inventory, suggested task sequence lists, and content outlines for the secondary…

  4. Automated Quality Assurance of Online NIR Analysers.

    PubMed

    Aaljoki, Kari

    2005-01-01

    Modern NIR analysers produce valuable data for closed-loop process control and optimisation practically in real time. Thus it is highly important to keep them in the best possible shape. Quality assurance (QA) of NIR analysers is an interesting and complex issue because it is not only the instrument and sample handling that has to be monitored. At the same time, validity of prediction models has to be assured. A system for fully automated QA of NIR analysers is described. The system takes care of collecting and organising spectra from various instruments, relevant laboratory, and process management system (PMS) data. Validation of spectra is based on simple diagnostics values derived from the spectra. Predictions are validated against laboratory (LIMS) or other online analyser results (collected from PMS). The system features automated alarming, reporting, trending, and charting functions for major key variables for easy visual inspection. Various textual and graphical reports are sent to maintenance people through email. The software was written with Borland Delphi 7 Enterprise. Oracle and PMS ODBC interfaces were used for accessing LIMS and PMS data using appropriate SQL queries. It will be shown that it is possible to take actions even before the quality of predictions is seriously affected, thus maximising the overall uptime of the instrument.

  5. Multispectral imaging and image processing

    NASA Astrophysics Data System (ADS)

    Klein, Julie

    2014-02-01

    The color accuracy of conventional RGB cameras is not sufficient for many color-critical applications. One of these applications, namely the measurement of color defects in yarns, is why Prof. Til Aach and the Institute of Image Processing and Computer Vision (RWTH Aachen University, Germany) started off with multispectral imaging. The first acquisition device was a camera using a monochrome sensor and seven bandpass color filters positioned sequentially in front of it. The camera allowed sampling the visible wavelength range more accurately and reconstructing the spectra for each acquired image position. An overview will be given over several optical and imaging aspects of the multispectral camera that have been investigated. For instance, optical aberrations caused by filters and camera lens deteriorate the quality of captured multispectral images. The different aberrations were analyzed thoroughly and compensated based on models for the optical elements and the imaging chain by utilizing image processing. With this compensation, geometrical distortions disappear and sharpness is enhanced, without reducing the color accuracy of multispectral images. Strong foundations in multispectral imaging were laid and a fruitful cooperation was initiated with Prof. Bernhard Hill. Current research topics like stereo multispectral imaging and goniometric multispectral measure- ments that are further explored with his expertise will also be presented in this work.

  6. JPEG2000 Image Compression on Solar EUV Images

    NASA Astrophysics Data System (ADS)

    Fischer, Catherine E.; Müller, Daniel; De Moortel, Ineke

    2017-01-01

    For future solar missions as well as ground-based telescopes, efficient ways to return and process data have become increasingly important. Solar Orbiter, which is the next ESA/NASA mission to explore the Sun and the heliosphere, is a deep-space mission, which implies a limited telemetry rate that makes efficient onboard data compression a necessity to achieve the mission science goals. Missions like the Solar Dynamics Observatory (SDO) and future ground-based telescopes such as the Daniel K. Inouye Solar Telescope, on the other hand, face the challenge of making petabyte-sized solar data archives accessible to the solar community. New image compression standards address these challenges by implementing efficient and flexible compression algorithms that can be tailored to user requirements. We analyse solar images from the Atmospheric Imaging Assembly (AIA) instrument onboard SDO to study the effect of lossy JPEG2000 (from the Joint Photographic Experts Group 2000) image compression at different bitrates. To assess the quality of compressed images, we use the mean structural similarity (MSSIM) index as well as the widely used peak signal-to-noise ratio (PSNR) as metrics and compare the two in the context of solar EUV images. In addition, we perform tests to validate the scientific use of the lossily compressed images by analysing examples of an on-disc and off-limb coronal-loop oscillation time-series observed by AIA/SDO.

  7. Positioning the image of AIDS.

    PubMed

    Cooter, Roger; Stein, Claudia

    2010-03-01

    AIDS posters can be treated as material objects whose production, distribution and consumption varied across time and place. It is also possible to reconstruct and analyse the public health discourse at the time these powerful images appeared. More recently, however, these conventional historical approaches have been challenged by projects in literary and art criticism. Here, images of AIDS are considered in terms of their function in and for a new discursive regime of power centred on the human body and its visualization. How images of AIDS came to be understood in Western culture in relation to wider political and economic conditions redefines the historical task.

  8. Airbags to Martian Landers: Analyses at Sandia National Laboratories

    SciTech Connect

    Gwinn, K.W.

    1994-03-01

    A new direction for the national laboratories is to assist US business with research and development, primarily through cooperative research and development agreements (CRADAs). Technology transfer to the private sector has been very successful as over 200 CRADAs are in place at Sandia. Because of these cooperative efforts, technology has evolved into some new areas not commonly associated with the former mission of the national laboratories. An example of this is the analysis of fabric structures. Explicit analyses and expertise in constructing parachutes led to the development of a next generation automobile airbag; which led to the construction, testing, and analysis of the Jet Propulsion Laboratory Mars Environmental Survey Lander; and finally led to the development of CAD based custom garment designs using 3D scanned images of the human body. The structural analysis of these fabric structures is described as well as a more traditional example Sandia with the test/analysis correlation of the impact of a weapon container.

  9. Implementation of an analyser crystal method for x-ray diffraction tomography

    NASA Astrophysics Data System (ADS)

    Kewish, C. M.; Davis, J. R.; Nikulin, A. Y.; Benci, A.; Pavlov, K. M.; Morgan, M. J.; Hester, J.

    2001-04-01

    In this paper we present an experimental method which is well suited to synchrotron x-ray diffraction tomography. The experimental technique utilizes an analyser crystal placed in the diffracted x-ray beam to selectively detect diffracted photons at the desired scattering angle. This arrangement selects the prominent diffraction feature of a specimen component, allowing the spatial distribution of that component to be specifically and quantitatively imaged. Images of the spatial variation of the differential x-ray scattering cross section (reflectivity) in a cross sectional plane of interest are reconstructed using an iterative algebraic technique and an error functional minimization. Reconstructed images of a well characterized phantom show that material features can be clearly delineated. Images based on the spatial variation of x-ray linear attenuation coefficients and differential coherent scattering cross sections have been reconstructed with approximately 125 µm spatial resolution in the imaging plane.

  10. Hyperspectral imaging in medicine: image pre-processing problems and solutions in Matlab.

    PubMed

    Koprowski, Robert

    2015-11-01

    The paper presents problems and solutions related to hyperspectral image pre-processing. New methods of preliminary image analysis are proposed. The paper shows problems occurring in Matlab when trying to analyse this type of images. Moreover, new methods are discussed which provide the source code in Matlab that can be used in practice without any licensing restrictions. The proposed application and sample result of hyperspectral image analysis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Imaging Genetics

    ERIC Educational Resources Information Center

    Munoz, Karen E.; Hyde, Luke W.; Hariri, Ahmad R.

    2009-01-01

    Imaging genetics is an experimental strategy that integrates molecular genetics and neuroimaging technology to examine biological mechanisms that mediate differences in behavior and the risks for psychiatric disorder. The basic principles in imaging genetics and the development of the field are discussed.

  12. Photoacoustic Imaging.

    DTIC Science & Technology

    1983-12-01

    DIODE LASER AS THE OPTICAL SOURCE ......... 1 6. IIIG RESOLUTION ACOUSTO-OPTIC LASER PROBE .............. 21 6-1. Introduction...4 * - S.1 SECTION 5 IMAGING WITH A DIODE LASER AS THE OPTICAL SOURCE .,, We have recently imaged photoacoustically...with micron resolution using a 5 milliwatt diode laser as the optical source. This demonstration is an indication of the tremendous sensitivity that we

  13. Imaging Genetics

    ERIC Educational Resources Information Center

    Munoz, Karen E.; Hyde, Luke W.; Hariri, Ahmad R.

    2009-01-01

    Imaging genetics is an experimental strategy that integrates molecular genetics and neuroimaging technology to examine biological mechanisms that mediate differences in behavior and the risks for psychiatric disorder. The basic principles in imaging genetics and the development of the field are discussed.

  14. Image fusion

    NASA Technical Reports Server (NTRS)

    Pavel, M.

    1993-01-01

    The topics covered include the following: a system overview of the basic components of a system designed to improve the ability of a pilot to fly through low-visibility conditions such as fog; the role of visual sciences; fusion issues; sensor characterization; sources of information; image processing; and image fusion.

  15. Imaging Atherosclerosis

    PubMed Central

    Tarkin, Jason M.; Dweck, Marc R.; Evans, Nicholas R.; Takx, Richard A.P.; Brown, Adam J.; Tawakol, Ahmed; Fayad, Zahi A.

    2016-01-01

    Advances in atherosclerosis imaging technology and research have provided a range of diagnostic tools to characterize high-risk plaque in vivo; however, these important vascular imaging methods additionally promise great scientific and translational applications beyond this quest. When combined with conventional anatomic- and hemodynamic-based assessments of disease severity, cross-sectional multimodal imaging incorporating molecular probes and other novel noninvasive techniques can add detailed interrogation of plaque composition, activity, and overall disease burden. In the catheterization laboratory, intravascular imaging provides unparalleled access to the world beneath the plaque surface, allowing tissue characterization and measurement of cap thickness with micrometer spatial resolution. Atherosclerosis imaging captures key data that reveal snapshots into underlying biology, which can test our understanding of fundamental research questions and shape our approach toward patient management. Imaging can also be used to quantify response to therapeutic interventions and ultimately help predict cardiovascular risk. Although there are undeniable barriers to clinical translation, many of these hold-ups might soon be surpassed by rapidly evolving innovations to improve image acquisition, coregistration, motion correction, and reduce radiation exposure. This article provides a comprehensive review of current and experimental atherosclerosis imaging methods and their uses in research and potential for translation to the clinic. PMID:26892971

  16. Diagnostic Imaging

    MedlinePlus

    Diagnostic imaging lets doctors look inside your body for clues about a medical condition. A variety of machines and techniques can create pictures of the structures and activities inside your body. The type of imaging your doctor uses depends on your symptoms and ...

  17. Retinal Imaging and Image Analysis

    PubMed Central

    Abràmoff, Michael D.; Garvin, Mona K.; Sonka, Milan

    2011-01-01

    Many important eye diseases as well as systemic diseases manifest themselves in the retina. While a number of other anatomical structures contribute to the process of vision, this review focuses on retinal imaging and image analysis. Following a brief overview of the most prevalent causes of blindness in the industrialized world that includes age-related macular degeneration, diabetic retinopathy, and glaucoma, the review is devoted to retinal imaging and image analysis methods and their clinical implications. Methods for 2-D fundus imaging and techniques for 3-D optical coherence tomography (OCT) imaging are reviewed. Special attention is given to quantitative techniques for analysis of fundus photographs with a focus on clinically relevant assessment of retinal vasculature, identification of retinal lesions, assessment of optic nerve head (ONH) shape, building retinal atlases, and to automated methods for population screening for retinal diseases. A separate section is devoted to 3-D analysis of OCT images, describing methods for segmentation and analysis of retinal layers, retinal vasculature, and 2-D/3-D detection of symptomatic exudate-associated derangements, as well as to OCT-based analysis of ONH morphology and shape. Throughout the paper, aspects of image acquisition, image analysis, and clinical relevance are treated together considering their mutually interlinked relationships. PMID:21743764

  18. Passive adaptive imaging through turbulence

    NASA Astrophysics Data System (ADS)

    Tofsted, David

    2016-05-01

    Standard methods for improved imaging system performance under degrading optical turbulence conditions typically involve active adaptive techniques or post-capture image processing. Here, passive adaptive methods are considered where active sources are disallowed, a priori. Theoretical analyses of short-exposure turbulence impacts indicate that varying aperture sizes experience different degrees of turbulence impacts. Smaller apertures often outperform larger aperture systems as turbulence strength increases. This suggests a controllable aperture system is advantageous. In addition, sub-aperture sampling of a set of training images permits the system to sense tilts in different sub-aperture regions through image acquisition and image cross-correlation calculations. A four sub-aperture pattern supports corrections involving five realizable operating modes (beyond tip and tilt) for removing aberrations over an annular pattern. Progress to date will be discussed regarding development and field trials of a prototype system.

  19. Quantum Multi-Image Encryption Based on Iteration Arnold Transform with Parameters and Image Correlation Decomposition

    NASA Astrophysics Data System (ADS)

    Hu, Yiqun; Xie, Xinwen; Liu, Xingbin; Zhou, Nanrun

    2017-07-01

    A novel quantum multi-image encryption algorithm based on iteration Arnold transform with parameters and image correlation decomposition is proposed, and a quantum realization of the iteration Arnold transform with parameters is designed. The corresponding low frequency images are obtained by performing 2-D discrete wavelet transform on each image respectively, and then the corresponding low frequency images are spliced randomly to one image. The new image is scrambled by the iteration Arnold transform with parameters, and the gray-level information of the scrambled image is encoded by quantum image correlation decomposition. For the encryption algorithm, the keys are iterative times, added parameters, classical binary and orthonormal basis states. The key space, the security and the computational complexity are analyzed, and all of the analyses show that the proposed encryption algorithm could encrypt multiple images simultaneously with lower computational complexity compared with its classical counterparts.

  20. Sensitivity in risk analyses with uncertain numbers.

    SciTech Connect

    Tucker, W. Troy; Ferson, Scott

    2006-06-01

    Sensitivity analysis is a study of how changes in the inputs to a model influence the results of the model. Many techniques have recently been proposed for use when the model is probabilistic. This report considers the related problem of sensitivity analysis when the model includes uncertain numbers that can involve both aleatory and epistemic uncertainty and the method of calculation is Dempster-Shafer evidence theory or probability bounds analysis. Some traditional methods for sensitivity analysis generalize directly for use with uncertain numbers, but, in some respects, sensitivity analysis for these analyses differs from traditional deterministic or probabilistic sensitivity analyses. A case study of a dike reliability assessment illustrates several methods of sensitivity analysis, including traditional probabilistic assessment, local derivatives, and a ''pinching'' strategy that hypothetically reduces the epistemic uncertainty or aleatory uncertainty, or both, in an input variable to estimate the reduction of uncertainty in the outputs. The prospects for applying the methods to black box models are also considered.

  1. Reliability of chemical analyses of water samples

    SciTech Connect

    Beardon, R.

    1989-11-01

    Ground-water quality investigations require reliable chemical analyses of water samples. Unfortunately, laboratory analytical results are often unreliable. The Uranium Mill Tailings Remedial Action (UMTRA) Project`s solution to this problem was to establish a two phase quality assurance program for the analysis of water samples. In the first phase, eight laboratories analyzed three solutions of known composition. The analytical accuracy of each laboratory was ranked and three laboratories were awarded contracts. The second phase consists of on-going monitoring of the reliability of the selected laboratories. The following conclusions are based on two years experience with the UMTRA Project`s Quality Assurance Program. The reliability of laboratory analyses should not be taken for granted. Analytical reliability may be independent of the prices charged by laboratories. Quality assurance programs benefit both the customer and the laboratory.

  2. Neuronal network analyses: premises, promises and uncertainties

    PubMed Central

    Parker, David

    2010-01-01

    Neuronal networks assemble the cellular components needed for sensory, motor and cognitive functions. Any rational intervention in the nervous system will thus require an understanding of network function. Obtaining this understanding is widely considered to be one of the major tasks facing neuroscience today. Network analyses have been performed for some years in relatively simple systems. In addition to the direct insights these systems have provided, they also illustrate some of the difficulties of understanding network function. Nevertheless, in more complex systems (including human), claims are made that the cellular bases of behaviour are, or will shortly be, understood. While the discussion is necessarily limited, this issue will examine these claims and highlight some traditional and novel aspects of network analyses and their difficulties. This introduction discusses the criteria that need to be satisfied for network understanding, and how they relate to traditional and novel approaches being applied to addressing network function. PMID:20603354

  3. Quantitative Analyse und Visualisierung der Herzfunktionen

    NASA Astrophysics Data System (ADS)

    Sauer, Anne; Schwarz, Tobias; Engel, Nicole; Seitel, Mathias; Kenngott, Hannes; Mohrhardt, Carsten; Loßnitzer, Dirk; Giannitsis, Evangelos; Katus, Hugo A.; Meinzer, Hans-Peter

    Die computergestützte bildbasierte Analyse der Herzfunktionen ist mittlerweile Standard in der Kardiologie. Die verfügbaren Produkte erfordern meist ein hohes Maß an Benutzerinteraktion und somit einen erhöhten Zeitaufwand. In dieser Arbeit wird ein Ansatz vorgestellt, der dem Kardiologen eine größtenteils automatische Analyse der Herzfunktionen mittels MRT-Bilddaten ermöglicht und damit Zeitersparnis schafft. Hierbei werden alle relevanten herzphysiologsichen Parameter berechnet und mithilfe von Diagrammen und Graphen visualisiert. Diese Berechnungen werden evaluiert, indem die ermittelten Werte mit manuell vermessenen verglichen werden. Der hierbei berechnete mittlere Fehler liegt mit 2,85 mm für die Wanddicke und 1,61 mm für die Wanddickenzunahme immer noch im Bereich einer Pixelgrösse der verwendeten Bilder.

  4. Using ENSO to analyse Cloud Radiative Feedback

    NASA Astrophysics Data System (ADS)

    Kolly, Allison; Huang, Yi

    2017-04-01

    When attempting to diagnose the climate sensitivity, clouds are the cause of much uncertainty as they are highly variable. There exists a discrepancy between climate models and observations on the sign and magnitude of cloud radiative feedback. For example, Dessler (2013) shows that models predict a very strong, positive feedback response to ENSO sea surface temperature anomalies in the central Pacific which is not present in observations. To better understand these discrepancies we are using radiation data from the CERES satellite and ERAi reanalysis data to look at the most recent El Nino events. By looking at temperature and humidity anomalies in the central Pacific which are associated with these events, and using radiative kernels, we can calculate their radiative effects. We extend previous work by not only performing an analysis of TOA but also analysing the surface and atmospheric radiation budgets. Additionally we analyse the latest GCMs (e.g. CMIP5 models) and compare them to observations.

  5. Center for Naval Analyses Annual Report 1982.

    DTIC Science & Technology

    1982-01-01

    recipients, alike, was due mainly to temporary rather than permanent layoffs ; they were unemployed for about the same length of time, and their post- layoff ...equal to 70 percent of average weekly wages for 52 weeks in the two years following layoff ) apparently encouraged workers to remain unemployed longer...Institute for Defense Analyses. William A. Nierenberg, Director of the Scripps Institution of Oceanog- raphy. Member, NASA Advisory Council. Member

  6. Inelastic and Dynamic Fracture and Stress Analyses

    NASA Technical Reports Server (NTRS)

    Atluri, S. N.

    1984-01-01

    Large deformation inelastic stress analysis and inelastic and dynamic crack propagation research work is summarized. The salient topics of interest in engine structure analysis that are discussed herein include: (1) a path-independent integral (T) in inelastic fracture mechanics, (2) analysis of dynamic crack propagation, (3) generalization of constitutive relations of inelasticity for finite deformations , (4) complementary energy approaches in inelastic analyses, and (5) objectivity of time integration schemes in inelastic stress analysis.

  7. [Clinical research=design*measurements*statistical analyses].

    PubMed

    Furukawa, Toshiaki

    2012-06-01

    A clinical study must address true endpoints that matter for the patients and the doctors. A good clinical study starts with a good clinical question. Formulating a clinical question in the form of PECO can sharpen one's original question. In order to perform a good clinical study one must have a knowledge of study design, measurements and statistical analyses: The first is taught by epidemiology, the second by psychometrics and the third by biostatistics.

  8. [Resistance analyses for recirculated membrane bioreactor].

    PubMed

    Yang, Qi; Huang, Xia; Shang, Hai-Tao; Wen, Xiang-Hua; Qian, Yi

    2006-11-01

    The resistance analyses for recirculated membrane bioreactor by the resistance-in-series model and the modified gel-polarization model respectively were extended to the turbulent ultrafiltration system. The experiments are carried out by dye wastewater in a tubular membrane module, it is found that the permeate fluxes are predicted very well by these models for turbinate systems. And the resistance caused by the concentration polarization is studied; the gel layer resistance is the most important of all the resistances.

  9. Image of the Singapore Child

    ERIC Educational Resources Information Center

    Ebbeck, Marjory; Warrier, Sheela

    2008-01-01

    The purpose of this study was to analyse the contents of one of the leading newspapers of Singapore in an effort to identify the public image of the children of the nation. Newspaper clippings of news/articles, pictures/photographs and advertisements featuring children below 15 years of age were collected over a one-week period and the content…

  10. Radiation effects on video imagers

    NASA Astrophysics Data System (ADS)

    Yates, G. J.; Bujnosek, J. J.; Jaramillo, S. A.; Walton, R. B.; Martinez, T. M.

    1986-02-01

    Radiation senstivity of several photoconductive, photoemissive, and solid state silicon-based video imagers was measured by analysing stored photo-charge induced by irradiation with continuous and pulsed sources of high energy photons and neutrons. Transient effects as functions of absorbed dose, dose rate, fluences, and ionizing particle energy are presented.

  11. Studying Distant Galaxies: a Handbook of Methods and Analyses

    NASA Astrophysics Data System (ADS)

    Hammer, François Puech, Mathieu; Flores, Hector; Rodrigues, Myriam

    Distant galaxies encapsulate the various stages of galaxy evolution and formation from over 95% of the development of the universe. As early as twenty-five years ago, little was known about them, however since the first systematic survey was completed in the 1990s, increasing amounts of resources have been devoted to their discovery and research. This book summarises for the first time the numerous techniques used for observing, analysing, and understanding the evolution and formation of these distant galaxies. In this rapidly expanding research field, this text is an every-day companion handbook for graduate students and active researchers. It provides guidelines in sample selection, imaging, integrated spectroscopy and 3D spectroscopy, which help to avoid the numerous pitfalls of observational and analysis techniques in use in extragalactic astronomy. It also paves the way for establishing relations between fundamental properties of distant galaxies. At each step, the reader is assisted with numerous practical examples and ready-to-use methodology to help understand and analyse research.

  12. Quantitative analyses of maxillary sinus using computed tomography.

    PubMed

    Perella, Andréia; Rocha, Sara Dos Santos; Cavalcanti, Marcelo de Gusmão Paraiso

    2003-09-01

    The aim of this study was to evaluate the precision and accuracy of linear measurements of maxillary sinus made in tomographic films, by comparing with 3D reconstructed images. Linear measurements of both maxillary sinus in computed tomography CT of 17 patients, with or without lesion by two calibrated examiners independently, on two occasions, with a single manual caliper. A third examiner has done the same measurements electronically in 3D-CT reconstruction. The statistical analysis was performed using ANOVA (analyses of variance). Intra-observer percentage error was little in both cases, with and without lesion; it ranged from 1.14% to 1.82%. The inter-observer error was a little higher reaching a 2.08% value. The accuracy presented a higher value. The perceptual accuracy error was higher in samples, which had lesion compared to that which had not. CT had provided adequate precision and accuracy for maxillary sinus analyses. The precision in cases with lesion was considered inferior when compared to that without lesion, but it can't affect the method efficacy.

  13. Optimizing header strength utilizing finite element analyses

    NASA Astrophysics Data System (ADS)

    Burchett, S. N.

    Finite element techniques have been successfully applied as a design tool in the optimization of high strength headers for pyrotechnic-driven actuators. These techniques have been applied to three aspects of the design process of a high strength header. The design process was a joint effort of experts from several disciplines including design engineers, material scientists, test engineers, manufacturing engineers, and structural analysts. Following material selection, finite element techniques were applied to evaluate the residual stresses due to manufacturing which were developed in the high strength glass ceramic-to-metal seal headers. Results from these finite element analyses were used to identify header designs which were manufacturable and had a minimum residual stress state. Finite element techniques were than applied to obtain the response of the header due to pyrotechnic burn. The results provided realistic upper bounds on the pressure containment ability of various preliminary header designs and provided a quick and inexpensive method of strengthening and refining the designs. Since testing of the headers was difficult and sometimes destructive, results of the analyses were also used to interpret test results and identify failure modes. In this paper, details of the finite element element techniques including the models used, material properties, material failure models, and loading will be presented. Results from the analyses showing the header failure process will also be presented. This paper will show that significant gains in capability and understanding can result when finite element techniques are included as an integral part of the design process of complicated high strength headers.

  14. TOGGLE: toolbox for generic NGS analyses.

    PubMed

    Monat, Cécile; Tranchant-Dubreuil, Christine; Kougbeadjo, Ayité; Farcy, Cédric; Ortega-Abboud, Enrique; Amanzougarene, Souhila; Ravel, Sébastien; Agbessi, Mawussé; Orjuela-Bouniol, Julie; Summo, Maryline; Sabot, François

    2015-11-09

    The explosion of NGS (Next Generation Sequencing) sequence data requires a huge effort in Bioinformatics methods and analyses. The creation of dedicated, robust and reliable pipelines able to handle dozens of samples from raw FASTQ data to relevant biological data is a time-consuming task in all projects relying on NGS. To address this, we created a generic and modular toolbox for developing such pipelines. TOGGLE (TOolbox for Generic nGs anaLysEs) is a suite of tools able to design pipelines that manage large sets of NGS softwares and utilities. Moreover, TOGGLE offers an easy way to manipulate the various options of the different softwares through the pipelines in using a single basic configuration file, which can be changed for each assay without having to change the code itself. We also describe one implementation of TOGGLE in a complete analysis pipeline designed for SNP discovery for large sets of genomic data, ready to use in different environments (from a single machine to HPC clusters). TOGGLE speeds up the creation of robust pipelines with reliable log tracking and data flow, for a large range of analyses. Moreover, it enables Biologists to concentrate on the biological relevance of results, and change the experimental conditions easily. The whole code and test data are available at https://github.com/SouthGreenPlatform/TOGGLE .

  15. Evaluation of the Technicon Axon analyser.

    PubMed

    Martínez, C; Márquez, M; Cortés, M; Mercé, J; Rodriguez, J; González, F

    1990-01-01

    An evaluation of the Technicon Axon analyser was carried out following the guidelines of the 'Sociedad Española de Química Clínica' and the European Committee for Clinical Laboratory Standards.A photometric study revealed acceptable results at both 340 nm and 404 nm. Inaccuracy and imprecision were lower at 404 nm than at 340 nm, although poor dispersion was found at both wavelengths, even at low absorbances. Drift was negligible, the imprecision of the sample pipette delivery system was greater for small sample volumes, the reagent pipette delivery system imprecision was acceptable and the sample diluting system study showed good precision and accuracy.Twelve analytes were studied for evaluation of the analyser under routine working conditions. Satisfactory results were obtained for within-run imprecision, while coefficients of variation for betweenrun imprecision were much greater than expected. Neither specimenrelated nor specimen-independent contamination was found in the carry-over study. For all analytes assayed, when comparing patient sample results with those obtained in a Hitachi 737 analyser, acceptable relative inaccuracy was observed.

  16. HASE: Framework for efficient high-dimensional association analyses

    PubMed Central

    Roshchupkin, G. V.; Adams, H. H. H.; Vernooij, M. W.; Hofman, A.; Van Duijn, C. M.; Ikram, M. A.; Niessen, W. J.

    2016-01-01

    High-throughput technology can now provide rich information on a person’s biological makeup and environmental surroundings. Important discoveries have been made by relating these data to various health outcomes in fields such as genomics, proteomics, and medical imaging. However, cross-investigations between several high-throughput technologies remain impractical due to demanding computational requirements (hundreds of years of computing resources) and unsuitability for collaborative settings (terabytes of data to share). Here we introduce the HASE framework that overcomes both of these issues. Our approach dramatically reduces computational time from years to only hours and also requires several gigabytes to be exchanged between collaborators. We implemented a novel meta-analytical method that yields identical power as pooled analyses without the need of sharing individual participant data. The efficiency of the framework is illustrated by associating 9 million genetic variants with 1.5 million brain imaging voxels in three cohorts (total N = 4,034) followed by meta-analysis, on a standard computational infrastructure. These experiments indicate that HASE facilitates high-dimensional association studies enabling large multicenter association studies for future discoveries. PMID:27782180

  17. Advanced Land Imager Assessment System

    NASA Technical Reports Server (NTRS)

    Chander, Gyanesh; Choate, Mike; Christopherson, Jon; Hollaren, Doug; Morfitt, Ron; Nelson, Jim; Nelson, Shar; Storey, James; Helder, Dennis; Ruggles, Tim; Kaita, Ed; Levy, Raviv; Ong, Lawrence; Markham, Brian; Schweiss, Robert

    2008-01-01

    The Advanced Land Imager Assessment System (ALIAS) supports radiometric and geometric image processing for the Advanced Land Imager (ALI) instrument onboard NASA s Earth Observing-1 (EO-1) satellite. ALIAS consists of two processing subsystems for radiometric and geometric processing of the ALI s multispectral imagery. The radiometric processing subsystem characterizes and corrects, where possible, radiometric qualities including: coherent, impulse; and random noise; signal-to-noise ratios (SNRs); detector operability; gain; bias; saturation levels; striping and banding; and the stability of detector performance. The geometric processing subsystem and analysis capabilities support sensor alignment calibrations, sensor chip assembly (SCA)-to-SCA alignments and band-to-band alignment; and perform geodetic accuracy assessments, modulation transfer function (MTF) characterizations, and image-to-image characterizations. ALIAS also characterizes and corrects band-toband registration, and performs systematic precision and terrain correction of ALI images. This system can geometrically correct, and automatically mosaic, the SCA image strips into a seamless, map-projected image. This system provides a large database, which enables bulk trending for all ALI image data and significant instrument telemetry. Bulk trending consists of two functions: Housekeeping Processing and Bulk Radiometric Processing. The Housekeeping function pulls telemetry and temperature information from the instrument housekeeping files and writes this information to a database for trending. The Bulk Radiometric Processing function writes statistical information from the dark data acquired before and after the Earth imagery and the lamp data to the database for trending. This allows for multi-scene statistical analyses.

  18. Creation of DICOM--aware applications using ImageJ.

    PubMed

    Barboriak, Daniel P; Padua, Anthony O; York, Gerald E; Macfall, James R

    2005-06-01

    The demand for image-processing software for radiology applications has been increasing, fueled by advancements in both image-acquisition and image-analysis techniques. The utility of existing image-processing software is often limited by cost, lack of flexibility, and/or specific hardware requirements. In particular, many existing packages cannot directly utilize images formatted using the specifications in part 10 of the DICOM standard ("DICOM images"). We show how image analyses can be performed directly on DICOM images by using ImageJ, a free, Java-based image-processing package (http://rsb.info.nih.gov/ij/). We demonstrate how plug-ins written in our laboratory can be used along with the ImageJ macro script language to create flexible, low-cost, multiplatform image-processing applications that can be directed by information contained in the DICOM image header.

  19. Body image and media use among adolescents.

    PubMed

    Borzekowski, Dina L G; Bayer, Angela M

    2005-06-01

    This article reviews the literature on body image and media use among adolescents. We begin by defining body image and how it is constructed, especially among young people. We then offer information on when one's body image perception is askew with one's perception of personal ideal, which can result in disordered eating, including obesity, anorexia, and bulimia. Next, we describe the research literature on media use and its relationship to adolescents' body image perceptions and discuss content analyses and correlational, experimental, and qualitative studies. Lastly, we recommend, beyond conducting further and improved research studies, interventions and policies that may have an impact on body image and media use.

  20. Bathymetric imaging

    NASA Technical Reports Server (NTRS)

    Paluzzi, P. R.; Malin, M. C.

    1981-01-01

    Digital topography has, for some years, been formatted and processed into shaded relief images for specific studies involving land use and thermal properties. Application to bathymetry is a new and seemingly fruitful extension of these techniques. Digital terrain models of the earth - combining subaerial topography with an extensive collection of bathymetric soundings - have been processed to yield shaded relief images. These images provide new and exciting insights into submarine geomorphology and portray many aspects of plate tectonic physiography in a manner not previously possible.

  1. Stable isotopic analyses in paleoclimatic reconstruction

    SciTech Connect

    Wigand, P.E.

    1995-09-01

    Most traditional paleoclimatic proxy data have inherent time lags between climatic input and system response that constrain their use in accurate reconstruction of paleoclimate chronology, scaling of its variability, and the elucidation of the processes that determine its impact on the biotic and abiotic environment. With the exception of dendroclimatology, and studies of short-lived organisms and pollen recovered from annually varved lacustrine sediments, significant periods of time ranging from years, to centuries, to millennia may intervene between climate change and its first manifestation in paleoclimatic proxy data records. Reconstruction of past climate through changes in plant community composition derived from pollen sequences and plant remains from ancient woodrat middens, wet environments and dry caves all suffer from these lags. However, stable isotopic analyses can provide more immediate indication of biotic response to climate change. Evidence of past physiological response of organisms to changes in effective precipitation as climate varies can be provided by analyses of the stable isotopic content of plant macrofossils from various contexts. These analyses consider variation in the stable isotopic (hydrogen, oxygen and carbon) content of plant tissues as it reflects (1) past global or local temperature through changes in meteoric (rainfall) water chemistry in the case of the first two isotopes, and (2) plant stress through changes in plant respiration/transpiration processes under differing water availability, and varying atmospheric CO, composition (which itself may actually be a net result of biotic response to climate change). Studies currently being conducted in the Intermountain West indicate both long- and short-term responses that when calibrated with modem analogue studies have the potential of revealing not only the timing of climate events, but their direction, magnitude and rapidity.

  2. Quality of reporting of dental survival analyses.

    PubMed

    Layton, D M; Clarke, M

    2014-12-01

    To explore the quality of reporting (writing and graphics) of articles that used time-to-event analyses to report dental treatment outcomes. A systematic search of the top 50 dental journals in 2008 produced the sample of articles for this analysis. Articles reporting treatment outcomes with (n = 95) and without (n = 91) time-to-event statistics were reviewed. Survival descriptive words used in the two groups were analysed (Pearson's chi-square). The quality of life tables, survival curves and time-to-event statistics were assessed (Kappa analysed agreement) and explored. Words describing dental outcomes 'over time' were more common in time-to-event compared with control articles (77%, 3%, P < 0.001). Non-specific use of 'rate' was common across both groups. Life tables and survival curves were used by 39% and 48% of the time-to-event articles, with at least one used by 82%. Construction quality was poor: 21% of life tables and 28% of survival curves achieved an acceptable standard. Time-to-event statistical reporting was poor: 3% achieved a high and 59% achieved an acceptable standard. The survival statistic, summary figure and standard error were reported in 76%, 95% and 20% of time-to-event articles. Individual statistical terms and graphic aids were common within and unique to time-to-event articles. Unfortunately, important details were regularly omitted from statistical descriptions and survival figures making the overall quality poor. It is likely this will mean such articles will be incorrectly indexed in databases, missed by searchers and unable to be understood completely if identified. © 2014 John Wiley & Sons Ltd.

  3. Combustion Devices CFD Team Analyses Review

    NASA Technical Reports Server (NTRS)

    Rocker, Marvin

    2008-01-01

    A variety of CFD simulations performed by the Combustion Devices CFD Team at Marshall Space Flight Center will be presented. These analyses were performed to support Space Shuttle operations and Ares-1 Crew Launch Vehicle design. Results from the analyses will be shown along with pertinent information on the CFD codes and computational resources used to obtain the results. Six analyses will be presented - two related to the Space Shuttle and four related to the Ares I-1 launch vehicle now under development at NASA. First, a CFD analysis of the flow fields around the Space Shuttle during the first six seconds of flight and potential debris trajectories within those flow fields will be discussed. Second, the combusting flows within the Space Shuttle Main Engine's main combustion chamber will be shown. For the Ares I-1, an analysis of the performance of the roll control thrusters during flight will be described. Several studies are discussed related to the J2-X engine to be used on the upper stage of the Ares I-1 vehicle. A parametric study of the propellant flow sequences and mixture ratios within the GOX/GH2 spark igniters on the J2-X is discussed. Transient simulations will be described that predict the asymmetric pressure loads that occur on the rocket nozzle during the engine start as the nozzle fills with combusting gases. Simulations of issues that affect temperature uniformity within the gas generator used to drive the J-2X turbines will described as well, both upstream of the chamber in the injector manifolds and within the combustion chamber itself.

  4. Medical Imaging.

    ERIC Educational Resources Information Center

    Jaffe, C. Carl

    1982-01-01

    Describes principle imaging techniques, their applications, and their limitations in terms of diagnostic capability and possible adverse biological effects. Techniques include film radiography, computed tomography, nuclear medicine, positron emission tomography (PET), ultrasonography, nuclear magnetic resonance, and digital radiography. PET has…

  5. Medical Imaging.

    ERIC Educational Resources Information Center

    Jaffe, C. Carl

    1982-01-01

    Describes principle imaging techniques, their applications, and their limitations in terms of diagnostic capability and possible adverse biological effects. Techniques include film radiography, computed tomography, nuclear medicine, positron emission tomography (PET), ultrasonography, nuclear magnetic resonance, and digital radiography. PET has…

  6. Body Imaging

    NASA Technical Reports Server (NTRS)

    2001-01-01

    The high-tech art of digital signal processing (DSP) was pioneered at NASA's Jet Propulsion Laboratory (JPL) in the mid-1960s for use in the Apollo Lunar Landing Program. Designed to computer enhance pictures of the Moon, this technology became the basis for the Landsat Earth resources satellites and subsequently has been incorporated into a broad range of Earthbound medical and diagnostic tools. DSP is employed in advanced body imaging techniques including Computer-Aided Tomography, also known as CT and CATScan, and Magnetic Resonance Imaging (MRI). CT images are collected by irradiating a thin slice of the body with a fan-shaped x-ray beam from a number of directions around the body's perimeter. A tomographic (slice-like) picture is reconstructed from these multiple views by a computer. MRI employs a magnetic field and radio waves, rather than x-rays, to create images. In this photograph, a patient undergoes an open MRI.

  7. Body Imaging

    NASA Technical Reports Server (NTRS)

    2001-01-01

    The high-tech art of digital signal processing (DSP) was pioneered at NASA's Jet Propulsion Laboratory (JPL) in the mid-1960s for use in the Apollo Lunar Landing Program. Designed to computer enhance pictures of the Moon, this technology became the basis for the Landsat Earth resources satellites and subsequently has been incorporated into a broad range of Earthbound medical and diagnostic tools. DSP is employed in advanced body imaging techniques including Computer-Aided Tomography, also known as CT and CATScan, and Magnetic Resonance Imaging (MRI). CT images are collected by irradiating a thin slice of the body with a fan-shaped x-ray beam from a number of directions around the body's perimeter. A tomographic (slice-like) picture is reconstructed from these multiple views by a computer. MRI employs a magnetic field and radio waves, rather than x-rays, to create images.

  8. Body Imaging

    NASA Technical Reports Server (NTRS)

    2001-01-01

    The high-tech art of digital signal processing (DSP) was pioneered at NASA's Jet Propulsion Laboratory (JPL) in the mid-1960s for use in the Apollo Lunar Landing Program. Designed to computer enhance pictures of the Moon, this technology became the basis for the Landsat Earth resources satellites and subsequently has been incorporated into a broad range of Earthbound medical and diagnostic tools. DSP is employed in advanced body imaging techniques including Computer-Aided Tomography, also known as CT and CATScan, and Magnetic Resonance Imaging (MRI). CT images are collected by irradiating a thin slice of the body with a fan-shaped x-ray beam from a number of directions around the body's perimeter. A tomographic (slice-like) picture is reconstructed from these multiple views by a computer. MRI employs a magnetic field and radio waves, rather than x-rays, to create images.

  9. Body Imaging

    NASA Technical Reports Server (NTRS)

    2001-01-01

    The high-tech art of digital signal processing (DSP) was pioneered at NASA's Jet Propulsion Laboratory (JPL) in the mid-1960s for use in the Apollo Lunar Landing Program. Designed to computer enhance pictures of the Moon, this technology became the basis for the Landsat Earth resources satellites and subsequently has been incorporated into a broad range of Earthbound medical and diagnostic tools. DSP is employed in advanced body imaging techniques including Computer-Aided Tomography, also known as CT and CATScan, and Magnetic Resonance Imaging (MRI). CT images are collected by irradiating a thin slice of the body with a fan-shaped x-ray beam from a number of directions around the body's perimeter. A tomographic (slice-like) picture is reconstructed from these multiple views by a computer. MRI employs a magnetic field and radio waves, rather than x-rays, to create images. In this photograph, a patient undergoes an open MRI.

  10. Thermal structure analyses for CSM testbed (COMET)

    NASA Technical Reports Server (NTRS)

    Xue, David Y.; Mei, Chuh

    1994-01-01

    This document is the final report for the project entitled 'Thermal Structure Analyses for CSM Testbed (COMET),' for the period of May 16, 1992 - August 15, 1994. The project was focused on the investigation and development of finite element analysis capability of the computational structural mechanics (CSM) testbed (COMET) software system in the field of thermal structural responses. The stages of this project consisted of investigating present capabilities, developing new functions, analysis demonstrations, and research topics. The appendices of this report list the detailed documents of major accomplishments and demonstration runstreams for future references.

  11. Alternative polyadenylation: New insights from global analyses

    PubMed Central

    Shi, Yongsheng

    2012-01-01

    Recent studies have revealed widespread mRNA alternative polyadenylation (APA) in eukaryotes and its dynamic spatial and temporal regulation. APA not only generates proteomic and functional diversity, but also plays important roles in regulating gene expression. Global deregulation of APA has been demonstrated in a variety of human diseases. Recent exciting advances in the field have been made possible in a large part by high throughput analyses using newly developed experimental tools. Here I review the recent progress in global studies of APA and the insights that have emerged from these and other studies that use more conventional methods. PMID:23097429

  12. Method of performing computational aeroelastic analyses

    NASA Technical Reports Server (NTRS)

    Silva, Walter A. (Inventor)

    2011-01-01

    Computational aeroelastic analyses typically use a mathematical model for the structural modes of a flexible structure and a nonlinear aerodynamic model that can generate a plurality of unsteady aerodynamic responses based on the structural modes for conditions defining an aerodynamic condition of the flexible structure. In the present invention, a linear state-space model is generated using a single execution of the nonlinear aerodynamic model for all of the structural modes where a family of orthogonal functions is used as the inputs. Then, static and dynamic aeroelastic solutions are generated using computational interaction between the mathematical model and the linear state-space model for a plurality of periodic points in time.

  13. Fundamentals of fungal molecular population genetic analyses.

    PubMed

    Xu, Jianping

    2006-07-01

    The last two decades have seen tremendous growth in the development and application of molecular methods in the analyses of fungal species and populations. In this paper, I provide an overview of the molecular techniques and the basic analytical tools used to address various fundamental population and evolutionary genetic questions in fungi. With increasing availability and decreasing cost, DNA sequencing is becoming a mainstream data acquisition method in fungal evolutionary genetic studies. However, other methods, especially those based on the polymerase chain reaction, remain powerful in addressing specific questions for certain groups of taxa. These developments are bringing fungal population and evolutionary genetics into mainstream ecology and evolutionary biology.

  14. Environmental monitoring final report: groundwater chemical analyses

    SciTech Connect

    Not Available

    1984-02-01

    This report presents the results of analyses of groundwater qualtiy at the SRC-I Demonstration Plant site in Newman, Kentucky. Samples were obtained from a network of 23 groundwater observation wells installed during previous studies. The groundwater was well within US EPA Interim Primary Drinking Water Standards for trace metals, radioactivity, and pesticides, but exceeded the standard for coliform bacteria. Several US EPA Secondary Drinking Water Standards were exceeded, namely, manganese, color, iron, and total dissolved solids. Based on the results, Dames and Moore recommend that all wells should be sterilized and those wells built in 1980 should be redeveloped. 1 figure, 6 tables.

  15. Laser power beaming system analyses. Final report

    SciTech Connect

    Zeiders, G.W. Jr.

    1993-08-01

    The successful demonstration of the PAMELA adaptive optics hardware and the fabrication of the BTOS truss structure were identified by the program office as the two most critical elements of the NASA power beaming program, so it was these that received attention during this program. Much of the effort was expended in direct program support at MSFC, but detailed technical analyses of the AMP deterministic control scheme and the BTOS truss structure (both the JPL design and a spherical one) were prepared and are attached, and recommendations are given.

  16. Alternative splicing: new insights from global analyses.

    PubMed

    Blencowe, Benjamin J

    2006-07-14

    Recent analyses of sequence and microarray data have suggested that alternative splicing plays a major role in the generation of proteomic and functional diversity in metazoan organisms. Efforts are now being directed at establishing the full repertoire of functionally relevant transcript variants generated by alternative splicing, the specific roles of such variants in normal and disease physiology, and how alternative splicing is coordinated on a global level to achieve cell- and tissue-specific functions. Recent progress in these areas is summarized in this review.

  17. Further analyses of Rio Cuarto impact glass

    NASA Technical Reports Server (NTRS)

    Schultz, Peter H.; Bunch, T. E.; Koeberl, C.; Collins, W.

    1993-01-01

    Initial analyses of the geologic setting, petrology, and geochemistry of glasses recovered from within and around the elongate Rio Cuarto (RC) craters in Argentina focused on selected samples in order to document the general similarity with impactites around other terrestrial impact craters and to establish their origin. Continued analysis has surveyed the diversity in compositions for a range of samples, examined further evidence for temperature and pressure history, and compared the results with experimentally fused loess from oblique hypervelocity impacts. These new results not only firmly establish their impact origin but provide new insight on the impact process.

  18. Analyses of containment structures with corrosion damage

    SciTech Connect

    Cherry, J.L.

    1996-12-31

    Corrosion damage to a nuclear power plant containment structure can degrade the pressure capacity of the vessel. For the low-carbon, low- strength steels used in containments, the effect of corrosion on material properties is discussed. Strain-to-failure tests, in uniaxial tension, have been performed on corroded material samples. Results were used to select strain-based failure criteria for corroded steel. Using the ABAQUS finite element analysis code, the capacity of a typical PWR Ice Condenser containment with corrosion damage has been studied. Multiple analyses were performed with the locations of the corrosion the containment, and the amount of corrosion varied in each analysis.

  19. Further analyses of Rio Cuarto impact glass

    NASA Technical Reports Server (NTRS)

    Schultz, Peter H.; Bunch, T. E.; Koeberl, C.; Collins, W.

    1993-01-01

    Initial analyses of the geologic setting, petrology, and geochemistry of glasses recovered from within and around the elongate Rio Cuarto (RC) craters in Argentina focused on selected samples in order to document the general similarity with impactites around other terrestrial impact craters and to establish their origin. Continued analysis has surveyed the diversity in compositions for a range of samples, examined further evidence for temperature and pressure history, and compared the results with experimentally fused loess from oblique hypervelocity impacts. These new results not only firmly establish their impact origin but provide new insight on the impact process.

  20. MULTISPECTRAL THERMAL IMAGER - OVERVIEW

    SciTech Connect

    P. WEBER

    2001-03-01

    The Multispectral Thermal Imager satellite fills a new and important role in advancing the state of the art in remote sensing sciences. Initial results with the full calibration system operating indicate that the system was already close to achieving the very ambitious goals which we laid out in 1993, and we are confident of reaching all of these goals as we continue our research and improve our analyses. In addition to the DOE interests, the satellite is tasked about one-third of the time with requests from other users supporting research ranging from volcanology to atmospheric sciences.

  1. Magnesium alloy ingots: Chemical and metallographic analyses

    NASA Astrophysics Data System (ADS)

    Tartaglia, John M.; Swartz, Robert E.; Bentz, Rodney L.; Howard, Jane H.

    2001-11-01

    The quality of a magnesium die casting is likely dependent on the quality of the feed stockingot material. Therefore, both Daimler-Chrysler and General Motors have established quality assurance measures that include analysis of magnesium ingots. These processes include chemical analysis, corrosion testing, fast neutron activation analysis, and metallography. Optical emission spectroscopy, inductively coupled plasma spectroscopy, and gravimetric analysis are several methods for determining the chemical composition of the material. Fast neutron activation analysis, image analysis and energy dispersive X-ray spectroscopy are used to quantify ingot cleanliness. These experimental techniques are described and discussed in this paper, and example case studies are presented for illustration.

  2. Imaging Magnetometer

    NASA Technical Reports Server (NTRS)

    Moynihan, Philip I.; Soll, Stanley L.

    1995-01-01

    Imaging magnetometer proposed for scientific, industrial, or military use in detecting underground structures containing magnetic materials and underground machines generating and/or altering magnetic fields. Includes electron-beam tube integrated with phosphor-coated charge-coupled device (CCD). Images formed by magnetic deflection of electron beam. Locations, magnitudes, and directions of identifiable features of magnetic field examined to extract data on locations, sizes, and types of magnetically detected objects.

  3. Image Security

    DTIC Science & Technology

    2007-11-02

    popularity, contemplates the cru- cial needs for protecting intellectual property rights on multimedia content like images, video, audio , and oth- ers...protection for still images, audio , video, and multimedia products.’ The networking environment of the future will require tools that provide m secure and fast...technique known as steganography ? Steganography , or “covered writing,” George Voyatzis and Ioannis Pitas University of Thessaloniki has a long

  4. Imaging System

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The 1100C Virtual Window is based on technology developed under NASA Small Business Innovation (SBIR) contracts to Ames Research Center. For example, under one contract Dimension Technologies, Inc. developed a large autostereoscopic display for scientific visualization applications. The Virtual Window employs an innovative illumination system to deliver the depth and color of true 3D imaging. Its applications include surgery and Magnetic Resonance Imaging scans, viewing for teleoperated robots, training, and in aviation cockpit displays.

  5. Optical eigenmodes for illumination & imaging

    NASA Astrophysics Data System (ADS)

    Kosmeier, Sebastian

    Gravitational Microlensing, as a technique for detecting Extrasolar Planets, is recognised for its potential in discovering small-mass planets similar to Earth, at a distance of a few Astronomical Units from their host stars. However, analysing the data from microlensing events (which statistically rarely reveal planets) is complex and requires continued and intensive use of various networks of telescopes working together in order to observe the phenomenon. As such the techniques are constantly being developed and refined; this project outlines some steps of the careful analysis required to model an event and ensure the best quality data is used in the fitting. A quantitative investigation into increasing the quality of the original photometric data available from any microlensing event demonstrates that 'lucky imaging' can lead to a marked improvement in the signal to noise ratio of images over standard imaging techniques, which could result in more accurate models and thus the calculation of more accurate planetary parameters. In addition, a simulation illustrating the effects of atmospheric turbulence on exposures was created, and expanded upon to give an approximation of the lucky imaging technique. This further demonstrated the advantages of lucky images which are shown to potentially approach the quality of those expected from diffraction limited photometry. The simulation may be further developed for potential future use as a 'theoretical lucky imager' in our research group, capable of producing and analysing synthetic exposures through customisable conditions.

  6. Analyses of containment structures with corrosion damage

    SciTech Connect

    Cherry, J.L.

    1997-01-01

    Corrosion damage that has been found in a number of nuclear power plant containment structures can degrade the pressure capacity of the vessel. This has prompted concerns regarding the capacity of corroded containments to withstand accident loadings. To address these concerns, finite element analyses have been performed for a typical PWR Ice Condenser containment structure. Using ABAQUS, the pressure capacity was calculated for a typical vessel with no corrosion damage. Multiple analyses were then performed with the location of the corrosion and the amount of corrosion varied in each analysis. Using a strain-based failure criterion, a {open_quotes}lower bound{close_quotes}, {open_quotes}best estimate{close_quotes}, and {open_quotes}upper bound{close_quotes} failure level was predicted for each case. These limits were established by: determining the amount of variability that exists in material properties of typical containments, estimating the amount of uncertainty associated with the level of modeling detail and modeling assumptions, and estimating the effect of corrosion on the material properties.

  7. Transportation systems analyses: Volume 1: Executive Summary

    NASA Astrophysics Data System (ADS)

    1993-05-01

    The principal objective of this study is to accomplish a systems engineering assessment of the nation's space transportation infrastructure. This analysis addresses the necessary elements to perform man delivery and return, cargo transfer, cargo delivery, payload servicing, and the exploration of the Moon and Mars. Specific elements analyzed, but not limited to, include the Space Exploration Initiative (SEI), the National Launch System (NLS), the current expendable launch vehicle (ELV) fleet, ground facilities, the Space Station Freedom (SSF), and other civil, military and commercial payloads. The performance of this study entails maintaining a broad perspective on the large number of transportation elements that could potentially comprise the U.S. space infrastructure over the next several decades. To perform this systems evaluation, top-level trade studies are conducted to enhance our understanding of the relationships between elements of the infrastructure. This broad 'infrastructure-level perspective' permits the identification of preferred infrastructures. Sensitivity analyses are performed to assure the credibility and usefulness of study results. This executive summary of the transportation systems analyses (TSM) semi-annual report addresses the SSF logistics resupply. Our analysis parallels the ongoing NASA SSF redesign effort. Therefore, there could be no SSF design to drive our logistics analysis. Consequently, the analysis attempted to bound the reasonable SSF design possibilities (and the subsequent transportation implications). No other strategy really exists until after a final decision is rendered on the SSF configuration.

  8. Fractal and multifractal analyses of bipartite networks

    PubMed Central

    Liu, Jin-Long; Wang, Jian; Yu, Zu-Guo; Xie, Xian-Hua

    2017-01-01

    Bipartite networks have attracted considerable interest in various fields. Fractality and multifractality of unipartite (classical) networks have been studied in recent years, but there is no work to study these properties of bipartite networks. In this paper, we try to unfold the self-similarity structure of bipartite networks by performing the fractal and multifractal analyses for a variety of real-world bipartite network data sets and models. First, we find the fractality in some bipartite networks, including the CiteULike, Netflix, MovieLens (ml-20m), Delicious data sets and (u, v)-flower model. Meanwhile, we observe the shifted power-law or exponential behavior in other several networks. We then focus on the multifractal properties of bipartite networks. Our results indicate that the multifractality exists in those bipartite networks possessing fractality. To capture the inherent attribute of bipartite network with two types different nodes, we give the different weights for the nodes of different classes, and show the existence of multifractality in these node-weighted bipartite networks. In addition, for the data sets with ratings, we modify the two existing algorithms for fractal and multifractal analyses of edge-weighted unipartite networks to study the self-similarity of the corresponding edge-weighted bipartite networks. The results show that our modified algorithms are feasible and can effectively uncover the self-similarity structure of these edge-weighted bipartite networks and their corresponding node-weighted versions. PMID:28361962

  9. Fractal and multifractal analyses of bipartite networks

    NASA Astrophysics Data System (ADS)

    Liu, Jin-Long; Wang, Jian; Yu, Zu-Guo; Xie, Xian-Hua

    2017-03-01

    Bipartite networks have attracted considerable interest in various fields. Fractality and multifractality of unipartite (classical) networks have been studied in recent years, but there is no work to study these properties of bipartite networks. In this paper, we try to unfold the self-similarity structure of bipartite networks by performing the fractal and multifractal analyses for a variety of real-world bipartite network data sets and models. First, we find the fractality in some bipartite networks, including the CiteULike, Netflix, MovieLens (ml-20m), Delicious data sets and (u, v)-flower model. Meanwhile, we observe the shifted power-law or exponential behavior in other several networks. We then focus on the multifractal properties of bipartite networks. Our results indicate that the multifractality exists in those bipartite networks possessing fractality. To capture the inherent attribute of bipartite network with two types different nodes, we give the different weights for the nodes of different classes, and show the existence of multifractality in these node-weighted bipartite networks. In addition, for the data sets with ratings, we modify the two existing algorithms for fractal and multifractal analyses of edge-weighted unipartite networks to study the self-similarity of the corresponding edge-weighted bipartite networks. The results show that our modified algorithms are feasible and can effectively uncover the self-similarity structure of these edge-weighted bipartite networks and their corresponding node-weighted versions.

  10. Phylogenetic uncertainty revisited: Implications for ecological analyses.

    PubMed

    Rangel, Thiago F; Colwell, Robert K; Graves, Gary R; Fučíková, Karolina; Rahbek, Carsten; Diniz-Filho, José Alexandre F

    2015-05-01

    Ecologists and biogeographers usually rely on a single phylogenetic tree to study evolutionary processes that affect macroecological patterns. This approach ignores the fact that each phylogenetic tree is a hypothesis about the evolutionary history of a clade, and cannot be directly observed in nature. Also, trees often leave out many extant species, or include missing species as polytomies because of a lack of information on the relationship among taxa. Still, researchers usually do not quantify the effects of phylogenetic uncertainty in ecological analyses. We propose here a novel analytical strategy to maximize the use of incomplete phylogenetic information, while simultaneously accounting for several sources of phylogenetic uncertainty that may distort statistical inferences about evolutionary processes. We illustrate the approach using a clade-wide analysis of the hummingbirds, evaluating how different sources of uncertainty affect several phylogenetic comparative analyses of trait evolution and biogeographic patterns. Although no statistical approximation can fully substitute for a complete and robust phylogeny, the method we describe and illustrate enables researchers to broaden the number of clades for which studies informed by evolutionary relationships are possible, while allowing the estimation and control of statistical error that arises from phylogenetic uncertainty. Software tools to carry out the necessary computations are offered. © 2015 The Author(s).

  11. Bioinformatics tools for analysing viral genomic data.

    PubMed

    Orton, R J; Gu, Q; Hughes, J; Maabar, M; Modha, S; Vattipally, S B; Wilkie, G S; Davison, A J

    2016-04-01

    The field of viral genomics and bioinformatics is experiencing a strong resurgence due to high-throughput sequencing (HTS) technology, which enables the rapid and cost-effective sequencing and subsequent assembly of large numbers of viral genomes. In addition, the unprecedented power of HTS technologies has enabled the analysis of intra-host viral diversity and quasispecies dynamics in relation to important biological questions on viral transmission, vaccine resistance and host jumping. HTS also enables the rapid identification of both known and potentially new viruses from field and clinical samples, thus adding new tools to the fields of viral discovery and metagenomics. Bioinformatics has been central to the rise of HTS applications because new algorithms and software tools are continually needed to process and analyse the large, complex datasets generated in this rapidly evolving area. In this paper, the authors give a brief overview of the main bioinformatics tools available for viral genomic research, with a particular emphasis on HTS technologies and their main applications. They summarise the major steps in various HTS analyses, starting with quality control of raw reads and encompassing activities ranging from consensus and de novo genome assembly to variant calling and metagenomics, as well as RNA sequencing.

  12. Used Fuel Management System Interface Analyses - 13578

    SciTech Connect

    Howard, Robert; Busch, Ingrid; Nutt, Mark; Morris, Edgar; Puig, Francesc; Carter, Joe; Delley, Alexcia; Rodwell, Phillip; Hardin, Ernest; Kalinina, Elena; Clark, Robert; Cotton, Thomas

    2013-07-01

    Preliminary system-level analyses of the interfaces between at-reactor used fuel management, consolidated storage facilities, and disposal facilities, along with the development of supporting logistics simulation tools, have been initiated to provide the U.S. Department of Energy (DOE) and other stakeholders with information regarding the various alternatives for managing used nuclear fuel (UNF) generated by the current fleet of light water reactors operating in the United States. An important UNF management system interface consideration is the need for ultimate disposal of UNF assemblies contained in waste packages that are sized to be compatible with different geologic media. Thermal analyses indicate that waste package sizes for the geologic media under consideration by the Used Fuel Disposition Campaign may be significantly smaller than the canisters being used for on-site dry storage by the nuclear utilities. Therefore, at some point along the UNF disposition pathway, there could be a need to repackage fuel assemblies already loaded and being loaded into the dry storage canisters currently in use. The implications of where and when the packaging or repackaging of commercial UNF will occur are key questions being addressed in this evaluation. The analysis demonstrated that thermal considerations will have a major impact on the operation of the system and that acceptance priority, rates, and facility start dates have significant system implications. (authors)

  13. Waste Stream Analyses for Nuclear Fuel Cycles

    SciTech Connect

    N. R. Soelberg

    2010-08-01

    A high-level study was performed in Fiscal Year 2009 for the U.S. Department of Energy (DOE) Office of Nuclear Energy (NE) Advanced Fuel Cycle Initiative (AFCI) to provide information for a range of nuclear fuel cycle options (Wigeland 2009). At that time, some fuel cycle options could not be adequately evaluated since they were not well defined and lacked sufficient information. As a result, five families of these fuel cycle options are being studied during Fiscal Year 2010 by the Systems Analysis Campaign for the DOE NE Fuel Cycle Research and Development (FCRD) program. The quality and completeness of data available to date for the fuel cycle options is insufficient to perform quantitative radioactive waste analyses using recommended metrics. This study has been limited thus far to qualitative analyses of waste streams from the candidate fuel cycle options, because quantitative data for wastes from the front end, fuel fabrication, reactor core structure, and used fuel for these options is generally not yet available.

  14. Fractal and multifractal analyses of bipartite networks.

    PubMed

    Liu, Jin-Long; Wang, Jian; Yu, Zu-Guo; Xie, Xian-Hua

    2017-03-31

    Bipartite networks have attracted considerable interest in various fields. Fractality and multifractality of unipartite (classical) networks have been studied in recent years, but there is no work to study these properties of bipartite networks. In this paper, we try to unfold the self-similarity structure of bipartite networks by performing the fractal and multifractal analyses for a variety of real-world bipartite network data sets and models. First, we find the fractality in some bipartite networks, including the CiteULike, Netflix, MovieLens (ml-20m), Delicious data sets and (u, v)-flower model. Meanwhile, we observe the shifted power-law or exponential behavior in other several networks. We then focus on the multifractal properties of bipartite networks. Our results indicate that the multifractality exists in those bipartite networks possessing fractality. To capture the inherent attribute of bipartite network with two types different nodes, we give the different weights for the nodes of different classes, and show the existence of multifractality in these node-weighted bipartite networks. In addition, for the data sets with ratings, we modify the two existing algorithms for fractal and multifractal analyses of edge-weighted unipartite networks to study the self-similarity of the corresponding edge-weighted bipartite networks. The results show that our modified algorithms are feasible and can effectively uncover the self-similarity structure of these edge-weighted bipartite networks and their corresponding node-weighted versions.

  15. Analyses of broadband noise mechanisms of rotors

    NASA Technical Reports Server (NTRS)

    George, A. R.

    1986-01-01

    The various source mechanisms which generate broadband noise on a range of rotors are reviewed. Analyses of these mechanisms are presented and compared to existing experimental data. The sources considered are load fluctuations due to inflow turbulence, due to turbulent blade boundary layers passing the trailing edge, and due to tip vortex formation turbulence. Vortex shedding noise due to laminar boundary layers and blunt trailing edges is not considered in detail as it can be avoided in most cases. Present analyses are adequate to predict the spectra from a wide variety of experiments on fans, helicopter rotors, and wind turbines to within about 5 to 10 dB. Better knowledge of the inflow turbulence improves the accuracy of the predictions. Inflow turbulence noise depends strongly on ambient conditions and dominates at low frequencies. Trailing edge and tip vortex noise are important at higher frequencies if inflow turbulence is weak. Boundary layer trailing edge noise increases slowly with angle of attack but not as rapidly as tip vortex formation noise. Tip noise can be important at high angles of attack for wide chord, square edge tips.

  16. Evaluating heterogeneity in cumulative meta-analyses

    PubMed Central

    Villanueva, Elmer V; Zavarsek, Silva

    2004-01-01

    Background Recently developed measures such as I2 and H allow the evaluation of the impact of heterogeneity in conventional meta-analyses. There has been no examination of the development of heterogeneity in the context of a cumulative meta-analysis. Methods Cumulative meta-analyses of five smoking cessation interventions (clonidine, nicotine replacement therapy using gum and patch, physician advice and acupuncture) were used to calculate I2 and H. These values were plotted by year of publication, control event rate and sample size to trace the development of heterogeneity over these covariates. Results The cumulative evaluation of heterogeneity varied according to the measure of heterogeneity used and the basis of cumulation. Plots produced from the calculations revealed areas of heterogeneity useful in the consideration of potential sources for further study. Conclusion The examination of heterogeneity in conjunction with summary effect estimates in a cumulative meta-analysis offered valuable insight into the evolution of variation. Such information is not available in the context of conventional meta-analysis and has the potential to lead to the development of a richer picture of the effectiveness of interventions. PMID:15251035

  17. Evaluating heterogeneity in cumulative meta-analyses.

    PubMed

    Villanueva, Elmer V; Zavarsek, Silva

    2004-07-13

    Recently developed measures such as I2 and H allow the evaluation of the impact of heterogeneity in conventional meta-analyses. There has been no examination of the development of heterogeneity in the context of a cumulative meta-analysis. Cumulative meta-analyses of five smoking cessation interventions (clonidine, nicotine replacement therapy using gum and patch, physician advice and acupuncture) were used to calculate I2 and H. These values were plotted by year of publication, control event rate and sample size to trace the development of heterogeneity over these covariates. The cumulative evaluation of heterogeneity varied according to the measure of heterogeneity used and the basis of cumulation. Plots produced from the calculations revealed areas of heterogeneity useful in the consideration of potential sources for further study. The examination of heterogeneity in conjunction with summary effect estimates in a cumulative meta-analysis offered valuable insight into the evolution of variation. Such information is not available in the context of conventional meta-analysis and has the potential to lead to the development of a richer picture of the effectiveness of interventions.

  18. NEXT Ion Thruster Performance Dispersion Analyses

    NASA Technical Reports Server (NTRS)

    Soulas, George C.; Patterson, Michael J.

    2008-01-01

    The NEXT ion thruster is a low specific mass, high performance thruster with a nominal throttling range of 0.5 to 7 kW. Numerous engineering model and one prototype model thrusters have been manufactured and tested. Of significant importance to propulsion system performance is thruster-to-thruster performance dispersions. This type of information can provide a bandwidth of expected performance variations both on a thruster and a component level. Knowledge of these dispersions can be used to more conservatively predict thruster service life capability and thruster performance for mission planning, facilitate future thruster performance comparisons, and verify power processor capabilities are compatible with the thruster design. This study compiles the test results of five engineering model thrusters and one flight-like thruster to determine unit-to-unit dispersions in thruster performance. Component level performance dispersion analyses will include discharge chamber voltages, currents, and losses; accelerator currents, electron backstreaming limits, and perveance limits; and neutralizer keeper and coupling voltages and the spot-to-plume mode transition flow rates. Thruster level performance dispersion analyses will include thrust efficiency.

  19. Autisme et douleur – analyse bibliographique

    PubMed Central

    Dubois, Amandine; Rattaz, Cécile; Pry, René; Baghdadli, Amaria

    2010-01-01

    La présente analyse bibliographique a pour objectif de réaliser un bilan des travaux publiés dans le champ de la douleur et de l’autisme. L’article aborde, dans un premier temps, les études publiées concernant les modes d’expression de la douleur observés dans cette population. Différentes hypothèses permettant d’expliquer les particularités expressives des personnes avec autisme sont ensuite passées en revue : excès d’endorphines, particularités dans le traitement sensoriel, déficit sociocommunicatif. Cette analyse bibliographique aborde, pour terminer, la question de l’évaluation et de la prise en compte de la douleur chez les personnes avec autisme. Les auteurs concluent à l’absence d’homogénéité des résultats des études publiées et au besoin de poursuivre les recherches afin de parvenir à des données consensuelles sur un domaine d’étude encore peu exploité au plan scientifique. Sur un plan clinique, l’approfondissement des connaissances dans ce domaine devrait permettre de mettre au point des outils d’évaluation de la douleur et d’ainsi en assurer une meilleure prise en charge au quotidien. PMID:20808970

  20. Integrated Genomic Analyses of Ovarian Carcinoma

    PubMed Central

    2011-01-01

    Summary The Cancer Genome Atlas (TCGA) project has analyzed mRNA expression, miRNA expression, promoter methylation, and DNA copy number in 489 high-grade serous ovarian adenocarcinomas (HGS-OvCa) and the DNA sequences of exons from coding genes in 316 of these tumors. These results show that HGS-OvCa is characterized by TP53 mutations in almost all tumors (96%); low prevalence but statistically recurrent somatic mutations in 9 additional genes including NF1, BRCA1, BRCA2, RB1, and CDK12; 113 significant focal DNA copy number aberrations; and promoter methylation events involving 168 genes. Analyses delineated four ovarian cancer transcriptional subtypes, three miRNA subtypes, four promoter methylation subtypes, a transcriptional signature associated with survival duration and shed new light on the impact on survival of tumors with BRCA1/2 and CCNE1 aberrations. Pathway analyses suggested that homologous recombination is defective in about half of tumors, and that Notch and FOXM1 signaling are involved in serous ovarian cancer pathophysiology. PMID:21720365

  1. Computational analyses of multilevel discourse comprehension.

    PubMed

    Graesser, Arthur C; McNamara, Danielle S

    2011-04-01

    The proposed multilevel framework of discourse comprehension includes the surface code, the textbase, the situation model, the genre and rhetorical structure, and the pragmatic communication level. We describe these five levels when comprehension succeeds and also when there are communication misalignments and comprehension breakdowns. A computer tool has been developed, called Coh-Metrix, that scales discourse (oral or print) on dozens of measures associated with the first four discourse levels. The measurement of these levels with an automated tool helps researchers track and better understand multilevel discourse comprehension. Two sets of analyses illustrate the utility of Coh-Metrix in discourse theory and educational practice. First, Coh-Metrix was used to measure the cohesion of the text base and situation model, as well as potential extraneous variables, in a sample of published studies that manipulated text cohesion. This analysis helped us better understand what was precisely manipulated in these studies and the implications for discourse comprehension mechanisms. Second, Coh-Metrix analyses are reported for samples of narrative and science texts in order to advance the argument that traditional text difficulty measures are limited because they fail to accommodate most of the levels of the multilevel discourse comprehension framework.

  2. MCNP analyses of criticality calculation results

    SciTech Connect

    Forster, R.A.; Booth, T.E.

    1995-05-01

    Careful assessment of the results of a calculation by the code itself can reduce mistakes in the problem setup and execution. MCNP has over four hundred error messages that inform the user of FATAL or WARNING errors that have been discovered during the processing of just the input file. The latest version, MCNP4A, now performs a self assessment of the calculated results to aid the user in determining the quality of the Monte Carlo results. MCNP4A, which was released to RSIC in October 1993, contains new analyses of the MCNP Monte Carlo calculation that provide simple user WARNINGs for both criticality and fixed source calculations. The goal of the new analyses is to provide the MCNP criticality practitioner with enough information in the output to assess the validity of the k{sub eff} calculation and any associated tallies. The results of these checks are presented in the k{sub eff} results summary page, several k{sub eff} tables and graphs, and tally tables and graphs. Plots of k{sub eff} at the workstation are also available as the problem is running or in a postprocessing mode to assess problem performance and results.

  3. [Clinical analyses of sarcoidosis with ocular involvement].

    PubMed

    Liu, Xiaofang; Sun, Yongchang; Dai, Honglei; Jin, Jianmin; Liu, Yong

    2014-11-04

    To explore the clinical characteristics and systemic manifestations of sarcoidosis with ocular involvement. The clinical data of 19 cases of sarcoidosis with ocular involvement confirmed by pathology at Beijing Tongren Hospital from March 2004 to February 2014 were analyzed retrospectively. The ocular manifestations, chest imaging findings, laboratory tests and pathological diagnosis were reviewed. And the clinical features of sarcoidosis with ocular involvement were summarized. Among them, there were 6 males and 13 females with an average age of (47 ± 14) (16-76) years. The ocular symptoms were the initial presenting manifestations of sarcoidosis in 12 cases while 2 cases presented ocular symptoms during the course of disease. And aother 5 cases without ocular symptoms were confirmed to have ocular involvement by eye examination. The main manifestations of ocular sarcoidosis were uveitis (n = 16), chorioretinitis (n = 3), retinal vasculitis (n = 2), optic neuritis (n = 1) and orbital mass (n = 3). The key feature of sarcoidosis was bilateral hilar lymphadenopathy (BHL) (14/16) on chest film. The diagnosis in 17 cases was confirmed by biopsy of extra-ocular organs. The positive diagnostic rate of bronchial biopsy was 8/9. Ocular involvement in sarcoidosis is relatively common due to a variety of ocular manifestations and serious vision impairment in some patients. Ophthalmologic examination is essential in the clinical management of sarcoidosis. Chest imaging and bronchial biopsy are important for the diagnosis of sarcoidosis with initial ocular manifestations.

  4. Department of Energy's team's analyses of Soviet designed VVERs

    SciTech Connect

    Not Available

    1989-09-01

    This document provides Appendices A thru K of this report. The topics discussed respectively are: radiation induced embrittlement and annealing of reactor pressure vessel steels; loss of coolant accident blowdown analyses; LOCA blowdown response analyses; non-seismic structural response analyses; seismic analyses; S'' seal integrity; reactor transient analyses; fire protection; aircraft impacts; and boric acid induced corrosion. (FI).

  5. Stellar Imager

    NASA Technical Reports Server (NTRS)

    Carpenter, Kenneth

    2007-01-01

    The Stellar Imager (SI) is one of NASA's "Vision Missions" - concepts for future, space-based, strategic missions that could enormously increase our capabilities for observing the Cosmos. SI is designed as a UV/Optical Interferometer which will enable 0.1 milli-arcsecond (mas) spectral imaging of stellar surfaces and, via asteroseismology, stellar interiors and of the Universe in general. The ultra-sharp images of the Stellar Imager will revolutionize our view of many dynamic astrophysical processes by transforming point sources into extended sources, and snapshots into evolving views. SI, with a characteristic angular resolution of 0.1 milli-arcseconds at 2000 Angstroms, represents an advance in image detail of several hundred times over that provided by the Hubble Space Telescope. The Stellar Imager will zoom in on what today-with few exceptions - we only know as point sources, revealing processes never before seen, thus providing a tool as fundamental to astrophysics as the microscope is to the study of life on Earth. SI's science focuses on the role of magnetism in the Universe, particularly on magnetic activity on the surfaces of stars like the Sun. It's prime goal is to enable long-term forecasting of solar activity and the space weather that it drives, in support of the Living With a Star program in the Exploration Era. SI will also revolutionize our understanding of the formation of planetary systems, of the habitability and climatology of distant planets, and of many magneto-hydrodynamically controlled processes in the Universe. Stellar Imager is included as a "Flagship and Landmark Discovery Mission" in the 2005 Sun Solar System Connection (SSSC) Roadmap and as a candidate for a "Pathways to Life Observatory" in the Exploration of the Universe Division (EUD) Roadmap (May, 2005) and as such is a candidate mission for the 2025-2030 timeframe. An artist's drawing of the current "baseline" concept for SI is presented.

  6. Time series analyses of global change data.

    PubMed

    Lane, L J; Nichols, M H; Osborn, H B

    1994-01-01

    The hypothesis that statistical analyses of historical time series data can be used to separate the influences of natural variations from anthropogenic sources on global climate change is tested. Point, regional, national, and global temperature data are analyzed. Trend analyses for the period 1901-1987 suggest mean annual temperatures increased (in degrees C per century) globally at the rate of about 0.5, in the USA at about 0.3, in the south-western USA desert region at about 1.2, and at the Walnut Gulch Experimental Watershed in south-eastern Arizona at about 0.8. However, the rates of temperature change are not constant but vary within the 87-year period. Serial correlation and spectral density analysis of the temperature time series showed weak periodicities at various frequencies. The only common periodicity among the temperature series is an apparent cycle of about 43 years. The temperature time series were correlated with the Wolf sunspot index, atmospheric CO(2) concentrations interpolated from the Siple ice core data, and atmospheric CO(2) concentration data from Mauna Loa measurements. Correlation analysis of temperature data with concurrent data on atmospheric CO(2) concentrations and the Wolf sunspot index support previously reported significant correlation over the 1901-1987 period. Correlation analysis between temperature, atmospheric CO(2) concentration, and the Wolf sunspot index for the shorter period, 1958-1987, when continuous Mauna Loa CO(2) data are available, suggest significant correlation between global warming and atmospheric CO(2) concentrations but no significant correlation between global warming and the Wolf sunspot index. This may be because the Wolf sunspot index apparently increased from 1901 until about 1960 and then decreased thereafter, while global warming apparently continued to increase through 1987. Correlation of sunspot activity with global warming may be spurious but additional analyses are required to test this hypothesis

  7. Genetic analyses of a seasonal interval timer.

    PubMed

    Prendergast, Brian J; Renstrom, Randall A; Nelson, Randy J

    2004-08-01

    Seasonal clocks (e.g., circannual clocks, seasonal interval timers) permit anticipation of regularly occurring environmental events by timing the onset of seasonal transitions in reproduction, metabolism, and behavior. Implicit in the concept that seasonal clocks reflect adaptations to the local environment is the unexamined assumption that heritable genetic variance exists in the critical features of such clocks, namely, their temporal properties. These experiments quantified the intraspecific variance in, and heritability of, the photorefractoriness interval timer in Siberian hamsters (Phodopus sungorus), a seasonal clock that provides temporal information to mechanisms that regulate seasonal transitions in body weight. Twenty-seven families consisting of 54 parents and 109 offspring were raised in a long-day photoperiod and transferred as adults to an inhibitory photoperiod (continuous darkness; DD). Weekly body weight measurements permitted specification of the interval of responsiveness to DD, a reflection of the duration of the interval timer, in each individual. Body weights of males and females decreased after exposure to DD, but 3 to 5 months later, somatic recrudescence occurred, indicative of photorefractoriness to DD. The interval timer was approximately 5 weeks longer and twice as variable in females relative to males. Analyses of variance of full siblings revealed an overall intraclass correlation of 0.71 +/- 0.04 (0.51 +/- 0.10 for male offspring and 0.80 +/- 0.06 for female offspring), suggesting a significant family resemblance in the duration of interval timers. Parent-offspring regression analyses yielded an overall heritability estimate of 0.61 +/- 0.2; h(2) estimates from parent-offspring regression analyses were significant for female offspring (0.91 +/- 0.4) but not for male offspring (0.35 +/- 0.2), indicating strong additive genetic components for this trait, primarily in females. In nature, individual differences, both within and between

  8. X-ray CT analyses, models and numerical simulations: a comparison with petrophysical analyses in an experimental CO2 study

    NASA Astrophysics Data System (ADS)

    Henkel, Steven; Pudlo, Dieter; Enzmann, Frieder; Reitenbach, Viktor; Albrecht, Daniel; Ganzer, Leonhard; Gaupp, Reinhard

    2016-06-01

    An essential part of the collaborative research project H2STORE (hydrogen to store), which is funded by the German government, was a comparison of various analytical methods for characterizing reservoir sandstones from different stratigraphic units. In this context Permian, Triassic and Tertiary reservoir sandstones were analysed. Rock core materials, provided by RWE Gasspeicher GmbH (Dortmund, Germany), GDF Suez E&P Deutschland GmbH (Lingen, Germany), E.ON Gas Storage GmbH (Essen, Germany) and RAG Rohöl-Aufsuchungs Aktiengesellschaft (Vienna, Austria), were processed by different laboratory techniques; thin sections were prepared, rock fragments were crushed and cubes of 1 cm edge length and plugs 3 to 5 cm in length with a diameter of about 2.5 cm were sawn from macroscopic homogeneous cores. With this prepared sample material, polarized light microscopy and scanning electron microscopy, coupled with image analyses, specific surface area measurements (after Brunauer, Emmet and Teller, 1938; BET), He-porosity and N2-permeability measurements and high-resolution microcomputer tomography (μ-CT), which were used for numerical simulations, were applied. All these methods were practised on most of the same sample material, before and on selected Permian sandstones also after static CO2 experiments under reservoir conditions. A major concern in comparing the results of these methods is an appraisal of the reliability of the given porosity, permeability and mineral-specific reactive (inner) surface area data. The CO2 experiments modified the petrophysical as well as the mineralogical/geochemical rock properties. These changes are detectable by all applied analytical methods. Nevertheless, a major outcome of the high-resolution μ-CT analyses and following numerical data simulations was that quite similar data sets and data interpretations were maintained by the different petrophysical standard methods. Moreover, the μ-CT analyses are not only time saving, but also

  9. TRACE ELEMENT ANALYSES OF URANIUM MATERIALS

    SciTech Connect

    Beals, D; Charles Shick, C

    2008-06-09

    The Savannah River National Laboratory (SRNL) has developed an analytical method to measure many trace elements in a variety of uranium materials at the high part-per-billion (ppb) to low part-per-million (ppm) levels using matrix removal and analysis by quadrapole ICP-MS. Over 35 elements were measured in uranium oxides, acetate, ore and metal. Replicate analyses of samples did provide precise results however none of the materials was certified for trace element content thus no measure of the accuracy could be made. The DOE New Brunswick Laboratory (NBL) does provide a Certified Reference Material (CRM) that has provisional values for a series of trace elements. The NBL CRM were purchased and analyzed to determine the accuracy of the method for the analysis of trace elements in uranium oxide. These results are presented and discussed in the following paper.

  10. Ensemble decadal predictions from analysed initial conditions.

    PubMed

    Troccoli, Alberto; Palmer, T N

    2007-08-15

    Sensitivity experiments using a coupled model initialized from analysed atmospheric and oceanic observations are used to investigate the potential for interannual-to-decadal predictability. The potential for extending seasonal predictions to longer time scales is explored using the same coupled model configuration and initialization procedure as used for seasonal prediction. It is found that, despite model drift, climatic signals on interannual-to-decadal time scales appear to be detectable. Two climatic states have been chosen: one starting in 1965, i.e. ahead of a period of global cooling, and the other in 1994, ahead of a period of global warming. The impact of initial conditions and of the different levels of greenhouse gases are isolated in order to gain insights into the source of predictability.

  11. Analysing avian eggshell pigments with Raman spectroscopy.

    PubMed

    Thomas, Daniel B; Hauber, Mark E; Hanley, Daniel; Waterhouse, Geoffrey I N; Fraser, Sara; Gordon, Keith C

    2015-09-01

    Avian eggshells are variable in appearance, including coloration. Here, we demonstrate that Raman spectroscopy can provide accurate diagnostic information about major eggshell constituents, including the pigments biliverdin and protoporphyrin IX. Eggshells pigmented with biliverdin showed a series of pigment-diagnostic Raman peaks under 785 nm excitation. Eggshells pigmented with protoporphyrin IX showed strong emission under 1064 nm and 785 nm excitation, whereas resonance Raman spectra (351 nm excitation) showed a set of protoporphyrin IX informative peaks characteristic of protoporphyrin IX. As representative examples, we identified biliverdin in the olive green eggshells of elegant crested tinamous (Eudromia elegans) and in the blue eggshells of extinct upland moa (Megalapteryx didinus). This study encourages the wider use of Raman spectroscopy in pigment and coloration research and highlights the value of this technique for non-destructive analyses of museum eggshell specimens. © 2015. Published by The Company of Biologists Ltd.

  12. FACS binding assay for analysing GDNF interactions.

    PubMed

    Quintino, Luís; Baudet, Aurélie; Larsson, Jonas; Lundberg, Cecilia

    2013-08-15

    Glial cell-line derived neurotrophic factor (GDNF) is a secreted protein with great therapeutic potential. However, in order to analyse the interactions between GDNF and its receptors, researchers have been mostly dependent of radioactive binding assays. We developed a FACS-based binding assay for GDNF as an alternative to current methods. We demonstrated that the FACS-based assay using TGW cells allowed readily detection of GDNF binding and displacement to endogenous receptors. The dissociation constant and half maximal inhibitory concentration obtained were comparable to other studies using standard binding assays. Overall, this FACS-based, simple to perform and adaptable to high throughput setup, provides a safer and reliable alternative to radioactive methods. Copyright © 2013 Elsevier B.V. All rights reserved.

  13. Project analysis and integration economic analyses summary

    NASA Technical Reports Server (NTRS)

    Macomber, H. L.

    1986-01-01

    An economic-analysis summary was presented for the manufacture of crystalline-silicon modules involving silicon ingot/sheet, growth, slicing, cell manufacture, and module assembly. Economic analyses provided: useful quantitative aspects for complex decision-making to the Flat-plate Solar Array (FSA) Project; yardsticks for design and performance to industry; and demonstration of how to evaluate and understand the worth of research and development both to JPL and other government agencies and programs. It was concluded that future research and development funds for photovoltaics must be provided by the Federal Government because the solar industry today does not reap enough profits from its present-day sales of photovoltaic equipment.

  14. Phylogenomic Analyses Support Traditional Relationships within Cnidaria.

    PubMed

    Zapata, Felipe; Goetz, Freya E; Smith, Stephen A; Howison, Mark; Siebert, Stefan; Church, Samuel H; Sanders, Steven M; Ames, Cheryl Lewis; McFadden, Catherine S; France, Scott C; Daly, Marymegan; Collins, Allen G; Haddock, Steven H D; Dunn, Casey W; Cartwright, Paulyn

    2015-01-01

    Cnidaria, the sister group to Bilateria, is a highly diverse group of animals in terms of morphology, lifecycles, ecology, and development. How this diversity originated and evolved is not well understood because phylogenetic relationships among major cnidarian lineages are unclear, and recent studies present contrasting phylogenetic hypotheses. Here, we use transcriptome data from 15 newly-sequenced species in combination with 26 publicly available genomes and transcriptomes to assess phylogenetic relationships among major cnidarian lineages. Phylogenetic analyses using different partition schemes and models of molecular evolution, as well as topology tests for alternative phylogenetic relationships, support the monophyly of Medusozoa, Anthozoa, Octocorallia, Hydrozoa, and a clade consisting of Staurozoa, Cubozoa, and Scyphozoa. Support for the monophyly of Hexacorallia is weak due to the equivocal position of Ceriantharia. Taken together, these results further resolve deep cnidarian relationships, largely support traditional phylogenetic views on relationships, and provide a historical framework for studying the evolutionary processes involved in one of the most ancient animal radiations.

  15. Error analyses for a gravity gradiometer mission

    NASA Technical Reports Server (NTRS)

    Kahn, W. D.; Von Bun, F. O.

    1985-01-01

    This paper addresses the usefulness of an orbiting gravity gradiometer as a sensor for mapping the fine structure of the earth gravity field. The exact knowledge of this field is essential for studies of the solid earth and the dynamics of the oceans. Although the earth gravity tensor, measured by a gradiometer assembly, has nine components, only five components are independent. This latter fact is as a consequence of the symmetry and conservative nature of the earth's gravity field. The most dominant component is the radial one. The error analyses considered here are therefore based only upon a single axis gradiometer sensing this radial component. The expected global gravity and geoid errors for a 50 x 50-km (1/2 x 1/2 deg) area utilizing a spaceborne gradiometer with a precision of 0.001 EU in a 160-km circular polar orbit are about 3 mGAL and 5 cm, respectively.

  16. Precise Chemical Analyses of Planetary Surfaces

    NASA Technical Reports Server (NTRS)

    Kring, David; Schweitzer, Jeffrey; Meyer, Charles; Trombka, Jacob; Freund, Friedemann; Economou, Thanasis; Yen, Albert; Kim, Soon Sam; Treiman, Allan H.; Blake, David; hide

    1996-01-01

    We identify the chemical elements and element ratios that should be analyzed to address many of the issues identified by the Committee on Planetary and Lunar Exploration (COMPLEX). We determined that most of these issues require two sensitive instruments to analyze the necessary complement of elements. In addition, it is useful in many cases to use one instrument to analyze the outermost planetary surface (e.g. to determine weathering effects), while a second is used to analyze a subsurface volume of material (e.g., to determine the composition of unaltered planetary surface material). This dual approach to chemical analyses will also facilitate the calibration of orbital and/or Earth-based spectral observations of the planetary body. We determined that in many cases the scientific issues defined by COMPLEX can only be fully addressed with combined packages of instruments that would supplement the chemical data with mineralogic or visual information.

  17. Determining Significant Endpoints for Ecological risk Analyses

    SciTech Connect

    Hinton, Thimas G.; Bedford, Joel

    1999-06-01

    Our interest is in obtaining a scientifically defensible endpoint for measuring ecological risks to populations exposed to chronic, low-level radiation, and radiation with concomitant exposure to chemicals. To do so, we believe that we must understand the extent to which molecular damage is detrimental at the individual and population levels of biological organization. Ecological risk analyses based on molecular damage, without an understanding of the impacts to higher levels of biological organization, could cause cleanup strategies on DOE sites to be overly conservative and unnecessarily expensive. Our goal is to determine the relevancy of sublethal cellular damage to the performance of individuals and populations. We think that we can achieve this by using novel biological dosimeters in controlled, manipulative dose/effects experiments, and by coupling changes in metabolic rates and energy allocation patterns to meaningful population response variables such as age-specific survivorship, reproductive output, age at maturity and longevity.

  18. An introduction to modern missing data analyses.

    PubMed

    Baraldi, Amanda N; Enders, Craig K

    2010-02-01

    A great deal of recent methodological research has focused on two modern missing data analysis methods: maximum likelihood and multiple imputation. These approaches are advantageous to traditional techniques (e.g. deletion and mean imputation techniques) because they require less stringent assumptions and mitigate the pitfalls of traditional techniques. This article explains the theoretical underpinnings of missing data analyses, gives an overview of traditional missing data techniques, and provides accessible descriptions of maximum likelihood and multiple imputation. In particular, this article focuses on maximum likelihood estimation and presents two analysis examples from the Longitudinal Study of American Youth data. One of these examples includes a description of the use of auxiliary variables. Finally, the paper illustrates ways that researchers can use intentional, or planned, missing data to enhance their research designs.

  19. [Use of pharmacoeconomics analyses to health protection].

    PubMed

    Drozd, Mariola

    2002-01-01

    The pharmacoeconomics makes possible a most favourable utilization of capital resources appropriated for the health protection. For the use of economic analysis health and effects of disease and its treatment are represented in absolute values having a common base--money. The economic analysis is usually carried out from a certain perspective. Something, what is an expense for someone can be a profit for someone else. This work is a review of available Polish literature describing main assumptions of the pharmoeconomics and its instruments--the pharmacoeconomic analyses. As a result of the review it has been ascertained that a modern medicine can not do without economics. At present the capital resources are constantly too small, profitability of an employed method of the therapy or drug must be assessed all the time.

  20. Neutronic Analyses of the Trade Demonstration Facility

    SciTech Connect

    Rubbia, C.

    2004-09-15

    The TRiga Accelerator-Driven Experiment (TRADE), to be performed in the TRIGA reactor of the ENEA-Casaccia Centre in Italy, consists of the coupling of an external proton accelerator to a target to be installed in the central channel of the reactor scrammed to subcriticality. This pilot experiment, aimed at a global demonstration of the accelerator-driven system concept, is based on an original idea of C. Rubbia. The present paper reports the results of some neutronic analyses focused on the feasibility of TRADE. Results show that all relevant experiments (at different power levels in a wide range of subcriticalities) can be carried out with relatively limited modifications to the present TRIGA reactor.

  1. Analyses of moisture in polymers and composites

    NASA Technical Reports Server (NTRS)

    Ryan, L. E.; Vaughan, R. W.

    1980-01-01

    A suitable method for the direct measurement of moisture concentrations after humidity/thermal exposure on state of the art epoxy and polyimide resins and their graphite and glass fiber reinforcements was investigated. Methods for the determination of moisture concentration profiles, moisture diffusion modeling and moisture induced chemical changes were examined. Carefully fabricated, precharacterized epoxy and polyimide neat resins and their AS graphite and S glass reinforced composites were exposed to humid conditions using heavy water (D20), at ambient and elevated temperatures. These specimens were fixtured to theoretically limit the D20 permeation to a unidirectional penetration axis. The analytical techniques evaluated were: (1) laser pyrolysis gas chromatography mass spectrometry; (2) solids probe mass spectrometry; (3) laser pyrolysis conventional infrared spectroscopy; and (4) infrared imaging thermovision. The most reproducible and sensitive technique was solids probe mass spectrometry. The fabricated exposed specimens were analyzed for D20 profiling after humidity/thermal conditioning at three exposure time durations.

  2. ANALYSES OF WOUND EXUDATES FOR CLOSTRIDIAL TOXINS

    PubMed Central

    Noyes, Howard E.; Pritchard, William L.; Brinkley, Floyd B.; Mendelson, Janice A.

    1964-01-01

    Noyes, Howard E. (Walter Reed Army Institute of Research, Washington, D.C.), William L. Pritchard, Floyd B. Brinkley, and Janice A. Mendelson. Analyses of wound exudates for clostridial toxins. J. Bacteriol. 87:623–629. 1964.—Earlier studies indicated that death of goats with traumatic wounds of the hindquarter could be related to the number of clostridia in the wounds, and that toxicity of wound exudates for mice and guinea pigs could be partially neutralized by commercial trivalent gas gangrene antitoxin. This report describes in vitro and in vivo analyses of wound exudates for known clostridial toxins. Wounds were produced by detonation of high-explosive pellets. Wound exudates were obtained by cold saline extraction of both necrotic tissues and gauze sponges used to cover the wounds. Exudates were sterilized by Seitz filtration in the cold. In vitro tests were used to measure alpha-, theta-, and mu-toxins of Clostridium perfringens and the epsilon-toxin of C. novyi. Mouse protection tests, employing commercial typing antisera, were used to analyze exudates for other clostridial toxins. Lethality of wound exudates for mice could be related to (i) the numbers of clostridia present in the wound, (ii) survival time of the goats, and (iii) positive lecithovitellin (LV) tests of the exudates. However, the LV tests could not be neutralized by antitoxin specific for C. perfringens alpha-toxin. Mice were not protected by typing antisera specific for types A, C, or D C. perfringens or C. septicum but were protected by antisera specific for type B C. perfringens and types A and B C. novyi. PMID:14127581

  3. GPU based framework for geospatial analyses

    NASA Astrophysics Data System (ADS)

    Cosmin Sandric, Ionut; Ionita, Cristian; Dardala, Marian; Furtuna, Titus

    2017-04-01

    Parallel processing on multiple CPU cores is already used at large scale in geocomputing, but parallel processing on graphics cards is just at the beginning. Being able to use an simple laptop with a dedicated graphics card for advanced and very fast geocomputation is an advantage that each scientist wants to have. The necessity to have high speed computation in geosciences has increased in the last 10 years, mostly due to the increase in the available datasets. These datasets are becoming more and more detailed and hence they require more space to store and more time to process. Distributed computation on multicore CPU's and GPU's plays an important role by processing one by one small parts from these big datasets. These way of computations allows to speed up the process, because instead of using just one process for each dataset, the user can use all the cores from a CPU or up to hundreds of cores from GPU The framework provide to the end user a standalone tools for morphometry analyses at multiscale level. An important part of the framework is dedicated to uncertainty propagation in geospatial analyses. The uncertainty may come from the data collection or may be induced by the model or may have an infinite sources. These uncertainties plays important roles when a spatial delineation of the phenomena is modelled. Uncertainty propagation is implemented inside the GPU framework using Monte Carlo simulations. The GPU framework with the standalone tools proved to be a reliable tool for modelling complex natural phenomena The framework is based on NVidia Cuda technology and is written in C++ programming language. The code source will be available on github at https://github.com/sandricionut/GeoRsGPU Acknowledgement: GPU framework for geospatial analysis, Young Researchers Grant (ICUB-University of Bucharest) 2016, director Ionut Sandric

  4. Interpreting cost analyses of clinical interventions.

    PubMed

    Balas, E A; Kretschmer, R A; Gnann, W; West, D A; Boren, S A; Centor, R M; Nerlich, M; Gupta, M; West, T D; Soderstrom, N S

    1998-01-07

    In the present era of cost containment, physicians need reliable data about specific interventions. The objectives of this study were to assist practitioners in interpretation of economic analyses and estimation of their own costs of implementing recommended interventions. MEDLINE search from 1966 through 1995 using the text words cost or expense and medical subject heading (MeSH) terms costs and cost analysis, cost control, cost of illness, cost savings, or cost-benefit analysis. The 4 eligibility criteria were clinical trial with random assignment; health care quality improvement intervention tested; effects measured on the process or outcome of care; and cost calculation mentioned in the report. After independent abstraction and after consensus development, financial data were entered into a costing protocol to determine which costs related to the intervention were provided. Of 181 articles, 97 (53.6%) included actual numbers on the costs of the intervention. Of 97 articles analyzed, the most frequently reported cost figures were in the category of operating expenses (direct cost, 61.9%; labor, 42.3%; and supplies, 32.0%). General overhead was not presented in 91 (93.8%) of the 97 studies. Only 14 (14.4%) of the 97 studies mentioned start-up costs. The text word $ in the abstract and the most useful MeSH index term of cost-benefit analysis appeared with nearly equal frequency in the articles that included actual cost data (37.1 % vs 35.1%). Two thirds of articles indexed with the MeSH term cost control did not include cost figures. Statements regarding cost without substantiating data are made habitually in reports of clinical trials. In clinical trial reports presenting data on expenditures, start-up costs and general overhead are frequently disregarded. Practitioners can detect missing information by placing cost data in a standardized protocol. The costing protocol of this study can help bridge care delivery and economic analyses.

  5. Topological Analyses of Symmetric Eruptive Prominences

    NASA Astrophysics Data System (ADS)

    Panasenco, O.; Martin, S. F.

    Erupting prominences (filaments) that we have analyzed from Hα Doppler data at Helio Research and from SOHO/EIT 304 Å, show strong coherency between their chirality, the direction of the vertical and lateral motions of the top of the prominences, and the directions of twisting of their legs. These coherent properties in erupting prominences occur in two patterns of opposite helicity; they constitute a form of dynamic chirality called the ``roll effect." Viewed from the positive network side as they erupt, many symmetrically-erupting dextral prominences develop rolling motion toward the observer along with right-hand helicity in the left leg and left-hand helicity in the right leg. Many symmetricaly-erupting sinistral prominences, also viewed from the positive network field side, have the opposite pattern: rolling motion at the top away from the observer, left-hand helical twist in the left leg, and right-hand twist in the right leg. We have analysed the motions seen in the famous movie of the ``Grand Daddy" erupting prominence and found that it has all the motions that define the roll effect. From our analyses of this and other symmetric erupting prominences, we show that the roll effect is an alternative to the popular hypothetical configuration of an eruptive prominence as a twisted flux rope or flux tube. Instead we find that a simple flat ribbon can be bent such that it reproduces nearly all of the observed forms. The flat ribbon is the most logical beginning topology because observed prominence spines already have this topology prior to eruption and an initial long magnetic ribbon with parallel, non-twisted threads, as a basic form, can be bent into many more and different geometrical forms than a flux rope.

  6. Indirect Comparisons and Network Meta-Analyses.

    PubMed

    Kiefer, Corinna; Sturtz, Sibylle; Bender, Ralf

    2015-11-20

    Systematic reviews provide a structured summary of the results of trials that have been carried out on any particular subject. If the data from multiple trials are sufficiently homogenous, a meta-analysis can be performed to calculate pooled effect estimates. Traditional meta-analysis involves groups of trials that compare the same two interventions directly (head to head). Lately, however, indirect comparisons and network metaanalyses have become increasingly common. Various methods of indirect comparison and network meta-analysis are presented and discussed on the basis of a selective review of the literature. The main assumptions and requirements of these methods are described, and a checklist is provided as an aid to the evaluation of published indirect comparisons and network meta-analyses. When no head-to-head trials of two interventions are available, indirect comparisons and network metaanalyses enable the estimation of effects as well as the simultaneous analysis of networks involving more than two interventions. Network meta-analyses and indirect comparisons can only be useful if the trial or patient characteristics are similar and the observed effects are sufficiently homogeneous. Moreover, there should be no major discrepancy between the direct and indirect evidence. If trials are available that compare each of two treatments against a third one, but not against each other, then the third intervention can be used as a common comparator to enable a comparison of the other two. Indirect comparisons and network metaanalyses are an important further development of traditional meta-analysis. Clear and detailed documentation is needed so that findings obtained by these new methods can be reliably judged.

  7. Analyse de formes par moiré

    NASA Astrophysics Data System (ADS)

    Harthong, J.; Sahli, H.; Poinsignon, R.; Meyrueis, P.

    1991-01-01

    We present a mathematical analysis of moiré phenomena for shape recognition. The basic theoretical concept - and tool - will be the contour function. We show that the mathematical analysis is greatly simplified by the systematic recourse to this tool. The analysis presented permits a simultaneous treatment of two different modes of implementing the moiré technique : the direct mode (widely used and well-known), and the converse mode (scarcely used). The converse mode consists in computing and designing a grating especially for one model of object, in such a manner that if (and only if) the object is in conformity with the prescribed model, the resulting moiré fringes are parallel straight lines. We give explicit formulas and algorithms for such computations. Nous présentons une analyse mathématique du moiré permettant une reconnaissance des formes. Le concept théorique de base est celui de “ fonction de contour ”. Nous montrons que l'analyse mathématique est simplifiée en faisant appel à ces fonctions. De plus, la méthode proposée permet de traiter d'une manière unifiée les deux différents modes d'utilisation des techniques de moiré : le mode direct (le plus utilisé et le mieux connu), et le moiré inverse, qui consiste, pour un modèle d'objet donné, à calculer et réaliser un réseau spécifique, tel que si (et seulement si) un objet est conforme au modèle, les franges de moiré obtenues seront des lignes droites parallèles. Nous proposons des formules explicites et des algorithmes pour ces traitements.

  8. Life cycle analyses and resource assessments.

    PubMed

    Fredga, Karl; Mäler, Karl-Göran

    2010-01-01

    Prof. Ulgiati stresses that we should always use an ecosystem view when transforming energy from one form to another. Sustainable growth and development of both environmental and human-dominated systems require optimum use of available resources for maximum power output. We have to adapt to the laws of nature because nature has to take care of all the waste products we produce. The presentation addresses a much needed shift away from linear production and consumption pattern, toward reorganization of economies and lifestyle that takes complexity--of resources, of the environment and of the economy--into proper account. The best way to reach maximum yield from the different kinds of biomass is to use biorefineries. Biorefinery is defined as the sustainable processing of biomass into a spectrum of marketable products like heat, power, fuels, chemicals, food, feed, and materials. However, biomass from agricultural land must be used for the production of food and not fuel. Prof. Voss focuses on the sustainability of energy supply chains and energy systems. Life cycle analyses (LCA) provides the conceptual framework for a comprehensive comparative evaluation of energy supply options with regard to their resource requirements as well as the health and environmental impact. Full scope LCA considers not only the emissions from plant operation, construction, and decommissioning but also the environmental burdens and resource requirements associated with the entire lifetime of all relevant upstream and downstream processes within the energy chain. This article describes the results of LCA analyses for state-of-the-art heating and electricity systems as well as of advanced future systems. Total costs are used as a measure for the overall resource consumption.

  9. Reporting guidelines for population pharmacokinetic analyses.

    PubMed

    Dykstra, Kevin; Mehrotra, Nitin; Tornøe, Christoffer Wenzel; Kastrissios, Helen; Patel, Bela; Al-Huniti, Nidal; Jadhav, Pravin; Wang, Yaning; Byon, Wonkyung

    2015-06-01

    The purpose of this work was to develop a consolidated set of guiding principles for reporting of population pharmacokinetic (PK) analyses based on input from a survey of practitioners as well as discussions between industry, consulting and regulatory scientists. The survey found that identification of population covariate effects on drug exposure and support for dose selection (where population PK frequently serves as preparatory analysis to exposure-response modeling) are the main areas of influence for population PK analysis. The proposed guidelines consider two main purposes of population PK reports (1) to present key analysis findings and their impact on drug development decisions, and (2) as documentation of the analysis methods for the dual purpose of enabling review of the analysis and facilitating future use of the models. This work also identified two main audiences for the reports: (1) a technically competent group responsible for in-depth review of the data, methodology, and results, and (2) a scientifically literate, but not technically adept group, whose main interest is in the implications of the analysis for the broader drug development program. We recommend a generalized question-based approach with six questions that need to be addressed throughout the report. We recommend eight sections (Synopsis, Introduction, Data, Methods, Results, Discussion, Conclusions, Appendix) with suggestions for the target audience and level of detail for each section. A section providing general expectations regarding population PK reporting from a regulatory perspective is also included. We consider this an important step towards industrialization of the field of pharmacometrics such that non-technical audience also understands the role of pharmacometrics analyses in decision making. Population PK reports were chosen as representative reports to derive these recommendations; however, the guiding principles presented here are applicable for all pharmacometric reports

  10. Energy adjustment methods applied to alcohol analyses.

    PubMed

    Johansen, Ditte; Andersen, Per K; Overvad, Kim; Jensen, Gorm; Schnohr, Peter; Sørensen, Thorkild I A; Grønbaek, Morten

    2003-01-01

    When alcohol consumption is related to outcome, associations between alcohol type and health outcomes may occur simply because of the ethanol in the beverage type. When one analyzes the consequences of consumption of beer, wine, and spirits, the total alcohol intake must therefore be taken into account. However, owing to the linear dependency between total alcohol intake and the alcohol content of each beverage type, the effects cannot be separated from each other or from the effect of ethanol. In nutritional epidemiology, similar problems regarding intake of macronutrients and total energy intake have been addressed, and four methods have been proposed to solve the problem: energy partition, standard, density, and residual. The aim of this study was to evaluate the usefulness of the energy adjustment methods in alcohol analyses by using coronary heart disease as an example. Data obtained from the Copenhagen City Heart Study were used. The standard and energy partition methods yielded similar results for continuous, and almost similar results for categorical, alcohol variables. The results from the density method differed, but nevertheless were concordant with these. Beer and wine drinkers, in comparison with findings for nondrinkers, had lower risk of coronary heart disease. Except for the case of men drinking beer, the effect seemed to be associated with drinking one drink per week. The standard method derives influence of substituting alcohol types at constant total alcohol intake and complements the estimates of adding consumption of a particular alcohol type to the total intake. For most diseases, the effect of ethanol predominates over that of substances in the beverage type, which makes the density method less relevant in alcohol analyses.

  11. Image Mission Attitude Support Experiences

    NASA Technical Reports Server (NTRS)

    Ottenstein, N.; Challa, M.; Home, A.; Harman, R.; Burley, R.

    2001-01-01

    The spin-stabilized Imager for Magnetopause to Aurora Global Exploration (IMAGE) is the National Aeronautics and Space Administration's (NASA's) first Medium-class Explorer Mission (MIDEX). IMAGE was launched into a highly elliptical polar orbit on March 25, 2000 from Vandenberg Air Force Base, California, aboard a Boeing Delta II 7326 launch vehicle. This paper presents some of the observations of the flight dynamics analyses during the launch and in-orbit checkout period through May 18, 2000. Three new algorithms - one algebraic and two differential correction - for computing the parameters of the coning motion of a spacecraft are described and evaluated using in-flight data from the autonomous star tracker (AST) on IMAGE. Other attitude aspects highlighted include support for active damping consequent upon the failure of the passive nutation damper, performance evaluation of the AST, evaluation of the Sun sensor and magnetometer using AST data, and magnetometer calibration.

  12. Noninvasive imaging of bone microarchitecture

    PubMed Central

    Patsch, Janina M.; Burghardt, Andrew J.; Kazakia, Galateia; Majumdar, Sharmila

    2015-01-01

    The noninvasive quantification of peripheral compartment-specific bone microarchitecture is feasible with high-resolution peripheral quantitative computed tomography (HR-pQCT) and high-resolution magnetic resonance imaging (HR-MRI). In addition to classic morphometric indices, both techniques provide a suitable basis for virtual biomechanical testing using finite element (FE) analyses. Methodical limitations, morphometric parameter definition, and motion artifacts have to be considered to achieve optimal data interpretation from imaging studies. With increasing availability of in vivo high-resolution bone imaging techniques, special emphasis should be put on quality control including multicenter, cross-site validations. Importantly, conclusions from interventional studies investigating the effects of antiosteoporotic drugs on bone microarchitecture should be drawn with care, ideally involving imaging scientists, translational researchers, and clinicians. PMID:22172043

  13. Medical imaging

    NASA Astrophysics Data System (ADS)

    Elliott, Alex

    2005-07-01

    Diagnostic medical imaging is a fundamental part of the practice of modern medicine and is responsible for the expenditure of considerable amounts of capital and revenue monies in healthcare systems around the world. Much research and development work is carried out, both by commercial companies and the academic community. This paper reviews briefly each of the major diagnostic medical imaging techniques—X-ray (planar and CT), ultrasound, nuclear medicine (planar, SPECT and PET) and magnetic resonance. The technical challenges facing each are highlighted, with some of the most recent developments. In terms of the future, interventional/peri-operative imaging, the advancement of molecular medicine and gene therapy are identified as potential areas of expansion.

  14. Ultrasonic Evaluation and Imaging

    SciTech Connect

    Crawford, Susan L.; Anderson, Michael T.; Diaz, Aaron A.; Larche, Michael R.; Prowant, Matthew S.; Cinson, Anthony D.

    2015-10-01

    Ultrasonic evaluation of materials for material characterization and flaw detection is as simple as manually moving a single-element probe across a speci-men and looking at an oscilloscope display in real time or as complex as automatically (under computer control) scanning a phased-array probe across a specimen and collecting encoded data for immediate or off-line data analyses. The reliability of the results in the second technique is greatly increased because of a higher density of measurements per scanned area and measurements that can be more precisely related to the specimen geometry. This chapter will briefly discuss applications of the collection of spatially encoded data and focus primarily on the off-line analyses in the form of data imaging. Pacific Northwest National Laboratory (PNNL) has been involved with as-sessing and advancing the reliability of inservice inspections of nuclear power plant components for over 35 years. Modern ultrasonic imaging techniques such as the synthetic aperture focusing technique (SAFT), phased-array (PA) technolo-gy and sound field mapping have undergone considerable improvements to effec-tively assess and better understand material constraints.

  15. Phloem imaging.

    PubMed

    Truernit, Elisabeth

    2014-04-01

    The phloem is the long-distance solute-conducting tissue of plants. The observation of phloem cells is particularly challenging for several reasons and many recent advances in microscopy are, therefore, especially beneficial for the study of phloem anatomy and physiology. This review will give an overview of the imaging techniques that have been used for studying different aspects of phloem biology. It will also highlight some new imaging techniques that have emerged in recent years that will certainly advance our knowledge about phloem function.

  16. Lung imaging.

    PubMed

    Ley, Sebastian

    2015-06-01

    Imaging of the lung is a mainstay of respiratory medicine. It provides local information about morphology and function of the lung parenchyma that is unchallenged by other noninvasive techniques. During the 2014 European Respiratory Society International Congress in Munich, Germany, a Clinical Year in Review session was held focusing on the latest developments in pulmonary imaging. This review summarises some of the main findings of peer-reviewed articles that were published in the 12-month period prior to the 2014 International Congress. Copyright ©ERS 2015.

  17. Brain Imaging

    PubMed Central

    Racine, Eric; Bar-Ilan, Ofek; Illes, Judy

    2007-01-01

    Advances in neuroscience are increasingly intersecting with issues of ethical, legal, and social interest. This study is an analysis of press coverage of an advanced technology for brain imaging, functional magnetic resonance imaging, that has gained significant public visibility over the past ten years. Discussion of issues of scientific validity and interpretation dominated over ethical content in both the popular and specialized press. Coverage of research on higher order cognitive phenomena specifically attributed broad personal and societal meaning to neuroimages. The authors conclude that neuroscience provides an ideal model for exploring science communication and ethics in a multicultural context. PMID:17330151

  18. Database-Driven Analyses of Astronomical Spectra

    NASA Astrophysics Data System (ADS)

    Cami, Jan

    2012-03-01

    Spectroscopy is one of the most powerful tools to study the physical properties and chemical composition of very diverse astrophysical environments. In principle, each nuclide has a unique set of spectral features; thus, establishing the presence of a specific material at astronomical distances requires no more than finding a laboratory spectrum of the right material that perfectly matches the astronomical observations. Once the presence of a substance is established, a careful analysis of the observational characteristics (wavelengths or frequencies, intensities, and line profiles) allows one to determine many physical parameters of the environment in which the substance resides, such as temperature, density, velocity, and so on. Because of this great diagnostic potential, ground-based and space-borne astronomical observatories often include instruments to carry out spectroscopic analyses of various celestial objects and events. Of particular interest is molecular spectroscopy at infrared wavelengths. From the spectroscopic point of view, molecules differ from atoms in their ability to vibrate and rotate, and quantum physics inevitably causes those motions to be quantized. The energies required to excite vibrations or rotations are such that vibrational transitions generally occur at infrared wavelengths, whereas pure rotational transitions typically occur at sub-mm wavelengths. Molecular vibration and rotation are coupled though, and thus at infrared wavelengths, one commonly observes a multitude of ro-vibrational transitions (see Figure 13.1). At lower spectral resolution, all transitions blend into one broad ro-vibrational molecular band. The isotope. Molecular spectroscopy thus allows us to see a difference of one neutron in an atomic nucleus that is located at astronomical distances! Since the detection of the first interstellar molecules (the CH [21] and CN [14] radicals), more than 150 species have been detected in space, ranging in size from diatomic

  19. Nonindependence and sensitivity analyses in ecological and evolutionary meta-analyses.

    PubMed

    Noble, Daniel W A; Lagisz, Malgorzata; O'dea, Rose E; Nakagawa, Shinichi

    2017-05-01

    Meta-analysis is an important tool for synthesizing research on a variety of topics in ecology and evolution, including molecular ecology, but can be susceptible to nonindependence. Nonindependence can affect two major interrelated components of a meta-analysis: (i) the calculation of effect size statistics and (ii) the estimation of overall meta-analytic estimates and their uncertainty. While some solutions to nonindependence exist at the statistical analysis stages, there is little advice on what to do when complex analyses are not possible, or when studies with nonindependent experimental designs exist in the data. Here we argue that exploring the effects of procedural decisions in a meta-analysis (e.g. inclusion of different quality data, choice of effect size) and statistical assumptions (e.g. assuming no phylogenetic covariance) using sensitivity analyses are extremely important in assessing the impact of nonindependence. Sensitivity analyses can provide greater confidence in results and highlight important limitations of empirical work (e.g. impact of study design on overall effects). Despite their importance, sensitivity analyses are seldom applied to problems of nonindependence. To encourage better practice for dealing with nonindependence in meta-analytic studies, we present accessible examples demonstrating the impact that ignoring nonindependence can have on meta-analytic estimates. We also provide pragmatic solutions for dealing with nonindependent study designs, and for analysing dependent effect sizes. Additionally, we offer reporting guidelines that will facilitate disclosure of the sources of nonindependence in meta-analyses, leading to greater transparency and more robust conclusions. © 2017 John Wiley & Sons Ltd.

  20. Study of spin-scan imaging for outer planets missions. [imaging techniques for Jupiter orbiter missions

    NASA Technical Reports Server (NTRS)

    Russell, E. E.; Chandos, R. A.; Kodak, J. C.; Pellicori, S. F.; Tomasko, M. G.

    1974-01-01

    The constraints that are imposed on the Outer Planet Missions (OPM) imager design are of critical importance. Imager system modeling analyses define important parameters and systematic means for trade-offs applied to specific Jupiter orbiter missions. Possible image sequence plans for Jupiter missions are discussed in detail. Considered is a series of orbits that allow repeated near encounters with three of the Jovian satellites. The data handling involved in the image processing is discussed, and it is shown that only minimal processing is required for the majority of images for a Jupiter orbiter mission.

  1. Efficient ALL vs. ALL collision risk analyses

    NASA Astrophysics Data System (ADS)

    Escobar, D.; Paskowitz, M.; Agueda, A.; Garcia, G.; Molina, M.

    2011-09-01

    In recent years, the space debris has gained a lot of attention due to the increasing amount of uncontrolled man-made objects orbiting the Earth. This population poses a significant and constantly growing thread to operational satellites. In order to face this thread in an independent manner, ESA has launched an initiative for the development of a European SSA System where GMV is participating via several activities. Apart from those activities financed by ESA, GMV has developed closeap, a tool for efficient conjunction assessment and collision probability prediction. ESÁs NAPEOS has been selected as computational engine and numerical propagator to be used in the tool, which can be considered as an add-on to the standard NAPEOS package. closeap makes use of the same orbit computation, conjunction assessment and collision risk algorithms implemented in CRASS, but at the same time both systems are completely independent. Moreover, the implementation in closeap has been validated against CRASS with excellent results. This paper describes the performance improvements implemented in closeap at algorithm level to ensure that the most time demanding scenarios (e.g., all catalogued objects are analysed against each other - all vs. all scenarios -) can be analysed in a reasonable amount of time with commercial-off-the-shelf hardware. However, the amount of space debris increases steadily due to the human activities. Thus, the number of objects involved in a full collision assessment is expected to increase notably and, consequently, the computational cost, which scales as the square of the number of objects, will increase as well. Additionally, orbit propagation algorithms that are computationally expensive might be needed to predict more accurately the trajectories of the space debris. In order to cope with such computational needs, the next natural step in the development of collision assessment tools is the use of parallelization techniques. In this paper we investigate

  2. MRI (Magnetic Resonance Imaging)

    MedlinePlus

    ... Procedures Medical Imaging MRI (Magnetic Resonance Imaging) MRI (Magnetic Resonance Imaging) Share Tweet Linkedin Pin it More sharing options Linkedin Pin it Email Print Magnetic Resonance Imaging (MRI) is a medical imaging procedure for making ...

  3. High perfomance liquid chromatography in pharmaceutical analyses.

    PubMed

    Nikolin, Branko; Imamović, Belma; Medanhodzić-Vuk, Saira; Sober, Miroslav

    2004-05-01

    In testing the pre-sale procedure the marketing of drugs and their control in the last ten years, high performance liquid chromatography replaced numerous spectroscopic methods and gas chromatography in the quantitative and qualitative analysis. In the first period of HPLC application it was thought that it would become a complementary method of gas chromatography, however, today it has nearly completely replaced gas chromatography in pharmaceutical analysis. The application of the liquid mobile phase with the possibility of transformation of mobilized polarity during chromatography and all other modifications of mobile phase depending upon the characteristics of substance which are being tested, is a great advantage in the process of separation in comparison to other methods. The greater choice of stationary phase is the next factor which enables realization of good separation. The separation line is connected to specific and sensitive detector systems, spectrafluorimeter, diode detector, electrochemical detector as other hyphernated systems HPLC-MS and HPLC-NMR, are the basic elements on which is based such wide and effective application of the HPLC method. The purpose high performance liquid chromatography (HPLC) analysis of any drugs is to confirm the identity of a drug and provide quantitative results and also to monitor the progress of the therapy of a disease.1) Measuring presented on the Fig. 1. is chromatogram obtained for the plasma of depressed patients 12 h before oral administration of dexamethasone. It may also be used to further our understanding of the normal and disease process in the human body trough biomedical and therapeutically research during investigation before of the drugs registration. The analyses of drugs and metabolites in biological fluids, particularly plasma, serum or urine is one of the most demanding but one of the most common uses of high performance of liquid chromatography. Blood, plasma or serum contains numerous endogenous

  4. Hierarchical Segmentation Enhances Diagnostic Imaging

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Bartron Medical Imaging LLC (BMI), of New Haven, Connecticut, gained a nonexclusive license from Goddard Space Flight Center to use the RHSEG software in medical imaging. To manage image data, BMI then licensed two pattern-matching software programs from NASA's Jet Propulsion Laboratory that were used in image analysis and three data-mining and edge-detection programs from Kennedy Space Center. More recently, BMI made NASA history by being the first company to partner with the Space Agency through a Cooperative Research and Development Agreement to develop a 3-D version of RHSEG. With U.S. Food and Drug Administration clearance, BMI will sell its Med-Seg imaging system with the 2-D version of the RHSEG software to analyze medical imagery from CAT and PET scans, MRI, ultrasound, digitized X-rays, digitized mammographies, dental X-rays, soft tissue analyses, moving object analyses, and soft-tissue slides such as Pap smears for the diagnoses and management of diseases. Extending the software's capabilities to three dimensions will eventually enable production of pixel-level views of a tumor or lesion, early identification of plaque build-up in arteries, and identification of density levels of microcalcification in mammographies.

  5. Imaging sciences workshop

    SciTech Connect

    Candy, J.V.

    1994-11-15

    This workshop on the Imaging Sciences sponsored by Lawrence Livermore National Laboratory contains short abstracts/articles submitted by speakers. The topic areas covered include the following: Astronomical Imaging; biomedical imaging; vision/image display; imaging hardware; imaging software; Acoustic/oceanic imaging; microwave/acoustic imaging; computed tomography; physical imaging; imaging algorithms. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.

  6. Operational Satellite-based Surface Oil Analyses (Invited)

    NASA Astrophysics Data System (ADS)

    Streett, D.; Warren, C.

    2010-12-01

    During the Deepwater Horizon spill, NOAA imagery analysts in the Satellite Analysis Branch (SAB) issued more than 300 near-real-time satellite-based oil spill analyses. These analyses were used by the oil spill response community for planning, issuing surface oil trajectories and tasking assets (e.g., oil containment booms, skimmers, overflights). SAB analysts used both Synthetic Aperture Radar (SAR) and high resolution visible/near IR multispectral satellite imagery as well as a variety of ancillary datasets. Satellite imagery used included ENVISAT ASAR (ESA), TerraSAR-X (DLR), Cosmo-Skymed (ASI), ALOS (JAXA), Radarsat (MDA), ENVISAT MERIS (ESA), SPOT (SPOT Image Corp.), Aster (NASA), MODIS (NASA), and AVHRR (NOAA). Ancillary datasets included ocean current information, wind information, location of natural oil seeps and a variety of in situ oil observations. The analyses were available as jpegs, pdfs, shapefiles and through Google, KML files and also available on a variety of websites including Geoplatform and ERMA. From the very first analysis issued just 5 hours after the rig sank through the final analysis issued in August, the complete archive is still publicly available on the NOAA/NESDIS website http://www.ssd.noaa.gov/PS/MPS/deepwater.html SAB personnel also served as the Deepwater Horizon International Disaster Charter Project Manager (at the official request of the USGS). The Project Manager’s primary responsibility was to acquire and oversee the processing and dissemination of satellite data generously donated by numerous private companies and nations in support of the oil spill response including some of the imagery described above. SAB has begun to address a number of goals that will improve our routine oil spill response as well as help assure that we are ready for the next spill of national significance. We hope to (1) secure a steady, abundant and timely stream of suitable satellite imagery even in the absence of large-scale emergencies such as

  7. Image Processing

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Images are prepared from data acquired by the multispectral scanner aboard Landsat, which views Earth in four ranges of the electromagnetic spectrum, two visible bands and two infrared. Scanner picks up radiation from ground objects and converts the radiation signatures to digital signals, which are relayed to Earth and recorded on tape. Each tape contains "pixels" or picture elements covering a ground area; computerized equipment processes the tapes and plots each pixel, line be line to produce the basic image. Image can be further processed to correct sensor errors, to heighten contrast for feature emphasis or to enhance the end product in other ways. Key factor in conversion of digital data to visual form is precision of processing equipment. Jet Propulsion Laboratory prepared a digital mosaic that was plotted and enhanced by Optronics International, Inc. by use of the company's C-4300 Colorwrite, a high precision, high speed system which manipulates and analyzes digital data and presents it in visual form on film. Optronics manufactures a complete family of image enhancement processing systems to meet all users' needs. Enhanced imagery is useful to geologists, hydrologists, land use planners, agricultural specialists geographers and others.

  8. Inner Image

    ERIC Educational Resources Information Center

    Mollhagen, Nancy

    2004-01-01

    In this article, the author states that she has always loved self portraits but most teenagers do not enjoy looking too closely at their own faces in an effort to replicate them. Thanks to a new digital camera, she was able to use this new technology to inspire students to take a closer look at their inner image. Prior to the self-portrait…

  9. Biblical Images.

    ERIC Educational Resources Information Center

    Nir, Yeshayahu

    1987-01-01

    Responds to Marjorie Munsterberg's review of "The Bible and the Image: The History of Photography in the Holy Land 1839-1899." Claims that Munsterberg provided an incomplete and inaccurate knowledge of the book's content, and that she considered Western pictorial traditions as the only valid measure in the study of the history of…

  10. Image Processing

    DTIC Science & Technology

    1999-03-01

    blurs the processed image. Blurring is the primary limitation of low-pass filtering. Figure (10) shows a photo of the famous Taj -Hahal, one of the...Original Histo(p"am FILENAME.APP=41 06FG 1 O.PSD APPLICATION: ADOBE PHOTOSHOP VERSION 4.0 Figure (10) Photo ofTaj- Mahal with arbitrarily noise

  11. Inner Image

    ERIC Educational Resources Information Center

    Mollhagen, Nancy

    2004-01-01

    In this article, the author states that she has always loved self portraits but most teenagers do not enjoy looking too closely at their own faces in an effort to replicate them. Thanks to a new digital camera, she was able to use this new technology to inspire students to take a closer look at their inner image. Prior to the self-portrait…

  12. Forest Imaging

    NASA Technical Reports Server (NTRS)

    1992-01-01

    NASA's Technology Applications Center, with other government and academic agencies, provided technology for improved resources management to the Cibola National Forest. Landsat satellite images enabled vegetation over a large area to be classified for purposes of timber analysis, wildlife habitat, range measurement and development of general vegetation maps.

  13. Maxillofacial imaging

    SciTech Connect

    Delbalso, A.M.

    1988-01-01

    This book covers a discussion of maxillofacial imaging demonstrating correlations between the clinical, pathological and radiographic aspects of a particular anatomic structure or problem. Sections cover: evaluation of facial trauma, radiographic evaluation of specific structures, evaluation and treatment of maxillofacial neoplastic processes, and radiographic evaluation of facial development.

  14. Comparative sequence analyses of sixteen reptilian paramyxoviruses

    USGS Publications Warehouse

    Ahne, W.; Batts, W.N.; Kurath, G.; Winton, J.R.

    1999-01-01

    Viral genomic RNA of Fer-de-Lance virus (FDLV), a paramyxovirus highly pathogenic for reptiles, was reverse transcribed and cloned. Plasmids with significant sequence similarities to the hemagglutinin-neuraminidase (HN) and polymerase (L) genes of mammalian paramyxoviruses were identified by BLAST search. Partial sequences of the FDLV genes were used to design primers for amplification by nested polymerase chain reaction (PCR) and sequencing of 518-bp L gene and 352-bp HN gene fragments from a collection of 15 previously uncharacterized reptilian paramyxoviruses. Phylogenetic analyses of the partial L and HN sequences produced similar trees in which there were two distinct subgroups of isolates that were supported with maximum bootstrap values, and several intermediate isolates. Within each subgroup the nucleotide divergence values were less than 2.5%, while the divergence between the two subgroups was 20-22%. This indicated that the two subgroups represent distinct virus species containing multiple virus strains. The five intermediate isolates had nucleotide divergence values of 11-20% and may represent additional distinct species. In addition to establishing diversity among reptilian paramyxoviruses, the phylogenetic groupings showed some correlation with geographic location, and clearly demonstrated a low level of host species-specificity within these viruses. Copyright (C) 1999 Elsevier Science B.V.

  15. Informative prior distributions for ELISA analyses.

    PubMed

    Klauenberg, Katy; Walzel, Monika; Ebert, Bernd; Elster, Clemens

    2015-07-01

    Immunoassays are capable of measuring very small concentrations of substances in solutions and have an immense range of application. Enzyme-linked immunosorbent assay (ELISA) tests in particular can detect the presence of an infection, of drugs, or hormones (as in the home pregnancy test). Inference of an unknown concentration via ELISA usually involves a non-linear heteroscedastic regression and subsequent prediction, which can be carried out in a Bayesian framework. For such a Bayesian inference, we are developing informative prior distributions based on extensive historical ELISA tests as well as theoretical considerations. One consideration regards the quality of the immunoassay leading to two practical requirements for the applicability of the priors. Simulations show that the additional prior information can lead to inferences which are robust to reasonable perturbations of the model and changes in the design of the data. On real data, the applicability is demonstrated across different laboratories, for different analytes and laboratory equipment as well as for previous and current ELISAs with sigmoid regression function. Consistency checks on real data (similar to cross-validation) underpin the adequacy of the suggested priors. Altogether, the new priors may improve concentration estimation for ELISAs that fulfill certain design conditions, by extending the range of the analyses, decreasing the uncertainty, or giving more robust estimates. Future use of these priors is straightforward because explicit, closed-form expressions are provided. This work encourages development and application of informative, yet general, prior distributions for other types of immunoassays.

  16. Belgian guidelines for budget impact analyses.

    PubMed

    Neyt, M; Cleemput, I; Sande, S Van De; Thiry, N

    2015-06-01

    To develop methodological guidelines for budget impact analysis submitted to the Belgian health authorities as part of a reimbursement request. A review of the literature was performed and provided the basis for preliminary budget impact guidelines. These guidelines were improved after discussion with health economists from the Belgian Health Care Knowledge Centre (KCE) and different Belgian stakeholders from both government and industry. Preliminary guidelines were also discussed in a workshop with health economists from The German Institute for Quality and Efficiency in Healthcare. Finally, the guidelines were also externally validated by three external experts. The guidelines give explicit guidance for the following components of a budget impact analysis: perspective of the evaluation, target population, comparator, costs, time horizon, modeling, handling uncertainty and discount rate. Special attention is given to handling varying target population sizes over time, applying a time horizon up to the steady state instead of short-term predictions, and similarities and differences between budget impact analysis and economic evaluations. The guidelines provide a framework for both researchers and assessors to set up budget impact analyses that are transparent, relevant, of high quality and apply a consistent methodology. This might improve the extent to which such evaluations can reliably and consistently be used in the reimbursement decision making process.

  17. Applications of Parallel Processing in Configuration Analyses

    NASA Technical Reports Server (NTRS)

    Sundaram, Ppchuraman; Hager, James O.; Biedron, Robert T.

    1999-01-01

    The paper presents the recent progress made towards developing an efficient and user-friendly parallel environment for routine analysis of large CFD problems. The coarse-grain parallel version of the CFL3D Euler/Navier-Stokes analysis code, CFL3Dhp, has been ported onto most available parallel platforms. The CFL3Dhp solution accuracy on these parallel platforms has been verified with the CFL3D sequential analyses. User-friendly pre- and post-processing tools that enable a seamless transfer from sequential to parallel processing have been written. Static load balancing tool for CFL3Dhp analysis has also been implemented for achieving good parallel efficiency. For large problems, load balancing efficiency as high as 95% can be achieved even when large number of processors are used. Linear scalability of the CFL3Dhp code with increasing number of processors has also been shown using a large installed transonic nozzle boattail analysis. To highlight the fast turn-around time of parallel processing, the TCA full configuration in sideslip Navier-Stokes drag polar at supersonic cruise has been obtained in a day. CFL3Dhp is currently being used as a production analysis tool.

  18. Reproducibility of neuroimaging analyses across operating systems

    PubMed Central

    Glatard, Tristan; Lewis, Lindsay B.; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C.

    2015-01-01

    Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed. PMID:25964757

  19. Trend Analyses of Nitrate in Danish Groundwater

    NASA Astrophysics Data System (ADS)

    Hansen, B.; Thorling, L.; Dalgaard, T.; Erlandsen, M.

    2012-04-01

    This presentation assesses the long-term development in the oxic groundwater nitrate concentration and nitrogen (N) loss due to intensive farming in Denmark. Firstly, up to 20-year time-series from the national groundwater monitoring network enable a statistically systematic analysis of distribution, trends and trend reversals in the groundwater nitrate concentration. Secondly, knowledge about the N surplus in Danish agriculture since 1950 is used as an indicator of the potential loss of N. Thirdly, groundwater recharge CFC (Chlorofluorocarbon) age determination allows linking of the first two dataset. The development in the nitrate concentration of oxic groundwater clearly mirrors the development in the national agricultural N surplus, and a corresponding trend reversal is found in groundwater. Regulation and technical improvements in the intensive farming in Denmark have succeeded in decreasing the N surplus by 40% since the mid 1980s while at the same time maintaining crop yields and increasing the animal production of especially pigs. Trend analyses prove that the youngest (0-15 years old) oxic groundwater shows more pronounced significant downward nitrate trends (44%) than the oldest (25-50 years old) oxic groundwater (9%). This amounts to clear evidence of the effect of reduced nitrate leaching on groundwater nitrate concentrations in Denmark. Are the Danish groundwater monitoring strategy obtimal for detection of nitrate trends? Will the nitrate concentrations in Danish groundwater continue to decrease or are the Danish nitrate concentration levels now appropriate according to the Water Framework Directive?

  20. Phylogenomic Analyses Support Traditional Relationships within Cnidaria

    PubMed Central

    Zapata, Felipe; Goetz, Freya E.; Smith, Stephen A.; Howison, Mark; Siebert, Stefan; Church, Samuel H.; Sanders, Steven M.; Ames, Cheryl Lewis; McFadden, Catherine S.; France, Scott C.; Daly, Marymegan; Collins, Allen G.; Haddock, Steven H. D.; Dunn, Casey W.; Cartwright, Paulyn

    2015-01-01

    Cnidaria, the sister group to Bilateria, is a highly diverse group of animals in terms of morphology, lifecycles, ecology, and development. How this diversity originated and evolved is not well understood because phylogenetic relationships among major cnidarian lineages are unclear, and recent studies present contrasting phylogenetic hypotheses. Here, we use transcriptome data from 15 newly-sequenced species in combination with 26 publicly available genomes and transcriptomes to assess phylogenetic relationships among major cnidarian lineages. Phylogenetic analyses using different partition schemes and models of molecular evolution, as well as topology tests for alternative phylogenetic relationships, support the monophyly of Medusozoa, Anthozoa, Octocorallia, Hydrozoa, and a clade consisting of Staurozoa, Cubozoa, and Scyphozoa. Support for the monophyly of Hexacorallia is weak due to the equivocal position of Ceriantharia. Taken together, these results further resolve deep cnidarian relationships, largely support traditional phylogenetic views on relationships, and provide a historical framework for studying the evolutionary processes involved in one of the most ancient animal radiations. PMID:26465609