Science.gov

Sample records for image analyses monitoracao

  1. Phase contrast image segmentation using a Laue analyser crystal

    NASA Astrophysics Data System (ADS)

    Kitchen, Marcus J.; Paganin, David M.; Uesugi, Kentaro; Allison, Beth J.; Lewis, Robert A.; Hooper, Stuart B.; Pavlov, Konstantin M.

    2011-02-01

    Dual-energy x-ray imaging is a powerful tool enabling two-component samples to be separated into their constituent objects from two-dimensional images. Phase contrast x-ray imaging can render the boundaries between media of differing refractive indices visible, despite them having similar attenuation properties; this is important for imaging biological soft tissues. We have used a Laue analyser crystal and a monochromatic x-ray source to combine the benefits of both techniques. The Laue analyser creates two distinct phase contrast images that can be simultaneously acquired on a high-resolution detector. These images can be combined to separate the effects of x-ray phase, absorption and scattering and, using the known complex refractive indices of the sample, to quantitatively segment its component materials. We have successfully validated this phase contrast image segmentation (PCIS) using a two-component phantom, containing an iodinated contrast agent, and have also separated the lungs and ribcage in images of a mouse thorax. Simultaneous image acquisition has enabled us to perform functional segmentation of the mouse thorax throughout the respiratory cycle during mechanical ventilation.

  2. Analyser-based x-ray imaging for biomedical research

    NASA Astrophysics Data System (ADS)

    Suortti, Pekka; Keyriläinen, Jani; Thomlinson, William

    2013-12-01

    Analyser-based imaging (ABI) is one of the several phase-contrast x-ray imaging techniques being pursued at synchrotron radiation facilities. With advancements in compact source technology, there is a possibility that ABI will become a clinical imaging modality. This paper presents the history of ABI as it has developed from its laboratory source to synchrotron imaging. The fundamental physics of phase-contrast imaging is presented both in a general sense and specifically for ABI. The technology is dependent on the use of perfect crystal monochromator optics. The theory of the x-ray optics is developed and presented in a way that will allow optimization of the imaging for specific biomedical systems. The advancement of analytical algorithms to produce separate images of the sample absorption, refraction angle map and small-angle x-ray scattering is detailed. Several detailed applications to biomedical imaging are presented to illustrate the broad range of systems and body sites studied preclinically to date: breast, cartilage and bone, soft tissue and organs. Ultimately, the application of ABI in clinical imaging will depend partly on the availability of compact sources with sufficient x-ray intensity comparable with that of the current synchrotron environment.

  3. Colony image acquisition and genetic segmentation algorithm and colony analyses

    NASA Astrophysics Data System (ADS)

    Wang, W. X.

    2012-01-01

    Colony anaysis is used in a large number of engineerings such as food, dairy, beverages, hygiene, environmental monitoring, water, toxicology, sterility testing. In order to reduce laboring and increase analysis acuracy, many researchers and developers have made efforts for image analysis systems. The main problems in the systems are image acquisition, image segmentation and image analysis. In this paper, to acquire colony images with good quality, an illumination box was constructed. In the box, the distances between lights and dishe, camra lens and lights, and camera lens and dishe are adjusted optimally. In image segmentation, It is based on a genetic approach that allow one to consider the segmentation problem as a global optimization,. After image pre-processing and image segmentation, the colony analyses are perfomed. The colony image analysis consists of (1) basic colony parameter measurements; (2) colony size analysis; (3) colony shape analysis; and (4) colony surface measurements. All the above visual colony parameters can be selected and combined together, used to make a new engineeing parameters. The colony analysis can be applied into different applications.

  4. A review of multivariate analyses in imaging genetics.

    PubMed

    Liu, Jingyu; Calhoun, Vince D

    2014-01-01

    Recent advances in neuroimaging technology and molecular genetics provide the unique opportunity to investigate genetic influence on the variation of brain attributes. Since the year 2000, when the initial publication on brain imaging and genetics was released, imaging genetics has been a rapidly growing research approach with increasing publications every year. Several reviews have been offered to the research community focusing on various study designs. In addition to study design, analytic tools and their proper implementation are also critical to the success of a study. In this review, we survey recent publications using data from neuroimaging and genetics, focusing on methods capturing multivariate effects accommodating the large number of variables from both imaging data and genetic data. We group the analyses of genetic or genomic data into either a priori driven or data driven approach, including gene-set enrichment analysis, multifactor dimensionality reduction, principal component analysis, independent component analysis (ICA), and clustering. For the analyses of imaging data, ICA and extensions of ICA are the most widely used multivariate methods. Given detailed reviews of multivariate analyses of imaging data available elsewhere, we provide a brief summary here that includes a recently proposed method known as independent vector analysis. Finally, we review methods focused on bridging the imaging and genetic data by establishing multivariate and multiple genotype-phenotype-associations, including sparse partial least squares, sparse canonical correlation analysis, sparse reduced rank regression and parallel ICA. These methods are designed to extract latent variables from both genetic and imaging data, which become new genotypes and phenotypes, and the links between the new genotype-phenotype pairs are maximized using different cost functions. The relationship between these methods along with their assumptions, advantages, and limitations are discussed.

  5. Simplified Model for Analysing Ion/Photoelectron Images

    NASA Astrophysics Data System (ADS)

    Zhu, Jing-Yi; Wang, Bing-Xing; Guo, Wei; Wang, Yan-Qiu; Wang, Li

    2007-07-01

    Based on the Onion-Peeling algorithm (OPA) principle, we present a simplified model for analysing photoion and photoelectron images, which allows the analysis of experimental raw images. A three-dimensional distribution of the nascent charged particles, from which the radial and angular distributions are deduced, can be obtained more easily by this model than by the commonly used procedures. The analysis results of Xe photoelectron images by this model are compared with those from the standard Hankel-Abel inversion. The results imply that this model can be used for complicated (many peaks) and `difficult' (low signal-to-noise) images with cylindrical symmetries, and can provide a reliable reconstruction in some cases when the commonly used Hankel Abel transform method fails.

  6. Solid Hydrogen Experiments for Atomic Propellants: Image Analyses

    NASA Technical Reports Server (NTRS)

    Palaszewski, Bryan

    2002-01-01

    This paper presents the results of detailed analyses of the images from experiments that were conducted on the formation of solid hydrogen particles in liquid helium. Solid particles of hydrogen were frozen in liquid helium, and observed with a video camera. The solid hydrogen particle sizes, their agglomerates, and the total mass of hydrogen particles were estimated. Particle sizes of 1.9 to 8 mm (0.075 to 0.315 in.) were measured. The particle agglomerate sizes and areas were measured, and the total mass of solid hydrogen was computed. A total mass of from 0.22 to 7.9 grams of hydrogen was frozen. Compaction and expansion of the agglomerate implied that the particles remain independent particles, and can be separated and controlled. These experiment image analyses are one of the first steps toward visually characterizing these particles, and allow designers to understand what issues must be addressed in atomic propellant feed system designs for future aerospace vehicles.

  7. ["When the ad is good, the product is sold." The MonitorACAO Project and drug advertising in Brazil].

    PubMed

    Soares, Jussara Calmon Reis de Souza

    2008-04-01

    This paper presents an analysis on drug advertising in Brazil, based on the final report of the MonitorACAO Project, by the group from the Universidade Federal Fluminense, Niterói, Rio de Janeiro. Due to a partnership between the university and the National Agency for Health Surveillance (ANVISA), drug advertisements were monitored and analyzed for one year, according to the methodology defined by the Agency. The samples were collected in medical practices and hospitals, drugstores, pharmacies and in scientific magazines. TV and radio programs were monitored, in the case of OTC drugs. 159 advertisements referring to pharmaceuticals were sent to ANVISA,from a total of 263 irregular ads analyzed between October 2004 and August 2005. The main problems found were the poor quality of drug information to health professionals, as well as misleading drug use to lay population. Based on the results of this project and on other studies, the banning of drug advertising in Brazil is proposed. PMID:21936168

  8. A Guide to Analysing Tongue Motion from Ultrasound Images

    ERIC Educational Resources Information Center

    Stone, Maureen

    2005-01-01

    This paper is meant to be an introduction to and general reference for ultrasound imaging for new and moderately experienced users of the instrument. The paper consists of eight sections. The first explains how ultrasound works, including beam properties, scan types and machine features. The second section discusses image quality, including the…

  9. Imaging data analyses for hazardous waste applications. Final report

    SciTech Connect

    David, N.; Ginsberg, I.W.

    1995-12-01

    The paper presents some examples of the use of remote sensing products for characterization of hazardous waste sites. The sites are located at the Los Alamos National Laboratory (LANL) where materials associated with past weapons testing are buried. Problems of interest include delineation of strata for soil sampling, detection and delineation of buried trenches containing contaminants, seepage from capped areas and old septic drain fields, and location of faults and fractures relative to hazardous waste areas. Merging of site map and other geographic information with imagery was found by site managers to produce useful products. Merging of hydrographic and soil contaminant data aided soil sampling strategists. Overlays of suspected trench on multispectral and thermal images showed correlation between image signatures and trenches. Overlays of engineering drawings on recent and historical photos showed error in trench location and extent. A thermal image showed warm anomalies suspected to be areas of water seepage through an asphalt cap. Overlays of engineering drawings on multispectral and thermal images showed correlation between image signatures and drain fields. Analysis of aerial photography and spectral signatures of faults/fractures improved geologic maps of mixed waste areas.

  10. The challenges of analysing blood stains with hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Kuula, J.; Puupponen, H.-H.; Rinta, H.; Pölönen, I.

    2014-06-01

    Hyperspectral imaging is a potential noninvasive technology for detecting, separating and identifying various substances. In the forensic and military medicine and other CBRNE related use it could be a potential method for analyzing blood and for scanning other human based fluids. For example, it would be valuable to easily detect whether some traces of blood are from one or more persons or if there are some irrelevant substances or anomalies in the blood. This article represents an experiment of separating four persons' blood stains on a white cotton fabric with a SWIR hyperspectral camera and FT-NIR spectrometer. Each tested sample includes standardized 75 _l of 100 % blood. The results suggest that on the basis of the amount of erythrocytes in the blood, different people's blood might be separable by hyperspectral analysis. And, referring to the indication given by erythrocytes, there might be a possibility to find some other traces in the blood as well. However, these assumptions need to be verified with wider tests, as the number of samples in the study was small. According to the study there also seems to be several biological, chemical and physical factors which affect alone and together on the hyperspectral analyzing results of blood on fabric textures, and these factors need to be considered before making any further conclusions on the analysis of blood on various materials.

  11. Integrating medical imaging analyses through a high-throughput bundled resource imaging system

    NASA Astrophysics Data System (ADS)

    Covington, Kelsie; Welch, E. Brian; Jeong, Ha-Kyu; Landman, Bennett A.

    2011-03-01

    Exploitation of advanced, PACS-centric image analysis and interpretation pipelines provides well-developed storage, retrieval, and archival capabilities along with state-of-the-art data providence, visualization, and clinical collaboration technologies. However, pursuit of integrated medical imaging analysis through a PACS environment can be limiting in terms of the overhead required to validate, evaluate and integrate emerging research technologies. Herein, we address this challenge through presentation of a high-throughput bundled resource imaging system (HUBRIS) as an extension to the Philips Research Imaging Development Environment (PRIDE). HUBRIS enables PACS-connected medical imaging equipment to invoke tools provided by the Java Imaging Science Toolkit (JIST) so that a medical imaging platform (e.g., a magnetic resonance imaging scanner) can pass images and parameters to a server, which communicates with a grid computing facility to invoke the selected algorithms. Generated images are passed back to the server and subsequently to the imaging platform from which the images can be sent to a PACS. JIST makes use of an open application program interface layer so that research technologies can be implemented in any language capable of communicating through a system shell environment (e.g., Matlab, Java, C/C++, Perl, LISP, etc.). As demonstrated in this proof-of-concept approach, HUBRIS enables evaluation and analysis of emerging technologies within well-developed PACS systems with minimal adaptation of research software, which simplifies evaluation of new technologies in clinical research and provides a more convenient use of PACS technology by imaging scientists.

  12. [The effect of image analyser noises in studies of cell structure].

    PubMed

    Shteĭn, G I

    2002-01-01

    Using image analysers, the influence of noises on the quality of images obtained from three types of digital CCD videocameras was studied. Algorithms for calculating the heterogeneity coefficient of cell structures have been proposed, which take into account the noises on the images. Application of procedures of image smoothing or averaging from a few shots, calculation of differences in heterogeneity coefficients of the object and a free field, and a combination of these methods have significantly reduced the influence of noises and increased the informativeness of texture features. PMID:11868456

  13. Integrating Medical Imaging Analyses through a High-throughput Bundled Resource Imaging System.

    PubMed

    Covington, Kelsie; Welch, E Brian; Jeong, Ha-Kyu; Landman, Bennett A

    2011-01-01

    Exploitation of advanced, PACS-centric image analysis and interpretation pipelines provides well-developed storage, retrieval, and archival capabilities along with state-of-the-art data providence, visualization, and clinical collaboration technologies. However, pursuit of integrated medical imaging analysis through a PACS environment can be limiting in terms of the overhead required to validate, evaluate and integrate emerging research technologies. Herein, we address this challenge through presentation of a high-throughput bundled resource imaging system (HUBRIS) as an extension to the Philips Research Imaging Development Environment (PRIDE). HUBRIS enables PACS-connected medical imaging equipment to invoke tools provided by the Java Imaging Science Toolkit (JIST) so that a medical imaging platform (e.g., a magnetic resonance imaging scanner) can pass images and parameters to a server, which communicates with a grid computing facility to invoke the selected algorithms. Generated images are passed back to the server and subsequently to the imaging platform from which the images can be sent to a PACS. JIST makes use of an open application program interface layer so that research technologies can be implemented in any language capable of communicating through a system shell environment (e.g., Matlab, Java, C/C++, Perl, LISP, etc.). As demonstrated in this proof-of-concept approach, HUBRIS enables evaluation and analysis of emerging technologies within well-developed PACS systems with minimal adaptation of research software, which simplifies evaluation of new technologies in clinical research and provides a more convenient use of PACS technology by imaging scientists. PMID:21841899

  14. Analyses of S-Box in Image Encryption Applications Based on Fuzzy Decision Making Criterion

    NASA Astrophysics Data System (ADS)

    Rehman, Inayatur; Shah, Tariq; Hussain, Iqtadar

    2014-06-01

    In this manuscript, we put forward a standard based on fuzzy decision making criterion to examine the current substitution boxes and study their strengths and weaknesses in order to decide their appropriateness in image encryption applications. The proposed standard utilizes the results of correlation analysis, entropy analysis, contrast analysis, homogeneity analysis, energy analysis, and mean of absolute deviation analysis. These analyses are applied to well-known substitution boxes. The outcome of these analyses are additional observed and a fuzzy soft set decision making criterion is used to decide the suitability of an S-box to image encryption applications.

  15. Geologist's Field Assistant: Developing Image and Spectral Analyses Algorithms for Remote Science Exploration

    NASA Astrophysics Data System (ADS)

    Gulick, V. C.; Morris, R. L.; Bishop, J.; Gazis, P.; Alena, R.; Sierhuis, M.

    2002-03-01

    We are developing science analyses algorithms to interface with a Geologist's Field Assistant device to allow robotic or human remote explorers to better sense their surroundings during limited surface excursions. Our algorithms will interpret spectral and imaging data obtained by various sensors.

  16. Biodistribution Analyses of a Near-Infrared, Fluorescently Labeled, Bispecific Monoclonal Antibody Using Optical Imaging.

    PubMed

    Peterson, Norman C; Wilson, George G; Huang, Qihui; Dimasi, Nazzareno; Sachsenmeier, Kris F

    2016-04-01

    In recent years, biodistribution analyses of pharmaceutical compounds in preclinical animal models have become an integral part of drug development. Here we report on the use of optical imaging biodistribution analyses in a mouse xenograft model to identify tissues that nonspecifically retained a bispecific antibody under development. Although our bispecific antibody bound both the epidermal growth factor receptor and insulin growth factor 1 receptor are expressed on H358, nonsmall-cell lung carcinoma cells, the fluorescence from labeled bispecific antibody was less intense than expected in xenografted tumors. Imaging analyses of live mice and major organs revealed that the majority of the Alexa Fluor 750 labeled bispecific antibody was sequestered in the liver within 2 h of injection. However, results varied depending on which near-infrared fluorophore was used, and fluorescence from the livers of mice injected with bispecific antibody labeled with Alexa Fluor 680 was less pronounced than those labeled with Alexa Fluor 750. The tissue distribution of control antibodies remained unaffected by label and suggests that the retention of fluorophores in the liver may differ. Given these precautions, these results support the incorporation of optical imaging biodistribution analyses in biotherapeutic development strategies. PMID:27053562

  17. Biodistribution Analyses of a Near-Infrared, Fluorescently Labeled, Bispecific Monoclonal Antibody Using Optical Imaging

    PubMed Central

    Peterson, Norman C; Wilson, George G; Huang, Qihui; Dimasi, Nazzareno; Sachsenmeier, Kris F

    2016-01-01

    In recent years, biodistribution analyses of pharmaceutical compounds in preclinical animal models have become an integral part of drug development. Here we report on the use of optical imaging biodistribution analyses in a mouse xenograft model to identify tissues that nonspecifically retained a bispecific antibody under development. Although our bispecific antibody bound both the epidermal growth factor receptor and insulin growth factor 1 receptor are expressed on H358, nonsmall-cell lung carcinoma cells, the fluorescence from labeled bispecific antibody was less intense than expected in xenografted tumors. Imaging analyses of live mice and major organs revealed that the majority of the Alexa Fluor 750 labeled bispecific antibody was sequestered in the liver within 2 h of injection. However, results varied depending on which near-infrared fluorophore was used, and fluorescence from the livers of mice injected with bispecific antibody labeled with Alexa Fluor 680 was less pronounced than those labeled with Alexa Fluor 750. The tissue distribution of control antibodies remained unaffected by label and suggests that the retention of fluorophores in the liver may differ. Given these precautions, these results support the incorporation of optical imaging biodistribution analyses in biotherapeutic development strategies. PMID:27053562

  18. Biodistribution Analyses of a Near-Infrared, Fluorescently Labeled, Bispecific Monoclonal Antibody Using Optical Imaging.

    PubMed

    Peterson, Norman C; Wilson, George G; Huang, Qihui; Dimasi, Nazzareno; Sachsenmeier, Kris F

    2016-04-01

    In recent years, biodistribution analyses of pharmaceutical compounds in preclinical animal models have become an integral part of drug development. Here we report on the use of optical imaging biodistribution analyses in a mouse xenograft model to identify tissues that nonspecifically retained a bispecific antibody under development. Although our bispecific antibody bound both the epidermal growth factor receptor and insulin growth factor 1 receptor are expressed on H358, nonsmall-cell lung carcinoma cells, the fluorescence from labeled bispecific antibody was less intense than expected in xenografted tumors. Imaging analyses of live mice and major organs revealed that the majority of the Alexa Fluor 750 labeled bispecific antibody was sequestered in the liver within 2 h of injection. However, results varied depending on which near-infrared fluorophore was used, and fluorescence from the livers of mice injected with bispecific antibody labeled with Alexa Fluor 680 was less pronounced than those labeled with Alexa Fluor 750. The tissue distribution of control antibodies remained unaffected by label and suggests that the retention of fluorophores in the liver may differ. Given these precautions, these results support the incorporation of optical imaging biodistribution analyses in biotherapeutic development strategies.

  19. Analysing the Image Building Effects of TV Advertisements Using Internet Community Data

    NASA Astrophysics Data System (ADS)

    Uehara, Hiroshi; Sato, Tadahiko; Yoshida, Kenichi

    This paper proposes a method to measure the effects of TV advertisements on the Internet bulletin boards. It aims to clarify how the viewes' interests on TV advertisements are reflected on their images on the promoted products. Two kinds of time series data are generated based on the proposed method. First one represents the time series fluctuation of the interests on the TV advertisements. Another one represents the time series fluctuation of the images on the products. By analysing the correlations between these two time series data, we try to clarify the implicit relationship between the viewer's interests on the TV advertisement and their images on the promoted products. By applying the proposed method to an Internet bulletin board that deals with certain cosmetic brand, we show that the images on the products vary depending on the difference of the interests on each TV advertisement.

  20. Solid Hydrogen Experiments for Atomic Propellants: Particle Formation Energy and Imaging Analyses

    NASA Technical Reports Server (NTRS)

    Palaszewski, Bryan

    2002-01-01

    This paper presents particle formation energy balances and detailed analyses of the images from experiments that were conducted on the formation of solid hydrogen particles in liquid helium during the Phase II testing in 2001. Solid particles of hydrogen were frozen in liquid helium and observed with a video camera. The solid hydrogen particle sizes and the total mass of hydrogen particles were estimated. The particle formation efficiency is also estimated. Particle sizes from the Phase I testing in 1999 and the Phase II testing in 2001 were similar. Though the 2001 testing created similar particles sizes, many new particle formation phenomena were observed. These experiment image analyses are one of the first steps toward visually characterizing these particles and it allows designers to understand what issues must be addressed in atomic propellant feed system designs for future aerospace vehicles.

  1. Solid Hydrogen Experiments for Atomic Propellants: Particle Formation, Imaging, Observations, and Analyses

    NASA Technical Reports Server (NTRS)

    Palaszewski, Bryan

    2005-01-01

    This report presents particle formation observations and detailed analyses of the images from experiments that were conducted on the formation of solid hydrogen particles in liquid helium. Hydrogen was frozen into particles in liquid helium, and observed with a video camera. The solid hydrogen particle sizes and the total mass of hydrogen particles were estimated. These newly analyzed data are from the test series held on February 28, 2001. Particle sizes from previous testing in 1999 and the testing in 2001 were similar. Though the 2001 testing created similar particles sizes, many new particle formation phenomena were observed: microparticles and delayed particle formation. These experiment image analyses are some of the first steps toward visually characterizing these particles, and they allow designers to understand what issues must be addressed in atomic propellant feed system designs for future aerospace vehicles.

  2. The Decoding Toolbox (TDT): a versatile software package for multivariate analyses of functional imaging data

    PubMed Central

    Hebart, Martin N.; Görgen, Kai; Haynes, John-Dylan

    2015-01-01

    The multivariate analysis of brain signals has recently sparked a great amount of interest, yet accessible and versatile tools to carry out decoding analyses are scarce. Here we introduce The Decoding Toolbox (TDT) which represents a user-friendly, powerful and flexible package for multivariate analysis of functional brain imaging data. TDT is written in Matlab and equipped with an interface to the widely used brain data analysis package SPM. The toolbox allows running fast whole-brain analyses, region-of-interest analyses and searchlight analyses, using machine learning classifiers, pattern correlation analysis, or representational similarity analysis. It offers automatic creation and visualization of diverse cross-validation schemes, feature scaling, nested parameter selection, a variety of feature selection methods, multiclass capabilities, and pattern reconstruction from classifier weights. While basic users can implement a generic analysis in one line of code, advanced users can extend the toolbox to their needs or exploit the structure to combine it with external high-performance classification toolboxes. The toolbox comes with an example data set which can be used to try out the various analysis methods. Taken together, TDT offers a promising option for researchers who want to employ multivariate analyses of brain activity patterns. PMID:25610393

  3. The Decoding Toolbox (TDT): a versatile software package for multivariate analyses of functional imaging data.

    PubMed

    Hebart, Martin N; Görgen, Kai; Haynes, John-Dylan

    2014-01-01

    The multivariate analysis of brain signals has recently sparked a great amount of interest, yet accessible and versatile tools to carry out decoding analyses are scarce. Here we introduce The Decoding Toolbox (TDT) which represents a user-friendly, powerful and flexible package for multivariate analysis of functional brain imaging data. TDT is written in Matlab and equipped with an interface to the widely used brain data analysis package SPM. The toolbox allows running fast whole-brain analyses, region-of-interest analyses and searchlight analyses, using machine learning classifiers, pattern correlation analysis, or representational similarity analysis. It offers automatic creation and visualization of diverse cross-validation schemes, feature scaling, nested parameter selection, a variety of feature selection methods, multiclass capabilities, and pattern reconstruction from classifier weights. While basic users can implement a generic analysis in one line of code, advanced users can extend the toolbox to their needs or exploit the structure to combine it with external high-performance classification toolboxes. The toolbox comes with an example data set which can be used to try out the various analysis methods. Taken together, TDT offers a promising option for researchers who want to employ multivariate analyses of brain activity patterns. PMID:25610393

  4. Computer-based image-analyses of laminated shales, carboniferous of the Midcontinent and surrounding areas

    SciTech Connect

    Archer, A.W. . Dept. of Geology)

    1993-02-01

    Computerized image-analyses of petrographic data can greatly facilitate the quantification of detailed descriptions and analyses of fine-scale fabric, or petrofabric. In thinly laminated rocks, manual measurement of successive lamina thicknesses is very time consuming, especially when applied to thick, cored sequences. In particular, images of core materials can be digitized and the resulting image then processed as a large matrix. Using such techniques, it is relatively easy to automate continuous measurements of lamina thickness and lateral continuity. This type of analyses has been applied to a variety of Carboniferous strata, particularly those siliciclastics that occur within the outside shale' portions of Kansas cyclothems. Of the various sedimentological processes capable of producing such non-random thickness variations, a model invoking tidal processes appears to be particularly robust. Tidal sedimentation could not only have resulted in the deposition of individual lamina, but in addition tidal-height variations during various phases of the lunar orbit can serve to explain the systematic variations. Comparison of these Carboniferous shales with similar laminations formed in modern high tidal-range environments indicates many similarities. These modern analogs include the Bay of Fundy in Canada, and Bay of Mont-Staint-Michel in France. Lamina-thickness variations, in specific cases, can be correlated with known tidal periodicities. In addition, in some samples, details of the tidal regime can be interpolated, such as the nature of the tidal system (i.e., diurnal or semidiurnal) and some indicators of tidal range can be ascertained based upon modern analogs.

  5. Study of SGD along the French Mediterranean coastline using airborne TIR images and in situ analyses

    NASA Astrophysics Data System (ADS)

    van Beek, Pieter; Stieglitz, Thomas; Souhaut, Marc

    2015-04-01

    Although submarine groundwater discharge (SGD) has been investigated in many places of the world, very few studies were conducted along the French coastline of the Mediterranean Sea. Almost no information is available on the fluxes of water and chemical elements associated with these SGD and on their potential impact on the geochemical cycling and ecosystems of the coastal zones. In this work, we combined the use of airborne thermal infrared (TIR) images with in situ analyses of salinity, temperature, radon and radium isotopes to study SGD at various sites along the French Mediterranean coastline and in coastal lagoons. These analyses allowed us to detect SGD sites and to quantify SGD fluxes (that include both the fluxes of fresh groundwater and recirculated seawater). In particular, we will show how the Ra isotopes determined in the La Palme lagoon were used to estimate i) the residence time of waters in the lagoon and ii) SGD fluxes.

  6. A new ImageJ plug-in "ActogramJ" for chronobiological analyses.

    PubMed

    Schmid, Benjamin; Helfrich-Förster, Charlotte; Yoshii, Taishi

    2011-10-01

    While the rapid development of personal computers and high-throughput recording systems for circadian rhythms allow chronobiologists to produce huge amounts of data, the software to analyze them often lags behind. Here, we announce newly developed chronobiology software that is easy to use, compatible with many different systems, and freely available. Our system can perform the most frequently used analyses: actogram drawing, periodogram analysis, and waveform analysis. The software is distributed as a pure Java plug-in for ImageJ and so works on the 3 main operating systems: Linux, Macintosh, and Windows. We believe that this free software raises the speed of data analyses and makes studying chronobiology accessible to newcomers.

  7. [Quantitative analyses of coronary artery calcification by using clinical cardiovascular imaging].

    PubMed

    Ehara, Shoichi; Yoshiyama, Minoru

    2010-11-01

    Coronary artery calcification (CAC) is a common phenomenon, but the clinical relevance of this phenomenon, for instance as a risk factor for plaque vulnerability, is still controversial. After the introduction of electron-beam computed tomography (EBCT), multislice computed tomography (MSCT), and intravascular ultrasound (IVUS), which enables quantitative assessment of CAC, the number of clinical studies concerning CAC has rapidly increased. In this review, we focus on the quantitative analyses of CAC by using clinical cardiovascular imaging and the clinical significance of CAC. PMID:21037389

  8. Immunochemical Micro Imaging Analyses for the Detection of Proteins in Artworks.

    PubMed

    Sciutto, Giorgia; Zangheri, Martina; Prati, Silvia; Guardigli, Massimo; Mirasoli, Mara; Mazzeo, Rocco; Roda, Aldo

    2016-06-01

    The present review is aimed at reporting on the most advanced and recent applications of immunochemical imaging techniques for the localization of proteins within complex and multilayered paint stratigraphies. Indeed, a paint sample is usually constituted by the superimposition of different layers whose characterization is fundamental in the evaluation of the state of conservation and for addressing proper restoration interventions. Immunochemical methods, which are based on the high selectivity of antigen-antibody reactions, were proposed some years ago in the field of cultural heritage. In addition to enzyme-linked immunosorbent assays for protein identification, immunochemical imaging methods have also been explored in the last decades, thanks to the possibility to localize the target analytes, thus increasing the amount of information obtained and thereby reducing the number of samples and/or analyses needed for a comprehensive characterization of the sample. In this review, chemiluminescent, spectroscopic and electrochemical imaging detection methods are discussed to illustrate potentialities and limits of advanced immunochemical imaging systems for the analysis of paint cross-sections. PMID:27573272

  9. Image analysis and data normalization procedures are crucial for microarray analyses.

    PubMed

    Kadanga, Ali Kpatcha; Leroux, Christine; Bonnet, Muriel; Chauvet, Stéphanie; Meunier, Bruno; Cassar-Malek, Isabelle; Hocquette, Jean-François

    2008-03-17

    This study was conducted with the aim of optimizing the experimental design of array experiments. We compared two image analysis and normalization procedures prior to data analysis using two experimental designs. For this, RNA samples from Charolais steers Longissimus thoracis muscle and subcutaneous adipose tissues were labeled and hybridized to a bovine 8,400 oligochip either in triplicate or in a dye-swap design. Image analysis and normalization were processed by either GenePix/MadScan or ImaGene/GeneSight. Statistical data analysis was then run using either the SAM method or a Student's t-test using a multiple test correction run on R 2.1 software. Our results show that image analysis and normalization procedure had an impact whereas the statistical methods much less influenced the outcome of differentially expressed genes. Image analysis and data normalization are thus an important aspect of microarray experiments, having a potentially significant impact on downstream analyses such as the identification of differentially expressed genes. This study provides indications on the choice of raw data preprocessing in microarray technology.

  10. Image Analysis and Data Normalization Procedures are Crucial for Microarray Analyses

    PubMed Central

    Kadanga, Ali Kpatcha; Leroux, Christine; Bonnet, Muriel; Chauvet, Stéphanie; Meunier, Bruno; Cassar-Malek, Isabelle; Hocquette, Jean-François

    2008-01-01

    This study was conducted with the aim of optimizing the experimental design of array experiments. We compared two image analysis and normalization procedures prior to data analysis using two experimental designs. For this, RNA samples from Charolais steers Longissimus thoracis muscle and subcutaneous adipose tissues were labeled and hybridized to a bovine 8,400 oligochip either in triplicate or in a dye-swap design. Image analysis and normalization were processed by either GenePix/MadScan or ImaGene/GeneSight. Statistical data analysis was then run using either the SAM method or a Student’s t-test using a multiple test correction run on R 2.1 software. Our results show that image analysis and normalization procedure had an impact whereas the statistical methods much less influenced the outcome of differentially expressed genes. Image analysis and data normalization are thus an important aspect of microarray experiments, having a potentially significant impact on downstream analyses such as the identification of differentially expressed genes. This study provides indications on the choice of raw data preprocessing in microarray technology. PMID:19787079

  11. An accessible, scalable ecosystem for enabling and sharing diverse mass spectrometry imaging analyses.

    PubMed

    Fischer, Curt R; Ruebel, Oliver; Bowen, Benjamin P

    2016-01-01

    Mass spectrometry imaging (MSI) is used in an increasing number of biological applications. Typical MSI datasets contain unique, high-resolution mass spectra from tens of thousands of spatial locations, resulting in raw data sizes of tens of gigabytes per sample. In this paper, we review technical progress that is enabling new biological applications and that is driving an increase in the complexity and size of MSI data. Handling such data often requires specialized computational infrastructure, software, and expertise. OpenMSI, our recently described platform, makes it easy to explore and share MSI datasets via the web - even when larger than 50 GB. Here we describe the integration of OpenMSI with IPython notebooks for transparent, sharable, and replicable MSI research. An advantage of this approach is that users do not have to share raw data along with analyses; instead, data is retrieved via OpenMSI's web API. The IPython notebook interface provides a low-barrier entry point for data manipulation that is accessible for scientists without extensive computational training. Via these notebooks, analyses can be easily shared without requiring any data movement. We provide example notebooks for several common MSI analysis types including data normalization, plotting, clustering, and classification, and image registration.

  12. Multiple Local Coils in Magnetic Resonance Imaging: Design Analyses and Recommended Improvements.

    NASA Astrophysics Data System (ADS)

    Jones, Randall Wayne

    The use of local coils in Magnetic Resonance Imaging (MR) is becoming increasingly popular. Local coils offer improved image quality within their inherently smaller region-of-sensitivity (ROS) compared to that of the body coil. As the MR experiment matures, an increased demand for improvements in local anatomical imaging is placed upon MR equipment manufacturers. Developing anatomically specific quadrature detection coils is one method for increasing image quality. Another method is to switch the coil's ROS to a smaller size during the scanning process; hence, improving the coil sensitivity. Also, optimizing the quality factor or Q of the basic coil element is important if it is to offer improvements over existing designs. Q is significantly affected by such things as: the system cable coupling; the geometry; and the decoupling mechanism--whether it is via varactor detuning, PIN diode switching, passive diode decoupling or inherent positioning. Analyses of these variations in coil design are presented and recommendations are given to minimize Q degradations. Computer modeling is used for analyzing Q effects of the cable and the tuning and decoupling networks. Also, a convenient program was developed for entering three dimensional coil geometries and plotting their corresponding sensitivity profiles. Images, taken on the MR system, are provided to verify the model's plots and predictions, as well as to demonstrate the feasibility of new coil designs created as a result of the above studies. The culmination of the results of the research effort is demonstrated by several new coil designs: a tunable, fused, pseudo-volume shoulder coil, a pseudo-volume pelvic coil, a planar, quadrature detection spine coil, a switchable ROS, pseudo-volume neck coil, and switchable ROS, planar coil.

  13. Partial correlation analyses of global diffusion tensor imaging-derived metrics in glioblastoma multiforme: Pilot study

    PubMed Central

    Cortez-Conradis, David; Rios, Camilo; Moreno-Jimenez, Sergio; Roldan-Valadez, Ernesto

    2015-01-01

    AIM: To determine existing correlates among diffusion tensor imaging (DTI)-derived metrics in healthy brains and brains with glioblastoma multiforme (GBM). METHODS: Case-control study using DTI data from brain magnetic resonance imaging of 34 controls (mean, 41.47; SD, ± 21.94 years; range, 21-80 years) and 27 patients with GBM (mean, SD; 48.41 ± 15.18 years; range, 18-78 years). Image postprocessing using FSL software calculated eleven tensor metrics: fractional (FA) and relative anisotropy; pure isotropic (p) and anisotropic diffusions (q), total magnitude of diffusion (L); linear (Cl), planar (Cp) and spherical tensors (Cs); mean (MD), axial (AD) and radial diffusivities (RD). Partial correlation analyses (controlling the effect of age and gender) and multivariate Mancova were performed. RESULTS: There was a normal distribution for all metrics. Comparing healthy brains vs brains with GBM, there were significant very strong bivariate correlations only depicted in GBM: [FA↔Cl (+)], [FA↔q (+)], [p↔AD (+)], [AD↔MD (+)], and [MD↔RD (+)]. Among 56 pairs of bivariate correlations, only seven were significantly different. The diagnosis variable depicted a main effect [F-value (11, 23) = 11.842, P ≤ 0.001], with partial eta squared = 0.850, meaning a large effect size; age showed a similar result. The age also had a significant influence as a covariate [F (11, 23) = 10.523, P < 0.001], with a large effect size (partial eta squared = 0.834). CONCLUSION: DTI-derived metrics depict significant differences between healthy brains and brains with GBM, with specific magnitudes and correlations. This study provides reference data and makes a contribution to decrease the underlying empiricism in the use of DTI parameters in brain imaging. PMID:26644826

  14. Mosquito Larval Habitats, Land Use, and Potential Malaria Risk in Northern Belize from Satellite Image Analyses

    NASA Technical Reports Server (NTRS)

    Pope, Kevin; Masuoka, Penny; Rejmankova, Eliska; Grieco, John; Johnson, Sarah; Roberts, Donald

    2004-01-01

    The distribution of Anopheles mosquito habitats and land use in northern Belize is examined with satellite data. -A land cover classification based on multispectral SPOT and multitemporal Radarsat images identified eleven land cover classes, including agricultural, forest, and marsh types. Two of the land cover types, Typha domingensis marsh and flooded forest, are Anopheles vestitipennis larval habitats. Eleocharis spp. marsh is the larval habitat for Anopheles albimanus. Geographic Information Systems (GIS) analyses of land cover demonstrate that the amount of T-ha domingensis in a marsh is positively correlated with the amount of agricultural land in the adjacent upland, and negatively correlated with the amount of adjacent forest. This finding is consistent with the hypothesis that nutrient (phosphorus) runoff from agricultural lands is causing an expansion of Typha domingensis in northern Belize. This expansion of Anopheles vestitipennis larval habitat may in turn cause an increase in malaria risk in the region.

  15. Development, Capabilities, and Impact on Wind Analyses of the Hurricane Imaging Radiometer (HIRAD)

    NASA Technical Reports Server (NTRS)

    Miller, T.; Uhlhorn, E.; Amarin, R.; Atlas, R.; Black, P. G.; Jones, W. L.; Ruf, C. S.

    2010-01-01

    The Hurricane Imaging Radiometer (HIRAD) is a new airborne microwave remote sensor for hurricane observations that is currently under development by NASA Marshall Space Flight Center in partnership with the NOAA Atlantic Oceanographic and Meteorological Laboratory/Hurricane Research Division, the University of Central Florida, the University of Michigan, and the University of Alabama in Huntsville. The instrument is being test flown in January and is expected to participate in the tropical cyclone experiment GRIP (Genesis and Rapid Intensification Processes) in the 2010 season. HIRAD is being designed to study the wind field in some detail within strong hurricanes and to enhance the real-time airborne ocean surface winds observation capabilities of NOAA and USAF Weather Squadron hurricane hunter aircraft currently using the operational Stepped Frequency Microwave Radiometer (SFMR). Unlike SFMR, which measures wind speed and rain rate along the ground track at a single point directly beneath the aircraft, HIRAD will provide images of the surface wind and rain field over a wide swath (approximately 3 x the aircraft altitude) with approximately 2 km resolution. This paper describes the HIRAD instrument and the physical basis for its operations, including chamber test data from the instrument. The potential value of future HIRAD observations will be illustrated with a summary of Observing System Simulation Experiments (OSSEs) in which measurements from the new instrument as well as those from existing instruments (air, surface, and space-based) are simulated from the output of a detailed numerical model, and those results are used to construct simulated H*Wind analyses. Evaluations will be presented on the impact on H*Wind analyses of using the HIRAD instrument observations to replace those of the SFMR instrument, and also on the impact of a future satellite-based HIRAD in comparison to instruments with more limited capabilities for observing strong winds through heavy

  16. Validation of the automatic image analyser to assess retinal vessel calibre (ALTAIR): a prospective study protocol

    PubMed Central

    Garcia-Ortiz, Luis; Gómez-Marcos, Manuel A; Recio-Rodríguez, Jose I; Maderuelo-Fernández, Jose A; Chamoso-Santos, Pablo; Rodríguez-González, Sara; de Paz-Santana, Juan F; Merchan-Cifuentes, Miguel A; Corchado-Rodríguez, Juan M

    2014-01-01

    Introduction The fundus examination is a non-invasive evaluation of the microcirculation of the retina. The aim of the present study is to develop and validate (reliability and validity) the ALTAIR software platform (Automatic image analyser to assess retinal vessel calibre) in order to analyse its utility in different clinical environments. Methods and analysis A cross-sectional study in the first phase and a prospective observational study in the second with 4 years of follow-up. The study will be performed in a primary care centre and will include 386 participants. The main measurements will include carotid intima-media thickness, pulse wave velocity by Sphygmocor, cardio-ankle vascular index through the VASERA VS-1500, cardiac evaluation by a digital ECG and renal injury by microalbuminuria and glomerular filtration. The retinal vascular evaluation will be performed using a TOPCON TRCNW200 non-mydriatic retinal camera to obtain digital images of the retina, and the developed software (ALTAIR) will be used to automatically calculate the calibre of the retinal vessels, the vascularised area and the branching pattern. For software validation, the intraobserver and interobserver reliability, the concurrent validity of the vascular structure and function, as well as the association between the estimated retinal parameters and the evolution or onset of new lesions in the target organs or cardiovascular diseases will be examined. Ethics and dissemination The study has been approved by the clinical research ethics committee of the healthcare area of Salamanca. All study participants will sign an informed consent to agree to participate in the study in compliance with the Declaration of Helsinki and the WHO standards for observational studies. Validation of this tool will provide greater reliability to the analysis of retinal vessels by decreasing the intervention of the observer and will result in increased validity through the use of additional information, especially

  17. Correlative Imaging and Analyses of Soil Organic Matter Stabilization in the Rhizosphere

    NASA Astrophysics Data System (ADS)

    Dohnalkova, Alice; Tfaily, Malak; Chu, Rosalie; Crump, Alex; Brislawn, Colin; Varga, Tamas; Chrisler, William

    2016-04-01

    Correlative Imaging and Analyses of Soil Organic Matter Stabilization in the Rhizosphere Understanding the dynamics of carbon (C) pools in soil systems is a critical area for mitigating atmospheric carbon dioxide levels and maintaining healthy soils. Although microbial contributions to stable soil carbon pools have often been regarded as low to negligible, we present evidence that microbes may play a far greater role in the stabilization of soil organic matter (SOM), thus in contributing to soil organic matter pools with longer residence time. The rhizosphere, a zone immediately surrounding the plant roots, represents a geochemical hotspot with high microbial activity and profuse SOM production. Particularly, microbially secreted extracellular polymeric substances (EPS) present a remarkable dynamic entity that plays a critical role in numerous soil processes including mineral weathering. We approach the interface of soil minerals and microbes with a focus on the organic C stabilization mechanisms. We use a suite of high-resolution imaging and analytical methods (confocal, scanning and transmission electron microscopy, Fourier transform ion cyclotron resonance mass spectrometry, DNA sequencing and X-ray diffraction), to study the living and non-living rhizosphere components. Our goal is to elucidate a pathway for the formation, storage, transformation and protection of persistent microbially-produced carbon in soils. Based on our multimodal analytical approach, we propose that persistent microbial necromass in soils accounts for considerably higher soil carbon than previously estimated.

  18. Image analyses in bauxitic ores: The case of the Apulian karst bauxites

    NASA Astrophysics Data System (ADS)

    Buccione, Roberto; Sinisi, Rosa; Mongelli, Giovanni

    2015-04-01

    This study concern two different karst bauxite deposits of the Apulia region (southern Italy). These deposits outcrop in the Murge and Salento areas: the Murge bauxite (upper Cretaceous) is a typical canyon-like deposit formed in a karst depression whereas the Salento bauxite (upper Eocene - Oligocene) is the result of the erosion, remobilization and transport of older bauxitic material from a relative distant area. This particular bauxite arrangement gave the name to all the same bauxite deposits which are thus called Salento-type deposits. Bauxite's texture is essentially made of sub-circular concentric aggregates, called ooids, dispersed in a pelitic matrix. The textural properties of the two bauxitic ores, as assessed by SEM-EDX, are different. In the bauxite from the canyon-like deposit the ooids/matrix ratio is higher than in the Salento-type bauxite. Furthermore the ooids in the Salento-like bauxite are usually made by a large core surrounded by a narrow, single, accretion layer, whereas the ooids from the canyon-like deposit have a smaller core surrounded by several alternating layers of Al-hematite and boehmite (Mongelli et al., 2014). In order to explore in more detail the textural features of both bauxite deposits, particle shape analyses were performed. Image analyses and the fractal dimension have been widely used in geological studies including economic geology (e.g. Turcotte, 1986; Meakin, 1991; Deng et al., 2011). The geometric properties evaluated are amounts of ooids, average ooids size, ooids rounding and the fractal dimension D, which depends on the ooids/matrix ratio. D is the slope of a plotting line obtained using a particular counting technique on each sample image. The fractal dimension is slightly lower for the Salento-type bauxites. Since the process which led to the formation of the ooids is related to an aggregation growth involving chemical fractionation (Mongelli, 2002) a correlation among these parameters and the contents of major

  19. Autonomous Science Analyses of Digital Images for Mars Sample Return and Beyond

    NASA Technical Reports Server (NTRS)

    Gulick, V. C.; Morris, R. L.; Ruzon, M.; Roush, T. L.

    1999-01-01

    To adequately explore high priority landing sites, scientists require rovers with greater mobility. Therefore, future Mars missions will involve rovers capable of traversing tens of kilometers (vs. tens of meters traversed by Mars Pathfinder's Sojourner). However, the current process by which scientists interact with a rover does not scale to such distances. A single science objective is achieved through many iterations of a basic command cycle: (1) all data must be transmitted to Earth and analyzed; (2) from this data, new targets are selected and the necessary information from the appropriate instruments are requested; (3) new commands are then uplinked and executed by the spacecraft and (4) the resulting data are returned to Earth, starting the process again. Experience with rover tests on Earth shows that this time intensive process cannot be substantially shortened given the limited data downlink bandwidth and command cycle opportunities of real missions. Sending complete multicolor panoramas at several waypoints, for example, is out of the question for a single downlink opportunity. As a result, long traverses requiring many science command cycles would likely require many weeks, months or even years, perhaps exceeding rover design life or other constraints. Autonomous onboard science analyses can address these problems in two ways. First, it will allow the rover to transmit only "interesting" images, defined as those likely to have higher science content. Second, the rover will be able to anticipate future commands, for example acquiring and returning spectra of "interesting" rocks along with the images in which they were detected. Such approaches, coupled with appropriate navigational software, address both the data volume and command cycle bottlenecks that limit both rover mobility and science yield. We are developing algorithms to enable such intelligent decision making by autonomous spacecraft. Reflecting the ultimate level of ability we aim for, this

  20. Reversed cell imprinting, AFM imaging and adhesion analyses of cells on patterned surfaces.

    PubMed

    Zhou, Xiongtu; Shi, Jian; Zhang, Fan; Hu, Jie; Li, Xin; Wang, Li; Ma, Xueming; Chen, Yong

    2010-05-01

    Cell adhesion and motility depend strongly on the interactions between cells and cell culture substratum. To observe the cell morphology at the interface between cells and artificial substratum or patterned surfaces, we have developed a technique named reversed cell imprinting. After culture and chemical fixation of the cells on a patterned hole array, a liquid polymer was poured on and UV cured, allowing taking off the cell-polymer assembly for a direct observation of the underside cell surface using atomic force microscopy. As expected, we observed local deformation of the cell membrane in the hole area with a penetration depth strongly dependent on the size and depth of the hole as well as the culture time. Quantitative analyses of Hela cells on patterned surfaces of polydimethylsiloxane (PDMS) revealed that the penetration was also position dependent over the cell attachment area due to the non-homogeneous distribution of the membrane stress. With the increase of the culture time, the penetration depth was reduced, in a close correlation with the increase of the cell spreading area. Nevertheless, both cell seeding and adhesion efficiency on high density hole arrays could be significantly increased comparing to that on a smooth surface. Patterned substrates are increasingly required to produce and interrogate new biomaterials for therapeutic benefit. Overall, this work suggests a strategy to endow conventional imaging methods with added functionality to enable easy observation of the underside cell morphology on topographic patterns. PMID:20390138

  1. Molecular cytogenetic analysis of human blastocysts andcytotrophoblasts by multi-color FISH and Spectra Imaging analyses

    SciTech Connect

    Weier, Jingly F.; Ferlatte, Christy; Baumgartner, Adolf; Jung,Christine J.; Nguyen, Ha-Nam; Chu, Lisa W.; Pedersen, Roger A.; Fisher,Susan J.; Weier, Heinz-Ulrich G.

    2006-02-08

    Numerical chromosome aberrations in gametes typically lead to failed fertilization, spontaneous abortion or a chromosomally abnormal fetus. By means of preimplantation genetic diagnosis (PGD), we now can screen human embryos in vitro for aneuploidy before transferring the embryos to the uterus. PGD allows us to select unaffected embryos for transfer and increases the implantation rate in in vitro fertilization programs. Molecular cytogenetic analyses using multi-color fluorescence in situ hybridization (FISH) of blastomeres have become the major tool for preimplantation genetic screening of aneuploidy. However, current FISH technology can test for only a small number of chromosome abnormalities and hitherto failed to increase the pregnancy rates as expected. We are in the process of developing technologies to score all 24 chromosomes in single cells within a 3 day time limit, which we believe is vital to the clinical setting. Also, human placental cytotrophoblasts (CTBs) at the fetal-maternal interface acquire aneuploidies as they differentiate to an invasive phenotype. About 20-50% of invasive CTB cells from uncomplicated pregnancies were found aneuploidy, suggesting that the acquisition of aneuploidy is an important component of normal placentation, perhaps limiting the proliferative and invasive potential of CTBs. Since most invasive CTBs are interphase cells and possess extreme heterogeneity, we applied multi-color FISH and repeated hybridizations to investigate individual CTBs. In summary, this study demonstrates the strength of Spectral Imaging analysis and repeated hybridizations, which provides a basis for full karyotype analysis of single interphase cells.

  2. Functional magnetic resonance imaging connectivity analyses reveal efference-copy to primary somatosensory area, BA2.

    PubMed

    Cui, Fang; Arnstein, Dan; Thomas, Rajat Mani; Maurits, Natasha M; Keysers, Christian; Gazzola, Valeria

    2014-01-01

    Some theories of motor control suggest efference-copies of motor commands reach somatosensory cortices. Here we used functional magnetic resonance imaging to test these models. We varied the amount of efference-copy signal by making participants squeeze a soft material either actively or passively. We found electromyographical recordings, an efference-copy proxy, to predict activity in primary somatosensory regions, in particular Brodmann Area (BA) 2. Partial correlation analyses confirmed that brain activity in cortical structures associated with motor control (premotor and supplementary motor cortices, the parietal area PF and the cerebellum) predicts brain activity in BA2 without being entirely mediated by activity in early somatosensory (BA3b) cortex. Our study therefore provides valuable empirical evidence for efference-copy models of motor control, and shows that signals in BA2 can indeed reflect an input from motor cortices and suggests that we should interpret activations in BA2 as evidence for somatosensory-motor rather than somatosensory coding alone.

  3. Functional Magnetic Resonance Imaging Connectivity Analyses Reveal Efference-Copy to Primary Somatosensory Area, BA2

    PubMed Central

    Cui, Fang; Arnstein, Dan; Thomas, Rajat Mani; Maurits, Natasha M.; Keysers, Christian; Gazzola, Valeria

    2014-01-01

    Some theories of motor control suggest efference-copies of motor commands reach somatosensory cortices. Here we used functional magnetic resonance imaging to test these models. We varied the amount of efference-copy signal by making participants squeeze a soft material either actively or passively. We found electromyographical recordings, an efference-copy proxy, to predict activity in primary somatosensory regions, in particular Brodmann Area (BA) 2. Partial correlation analyses confirmed that brain activity in cortical structures associated with motor control (premotor and supplementary motor cortices, the parietal area PF and the cerebellum) predicts brain activity in BA2 without being entirely mediated by activity in early somatosensory (BA3b) cortex. Our study therefore provides valuable empirical evidence for efference-copy models of motor control, and shows that signals in BA2 can indeed reflect an input from motor cortices and suggests that we should interpret activations in BA2 as evidence for somatosensory-motor rather than somatosensory coding alone. PMID:24416222

  4. X-ray digital imaging petrography of lunar mare soils: modal analyses of minerals and glasses

    NASA Technical Reports Server (NTRS)

    Taylor, L. A.; Patchen, A.; Taylor, D. H.; Chambers, J. G.; McKay, D. S.

    1996-01-01

    It is essential that accurate modal (i.e., volume) percentages of the various mineral and glass phases in lunar soils be used for addressing and resolving the effects of space weathering upon reflectance spectra, as well as for their calibration such data are also required for evaluating the resource potential of lunar minerals for use at a lunar base. However, these data are largely lacking. Particle-counting information for lunar soils, originally obtained to study formational processes, does not provide these necessary data, including the percentages of minerals locked in multi-phase lithic fragments and fused-soil particles, such as agglutinates. We have developed a technique for modal analyses, sensu stricto, of lunar soils, using digital imaging of X-ray maps obtained with an energy-dispersive spectrometer mounted on an electron microprobe. A suite of nine soils (90 to 150 micrometers size fraction) from the Apollo 11, 12, 15, and 17 mare sites was used for this study. This is the first collection of such modal data on soils from all Apollo mare sites. The abundances of free-mineral fragments in the mare soils are greater for immature and submature soils than for mature soils, largely because of the formation of agglutinitic glass as maturity progresses. In considerations of resource utilization at a lunar base, the best lunar soils to use for mineral beneficiation (i.e., most free-mineral fragments) have maturities near the immature/submature boundary (Is/FeO approximately or = 30), not the mature soils with their complications due to extensive agglutination. The particle data obtained from the nine mare soils confirm the generalizations for lunar soils predicted by L.A. Taylor and D.S. McKay (1992, Lunar Planet Sci. Conf. 23rd, pp. 1411-1412 [Abstract]).

  5. Spatiotemporal Analyses of Osteogenesis and Angiogenesis via Intravital Imaging in Cranial Bone Defect Repair

    PubMed Central

    Huang, Chunlan; Ness, Vincent P.; Yang, Xiaochuan; Chen, Hongli; Luo, Jiebo; Brown, Edward B; Zhang, Xinping

    2015-01-01

    Osteogenesis and angiogenesis are two integrated components in bone repair and regeneration. A deeper understanding of osteogenesis and angiogenesis has been hampered by technical difficulties of analyzing bone and neovasculature simultaneously in spatiotemporal scales and in three-dimensional formats. To overcome these barriers, a cranial defect window chamber model was established that enabled high-resolution, longitudinal, and real-time tracking of angiogenesis and bone defect healing via Multiphoton Laser Scanning Microscopy (MPLSM). By simultaneously probing new bone matrix via second harmonic generation (SHG), neovascular networks via intravenous perfusion of fluorophore, and osteoblast differentiation via 2.3kb collagen type I promoter driven GFP (Col2.3GFP), we examined the morphogenetic sequence of cranial bone defect healing and further established the spatiotemporal analyses of osteogenesis and angiogenesis coupling in repair and regeneration. We demonstrated that bone defect closure was initiated in the residual bone around the edge of the defect. The expansion and migration of osteoprogenitors into the bone defect occurred during the first 3 weeks of healing, coupled with vigorous microvessel angiogenesis at the leading edge of the defect. Subsequent bone repair was marked by matrix deposition and active vascular network remodeling within new bone. Implantation of bone marrow stromal cells (BMSCs) isolated from Col2.3GFP mice further showed that donor-dependent bone formation occurred rapidly within the first 3 weeks of implantation, in concert with early angiogenesis. The subsequent bone wound closure was largely host-dependent, associated with localized modest induction of angiogenesis. The establishment of a live imaging platform via cranial window provides a unique tool to understand osteogenesis and angiogenesis in repair and regeneration, enabling further elucidation of the spatiotemporal regulatory mechanisms of osteoprogenitor cell interactions

  6. Low-Rank Atlas Image Analyses in the Presence of Pathologies.

    PubMed

    Liu, Xiaoxiao; Niethammer, Marc; Kwitt, Roland; Singh, Nikhil; McCormick, Matt; Aylward, Stephen

    2015-12-01

    We present a common framework, for registering images to an atlas and for forming an unbiased atlas, that tolerates the presence of pathologies such as tumors and traumatic brain injury lesions. This common framework is particularly useful when a sufficient number of protocol-matched scans from healthy subjects cannot be easily acquired for atlas formation and when the pathologies in a patient cause large appearance changes. Our framework combines a low-rank-plus-sparse image decomposition technique with an iterative, diffeomorphic, group-wise image registration method. At each iteration of image registration, the decomposition technique estimates a "healthy" version of each image as its low-rank component and estimates the pathologies in each image as its sparse component. The healthy version of each image is used for the next iteration of image registration. The low-rank and sparse estimates are refined as the image registrations iteratively improve. For unbiased atlas formation, at each iteration, the average of the low-rank images from the patients is used as the atlas image for the next iteration, until convergence. Since each iteration's atlas is comprised of low-rank components, it provides a population-consistent, pathology-free appearance. Evaluations of the proposed methodology are presented using synthetic data as well as simulated and clinical tumor MRI images from the brain tumor segmentation (BRATS) challenge from MICCAI 2012.

  7. Detection of melamine in milk powders based on NIR hyperspectral imaging and spectral similarity analyses

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Melamine (2,4,6-triamino-1,3,5-triazine) contamination of food has become an urgent and broadly recognized topic as a result of several food safety scares in the past five years. Hyperspectral imaging techniques that combine the advantages of spectroscopy and imaging have been widely applied for a v...

  8. Maximizing Science Return from Future Mars Missions with Onboard Image Analyses

    NASA Technical Reports Server (NTRS)

    Gulick, V. C.; Morris, R. L.; Bandari, E. B.; Roush, T. L.

    2000-01-01

    We have developed two new techniques to enhance science return and to decrease returned data volume for near-term Mars missions: 1) multi-spectral image compression and 2) autonomous identification and fusion of in-focus regions in an image series.

  9. Edge detection and image segmentation of space scenes using fractal analyses

    NASA Technical Reports Server (NTRS)

    Cleghorn, Timothy F.; Fuller, J. J.

    1992-01-01

    A method was developed for segmenting images of space scenes into manmade and natural components, using fractal dimensions and lacunarities. Calculations of these parameters are presented. Results are presented for a variety of aerospace images, showing that it is possible to perform edge detections of manmade objects against natural background such as those seen in an aerospace environment.

  10. JChainsAnalyser: an ImageJ-based stand-alone application for the study of magneto-rheological fluids

    NASA Astrophysics Data System (ADS)

    Domínguez-García, P.; Rubio, M. A.

    2009-10-01

    JChainsAnalyser is a Java-based program for the analysis of two-dimensional images of magneto-rheological fluids (MRF) at low concentration of particles obtained using the video-microscopy technique. MRF are colloidal dispersions of micron-sized polarizable particles in a carrier fluid with medium to low viscosity. When a magnetic field is applied to the suspension, the particles aggregate forming chains or clusters. Aggregation dynamics [P. Domínguez-García, S. Melle, J.M. Pastor, M.A. Rubio, Phys. Rev. E 76 (2007) 051403] and morphology of the aggregates [P. Domínguez-García, S. Melle, M.A. Rubio, J. Colloid Interface Sci. 333 (2009) 221-229] have been studied capturing images of the fluid and analyzing them by using this software. The program allows to analyze automatically the MRF images by means of an adequate combination of different imaging methods, while magnitudes and statistics are calculated and saved in data files. It is possible to run the program on a desktop computer, using the GUI (graphical user interface), or in a cluster of processors or remote computer by means of command-line instructions. Program summaryProgram title: JChainsAnalyser Catalogue identifier: AEDT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEDT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC license, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 79 071 No. of bytes in distributed program, including test data, etc.: 4 367 909 Distribution format: tar.gz Programming language: Java 2 Computer: Any computer with Java Runtime Environment (JRE) installed Operating system: Any OS with Java Runtime Environment (JRE) installed RAM: Typically, 3.3 MB Classification: 23 External routines: ImageJ, ij-imageIO, jdom, L2FProd Nature of problem: The video-microscopy technique usually produces quite a big quantity of images to analyze

  11. Effect of Harderian adenectomy on the statistical analyses of mouse brain imaging using positron emission tomography.

    PubMed

    Kim, Minsoo; Woo, Sang-Keun; Yu, Jung Woo; Lee, Yong Jin; Kim, Kyeong Min; Kang, Joo Hyun; Eom, Kidong; Nahm, Sang-Soep

    2014-01-01

    Positron emission tomography (PET) using 2-deoxy-2-[(18)F] fluoro-D-glucose (FDG) as a radioactive tracer is a useful technique for in vivo brain imaging. However, the anatomical and physiological features of the Harderian gland limit the use of FDG-PET imaging in the mouse brain. The gland shows strong FDG uptake, which in turn results in distorted PET images of the frontal brain region. The purpose of this study was to determine if a simple surgical procedure to remove the Harderian gland prior to PET imaging of mouse brains could reduce or eliminate FDG uptake. Measurement of FDG uptake in unilaterally adenectomized mice showed that the radioactive signal emitted from the intact Harderian gland distorts frontal brain region images. Spatial parametric measurement analysis demonstrated that the presence of the Harderian gland could prevent accurate assessment of brain PET imaging. Bilateral Harderian adenectomy efficiently eliminated unwanted radioactive signal spillover into the frontal brain region beginning on postoperative Day 10. Harderian adenectomy did not cause any post-operative complications during the experimental period. These findings demonstrate the benefits of performing a Harderian adenectomy prior to PET imaging of mouse brains.

  12. Should processed or raw image data be used in mammographic image quality analyses? A comparative study of three full-field digital mammography systems.

    PubMed

    Borg, Mark; Badr, Ishmail; Royle, Gary

    2015-01-01

    The purpose of this study is to compare a number of measured image quality parameters using processed and unprocessed or raw images in two full-field direct digital units and one computed radiography mammography system. This study shows that the difference between raw and processed image data is system specific. The results have shown that there are no significant differences between raw and processed data in the mean threshold contrast values using the contrast-detail mammography phantom in all the systems investigated; however, these results cannot be generalised to all available systems. Notable differences were noted in contrast-to-noise ratios and in other tests including: response function, modulation transfer function , noise equivalent quanta, normalised noise power spectra and detective quantum efficiency as specified in IEC 62220-1-2. Consequently, the authors strongly recommend the use of raw data for all image quality analyses in digital mammography.

  13. Applying I-FGM to image retrieval and an I-FGM system performance analyses

    NASA Astrophysics Data System (ADS)

    Santos, Eugene, Jr.; Santos, Eunice E.; Nguyen, Hien; Pan, Long; Korah, John; Zhao, Qunhua; Xia, Huadong

    2007-04-01

    Intelligent Foraging, Gathering and Matching (I-FGM) combines a unique multi-agent architecture with a novel partial processing paradigm to provide a solution for real-time information retrieval in large and dynamic databases. I-FGM provides a unified framework for combining the results from various heterogeneous databases and seeks to provide easily verifiable performance guarantees. In our previous work, I-FGM had been implemented and validated with experiments on dynamic text data. However, the heterogeneity of search spaces requires our system having the ability to effectively handle various types of data. Besides texts, images are the most significant and fundamental data for information retrieval. In this paper, we extend the I-FGM system to incorporate images in its search spaces using a region-based Wavelet Image Retrieval algorithm called WALRUS. Similar to what we did for text retrieval, we modified the WALRUS algorithm to partially and incrementally extract the regions from an image and measure the similarity value of this image. Based on the obtained partial results, we refine our computational resources by updating the priority values of image documents. Experiments have been conducted on I-FGM system with image retrieval. The results show that I-FGM outperforms its control systems. Also, in this paper we present theoretical analysis of the systems with a focus on performance. Based on probability theory, we provide models and predictions of the average performance of the I-FGM system and its two control systems, as well as the systems without partial processing.

  14. Developing a dermatological photodiagnosis system by optical image analyses and in vivo study.

    PubMed

    Chang, Chung-Hsing; Lin, Yu-Hsuan; Li, Cheng-Ru; Chang, Chun-Ming; Hung, Chih-Wei; Chang, Han-Chao

    2016-09-01

    Dermatological photodynamic therapy (DPDT) involves using systematic photosensitizers in combination with light irradiation treatment to eliminate cancer cells. Therefore, a noninvasive fluorescence photodiagnosis system is critical in DPDT for diagnosing target tissues and demarcating the margin of normal tissues. This study proposes a 375-nm ring LED light module in fluorescence imaging for DPDT applications. Image reproducibility (I.R.) and image interference (I.I.) analysis were performed. The results showed that the I.R. value of this fluorescence diagnostic system was higher than 99.0%, and the I.I. from external light sources was lower than 3.0%. In addition, the result of an in vivo study showed that the Chlorin e6 red fluorescence and the scope of distribution of B16-F10 melanoma cells in a mouse ear's vein could be measured clearly using our device; however, the comparison studio with 395-nm LED lights could not focus or capture the red fluorescence effectively.

  15. THE NEGLECTED SIDE OF THE COIN: QUANTITATIVE BENEFIT-RISK ANALYSES IN MEDICAL IMAGING

    PubMed Central

    Zanzonico, Pat B.

    2016-01-01

    While it is implicitly recognized that the benefits of diagnostic imaging far outweigh any theoretical radiogenic risks, quantitative estimates of the benefits are rarely, if ever, juxtaposed with quantitative estimates of risk. This alone - expression of benefit in purely qualitative terms versus expression of risk in quantitative, and therefore seemingly more certain, terms - may well contribute to a skewed sense of the relative benefits and risks of diagnostic imaging among healthcare providers as well as patients. The current paper, therefore, briefly compares the benefits of diagnostic imaging in several cases, based on actual mortality or morbidity data if ionizing radiation were not employed, with theoretical estimates of radiogenic cancer mortality based on the “linear no-threshold” (LNT) dose-response model. PMID:26808890

  16. The Neglected Side of the Coin: Quantitative Benefit-risk Analyses in Medical Imaging.

    PubMed

    Zanzonico, Pat B

    2016-03-01

    While it is implicitly recognized that the benefits of diagnostic imaging far outweigh any theoretical radiogenic risks, quantitative estimates of the benefits are rarely, if ever, juxtaposed with quantitative estimates of risk. This alone - expression of benefit in purely qualitative terms versus expression of risk in quantitative, and therefore seemingly more certain, terms - may well contribute to a skewed sense of the relative benefits and risks of diagnostic imaging among healthcare providers as well as patients. The current paper, therefore, briefly compares the benefits of diagnostic imaging in several cases, based on actual mortality or morbidity data if ionizing radiation were not employed, with theoretical estimates of radiogenic cancer mortality based on the "linear no-threshold" (LNT) dose-response model. PMID:26808890

  17. Formal Distinctiveness of High- and Low-Imageability Nouns: Analyses and Theoretical Implications

    ERIC Educational Resources Information Center

    Reilly, Jamie; Kean, Jacob

    2007-01-01

    Words associated with perceptually salient, highly imageable concepts are learned earlier in life, more accurately recalled, and more rapidly named than abstract words (R. W. Brown, 1976; Walker & Hulme, 1999). Theories accounting for this concreteness effect have focused exclusively on semantic properties of word referents. A novel possibility is…

  18. Three-dimensional imaging system for analyses of dynamic droplet impaction and deposition formation on leaves

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A system was developed to assess the dynamic processes of droplet impact, rebound and retention on leaf surfaces with three-dimensional (3-D) images. The system components consisted of a uniform-size droplet generator, two high speed digital video cameras, a constant speed track, a leaf holder, and ...

  19. Optimizing Laguerre expansion based deconvolution methods for analysing bi-exponential fluorescence lifetime images.

    PubMed

    Zhang, Yongliang; Chen, Yu; Li, David Day-Uei

    2016-06-27

    Fast deconvolution is an essential step to calibrate instrument responses in big fluorescence lifetime imaging microscopy (FLIM) image analysis. This paper examined a computationally effective least squares deconvolution method based on Laguerre expansion (LSD-LE), recently developed for clinical diagnosis applications, and proposed new criteria for selecting Laguerre basis functions (LBFs) without considering the mutual orthonormalities between LBFs. Compared with the previously reported LSD-LE, the improved LSD-LE allows to use a higher laser repetition rate, reducing the acquisition time per measurement. Moreover, we extended it, for the first time, to analyze bi-exponential fluorescence decays for more general FLIM-FRET applications. The proposed method was tested on both synthesized bi-exponential and realistic FLIM data for studying the endocytosis of gold nanorods in Hek293 cells. Compared with the previously reported constrained LSD-LE, it shows promising results. PMID:27410552

  20. Measurements and simulations analysing the noise behaviour of grating-based X-ray phase-contrast imaging

    NASA Astrophysics Data System (ADS)

    Weber, T.; Bartl, P.; Durst, J.; Haas, W.; Michel, T.; Ritter, A.; Anton, G.

    2011-08-01

    In the last decades, phase-contrast imaging using a Talbot-Lau grating interferometer is possible even with a low-brilliance X-ray source. With the potential of increasing the soft-tissue contrast, this method is on its way into medical imaging. For this purpose, the knowledge of the underlying physics of this technique is necessary.With this paper, we would like to contribute to the understanding of grating-based phase-contrast imaging by presenting results on measurements and simulations regarding the noise behaviour of the differential phases.These measurements were done using a microfocus X-ray tube with a hybrid, photon-counting, semiconductor Medipix2 detector. The additional simulations were performed by our in-house developed phase-contrast simulation tool “SPHINX”, combining both wave and particle contributions of the simulated photons.The results obtained by both of these methods show the same behaviour. Increasing the number of photons leads to a linear decrease of the standard deviation of the phase. The number of used phase steps has no influence on the standard deviation, if the total number of photons is held constant.Furthermore, the probability density function (pdf) of the reconstructed differential phases was analysed. It turned out that the so-called von Mises distribution is the physically correct pdf, which was also confirmed by measurements.This information advances the understanding of grating-based phase-contrast imaging and can be used to improve image quality.

  1. Use of Very High-Resolution Airborne Images to Analyse 3d Canopy Architecture of a Vineyard

    NASA Astrophysics Data System (ADS)

    Burgos, S.; Mota, M.; Noll, D.; Cannelle, B.

    2015-08-01

    Differencing between green cover and grape canopy is a challenge for vigour status evaluation in viticulture. This paper presents the acquisition methodology of very high-resolution images (4 cm), using a Sensefly Swinglet CAM unmanned aerial vehicle (UAV) and their processing to construct a 3D digital surface model (DSM) for the creation of precise digital terrain models (DTM). The DTM was obtained using python processing libraries. The DTM was then subtracted to the DSM in order to obtain a differential digital model (DDM) of a vineyard. In the DDM, the vine pixels were then obtained by selecting all pixels with an elevation higher than 50 [cm] above the ground level. The results show that it was possible to separate pixels from the green cover and the vine rows. The DDM showed values between -0.1 and + 1.5 [m]. A manually delineation of polygons based on the RGB image belonging to the green cover and to the vine rows gave a highly significant differences with an average value of 1.23 [m] and 0.08 [m] for the vine and the ground respectively. The vine rows elevation is in good accordance with the topping height of the vines 1.35 [m] measured on the field. This mask could be used to analyse images of the same plot taken at different times. The extraction of only vine pixels will facilitate subsequent analyses, for example, a supervised classification of these pixels.

  2. Contextualising and Analysing Planetary Rover Image Products through the Web-Based PRoGIS

    NASA Astrophysics Data System (ADS)

    Morley, Jeremy; Sprinks, James; Muller, Jan-Peter; Tao, Yu; Paar, Gerhard; Huber, Ben; Bauer, Arnold; Willner, Konrad; Traxler, Christoph; Garov, Andrey; Karachevtseva, Irina

    2014-05-01

    The international planetary science community has launched, landed and operated dozens of human and robotic missions to the planets and the Moon. They have collected various surface imagery that has only been partially utilized for further scientific purposes. The FP7 project PRoViDE (Planetary Robotics Vision Data Exploitation) is assembling a major portion of the imaging data gathered so far from planetary surface missions into a unique database, bringing them into a spatial context and providing access to a complete set of 3D vision products. Processing is complemented by a multi-resolution visualization engine that combines various levels of detail for a seamless and immersive real-time access to dynamically rendered 3D scenes. PRoViDE aims to (1) complete relevant 3D vision processing of planetary surface missions, such as Surveyor, Viking, Pathfinder, MER, MSL, Phoenix, Huygens, and Lunar ground-level imagery from Apollo, Russian Lunokhod and selected Luna missions, (2) provide highest resolution & accuracy remote sensing (orbital) vision data processing results for these sites to embed the robotic imagery and its products into spatial planetary context, (3) collect 3D Vision processing and remote sensing products within a single coherent spatial data base, (4) realise seamless fusion between orbital and ground vision data, (5) demonstrate the potential of planetary surface vision data by maximising image quality visualisation in 3D publishing platform, (6) collect and formulate use cases for novel scientific application scenarios exploiting the newly introduced spatial relationships and presentation, (7) demonstrate the concepts for MSL, (9) realize on-line dissemination of key data & its presentation by a web-based GIS and rendering tool named PRoGIS (Planetary Robotics GIS). PRoGIS is designed to give access to rover image archives in geographical context, using projected image view cones, obtained from existing meta-data and updated according to

  3. Analysed cap mesenchyme track data from live imaging of mouse kidney development.

    PubMed

    Lefevre, James G; Combes, Alexander N; Little, Melissa H; Hamilton, Nicholas A

    2016-12-01

    This article provides detailed information on manually tracked cap mesenchyme cells from timelapse imaging of multiple ex vivo embryonic mouse kidneys. Cells were imaged for up to 18 h at 15 or 20 min intervals, and multiple cell divisions were tracked. Positional data is supplemented with a range of information including the relative location of the closest ureteric tip and a correction for drift due to bulk movement and tip growth. A subset of tracks were annotated to indicate the presence of processes attached to the ureteric epithelium. The calculations used for drift correction are described, as are the main methods used in the analysis of this data for the purpose of describing cap cell motility. The outcomes of this analysis are discussed in "Cap mesenchyme cell swarming during kidney development is influenced by attraction, repulsion, and adhesion to the ureteric tip" (A.N. Combes, J.G. Lefevre, S. Wilson, N.A. Hamilton, M.H. Little, 2016) [1]. PMID:27642621

  4. Capabilities and Impact on Wind Analyses of the Hurricane Imaging Radiometer (HIRAD)

    NASA Technical Reports Server (NTRS)

    Miller, Timothy L.; Amarin, Ruba; Atlas, Robert; Bailey, M. C.; Black, Peter; Buckley, Courtney; James, Mark; Johnson, James; Jones, Linwood; Ruf, Christopher; Simmons, David; Uhlhorn, Eric

    2010-01-01

    The Hurricane Imaging Radiometer (HIRAD) is a new airborne microwave remote sensor for hurricane observations that is currently under development by NASA Marshall Space Flight Center in partnership with the NOAA Atlantic Oceanographic and Meteorological Laboratory/Hurricane Research Division, the University of Central Florida, the University of Michigan, and the University of Alabama in Huntsville. The instrument is being test flown in January and is expected to participate in or collaborate with the tropical cyclone experiment GRIP (Genesis and Rapid Intensification Processes) in the 2010 season. HIRAD is designed to study the wind field in some detail within strong hurricanes and to enhance the real-time airborne ocean surface winds observation capabilities of NOAA and USAF Weather Squadron hurricane hunter aircraft currently using the operational Stepped Frequency Microwave Radiometer (SFMR). Unlike SFMR, which measures wind speed and rain rate along the ground track at a single point directly beneath the aircraft, HIRAD will provide images of the surface wind and rain field over a wide swath (approx.3 x the aircraft altitude) with approx.2 km resolution. See Figure 1, which depicts a simulated HIRAD swath versus the line of data obtained by SFMR.

  5. X-ray fluorescence and imaging analyses of paintings by the Brazilian artist Oscar Pereira Da Silva

    NASA Astrophysics Data System (ADS)

    Campos, P. H. O. V.; Kajiya, E. A. M.; Rizzutto, M. A.; Neiva, A. C.; Pinto, H. P. F.; Almeida, P. A. D.

    2014-02-01

    Non-destructive analyses, such as EDXRF (Energy-Dispersive X-Ray Fluorescence) spectroscopy, and imaging were used to characterize easel paintings. The analyzed objects are from the collection of the Pinacoteca do Estado de São Paulo. EDXRF results allowed us to identify the chemical elements present in the pigments, showing the use of many Fe-based pigments, modern pigments, such as cobalt blue and cadmium yellow, as well as white pigments containing lead and zinc used by the artist in different layers. Imaging analysis was useful to identify the state of conservation, the localization of old and new restorations and also to detect and unveil the underlying drawings revealing the artist's creative processes.

  6. ICPES analyses using full image spectra and astronomical data fitting algorithms to provide diagnostic and result information

    SciTech Connect

    Spencer, W.A.; Goode, S.R.

    1997-10-01

    ICP emission analyses are prone to errors due to changes in power level, nebulization rate, plasma temperature, and sample matrix. As a result, accurate analyses of complex samples often require frequent bracketing with matrix matched standards. Information needed to track and correct the matrix errors is contained in the emission spectrum. But most commercial software packages use only the analyte line emission to determine concentrations. Changes in plasma temperature and the nebulization rate are reflected by changes in the hydrogen line widths, the oxygen emission, and neutral ion line ratios. Argon and off-line emissions provide a measure to correct the power level and the background scattering occurring in the polychromator. The authors` studies indicated that changes in the intensity of the Ar 404.4 nm line readily flag most matrix and plasma condition modifications. Carbon lines can be used to monitor the impact of organics on the analyses and calcium and argon lines can be used to correct for spectral drift and alignment. Spectra of contaminated groundwater and simulated defense waste glasses were obtained using a Thermo Jarrell Ash ICP that has an echelle CID detector system covering the 190-850 nm range. The echelle images were translated to the FITS data format, which astronomers recommend for data storage. Data reduction packages such as those in the ESO-MIDAS/ECHELLE and DAOPHOT programs were tried with limited success. The radial point spread function was evaluated as a possible improved peak intensity measurement instead of the common pixel averaging approach used in the commercial ICP software. Several algorithms were evaluated to align and automatically scale the background and reference spectra. A new data reduction approach that utilizes standard reference images, successive subtractions, and residual analyses has been evaluated to correct for matrix effects.

  7. Accelerating Large Scale Image Analyses on Parallel, CPU-GPU Equipped Systems.

    PubMed

    Teodoro, George; Kurc, Tahsin M; Pan, Tony; Cooper, Lee A D; Kong, Jun; Widener, Patrick; Saltz, Joel H

    2012-05-01

    The past decade has witnessed a major paradigm shift in high performance computing with the introduction of accelerators as general purpose processors. These computing devices make available very high parallel computing power at low cost and power consumption, transforming current high performance platforms into heterogeneous CPU-GPU equipped systems. Although the theoretical performance achieved by these hybrid systems is impressive, taking practical advantage of this computing power remains a very challenging problem. Most applications are still deployed to either GPU or CPU, leaving the other resource under- or un-utilized. In this paper, we propose, implement, and evaluate a performance aware scheduling technique along with optimizations to make efficient collaborative use of CPUs and GPUs on a parallel system. In the context of feature computations in large scale image analysis applications, our evaluations show that intelligently co-scheduling CPUs and GPUs can significantly improve performance over GPU-only or multi-core CPU-only approaches.

  8. Accelerating Large Scale Image Analyses on Parallel, CPU-GPU Equipped Systems.

    PubMed

    Teodoro, George; Kurc, Tahsin M; Pan, Tony; Cooper, Lee A D; Kong, Jun; Widener, Patrick; Saltz, Joel H

    2012-05-01

    The past decade has witnessed a major paradigm shift in high performance computing with the introduction of accelerators as general purpose processors. These computing devices make available very high parallel computing power at low cost and power consumption, transforming current high performance platforms into heterogeneous CPU-GPU equipped systems. Although the theoretical performance achieved by these hybrid systems is impressive, taking practical advantage of this computing power remains a very challenging problem. Most applications are still deployed to either GPU or CPU, leaving the other resource under- or un-utilized. In this paper, we propose, implement, and evaluate a performance aware scheduling technique along with optimizations to make efficient collaborative use of CPUs and GPUs on a parallel system. In the context of feature computations in large scale image analysis applications, our evaluations show that intelligently co-scheduling CPUs and GPUs can significantly improve performance over GPU-only or multi-core CPU-only approaches. PMID:25419545

  9. Accelerating Large Scale Image Analyses on Parallel, CPU-GPU Equipped Systems

    PubMed Central

    Teodoro, George; Kurc, Tahsin M.; Pan, Tony; Cooper, Lee A.D.; Kong, Jun; Widener, Patrick; Saltz, Joel H.

    2014-01-01

    The past decade has witnessed a major paradigm shift in high performance computing with the introduction of accelerators as general purpose processors. These computing devices make available very high parallel computing power at low cost and power consumption, transforming current high performance platforms into heterogeneous CPU-GPU equipped systems. Although the theoretical performance achieved by these hybrid systems is impressive, taking practical advantage of this computing power remains a very challenging problem. Most applications are still deployed to either GPU or CPU, leaving the other resource under- or un-utilized. In this paper, we propose, implement, and evaluate a performance aware scheduling technique along with optimizations to make efficient collaborative use of CPUs and GPUs on a parallel system. In the context of feature computations in large scale image analysis applications, our evaluations show that intelligently co-scheduling CPUs and GPUs can significantly improve performance over GPU-only or multi-core CPU-only approaches. PMID:25419545

  10. Expansion analyses on low-excitation planetary nebulae with stellar images

    SciTech Connect

    Tamura, Shinichi; Shibata, K.M. Nobeyama Radio Observatory, Minamimaki )

    1990-11-01

    The paper presents the results of analyses on the expansion characteristics of the low-excitation and unresolved planetary nebulae, M1-5, M1-9, K3-66, and K3-67. The sample nebulae are divided into two groups. The first includes the real compact planetary nebulae M1-5 and M1-9 based on their single-Gaussian profiles. The second one includes nebulae that are unresolved because of their large distances. The nebulae K3-66 and K3-67 should belong to the second group since they show the double-Gaussian components in the emission-line profiles. Relationships between expansion velocities and I(forbidden O III 5007 A)/I(H-beta) and between electron densities and expansion velocities give the basis for the above arguments and reveal that the nebulae IC 4997, Vy2-2, and M3-27 obviously are in different phases of evolution from those of other low-excitation planetary nebulae. 24 refs.

  11. Imaging Erg and Jun transcription factor interaction in living cells using fluorescence resonance energy transfer analyses

    SciTech Connect

    Camuzeaux, Barbara; Heliot, Laurent; Coll, Jean . E-mail: martine.duterque@ibl.fr

    2005-07-15

    Physical interactions between transcription factors play important roles in modulating gene expression. Previous in vitro studies have shown a transcriptional synergy between Erg protein, an Ets family member, and Jun/Fos heterodimer, members of the bZip family, which requires direct Erg-Jun protein interactions. Visualization of protein interactions in living cells is a new challenge in biology. For this purpose, we generated fusion proteins of Erg, Fos, and Jun with yellow and cyan fluorescent proteins, YFP and CFP, respectively. After transient expression in HeLa cells, interactions of the resulting fusion proteins were explored by fluorescence resonance energy transfer microscopy (FRET) in fixed and living cells. FRET between YFP-Erg and CFP-Jun was monitored by using photobleaching FRET and fluorescence lifetime imaging microscopy. Both techniques revealed the occurrence of intermolecular FRET between YFP-Erg and CFP-Jun. This is stressed by loss of FRET with an YFP-Erg version carrying a point mutation in its ETS domain. These results provide evidence for the interaction of Erg and Jun proteins in living cells as a critical prerequisite of their transcriptional synergy, but also for the essential role of the Y371 residue, conserved in most Ets proteins, in this interaction.

  12. Statistical improvements in functional magnetic resonance imaging analyses produced by censoring high-motion data points.

    PubMed

    Siegel, Joshua S; Power, Jonathan D; Dubis, Joseph W; Vogel, Alecia C; Church, Jessica A; Schlaggar, Bradley L; Petersen, Steven E

    2014-05-01

    Subject motion degrades the quality of task functional magnetic resonance imaging (fMRI) data. Here, we test two classes of methods to counteract the effects of motion in task fMRI data: (1) a variety of motion regressions and (2) motion censoring ("motion scrubbing"). In motion regression, various regressors based on realignment estimates were included as nuisance regressors in general linear model (GLM) estimation. In motion censoring, volumes in which head motion exceeded a threshold were withheld from GLM estimation. The effects of each method were explored in several task fMRI data sets and compared using indicators of data quality and signal-to-noise ratio. Motion censoring decreased variance in parameter estimates within- and across-subjects, reduced residual error in GLM estimation, and increased the magnitude of statistical effects. Motion censoring performed better than all forms of motion regression and also performed well across a variety of parameter spaces, in GLMs with assumed or unassumed response shapes. We conclude that motion censoring improves the quality of task fMRI data and can be a valuable processing step in studies involving populations with even mild amounts of head movement.

  13. Airflow analyses using thermal imaging in Arizona's Meteor Crater as part of METCRAX II

    NASA Astrophysics Data System (ADS)

    Grudzielanek, A. Martina; Vogt, Roland; Cermak, Jan; Maric, Mateja; Feigenwinter, Iris; Whiteman, C. David; Lehner, Manuela; Hoch, Sebastian W.; Krauß, Matthias G.; Bernhofer, Christian; Pitacco, Andrea

    2016-04-01

    In October 2013 the second Meteor Crater Experiment (METCRAX II) took place at the Barringer Meteorite Crater (aka Meteor Crater) in north central Arizona, USA. Downslope-windstorm-type flows (DWF), the main research objective of METCRAX II, were measured by a comprehensive set of meteorological sensors deployed in and around the crater. During two weeks of METCRAX II five infrared (IR) time lapse cameras (VarioCAM® hr research & VarioCAM® High Definition, InfraTec) were installed at various locations on the crater rim to record high-resolution images of the surface temperatures within the crater from different viewpoints. Changes of surface temperature are indicative of air temperature changes induced by flow dynamics inside the crater, including the DWF. By correlating thermal IR surface temperature data with meteorological sensor data during intensive observational periods the applicability of the IR method of representing flow dynamics can be assessed. We present evaluation results and draw conclusions relative to the application of this method for observing air flow dynamics in the crater. In addition we show the potential of the IR method for METCRAX II in 1) visualizing airflow processes to improve understanding of these flows, and 2) analyzing cold-air flows and cold-air pooling.

  14. Statistical Improvements in Functional Magnetic Resonance Imaging Analyses Produced by Censoring High-Motion Data Points

    PubMed Central

    Siegel, Joshua S.; Power, Jonathan D.; Dubis, Joseph W.; Vogel, Alecia C.; Church, Jessica A.; Schlaggar, Bradley L.; Petersen, Steven E.

    2013-01-01

    Subject motion degrades the quality of task functional magnetic resonance imaging (fMRI) data. Here, we test two classes of methods to counteract the effects of motion in task fMRI data: (1) a variety of motion regressions and (2) motion censoring (“motion scrubbing”). In motion regression, various regressors based on realignment estimates were included as nuisance regressors in general linear model (GLM) estimation. In motion censoring, volumes in which head motion exceeded a threshold were withheld from GLM estimation. The effects of each method were explored in several task fMRI data sets and compared using indicators of data quality and signal-to-noise ratio. Motion censoring decreased variance in parameter estimates within- and across-subjects, reduced residual error in GLM estimation, and increased the magnitude of statistical effects. Motion censoring performed better than all forms of motion regression and also performed well across a variety of parameter spaces, in GLMs with assumed or unassumed response shapes. We conclude that motion censoring improves the quality of task fMRI data and can be a valuable processing step in studies involving populations with even mild amounts of head movement. PMID:23861343

  15. Advanced X ray Astrophysics Facility-Imaging (AXAF-I) thermal analyses using Integrated Thermal Analysis System (ITAS) program

    NASA Technical Reports Server (NTRS)

    Ghaffarian, Benny; Cummings, Ramona

    1993-01-01

    The complex geometry and stringent thermal requirements associated with the Advanced X-ray Astrophysics Facility - Imaging (AXAF-I) necessitate a detailed and accurate thermal analysis of the proposed system. A brief description of said geometry and thermal requirements is included. Among the tools considered for the aforementioned analysis is a PC-compatible version of the Integrated Thermal Analysis System (ITAS). Several bench-mark studies were performed to evaluate the capabilities of ITAS and to compare the corresponding results with those obtained using TRASYS and SINDA. Comparative studies were conducted for a typical Space Station module. Four models were developed using various combinations of the available software packages (i.e. ITAS, SINDA, and TRASYS). Orbital heating and heat transfer calculations were performed to determine the temperature distributions along the surfaces of this module. A comparison of the temperature distributions obtained for each of the four cases is presented. Results of this investigation were used to verify the different ITAS modules including those used for model generation, steady state and transient orbital heating analyses, radiative and convective heat flow analyses, and SINDA/TRASYS model translation. The results suggest that ITAS is well suited to subsequent analyses of the AXAF-I.

  16. Combining satellite and seismic images to analyse the shallow structure of the Dead Sea Transform near the DESERT transect

    NASA Astrophysics Data System (ADS)

    Kesten, D.; Weber, M.; Haberland, Ch.; Janssen, Ch.; Agnon, A.; Bartov, Y.; Rabba, I.

    2008-02-01

    The left-lateral Dead Sea Transform (DST) in the Middle East is one of the largest continental strike-slip faults of the world. The southern segment of the DST in the Arava/Araba Valley between the Dead Sea and the Red Sea, called Arava/Araba Fault (AF), has been studied in detail in the multidisciplinary DESERT (DEad SEa Rift Transect) project. Based on these results, here, the interpretations of multi-spectral (ASTER) satellite images and seismic reflection studies have been combined to analyse geologic structures. Whereas satellite images reveal neotectonic activity in shallow young sediments, reflection seismic image deep faults that are possibly inactive at present. The combination of the two methods allows putting some age constraint on the activity of individual fault strands. Although the AF is clearly the main active fault segment of the southern DST, we propose that it has accommodated only a limited (up to 60 km) part of the overall 105 km of sinistral plate motion since Miocene times. There is evidence for sinistral displacement along other faults, based on geological studies, including satellite image interpretation. Furthermore, a subsurface fault is revealed ≈4 km west of the AF on two ≈E-W running seismic reflection profiles. Whereas these seismic data show a flower structure typical for strike-slip faults, on the satellite image this fault is not expressed in the post-Miocene sediments, implying that it has been inactive for the last few million years. About 1 km to the east of the AF another, now buried fault, was detected in seismic, magnetotelluric and gravity studies of DESERT. Taking together various evidences, we suggest that at the beginning of transform motion deformation occurred in a rather wide belt, possibly with the reactivation of older ≈N-S striking structures. Later, deformation became concentrated in the region of today’s Arava Valley. Till ≈5 Ma ago there might have been other, now inactive fault traces in the vicinity

  17. Computerized multiple image analysis on mammograms: performance improvement of nipple identification for registration of multiple views using texture convergence analyses

    NASA Astrophysics Data System (ADS)

    Zhou, Chuan; Chan, Heang-Ping; Sahiner, Berkman; Hadjiiski, Lubomir M.; Paramagul, Chintana

    2004-05-01

    Automated registration of multiple mammograms for CAD depends on accurate nipple identification. We developed two new image analysis techniques based on geometric and texture convergence analyses to improve the performance of our previously developed nipple identification method. A gradient-based algorithm is used to automatically track the breast boundary. The nipple search region along the boundary is then defined by geometric convergence analysis of the breast shape. Three nipple candidates are identified by detecting the changes along the gray level profiles inside and outside the boundary and the changes in the boundary direction. A texture orientation-field analysis method is developed to estimate the fourth nipple candidate based on the convergence of the tissue texture pattern towards the nipple. The final nipple location is determined from the four nipple candidates by a confidence analysis. Our training and test data sets consisted of 419 and 368 randomly selected mammograms, respectively. The nipple location identified on each image by an experienced radiologist was used as the ground truth. For 118 of the training and 70 of the test images, the radiologist could not positively identify the nipple, but provided an estimate of its location. These were referred to as invisible nipple images. In the training data set, 89.37% (269/301) of the visible nipples and 81.36% (96/118) of the invisible nipples could be detected within 1 cm of the truth. In the test data set, 92.28% (275/298) of the visible nipples and 67.14% (47/70) of the invisible nipples were identified within 1 cm of the truth. In comparison, our previous nipple identification method without using the two convergence analysis techniques detected 82.39% (248/301), 77.12% (91/118), 89.93% (268/298) and 54.29% (38/70) of the nipples within 1 cm of the truth for the visible and invisible nipples in the training and test sets, respectively. The results indicate that the nipple on mammograms can be

  18. Coupling MODIS images and agrometeorological data for agricultural water productivity analyses in the Mato Grosso State, Brazil

    NASA Astrophysics Data System (ADS)

    de C. Teixeira, Antônio H.; Victoria, Daniel C.; Andrade, Ricardo G.; Leivas, Janice F.; Bolfe, Edson L.; Cruz, Caroline R.

    2014-10-01

    Mato Grosso state, Central West Brazil, has been highlighted by the grain production, mainly soybean and corn, as first (November-March) and second (April-August) harvest crops, respectively. For water productivity (WP) analyses, MODIS products together with a net of weather stations were used. Evapotranspiration (ET) and biomass production (BIO) were acquired during the year 2012 and WP was considered as the ratio of BIO to ET. The SAFER (Simple Algorithm For Evapotranspiration Retrieving) for ET and the Monteith's radiation model for BIO were applied together, considering a mask which separated the crops from other surface types. In relation to the first harvest crop ET, BIO and WP values above of those for other surface types, happened only from November to January with incremental values reaching to 1.2 mm day-1; 67 kg ha-1 day-1; and 0.7 kg m-3, respectively; and between March and May for the second harvest crops, with incremental values attaining 0.5 mm day-1; 27 kg ha-1 day-1; and 0.3 kg m-3, respectively. In both cases, during the growing seasons, the highest WP parameters in cropped areas corresponded, in general, to the blooming to grain filling transition. Considering corn crop, which nowadays is increasing in terms of cultivated areas in the Brazilian Central West region, and crop water productivity (CWP) the ratio of yield to the amount of water consumed, the main growing regions North, Southeast and Northeast were analyzed. Southeast presented the highest annual pixel averages for ET, BIO and CWP (1.7 mm day-1, 78 kg ha-1 day-1 and 2.2 kg m-3, respectively); while for Northeast they were the lowest ones (1.2 mm day-1, 52 kg ha-1 dia-1 and 1.9 kg m-3). Throughout a soil moisture indicator, the ratio of precipitation (P) to ET, it was indeed noted that rainfall was enough for a good grain yield, with P/ET lower than 1.00 only outside the crop growing seasons. The combination of MODIS images and weather stations proved to be useful for monitoring

  19. Analyses of Magnetic Resonance Imaging of Cerebrospinal Fluid Dynamics Pre and Post Short and Long-Duration Space Flights

    NASA Technical Reports Server (NTRS)

    Alperin, Noam; Barr, Yael; Lee, Sang H.; Mason,Sara; Bagci, Ahmet M.

    2015-01-01

    Preliminary results are based on analyses of data from 17 crewmembers. The initial analysis compares pre to post-flight changes in total cerebral blood flow (CBF) and craniospinal CSF flow volume. Total CBF is obtained by summation of the mean flow rates through the 4 blood vessels supplying the brain (right and left internal carotid and vertebral arteries). Volumetric flow rates were obtained using an automated lumen segmentation technique shown to have 3-4-fold improved reproducibility and accuracy over manual lumen segmentation (6). Two cohorts, 5 short-duration and 8 long-duration crewmembers, who were scanned within 3 to 8 days post landing were included (4 short-duration crewmembers with MRI scans occurring beyond 10 days post flight were excluded). The VIIP Clinical Practice Guideline (CPG) classification is being used initially as a measure for VIIP syndrome severity. Median CPG scores of the short and long-duration cohorts were similar, 2. Mean preflight total CBF for the short and long-duration cohorts were similar, 863+/-144 and 747+/-119 mL/min, respectively. Percentage CBF changes for all short duration crewmembers were 11% or lower, within the range of normal physiological fluctuations in healthy individuals. In contrast, in 4 of the 8 long-duration crewmembers, the change in CBF exceeded the range of normal physiological fluctuation. In 3 of the 4 subjects an increase in CBF was measured. Large pre to post-flight changes in the craniospinal CSF flow volume were found in 6 of the 8 long-duration crewmembers. Box-Whisker plots of the CPG and the percent CBF and CSF flow changes for the two cohorts are shown in Figure 4. Examples of CSF flow waveforms for a short and two long-duration (CPG 0 and 3) are shown in Figure 5. Changes in CBF and CSF flow dynamics larger than normal physiological fluctuations were observed in the long-duration crewmembers. Changes in CSF flow were more pronounced than changes in CBF. Decreased CSF flow dynamics were observed

  20. Video-image analyses of the cross-stream distribution of smoke in the near wake of a building

    SciTech Connect

    Huber, A.H.; Arya, S.P.S.

    1988-04-01

    In a wind-tunnel study, recorded video images of the top view of smoke dispersion in the wake of a building were analyzed. A continuous source of smoke was emitted at floor level, midway along the leeward side of the building. The technique and usefulness of analyzing video images of smoke is demonstrated in a study of building effects on smoke dispersion. The presentation discusses how the video-image intensity is corrected for background intensity and then normalized to a scale of 0 to 100%. Profiles of the normalized image mean intensity are compared with similar profiles for the mean concentration of hydrocarbon tracer. The distributions of intensity of cross-stream profiles are discussed. These distributions were analyzed in time and space. Also, time-averaged cross-stream profiles of mean, standard deviation, and other statistics of image intensity are compared with traditional concentration measurements for symmetric wake flow.

  1. Analysing the Progression Rates of Macular Lesions with Autofluorescence Imaging Modes in Dry Age-Related Macular Degeneration

    PubMed Central

    Olcay, Kenan; Çakır, Akın; Sönmez, Murat; Düzgün, Eyüp; Yıldırım, Yıldıray

    2015-01-01

    Objectives: In this study we aimed to compare the sensitivity of blue-light fundus autofluorescence (FAF) and near-infrared autofluorescence (NI-AF) imaging for determining the progression rates of macular lesions in dry age-related macular degeneration (AMD). Materials and Methods: The study was designed retrospectively and included patients diagnosed with intermediate and advanced stage dry AMD. Best corrected visual acuities and FAF and NI-AF images were recorded in 46 eyes of 33 patients. Lesion borders were drawn manually on the images using Heidelberg Eye Explorer software and lesion areas were calculated using Microsoft Excel software. BCVA and lesion areas were compared with each other. Results: Patients’ mean follow-up time was 30.98±13.30 months. The lesion area progression rates were 0.85±0.93 mm2/y in FAF and 0.93±1.01 mm2/y in NI-AF, showing statistically significant correlation with each other (r=0.883; p<0.01). Both imaging methods are moderately correlated with visual acuity impairment (r=0.362; p<0.05 and r=0.311; p<0.05, respectively). In addition, larger lesions showed higher progression rates than smaller ones in both imaging methods. Conclusion: NI-AF imaging is as important and effective as FAF imaging for follow-up of dry AMD patients. PMID:27800240

  2. A method to analyse observer disagreement in visual grading studies: example of assessed image quality in paediatric cerebral multidetector CT images.

    PubMed

    Ledenius, K; Svensson, E; Stålhammar, F; Wiklund, L-M; Thilander-Klang, A

    2010-07-01

    The purpose was to demonstrate a non-parametric statistical method that can identify and explain the components of observer disagreement in terms of systematic disagreement as well as additional individual variability, in visual grading studies. As an example, the method was applied to a study where the effect of reduced tube current on diagnostic image quality in paediatric cerebral multidetector CT (MDCT) images was investigated. Quantum noise, representing dose reductions equivalent to steps of 20 mA, was artificially added to the raw data of 25 retrospectively selected paediatric cerebral MDCT examinations. Three radiologists, blindly and randomly, assessed the resulting images from two different levels of the brain with regard to the reproduction of high- and low-contrast structures and overall image quality. Images from three patients were assessed twice for the analysis of intra-observer disagreement. The intra-observer disagreement in test-retest assessments could mainly be explained by a systematic change towards lower image quality the second time the image was reviewed. The inter-observer comparisons showed that the paediatric radiologist was more critical of the overall image quality, while the neuroradiologists were more critical of the reproduction of the basal ganglia. Differences between the radiologists regarding the extent to which they used the whole classification scale were also found. The statistical method used was able to identify and separately measure a presence of bias apart from additional individual variability within and between the radiologists which is, at the time of writing, not attainable by any other statistical approach suitable for paired, ordinal data.

  3. Let there be bioluminescence: development of a biophotonic imaging platform for in situ analyses of oral biofilms in animal models.

    PubMed

    Merritt, Justin; Senpuku, Hidenobu; Kreth, Jens

    2016-01-01

    In the current study, we describe a novel biophotonic imaging-based reporter system that is particularly useful for the study of virulence in polymicrobial infections and interspecies interactions within animal models. A suite of luciferase enzymes was compared using three early colonizing species of the human oral flora (Streptococcus mutans, Streptococcus gordonii and Streptococcus sanguinis) to determine the utility of the different reporters for multiplexed imaging studies in vivo. Using the multiplex approach, we were able to track individual species within a dual-species oral infection model in mice with both temporal and spatial resolution. We also demonstrate how biophotonic imaging of multiplexed luciferase reporters could be adapted for real-time quantification of bacterial gene expression in situ. By creating an inducible dual-luciferase expressing reporter strain of S. mutans, we were able to exogenously control and measure expression of nlmAB (encoding the bacteriocin mutacin IV) within mice to assess its importance for the persistence ability of S. mutans in the oral cavity. The imaging system described in the current study circumvents many of the inherent limitations of current animal model systems, which should now make it feasible to test hypotheses that were previously impractical to model.

  4. Prediction of neural differentiation fate of rat mesenchymal stem cells by quantitative morphological analyses using image processing techniques.

    PubMed

    Kazemimoghadam, Mahdieh; Janmaleki, Mohsen; Fouani, Mohamad Hassan; Abbasi, Sara

    2015-02-01

    Differentiation of bone marrow mesenchymal stem cells (BMSCs) into neural cells has received significant attention in recent years. However, there is still no practical method to evaluate differentiation process non-invasively and practically. The cellular quality evaluation method is still limited to conventional techniques, which are based on extracting genes or proteins from the cells. These techniques are invasive, costly, time consuming, and should be performed by relevant experts in equipped laboratories. Moreover, they cannot anticipate the future status of cells. Recently, cell morphology has been introduced as a feasible way of monitoring cell behavior because of its relationship with cell proliferation, functions and differentiation. In this study, rat BMSCs were induced to differentiate into neurons. Subsequently, phase contrast images of cells taken at certain intervals were subjected to a series of image processing steps and cell morphology features were calculated. In order to validate the viability of applying image-based approaches for estimating the quality of differentiation process, neural-specific markers were measured experimentally throughout the induction. The strong correlation between quantitative imaging metrics and experimental outcomes revealed the capability of the proposed approach as an auxiliary method of assessing cell behavior during differentiation.

  5. Tract-Specific Analyses of Diffusion Tensor Imaging Show Widespread White Matter Compromise in Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Shukla, Dinesh K.; Keehn, Brandon; Muller, Ralph-Axel

    2011-01-01

    Background: Previous diffusion tensor imaging (DTI) studies have shown white matter compromise in children and adults with autism spectrum disorder (ASD), which may relate to reduced connectivity and impaired function of distributed networks. However, tract-specific evidence remains limited in ASD. We applied tract-based spatial statistics (TBSS)…

  6. Emerging and vector-borne diseases: Role of high spatial resolution and hyperspectral images in analyses and forecasts

    NASA Astrophysics Data System (ADS)

    Wilson, Mark L.

    Many infectious diseases that are emerging or transmitted by arthropod vectors have a strong link to landscape features. Depending on the source of infection or ecology of the transmitting vector, micro-habitat characteristics at the spatial scale of square meters or less may be important. Recently, satellite images have been used to classify habitats in an attempt to understand associations with infectious diseases. Whether high spatial resolution and hyperspectral (HSRH) images can be useful in studies of such infectious diseases is addressed. The nature of questions that such studies address and the desired accuracy and precision of answers will determine the utility of HSRH data. Need for such data should be based on the goals of the effort. Examples of kinds of questions and applications are discussed. The research implications and public health applications may depend on available analytic tools as well as epidemiological observations.

  7. Use of Comet assay to assess DNA damage in patients infected by Helicobacter pylori: comparisons between visual and image analyses.

    PubMed

    Ladeira, Marcelo S P; Rodrigues, Maria A M; Freire-Maia, Dértia V; Salvadori, Daisy M F

    2005-09-01

    Studies of DNA damage in gastric epithelial cells of Helicobacter pylori (H. pylori)-infected patients are conflicting, possibly due to different methods used for scoring DNA damage by Comet assay. Therefore, we compared the sensitivity of visual microscopic analysis (arbitrary units-scores and comets%) and image analysis system (tail moment), in the gastric epithelial cells from the antrum and corpus of 122 H. pylori-infected and 32 non-infected patients. The feasibility of cryopreserved peripheral blood lymphocytes and whole-blood cells for DNA damage biomonitoring was also investigated. In the antrum, the levels of DNA damage were significantly higher in H. pylori-infected patients with gastritis than in non-infected patients with normal mucosa, when evaluated by image analysis system, arbitrary units and comets%. In the corpus, the comets% was not sufficiently sensitive to detect the difference between H. pylori-infected patients with gastritis and non-infected patients with normal mucosa. The image analysis system was sensitive enough to detect differences between non-infected patients and H. pylori-infected patients with mild gastritis and between infected patients with moderate and severe gastritis, in both antrum and corpus, while arbitrary units and comets% were unable to detect these differences. In cryopreserved peripheral blood lymphocytes, the levels of DNA damage (tail moment) were significantly higher in H. pylori-infected patients with moderate and severe gastritis than in non-infected patients. Overall, our results indicate that the image analysis system is more sensitive and adequate to measure the levels of DNA damage in gastric epithelial cells than the other methods assayed. PMID:16084756

  8. Unsupervised clustering analyses of features extraction for a caries computer-assisted diagnosis using dental fluorescence images

    NASA Astrophysics Data System (ADS)

    Bessani, Michel; da Costa, Mardoqueu M.; Lins, Emery C. C. C.; Maciel, Carlos D.

    2014-02-01

    Computer-assisted diagnoses (CAD) are performed by systems with embedded knowledge. These systems work as a second opinion to the physician and use patient data to infer diagnoses for health problems. Caries is the most common oral disease and directly affects both individuals and the society. Here we propose the use of dental fluorescence images as input of a caries computer-assisted diagnosis. We use texture descriptors together with statistical pattern recognition techniques to measure the descriptors performance for the caries classification task. The data set consists of 64 fluorescence images of in vitro healthy and carious teeth including different surfaces and lesions already diagnosed by an expert. The texture feature extraction was performed on fluorescence images using RGB and YCbCr color spaces, which generated 35 different descriptors for each sample. Principal components analysis was performed for the data interpretation and dimensionality reduction. Finally, unsupervised clustering was employed for the analysis of the relation between the output labeling and the diagnosis of the expert. The PCA result showed a high correlation between the extracted features; seven components were sufficient to represent 91.9% of the original feature vectors information. The unsupervised clustering output was compared with the expert classification resulting in an accuracy of 96.88%. The results show the high accuracy of the proposed approach in identifying carious and non-carious teeth. Therefore, the development of a CAD system for caries using such an approach appears to be promising.

  9. Combined magnetic resonance and diffusion tensor imaging analyses provide a powerful tool for in vivo assessment of deformation along human muscle fibers.

    PubMed

    Pamuk, Uluç; Karakuzu, Agah; Ozturk, Cengizhan; Acar, Burak; Yucesoy, Can A

    2016-10-01

    Muscle fiber direction strain provides invaluable information for characterizing muscle function. However, methods to study this for human muscles in vivo are lacking. Using magnetic resonance (MR) imaging based deformation analyses and diffusion tensor (DT) imaging based tractography combined, we aimed to assess muscle fiber direction local tissue deformations within the human medial gastrocnemius (GM) muscle. Healthy female subjects (n=5, age=27±1 years) were positioned prone within the MR scanner in a relaxed state with the ankle angle fixed at 90°. The knee was brought to flexion (140.8±3.0°) (undeformed state). Sets of 3D high resolution MR, and DT images were acquired. This protocol was repeated at extended knee joint position (177.0±1.0°) (deformed state). Tractography and Demons nonrigid registration algorithm was utilized to calculate local deformations along muscle fascicles. Undeformed state images were also transformed by a synthetic rigid body motion to calculate strain errors. Mean strain errors were significantly smaller then mean fiber direction strains (lengthening: 0.2±0.1% vs. 8.7±8.5%; shortening: 3.3±0.9% vs. 7.5±4.6%). Shortening and lengthening (up to 23.3% and 116.7%, respectively) occurs simultaneously along individual fascicles despite imposed GM lengthening. Along-fiber shear strains confirm the presence of much shearing between fascicles. Mean fiber direction strains of different tracts also show non-uniform distribution. Inhomogeneity of fiber strain indicates epimuscular myofascial force transmission. We conclude that MR and DT imaging analyses combined provide a powerful tool for quantifying deformation along human muscle fibers in vivo. This can help substantially achieving a better understanding of normal and pathological muscle function and mechanisms of treatment techniques. PMID:27429070

  10. Combined magnetic resonance and diffusion tensor imaging analyses provide a powerful tool for in vivo assessment of deformation along human muscle fibers.

    PubMed

    Pamuk, Uluç; Karakuzu, Agah; Ozturk, Cengizhan; Acar, Burak; Yucesoy, Can A

    2016-10-01

    Muscle fiber direction strain provides invaluable information for characterizing muscle function. However, methods to study this for human muscles in vivo are lacking. Using magnetic resonance (MR) imaging based deformation analyses and diffusion tensor (DT) imaging based tractography combined, we aimed to assess muscle fiber direction local tissue deformations within the human medial gastrocnemius (GM) muscle. Healthy female subjects (n=5, age=27±1 years) were positioned prone within the MR scanner in a relaxed state with the ankle angle fixed at 90°. The knee was brought to flexion (140.8±3.0°) (undeformed state). Sets of 3D high resolution MR, and DT images were acquired. This protocol was repeated at extended knee joint position (177.0±1.0°) (deformed state). Tractography and Demons nonrigid registration algorithm was utilized to calculate local deformations along muscle fascicles. Undeformed state images were also transformed by a synthetic rigid body motion to calculate strain errors. Mean strain errors were significantly smaller then mean fiber direction strains (lengthening: 0.2±0.1% vs. 8.7±8.5%; shortening: 3.3±0.9% vs. 7.5±4.6%). Shortening and lengthening (up to 23.3% and 116.7%, respectively) occurs simultaneously along individual fascicles despite imposed GM lengthening. Along-fiber shear strains confirm the presence of much shearing between fascicles. Mean fiber direction strains of different tracts also show non-uniform distribution. Inhomogeneity of fiber strain indicates epimuscular myofascial force transmission. We conclude that MR and DT imaging analyses combined provide a powerful tool for quantifying deformation along human muscle fibers in vivo. This can help substantially achieving a better understanding of normal and pathological muscle function and mechanisms of treatment techniques.

  11. Androgen Receptor Functional Analyses by High Throughput Imaging: Determination of Ligand, Cell Cycle, and Mutation-Specific Effects

    PubMed Central

    Szafran, Adam T.; Szwarc, Maria; Marcelli, Marco; Mancini, Michael A.

    2008-01-01

    Background Understanding how androgen receptor (AR) function is modulated by exposure to steroids, growth factors or small molecules can have important mechanistic implications for AR-related disease therapies (e.g., prostate cancer, androgen insensitivity syndrome, AIS), and in the analysis of environmental endocrine disruptors. Methodology/Principal Findings We report the development of a high throughput (HT) image-based assay that quantifies AR subcellular and subnuclear distribution, and transcriptional reporter gene activity on a cell-by-cell basis. Furthermore, simultaneous analysis of DNA content allowed determination of cell cycle position and permitted the analysis of cell cycle dependent changes in AR function in unsynchronized cell populations. Assay quality for EC50 coefficients of variation were 5–24%, with Z' values reaching 0.91. This was achieved by the selective analysis of cells expressing physiological levels of AR, important because minor over-expression resulted in elevated nuclear speckling and decreased transcriptional reporter gene activity. A small screen of AR-binding ligands, including known agonists, antagonists, and endocrine disruptors, demonstrated that nuclear translocation and nuclear “speckling” were linked with transcriptional output, and specific ligands were noted to differentially affect measurements for wild type versus mutant AR, suggesting differing mechanisms of action. HT imaging of patient-derived AIS mutations demonstrated a proof-of-principle personalized medicine approach to rapidly identify ligands capable of restoring multiple AR functions. Conclusions/Significance HT imaging-based multiplex screening will provide a rapid, systems-level analysis of compounds/RNAi that may differentially affect wild type AR or clinically relevant AR mutations. PMID:18978937

  12. Section method for projected structures of icosahedral quasicrystals and its application to electron-microscopy-image and surface analyses.

    PubMed

    Yamamoto, Akiji

    2004-11-01

    A section method for projected structures of icosahedral quasicrystals (IQCs) is given. A structure projected along a specified direction can be calculated directly from a six-dimensional periodic structure by this method. The method concludes that all peaks in high-resolution transmission electron-microscopy images of an IQC have different projected atom densities in general and leads to different chemical compositions and densities for all atom layers, suggesting that all surfaces of an IQC are different. Its application to icosahedral Al-Pd-Mn quasicrystals is shown.

  13. Functional assessment of glioma pathogenesis by in vivo multi-parametric magnetic resonance imaging and in vitro analyses

    PubMed Central

    Yao, Nai-Wei; Chang, Chen; Lin, Hsiu-Ting; Yen, Chen-Tung; Chen, Jeou-Yuan

    2016-01-01

    Gliomas are aggressive brain tumors with poor prognosis. In this study, we report a novel approach combining both in vivo multi-parametric MRI and in vitro cell culture assessments to evaluate the pathogenic development of gliomas. Osteopontin (OPN), a pleiotropic factor, has been implicated in the formation and progression of various human cancers, including gliomas, through its functions in regulating cell proliferation, survival, angiogenesis, and migration. Using rat C6 glioma model, the combined approach successfully monitors the acquisition and decrease of cancer hallmarks. We show that knockdown of the expression of OPN reduces C6 cell proliferation, survival, viability and clonogenicity in vitro, and reduces tumor burden and prolongs animal survival in syngeneic rats. OPN depletion is associated with reduced tumor growth, decreased angiogenesis, and an increase of tumor-associated metabolites, as revealed by T2-weighted images, diffusion-weighted images, Ktrans maps, and 1H-MRS, respectively. These strategies allow us to define an important role of OPN in conferring cancer hallmarks, which can be further applied to assess the functional roles of other candidate genes in glioma. In particular, the non-invasive multi-parametric MRI measurement of cancer hallmarks related to proliferation, angiogenesis and altered metabolism may serve as a useful tool for diagnosis and for patient management. PMID:27198662

  14. Single-Cell Imaging and Spectroscopic Analyses of Cr(VI) Reduction on the Surface of Bacterial Cells

    SciTech Connect

    Wang, Yuanmin; Sevinc, Papatya C.; Belchik, Sara M.; Fredrickson, Jim K.; Shi, Liang; Lu, H. Peter

    2013-01-22

    We investigate single-cell reduction of toxic Cr(VI) by the dissimilatory metal-reducing bacterium Shewanella oneidensis MR-1 (MR-1), an important bioremediation process, using Raman spectroscopy and scanning electron microscopy (SEM) combined with energy-dispersive X-ray spectroscopy (EDX). Our experiments indicate that the toxic and highly soluble Cr(VI) can be efficiently reduced to the less toxic and non-soluble Cr2O3 nanoparticles by MR-1. Cr2O3 is observed to emerge as nanoparticles adsorbed on the cell surface and its chemical nature is identified by EDX imaging and Raman spectroscopy. Co-localization of Cr2O3 and cytochromes by EDX imaging and Raman spectroscopy suggests a terminal reductase role for MR-1 surface-exposed cytochromes MtrC and OmcA. Our experiments revealed that the cooperation of surface proteins OmcA and MtrC makes the reduction reaction most efficient, and the sequence of the reducing reactivity of the MR-1 is: wild type > single mutant @mtrC or mutant @omcA > double mutant (@omcA-@mtrC). Moreover, our results also suggest that the direct microbial Cr(VI) reduction and Fe(II) (hematite)-mediated Cr(VI) reduction mechanisms may co-exist in the reduction processes.

  15. Single-Cell Imaging and Spectroscopic Analyses of Cr(VI) Reduction on the Surface of Bacterial Cells

    PubMed Central

    Wang, Yuanmin; Sevinc, Papatya C.; Balchik, Sara M.; Fridrickson, Jim; Shi, Liang; Lu, H. Peter

    2013-01-01

    We investigate single-cell reduction of toxic Cr(VI) by the dissimilatory metal-reducing bacterium Shewanella oneidensis MR-1 (MR-1), an important bioremediation process, using Raman spectroscopy and scanning electron microscopy (SEM) combined with energy-dispersive X-ray spectroscopy (EDX). Our experiments indicate that the toxic and highly soluble Cr(VI) can be efficiently reduced to the less toxic and non-soluble Cr2O3 nanoparticles by MR-1. Cr2O3 is observed to emerge as nanoparticles adsorbed on the cell surface and its chemical nature is identified by EDX imaging and Raman spectroscopy. Co-localization of Cr2O3 and cytochromes by EDX imaging and Raman spectroscopy suggests a terminal reductase role for MR-1 surface-exposed cytochromes MtrC and OmcA. Our experiments revealed that the cooperation of surface proteins OmcA and MtrC makes the reduction reaction most efficient, and the sequence of the reducing reactivity of the MR-1 is: wild type > single mutant ΔmtrC or mutant ΔomcA > double mutant (ΔomcA-ΔmtrC). Moreover, our results also suggest that the direct microbial Cr(VI) reduction and Fe(II) (hematite)-mediated Cr(VI) reduction mechanisms may co-exist in the reduction processes. PMID:23249294

  16. Calibration of remote mineralogy algorithms using modal analyses of Apollo soils by X-ray diffraction and microscopic spectral imaging

    NASA Astrophysics Data System (ADS)

    Crites, S. T.; Taylor, J.; Martel, L.; Lucey, P. G.; Blake, D. F.

    2012-12-01

    We have launched a project to determine the modal mineralogy of over 100 soils from all Apollo sites using quantitative X-ray diffraction (XRD) and microscopic hyperspectral imaging at visible, near-IR and thermal IR wavelengths. The two methods are complementary: XRD is optimal for obtaining the major mineral modes because its measurement is not limited to the surfaces of grains, whereas the hyperspectral imaging method allows us to identify minerals present even down to a single grain, well below the quantitative detection limit of XRD. Each soil is also sent to RELAB to obtain visible, near-IR, and thermal-IR reflectance spectra. The goal is to use quantitative mineralogy in comparison with spectra of the same soils and with remote sensing data of the sampling stations to improve our ability to extract quantitative mineralogy from remote sensing observations. Previous groups have demonstrated methods for using lab mineralogy to validate remote sensing. The LSCC pioneered the method of comparing mineralogy to laboratory spectra of the same soils (Pieters et al. 2002); Blewett et al. (1997) directly compared remote sensing results for sample sites with lab measurements of representative soils from those sites. We are building upon the work of both groups by expanding the number of soils measured to 128, with an emphasis on immature soils to support recent work studying fresh exposures like crater central peaks, and also by incorporating the recent high spatial and spectral resolution data sets over expanded wavelength ranges (e.g. Diviner TIR, M3 hyperspectral VNIR) not available at the time of the previous studies. We have thus far measured 32 Apollo 16 soils using quantitative XRD and are continuing with our collection of soils from the other landing sites. We have developed a microscopic spectral imaging system that includes TIR, VIS, and NIR capabilities and have completed proof-of-concept scans of mineral separates and preliminary lunar soil scans with plans

  17. Evaluating Climate Causation of Conflict in Darfur Using Multi-temporal, Multi-resolution Satellite Image Datasets With Novel Analyses

    NASA Astrophysics Data System (ADS)

    Brown, I.; Wennbom, M.

    2013-12-01

    Climate change, population growth and changes in traditional lifestyles have led to instabilities in traditional demarcations between neighboring ethic and religious groups in the Sahel region. This has resulted in a number of conflicts as groups resort to arms to settle disputes. Such disputes often centre on or are justified by competition for resources. The conflict in Darfur has been controversially explained by resource scarcity resulting from climate change. Here we analyse established methods of using satellite imagery to assess vegetation health in Darfur. Multi-decadal time series of observations are available using low spatial resolution visible-near infrared imagery. Typically normalized difference vegetation index (NDVI) analyses are produced to describe changes in vegetation ';greenness' or ';health'. Such approaches have been widely used to evaluate the long term development of vegetation in relation to climate variations across a wide range of environments from the Arctic to the Sahel. These datasets typically measure peak NDVI observed over a given interval and may introduce bias. It is furthermore unclear how the spatial organization of sparse vegetation may affect low resolution NDVI products. We develop and assess alternative measures of vegetation including descriptors of the growing season, wetness and resource availability. Expanding the range of parameters used in the analysis reduces our dependence on peak NDVI. Furthermore, these descriptors provide a better characterization of the growing season than the single NDVI measure. Using multi-sensor data we combine high temporal/moderate spatial resolution data with low temporal/high spatial resolution data to improve the spatial representativity of the observations and to provide improved spatial analysis of vegetation patterns. The approach places the high resolution observations in the NDVI context space using a longer time series of lower resolution imagery. The vegetation descriptors

  18. Nonintrusive Finger-Vein Recognition System Using NIR Image Sensor and Accuracy Analyses According to Various Factors.

    PubMed

    Pham, Tuyen Danh; Park, Young Ho; Nguyen, Dat Tien; Kwon, Seung Yong; Park, Kang Ryoung

    2015-01-01

    Biometrics is a technology that enables an individual person to be identified based on human physiological and behavioral characteristics. Among biometrics technologies, face recognition has been widely used because of its advantages in terms of convenience and non-contact operation. However, its performance is affected by factors such as variation in the illumination, facial expression, and head pose. Therefore, fingerprint and iris recognitions are preferred alternatives. However, the performance of the former can be adversely affected by the skin condition, including scarring and dryness. In addition, the latter has the disadvantages of high cost, large system size, and inconvenience to the user, who has to align their eyes with the iris camera. In an attempt to overcome these problems, finger-vein recognition has been vigorously researched, but an analysis of its accuracies according to various factors has not received much attention. Therefore, we propose a nonintrusive finger-vein recognition system using a near infrared (NIR) image sensor and analyze its accuracies considering various factors. The experimental results obtained with three databases showed that our system can be operated in real applications with high accuracy; and the dissimilarity of the finger-veins of different people is larger than that of the finger types and hands. PMID:26184214

  19. Nonintrusive Finger-Vein Recognition System Using NIR Image Sensor and Accuracy Analyses According to Various Factors

    PubMed Central

    Pham, Tuyen Danh; Park, Young Ho; Nguyen, Dat Tien; Kwon, Seung Yong; Park, Kang Ryoung

    2015-01-01

    Biometrics is a technology that enables an individual person to be identified based on human physiological and behavioral characteristics. Among biometrics technologies, face recognition has been widely used because of its advantages in terms of convenience and non-contact operation. However, its performance is affected by factors such as variation in the illumination, facial expression, and head pose. Therefore, fingerprint and iris recognitions are preferred alternatives. However, the performance of the former can be adversely affected by the skin condition, including scarring and dryness. In addition, the latter has the disadvantages of high cost, large system size, and inconvenience to the user, who has to align their eyes with the iris camera. In an attempt to overcome these problems, finger-vein recognition has been vigorously researched, but an analysis of its accuracies according to various factors has not received much attention. Therefore, we propose a nonintrusive finger-vein recognition system using a near infrared (NIR) image sensor and analyze its accuracies considering various factors. The experimental results obtained with three databases showed that our system can be operated in real applications with high accuracy; and the dissimilarity of the finger-veins of different people is larger than that of the finger types and hands. PMID:26184214

  20. Overall image of nuclear tests and their human effects at Semipalatinsk: an attempt at analyses based on verbal data.

    PubMed

    Matsuo, Masatsugu; Kawano, Noriyuki; Satoh, Kenichi; Apsalikov, Kazbek N; Moldagaliev, Targat

    2006-02-01

    The present paper is part of an attempt at finally reconstructing the realities of nuclear tests and their human effects near Semipalatinsk, Kazakhstan. As a first step, it tries to reconstruct the overall image of nuclear tests and their human effects. Our data are 199 written testimonies of those affected by radiation, which were collected in 2002 and 2003. We statistically processed them, and categorized those words and expressions, which occurred most frequently in the testimonies, and obtained some forty categories, which represent the experiences, feelings, desires of those affected by radiation. Next, we conducted a principal component analysis of the categories. The result shows: (1) The experiences of the nuclear tests are arranged along the time axis, with direct experiences of the nuclear tests forming one coherent part of the perception and memory, and with other subsequent experiences forming another. (2) Of the latter, we can discern a core of the experiences on human effects such as "disease," "death," "family," "radiation," and so on. (3) And around this core, we see two different trends: one pointing to the current distress and plight, and the other pointing to future fear and hope.

  1. Utilizing magnetic resonance imaging logs, openhole logs, and sidewall core analyses to evaluate shaly sands for water-free production

    SciTech Connect

    Taylor, D.A.; Morganti, J.K.; White, H.J. ); Noblett, B.R. )

    1996-01-01

    Nuclear magnetic resonance (NMR) logging using the new C Series Magnetic Resonance Imaging Log (MRIL) system is rapidly enhancing formation evaluation throughout the industry. By measuring irreducible water saturations, permeabilities, and effective porosities, MRIL data can help petrophysicists evaluate low-resistivity pays. In these environments, conventional openhole logs may not define all of the pay intervals. The MRIL system can also reduce the number of unnecessary completions in zones of potentially high water cut. MRIL tool theory and log presentations used with conventional logs and sidewall cores are presented along with field examples. Scanning electron microscope (SEM) analysis shows good correlation of varying grain size in sandstones with the T2 distribution and bulk volume irreducible water determined from the MRIL measurements. Analysis of each new well drilled in the study area shows how water-free production zones were defined. Because the MRIL data were not recorded on one of the wells, predictions from the conventional logs and the MRIL data collected on the other two wells were used to estimate productive zones in the first well. Discussion of additional formation characteristics, completion procedures, actual production, and predicted producibility of the shaly sands is presented. Integrated methodologies resulted in the perforation of 3 new wells for a gross initial potential of 690 BOPD and 0 BWPD.

  2. Utilizing magnetic resonance imaging logs, openhole logs, and sidewall core analyses to evaluate shaly sands for water-free production

    SciTech Connect

    Taylor, D.A.; Morganti, J.K.; White, H.J.; Noblett, B.R.

    1996-12-31

    Nuclear magnetic resonance (NMR) logging using the new C Series Magnetic Resonance Imaging Log (MRIL) system is rapidly enhancing formation evaluation throughout the industry. By measuring irreducible water saturations, permeabilities, and effective porosities, MRIL data can help petrophysicists evaluate low-resistivity pays. In these environments, conventional openhole logs may not define all of the pay intervals. The MRIL system can also reduce the number of unnecessary completions in zones of potentially high water cut. MRIL tool theory and log presentations used with conventional logs and sidewall cores are presented along with field examples. Scanning electron microscope (SEM) analysis shows good correlation of varying grain size in sandstones with the T2 distribution and bulk volume irreducible water determined from the MRIL measurements. Analysis of each new well drilled in the study area shows how water-free production zones were defined. Because the MRIL data were not recorded on one of the wells, predictions from the conventional logs and the MRIL data collected on the other two wells were used to estimate productive zones in the first well. Discussion of additional formation characteristics, completion procedures, actual production, and predicted producibility of the shaly sands is presented. Integrated methodologies resulted in the perforation of 3 new wells for a gross initial potential of 690 BOPD and 0 BWPD.

  3. An automatic generation of non-uniform mesh for CFD analyses of image-based multiscale human airway models

    NASA Astrophysics Data System (ADS)

    Miyawaki, Shinjiro; Tawhai, Merryn H.; Hoffman, Eric A.; Lin, Ching-Long

    2014-11-01

    The authors have developed a method to automatically generate non-uniform CFD mesh for image-based human airway models. The sizes of generated tetrahedral elements vary in both radial and longitudinal directions to account for boundary layer and multiscale nature of pulmonary airflow. The proposed method takes advantage of our previously developed centerline-based geometry reconstruction method. In order to generate the mesh branch by branch in parallel, we used the open-source programs Gmsh and TetGen for surface and volume meshes, respectively. Both programs can specify element sizes by means of background mesh. The size of an arbitrary element in the domain is a function of wall distance, element size on the wall, and element size at the center of airway lumen. The element sizes on the wall are computed based on local flow rate and airway diameter. The total number of elements in the non-uniform mesh (10 M) was about half of that in the uniform mesh, although the computational time for the non-uniform mesh was about twice longer (170 min). The proposed method generates CFD meshes with fine elements near the wall and smooth variation of element size in longitudinal direction, which are required, e.g., for simulations with high flow rate. NIH Grants R01-HL094315, U01-HL114494, and S10-RR022421. Computer time provided by XSEDE.

  4. Textural analyses of carbon fiber materials by 2D-FFT of complex images obtained by high frequency eddy current imaging (HF-ECI)

    NASA Astrophysics Data System (ADS)

    Schulze, Martin H.; Heuer, Henning

    2012-04-01

    Carbon fiber based materials are used in many lightweight applications in aeronautical, automotive, machine and civil engineering application. By the increasing automation in the production process of CFRP laminates a manual optical inspection of each resin transfer molding (RTM) layer is not practicable. Due to the limitation to surface inspection, the quality parameters of multilayer 3 dimensional materials cannot be observed by optical systems. The Imaging Eddy- Current (EC) NDT is the only suitable inspection method for non-resin materials in the textile state that allows an inspection of surface and hidden layers in parallel. The HF-ECI method has the capability to measure layer displacements (misaligned angle orientations) and gap sizes in a multilayer carbon fiber structure. EC technique uses the variation of the electrical conductivity of carbon based materials to obtain material properties. Beside the determination of textural parameters like layer orientation and gap sizes between rovings, the detection of foreign polymer particles, fuzzy balls or visualization of undulations can be done by the method. For all of these typical parameters an imaging classification process chain based on a high resolving directional ECimaging device named EddyCus® MPECS and a 2D-FFT with adapted preprocessing algorithms are developed.

  5. Genome architecture studied by nanoscale imaging: analyses among bacterial phyla and their implication to eukaryotic genome folding.

    PubMed

    Takeyasu, K; Kim, J; Ohniwa, R L; Kobori, T; Inose, Y; Morikawa, K; Ohta, T; Ishihama, A; Yoshimura, S H

    2004-01-01

    The proper function of the genome largely depends on the higher order architecture of the chromosome. Our previous application of nanotechnology to the questions regarding the structural basis for such macromolecular dynamics has shown that the higher order architecture of the Escherichia coli genome (nucleoid) is achieved via several steps of DNA folding (Kim et al., 2004). In this study, the hierarchy of genome organization was compared among E. coli, Staphylococcus aureus and Clostridium perfringens. A one-molecule-imaging technique, atomic force microscopy (AFM), was applied to the E. coli cells on a cover glass that were successively treated with a detergent, and demonstrated that the nucleoids consist of a fundamental fibrous structure with a diameter of 80 nm that was further dissected into a 40-nm fiber. An application of this on-substrate procedure to the S. aureus and the C. perfringens nucleoids revealed that they also possessed the 40- and 80-nm fibers that were sustainable in the mild detergent solution. The E. coli nucleoid dynamically changed its structure during cell growth; the 80-nm fibers releasable from the cell could be transformed into a tightly packed state depending upon the expression of Dps. However, the S. aureus and the C. perfringens nucleoids never underwent such tight compaction when they reached stationary phase. Bioinformatic analysis suggested that this was possibly due to the lack of a nucleoid protein, Dps, in both species. AFM analysis revealed that both the mitotic chromosome and the interphase chromatin of human cells were also composed of 80-nm fibers. Taking all together, we propose a structural model of the bacterial nucleoid in which a fundamental mechanism of chromosome packing is common in both prokaryotes and eukaryotes.

  6. Comparative quantitative study of Ki-67 antibody staining in 78 B and T cell malignant lymphoma (ML) using two image analyser systems.

    PubMed

    Caulet, S; Lesty, C; Raphael, M; Schoevaert, D; Brousset, P; Binet, J L; Diebold, J; Delsol, G

    1992-06-01

    Total Ki-67 stained area percentage was studied in 32 B and 46 T malignant lymphomas (ML) using two different image analyser systems (TAS, Leitz; SAMBA TM 2005, TITN) respectively. The total Ki-67 area percentage was highly correlated to the number of Ki-67 positive cellular profiles (B-ML, r = 0.93; T-ML, r = 0.88), indicating that area percentage is a reliable alternative method to the manual cell counting. Image analysis allows quicker measurements, appropriate to large and strictly lymphomatous regions. The cell image processor (SAMBA TM 2005, TITN) linked to a color video camera was more suitable for immunohistochemical sections and allowed more automated and faster measurements than the texture analyser (TAS, Leitz) linked with a black and white camera. Alkaline phosphatase technique with fast red as chromogen was more suitable for the detection of Ki-67 stained area by thresholding than peroxidase technique with aminoethylcarbazol or with diaminobenzidine as chromogens. Significant differences were found between low and high grade in B and T ML according to the Kiel classification (mean values +/- SD of 7.7 +/- 3.8% and 16.6 +/- 6.2% in B-ML and of 10.2 +/- 7.9% and 25.6 +/- 16.3% in T-ML respectively). In follicular B-ML, considering follicular areas only, values were comparable to high grade ML; angioimmunoblastic-lymphadenopathy-like (AILD-type) T-ML belonging to low grade ML showed similar values to pleomorphic T-ML with medium and/or large cells belonging to high grade ML.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:1409077

  7. Grain-size and grain-shape analyses using digital imaging technology: Application to the fluvial formation of the Ngandong paleoanthropological site in Central Java, Indonesia

    NASA Astrophysics Data System (ADS)

    Sipola, Maija

    2013-04-01

    This study implements grain-size and grain-shape analyses to better understand the fluvial processes responsible for forming the Ngandong paleoanthropological site along the Solo River in Central Java. The site was first discovered and excavated by the Dutch Geological Survey in the early 1930's, during which fourteen Homo erectus fossils and thousands of other macrofaunal remains were uncovered. The Homo erectus fossils discovered at Ngandong are particularly interesting to paleoanthropologists because the morphology of the excavated crania suggests they are from a recently-living variety of the species. The primary scientific focus for many years has been to determine the absolute age of the Ngandong fossils, while the question of exactly how the Ngandong site itself formed has been frequently overlooked. In this study I use Retsch CAMSIZER digital imaging technology to conduct grain-size and grain-shape analyses of sediments from the terrace stratigraphy at the Ngandong site to understand if there are significant differences between sedimentary layers in grain-size and/or grain-shape, and what these differences mean in terms of local paleoflow dynamics over time. Preliminary analyses indicate there are four distinct sedimentary layers present at Ngandong with regard to size sorting, with the fossil-bearing layers proving to be the most poorly-sorted and most similar to debris-flow deposits. These results support hypotheses by geoarchaeologists that the fossil-bearing layers present at Ngandong were deposited during special flow events rather than under normal stream flow conditions.

  8. An automated image-based method of 3D subject-specific body segment parameter estimation for kinetic analyses of rapid movements.

    PubMed

    Sheets, Alison L; Corazza, Stefano; Andriacchi, Thomas P

    2010-01-01

    Accurate subject-specific body segment parameters (BSPs) are necessary to perform kinetic analyses of human movements with large accelerations, or no external contact forces or moments. A new automated topographical image-based method of estimating segment mass, center of mass (CM) position, and moments of inertia is presented. Body geometry and volume were measured using a laser scanner, then an automated pose and shape registration algorithm segmented the scanned body surface, and identified joint center (JC) positions. Assuming the constant segment densities of Dempster, thigh and shank masses, CM locations, and moments of inertia were estimated for four male subjects with body mass indexes (BMIs) of 19.7-38.2. The subject-specific BSP were compared with those determined using Dempster and Clauser regression equations. The influence of BSP and BMI differences on knee and hip net forces and moments during a running swing phase were quantified for the subjects with the smallest and largest BMIs. Subject-specific BSP for 15 body segments were quickly calculated using the image-based method, and total subject masses were overestimated by 1.7-2.9%.When compared with the Dempster and Clauser methods, image-based and regression estimated thigh BSP varied more than the shank parameters. Thigh masses and hip JC to thigh CM distances were consistently larger, and each transverse moment of inertia was smaller using the image-based method. Because the shank had larger linear and angular accelerations than the thigh during the running swing phase, shank BSP differences had a larger effect on calculated intersegmental forces and moments at the knee joint than thigh BSP differences did at the hip. It was the net knee kinetic differences caused by the shank BSP differences that were the largest contributors to the hip variations. Finally, BSP differences produced larger kinetic differences for the subject with larger segment masses, suggesting that parameter accuracy is more

  9. An automated image-based method of 3D subject-specific body segment parameter estimation for kinetic analyses of rapid movements.

    PubMed

    Sheets, Alison L; Corazza, Stefano; Andriacchi, Thomas P

    2010-01-01

    Accurate subject-specific body segment parameters (BSPs) are necessary to perform kinetic analyses of human movements with large accelerations, or no external contact forces or moments. A new automated topographical image-based method of estimating segment mass, center of mass (CM) position, and moments of inertia is presented. Body geometry and volume were measured using a laser scanner, then an automated pose and shape registration algorithm segmented the scanned body surface, and identified joint center (JC) positions. Assuming the constant segment densities of Dempster, thigh and shank masses, CM locations, and moments of inertia were estimated for four male subjects with body mass indexes (BMIs) of 19.7-38.2. The subject-specific BSP were compared with those determined using Dempster and Clauser regression equations. The influence of BSP and BMI differences on knee and hip net forces and moments during a running swing phase were quantified for the subjects with the smallest and largest BMIs. Subject-specific BSP for 15 body segments were quickly calculated using the image-based method, and total subject masses were overestimated by 1.7-2.9%.When compared with the Dempster and Clauser methods, image-based and regression estimated thigh BSP varied more than the shank parameters. Thigh masses and hip JC to thigh CM distances were consistently larger, and each transverse moment of inertia was smaller using the image-based method. Because the shank had larger linear and angular accelerations than the thigh during the running swing phase, shank BSP differences had a larger effect on calculated intersegmental forces and moments at the knee joint than thigh BSP differences did at the hip. It was the net knee kinetic differences caused by the shank BSP differences that were the largest contributors to the hip variations. Finally, BSP differences produced larger kinetic differences for the subject with larger segment masses, suggesting that parameter accuracy is more

  10. Growing seasons of Nordic mountain birch in northernmost Europe as indicated by long-term field studies and analyses of satellite images.

    PubMed

    Shutova, E; Wielgolaski, F E; Karlsen, S R; Makarova, O; Berlina, N; Filimonova, T; Haraldsson, E; Aspholm, P E; Flø, L; Høgda, K A

    2006-11-01

    The phenophases first greening (bud burst) and yellowing of Nordic mountain birch (Betula pubescens ssp.tortuosa, also called B. p. ssp. czerepanovii) were observed at three sites on the Kola Peninsula in northernmost Europe during the period 1964-2003, and at two sites in the trans-boundary Pasvik-Enare region during 1994-2003. The field observations were compared with satellite images based on the GIMMS-NDVI dataset covering 1982-2002 at the start and end of the growing season. A trend for a delay of first greening was observed at only one of the sites (Kandalaksha) over the 40 year period. This fits well with the delayed onset of the growing season for that site based on satellite images. No significant changes in time of greening at the other sites were found with either field observations or satellite analyses throughout the study period. These results differ from the earlier spring generally observed in other parts of Europe in recent decades. In the coldest regions of Europe, e.g. in northern high mountains and the northernmost continental areas, increased precipitation associated with the generally positive North Atlantic Oscillation in the last few decades has often fallen as snow. Increased snow may delay the time of onset of the growing season, although increased temperature generally causes earlier spring phenophases. Autumn yellowing of birch leaves tends towards an earlier date at all sites. Due to both later birch greening and earlier yellowing at the Kandalaksha site, the growing season there has also become significantly shorter during the years observed. The sites showing the most advanced yellowing in the field throughout the study period fit well with areas showing an earlier end of the growing season from satellite images covering 1982-2002. The earlier yellowing is highly correlated with a trend at the sites in autumn for earlier decreasing air temperature over the study period, indicating that this environmental factor is important also for

  11. Surface Roughness and Critical Exponent Analyses of Boron-Doped Diamond Films Using Atomic Force Microscopy Imaging: Application of Autocorrelation and Power Spectral Density Functions

    NASA Astrophysics Data System (ADS)

    Gupta, S.; Vierkant, G. P.

    2014-09-01

    The evolution of the surface roughness of growing metal or semiconductor thin films provides much needed information about their growth kinetics and corresponding mechanism. While some systems show stages of nucleation, coalescence, and growth, others exhibit varying microstructures for different process conditions. In view of these classifications, we report herein detailed analyses based on atomic force microscopy (AFM) characterization to extract the surface roughness and growth kinetics exponents of relatively low boron-doped diamond (BDD) films by utilizing the analytical power spectral density (PSD) and autocorrelation function (ACF) as mathematical tools. The machining industry has applied PSD for a number of years for tool design and analysis of wear and machined surface quality. Herein, we present similar analyses at the mesoscale to study the surface morphology as well as quality of BDD films grown using the microwave plasma-assisted chemical vapor deposition technique. PSD spectra as a function of boron concentration (in gaseous phase) are compared with those for samples grown without boron. We find that relatively higher boron concentration yields higher amplitudes of the longer-wavelength power spectral lines, with amplitudes decreasing in an exponential or power-law fashion towards shorter wavelengths, determining the roughness exponent ( α ≈ 0.16 ± 0.03) and growth exponent ( β ≈ 0.54), albeit indirectly. A unique application of the ACF, which is widely used in signal processing, was also applied to one-dimensional or line analyses (i.e., along the x- and y-axes) of AFM images, revealing surface topology datasets with varying boron concentration. Here, the ACF was used to cancel random surface "noise" and identify any spatial periodicity via repetitive ACF peaks or spatially correlated noise. Periodicity at shorter spatial wavelengths was observed for no doping and low doping levels, while smaller correlations were observed for relatively

  12. Computational replication of the patient-specific stenting procedure for coronary artery bifurcations: From OCT and CT imaging to structural and hemodynamics analyses.

    PubMed

    Chiastra, Claudio; Wu, Wei; Dickerhoff, Benjamin; Aleiou, Ali; Dubini, Gabriele; Otake, Hiromasa; Migliavacca, Francesco; LaDisa, John F

    2016-07-26

    The optimal stenting technique for coronary artery bifurcations is still debated. With additional advances computational simulations can soon be used to compare stent designs or strategies based on verified structural and hemodynamics results in order to identify the optimal solution for each individual's anatomy. In this study, patient-specific simulations of stent deployment were performed for 2 cases to replicate the complete procedure conducted by interventional cardiologists. Subsequent computational fluid dynamics (CFD) analyses were conducted to quantify hemodynamic quantities linked to restenosis. Patient-specific pre-operative models of coronary bifurcations were reconstructed from CT angiography and optical coherence tomography (OCT). Plaque location and composition were estimated from OCT and assigned to models, and structural simulations were performed in Abaqus. Artery geometries after virtual stent expansion of Xience Prime or Nobori stents created in SolidWorks were compared to post-operative geometry from OCT and CT before being extracted and used for CFD simulations in SimVascular. Inflow boundary conditions based on body surface area, and downstream vascular resistances and capacitances were applied at branches to mimic physiology. Artery geometries obtained after virtual expansion were in good agreement with those reconstructed from patient images. Quantitative comparison of the distance between reconstructed and post-stent geometries revealed a maximum difference in area of 20.4%. Adverse indices of wall shear stress were more pronounced for thicker Nobori stents in both patients. These findings verify structural analyses of stent expansion, introduce a workflow to combine software packages for solid and fluid mechanics analysis, and underscore important stent design features from prior idealized studies. The proposed approach may ultimately be useful in determining an optimal choice of stent and position for each patient.

  13. Computational replication of the patient-specific stenting procedure for coronary artery bifurcations: From OCT and CT imaging to structural and hemodynamics analyses.

    PubMed

    Chiastra, Claudio; Wu, Wei; Dickerhoff, Benjamin; Aleiou, Ali; Dubini, Gabriele; Otake, Hiromasa; Migliavacca, Francesco; LaDisa, John F

    2016-07-26

    The optimal stenting technique for coronary artery bifurcations is still debated. With additional advances computational simulations can soon be used to compare stent designs or strategies based on verified structural and hemodynamics results in order to identify the optimal solution for each individual's anatomy. In this study, patient-specific simulations of stent deployment were performed for 2 cases to replicate the complete procedure conducted by interventional cardiologists. Subsequent computational fluid dynamics (CFD) analyses were conducted to quantify hemodynamic quantities linked to restenosis. Patient-specific pre-operative models of coronary bifurcations were reconstructed from CT angiography and optical coherence tomography (OCT). Plaque location and composition were estimated from OCT and assigned to models, and structural simulations were performed in Abaqus. Artery geometries after virtual stent expansion of Xience Prime or Nobori stents created in SolidWorks were compared to post-operative geometry from OCT and CT before being extracted and used for CFD simulations in SimVascular. Inflow boundary conditions based on body surface area, and downstream vascular resistances and capacitances were applied at branches to mimic physiology. Artery geometries obtained after virtual expansion were in good agreement with those reconstructed from patient images. Quantitative comparison of the distance between reconstructed and post-stent geometries revealed a maximum difference in area of 20.4%. Adverse indices of wall shear stress were more pronounced for thicker Nobori stents in both patients. These findings verify structural analyses of stent expansion, introduce a workflow to combine software packages for solid and fluid mechanics analysis, and underscore important stent design features from prior idealized studies. The proposed approach may ultimately be useful in determining an optimal choice of stent and position for each patient. PMID:26655589

  14. Coregistration of quantitative proton magnetic resonance spectroscopic imaging with neuropathological and neurophysiological analyses defines the extent of neuronal impairments in murine human immunodeficiency virus type-1 encephalitis.

    PubMed

    Nelson, J A; Dou, H; Ellison, B; Uberti, M; Xiong, H; Anderson, E; Mellon, M; Gelbard, H A; Boska, M; Gendelman, H E

    2005-05-15

    Relatively few immune-activated and virus-infected mononuclear phagocytes (MP; perivascular macrophages and microglia) may affect widespread neuronal dysfunction during human immunodeficiency virus type 1 (HIV-1)-associated dementia (HAD). Indeed, histopathological evidence of neuronal dropout often belies the extent of cognitive impairment. To define relationships between neuronal function and histopathology, proton magnetic resonance spectroscopic imaging (1H MRSI) and hippocampal long-term potentiation (LTP) were compared with neuronal and glial immunohistology in a murine model of HIV-1 encephalitis (HIVE). HIV-1(ADA)-infected human monocyte-derived macrophages (MDM) were stereotactically injected into the subcortex of severe combined immunodeficient (SCID) mice. Sham-operated and unmanipulated mice served as controls. Seven days after cell injection, brain histological analyses revealed a focal giant cell encephalitis, with reactive astrocytes, microgliosis, and neuronal dropout. Strikingly, significant reductions in N-acetyl aspartate concentration ([NAA]) and LTP levels in HIVE mice were in both injected and contralateral hemispheres and in brain subregions, including the hippocampus, where neuropathology was limited or absent. The data support the importance of 1H MRSI as a tool for assessing neuronal function for HAD. The data also demonstrate that a highly focal encephalitis can produce global deficits for neuronal function and metabolism. PMID:15825192

  15. Sociopolitical Analyses.

    ERIC Educational Resources Information Center

    Van Galen, Jane, Ed.; And Others

    1992-01-01

    This theme issue of the serial "Educational Foundations" contains four articles devoted to the topic of "Sociopolitical Analyses." In "An Interview with Peter L. McLaren," Mary Leach presented the views of Peter L. McLaren on topics of local and national discourses, values, and the politics of difference. Landon E. Beyer's "Educational Studies and…

  16. Let there be bioluminescence – Development of a biophotonic imaging platform for in situ analyses of oral biofilms in animal models

    PubMed Central

    Merritt, Justin; Senpuku, Hidenobu; Kreth, Jens

    2016-01-01

    Summary In the current study, we describe a novel biophotonic imaging-based reporter system that is particularly useful for the study of virulence in polymicrobial infections and interspecies interactions within animal models. A suite of luciferase enzymes was compared using three early colonizing species of the human oral flora (Streptococcus mutans, Streptococcus gordonii, and Streptococcus sanguinis) to determine the utility of the different reporters for multiplexed imaging studies in vivo. Using the multiplex approach, we were able to track individual species within a dual species oral infection model in mice with both temporal and spatial resolution. We also demonstrate how biophotonic imaging of multiplexed luciferase reporters could be adapted for real-time quantification of bacterial gene expression in situ. By creating an inducible dual-luciferase expressing reporter strain of S. mutans, we were able to exogenously control and measure expression of nlmAB (encoding the bacteriocin mutacin IV) within mice to assess its importance for the persistence ability of S. mutans in the oral cavity. The imaging system described in the current study circumvents many of the inherent limitations of current animal model systems, which should now make it feasible to test hypotheses that were previously impractical to model. PMID:26119252

  17. A study on quantitative analyses before and after injection of contrast medium in spine examinations performed by using diffusion weighted image

    NASA Astrophysics Data System (ADS)

    Cho, Jae-Hwan; Lee, Hae-Kag; Kim, Yong-Kyun; Dong, Kyung-Rae; Chung, Woon-Kwan; Joo, Kyu-Ji

    2013-02-01

    This study examined the changes in the signal-to-noise ratio (SNR), the contrast-to-noise ratio (CNR) and the apparent diffusion coefficient (ADC) of metastatic cancer in the lumbar region by using diffusion weighted image taken with a 1.5 T (Tesla) magnetic resonance (MR) scanner before and after injecting a contrast medium. The study enrolled 30 healthy people and 30 patients with metastatic spine cancer from patients who underwent a lumbar MRI scan from January 2011 to October 2012. A 1.5 T MR scanner was used to obtain the diffusion weighted images (DWIs) before and after injecting the contrast medium. In the group with metastatic spine cancer, the SNR and the CNR were measured in three parts among the L1-L5 lumbar vertebrae, which included the part with metastatic spine cancer, the area of the spine with spine cancer, and the area of spine under the region with cancer. In the acquired ADC map image, the SNRs and the ADCs of the three parts were measured in ADC map images. Among the healthy subjects, the measurements were conducted for the lumbar regions of L3-L5. According to the results, in the group with metastatic spine cancer, the SNR in the DWI before the contrast medium had been injected was lowest in the part with spine cancer. In the DWI after the contrast medium had been injected, the SNR and the CNR were increased in all three parts. In the ADC map image after the contrast medium had been injected, the SNR decreased in all three parts compared to the SNR before the contrast had been injected. The ADC after had been injected the contrast medium was decreased in all three parts compared to that before the contrast medium had been injected. In the healthy group, the SNR was increased in the L3-L5 lumbar regions in the DWI. In the ADC map image, the SNR in all the three parts was decreased in the DWI after injecting the contrast medium had been injected. The ADC in the ADC map image was also decreased in all three parts.

  18. PCaAnalyser: a 2D-image analysis based module for effective determination of prostate cancer progression in 3D culture.

    PubMed

    Hoque, Md Tamjidul; Windus, Louisa C E; Lovitt, Carrie J; Avery, Vicky M

    2013-01-01

    Three-dimensional (3D) in vitro cell based assays for Prostate Cancer (PCa) research are rapidly becoming the preferred alternative to that of conventional 2D monolayer cultures. 3D assays more precisely mimic the microenvironment found in vivo, and thus are ideally suited to evaluate compounds and their suitability for progression in the drug discovery pipeline. To achieve the desired high throughput needed for most screening programs, automated quantification of 3D cultures is required. Towards this end, this paper reports on the development of a prototype analysis module for an automated high-content-analysis (HCA) system, which allows for accurate and fast investigation of in vitro 3D cell culture models for PCa. The Java based program, which we have named PCaAnalyser, uses novel algorithms that allow accurate and rapid quantitation of protein expression in 3D cell culture. As currently configured, the PCaAnalyser can quantify a range of biological parameters including: nuclei-count, nuclei-spheroid membership prediction, various function based classification of peripheral and non-peripheral areas to measure expression of biomarkers and protein constituents known to be associated with PCa progression, as well as defining segregate cellular-objects effectively for a range of signal-to-noise ratios. In addition, PCaAnalyser architecture is highly flexible, operating as a single independent analysis, as well as in batch mode; essential for High-Throughput-Screening (HTS). Utilising the PCaAnalyser, accurate and rapid analysis in an automated high throughput manner is provided, and reproducible analysis of the distribution and intensity of well-established markers associated with PCa progression in a range of metastatic PCa cell-lines (DU145 and PC3) in a 3D model demonstrated.

  19. Image

    SciTech Connect

    Marsh, Amber; Harsch, Tim; Pitt, Julie; Firpo, Mike; Lekin, April; Pardes, Elizabeth

    2007-08-31

    The computer side of the IMAGE project consists of a collection of Perl scripts that perform a variety of tasks; scripts are available to insert, update and delete data from the underlying Oracle database, download data from NCBI's Genbank and other sources, and generate data files for download by interested parties. Web scripts make up the tracking interface, and various tools available on the project web-site (image.llnl.gov) that provide a search interface to the database.

  20. Assimilating All-Sky GPM Microwave Imager(GMI) Radiance Data in NASA GEOS-5 System for Global Cloud and Precipitation Analyses

    NASA Astrophysics Data System (ADS)

    Kim, M. J.; Jin, J.; McCarty, W.; Todling, R.; Holdaway, D. R.; Gelaro, R.

    2014-12-01

    The NASA Global Modeling and Assimilation Office (GMAO) works to maximize the impact of satellite observations in the analysis and prediction of climate and weather through integrated Earth system modeling and data assimilation. To achieve this goal, the GMAO undertakes model and assimilation development, generates products to support NASA instrument teams and the NASA Earth science program. Currently Atmospheric Data Assimilation System (ADAS) in the Goddard Earth Observing System Model, Version 5(GEOS-5) system combines millions of observations and short-term forecasts to determine the best estimate, or analysis, of the instantaneous atmospheric state. However, ADAS has been geared towards utilization of observations in clear sky conditions and the majority of satellite channel data affected by clouds are discarded. Microwave imager data from satellites can be a significant source of information for clouds and precipitation but the data are presently underutilized, as only surface rain rates from the Tropical Rainfall Measurement Mission (TRMM) Microwave Imager (TMI) are assimilated with small weight assigned in the analysis process. As clouds and precipitation often occur in regions with high forecast sensitivity, improvements in the temperature, moisture, wind and cloud analysis of these regions are likely to contribute to significant gains in numerical weather prediction accuracy. This presentation is intended to give an overview of GMAO's recent progress in assimilating the all-sky GPM Microwave Imager (GMI) radiance data in GEOS-5 system. This includes development of various new components to assimilate cloud and precipitation affected data in addition to data in clear sky condition. New observation operators, quality controls, moisture control variables, observation and background error models, and a methodology to incorporate the linearlized moisture physics in the assimilation system are described. In addition preliminary results showing impacts of

  1. Analyses of requirements for computer control and data processing experiment subsystems: Image data processing system (IDAPS) software description (7094 version), volume 2

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A description of each of the software modules of the Image Data Processing System (IDAPS) is presented. The changes in the software modules are the result of additions to the application software of the system and an upgrade of the IBM 7094 Mod(1) computer to a 1301 disk storage configuration. Necessary information about IDAPS sofware is supplied to the computer programmer who desires to make changes in the software system or who desires to use portions of the software outside of the IDAPS system. Each software module is documented with: module name, purpose, usage, common block(s) description, method (algorithm of subroutine) flow diagram (if needed), subroutines called, and storage requirements.

  2. Characteristics and Origin of a Cratered Unit near the MSL Bradbury Landing Site (Gale Crater, Mars) Based on Analyses of Surface Data and Orbital Images

    NASA Astrophysics Data System (ADS)

    Jacob, S.; Rowland, S. K.; Edgett, K. S.; Kah, L. C.; Wiens, R. C.; Day, M. D.; Calef, F.; Palucis, M. C.; Anderson, R. B.

    2014-12-01

    Using orbiter images, the Curiosity landing ellipse was mapped as six distinct units based mainly on geomorphic characteristics. These units are the alluvial fan material (ALF), fractured light-toned surface (FLT), cratered plains/surfaces (CS), smooth hummocky plains (SH), rugged unit (RU) and striated light-toned outcrops (SLT) (Grotzinger et al., 2014; DOI: 10.1126/science.1242777). The goal of this project was to characterize and determine the origin of the CS. The CS is a thin, nearly horizontal, erosion resistant capping unit. HiRISE mosaics were utilized to subdivide the CS into four geomorphic sub-units. Crater densities were calculated for each sub-unit to provide a quantifiable parameter that could aid in understanding how the sub-units differ. Mastcam images from many locations along Curiosity's traverse show fields of dark, massive boulders, which are presumably erosional remnants of the CS. This indicates that the CS was likely more laterally extensive in the past. In situ CS outcrops, seen at Shaler and multiple locations near the Zabriskie Plateau, appear to have a rough, wind-sculpted surface and may consist of two distinct lithologies. The lower lithology displays hints of layering that have been enhanced by differential weathering, whereas the upper lithology consists of dark, massive rock. When present, the outcrops can extend laterally for several meters, but Mastcam images of outcrops do not always reveal both sections. ChemCam data show that CS targets have concentrations of Al, Na, and K that are indicative of an alkali feldspar phase. The physical and chemical characteristics of the CS suggest a massive deposit that has seen little to no chemical alteration. Physical characteristics of the CS do not allow us to unambiguously determine its geologic origin; possible emplacement mechanisms would require the ability to spread laterally over a nearly horizontal surface, and include inflating lava (e.g., pāhoehoe) or a distal delta deposit. The

  3. Utilizing magnetic resonance imaging logs, open hole logs and sidewall core analyses to evaluate shaly sands for water-free production

    SciTech Connect

    Taylor, D.; Morganti, J.; White, H.

    1995-06-01

    NMR logging using the new C series Magnetic Resonance Imaging Logging (MRIL){trademark} is rapidly enhancing formation evaluation throughout the industry. By measuring irreducible water saturations, permeability and effective porosities, MRIL data can help petrophysicists evaluate low resistivity pays. In these instances, conventional open hole logs may not define all of the pay intervals. MRIL can also minimize unnecessary completions in zones of potentially high water-cut. This case study will briefly discuss MRIL tool theory and log presentations used with the conventional logs and sidewall cores. SEM analysis will show a good correlation of varying grain size sands with the T{sub 2} distribution and bulk volume irreducible from MRIL. Discussions of each well in the study area will show how water-free production zones were defined. Because the MRIL data was not recorded on one of the wells, the advanced petrophysical program HORIZON was used to predict the MRIL bulk volume irreducible and effective porosity to estimate productive zones. Discussion of additional formation characteristics, completion procedures, actual production and predicted producibility of the shaly sands will be presented.

  4. Quality and compatibility analyses of global aerosol products derived from the advanced very high resolution radiometer and Moderate Resolution Imaging Spectroradiometer

    NASA Astrophysics Data System (ADS)

    Jeong, Myeong-Jae; Li, Zhanqing; Chu, D. Allen; Tsay, Si-Chee

    2005-05-01

    There exist numerous global aerosol products derived from various satellite sensors, but little insight has been gained about their compatibility and quality. This study presents a comparison of two prominent global aerosol products derived over oceans from the advanced very high resolution radiometer (AVHRR) under the Global Aerosol Climatology Project (GACP) (Mishchenko et al., 1999) and the Moderate Resolution Imaging Spectroradiometer (MODIS) (Tanré et al., 1997). The comparisons are for monthly mean aerosol optical thickness (AOT) and Ångström exponent (α) at a spatial resolution of 1 × 1 degree. The two monthly AOT products showed substantial discrepancies, with a tendency of higher values from MODIS than from GACP/AVHRR, especially near the coasts of major aerosol outbreak regions. Individual monthly AOT values have poor correlation, but their regional means are moderately correlated (correlation coefficient 0.5 < R < 1.0). While cloud screening has often been argued to be a major factor explaining large discrepancies, this study shows that differences in aerosol models in the two retrieval algorithms can lead to large discrepancies. Contributions of the size distribution are more significant than the refractive index. The noisiness of the GACP/AVHRR aerosol retrievals seem to be partially influenced by radiometric uncertainties in the AVHRR system, but it is unlikely a major factor to explain the observed systematic discrepancies between the MODIS and GACP/AVHRR AOTs. For α, correlations between MODIS and GACP/AVHRR are lower (0.2 < R < 0.7) than AOT. The MODIS α shows a well-behaved dependence on the AOT contingent upon the aerosol type, while the GACP/AVHRR α has little correlation with the AOT. The high sensitivity in the selection of aerosol models to radiometric errors may be a primary reason for the worse comparison of α. Part of the discrepancies in α is attributed to different aerosol size distributions.

  5. Use of INSAT-3D sounder and imager radiances in the 4D-VAR data assimilation system and its implications in the analyses and forecasts

    NASA Astrophysics Data System (ADS)

    Indira Rani, S.; Taylor, Ruth; George, John P.; Rajagopal, E. N.

    2016-05-01

    INSAT-3D, the first Indian geostationary satellite with sounding capability, provides valuable information over India and the surrounding oceanic regions which are pivotal to Numerical Weather Prediction. In collaboration with UK Met Office, NCMRWF developed the assimilation capability of INSAT-3D Clear Sky Brightness Temperature (CSBT), both from the sounder and imager, in the 4D-Var assimilation system being used at NCMRWF. Out of the 18 sounder channels, radiances from 9 channels are selected for assimilation depending on relevance of the information in each channel. The first three high peaking channels, the CO2 absorption channels and the three water vapor channels (channel no. 10, 11, and 12) are assimilated both over land and Ocean, whereas the window channels (channel no. 6, 7, and 8) are assimilated only over the Ocean. Measured satellite radiances are compared with that from short range forecasts to monitor the data quality. This is based on the assumption that the observed satellite radiances are free from calibration errors and the short range forecast provided by NWP model is free from systematic errors. Innovations (Observation - Forecast) before and after the bias correction are indicative of how well the bias correction works. Since the biases vary with air-masses, time, scan angle and also due to instrument degradation, an accurate bias correction algorithm for the assimilation of INSAT-3D sounder radiance is important. This paper discusses the bias correction methods and other quality controls used for the selected INSAT-3D sounder channels and the impact of bias corrected radiance in the data assimilation system particularly over India and surrounding oceanic regions.

  6. Quantifying the Physical Composition of Urban Morphology throughout Wales by analysing a Time Series (1989-2011) of Landsat TM/ETM+ images and Supporting GIS data

    NASA Astrophysics Data System (ADS)

    Scott, Douglas; Petropoulos, George

    2014-05-01

    Knowledge of impervious surface areas (ISA) and on their changes in magnitude, location, geometry and morphology over time is significant for a range of practical applications and research alike from local to global scale. It is a key indicator of global environmental change and is also important parameter for urban planning and environmental resources management, especially within a European context due to the policy recommendations given to the European Commission by the Austrian Environment Agency in 2011. Despite this, use of Earth Observation (EO) technology in mapping ISAs within the European Union (EU) and in particular in the UK is inadequate. In the present study, selected study sites across Wales have been used to test the use of freely distributed EO data from Landsat TM/ETM+ sensors in retrieving ISA for improving the current European estimations of international urbanization and soil sealing. A traditional classifier and a linear spectral mixture analysis (LSMA) were both applied to a series of Landsat TM/ETM+ images acquired over a period spanning 22 years to extract ISA. Aerial photography with a spatial resolution of 0.4m, acquired over the summer period in 2005 was used for validation purposes. The Welsh study areas provided a unique chance to detect largely dispersed urban morphology within an urban-rural frontier context. The study also presents an innovative method for detecting clouds and cloud shadow layers, detected with an overall accuracy of around 97%. The process tree built and presented in this study is important in terms of moving forward into a biennial program for the Welsh Government and is comparable to currently existing products. This EO-based product also offers a much less subjectively static and more objectively dynamic estimation of ISA cover. Our methodology not only inaugurates the local retrieval of ISA for Wales but also meliorates the existing EU international figures, and expands relatively stationary 'global' US

  7. SNS shielding analyses overview

    SciTech Connect

    Popova, Irina; Gallmeier, Franz; Iverson, Erik B; Lu, Wei; Remec, Igor

    2015-01-01

    This paper gives an overview on on-going shielding analyses for Spallation Neutron Source. Presently, the most of the shielding work is concentrated on the beam lines and instrument enclosures to prepare for commissioning, save operation and adequate radiation background in the future. There is on-going work for the accelerator facility. This includes radiation-protection analyses for radiation monitors placement, designing shielding for additional facilities to test accelerator structures, redesigning some parts of the facility, and designing test facilities to the main accelerator structure for component testing. Neutronics analyses are required as well to support spent structure management, including waste characterisation analyses, choice of proper transport/storage package and shielding enhancement for the package if required.

  8. Spacelab Charcoal Analyses

    NASA Technical Reports Server (NTRS)

    Slivon, L. E.; Hernon-Kenny, L. A.; Katona, V. R.; Dejarme, L. E.

    1995-01-01

    This report describes analytical methods and results obtained from chemical analysis of 31 charcoal samples in five sets. Each set was obtained from a single scrubber used to filter ambient air on board a Spacelab mission. Analysis of the charcoal samples was conducted by thermal desorption followed by gas chromatography/mass spectrometry (GC/MS). All samples were analyzed using identical methods. The method used for these analyses was able to detect compounds independent of their polarity or volatility. In addition to the charcoal samples, analyses of three Environmental Control and Life Support System (ECLSS) water samples were conducted specifically for trimethylamine.

  9. Wavelet Analyses and Applications

    ERIC Educational Resources Information Center

    Bordeianu, Cristian C.; Landau, Rubin H.; Paez, Manuel J.

    2009-01-01

    It is shown how a modern extension of Fourier analysis known as wavelet analysis is applied to signals containing multiscale information. First, a continuous wavelet transform is used to analyse the spectrum of a nonstationary signal (one whose form changes in time). The spectral analysis of such a signal gives the strength of the signal in each…

  10. Apollo 14 microbial analyses

    NASA Technical Reports Server (NTRS)

    Taylor, G. R.

    1972-01-01

    Extensive microbiological analyses that were performed on the Apollo 14 prime and backup crewmembers and ancillary personnel are discussed. The crewmembers were subjected to four separate and quite different environments during the 137-day monitoring period. The relation between each of these environments and observed changes in the microflora of each astronaut are presented.

  11. Information Omitted From Analyses.

    PubMed

    2015-08-01

    In the Original Article titled “Higher- Order Genetic and Environmental Structure of Prevalent Forms of Child and Adolescent Psychopathology” published in the February 2011 issue of JAMA Psychiatry (then Archives of General Psychiatry) (2011;68[2]:181-189), there were 2 errors. Although the article stated that the dimensions of psychopathology were measured using parent informants for inattention, hyperactivity-impulsivity, and oppositional defiant disorder, and a combination of parent and youth informants for conduct disorder, major depression, generalized anxiety disorder, separation anxiety disorder, social phobia, specific phobia, agoraphobia, and obsessive-compulsive disorder, all dimensional scores used in the reported analyses were actually based on parent reports of symptoms; youth reports were not used. In addition, whereas the article stated that each symptom dimension was residualized on age, sex, age-squared, and age by sex, the dimensions actually were only residualized on age, sex, and age-squared. All analyses were repeated using parent informants for inattention, hyperactivity-impulsivity, and oppositional defiant disorder, and a combination of parent and youth informants for conduct disorder,major depression, generalized anxiety disorder, separation anxiety disorder, social phobia, specific phobia, agoraphobia, and obsessive-compulsive disorder; these dimensional scores were residualized on age, age-squared, sex, sex by age, and sex by age-squared. The results of the new analyses were qualitatively the same as those reported in the article, with no substantial changes in conclusions. The only notable small difference was that major depression and generalized anxiety disorder dimensions had small but significant loadings on the internalizing factor in addition to their substantial loadings on the general factor in the analyses of both genetic and non-shared covariances in the selected models in the new analyses. Corrections were made to the

  12. Systematic Processing of Clementine Data for Scientific Analyses

    NASA Technical Reports Server (NTRS)

    Mcewen, A. S.

    1993-01-01

    If fully successful, the Clementine mission will return about 3,000,000 lunar images and more than 5000 images of Geographos. Effective scientific analyses of such large datasets require systematic processing efforts. Concepts for two such efforts are described: glogal multispectral imaging of the moon; and videos of Geographos.

  13. Development of a systematic computer vision-based method to analyse and compare images of false identity documents for forensic intelligence purposes-Part I: Acquisition, calibration and validation issues.

    PubMed

    Auberson, Marie; Baechler, Simon; Zasso, Michaël; Genessay, Thibault; Patiny, Luc; Esseiva, Pierre

    2016-03-01

    Following their detection and seizure by police and border guard authorities, false identity and travel documents are usually scanned, producing digital images. This research investigates the potential of these images to classify false identity documents, highlight links between documents produced by a same modus operandi or same source, and thus support forensic intelligence efforts. Inspired by previous research work about digital images of Ecstasy tablets, a systematic and complete method has been developed to acquire, collect, process and compare images of false identity documents. This first part of the article highlights the critical steps of the method and the development of a prototype that processes regions of interest extracted from images. Acquisition conditions have been fine-tuned in order to optimise reproducibility and comparability of images. Different filters and comparison metrics have been evaluated and the performance of the method has been assessed using two calibration and validation sets of documents, made up of 101 Italian driving licenses and 96 Portuguese passports seized in Switzerland, among which some were known to come from common sources. Results indicate that the use of Hue and Edge filters or their combination to extract profiles from images, and then the comparison of profiles with a Canberra distance-based metric provides the most accurate classification of documents. The method appears also to be quick, efficient and inexpensive. It can be easily operated from remote locations and shared amongst different organisations, which makes it very convenient for future operational applications. The method could serve as a first fast triage method that may help target more resource-intensive profiling methods (based on a visual, physical or chemical examination of documents for instance). Its contribution to forensic intelligence and its application to several sets of false identity documents seized by police and border guards will be

  14. [Network analyses in neuroimaging studies].

    PubMed

    Hirano, Shigeki; Yamada, Makiko

    2013-06-01

    Neurons are anatomically and physiologically connected to each other, and these connections are involved in various neuronal functions. Multiple important neural networks involved in neurodegenerative diseases can be detected using network analyses in functional neuroimaging. First, the basic methods and theories of voxel-based network analyses, such as principal component analysis, independent component analysis, and seed-based analysis, are described. Disease- and symptom-specific brain networks have been identified using glucose metabolism images in patients with Parkinson's disease. These networks enable us to objectively evaluate individual patients and serve as diagnostic tools as well as biomarkers for therapeutic interventions. Many functional MRI studies have shown that "hub" brain regions, such as the posterior cingulate cortex and medial prefrontal cortex, are deactivated by externally driven cognitive tasks; such brain regions form the "default mode network." Recent studies have shown that this default mode network is disrupted from the preclinical phase of Alzheimer's disease and is associated with amyloid deposition in the brain. Some recent studies have shown that the default mode network is also impaired in Parkinson's disease, whereas other studies have shown inconsistent results. These incongruent results could be due to the heterogeneous pharmacological status, differences in mesocortical dopaminergic impairment status, and concomitant amyloid deposition. Future neuroimaging network analysis studies will reveal novel and interesting findings that will uncover the pathomechanisms of neurological and psychiatric disorders. PMID:23735528

  15. IMAGES, IMAGES, IMAGES

    SciTech Connect

    Marcus, A.

    1980-07-01

    The role of images of information (charts, diagrams, maps, and symbols) for effective presentation of facts and concepts is expanding dramatically because of advances in computer graphics technology, increasingly hetero-lingual, hetero-cultural world target populations of information providers, the urgent need to convey more efficiently vast amounts of information, the broadening population of (non-expert) computer users, the decrease of available time for reading texts and for decision making, and the general level of literacy. A coalition of visual performance experts, human engineering specialists, computer scientists, and graphic designers/artists is required to resolve human factors aspects of images of information. The need for, nature of, and benefits of interdisciplinary effort are discussed. The results of an interdisciplinary collaboration are demonstrated in a product for visualizing complex information about global energy interdependence. An invited panel will respond to the presentation.

  16. LDEF Satellite Radiation Analyses

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.

    1996-01-01

    Model calculations and analyses have been carried out to compare with several sets of data (dose, induced radioactivity in various experiment samples and spacecraft components, fission foil measurements, and LET spectra) from passive radiation dosimetry on the Long Duration Exposure Facility (LDEF) satellite, which was recovered after almost six years in space. The calculations and data comparisons are used to estimate the accuracy of current models and methods for predicting the ionizing radiation environment in low earth orbit. The emphasis is on checking the accuracy of trapped proton flux and anisotropy models.

  17. LDEF Satellite Radiation Analyses

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.

    1996-01-01

    This report covers work performed by Science Applications International Corporation (SAIC) under contract NAS8-39386 from the NASA Marshall Space Flight Center entitled LDEF Satellite Radiation Analyses. The basic objective of the study was to evaluate the accuracy of present models and computational methods for defining the ionizing radiation environment for spacecraft in Low Earth Orbit (LEO) by making comparisons with radiation measurements made on the Long Duration Exposure Facility (LDEF) satellite, which was recovered after almost six years in space. The emphasis of the work here is on predictions and comparisons with LDEF measurements of induced radioactivity and Linear Energy Transfer (LET) measurements. These model/data comparisons have been used to evaluate the accuracy of current models for predicting the flux and directionality of trapped protons for LEO missions.

  18. EEG analyses with SOBI.

    SciTech Connect

    Glickman, Matthew R.; Tang, Akaysha

    2009-02-01

    The motivating vision behind Sandia's MENTOR/PAL LDRD project has been that of systems which use real-time psychophysiological data to support and enhance human performance, both individually and of groups. Relevant and significant psychophysiological data being a necessary prerequisite to such systems, this LDRD has focused on identifying and refining such signals. The project has focused in particular on EEG (electroencephalogram) data as a promising candidate signal because it (potentially) provides a broad window on brain activity with relatively low cost and logistical constraints. We report here on two analyses performed on EEG data collected in this project using the SOBI (Second Order Blind Identification) algorithm to identify two independent sources of brain activity: one in the frontal lobe and one in the occipital. The first study looks at directional influences between the two components, while the second study looks at inferring gender based upon the frontal component.

  19. Network Class Superposition Analyses

    PubMed Central

    Pearson, Carl A. B.; Zeng, Chen; Simha, Rahul

    2013-01-01

    Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., for the yeast cell cycle process [1]), considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix , which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for derived from Boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with . We show how to generate Derrida plots based on . We show that -based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on . We motivate all of these results in terms of a popular molecular biology Boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for , for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses. PMID:23565141

  20. Network class superposition analyses.

    PubMed

    Pearson, Carl A B; Zeng, Chen; Simha, Rahul

    2013-01-01

    Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., ≈ 10(30) for the yeast cell cycle process), considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix T, which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for T derived from boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying T to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with T. We show how to generate Derrida plots based on T. We show that T-based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on T. We motivate all of these results in terms of a popular molecular biology boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for T, for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses. PMID:23565141

  1. NOAA's National Snow Analyses

    NASA Astrophysics Data System (ADS)

    Carroll, T. R.; Cline, D. W.; Olheiser, C. M.; Rost, A. A.; Nilsson, A. O.; Fall, G. M.; Li, L.; Bovitz, C. T.

    2005-12-01

    NOAA's National Operational Hydrologic Remote Sensing Center (NOHRSC) routinely ingests all of the electronically available, real-time, ground-based, snow data; airborne snow water equivalent data; satellite areal extent of snow cover information; and numerical weather prediction (NWP) model forcings for the coterminous U.S. The NWP model forcings are physically downscaled from their native 13 km2 spatial resolution to a 1 km2 resolution for the CONUS. The downscaled NWP forcings drive an energy-and-mass-balance snow accumulation and ablation model at a 1 km2 spatial resolution and at a 1 hour temporal resolution for the country. The ground-based, airborne, and satellite snow observations are assimilated into the snow model's simulated state variables using a Newtonian nudging technique. The principle advantages of the assimilation technique are: (1) approximate balance is maintained in the snow model, (2) physical processes are easily accommodated in the model, and (3) asynoptic data are incorporated at the appropriate times. The snow model is reinitialized with the assimilated snow observations to generate a variety of snow products that combine to form NOAA's NOHRSC National Snow Analyses (NSA). The NOHRSC NSA incorporate all of the available information necessary and available to produce a "best estimate" of real-time snow cover conditions at 1 km2 spatial resolution and 1 hour temporal resolution for the country. The NOHRSC NSA consist of a variety of daily, operational, products that characterize real-time snowpack conditions including: snow water equivalent, snow depth, surface and internal snowpack temperatures, surface and blowing snow sublimation, and snowmelt for the CONUS. The products are generated and distributed in a variety of formats including: interactive maps, time-series, alphanumeric products (e.g., mean areal snow water equivalent on a hydrologic basin-by-basin basis), text and map discussions, map animations, and quantitative gridded products

  2. On study design in neuroimaging heritability analyses

    NASA Astrophysics Data System (ADS)

    Koran, Mary Ellen; Li, Bo; Jahanshad, Neda; Thornton-Wells, Tricia A.; Glahn, David C.; Thompson, Paul M.; Blangero, John; Nichols, Thomas E.; Kochunov, Peter; Landman, Bennett A.

    2014-03-01

    Imaging genetics is an emerging methodology that combines genetic information with imaging-derived metrics to understand how genetic factors impact observable structural, functional, and quantitative phenotypes. Many of the most well-known genetic studies are based on Genome-Wide Association Studies (GWAS), which use large populations of related or unrelated individuals to associate traits and disorders with individual genetic factors. Merging imaging and genetics may potentially lead to improved power of association in GWAS because imaging traits may be more sensitive phenotypes, being closer to underlying genetic mechanisms, and their quantitative nature inherently increases power. We are developing SOLAR-ECLIPSE (SE) imaging genetics software which is capable of performing genetic analyses with both large-scale quantitative trait data and family structures of variable complexity. This program can estimate the contribution of genetic commonality among related subjects to a given phenotype, and essentially answer the question of whether or not the phenotype is heritable. This central factor of interest, heritability, offers bounds on the direct genetic influence over observed phenotypes. In order for a trait to be a good phenotype for GWAS, it must be heritable: at least some proportion of its variance must be due to genetic influences. A variety of family structures are commonly used for estimating heritability, yet the variability and biases for each as a function of the sample size are unknown. Herein, we investigate the ability of SOLAR to accurately estimate heritability models based on imaging data simulated using Monte Carlo methods implemented in R. We characterize the bias and the variability of heritability estimates from SOLAR as a function of sample size and pedigree structure (including twins, nuclear families, and nuclear families with grandparents).

  3. EEO Implications of Job Analyses.

    ERIC Educational Resources Information Center

    Lacy, D. Patrick, Jr.

    1979-01-01

    Discusses job analyses as they relate to the requirements of Title VII of the Civil Rights Act of 1964, the Equal Pay Act of 1963, and the Rehabilitation Act of 1973. Argues that job analyses can establish the job-relatedness of entrance requirements and aid in defenses against charges of discrimination. Journal availability: see EA 511 615.

  4. Feed analyses and their interpretation.

    PubMed

    Hall, Mary Beth

    2014-11-01

    Compositional analysis is central to determining the nutritional value of feedstuffs for use in ration formulation. The utility of the values and how they should be used depends on how representative the feed subsample is, the nutritional relevance and analytical variability of the assays, and whether an analysis is suitable to be applied to a particular feedstuff. Commercial analyses presently available for carbohydrates, protein, and fats have improved nutritionally pertinent description of feed fractions. Factors affecting interpretation of feed analyses and the nutritional relevance and application of currently available analyses are discussed.

  5. Analysing the Metaphorical Images of Turkish Preschool Teachers

    ERIC Educational Resources Information Center

    Kabadayi, Abdulkadir

    2008-01-01

    The metaphorical basis of teacher reflection about teaching and learning has been a rich area of theory and research. This is a study of metaphor as a shared system of interpretation and classification, which teachers and student teachers and their supervising teachers can cooperatively explore. This study employs metaphor as a means of research…

  6. Emerging in vivo analyses of cell function using fluorescence imaging (*).

    PubMed

    Lippincott-Schwartz, Jennifer

    2011-01-01

    Understanding how cells of all types sense external and internal signals and how these signals are processed to yield particular responses is a major goal of biology. Genetically encoded fluorescent proteins (FPs) and fluorescent sensors are playing an important role in achieving this comprehensive knowledge base of cell function. Providing high sensitivity and immense versatility while being minimally perturbing to a biological specimen, the probes can be used in different microscopy techniques to visualize cellular processes on many spatial scales. Three review articles in this volume discuss recent advances in probe design and applications. These developments help expand the range of biochemical processes in living systems suitable for study. They provide researchers with exciting new tools to explore how cellular processes are organized and their activity regulated in vivo.

  7. Image forensic analyses that elude the human visual system

    NASA Astrophysics Data System (ADS)

    Farid, Hany; Bravo, Mary J.

    2010-01-01

    While historically we may have been overly trusting of photographs, in recent years there has been a backlash of sorts and the authenticity of photographs is now routinely questioned. Because these judgments are often made by eye, we wondered how reliable the human visual system is in detecting discrepancies that might arise from photo tampering. We show that the visual system is remarkably inept at detecting simple geometric inconsistencies in shadows, reflections, and perspective distortions. We also describe computational methods that can be applied to detect the inconsistencies that seem to elude the human visual system.

  8. Stereological analyses of the whole human pancreas

    PubMed Central

    Poudel, Ananta; Fowler, Jonas L.; Zielinski, Mark C.; Kilimnik, German; Hara, Manami

    2016-01-01

    The large size of human tissues requires a practical stereological approach to perform a comprehensive analysis of the whole organ. We have developed a method to quantitatively analyze the whole human pancreas, as one of the challenging organs to study, in which endocrine cells form various sizes of islets that are scattered unevenly throughout the exocrine pancreas. Furthermore, the human pancreas possesses intrinsic characteristics of intra-individual variability, i.e. regional differences in endocrine cell/islet distribution, and marked inter-individual heterogeneity regardless of age, sex and disease conditions including obesity and diabetes. The method is built based on large-scale image capture, computer-assisted unbiased image analysis and quantification, and further mathematical analyses, using widely-used software such as Fiji/ImageJ and MATLAB. The present study includes detailed protocols of every procedure as well as all the custom-written computer scripts, which can be modified according to specific experimental plans and specimens of interest. PMID:27658965

  9. Feed analyses and their interpretation

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Compositional analysis is central to determining the nutritional value of feedstuffs. The utility of the values and how they should be used depends on how representative the feed subsample is, the nutritional relevance of the assays, analytical variability of the analyses, and whether a feed is suit...

  10. Mitogenomic analyses of caniform relationships.

    PubMed

    Arnason, Ulfur; Gullberg, Anette; Janke, Axel; Kullberg, Morgan

    2007-12-01

    Extant members of the order Carnivora split into two basal groups, Caniformia (dog-like carnivorans) and Feliformia (cat-like carnivorans). In this study we address phylogenetic relationships within Caniformia applying various methodological approaches to analyses of complete mitochondrial genomes. Pinnipeds are currently well represented with respect to mitogenomic data and here we add seven mt genomes to the non-pinniped caniform collection. The analyses identified a basal caniform divergence between Cynoidea and Arctoidea. Arctoidea split into three primary groups, Ursidae (including the giant panda), Pinnipedia, and a branch, Musteloidea, which encompassed Ailuridae (red panda), Mephitidae (skunks), Procyonidae (raccoons) and Mustelidae (mustelids). The analyses favored a basal arctoid split between Ursidae and a branch containing Pinnipedia and Musteloidea. Within the Musteloidea there was a preference for a basal divergence between Ailuridae and remaining families. Among the latter, the analyses identified a sister group relationship between Mephitidae and a branch that contained Procyonidae and Mustelidae. The mitogenomic distance between the wolf and the dog was shown to be at the same level as that of basal human divergences. The wolf and the dog are commonly considered as separate species in the popular literature. The mitogenomic result is inconsistent with that understanding at the same time as it provides insight into the time of the domestication of the dog relative to basal human mitogenomic divergences.

  11. Introduction to Project Materials Analyses

    ERIC Educational Resources Information Center

    Haley, Frances

    1972-01-01

    The author introduces twenty-six analyses, describes the method of analysis, includes a selection policy for this issue, and lists ten analysts. Each project, analyzed by the combined criteria of the CMAS and the NCSS Guidelines, is examined for background information, product characteristics, rationale and objectives, content, methodology,…

  12. Analysing Children's Drawings: Applied Imagination

    ERIC Educational Resources Information Center

    Bland, Derek

    2012-01-01

    This article centres on a research project in which freehand drawings provided a richly creative and colourful data source of children's imagined, ideal learning environments. Issues concerning the analysis of the visual data are discussed, in particular, how imaginative content was analysed and how the analytical process was dependent on an…

  13. FORTRAN Algorithm for Image Processing

    NASA Technical Reports Server (NTRS)

    Roth, Don J.; Hull, David R.

    1987-01-01

    FORTRAN computer algorithm containing various image-processing analysis and enhancement functions developed. Algorithm developed specifically to process images of developmental heat-engine materials obtained with sophisticated nondestructive evaluation instruments. Applications of program include scientific, industrial, and biomedical imaging for studies of flaws in materials, analyses of steel and ores, and pathology.

  14. Workload analyse of assembling process

    NASA Astrophysics Data System (ADS)

    Ghenghea, L. D.

    2015-11-01

    The workload is the most important indicator for managers responsible of industrial technological processes no matter if these are automated, mechanized or simply manual in each case, machines or workers will be in the focus of workload measurements. The paper deals with workload analyses made to a most part manual assembling technology for roller bearings assembling process, executed in a big company, with integrated bearings manufacturing processes. In this analyses the delay sample technique have been used to identify and divide all bearing assemblers activities, to get information about time parts from 480 minutes day work time that workers allow to each activity. The developed study shows some ways to increase the process productivity without supplementary investments and also indicated the process automation could be the solution to gain maximum productivity.

  15. Mitogenomic analyses of eutherian relationships.

    PubMed

    Arnason, U; Janke, A

    2002-01-01

    Reasonably correct phylogenies are fundamental to the testing of evolutionary hypotheses. Here, we present phylogenetic findings based on analyses of 67 complete mammalian mitochondrial (mt) genomes. The analyses, irrespective of whether they were performed at the amino acid (aa) level or on nucleotides (nt) of first and second codon positions, placed Erinaceomorpha (hedgehogs and their kin) as the sister group of remaining eutherians. Thus, the analyses separated Erinaceomorpha from other traditional lipotyphlans (e.g., tenrecs, moles, and shrews), making traditional Lipotyphla polyphyletic. Both the aa and nt data sets identified the two order-rich eutherian clades, the Cetferungulata (comprising Pholidota, Carnivora, Perissodactyla, Artiodactyla, and Cetacea) and the African clade (Tenrecomorpha, Macroscelidea, Tubulidentata, Hyracoidea, Proboscidea, and Sirenia). The study corroborated recent findings that have identified a sister-group relationship between Anthropoidea and Dermoptera (flying lemurs), thereby making our own order, Primates, a paraphyletic assembly. Molecular estimates using paleontologically well-established calibration points, placed the origin of most eutherian orders in Cretaceous times, 70-100 million years before present (MYBP). The same estimates place all primate divergences much earlier than traditionally believed. For example, the divergence between Homo and Pan is estimated to have taken place approximately 10 MYBP, a dating consistent with recent findings in primate paleontology.

  16. Biological aerosol warner and analyser

    NASA Astrophysics Data System (ADS)

    Schlemmer, Harry; Kürbitz, Gunther; Miethe, Peter; Spieweck, Michael

    2006-05-01

    The development of an integrated sensor device BiSAM (Biological Sampling and Analysing Module) is presented which is designed for rapid detection of aerosol or dust particles potentially loaded with biological warfare agents. All functional steps from aerosol collection via immuno analysis to display of results are fully automated. The core component of the sensor device is an ultra sensitive rapid analyser PBA (Portable Benchtop Analyser) based on a 3 dimensional immuno filtration column of large internal area, Poly HRP marker technology and kinetic optical detection. High sensitivity despite of the short measuring time, high chemical stability of the micro column and robustness against interferents make the PBA an ideal tool for fielded sensor devices. It is especially favourable to combine the PBA with a bio collector because virtually no sample preparation is necessary. Overall, the BiSAM device is capable to detect and identify living micro organisms (bacteria, spores, viruses) as well as toxins in a measuring cycle of typically half an hour duration. In each batch up to 12 different tests can be run in parallel together with positive and negative controls to keep the false alarm rate low.

  17. Mars periglacial punctual features analyses

    NASA Astrophysics Data System (ADS)

    Machado, Adriane; Barata, Teresa; Ivo Alves, E.; Cunha, Pedro P.

    2012-11-01

    The presence of patterned grounds on Mars has been reported in several papers, especially the study of polygons distribution, size and formation processes. In the last years, the presence of basketball terrains has been noticed on Mars. Studies were made to recognize these terrains on Mars through the analysis of Mars Orbiter Camera (MOC) images. We have been developing an algorithm that recognizes automatically and extracts the hummocky patterns on Mars related to landforms generated by freeze-thaw cycles such as mud boils features. The algorithm is based on remote sensing data that establishes a comparison between the hummocks and mud boils morphology and size from Adventdalen at Longyearbyen (Svalbard - Norway) and hummocky patterns on Mars using High Resolution Imaging Science Experiment (HiRISE) imagery.

  18. Analysing photonic structures in plants

    PubMed Central

    Vignolini, Silvia; Moyroud, Edwige; Glover, Beverley J.; Steiner, Ullrich

    2013-01-01

    The outer layers of a range of plant tissues, including flower petals, leaves and fruits, exhibit an intriguing variation of microscopic structures. Some of these structures include ordered periodic multilayers and diffraction gratings that give rise to interesting optical appearances. The colour arising from such structures is generally brighter than pigment-based colour. Here, we describe the main types of photonic structures found in plants and discuss the experimental approaches that can be used to analyse them. These experimental approaches allow identification of the physical mechanisms producing structural colours with a high degree of confidence. PMID:23883949

  19. THOR Turbulence Electron Analyser: TEA

    NASA Astrophysics Data System (ADS)

    Fazakerley, Andrew; Moore, Tom; Owen, Chris; Pollock, Craig; Wicks, Rob; Samara, Marilia; Rae, Jonny; Hancock, Barry; Kataria, Dhiren; Rust, Duncan

    2016-04-01

    Turbulence Heating ObserveR (THOR) is the first mission ever flown in space dedicated to plasma turbulence. The Turbulence Electron Analyser (TEA) will measure the plasma electron populations in the mission's Regions of Interest. It will collect a 3D electron velocity distribution with cadences as short as 5 ms. The instrument will be capable of measuring energies up to 30 keV. TEA consists of multiple electrostatic analyser heads arranged so as to measure electrons arriving from look directions covering the full sky, i.e. 4 pi solid angle. The baseline concept is similar to the successful FPI-DES instrument currently operating on the MMS mission. TEA is intended to have a similar angular resolution, but a larger geometric factor. In comparison to earlier missions, TEA improves on the measurement cadence. For example, MMS FPI-DES routinely operates at 30 ms cadence. The objective of measuring distributions at rates as fast as 5 ms is driven by the mission's scientific requirements to resolve electron gyroscale size structures, where plasma heating and fluctuation dissipation is predicted to occur. TEA will therefore be capable of making measurements of the evolution of distribution functions across thin (a few km) current sheets travelling past the spacecraft at up to 600 km/s, of the Power Spectral Density of fluctuations of electron moments and of distributions fast enough to match frequencies with waves expected to be dissipating turbulence (e.g. with 100 Hz whistler waves).

  20. imageMCR

    2011-09-27

    imageMCR is a user friendly software package that consists of a variety inputs to preprocess and analyze the hyperspectral image data using multivariate algorithms such as Multivariate Curve Resolution (MCR), Principle Component Analysis (PCA), Classical Least Squares (CLS) and Parallel Factor Analysis (PARAFAC). MCR provides a relative quantitative analysis of the hyperspectral image data without the need for standards, and it discovers all the emitting species (spectral pure components) present in an image, even thosemore » in which there is no a priori information. Once the spectral components are discovered, these spectral components can be used for future MCR analyses or used with CLS algorithms to quickly extract concentration image maps for each component within spectral image data sets.« less

  1. Perturbation analyses of intermolecular interactions.

    PubMed

    Koyama, Yohei M; Kobayashi, Tetsuya J; Ueda, Hiroki R

    2011-08-01

    Conformational fluctuations of a protein molecule are important to its function, and it is known that environmental molecules, such as water molecules, ions, and ligand molecules, significantly affect the function by changing the conformational fluctuations. However, it is difficult to systematically understand the role of environmental molecules because intermolecular interactions related to the conformational fluctuations are complicated. To identify important intermolecular interactions with regard to the conformational fluctuations, we develop herein (i) distance-independent and (ii) distance-dependent perturbation analyses of the intermolecular interactions. We show that these perturbation analyses can be realized by performing (i) a principal component analysis using conditional expectations of truncated and shifted intermolecular potential energy terms and (ii) a functional principal component analysis using products of intermolecular forces and conditional cumulative densities. We refer to these analyses as intermolecular perturbation analysis (IPA) and distance-dependent intermolecular perturbation analysis (DIPA), respectively. For comparison of the IPA and the DIPA, we apply them to the alanine dipeptide isomerization in explicit water. Although the first IPA principal components discriminate two states (the α state and PPII (polyproline II) + β states) for larger cutoff length, the separation between the PPII state and the β state is unclear in the second IPA principal components. On the other hand, in the large cutoff value, DIPA eigenvalues converge faster than that for IPA and the top two DIPA principal components clearly identify the three states. By using the DIPA biplot, the contributions of the dipeptide-water interactions to each state are analyzed systematically. Since the DIPA improves the state identification and the convergence rate with retaining distance information, we conclude that the DIPA is a more practical method compared with the

  2. Perturbation analyses of intermolecular interactions

    NASA Astrophysics Data System (ADS)

    Koyama, Yohei M.; Kobayashi, Tetsuya J.; Ueda, Hiroki R.

    2011-08-01

    Conformational fluctuations of a protein molecule are important to its function, and it is known that environmental molecules, such as water molecules, ions, and ligand molecules, significantly affect the function by changing the conformational fluctuations. However, it is difficult to systematically understand the role of environmental molecules because intermolecular interactions related to the conformational fluctuations are complicated. To identify important intermolecular interactions with regard to the conformational fluctuations, we develop herein (i) distance-independent and (ii) distance-dependent perturbation analyses of the intermolecular interactions. We show that these perturbation analyses can be realized by performing (i) a principal component analysis using conditional expectations of truncated and shifted intermolecular potential energy terms and (ii) a functional principal component analysis using products of intermolecular forces and conditional cumulative densities. We refer to these analyses as intermolecular perturbation analysis (IPA) and distance-dependent intermolecular perturbation analysis (DIPA), respectively. For comparison of the IPA and the DIPA, we apply them to the alanine dipeptide isomerization in explicit water. Although the first IPA principal components discriminate two states (the α state and PPII (polyproline II) + β states) for larger cutoff length, the separation between the PPII state and the β state is unclear in the second IPA principal components. On the other hand, in the large cutoff value, DIPA eigenvalues converge faster than that for IPA and the top two DIPA principal components clearly identify the three states. By using the DIPA biplot, the contributions of the dipeptide-water interactions to each state are analyzed systematically. Since the DIPA improves the state identification and the convergence rate with retaining distance information, we conclude that the DIPA is a more practical method compared with the

  3. Uncertainty and Sensitivity Analyses Plan

    SciTech Connect

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.

  4. Analyses to improve operational flexibility

    SciTech Connect

    Trikouros, N.G.

    1986-01-01

    Operational flexibility is greatly enhanced if the technical bases for plant limits and design margins are fully understood, and the analyses necessary to evaluate the effect of plant modifications or changes in operating modes on these parameters can be performed as required. If a condition should arise that might jeopardize a plant limit or reduce operational flexibility, it would be necessary to understand the basis for the limit or the specific condition limiting operational flexibility and be capable of performing a reanalysis to either demonstrate that the limit will not be violated or to change the limit. This paper provides examples of GPU Nuclear efforts in this regard. Examples of Oyster Creek and Three Mile Island operating experiences are discussed.

  5. Chemical analyses of provided samples

    NASA Technical Reports Server (NTRS)

    Becker, Christopher H.

    1993-01-01

    Two batches of samples were received and chemical analysis was performed of the surface and near surface regions of the samples by the surface analysis by laser ionization (SALI) method. The samples included four one-inch optics and several paint samples. The analyses emphasized surface contamination or modification. In these studies, pulsed sputtering by 7 keV Ar+ and primarily single-photon ionization (SPI) by coherent 118 nm radiation (at approximately 5 x 10(exp 5) W/cm(sup 2) were used. For two of the samples, also multiphoton ionization (MPI) at 266 nm (approximately 5 x 10(exp 11) W/cm(sup 2) was used. Most notable among the results was the silicone contamination on Mg2 mirror 28-92, and that the Long Duration Exposure Facility (LDEF) paint sample had been enriched in K and Na and depleted in Zn, Si, B, and organic compounds relative to the control paint.

  6. Isotopic signatures by bulk analyses

    SciTech Connect

    Efurd, D.W.; Rokop, D.J.

    1997-12-01

    Los Alamos National Laboratory has developed a series of measurement techniques for identification of nuclear signatures by analyzing bulk samples. Two specific applications for isotopic fingerprinting to identify the origin of anthropogenic radioactivity in bulk samples are presented. The first example is the analyses of environmental samples collected in the US Arctic to determine the impact of dumping of radionuclides in this polar region. Analyses of sediment and biota samples indicate that for the areas sampled the anthropogenic radionuclide content of sediments was predominantly the result of the deposition of global fallout. The anthropogenic radionuclide concentrations in fish, birds and mammals were very low. It can be surmised that marine food chains are presently not significantly affected. The second example is isotopic fingerprinting of water and sediment samples from the Rocky Flats Facility (RFP). The largest source of anthropogenic radioactivity presently affecting surface-waters at RFP is the sediments that are currently residing in the holding ponds. One gram of sediment from a holding pond contains approximately 50 times more plutonium than 1 liter of water from the pond. Essentially 100% of the uranium in Ponds A-1 and A-2 originated as depleted uranium. The largest source of radioactivity in the terminal Ponds A-4, B-5 and C-2 was naturally occurring uranium and its decay product radium. The uranium concentrations in the waters collected from the terminal ponds contained 0.05% or less of the interim standard calculated derived concentration guide for uranium in waters available to the public. All of the radioactivity observed in soil, sediment and water samples collected at RFP was naturally occurring, the result of processes at RFP or the result of global fallout. No extraneous anthropogenic alpha, beta or gamma activities were detected. The plutonium concentrations in Pond C-2 appear to vary seasonally.

  7. Genetic analyses of captive Alala (Corvus hawaiiensis) using AFLP analyses

    USGS Publications Warehouse

    Jarvi, Susan I.; Bianchi, Kiara R.

    2006-01-01

    affected by the mutation rate at microsatellite loci, thus introducing a bias. Also, the number of loci that can be studied is frequently limited to fewer than 10. This theoretically represents a maximum of one marker for each of 10 chromosomes. Dominant markers like AFLP allow a larger fraction of the genome to be screened. Large numbers of loci can be screened by AFLP to resolve very small individual differences that can be used for identification of individuals, estimates of pairwise relatedness and, in some cases, for parentage analyses. Since AFLP is a dominant marker (can not distinguish between +/+ homozygote versus +/- heterozygote), it has limitations for parentage analyses. Only when both parents are homozygous for the absence of alleles (-/-) and offspring show a presence (+/+ or +/-) can the parents be excluded. In this case, microsatellites become preferable as they have the potential to exclude individual parents when the other parent is unknown. Another limitation of AFLP is that the loci are generally less polymorphic (only two alleles/locus) than microsatellite loci (often >10 alleles/locus). While generally fewer than 10 highly polymorphic microsatellite loci are enough to exclude and assign parentage, it might require up to 100 or more AFLP loci. While there are pros and cons to different methodologies, the total number of loci evaluated by AFLP generally offsets the limitations imposed due to the dominant nature of this approach and end results between methods are generally comparable. Overall objectives of this study were to evaluate the level of genetic diversity in the captive population of Alala, to compare genetic data with currently available pedigree information, and to determine the extent of relatedness of mating pairs and among founding individuals.

  8. APXS ANALYSES OF BOUNCE ROCK: THE FIRST SHERGOTTITE ON MARS

    NASA Technical Reports Server (NTRS)

    Ming, Douglas W.; Zipfel, J.; Anderson, R.; Brueckner, J.; Clark, B. C.; Dreibus, G.; Economou, T.; Gellert, R.; Lugmair, G. W.; Klingelhoefer, G.

    2005-01-01

    During the MER Mission, an isolated rock at Meridiani Planum was analyzed by the Athena instrument suite [1]. Remote sensing instruments noticed its distinct appearance. Two areas on the untreated rock surface and one area that was abraded with the Rock Abrasion Tool were analyzed by Microscopic Imager, Mossbauer Mimos II [2], and Alpha Particle X-ray Spectrometer (APXS). Results of all analyses revealed a close relationship of this rock with known basaltic shergottites.

  9. Imprecise probabilities in engineering analyses

    NASA Astrophysics Data System (ADS)

    Beer, Michael; Ferson, Scott; Kreinovich, Vladik

    2013-05-01

    Probabilistic uncertainty and imprecision in structural parameters and in environmental conditions and loads are challenging phenomena in engineering analyses. They require appropriate mathematical modeling and quantification to obtain realistic results when predicting the behavior and reliability of engineering structures and systems. But the modeling and quantification is complicated by the characteristics of the available information, which involves, for example, sparse data, poor measurements and subjective information. This raises the question whether the available information is sufficient for probabilistic modeling or rather suggests a set-theoretical approach. The framework of imprecise probabilities provides a mathematical basis to deal with these problems which involve both probabilistic and non-probabilistic information. A common feature of the various concepts of imprecise probabilities is the consideration of an entire set of probabilistic models in one analysis. The theoretical differences between the concepts mainly concern the mathematical description of the set of probabilistic models and the connection to the probabilistic models involved. This paper provides an overview on developments which involve imprecise probabilities for the solution of engineering problems. Evidence theory, probability bounds analysis with p-boxes, and fuzzy probabilities are discussed with emphasis on their key features and on their relationships to one another. This paper was especially prepared for this special issue and reflects, in various ways, the thinking and presentation preferences of the authors, who are also the guest editors for this special issue.

  10. Wide area microprobe analyser (WAMPA)

    NASA Astrophysics Data System (ADS)

    Rogoyski, A.; Skidmore, B.; Maheswaran, V.; Wright, I.; Zarnecki, J.; Pillinger, C.

    2006-10-01

    Wide area microprobe analyser (WAMPA) represents a new scientific instrument concept for planetary exploration. WAMPA builds on recently published research such as sensor webs and distributed microsensors [The sensor web: a new instrument concept, SPIE Symposium on Integrated Optics, 20 26 January 2001, San Jose, CA; Design considerations for distributed microsensor systems, Proceedings of the IEEE 1999 Custom Integrated Circuits Conference (CICC ’99), May 1999, pp. 279 286] but adds new sensor and localisation concepts. WAMPA is driven by the recurrent theme in spacecraft and sensor design to achieve smaller, lighter and lower cost systems. The essential characteristics of the WAMPA design that differentiates it from other space science instruments are that WAMPA is both a wide area instrument, consisting of a distributed set of sensors, and that each probe is designed to use little, if any, power. It achieves the former by being utilised in large numbers (>10), requiring that the individual probes be low mass (<100g) and low volume (<10cm). It is envisaged that the probes would be dispersed by landers or rovers as mission support instruments rather than primary science instruments and would be used in hostile environments and rugged terrains where the lander/rover could not be risked (see Fig. 1).

  11. Network analyses in systems pharmacology

    PubMed Central

    Berger, Seth I.; Iyengar, Ravi

    2009-01-01

    Systems pharmacology is an emerging area of pharmacology which utilizes network analysis of drug action as one of its approaches. By considering drug actions and side effects in the context of the regulatory networks within which the drug targets and disease gene products function, network analysis promises to greatly increase our knowledge of the mechanisms underlying the multiple actions of drugs. Systems pharmacology can provide new approaches for drug discovery for complex diseases. The integrated approach used in systems pharmacology can allow for drug action to be considered in the context of the whole genome. Network-based studies are becoming an increasingly important tool in understanding the relationships between drug action and disease susceptibility genes. This review discusses how analysis of biological networks has contributed to the genesis of systems pharmacology and how these studies have improved global understanding of drug targets, suggested new targets and approaches for therapeutics, and provided a deeper understanding of the effects of drugs. Taken together, these types of analyses can lead to new therapeutic options while improving the safety and efficacy of existing medications. Contact: ravi.iyengar@mssm.edu PMID:19648136

  12. Comparison between Inbreeding Analyses Methodologies.

    PubMed

    Esparza, Mireia; Martínez-Abadías, Neus; Sjøvold, Torstein; González-José, Rolando; Hernández, Miquel

    2015-12-01

    Surnames are widely used in inbreeding analysis, but the validity of results has often been questioned due to the failure to comply with the prerequisites of the method. Here we analyze inbreeding in Hallstatt (Austria) between the 17th and the 19th centuries both using genealogies and surnames. The high and significant correlation of the results obtained by both methods demonstrates the validity of the use of surnames in this kind of studies. On the other hand, the inbreeding values obtained (0.24 x 10⁻³ in the genealogies analysis and 2.66 x 10⁻³ in the surnames analysis) are lower than those observed in Europe for this period and for this kind of population, demonstrating the falseness of the apparent isolation of Hallstatt's population. The temporal trend of inbreeding in both analyses does not follow the European general pattern, but shows a maximum in 1850 with a later decrease along the second half of the 19th century. This is probably due to the high migration rate that is implied by the construction of transport infrastructures around the 1870's. PMID:26987150

  13. The relationship among sea surface roughness variations, oceanographic analyses, and airborne remote sensing analyses

    NASA Technical Reports Server (NTRS)

    Oertel, G. F.; Wade, T. L.

    1981-01-01

    The synthetic aperture radar (SAR) was studied to determine whether it could image large scale estuaries and oceanic features such as fronts and to explain the electromagnetic interaction between SAR and the individual surface front features. Fronts were observed to occur at the entrance to the Chesapeake Bay. The airborne measurements consisted of data collection by SAR onboard an F-4 aircraft and real aperture side looking radar (SLAR) in Mohawk aircraft. A total of 89 transects were flown. Surface roughness and color as well as temperature and salinity were evaluated. Cross-frontal surveys were made. Frontal shear and convergence flow were obtained. Surface active organic materials, it was indicated, are present at the air-sea interface. In all, 2000 analyses were conducted to characterize the spatial and temporal variabilities associated with water mass boundaries.

  14. NOx analyser interefence from alkenes

    NASA Astrophysics Data System (ADS)

    Bloss, W. J.; Alam, M. S.; Lee, J. D.; Vazquez, M.; Munoz, A.; Rodenas, M.

    2012-04-01

    Nitrogen oxides (NO and NO2, collectively NOx) are critical intermediates in atmospheric chemistry. NOx abundance controls the levels of the primary atmospheric oxidants OH, NO3 and O3, and regulates the ozone production which results from the degradation of volatile organic compounds. NOx are also atmospheric pollutants in their own right, and NO2 is commonly included in air quality objectives and regulations. In addition to their role in controlling ozone formation, NOx levels affect the production of other pollutants such as the lachrymator PAN, and the nitrate component of secondary aerosol particles. Consequently, accurate measurement of nitrogen oxides in the atmosphere is of major importance for understanding our atmosphere. The most widely employed approach for the measurement of NOx is chemiluminescent detection of NO2* from the NO + O3 reaction, combined with NO2 reduction by either a heated catalyst or photoconvertor. The reaction between alkenes and ozone is also chemiluminescent; therefore alkenes may contribute to the measured NOx signal, depending upon the instrumental background subtraction cycle employed. This interference has been noted previously, and indeed the effect has been used to measure both alkenes and ozone in the atmosphere. Here we report the results of a systematic investigation of the response of a selection of NOx analysers, ranging from systems used for routine air quality monitoring to atmospheric research instrumentation, to a series of alkenes ranging from ethene to the biogenic monoterpenes, as a function of conditions (co-reactants, humidity). Experiments were performed in the European Photoreactor (EUPHORE) to ensure common calibration, a common sample for the monitors, and to unequivocally confirm the alkene (via FTIR) and NO2 (via DOAS) levels present. The instrument responses ranged from negligible levels up to 10 % depending upon the alkene present and conditions used. Such interferences may be of substantial importance

  15. Ergonomic analyses of downhill skiing.

    PubMed

    Clarys, J P; Publie, J; Zinzen, E

    1994-06-01

    The purpose of this study was to provide electromyographic feedback for (1) pedagogical advice in motor learning, (2) the ergonomics of materials choice and (3) competition. For these purposes: (1) EMG data were collected for the Stem Christie, the Stem Turn and the Parallel Christie (three basic ski initiation drills) and verified for the complexity of patterns; (2) integrated EMG (iEMG) and linear envelopes (LEs) were analysed from standardized positions, motions and slopes using compact, soft and competition skis; (3) in a simulated 'parallel special slalom', the muscular activity pattern and intensity of excavated and flat snow conditions were compared. The EMG data from the three studies were collected on location in the French Alps (Tignes). The analog raw EMG was recorded on the slopes with a portable seven-channel FM recorder (TEAC MR30) and with pre-amplified bipolar surface electrodes supplied with a precision instrumentation amplifier (AD 524, Analog Devices, Norwood, USA). The raw signal was full-wave rectified and enveloped using a moving average principle. This linear envelope was normalized according to the highest peak amplitude procedure per subject and was integrated in order to obtain a reference of muscular intensity. In the three studies and for all subjects (elite skiers: n = 25 in studies 1 and 2, n = 6 in study 3), we found a high level of co-contractions in the lower limb extensors and flexors, especially during the extension phase of the ski movement. The Stem Christie and the Parallel Christie showed higher levels of rhythmic movement (92 and 84%, respectively).(ABSTRACT TRUNCATED AT 250 WORDS) PMID:8064970

  16. ITER Safety Analyses with ISAS

    NASA Astrophysics Data System (ADS)

    Gulden, W.; Nisan, S.; Porfiri, M.-T.; Toumi, I.; de Gramont, T. Boubée

    1997-06-01

    Detailed analyses of accident sequences for the International Thermonuclear Experimental Reactor (ITER), from an initiating event to the environmental release of activity, have involved in the past the use of different types of computer codes in a sequential manner. Since these codes were developed at different time scales in different countries, there is no common computing structure to enable automatic data transfer from one code to the other, and no possibility exists to model or to quantify the effect of coupled physical phenomena. To solve this problem, the Integrated Safety Analysis System of codes (ISAS) is being developed, which allows users to integrate existing computer codes in a coherent manner. This approach is based on the utilization of a command language (GIBIANE) acting as a “glue” to integrate the various codes as modules of a common environment. The present version of ISAS allows comprehensive (coupled) calculations of a chain of codes such as ATHENA (thermal-hydraulic analysis of transients and accidents), INTRA (analysis of in-vessel chemical reactions, pressure built-up, and distribution of reaction products inside the vacuum vessel and adjacent rooms), and NAUA (transport of radiological species within buildings and to the environment). In the near future, the integration of S AFALY (simultaneous analysis of plasma dynamics and thermal behavior of in-vessel components) is also foreseen. The paper briefly describes the essential features of ISAS development and the associated software architecture. It gives first results of a typical ITER accident sequence, a loss of coolant accident (LOCA) in the divertor cooling loop inside the vacuum vessel, amply demonstrating ISAS capabilities.

  17. Quantum Image Encryption Algorithm Based on Quantum Image XOR Operations

    NASA Astrophysics Data System (ADS)

    Gong, Li-Hua; He, Xiang-Tao; Cheng, Shan; Hua, Tian-Xiang; Zhou, Nan-Run

    2016-07-01

    A novel encryption algorithm for quantum images based on quantum image XOR operations is designed. The quantum image XOR operations are designed by using the hyper-chaotic sequences generated with the Chen's hyper-chaotic system to control the control-NOT operation, which is used to encode gray-level information. The initial conditions of the Chen's hyper-chaotic system are the keys, which guarantee the security of the proposed quantum image encryption algorithm. Numerical simulations and theoretical analyses demonstrate that the proposed quantum image encryption algorithm has larger key space, higher key sensitivity, stronger resistance of statistical analysis and lower computational complexity than its classical counterparts.

  18. Analyses of Transistor Punchthrough Failures

    NASA Technical Reports Server (NTRS)

    Nicolas, David P.

    1999-01-01

    The failure of two transistors in the Altitude Switch Assembly for the Solid Rocket Booster followed by two additional failures a year later presented a challenge to failure analysts. These devices had successfully worked for many years on numerous missions. There was no history of failures with this type of device. Extensive checks of the test procedures gave no indication for a source of the cause. The devices were manufactured more than twenty years ago and failure information on this lot date code was not readily available. External visual exam, radiography, PEID, and leak testing were performed with nominal results Electrical testing indicated nearly identical base-emitter and base-collector characteristics (both forward and reverse) with a low resistance short emitter to collector. These characteristics are indicative of a classic failure mechanism called punchthrough. In failure analysis punchthrough refers to an condition where a relatively low voltage pulse causes the device to conduct very hard producing localized areas of thermal runaway or "hot spots". At one or more of these hot spots, the excessive currents melt the silicon. Heavily doped emitter material diffuses through the base region to the collector forming a diffusion pipe shorting the emitter to base to collector. Upon cooling, an alloy junction forms between the pipe and the base region. Generally, the hot spot (punch-through site) is under the bond and no surface artifact is visible. The devices were delidded and the internal structures were examined microscopically. The gold emitter lead was melted on one device, but others had anomalies in the metallization around the in-tact emitter bonds. The SEM examination confirmed some anomalies to be cosmetic defects while other anomalies were artifacts of the punchthrough site. Subsequent to these analyses, the contractor determined that some irregular testing procedures occurred at the time of the failures heretofore unreported. These testing

  19. Computer analysis of mammography phantom images (CAMPI)

    NASA Astrophysics Data System (ADS)

    Chakraborty, Dev P.

    1997-05-01

    Computer analysis of mammography phantom images (CAMPI) is a method for objective and precise measurements of phantom image quality in mammography. This investigation applied CAMPI methodology to the Fischer Mammotest Stereotactic Digital Biopsy machine. Images of an American College of Radiology phantom centered on the largest two microcalcification groups were obtained on this machine under a variety of x-ray conditions. Analyses of the images revealed that the precise behavior of the CAMPI measures could be understood from basic imaging physics principles. We conclude that CAMPI is sensitive to subtle image quality changes and can perform accurate evaluations of images, especially of directly acquired digital images.

  20. Image data processing of earth resources management. [technology transfer

    NASA Technical Reports Server (NTRS)

    Desio, A. W.

    1974-01-01

    Various image processing and information extraction systems are described along with the design and operation of an interactive multispectral information system, IMAGE 100. Analyses of ERTS data, using IMAGE 100, over a number of U.S. sites are presented. The following analyses are included: investigations of crop inventory and management using remote sensing; and (2) land cover classification for environmental impact assessments. Results show that useful information is provided by IMAGE 100 analyses of ERTS data in digital form.

  1. Image Calibration

    NASA Technical Reports Server (NTRS)

    Peay, Christopher S.; Palacios, David M.

    2011-01-01

    Calibrate_Image calibrates images obtained from focal plane arrays so that the output image more accurately represents the observed scene. The function takes as input a degraded image along with a flat field image and a dark frame image produced by the focal plane array and outputs a corrected image. The three most prominent sources of image degradation are corrected for: dark current accumulation, gain non-uniformity across the focal plane array, and hot and/or dead pixels in the array. In the corrected output image the dark current is subtracted, the gain variation is equalized, and values for hot and dead pixels are estimated, using bicubic interpolation techniques.

  2. Pawnee Nation Energy Option Analyses

    SciTech Connect

    Matlock, M.; Kersey, K.; Riding In, C.

    2009-07-21

    Pawnee Nation of Oklahoma Energy Option Analyses In 2003, the Pawnee Nation leadership identified the need for the tribe to comprehensively address its energy issues. During a strategic energy planning workshop a general framework was laid out and the Pawnee Nation Energy Task Force was created to work toward further development of the tribe’s energy vision. The overarching goals of the “first steps” project were to identify the most appropriate focus for its strategic energy initiatives going forward, and to provide information necessary to take the next steps in pursuit of the “best fit” energy options. Description of Activities Performed The research team reviewed existing data pertaining to the availability of biomass (focusing on woody biomass, agricultural biomass/bio-energy crops, and methane capture), solar, wind and hydropower resources on the Pawnee-owned lands. Using these data, combined with assumptions about costs and revenue streams, the research team performed preliminary feasibility assessments for each resource category. The research team also reviewed available funding resources and made recommendations to Pawnee Nation highlighting those resources with the greatest potential for financially-viable development, both in the near-term and over a longer time horizon. Findings and Recommendations Due to a lack of financial incentives for renewable energy, particularly at the state level, combined mediocre renewable energy resources, renewable energy development opportunities are limited for Pawnee Nation. However, near-term potential exists for development of solar hot water at the gym, and an exterior wood-fired boiler system at the tribe’s main administrative building. Pawnee Nation should also explore options for developing LFGTE resources in collaboration with the City of Pawnee. Significant potential may also exist for development of bio-energy resources within the next decade. Pawnee Nation representatives should closely monitor

  3. Indexing Images.

    ERIC Educational Resources Information Center

    Rasmussen, Edie M.

    1997-01-01

    Focuses on access to digital image collections by means of manual and automatic indexing. Contains six sections: (1) Studies of Image Systems and their Use; (2) Approaches to Indexing Images; (3) Image Attributes; (4) Concept-Based Indexing; (5) Content-Based Indexing; and (6) Browsing in Image Retrieval. Contains 105 references. (AEF)

  4. Integrated Field Analyses of Thermal Springs

    NASA Astrophysics Data System (ADS)

    Shervais, K.; Young, B.; Ponce-Zepeda, M. M.; Rosove, S.

    2011-12-01

    A group of undergraduate researchers through the SURE internship offered by the Southern California Earthquake Center (SCEC) have examined thermal springs in southern Idaho, northern Utah as well as mud volcanoes in the Salton Sea, California. We used an integrated approach to estimate the setting and maximum temperature, including water chemistry, Ipad-based image and data-base management, microbiology, and gas analyses with a modified Giggenbach sampler.All springs were characterized using GISRoam (tmCogent3D). We are performing geothermometry calculations as well as comparisons with temperature gradient data on the results while also analyzing biological samples. Analyses include water temperature, pH, electrical conductivity, and TDS measured in the field. Each sample is sealed and chilled and delivered to a water lab within 12 hours.Temperatures are continuously monitored with the use of Solinst Levelogger Juniors. Through partnership with a local community college geology club, we receive results on a monthly basis and are able to process initial data earlier in order to evaluate data over a longer time span. The springs and mudpots contained microbial organisms which were analyzed using methods of single colony isolation, polymerase chain reaction, and DNA sequencing showing the impact of the organisms on the springs or vice versa. Soon we we will collect gas samples at sites that show signs of gas. This will be taken using a hybrid of the Giggenbach method and our own methods. Drawing gas samples has proven a challenge, however we devised a method to draw out gas samples utilizing the Giggenbach flask, transferring samples to glass blood sample tubes, replacing NaOH in the Giggenbach flask, and evacuating it in the field for multiple samples using a vacuum pump. We also use a floating platform devised to carry and lower a levelogger, to using an in-line fuel filter from a tractor in order to keep mud from contaminating the equipment.The use of raster

  5. 49 CFR 1180.7 - Market analyses.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 8 2013-10-01 2013-10-01 false Market analyses. 1180.7 Section 1180.7..., TRACKAGE RIGHTS, AND LEASE PROCEDURES General Acquisition Procedures § 1180.7 Market analyses. (a) For major and significant transactions, applicants shall submit impact analyses (exhibit 12) describing...

  6. 49 CFR 1180.7 - Market analyses.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 8 2012-10-01 2012-10-01 false Market analyses. 1180.7 Section 1180.7..., TRACKAGE RIGHTS, AND LEASE PROCEDURES General Acquisition Procedures § 1180.7 Market analyses. (a) For major and significant transactions, applicants shall submit impact analyses (exhibit 12) describing...

  7. Oncological image analysis: medical and molecular image analysis

    NASA Astrophysics Data System (ADS)

    Brady, Michael

    2007-03-01

    This paper summarises the work we have been doing on joint projects with GE Healthcare on colorectal and liver cancer, and with Siemens Molecular Imaging on dynamic PET. First, we recall the salient facts about cancer and oncological image analysis. Then we introduce some of the work that we have done on analysing clinical MRI images of colorectal and liver cancer, specifically the detection of lymph nodes and segmentation of the circumferential resection margin. In the second part of the paper, we shift attention to the complementary aspect of molecular image analysis, illustrating our approach with some recent work on: tumour acidosis, tumour hypoxia, and multiply drug resistant tumours.

  8. Electron/proton spectrometer certification documentation analyses

    NASA Technical Reports Server (NTRS)

    Gleeson, P.

    1972-01-01

    A compilation of analyses generated during the development of the electron-proton spectrometer for the Skylab program is presented. The data documents the analyses required by the electron-proton spectrometer verification plan. The verification plan was generated to satisfy the ancillary hardware requirements of the Apollo Applications program. The certification of the spectrometer requires that various tests, inspections, and analyses be documented, approved, and accepted by reliability and quality control personnel of the spectrometer development program.

  9. MELCOR analyses for accident progression issues

    SciTech Connect

    Dingman, S.E.; Shaffer, C.J.; Payne, A.C.; Carmel, M.K. )

    1991-01-01

    Results of calculations performed with MELCOR and HECTR in support of the NUREG-1150 study are presented in this report. The analyses examined a wide range of issues. The analyses included integral calculations covering an entire accident sequence, as well as calculations that addressed specific issues that could affect several accident sequences. The results of the analyses for Grand Gulf, Peach Bottom, LaSalle, and Sequoyah are described, and the major conclusions are summarized. 23 refs., 69 figs., 8 tabs.

  10. Photothermal imaging

    NASA Astrophysics Data System (ADS)

    Lapotko, Dmitry; Antonishina, Elena

    1995-02-01

    An automated image analysis system with two imaging regimes is described. Photothermal (PT) effect is used for imaging of a temperature field or absorption structure of the sample (the cell) with high sensitivity and spatial resolution. In a cell study PT-technique enables imaging of live non-stained cells, and the monitoring of the cell shape/structure. The system includes a dual laser illumination unit coupled to a conventional optical microscope. A sample chamber provides automated or manual loading of up to 3 samples and cell positioning. For image detection a 256 X 256 10-bit CCD-camera is used. The lasers, scanning stage, and camera are controlled by PC. The system provides optical (transmitted light) image, probe laser optical image, and PT-image acquisition. Operation rate is 1 - 1.5 sec per cell for a cycle: cell positioning -- 3 images acquisition -- image parameters calculation. A special database provides image/parameters storage, presentation, and cell diagnostic according to quantitative image parameters. The described system has been tested during live and stained blood cell studies. PT-images of the cells have been used for cell differentiation. In experiments with the red blood cells (RBC) that originate from normal and anaemia blood parameters for disease differentiation have been found. For white blood cells in PT-images the details of cell structure have found that absent in their optical images.

  11. Infrared imaging of comets

    NASA Technical Reports Server (NTRS)

    Telesco, Charles M.

    1988-01-01

    Thermal infrared imaging of comets provides fundamental information about the distribution of dust in their comae and tails. The imaging program at NASA Marshall Space Flight Center (MSFC) uses a unique 20-pixel bolometer array that was developed to image comets at 8 to 30 micrometer. These images provide the basis for: (1) characterizing the composition and size distribution of particles, (2) determining the mass-loss rates from cometary nuclei, and (3) describing the dynamics of the interaction between the dust and the solar radiation. Since the array became operational in 1985, researchers have produced a unique series of IR images of comets Giacobini-Zinner (GZ), Halley, and Wilson. That of GZ was the first groundbased thermal image ever made of a comet and was used to construct, with visible observations, an albedo map. Those data and dynamical analyses showed that GZ contained a population of large (approximately 300 micrometer), fluffy dust grains that formed a distinict inner tail. The accumulating body of images of various comets has also provided a basis for fruitfully intercomparing comet properties. Researchers also took advantage of the unique capabilities of the camera to resolve the inner, possible protoplanetary, disk of the star Beta Pictoris, while not a comet research program, that study is a fruitful additional application of the array to solar system astronomy.

  12. 49 CFR 1180.7 - Market analyses.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., an analysis of traffic flows indicating patterns of geographic competition or product competition... 49 Transportation 8 2011-10-01 2011-10-01 false Market analyses. 1180.7 Section 1180.7..., TRACKAGE RIGHTS, AND LEASE PROCEDURES General Acquisition Procedures § 1180.7 Market analyses. (a)...

  13. Aviation System Analysis Capability Executive Assistant Analyses

    NASA Technical Reports Server (NTRS)

    Roberts, Eileen; Kostiuk, Peter

    1999-01-01

    This document describes the analyses that may be incorporated into the Aviation System Analysis Capability Executive Assistant. The document will be used as a discussion tool to enable NASA and other integrated aviation system entities to evaluate, discuss, and prioritize analyses.

  14. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle...

  15. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle...

  16. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle...

  17. Operator-free flow injection analyser

    PubMed Central

    de Faria, Lourival C.

    1991-01-01

    A flow injection analyser has been constructed to allow an operator-free determination of up to 40 samples. Besides the usual FIA apparatus, the analyser includes a home-made sample introduction device made with three electromechanical three-way valves and an auto-sampler from Technicon which has been adapted to be commanded by an external digital signal. The analyser is controlled by a single board SDK-8085 microcomputer. The necessary interface to couple the analyser components to the microcomputer is also described. The analyser was evaluated for a Cr(VI)-FIA determination showing a very good performance with a relative standard deviation for 15 signals from the injection of 100 μl of a 1.0 mg.ml-1 standard Cr(VI) solution being equal to 0.5%. PMID:18924899

  18. Image processing technology

    SciTech Connect

    Van Eeckhout, E.; Pope, P.; Balick, L.

    1996-07-01

    This is the final report of a two-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The primary objective of this project was to advance image processing and visualization technologies for environmental characterization. This was effected by developing and implementing analyses of remote sensing data from satellite and airborne platforms, and demonstrating their effectiveness in visualization of environmental problems. Many sources of information were integrated as appropriate using geographic information systems.

  19. Medical Imaging.

    ERIC Educational Resources Information Center

    Barker, M. C. J.

    1996-01-01

    Discusses four main types of medical imaging (x-ray, radionuclide, ultrasound, and magnetic resonance) and considers their relative merits. Describes important recent and possible future developments in image processing. (Author/MKR)

  20. Functional analyses and treatment of precursor behavior.

    PubMed

    Najdowski, Adel C; Wallace, Michele D; Ellsworth, Carrie L; MacAleese, Alicia N; Cleveland, Jackie M

    2008-01-01

    Functional analysis has been demonstrated to be an effective method to identify environmental variables that maintain problem behavior. However, there are cases when conducting functional analyses of severe problem behavior may be contraindicated. The current study applied functional analysis procedures to a class of behavior that preceded severe problem behavior (precursor behavior) and evaluated treatments based on the outcomes of the functional analyses of precursor behavior. Responding for all participants was differentiated during the functional analyses, and individualized treatments eliminated precursor behavior. These results suggest that functional analysis of precursor behavior may offer an alternative, indirect method to assess the operant function of severe problem behavior. PMID:18468282

  1. Functional Analyses and Treatment of Precursor Behavior

    PubMed Central

    Najdowski, Adel C; Wallace, Michele D; Ellsworth, Carrie L; MacAleese, Alicia N; Cleveland, Jackie M

    2008-01-01

    Functional analysis has been demonstrated to be an effective method to identify environmental variables that maintain problem behavior. However, there are cases when conducting functional analyses of severe problem behavior may be contraindicated. The current study applied functional analysis procedures to a class of behavior that preceded severe problem behavior (precursor behavior) and evaluated treatments based on the outcomes of the functional analyses of precursor behavior. Responding for all participants was differentiated during the functional analyses, and individualized treatments eliminated precursor behavior. These results suggest that functional analysis of precursor behavior may offer an alternative, indirect method to assess the operant function of severe problem behavior. PMID:18468282

  2. Proof Image

    ERIC Educational Resources Information Center

    Kidron, Ivy; Dreyfus, Tommy

    2014-01-01

    The emergence of a proof image is often an important stage in a learner's construction of a proof. In this paper, we introduce, characterize, and exemplify the notion of proof image. We also investigate how proof images emerge. Our approach starts from the learner's efforts to construct a justification without (or before) attempting any…

  3. Image alignment

    SciTech Connect

    Dowell, Larry Jonathan

    2014-04-22

    Disclosed is a method and device for aligning at least two digital images. An embodiment may use frequency-domain transforms of small tiles created from each image to identify substantially similar, "distinguishing" features within each of the images, and then align the images together based on the location of the distinguishing features. To accomplish this, an embodiment may create equal sized tile sub-images for each image. A "key" for each tile may be created by performing a frequency-domain transform calculation on each tile. A information-distance difference between each possible pair of tiles on each image may be calculated to identify distinguishing features. From analysis of the information-distance differences of the pairs of tiles, a subset of tiles with high discrimination metrics in relation to other tiles may be located for each image. The subset of distinguishing tiles for each image may then be compared to locate tiles with substantially similar keys and/or information-distance metrics to other tiles of other images. Once similar tiles are located for each image, the images may be aligned in relation to the identified similar tiles.

  4. Intracranial imaging.

    PubMed Central

    Gibson, M.; Cook, G.; Al-Kutoubi, A.

    1996-01-01

    This article concentrates on the imaging of intracranial structures and outlines some basic imaging strategies for common clinical presentations. Images Figure 1 Figure 2 Figure 3 Figure 4 Figure 5 Figure 6 Figure 7 Figure 8 Figure 9 Figure 10 Figure 11 Figure 12 Figure 13 Figure 14 Figure 15 PMID:8935596

  5. Quantitative analyses for elucidating mechanisms of cell fate commitment in the mouse blastocyst

    NASA Astrophysics Data System (ADS)

    Saiz, Néstor; Kang, Minjung; Puliafito, Alberto; Schrode, Nadine; Xenopoulos, Panagiotis; Lou, Xinghua; Di Talia, Stefano; Hadjantonakis, Anna-Katerina

    2015-03-01

    In recent years we have witnessed a shift from qualitative image analysis towards higher resolution, quantitative analyses of imaging data in developmental biology. This shift has been fueled by technological advances in both imaging and analysis software. We have recently developed a tool for accurate, semi-automated nuclear segmentation of imaging data from early mouse embryos and embryonic stem cells. We have applied this software to the study of the first lineage decisions that take place during mouse development and established analysis pipelines for both static and time-lapse imaging experiments. In this paper we summarize the conclusions from these studies to illustrate how quantitative, single-cell level analysis of imaging data can unveil biological processes that cannot be revealed by traditional qualitative studies.

  6. SCM Forcing Data Derived from NWP Analyses

    DOE Data Explorer

    Jakob, Christian

    2008-01-15

    Forcing data, suitable for use with single column models (SCMs) and cloud resolving models (CRMs), have been derived from NWP analyses for the ARM (Atmospheric Radiation Measurement) Tropical Western Pacific (TWP) sites of Manus Island and Nauru.

  7. 49 CFR 1180.7 - Market analyses.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... OF TRANSPORTATION RULES OF PRACTICE RAILROAD ACQUISITION, CONTROL, MERGER, CONSOLIDATION PROJECT, TRACKAGE RIGHTS, AND LEASE PROCEDURES General Acquisition Procedures § 1180.7 Market analyses. (a) For... company's marketing plan and existing and potential competitive alternatives (inter- as well as...

  8. Quality control considerations in performing washability analyses

    SciTech Connect

    Graham, R.D.

    1984-10-01

    The author describes, in considerable detail, the procedures for carrying out washability analyses as laid down in ASTM Standard Test Method D4371. These include sampling, sample preparation, hydrometer standardisation, washability testing, and analysis of specific gravity fractions.

  9. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... DEPARTMENT OF ENERGY ENERGY CONSERVATION FEDERAL ENERGY MANAGEMENT AND PLANNING PROGRAMS Methodology and... by conducting additional analyses using any standard engineering economics method such as sensitivity... energy or water system alternative....

  10. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... DEPARTMENT OF ENERGY ENERGY CONSERVATION FEDERAL ENERGY MANAGEMENT AND PLANNING PROGRAMS Methodology and... by conducting additional analyses using any standard engineering economics method such as sensitivity... energy or water system alternative....

  11. Anthocyanin analyses of Vaccinium fruit dietary supplements

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Vaccinium fruit ingredients within dietary supplements were identified by comparisons with anthocyanin analyses of known Vaccinium profiles (demonstration of anthocyanin fingerprinting). Available Vaccinium supplements were purchased and analyzed; their anthocyanin profiles (based on HPLC separation...

  12. Comparison with Russian analyses of meteor impact

    SciTech Connect

    Canavan, G.H.

    1997-06-01

    The inversion model for meteor impacts is used to discuss Russian analyses and compare principal results. For common input parameters, the models produce consistent estimates of impactor parameters. Directions for future research are discussed and prioritized.

  13. Analyses and forecasts with LAWS winds

    NASA Technical Reports Server (NTRS)

    Wang, Muyin; Paegle, Jan

    1994-01-01

    Horizontal fluxes of atmospheric water vapor are studied for summer months during 1989 and 1992 over North and South America based on analyses from European Center for Medium Range Weather Forecasts, US National Meteorological Center, and United Kingdom Meteorological Office. The calculations are performed over 20 deg by 20 deg box-shaped midlatitude domains located to the east of the Rocky Mountains in North America, and to the east of the Andes Mountains in South America. The fluxes are determined from operational center gridded analyses of wind and moisture. Differences in the monthly mean moisture flux divergence determined from these analyses are as large as 7 cm/month precipitable water equivalent over South America, and 3 cm/month over North America. Gridded analyses at higher spatial and temporal resolution exhibit better agreement in the moisture budget study. However, significant discrepancies of the moisture flux divergence computed from different gridded analyses still exist. The conclusion is more pessimistic than Rasmusson's estimate based on station data. Further analysis reveals that the most significant sources of error result from model surface elevation fields, gaps in the data archive, and uncertainties in the wind and specific humidity analyses. Uncertainties in the wind analyses are the most important problem. The low-level jets, in particular, are substantially different in the different data archives. Part of the reason for this may be due to the way the different analysis models parameterized physical processes affecting low-level jets. The results support the inference that the noise/signal ratio of the moisture budget may be improved more rapidly by providing better wind observations and analyses than by providing better moisture data.

  14. A History of Rotorcraft Comprehensive Analyses

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2013-01-01

    A history of the development of rotorcraft comprehensive analyses is presented. Comprehensive analyses are digital computer programs that calculate the aeromechanical behavior of the rotor and aircraft, bringing together the most advanced models of the geometry, structure, dynamics, and aerodynamics available in rotary wing technology. The development of the major codes of the last five decades from industry, government, and universities is described. A number of common themes observed in this history are discussed.

  15. Albedo Pattern Recognition and Time-Series Analyses in Malaysia

    NASA Astrophysics Data System (ADS)

    Salleh, S. A.; Abd Latif, Z.; Mohd, W. M. N. Wan; Chan, A.

    2012-07-01

    Pattern recognition and time-series analyses will enable one to evaluate and generate predictions of specific phenomena. The albedo pattern and time-series analyses are very much useful especially in relation to climate condition monitoring. This study is conducted to seek for Malaysia albedo pattern changes. The pattern recognition and changes will be useful for variety of environmental and climate monitoring researches such as carbon budgeting and aerosol mapping. The 10 years (2000-2009) MODIS satellite images were used for the analyses and interpretation. These images were being processed using ERDAS Imagine remote sensing software, ArcGIS 9.3, the 6S code for atmospherical calibration and several MODIS tools (MRT, HDF2GIS, Albedo tools). There are several methods for time-series analyses were explored, this paper demonstrates trends and seasonal time-series analyses using converted HDF format MODIS MCD43A3 albedo land product. The results revealed significance changes of albedo percentages over the past 10 years and the pattern with regards to Malaysia's nebulosity index (NI) and aerosol optical depth (AOD). There is noticeable trend can be identified with regards to its maximum and minimum value of the albedo. The rise and fall of the line graph show a similar trend with regards to its daily observation. The different can be identified in term of the value or percentage of rises and falls of albedo. Thus, it can be concludes that the temporal behavior of land surface albedo in Malaysia have a uniform behaviours and effects with regards to the local monsoons. However, although the average albedo shows linear trend with nebulosity index, the pattern changes of albedo with respects to the nebulosity index indicates that there are external factors that implicates the albedo values, as the sky conditions and its diffusion plotted does not have uniform trend over the years, especially when the trend of 5 years interval is examined, 2000 shows high negative linear

  16. Web-based cephalometric procedure for craniofacial and dentition analyses

    NASA Astrophysics Data System (ADS)

    Arun Kumar, N. S.; Kamath, Srijit R.; Ram, S.; Muthukumaran, B.; Venkatachalapathy, A.; Nandakumar, A.; Jayakumar, P.

    2000-05-01

    Craniofacial analysis is a very important and widely used procedure in orthodontic caphalometry, which plays a key role in diagnosis and treatment planning. This involves establishing reference standards and specification of landmarks and variables. The manual approach takes up a tremendous amount of the orthodontist's time. In this paper, we developed a web-based approach for the craniofacial and dentition analyses. A digital computed radiography (CR) system is utilized for obtaining the craniofacial image, which is stored as a bitmap file. The system comprises of two components - a server and a client. The server component is a program that runs on a remote machine. To use the system, the user has to connect to the website. The client component is now activated, which uploads the image from the PC and displays it on the canvas area. The landmarks are identified using a mouse interface. The reference lines are generated. The resulting image is then sent to the server which performs all measurement and calculates the mean, standard deviation, etc. of the variables. The results generated are sent immediately to the client where it is displayed on a separate frame along with the standard values for comparison. This system eliminates the need for every user to load other expensive programs on his machine.

  17. Dynamic and static error analyses of neutron radiography testing

    SciTech Connect

    Joo, H.; Glickstein, S.S.

    1999-03-01

    Neutron radiography systems are being used for real-time visualization of the dynamic behavior as well as time-averaged measurements of spatial vapor fraction distributions for two phase fluids. The data in the form of video images are typically recorded on videotape at 30 frames per second. Image analysis of he video pictures is used to extract time-dependent or time-averaged data. The determination of the average vapor fraction requires averaging of the logarithm of time-dependent intensity measurements of the neutron beam (gray scale distribution of the image) that passes through the fluid. This could be significantly different than averaging the intensity of the transmitted beam and then taking the logarithm of that term. This difference is termed the dynamic error (error in the time-averaged vapor fractions due to the inherent time-dependence of the measured data) and is separate from the static error (statistical sampling uncertainty). Detailed analyses of both sources of errors are discussed.

  18. Fractal and Lacunarity Analyses: Quantitative Characterization of Hierarchical Surface Topographies.

    PubMed

    Ling, Edwin J Y; Servio, Phillip; Kietzig, Anne-Marie

    2016-02-01

    Biomimetic hierarchical surface structures that exhibit features having multiple length scales have been used in many technological and engineering applications. Their surface topographies are most commonly analyzed using scanning electron microscopy (SEM), which only allows for qualitative visual assessments. Here we introduce fractal and lacunarity analyses as a method of characterizing the SEM images of hierarchical surface structures in a quantitative manner. Taking femtosecond laser-irradiated metals as an example, our results illustrate that, while the fractal dimension is a poor descriptor of surface complexity, lacunarity analysis can successfully quantify the spatial texture of an SEM image; this, in turn, provides a convenient means of reporting changes in surface topography with respect to changes in processing parameters. Furthermore, lacunarity plots are shown to be sensitive to the different length scales present within a hierarchical structure due to the reversal of lacunarity trends at specific magnifications where new features become resolvable. Finally, we have established a consistent method of detecting pattern sizes in an image from the oscillation of lacunarity plots. Therefore, we promote the adoption of lacunarity analysis as a powerful tool for quantitative characterization of, but not limited to, multi-scale hierarchical surface topographies. PMID:26758776

  19. Finite element analyses of CCAT preliminary design

    NASA Astrophysics Data System (ADS)

    Sarawit, Andrew T.; Kan, Frank W.

    2014-07-01

    This paper describes the development of the CCAT telescope finite element model (FEM) and the analyses performed to support the preliminary design work. CCAT will be a 25 m diameter telescope operating in the 0.2 to 2 mm wavelength range. It will be located at an elevation of 5600 m on Cerro Chajnantor in Northern Chile, near ALMA. The telescope will be equipped with wide-field cameras and spectrometers mounted at the two Nasmyth foci. The telescope will be inside an enclosure to protect it from wind buffeting, direct solar heating, and bad weather. The main structures of the telescope include a steel Mount and a carbon-fiber-reinforced-plastic (CFRP) primary truss. The finite element model developed in this study was used to perform modal, frequency response, seismic response spectrum, stress, and deflection analyses of telescope. Modal analyses of telescope were performed to compute the structure natural frequencies and mode shapes and to obtain reduced order modal output at selected locations in the telescope structure to support the design of the Mount control system. Modal frequency response analyses were also performed to compute transfer functions at these selected locations. Seismic response spectrum analyses of the telescope subject to the Maximum Likely Earthquake were performed to compute peak accelerations and seismic demand stresses. Stress analyses were performed for gravity load to obtain gravity demand stresses. Deflection analyses for gravity load, thermal load, and differential elevation drive torque were performed so that the CCAT Observatory can verify that the structures meet the stringent telescope surface and pointing error requirements.

  20. Prismatic analyser concept for neutron spectrometers

    SciTech Connect

    Birk, Jonas O.; Jacobsen, Johan; Hansen, Rasmus L.; Lefmann, Kim; Markó, Márton; Niedermayer, Christof; Freeman, Paul G.; Christensen, Niels B.; Månsson, Martin; Rønnow, Henrik M.

    2014-11-15

    Developments in modern neutron spectroscopy have led to typical sample sizes decreasing from few cm to several mm in diameter samples. We demonstrate how small samples together with the right choice of analyser and detector components makes distance collimation an important concept in crystal analyser spectrometers. We further show that this opens new possibilities where neutrons with different energies are reflected by the same analyser but counted in different detectors, thus improving both energy resolution and total count rate compared to conventional spectrometers. The technique can readily be combined with advanced focussing geometries and with multiplexing instrument designs. We present a combination of simulations and data showing three different energies simultaneously reflected from one analyser. Experiments were performed on a cold triple axis instrument and on a prototype inverse geometry Time-of-flight spectrometer installed at PSI, Switzerland, and shows excellent agreement with the predictions. Typical improvements will be 2.0 times finer resolution and a factor of 1.9 in flux gain compared to a focussing Rowland geometry, or of 3.3 times finer resolution and a factor of 2.4 in flux gain compared to a single flat analyser slab.

  1. Positioning the image of AIDS.

    PubMed

    Cooter, Roger; Stein, Claudia

    2010-03-01

    AIDS posters can be treated as material objects whose production, distribution and consumption varied across time and place. It is also possible to reconstruct and analyse the public health discourse at the time these powerful images appeared. More recently, however, these conventional historical approaches have been challenged by projects in literary and art criticism. Here, images of AIDS are considered in terms of their function in and for a new discursive regime of power centred on the human body and its visualization. How images of AIDS came to be understood in Western culture in relation to wider political and economic conditions redefines the historical task.

  2. Image Processing

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Electronic Imagery, Inc.'s ImageScale Plus software, developed through a Small Business Innovation Research (SBIR) contract with Kennedy Space Flight Center for use on space shuttle Orbiter in 1991, enables astronauts to conduct image processing, prepare electronic still camera images in orbit, display them and downlink images to ground based scientists for evaluation. Electronic Imagery, Inc.'s ImageCount, a spin-off product of ImageScale Plus, is used to count trees in Florida orange groves. Other applications include x-ray and MRI imagery, textile designs and special effects for movies. As of 1/28/98, company could not be located, therefore contact/product information is no longer valid.

  3. Hyperspectral imaging in medicine: image pre-processing problems and solutions in Matlab.

    PubMed

    Koprowski, Robert

    2015-11-01

    The paper presents problems and solutions related to hyperspectral image pre-processing. New methods of preliminary image analysis are proposed. The paper shows problems occurring in Matlab when trying to analyse this type of images. Moreover, new methods are discussed which provide the source code in Matlab that can be used in practice without any licensing restrictions. The proposed application and sample result of hyperspectral image analysis.

  4. Hyperspectral imaging in medicine: image pre-processing problems and solutions in Matlab.

    PubMed

    Koprowski, Robert

    2015-11-01

    The paper presents problems and solutions related to hyperspectral image pre-processing. New methods of preliminary image analysis are proposed. The paper shows problems occurring in Matlab when trying to analyse this type of images. Moreover, new methods are discussed which provide the source code in Matlab that can be used in practice without any licensing restrictions. The proposed application and sample result of hyperspectral image analysis. PMID:25676816

  5. Meta-analyses of randomized controlled trials.

    PubMed

    Sacks, H S; Berrier, J; Reitman, D; Ancona-Berk, V A; Chalmers, T C

    1987-02-19

    A new type of research, termed meta-analysis, attempts to analyze and combine the results of previous reports. We found 86 meta-analyses of reports of randomized controlled trials in the English-language literature. We evaluated the quality of these meta-analyses, using a scoring method that considered 23 items in six major areas--study design, combinability, control of bias, statistical analysis, sensitivity analysis, and application of results. Only 24 meta-analyses (28 percent) addressed all six areas, 31 (36 percent) addressed five, 25 (29 percent) addressed four, 5 (6 percent) addressed three, and 1 (1 percent) addressed two. Of the 23 individual items, between 1 and 14 were addressed satisfactorily (mean +/- SD, 7.7 +/- 2.7). We conclude that an urgent need exists for improved methods in literature searching, quality evaluation of trials, and synthesizing of the results.

  6. Geomagnetic local and regional harmonic analyses.

    USGS Publications Warehouse

    Alldredge, L.R.

    1982-01-01

    Procedures are developed for using rectangular and cylindrical harmonic analyses in local and regional areas. Both the linear least squares analysis, applicable when component data are available, and the nonlinear least squares analysis, applicable when only total field data are available, are treated. When component data are available, it is advantageous to work with residual fields obtained by subtracting components derived from a harmonic potential from the observed components. When only total field intensity data are available, they must be used directly. Residual values cannot be used. Cylindrical harmonic analyses are indicated when fields tend toward cylindrical symmetry; otherwise, rectangular harmonic analyses will be more advantageous. Examples illustrating each type of analysis are given.-Author

  7. A qualitative method for analysing multivoicedness

    PubMed Central

    Aveling, Emma-Louise; Gillespie, Alex; Cornish, Flora

    2015-01-01

    ‘Multivoicedness’ and the ‘multivoiced Self’ have become important theoretical concepts guiding research. Drawing on the tradition of dialogism, the Self is conceptualised as being constituted by a multiplicity of dynamic, interacting voices. Despite the growth in literature and empirical research, there remains a paucity of established methodological tools for analysing the multivoiced Self using qualitative data. In this article, we set out a systematic, practical ‘how-to’ guide for analysing multivoicedness. Using theoretically derived tools, our three-step method comprises: identifying the voices of I-positions within the Self’s talk (or text), identifying the voices of ‘inner-Others’, and examining the dialogue and relationships between the different voices. We elaborate each step and illustrate our method using examples from a published paper in which data were analysed using this method. We conclude by offering more general principles for the use of the method and discussing potential applications. PMID:26664292

  8. Advanced toroidal facility vaccuum vessel stress analyses

    SciTech Connect

    Hammonds, C.J.; Mayhall, J.A.

    1987-01-01

    The complex geometry of the Advance Toroidal Facility (ATF) vacuum vessel required special analysis techniques in investigating the structural behavior of the design. The response of a large-scale finite element model was found for transportation and operational loading. Several computer codes and systems, including the National Magnetic Fusion Energy Computer Center Cray machines, were implemented in accomplishing these analyses. The work combined complex methods that taxed the limits of both the codes and the computer systems involved. Using MSC/NASTRAN cyclic-symmetry solutions permitted using only 1/12 of the vessel geometry to mathematically analyze the entire vessel. This allowed the greater detail and accuracy demanded by the complex geometry of the vessel. Critical buckling-pressure analyses were performed with the same model. The development, results, and problems encountered in performing these analyses are described. 5 refs., 3 figs.

  9. Imaging genomics

    PubMed Central

    Thompson, Paul M.; Martin, Nicholas G.; Wright, Margaret J.

    2010-01-01

    Purpose of review Imaging genomics is an emerging field that is rapidly identifying genes that influence the brain, cognition, and risk for disease. Worldwide, thousands of individuals are being scanned with high-throughput genotyping (genome-wide scans), and new imaging techniques [high angular resolution diffusion imaging and resting state functional magnetic resonance imaging (MRI)] that provide fine-grained measures of the brain’s structural and functional connectivity. Along with clinical diagnosis and cognitive testing, brain imaging offers highly reproducible measures that can be subjected to genetic analysis. Recent findings Recent studies of twin, pedigree, and population-based datasets have discovered several candidate genes that consistently show small to moderate effects on brain measures. Many studies measure single phenotypes from the images, such as hippocampal volume, but voxel-wise genomic methods can plot the profile of genetic association at each 3D point in the brain. This exploits the full arsenal of imaging statistics to discover and replicate gene effects. Summary Imaging genomics efforts worldwide are now working together to discover and replicate many promising leads. By studying brain phenotypes closer to causative gene action, larger gene effects are detectable with realistic sample sizes obtainable from meta-analysis of smaller studies. Imaging genomics has broad applications to dementia, mental illness, and public health. PMID:20581684

  10. Body Imaging

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Magnetic Resonance Imaging (MRI) and Computer-aided Tomography (CT) images are often complementary. In most cases, MRI is good for viewing soft tissue but not bone, while CT images are good for bone but not always good for soft tissue discrimination. Physicians and engineers in the Department of Radiology at the University of Michigan Hospitals are developing a technique for combining the best features of MRI and CT scans to increase the accuracy of discriminating one type of body tissue from another. One of their research tools is a computer program called HICAP. The program can be used to distinguish between healthy and diseased tissue in body images.

  11. Advanced laser stratospheric monitoring systems analyses

    NASA Technical Reports Server (NTRS)

    Larsen, J. C.

    1984-01-01

    This report describes the software support supplied by Systems and Applied Sciences Corporation for the study of Advanced Laser Stratospheric Monitoring Systems Analyses under contract No. NAS1-15806. This report discusses improvements to the Langley spectroscopic data base, development of LHS instrument control software and data analyses and validation software. The effect of diurnal variations on the retrieved concentrations of NO, NO2 and C L O from a space and balloon borne measurement platform are discussed along with the selection of optimum IF channels for sensing stratospheric species from space.

  12. Identifying, analysing and solving problems in practice.

    PubMed

    Hewitt-Taylor, Jaqui

    When a problem is identified in practice, it is important to clarify exactly what it is and establish the cause before seeking a solution. This solution-seeking process should include input from those directly involved in the problematic situation, to enable individuals to contribute their perspective, appreciate why any change in practice is necessary and what will be achieved by the change. This article describes some approaches to identifying and analysing problems in practice so that effective solutions can be devised. It includes a case study and examples of how the Five Whys analysis, fishbone diagram, problem tree analysis, and Seven-S Model can be used to analyse a problem.

  13. Multispectral imaging and image processing

    NASA Astrophysics Data System (ADS)

    Klein, Julie

    2014-02-01

    The color accuracy of conventional RGB cameras is not sufficient for many color-critical applications. One of these applications, namely the measurement of color defects in yarns, is why Prof. Til Aach and the Institute of Image Processing and Computer Vision (RWTH Aachen University, Germany) started off with multispectral imaging. The first acquisition device was a camera using a monochrome sensor and seven bandpass color filters positioned sequentially in front of it. The camera allowed sampling the visible wavelength range more accurately and reconstructing the spectra for each acquired image position. An overview will be given over several optical and imaging aspects of the multispectral camera that have been investigated. For instance, optical aberrations caused by filters and camera lens deteriorate the quality of captured multispectral images. The different aberrations were analyzed thoroughly and compensated based on models for the optical elements and the imaging chain by utilizing image processing. With this compensation, geometrical distortions disappear and sharpness is enhanced, without reducing the color accuracy of multispectral images. Strong foundations in multispectral imaging were laid and a fruitful cooperation was initiated with Prof. Bernhard Hill. Current research topics like stereo multispectral imaging and goniometric multispectral measure- ments that are further explored with his expertise will also be presented in this work.

  14. Passive adaptive imaging through turbulence

    NASA Astrophysics Data System (ADS)

    Tofsted, David

    2016-05-01

    Standard methods for improved imaging system performance under degrading optical turbulence conditions typically involve active adaptive techniques or post-capture image processing. Here, passive adaptive methods are considered where active sources are disallowed, a priori. Theoretical analyses of short-exposure turbulence impacts indicate that varying aperture sizes experience different degrees of turbulence impacts. Smaller apertures often outperform larger aperture systems as turbulence strength increases. This suggests a controllable aperture system is advantageous. In addition, sub-aperture sampling of a set of training images permits the system to sense tilts in different sub-aperture regions through image acquisition and image cross-correlation calculations. A four sub-aperture pattern supports corrections involving five realizable operating modes (beyond tip and tilt) for removing aberrations over an annular pattern. Progress to date will be discussed regarding development and field trials of a prototype system.

  15. Imaging in anatomy: a comparison of imaging techniques in embalmed human cadavers

    PubMed Central

    2013-01-01

    Background A large variety of imaging techniques is an integral part of modern medicine. Introducing radiological imaging techniques into the dissection course serves as a basis for improved learning of anatomy and multidisciplinary learning in pre-clinical medical education. Methods Four different imaging techniques (ultrasound, radiography, computed tomography, and magnetic resonance imaging) were performed in embalmed human body donors to analyse possibilities and limitations of the respective techniques in this peculiar setting. Results The quality of ultrasound and radiography images was poor, images of computed tomography and magnetic resonance imaging were of good quality. Conclusion Computed tomography and magnetic resonance imaging have a superior image quality in comparison to ultrasound and radiography and offer suitable methods for imaging embalmed human cadavers as a valuable addition to the dissection course. PMID:24156510

  16. Image of the Singapore Child

    ERIC Educational Resources Information Center

    Ebbeck, Marjory; Warrier, Sheela

    2008-01-01

    The purpose of this study was to analyse the contents of one of the leading newspapers of Singapore in an effort to identify the public image of the children of the nation. Newspaper clippings of news/articles, pictures/photographs and advertisements featuring children below 15 years of age were collected over a one-week period and the content…

  17. Uncertainty quantification approaches for advanced reactor analyses.

    SciTech Connect

    Briggs, L. L.; Nuclear Engineering Division

    2009-03-24

    The original approach to nuclear reactor design or safety analyses was to make very conservative modeling assumptions so as to ensure meeting the required safety margins. Traditional regulation, as established by the U. S. Nuclear Regulatory Commission required conservatisms which have subsequently been shown to be excessive. The commission has therefore moved away from excessively conservative evaluations and has determined best-estimate calculations to be an acceptable alternative to conservative models, provided the best-estimate results are accompanied by an uncertainty evaluation which can demonstrate that, when a set of analysis cases which statistically account for uncertainties of all types are generated, there is a 95% probability that at least 95% of the cases meet the safety margins. To date, nearly all published work addressing uncertainty evaluations of nuclear power plant calculations has focused on light water reactors and on large-break loss-of-coolant accident (LBLOCA) analyses. However, there is nothing in the uncertainty evaluation methodologies that is limited to a specific type of reactor or to specific types of plant scenarios. These same methodologies can be equally well applied to analyses for high-temperature gas-cooled reactors and to liquid metal reactors, and they can be applied to steady-state calculations, operational transients, or severe accident scenarios. This report reviews and compares both statistical and deterministic uncertainty evaluation approaches. Recommendations are given for selection of an uncertainty methodology and for considerations to be factored into the process of evaluating uncertainties for advanced reactor best-estimate analyses.

  18. FAME: Software for analysing rock microstructures

    NASA Astrophysics Data System (ADS)

    Hammes, Daniel M.; Peternell, Mark

    2016-05-01

    Determination of rock microstructures leads to a better understanding of the formation and deformation of polycrystalline solids. Here, we present FAME (Fabric Analyser based Microstructure Evaluation), an easy-to-use MATLAB®-based software for processing datasets recorded by an automated fabric analyser microscope. FAME is provided as a MATLAB®-independent Windows® executable with an intuitive graphical user interface. Raw data from the fabric analyser microscope can be automatically loaded, filtered and cropped before analysis. Accurate and efficient rock microstructure analysis is based on an advanced user-controlled grain labelling algorithm. The preview and testing environments simplify the determination of appropriate analysis parameters. Various statistic and plotting tools allow a graphical visualisation of the results such as grain size, shape, c-axis orientation and misorientation. The FAME2elle algorithm exports fabric analyser data to an elle (modelling software)-supported format. FAME supports batch processing for multiple thin section analysis or large datasets that are generated for example during 2D in-situ deformation experiments. The use and versatility of FAME is demonstrated on quartz and deuterium ice samples.

  19. Chemical Analyses of Silicon Aerogel Samples

    SciTech Connect

    van der Werf, I.; Palmisano, F.; De Leo, Raffaele; Marrone, Stefano

    2008-04-01

    After five years of operating, two Aerogel counters: A1 and A2, taking data in Hall A at Jefferson Lab, suffered a loss of performance. In this note possible causes of degradation have been studied. In particular, various chemical and physical analyses have been carried out on several Aerogel tiles and on adhesive tape in order to reveal the presence of contaminants.

  20. Amino acid analyses of Apollo 14 samples.

    NASA Technical Reports Server (NTRS)

    Gehrke, C. W.; Zumwalt, R. W.; Kuo, K.; Aue, W. A.; Stalling, D. L.; Kvenvolden, K. A.; Ponnamperuma, C.

    1972-01-01

    Detection limits were between 300 pg and 1 ng for different amino acids, in an analysis by gas-liquid chromatography of water extracts from Apollo 14 lunar fines in which amino acids were converted to their N-trifluoro-acetyl-n-butyl esters. Initial analyses of water and HCl extracts of sample 14240 and 14298 samples showed no amino acids above background levels.

  1. Multiphase Method for Analysing Online Discussions

    ERIC Educational Resources Information Center

    Häkkinen, P.

    2013-01-01

    Several studies have analysed and assessed online performance and discourse using quantitative and qualitative methods. Quantitative measures have typically included the analysis of participation rates and learning outcomes in terms of grades. Qualitative measures of postings, discussions and context features aim to give insights into the nature…

  2. Correlation Functions Aid Analyses Of Spectra

    NASA Technical Reports Server (NTRS)

    Beer, Reinhard; Norton, Robert H., Jr.

    1989-01-01

    New uses found for correlation functions in analyses of spectra. In approach combining elements of both pattern-recognition and traditional spectral-analysis techniques, spectral lines identified in data appear useless at first glance because they are dominated by noise. New approach particularly useful in measurement of concentrations of rare species of molecules in atmosphere.

  3. 49 CFR 1180.7 - Market analyses.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... identify and address relevant markets and issues, and provide additional information as requested by the...). (b) For major transactions, applicants shall submit “full system” impact analyses (incorporating any... (including inter- and intramodal competition, product competition, and geographic competition) and...

  4. Cosmetology: Task Analyses. Competency-Based Education.

    ERIC Educational Resources Information Center

    Henrico County Public Schools, Glen Allen, VA. Virginia Vocational Curriculum Center.

    These task analyses are designed to be used in combination with the "Trade and Industrial Education Service Area Resource" in order to implement competency-based education in the cosmetology program in Virginia. The task analysis document contains the task inventory, suggested task sequence lists, and content outlines for the secondary courses…

  5. What's missing from avian global diversification analyses?

    PubMed

    Reddy, Sushma

    2014-08-01

    The accumulation of vast numbers of molecular phylogenetic studies has contributed to huge knowledge gains in the evolutionary history of birds. This permits subsequent analyses of avian diversity, such as how and why diversification varies across the globe and among taxonomic groups. However, available genetic data for these meta-analyses are unevenly distributed across different geographic regions and taxonomic groups. To comprehend the impact of this variation on the interpretation of global diversity patterns, I examined the availability of genetic data for possible biases in geographic and taxonomic sampling of birds. I identified three main disparities of sampling that are geographically associated with latitude (temperate, tropical), hemispheres (East, West), and range size. Tropical regions, which host the vast majority of species, are substantially less studied. Moreover, Eastern regions, such as the Old World Tropics and Australasia, stand out as being disproportionately undersampled, with up to half of communities not being represented in recent studies. In terms of taxonomic discrepancies, a majority of genetically undersampled clades are exclusively found in tropical regions. My analysis identifies several disparities in the key regions of interest of global diversity analyses. Differential sampling can have considerable impacts on these global comparisons and call into question recent interpretations of latitudinal or hemispheric differences of diversification rates. Moreover, this review pinpoints understudied regions whose biota are in critical need of modern systematic analyses.

  6. The Economic Cost of Homosexuality: Multilevel Analyses

    ERIC Educational Resources Information Center

    Baumle, Amanda K.; Poston, Dudley, Jr.

    2011-01-01

    This article builds on earlier studies that have examined "the economic cost of homosexuality," by using data from the 2000 U.S. Census and by employing multilevel analyses. Our findings indicate that partnered gay men experience a 12.5 percent earnings penalty compared to married heterosexual men, and a statistically insignificant earnings…

  7. Functional Analyses and Treatment of Precursor Behavior

    ERIC Educational Resources Information Center

    Najdowski, Adel C.; Wallace, Michele D.; Ellsworth, Carrie L.; MacAleese, Alicia N.; Cleveland, Jackie

    2008-01-01

    Functional analysis has been demonstrated to be an effective method to identify environmental variables that maintain problem behavior. However, there are cases when conducting functional analyses of severe problem behavior may be contraindicated. The current study applied functional analysis procedures to a class of behavior that preceded severe…

  8. Using Solo to Analyse Group Responses

    ERIC Educational Resources Information Center

    Reading, Chris; Lawrie, Christine

    2004-01-01

    The increased use of group work in teaching and learning has seen an increased need for knowledge about assessment of group work. This report considers exploratory research where the SOLO Taxonomy, previously used to analyse the quality of individual responses, is applied to group responses. The responses were created as part of an activity…

  9. Analysing Simple Electric Motors in the Classroom

    ERIC Educational Resources Information Center

    Yap, Jeff; MacIsaac, Dan

    2006-01-01

    Electromagnetic phenomena and devices such as motors are typically unfamiliar to both teachers and students. To better visualize and illustrate the abstract concepts (such as magnetic fields) underlying electricity and magnetism, we suggest that students construct and analyse the operation of a simply constructed Johnson electric motor. In this…

  10. Impact analyses after pipe rupture. [PWR; BWR

    SciTech Connect

    Chun, R.C.; Chuang, T.Y.

    1983-12-13

    Two of the French pipe whip experiments are reproduced with the computer code WIPS. The WIPS results are in good agreement with the experimental data and the French computer code TEDEL. This justifies the use of its pipe element in conjunction with its U-bar element in a simplified method of impact analyses.

  11. Airbags to Martian Landers: Analyses at Sandia National Laboratories

    SciTech Connect

    Gwinn, K.W.

    1994-03-01

    A new direction for the national laboratories is to assist US business with research and development, primarily through cooperative research and development agreements (CRADAs). Technology transfer to the private sector has been very successful as over 200 CRADAs are in place at Sandia. Because of these cooperative efforts, technology has evolved into some new areas not commonly associated with the former mission of the national laboratories. An example of this is the analysis of fabric structures. Explicit analyses and expertise in constructing parachutes led to the development of a next generation automobile airbag; which led to the construction, testing, and analysis of the Jet Propulsion Laboratory Mars Environmental Survey Lander; and finally led to the development of CAD based custom garment designs using 3D scanned images of the human body. The structural analysis of these fabric structures is described as well as a more traditional example Sandia with the test/analysis correlation of the impact of a weapon container.

  12. Advanced Land Imager Assessment System

    NASA Technical Reports Server (NTRS)

    Chander, Gyanesh; Choate, Mike; Christopherson, Jon; Hollaren, Doug; Morfitt, Ron; Nelson, Jim; Nelson, Shar; Storey, James; Helder, Dennis; Ruggles, Tim; Kaita, Ed; Levy, Raviv; Ong, Lawrence; Markham, Brian; Schweiss, Robert

    2008-01-01

    The Advanced Land Imager Assessment System (ALIAS) supports radiometric and geometric image processing for the Advanced Land Imager (ALI) instrument onboard NASA s Earth Observing-1 (EO-1) satellite. ALIAS consists of two processing subsystems for radiometric and geometric processing of the ALI s multispectral imagery. The radiometric processing subsystem characterizes and corrects, where possible, radiometric qualities including: coherent, impulse; and random noise; signal-to-noise ratios (SNRs); detector operability; gain; bias; saturation levels; striping and banding; and the stability of detector performance. The geometric processing subsystem and analysis capabilities support sensor alignment calibrations, sensor chip assembly (SCA)-to-SCA alignments and band-to-band alignment; and perform geodetic accuracy assessments, modulation transfer function (MTF) characterizations, and image-to-image characterizations. ALIAS also characterizes and corrects band-toband registration, and performs systematic precision and terrain correction of ALI images. This system can geometrically correct, and automatically mosaic, the SCA image strips into a seamless, map-projected image. This system provides a large database, which enables bulk trending for all ALI image data and significant instrument telemetry. Bulk trending consists of two functions: Housekeeping Processing and Bulk Radiometric Processing. The Housekeeping function pulls telemetry and temperature information from the instrument housekeeping files and writes this information to a database for trending. The Bulk Radiometric Processing function writes statistical information from the dark data acquired before and after the Earth imagery and the lamp data to the database for trending. This allows for multi-scene statistical analyses.

  13. Body image and media use among adolescents.

    PubMed

    Borzekowski, Dina L G; Bayer, Angela M

    2005-06-01

    This article reviews the literature on body image and media use among adolescents. We begin by defining body image and how it is constructed, especially among young people. We then offer information on when one's body image perception is askew with one's perception of personal ideal, which can result in disordered eating, including obesity, anorexia, and bulimia. Next, we describe the research literature on media use and its relationship to adolescents' body image perceptions and discuss content analyses and correlational, experimental, and qualitative studies. Lastly, we recommend, beyond conducting further and improved research studies, interventions and policies that may have an impact on body image and media use.

  14. Blurred Image

    ERIC Educational Resources Information Center

    Conde, Maryse

    1975-01-01

    The growing influence of Western culture has greatly affected African women's status and image in the traditional society. Working women are confronted with the dilemma of preserving family traditions while changing their behavior and image to become members of the labor force. (MR)

  15. Diagnostic Imaging

    MedlinePlus

    Diagnostic imaging lets doctors look inside your body for clues about a medical condition. A variety of machines and techniques can create pictures of the structures and activities inside your body. The type of imaging your doctor uses depends on your symptoms and ...

  16. Cerenkov imaging.

    PubMed

    Das, Sudeep; Thorek, Daniel L J; Grimm, Jan

    2014-01-01

    Cerenkov luminescence (CL) has been used recently in a plethora of medical applications like imaging and therapy with clinically relevant medical isotopes. The range of medical isotopes used is fairly large and expanding. The generation of in vivo light is useful since it circumvents depth limitations for excitation light. Cerenkov luminescence imaging (CLI) is much cheaper in terms of infrastructure than positron emission tomography (PET) and is particularly useful for imaging of superficial structures. Imaging can basically be done using a sensitive camera optimized for low-light conditions, and it has a better resolution than any other nuclear imaging modality. CLI has been shown to effectively diagnose disease with regularly used PET isotope ((18)F-FDG) in clinical setting. Cerenkov luminescence tomography, Cerenkov luminescence endoscopy, and intraoperative Cerenkov imaging have also been explored with positive conclusions expanding the current range of applications. Cerenkov has also been used to improve PET imaging resolution since the source of both is the radioisotope being used. Smart imaging agents have been designed based on modulation of the Cerenkov signal using small molecules and nanoparticles giving better insight of the tumor biology. PMID:25287690

  17. Imaging Genetics

    ERIC Educational Resources Information Center

    Munoz, Karen E.; Hyde, Luke W.; Hariri, Ahmad R.

    2009-01-01

    Imaging genetics is an experimental strategy that integrates molecular genetics and neuroimaging technology to examine biological mechanisms that mediate differences in behavior and the risks for psychiatric disorder. The basic principles in imaging genetics and the development of the field are discussed.

  18. Imaging Atherosclerosis

    PubMed Central

    Tarkin, Jason M.; Dweck, Marc R.; Evans, Nicholas R.; Takx, Richard A.P.; Brown, Adam J.; Tawakol, Ahmed; Fayad, Zahi A.

    2016-01-01

    Advances in atherosclerosis imaging technology and research have provided a range of diagnostic tools to characterize high-risk plaque in vivo; however, these important vascular imaging methods additionally promise great scientific and translational applications beyond this quest. When combined with conventional anatomic- and hemodynamic-based assessments of disease severity, cross-sectional multimodal imaging incorporating molecular probes and other novel noninvasive techniques can add detailed interrogation of plaque composition, activity, and overall disease burden. In the catheterization laboratory, intravascular imaging provides unparalleled access to the world beneath the plaque surface, allowing tissue characterization and measurement of cap thickness with micrometer spatial resolution. Atherosclerosis imaging captures key data that reveal snapshots into underlying biology, which can test our understanding of fundamental research questions and shape our approach toward patient management. Imaging can also be used to quantify response to therapeutic interventions and ultimately help predict cardiovascular risk. Although there are undeniable barriers to clinical translation, many of these hold-ups might soon be surpassed by rapidly evolving innovations to improve image acquisition, coregistration, motion correction, and reduce radiation exposure. This article provides a comprehensive review of current and experimental atherosclerosis imaging methods and their uses in research and potential for translation to the clinic. PMID:26892971

  19. Image fusion

    NASA Technical Reports Server (NTRS)

    Pavel, M.

    1993-01-01

    The topics covered include the following: a system overview of the basic components of a system designed to improve the ability of a pilot to fly through low-visibility conditions such as fog; the role of visual sciences; fusion issues; sensor characterization; sources of information; image processing; and image fusion.

  20. Cerenkov Imaging

    PubMed Central

    Das, Sudeep; Thorek, Daniel L.J.; Grimm, Jan

    2014-01-01

    Cerenkov luminescence (CL) has been used recently in a plethora of medical applications like imaging and therapy with clinically relevant medical isotopes. The range of medical isotopes used is fairly large and expanding. The generation of in vivo light is useful since it circumvents depth limitations for excitation light. Cerenkov luminescence imaging (CLI) is much cheaper in terms of infrastructure than positron emission tomography (PET) and is particularly useful for imaging of superficial structures. Imaging can basically be done using a sensitive camera optimized for low-light conditions, and it has a better resolution than any other nuclear imaging modality. CLI has been shown to effectively diagnose disease with regularly used PET isotope (18F-FDG) in clinical setting. Cerenkov luminescence tomography, Cerenkov luminescence endoscopy, and intraoperative Cerenkov imaging have also been explored with positive conclusions expanding the current range of applications. Cerenkov has also been used to improve PET imaging resolution since the source of both is the radioisotope being used. Smart imaging agents have been designed based on modulation of the Cerenkov signal using small molecules and nanoparticles giving better insight of the tumor biology. PMID:25287690

  1. Retinal Imaging and Image Analysis

    PubMed Central

    Abràmoff, Michael D.; Garvin, Mona K.; Sonka, Milan

    2011-01-01

    Many important eye diseases as well as systemic diseases manifest themselves in the retina. While a number of other anatomical structures contribute to the process of vision, this review focuses on retinal imaging and image analysis. Following a brief overview of the most prevalent causes of blindness in the industrialized world that includes age-related macular degeneration, diabetic retinopathy, and glaucoma, the review is devoted to retinal imaging and image analysis methods and their clinical implications. Methods for 2-D fundus imaging and techniques for 3-D optical coherence tomography (OCT) imaging are reviewed. Special attention is given to quantitative techniques for analysis of fundus photographs with a focus on clinically relevant assessment of retinal vasculature, identification of retinal lesions, assessment of optic nerve head (ONH) shape, building retinal atlases, and to automated methods for population screening for retinal diseases. A separate section is devoted to 3-D analysis of OCT images, describing methods for segmentation and analysis of retinal layers, retinal vasculature, and 2-D/3-D detection of symptomatic exudate-associated derangements, as well as to OCT-based analysis of ONH morphology and shape. Throughout the paper, aspects of image acquisition, image analysis, and clinical relevance are treated together considering their mutually interlinked relationships. PMID:21743764

  2. Factor Analysis of the Image Correlation Matrix.

    ERIC Educational Resources Information Center

    Kaiser, Henry F.; Cerny, Barbara A.

    1979-01-01

    Whether to factor the image correlation matrix or to use a new model with an alpha factor analysis of it is mentioned, with particular reference to the determinacy problem. It is pointed out that the distribution of the images is sensibly multivariate normal, making for "better" factor analyses. (Author/CTM)

  3. Evaluation method based on the image correlation for laser jamming image

    NASA Astrophysics Data System (ADS)

    Che, Jinxi; Li, Zhongmin; Gao, Bo

    2013-09-01

    The jamming effectiveness evaluation of infrared imaging system is an important part of electro-optical countermeasure. The infrared imaging devices in the military are widely used in the searching, tracking and guidance and so many other fields. At the same time, with the continuous development of laser technology, research of laser interference and damage effect developed continuously, laser has been used to disturbing the infrared imaging device. Therefore, the effect evaluation of the infrared imaging system by laser has become a meaningful problem to be solved. The information that the infrared imaging system ultimately present to the user is an image, so the evaluation on jamming effect can be made from the point of assessment of image quality. The image contains two aspects of the information, the light amplitude and light phase, so the image correlation can accurately perform the difference between the original image and disturbed image. In the paper, the evaluation method of digital image correlation, the assessment method of image quality based on Fourier transform, the estimate method of image quality based on error statistic and the evaluation method of based on peak signal noise ratio are analysed. In addition, the advantages and disadvantages of these methods are analysed. Moreover, the infrared disturbing images of the experiment result, in which the thermal infrared imager was interfered by laser, were analysed by using these methods. The results show that the methods can better reflect the jamming effects of the infrared imaging system by laser. Furthermore, there is good consistence between evaluation results by using the methods and the results of subjective visual evaluation. And it also provides well repeatability and convenient quantitative analysis. The feasibility of the methods to evaluate the jamming effect was proved. It has some extent reference value for the studying and developing on electro-optical countermeasures equipments and

  4. Group-level component analyses of EEG: validation and evaluation.

    PubMed

    Huster, Rene J; Plis, Sergey M; Calhoun, Vince D

    2015-01-01

    Multi-subject or group-level component analysis provides a data-driven approach to study properties of brain networks. Algorithms for group-level data decomposition of functional magnetic resonance imaging data have been brought forward more than a decade ago and have significantly matured since. Similar applications for electroencephalographic data are at a comparatively early stage of development though, and their sensitivity to topographic variability of the electroencephalogram or loose time-locking of neuronal responses has not yet been assessed. This study investigates the performance of independent component analysis (ICA) and second order blind source identification (SOBI) for data decomposition, and their combination with either temporal or spatial concatenation of data sets, for multi-subject analyses of electroencephalographic data. Analyses of simulated sources with different spatial, frequency, and time-locking profiles, revealed that temporal concatenation of data sets with either ICA or SOBI served well to reconstruct sources with both strict and loose time-locking, whereas performance decreased in the presence of topographical variability. The opposite pattern was found with a spatial concatenation of subject-specific data sets. This study proofs that procedures for group-level decomposition of electroencephalographic data can be considered valid and promising approaches to infer the latent structure of multi-subject data sets. Yet, specific implementations need further adaptations to optimally address sources of inter-subject and inter-trial variance commonly found in EEG recordings. PMID:26283897

  5. Group-level component analyses of EEG: validation and evaluation.

    PubMed

    Huster, Rene J; Plis, Sergey M; Calhoun, Vince D

    2015-01-01

    Multi-subject or group-level component analysis provides a data-driven approach to study properties of brain networks. Algorithms for group-level data decomposition of functional magnetic resonance imaging data have been brought forward more than a decade ago and have significantly matured since. Similar applications for electroencephalographic data are at a comparatively early stage of development though, and their sensitivity to topographic variability of the electroencephalogram or loose time-locking of neuronal responses has not yet been assessed. This study investigates the performance of independent component analysis (ICA) and second order blind source identification (SOBI) for data decomposition, and their combination with either temporal or spatial concatenation of data sets, for multi-subject analyses of electroencephalographic data. Analyses of simulated sources with different spatial, frequency, and time-locking profiles, revealed that temporal concatenation of data sets with either ICA or SOBI served well to reconstruct sources with both strict and loose time-locking, whereas performance decreased in the presence of topographical variability. The opposite pattern was found with a spatial concatenation of subject-specific data sets. This study proofs that procedures for group-level decomposition of electroencephalographic data can be considered valid and promising approaches to infer the latent structure of multi-subject data sets. Yet, specific implementations need further adaptations to optimally address sources of inter-subject and inter-trial variance commonly found in EEG recordings.

  6. HASE: Framework for efficient high-dimensional association analyses

    PubMed Central

    Roshchupkin, G. V.; Adams, H. H. H.; Vernooij, M. W.; Hofman, A.; Van Duijn, C. M.; Ikram, M. A.; Niessen, W. J.

    2016-01-01

    High-throughput technology can now provide rich information on a person’s biological makeup and environmental surroundings. Important discoveries have been made by relating these data to various health outcomes in fields such as genomics, proteomics, and medical imaging. However, cross-investigations between several high-throughput technologies remain impractical due to demanding computational requirements (hundreds of years of computing resources) and unsuitability for collaborative settings (terabytes of data to share). Here we introduce the HASE framework that overcomes both of these issues. Our approach dramatically reduces computational time from years to only hours and also requires several gigabytes to be exchanged between collaborators. We implemented a novel meta-analytical method that yields identical power as pooled analyses without the need of sharing individual participant data. The efficiency of the framework is illustrated by associating 9 million genetic variants with 1.5 million brain imaging voxels in three cohorts (total N = 4,034) followed by meta-analysis, on a standard computational infrastructure. These experiments indicate that HASE facilitates high-dimensional association studies enabling large multicenter association studies for future discoveries. PMID:27782180

  7. Analysing organic transistors based on interface approximation

    SciTech Connect

    Akiyama, Yuto; Mori, Takehiko

    2014-01-15

    Temperature-dependent characteristics of organic transistors are analysed thoroughly using interface approximation. In contrast to amorphous silicon transistors, it is characteristic of organic transistors that the accumulation layer is concentrated on the first monolayer, and it is appropriate to consider interface charge rather than band bending. On the basis of this model, observed characteristics of hexamethylenetetrathiafulvalene (HMTTF) and dibenzotetrathiafulvalene (DBTTF) transistors with various surface treatments are analysed, and the trap distribution is extracted. In turn, starting from a simple exponential distribution, we can reproduce the temperature-dependent transistor characteristics as well as the gate voltage dependence of the activation energy, so we can investigate various aspects of organic transistors self-consistently under the interface approximation. Small deviation from such an ideal transistor operation is discussed assuming the presence of an energetically discrete trap level, which leads to a hump in the transfer characteristics. The contact resistance is estimated by measuring the transfer characteristics up to the linear region.

  8. Neuronal network analyses: premises, promises and uncertainties

    PubMed Central

    Parker, David

    2010-01-01

    Neuronal networks assemble the cellular components needed for sensory, motor and cognitive functions. Any rational intervention in the nervous system will thus require an understanding of network function. Obtaining this understanding is widely considered to be one of the major tasks facing neuroscience today. Network analyses have been performed for some years in relatively simple systems. In addition to the direct insights these systems have provided, they also illustrate some of the difficulties of understanding network function. Nevertheless, in more complex systems (including human), claims are made that the cellular bases of behaviour are, or will shortly be, understood. While the discussion is necessarily limited, this issue will examine these claims and highlight some traditional and novel aspects of network analyses and their difficulties. This introduction discusses the criteria that need to be satisfied for network understanding, and how they relate to traditional and novel approaches being applied to addressing network function. PMID:20603354

  9. Reliability of chemical analyses of water samples

    SciTech Connect

    Beardon, R.

    1989-11-01

    Ground-water quality investigations require reliable chemical analyses of water samples. Unfortunately, laboratory analytical results are often unreliable. The Uranium Mill Tailings Remedial Action (UMTRA) Project`s solution to this problem was to establish a two phase quality assurance program for the analysis of water samples. In the first phase, eight laboratories analyzed three solutions of known composition. The analytical accuracy of each laboratory was ranked and three laboratories were awarded contracts. The second phase consists of on-going monitoring of the reliability of the selected laboratories. The following conclusions are based on two years experience with the UMTRA Project`s Quality Assurance Program. The reliability of laboratory analyses should not be taken for granted. Analytical reliability may be independent of the prices charged by laboratories. Quality assurance programs benefit both the customer and the laboratory.

  10. Identifying, analysing and solving problems in practice.

    PubMed

    Hewitt-Taylor, Jaqui

    When a problem is identified in practice, it is important to clarify exactly what it is and establish the cause before seeking a solution. This solution-seeking process should include input from those directly involved in the problematic situation, to enable individuals to contribute their perspective, appreciate why any change in practice is necessary and what will be achieved by the change. This article describes some approaches to identifying and analysing problems in practice so that effective solutions can be devised. It includes a case study and examples of how the Five Whys analysis, fishbone diagram, problem tree analysis, and Seven-S Model can be used to analyse a problem. PMID:22848969

  11. Sensitivity in risk analyses with uncertain numbers.

    SciTech Connect

    Tucker, W. Troy; Ferson, Scott

    2006-06-01

    Sensitivity analysis is a study of how changes in the inputs to a model influence the results of the model. Many techniques have recently been proposed for use when the model is probabilistic. This report considers the related problem of sensitivity analysis when the model includes uncertain numbers that can involve both aleatory and epistemic uncertainty and the method of calculation is Dempster-Shafer evidence theory or probability bounds analysis. Some traditional methods for sensitivity analysis generalize directly for use with uncertain numbers, but, in some respects, sensitivity analysis for these analyses differs from traditional deterministic or probabilistic sensitivity analyses. A case study of a dike reliability assessment illustrates several methods of sensitivity analysis, including traditional probabilistic assessment, local derivatives, and a ''pinching'' strategy that hypothetically reduces the epistemic uncertainty or aleatory uncertainty, or both, in an input variable to estimate the reduction of uncertainty in the outputs. The prospects for applying the methods to black box models are also considered.

  12. Analyses and characterization of double shell tank

    SciTech Connect

    Not Available

    1994-10-04

    Evaporator candidate feed from tank 241-AP-108 (108-AP) was sampled under prescribed protocol. Physical, inorganic, and radiochemical analyses were performed on tank 108-AP. Characterization of evaporator feed tank waste is needed primarily for an evaluation of its suitability to be safely processed through the evaporator. Such analyses should provide sufficient information regarding the waste composition to confidently determine whether constituent concentrations are within not only safe operating limits, but should also be relevant to functional limits for operation of the evaporator. Characterization of tank constituent concentrations should provide data which enable a prediction of where the types and amounts of environmentally hazardous waste are likely to occur in the evaporator product streams.

  13. [Clinical research=design*measurements*statistical analyses].

    PubMed

    Furukawa, Toshiaki

    2012-06-01

    A clinical study must address true endpoints that matter for the patients and the doctors. A good clinical study starts with a good clinical question. Formulating a clinical question in the form of PECO can sharpen one's original question. In order to perform a good clinical study one must have a knowledge of study design, measurements and statistical analyses: The first is taught by epidemiology, the second by psychometrics and the third by biostatistics.

  14. Inelastic and Dynamic Fracture and Stress Analyses

    NASA Technical Reports Server (NTRS)

    Atluri, S. N.

    1984-01-01

    Large deformation inelastic stress analysis and inelastic and dynamic crack propagation research work is summarized. The salient topics of interest in engine structure analysis that are discussed herein include: (1) a path-independent integral (T) in inelastic fracture mechanics, (2) analysis of dynamic crack propagation, (3) generalization of constitutive relations of inelasticity for finite deformations , (4) complementary energy approaches in inelastic analyses, and (5) objectivity of time integration schemes in inelastic stress analysis.

  15. MULTISPECTRAL THERMAL IMAGER - OVERVIEW

    SciTech Connect

    P. WEBER

    2001-03-01

    The Multispectral Thermal Imager satellite fills a new and important role in advancing the state of the art in remote sensing sciences. Initial results with the full calibration system operating indicate that the system was already close to achieving the very ambitious goals which we laid out in 1993, and we are confident of reaching all of these goals as we continue our research and improve our analyses. In addition to the DOE interests, the satellite is tasked about one-third of the time with requests from other users supporting research ranging from volcanology to atmospheric sciences.

  16. Analyses of Impedance Microstructure and Wave Propagation Characteristics in Rocks

    NASA Astrophysics Data System (ADS)

    Prasad, M.; Mukerji, T.

    2002-12-01

    Seismic methods are our primary tools to image subsurface structures and to derive information about microstructural properties at subsurface that are pertinent to exploration. However, velocity - physical property transforms are mostly empirical or qualitative in nature, mainly because microstructural descriptions are qualitative. Although, sedimentary systems produce distinctive textures that influence physical properties and seismic signatures, these textures are not quantified in terms comparable to seismic. We present a method to quantify microsctructure in terms of acoustic impedance and show how these microstructural impedance maps can be used to analyze wave propagation characteristics in rocks. Using image analyses techniques, the texture of the calibrated scanned images is quantified by spatial autocorrelation functions and binary morphological operations. Parametric modeling of the empirical autocorrelation functions is used to estimate the textural anisotropy. We quantify microstructural impedance anisotropy and compare these textural maps to ultrasonic velocity anisotropy measurements. Inclusion based effective medium theory is used to upscale the impedances at the microstructural scale to the core plug scale. In the example of optically opaque kerogen-rich shales, we find that 1. Acoustic impedance in kerogen shales increases with shale maturity, 2. Impedance measured on a micrometer scale and centimeter scale match well, indicating that seismic wave propagation are controlled by the microtexture 3. With increasing maturity, there is a transition from kerogen supported to grain supported framework We thank the Fraunhofer Institute for Nondestructive Testing (IZfP) for use of AM facilities, Walter Arnold (IZfP) for discussions about acoustic microscopy, ARCO and SRB Project for support. This work was performed under the auspices of National Science Foundation (Grant No. EAR 0074330) and Department of Energy (Award No. DE-FC26-01BC15354).

  17. Raman Imaging

    NASA Astrophysics Data System (ADS)

    Stewart, Shona; Priore, Ryan J.; Nelson, Matthew P.; Treado, Patrick J.

    2012-07-01

    The past decade has seen an enormous increase in the number and breadth of imaging techniques developed for analysis in many industries, including pharmaceuticals, food, and especially biomedicine. Rather than accept single-dimensional forms of information, users now demand multidimensional assessment of samples. High specificity and the need for little or no sample preparation make Raman imaging a highly attractive analytical technique and provide motivation for continuing advances in its supporting technology and utilization. This review discusses the current tools employed in Raman imaging, the recent advances, and the major applications in this ever-growing analytical field.

  18. Evaluation of the Technicon Axon analyser.

    PubMed

    Martínez, C; Márquez, M; Cortés, M; Mercé, J; Rodriguez, J; González, F

    1990-01-01

    An evaluation of the Technicon Axon analyser was carried out following the guidelines of the 'Sociedad Española de Química Clínica' and the European Committee for Clinical Laboratory Standards.A photometric study revealed acceptable results at both 340 nm and 404 nm. Inaccuracy and imprecision were lower at 404 nm than at 340 nm, although poor dispersion was found at both wavelengths, even at low absorbances. Drift was negligible, the imprecision of the sample pipette delivery system was greater for small sample volumes, the reagent pipette delivery system imprecision was acceptable and the sample diluting system study showed good precision and accuracy.Twelve analytes were studied for evaluation of the analyser under routine working conditions. Satisfactory results were obtained for within-run imprecision, while coefficients of variation for betweenrun imprecision were much greater than expected. Neither specimenrelated nor specimen-independent contamination was found in the carry-over study. For all analytes assayed, when comparing patient sample results with those obtained in a Hitachi 737 analyser, acceptable relative inaccuracy was observed.

  19. Hybrid Diffusion Imaging

    PubMed Central

    Wu, Yu-Chien; Alexander, Andrew L.

    2007-01-01

    Diffusion measurements in the human central nervous system are complex to characterize and a broad spectrum of methods have been proposed. In this study, a comprehensive diffusion encoding and analysis approach, Hybrid Diffusion Imaging (HYDI), is described. The HYDI encoding scheme is composed of multiple concentric “shells” of constant diffusion-weighting, which may be used to characterize the signal behavior with low, moderate and high diffusion-weighting. HYDI facilitates the application of multiple data-analyses strategies including diffusion tensor imaging (DTI), multi-exponential diffusion measurements, diffusion spectrum imaging (DSI) and q-ball imaging (QBI). These different analysis strategies may provide complementary information. DTI measures (mean diffusivity and fractional anisotropy) may be estimated from either data in the inner shells or the entire HYDI data. Fast and slow diffusivities were estimated using a nonlinear least-squares bi-exponential fit on geometric means of the HYDI shells. DSI measurements from the entire HYDI data yield empirical model-independent diffusion information and are well-suited for characterizing tissue regions with complex diffusion behavior. DSI measurements were characterized using the zero displacement probability and the mean squared displacement. The outermost HYDI shell was analyzed using QBI analysis to estimate the orientation distribution function (ODF), which is useful for characterizing the directions of multiple fiber groups within a voxel. In this study, a HYDI encoding scheme with 102 diffusion-weighted measurements was obtained over most of the human cerebrum in under 30 minutes. PMID:17481920

  20. Computational analyses of arteriovenous malformations in neuroimaging.

    PubMed

    Di Ieva, Antonio; Boukadoum, Mounir; Lahmiri, Salim; Cusimano, Michael D

    2015-01-01

    Computational models have been investigated for the analysis of the physiopathology and morphology of arteriovenous malformation (AVM) in recent years. Special emphasis has been given to image fusion in multimodal imaging and 3-dimensional rendering of the AVM, with the aim to improve the visualization of the lesion (for diagnostic purposes) and the selection of the nidus (for therapeutic aims, like the selection of the region of interest for the gamma knife radiosurgery plan). Searching for new diagnostic and prognostic neuroimaging biomarkers, fractal-based computational models have been proposed for describing and quantifying the angioarchitecture of the nidus. Computational modeling in the AVM field offers promising tools of analysis and requires a strict collaboration among neurosurgeons, neuroradiologists, clinicians, computer scientists, and engineers. We present here some updated state-of-the-art exemplary cases in the field, focusing on recent neuroimaging computational modeling with clinical relevance, which might offer useful clinical tools for the management of AVMs in the future.

  1. Medical Imaging.

    ERIC Educational Resources Information Center

    Jaffe, C. Carl

    1982-01-01

    Describes principle imaging techniques, their applications, and their limitations in terms of diagnostic capability and possible adverse biological effects. Techniques include film radiography, computed tomography, nuclear medicine, positron emission tomography (PET), ultrasonography, nuclear magnetic resonance, and digital radiography. PET has…

  2. Body Imaging

    NASA Technical Reports Server (NTRS)

    2001-01-01

    The high-tech art of digital signal processing (DSP) was pioneered at NASA's Jet Propulsion Laboratory (JPL) in the mid-1960s for use in the Apollo Lunar Landing Program. Designed to computer enhance pictures of the Moon, this technology became the basis for the Landsat Earth resources satellites and subsequently has been incorporated into a broad range of Earthbound medical and diagnostic tools. DSP is employed in advanced body imaging techniques including Computer-Aided Tomography, also known as CT and CATScan, and Magnetic Resonance Imaging (MRI). CT images are collected by irradiating a thin slice of the body with a fan-shaped x-ray beam from a number of directions around the body's perimeter. A tomographic (slice-like) picture is reconstructed from these multiple views by a computer. MRI employs a magnetic field and radio waves, rather than x-rays, to create images.

  3. Body Imaging

    NASA Technical Reports Server (NTRS)

    2001-01-01

    The high-tech art of digital signal processing (DSP) was pioneered at NASA's Jet Propulsion Laboratory (JPL) in the mid-1960s for use in the Apollo Lunar Landing Program. Designed to computer enhance pictures of the Moon, this technology became the basis for the Landsat Earth resources satellites and subsequently has been incorporated into a broad range of Earthbound medical and diagnostic tools. DSP is employed in advanced body imaging techniques including Computer-Aided Tomography, also known as CT and CATScan, and Magnetic Resonance Imaging (MRI). CT images are collected by irradiating a thin slice of the body with a fan-shaped x-ray beam from a number of directions around the body's perimeter. A tomographic (slice-like) picture is reconstructed from these multiple views by a computer. MRI employs a magnetic field and radio waves, rather than x-rays, to create images. In this photograph, a patient undergoes an open MRI.

  4. Imaging System

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The 1100C Virtual Window is based on technology developed under NASA Small Business Innovation (SBIR) contracts to Ames Research Center. For example, under one contract Dimension Technologies, Inc. developed a large autostereoscopic display for scientific visualization applications. The Virtual Window employs an innovative illumination system to deliver the depth and color of true 3D imaging. Its applications include surgery and Magnetic Resonance Imaging scans, viewing for teleoperated robots, training, and in aviation cockpit displays.

  5. Diagnostic imaging.

    PubMed

    Morris, Peter; Perkins, Alan

    2012-04-21

    Physical techniques have always had a key role in medicine, and the second half of the 20th century in particular saw a revolution in medical diagnostic techniques with the development of key imaging instruments: x-ray imaging and emission tomography (nuclear imaging and PET), MRI, and ultrasound. These techniques use the full width of the electromagnetic spectrum, from gamma rays to radio waves, and sound. In most cases, the development of a medical imaging device was opportunistic; many scientists in physics laboratories were experimenting with simple x-ray images within the first year of the discovery of such rays, the development of the cyclotron and later nuclear reactors created the opportunity for nuclear medicine, and one of the co-inventors of MRI was initially attempting to develop an alternative to x-ray diffraction for the analysis of crystal structures. What all these techniques have in common is the brilliant insight of a few pioneering physical scientists and engineers who had the tenacity to develop their inventions, followed by a series of technical innovations that enabled the full diagnostic potential of these instruments to be realised. In this report, we focus on the key part played by these scientists and engineers and the new imaging instruments and diagnostic procedures that they developed. By bringing the key developments and applications together we hope to show the true legacy of physics and engineering in diagnostic medicine. PMID:22516558

  6. Department of Energy's team's analyses of Soviet designed VVERs

    SciTech Connect

    Not Available

    1989-09-01

    This document provides Appendices A thru K of this report. The topics discussed respectively are: radiation induced embrittlement and annealing of reactor pressure vessel steels; loss of coolant accident blowdown analyses; LOCA blowdown response analyses; non-seismic structural response analyses; seismic analyses; S'' seal integrity; reactor transient analyses; fire protection; aircraft impacts; and boric acid induced corrosion. (FI).

  7. Evaluation of Model Operational Analyses during DYNAMO

    NASA Astrophysics Data System (ADS)

    Ciesielski, Paul; Johnson, Richard

    2013-04-01

    A primary component of the observing system in the DYNAMO-CINDY2011-AMIE field campaign was an atmospheric sounding network comprised of two sounding quadrilaterals, one north and one south of the equator over the central Indian Ocean. During the experiment a major effort was undertaken to ensure the real-time transmission of these data onto the GTS (Global Telecommunication System) for dissemination to the operational centers (ECMWF, NCEP, JMA, etc.). Preliminary estimates indicate that ~95% of the soundings from the enhanced sounding network were successfully transmitted and potentially used in their data assimilation systems. Because of the wide use of operational and reanalysis products (e.g., in process studies, initializing numerical simulations, construction of large-scale forcing datasets for CRMs, etc.), their validity will be examined by comparing a variety of basic and diagnosed fields from two operational analyses (ECMWF and NCEP) to similar analyses based solely on sounding observations. Particular attention will be given to the vertical structures of apparent heating (Q1) and drying (Q2) from the operational analyses (OA), which are strongly influenced by cumulus parameterizations, a source of model infidelity. Preliminary results indicate that the OA products did a reasonable job at capturing the mean and temporal characteristics of convection during the DYNAMO enhanced observing period, which included the passage of two significant MJO events during the October-November 2011 period. For example, temporal correlations between Q2-budget derived rainfall from the OA products and that estimated from the TRMM satellite (i.e., the 3B42V7 product) were greater than 0.9 over the Northern Sounding Array of DYNAMO. However closer inspection of the budget profiles show notable differences between the OA products and the sounding-derived results in low-level (surface to 700 hPa) heating and drying structures. This presentation will examine these differences and

  8. Combustion Devices CFD Team Analyses Review

    NASA Technical Reports Server (NTRS)

    Rocker, Marvin

    2008-01-01

    A variety of CFD simulations performed by the Combustion Devices CFD Team at Marshall Space Flight Center will be presented. These analyses were performed to support Space Shuttle operations and Ares-1 Crew Launch Vehicle design. Results from the analyses will be shown along with pertinent information on the CFD codes and computational resources used to obtain the results. Six analyses will be presented - two related to the Space Shuttle and four related to the Ares I-1 launch vehicle now under development at NASA. First, a CFD analysis of the flow fields around the Space Shuttle during the first six seconds of flight and potential debris trajectories within those flow fields will be discussed. Second, the combusting flows within the Space Shuttle Main Engine's main combustion chamber will be shown. For the Ares I-1, an analysis of the performance of the roll control thrusters during flight will be described. Several studies are discussed related to the J2-X engine to be used on the upper stage of the Ares I-1 vehicle. A parametric study of the propellant flow sequences and mixture ratios within the GOX/GH2 spark igniters on the J2-X is discussed. Transient simulations will be described that predict the asymmetric pressure loads that occur on the rocket nozzle during the engine start as the nozzle fills with combusting gases. Simulations of issues that affect temperature uniformity within the gas generator used to drive the J-2X turbines will described as well, both upstream of the chamber in the injector manifolds and within the combustion chamber itself.

  9. Stable isotopic analyses in paleoclimatic reconstruction

    SciTech Connect

    Wigand, P.E.

    1995-09-01

    Most traditional paleoclimatic proxy data have inherent time lags between climatic input and system response that constrain their use in accurate reconstruction of paleoclimate chronology, scaling of its variability, and the elucidation of the processes that determine its impact on the biotic and abiotic environment. With the exception of dendroclimatology, and studies of short-lived organisms and pollen recovered from annually varved lacustrine sediments, significant periods of time ranging from years, to centuries, to millennia may intervene between climate change and its first manifestation in paleoclimatic proxy data records. Reconstruction of past climate through changes in plant community composition derived from pollen sequences and plant remains from ancient woodrat middens, wet environments and dry caves all suffer from these lags. However, stable isotopic analyses can provide more immediate indication of biotic response to climate change. Evidence of past physiological response of organisms to changes in effective precipitation as climate varies can be provided by analyses of the stable isotopic content of plant macrofossils from various contexts. These analyses consider variation in the stable isotopic (hydrogen, oxygen and carbon) content of plant tissues as it reflects (1) past global or local temperature through changes in meteoric (rainfall) water chemistry in the case of the first two isotopes, and (2) plant stress through changes in plant respiration/transpiration processes under differing water availability, and varying atmospheric CO, composition (which itself may actually be a net result of biotic response to climate change). Studies currently being conducted in the Intermountain West indicate both long- and short-term responses that when calibrated with modem analogue studies have the potential of revealing not only the timing of climate events, but their direction, magnitude and rapidity.

  10. Method of performing computational aeroelastic analyses

    NASA Technical Reports Server (NTRS)

    Silva, Walter A. (Inventor)

    2011-01-01

    Computational aeroelastic analyses typically use a mathematical model for the structural modes of a flexible structure and a nonlinear aerodynamic model that can generate a plurality of unsteady aerodynamic responses based on the structural modes for conditions defining an aerodynamic condition of the flexible structure. In the present invention, a linear state-space model is generated using a single execution of the nonlinear aerodynamic model for all of the structural modes where a family of orthogonal functions is used as the inputs. Then, static and dynamic aeroelastic solutions are generated using computational interaction between the mathematical model and the linear state-space model for a plurality of periodic points in time.

  11. Analyses of Shuttle Orbiter approach and landing

    NASA Technical Reports Server (NTRS)

    Ashkenas, I. L.; Hoh, R. H.; Teper, G. L.

    1982-01-01

    A study of the Shuttle Orbiter approach and landing conditions is summarized. The causes of observed PIO-like flight deficiencies are listed, and possible corrective measures are examined. Closed-loop pilot/vehicle analyses are described, and a description is given of path-attitude stability boundaries. The latter novel approach is found to be of great value in delineating and illustrating the basic causes of this multiloop pilot control problem. It is shown that the analytical results are consistent with flight test and fixed-base simulation. Conclusions are drawn concerning possible improvements in the Shuttle Orbiter/Digital Flight Control System.

  12. Further analyses of Rio Cuarto impact glass

    NASA Technical Reports Server (NTRS)

    Schultz, Peter H.; Bunch, T. E.; Koeberl, C.; Collins, W.

    1993-01-01

    Initial analyses of the geologic setting, petrology, and geochemistry of glasses recovered from within and around the elongate Rio Cuarto (RC) craters in Argentina focused on selected samples in order to document the general similarity with impactites around other terrestrial impact craters and to establish their origin. Continued analysis has surveyed the diversity in compositions for a range of samples, examined further evidence for temperature and pressure history, and compared the results with experimentally fused loess from oblique hypervelocity impacts. These new results not only firmly establish their impact origin but provide new insight on the impact process.

  13. Environmental monitoring final report: groundwater chemical analyses

    SciTech Connect

    Not Available

    1984-02-01

    This report presents the results of analyses of groundwater qualtiy at the SRC-I Demonstration Plant site in Newman, Kentucky. Samples were obtained from a network of 23 groundwater observation wells installed during previous studies. The groundwater was well within US EPA Interim Primary Drinking Water Standards for trace metals, radioactivity, and pesticides, but exceeded the standard for coliform bacteria. Several US EPA Secondary Drinking Water Standards were exceeded, namely, manganese, color, iron, and total dissolved solids. Based on the results, Dames and Moore recommend that all wells should be sterilized and those wells built in 1980 should be redeveloped. 1 figure, 6 tables.

  14. Analyses of containment structures with corrosion damage

    SciTech Connect

    Cherry, J.L.

    1996-12-31

    Corrosion damage to a nuclear power plant containment structure can degrade the pressure capacity of the vessel. For the low-carbon, low- strength steels used in containments, the effect of corrosion on material properties is discussed. Strain-to-failure tests, in uniaxial tension, have been performed on corroded material samples. Results were used to select strain-based failure criteria for corroded steel. Using the ABAQUS finite element analysis code, the capacity of a typical PWR Ice Condenser containment with corrosion damage has been studied. Multiple analyses were performed with the locations of the corrosion the containment, and the amount of corrosion varied in each analysis.

  15. Noninvasive imaging of bone microarchitecture

    PubMed Central

    Patsch, Janina M.; Burghardt, Andrew J.; Kazakia, Galateia; Majumdar, Sharmila

    2015-01-01

    The noninvasive quantification of peripheral compartment-specific bone microarchitecture is feasible with high-resolution peripheral quantitative computed tomography (HR-pQCT) and high-resolution magnetic resonance imaging (HR-MRI). In addition to classic morphometric indices, both techniques provide a suitable basis for virtual biomechanical testing using finite element (FE) analyses. Methodical limitations, morphometric parameter definition, and motion artifacts have to be considered to achieve optimal data interpretation from imaging studies. With increasing availability of in vivo high-resolution bone imaging techniques, special emphasis should be put on quality control including multicenter, cross-site validations. Importantly, conclusions from interventional studies investigating the effects of antiosteoporotic drugs on bone microarchitecture should be drawn with care, ideally involving imaging scientists, translational researchers, and clinicians. PMID:22172043

  16. Image Mission Attitude Support Experiences

    NASA Technical Reports Server (NTRS)

    Ottenstein, N.; Challa, M.; Home, A.; Harman, R.; Burley, R.

    2001-01-01

    The spin-stabilized Imager for Magnetopause to Aurora Global Exploration (IMAGE) is the National Aeronautics and Space Administration's (NASA's) first Medium-class Explorer Mission (MIDEX). IMAGE was launched into a highly elliptical polar orbit on March 25, 2000 from Vandenberg Air Force Base, California, aboard a Boeing Delta II 7326 launch vehicle. This paper presents some of the observations of the flight dynamics analyses during the launch and in-orbit checkout period through May 18, 2000. Three new algorithms - one algebraic and two differential correction - for computing the parameters of the coning motion of a spacecraft are described and evaluated using in-flight data from the autonomous star tracker (AST) on IMAGE. Other attitude aspects highlighted include support for active damping consequent upon the failure of the passive nutation damper, performance evaluation of the AST, evaluation of the Sun sensor and magnetometer using AST data, and magnetometer calibration.

  17. Stellar Imager

    NASA Technical Reports Server (NTRS)

    Carpenter, Kenneth

    2007-01-01

    The Stellar Imager (SI) is one of NASA's "Vision Missions" - concepts for future, space-based, strategic missions that could enormously increase our capabilities for observing the Cosmos. SI is designed as a UV/Optical Interferometer which will enable 0.1 milli-arcsecond (mas) spectral imaging of stellar surfaces and, via asteroseismology, stellar interiors and of the Universe in general. The ultra-sharp images of the Stellar Imager will revolutionize our view of many dynamic astrophysical processes by transforming point sources into extended sources, and snapshots into evolving views. SI, with a characteristic angular resolution of 0.1 milli-arcseconds at 2000 Angstroms, represents an advance in image detail of several hundred times over that provided by the Hubble Space Telescope. The Stellar Imager will zoom in on what today-with few exceptions - we only know as point sources, revealing processes never before seen, thus providing a tool as fundamental to astrophysics as the microscope is to the study of life on Earth. SI's science focuses on the role of magnetism in the Universe, particularly on magnetic activity on the surfaces of stars like the Sun. It's prime goal is to enable long-term forecasting of solar activity and the space weather that it drives, in support of the Living With a Star program in the Exploration Era. SI will also revolutionize our understanding of the formation of planetary systems, of the habitability and climatology of distant planets, and of many magneto-hydrodynamically controlled processes in the Universe. Stellar Imager is included as a "Flagship and Landmark Discovery Mission" in the 2005 Sun Solar System Connection (SSSC) Roadmap and as a candidate for a "Pathways to Life Observatory" in the Exploration of the Universe Division (EUD) Roadmap (May, 2005) and as such is a candidate mission for the 2025-2030 timeframe. An artist's drawing of the current "baseline" concept for SI is presented.

  18. Special analyses reveal coke-deposit structure

    SciTech Connect

    Albright, L.F.

    1988-08-01

    A scanning electron microscope (SEM) and an energy dispersive X-ray analyzer (EDAX) have been used to obtain information that clarifies the three mechanisms of coke formation in ethylene furnaces, and to analyze the metal condition at the exit of furnace. The results can be used to examine furnace operations and develop improved ethylene plant practices. In this first of four articles on the analyses of coke and metal samples, the coking mechanisms and coke deposits in a section of tube from an actual ethylene furnace (Furnace A) from a plant on the Texas Gulf Coast are discussed. The second articles in the series will analyze the condition of the tube metal in the same furnace. To show how coke deposition and metal condition dependent on the operating parameters of an ethylene furnace, the third article in the series will show the coke deposition in a Texas Gulf Coast furnace tube (Furnace B) that operated at shorter residence time. The fourth article discusses the metal condition in that furnace. Some recommendations, based on the analyses and findings, are offered in the fourth article that could help extend the life of ethylene furnace tubes, and also improve overall ethylene plant operations.

  19. Autisme et douleur – analyse bibliographique

    PubMed Central

    Dubois, Amandine; Rattaz, Cécile; Pry, René; Baghdadli, Amaria

    2010-01-01

    La présente analyse bibliographique a pour objectif de réaliser un bilan des travaux publiés dans le champ de la douleur et de l’autisme. L’article aborde, dans un premier temps, les études publiées concernant les modes d’expression de la douleur observés dans cette population. Différentes hypothèses permettant d’expliquer les particularités expressives des personnes avec autisme sont ensuite passées en revue : excès d’endorphines, particularités dans le traitement sensoriel, déficit sociocommunicatif. Cette analyse bibliographique aborde, pour terminer, la question de l’évaluation et de la prise en compte de la douleur chez les personnes avec autisme. Les auteurs concluent à l’absence d’homogénéité des résultats des études publiées et au besoin de poursuivre les recherches afin de parvenir à des données consensuelles sur un domaine d’étude encore peu exploité au plan scientifique. Sur un plan clinique, l’approfondissement des connaissances dans ce domaine devrait permettre de mettre au point des outils d’évaluation de la douleur et d’ainsi en assurer une meilleure prise en charge au quotidien. PMID:20808970

  20. Repeatability of published microarray gene expression analyses.

    PubMed

    Ioannidis, John P A; Allison, David B; Ball, Catherine A; Coulibaly, Issa; Cui, Xiangqin; Culhane, Aedín C; Falchi, Mario; Furlanello, Cesare; Game, Laurence; Jurman, Giuseppe; Mangion, Jon; Mehta, Tapan; Nitzberg, Michael; Page, Grier P; Petretto, Enrico; van Noort, Vera

    2009-02-01

    Given the complexity of microarray-based gene expression studies, guidelines encourage transparent design and public data availability. Several journals require public data deposition and several public databases exist. However, not all data are publicly available, and even when available, it is unknown whether the published results are reproducible by independent scientists. Here we evaluated the replication of data analyses in 18 articles on microarray-based gene expression profiling published in Nature Genetics in 2005-2006. One table or figure from each article was independently evaluated by two teams of analysts. We reproduced two analyses in principle and six partially or with some discrepancies; ten could not be reproduced. The main reason for failure to reproduce was data unavailability, and discrepancies were mostly due to incomplete data annotation or specification of data processing and analysis. Repeatability of published microarray studies is apparently limited. More strict publication rules enforcing public data availability and explicit description of data processing and analysis should be considered.

  1. Analyses of containment structures with corrosion damage

    SciTech Connect

    Cherry, J.L.

    1997-01-01

    Corrosion damage that has been found in a number of nuclear power plant containment structures can degrade the pressure capacity of the vessel. This has prompted concerns regarding the capacity of corroded containments to withstand accident loadings. To address these concerns, finite element analyses have been performed for a typical PWR Ice Condenser containment structure. Using ABAQUS, the pressure capacity was calculated for a typical vessel with no corrosion damage. Multiple analyses were then performed with the location of the corrosion and the amount of corrosion varied in each analysis. Using a strain-based failure criterion, a {open_quotes}lower bound{close_quotes}, {open_quotes}best estimate{close_quotes}, and {open_quotes}upper bound{close_quotes} failure level was predicted for each case. These limits were established by: determining the amount of variability that exists in material properties of typical containments, estimating the amount of uncertainty associated with the level of modeling detail and modeling assumptions, and estimating the effect of corrosion on the material properties.

  2. Used Fuel Management System Interface Analyses - 13578

    SciTech Connect

    Howard, Robert; Busch, Ingrid; Nutt, Mark; Morris, Edgar; Puig, Francesc; Carter, Joe; Delley, Alexcia; Rodwell, Phillip; Hardin, Ernest; Kalinina, Elena; Clark, Robert; Cotton, Thomas

    2013-07-01

    Preliminary system-level analyses of the interfaces between at-reactor used fuel management, consolidated storage facilities, and disposal facilities, along with the development of supporting logistics simulation tools, have been initiated to provide the U.S. Department of Energy (DOE) and other stakeholders with information regarding the various alternatives for managing used nuclear fuel (UNF) generated by the current fleet of light water reactors operating in the United States. An important UNF management system interface consideration is the need for ultimate disposal of UNF assemblies contained in waste packages that are sized to be compatible with different geologic media. Thermal analyses indicate that waste package sizes for the geologic media under consideration by the Used Fuel Disposition Campaign may be significantly smaller than the canisters being used for on-site dry storage by the nuclear utilities. Therefore, at some point along the UNF disposition pathway, there could be a need to repackage fuel assemblies already loaded and being loaded into the dry storage canisters currently in use. The implications of where and when the packaging or repackaging of commercial UNF will occur are key questions being addressed in this evaluation. The analysis demonstrated that thermal considerations will have a major impact on the operation of the system and that acceptance priority, rates, and facility start dates have significant system implications. (authors)

  3. Bioinformatics tools for analysing viral genomic data.

    PubMed

    Orton, R J; Gu, Q; Hughes, J; Maabar, M; Modha, S; Vattipally, S B; Wilkie, G S; Davison, A J

    2016-04-01

    The field of viral genomics and bioinformatics is experiencing a strong resurgence due to high-throughput sequencing (HTS) technology, which enables the rapid and cost-effective sequencing and subsequent assembly of large numbers of viral genomes. In addition, the unprecedented power of HTS technologies has enabled the analysis of intra-host viral diversity and quasispecies dynamics in relation to important biological questions on viral transmission, vaccine resistance and host jumping. HTS also enables the rapid identification of both known and potentially new viruses from field and clinical samples, thus adding new tools to the fields of viral discovery and metagenomics. Bioinformatics has been central to the rise of HTS applications because new algorithms and software tools are continually needed to process and analyse the large, complex datasets generated in this rapidly evolving area. In this paper, the authors give a brief overview of the main bioinformatics tools available for viral genomic research, with a particular emphasis on HTS technologies and their main applications. They summarise the major steps in various HTS analyses, starting with quality control of raw reads and encompassing activities ranging from consensus and de novo genome assembly to variant calling and metagenomics, as well as RNA sequencing.

  4. Transportation systems analyses: Volume 1: Executive Summary

    NASA Astrophysics Data System (ADS)

    1993-05-01

    The principal objective of this study is to accomplish a systems engineering assessment of the nation's space transportation infrastructure. This analysis addresses the necessary elements to perform man delivery and return, cargo transfer, cargo delivery, payload servicing, and the exploration of the Moon and Mars. Specific elements analyzed, but not limited to, include the Space Exploration Initiative (SEI), the National Launch System (NLS), the current expendable launch vehicle (ELV) fleet, ground facilities, the Space Station Freedom (SSF), and other civil, military and commercial payloads. The performance of this study entails maintaining a broad perspective on the large number of transportation elements that could potentially comprise the U.S. space infrastructure over the next several decades. To perform this systems evaluation, top-level trade studies are conducted to enhance our understanding of the relationships between elements of the infrastructure. This broad 'infrastructure-level perspective' permits the identification of preferred infrastructures. Sensitivity analyses are performed to assure the credibility and usefulness of study results. This executive summary of the transportation systems analyses (TSM) semi-annual report addresses the SSF logistics resupply. Our analysis parallels the ongoing NASA SSF redesign effort. Therefore, there could be no SSF design to drive our logistics analysis. Consequently, the analysis attempted to bound the reasonable SSF design possibilities (and the subsequent transportation implications). No other strategy really exists until after a final decision is rendered on the SSF configuration.

  5. Hierarchical regression for analyses of multiple outcomes.

    PubMed

    Richardson, David B; Hamra, Ghassan B; MacLehose, Richard F; Cole, Stephen R; Chu, Haitao

    2015-09-01

    In cohort mortality studies, there often is interest in associations between an exposure of primary interest and mortality due to a range of different causes. A standard approach to such analyses involves fitting a separate regression model for each type of outcome. However, the statistical precision of some estimated associations may be poor because of sparse data. In this paper, we describe a hierarchical regression model for estimation of parameters describing outcome-specific relative rate functions and associated credible intervals. The proposed model uses background stratification to provide flexible control for the outcome-specific associations of potential confounders, and it employs a hierarchical "shrinkage" approach to stabilize estimates of an exposure's associations with mortality due to different causes of death. The approach is illustrated in analyses of cancer mortality in 2 cohorts: a cohort of dioxin-exposed US chemical workers and a cohort of radiation-exposed Japanese atomic bomb survivors. Compared with standard regression estimates of associations, hierarchical regression yielded estimates with improved precision that tended to have less extreme values. The hierarchical regression approach also allowed the fitting of models with effect-measure modification. The proposed hierarchical approach can yield estimates of association that are more precise than conventional estimates when one wishes to estimate associations with multiple outcomes. PMID:26232395

  6. Waste Stream Analyses for Nuclear Fuel Cycles

    SciTech Connect

    N. R. Soelberg

    2010-08-01

    A high-level study was performed in Fiscal Year 2009 for the U.S. Department of Energy (DOE) Office of Nuclear Energy (NE) Advanced Fuel Cycle Initiative (AFCI) to provide information for a range of nuclear fuel cycle options (Wigeland 2009). At that time, some fuel cycle options could not be adequately evaluated since they were not well defined and lacked sufficient information. As a result, five families of these fuel cycle options are being studied during Fiscal Year 2010 by the Systems Analysis Campaign for the DOE NE Fuel Cycle Research and Development (FCRD) program. The quality and completeness of data available to date for the fuel cycle options is insufficient to perform quantitative radioactive waste analyses using recommended metrics. This study has been limited thus far to qualitative analyses of waste streams from the candidate fuel cycle options, because quantitative data for wastes from the front end, fuel fabrication, reactor core structure, and used fuel for these options is generally not yet available.

  7. Computational analyses of multilevel discourse comprehension.

    PubMed

    Graesser, Arthur C; McNamara, Danielle S

    2011-04-01

    The proposed multilevel framework of discourse comprehension includes the surface code, the textbase, the situation model, the genre and rhetorical structure, and the pragmatic communication level. We describe these five levels when comprehension succeeds and also when there are communication misalignments and comprehension breakdowns. A computer tool has been developed, called Coh-Metrix, that scales discourse (oral or print) on dozens of measures associated with the first four discourse levels. The measurement of these levels with an automated tool helps researchers track and better understand multilevel discourse comprehension. Two sets of analyses illustrate the utility of Coh-Metrix in discourse theory and educational practice. First, Coh-Metrix was used to measure the cohesion of the text base and situation model, as well as potential extraneous variables, in a sample of published studies that manipulated text cohesion. This analysis helped us better understand what was precisely manipulated in these studies and the implications for discourse comprehension mechanisms. Second, Coh-Metrix analyses are reported for samples of narrative and science texts in order to advance the argument that traditional text difficulty measures are limited because they fail to accommodate most of the levels of the multilevel discourse comprehension framework.

  8. ISFSI site boundary radiation dose rate analyses.

    PubMed

    Hagler, R J; Fero, A H

    2005-01-01

    Across the globe nuclear utilities are in the process of designing and analysing Independent Spent Fuel Storage Installations (ISFSI) for the purpose of above ground spent-fuel storage primarily to mitigate the filling of spent-fuel pools. Using a conjoining of discrete ordinates transport theory (DORT) and Monte Carlo (MCNP) techniques, an ISFSI was analysed to determine neutron and photon dose rates for a generic overpack, and ISFSI pad configuration and design at distances ranging from 1 to -1700 m from the ISFSI array. The calculated dose rates are used to address the requirements of 10CFR72.104, which provides limits to be enforced for the protection of the public by the NRC in regard to ISFSI facilities. For this overpack, dose rates decrease by three orders of magnitude through the first 200 m moving away from the ISFSI. In addition, the contributions from different source terms changes over distance. It can be observed that although side photons provide the majority of dose rate in this calculation, scattered photons and side neutrons take on more importance as the distance from the ISFSI is increased. PMID:16604670

  9. Ultrasonic Evaluation and Imaging

    SciTech Connect

    Crawford, Susan L.; Anderson, Michael T.; Diaz, Aaron A.; Larche, Michael R.; Prowant, Matthew S.; Cinson, Anthony D.

    2015-10-01

    Ultrasonic evaluation of materials for material characterization and flaw detection is as simple as manually moving a single-element probe across a speci-men and looking at an oscilloscope display in real time or as complex as automatically (under computer control) scanning a phased-array probe across a specimen and collecting encoded data for immediate or off-line data analyses. The reliability of the results in the second technique is greatly increased because of a higher density of measurements per scanned area and measurements that can be more precisely related to the specimen geometry. This chapter will briefly discuss applications of the collection of spatially encoded data and focus primarily on the off-line analyses in the form of data imaging. Pacific Northwest National Laboratory (PNNL) has been involved with as-sessing and advancing the reliability of inservice inspections of nuclear power plant components for over 35 years. Modern ultrasonic imaging techniques such as the synthetic aperture focusing technique (SAFT), phased-array (PA) technolo-gy and sound field mapping have undergone considerable improvements to effec-tively assess and better understand material constraints.

  10. Study of spin-scan imaging for outer planets missions. [imaging techniques for Jupiter orbiter missions

    NASA Technical Reports Server (NTRS)

    Russell, E. E.; Chandos, R. A.; Kodak, J. C.; Pellicori, S. F.; Tomasko, M. G.

    1974-01-01

    The constraints that are imposed on the Outer Planet Missions (OPM) imager design are of critical importance. Imager system modeling analyses define important parameters and systematic means for trade-offs applied to specific Jupiter orbiter missions. Possible image sequence plans for Jupiter missions are discussed in detail. Considered is a series of orbits that allow repeated near encounters with three of the Jovian satellites. The data handling involved in the image processing is discussed, and it is shown that only minimal processing is required for the majority of images for a Jupiter orbiter mission.

  11. Cartographic quality of ERTS-1 images

    NASA Technical Reports Server (NTRS)

    Welch, R. I.

    1973-01-01

    Analyses of simulated and operational ERTS images have provided initial estimates of resolution, ground resolution, detectability thresholds and other measures of image quality of interest to earth scientists and cartographers. Based on these values, including an approximate ground resolution of 250 meters for both RBV and MSS systems, the ERTS-1 images appear suited to the production and/or revision of planimetric and photo maps of 1:500,000 scale and smaller for which map accuracy standards are compatible with the imaged detail. Thematic mapping, although less constrained by map accuracy standards, will be influenced by measurement thresholds and errors which have yet to be accurately determined for ERTS images. This study also indicates the desirability of establishing a quantitative relationship between image quality values and map products which will permit both engineers and cartographers/earth scientists to contribute to the design requirements of future satellite imaging systems.

  12. X-ray CT analyses, models and numerical simulations: a comparison with petrophysical analyses in an experimental CO2 study

    NASA Astrophysics Data System (ADS)

    Henkel, Steven; Pudlo, Dieter; Enzmann, Frieder; Reitenbach, Viktor; Albrecht, Daniel; Ganzer, Leonhard; Gaupp, Reinhard

    2016-06-01

    An essential part of the collaborative research project H2STORE (hydrogen to store), which is funded by the German government, was a comparison of various analytical methods for characterizing reservoir sandstones from different stratigraphic units. In this context Permian, Triassic and Tertiary reservoir sandstones were analysed. Rock core materials, provided by RWE Gasspeicher GmbH (Dortmund, Germany), GDF Suez E&P Deutschland GmbH (Lingen, Germany), E.ON Gas Storage GmbH (Essen, Germany) and RAG Rohöl-Aufsuchungs Aktiengesellschaft (Vienna, Austria), were processed by different laboratory techniques; thin sections were prepared, rock fragments were crushed and cubes of 1 cm edge length and plugs 3 to 5 cm in length with a diameter of about 2.5 cm were sawn from macroscopic homogeneous cores. With this prepared sample material, polarized light microscopy and scanning electron microscopy, coupled with image analyses, specific surface area measurements (after Brunauer, Emmet and Teller, 1938; BET), He-porosity and N2-permeability measurements and high-resolution microcomputer tomography (μ-CT), which were used for numerical simulations, were applied. All these methods were practised on most of the same sample material, before and on selected Permian sandstones also after static CO2 experiments under reservoir conditions. A major concern in comparing the results of these methods is an appraisal of the reliability of the given porosity, permeability and mineral-specific reactive (inner) surface area data. The CO2 experiments modified the petrophysical as well as the mineralogical/geochemical rock properties. These changes are detectable by all applied analytical methods. Nevertheless, a major outcome of the high-resolution μ-CT analyses and following numerical data simulations was that quite similar data sets and data interpretations were maintained by the different petrophysical standard methods. Moreover, the μ-CT analyses are not only time saving, but also

  13. Medical imaging

    NASA Astrophysics Data System (ADS)

    Elliott, Alex

    2005-07-01

    Diagnostic medical imaging is a fundamental part of the practice of modern medicine and is responsible for the expenditure of considerable amounts of capital and revenue monies in healthcare systems around the world. Much research and development work is carried out, both by commercial companies and the academic community. This paper reviews briefly each of the major diagnostic medical imaging techniques—X-ray (planar and CT), ultrasound, nuclear medicine (planar, SPECT and PET) and magnetic resonance. The technical challenges facing each are highlighted, with some of the most recent developments. In terms of the future, interventional/peri-operative imaging, the advancement of molecular medicine and gene therapy are identified as potential areas of expansion.

  14. Imaging Hemodynamics

    PubMed Central

    Jennings, Dominique; Raghunand, Natarajan; Gillies, Robert J.

    2014-01-01

    Microvascular permeability is a pharmacologic indicator of tumor response to therapy, and it is expected that this biomarker will evolve into a clinical surrogate endpoint and be integrated into protocols for determining patient response to antiangiogenic or antivascular therapies. This review discusses the physiological context of vessel permeability in an imaging setting, how it is affected by active and passive transport mechanisms, and how it is described mathematically for both theoretical and complex dynamic microvessel membranes. Many research groups have established dynamic-enhanced imaging protocols for estimating this important parameter. This review discusses those imaging modalities, the advantages and disadvantages of each, and how they compare in terms of their ability to deliver information about therapy-associated changes in microvessel permeability in humans. Finally, this review discusses future directions and improvements needed in these areas. PMID:18506397

  15. Hierarchical Segmentation Enhances Diagnostic Imaging

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Bartron Medical Imaging LLC (BMI), of New Haven, Connecticut, gained a nonexclusive license from Goddard Space Flight Center to use the RHSEG software in medical imaging. To manage image data, BMI then licensed two pattern-matching software programs from NASA's Jet Propulsion Laboratory that were used in image analysis and three data-mining and edge-detection programs from Kennedy Space Center. More recently, BMI made NASA history by being the first company to partner with the Space Agency through a Cooperative Research and Development Agreement to develop a 3-D version of RHSEG. With U.S. Food and Drug Administration clearance, BMI will sell its Med-Seg imaging system with the 2-D version of the RHSEG software to analyze medical imagery from CAT and PET scans, MRI, ultrasound, digitized X-rays, digitized mammographies, dental X-rays, soft tissue analyses, moving object analyses, and soft-tissue slides such as Pap smears for the diagnoses and management of diseases. Extending the software's capabilities to three dimensions will eventually enable production of pixel-level views of a tumor or lesion, early identification of plaque build-up in arteries, and identification of density levels of microcalcification in mammographies.

  16. Chapter 9: Analyses Using Disease Ontologies

    PubMed Central

    Shah, Nigam H.; Cole, Tyler; Musen, Mark A.

    2012-01-01

    Advanced statistical methods used to analyze high-throughput data such as gene-expression assays result in long lists of “significant genes.” One way to gain insight into the significance of altered expression levels is to determine whether Gene Ontology (GO) terms associated with a particular biological process, molecular function, or cellular component are over- or under-represented in the set of genes deemed significant. This process, referred to as enrichment analysis, profiles a gene-set, and is widely used to makes sense of the results of high-throughput experiments. The canonical example of enrichment analysis is when the output dataset is a list of genes differentially expressed in some condition. To determine the biological relevance of a lengthy gene list, the usual solution is to perform enrichment analysis with the GO. We can aggregate the annotating GO concepts for each gene in this list, and arrive at a profile of the biological processes or mechanisms affected by the condition under study. While GO has been the principal target for enrichment analysis, the methods of enrichment analysis are generalizable. We can conduct the same sort of profiling along other ontologies of interest. Just as scientists can ask “Which biological process is over-represented in my set of interesting genes or proteins?” we can also ask “Which disease (or class of diseases) is over-represented in my set of interesting genes or proteins?“. For example, by annotating known protein mutations with disease terms from the ontologies in BioPortal, Mort et al. recently identified a class of diseases—blood coagulation disorders—that were associated with a 14-fold depletion in substitutions at O-linked glycosylation sites. With the availability of tools for automatic annotation of datasets with terms from disease ontologies, there is no reason to restrict enrichment analyses to the GO. In this chapter, we will discuss methods to perform enrichment analysis using any

  17. Angiographic Imaging

    PubMed Central

    Morris, D. Christopher

    1986-01-01

    Angiographic imaging in 1986 employs not only conventional film arteriography and venography, but also digital subtraction angiography (DSA). Arteriography is still the best method of demonstrating pathology in patients with peripheral vascular disease. Transluminal angioplasty, its indications and results are discussed. Patients with suspected renovascular hypertension should be given intravenous DSA and, if pathology is demonstrated, renin sampling as well. Patients with severe, acute, life-threatening hemorrhage may have angiography not only to localize bleeding sites, but also to treat them by transcatheter embolization techniques. Various other angiographic techniques including venous sampling are discussed briefly. ImagesFigure 2Figure 3Figure 4Figure 5Figure 6Figure 7 PMID:21267204

  18. Musculoskeletal Imaging

    PubMed Central

    Connell, Douglas G.

    1986-01-01

    Musculoskeletal problems account for a significant portion of primary care medicine. Increase in the public awareness of physical fitness has led to an increase in both the incidence and appreciation of musculoskeletal disorders. This discussion considers the investigation of disorders involving the shoulder, wrist, foot, knee and pelvis. Emphasis is placed on new imaging techniques and their place in the investigation of these problems, as well as on their relationship to the more traditional modalities. ImagesFigure 1Figure 2Figure 3Figure 4Figure 5Figure 6Figure 7Figure 8Figure 9 PMID:21267198

  19. Brain Imaging

    PubMed Central

    Racine, Eric; Bar-Ilan, Ofek; Illes, Judy

    2007-01-01

    Advances in neuroscience are increasingly intersecting with issues of ethical, legal, and social interest. This study is an analysis of press coverage of an advanced technology for brain imaging, functional magnetic resonance imaging, that has gained significant public visibility over the past ten years. Discussion of issues of scientific validity and interpretation dominated over ethical content in both the popular and specialized press. Coverage of research on higher order cognitive phenomena specifically attributed broad personal and societal meaning to neuroimages. The authors conclude that neuroscience provides an ideal model for exploring science communication and ethics in a multicultural context. PMID:17330151

  20. Phylogenomic Analyses Support Traditional Relationships within Cnidaria.

    PubMed

    Zapata, Felipe; Goetz, Freya E; Smith, Stephen A; Howison, Mark; Siebert, Stefan; Church, Samuel H; Sanders, Steven M; Ames, Cheryl Lewis; McFadden, Catherine S; France, Scott C; Daly, Marymegan; Collins, Allen G; Haddock, Steven H D; Dunn, Casey W; Cartwright, Paulyn

    2015-01-01

    Cnidaria, the sister group to Bilateria, is a highly diverse group of animals in terms of morphology, lifecycles, ecology, and development. How this diversity originated and evolved is not well understood because phylogenetic relationships among major cnidarian lineages are unclear, and recent studies present contrasting phylogenetic hypotheses. Here, we use transcriptome data from 15 newly-sequenced species in combination with 26 publicly available genomes and transcriptomes to assess phylogenetic relationships among major cnidarian lineages. Phylogenetic analyses using different partition schemes and models of molecular evolution, as well as topology tests for alternative phylogenetic relationships, support the monophyly of Medusozoa, Anthozoa, Octocorallia, Hydrozoa, and a clade consisting of Staurozoa, Cubozoa, and Scyphozoa. Support for the monophyly of Hexacorallia is weak due to the equivocal position of Ceriantharia. Taken together, these results further resolve deep cnidarian relationships, largely support traditional phylogenetic views on relationships, and provide a historical framework for studying the evolutionary processes involved in one of the most ancient animal radiations.

  1. Project analysis and integration economic analyses summary

    NASA Technical Reports Server (NTRS)

    Macomber, H. L.

    1986-01-01

    An economic-analysis summary was presented for the manufacture of crystalline-silicon modules involving silicon ingot/sheet, growth, slicing, cell manufacture, and module assembly. Economic analyses provided: useful quantitative aspects for complex decision-making to the Flat-plate Solar Array (FSA) Project; yardsticks for design and performance to industry; and demonstration of how to evaluate and understand the worth of research and development both to JPL and other government agencies and programs. It was concluded that future research and development funds for photovoltaics must be provided by the Federal Government because the solar industry today does not reap enough profits from its present-day sales of photovoltaic equipment.

  2. Precise Chemical Analyses of Planetary Surfaces

    NASA Technical Reports Server (NTRS)

    Kring, David; Schweitzer, Jeffrey; Meyer, Charles; Trombka, Jacob; Freund, Friedemann; Economou, Thanasis; Yen, Albert; Kim, Soon Sam; Treiman, Allan H.; Blake, David; Lisse, Carey

    1996-01-01

    We identify the chemical elements and element ratios that should be analyzed to address many of the issues identified by the Committee on Planetary and Lunar Exploration (COMPLEX). We determined that most of these issues require two sensitive instruments to analyze the necessary complement of elements. In addition, it is useful in many cases to use one instrument to analyze the outermost planetary surface (e.g. to determine weathering effects), while a second is used to analyze a subsurface volume of material (e.g., to determine the composition of unaltered planetary surface material). This dual approach to chemical analyses will also facilitate the calibration of orbital and/or Earth-based spectral observations of the planetary body. We determined that in many cases the scientific issues defined by COMPLEX can only be fully addressed with combined packages of instruments that would supplement the chemical data with mineralogic or visual information.

  3. Comparative Analyses of Plant Transcription Factor Databases

    PubMed Central

    Ramirez, Silvia R; Basu, Chhandak

    2009-01-01

    Transcription factors (TFs) are proteinaceous complex, which bind to the promoter regions in the DNA and affect transcription initiation. Plant TFs control gene expressions and genes control many physiological processes, which in turn trigger cascades of biochemical reactions in plant cells. The databases available for plant TFs are somewhat abundant but all convey different information and in different formats. Some of the publicly available plant TF databases may be narrow, while others are broad in scopes. For example, some of the best TF databases are ones that are very specific with just one plant species, but there are also other databases that contain a total of up to 20 different plant species. In this review plant TF databases ranging from a single species to many will be assessed and described. The comparative analyses of all the databases and their advantages and disadvantages are also discussed. PMID:19721806

  4. Anthocyanin analyses of Vaccinium fruit dietary supplements.

    PubMed

    Lee, Jungmin

    2016-09-01

    Vaccinium fruit ingredients within dietary supplements were identified by comparisons with anthocyanin analyses of known Vaccinium profiles (demonstration of anthocyanin fingerprinting). Available Vaccinium supplements were purchased and analyzed, their anthocyanin profiles (based on high-performance liquid chromatography [HPLC] separation) indicated if products' fruit origin listings were authentic. Over 30% of the Vaccinium fruit (cranberry, lingonberry, bilberry, and blueberry; 14 of 45) products available as dietary supplements did not contain the fruit listed as ingredients. Six supplements contained no anthocyanins. Five others had contents differing from labeled fruit (e.g., bilberry capsules containing Andean blueberry fruit). Of the samples that did contain the specified fruit (n = 27), anthocyanin content ranged from 0.04 to 14.37 mg per capsule, tablet, or teaspoon (5 g). Approaches to utilizing anthocyanins in assessment of sample authenticity, and a discussion of the challenges with anthocyanin profiles in quality control are both presented. PMID:27625778

  5. Analyse de formes par moiré

    NASA Astrophysics Data System (ADS)

    Harthong, J.; Sahli, H.; Poinsignon, R.; Meyrueis, P.

    1991-01-01

    We present a mathematical analysis of moiré phenomena for shape recognition. The basic theoretical concept - and tool - will be the contour function. We show that the mathematical analysis is greatly simplified by the systematic recourse to this tool. The analysis presented permits a simultaneous treatment of two different modes of implementing the moiré technique : the direct mode (widely used and well-known), and the converse mode (scarcely used). The converse mode consists in computing and designing a grating especially for one model of object, in such a manner that if (and only if) the object is in conformity with the prescribed model, the resulting moiré fringes are parallel straight lines. We give explicit formulas and algorithms for such computations. Nous présentons une analyse mathématique du moiré permettant une reconnaissance des formes. Le concept théorique de base est celui de “ fonction de contour ”. Nous montrons que l'analyse mathématique est simplifiée en faisant appel à ces fonctions. De plus, la méthode proposée permet de traiter d'une manière unifiée les deux différents modes d'utilisation des techniques de moiré : le mode direct (le plus utilisé et le mieux connu), et le moiré inverse, qui consiste, pour un modèle d'objet donné, à calculer et réaliser un réseau spécifique, tel que si (et seulement si) un objet est conforme au modèle, les franges de moiré obtenues seront des lignes droites parallèles. Nous proposons des formules explicites et des algorithmes pour ces traitements.

  6. Statistical analyses of the relative risk.

    PubMed Central

    Gart, J J

    1979-01-01

    Let P1 be the probability of a disease in one population and P2 be the probability of a disease in a second population. The ratio of these quantities, R = P1/P2, is termed the relative risk. We consider first the analyses of the relative risk from retrospective studies. The relation between the relative risk and the odds ratio (or cross-product ratio) is developed. The odds ratio can be considered a parameter of an exponential model possessing sufficient statistics. This permits the development of exact significance tests and confidence intervals in the conditional space. Unconditional tests and intervals are also considered briefly. The consequences of misclassification errors and ignoring matching or stratifying are also considered. The various methods are extended to combination of results over the strata. Examples of case-control studies testing the association between HL-A frequencies and cancer illustrate the techniques. The parallel analyses of prospective studies are given. If P1 and P2 are small with large samples sizes the appropriate model is a Poisson distribution. This yields a exponential model with sufficient statistics. Exact conditional tests and confidence intervals can then be developed. Here we consider the case where two populations are compared adjusting for sex differences as well as for the strata (or covariate) differences such as age. The methods are applied to two examples: (1) testing in the two sexes the ratio of relative risks of skin cancer in people living in different latitudes, and (2) testing over time the ratio of the relative risks of cancer in two cities, one of which fluoridated its drinking water and one which did not. PMID:540589

  7. Reporting guidelines for population pharmacokinetic analyses.

    PubMed

    Dykstra, Kevin; Mehrotra, Nitin; Tornøe, Christoffer Wenzel; Kastrissios, Helen; Patel, Bela; Al-Huniti, Nidal; Jadhav, Pravin; Wang, Yaning; Byon, Wonkyung

    2015-06-01

    The purpose of this work was to develop a consolidated set of guiding principles for reporting of population pharmacokinetic (PK) analyses based on input from a survey of practitioners as well as discussions between industry, consulting and regulatory scientists. The survey found that identification of population covariate effects on drug exposure and support for dose selection (where population PK frequently serves as preparatory analysis to exposure-response modeling) are the main areas of influence for population PK analysis. The proposed guidelines consider two main purposes of population PK reports (1) to present key analysis findings and their impact on drug development decisions, and (2) as documentation of the analysis methods for the dual purpose of enabling review of the analysis and facilitating future use of the models. This work also identified two main audiences for the reports: (1) a technically competent group responsible for in-depth review of the data, methodology, and results, and (2) a scientifically literate, but not technically adept group, whose main interest is in the implications of the analysis for the broader drug development program. We recommend a generalized question-based approach with six questions that need to be addressed throughout the report. We recommend eight sections (Synopsis, Introduction, Data, Methods, Results, Discussion, Conclusions, Appendix) with suggestions for the target audience and level of detail for each section. A section providing general expectations regarding population PK reporting from a regulatory perspective is also included. We consider this an important step towards industrialization of the field of pharmacometrics such that non-technical audience also understands the role of pharmacometrics analyses in decision making. Population PK reports were chosen as representative reports to derive these recommendations; however, the guiding principles presented here are applicable for all pharmacometric reports

  8. Reporting guidelines for population pharmacokinetic analyses.

    PubMed

    Dykstra, Kevin; Mehrotra, Nitin; Tornøe, Christoffer Wenzel; Kastrissios, Helen; Patel, Bela; Al-Huniti, Nidal; Jadhav, Pravin; Wang, Yaning; Byon, Wonkyung

    2015-08-01

    The purpose of this work was to develop a consolidated set of guiding principles for the reporting of population pharmacokinetic (PK) analyses based on input from a survey of practitioners as well as discussions between industry, consulting, and regulatory scientists. The survey found that identification of population covariate effects on drug exposure and support for dose selection (in which population PK frequently serves as preparatory analysis for exposure-response modeling) are the main areas of influence for population PK analysis. The proposed guidelines consider 2 main purposes of population PK reports: (1) to present key analysis findings and their impact on drug development decisions, and (2) as documentation of the analysis methods for the dual purpose of enabling review of the analysis and facilitating future use of the models. This work also identified 2 main audiences for the reports: (1) a technically competent group responsible for in-depth review of the data, methodology, and results; and (2) a scientifically literate but not technically adept group, whose main interest is in the implications of the analysis for the broader drug development program. We recommend a generalized question-based approach with 6 questions that need to be addressed throughout the report. We recommend 8 sections (Synopsis, Introduction, Data, Methods, Results, Discussion, Conclusions, Appendix) with suggestions for the target audience and level of detail for each section. A section providing general expectations regarding population PK reporting from a regulatory perspective is also included. We consider this an important step toward industrialization of the field of pharmacometrics such that a nontechnical audience also understands the role of pharmacometric analyses in decision making. Population PK reports were chosen as representative reports to derive these recommendations; however, the guiding principles presented here are applicable for all pharmacometric reports

  9. Reporting guidelines for population pharmacokinetic analyses.

    PubMed

    Dykstra, Kevin; Mehrotra, Nitin; Tornøe, Christoffer Wenzel; Kastrissios, Helen; Patel, Bela; Al-Huniti, Nidal; Jadhav, Pravin; Wang, Yaning; Byon, Wonkyung

    2015-06-01

    The purpose of this work was to develop a consolidated set of guiding principles for reporting of population pharmacokinetic (PK) analyses based on input from a survey of practitioners as well as discussions between industry, consulting and regulatory scientists. The survey found that identification of population covariate effects on drug exposure and support for dose selection (where population PK frequently serves as preparatory analysis to exposure-response modeling) are the main areas of influence for population PK analysis. The proposed guidelines consider two main purposes of population PK reports (1) to present key analysis findings and their impact on drug development decisions, and (2) as documentation of the analysis methods for the dual purpose of enabling review of the analysis and facilitating future use of the models. This work also identified two main audiences for the reports: (1) a technically competent group responsible for in-depth review of the data, methodology, and results, and (2) a scientifically literate, but not technically adept group, whose main interest is in the implications of the analysis for the broader drug development program. We recommend a generalized question-based approach with six questions that need to be addressed throughout the report. We recommend eight sections (Synopsis, Introduction, Data, Methods, Results, Discussion, Conclusions, Appendix) with suggestions for the target audience and level of detail for each section. A section providing general expectations regarding population PK reporting from a regulatory perspective is also included. We consider this an important step towards industrialization of the field of pharmacometrics such that non-technical audience also understands the role of pharmacometrics analyses in decision making. Population PK reports were chosen as representative reports to derive these recommendations; however, the guiding principles presented here are applicable for all pharmacometric reports

  10. High-Resolution Force Balance Analyses of Tidewater Glacier Dynamics

    NASA Astrophysics Data System (ADS)

    Enderlin, E. M.; Hamilton, G. S.; O'Neel, S.

    2015-12-01

    Changes in glacier velocity, thickness, and terminus position have been used to infer the dynamic response of tidewater glaciers to environmental perturbations, yet few analyses have attempted to quantify the associated variations in the glacier force balance. Where repeat high-resolution ice thickness and velocity estimates are available, force balance time series can be constructed to investigate the redistribution of driving and resistive forces associated with changes in terminus position. Comparative force balance analyses may, therefore, help us understand the variable dynamic response observed for glaciers in close proximity to each other. Here we construct force balance time series for Helheim Glacier, SE Greenland, and Columbia Glacier, SE Alaska, to investigate differences in dynamic sensitivity to terminus position change. The analysis relies on in situ and remotely sensed observations of ice thickness, velocity, and terminus position. Ice thickness time series are obtained from stereo satellite image-derived surface elevation and continuity-derived bed elevations that are constrained by airborne radar observations. Surface velocity time series are obtained from interferometric synthetic aperture radar (InSAR) observations. Approximately daily terminus positions are from a combination of satellite images and terrestrial time-lapse photographs. Helheim and Columbia glaciers are two of the best-studied Arctic tidewater glaciers with comprehensive high-resolution observational time series, yet we find that bed elevation uncertainties and poorly-constrained stress-coupling length estimates still hinder the analysis of spatial and temporal force balance variations. Here we use a new observationally-based method to estimate the stress-coupling length which successfully reduces noise in the derived force balance but preserves spatial variations that can be over-smoothed when estimating the stress-coupling length as a scalar function of the ice thickness

  11. Molecular analyses of an acidic transthyretin Asn 90 variant.

    PubMed Central

    Saraiva, M J; Almeida, M R; Alves, I L; Moreira, P; Gawinowicz, M; Costa, P P; Rauh, S; Banhzoff, A; Altland, K

    1991-01-01

    A mutation in transthyretin (TTR Asn 90) has been identified in the Portuguese and German populations. This variant has a lower pI and was found by screening analyses in 2/4,000 German subjects and in 4/1,200 Portuguese by using either double one-dimensional (D1-D) electrophoresis with isoelectric focusing (IEF) or hybrid isoelectric focusing in immobilized pH gradient (HIEF) as the final separation step. The Portuguese population sample was from the area where TTR Met 30-associated familial amyloidotic polyneuropathy (FAP) prevails, and it was divided into (a) a group of 500 individuals belonging to FAP kindreds and (b) a group of 700 collected at random. HIEF showed two particular situations: (1) one case, from an FAP kindred, was simultaneously carrier of the Met 30 substitution and the acidic variant, and (2) one individual, from the randomly selected Portuguese sample, had only the acidic monomer. Comparative peptide mapping, by HPLC, of the acidic variant carriers and of normal TTR showed the presence of an abnormal tryptic peptide, not present in the normal TTR digests, with an asparagine-for-histidine substitution at position 90 explained by a single base change of adenine for cytosine in the histidine codon. This was confirmed at the DNA level by RFLP analyses of PCR-amplified material after digestion with SphI and BsmI. In all carriers of the Asn 90 substitution, no indicators were found for an association with traits characteristic for FAP. Images Figure 1 Figure 3 PMID:1850190

  12. Analyses of moisture in polymers and composites

    NASA Technical Reports Server (NTRS)

    Ryan, L. E.; Vaughan, R. W.

    1980-01-01

    A suitable method for the direct measurement of moisture concentrations after humidity/thermal exposure on state of the art epoxy and polyimide resins and their graphite and glass fiber reinforcements was investigated. Methods for the determination of moisture concentration profiles, moisture diffusion modeling and moisture induced chemical changes were examined. Carefully fabricated, precharacterized epoxy and polyimide neat resins and their AS graphite and S glass reinforced composites were exposed to humid conditions using heavy water (D20), at ambient and elevated temperatures. These specimens were fixtured to theoretically limit the D20 permeation to a unidirectional penetration axis. The analytical techniques evaluated were: (1) laser pyrolysis gas chromatography mass spectrometry; (2) solids probe mass spectrometry; (3) laser pyrolysis conventional infrared spectroscopy; and (4) infrared imaging thermovision. The most reproducible and sensitive technique was solids probe mass spectrometry. The fabricated exposed specimens were analyzed for D20 profiling after humidity/thermal conditioning at three exposure time durations.

  13. Imaging sciences workshop

    SciTech Connect

    Candy, J.V.

    1994-11-15

    This workshop on the Imaging Sciences sponsored by Lawrence Livermore National Laboratory contains short abstracts/articles submitted by speakers. The topic areas covered include the following: Astronomical Imaging; biomedical imaging; vision/image display; imaging hardware; imaging software; Acoustic/oceanic imaging; microwave/acoustic imaging; computed tomography; physical imaging; imaging algorithms. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.

  14. Image Processing Diagnostics: Emphysema

    NASA Astrophysics Data System (ADS)

    McKenzie, Alex

    2009-10-01

    Currently the computerized tomography (CT) scan can detect emphysema sooner than traditional x-rays, but other tests are required to measure more accurately the amount of affected lung. CT scan images show clearly if a patient has emphysema, but is unable by visual scan alone, to quantify the degree of the disease, as it appears merely as subtle, barely distinct, dark spots on the lung. Our goal is to create a software plug-in to interface with existing open source medical imaging software, to automate the process of accurately diagnosing and determining emphysema severity levels in patients. This will be accomplished by performing a number of statistical calculations using data taken from CT scan images of several patients representing a wide range of severity of the disease. These analyses include an examination of the deviation from a normal distribution curve to determine skewness, a commonly used statistical parameter. Our preliminary results show that this method of assessment appears to be more accurate and robust than currently utilized methods which involve looking at percentages of radiodensities in air passages of the lung.

  15. BoneJ: Free and extensible bone image analysis in ImageJ.

    PubMed

    Doube, Michael; Kłosowski, Michał M; Arganda-Carreras, Ignacio; Cordelières, Fabrice P; Dougherty, Robert P; Jackson, Jonathan S; Schmid, Benjamin; Hutchinson, John R; Shefelbine, Sandra J

    2010-12-01

    Bone geometry is commonly measured on computed tomographic (CT) and X-ray microtomographic (μCT) images. We obtained hundreds of CT, μCT and synchrotron μCT images of bones from diverse species that needed to be analysed remote from scanning hardware, but found that available software solutions were expensive, inflexible or methodologically opaque. We implemented standard bone measurements in a novel ImageJ plugin, BoneJ, with which we analysed trabecular bone, whole bones and osteocyte lacunae. BoneJ is open source and free for anyone to download, use, modify and distribute.

  16. Biblical Images.

    ERIC Educational Resources Information Center

    Nir, Yeshayahu

    1987-01-01

    Responds to Marjorie Munsterberg's review of "The Bible and the Image: The History of Photography in the Holy Land 1839-1899." Claims that Munsterberg provided an incomplete and inaccurate knowledge of the book's content, and that she considered Western pictorial traditions as the only valid measure in the study of the history of photography.…

  17. [Endometrial imaging].

    PubMed

    Lemercier, E; Genevois, A; Dacher, J N; Benozio, M; Descargues, G; Marpeau, L

    2000-12-01

    The diagnostic value of endovaginal sonography in benign or malignant endometrial pathology is high, increased by sonohysterography. Sonohysterography is useful in the diagnosis of endometrial thickness and to determine further investigations. MRI is accurate in the uterine adenomyosis diagnosis and is the imaging modality of choice for the preoperative endometrial cancer staging. PMID:11173754

  18. Narrowband Imaging

    NASA Astrophysics Data System (ADS)

    Goldman, Don S.

    The Hubble Space Telescope (HST) captured the attention of the world when it released its astounding image in 1995 of the Eagle Nebula (Messier 16) often called "The Pillars of Creation" (Fig. 1). It contained dark, billowing towers of gas and dust rising majestically into a background of glowing radiation. It told a story of new star formation.

  19. Inner Image

    ERIC Educational Resources Information Center

    Mollhagen, Nancy

    2004-01-01

    In this article, the author states that she has always loved self portraits but most teenagers do not enjoy looking too closely at their own faces in an effort to replicate them. Thanks to a new digital camera, she was able to use this new technology to inspire students to take a closer look at their inner image. Prior to the self-portrait…

  20. Forest Imaging

    NASA Technical Reports Server (NTRS)

    1992-01-01

    NASA's Technology Applications Center, with other government and academic agencies, provided technology for improved resources management to the Cibola National Forest. Landsat satellite images enabled vegetation over a large area to be classified for purposes of timber analysis, wildlife habitat, range measurement and development of general vegetation maps.

  1. Photoacoustic imaging platforms for multimodal imaging

    PubMed Central

    2015-01-01

    Photoacoustic (PA) imaging is a hybrid biomedical imaging method that exploits both acoustical Epub ahead of print and optical properties and can provide both functional and structural information. Therefore, PA imaging can complement other imaging methods, such as ultrasound imaging, fluorescence imaging, optical coherence tomography, and multi-photon microscopy. This article reviews techniques that integrate PA with the above imaging methods and describes their applications. PMID:25754364

  2. Database-Driven Analyses of Astronomical Spectra

    NASA Astrophysics Data System (ADS)

    Cami, Jan

    2012-03-01

    Spectroscopy is one of the most powerful tools to study the physical properties and chemical composition of very diverse astrophysical environments. In principle, each nuclide has a unique set of spectral features; thus, establishing the presence of a specific material at astronomical distances requires no more than finding a laboratory spectrum of the right material that perfectly matches the astronomical observations. Once the presence of a substance is established, a careful analysis of the observational characteristics (wavelengths or frequencies, intensities, and line profiles) allows one to determine many physical parameters of the environment in which the substance resides, such as temperature, density, velocity, and so on. Because of this great diagnostic potential, ground-based and space-borne astronomical observatories often include instruments to carry out spectroscopic analyses of various celestial objects and events. Of particular interest is molecular spectroscopy at infrared wavelengths. From the spectroscopic point of view, molecules differ from atoms in their ability to vibrate and rotate, and quantum physics inevitably causes those motions to be quantized. The energies required to excite vibrations or rotations are such that vibrational transitions generally occur at infrared wavelengths, whereas pure rotational transitions typically occur at sub-mm wavelengths. Molecular vibration and rotation are coupled though, and thus at infrared wavelengths, one commonly observes a multitude of ro-vibrational transitions (see Figure 13.1). At lower spectral resolution, all transitions blend into one broad ro-vibrational molecular band. The isotope. Molecular spectroscopy thus allows us to see a difference of one neutron in an atomic nucleus that is located at astronomical distances! Since the detection of the first interstellar molecules (the CH [21] and CN [14] radicals), more than 150 species have been detected in space, ranging in size from diatomic

  3. Operational Satellite-based Surface Oil Analyses (Invited)

    NASA Astrophysics Data System (ADS)

    Streett, D.; Warren, C.

    2010-12-01

    During the Deepwater Horizon spill, NOAA imagery analysts in the Satellite Analysis Branch (SAB) issued more than 300 near-real-time satellite-based oil spill analyses. These analyses were used by the oil spill response community for planning, issuing surface oil trajectories and tasking assets (e.g., oil containment booms, skimmers, overflights). SAB analysts used both Synthetic Aperture Radar (SAR) and high resolution visible/near IR multispectral satellite imagery as well as a variety of ancillary datasets. Satellite imagery used included ENVISAT ASAR (ESA), TerraSAR-X (DLR), Cosmo-Skymed (ASI), ALOS (JAXA), Radarsat (MDA), ENVISAT MERIS (ESA), SPOT (SPOT Image Corp.), Aster (NASA), MODIS (NASA), and AVHRR (NOAA). Ancillary datasets included ocean current information, wind information, location of natural oil seeps and a variety of in situ oil observations. The analyses were available as jpegs, pdfs, shapefiles and through Google, KML files and also available on a variety of websites including Geoplatform and ERMA. From the very first analysis issued just 5 hours after the rig sank through the final analysis issued in August, the complete archive is still publicly available on the NOAA/NESDIS website http://www.ssd.noaa.gov/PS/MPS/deepwater.html SAB personnel also served as the Deepwater Horizon International Disaster Charter Project Manager (at the official request of the USGS). The Project Manager’s primary responsibility was to acquire and oversee the processing and dissemination of satellite data generously donated by numerous private companies and nations in support of the oil spill response including some of the imagery described above. SAB has begun to address a number of goals that will improve our routine oil spill response as well as help assure that we are ready for the next spill of national significance. We hope to (1) secure a steady, abundant and timely stream of suitable satellite imagery even in the absence of large-scale emergencies such as

  4. Efficient ALL vs. ALL collision risk analyses

    NASA Astrophysics Data System (ADS)

    Escobar, D.; Paskowitz, M.; Agueda, A.; Garcia, G.; Molina, M.

    2011-09-01

    In recent years, the space debris has gained a lot of attention due to the increasing amount of uncontrolled man-made objects orbiting the Earth. This population poses a significant and constantly growing thread to operational satellites. In order to face this thread in an independent manner, ESA has launched an initiative for the development of a European SSA System where GMV is participating via several activities. Apart from those activities financed by ESA, GMV has developed closeap, a tool for efficient conjunction assessment and collision probability prediction. ESÁs NAPEOS has been selected as computational engine and numerical propagator to be used in the tool, which can be considered as an add-on to the standard NAPEOS package. closeap makes use of the same orbit computation, conjunction assessment and collision risk algorithms implemented in CRASS, but at the same time both systems are completely independent. Moreover, the implementation in closeap has been validated against CRASS with excellent results. This paper describes the performance improvements implemented in closeap at algorithm level to ensure that the most time demanding scenarios (e.g., all catalogued objects are analysed against each other - all vs. all scenarios -) can be analysed in a reasonable amount of time with commercial-off-the-shelf hardware. However, the amount of space debris increases steadily due to the human activities. Thus, the number of objects involved in a full collision assessment is expected to increase notably and, consequently, the computational cost, which scales as the square of the number of objects, will increase as well. Additionally, orbit propagation algorithms that are computationally expensive might be needed to predict more accurately the trajectories of the space debris. In order to cope with such computational needs, the next natural step in the development of collision assessment tools is the use of parallelization techniques. In this paper we investigate

  5. High perfomance liquid chromatography in pharmaceutical analyses.

    PubMed

    Nikolin, Branko; Imamović, Belma; Medanhodzić-Vuk, Saira; Sober, Miroslav

    2004-05-01

    In testing the pre-sale procedure the marketing of drugs and their control in the last ten years, high performance liquid chromatography replaced numerous spectroscopic methods and gas chromatography in the quantitative and qualitative analysis. In the first period of HPLC application it was thought that it would become a complementary method of gas chromatography, however, today it has nearly completely replaced gas chromatography in pharmaceutical analysis. The application of the liquid mobile phase with the possibility of transformation of mobilized polarity during chromatography and all other modifications of mobile phase depending upon the characteristics of substance which are being tested, is a great advantage in the process of separation in comparison to other methods. The greater choice of stationary phase is the next factor which enables realization of good separation. The separation line is connected to specific and sensitive detector systems, spectrafluorimeter, diode detector, electrochemical detector as other hyphernated systems HPLC-MS and HPLC-NMR, are the basic elements on which is based such wide and effective application of the HPLC method. The purpose high performance liquid chromatography (HPLC) analysis of any drugs is to confirm the identity of a drug and provide quantitative results and also to monitor the progress of the therapy of a disease.1) Measuring presented on the Fig. 1. is chromatogram obtained for the plasma of depressed patients 12 h before oral administration of dexamethasone. It may also be used to further our understanding of the normal and disease process in the human body trough biomedical and therapeutically research during investigation before of the drugs registration. The analyses of drugs and metabolites in biological fluids, particularly plasma, serum or urine is one of the most demanding but one of the most common uses of high performance of liquid chromatography. Blood, plasma or serum contains numerous endogenous

  6. Growth curve analyses in selected duck lines.

    PubMed

    Maruyama, K; Vinyard, B; Akbar, M K; Shafer, D J; Turk, C M

    2001-12-01

    1. Growth patterns of male ducks from 4 lines (lines A, B, C and D) selected for market weight were analysed and compared to growth patterns of ducks in the respective line 7 generations earlier. Growth curves were analysed using procedures derived from the Weibull sigmoidal function and the linear-linear relative growth rate model and simple allometry. 2. The ducks were fed ad libitum under 24-h lighting throughout the experiment. At weekly intervals from the time of hatch through 70 d of age, 16 ducks from each line were killed to determine body, carcase, breast-muscle, leg and thigh-muscle, and abdominal fat weights. 3. Line A was the heaviest line, followed by line B, line C and line D. However, body weight, carcase weight and breast-muscle weight at 49 d of age were not significantly different between lines A and B. After 7 generations of selection, the breast-muscle yield was increased to >19% and the abdominal fat percent was reduced to <1.4% in all lines. 4. The Weibull growth curve analysis of body weight showed an increase in the asymptotes during selection, while the age of the inflection point remained constant in all lines (21.3 to 26.0 d). For breast-muscle growth, ducks reached the inflection point 12.8 to 14.3 d later than for body weight. Between line A and line B, asymptotes for body weight, asymptotes for breast-muscle weight and allometric growth coefficients of breast muscle and leg and thigh muscles from 14 to 49 d were not significantly different. 5. The relative growth rate model discriminated body and breast-muscle growth patterns of line A and line B. The initial decline in the relative body growth rate was less and the time to reach the transition was longer in line A than line B. On the other hand, the initial decline in the relative breast-muscle growth rate was greater in line A than line B. PMID:11811908

  7. Correlation-Based Image Reconstruction Methods for Magnetic Particle Imaging

    NASA Astrophysics Data System (ADS)

    Ishihara, Yasutoshi; Kuwabara, Tsuyoshi; Honma, Takumi; Nakagawa, Yohei

    Magnetic particle imaging (MPI), in which the nonlinear interaction between internally administered magnetic nanoparticles (MNPs) and electromagnetic waves irradiated from outside of the body is utilized, has attracted attention for its potential to achieve early diagnosis of diseases such as cancer. In MPI, the local magnetic field distribution is scanned, and the magnetization signal from MNPs within a selected region is detected. However, the signal sensitivity and image resolution are degraded by interference from magnetization signals generated by MNPs outside of the selected region, mainly because of imperfections (limited gradients) in the local magnetic field distribution. Here, we propose new methods based on correlation information between the observed signal and the system function—defined as the interaction between the magnetic field distribution and the magnetizing properties of MNPs. We performed numerical analyses and found that, although the images were somewhat blurred, image artifacts could be significantly reduced and accurate images could be reconstructed without the inverse-matrix operation used in conventional image reconstruction methods.

  8. Applications of Parallel Processing in Configuration Analyses

    NASA Technical Reports Server (NTRS)

    Sundaram, Ppchuraman; Hager, James O.; Biedron, Robert T.

    1999-01-01

    The paper presents the recent progress made towards developing an efficient and user-friendly parallel environment for routine analysis of large CFD problems. The coarse-grain parallel version of the CFL3D Euler/Navier-Stokes analysis code, CFL3Dhp, has been ported onto most available parallel platforms. The CFL3Dhp solution accuracy on these parallel platforms has been verified with the CFL3D sequential analyses. User-friendly pre- and post-processing tools that enable a seamless transfer from sequential to parallel processing have been written. Static load balancing tool for CFL3Dhp analysis has also been implemented for achieving good parallel efficiency. For large problems, load balancing efficiency as high as 95% can be achieved even when large number of processors are used. Linear scalability of the CFL3Dhp code with increasing number of processors has also been shown using a large installed transonic nozzle boattail analysis. To highlight the fast turn-around time of parallel processing, the TCA full configuration in sideslip Navier-Stokes drag polar at supersonic cruise has been obtained in a day. CFL3Dhp is currently being used as a production analysis tool.

  9. DNA microarray analyses in higher plants.

    PubMed

    Galbraith, David W

    2006-01-01

    DNA microarrays were originally devised and described as a convenient technology for the global analysis of plant gene expression. Over the past decade, their use has expanded enormously to cover all kingdoms of living organisms. At the same time, the scope of applications of microarrays has increased beyond expression analyses, with plant genomics playing a leadership role in the on-going development of this technology. As the field has matured, the rate-limiting step has moved from that of the technical process of data generation to that of data analysis. We currently face major problems in dealing with the accumulating datasets, not simply with respect to how to archive, access, and process the huge amounts of data that have been and are being produced, but also in determining the relative quality of the different datasets. A major recognized concern is the appropriate use of statistical design in microarray experiments, without which the datasets are rendered useless. A vigorous area of current research involves the development of novel statistical tools specifically for microarray experiments. This article describes, in a necessarily selective manner, the types of platforms currently employed in microarray research and provides an overview of recent activities using these platforms in plant biology.

  10. Phylogenomic Analyses Support Traditional Relationships within Cnidaria

    PubMed Central

    Zapata, Felipe; Goetz, Freya E.; Smith, Stephen A.; Howison, Mark; Siebert, Stefan; Church, Samuel H.; Sanders, Steven M.; Ames, Cheryl Lewis; McFadden, Catherine S.; France, Scott C.; Daly, Marymegan; Collins, Allen G.; Haddock, Steven H. D.; Dunn, Casey W.; Cartwright, Paulyn

    2015-01-01

    Cnidaria, the sister group to Bilateria, is a highly diverse group of animals in terms of morphology, lifecycles, ecology, and development. How this diversity originated and evolved is not well understood because phylogenetic relationships among major cnidarian lineages are unclear, and recent studies present contrasting phylogenetic hypotheses. Here, we use transcriptome data from 15 newly-sequenced species in combination with 26 publicly available genomes and transcriptomes to assess phylogenetic relationships among major cnidarian lineages. Phylogenetic analyses using different partition schemes and models of molecular evolution, as well as topology tests for alternative phylogenetic relationships, support the monophyly of Medusozoa, Anthozoa, Octocorallia, Hydrozoa, and a clade consisting of Staurozoa, Cubozoa, and Scyphozoa. Support for the monophyly of Hexacorallia is weak due to the equivocal position of Ceriantharia. Taken together, these results further resolve deep cnidarian relationships, largely support traditional phylogenetic views on relationships, and provide a historical framework for studying the evolutionary processes involved in one of the most ancient animal radiations. PMID:26465609

  11. Analyse de plomb dans les peintures

    NASA Astrophysics Data System (ADS)

    Broll, N.; Frezouls, J.-M.

    2002-07-01

    The analysis of lead in paints was previously used for the characterisation of pigments. In this way, the analysis is able to specify the century of the painting of a work of art. Recently this technique was also used to determine the toxicity of lead paints in building. This paper compared the result of several X-ray fluorescence spectrometer, either wave length/energy dispersion laboratory apparatus or X-ray microtube/radioactive source portable equipment's. L'analyse du plomb dans les peintures a jusqu'à présent été appliquée essentiellement pour caractériser les pigments lors de leur fabrication et pour identifier des rouvres d'art. Récemment cette technique est également utilisée pour déterminer la toxicité des peintures au plomb dans les bâtiments. Nous avons comparé les performances de plusieurs spectromètres de fluorescence X, soit de laboratoire à dispersion en longueur d'onde ou à dispersion en énergie (avec tube à rayonsX), soit portable avec source radioactive ou tube à rayons X.

  12. Comparative sequence analyses of sixteen reptilian paramyxoviruses

    USGS Publications Warehouse

    Ahne, W.; Batts, W.N.; Kurath, G.; Winton, J.R.

    1999-01-01

    Viral genomic RNA of Fer-de-Lance virus (FDLV), a paramyxovirus highly pathogenic for reptiles, was reverse transcribed and cloned. Plasmids with significant sequence similarities to the hemagglutinin-neuraminidase (HN) and polymerase (L) genes of mammalian paramyxoviruses were identified by BLAST search. Partial sequences of the FDLV genes were used to design primers for amplification by nested polymerase chain reaction (PCR) and sequencing of 518-bp L gene and 352-bp HN gene fragments from a collection of 15 previously uncharacterized reptilian paramyxoviruses. Phylogenetic analyses of the partial L and HN sequences produced similar trees in which there were two distinct subgroups of isolates that were supported with maximum bootstrap values, and several intermediate isolates. Within each subgroup the nucleotide divergence values were less than 2.5%, while the divergence between the two subgroups was 20-22%. This indicated that the two subgroups represent distinct virus species containing multiple virus strains. The five intermediate isolates had nucleotide divergence values of 11-20% and may represent additional distinct species. In addition to establishing diversity among reptilian paramyxoviruses, the phylogenetic groupings showed some correlation with geographic location, and clearly demonstrated a low level of host species-specificity within these viruses. Copyright (C) 1999 Elsevier Science B.V.

  13. Isolation and Analyses of Axonal Ribonucleoprotein Complexes

    PubMed Central

    Doron-Mandel, Ella; Alber, Stefanie; Oses, Juan A.; Medzihradszky, Katalin F.; Burlingame, Alma L.; Fainzilber, Mike; Twiss, Jeffery L.; Lee, Seung Joon

    2016-01-01

    Cytoskeleton-dependent RNA transport and local translation in axons are gaining increased attention as key processes in the maintenance and functioning of neurons. Specific axonal transcripts have been found to play roles in many aspects of axonal physiology including axon guidance, axon survival, axon to soma communication, injury response and regeneration. This axonal transcriptome requires long-range transport that is achieved by motor proteins carrying transcripts as messenger ribonucleoprotein (mRNP) complexes along microtubules. Other than transport, the mRNP complex plays a major role in the generation, maintenance and regulation of the axonal transcriptome. Identification of axonal RNA binding proteins (RBPs) and analyses of the dynamics of their mRNPs are of high interest to the field. Here we describe methods for the study of interactions between RNA and proteins in axons. First, we describe a protocol for identifying binding proteins for an RNA of interest by using RNA affinity chromatography. Subsequently, we discuss immunoprecipitation (IP) methods allowing the dissection of protein- RNA and protein-protein interactions in mRNPs under various physiological conditions. PMID:26794529

  14. CFD analyses of coolant channel flowfields

    NASA Technical Reports Server (NTRS)

    Yagley, Jennifer A.; Feng, Jinzhang; Merkle, Charles L.

    1993-01-01

    The flowfield characteristics in rocket engine coolant channels are analyzed by means of a numerical model. The channels are characterized by large length to diameter ratios, high Reynolds numbers, and asymmetrical heating. At representative flow conditions, the channel length is approximately twice the hydraulic entrance length so that fully developed conditions would be reached for a constant property fluid. For the supercritical hydrogen that is used as the coolant, the strong property variations create significant secondary flows in the cross-plane which have a major influence on the flow and the resulting heat transfer. Comparison of constant and variable property solutions show substantial differences. In addition, the property variations prevent fully developed flow. The density variation accelerates the fluid in the channels increasing the pressure drop without an accompanying increase in heat flux. Analyses of the inlet configuration suggest that side entry from a manifold can affect the development of the velocity profile because of vortices generated as the flow enters the channel. Current work is focused on studying the effects of channel bifurcation on the flow field and the heat transfer characteristics.

  15. Trend Analyses of Nitrate in Danish Groundwater

    NASA Astrophysics Data System (ADS)

    Hansen, B.; Thorling, L.; Dalgaard, T.; Erlandsen, M.

    2012-04-01

    This presentation assesses the long-term development in the oxic groundwater nitrate concentration and nitrogen (N) loss due to intensive farming in Denmark. Firstly, up to 20-year time-series from the national groundwater monitoring network enable a statistically systematic analysis of distribution, trends and trend reversals in the groundwater nitrate concentration. Secondly, knowledge about the N surplus in Danish agriculture since 1950 is used as an indicator of the potential loss of N. Thirdly, groundwater recharge CFC (Chlorofluorocarbon) age determination allows linking of the first two dataset. The development in the nitrate concentration of oxic groundwater clearly mirrors the development in the national agricultural N surplus, and a corresponding trend reversal is found in groundwater. Regulation and technical improvements in the intensive farming in Denmark have succeeded in decreasing the N surplus by 40% since the mid 1980s while at the same time maintaining crop yields and increasing the animal production of especially pigs. Trend analyses prove that the youngest (0-15 years old) oxic groundwater shows more pronounced significant downward nitrate trends (44%) than the oldest (25-50 years old) oxic groundwater (9%). This amounts to clear evidence of the effect of reduced nitrate leaching on groundwater nitrate concentrations in Denmark. Are the Danish groundwater monitoring strategy obtimal for detection of nitrate trends? Will the nitrate concentrations in Danish groundwater continue to decrease or are the Danish nitrate concentration levels now appropriate according to the Water Framework Directive?

  16. Evaluation of the Hitachi 717 analyser.

    PubMed

    Biosca, C; Antoja, F; Sierra, C; Douezi, H; Macià, M; Alsina, M J; Galimany, R

    1989-01-01

    The selective multitest Boehringer Mannheim Hitachi 717 analyser was evaluated according to the guidelines of the Comisión de Instrumentación de la Sociedad Española de Química Clinica and the European Committee for Clinical Laboratory Standards. The evaluation was performed in two steps: examination of the analytical units and evaluation in routine operation.THE EVALUATION OF THE ANALYTICAL UNITS INCLUDED A PHOTOMETRIC STUDY: the inaccuracy is acceptable for 340 and 405 nm; the imprecision ranges from 0.12 to 0.95% at 340 nm and from 0.30 to 0.73 at 405 nm, the linearity shows some dispersion at low absorbance for NADH at 340 nm, the drift is negligible, the imprecision of the pipette delivery system increases when the sample pipette operates with 3 mul, the reagent pipette imprecision is acceptable and the temperature control system is good.UNDER ROUTINE WORKING CONDITIONS, SEVEN DETERMINATIONS WERE STUDIED: glucose, creatinine, iron, total protein, AST, ALP and calcium. The within-run imprecision (CV) ranged from 0.6% for total protein and AST to 6.9% for iron. The between run imprecision ranged from 2.4% for glucose to 9.7% for iron. Some contamination was found in the carry-over study. The relative inaccuracy is good for all the constituents assayed.

  17. Evaluation of the Olympus AU-510 analyser.

    PubMed

    Farré, C; Velasco, J; Ramón, F

    1991-01-01

    The selective multitest Olympus AU-510 analyser was evaluated according to the recommendations of the Comision de Instrumentacion de la Sociedad Española de Quimica Clinica and the European Committee for Clinical Laboratory Standards. The evaluation was carried out in two stages: an examination of the analytical units and then an evaluation in routine work conditions. The operational characteristics of the system were also studied.THE FIRST STAGE INCLUDED A PHOTOMETRIC STUDY: dependent on the absorbance, the inaccuracy varies between +0.5% to -0.6% at 405 nm and from -5.6% to 10.6% at 340 nm; the imprecision ranges between -0.22% and 0.56% at 405 nm and between 0.09% and 2.74% at 340 nm. Linearity was acceptable, apart from a very low absorbance for NADH at 340 nm; and the imprecision of the serum sample pipetter was satisfactory.TWELVE SERUM ANALYTES WERE STUDIED UNDER ROUTINE CONDITIONS: glucose, urea urate, cholesterol, triglycerides, total bilirubin, creatinine, phosphate, iron, aspartate aminotransferase, alanine aminotransferase and gamma-glutamyl transferase.The within-run imprecision (CV%) ranged from 0.67% for phosphate to 2.89% for iron and the between-run imprecision from 0.97% for total bilirubin to 7.06% for iron. There was no carryover in a study of the serum sample pipetter. Carry-over studies with the reagent and sample pipetters shows some cross contamination in the iron assay.

  18. Biomechanical analyses of rising from a chair.

    PubMed

    Schultz, A B; Alexander, N B; Ashton-Miller, J A

    1992-12-01

    Quantification of the biomechanical factors that underlie the inability to rise from a chair can help explain why this disability occurs and can aid in the design of chairs and of therapeutic intervention programs. Experimental data collected earlier from 17 young adult and two groups of elderly subjects, 23 healthy and 11 impaired, rising from a standard chair under controlled conditions were analyzed using a planar biomechanical model. The joint torque strength requirements and the location of the floor reaction force at liftoff from the seat in the different groups and under several conditions were calculated. Analyses were also made of how body configurations and the use of hand force affect these joint torques and reaction locations. In all three groups, the required torques at liftoff were modest compared to literature data on voluntary strengths. Among the three groups rising with the use of hands, at the time of liftoff from the seat, the impaired old subjects, on an average, placed the reaction force the most anterior, the healthy old subjects placed it intermediately and the young subjects placed it the least anterior, within the foot support area. Moreover, the results suggest that, at liftoff, all subjects placed more importance on locating the floor reaction force to achieve acceptable postural stability than on diminishing the magnitudes of the needed joint muscle strengths.

  19. PRECLOSURE CONSEQUENCE ANALYSES FOR LICENSE APPLICATION

    SciTech Connect

    S. Tsai

    2005-01-12

    Radiological consequence analyses are performed for potential releases from normal operations in surface and subsurface facilities and from Category 1 and Category 2 event sequences during the preclosure period. Surface releases from normal repository operations are primarily from radionuclides released from opening a transportation cask during dry transfer operations of spent nuclear fuel (SNF) in Dry Transfer Facility 1 (DTF 1), Dry Transfer Facility 2 (DTF 2), the Canister Handling facility (CHF), or the Fuel Handling Facility (FHF). Subsurface releases from normal repository operations are from resuspension of waste package surface contamination and neutron activation of ventilated air and silica dust from host rock in the emplacement drifts. The purpose of this calculation is to demonstrate that the preclosure performance objectives, specified in 10 CFR 63.111(a) and 10 CFR 63.111(b), have been met for the proposed design and operations in the geologic repository operations area. Preclosure performance objectives are discussed in Section 6.2.3 and are summarized in Tables 1 and 2.

  20. Informative prior distributions for ELISA analyses.

    PubMed

    Klauenberg, Katy; Walzel, Monika; Ebert, Bernd; Elster, Clemens

    2015-07-01

    Immunoassays are capable of measuring very small concentrations of substances in solutions and have an immense range of application. Enzyme-linked immunosorbent assay (ELISA) tests in particular can detect the presence of an infection, of drugs, or hormones (as in the home pregnancy test). Inference of an unknown concentration via ELISA usually involves a non-linear heteroscedastic regression and subsequent prediction, which can be carried out in a Bayesian framework. For such a Bayesian inference, we are developing informative prior distributions based on extensive historical ELISA tests as well as theoretical considerations. One consideration regards the quality of the immunoassay leading to two practical requirements for the applicability of the priors. Simulations show that the additional prior information can lead to inferences which are robust to reasonable perturbations of the model and changes in the design of the data. On real data, the applicability is demonstrated across different laboratories, for different analytes and laboratory equipment as well as for previous and current ELISAs with sigmoid regression function. Consistency checks on real data (similar to cross-validation) underpin the adequacy of the suggested priors. Altogether, the new priors may improve concentration estimation for ELISAs that fulfill certain design conditions, by extending the range of the analyses, decreasing the uncertainty, or giving more robust estimates. Future use of these priors is straightforward because explicit, closed-form expressions are provided. This work encourages development and application of informative, yet general, prior distributions for other types of immunoassays.

  1. Verification against perturbed analyses and observations

    NASA Astrophysics Data System (ADS)

    Bowler, N. E.; Cullen, M. J. P.; Piccolo, C.

    2015-07-01

    It has long been known that verification of a forecast against the sequence of analyses used to produce those forecasts can under-estimate the magnitude of forecast errors. Here we show that under certain conditions the verification of a short-range forecast against a perturbed analysis coming from an ensemble data assimilation scheme can give the same root-mean-square error as verification against the truth. This means that a perturbed analysis can be used as a reliable proxy for the truth. However, the conditions required for this result to hold are rather restrictive: the analysis must be optimal, the ensemble spread must be equal to the error in the mean, the ensemble size must be large and the forecast being verified must be the background forecast used in the data assimilation. Although these criteria are unlikely to be met exactly it becomes clear that for most cases verification against a perturbed analysis gives better results than verification against an unperturbed analysis. We demonstrate the application of these results in a idealised model framework and a numerical weather prediction context. In deriving this result we recall that an optimal (Kalman) analysis is one for which the analysis increments are uncorrelated with the analysis errors.

  2. Genomic analyses of the CAM plant pineapple.

    PubMed

    Zhang, Jisen; Liu, Juan; Ming, Ray

    2014-07-01

    The innovation of crassulacean acid metabolism (CAM) photosynthesis in arid and/or low CO2 conditions is a remarkable case of adaptation in flowering plants. As the most important crop that utilizes CAM photosynthesis, the genetic and genomic resources of pineapple have been developed over many years. Genetic diversity studies using various types of DNA markers led to the reclassification of the two genera Ananas and Pseudananas and nine species into one genus Ananas and two species, A. comosus and A. macrodontes with five botanical varieties in A. comosus. Five genetic maps have been constructed using F1 or F2 populations, and high-density genetic maps generated by genotype sequencing are essential resources for sequencing and assembling the pineapple genome and for marker-assisted selection. There are abundant expression sequence tag resources but limited genomic sequences in pineapple. Genes involved in the CAM pathway has been analysed in several CAM plants but only a few of them are from pineapple. A reference genome of pineapple is being generated and will accelerate genetic and genomic research in this major CAM crop. This reference genome of pineapple provides the foundation for studying the origin and regulatory mechanism of CAM photosynthesis, and the opportunity to evaluate the classification of Ananas species and botanical cultivars.

  3. Reproducibility of neuroimaging analyses across operating systems

    PubMed Central

    Glatard, Tristan; Lewis, Lindsay B.; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C.

    2015-01-01

    Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed. PMID:25964757

  4. Recent Advances in Cellular Glycomic Analyses

    PubMed Central

    Furukawa, Jun-ichi; Fujitani, Naoki; Shinohara, Yasuro

    2013-01-01

    A large variety of glycans is intricately located on the cell surface, and the overall profile (the glycome, given the entire repertoire of glycoconjugate-associated sugars in cells and tissues) is believed to be crucial for the diverse roles of glycans, which are mediated by specific interactions that control cell-cell adhesion, immune response, microbial pathogenesis and other cellular events. The glycomic profile also reflects cellular alterations, such as development, differentiation and cancerous change. A glycoconjugate-based approach would therefore be expected to streamline discovery of novel cellular biomarkers. Development of such an approach has proven challenging, due to the technical difficulties associated with the analysis of various types of cellular glycomes; however, recent progress in the development of analytical methodologies and strategies has begun to clarify the cellular glycomics of various classes of glycoconjugates. This review focuses on recent advances in the technical aspects of cellular glycomic analyses of major classes of glycoconjugates, including N- and O-linked glycans, derived from glycoproteins, proteoglycans and glycosphingolipids. Articles that unveil the glycomics of various biologically important cells, including embryonic and somatic stem cells, induced pluripotent stem (iPS) cells and cancer cells, are discussed. PMID:24970165

  5. Cyanide analyses for risk and treatability assessments

    SciTech Connect

    MacFarlane, I.D.; Elseroad, H.J.; Pergrin, D.E.; Logan, C.M.

    1994-12-31

    Cyanide, an EPA priority pollutant and target analyte, is typically measured as total. However, cyanide complexation, information which is not acquired through total cyanide analysis, is often a driver of cyanide toxicity and treatability. A case study of a former manufacture gas plant (MGP) is used to demonstrate the usability of various cyanide analytical methods for risk and treatability assessments. Several analytical methods, including cyanide amenable to chlorination and weak acid dissociable cyanide help test the degree of cyanide complexation. Generally, free or uncomplexed cyanide is more biologically available, toxic, and reactive than complexed cyanide. Extensive site testing has shown that free and weakly dissociable cyanide composes only a small fraction of total cyanide as would be expected from the literature, and that risk assessment will be more realistic considering cyanide form. Likewise, aqueous treatment for cyanide can be properly tested if cyanide form is accounted for. Weak acid dissociable cyanide analyses proved to be the most reliable (and potentially acceptable) cyanide method, as well as represent the most toxic and reactive cyanide forms.

  6. Transportation systems analyses. Volume 1: Executive summary

    NASA Astrophysics Data System (ADS)

    1992-11-01

    The principal objective is to accomplish a systems engineering assessment of the nation's space transportation infrastructure. This analysis addresses the necessary elements to perform crew delivery and return, cargo transfer, cargo delivery and return, payload servicing, and the exploration of the Moon and Mars. Specific elements analyzed, but not limited to, include: the Space Exploration Initiative (SEI), the National Launch System (NLS), the current expendable launch vehicle (ELV) fleet, ground facilities, the Space Station Freedom (SSF), and other civil, military and commercial payloads. The performance of this study entails maintaining a broad perspective on the large number of transportation elements that could potentially comprise the U.S. space infrastructure over the next several decades. To perform this systems evaluation, top-level trade studies are conducted to enhance our understanding of the relationship between elements of the infrastructure. This broad 'infrastructure-level perspective' permits the identification of preferred infrastructures. Sensitivity analyses are performed to assure the credibility and usefulness of study results. Conceptual studies of transportation elements contribute to the systems approach by identifying elements (such as ETO node and transfer/excursion vehicles) needed in current and planned transportation systems. These studies are also a mechanism to integrate the results of relevant parallel studies.

  7. Bayesian network learning for natural hazard analyses

    NASA Astrophysics Data System (ADS)

    Vogel, K.; Riggelsen, C.; Korup, O.; Scherbaum, F.

    2014-09-01

    Modern natural hazards research requires dealing with several uncertainties that arise from limited process knowledge, measurement errors, censored and incomplete observations, and the intrinsic randomness of the governing processes. Nevertheless, deterministic analyses are still widely used in quantitative hazard assessments despite the pitfall of misestimating the hazard and any ensuing risks. In this paper we show that Bayesian networks offer a flexible framework for capturing and expressing a broad range of uncertainties encountered in natural hazard assessments. Although Bayesian networks are well studied in theory, their application to real-world data is far from straightforward, and requires specific tailoring and adaptation of existing algorithms. We offer suggestions as how to tackle frequently arising problems in this context and mainly concentrate on the handling of continuous variables, incomplete data sets, and the interaction of both. By way of three case studies from earthquake, flood, and landslide research, we demonstrate the method of data-driven Bayesian network learning, and showcase the flexibility, applicability, and benefits of this approach. Our results offer fresh and partly counterintuitive insights into well-studied multivariate problems of earthquake-induced ground motion prediction, accurate flood damage quantification, and spatially explicit landslide prediction at the regional scale. In particular, we highlight how Bayesian networks help to express information flow and independence assumptions between candidate predictors. Such knowledge is pivotal in providing scientists and decision makers with well-informed strategies for selecting adequate predictor variables for quantitative natural hazard assessments.

  8. Characterization of branch complexity by fractal analyses

    USGS Publications Warehouse

    Alados, C.L.; Escos, J.; Emlen, J.M.; Freeman, D.C.

    1999-01-01

    The comparison between complexity in the sense of space occupancy (box-counting fractal dimension D(c) and information dimension D1) and heterogeneity in the sense of space distribution (average evenness index f and evenness variation coefficient J(cv)) were investigated in mathematical fractal objects and natural branch structures. In general, increased fractal dimension was paired with low heterogeneity. Comparisons between branch architecture in Anthyllis cytisoides under different slope exposure and grazing impact revealed that branches were more complex and more homogeneously distributed for plants on northern exposures than southern, while grazing had no impact during a wet year. Developmental instability was also investigated by the statistical noise of the allometric relation between internode length and node order. In conclusion, our study demonstrated that fractal dimension of branch structure can be used to analyze the structural organization of plants, especially if we consider not only fractal dimension but also shoot distribution within the canopy (lacunarity). These indexes together with developmental instability analyses are good indicators of growth responses to the environment.

  9. Local spin analyses using density functional theory

    NASA Astrophysics Data System (ADS)

    Abate, Bayileyegn; Peralta, Juan

    Local spin analysis is a valuable technique in computational investigations magnetic interactions on mono- and polynuclear transition metal complexes, which play vital roles in catalysis, molecular magnetism, artificial photosynthesis, and several other commercially important materials. The relative size and complex electronic structure of transition metal complexes often prohibits the use of multi-determinant approaches, and hence, practical calculations are often limited to single-determinant methods. Density functional theory (DFT) has become one of the most successful and widely used computational tools for the electronic structure study of complex chemical systems; transition metal complexes in particular. Within the DFT formalism, a more flexible and complete theoretical modeling of transition metal complexes can be achieved by considering noncollinear spins, in which the spin density is 'allowed to' adopt noncollinear structures in stead of being constrained to align parallel/antiparallel to a universal axis of magnetization. In this meeting, I will present local spin analyses results obtained using different DFT functionals. Local projection operators are used to decompose the expectation value of the total spin operator; first introduced by Clark and Davidson.

  10. Electrophoretic and immunoblot analyses of Rhizopus arrhizus antigens.

    PubMed Central

    Wysong, D R; Waldorf, A R

    1987-01-01

    Four antigen preparations from Rhizopus arrhizus were made and analyzed by sodium dodecyl sulfate-polyacrylamide gel electrophoresis (SDS-PAGE) and column chromatography. Electrophoretic analyses of these antigens indicated that there are 18 to 28 component bands with a molecular mass range of approximately 10,500 to 83,000 daltons. Seven of these bands appear to be components common to three antigen preparations. Several of the bands identified by SDS-PAGE were composed of glycoproteins or carbohydrates as determined by their affinity for concanavalin A. Western blots, using sera from five patients with mucormycosis, consistently identified five different determinants in the R. arrhizus antigens separated by SDS-PAGE. This suggests that several of the Rhizopus antigens are present during mucormycosis. Four of the antigenic determinants recognized by patient sera reacted with the concanavalin A-peroxidase stain, indicating that they are composed of glycoproteins or carbohydrate. Enzyme-linked immunosorbent assays of sera from five patients with mucormycosis and with rabbit antisera resulted in antibody titers ranging from 1:64 to 1:32,000 for the R. arrhizus antigens. Images PMID:3546367

  11. Seismic Soil-Structure Interaction Analyses of a Deeply Embedded Model Reactor – SASSI Analyses

    SciTech Connect

    Nie J.; Braverman J.; Costantino, M.

    2013-10-31

    This report summarizes the SASSI analyses of a deeply embedded reactor model performed by BNL and CJC and Associates, as part of the seismic soil-structure interaction (SSI) simulation capability project for the NEAMS (Nuclear Energy Advanced Modeling and Simulation) Program of the Department of Energy. The SASSI analyses included three cases: 0.2 g, 0.5 g, and 0.9g, all of which refer to nominal peak accelerations at the top of the bedrock. The analyses utilized the modified subtraction method (MSM) for performing the seismic SSI evaluations. Each case consisted of two analyses: input motion in one horizontal direction (X) and input motion in the vertical direction (Z), both of which utilized the same in-column input motion. Besides providing SASSI results for use in comparison with the time domain SSI results obtained using the DIABLO computer code, this study also leads to the recognition that the frequency-domain method should be modernized so that it can better serve its mission-critical role for analysis and design of nuclear power plants.

  12. High-Resolution Views of Io's Emakong Patera: Latest Galileo Imaging Results

    NASA Technical Reports Server (NTRS)

    Williams, D. A.; Keszthelyi, L. P.; Davies, A. G.; Greeley, R.; Head, J. W., III

    2002-01-01

    This presentation will discuss analyses of the latest Galileo SSI (solid state imaging) high-resolution images of the Emakong lava channels and flow field on Jupiter's moon Io. Additional information is contained in the original extended abstract.

  13. Imaging bolometer

    DOEpatents

    Wurden, G.A.

    1999-01-19

    Radiation-hard, steady-state imaging bolometer is disclosed. A bolometer employing infrared (IR) imaging of a segmented-matrix absorber of plasma radiation in a cooled-pinhole camera geometry is described. The bolometer design parameters are determined by modeling the temperature of the foils from which the absorbing matrix is fabricated by using a two-dimensional time-dependent solution of the heat conduction equation. The resulting design will give a steady-state bolometry capability, with approximately 100 Hz time resolution, while simultaneously providing hundreds of channels of spatial information. No wiring harnesses will be required, as the temperature-rise data will be measured via an IR camera. The resulting spatial data may be used to tomographically investigate the profile of plasmas. 2 figs.

  14. Imaging bolometer

    SciTech Connect

    Wurden, Glen A.

    1999-01-01

    Radiation-hard, steady-state imaging bolometer. A bolometer employing infrared (IR) imaging of a segmented-matrix absorber of plasma radiation in a cooled-pinhole camera geometry is described. The bolometer design parameters are determined by modeling the temperature of the foils from which the absorbing matrix is fabricated by using a two-dimensional time-dependent solution of the heat conduction equation. The resulting design will give a steady-state bolometry capability, with approximately 100 Hz time resolution, while simultaneously providing hundreds of channels of spatial information. No wiring harnesses will be required, as the temperature-rise data will be measured via an IR camera. The resulting spatial data may be used to tomographically investigate the profile of plasmas.

  15. Attosecond imaging.

    PubMed

    Vrakking, Marc J J

    2014-02-21

    The natural timescale for electron dynamics reaches down to the attosecond domain. Following the discovery of attosecond laser pulses, about a decade ago, attosecond science has developed into a vibrant, new research field, where the motion of single or multiple electrons and, in molecules, the coupling of electronic and nuclear motion, can be investigated, on attosecond to few-femtosecond timescales. Attosecond experiments require suitable observables. This review describes how "attosecond imaging", basing itself on kinetic energy and angle-resolved detection of photoelectrons and fragment ions using a velocity map imaging (VMI) spectrometer, has been exploited in a number of pump-probe experiments. The use of a VMI spectrometer in attosecond experiments has allowed the characterization of attosecond pulse trains and isolated attosecond pulses, the elucidation of continuum electron dynamics and wave packet interferometry in atomic photoionization and the observation of electron localization in dissociative molecular photoionization. PMID:24398785

  16. Brain imaging

    SciTech Connect

    Bradshaw, J.R.

    1989-01-01

    This book presents a survey of the various imaging tools with examples of the different diseases shown best with each modality. It includes 100 case presentations covering the gamut of brain diseases. These examples are grouped according to the clinical presentation of the patient: headache, acute headache, sudden unilateral weakness, unilateral weakness of gradual onset, speech disorders, seizures, pituitary and parasellar lesions, sensory disorders, posterior fossa and cranial nerve disorders, dementia, and congenital lesions.

  17. Cosmic microwave background images

    NASA Astrophysics Data System (ADS)

    Herranz, D.; Vielva, P.

    2010-01-01

    Cosmology concerns itself with the fundamental questions about the formation, structure, and evolution of the Universe as a whole. Cosmic microwave background (CMB) radiation is one of the foremost pillars of physical cosmology. Joint analyses of CMB and other astronomical observations are able to determine with ever increasing precision the value of the fundamental cosmological parameters and to provide us with valuable insight about the dynamics of the Universe in evolution. The CMB radiation is a relic of the hot and dense first moments of the Universe: a extraordinarily homogeneous and isotropic blackbody radiation, which shows small temperature anisotropies that are the key for understanding the conditions of the primitive Universe, testing cosmological models and probing fundamental physics at the very dawn of time. CMB observations are obtained by imaging of the sky at microwave wavelengths. However, the CMB signal is mixed with other astrophysical signals of both Galactic and extragalactic origin. To properly exploit the cosmological information contained in CMB images, they must be cleansed of these other astrophysical emissions first. Blind source separation (BSS) has been a very active field in the last few years. Conversely, the term "compact sources" is often used in the CMB literature referring to spatially bounded, small features in the images, such as galaxies and galaxy clusters. Compact sources and diffuse sources are usually treated separately in CMB image processing. We devote this tutorial to the case of compact sources. Many of the compact source-detection techniques that are widespread inmost fields of astronomy are not easily applicable to CMB images. In this tutorial, we present an overview of the fundamentals of compact object detection theory keeping in mind at every moment these particularities. Throughout the article, we briefly consider Bayesian object detection, model selection, optimal linear filtering, nonlinear filtering, and

  18. Multispectral Imaging Broadens Cellular Analysis

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Amnis Corporation, a Seattle-based biotechnology company, developed ImageStream to produce sensitive fluorescence images of cells in flow. The company responded to an SBIR solicitation from Ames Research Center, and proposed to evaluate several methods of extending the depth of field for its ImageStream system and implement the best as an upgrade to its commercial products. This would allow users to view whole cells at the same time, rather than just one section of each cell. Through Phase I and II SBIR contracts, Ames provided Amnis the funding the company needed to develop this extended functionality. For NASA, the resulting high-speed image flow cytometry process made its way into Medusa, a life-detection instrument built to collect, store, and analyze sample organisms from erupting hydrothermal vents, and has the potential to benefit space flight health monitoring. On the commercial end, Amnis has implemented the process in ImageStream, combining high-resolution microscopy and flow cytometry in a single instrument, giving researchers the power to conduct quantitative analyses of individual cells and cell populations at the same time, in the same experiment. ImageStream is also built for many other applications, including cell signaling and pathway analysis; classification and characterization of peripheral blood mononuclear cell populations; quantitative morphology; apoptosis (cell death) assays; gene expression analysis; analysis of cell conjugates; molecular distribution; and receptor mapping and distribution.

  19. Imaging stress.

    PubMed

    Brielle, Shlomi; Gura, Rotem; Kaganovich, Daniel

    2015-11-01

    Recent innovations in cell biology and imaging approaches are changing the way we study cellular stress, protein misfolding, and aggregation. Studies have begun to show that stress responses are even more variegated and dynamic than previously thought, encompassing nano-scale reorganization of cytosolic machinery that occurs almost instantaneously, much faster than transcriptional responses. Moreover, protein and mRNA quality control is often organized into highly dynamic macromolecular assemblies, or dynamic droplets, which could easily be mistaken for dysfunctional "aggregates," but which are, in fact, regulated functional compartments. The nano-scale architecture of stress-response ranges from diffraction-limited structures like stress granules, P-bodies, and stress foci to slightly larger quality control inclusions like juxta nuclear quality control compartment (JUNQ) and insoluble protein deposit compartment (IPOD), as well as others. Examining the biochemical and physical properties of these dynamic structures necessitates live cell imaging at high spatial and temporal resolution, and techniques to make quantitative measurements with respect to movement, localization, and mobility. Hence, it is important to note some of the most recent observations, while casting an eye towards new imaging approaches that offer the possibility of collecting entirely new kinds of data from living cells.

  20. Imaging Borrelly

    USGS Publications Warehouse

    Soderblom, L.A.; Boice, D.C.; Britt, D.T.; Brown, R.H.; Buratti, B.J.; Kirk, R.L.; Lee, M.; Nelson, R.M.; Oberst, J.; Sandel, B.R.; Stern, S.A.; Thomas, N.; Yelle, R.V.

    2004-01-01

    The nucleus, coma, and dust jets of short-period Comet 19P/Borrelly were imaged from the Deep Space 1 spacecraft during its close flyby in September 2001. A prominent jet dominated the near-nucleus coma and emanated roughly normal to the long axis of nucleus from a broad central cavity. We show it to have remained fixed in position for more than 34 hr, much longer than the 26-hr rotation period. This confirms earlier suggestions that it is co-aligned with the rotation axis. From a combination of fitting the nucleus light curve from approach images and the nucleus' orientation from stereo images at encounter, we conclude that the sense of rotation is right-handed around the main jet vector. The inferred rotation pole is approximately perpendicular to the long axis of the nucleus, consistent with a simple rotational state. Lacking an existing IAU comet-specific convention but applying a convention provisionally adopted for asteroids, we label this the north pole. This places the sub-solar latitude at ???60?? N at the time of the perihelion with the north pole in constant sunlight and thus receiving maximum average insolation. ?? 2003 Elsevier Inc. All rights reserved.

  1. Image structure restoration from sputnik with multi-matrix scanners

    NASA Astrophysics Data System (ADS)

    Eremeev, V.; Kuznetcov, A.; Myatov, G.; Presnyakov, Oleg; Poshekhonov, V.; Svetelkin, P.

    2014-10-01

    The paper is devoted to the earth surface image formation by means of multi-matrix scanning cameras. The realized formation of continuous and spatially combined images consists of consistent solutions for radiometric scan correction, stitching and geo-referencing of multispectral images. The radiometric scan correction algorithm based on statistical analyses of input images is described. Also, there is the algorithm for sub-pixel stitching of scans into one continuous image which could be formed by the virtual scanner. The paper contains algorithms for geometrical combining of multispectral images obtained in different moments; and, examples illustrating effectiveness of the suggested processing algorithms.

  2. Fracturing and brittleness index analyses of shales

    NASA Astrophysics Data System (ADS)

    Barnhoorn, Auke; Primarini, Mutia; Houben, Maartje

    2016-04-01

    The formation of a fracture network in rocks has a crucial control on the flow behaviour of fluids. In addition, an existing network of fractures , influences the propagation of new fractures during e.g. hydraulic fracturing or during a seismic event. Understanding of the type and characteristics of the fracture network that will be formed during e.g. hydraulic fracturing is thus crucial to better predict the outcome of a hydraulic fracturing job. For this, knowledge of the rock properties is crucial. The brittleness index is often used as a rock property that can be used to predict the fracturing behaviour of a rock for e.g. hydraulic fracturing of shales. Various terminologies of the brittleness index (BI1, BI2 and BI3) exist based on mineralogy, elastic constants and stress-strain behaviour (Jin et al., 2014, Jarvie et al., 2007 and Holt et al., 2011). A maximum brittleness index of 1 predicts very good and efficient fracturing behaviour while a minimum brittleness index of 0 predicts a much more ductile shale behaviour. Here, we have performed systematic petrophysical, acoustic and geomechanical analyses on a set of shale samples from Whitby (UK) and we have determined the three different brittleness indices on each sample by performing all the analyses on each of the samples. We show that each of the three brittleness indices are very different for the same sample and as such it can be concluded that the brittleness index is not a good predictor of the fracturing behaviour of shales. The brittleness index based on the acoustic data (BI1) all lie around values of 0.5, while the brittleness index based on the stress strain data (BI2) give an average brittleness index around 0.75, whereas the mineralogy brittleness index (BI3) predict values below 0.2. This shows that by using different estimates of the brittleness index different decisions can be made for hydraulic fracturing. If we would rely on the mineralogy (BI3), the Whitby mudstone is not a suitable

  3. Imaging and radiology

    MedlinePlus

    ... imaging or a PET scan Ultrasound INTERVENTIONAL RADIOLOGY Interventional radiologists are doctors that use imaging such as CT, ultrasound, MRI and fluoroscopy to help guide procedures. The imaging ...

  4. Phylogenomic analyses and improved resolution of Cetartiodactyla.

    PubMed

    Zhou, Xuming; Xu, Shixia; Yang, Yunxia; Zhou, Kaiya; Yang, Guang

    2011-11-01

    The remarkable antiquity, diversity, and significance in the ecology and evolution of Cetartiodactyla have inspired numerous attempts to resolve their phylogenetic relationships. However, previous analyses based on limited samples of nuclear genes or mitochondrial DNA sequences have generated results that were either inconsistent with one another, weakly supported, or highly sensitive to analytical conditions. Here, we present strongly supported results based upon over 1.4 Mb of an aligned DNA sequence matrix from 110 single-copy nuclear protein-coding genes of 21 Cetartiodactyla species, which represent major Cetartiodactyla lineages, and three species of Perissodactyla and Carnivora as outgroups. Phylogenetic analysis of this newly developed genomic sequence data using a codon-based model and recently developed models of the rate autocorrelation resolved the phylogenetic relationships of the major cetartiodactylan lineages and of those lineages with a high degree of confidence. Cetacea was found to nest within Artiodactyla as the sister group of Hippopotamidae, and Tylopoda was corroborated as the sole base clade of Cetartiodactyla. Within Cetacea, the monophyletic status of Odontoceti relative to Mysticeti, the basal position of Physeteroidea in Odontoceti, the non-monophyly of the river dolphins, and the sister relationship between Delphinidae and Monodontidae+Phocoenidae were strongly supported. In particular, the groups of Tursiops (bottlenose dolphins) and Stenella (spotted dolphins) were validated as unnatural groups. Additionally, a very narrow time frame of ∼3 My (million years) was found for the rapid diversification of delphinids in the late Miocene, which made it difficult to resolve the phylogenetic relationships within the Delphinidae, especially for previous studies with limited data sets. The present study provides a statistically well-supported phylogenetic framework of Cetartiodactyla, which represents an important step toward ending some of

  5. Finite Element analyses of soil bioengineered slopes

    NASA Astrophysics Data System (ADS)

    Tamagnini, Roberto; Switala, Barbara Maria; Sudan Acharya, Madhu; Wu, Wei; Graf, Frank; Auer, Michael; te Kamp, Lothar

    2014-05-01

    Soil Bioengineering methods are not only effective from an economical point of view, but they are also interesting as fully ecological solutions. The presented project is aimed to define a numerical model which includes the impact of vegetation on slope stability, considering both mechanical and hydrological effects. In this project, a constitutive model has been developed that accounts for the multi-phase nature of the soil, namely the partly saturated condition and it also includes the effects of a biological component. The constitutive equation is implemented in the Finite Element (FE) software Comes-Geo with an implicit integration scheme that accounts for the collapse of the soils structure due to wetting. The mathematical formulation of the constitutive equations is introduced by means of thermodynamics and it simulates the growth of the biological system during the time. The numerical code is then applied in the analysis of an ideal rainfall induced landslide. The slope is analyzed for vegetated and non-vegetated conditions. The final results allow to quantitatively assessing the impact of vegetation on slope stability. This allows drawing conclusions and choosing whenever it is worthful to use soil bioengineering methods in slope stabilization instead of traditional approaches. The application of the FE methods show some advantages with respect to the commonly used limit equilibrium analyses, because it can account for the real coupled strain-diffusion nature of the problem. The mechanical strength of roots is in fact influenced by the stress evolution into the slope. Moreover, FE method does not need a pre-definition of any failure surface. FE method can also be used in monitoring the progressive failure of the soil bio-engineered system as it calculates the amount of displacements and strains of the model slope. The preliminary study results show that the formulated equations can be useful for analysis and evaluation of different soil bio

  6. Computational Analyses of Pressurization in Cryogenic Tanks

    NASA Technical Reports Server (NTRS)

    Ahuja, Vineet; Hosangadi, Ashvin; Lee, Chun P.; Field, Robert E.; Ryan, Harry

    2010-01-01

    A comprehensive numerical framework utilizing multi-element unstructured CFD and rigorous real fluid property routines has been developed to carry out analyses of propellant tank and delivery systems at NASA SSC. Traditionally CFD modeling of pressurization and mixing in cryogenic tanks has been difficult primarily because the fluids in the tank co-exist in different sub-critical and supercritical states with largely varying properties that have to be accurately accounted for in order to predict the correct mixing and phase change between the ullage and the propellant. For example, during tank pressurization under some circumstances, rapid mixing of relatively warm pressurant gas with cryogenic propellant can lead to rapid densification of the gas and loss of pressure in the tank. This phenomenon can cause serious problems during testing because of the resulting decrease in propellant flow rate. With proper physical models implemented, CFD can model the coupling between the propellant and pressurant including heat transfer and phase change effects and accurately capture the complex physics in the evolving flowfields. This holds the promise of allowing the specification of operational conditions and procedures that could minimize the undesirable mixing and heat transfer inherent in propellant tank operation. In our modeling framework, we incorporated two different approaches to real fluids modeling: (a) the first approach is based on the HBMS model developed by Hirschfelder, Beuler, McGee and Sutton and (b) the second approach is based on a cubic equation of state developed by Soave, Redlich and Kwong (SRK). Both approaches cover fluid properties and property variation spanning sub-critical gas and liquid states as well as the supercritical states. Both models were rigorously tested and properties for common fluids such as oxygen, nitrogen, hydrogen etc were compared against NIST data in both the sub-critical as well as supercritical regimes.

  7. Soil grain analyses at Meridiani Planum, Mars

    USGS Publications Warehouse

    Weitz, C.M.; Anderson, R.C.; Bell, J.F.; Farrand, W. H.; Herkenhoff, K. E.; Johnson, J. R.; Jolliff, B.L.; Morris, R.V.; Squyres, S. W.; Sullivan, R.J.

    2006-01-01

    Grain-size analyses of the soils at Meridiani Planum have been used to identify rock souces for the grains and provide information about depositional processes under past and current conditions. Basaltic sand, dust, millimeter-size hematite-rich spherules interpreted as concretions, spherule fragments, coated partially buried spherules, basalt fragments, sedimentary outcrop fragments, and centimeter-size cobbles are concentrated on the upper surfaces of the soils as a lag deposit, while finer basaltic sands and dust dominate the underlying soils. There is a bimodal distribution of soil grain sizes with one population representing grains <125 ??m and the other falling between 1-4.5 mm. Soils within craters like Eagle and Endurance show a much greater diversity of grain morphologies compared to the plains. The spherules found in the plains soils are approximately 1-2 mm smaller in size than those seen embedded in the outcrop rocks of Eagle and Endurance craters. The average major axis for all unfractured spherules measured in the soils and outcrop rocks is 2.87 ?? 1.18 mm, with a trend toward decreasing spherule sizes in both the soils and outcrop rocks as the rover drove southward. Wind ripples seen across the plains of Meridiani are dominated by similar size (1.3-1.7 mm) hematite-rich grains, and they match in size the larger grains on plains ripples at Gusev Crater. Larger clasts and centimeter-size cobbles that are scattered on the soils have several spectral and compositional types, reflecting multiple origins. The cobbles tend to concentrate within ripple troughs along the plains and in association with outcrop exposures. Copyright 2006 by the American Geophysical Union.

  8. Integrated Genomic Analyses in Bronchopulmonary Dysplasia

    PubMed Central

    Ambalavanan, Namasivayam; Cotten, C. Michael; Page, Grier P.; Carlo, Waldemar A.; Murray, Jeffrey C.; Bhattacharya, Soumyaroop; Mariani, Thomas J.; Cuna, Alain C.; Faye-Petersen, Ona M.; Kelly, David; Higgins, Rosemary D.

    2014-01-01

    Objective To identify single nucleotide polymorphisms (SNPs) and pathways associated with bronchopulmonary dysplasia (BPD) because O2 requirement at 36 weeks’ post-menstrual age risk is strongly influenced by heritable factors. Study design A genome-wide scan was conducted on 1.2 million genotyped SNPs, and an additional 7 million imputed SNPs, using a DNA repository of extremely low birth weight infants. Genome-wide association and gene set analysis was performed for BPD or death, severe BPD or death, and severe BPD in survivors. Specific targets were validated using gene expression in BPD lung tissue and in mouse models. Results Of 751 infants analyzed, 428 developed BPD or died. No SNPs achieved genome-wide significance (p<10−8) although multiple SNPs in adenosine deaminase (ADARB2), CD44, and other genes were just below p<10−6. Of approximately 8000 pathways, 75 were significant at False Discovery Rate (FDR) <0.1 and p<0.001 for BPD/death, 95 for severe BPD/death, and 90 for severe BPD in survivors. The pathway with lowest FDR was miR-219 targets (p=1.41E-08, FDR 9.5E-05) for BPD/death and Phosphorous Oxygen Lyase Activity (includes adenylate and guanylate cyclases) for both severe BPD/death (p=5.68E-08, FDR 0.00019) and severe BPD in survivors (p=3.91E-08, FDR 0.00013). Gene expression analysis confirmed significantly increased miR-219 and CD44 in BPD. Conclusions Pathway analyses confirmed involvement of known pathways of lung development and repair (CD44, Phosphorus Oxygen Lyase Activity) and indicated novel molecules and pathways (ADARB2, Targets of miR-219) involved in genetic predisposition to BPD. PMID:25449221

  9. Assessing the reproducibility of discriminant function analyses.

    PubMed

    Andrew, Rose L; Albert, Arianne Y K; Renaut, Sebastien; Rennison, Diana J; Bock, Dan G; Vines, Tim

    2015-01-01

    Data are the foundation of empirical research, yet all too often the datasets underlying published papers are unavailable, incorrect, or poorly curated. This is a serious issue, because future researchers are then unable to validate published results or reuse data to explore new ideas and hypotheses. Even if data files are securely stored and accessible, they must also be accompanied by accurate labels and identifiers. To assess how often problems with metadata or data curation affect the reproducibility of published results, we attempted to reproduce Discriminant Function Analyses (DFAs) from the field of organismal biology. DFA is a commonly used statistical analysis that has changed little since its inception almost eight decades ago, and therefore provides an opportunity to test reproducibility among datasets of varying ages. Out of 100 papers we initially surveyed, fourteen were excluded because they did not present the common types of quantitative result from their DFA or gave insufficient details of their DFA. Of the remaining 86 datasets, there were 15 cases for which we were unable to confidently relate the dataset we received to the one used in the published analysis. The reasons ranged from incomprehensible or absent variable labels, the DFA being performed on an unspecified subset of the data, or the dataset we received being incomplete. We focused on reproducing three common summary statistics from DFAs: the percent variance explained, the percentage correctly assigned and the largest discriminant function coefficient. The reproducibility of the first two was fairly high (20 of 26, and 44 of 60 datasets, respectively), whereas our success rate with the discriminant function coefficients was lower (15 of 26 datasets). When considering all three summary statistics, we were able to completely reproduce 46 (65%) of 71 datasets. While our results show that a majority of studies are reproducible, they highlight the fact that many studies still are not the

  10. Assessing the reproducibility of discriminant function analyses

    PubMed Central

    Andrew, Rose L.; Albert, Arianne Y.K.; Renaut, Sebastien; Rennison, Diana J.; Bock, Dan G.

    2015-01-01

    Data are the foundation of empirical research, yet all too often the datasets underlying published papers are unavailable, incorrect, or poorly curated. This is a serious issue, because future researchers are then unable to validate published results or reuse data to explore new ideas and hypotheses. Even if data files are securely stored and accessible, they must also be accompanied by accurate labels and identifiers. To assess how often problems with metadata or data curation affect the reproducibility of published results, we attempted to reproduce Discriminant Function Analyses (DFAs) from the field of organismal biology. DFA is a commonly used statistical analysis that has changed little since its inception almost eight decades ago, and therefore provides an opportunity to test reproducibility among datasets of varying ages. Out of 100 papers we initially surveyed, fourteen were excluded because they did not present the common types of quantitative result from their DFA or gave insufficient details of their DFA. Of the remaining 86 datasets, there were 15 cases for which we were unable to confidently relate the dataset we received to the one used in the published analysis. The reasons ranged from incomprehensible or absent variable labels, the DFA being performed on an unspecified subset of the data, or the dataset we received being incomplete. We focused on reproducing three common summary statistics from DFAs: the percent variance explained, the percentage correctly assigned and the largest discriminant function coefficient. The reproducibility of the first two was fairly high (20 of 26, and 44 of 60 datasets, respectively), whereas our success rate with the discriminant function coefficients was lower (15 of 26 datasets). When considering all three summary statistics, we were able to completely reproduce 46 (65%) of 71 datasets. While our results show that a majority of studies are reproducible, they highlight the fact that many studies still are not the

  11. Consumption Patterns and Perception Analyses of Hangwa

    PubMed Central

    Kwock, Chang Geun; Lee, Min A; Park, So Hyun

    2012-01-01

    Hangwa is a traditional food, corresponding to the current consumption trend, in need of marketing strategies to extend its consumption. Therefore, the purpose of this study was to analyze consumers’ consumption patterns and perception of Hangwa to increase consumption in the market. A questionnaire was sent to 250 consumers by e-mail from Oct 8∼23, 2009 and the data from 231 persons were analyzed in this study. Statistical, descriptive, paired samples t-test, and importance-performance analyses were conducted using SPSS WIN 17.0. According to the results, Hangwa was purchased mainly ‘for present’ (39.8%) and the main reasons for buying it were ‘traditional image’ (33.3%) and ‘taste’ (22.5%). When importance and performance of attributes considered in purchasing Hangwa were evaluated, performance was assessed to be lower than importance for all attributes. The attributes in the first quadrant with a high importance and a high performance were ‘a sanitary process’, ‘a rigorous quality mark’ and ‘taste’, which were related with quality of the products. In addition, those with a high importance but a low performance were ‘popularization through advertisement’, ‘promotion through mass media’, ‘conversion of thought on traditional foods’, ‘a reasonable price’ and ‘a wide range of price’. In conclusion, Hangwa manufacturers need to diversify products and extend the expiration date based on technologies to promote its consumption. In terms of price, Hangwa should become more available by lowering the price barrier for consumers who are sensitive to price. PMID:24471065

  12. Viscoelastic analyses of launch vehicle components

    SciTech Connect

    Chi, J.K.; Lin, S.R.

    1995-12-31

    Current analysis techniques for solid rocket propellant, and insulation used in space launch vehicles, have several shortcomings. The simplest linear elastic analysis method ignores the inherent viscoelastic behavior of these materials entirely. The relaxation modulus method commonly used to simulate time-dependent effects ignores the past loading history, while the rigorous viscoelastic finite-element analysis is often expensive and impractical. The response of viscoelastic materials is often characterized by the time-dependent relaxation moduli obtained from uniaxial relaxation tests. Since the relaxation moduli are functions of elapsed time, the viscoelastic analysis is not only dependent on the current stress or strain state but also the full loading history. As a preliminary step towards developing a procedure which will yield reasonably conservative results for analyzing the structural response of solid rocket motors, an equivalent-modulus approach was developed. To demonstrate its application, a viscoelastic thick-walled cylindrical material, confined by a stiff steel case and under an internal pressure condition, was analyzed using (1) the equivalent-modulus elastic quasi-static method, (2) an exact viscoelastic closed-form solution, and (3) the viscoelastic finite-element program. A combination of two springs and one viscous damper is used to represent the viscoelastic material with parameters obtained from stress-relaxation tests. The equivalent modulus is derived based on an accumulated quasi-static stress/strain state. The exact closed-form solution is obtained by the Laplace Transform method. The ABAQUS program is then used for the viscoelastic finite-element solution, where the loading-rate dependent moduli is represented by a Prony series expansion of the relaxation modulus. Additional analyses were performed for two space launch solid rocket motors for the purpose of comparing results from the equivalent-modulus approach and the ABAQUS program.

  13. Trend analyses with river sediment rating curves

    USGS Publications Warehouse

    Warrick, Jonathan A.

    2015-01-01

    Sediment rating curves, which are fitted relationships between river discharge (Q) and suspended-sediment concentration (C), are commonly used to assess patterns and trends in river water quality. In many of these studies it is assumed that rating curves have a power-law form (i.e., C = aQb, where a and b are fitted parameters). Two fundamental questions about the utility of these techniques are assessed in this paper: (i) How well to the parameters, a and b, characterize trends in the data? (ii) Are trends in rating curves diagnostic of changes to river water or sediment discharge? As noted in previous research, the offset parameter, a, is not an independent variable for most rivers, but rather strongly dependent on b and Q. Here it is shown that a is a poor metric for trends in the vertical offset of a rating curve, and a new parameter, â, as determined by the discharge-normalized power function [C = â (Q/QGM)b], where QGM is the geometric mean of the Q values sampled, provides a better characterization of trends. However, these techniques must be applied carefully, because curvature in the relationship between log(Q) and log(C), which exists for many rivers, can produce false trends in â and b. Also, it is shown that trends in â and b are not uniquely diagnostic of river water or sediment supply conditions. For example, an increase in â can be caused by an increase in sediment supply, a decrease in water supply, or a combination of these conditions. Large changes in water and sediment supplies can occur without any change in the parameters, â and b. Thus, trend analyses using sediment rating curves must include additional assessments of the time-dependent rates and trends of river water, sediment concentrations, and sediment discharge.

  14. Comparative mutational analyses of influenza A viruses

    PubMed Central

    Cheung, Peter Pak-Hang; Rogozin, Igor B.; Choy, Ka-Tim; Ng, Hoi Yee

    2015-01-01

    The error-prone RNA-dependent RNA polymerase (RdRP) and external selective pressures are the driving forces for RNA viral diversity. When confounded by selective pressures, it is difficult to assess if influenza A viruses (IAV) that have a wide host range possess comparable or distinct spontaneous mutational frequency in their RdRPs. We used in-depth bioinformatics analyses to assess the spontaneous mutational frequencies of two RdRPs derived from human seasonal (A/Wuhan/359/95; Wuhan) and H5N1 (A/Vietnam/1203/04; VN1203) viruses using the mini-genome system with a common firefly luciferase reporter serving as the template. High-fidelity reverse transcriptase was applied to generate high-quality mutational spectra which allowed us to assess and compare the mutational frequencies and mutable motifs along a target sequence of the two RdRPs of two different subtypes. We observed correlated mutational spectra (τ correlation P < 0.0001), comparable mutational frequencies (H3N2:5.8 ± 0.9; H5N1:6.0 ± 0.5), and discovered a highly mutable motif “(A)AAG” for both Wuhan and VN1203 RdRPs. Results were then confirmed with two recombinant A/Puerto Rico/8/34 (PR8) viruses that possess RdRP derived from Wuhan or VN1203 (RG-PR8×WuhanPB2, PB1, PA, NP and RG-PR8×VN1203PB2, PB1, PA, NP). Applying novel bioinformatics analysis on influenza mutational spectra, we provide a platform for a comprehensive analysis of the spontaneous mutation spectra for an RNA virus. PMID:25404565

  15. El Paso Electric photovoltaic-system analyses

    SciTech Connect

    Not Available

    1982-05-01

    Four analyses were performed on the Newman Power Station PV system. Two were performed using the Photovoltaic Transient Analysis Program (PV-TAP) and two with the SOLCEL II code. The first was to determine the optimum tilt angle for the array and the sensitivity of the annual energy production to variation in tilt angle. The optimum tilt angle was found to be 28/sup 0/, and variations of 2/sup 0/ produce losses of only 0.06% in the annual energy production. The second analysis assesses the power loss due to cell-to-cell variations in short circuit current and the degree of improvement attainable by sorting cells and matching modules. Typical distributions on short circuit current can cause losses of about 9.5 to 11 percent in peak array power, and sorting cells into 4 bins prior to module assembly can reduce the losses to about 6 to 8 percent. Using modules from the same cell bins in building series strings can reduce the losses to about 4.5 to 6 percent. Results are nearly the same if the array is operated at a fixed votage. The third study quantifies the magnitude and frequency of occurrence of high cell temperatures due to reverse bias caused by shadowing, and it demonstrates that cell temperatures achieved in reverse bias are higher for cells with larger shunt resistance. The last study assesses the adequacy of transient protection devices on the dc power lines to transients produced by array switching and lightning. Large surge capacitors on the dc power line effectively limit voltage excursions at the array and at the control room due to lightning. Without insertion of series resistors, the current may be limited only by cable and switch impedances, and all elements could be severely stressed. (LEW)

  16. SEDS Tether M/OD Damage Analyses

    NASA Technical Reports Server (NTRS)

    Hayashida, K. B.; Robinson, J. H.; Hill, S. A.

    1997-01-01

    The Small Expendable Deployer System (SEDS) was designed to deploy an endmass at the end of a 20-km-long tether which acts as an upper stage rocket, and the threats from the meteoroid and orbital debris (M/OD) particle environments on SEDS components are important issues for the safety and success of any SEDS mission. However, the possibility of severing the tether due to M/OD particle impacts is an even more serious concern, since the SEDS tether has a relatively large exposed area to the M/OD environments although its diameter is quite small. The threats from the M/OD environments became a very important issue for the third SEDS mission, since the project office proposed using the shuttle orbiter as a launch platform instead of the second stage of a Delta II expendable rocket, which was used for the first two SEDS missions. A series of hyper-velocity impact tests were performed at the Johnson Space Center and Arnold Engineering Development Center to help determine the critical particle sizes required to sever the tether. The computer hydrodynamic code or hydrocode called CTH, developed by the Sandia National Laboratories, was also used to simulate the damage on the SEDS tether caused by both the orbital debris and test particle impacts. The CTH hydrocode simulation results provided the much needed information to help determine the critical particle sizes required to sever the tether. The M/OD particle sizes required to sever the tether were estimated to be less than 0.1 cm in diameter from these studies, and these size particles are more abundant in low-Earth orbit than larger size particles. Finally, the authors performed the M/OD damage analyses for the three SEDS missions; i.e., SEDS-1, -2, and -3 missions, by using the information obtained from the hypervelocity impact test and hydrocode simulations results.

  17. Data Filtering in Instrumental Analyses with Applications to Optical Spectroscopy and Chemical Imaging

    ERIC Educational Resources Information Center

    Vogt, Frank

    2011-01-01

    Most measurement techniques have some limitations imposed by a sensor's signal-to-noise ratio (SNR). Thus, in analytical chemistry, methods for enhancing the SNR are of crucial importance and can be ensured experimentally or established via pre-treatment of digitized data. In many analytical curricula, instrumental techniques are given preference…

  18. [Critical reading of systematic reviews and meta-analyses about diagnostic imaging].

    PubMed

    Plana, M N; Zamora, J; Abraira, V

    2015-11-01

    Systematic reviews of diagnostic validity have been proposed as the best methodological tool to integrate all the available evidence and to help physicians decide whether to use a given diagnostic test. These studies aim to synthesize the results obtained in different primary studies into a couple of indices, generally sensitivity and specificity, or into a summary receiver operating characteristic (ROC) curve. Although there is a certain parallelism with reviews about the efficacy of therapeutic interventions, reviews of diagnostic validity have certain peculiarities that add complexity to the analysis and interpretation of the results. This article emphasizes the methodological aspects that make it possible to critically assess the extent to which the results of a review of the validity of diagnostic tests are valid and provides rudimentary knowledge of the statistics necessary to understand the results.

  19. Value of MR histogram analyses for prediction of microvascular invasion of hepatocellular carcinoma

    PubMed Central

    Huang, Ya-Qin; Liang, He-Yue; Yang, Zhao-Xia; Ding, Ying; Zeng, Meng-Su; Rao, Sheng-Xiang

    2016-01-01

    Abstract The objective is to explore the value of preoperative magnetic resonance (MR) histogram analyses in predicting microvascular invasion (MVI) of hepatocellular carcinoma (HCC). Fifty-one patients with histologically confirmed HCC who underwent diffusion-weighted and contrast-enhanced MR imaging were included. Histogram analyses were performed and mean, variance, skewness, kurtosis, 1th, 10th, 50th, 90th, and 99th percentiles were derived. Quantitative histogram parameters were compared between HCCs with and without MVI. Receiver operating characteristics (ROC) analyses were generated to compare the diagnostic performance of tumor size, histogram analyses of apparent diffusion coefficient (ADC) maps, and MR enhancement. The mean, 1th, 10th, and 50th percentiles of ADC maps, and the mean, variance. 1th, 10th, 50th, 90th, and 99th percentiles of the portal venous phase (PVP) images were significantly different between the groups with and without MVI (P <0.05), with area under the ROC curves (AUCs) of 0.66 to 0.74 for ADC and 0.76 to 0.88 for PVP. The largest AUC of PVP (1th percentile) showed significantly higher accuracy compared with that of arterial phase (AP) or tumor size (P <0.001). MR histogram analyses—in particular for 1th percentile for PVP images—held promise for prediction of MVI of HCC. PMID:27368028

  20. EURIDICE Project: The Evaluation of Image Database Use in Online Learning

    ERIC Educational Resources Information Center

    Eklund, Pieta; Lindh, Maria; Maceviciute, Elena; Wilson, Thomas D.

    2006-01-01

    Digital images and image databases can support a wide range of learning objectives. The EURIDICE project aims to assess the requirements for the image databases, on one hand, and to incorporate images into higher education teaching (HEI) teaching, on the other. The aim of this paper is to analyse the tasks and process of evaluation of the EURIDICE…

  1. Body Image Dissatisfaction and Distortion, Steroid Use, and Sex Differences in College Age Bodybuilders.

    ERIC Educational Resources Information Center

    Peters, Mark Anthony; Phelps, LeAddelle

    2001-01-01

    Compares college age bodybuilders by sex and steroid intake on two variables: body image dissatisfaction and body image distortion. Results reveal only a significant effect for gender on body distortion. No steroid-use differences were apparent for either body image dissatisfaction or body image distortion. Analyses indicate that female…

  2. X-Eye: A reference format for eye tracking data to facilitate analyses across databases

    NASA Astrophysics Data System (ADS)

    Winkler, Stefan; Savoy, Florian M.; Subramanian, Ramanathan

    2014-02-01

    Datasets of images annotated with eye tracking data constitute important ground truth for the development of saliency models, which have applications in many areas of electronic imaging. While comparisons and reviews of saliency models abound, similar comparisons among the eye tracking databases themselves are rare. In an earlier paper, we reviewed the content and purpose of over two dozen databases available in the public domain and discussed their commonalities and differences. A major issue is that the formats of the various datasets vary a lot owing to the nature of tools used for eye movement recordings, and often specialized code is required to use the data for further analysis. In this paper, we therefore propose a common reference format for eye tracking data, together with conversion routines for 16 existing image eye tracking databases to that format. Furthermore, we conduct a few analyses on these datasets as examples of what X-Eye facilitates.

  3. Image Editing Via Searching Source Image

    NASA Astrophysics Data System (ADS)

    Yu, Han; Deng, Liang-Jian

    Image editing has important applications by changing the image texture, illumination, target location, etc. As an important application of Poisson equation, Poisson image editing processes images on the gradient domain and has been applied to seamless clone, selection editing, image denoising, etc. In this paper, we present a new application of Poisson image editing, which is based on searching source image. The main feature of the new application is all modifying information comes from the source image. Experimental results show that the proposed application performs well.

  4. Genome-Facilitated Analyses of Geomicrobial Processes

    SciTech Connect

    Kenneth H. Nealson

    2012-05-02

    that makes up chitin, virtually all of the strains were in fact capable. This led to the discovery of a great many new genes involved with chitin and NAG metabolism (7). In a similar vein, a detailed study of the sugar utilization pathway revealed a major new insight into the regulation of sugar metabolism in this genus (19). Systems Biology and Comparative Genomics of the shewanellae: Several publications were put together describing the use of comparative genomics for analyses of the group Shewanella, and these were a logical culmination of our genomic-driven research (10,15,18). Eight graduate students received their Ph.D. degrees doing part of the work described here, and four postdoctoral fellows were supported. In addition, approximately 20 undergraduates took part in projects during the grant period.

  5. Static and dynamic analyses of tensegrity structures

    NASA Astrophysics Data System (ADS)

    Nishimura, Yoshitaka

    Tensegrity structures are a class of truss structures consisting of a continuous set of tension members (cables) and a discrete set of compression members (bars). Since tensegrity structures are light weight and can be compactly stowed and deployed, cylindrical tensegrity modules have been proposed for space structures. From a view point of structural dynamics, tensegrity structures pose a new set of problems, i.e., initial shape finding. Initial configurations of tensegrity structures must be computed by imposing a pre-stressability condition to initial equilibrium equations. There are ample qualitative statements regarding the initial geometry of cylindrical and spherical tensegrity modules. Quantitative initial shape anlyses have only been performed on one-stage and two-stage cylindrical modules. However, analytical expressions for important geometrical parameters such as twist angles and overlap ratios lack the definition of the initial shape of both cylindrical and spherical tensegrity modules. In response to the above needs, a set of static and dynamic characterization procedures for tensegrity modules was first developed. The procedures were subsequently applied to Buckminster Fuller's spherical tensegrity modules. Both the initial shape and the corresponding pre-stress mode were analytically obtained by using the graphs of the tetrahedral, octahedral (cubic), and icosahedral (dodecahedral) groups. For pre-stressed configurations, modal analyses were conducted to classify a large number of infinitesimal mechanism modes. The procedures also applied tocyclic cylindrical tensegrity modules with an arbitrary number of stages. It was found that both the Maxwell number and the number of infinitesimal mechanism modes are independent of the number of stages in the axial direction. A reduced set of equilibrium equations was derived by incorporating cyclic symmetry and the flip, or quasi-flip, symmetry of the cylindrical modules. For multi-stage modules with more than

  6. Molecular cloning of chicken aggrecan. Structural analyses.

    PubMed Central

    Chandrasekaran, L; Tanzer, M L

    1992-01-01

    domain. Thus different variants of chondroitin sulphate and keratan sulphate domains may have evolved separately to fulfil specific biochemical and physiological functions. Images Fig. 1. Fig. 3. Fig. 4. Fig. 5. Fig. 6. Fig. 7. Fig. 8. PMID:1339285

  7. First Super-Earth Atmosphere Analysed

    NASA Astrophysics Data System (ADS)

    2010-12-01

    The atmosphere around a super-Earth exoplanet has been analysed for the first time by an international team of astronomers using ESO's Very Large Telescope. The planet, which is known as GJ 1214b, was studied as it passed in front of its parent star and some of the starlight passed through the planet's atmosphere. We now know that the atmosphere is either mostly water in the form of steam or is dominated by thick clouds or hazes. The results will appear in the 2 December 2010 issue of the journal Nature. The planet GJ 1214b was confirmed in 2009 using the HARPS instrument on ESO's 3.6-metre telescope in Chile (eso0950) [1]. Initial findings suggested that this planet had an atmosphere, which has now been confirmed and studied in detail by an international team of astronomers, led by Jacob Bean (Harvard-Smithsonian Center for Astrophysics), using the FORS instrument on ESO's Very Large Telescope. "This is the first super-Earth to have its atmosphere analysed. We've reached a real milestone on the road toward characterising these worlds," said Bean. GJ 1214b has a radius of about 2.6 times that of the Earth and is about 6.5 times as massive, putting it squarely into the class of exoplanets known as super-Earths. Its host star lies about 40 light-years from Earth in the constellation of Ophiuchus (the Serpent Bearer). It is a faint star [2], but it is also small, which means that the size of the planet is large compared to the stellar disc, making it relatively easy to study [3]. The planet travels across the disc of its parent star once every 38 hours as it orbits at a distance of only two million kilometres: about seventy times closer than the Earth orbits the Sun. To study the atmosphere, the team observed the light coming from the star as the planet passed in front of it [4]. During these transits, some of the starlight passes through the planet's atmosphere and, depending on the chemical composition and weather on the planet, specific wavelengths of light are

  8. Chest Imaging.

    PubMed

    Keijsers, Ruth G; Veltkamp, Marcel; Grutters, Jan C

    2015-12-01

    Chest imaging has a central role in the diagnosis and monitoring of sarcoidosis. For staging of pulmonary disease on chest radiograph, Scadding stages are still widely used. High-resolution CT (HRCT), however, is more accurate in visualizing the various manifestations of pulmonary sarcoidosis as well its complications. A generally accepted HRCT scoring system is lacking. Fluorodeoxyglucose F 18 positron emission tomography can visualize disease activity better than conventional makers in a significant proportion of patients. In patients with extensive changes on HRCT but no parenchymal fluorodeoxyglucose F 18 uptake, prudence with regard to initiation or intensification of immunosuppressive treatment is warranted. PMID:26593136

  9. Image Processor

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Texas Instruments Programmable Remapper is a research tool used to determine how to best utilize the part of a patient's visual field still usable by mapping onto his field of vision with manipulated imagery. It is an offshoot of a NASA program for speeding up, improving the accuracy of pattern recognition in video imagery. The Remapper enables an image to be "pushed around" so more of it falls into the functional portions in the retina of a low vision person. It works at video rates, and researchers hope to significantly reduce its size and cost, creating a wearable prosthesis for visually impaired people.

  10. Electron Microprobe Techniques for Use in Tephrochronological Analyses

    NASA Astrophysics Data System (ADS)

    Fournelle, J.; Severin, K.; Wallace, K.; Beget, J.; Larsen, J.

    2006-12-01

    Tephrochronology generally assumes that a layer of volcanic ash represents a snapshot of eruption/deposition and of a region within the subvolcanic magma chamber. Correlation of tephra deposits over long distances helps establish age control for other deposits (volcanic and nonvolcanic). Reliable correlations depend on establishing similarity among tephra deposits. Although multi-parameter characterization of a tephra enhances long-distance correlations, identification and correlation of unknown tephras is often done using only geochemical analyses. Techniques vary but generally deal with chemically characterizing all (bulk) or portions (glass, crystals) of the tephra layer, with various geochemical techniques at various spatial scales. Electron probe microanalysis (EPMA) is the most commonly used analytical tool for geochemical analysis and imaging of micron-size volumes of glass and crystals, yet, despite warnings from numerous EPMA analysts dating back to at least 1992, a standard method for collecting, reducing, and reporting tephra data among and within laboratories is not common practice, making comparison of data sets problematic. We review the complexities in volcanic glass analysis, which include: 1) selection of standards (natural and synthetic, minerals and glasses, simple and complex chemistry, primary and secondary); 2) beam diameter, current level and count times; 3) time dependent element migration (volatiles Na, K, Al, Si); and 4) possible hydration of the glass. For example, there are multiple methods available for treating the volatile elements (minimizing the effect vs. not minimizing but correcting for it), and Morgan and London (1996) examined some of these for hydrous silicate glasses; we suggest continued comparisons are warranted, particularly on commonly used standards. Some published data sets are normalized to 100 wt% without an explanation of the extent of the deficiency in raw total. We review the 10 recommendations made by Froggatt

  11. Residual Strength Analyses of Monolithic Structures

    NASA Technical Reports Server (NTRS)

    Forth, Scott (Technical Monitor); Ambur, Damodar R. (Technical Monitor); Seshadri, B. R.; Tiwari, S. N.

    2003-01-01

    Finite-element fracture simulation methodology predicts the residual strength of damaged aircraft structures. The methodology uses the critical crack-tip-opening-angle (CTOA) fracture criterion to characterize the fracture behavior of the material. The CTOA fracture criterion assumes that stable crack growth occurs when the crack-tip angle reaches a constant critical value. The use of the CTOA criterion requires an elastic- plastic, finite-element analysis. The critical CTOA value is determined by simulating fracture behavior in laboratory specimens, such as a compact specimen, to obtain the angle that best fits the observed test behavior. The critical CTOA value appears to be independent of loading, crack length, and in-plane dimensions. However, it is a function of material thickness and local crack-front constraint. Modeling the local constraint requires either a three-dimensional analysis or a two-dimensional analysis with an approximation to account for the constraint effects. In recent times as the aircraft industry is leaning towards monolithic structures with the intention of reducing part count and manufacturing cost, there has been a consistent effort at NASA Langley to extend critical CTOA based numerical methodology in the analysis of integrally-stiffened panels.In this regard, a series of fracture tests were conducted on both flat and curved aluminum alloy integrally-stiffened panels. These flat panels were subjected to uniaxial tension and during the test, applied load-crack extension, out-of-plane displacements and local deformations around the crack tip region were measured. Compact and middle-crack tension specimens were tested to determine the critical angle (wc) using three-dimensional code (ZIP3D) and the plane-strain core height (hJ using two-dimensional code (STAGS). These values were then used in the STAGS analysis to predict the fracture behavior of the integrally-stiffened panels. The analyses modeled stable tearing, buckling, and crack

  12. Runtime and Pressurization Analyses of Propellant Tanks

    NASA Technical Reports Server (NTRS)

    Field, Robert E.; Ryan, Harry M.; Ahuja, Vineet; Hosangadi, Ashvin; Lee, Chung P.

    2007-01-01

    Multi-element unstructured CFD has been utilized at NASA SSC to carry out analyses of propellant tank systems in different modes of operation. The three regimes of interest at SSC include (a) tank chill down (b) tank pressurization and (c) runtime propellant draw-down and purge. While tank chill down is an important event that is best addressed with long time-scale heat transfer calculations, CFD can play a critical role in the tank pressurization and runtime modes of operation. In these situations, problems with contamination of the propellant by inclusion of the pressurant gas from the ullage causes a deterioration of the quality of the propellant delivered to the test article. CFD can be used to help quantify the mixing and propellant degradation. During tank pressurization under some circumstances, rapid mixing of relatively warm pressurant gas with cryogenic propellant can lead to rapid densification of the gas and loss of pressure in the tank. This phenomenon can cause serious problems during testing because of the resulting decrease in propellant flow rate. With proper physical models implemented, CFD can model the coupling between the propellant and pressurant including heat transfer and phase change effects and accurately capture the complex physics in the evolving flowfields. This holds the promise of allowing the specification of operational conditions and procedures that could minimize the undesirable mixing and heat transfer inherent in propellant tank operation. It should be noted that traditional CFD modeling is inadequate for such simulations because the fluids in the tank are in a range of different sub-critical and supercritical states and elaborate phase change and mixing rules have to be developed to accurately model the interaction between the ullage gas and the propellant. We show a typical run-time simulation of a spherical propellant tank, containing RP-1 in this case, being pressurized with room-temperature nitrogen at 540 R. Nitrogen

  13. Radiative transfer analyses of Titan's tropical atmosphere

    NASA Astrophysics Data System (ADS)

    Griffith, Caitlin A.; Doose, Lyn; Tomasko, Martin G.; Penteado, Paulo F.; See, Charles

    2012-04-01

    Titan's optical and near-IR spectra result primarily from the scattering of sunlight by haze and its absorption by methane. With a column abundance of 92 km amagat (11 times that of Earth), Titan's atmosphere is optically thick and only ˜10% of the incident solar radiation reaches the surface, compared to 57% on Earth. Such a formidable atmosphere obstructs investigations of the moon's lower troposphere and surface, which are highly sensitive to the radiative transfer treatment of methane absorption and haze scattering. The absorption and scattering characteristics of Titan's atmosphere have been constrained by the Huygens Probe Descent Imager/Spectral Radiometer (DISR) experiment for conditions at the probe landing site (Tomasko, M.G., Bézard, B., Doose, L., Engel, S., Karkoschka, E. [2008a]. Planet. Space Sci. 56, 624-247; Tomasko, M.G. et al. [2008b]. Planet. Space Sci. 56, 669-707). Cassini's Visual and Infrared Mapping Spectrometer (VIMS) data indicate that the rest of the atmosphere (except for the polar regions) can be understood with small perturbations in the high haze structure determined at the landing site (Penteado, P.F., Griffith, C.A., Tomasko, M.G., Engel, S., See, C., Doose, L., Baines, K.H., Brown, R.H., Buratti, B.J., Clark, R., Nicholson, P., Sotin, C. [2010]. Icarus 206, 352-365). However the in situ measurements were analyzed with a doubling and adding radiative transfer calculation that differs considerably from the discrete ordinates codes used to interpret remote data from Cassini and ground-based measurements. In addition, the calibration of the VIMS data with respect to the DISR data has not yet been tested. Here, VIMS data of the probe landing site are analyzed with the DISR radiative transfer method and the faster discrete ordinates radiative transfer calculation; both models are consistent (to within 0.3%) and reproduce the scattering and absorption characteristics derived from in situ measurements. Constraints on the atmospheric

  14. Content standards for medical image metadata

    NASA Astrophysics Data System (ADS)

    d'Ornellas, Marcos C.; da Rocha, Rafael P.

    2003-12-01

    Medical images are at the heart of the healthcare diagnostic procedures. They have provided not only a noninvasive mean to view anatomical cross-sections of internal organs but also a mean for physicians to evaluate the patient"s diagnosis and monitor the effects of the treatment. For a Medical Center, the emphasis may shift from the generation of image to post processing and data management since the medical staff may generate even more processed images and other data from the original image after various analyses and post processing. A medical image data repository for health care information system is becoming a critical need. This data repository would contain comprehensive patient records, including information such as clinical data and related diagnostic images, and post-processed images. Due to the large volume and complexity of the data as well as the diversified user access requirements, the implementation of the medical image archive system will be a complex and challenging task. This paper discusses content standards for medical image metadata. In addition it also focuses on the image metadata content evaluation and metadata quality management.

  15. Scrotal imaging

    PubMed Central

    Studniarek, Michał; Modzelewska, Elza

    2015-01-01

    Pathological lesions within the scrotum are relatively rare in imaging except for ultrasonography. The diseases presented in the paper are usually found in men at the age of 15–45, i.e. men of reproductive age, and therefore they are worth attention. Scrotal ultrasound in infertile individuals should be conducted on a routine basis owing to the fact that pathological scrotal lesions are frequently detected in this population. Malignant testicular cancers are the most common neoplasms in men at the age of 20–40. Ultrasound imaging is the method of choice characterized by the sensitivity of nearly 100% in the differentiation between intratesticular and extratesticular lesions. In the case of doubtful lesions that are not classified for intra-operative verification, nuclear magnetic resonance is applied. Computed tomography, however, is performed to monitor the progression of a neoplastic disease, in pelvic trauma with scrotal injury as well as in rare cases of scrotal hernias involving the ureters or a fragment of the urinary bladder. PMID:26674847

  16. Image Ambiguity and Fluency

    PubMed Central

    Jakesch, Martina; Leder, Helmut; Forster, Michael

    2013-01-01

    Ambiguity is often associated with negative affective responses, and enjoying ambiguity seems restricted to only a few situations, such as experiencing art. Nevertheless, theories of judgment formation, especially the “processing fluency account”, suggest that easy-to-process (non-ambiguous) stimuli are processed faster and are therefore preferred to (ambiguous) stimuli, which are hard to process. In a series of six experiments, we investigated these contrasting approaches by manipulating fluency (presentation duration: 10ms, 50ms, 100ms, 500ms, 1000ms) and testing effects of ambiguity (ambiguous versus non-ambiguous pictures of paintings) on classification performance (Part A; speed and accuracy) and aesthetic appreciation (Part B; liking and interest). As indicated by signal detection analyses, classification accuracy increased with presentation duration (Exp. 1a), but we found no effects of ambiguity on classification speed (Exp. 1b). Fifty percent of the participants were able to successfully classify ambiguous content at a presentation duration of 100 ms, and at 500ms even 75% performed above chance level. Ambiguous artworks were found more interesting (in conditions 50ms to 1000ms) and were preferred over non-ambiguous stimuli at 500ms and 1000ms (Exp. 2a - 2c, 3). Importantly, ambiguous images were nonetheless rated significantly harder to process as non-ambiguous images. These results suggest that ambiguity is an essential ingredient in art appreciation even though or maybe because it is harder to process. PMID:24040172

  17. Indexing Images: Testing an Image Description Template.

    ERIC Educational Resources Information Center

    Jorgensen, Corinne

    1996-01-01

    A template for pictorial image description to be used by novice image searchers in recording their descriptions of images was tested; image attribute classes derived in previous research were used to model the template. Results indicated that users may need training and/or more guidance to correctly assign descriptors to higher-level classes.…

  18. Descriptive and Experimental Analyses of Potential Precursors to Problem Behavior

    ERIC Educational Resources Information Center

    Borrero, Carrie S. W.; Borrero, John C.

    2008-01-01

    We conducted descriptive observations of severe problem behavior for 2 individuals with autism to identify precursors to problem behavior. Several comparative probability analyses were conducted in addition to lag-sequential analyses using the descriptive data. Results of the descriptive analyses showed that the probability of the potential…

  19. Speckle imaging algorithms for planetary imaging

    SciTech Connect

    Johansson, E.

    1994-11-15

    I will discuss the speckle imaging algorithms used to process images of the impact sites of the collision of comet Shoemaker-Levy 9 with Jupiter. The algorithms use a phase retrieval process based on the average bispectrum of the speckle image data. High resolution images are produced by estimating the Fourier magnitude and Fourier phase of the image separately, then combining them and inverse transforming to achieve the final result. I will show raw speckle image data and high-resolution image reconstructions from our recent experiment at Lick Observatory.

  20. NASA Earth Exchange (NEX) Supporting Analyses for National Climate Assessments

    NASA Astrophysics Data System (ADS)

    Nemani, R. R.; Thrasher, B. L.; Wang, W.; Lee, T. J.; Melton, F. S.; Dungan, J. L.; Michaelis, A.

    2015-12-01

    The NASA Earth Exchange (NEX) is a collaborative computing platform that has been developed with the objective of bringing scientists together with the software tools, massive global datasets, and supercomputing resources necessary to accelerate research in Earth systems science and global change. NEX supports several research projects that are closely related with the National Climate Assessment including the generation of high-resolution climate projections, identification of trends and extremes in climate variables and the evaluation of their impacts on regional carbon/water cycles and biodiversity, the development of land-use management and adaptation strategies for climate-change scenarios, and even the exploration of climate mitigation through geo-engineering. Scientists also use the large collection of satellite data on NEX to conduct research on quantifying spatial and temporal changes in land surface processes in response to climate and land-cover-land-use changes. Researchers, leveraging NEX's massive compute/storage resources, have used statistical techniques to downscale the coarse-resolution CMIP5 projections to fulfill the demands of the community for a wide range of climate change impact analyses. The DCP-30 (Downscaled Climate Projections at 30 arcsecond) for the conterminous US at monthly, ~1km resolution and the GDDP (Global Daily Downscaled Projections) for the entire world at daily, 25km resolution are now widely used in climate research and applications, as well as for communicating climate change. In order to serve a broader community, the NEX team in collaboration with Amazon, Inc, created the OpenNEX platform. OpenNEX provides ready access to NEX data holdings, including the NEX-DCP30 and GDDP datasets along with a number of pertinent analysis tools and workflows on the AWS infrastructure in the form of publicly available, self contained, fully functional Amazon Machine Images (AMI's) for anyone interested in global climate change.

  1. Image processing for the Arcetri Solar Archive

    NASA Astrophysics Data System (ADS)

    Centrone, M.; Ermolli, I.; Giorgi, F.

    The modelling recently developed to "reconstruct" with high accuracy the measured Total Solar Irradiance (TSI) variations, based on semi-empirical atmosphere models and observed distribution of the solar magnetic regions, can be applied to "construct" the TSI variations back in time making use of observations stored on several historic photographic archives. However, the analyses of images obtained through these archives is not a straightforward task, because these images suffer of several defects originated by the acquisition techniques and the data storing. In this paper we summarize the processing applied to identify solar features on the images obtained by the digitization of the Arcetri solar archive.

  2. Image resampling effects in mammographic image simulation.

    PubMed

    Yip, M; Mackenzie, A; Lewis, E; Dance, D R; Young, K C; Christmas, W; Wells, K

    2011-11-21

    This work describes the theory of resampling effects within the context of image simulation for mammographic images. The process of digitization associated with using digital imaging technology needs to be correctly addressed in any image simulation process. Failure to do so can lead to overblurring in the final synthetic image. A method for weighted neighbourhood averaging is described for non-integer scaling factors in resampling images. The use of the method is demonstrated by comparing simulated and real images of an edge test object acquired on two clinical mammography systems. Images were simulated using two setups: from idealized images and from images obtained with clinical systems. A Gaussian interpolation method is proposed as a single-step solution to modelling blurring filters for the simulation process.

  3. Non-destructive infrared analyses: a method for provenance analyses of sandstones

    NASA Astrophysics Data System (ADS)

    Bowitz, Jörg; Ehling, Angela

    2008-12-01

    Infrared spectroscopy (IR spectroscopy) is commonly applied in the laboratory for mineral analyses in addition to XRD. Because such technical efforts are time and cost consuming, we present an infrared-based mobile method for non-destructive mineral and provenance analyses of sandstones. IR spectroscopy is based on activating chemical bonds. By irradiating a mineral mixture, special bonds are activated to vibrate depending on the bond energy (resonance vibration). Accordingly, the energy of the IR spectrum will be reduced thereby generating an absorption spectrum. The positions of the absorption maxima within the spectral region indicate the type of the bonds and in many cases identify minerals containing these bonds. The non-destructive reflection spectroscopy operates in the near infrared region (NIR) and can detect all common clay minerals as well as sulfates, hydroxides and carbonates. The spectra produced have been interpreted by computer using digital mineral libraries that have been especially collected for sandstones. The comparison of all results with XRD, RFA and interpretations of thin sections demonstrates impressively the accuracy and reliability of this method. Not only are different minerals detectable, but also differently ordered kaolinites and varieties of illites can be identified by the shape and size of the absorption bands. Especially clay minerals and their varieties in combination with their relative contents form the characteristic spectra of sandstones. Other components such as limonite, hematite and amorphous silica also influence the spectra. Sandstones, similar in colour and texture, often can be identified by their characteristic reflectance spectra. Reference libraries with more than 60 spectra of important German sandstones have been created to enable entirely computerized interpretations and identifications of these dimension stones. The analysis of infrared spectroscopy results is demonstrated with examples of different sandstones

  4. Large area CMOS image sensors

    NASA Astrophysics Data System (ADS)

    Turchetta, R.; Guerrini, N.; Sedgwick, I.

    2011-01-01

    CMOS image sensors, also known as CMOS Active Pixel Sensors (APS) or Monolithic Active Pixel Sensors (MAPS), are today the dominant imaging devices. They are omnipresent in our daily life, as image sensors in cellular phones, web cams, digital cameras, ... In these applications, the pixels can be very small, in the micron range, and the sensors themselves tend to be limited in size. However, many scientific applications, like particle or X-ray detection, require large format, often with large pixels, as well as other specific performance, like low noise, radiation hardness or very fast readout. The sensors are also required to be sensitive to a broad spectrum of radiation: photons from the silicon cut-off in the IR down to UV and X- and gamma-rays through the visible spectrum as well as charged particles. This requirement calls for modifications to the substrate to be introduced to provide optimized sensitivity. This paper will review existing CMOS image sensors, whose size can be as large as a single CMOS wafer, and analyse the technical requirements and specific challenges of large format CMOS image sensors.

  5. Using a Log Analyser to Assist Research into Haptic Technology

    NASA Astrophysics Data System (ADS)

    Jónsson, Fannar Freyr; Hvannberg, Ebba Þóra

    Usability evaluations collect subjective and objective measures. Examples of the latter are time to complete a task. The paper describes use cases of a log analyser for haptic feedback. The log analyser reads a log file and extracts information such as time of each practice and assessment session, analyses whether the user goes off curve and measures the force applied. A study case using the analyser is performed using a PHANToM haptic learning environment application that is used to teach young visually impaired students the subject of polynomials. The paper answers six questions to illustrate further use cases of the log analyser.

  6. Imaging Genetics and Psychiatric Disorders

    PubMed Central

    Hashimoto, R; Ohi, K; Yamamori, H; Yasuda, Y; Fujimoto, M; Umeda-Yano, S; Watanabe, Y; Fukunaga, M; Takeda, M

    2015-01-01

    Imaging genetics is an integrated research method that uses neuroimaging and genetics to assess the impact of genetic variation on brain function and structure. Imaging genetics is both a tool for the discovery of risk genes for psychiatric disorders and a strategy for characterizing the neural systems affected by risk gene variants to elucidate quantitative and mechanistic aspects of brain function implicated in psychiatric disease. Early studies of imaging genetics included association analyses between brain morphology and single nucleotide polymorphisms whose function is well known, such as catechol-O-methyltransferase (COMT) and brain-derived neurotrophic factor (BDNF). GWAS of psychiatric disorders have identified genes with unknown functions, such as ZNF804A, and imaging genetics has been used to investigate clues of the biological function of these genes. The difficulty in replicating the findings of studies with small sample sizes has motivated the creation of large-scale collaborative consortiums, such as ENIGMA, CHARGE and IMAGEN, to collect thousands of images. In a genome-wide association study, the ENIGMA consortium successfully identified common variants in the genome associated with hippocampal volume at 12q24, and the CHARGE consortium replicated this finding. The new era of imaging genetics has just begun, and the next challenge we face is the discovery of small effect size signals from large data sets obtained from genetics and neuroimaging. New methods and technologies for data reduction with appropriate statistical thresholds, such as polygenic analysis and parallel independent component analysis (ICA), are warranted. Future advances in imaging genetics will aid in the discovery of genes and provide mechanistic insight into psychiatric disorders. PMID:25732148

  7. Imaging genetics and psychiatric disorders.

    PubMed

    Hashimoto, R; Ohi, K; Yamamori, H; Yasuda, Y; Fujimoto, M; Umeda-Yano, S; Watanabe, Y; Fukunaga, M; Takeda, M

    2015-01-01

    Imaging genetics is an integrated research method that uses neuroimaging and genetics to assess the impact of genetic variation on brain function and structure. Imaging genetics is both a tool for the discovery of risk genes for psychiatric disorders and a strategy for characterizing the neural systems affected by risk gene variants to elucidate quantitative and mechanistic aspects of brain function implicated in psychiatric disease. Early studies of imaging genetics included association analyses between brain morphology and single nucleotide polymorphisms whose function is well known, such as catechol-Omethyltransferase (COMT) and brain-derived neurotrophic factor (BDNF). GWAS of psychiatric disorders have identified genes with unknown functions, such as ZNF804A, and imaging genetics has been used to investigate clues of the biological function of these genes. The difficulty in replicating the findings of studies with small sample sizes has motivated the creation of largescale collaborative consortiums, such as ENIGMA, CHARGE and IMAGEN, to collect thousands of images. In a genome-wide association study, the ENIGMA consortium successfully identified common variants in the genome associated with hippocampal volume at 12q24, and the CHARGE consortium replicated this finding. The new era of imaging genetics has just begun, and the next challenge we face is the discovery of small effect size signals from large data sets obtained from genetics and neuroimaging. New methods and technologies for data reduction with appropriate statistical thresholds, such as polygenic analysis and parallel independent component analysis (ICA), are warranted. Future advances in imaging genetics will aid in the discovery of genes and provide mechanistic insight into psychiatric disorders.

  8. Integrated Waste Treatment Unit (IWTU) Input Coal Analyses and Off-Gass Filter (OGF) Content Analyses

    SciTech Connect

    Jantzen, Carol M.; Missimer, David M.; Guenther, Chris P.; Shekhawat, Dushyant; VanEssendelft, Dirk T.; Means, Nicholas C.

    2015-04-23

    in process piping and materials, in excessive off-gas absorbent loading, and in undesired process emissions. The ash content of the coal is important as the ash adds to the DMR and other vessel products which affect the final waste product mass and composition. The amount and composition of the ash also affects the reaction kinetics. Thus ash content and composition contributes to the mass balance. In addition, sodium, potassium, calcium, sulfur, and maybe silica and alumina in the ash may contribute to wall-scale formation. Sodium, potassium, and alumina in the ash will be overwhelmed by the sodium, potassium, and alumina from the feed but the impact from the other ash components needs to be quantified. A maximum coal particle size is specified so the feed system does not plug and a minimum particle size is specified to prevent excess elutriation from the DMR to the Process Gas Filter (PGF). A vendor specification was used to procure the calcined coal for IWTU processing. While the vendor supplied a composite analysis for the 22 tons of coal (Appendix A), this study compares independent analyses of the coal performed at the Savannah River National Laboratory (SRNL) and at the National Energy Technology Laboratory (NETL). Three supersacks a were sampled at three different heights within the sack in order to determine within bag variability and between bag variability of the coal. These analyses were also compared to the vendor’s composite analyses and to the coal specification. These analyses were also compared to historic data on Bestac coal analyses that had been performed at Hazen Research Inc. (HRI) between 2004-2011.

  9. Reversible digital images

    NASA Astrophysics Data System (ADS)

    Knox, Keith T.

    1999-04-01

    A method has been developed to hide one image inside another with little loss in image quality. If the second image is a logo or watermark, then this method may be used to protect the ownership rights of the first image and to guarantee the authenticity of the image. The two images to be combined may be either black & white or color continuous tone images. A reversible image is created by incorporating the first image in the upper 4 bits and the second image in the lower 4 bits. When viewed normally, the reversible image appears to be the first image. To view the hidden image, the bits of the combined image are reversed, exchanging all of the lower and higher order bits. When viewed in the reversed mode, the image appears to be the second or hidden image. To maintain a high level of image quality for both images, two simultaneous error diffusion calculations are run to ensure that both views of the reversible image have the same visual appearance as the originals. Any alteration of one of the images locally destroys the other image at the site of the alterations. This provides a method to detect alterations of the original image.

  10. Chemical imaging of latent fingerprint residues.

    PubMed

    Ricci, Camilla; Phiriyavityopas, Phiraporn; Curum, Nicholas; Chan, K L Andrew; Jickells, Sue; Kazarian, Sergei G

    2007-05-01

    In situ attenuated total reflection Fourier transform infrared (ATR-FT-IR) spectroscopic imaging has been used to obtain chemical images of fingerprints under controlled humidity and temperature. The distribution of lipid and amino acid components in the fingerprints from different donors left on the surface of the ZnSe crystal has been studied using an in situ FT-IR spectroscopic imaging approach under a controlled environment and studied as a function of time. Univariate and multivariate analyses were employed to analyze the spectroscopic dataset. Changes in the spectra of lipids with temperature and time have been detected. This information is needed to understand aging of the fingerprints. The ATR-FT-IR spectroscopic imaging offers a new and complementary means for studying the chemistry of fingerprints that are left pristine for further analysis. This study demonstrates the potential for visualizing the chemical changes of fingerprints for forensic applications by spectroscopic imaging.

  11. Metric Learning to Enhance Hyperspectral Image Segmentation

    NASA Technical Reports Server (NTRS)

    Thompson, David R.; Castano, Rebecca; Bue, Brian; Gilmore, Martha S.

    2013-01-01

    Unsupervised hyperspectral image segmentation can reveal spatial trends that show the physical structure of the scene to an analyst. They highlight borders and reveal areas of homogeneity and change. Segmentations are independently helpful for object recognition, and assist with automated production of symbolic maps. Additionally, a good segmentation can dramatically reduce the number of effective spectra in an image, enabling analyses that would otherwise be computationally prohibitive. Specifically, using an over-segmentation of the image instead of individual pixels can reduce noise and potentially improve the results of statistical post-analysis. In this innovation, a metric learning approach is presented to improve the performance of unsupervised hyperspectral image segmentation. The prototype demonstrations attempt a superpixel segmentation in which the image is conservatively over-segmented; that is, the single surface features may be split into multiple segments, but each individual segment, or superpixel, is ensured to have homogenous mineralogy.

  12. Nonspectroscopic imaging for quantitative chlorophyll sensing

    NASA Astrophysics Data System (ADS)

    Kim, Taehoon; Kim, Jeong-Im; Visbal-Onufrak, Michelle A.; Chapple, Clint; Kim, Young L.

    2016-01-01

    Nondestructive imaging of physiological changes in plants has been intensively used as an invaluable tool for visualizing heterogeneous responses to various types of abiotic and biotic stress. However, conventional approaches often have intrinsic limitations for quantitative analyses, requiring bulky and expensive optical instruments for capturing full spectral information. We report a spectrometerless (or spectrometer-free) reflectance imaging method that allows for nondestructive and quantitative chlorophyll imaging in individual leaves in situ in a handheld device format. The combination of a handheld-type imaging system and a hyperspectral reconstruction algorithm from an RGB camera offers simple instrumentation and operation while avoiding the use of an imaging spectrograph or tunable color filter. This platform could potentially be integrated into a compact, inexpensive, and portable system, while being of great value in high-throughput phenotyping facilities and laboratory settings.

  13. Infrared scanning images: An archeological application

    USGS Publications Warehouse

    Schaber, G.G.; Gumerman, G.J.

    1969-01-01

    Aerial infrared scanner images of an area near the Little Colorado River in north-central Arizona disclosed the existence of scattered clusters of parallel linear features in the ashfall area of Sunset Crater. The features are not obvious in conventional aerial photographs, and only one cluster could be recognized on the ground. Soil and pollen analyses reveal that they are prehistoric agricultural plots.

  14. Probing the mysterious underpinnings of multi-voxel fMRI analyses.

    PubMed

    Op de Beeck, Hans P

    2010-04-01

    Various arguments have been proposed for or against sub-voxel sensitivity or hyperacuity in functional magnetic resonance imaging (fMRI) at standard resolution. Sub-voxel sensitivity might exist, but nevertheless the performance of multi-voxel fMRI analyses is very likely to be dominated by a larger-scale organization, even if this organization is very weak. Up to now, most arguments are indirect in nature: they do not in themselves proof or contradict sub-voxel sensitivity, but they are suggestive, seem consistent or not with sub-voxel sensitivity, or show that the principle might or might not work. Here the previously proposed smoothing argument against hyperacuity is extended with simulations that include more realistic signal, noise, and analysis properties than any of the simulations presented before. These simulations confirm the relevance of the smoothing approach to find out the scale of the functional maps that underlie the outcome of multi-voxel analyses, at least in relative terms (differences in the scale of different maps). However, image smoothing, like most other arguments in the literature, is an indirect argument, and at the end of the day such arguments are not sufficient to decide the issue on whether and how much sub-voxel maps contribute. A few suggestions are made about the type of evidence that is needed to help us understand the as yet mysterious underpinnings of multi-voxel fMRI analyses.

  15. Towards Efficiency of Oblique Images Orientation

    NASA Astrophysics Data System (ADS)

    Ostrowski, W.; Bakuła, K.

    2016-03-01

    Many papers on both theoretical aspects of bundle adjustment of oblique images and new operators for detecting tie points on oblique images have been written. However, only a few achievements presented in the literature were practically implemented in commercial software. In consequence often aerial triangulation is performed either for nadir images obtained simultaneously with oblique photos or bundle adjustment for separate images captured in different directions. The aim of this study was to investigate how the orientation of oblique images can be carried out effectively in commercial software based on the structure from motion technology. The main objective of the research was to evaluate the impact of the orientation strategy on both duration of the process and accuracy of photogrammetric 3D products. Two, very popular software: Pix4D and Agisoft Photoscan were tested and two approaches for image blocks were considered. The first approach based only on oblique images collected in four directions and the second approach included nadir images. In this study, blocks for three test areas were analysed. Oblique images were collected with medium-format cameras in maltan cross configuration with registration of GNSS and INS data. As a reference both check points and digital surface models from airborne laser scanning were used.

  16. Modeling countermeasures to imaging infrared seekers

    NASA Astrophysics Data System (ADS)

    Cox, Laurence J.; Batten, Michael A.; Carpenter, Stephen R.; Saddleton, Philip A. B.

    2004-12-01

    The threat to aircraft from missiles with imaging infrared seekers has developed more rapidly and in more countries independently than the original infrared missile threat. This is, in part, a consequence of the civil sector's demand for high-resolution infrared imagers and the development of computer processors capable of implementing complex image-processing algorithms im real time. Dstl has developed the Fly-In model to analyse the potential effectiveness of existing countermeasures (CM) to imaging infrared seekers and to test new CM approaches before trialling them against surrogate imaging seekers. The validation of the Fly-In model is extremely important, particularly as the newness of the imaging infrared threat, means that actual examples of the threat are not available for study. Extensive measurements have been carried out on the appearance of flare CM in different infrared wavebands, and on the effects of lasers on the optics and detector of an surrogate imageing seeker. Other parts of the model are derived from other Dstl models, including the NATO Infrared Airborne Target Model (NIRATAM) and HADES (missile dynamics) that are validated against trials' data. Initial studies have shown that existing CM, and those under development, can be very effective against imaging infrared seekers, by defeating the seeker's image-processing algorithms. It is already clear that laser CM will play an increasing role in the defence of aircraft, thereby enhancing aircraft survivability. Moreover, this model will aid the military planner in determining the best mix of CM and the tactics for using them.

  17. GRAPHIE: graph based histology image explorer

    PubMed Central

    2015-01-01

    Background Histology images comprise one of the important sources of knowledge for phenotyping studies in systems biology. However, the annotation and analyses of histological data have remained a manual, subjective and relatively low-throughput process. Results We introduce Graph based Histology Image Explorer (GRAPHIE)-a visual analytics tool to explore, annotate and discover potential relationships in histology image collections within a biologically relevant context. The design of GRAPHIE is guided by domain experts' requirements and well-known InfoVis mantras. By representing each image with informative features and then subsequently visualizing the image collection with a graph, GRAPHIE allows users to effectively explore the image collection. The features were designed to capture localized morphological properties in the given tissue specimen. More importantly, users can perform feature selection in an interactive way to improve the visualization of the image collection and the overall annotation process. Finally, the annotation allows for a better prospective examination of datasets as demonstrated in the users study. Thus, our design of GRAPHIE allows for the users to navigate and explore large collections of histology image datasets. Conclusions We demonstrated the usefulness of our visual analytics approach through two case studies. Both of the cases showed efficient annotation and analysis of histology image collection. PMID:26330277

  18. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. PMID:26556680

  19. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image.

  20. The Influence of University Image on Student Behaviour

    ERIC Educational Resources Information Center

    Alves, Helena; Raposo, Mario

    2010-01-01

    Purpose: The purpose of this paper is to analyse the influence of image on student satisfaction and loyalty. Design/methodology/approach: In order to accomplish the objectives proposed, a model reflecting the influence of image on student satisfaction and loyalty is applied. The model is tested through use of structural equations and the final…

  1. scikit-image: image processing in Python

    PubMed Central

    Schönberger, Johannes L.; Nunez-Iglesias, Juan; Boulogne, François; Warner, Joshua D.; Yager, Neil; Gouillart, Emmanuelle; Yu, Tony

    2014-01-01

    scikit-image is an image processing library that implements algorithms and utilities for use in research, education and industry applications. It is released under the liberal Modified BSD open source license, provides a well-documented API in the Python programming language, and is developed by an active, international team of collaborators. In this paper we highlight the advantages of open source to achieve the goals of the scikit-image library, and we showcase several real-world image processing applications that use scikit-image. More information can be found on the project homepage, http://scikit-image.org. PMID:25024921

  2. scikit-image: image processing in Python.

    PubMed

    van der Walt, Stéfan; Schönberger, Johannes L; Nunez-Iglesias, Juan; Boulogne, François; Warner, Joshua D; Yager, Neil; Gouillart, Emmanuelle; Yu, Tony

    2014-01-01

    scikit-image is an image processing library that implements algorithms and utilities for use in research, education and industry applications. It is released under the liberal Modified BSD open source license, provides a well-documented API in the Python programming language, and is developed by an active, international team of collaborators. In this paper we highlight the advantages of open source to achieve the goals of the scikit-image library, and we showcase several real-world image processing applications that use scikit-image. More information can be found on the project homepage, http://scikit-image.org.

  3. Non-parametric partitioning of SAR images

    NASA Astrophysics Data System (ADS)

    Delyon, G.; Galland, F.; Réfrégier, Ph.

    2006-09-01

    We describe and analyse a generalization of a parametric segmentation technique adapted to Gamma distributed SAR images to a simple non parametric noise model. The partition is obtained by minimizing the stochastic complexity of a quantized version on Q levels of the SAR image and lead to a criterion without parameters to be tuned by the user. We analyse the reliability of the proposed approach on synthetic images. The quality of the obtained partition will be studied for different possible strategies. In particular, one will discuss the reliability of the proposed optimization procedure. Finally, we will precisely study the performance of the proposed approach in comparison with the statistical parametric technique adapted to Gamma noise. These studies will be led by analyzing the number of misclassified pixels, the standard Hausdorff distance and the number of estimated regions.

  4. X-Ray Imaging

    MedlinePlus

    ... Brain Surgery Imaging Clinical Trials Basics Patient Information X-Ray Imaging Print This Page X-ray imaging is perhaps the most familiar type of imaging. Images produced by X-rays are due to the different absorption rates of ...

  5. Split image optical display

    DOEpatents

    Veligdan, James T.

    2007-05-29

    A video image is displayed from an optical panel by splitting the image into a plurality of image components, and then projecting the image components through corresponding portions of the panel to collectively form the image. Depth of the display is correspondingly reduced.

  6. Split image optical display

    DOEpatents

    Veligdan, James T.

    2005-05-31

    A video image is displayed from an optical panel by splitting the image into a plurality of image components, and then projecting the image components through corresponding portions of the panel to collectively form the image. Depth of the display is correspondingly reduced.

  7. Developmental Meta-Analyses of the Functional Neural Correlates of Bipolar Disorder

    PubMed Central

    Wegbreit, Ezra; Cushman, Grace K.; Puzia, Megan E.; Weissman, Alexandra B.; Kim, Kerri L.; Laird, Angela R.; Dickstein, Daniel P.

    2015-01-01

    Context Bipolar disorder (BD) is a debilitating mental illness associated with high costs to diagnosed individuals and society. Within the past two decades, increasing numbers of children and adolescents have been diagnosed with BD. While functional magnetic resonance imaging (fMRI) studies have begun to investigate the neural mechanisms underlying BD, few have directly compared differences in BD-youths and BD-adults. Objective To address this gap, we conducted activation likelihood estimation (ALE) meta-analyses directly comparing the voxel-wise convergence of fMRI findings in BD-youths versus BD-adults, both relative to healthy control (HC) participants. We hypothesized that BD-youths (<18 years old) would show greater convergence of amygdala hyper-activation and prefrontal cortical hypo-activation versus BD-adults. Data Sources PubMed and PsycINFO databases were searched through July 2013 for original, task-related coordinate-based fMRI articles. Study Selection 21 pediatric studies, 73 adult studies, and 2 studies containing distinct pediatric and adult groups within the same study met inclusion criteria for our ALE analyses. Data Extraction and Synthesis Coordinates of significant between-group differences were extracted from each published study. Recent improvements in GingerALE software were employed to perform direct comparisons of pediatric and adult fMRI findings. Results Analyses of emotional face recognition fMRI studies showed significantly greater convergence of amygdala hyper-activation among BD-youths than BD-adults. More broadly, analyses of fMRI studies employing emotional stimuli showed significantly greater convergence of hyper-activation among BD-youths than BD-adults in the inferior frontal gyrus and precuneus. In contrast, analyses of fMRI studies employing non-emotional cognitive tasks and also analyses aggregating emotional and non-emotional tasks showed significantly greater convergence of hypo-activation among BD-youths than BD-adults in

  8. Enhancing forensic science with spectroscopic imaging

    NASA Astrophysics Data System (ADS)

    Ricci, Camilla; Kazarian, Sergei G.

    2006-09-01

    This presentation outlines the research we are developing in the area of Fourier Transform Infrared (FTIR) spectroscopic imaging with the focus on materials of forensic interest. FTIR spectroscopic imaging has recently emerged as a powerful tool for characterisation of heterogeneous materials. FTIR imaging relies on the ability of the military-developed infrared array detector to simultaneously measure spectra from thousands of different locations in a sample. Recently developed application of FTIR imaging using an ATR (Attenuated Total Reflection) mode has demonstrated the ability of this method to achieve spatial resolution beyond the diffraction limit of infrared light in air. Chemical visualisation with enhanced spatial resolution in micro-ATR mode broadens the range of materials studied with FTIR imaging with applications to pharmaceutical formulations or biological samples. Macro-ATR imaging has also been developed for chemical imaging analysis of large surface area samples and was applied to analyse the surface of human skin (e.g. finger), counterfeit tablets, textile materials (clothing), etc. This approach demonstrated the ability of this imaging method to detect trace materials attached to the surface of the skin. This may also prove as a valuable tool in detection of traces of explosives left or trapped on the surfaces of different materials. This FTIR imaging method is substantially superior to many of the other imaging methods due to inherent chemical specificity of infrared spectroscopy and fast acquisition times of this technique. Our preliminary data demonstrated that this methodology will provide the means to non-destructive detection method that could relate evidence to its source. This will be important in a wider crime prevention programme. In summary, intrinsic chemical specificity and enhanced visualising capability of FTIR spectroscopic imaging open a window of opportunities for counter-terrorism and crime-fighting, with applications ranging

  9. Methods and Procedures for Shielding Analyses for the SNS

    SciTech Connect

    Gallmeier, Franz X.; Iverson, Erik B.; Remec, Igor; Lu, Wei; Popova, Irina

    2014-01-01

    In order to provide radiologically safe Spallation Neutron Source operation, shielding analyses are performed according to Oak Ridge National Laboratory internal regulations and to comply with the Code of Federal Regulations. An overview of on-going shielding work for the accelerator facility and neutrons beam lines, methods, used for the analyses, and associated procedures and regulations is presented. Methods used to perform shielding analyses are described as well.

  10. Smart Image Enhancement Process

    NASA Technical Reports Server (NTRS)

    Jobson, Daniel J. (Inventor); Rahman, Zia-ur (Inventor); Woodell, Glenn A. (Inventor)

    2012-01-01

    Contrast and lightness measures are used to first classify the image as being one of non-turbid and turbid. If turbid, the original image is enhanced to generate a first enhanced image. If non-turbid, the original image is classified in terms of a merged contrast/lightness score based on the contrast and lightness measures. The non-turbid image is enhanced to generate a second enhanced image when a poor contrast/lightness score is associated therewith. When the second enhanced image has a poor contrast/lightness score associated therewith, this image is enhanced to generate a third enhanced image. A sharpness measure is computed for one image that is selected from (i) the non-turbid image, (ii) the first enhanced image, (iii) the second enhanced image when a good contrast/lightness score is associated therewith, and (iv) the third enhanced image. If the selected image is not-sharp, it is sharpened to generate a sharpened image. The final image is selected from the selected image and the sharpened image.

  11. SENSITIVITY ANALYSIS FOR SALTSTONE DISPOSAL UNIT COLUMN DEGRADATION ANALYSES

    SciTech Connect

    Flach, G.

    2014-10-28

    PORFLOW related analyses supporting a Sensitivity Analysis for Saltstone Disposal Unit (SDU) column degradation were performed. Previous analyses, Flach and Taylor 2014, used a model in which the SDU columns degraded in a piecewise manner from the top and bottom simultaneously. The current analyses employs a model in which all pieces of the column degrade at the same time. Information was extracted from the analyses which may be useful in determining the distribution of Tc-99 in the various SDUs throughout time and in determining flow balances for the SDUs.

  12. Analysing harmonic motions with an iPhone’s magnetometer

    NASA Astrophysics Data System (ADS)

    Yavuz, Ahmet; Kağan Temiz, Burak

    2016-05-01

    In this paper, we propose an experiment for analysing harmonic motion using an iPhone’s (or iPad’s) magnetometer. This experiment consists of the detection of magnetic field variations obtained from an iPhone’s magnetometer sensor. A graph of harmonic motion is directly displayed on the iPhone’s screen using the Sensor Kinetics application. Data from this application was analysed with Eureqa software to establish the equation of the harmonic motion. Analyses show that the use of an iPhone’s magnetometer to analyse harmonic motion is a practical and effective method for small oscillations and frequencies less than 15-20 Hz.

  13. What Is an Image?

    ERIC Educational Resources Information Center

    Gerber, Andrew J.; Peterson, Bradley S.

    2008-01-01

    The article helps to understand the interpretation of an image by presenting as to what constitutes an image. A common feature in all images is the basic physical structure that can be described with a common set of terms.

  14. Inertial imaging with nanomechanical systems.

    PubMed

    Hanay, M Selim; Kelber, Scott I; O'Connell, Cathal D; Mulvaney, Paul; Sader, John E; Roukes, Michael L

    2015-04-01

    Mass sensing with nanoelectromechanical systems has advanced significantly during the last decade. With nanoelectromechanical systems sensors it is now possible to carry out ultrasensitive detection of gaseous analytes, to achieve atomic-scale mass resolution and to perform mass spectrometry on single proteins. Here, we demonstrate that the spatial distribution of mass within an individual analyte can be imaged--in real time and at the molecular scale--when it adsorbs onto a nanomechanical resonator. Each single-molecule adsorption event induces discrete, time-correlated perturbations to all modal frequencies of the device. We show that by continuously monitoring a multiplicity of vibrational modes, the spatial moments of mass distribution can be deduced for individual analytes, one-by-one, as they adsorb. We validate this method for inertial imaging, using both experimental measurements of multimode frequency shifts and numerical simulations, to analyse the inertial mass, position of adsorption and the size and shape of individual analytes. Unlike conventional imaging, the minimum analyte size detectable through nanomechanical inertial imaging is not limited by wavelength-dependent diffraction phenomena. Instead, frequency fluctuation processes determine the ultimate attainable resolution. Advanced nanoelectromechanical devices appear capable of resolving molecular-scale analytes.

  15. Analyses and Measures of GPR Signal with Superimposed Noise

    NASA Astrophysics Data System (ADS)

    Chicarella, Simone; Ferrara, Vincenzo; D'Atanasio, Paolo; Frezza, Fabrizio; Pajewski, Lara; Pavoncello, Settimio; Prontera, Santo; Tedeschi, Nicola; Zambotti, Alessandro

    2014-05-01

    The influence of EM noises and environmental hard conditions on the GPR surveys has been examined analytically [1]. In the case of pulse radar GPR, many unwanted signals as stationary clutter, non-stationary clutter, random noise, and time jitter, influence the measurement signal. When GPR is motionless, stationary clutter is the most dominant signal component due to the reflections of static objects different from the investigated target, and to the direct antenna coupling. Moving objects like e.g. persons and vehicles, and the swaying of tree crown, produce non-stationary clutter. Device internal noise and narrowband jamming are e.g. two potential sources of random noises. Finally, trigger instabilities generate random jitter. In order to estimate the effective influence of these noise signal components, we organized some experimental setup of measurement. At first, we evaluated for the case of a GPR basic detection, simpler image processing of radargram. In the future, we foresee experimental measurements for detection of the Doppler frequency changes induced by movements of targets (like physiological movements of survivors under debris). We obtain image processing of radargram by using of GSSI SIR® 2000 GPR system together with the UWB UHF GPR-antenna (SUB-ECHO HBD 300, a model manufactured by Radarteam company). Our work includes both characterization of GPR signal without (or almost without) a superimposed noise, and the effect of jamming originated from the coexistence of a different radio signal. For characterizing GPR signal, we organized a measurement setup that includes the following instruments: mod. FSP 30 spectrum analyser by Rohde & Schwarz which operates in the frequency range 9 KHz - 30 GHz, mod. Sucoflex 104 cable by Huber Suhner (10 MHz - 18 GHz), and HL050 antenna by Rohde & Schwarz (bandwidth: from 850 MHz to 26.5 GHz). The next analysis of superimposed jamming will examine two different signal sources: by a cellular phone and by a

  16. Coordinated in Situ Analyses of Organic Nanoglobules in the Sutter's Mill Meteorite

    NASA Technical Reports Server (NTRS)

    Nakamura-Messenger, K.; Messenger, S.; Keller, L. P.; Clemett, S. J.; Nguyen, A. N.; Gibson, E. K.

    2013-01-01

    The Sutter's Mill meteorite is a newly fallen carbonaceous chondrite that was collected and curated quickly after its fall. Preliminary petrographic and isotopic investigations suggest affinities to the CM2 carbonaceous chondrites. The primitive nature of this meteorite and its rapid recovery provide an opportunity to investigate primordial solar system organic matter in a unique new sample. Here we report in-situ analyses of organic nanoglobules in the Sutter's Mill meteorite using UV fluorescence imaging, Fourier-transform infrared spectroscopy (FTIR), scanning transmission electron microscopy (STEM), NanoSIMS, and ultrafast two-step laser mass spectrometry (ultra-L2MS).

  17. Display depth analyses with the wave aberration for the auto-stereoscopic 3D display

    NASA Astrophysics Data System (ADS)

    Gao, Xin; Sang, Xinzhu; Yu, Xunbo; Chen, Duo; Chen, Zhidong; Zhang, Wanlu; Yan, Binbin; Yuan, Jinhui; Wang, Kuiru; Yu, Chongxiu; Dou, Wenhua; Xiao, Liquan

    2016-07-01

    Because the aberration severely affects the display performances of the auto-stereoscopic 3D display, the diffraction theory is used to analyze the diffraction field distribution and the display depth through aberration analysis. Based on the proposed method, the display depth of central and marginal reconstructed images is discussed. The experimental results agree with the theoretical analyses. Increasing the viewing distance or decreasing the lens aperture can improve the display depth. Different viewing distances and the LCD with two lens-arrays are used to verify the conclusion.

  18. To Image...or Not to Image?

    ERIC Educational Resources Information Center

    Bruley, Karina

    1996-01-01

    Provides a checklist of considerations for installing document image processing with an electronic document management system. Other topics include scanning; indexing; the image file life cycle; benefits of imaging; document-driven workflow; and planning for workplace changes like postsorting, creating a scanning room, redeveloping job tasks and…

  19. Filter for biomedical imaging and image processing

    NASA Astrophysics Data System (ADS)

    Mondal, Partha P.; Rajan, K.; Ahmad, Imteyaz

    2006-07-01

    Image filtering techniques have numerous potential applications in biomedical imaging and image processing. The design of filters largely depends on the a priori, knowledge about the type of noise corrupting the image. This makes the standard filters application specific. Widely used filters such as average, Gaussian, and Wiener reduce noisy artifacts by smoothing. However, this operation normally results in smoothing of the edges as well. On the other hand, sharpening filters enhance the high-frequency details, making the image nonsmooth. An integrated general approach to design a finite impulse response filter based on Hebbian learning is proposed for optimal image filtering. This algorithm exploits the interpixel correlation by updating the filter coefficients using Hebbian learning. The algorithm is made iterative for achieving efficient learning from the neighborhood pixels. This algorithm performs optimal smoothing of the noisy image by preserving high-frequency as well as low-frequency features. Evaluation results show that the proposed finite impulse response filter is robust under various noise distributions such as Gaussian noise, salt-and-pepper noise, and speckle noise. Furthermore, the proposed approach does not require any a priori knowledge about the type of noise. The number of unknown parameters is few, and most of these parameters are adaptively obtained from the processed image. The proposed filter is successfully applied for image reconstruction in a positron emission tomography imaging modality. The images reconstructed by the proposed algorithm are found to be superior in quality compared with those reconstructed by existing PET image reconstruction methodologies.

  20. Analysis of imaging quality under the systematic parameters for thermal imaging system

    NASA Astrophysics Data System (ADS)

    Liu, Bin; Jin, Weiqi

    2009-07-01

    The integration of thermal imaging system and radar system could increase the range of target identification as well as strengthen the accuracy and reliability of detection, which is a state-of-the-art and mainstream integrated system to search any invasive target and guard homeland security. When it works, there is, however, one defect existing of what the thermal imaging system would produce affected images which could cause serious consequences when searching and detecting. In this paper, we study and reveal the reason why and how the affected images would occur utilizing the principle of lightwave before establishing mathematical imaging model which could meet the course of ray transmitting. In the further analysis, we give special attentions to the systematic parameters of the model, and analyse in detail all parameters which could possibly affect the imaging process and the function how it does respectively. With comprehensive research, we obtain detailed information about the regulation of diffractive phenomena shaped by these parameters. Analytical results have been convinced through the comparison between experimental images and MATLAB simulated images, while simulated images based on the parameters we revised to judge our expectation have good comparability with images acquired in reality.

  1. Underwater image quality enhancement through composition of dual-intensity images and Rayleigh-stretching.

    PubMed

    Abdul Ghani, Ahmad Shahrizan; Mat Isa, Nor Ashidi

    2014-01-01

    The quality of underwater image is poor due to the properties of water and its impurities. The properties of water cause attenuation of light travels through the water medium, resulting in low contrast, blur, inhomogeneous lighting, and color diminishing of the underwater images. This paper proposes a method of enhancing the quality of underwater image. The proposed method consists of two stages. At the first stage, the contrast correction technique is applied to the image, where the image is applied with the modified Von Kries hypothesis and stretching the image into two different intensity images at the average value with respects to Rayleigh distribution. At the second stage, the color correction technique is applied to the image where the image is first converted into hue-saturation-value (HSV) color model. The modification of the color component increases the image color performance. Qualitative and quantitative analyses indicate that the proposed method outperforms other state-of-the-art methods in terms of contrast, details, and noise reduction.

  2. Underwater image quality enhancement through composition of dual-intensity images and Rayleigh-stretching.

    PubMed

    Abdul Ghani, Ahmad Shahrizan; Mat Isa, Nor Ashidi

    2014-01-01

    The quality of underwater image is poor due to the properties of water and its impurities. The properties of water cause attenuation of light travels through the water medium, resulting in low contrast, blur, inhomogeneous lighting, and color diminishing of the underwater images. This paper proposes a method of enhancing the quality of underwater image. The proposed method consists of two stages. At the first stage, the contrast correction technique is applied to the image, where the image is applied with the modified Von Kries hypothesis and stretching the image into two different intensity images at the average value with respects to Rayleigh distribution. At the second stage, the color correction technique is applied to the image where the image is first converted into hue-saturation-value (HSV) color model. The modification of the color component increases the image color performance. Qualitative and quantitative analyses indicate that the proposed method outperforms other state-of-the-art methods in terms of contrast, details, and noise reduction. PMID:25674483

  3. ';Big Data' can make a big difference: Applying Big Data to National Scale Change Analyses

    NASA Astrophysics Data System (ADS)

    Mueller, N. R.; Curnow, S.; Melrose, R.; Purss, M. B.; Lewis, A.

    2013-12-01

    The traditional method of change detection in remote sensing is based on acquiring a pair of images and conducting a set of analyses to determine what is different between them. The end result is a single change analysis for a single time period. While this may be repeated several times, it is generally a time consuming, often manual process providing a series of snapshots of change. As datasets become larger, and time series analyses become more sophisticated, these traditional methods of analysis are unviable. The Geoscience Australia ';Data Cube' provides a 25-year time series of all Landsat-5 and Landsat-7 data for the entire Australian continent. Each image is orthorectified to a standard set of pixel locations and is fully calibrated to a measure of surface reflectance (the 25m Australian Reflectance Grid [ARG25]). These surface reflectance measurements are directly comparable, between different scenes, and regardless of whether they are sourced from the Landsat-5 TM instrument or the Landsat-7 ETM+. The advantage of the Data Cube environment lies in the ability to apply an algorithm to every pixel across Australia (some 1013 pixels) in a consistent way, enabling change analysis for every acquired observation. This provides a framework to analyse change through time on a scene to scene basis, and across national-scale areas for the entire duration of the archive. Two examples of applications of the Data Cube are described here: surface water extent mapping across Australia; and vegetation condition mapping across the Murray-Darling Basin, Australia's largest river system.. Ongoing water mapping and vegetation condition mapping is required by the Australian government to produce information products for a range of requirements including ecological monitoring and emergency management risk planning. With a 25 year archive of Landsat-5 and Landsat-7 imagery hosted on an efficient High Performance Computing (HPC) environment, high speed analyses of long time

  4. Identifying neural correlates of visual consciousness with ALE meta-analyses.

    PubMed

    Bisenius, Sandrine; Trapp, Sabrina; Neumann, Jane; Schroeter, Matthias L

    2015-11-15

    Neural correlates of consciousness (NCC) have been a topic of study for nearly two decades. In functional imaging studies, several regions have been proposed to constitute possible candidates for NCC, but as of yet, no quantitative summary of the literature on NCC has been done. The question whether single (striate or extrastriate) regions or a network consisting of extrastriate areas that project directly to fronto-parietal regions are necessary and sufficient neural correlates for visual consciousness is still highly debated [e.g., Rees et al., 2002, Nat Rev. Neurosci 3, 261-270; Tong, 2003, Nat Rev. Neurosci 4, 219-229]. The aim of this work was to elucidate this issue and give a synopsis of the present state of the art by conducting systematic and quantitative meta-analyses across functional magnetic resonance imaging (fMRI) studies using several standard paradigms for conscious visual perception. In these paradigms, consciousness is operationalized via perceptual changes, while the visual stimulus remains invariant. An activation likelihood estimation (ALE) meta-analysis was performed, representing the best approach for voxel-wise meta-analyses to date. In addition to computing a meta-analysis across all paradigms, separate meta-analyses on bistable perception and masking paradigms were conducted to assess whether these paradigms show common or different NCC. For the overall meta-analysis, we found significant clusters of activation in inferior and middle occipital gyrus; fusiform gyrus; inferior temporal gyrus; caudate nucleus; insula; inferior, middle, and superior frontal gyri; precuneus; as well as in inferior and superior parietal lobules. These results suggest a subcortical-extrastriate-fronto-parietal network rather than a single region that constitutes the necessary NCC. The results of our exploratory paradigm-specific meta-analyses suggest that this subcortical-extrastriate-fronto-parietal network might be differentially activated as a function of the

  5. Identifying neural correlates of visual consciousness with ALE meta-analyses.

    PubMed

    Bisenius, Sandrine; Trapp, Sabrina; Neumann, Jane; Schroeter, Matthias L

    2015-11-15

    Neural correlates of consciousness (NCC) have been a topic of study for nearly two decades. In functional imaging studies, several regions have been proposed to constitute possible candidates for NCC, but as of yet, no quantitative summary of the literature on NCC has been done. The question whether single (striate or extrastriate) regions or a network consisting of extrastriate areas that project directly to fronto-parietal regions are necessary and sufficient neural correlates for visual consciousness is still highly debated [e.g., Rees et al., 2002, Nat Rev. Neurosci 3, 261-270; Tong, 2003, Nat Rev. Neurosci 4, 219-229]. The aim of this work was to elucidate this issue and give a synopsis of the present state of the art by conducting systematic and quantitative meta-analyses across functional magnetic resonance imaging (fMRI) studies using several standard paradigms for conscious visual perception. In these paradigms, consciousness is operationalized via perceptual changes, while the visual stimulus remains invariant. An activation likelihood estimation (ALE) meta-analysis was performed, representing the best approach for voxel-wise meta-analyses to date. In addition to computing a meta-analysis across all paradigms, separate meta-analyses on bistable perception and masking paradigms were conducted to assess whether these paradigms show common or different NCC. For the overall meta-analysis, we found significant clusters of activation in inferior and middle occipital gyrus; fusiform gyrus; inferior temporal gyrus; caudate nucleus; insula; inferior, middle, and superior frontal gyri; precuneus; as well as in inferior and superior parietal lobules. These results suggest a subcortical-extrastriate-fronto-parietal network rather than a single region that constitutes the necessary NCC. The results of our exploratory paradigm-specific meta-analyses suggest that this subcortical-extrastriate-fronto-parietal network might be differentially activated as a function of the

  6. Analysing soil moisture reactions to precipitation for soil moisture regionalization

    NASA Astrophysics Data System (ADS)

    Engels, S.; Marschner, B.; Zepp, H.

    2012-04-01

    Storage and turnover of water in soils have an important impact on processes of runoff generation. To consider soil moisture in precipitation-runoff-models data with high spatial and temporal resolution are required. In a mesoscale catchment (about 300 km2) in the hilly landscape of the Sauerland (Western-Germany) an online monitoring network collects data by 48 pF-meters and four precipitation collectors. Because data is generated discrete in time and space at a few sites an upscaling for every point in time from local point measurements to the mesoscale is necessary. Our approach to regionalize the actual soil moisture not only interpolates the measurements of observed random variables like classic geostatistical methods do, e.g. kriging interpolations, but uses locally variable properties of the study area that support our estimation. Such properties are on the one hand temporally constant parameters like land use, soil properties and topography from satellite images, soil maps and a digital elevation model and on the other hand temporally variable parameters derived from solar radiation data and precipitation time series. The regionalization model thus incorporates results of these time series, such as the time between a precipitation event and the depth-dependent soil moisture reaction. In order to achieve this, precipitation time series are separated into events and soil moisture time series are divided into intervals of increasing, decreasing and constant soil moisture. Intervals of time series with decreasing soil moisture are matched to previous precipitation events. Then characteristic attributes like the time between a precipitation event and the depth-dependent decreasing soil moisture are calculated. The results are used to develop a soil moisture regionalization model based on temporally constant and dynamic parameters. The nonlinear relation between these parameters and soil moisture are learned from given data, e.g. by an artificial neural network

  7. Analysing land cover and land use change in the Matobo National Park and surroundings in Zimbabwe

    NASA Astrophysics Data System (ADS)

    Scharsich, Valeska; Mtata, Kupakwashe; Hauhs, Michael; Lange, Holger; Bogner, Christina

    2016-04-01

    Natural forests are threatened worldwide, therefore their protection in National Parks is essential. Here, we investigate how this protection status affects the land cover. To answer this question, we analyse the surface reflectance of three Landsat images of Matobo National Park and surrounding in Zimbabwe from 1989, 1998 and 2014 to detect changes in land cover in this region. To account for the rolling countryside and the resulting prominent shadows, a topographical correction of the surface reflectance was required. To infer land cover changes it is not only necessary to have some ground data for the current satellite images but also for the old ones. In particular for the older images no recent field study could help to reconstruct these data reliably. In our study we follow the idea that land cover classes of pixels in current images can be transferred to the equivalent pixels of older ones if no changes occurred meanwhile. Therefore we combine unsupervised clustering with supervised classification as follows. At first, we produce a land cover map for 2014. Secondly, we cluster the images with clara, which is similar to k-means, but suitable for large data sets. Whereby the best number of classes were determined to be 4. Thirdly, we locate unchanged pixels with change vector analysis in the images of 1989 and 1998. For these pixels we transfer the corresponding cluster label from 2014 to 1989 and 1998. Subsequently, the classified pixels serve as training data for supervised classification with random forest, which is carried out for each image separately. Finally, we derive land cover classes from the Landsat image in 2014, photographs and Google Earth and transfer them to the other two images. The resulting classes are shrub land; forest/shallow waters; bare soils/fields with some trees/shrubs; and bare light soils/rocks, fields and settlements. Subsequently the three different classifications are compared and land changes are mapped. The main changes are

  8. Imaging of testicular tumours.

    PubMed

    Owens, E J; Kabala, J; Goddard, P

    2004-01-01

    This article reviews the diagnosis, pathology and imaging of testicular tumours, predominantly germ cell tumours. It will discuss the imaging techniques used in their diagnosis, staging and surveillance.

  9. Noninvasive Imaging of Experimental Lung Fibrosis

    PubMed Central

    Chen, Huaping; Ambalavanan, Namasivayam; Liu, Gang; Antony, Veena B.; Ding, Qiang; Nath, Hrudaya; Eary, Janet F.; Thannickal, Victor J.

    2015-01-01

    Small animal models of lung fibrosis are essential for unraveling the molecular mechanisms underlying human fibrotic lung diseases; additionally, they are useful for preclinical testing of candidate antifibrotic agents. The current end-point measures of experimental lung fibrosis involve labor-intensive histological and biochemical analyses. These measures fail to account for dynamic changes in the disease process in individual animals and are limited by the need for large numbers of animals for longitudinal studies. The emergence of noninvasive imaging technologies provides exciting opportunities to image lung fibrosis in live animals as often as needed and to longitudinally track the efficacy of novel antifibrotic compounds. Data obtained by noninvasive imaging provide complementary information to histological and biochemical measurements. In addition, the use of noninvasive imaging in animal studies reduces animal usage, thus satisfying animal welfare concerns. In this article, we review these new imaging modalities with the potential for evaluation of lung fibrosis in small animal models. Such techniques include micro-computed tomography (micro-CT), magnetic resonance imaging, positron emission tomography (PET), single photon emission computed tomography (SPECT), and multimodal imaging systems including PET/CT and SPECT/CT. It is anticipated that noninvasive imaging will be increasingly used in animal models of fibrosis to gain insights into disease pathogenesis and as preclinical tools to assess drug efficacy. PMID:25679265

  10. Imaging approaches to optimize molecular therapies.

    PubMed

    Weissleder, Ralph; Schwaiger, Markus C; Gambhir, Sanjiv Sam; Hricak, Hedvig

    2016-09-01

    Imaging, including its use for innovative tissue sampling, is slowly being recognized as playing a pivotal role in drug development, clinical trial design, and more effective delivery and monitoring of molecular therapies. The challenge is that, while a considerable number of new imaging technologies and new targeted tracers have been developed for cancer imaging in recent years, the technologies are neither evenly distributed nor evenly implemented. Furthermore, many imaging innovations are not validated and are not ready for widespread use in drug development or in clinical trial designs. Inconsistent and often erroneous use of terminology related to quantitative imaging biomarkers has also played a role in slowing their development and implementation. We examine opportunities for, and challenges of, the use of imaging biomarkers to facilitate development of molecular therapies and to accelerate progress in clinical trial design. In the future, in vivo molecular imaging, image-guided tissue sampling for mutational analyses ("high-content biopsies"), and noninvasive in vitro tests ("liquid biopsies") will likely be used in various combinations to provide the best possible monitoring and individualized treatment plans for cancer patients. PMID:27605550

  11. Far Ultraviolet Imaging from the Image Spacecraft

    NASA Technical Reports Server (NTRS)

    Mende, S. B.; Heetderks, H.; Frey, H. U.; Lampton, M.; Geller, S. P.; Stock, J. M.; Abiad, R.; Siegmund, O. H. W.; Tremsin, A. S.; Habraken, S.

    2000-01-01

    Direct imaging of the magnetosphere by the IMAGE spacecraft will be supplemented by observation of the global aurora. The IMAGE satellite instrument complement includes three Far Ultraviolet (FUV) instruments. The Wideband Imaging Camera (WIC) will provide broad band ultraviolet images of the aurora for maximum spatial and temporal resolution by imaging the LBH N2 bands of the aurora. The Spectrographic Imager (SI), a novel form of monochromatic imager, will image the aurora, filtered by wavelength. The proton-induced component of the aurora will be imaged separately by measuring the Doppler-shifted Lyman-a. Finally, the GEO instrument will observe the distribution of the geocoronal emission to obtain the neutral background density source for charge exchange in the magnetosphere. The FUV instrument complement looks radially outward from the rotating IMAGE satellite and, therefore, it spends only a short time observing the aurora and the Earth during each spin. To maximize photon collection efficiency and use efficiently the short time available for exposures the FUV auroral imagers WIC and SI both have wide fields of view and take data continuously as the auroral region proceeds through the field of view. To minimize data volume, the set of multiple images are electronically co-added by suitably shifting each image to compensate for the spacecraft rotation. In order to minimize resolution loss, the images have to be distort ion-corrected in real time. The distortion correction is accomplished using high speed look up tables that are pre-generated by least square fitting to polynomial functions by the on-orbit processor. The instruments were calibrated individually while on stationary platforms, mostly in vacuum chambers. Extensive ground-based testing was performed with visible and near UV simulators mounted on a rotating platform to emulate their performance on a rotating spacecraft.

  12. Quantitative histogram analysis of images

    NASA Astrophysics Data System (ADS)

    Holub, Oliver; Ferreira, Sérgio T.

    2006-11-01

    A routine for histogram analysis of images has been written in the object-oriented, graphical development environment LabVIEW. The program converts an RGB bitmap image into an intensity-linear greyscale image according to selectable conversion coefficients. This greyscale image is subsequently analysed by plots of the intensity histogram and probability distribution of brightness, and by calculation of various parameters, including average brightness, standard deviation, variance, minimal and maximal brightness, mode, skewness and kurtosis of the histogram and the median of the probability distribution. The program allows interactive selection of specific regions of interest (ROI) in the image and definition of lower and upper threshold levels (e.g., to permit the removal of a constant background signal). The results of the analysis of multiple images can be conveniently saved and exported for plotting in other programs, which allows fast analysis of relatively large sets of image data. The program file accompanies this manuscript together with a detailed description of two application examples: The analysis of fluorescence microscopy images, specifically of tau-immunofluorescence in primary cultures of rat cortical and hippocampal neurons, and the quantification of protein bands by Western-blot. The possibilities and limitations of this kind of analysis are discussed. Program summaryTitle of program: HAWGC Catalogue identifier: ADXG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXG_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computers: Mobile Intel Pentium III, AMD Duron Installations: No installation necessary—Executable file together with necessary files for LabVIEW Run-time engine Operating systems or monitors under which the program has been tested: WindowsME/2000/XP Programming language used: LabVIEW 7.0 Memory required to execute with typical data:˜16MB for starting and ˜160MB used for

  13. Analysis of a multisensor image data set of south San Rafael Swell, Utah

    NASA Technical Reports Server (NTRS)

    Evans, D. L.

    1982-01-01

    A Shuttle Imaging Radar (SIR-A) image of the southern portion of the San Rafael Swell in Utah has been digitized and registered to coregistered Landsat, Seasat, and HCMM thermal inertia images. The addition of the SIR-A image to the registered data set improves rock type discrimination in both qualitative and quantitative analyses. Sedimentary units can be separated in a combined SIR-A/Seasat image that cannot be seen in either image alone. Discriminant Analyses show that the classification accuracy is improved with addition of the SIR-A image to Landsat images. Classification accuracy is further improved when texture information from the Seasat and SIR-A images is included.

  14. Tracing Success: Graphical Methods for Analysing Successful Collaborative Problem Solving

    ERIC Educational Resources Information Center

    Joiner, Richard; Issroff, Kim

    2003-01-01

    The aim of this paper is to evaluate the use of trace diagrams for analysing collaborative problem solving. The paper describes a study where trace diagrams were used to analyse joint navigation in a virtual environment. Ten pairs of undergraduates worked together on a distributed virtual task to collect five flowers using two bees with each…

  15. Training Residential Staff to Conduct Trial-Based Functional Analyses

    ERIC Educational Resources Information Center

    Lambert, Joseph M.; Bloom, Sarah E.; Kunnavatana, S. Shanun; Collins, Shawnee D.; Clay, Casey J.

    2013-01-01

    We taught 6 supervisors of a residential service provider for adults with developmental disabilities to train 9 house managers to conduct trial-based functional analyses. Effects of the training were evaluated with a nonconcurrent multiple baseline. Results suggest that house managers can be trained to conduct trial-based functional analyses with…

  16. 41 CFR 101-27.208 - Inventory analyses.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 41 Public Contracts and Property Management 2 2010-07-01 2010-07-01 true Inventory analyses. 101...-Management of Shelf-Life Materials § 101-27.208 Inventory analyses. (a) An inventory analysis shall be... the established shelf-life period. If the analysis indicates there are quantities which will not...

  17. 41 CFR 101-27.208 - Inventory analyses.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 41 Public Contracts and Property Management 2 2013-07-01 2012-07-01 true Inventory analyses. 101...-Management of Shelf-Life Materials § 101-27.208 Inventory analyses. (a) An inventory analysis shall be... the established shelf-life period. If the analysis indicates there are quantities which will not...

  18. 41 CFR 101-27.208 - Inventory analyses.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 41 Public Contracts and Property Management 2 2012-07-01 2012-07-01 false Inventory analyses. 101...-Management of Shelf-Life Materials § 101-27.208 Inventory analyses. (a) An inventory analysis shall be... the established shelf-life period. If the analysis indicates there are quantities which will not...

  19. 41 CFR 101-27.208 - Inventory analyses.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 41 Public Contracts and Property Management 2 2014-07-01 2012-07-01 true Inventory analyses. 101...-Management of Shelf-Life Materials § 101-27.208 Inventory analyses. (a) An inventory analysis shall be... the established shelf-life period. If the analysis indicates there are quantities which will not...

  20. 41 CFR 101-27.208 - Inventory analyses.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 41 Public Contracts and Property Management 2 2011-07-01 2007-07-01 true Inventory analyses. 101...-Management of Shelf-Life Materials § 101-27.208 Inventory analyses. (a) An inventory analysis shall be... the established shelf-life period. If the analysis indicates there are quantities which will not...

  1. 44 CFR 1.9 - Regulatory impact analyses.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 44 Emergency Management and Assistance 1 2011-10-01 2011-10-01 false Regulatory impact analyses. 1... HOMELAND SECURITY GENERAL RULEMAKING; POLICY AND PROCEDURES General § 1.9 Regulatory impact analyses. (a) FEMA shall, in connection with any major rule, prepare and consider a Regulatory Impact Analysis....

  2. 9 CFR 590.580 - Laboratory tests and analyses.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 9 Animals and Animal Products 2 2012-01-01 2012-01-01 false Laboratory tests and analyses. 590.580... EGG PRODUCTS INSPECTION INSPECTION OF EGGS AND EGG PRODUCTS (EGG PRODUCTS INSPECTION ACT) Laboratory § 590.580 Laboratory tests and analyses. The official plant, at their expense, shall make tests...

  3. 9 CFR 590.580 - Laboratory tests and analyses.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 9 Animals and Animal Products 2 2014-01-01 2014-01-01 false Laboratory tests and analyses. 590.580... EGG PRODUCTS INSPECTION INSPECTION OF EGGS AND EGG PRODUCTS (EGG PRODUCTS INSPECTION ACT) Laboratory § 590.580 Laboratory tests and analyses. The official plant, at their expense, shall make tests...

  4. 9 CFR 590.580 - Laboratory tests and analyses.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 9 Animals and Animal Products 2 2013-01-01 2013-01-01 false Laboratory tests and analyses. 590.580... EGG PRODUCTS INSPECTION INSPECTION OF EGGS AND EGG PRODUCTS (EGG PRODUCTS INSPECTION ACT) Laboratory § 590.580 Laboratory tests and analyses. The official plant, at their expense, shall make tests...

  5. 9 CFR 590.580 - Laboratory tests and analyses.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Laboratory tests and analyses. 590.580... EGG PRODUCTS INSPECTION INSPECTION OF EGGS AND EGG PRODUCTS (EGG PRODUCTS INSPECTION ACT) Laboratory § 590.580 Laboratory tests and analyses. The official plant, at their expense, shall make tests...

  6. 43 CFR 46.130 - Mitigation measures in analyses.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 43 Public Lands: Interior 1 2014-10-01 2014-10-01 false Mitigation measures in analyses. 46.130... Mitigation measures in analyses. (a) Bureau proposed action. The analysis of the proposed action and any... of the effects of any appropriate mitigation measures or best management practices that...

  7. 43 CFR 46.130 - Mitigation measures in analyses.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false Mitigation measures in analyses. 46.130... Mitigation measures in analyses. (a) Bureau proposed action. The analysis of the proposed action and any... of the effects of any appropriate mitigation measures or best management practices that...

  8. 43 CFR 46.130 - Mitigation measures in analyses.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 43 Public Lands: Interior 1 2011-10-01 2011-10-01 false Mitigation measures in analyses. 46.130... Mitigation measures in analyses. (a) Bureau proposed action. The analysis of the proposed action and any... of the effects of any appropriate mitigation measures or best management practices that...

  9. 43 CFR 46.130 - Mitigation measures in analyses.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 43 Public Lands: Interior 1 2012-10-01 2011-10-01 true Mitigation measures in analyses. 46.130... Mitigation measures in analyses. (a) Bureau proposed action. The analysis of the proposed action and any... of the effects of any appropriate mitigation measures or best management practices that...

  10. 43 CFR 46.130 - Mitigation measures in analyses.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 43 Public Lands: Interior 1 2013-10-01 2013-10-01 false Mitigation measures in analyses. 46.130... Mitigation measures in analyses. (a) Bureau proposed action. The analysis of the proposed action and any... of the effects of any appropriate mitigation measures or best management practices that...

  11. Rational Analyses of Information Foraging on the Web

    ERIC Educational Resources Information Center

    Pirolli, Peter

    2005-01-01

    This article describes rational analyses and cognitive models of Web users developed within information foraging theory. This is done by following the rational analysis methodology of (a) characterizing the problems posed by the environment, (b) developing rational analyses of behavioral solutions to those problems, and (c) developing cognitive…

  12. What can we do about exploratory analyses in clinical trials?

    PubMed

    Moyé, Lem

    2015-11-01

    The research community has alternatively embraced then repudiated exploratory analyses since the inception of clinical trials in the middle of the twentieth century. After a series of important but ultimately unreproducible findings, these non-prospectively declared evaluations were relegated to hypothesis generating. Since the majority of evaluations conducted in clinical trials with their rich data sets are exploratory, the absence of their persuasive power adds to the inefficiency of clinical trial analyses in an atmosphere of fiscal frugality. However, the principle argument against exploratory analyses is not based in statistical theory, but pragmatism and observation. The absence of any theoretical treatment of exploratory analyses postpones the day when their statistical weaknesses might be repaired. Here, we introduce examination of the characteristics of exploratory analyses from a probabilistic and statistical framework. Setting the obvious logistical concerns aside (i.e., the absence of planning produces poor precision), exploratory analyses do not appear to suffer from estimation theory weaknesses. The problem appears to be a difficulty in what is actually reported as the p-value. The use of Bayes Theorem provides p-values that are more in line with confirmatory analyses. This development may inaugurate a body of work that would lead to the readmission of exploratory analyses to a position of persuasive power in clinical trials.

  13. Analyses of Response-Stimulus Sequences in Descriptive Observations

    ERIC Educational Resources Information Center

    Samaha, Andrew L.; Vollmer, Timothy R.; Borrero, Carrie; Sloman, Kimberly; Pipkin, Claire St. Peter; Bourret, Jason

    2009-01-01

    Descriptive observations were conducted to record problem behavior displayed by participants and to record antecedents and consequences delivered by caregivers. Next, functional analyses were conducted to identify reinforcers for problem behavior. Then, using data from the descriptive observations, lag-sequential analyses were conducted to examine…

  14. Descriptive Analyses of Pediatric Food Refusal and Acceptance

    ERIC Educational Resources Information Center

    Borrero, Carrie S. W.; Woods, Julia N.; Borrero, John C.; Masler, Elizabeth A.; Lesser, Aaron D.

    2010-01-01

    Functional analyses of inappropriate mealtime behavior typically include conditions to determine if the contingent delivery of attention, tangible items, or escape reinforce food refusal. In the current investigation, descriptive analyses were conducted for 25 children who had been admitted to a program for the assessment and treatment of food…

  15. 46 CFR Appendix B to Part 154 - Stress Analyses Definitions

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 5 2014-10-01 2014-10-01 false Stress Analyses Definitions B Appendix B to Part 154...—Stress Analyses Definitions The following are the standard definitions of stresses for the analysis of an independent tank type B: Normal stress means the component of stress normal to the plane of...

  16. 46 CFR Appendix B to Part 154 - Stress Analyses Definitions

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 5 2013-10-01 2013-10-01 false Stress Analyses Definitions B Appendix B to Part 154...—Stress Analyses Definitions The following are the standard definitions of stresses for the analysis of an independent tank type B: Normal stress means the component of stress normal to the plane of...

  17. 46 CFR Appendix B to Part 154 - Stress Analyses Definitions

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 5 2011-10-01 2011-10-01 false Stress Analyses Definitions B Appendix B to Part 154...—Stress Analyses Definitions The following are the standard definitions of stresses for the analysis of an independent tank type B: Normal stress means the component of stress normal to the plane of...

  18. 46 CFR Appendix B to Part 154 - Stress Analyses Definitions

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 5 2010-10-01 2010-10-01 false Stress Analyses Definitions B Appendix B to Part 154...—Stress Analyses Definitions The following are the standard definitions of stresses for the analysis of an independent tank type B: Normal stress means the component of stress normal to the plane of...

  19. 46 CFR Appendix B to Part 154 - Stress Analyses Definitions

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 5 2012-10-01 2012-10-01 false Stress Analyses Definitions B Appendix B to Part 154...—Stress Analyses Definitions The following are the standard definitions of stresses for the analysis of an independent tank type B: Normal stress means the component of stress normal to the plane of...

  20. Image processing and recognition for biological images

    PubMed Central

    Uchida, Seiichi

    2013-01-01

    This paper reviews image processing and pattern recognition techniques, which will be useful to analyze bioimages. Although this paper does not provide their technical details, it will be possible to grasp their main tasks and typical tools to handle the tasks. Image processing is a large research area to improve the visibility of an input image and acquire some valuable information from it. As the main tasks of image processing, this paper introduces gray-level transformation, binarization, image filtering, image segmentation, visual object tracking, optical flow and image registration. Image pattern recognition is the technique to classify an input image into one of the predefined classes and also has a large research area. This paper overviews its two main modules, that is, feature extraction module and classification module. Throughout the paper, it will be emphasized that bioimage is a very difficult target for even state-of-the-art image processing and pattern recognition techniques due to noises, deformations, etc. This paper is expected to be one tutorial guide to bridge biology and image processing researchers for their further collaboration to tackle such a difficult target. PMID:23560739