Science.gov

Sample records for image analyses monitoracao

  1. Digital image analyser for autoradiography

    SciTech Connect

    Muth, R.A.; Plotnick, J.

    1985-05-01

    The most critical parameter in quantitative autoradiography for assay of tissue concentrations of tracers is the ability to obtain precise and accurate measurements of optical density of the images. Existing high precision systems for image analysis, rotating drum densitometers, are expensive, suffer from mechanical problems and are slow. More moderately priced and reliable video camera based systems are available, but their outputs generally do not have the uniformity and stability necessary for high resolution quantitative autoradiography. The authors have designed and constructed an image analyser optimized for quantitative single and multiple tracer autoradiography which the authors refer to as a memory-mapped charged-coupled device scanner (MM-CCD). The input is from a linear array of CCD's which is used to optically scan the autoradiograph. Images are digitized into 512 x 512 picture elements with 256 gray levels and the data is stored in buffer video memory in less than two seconds. Images can then be transferred to RAM memory by direct memory-mapping for further processing. Arterial blood curve data and optical density-calibrated standards data can be entered and the optical density images can be converted automatically to tracer concentration or functional images. In double tracer studies, images produced from both exposures can be stored and processed in RAM to yield ''pure'' individual tracer concentration or functional images. Any processed image can be transmitted back to the buffer memory to be viewed on a monitor and processed for region of interest analysis.

  2. Analyses of radar images of small craters

    NASA Astrophysics Data System (ADS)

    Greeley, R.; Christensen, P. R.; McHone, J. F.

    1985-04-01

    Clouds hide the surface of Venus from all but radar imaging systems, supplemented by limited views from land spacecraft. Among the surfaces features likely to be observed by radar are craters that have formed by a variety of processes. In order to assess the radar characteristics of craters, volcanic craters and impact structures on Earth are described as imaged by the Shuttle Imaging Radar (SIR-A) experiment. Although most of the craters are small, this analysis provides insight into the ability to discriminate craters of various origins and provides some basis for interpreting radar images returned from Venus.

  3. Analysing multitemporal SAR images for forest mapping

    NASA Astrophysics Data System (ADS)

    Maghsoudi, Yasser; Collins, Michael J.; Leckie, Donald G.

    2010-10-01

    The objective of this paper is twofold: first, to presents a generic approach for the analysis of Radarsat-1 multitemporal data and, second, to presents a multi classifier schema for the classification of multitemporal images. The general approach consists of preprocessing step and classification. In the preprocessing stage, the images are calibrated and registered and then temporally filtered. The resulted multitemporally filtered images are subsequently used as the input images in the classification step. The first step in a classifier design is to pick up the most informative features from a series of multitemporal SAR images. Most of the feature selection algorithms seek only one set of features that distinguish among all the classes simultaneously and hence a limited amount of classification accuracy. In this paper, a class-based feature selection (CBFS) was proposed. In this schema, instead of using feature selection for the whole classes, the features are selected for each class separately. The selection is based on the calculation of JM distance of each class from the rest of classes. Afterwards, a maximum likelihood classifier is trained on each of the selected feature subsets. Finally, the outputs of the classifiers are combined through a combination mechanism. Experiments are performed on a set of 34 Radarsat-1 images acquired from August 1996 to February 2007. A set of 9 classes in a forest area are used in this study. Classification results confirm the effectiveness of the proposed approach compared with the case of single feature selection. Moreover, the proposed process is generic and hence is applicable in different mapping purposes for which a multitemporal set of SAR images are available.

  4. Phase contrast image segmentation using a Laue analyser crystal

    NASA Astrophysics Data System (ADS)

    Kitchen, Marcus J.; Paganin, David M.; Uesugi, Kentaro; Allison, Beth J.; Lewis, Robert A.; Hooper, Stuart B.; Pavlov, Konstantin M.

    2011-02-01

    Dual-energy x-ray imaging is a powerful tool enabling two-component samples to be separated into their constituent objects from two-dimensional images. Phase contrast x-ray imaging can render the boundaries between media of differing refractive indices visible, despite them having similar attenuation properties; this is important for imaging biological soft tissues. We have used a Laue analyser crystal and a monochromatic x-ray source to combine the benefits of both techniques. The Laue analyser creates two distinct phase contrast images that can be simultaneously acquired on a high-resolution detector. These images can be combined to separate the effects of x-ray phase, absorption and scattering and, using the known complex refractive indices of the sample, to quantitatively segment its component materials. We have successfully validated this phase contrast image segmentation (PCIS) using a two-component phantom, containing an iodinated contrast agent, and have also separated the lungs and ribcage in images of a mouse thorax. Simultaneous image acquisition has enabled us to perform functional segmentation of the mouse thorax throughout the respiratory cycle during mechanical ventilation.

  5. Analyser-based x-ray imaging for biomedical research

    NASA Astrophysics Data System (ADS)

    Suortti, Pekka; Keyriläinen, Jani; Thomlinson, William

    2013-12-01

    Analyser-based imaging (ABI) is one of the several phase-contrast x-ray imaging techniques being pursued at synchrotron radiation facilities. With advancements in compact source technology, there is a possibility that ABI will become a clinical imaging modality. This paper presents the history of ABI as it has developed from its laboratory source to synchrotron imaging. The fundamental physics of phase-contrast imaging is presented both in a general sense and specifically for ABI. The technology is dependent on the use of perfect crystal monochromator optics. The theory of the x-ray optics is developed and presented in a way that will allow optimization of the imaging for specific biomedical systems. The advancement of analytical algorithms to produce separate images of the sample absorption, refraction angle map and small-angle x-ray scattering is detailed. Several detailed applications to biomedical imaging are presented to illustrate the broad range of systems and body sites studied preclinically to date: breast, cartilage and bone, soft tissue and organs. Ultimately, the application of ABI in clinical imaging will depend partly on the availability of compact sources with sufficient x-ray intensity comparable with that of the current synchrotron environment.

  6. Colony image acquisition and genetic segmentation algorithm and colony analyses

    NASA Astrophysics Data System (ADS)

    Wang, W. X.

    2012-01-01

    Colony anaysis is used in a large number of engineerings such as food, dairy, beverages, hygiene, environmental monitoring, water, toxicology, sterility testing. In order to reduce laboring and increase analysis acuracy, many researchers and developers have made efforts for image analysis systems. The main problems in the systems are image acquisition, image segmentation and image analysis. In this paper, to acquire colony images with good quality, an illumination box was constructed. In the box, the distances between lights and dishe, camra lens and lights, and camera lens and dishe are adjusted optimally. In image segmentation, It is based on a genetic approach that allow one to consider the segmentation problem as a global optimization,. After image pre-processing and image segmentation, the colony analyses are perfomed. The colony image analysis consists of (1) basic colony parameter measurements; (2) colony size analysis; (3) colony shape analysis; and (4) colony surface measurements. All the above visual colony parameters can be selected and combined together, used to make a new engineeing parameters. The colony analysis can be applied into different applications.

  7. A review of multivariate analyses in imaging genetics

    PubMed Central

    Liu, Jingyu; Calhoun, Vince D.

    2014-01-01

    Recent advances in neuroimaging technology and molecular genetics provide the unique opportunity to investigate genetic influence on the variation of brain attributes. Since the year 2000, when the initial publication on brain imaging and genetics was released, imaging genetics has been a rapidly growing research approach with increasing publications every year. Several reviews have been offered to the research community focusing on various study designs. In addition to study design, analytic tools and their proper implementation are also critical to the success of a study. In this review, we survey recent publications using data from neuroimaging and genetics, focusing on methods capturing multivariate effects accommodating the large number of variables from both imaging data and genetic data. We group the analyses of genetic or genomic data into either a priori driven or data driven approach, including gene-set enrichment analysis, multifactor dimensionality reduction, principal component analysis, independent component analysis (ICA), and clustering. For the analyses of imaging data, ICA and extensions of ICA are the most widely used multivariate methods. Given detailed reviews of multivariate analyses of imaging data available elsewhere, we provide a brief summary here that includes a recently proposed method known as independent vector analysis. Finally, we review methods focused on bridging the imaging and genetic data by establishing multivariate and multiple genotype-phenotype-associations, including sparse partial least squares, sparse canonical correlation analysis, sparse reduced rank regression and parallel ICA. These methods are designed to extract latent variables from both genetic and imaging data, which become new genotypes and phenotypes, and the links between the new genotype-phenotype pairs are maximized using different cost functions. The relationship between these methods along with their assumptions, advantages, and limitations are discussed

  8. Labeling of virus components for advanced, quantitative imaging analyses.

    PubMed

    Sakin, Volkan; Paci, Giulia; Lemke, Edward A; Müller, Barbara

    2016-07-01

    In recent years, investigation of virus-cell interactions has moved from ensemble measurements to imaging analyses at the single-particle level. Advanced fluorescence microscopy techniques provide single-molecule sensitivity and subdiffraction spatial resolution, allowing observation of subviral details and individual replication events to obtain detailed quantitative information. To exploit the full potential of these techniques, virologists need to employ novel labeling strategies, taking into account specific constraints imposed by viruses, as well as unique requirements of microscopic methods. Here, we compare strengths and limitations of various labeling methods, exemplify virological questions that were successfully addressed, and discuss challenges and future potential of novel approaches in virus imaging. PMID:26987299

  9. Cartographic analyses of geographic information available on Google Earth Images

    NASA Astrophysics Data System (ADS)

    Oliveira, J. C.; Ramos, J. R.; Epiphanio, J. C.

    2011-12-01

    The propose was to evaluate planimetric accuracy of satellite images available on database of Google Earth. These images are referents to the vicinities of the Federal Univertisity of Viçosa, Minas Gerais - Brazil. The methodology developed evaluated the geographical information of three groups of images which were in accordance to the level of detail presented in the screen images (zoom). These groups of images were labeled to Zoom 1000 (a single image for the entire study area), Zoom 100 (formed by a mosaic of 73 images) and Zoom 100 with geometric correction (this mosaic is like before, however, it was applied a geometric correction through control points). In each group of image was measured the Cartographic Accuracy based on statistical analyses and brazilian's law parameters about planimetric mapping. For this evaluation were identified 22 points in each group of image, where the coordinates of each point were compared to the coordinates of the field obtained by GPS (Global Positioning System). The Table 1 show results related to accuracy (based on a threshold equal to 0.5 mm * mapping scale) and tendency (abscissa and ordinate) between the coordinates of the image and the coordinates of field. Table 1 The geometric correction applied to the Group Zoom 100 reduced the trends identified earlier, and the statistical tests pointed a usefulness of the data for a mapping at a scale of 1/5000 with error minor than 0.5 mm * scale. The analyses proved the quality of cartographic data provided by Google, as well as the possibility of reduce the divergences of positioning present on the data. It can be concluded that it is possible to obtain geographic information database available on Google Earth, however, the level of detail (zoom) used at the time of viewing and capturing information on the screen influences the quality cartographic of the mapping. Although cartographic and thematic potential present in the database, it is important to note that both the software

  10. Analyses of particles in beryllium by ion imaging

    SciTech Connect

    Price, C.W.; Norberg, J.C.; Evans and Associates, Redwood City, CA )

    1989-10-06

    Ion microanalysis using a {sup 133}Cs{sup +} primary ion beam and SIMS has sufficiently high sensitivity that it can be used to analyze Be for trace amounts of most elements. High sensitivity is important, because O, C, and other elements have low solubilities in Be, and reliable analyses of these elements becaome difficult as they approach their solid solubility limits (about 6 appm for O; C also is suspected to be within this range). Because of the low solubilities of these elements, major portions of their total concentrations can be contained in particles. Quantitative depth-profile analyses using ion-implanted standards are ideal to analyze the Be matrix, but if particles exist, supplementary techniques such as stereology are required to determine the amounts of the elements that are associated with the particles. This paper will demonstrate the use of ion imaging to identify various types of particles and determine their spatial distributions. 4 refs., 3 figs.

  11. Solid Hydrogen Experiments for Atomic Propellants: Image Analyses

    NASA Technical Reports Server (NTRS)

    Palaszewski, Bryan

    2002-01-01

    This paper presents the results of detailed analyses of the images from experiments that were conducted on the formation of solid hydrogen particles in liquid helium. Solid particles of hydrogen were frozen in liquid helium, and observed with a video camera. The solid hydrogen particle sizes, their agglomerates, and the total mass of hydrogen particles were estimated. Particle sizes of 1.9 to 8 mm (0.075 to 0.315 in.) were measured. The particle agglomerate sizes and areas were measured, and the total mass of solid hydrogen was computed. A total mass of from 0.22 to 7.9 grams of hydrogen was frozen. Compaction and expansion of the agglomerate implied that the particles remain independent particles, and can be separated and controlled. These experiment image analyses are one of the first steps toward visually characterizing these particles, and allow designers to understand what issues must be addressed in atomic propellant feed system designs for future aerospace vehicles.

  12. ["When the ad is good, the product is sold." The MonitorACAO Project and drug advertising in Brazil].

    PubMed

    Soares, Jussara Calmon Reis de Souza

    2008-04-01

    This paper presents an analysis on drug advertising in Brazil, based on the final report of the MonitorACAO Project, by the group from the Universidade Federal Fluminense, Niterói, Rio de Janeiro. Due to a partnership between the university and the National Agency for Health Surveillance (ANVISA), drug advertisements were monitored and analyzed for one year, according to the methodology defined by the Agency. The samples were collected in medical practices and hospitals, drugstores, pharmacies and in scientific magazines. TV and radio programs were monitored, in the case of OTC drugs. 159 advertisements referring to pharmaceuticals were sent to ANVISA,from a total of 263 irregular ads analyzed between October 2004 and August 2005. The main problems found were the poor quality of drug information to health professionals, as well as misleading drug use to lay population. Based on the results of this project and on other studies, the banning of drug advertising in Brazil is proposed. PMID:21936168

  13. A Guide to Analysing Tongue Motion from Ultrasound Images

    ERIC Educational Resources Information Center

    Stone, Maureen

    2005-01-01

    This paper is meant to be an introduction to and general reference for ultrasound imaging for new and moderately experienced users of the instrument. The paper consists of eight sections. The first explains how ultrasound works, including beam properties, scan types and machine features. The second section discusses image quality, including the…

  14. Imaging data analyses for hazardous waste applications. Final report

    SciTech Connect

    David, N.; Ginsberg, I.W.

    1995-12-01

    The paper presents some examples of the use of remote sensing products for characterization of hazardous waste sites. The sites are located at the Los Alamos National Laboratory (LANL) where materials associated with past weapons testing are buried. Problems of interest include delineation of strata for soil sampling, detection and delineation of buried trenches containing contaminants, seepage from capped areas and old septic drain fields, and location of faults and fractures relative to hazardous waste areas. Merging of site map and other geographic information with imagery was found by site managers to produce useful products. Merging of hydrographic and soil contaminant data aided soil sampling strategists. Overlays of suspected trench on multispectral and thermal images showed correlation between image signatures and trenches. Overlays of engineering drawings on recent and historical photos showed error in trench location and extent. A thermal image showed warm anomalies suspected to be areas of water seepage through an asphalt cap. Overlays of engineering drawings on multispectral and thermal images showed correlation between image signatures and drain fields. Analysis of aerial photography and spectral signatures of faults/fractures improved geologic maps of mixed waste areas.

  15. Biomarkers and imaging: physics and chemistry for noninvasive analyses.

    PubMed

    Moyer, Brian R; Barrett, John A

    2009-05-01

    The era of 'modern medicine' has changed its name to 'molecular medicine', and reflects a new age based on personalized medicine utilizing molecular biomarkers in the diagnosis, staging and monitoring of therapy. Alzheimer's disease has a classical biomarker determined at autopsy with the histologic staining of amyloid accumulation in the brain. Today we can diagnose Alzheimer's disease using the same classical pathologic biomarker, but now using a noninvasive imaging probe to image the amyloid deposition in a patient and potentially provide treatment strategies and measure their effectiveness. Molecular medicine is the exploitation of biomarkers to detect disease before overt expression of pathology. Physicians can now find, fight and follow disease using imaging, and the need for other disease biomarkers is in high demand. This review will discuss the innovative physical and molecular biomarker probes now being developed for imaging systems and we will introduce the concepts needed for validation and regulatory acceptance of surrogate biomarkers in the detection and treatment of disease. PMID:21083171

  16. Quantifying inter-subject agreement in brain-imaging analyses.

    PubMed

    Wong, Dik Kin; Grosenick, Logan; Uy, E Timothy; Perreau Guimaraes, Marcos; Carvalhaes, Claudio G; Desain, Peter; Suppes, Patrick

    2008-02-01

    In brain-imaging research, we are often interested in making quantitative claims about effects across subjects. Given that most imaging data consist of tens to thousands of spatially correlated time series, inter-subject comparisons are typically accomplished with simple combinations of inter-subject data, for example methods relying on group means. Further, these data are frequently taken from reduced channel subsets defined either a priori using anatomical considerations, or functionally using p-value thresholding to choose cluster boundaries. While such methods are effective for data reduction, means are sensitive to outliers, and current methods for subset selection can be somewhat arbitrary. Here, we introduce a novel "partial-ranking" approach to test for inter-subject agreement at the channel level. This non-parametric method effectively tests whether channel concordance is present across subjects, how many channels are necessary for maximum concordance, and which channels are responsible for this agreement. We validate the method on two previously published and two simulated EEG data sets. PMID:18023210

  17. The challenges of analysing blood stains with hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Kuula, J.; Puupponen, H.-H.; Rinta, H.; Pölönen, I.

    2014-06-01

    Hyperspectral imaging is a potential noninvasive technology for detecting, separating and identifying various substances. In the forensic and military medicine and other CBRNE related use it could be a potential method for analyzing blood and for scanning other human based fluids. For example, it would be valuable to easily detect whether some traces of blood are from one or more persons or if there are some irrelevant substances or anomalies in the blood. This article represents an experiment of separating four persons' blood stains on a white cotton fabric with a SWIR hyperspectral camera and FT-NIR spectrometer. Each tested sample includes standardized 75 _l of 100 % blood. The results suggest that on the basis of the amount of erythrocytes in the blood, different people's blood might be separable by hyperspectral analysis. And, referring to the indication given by erythrocytes, there might be a possibility to find some other traces in the blood as well. However, these assumptions need to be verified with wider tests, as the number of samples in the study was small. According to the study there also seems to be several biological, chemical and physical factors which affect alone and together on the hyperspectral analyzing results of blood on fabric textures, and these factors need to be considered before making any further conclusions on the analysis of blood on various materials.

  18. Validating new diagnostic imaging criteria for primary progressive aphasia via anatomical likelihood estimation meta-analyses.

    PubMed

    Bisenius, S; Neumann, J; Schroeter, M L

    2016-04-01

    Recently, diagnostic clinical and imaging criteria for primary progressive aphasia (PPA) have been revised by an international consortium (Gorno-Tempini et al. Neurology 2011;76:1006-14). The aim of this study was to validate the specificity of the new imaging criteria and investigate whether different imaging modalities [magnetic resonance imaging (MRI) and fluorodeoxyglucose positron emission tomography (FDG-PET)] require different diagnostic subtype-specific imaging criteria. Anatomical likelihood estimation meta-analyses were conducted for PPA subtypes across a large cohort of 396 patients: firstly, across MRI studies for each of the three PPA subtypes followed by conjunction and subtraction analyses to investigate the specificity, and, secondly, by comparing results across MRI vs. FDG-PET studies in semantic dementia and progressive nonfluent aphasia. Semantic dementia showed atrophy in temporal, fusiform, parahippocampal gyri, hippocampus, and amygdala, progressive nonfluent aphasia in left putamen, insula, middle/superior temporal, precentral, and frontal gyri, logopenic progressive aphasia in middle/superior temporal, supramarginal, and dorsal posterior cingulate gyri. Results of the disease-specific meta-analyses across MRI studies were disjunct. Similarly, atrophic and hypometabolic brain networks were regionally dissociated in both semantic dementia and progressive nonfluent aphasia. In conclusion, meta-analyses support the specificity of new diagnostic imaging criteria for PPA and suggest that they should be specified for each imaging modality separately. PMID:26901360

  19. Integrating medical imaging analyses through a high-throughput bundled resource imaging system

    NASA Astrophysics Data System (ADS)

    Covington, Kelsie; Welch, E. Brian; Jeong, Ha-Kyu; Landman, Bennett A.

    2011-03-01

    Exploitation of advanced, PACS-centric image analysis and interpretation pipelines provides well-developed storage, retrieval, and archival capabilities along with state-of-the-art data providence, visualization, and clinical collaboration technologies. However, pursuit of integrated medical imaging analysis through a PACS environment can be limiting in terms of the overhead required to validate, evaluate and integrate emerging research technologies. Herein, we address this challenge through presentation of a high-throughput bundled resource imaging system (HUBRIS) as an extension to the Philips Research Imaging Development Environment (PRIDE). HUBRIS enables PACS-connected medical imaging equipment to invoke tools provided by the Java Imaging Science Toolkit (JIST) so that a medical imaging platform (e.g., a magnetic resonance imaging scanner) can pass images and parameters to a server, which communicates with a grid computing facility to invoke the selected algorithms. Generated images are passed back to the server and subsequently to the imaging platform from which the images can be sent to a PACS. JIST makes use of an open application program interface layer so that research technologies can be implemented in any language capable of communicating through a system shell environment (e.g., Matlab, Java, C/C++, Perl, LISP, etc.). As demonstrated in this proof-of-concept approach, HUBRIS enables evaluation and analysis of emerging technologies within well-developed PACS systems with minimal adaptation of research software, which simplifies evaluation of new technologies in clinical research and provides a more convenient use of PACS technology by imaging scientists.

  20. Integrating Medical Imaging Analyses through a High-throughput Bundled Resource Imaging System

    PubMed Central

    Covington, Kelsie; Welch, E. Brian; Jeong, Ha-Kyu; Landman, Bennett A.

    2011-01-01

    Exploitation of advanced, PACS-centric image analysis and interpretation pipelines provides well-developed storage, retrieval, and archival capabilities along with state-of-the-art data providence, visualization, and clinical collaboration technologies. However, pursuit of integrated medical imaging analysis through a PACS environment can be limiting in terms of the overhead required to validate, evaluate and integrate emerging research technologies. Herein, we address this challenge through presentation of a high-throughput bundled resource imaging system (HUBRIS) as an extension to the Philips Research Imaging Development Environment (PRIDE). HUBRIS enables PACS-connected medical imaging equipment to invoke tools provided by the Java Imaging Science Toolkit (JIST) so that a medical imaging platform (e.g., a magnetic resonance imaging scanner) can pass images and parameters to a server, which communicates with a grid computing facility to invoke the selected algorithms. Generated images are passed back to the server and subsequently to the imaging platform from which the images can be sent to a PACS. JIST makes use of an open application program interface layer so that research technologies can be implemented in any language capable of communicating through a system shell environment (e.g., Matlab, Java, C/C++, Perl, LISP, etc.). As demonstrated in this proof-of-concept approach, HUBRIS enables evaluation and analysis of emerging technologies within well-developed PACS systems with minimal adaptation of research software, which simplifies evaluation of new technologies in clinical research and provides a more convenient use of PACS technology by imaging scientists. PMID:21841899

  1. Integrating Medical Imaging Analyses through a High-throughput Bundled Resource Imaging System.

    PubMed

    Covington, Kelsie; Welch, E Brian; Jeong, Ha-Kyu; Landman, Bennett A

    2011-01-01

    Exploitation of advanced, PACS-centric image analysis and interpretation pipelines provides well-developed storage, retrieval, and archival capabilities along with state-of-the-art data providence, visualization, and clinical collaboration technologies. However, pursuit of integrated medical imaging analysis through a PACS environment can be limiting in terms of the overhead required to validate, evaluate and integrate emerging research technologies. Herein, we address this challenge through presentation of a high-throughput bundled resource imaging system (HUBRIS) as an extension to the Philips Research Imaging Development Environment (PRIDE). HUBRIS enables PACS-connected medical imaging equipment to invoke tools provided by the Java Imaging Science Toolkit (JIST) so that a medical imaging platform (e.g., a magnetic resonance imaging scanner) can pass images and parameters to a server, which communicates with a grid computing facility to invoke the selected algorithms. Generated images are passed back to the server and subsequently to the imaging platform from which the images can be sent to a PACS. JIST makes use of an open application program interface layer so that research technologies can be implemented in any language capable of communicating through a system shell environment (e.g., Matlab, Java, C/C++, Perl, LISP, etc.). As demonstrated in this proof-of-concept approach, HUBRIS enables evaluation and analysis of emerging technologies within well-developed PACS systems with minimal adaptation of research software, which simplifies evaluation of new technologies in clinical research and provides a more convenient use of PACS technology by imaging scientists. PMID:21841899

  2. Analyses of S-Box in Image Encryption Applications Based on Fuzzy Decision Making Criterion

    NASA Astrophysics Data System (ADS)

    Rehman, Inayatur; Shah, Tariq; Hussain, Iqtadar

    2014-06-01

    In this manuscript, we put forward a standard based on fuzzy decision making criterion to examine the current substitution boxes and study their strengths and weaknesses in order to decide their appropriateness in image encryption applications. The proposed standard utilizes the results of correlation analysis, entropy analysis, contrast analysis, homogeneity analysis, energy analysis, and mean of absolute deviation analysis. These analyses are applied to well-known substitution boxes. The outcome of these analyses are additional observed and a fuzzy soft set decision making criterion is used to decide the suitability of an S-box to image encryption applications.

  3. Geologist's Field Assistant: Developing Image and Spectral Analyses Algorithms for Remote Science Exploration

    NASA Astrophysics Data System (ADS)

    Gulick, V. C.; Morris, R. L.; Bishop, J.; Gazis, P.; Alena, R.; Sierhuis, M.

    2002-03-01

    We are developing science analyses algorithms to interface with a Geologist's Field Assistant device to allow robotic or human remote explorers to better sense their surroundings during limited surface excursions. Our algorithms will interpret spectral and imaging data obtained by various sensors.

  4. Biodistribution Analyses of a Near-Infrared, Fluorescently Labeled, Bispecific Monoclonal Antibody Using Optical Imaging.

    PubMed

    Peterson, Norman C; Wilson, George G; Huang, Qihui; Dimasi, Nazzareno; Sachsenmeier, Kris F

    2016-04-01

    In recent years, biodistribution analyses of pharmaceutical compounds in preclinical animal models have become an integral part of drug development. Here we report on the use of optical imaging biodistribution analyses in a mouse xenograft model to identify tissues that nonspecifically retained a bispecific antibody under development. Although our bispecific antibody bound both the epidermal growth factor receptor and insulin growth factor 1 receptor are expressed on H358, nonsmall-cell lung carcinoma cells, the fluorescence from labeled bispecific antibody was less intense than expected in xenografted tumors. Imaging analyses of live mice and major organs revealed that the majority of the Alexa Fluor 750 labeled bispecific antibody was sequestered in the liver within 2 h of injection. However, results varied depending on which near-infrared fluorophore was used, and fluorescence from the livers of mice injected with bispecific antibody labeled with Alexa Fluor 680 was less pronounced than those labeled with Alexa Fluor 750. The tissue distribution of control antibodies remained unaffected by label and suggests that the retention of fluorophores in the liver may differ. Given these precautions, these results support the incorporation of optical imaging biodistribution analyses in biotherapeutic development strategies. PMID:27053562

  5. Computer assisted photo-anthropometric analyses of full-face and profile facial images.

    PubMed

    Davis, Josh P; Valentine, Tim; Davis, Robert E

    2010-07-15

    Expert witnesses using facial comparison techniques are regularly required to disambiguate cases of disputed identification in CCTV images and other photographic evidence in court. This paper describes a novel software-assisted photo-anthropometric facial landmark identification system, DigitalFace tested against a database of 70 full-face and profile images of young males meeting a similar description. The system produces 37 linear and 25 angular measurements across the two viewpoints. A series of 64 analyses were conducted to examine whether separate novel probe facial images of target individuals whose face dimensions were already stored within the database would be correctly identified as the same person. Identification verification was found to be unreliable unless multiple distance and angular measurements from both profile and full-face images were included in an analysis. PMID:20570069

  6. Analysing the Image Building Effects of TV Advertisements Using Internet Community Data

    NASA Astrophysics Data System (ADS)

    Uehara, Hiroshi; Sato, Tadahiko; Yoshida, Kenichi

    This paper proposes a method to measure the effects of TV advertisements on the Internet bulletin boards. It aims to clarify how the viewes' interests on TV advertisements are reflected on their images on the promoted products. Two kinds of time series data are generated based on the proposed method. First one represents the time series fluctuation of the interests on the TV advertisements. Another one represents the time series fluctuation of the images on the products. By analysing the correlations between these two time series data, we try to clarify the implicit relationship between the viewer's interests on the TV advertisement and their images on the promoted products. By applying the proposed method to an Internet bulletin board that deals with certain cosmetic brand, we show that the images on the products vary depending on the difference of the interests on each TV advertisement.

  7. Solid Hydrogen Experiments for Atomic Propellants: Particle Formation, Imaging, Observations, and Analyses

    NASA Technical Reports Server (NTRS)

    Palaszewski, Bryan

    2005-01-01

    This report presents particle formation observations and detailed analyses of the images from experiments that were conducted on the formation of solid hydrogen particles in liquid helium. Hydrogen was frozen into particles in liquid helium, and observed with a video camera. The solid hydrogen particle sizes and the total mass of hydrogen particles were estimated. These newly analyzed data are from the test series held on February 28, 2001. Particle sizes from previous testing in 1999 and the testing in 2001 were similar. Though the 2001 testing created similar particles sizes, many new particle formation phenomena were observed: microparticles and delayed particle formation. These experiment image analyses are some of the first steps toward visually characterizing these particles, and they allow designers to understand what issues must be addressed in atomic propellant feed system designs for future aerospace vehicles.

  8. Solid Hydrogen Experiments for Atomic Propellants: Particle Formation Energy and Imaging Analyses

    NASA Technical Reports Server (NTRS)

    Palaszewski, Bryan

    2002-01-01

    This paper presents particle formation energy balances and detailed analyses of the images from experiments that were conducted on the formation of solid hydrogen particles in liquid helium during the Phase II testing in 2001. Solid particles of hydrogen were frozen in liquid helium and observed with a video camera. The solid hydrogen particle sizes and the total mass of hydrogen particles were estimated. The particle formation efficiency is also estimated. Particle sizes from the Phase I testing in 1999 and the Phase II testing in 2001 were similar. Though the 2001 testing created similar particles sizes, many new particle formation phenomena were observed. These experiment image analyses are one of the first steps toward visually characterizing these particles and it allows designers to understand what issues must be addressed in atomic propellant feed system designs for future aerospace vehicles.

  9. Fractal analyses of osseous healing using Tuned Aperture Computed Tomography images

    PubMed Central

    Seyedain, Ali; Webber, Richard L.; Nair, Umadevi P.; Piesco, Nicholas P.; Agarwal, Sudha; Mooney, Mark P.; Gröndahl, Hans-Göran

    2016-01-01

    The aim of this study was to evaluate osseous healing in mandibular defects using fractal analyses on conventional radiographs and tuned aperture computed tomography (TACT; OrthoTACT, Instrumentarium Imaging, Helsinki, Finland) images. Eighty test sites on the inferior margins of rabbit mandibles were subject to lesion induction and treated with one of the following: no treatment (controls); osteoblasts only; polymer matrix only; or osteoblast-polymer matrix (OPM) combination. Images were acquired using conventional radiography and TACT, including unprocessed TACT (TACT-U) and iteratively restored TACT (TACT-IR). Healing was followed up over time and images acquired at 3, 6, 9, and 12 weeks post-surgery. Fractal dimension (FD) was computed within regions of interest in the defects using the TACT workbench. Results were analyzed for effects produced by imaging modality, treatment modality, time after surgery and lesion location. Histomorphometric data were available to assess ground truth. Significant differences (p < 0.0001) were noted based on imaging modality with TACT-IR recording the highest mean fractal dimension (MFD), followed by TACT-U and conventional images, in that order. Sites treated with OPM recorded the highest MFDs among all treatment modalities (p < 0.0001). The highest MFD based on time was recorded at 3 weeks and differed significantly with 12 weeks (p < 0.035). Correlation of FD with results of histomorphometric data was high (r = 0.79; p < 0.001). The FD computed on TACT-IR showed the highest correlation with histomorphometric data, thus establishing the fact TACT is a more efficient and accurate imaging modality for quantification of osseous changes within healing bony defects. PMID:11519567

  10. The Decoding Toolbox (TDT): a versatile software package for multivariate analyses of functional imaging data

    PubMed Central

    Hebart, Martin N.; Görgen, Kai; Haynes, John-Dylan

    2015-01-01

    The multivariate analysis of brain signals has recently sparked a great amount of interest, yet accessible and versatile tools to carry out decoding analyses are scarce. Here we introduce The Decoding Toolbox (TDT) which represents a user-friendly, powerful and flexible package for multivariate analysis of functional brain imaging data. TDT is written in Matlab and equipped with an interface to the widely used brain data analysis package SPM. The toolbox allows running fast whole-brain analyses, region-of-interest analyses and searchlight analyses, using machine learning classifiers, pattern correlation analysis, or representational similarity analysis. It offers automatic creation and visualization of diverse cross-validation schemes, feature scaling, nested parameter selection, a variety of feature selection methods, multiclass capabilities, and pattern reconstruction from classifier weights. While basic users can implement a generic analysis in one line of code, advanced users can extend the toolbox to their needs or exploit the structure to combine it with external high-performance classification toolboxes. The toolbox comes with an example data set which can be used to try out the various analysis methods. Taken together, TDT offers a promising option for researchers who want to employ multivariate analyses of brain activity patterns. PMID:25610393

  11. The Decoding Toolbox (TDT): a versatile software package for multivariate analyses of functional imaging data.

    PubMed

    Hebart, Martin N; Görgen, Kai; Haynes, John-Dylan

    2014-01-01

    The multivariate analysis of brain signals has recently sparked a great amount of interest, yet accessible and versatile tools to carry out decoding analyses are scarce. Here we introduce The Decoding Toolbox (TDT) which represents a user-friendly, powerful and flexible package for multivariate analysis of functional brain imaging data. TDT is written in Matlab and equipped with an interface to the widely used brain data analysis package SPM. The toolbox allows running fast whole-brain analyses, region-of-interest analyses and searchlight analyses, using machine learning classifiers, pattern correlation analysis, or representational similarity analysis. It offers automatic creation and visualization of diverse cross-validation schemes, feature scaling, nested parameter selection, a variety of feature selection methods, multiclass capabilities, and pattern reconstruction from classifier weights. While basic users can implement a generic analysis in one line of code, advanced users can extend the toolbox to their needs or exploit the structure to combine it with external high-performance classification toolboxes. The toolbox comes with an example data set which can be used to try out the various analysis methods. Taken together, TDT offers a promising option for researchers who want to employ multivariate analyses of brain activity patterns. PMID:25610393

  12. Study of SGD along the French Mediterranean coastline using airborne TIR images and in situ analyses

    NASA Astrophysics Data System (ADS)

    van Beek, Pieter; Stieglitz, Thomas; Souhaut, Marc

    2015-04-01

    Although submarine groundwater discharge (SGD) has been investigated in many places of the world, very few studies were conducted along the French coastline of the Mediterranean Sea. Almost no information is available on the fluxes of water and chemical elements associated with these SGD and on their potential impact on the geochemical cycling and ecosystems of the coastal zones. In this work, we combined the use of airborne thermal infrared (TIR) images with in situ analyses of salinity, temperature, radon and radium isotopes to study SGD at various sites along the French Mediterranean coastline and in coastal lagoons. These analyses allowed us to detect SGD sites and to quantify SGD fluxes (that include both the fluxes of fresh groundwater and recirculated seawater). In particular, we will show how the Ra isotopes determined in the La Palme lagoon were used to estimate i) the residence time of waters in the lagoon and ii) SGD fluxes.

  13. Computer-based image-analyses of laminated shales, carboniferous of the Midcontinent and surrounding areas

    SciTech Connect

    Archer, A.W. . Dept. of Geology)

    1993-02-01

    Computerized image-analyses of petrographic data can greatly facilitate the quantification of detailed descriptions and analyses of fine-scale fabric, or petrofabric. In thinly laminated rocks, manual measurement of successive lamina thicknesses is very time consuming, especially when applied to thick, cored sequences. In particular, images of core materials can be digitized and the resulting image then processed as a large matrix. Using such techniques, it is relatively easy to automate continuous measurements of lamina thickness and lateral continuity. This type of analyses has been applied to a variety of Carboniferous strata, particularly those siliciclastics that occur within the outside shale' portions of Kansas cyclothems. Of the various sedimentological processes capable of producing such non-random thickness variations, a model invoking tidal processes appears to be particularly robust. Tidal sedimentation could not only have resulted in the deposition of individual lamina, but in addition tidal-height variations during various phases of the lunar orbit can serve to explain the systematic variations. Comparison of these Carboniferous shales with similar laminations formed in modern high tidal-range environments indicates many similarities. These modern analogs include the Bay of Fundy in Canada, and Bay of Mont-Staint-Michel in France. Lamina-thickness variations, in specific cases, can be correlated with known tidal periodicities. In addition, in some samples, details of the tidal regime can be interpolated, such as the nature of the tidal system (i.e., diurnal or semidiurnal) and some indicators of tidal range can be ascertained based upon modern analogs.

  14. Une Technique D' Analyse D' Image Par Cellule Sur Ecran Video

    NASA Astrophysics Data System (ADS)

    Lefebvre, G.; Thorax, L.; Ducom, J.

    1985-02-01

    In packaging,vision of making process is difficult due to high speed of machines. The setting of box erecting is tedious. The origin of filling hazards tedious is often unknown. Some performances on line are not reproducible. To visualisate, decompose and analyse the fast moving phenomenons which are causing packaging troubles, the French Paper and Board Research Institute has obtained video NAC eouinment (200 pictures/sec.). An articulate stand able to move on a carriage is designed to facilitate shooting on machine. An elaborate image analysis is undertaken to find the characteristics of the hoards which are necessary to optimize making the folding box erection on the packaging lines. Actual image analysis with computer systems are large, expensive and necessitate a specific program for each problem. Important equipment are actually used exclusively for special fields. For packaging field where machines and products are diversified we have developed an easy electronical technique for picture analysis. This technique is suitable for all kinds of processes or defects, visualised by high speed video from video shooting on machine. The processes and the manufacture incidents are analysed, controled with a cell on video screen. The light intensity variations are detected and writed on self-recording apparatus. Materials move, their forming modifications and the moves of machine elements are expressed like "signature". All changes on "signature" show hazardous or reproducible variations according to defects of manufacture processes. A short time event, visible on few images only is located with normal or slower speed of the magnetic tape according to the importance of variation. This technique is used to measure in real time the packaging deformations on line. Ease for use, speed of setting and quantity of data are operating qualities of efficient image analysis.

  15. Immunochemical Micro Imaging Analyses for the Detection of Proteins in Artworks.

    PubMed

    Sciutto, Giorgia; Zangheri, Martina; Prati, Silvia; Guardigli, Massimo; Mirasoli, Mara; Mazzeo, Rocco; Roda, Aldo

    2016-06-01

    The present review is aimed at reporting on the most advanced and recent applications of immunochemical imaging techniques for the localization of proteins within complex and multilayered paint stratigraphies. Indeed, a paint sample is usually constituted by the superimposition of different layers whose characterization is fundamental in the evaluation of the state of conservation and for addressing proper restoration interventions. Immunochemical methods, which are based on the high selectivity of antigen-antibody reactions, were proposed some years ago in the field of cultural heritage. In addition to enzyme-linked immunosorbent assays for protein identification, immunochemical imaging methods have also been explored in the last decades, thanks to the possibility to localize the target analytes, thus increasing the amount of information obtained and thereby reducing the number of samples and/or analyses needed for a comprehensive characterization of the sample. In this review, chemiluminescent, spectroscopic and electrochemical imaging detection methods are discussed to illustrate potentialities and limits of advanced immunochemical imaging systems for the analysis of paint cross-sections. PMID:27573272

  16. An accessible, scalable ecosystem for enabling and sharing diverse mass spectrometry imaging analyses.

    PubMed

    Fischer, Curt R; Ruebel, Oliver; Bowen, Benjamin P

    2016-01-01

    Mass spectrometry imaging (MSI) is used in an increasing number of biological applications. Typical MSI datasets contain unique, high-resolution mass spectra from tens of thousands of spatial locations, resulting in raw data sizes of tens of gigabytes per sample. In this paper, we review technical progress that is enabling new biological applications and that is driving an increase in the complexity and size of MSI data. Handling such data often requires specialized computational infrastructure, software, and expertise. OpenMSI, our recently described platform, makes it easy to explore and share MSI datasets via the web - even when larger than 50 GB. Here we describe the integration of OpenMSI with IPython notebooks for transparent, sharable, and replicable MSI research. An advantage of this approach is that users do not have to share raw data along with analyses; instead, data is retrieved via OpenMSI's web API. The IPython notebook interface provides a low-barrier entry point for data manipulation that is accessible for scientists without extensive computational training. Via these notebooks, analyses can be easily shared without requiring any data movement. We provide example notebooks for several common MSI analysis types including data normalization, plotting, clustering, and classification, and image registration. PMID:26365033

  17. A novel high-throughput imaging system for automated analyses of avoidance behavior in zebrafish larvae

    PubMed Central

    Pelkowski, Sean D.; Kapoor, Mrinal; Richendrfer, Holly A.; Wang, Xingyue; Colwill, Ruth M.; Creton, Robbert

    2011-01-01

    Early brain development can be influenced by numerous genetic and environmental factors, with long-lasting effects on brain function and behavior. The identification of these factors is facilitated by recent innovations in high-throughput screening. However, large-scale screening in whole organisms remains challenging, in particular when studying changes in brain function or behavior in vertebrate model systems. In this study, we present a novel imaging system for high-throughput analyses of behavior in zebrafish larvae. The three-camera system can image twelve multiwell plates simultaneously and is unique in its ability to provide local visual stimuli in the wells of a multiwell plate. The acquired images are converted into a series of coordinates, which characterize the location and orientation of the larvae. The developed imaging techniques were tested by measuring avoidance behaviors in seven-day-old zebrafish larvae. The system effectively quantified larval avoidance and revealed an increased edge preference in response to a blue or red ‘bouncing ball’ stimulus. Larvae also avoid a bouncing ball stimulus when it is counter-balanced with a stationary ball, but do not avoid blinking balls counter-balanced with a stationary ball. These results indicate that the seven-day-old larvae respond specifically to movement, rather than color, size, or local changes in light intensity. The imaging system and assays for measuring avoidance behavior may be used to screen for genetic and environmental factors that cause developmental brain disorders and for novel drugs that could prevent or treat these disorders. PMID:21549762

  18. Partial correlation analyses of global diffusion tensor imaging-derived metrics in glioblastoma multiforme: Pilot study

    PubMed Central

    Cortez-Conradis, David; Rios, Camilo; Moreno-Jimenez, Sergio; Roldan-Valadez, Ernesto

    2015-01-01

    AIM: To determine existing correlates among diffusion tensor imaging (DTI)-derived metrics in healthy brains and brains with glioblastoma multiforme (GBM). METHODS: Case-control study using DTI data from brain magnetic resonance imaging of 34 controls (mean, 41.47; SD, ± 21.94 years; range, 21-80 years) and 27 patients with GBM (mean, SD; 48.41 ± 15.18 years; range, 18-78 years). Image postprocessing using FSL software calculated eleven tensor metrics: fractional (FA) and relative anisotropy; pure isotropic (p) and anisotropic diffusions (q), total magnitude of diffusion (L); linear (Cl), planar (Cp) and spherical tensors (Cs); mean (MD), axial (AD) and radial diffusivities (RD). Partial correlation analyses (controlling the effect of age and gender) and multivariate Mancova were performed. RESULTS: There was a normal distribution for all metrics. Comparing healthy brains vs brains with GBM, there were significant very strong bivariate correlations only depicted in GBM: [FA↔Cl (+)], [FA↔q (+)], [p↔AD (+)], [AD↔MD (+)], and [MD↔RD (+)]. Among 56 pairs of bivariate correlations, only seven were significantly different. The diagnosis variable depicted a main effect [F-value (11, 23) = 11.842, P ≤ 0.001], with partial eta squared = 0.850, meaning a large effect size; age showed a similar result. The age also had a significant influence as a covariate [F (11, 23) = 10.523, P < 0.001], with a large effect size (partial eta squared = 0.834). CONCLUSION: DTI-derived metrics depict significant differences between healthy brains and brains with GBM, with specific magnitudes and correlations. This study provides reference data and makes a contribution to decrease the underlying empiricism in the use of DTI parameters in brain imaging. PMID:26644826

  19. Mosquito Larval Habitats, Land Use, and Potential Malaria Risk in Northern Belize from Satellite Image Analyses

    NASA Technical Reports Server (NTRS)

    Pope, Kevin; Masuoka, Penny; Rejmankova, Eliska; Grieco, John; Johnson, Sarah; Roberts, Donald

    2004-01-01

    The distribution of Anopheles mosquito habitats and land use in northern Belize is examined with satellite data. -A land cover classification based on multispectral SPOT and multitemporal Radarsat images identified eleven land cover classes, including agricultural, forest, and marsh types. Two of the land cover types, Typha domingensis marsh and flooded forest, are Anopheles vestitipennis larval habitats. Eleocharis spp. marsh is the larval habitat for Anopheles albimanus. Geographic Information Systems (GIS) analyses of land cover demonstrate that the amount of T-ha domingensis in a marsh is positively correlated with the amount of agricultural land in the adjacent upland, and negatively correlated with the amount of adjacent forest. This finding is consistent with the hypothesis that nutrient (phosphorus) runoff from agricultural lands is causing an expansion of Typha domingensis in northern Belize. This expansion of Anopheles vestitipennis larval habitat may in turn cause an increase in malaria risk in the region.

  20. Validation of the automatic image analyser to assess retinal vessel calibre (ALTAIR): a prospective study protocol

    PubMed Central

    Garcia-Ortiz, Luis; Gómez-Marcos, Manuel A; Recio-Rodríguez, Jose I; Maderuelo-Fernández, Jose A; Chamoso-Santos, Pablo; Rodríguez-González, Sara; de Paz-Santana, Juan F; Merchan-Cifuentes, Miguel A; Corchado-Rodríguez, Juan M

    2014-01-01

    Introduction The fundus examination is a non-invasive evaluation of the microcirculation of the retina. The aim of the present study is to develop and validate (reliability and validity) the ALTAIR software platform (Automatic image analyser to assess retinal vessel calibre) in order to analyse its utility in different clinical environments. Methods and analysis A cross-sectional study in the first phase and a prospective observational study in the second with 4 years of follow-up. The study will be performed in a primary care centre and will include 386 participants. The main measurements will include carotid intima-media thickness, pulse wave velocity by Sphygmocor, cardio-ankle vascular index through the VASERA VS-1500, cardiac evaluation by a digital ECG and renal injury by microalbuminuria and glomerular filtration. The retinal vascular evaluation will be performed using a TOPCON TRCNW200 non-mydriatic retinal camera to obtain digital images of the retina, and the developed software (ALTAIR) will be used to automatically calculate the calibre of the retinal vessels, the vascularised area and the branching pattern. For software validation, the intraobserver and interobserver reliability, the concurrent validity of the vascular structure and function, as well as the association between the estimated retinal parameters and the evolution or onset of new lesions in the target organs or cardiovascular diseases will be examined. Ethics and dissemination The study has been approved by the clinical research ethics committee of the healthcare area of Salamanca. All study participants will sign an informed consent to agree to participate in the study in compliance with the Declaration of Helsinki and the WHO standards for observational studies. Validation of this tool will provide greater reliability to the analysis of retinal vessels by decreasing the intervention of the observer and will result in increased validity through the use of additional information, especially

  1. Development, Capabilities, and Impact on Wind Analyses of the Hurricane Imaging Radiometer (HIRAD)

    NASA Technical Reports Server (NTRS)

    Miller, T.; Amarin, R.; Atlas, R.; Bailey, M.; Black, P.; Buckley, C.; Chen, S.; El-Nimri, S.; Hood, R.; James, M.; Johnson, J.; Jones, W.; Ruf, C.; Simmons, D.; Uhlhorn, E.; Inglish, C.

    2010-01-01

    The Hurricane Imaging Radiometer (HIRAD) is a new airborne microwave remote sensor for hurricane observations that is currently under development by NASA Marshall Space Flight Center in partnership with the NOAA Atlantic Oceanographic and Meteorological Laboratory/Hurricane Research Division, the University of Central Florida, the University of Michigan, and the University of Alabama in Huntsville. The instrument is being test flown in January and is expected to participate in the tropical cyclone experiment GRIP (Genesis and Rapid Intensification Processes) in the 2010 season. HIRAD is being designed to study the wind field in some detail within strong hurricanes and to enhance the real-time airborne ocean surface winds observation capabilities of NOAA and USAF Weather Squadron hurricane hunter aircraft currently using the operational Stepped Frequency Microwave Radiometer (SFMR). Unlike SFMR, which measures wind speed and rain rate along the ground track at a single point directly beneath the aircraft, HIRAD will provide images of the surface wind and rain field over a wide swath (approximately 3 x the aircraft altitude) with approximately 2 km resolution. This paper describes the HIRAD instrument and the physical basis for its operations, including chamber test data from the instrument. The potential value of future HIRAD observations will be illustrated with a summary of Observing System Simulation Experiments (OSSEs) in which measurements from the new instrument as well as those from existing instruments (air, surface, and space-based) are simulated from the output of a detailed numerical model, and those results are used to construct simulated H*Wind analyses. Evaluations will be presented on the impact on H*Wind analyses of using the HIRAD instrument observations to replace those of the SFMR instrument, and also on the impact of a future satellite-based HIRAD in comparison to instruments with more limited capabilities for observing strong winds through heavy

  2. Correlative Imaging and Analyses of Soil Organic Matter Stabilization in the Rhizosphere

    NASA Astrophysics Data System (ADS)

    Dohnalkova, Alice; Tfaily, Malak; Chu, Rosalie; Crump, Alex; Brislawn, Colin; Varga, Tamas; Chrisler, William

    2016-04-01

    Correlative Imaging and Analyses of Soil Organic Matter Stabilization in the Rhizosphere Understanding the dynamics of carbon (C) pools in soil systems is a critical area for mitigating atmospheric carbon dioxide levels and maintaining healthy soils. Although microbial contributions to stable soil carbon pools have often been regarded as low to negligible, we present evidence that microbes may play a far greater role in the stabilization of soil organic matter (SOM), thus in contributing to soil organic matter pools with longer residence time. The rhizosphere, a zone immediately surrounding the plant roots, represents a geochemical hotspot with high microbial activity and profuse SOM production. Particularly, microbially secreted extracellular polymeric substances (EPS) present a remarkable dynamic entity that plays a critical role in numerous soil processes including mineral weathering. We approach the interface of soil minerals and microbes with a focus on the organic C stabilization mechanisms. We use a suite of high-resolution imaging and analytical methods (confocal, scanning and transmission electron microscopy, Fourier transform ion cyclotron resonance mass spectrometry, DNA sequencing and X-ray diffraction), to study the living and non-living rhizosphere components. Our goal is to elucidate a pathway for the formation, storage, transformation and protection of persistent microbially-produced carbon in soils. Based on our multimodal analytical approach, we propose that persistent microbial necromass in soils accounts for considerably higher soil carbon than previously estimated.

  3. Image analyses in bauxitic ores: The case of the Apulian karst bauxites

    NASA Astrophysics Data System (ADS)

    Buccione, Roberto; Sinisi, Rosa; Mongelli, Giovanni

    2015-04-01

    This study concern two different karst bauxite deposits of the Apulia region (southern Italy). These deposits outcrop in the Murge and Salento areas: the Murge bauxite (upper Cretaceous) is a typical canyon-like deposit formed in a karst depression whereas the Salento bauxite (upper Eocene - Oligocene) is the result of the erosion, remobilization and transport of older bauxitic material from a relative distant area. This particular bauxite arrangement gave the name to all the same bauxite deposits which are thus called Salento-type deposits. Bauxite's texture is essentially made of sub-circular concentric aggregates, called ooids, dispersed in a pelitic matrix. The textural properties of the two bauxitic ores, as assessed by SEM-EDX, are different. In the bauxite from the canyon-like deposit the ooids/matrix ratio is higher than in the Salento-type bauxite. Furthermore the ooids in the Salento-like bauxite are usually made by a large core surrounded by a narrow, single, accretion layer, whereas the ooids from the canyon-like deposit have a smaller core surrounded by several alternating layers of Al-hematite and boehmite (Mongelli et al., 2014). In order to explore in more detail the textural features of both bauxite deposits, particle shape analyses were performed. Image analyses and the fractal dimension have been widely used in geological studies including economic geology (e.g. Turcotte, 1986; Meakin, 1991; Deng et al., 2011). The geometric properties evaluated are amounts of ooids, average ooids size, ooids rounding and the fractal dimension D, which depends on the ooids/matrix ratio. D is the slope of a plotting line obtained using a particular counting technique on each sample image. The fractal dimension is slightly lower for the Salento-type bauxites. Since the process which led to the formation of the ooids is related to an aggregation growth involving chemical fractionation (Mongelli, 2002) a correlation among these parameters and the contents of major

  4. Molecular cytogenetic analysis of human blastocysts andcytotrophoblasts by multi-color FISH and Spectra Imaging analyses

    SciTech Connect

    Weier, Jingly F.; Ferlatte, Christy; Baumgartner, Adolf; Jung,Christine J.; Nguyen, Ha-Nam; Chu, Lisa W.; Pedersen, Roger A.; Fisher,Susan J.; Weier, Heinz-Ulrich G.

    2006-02-08

    Numerical chromosome aberrations in gametes typically lead to failed fertilization, spontaneous abortion or a chromosomally abnormal fetus. By means of preimplantation genetic diagnosis (PGD), we now can screen human embryos in vitro for aneuploidy before transferring the embryos to the uterus. PGD allows us to select unaffected embryos for transfer and increases the implantation rate in in vitro fertilization programs. Molecular cytogenetic analyses using multi-color fluorescence in situ hybridization (FISH) of blastomeres have become the major tool for preimplantation genetic screening of aneuploidy. However, current FISH technology can test for only a small number of chromosome abnormalities and hitherto failed to increase the pregnancy rates as expected. We are in the process of developing technologies to score all 24 chromosomes in single cells within a 3 day time limit, which we believe is vital to the clinical setting. Also, human placental cytotrophoblasts (CTBs) at the fetal-maternal interface acquire aneuploidies as they differentiate to an invasive phenotype. About 20-50% of invasive CTB cells from uncomplicated pregnancies were found aneuploidy, suggesting that the acquisition of aneuploidy is an important component of normal placentation, perhaps limiting the proliferative and invasive potential of CTBs. Since most invasive CTBs are interphase cells and possess extreme heterogeneity, we applied multi-color FISH and repeated hybridizations to investigate individual CTBs. In summary, this study demonstrates the strength of Spectral Imaging analysis and repeated hybridizations, which provides a basis for full karyotype analysis of single interphase cells.

  5. Functional Magnetic Resonance Imaging Connectivity Analyses Reveal Efference-Copy to Primary Somatosensory Area, BA2

    PubMed Central

    Cui, Fang; Arnstein, Dan; Thomas, Rajat Mani; Maurits, Natasha M.; Keysers, Christian; Gazzola, Valeria

    2014-01-01

    Some theories of motor control suggest efference-copies of motor commands reach somatosensory cortices. Here we used functional magnetic resonance imaging to test these models. We varied the amount of efference-copy signal by making participants squeeze a soft material either actively or passively. We found electromyographical recordings, an efference-copy proxy, to predict activity in primary somatosensory regions, in particular Brodmann Area (BA) 2. Partial correlation analyses confirmed that brain activity in cortical structures associated with motor control (premotor and supplementary motor cortices, the parietal area PF and the cerebellum) predicts brain activity in BA2 without being entirely mediated by activity in early somatosensory (BA3b) cortex. Our study therefore provides valuable empirical evidence for efference-copy models of motor control, and shows that signals in BA2 can indeed reflect an input from motor cortices and suggests that we should interpret activations in BA2 as evidence for somatosensory-motor rather than somatosensory coding alone. PMID:24416222

  6. Autonomous Science Analyses of Digital Images for Mars Sample Return and Beyond

    NASA Technical Reports Server (NTRS)

    Gulick, V. C.; Morris, R. L.; Ruzon, M.; Roush, T. L.

    1999-01-01

    To adequately explore high priority landing sites, scientists require rovers with greater mobility. Therefore, future Mars missions will involve rovers capable of traversing tens of kilometers (vs. tens of meters traversed by Mars Pathfinder's Sojourner). However, the current process by which scientists interact with a rover does not scale to such distances. A single science objective is achieved through many iterations of a basic command cycle: (1) all data must be transmitted to Earth and analyzed; (2) from this data, new targets are selected and the necessary information from the appropriate instruments are requested; (3) new commands are then uplinked and executed by the spacecraft and (4) the resulting data are returned to Earth, starting the process again. Experience with rover tests on Earth shows that this time intensive process cannot be substantially shortened given the limited data downlink bandwidth and command cycle opportunities of real missions. Sending complete multicolor panoramas at several waypoints, for example, is out of the question for a single downlink opportunity. As a result, long traverses requiring many science command cycles would likely require many weeks, months or even years, perhaps exceeding rover design life or other constraints. Autonomous onboard science analyses can address these problems in two ways. First, it will allow the rover to transmit only "interesting" images, defined as those likely to have higher science content. Second, the rover will be able to anticipate future commands, for example acquiring and returning spectra of "interesting" rocks along with the images in which they were detected. Such approaches, coupled with appropriate navigational software, address both the data volume and command cycle bottlenecks that limit both rover mobility and science yield. We are developing algorithms to enable such intelligent decision making by autonomous spacecraft. Reflecting the ultimate level of ability we aim for, this

  7. Pallasite formation after a non-destructive impact. An experimental- and image analyses-based study

    NASA Astrophysics Data System (ADS)

    Solferino, Giulio; Golabek, Gregor J.; Nimmo, Francis; Schmidt, Max W.

    2015-04-01

    The formation conditions of pallasite meteorites in the interior of terrestrial planetesimals have been matter of debate over the last 40 years. Among other characteristics, the simple mineralogical composition (i.e., olivine, FeNi, FeS +/- pyroxene) and the dualism between fragmental and rounded olivine-bearing pallasites must be successfully reproduced by a potential formation scenario. This study incorporates a series of annealing experiments with olivine plus Fe-S, and digital image analyses of slabs from Brenham, Brahin, Seymchan, and Springwater pallasites. Additionally a 1D finite-difference numerical model was employed to show that a non-destructive collision followed by mixing of the impactor's core with the target body silicate mantle could lead to the formation of both fragmental and rounded pallasite types. Specifically, an impact occurring right after the accomplishment of the target body differentiation and up to several millions of years afterwards allows for (i) average grain sizes consistent with the observed rounded olivine-bearing pallasites, (ii) a remnant magnetization of Fe-Ni olivine inclusions as measured in natural pallasites and (iii) for the metallographic cooling rates derived from Fe-Ni in pallasites. An important result of this investigation is the definition of the grain growth rate of olivine in molten Fe-S as follows: dn - d0n = k0 exp(-Ea/RT) t, where, d0 is the starting grain size, d the grain size at time t, n = 2.42(46) the growth exponent, k0 = 9.43•E06 μm n s-1 a characteristic constant, Ea = 289 kJ/mol the activation energy for a specific growth process, R the gas constant, and T the absolute temperature. The computed olivine coarsening rate is markedly faster than in olivine-FeNi and olivine-Ni systems.

  8. X-Ray Digital Imaging Petrography of Lunar Mare Soils: Modal Analyses of Minerals and Glasses

    NASA Astrophysics Data System (ADS)

    Taylor, Lawrence A.; Patchen, Allan; Taylor, Dong-Hwa S.; Chambers, John G.; McKay, David S.

    1996-12-01

    It is essential that accurate modal (i.e., volume) percentages of the various mineral and glass phases in lunar soils be used for addressing and resolving the effects of space weathering upon reflectance spectra, as well as for their calibration such data are also required for evaluating the resource potential of lunar minerals for use at a lunar base. However, these data are largely lacking. Particle-counting information for lunar soils, originally obtained to study formational processes, does not provide these necessary data, including the percentages of minerals locked in multi-phase lithic fragments and fused-soil particles, such as agglutinates. We have developed a technique for modal analyses,sensu stricto, of lunar soils, using digital imaging of X-ray maps obtained with an energy-dispersive spectrometer mounted on an electron microprobe. A suite of nine soils (90 to 150 μm size fraction) from the Apollo 11, 12, 15, and 17 mare sites was used for this study. This is the first collection of such modal data on soils from all Apollo mare sites. The abundances of free-mineral fragments in the mare soils are greater for immature and submature soils than for mature soils, largely because of the formation of agglutinitic glass as maturity progresses. In considerations of resource utilization at a lunar base, the best lunar soils to use for mineral beneficiation (i.e., most free-mineral fragments) have maturities near the immature/submature boundary (Is/FeO ≅ 30), not the mature soils with their complications due to extensive agglutination. The particle data obtained from the nine mare soils confirm the generalizations for lunar soils predicted by L. A. Taylor and D. S. McKay (1992,Lunar Planet Sci. Conf. 23rd,pp. 1411-1412 [Abstract]).

  9. Spatiotemporal Analyses of Osteogenesis and Angiogenesis via Intravital Imaging in Cranial Bone Defect Repair

    PubMed Central

    Huang, Chunlan; Ness, Vincent P.; Yang, Xiaochuan; Chen, Hongli; Luo, Jiebo; Brown, Edward B; Zhang, Xinping

    2015-01-01

    Osteogenesis and angiogenesis are two integrated components in bone repair and regeneration. A deeper understanding of osteogenesis and angiogenesis has been hampered by technical difficulties of analyzing bone and neovasculature simultaneously in spatiotemporal scales and in three-dimensional formats. To overcome these barriers, a cranial defect window chamber model was established that enabled high-resolution, longitudinal, and real-time tracking of angiogenesis and bone defect healing via Multiphoton Laser Scanning Microscopy (MPLSM). By simultaneously probing new bone matrix via second harmonic generation (SHG), neovascular networks via intravenous perfusion of fluorophore, and osteoblast differentiation via 2.3kb collagen type I promoter driven GFP (Col2.3GFP), we examined the morphogenetic sequence of cranial bone defect healing and further established the spatiotemporal analyses of osteogenesis and angiogenesis coupling in repair and regeneration. We demonstrated that bone defect closure was initiated in the residual bone around the edge of the defect. The expansion and migration of osteoprogenitors into the bone defect occurred during the first 3 weeks of healing, coupled with vigorous microvessel angiogenesis at the leading edge of the defect. Subsequent bone repair was marked by matrix deposition and active vascular network remodeling within new bone. Implantation of bone marrow stromal cells (BMSCs) isolated from Col2.3GFP mice further showed that donor-dependent bone formation occurred rapidly within the first 3 weeks of implantation, in concert with early angiogenesis. The subsequent bone wound closure was largely host-dependent, associated with localized modest induction of angiogenesis. The establishment of a live imaging platform via cranial window provides a unique tool to understand osteogenesis and angiogenesis in repair and regeneration, enabling further elucidation of the spatiotemporal regulatory mechanisms of osteoprogenitor cell interactions

  10. Spatiotemporal Analyses of Osteogenesis and Angiogenesis via Intravital Imaging in Cranial Bone Defect Repair.

    PubMed

    Huang, Chunlan; Ness, Vincent P; Yang, Xiaochuan; Chen, Hongli; Luo, Jiebo; Brown, Edward B; Zhang, Xinping

    2015-07-01

    Osteogenesis and angiogenesis are two integrated components in bone repair and regeneration. A deeper understanding of osteogenesis and angiogenesis has been hampered by technical difficulties of analyzing bone and neovasculature simultaneously in spatiotemporal scales and in 3D formats. To overcome these barriers, a cranial defect window chamber model was established that enabled high-resolution, longitudinal, and real-time tracking of angiogenesis and bone defect healing via multiphoton laser scanning microscopy (MPLSM). By simultaneously probing new bone matrix via second harmonic generation (SHG), neovascular networks via intravenous perfusion of fluorophore, and osteoblast differentiation via 2.3-kb collagen type I promoter-driven GFP (Col2.3GFP), we examined the morphogenetic sequence of cranial bone defect healing and further established the spatiotemporal analyses of osteogenesis and angiogenesis coupling in repair and regeneration. We showed that bone defect closure was initiated in the residual bone around the edge of the defect. The expansion and migration of osteoprogenitors into the bone defect occurred during the first 3 weeks of healing, coupled with vigorous microvessel angiogenesis at the leading edge of the defect. Subsequent bone repair was marked by matrix deposition and active vascular network remodeling within new bone. Implantation of bone marrow stromal cells (BMSCs) isolated from Col2.3GFP mice further showed that donor-dependent bone formation occurred rapidly within the first 3 weeks of implantation, in concert with early angiogenesis. The subsequent bone wound closure was largely host-dependent, associated with localized modest induction of angiogenesis. The establishment of a live imaging platform via cranial window provides a unique tool to understand osteogenesis and angiogenesis in repair and regeneration, enabling further elucidation of the spatiotemporal regulatory mechanisms of osteoprogenitor cell interactions with host bone

  11. X-ray digital imaging petrography of lunar mare soils: modal analyses of minerals and glasses

    NASA Technical Reports Server (NTRS)

    Taylor, L. A.; Patchen, A.; Taylor, D. H.; Chambers, J. G.; McKay, D. S.

    1996-01-01

    It is essential that accurate modal (i.e., volume) percentages of the various mineral and glass phases in lunar soils be used for addressing and resolving the effects of space weathering upon reflectance spectra, as well as for their calibration such data are also required for evaluating the resource potential of lunar minerals for use at a lunar base. However, these data are largely lacking. Particle-counting information for lunar soils, originally obtained to study formational processes, does not provide these necessary data, including the percentages of minerals locked in multi-phase lithic fragments and fused-soil particles, such as agglutinates. We have developed a technique for modal analyses, sensu stricto, of lunar soils, using digital imaging of X-ray maps obtained with an energy-dispersive spectrometer mounted on an electron microprobe. A suite of nine soils (90 to 150 micrometers size fraction) from the Apollo 11, 12, 15, and 17 mare sites was used for this study. This is the first collection of such modal data on soils from all Apollo mare sites. The abundances of free-mineral fragments in the mare soils are greater for immature and submature soils than for mature soils, largely because of the formation of agglutinitic glass as maturity progresses. In considerations of resource utilization at a lunar base, the best lunar soils to use for mineral beneficiation (i.e., most free-mineral fragments) have maturities near the immature/submature boundary (Is/FeO approximately or = 30), not the mature soils with their complications due to extensive agglutination. The particle data obtained from the nine mare soils confirm the generalizations for lunar soils predicted by L.A. Taylor and D.S. McKay (1992, Lunar Planet Sci. Conf. 23rd, pp. 1411-1412 [Abstract]).

  12. Maximizing Science Return from Future Mars Missions with Onboard Image Analyses

    NASA Technical Reports Server (NTRS)

    Gulick, V. C.; Morris, R. L.; Bandari, E. B.; Roush, T. L.

    2000-01-01

    We have developed two new techniques to enhance science return and to decrease returned data volume for near-term Mars missions: 1) multi-spectral image compression and 2) autonomous identification and fusion of in-focus regions in an image series.

  13. Detection of melamine in milk powders based on NIR hyperspectral imaging and spectral similarity analyses

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Melamine (2,4,6-triamino-1,3,5-triazine) contamination of food has become an urgent and broadly recognized topic as a result of several food safety scares in the past five years. Hyperspectral imaging techniques that combine the advantages of spectroscopy and imaging have been widely applied for a v...

  14. Edge detection and image segmentation of space scenes using fractal analyses

    NASA Technical Reports Server (NTRS)

    Cleghorn, Timothy F.; Fuller, J. J.

    1992-01-01

    A method was developed for segmenting images of space scenes into manmade and natural components, using fractal dimensions and lacunarities. Calculations of these parameters are presented. Results are presented for a variety of aerospace images, showing that it is possible to perform edge detections of manmade objects against natural background such as those seen in an aerospace environment.

  15. Applying I-FGM to image retrieval and an I-FGM system performance analyses

    NASA Astrophysics Data System (ADS)

    Santos, Eugene, Jr.; Santos, Eunice E.; Nguyen, Hien; Pan, Long; Korah, John; Zhao, Qunhua; Xia, Huadong

    2007-04-01

    Intelligent Foraging, Gathering and Matching (I-FGM) combines a unique multi-agent architecture with a novel partial processing paradigm to provide a solution for real-time information retrieval in large and dynamic databases. I-FGM provides a unified framework for combining the results from various heterogeneous databases and seeks to provide easily verifiable performance guarantees. In our previous work, I-FGM had been implemented and validated with experiments on dynamic text data. However, the heterogeneity of search spaces requires our system having the ability to effectively handle various types of data. Besides texts, images are the most significant and fundamental data for information retrieval. In this paper, we extend the I-FGM system to incorporate images in its search spaces using a region-based Wavelet Image Retrieval algorithm called WALRUS. Similar to what we did for text retrieval, we modified the WALRUS algorithm to partially and incrementally extract the regions from an image and measure the similarity value of this image. Based on the obtained partial results, we refine our computational resources by updating the priority values of image documents. Experiments have been conducted on I-FGM system with image retrieval. The results show that I-FGM outperforms its control systems. Also, in this paper we present theoretical analysis of the systems with a focus on performance. Based on probability theory, we provide models and predictions of the average performance of the I-FGM system and its two control systems, as well as the systems without partial processing.

  16. The Neglected Side of the Coin: Quantitative Benefit-risk Analyses in Medical Imaging.

    PubMed

    Zanzonico, Pat B

    2016-03-01

    While it is implicitly recognized that the benefits of diagnostic imaging far outweigh any theoretical radiogenic risks, quantitative estimates of the benefits are rarely, if ever, juxtaposed with quantitative estimates of risk. This alone - expression of benefit in purely qualitative terms versus expression of risk in quantitative, and therefore seemingly more certain, terms - may well contribute to a skewed sense of the relative benefits and risks of diagnostic imaging among healthcare providers as well as patients. The current paper, therefore, briefly compares the benefits of diagnostic imaging in several cases, based on actual mortality or morbidity data if ionizing radiation were not employed, with theoretical estimates of radiogenic cancer mortality based on the "linear no-threshold" (LNT) dose-response model. PMID:26808890

  17. Formal Distinctiveness of High- and Low-Imageability Nouns: Analyses and Theoretical Implications

    ERIC Educational Resources Information Center

    Reilly, Jamie; Kean, Jacob

    2007-01-01

    Words associated with perceptually salient, highly imageable concepts are learned earlier in life, more accurately recalled, and more rapidly named than abstract words (R. W. Brown, 1976; Walker & Hulme, 1999). Theories accounting for this concreteness effect have focused exclusively on semantic properties of word referents. A novel possibility is…

  18. Three-dimensional imaging system for analyses of dynamic droplet impaction and deposition formation on leaves

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A system was developed to assess the dynamic processes of droplet impact, rebound and retention on leaf surfaces with three-dimensional (3-D) images. The system components consisted of a uniform-size droplet generator, two high speed digital video cameras, a constant speed track, a leaf holder, and ...

  19. Measurements and simulations analysing the noise behaviour of grating-based X-ray phase-contrast imaging

    NASA Astrophysics Data System (ADS)

    Weber, T.; Bartl, P.; Durst, J.; Haas, W.; Michel, T.; Ritter, A.; Anton, G.

    2011-08-01

    In the last decades, phase-contrast imaging using a Talbot-Lau grating interferometer is possible even with a low-brilliance X-ray source. With the potential of increasing the soft-tissue contrast, this method is on its way into medical imaging. For this purpose, the knowledge of the underlying physics of this technique is necessary.With this paper, we would like to contribute to the understanding of grating-based phase-contrast imaging by presenting results on measurements and simulations regarding the noise behaviour of the differential phases.These measurements were done using a microfocus X-ray tube with a hybrid, photon-counting, semiconductor Medipix2 detector. The additional simulations were performed by our in-house developed phase-contrast simulation tool “SPHINX”, combining both wave and particle contributions of the simulated photons.The results obtained by both of these methods show the same behaviour. Increasing the number of photons leads to a linear decrease of the standard deviation of the phase. The number of used phase steps has no influence on the standard deviation, if the total number of photons is held constant.Furthermore, the probability density function (pdf) of the reconstructed differential phases was analysed. It turned out that the so-called von Mises distribution is the physically correct pdf, which was also confirmed by measurements.This information advances the understanding of grating-based phase-contrast imaging and can be used to improve image quality.

  20. Optimizing Laguerre expansion based deconvolution methods for analysing bi-exponential fluorescence lifetime images.

    PubMed

    Zhang, Yongliang; Chen, Yu; Li, David Day-Uei

    2016-06-27

    Fast deconvolution is an essential step to calibrate instrument responses in big fluorescence lifetime imaging microscopy (FLIM) image analysis. This paper examined a computationally effective least squares deconvolution method based on Laguerre expansion (LSD-LE), recently developed for clinical diagnosis applications, and proposed new criteria for selecting Laguerre basis functions (LBFs) without considering the mutual orthonormalities between LBFs. Compared with the previously reported LSD-LE, the improved LSD-LE allows to use a higher laser repetition rate, reducing the acquisition time per measurement. Moreover, we extended it, for the first time, to analyze bi-exponential fluorescence decays for more general FLIM-FRET applications. The proposed method was tested on both synthesized bi-exponential and realistic FLIM data for studying the endocytosis of gold nanorods in Hek293 cells. Compared with the previously reported constrained LSD-LE, it shows promising results. PMID:27410552

  1. Three-Dimensional Acoustic Tissue Model: A Computational Tissue Phantom for Image Analyses

    NASA Astrophysics Data System (ADS)

    Mamou, J.; Oelze, M. L.; O'Brien, W. D.; Zachary, J. F.

    A novel methodology to obtain three-dimensional (3D) acoustic tissue models (3DATMs) is introduced. 3DATMs can be used as computational tools for ultrasonic imaging algorithm development and analysis. In particular, 3D models of biological structures can provide great benefit to better understand fundamentally how ultrasonic waves interact with biological materials. As an example, such models were used to generate ultrasonic images that characterize tumor tissue microstructures. 3DATMs can be used to evaluate a variety of tissue types. Typically, excised tissue is fixed, embedded, serially sectioned, and stained. The stained sections are digitally imaged (24-bit bitmap) with light microscopy. Contrast of each stained section is equalized and an automated registration algorithm aligns consecutive sections. The normalized mutual information is used as a similarity measure, and simplex optimization is conducted to find the best alignment. Both rigid and non-rigid registrations are performed. During tissue preparation, some sections are generally lost; thus, interpolation prior to 3D reconstruction is performed. Interpolation is conducted after registration using cubic Hermite polynoms. The registered (with interpolated) sections yield a 3D histologic volume (3DHV). Acoustic properties are then assigned to each tissue constituent of the 3DHV to obtain the 3DATMs. As an example, a 3D acoustic impedance tissue model (3DZM) was obtained for a solid breast tumor (EHS mouse sarcoma) and used to estimate ultrasonic scatterer size. The 3DZM results yielded an effective scatterer size of 32.9 (±6.1) μm. Ultrasonic backscatter measurements conducted on the same tumor tissue in vivo yielded an effective scatterer size of 33 (±8) μm. This good agreement shows that 3DATMs may be a powerful modeling tool for acoustic imaging applications

  2. Functional connectivity analyses in imaging genetics: considerations on methods and data interpretation.

    PubMed

    Bedenbender, Johannes; Paulus, Frieder M; Krach, Sören; Pyka, Martin; Sommer, Jens; Krug, Axel; Witt, Stephanie H; Rietschel, Marcella; Laneri, Davide; Kircher, Tilo; Jansen, Andreas

    2011-01-01

    Functional magnetic resonance imaging (fMRI) can be combined with genotype assessment to identify brain systems that mediate genetic vulnerability to mental disorders ("imaging genetics"). A data analysis approach that is widely applied is "functional connectivity". In this approach, the temporal correlation between the fMRI signal from a pre-defined brain region (the so-called "seed point") and other brain voxels is determined. In this technical note, we show how the choice of freely selectable data analysis parameters strongly influences the assessment of the genetic modulation of connectivity features. In our data analysis we exemplarily focus on three methodological parameters: (i) seed voxel selection, (ii) noise reduction algorithms, and (iii) use of additional second level covariates. Our results show that even small variations in the implementation of a functional connectivity analysis can have an impact on the connectivity pattern that is as strong as the potential modulation by genetic allele variants. Some effects of genetic variation can only be found for one specific implementation of the connectivity analysis. A reoccurring difficulty in the field of psychiatric genetics is the non-replication of initially promising findings, partly caused by the small effects of single genes. The replication of imaging genetic results is therefore crucial for the long-term assessment of genetic effects on neural connectivity parameters. For a meaningful comparison of imaging genetics studies however, it is therefore necessary to provide more details on specific methodological parameters (e.g., seed voxel distribution) and to give information how robust effects are across the choice of methodological parameters. PMID:22220190

  3. Hyperspectral and Chlorophyll Fluorescence Imaging to Analyse the Impact of Fusarium culmorum on the Photosynthetic Integrity of Infected Wheat Ears

    PubMed Central

    Bauriegel, Elke; Giebel, Antje; Herppich, Werner B.

    2011-01-01

    Head blight on wheat, caused by Fusarium spp., is a serious problem for both farmers and food production due to the concomitant production of highly toxic mycotoxins in infected cereals. For selective mycotoxin analyses, information about the on-field status of infestation would be helpful. Early symptom detection directly on ears, together with the corresponding geographic position, would be important for selective harvesting. Hence, the capabilities of various digital imaging methods to detect head blight disease on winter wheat were tested. Time series of images of healthy and artificially Fusarium-infected ears were recorded with a laboratory hyperspectral imaging system (wavelength range: 400 nm to 1,000 nm). Disease-specific spectral signatures were evaluated with an imaging software. Applying the ‘Spectral Angle Mapper’ method, healthy and infected ear tissue could be clearly classified. Simultaneously, chlorophyll fluorescence imaging of healthy and infected ears, and visual rating of the severity of disease was performed. Between six and eleven days after artificial inoculation, photosynthetic efficiency of infected compared to healthy ears decreased. The severity of disease highly correlated with photosynthetic efficiency. Above an infection limit of 5% severity of disease, chlorophyll fluorescence imaging reliably recognised infected ears. With this technique, differentiation of the severity of disease was successful in steps of 10%. Depending on the quality of chosen regions of interests, hyperspectral imaging readily detects head blight 7 d after inoculation up to a severity of disease of 50%. After beginning of ripening, healthy and diseased ears were hardly distinguishable with the evaluated methods. PMID:22163820

  4. Use of Very High-Resolution Airborne Images to Analyse 3d Canopy Architecture of a Vineyard

    NASA Astrophysics Data System (ADS)

    Burgos, S.; Mota, M.; Noll, D.; Cannelle, B.

    2015-08-01

    Differencing between green cover and grape canopy is a challenge for vigour status evaluation in viticulture. This paper presents the acquisition methodology of very high-resolution images (4 cm), using a Sensefly Swinglet CAM unmanned aerial vehicle (UAV) and their processing to construct a 3D digital surface model (DSM) for the creation of precise digital terrain models (DTM). The DTM was obtained using python processing libraries. The DTM was then subtracted to the DSM in order to obtain a differential digital model (DDM) of a vineyard. In the DDM, the vine pixels were then obtained by selecting all pixels with an elevation higher than 50 [cm] above the ground level. The results show that it was possible to separate pixels from the green cover and the vine rows. The DDM showed values between -0.1 and + 1.5 [m]. A manually delineation of polygons based on the RGB image belonging to the green cover and to the vine rows gave a highly significant differences with an average value of 1.23 [m] and 0.08 [m] for the vine and the ground respectively. The vine rows elevation is in good accordance with the topping height of the vines 1.35 [m] measured on the field. This mask could be used to analyse images of the same plot taken at different times. The extraction of only vine pixels will facilitate subsequent analyses, for example, a supervised classification of these pixels.

  5. Contextualising and Analysing Planetary Rover Image Products through the Web-Based PRoGIS

    NASA Astrophysics Data System (ADS)

    Morley, Jeremy; Sprinks, James; Muller, Jan-Peter; Tao, Yu; Paar, Gerhard; Huber, Ben; Bauer, Arnold; Willner, Konrad; Traxler, Christoph; Garov, Andrey; Karachevtseva, Irina

    2014-05-01

    The international planetary science community has launched, landed and operated dozens of human and robotic missions to the planets and the Moon. They have collected various surface imagery that has only been partially utilized for further scientific purposes. The FP7 project PRoViDE (Planetary Robotics Vision Data Exploitation) is assembling a major portion of the imaging data gathered so far from planetary surface missions into a unique database, bringing them into a spatial context and providing access to a complete set of 3D vision products. Processing is complemented by a multi-resolution visualization engine that combines various levels of detail for a seamless and immersive real-time access to dynamically rendered 3D scenes. PRoViDE aims to (1) complete relevant 3D vision processing of planetary surface missions, such as Surveyor, Viking, Pathfinder, MER, MSL, Phoenix, Huygens, and Lunar ground-level imagery from Apollo, Russian Lunokhod and selected Luna missions, (2) provide highest resolution & accuracy remote sensing (orbital) vision data processing results for these sites to embed the robotic imagery and its products into spatial planetary context, (3) collect 3D Vision processing and remote sensing products within a single coherent spatial data base, (4) realise seamless fusion between orbital and ground vision data, (5) demonstrate the potential of planetary surface vision data by maximising image quality visualisation in 3D publishing platform, (6) collect and formulate use cases for novel scientific application scenarios exploiting the newly introduced spatial relationships and presentation, (7) demonstrate the concepts for MSL, (9) realize on-line dissemination of key data & its presentation by a web-based GIS and rendering tool named PRoGIS (Planetary Robotics GIS). PRoGIS is designed to give access to rover image archives in geographical context, using projected image view cones, obtained from existing meta-data and updated according to

  6. Normal development of the tomato clownfish Amphiprion frenatus: live imaging and in situ hybridization analyses of mesodermal and neurectodermal development.

    PubMed

    Ghosh, J; Wilson, R W; Kudoh, T

    2009-12-01

    The normal embryonic development of the tomato clownfish Amphiprion frenatus was analysed using live imaging and by in situ hybridization for detection of mesodermal and neurectodermal development. Both morphology of live embryos and tissue-specific staining revealed significant differences in the gross developmental programme of A. frenatus compared with better-known teleost fish models, in particular, initiation of somitogenesis before complete epiboly, initiation of narrowing of the neurectoderm (neurulation) before somitogenesis, relatively early pigmentation of melanophores at the 10-15 somite stage and a distinctive pattern of melanophore distribution. These results suggest evolutionary adaptability of the teleost developmental programme. The ease of obtaining eggs, in vitro culture of the embryo, in situ staining analyses and these reported characteristics make A. frenatus a potentially important model marine fish species for studying embryonic development, physiology, ecology and evolution. PMID:20738687

  7. Capabilities and Impact on Wind Analyses of the Hurricane Imaging Radiometer (HIRAD)

    NASA Technical Reports Server (NTRS)

    Miller, Timothy L.; Amarin, Ruba; Atlas, Robert; Bailey, M. C.; Black, Peter; Buckley, Courtney; James, Mark; Johnson, James; Jones, Linwood; Ruf, Christopher; Simmons, David; Uhlhorn, Eric

    2010-01-01

    The Hurricane Imaging Radiometer (HIRAD) is a new airborne microwave remote sensor for hurricane observations that is currently under development by NASA Marshall Space Flight Center in partnership with the NOAA Atlantic Oceanographic and Meteorological Laboratory/Hurricane Research Division, the University of Central Florida, the University of Michigan, and the University of Alabama in Huntsville. The instrument is being test flown in January and is expected to participate in or collaborate with the tropical cyclone experiment GRIP (Genesis and Rapid Intensification Processes) in the 2010 season. HIRAD is designed to study the wind field in some detail within strong hurricanes and to enhance the real-time airborne ocean surface winds observation capabilities of NOAA and USAF Weather Squadron hurricane hunter aircraft currently using the operational Stepped Frequency Microwave Radiometer (SFMR). Unlike SFMR, which measures wind speed and rain rate along the ground track at a single point directly beneath the aircraft, HIRAD will provide images of the surface wind and rain field over a wide swath (approx.3 x the aircraft altitude) with approx.2 km resolution. See Figure 1, which depicts a simulated HIRAD swath versus the line of data obtained by SFMR.

  8. Color Mosaics and Multispectral Analyses of Mars Reconnaissance Orbit Mars Color Imager (MARCI) Observations

    NASA Astrophysics Data System (ADS)

    Bell, J. F.; Anderson, R. B.; Kressler, K.; Wolff, M. J.; Cantor, B.; Science; Operations Teams, M.

    2008-12-01

    The Mars Color Imager (MARCI) on the Mars Reconnaissance Orbiter (MRO) spacecraft is a is a wide-angle, multispectral Charge-Coupled Device (CCD) "push-frame" imaging camera designed to provide frequent, synoptic-scale imaging of Martian atmospheric and surface features and phenomena. MARCI uses a 1024x1024 pixel interline transfer CCD detector that has seven narrowband interference filters bonded directly to the CCD. Five of the filters are in the visible to short-wave near-IR wavelength range (MARCI-VIS: 437, 546, 604, 653, and 718 nm) and two are in the UV (MARCI-UV: 258 and 320 nm). During the MRO primary mission (November 2006 through November 2008), the instrument has acquired data swaths on the dayside of the planet, at an equator-crossing local solar time of about 3:00 p.m. We are analyzing the MARCI-VIS multispectral imaging data from the MRO primary mission in order to investigate (a) color variations in the surface and their potential relationship to variations in iron mineralogy; and (b) the time variability of surface albedo features at the approx. 1 km/pixel scale typical of MARCI nadir-pointed observations. Raw MARCI images were calibrated to radiance factor (I/F) using pre-flight and in-flight calibration files and a pipeline calibration process developed by the science team. We are using these calibrated MARCI files to generate map-projected mosaics of each of the 30 USGS standard quadrangles on Mars in each of the five MARCI-VIS bands. Our mosaicking software searches the MARCI data set to identify files that match a user- defined set of limits such as latitude, longitude, Ls, incidence angle, emission angle, and year. Each of the files matching the desired criteria is then map-projected and inserted in series into an output mosaic covering the desired lat/lon range. In cases of redundant coverage of the same pixels by different files, the user can set the program to use the pixel with the lowest I/F value for each individual MARCI-VIS band, thus

  9. X-ray fluorescence and imaging analyses of paintings by the Brazilian artist Oscar Pereira Da Silva

    NASA Astrophysics Data System (ADS)

    Campos, P. H. O. V.; Kajiya, E. A. M.; Rizzutto, M. A.; Neiva, A. C.; Pinto, H. P. F.; Almeida, P. A. D.

    2014-02-01

    Non-destructive analyses, such as EDXRF (Energy-Dispersive X-Ray Fluorescence) spectroscopy, and imaging were used to characterize easel paintings. The analyzed objects are from the collection of the Pinacoteca do Estado de São Paulo. EDXRF results allowed us to identify the chemical elements present in the pigments, showing the use of many Fe-based pigments, modern pigments, such as cobalt blue and cadmium yellow, as well as white pigments containing lead and zinc used by the artist in different layers. Imaging analysis was useful to identify the state of conservation, the localization of old and new restorations and also to detect and unveil the underlying drawings revealing the artist's creative processes.

  10. Accelerating Large Scale Image Analyses on Parallel, CPU-GPU Equipped Systems

    PubMed Central

    Teodoro, George; Kurc, Tahsin M.; Pan, Tony; Cooper, Lee A.D.; Kong, Jun; Widener, Patrick; Saltz, Joel H.

    2014-01-01

    The past decade has witnessed a major paradigm shift in high performance computing with the introduction of accelerators as general purpose processors. These computing devices make available very high parallel computing power at low cost and power consumption, transforming current high performance platforms into heterogeneous CPU-GPU equipped systems. Although the theoretical performance achieved by these hybrid systems is impressive, taking practical advantage of this computing power remains a very challenging problem. Most applications are still deployed to either GPU or CPU, leaving the other resource under- or un-utilized. In this paper, we propose, implement, and evaluate a performance aware scheduling technique along with optimizations to make efficient collaborative use of CPUs and GPUs on a parallel system. In the context of feature computations in large scale image analysis applications, our evaluations show that intelligently co-scheduling CPUs and GPUs can significantly improve performance over GPU-only or multi-core CPU-only approaches. PMID:25419545

  11. Emotion Estimation Algorithm from Facial Image Analyses of e-Learning Users

    NASA Astrophysics Data System (ADS)

    Shigeta, Ayuko; Koike, Takeshi; Kurokawa, Tomoya; Nosu, Kiyoshi

    This paper proposes an emotion estimation algorithm from e-Learning user's facial image. The algorithm characteristics are as follows: The criteria used to relate an e-Learning use's emotion to a representative emotion were obtained from the time sequential analysis of user's facial expressions. By examining the emotions of the e-Learning users and the positional change of the facial expressions from the experiment results, the following procedures are introduce to improve the estimation reliability; (1) some effective features points are chosen by the emotion estimation (2) dividing subjects into two groups by the change rates of the face feature points (3) selection of the eigenvector of the variance-co-variance matrices (cumulative contribution rate>=95%) (4) emotion calculation using Mahalanobis distance.

  12. Crustal diversity of the Moon: Compositional analyses of Galileo solid state imaging data

    NASA Astrophysics Data System (ADS)

    Pieters, C. M.; Sunshine, J. M.; Fischer, E. M.; Murchie, S. L.; Belton, M.; McEwen, A.; Gaddis, L.; Greeley, R.; Neukum, G.; Jaumann, R.; Hoffmann, H.

    1993-09-01

    The multispectral images of the lunar limb and farside obtained by the solid state imaging (SSI) system on board the Galileo spacecraft provide the first new pulse of compositional data of the Moon by a spacecraft in well over a decade. The wavelength range covered by SSI filters (0.4-1.0 μm) is particularly sensitive to the composition of mare basalts, the abundance of mafic (ferrous) minerals, and the maturity of the regolith. To a first order, the limb and farside material is consistent with previous characterization of nearside lunar spectral types for mare and highland soils and craters. Most basalts are of an intermediate TiO2 composition and most of the highland crust is feldspathic with local variations in mafic content identified principally at impact craters. Dark mantling material on the farside can be interpreted in terms of known properties of lunar pyroclastic glass. Regions of cryptomare are shown to have spectral properties intermediate between those of highland and mare soils, as would be expected from mixture of the two. There are several important exceptions and surprises, however. Unlike the basalt types identified on the nearside, limb and farside basalts exhibit an exceptionally weak 1 μm ferrous absorption band. This may indicate a compositionally distinct lunar basalt group that, for example, is more Mg-rich than most basalts of the nearside. Some of the most notable compositional anomalies are associated with South Pole-Aitken Basin. This large region has a much lower albedo than surrounding highlands. The inner, darkest, portion of the basin exhibits optical properties indistinguishable from low-Ti basalts. Deposits to the south exhibit unique properties with a strong and broad ferrous 1 μm absorption, most consistent with abundant olivine. The unusual compositions associated with South Pole-Aitken and their spatial extent suggests the impact creating this huge lunar basin excavated mafic-rich lower crust or perhaps mantle material.

  13. Statistical Improvements in Functional Magnetic Resonance Imaging Analyses Produced by Censoring High-Motion Data Points

    PubMed Central

    Siegel, Joshua S.; Power, Jonathan D.; Dubis, Joseph W.; Vogel, Alecia C.; Church, Jessica A.; Schlaggar, Bradley L.; Petersen, Steven E.

    2013-01-01

    Subject motion degrades the quality of task functional magnetic resonance imaging (fMRI) data. Here, we test two classes of methods to counteract the effects of motion in task fMRI data: (1) a variety of motion regressions and (2) motion censoring (“motion scrubbing”). In motion regression, various regressors based on realignment estimates were included as nuisance regressors in general linear model (GLM) estimation. In motion censoring, volumes in which head motion exceeded a threshold were withheld from GLM estimation. The effects of each method were explored in several task fMRI data sets and compared using indicators of data quality and signal-to-noise ratio. Motion censoring decreased variance in parameter estimates within- and across-subjects, reduced residual error in GLM estimation, and increased the magnitude of statistical effects. Motion censoring performed better than all forms of motion regression and also performed well across a variety of parameter spaces, in GLMs with assumed or unassumed response shapes. We conclude that motion censoring improves the quality of task fMRI data and can be a valuable processing step in studies involving populations with even mild amounts of head movement. PMID:23861343

  14. A 10 year intercomparison between collocated Special Sensor Microwave Imager oceanic surface wind speed retrievals and global analyses

    NASA Astrophysics Data System (ADS)

    Meissner, T.; Smith, D.; Wentz, F.

    2001-06-01

    To evaluate the scalar ocean surface wind speeds obtained from the Special Sensor Microwave Imager (SSM/I), we compare them over the time period from July 1987 through December 1997 with those from two global analyses: the National Center for Environmental Prediction (NCEP)/National Center for Atmospheric Research (NCAR) Annual Reanalysis and the European Center for Medium-Range Weather Forecasts (ECMWF)/Tropical Ocean-Global Atmosphere Global Surface Analysis. We perform a statistical analysis for the whole globe and present time series analyses for selected geographical regions in connection with collocated wind speed difference maps. In order to evaluate further geographical biases observed in the SSM/I versus analyses comparisons we use wind speeds from the NASA scatterometer (NSCAT) for the 10 month period from September 1996 through June 1997 as a third data source. The value of the standard deviation for all collocated SSM/I - ECMWF wind speed differences is 2.1 m s-1 and for all collocated SSM/I - NCEP/NCAR reanalyis wind speed differences is 2.4 m s-1. When taking monthly or yearly averages in each pixel, which has the effect of cancelling out small timescale wind speed fluctuations, the values are between 0.8 and 1.2 m s-1, respectively. Global biases range between -0.05 and +0.55 m s-1 for the various SSM/I satellites. Our analysis allows us to identify regional biases for both the SSM/I and analyses winds. The NCEP/NCAR reanalysis wind speeds appear underestimated in the tropical Pacific and tropical Atlantic. ECMWF wind speeds appear underestimated near the southern Pacific islands NE of Australia. The analyses wind speeds are higher than the SSM/I wind speeds near the Argentinean coast. The SSM/I wind speeds appear high in the extratropical central and eastern Pacific and low in certain coastal regions with eastern boundary currents and in the Arabian Sea. The size of some of these biases are seasonally dependent.

  15. Structure and clay mineralogy: borehole images, log interpretation and sample analyses at Site C0002 Nankai Trough accretionary prism

    NASA Astrophysics Data System (ADS)

    Jurado, Maria Jose; Schleicher, Anja

    2015-04-01

    Our research focused on the characterization of fracture and fault structures from the deep Nankai Trough accretionary prism in Japan. Logging Data and cuttings samples from the two most recent International Ocean Discovery Program (IODP) Expeditions 338 and 348 of the NanTroSEIZE project were analyzed by Logging While Drilling (LWD) oriented images, geophysical logs and clay mineralogy. Both expeditions took place at Site C0002, but whereas Hole C0002F (Expedition 338) was drilled down to 2004.5 mbsf, Hole C0002N and C0002P (Expedition 348) reached a depth of 2325.5 mbsf and 3058.8 mbsf respectively. The structural interpretation of borehole imaging data illustrates the deformation within the fractured and faulted sections of the accretionary prism. All drill holes show distinct areas of intense fracturing and faulting within a very clay-dominated lithology. Here, smectite and illite are the most common clay minerals, but the properties and the role they may play in influencing the fractures, faults and folds in the accretionary prism is still not well understood. When comparing clay mineralogy and fracture/fault areas in hole C0002F (Expedition 338), a trend in the abundance of illite and smectite, and in particular the swelling behavior of smectite is recognizable. In general, the log data provided a good correlation with the actual mineralogy and the relative abundance of clay. Ongoing postcruise preliminary research on hole C0002 N and C0002P (Expedition 348) should confirm these results. The relationship between fracture and fault structures and the changes in clay mineralogy could be explained by the deformation of specific areas with different compaction features, fluid-rock interaction processes, but could also be related to beginning diagenetic processes related to depth. Our results show the integration of logging data and cutting sample analyses as a valuable tool for characterization of petrophysical and mineralogical changes of the structures of the

  16. Analyses of Magnetic Resonance Imaging of Cerebrospinal Fluid Dynamics Pre and Post Short and Long-Duration Space Flights

    NASA Technical Reports Server (NTRS)

    Alperin, Noam; Barr, Yael; Lee, Sang H.; Mason,Sara; Bagci, Ahmet M.

    2015-01-01

    Preliminary results are based on analyses of data from 17 crewmembers. The initial analysis compares pre to post-flight changes in total cerebral blood flow (CBF) and craniospinal CSF flow volume. Total CBF is obtained by summation of the mean flow rates through the 4 blood vessels supplying the brain (right and left internal carotid and vertebral arteries). Volumetric flow rates were obtained using an automated lumen segmentation technique shown to have 3-4-fold improved reproducibility and accuracy over manual lumen segmentation (6). Two cohorts, 5 short-duration and 8 long-duration crewmembers, who were scanned within 3 to 8 days post landing were included (4 short-duration crewmembers with MRI scans occurring beyond 10 days post flight were excluded). The VIIP Clinical Practice Guideline (CPG) classification is being used initially as a measure for VIIP syndrome severity. Median CPG scores of the short and long-duration cohorts were similar, 2. Mean preflight total CBF for the short and long-duration cohorts were similar, 863+/-144 and 747+/-119 mL/min, respectively. Percentage CBF changes for all short duration crewmembers were 11% or lower, within the range of normal physiological fluctuations in healthy individuals. In contrast, in 4 of the 8 long-duration crewmembers, the change in CBF exceeded the range of normal physiological fluctuation. In 3 of the 4 subjects an increase in CBF was measured. Large pre to post-flight changes in the craniospinal CSF flow volume were found in 6 of the 8 long-duration crewmembers. Box-Whisker plots of the CPG and the percent CBF and CSF flow changes for the two cohorts are shown in Figure 4. Examples of CSF flow waveforms for a short and two long-duration (CPG 0 and 3) are shown in Figure 5. Changes in CBF and CSF flow dynamics larger than normal physiological fluctuations were observed in the long-duration crewmembers. Changes in CSF flow were more pronounced than changes in CBF. Decreased CSF flow dynamics were observed

  17. Coupling MODIS images and agrometeorological data for agricultural water productivity analyses in the Mato Grosso State, Brazil

    NASA Astrophysics Data System (ADS)

    de C. Teixeira, Antônio H.; Victoria, Daniel C.; Andrade, Ricardo G.; Leivas, Janice F.; Bolfe, Edson L.; Cruz, Caroline R.

    2014-10-01

    Mato Grosso state, Central West Brazil, has been highlighted by the grain production, mainly soybean and corn, as first (November-March) and second (April-August) harvest crops, respectively. For water productivity (WP) analyses, MODIS products together with a net of weather stations were used. Evapotranspiration (ET) and biomass production (BIO) were acquired during the year 2012 and WP was considered as the ratio of BIO to ET. The SAFER (Simple Algorithm For Evapotranspiration Retrieving) for ET and the Monteith's radiation model for BIO were applied together, considering a mask which separated the crops from other surface types. In relation to the first harvest crop ET, BIO and WP values above of those for other surface types, happened only from November to January with incremental values reaching to 1.2 mm day-1; 67 kg ha-1 day-1; and 0.7 kg m-3, respectively; and between March and May for the second harvest crops, with incremental values attaining 0.5 mm day-1; 27 kg ha-1 day-1; and 0.3 kg m-3, respectively. In both cases, during the growing seasons, the highest WP parameters in cropped areas corresponded, in general, to the blooming to grain filling transition. Considering corn crop, which nowadays is increasing in terms of cultivated areas in the Brazilian Central West region, and crop water productivity (CWP) the ratio of yield to the amount of water consumed, the main growing regions North, Southeast and Northeast were analyzed. Southeast presented the highest annual pixel averages for ET, BIO and CWP (1.7 mm day-1, 78 kg ha-1 day-1 and 2.2 kg m-3, respectively); while for Northeast they were the lowest ones (1.2 mm day-1, 52 kg ha-1 dia-1 and 1.9 kg m-3). Throughout a soil moisture indicator, the ratio of precipitation (P) to ET, it was indeed noted that rainfall was enough for a good grain yield, with P/ET lower than 1.00 only outside the crop growing seasons. The combination of MODIS images and weather stations proved to be useful for monitoring

  18. Comparison of in vitro breast cancer visibility in analyser-based computed tomography with histopathology, mammography, computed tomography and magnetic resonance imaging.

    PubMed

    Keyriläinen, Jani; Fernández, Manuel; Bravin, Alberto; Karjalainen-Lindsberg, Marja Liisa; Leidenius, Marjut; von Smitten, Karl; Tenhunen, Mikko; Kangasmäki, Aki; Sipilä, Petri; Nemoz, Christian; Virkkunen, Pekka; Suortti, Pekka

    2011-09-01

    High-resolution analyser-based X-ray imaging computed tomography (HR ABI-CT) findings on in vitro human breast cancer are compared with histopathology, mammography, computed tomography (CT) and magnetic resonance imaging. The HR ABI-CT images provided significantly better low-contrast visibility compared with the standard radiological images. Fine cancer structures indistinguishable and superimposed in mammograms were seen, and could be matched with the histopathological results. The mean glandular dose was less than 1 mGy in mammography and 12-13 mGy in CT and ABI-CT. The excellent visibility of in vitro breast cancer suggests that HR ABI-CT may have a valuable role in the future as an adjunct or even alternative to current breast diagnostics, when radiation dose is further decreased, and compact synchrotron radiation sources become available. PMID:21862846

  19. VIDEO IMAGE ANALYSES OF THE CROSS-STREAM DISTRIBUTION OF SMOKE IN THE NEAR WAKE OF A BUILDING

    EPA Science Inventory

    In a wind-tunnel study, recorded video images of the top view of smoke dispersion in the wake of a building were analyzed. A continuous source of smoke was emitted at floor level, midway along the leeward side of the building. The technique and usefulness of analyzing video image...

  20. Comparison of genetic-algorithm and emissivity-ratio analyses of image data from OMEGA implosion cores.

    PubMed

    Nagayama, T; Mancini, R C; Florido, R; Tommasini, R; Koch, J A; Delettrez, J A; Regan, S P; Smalyuk, V A; Welser-Sherrill, L A; Golovkin, I E

    2008-10-01

    Detailed analysis of x-ray narrow-band images from argon-doped deuterium-filled inertial confinement fusion implosion experiments yields information about the temperature spatial structure in the core at the collapse of the implosion. We discuss the analysis of direct-drive implosion experiments at OMEGA, in which multiple narrow-band images were recorded with a multimonochromatic x-ray imaging instrument. The temperature spatial structure is investigated by using the sensitivity of the Ly beta/He beta line emissivity ratio to the temperature. Three analysis methods that consider the argon He beta and Ly beta image data are discussed and the results compared. The methods are based on a ratio of image intensities, ratio of Abel-inverted emissivities, and a search and reconstruction technique driven by a Pareto genetic algorithm. PMID:19044576

  1. Comparison of genetic-algorithm and emissivity-ratio analyses of image data from OMEGA implosion cores

    SciTech Connect

    Nagayama, T.; Mancini, R. C.; Florido, R.; Tommasini, R.; Koch, J. A.; Delettrez, J. A.; Regan, S. P.; Smalyuk, V. A.; Welser-Sherrill, L. A.; Golovkin, I. E.

    2008-10-15

    Detailed analysis of x-ray narrow-band images from argon-doped deuterium-filled inertial confinement fusion implosion experiments yields information about the temperature spatial structure in the core at the collapse of the implosion. We discuss the analysis of direct-drive implosion experiments at OMEGA, in which multiple narrow-band images were recorded with a multimonochromatic x-ray imaging instrument. The temperature spatial structure is investigated by using the sensitivity of the Ly{beta}/He{beta} line emissivity ratio to the temperature. Three analysis methods that consider the argon He{beta} and Ly{beta} image data are discussed and the results compared. The methods are based on a ratio of image intensities, ratio of Abel-inverted emissivities, and a search and reconstruction technique driven by a Pareto genetic algorithm.

  2. Rapid specimen preparation to improve the throughput of electron microscopic volume imaging for three-dimensional analyses of subcellular ultrastructures with serial block-face scanning electron microscopy.

    PubMed

    Thai, Truc Quynh; Nguyen, Huy Bang; Saitoh, Sei; Wu, Bao; Saitoh, Yurika; Shimo, Satoshi; Elewa, Yaser Hosny Ali; Ichii, Osamu; Kon, Yasuhiro; Takaki, Takashi; Joh, Kensuke; Ohno, Nobuhiko

    2016-09-01

    Serial block-face imaging using scanning electron microscopy enables rapid observations of three-dimensional ultrastructures in a large volume of biological specimens. However, such imaging usually requires days for sample preparation to reduce charging and increase image contrast. In this study, we report a rapid procedure to acquire serial electron microscopic images within 1 day for three-dimensional analyses of subcellular ultrastructures. This procedure is based on serial block-face with two major modifications, including a new sample treatment device and direct polymerization on the rivets, to reduce the time and workload needed. The modified procedure without uranyl acetate can produce tens of embedded samples observable under serial block-face scanning electron microscopy within 1 day. The serial images obtained are similar to the block-face images acquired by common procedures, and are applicable to three-dimensional reconstructions at a subcellular resolution. Using this approach, regional immune deposits and the double contour or heterogeneous thinning of basement membranes were observed in the glomerular capillary loops of an autoimmune nephropathy model. These modifications provide options to improve the throughput of three-dimensional electron microscopic examinations, and will ultimately be beneficial for the wider application of volume imaging in life science and clinical medicine. PMID:26867664

  3. Tract-Specific Analyses of Diffusion Tensor Imaging Show Widespread White Matter Compromise in Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Shukla, Dinesh K.; Keehn, Brandon; Muller, Ralph-Axel

    2011-01-01

    Background: Previous diffusion tensor imaging (DTI) studies have shown white matter compromise in children and adults with autism spectrum disorder (ASD), which may relate to reduced connectivity and impaired function of distributed networks. However, tract-specific evidence remains limited in ASD. We applied tract-based spatial statistics (TBSS)…

  4. Quantifying the complexity of excised larynx vibrations from high-speed imaging using spatiotemporal and nonlinear dynamic analyses

    NASA Astrophysics Data System (ADS)

    Zhang, Yu; Jiang, Jack J.; Tao, Chao; Bieging, Erik; MacCallum, Julia K.

    2007-12-01

    In this paper, we investigate the biomechanical applications of spatiotemporal analysis and nonlinear dynamic analysis to quantitatively describe regular and irregular vibrations of twelve excised larynges from high-speed image recordings. Regular vibrations show simple spatial symmetry, temporal periodicity, and discrete frequency spectra, while irregular vibrations show complex spatiotemporal plots, aperiodic time series, and broadband spectra. Furthermore, the global entropy and correlation length from spatiotemporal analysis and the correlation dimension from nonlinear dynamic analysis reveal a statistical difference between regular and irregular vibrations. In comparison with regular vibrations, the global entropy and correlation dimension of irregular vibrations are statistically higher, while the correlation length is significantly lower. These findings show that spatiotemporal analysis and nonlinear dynamic analysis are capable of describing the complex dynamics of vocal fold vibrations from high-speed imaging and may potentially be helpful for understanding disordered behaviors in biomedical laryngeal systems.

  5. Unsupervised clustering analyses of features extraction for a caries computer-assisted diagnosis using dental fluorescence images

    NASA Astrophysics Data System (ADS)

    Bessani, Michel; da Costa, Mardoqueu M.; Lins, Emery C. C. C.; Maciel, Carlos D.

    2014-02-01

    Computer-assisted diagnoses (CAD) are performed by systems with embedded knowledge. These systems work as a second opinion to the physician and use patient data to infer diagnoses for health problems. Caries is the most common oral disease and directly affects both individuals and the society. Here we propose the use of dental fluorescence images as input of a caries computer-assisted diagnosis. We use texture descriptors together with statistical pattern recognition techniques to measure the descriptors performance for the caries classification task. The data set consists of 64 fluorescence images of in vitro healthy and carious teeth including different surfaces and lesions already diagnosed by an expert. The texture feature extraction was performed on fluorescence images using RGB and YCbCr color spaces, which generated 35 different descriptors for each sample. Principal components analysis was performed for the data interpretation and dimensionality reduction. Finally, unsupervised clustering was employed for the analysis of the relation between the output labeling and the diagnosis of the expert. The PCA result showed a high correlation between the extracted features; seven components were sufficient to represent 91.9% of the original feature vectors information. The unsupervised clustering output was compared with the expert classification resulting in an accuracy of 96.88%. The results show the high accuracy of the proposed approach in identifying carious and non-carious teeth. Therefore, the development of a CAD system for caries using such an approach appears to be promising.

  6. Combined magnetic resonance and diffusion tensor imaging analyses provide a powerful tool for in vivo assessment of deformation along human muscle fibers.

    PubMed

    Pamuk, Uluç; Karakuzu, Agah; Ozturk, Cengizhan; Acar, Burak; Yucesoy, Can A

    2016-10-01

    Muscle fiber direction strain provides invaluable information for characterizing muscle function. However, methods to study this for human muscles in vivo are lacking. Using magnetic resonance (MR) imaging based deformation analyses and diffusion tensor (DT) imaging based tractography combined, we aimed to assess muscle fiber direction local tissue deformations within the human medial gastrocnemius (GM) muscle. Healthy female subjects (n=5, age=27±1 years) were positioned prone within the MR scanner in a relaxed state with the ankle angle fixed at 90°. The knee was brought to flexion (140.8±3.0°) (undeformed state). Sets of 3D high resolution MR, and DT images were acquired. This protocol was repeated at extended knee joint position (177.0±1.0°) (deformed state). Tractography and Demons nonrigid registration algorithm was utilized to calculate local deformations along muscle fascicles. Undeformed state images were also transformed by a synthetic rigid body motion to calculate strain errors. Mean strain errors were significantly smaller then mean fiber direction strains (lengthening: 0.2±0.1% vs. 8.7±8.5%; shortening: 3.3±0.9% vs. 7.5±4.6%). Shortening and lengthening (up to 23.3% and 116.7%, respectively) occurs simultaneously along individual fascicles despite imposed GM lengthening. Along-fiber shear strains confirm the presence of much shearing between fascicles. Mean fiber direction strains of different tracts also show non-uniform distribution. Inhomogeneity of fiber strain indicates epimuscular myofascial force transmission. We conclude that MR and DT imaging analyses combined provide a powerful tool for quantifying deformation along human muscle fibers in vivo. This can help substantially achieving a better understanding of normal and pathological muscle function and mechanisms of treatment techniques. PMID:27429070

  7. Single-Cell Imaging and Spectroscopic Analyses of Cr(VI) Reduction on the Surface of Bacterial Cells

    PubMed Central

    Wang, Yuanmin; Sevinc, Papatya C.; Balchik, Sara M.; Fridrickson, Jim; Shi, Liang; Lu, H. Peter

    2013-01-01

    We investigate single-cell reduction of toxic Cr(VI) by the dissimilatory metal-reducing bacterium Shewanella oneidensis MR-1 (MR-1), an important bioremediation process, using Raman spectroscopy and scanning electron microscopy (SEM) combined with energy-dispersive X-ray spectroscopy (EDX). Our experiments indicate that the toxic and highly soluble Cr(VI) can be efficiently reduced to the less toxic and non-soluble Cr2O3 nanoparticles by MR-1. Cr2O3 is observed to emerge as nanoparticles adsorbed on the cell surface and its chemical nature is identified by EDX imaging and Raman spectroscopy. Co-localization of Cr2O3 and cytochromes by EDX imaging and Raman spectroscopy suggests a terminal reductase role for MR-1 surface-exposed cytochromes MtrC and OmcA. Our experiments revealed that the cooperation of surface proteins OmcA and MtrC makes the reduction reaction most efficient, and the sequence of the reducing reactivity of the MR-1 is: wild type > single mutant ΔmtrC or mutant ΔomcA > double mutant (ΔomcA-ΔmtrC). Moreover, our results also suggest that the direct microbial Cr(VI) reduction and Fe(II) (hematite)-mediated Cr(VI) reduction mechanisms may co-exist in the reduction processes. PMID:23249294

  8. Functional assessment of glioma pathogenesis by in vivo multi-parametric magnetic resonance imaging and in vitro analyses

    PubMed Central

    Yao, Nai-Wei; Chang, Chen; Lin, Hsiu-Ting; Yen, Chen-Tung; Chen, Jeou-Yuan

    2016-01-01

    Gliomas are aggressive brain tumors with poor prognosis. In this study, we report a novel approach combining both in vivo multi-parametric MRI and in vitro cell culture assessments to evaluate the pathogenic development of gliomas. Osteopontin (OPN), a pleiotropic factor, has been implicated in the formation and progression of various human cancers, including gliomas, through its functions in regulating cell proliferation, survival, angiogenesis, and migration. Using rat C6 glioma model, the combined approach successfully monitors the acquisition and decrease of cancer hallmarks. We show that knockdown of the expression of OPN reduces C6 cell proliferation, survival, viability and clonogenicity in vitro, and reduces tumor burden and prolongs animal survival in syngeneic rats. OPN depletion is associated with reduced tumor growth, decreased angiogenesis, and an increase of tumor-associated metabolites, as revealed by T2-weighted images, diffusion-weighted images, Ktrans maps, and 1H-MRS, respectively. These strategies allow us to define an important role of OPN in conferring cancer hallmarks, which can be further applied to assess the functional roles of other candidate genes in glioma. In particular, the non-invasive multi-parametric MRI measurement of cancer hallmarks related to proliferation, angiogenesis and altered metabolism may serve as a useful tool for diagnosis and for patient management. PMID:27198662

  9. Single-Cell Imaging and Spectroscopic Analyses of Cr(VI) Reduction on the Surface of Bacterial Cells

    SciTech Connect

    Wang, Yuanmin; Sevinc, Papatya C.; Belchik, Sara M.; Fredrickson, Jim K.; Shi, Liang; Lu, H. Peter

    2013-01-22

    We investigate single-cell reduction of toxic Cr(VI) by the dissimilatory metal-reducing bacterium Shewanella oneidensis MR-1 (MR-1), an important bioremediation process, using Raman spectroscopy and scanning electron microscopy (SEM) combined with energy-dispersive X-ray spectroscopy (EDX). Our experiments indicate that the toxic and highly soluble Cr(VI) can be efficiently reduced to the less toxic and non-soluble Cr2O3 nanoparticles by MR-1. Cr2O3 is observed to emerge as nanoparticles adsorbed on the cell surface and its chemical nature is identified by EDX imaging and Raman spectroscopy. Co-localization of Cr2O3 and cytochromes by EDX imaging and Raman spectroscopy suggests a terminal reductase role for MR-1 surface-exposed cytochromes MtrC and OmcA. Our experiments revealed that the cooperation of surface proteins OmcA and MtrC makes the reduction reaction most efficient, and the sequence of the reducing reactivity of the MR-1 is: wild type > single mutant @mtrC or mutant @omcA > double mutant (@omcA-@mtrC). Moreover, our results also suggest that the direct microbial Cr(VI) reduction and Fe(II) (hematite)-mediated Cr(VI) reduction mechanisms may co-exist in the reduction processes.

  10. Functional assessment of glioma pathogenesis by in vivo multi-parametric magnetic resonance imaging and in vitro analyses.

    PubMed

    Yao, Nai-Wei; Chang, Chen; Lin, Hsiu-Ting; Yen, Chen-Tung; Chen, Jeou-Yuan

    2016-01-01

    Gliomas are aggressive brain tumors with poor prognosis. In this study, we report a novel approach combining both in vivo multi-parametric MRI and in vitro cell culture assessments to evaluate the pathogenic development of gliomas. Osteopontin (OPN), a pleiotropic factor, has been implicated in the formation and progression of various human cancers, including gliomas, through its functions in regulating cell proliferation, survival, angiogenesis, and migration. Using rat C6 glioma model, the combined approach successfully monitors the acquisition and decrease of cancer hallmarks. We show that knockdown of the expression of OPN reduces C6 cell proliferation, survival, viability and clonogenicity in vitro, and reduces tumor burden and prolongs animal survival in syngeneic rats. OPN depletion is associated with reduced tumor growth, decreased angiogenesis, and an increase of tumor-associated metabolites, as revealed by T2-weighted images, diffusion-weighted images, K(trans) maps, and 1H-MRS, respectively. These strategies allow us to define an important role of OPN in conferring cancer hallmarks, which can be further applied to assess the functional roles of other candidate genes in glioma. In particular, the non-invasive multi-parametric MRI measurement of cancer hallmarks related to proliferation, angiogenesis and altered metabolism may serve as a useful tool for diagnosis and for patient management. PMID:27198662

  11. Calibration of remote mineralogy algorithms using modal analyses of Apollo soils by X-ray diffraction and microscopic spectral imaging

    NASA Astrophysics Data System (ADS)

    Crites, S. T.; Taylor, J.; Martel, L.; Lucey, P. G.; Blake, D. F.

    2012-12-01

    We have launched a project to determine the modal mineralogy of over 100 soils from all Apollo sites using quantitative X-ray diffraction (XRD) and microscopic hyperspectral imaging at visible, near-IR and thermal IR wavelengths. The two methods are complementary: XRD is optimal for obtaining the major mineral modes because its measurement is not limited to the surfaces of grains, whereas the hyperspectral imaging method allows us to identify minerals present even down to a single grain, well below the quantitative detection limit of XRD. Each soil is also sent to RELAB to obtain visible, near-IR, and thermal-IR reflectance spectra. The goal is to use quantitative mineralogy in comparison with spectra of the same soils and with remote sensing data of the sampling stations to improve our ability to extract quantitative mineralogy from remote sensing observations. Previous groups have demonstrated methods for using lab mineralogy to validate remote sensing. The LSCC pioneered the method of comparing mineralogy to laboratory spectra of the same soils (Pieters et al. 2002); Blewett et al. (1997) directly compared remote sensing results for sample sites with lab measurements of representative soils from those sites. We are building upon the work of both groups by expanding the number of soils measured to 128, with an emphasis on immature soils to support recent work studying fresh exposures like crater central peaks, and also by incorporating the recent high spatial and spectral resolution data sets over expanded wavelength ranges (e.g. Diviner TIR, M3 hyperspectral VNIR) not available at the time of the previous studies. We have thus far measured 32 Apollo 16 soils using quantitative XRD and are continuing with our collection of soils from the other landing sites. We have developed a microscopic spectral imaging system that includes TIR, VIS, and NIR capabilities and have completed proof-of-concept scans of mineral separates and preliminary lunar soil scans with plans

  12. Evaluating Climate Causation of Conflict in Darfur Using Multi-temporal, Multi-resolution Satellite Image Datasets With Novel Analyses

    NASA Astrophysics Data System (ADS)

    Brown, I.; Wennbom, M.

    2013-12-01

    Climate change, population growth and changes in traditional lifestyles have led to instabilities in traditional demarcations between neighboring ethic and religious groups in the Sahel region. This has resulted in a number of conflicts as groups resort to arms to settle disputes. Such disputes often centre on or are justified by competition for resources. The conflict in Darfur has been controversially explained by resource scarcity resulting from climate change. Here we analyse established methods of using satellite imagery to assess vegetation health in Darfur. Multi-decadal time series of observations are available using low spatial resolution visible-near infrared imagery. Typically normalized difference vegetation index (NDVI) analyses are produced to describe changes in vegetation ';greenness' or ';health'. Such approaches have been widely used to evaluate the long term development of vegetation in relation to climate variations across a wide range of environments from the Arctic to the Sahel. These datasets typically measure peak NDVI observed over a given interval and may introduce bias. It is furthermore unclear how the spatial organization of sparse vegetation may affect low resolution NDVI products. We develop and assess alternative measures of vegetation including descriptors of the growing season, wetness and resource availability. Expanding the range of parameters used in the analysis reduces our dependence on peak NDVI. Furthermore, these descriptors provide a better characterization of the growing season than the single NDVI measure. Using multi-sensor data we combine high temporal/moderate spatial resolution data with low temporal/high spatial resolution data to improve the spatial representativity of the observations and to provide improved spatial analysis of vegetation patterns. The approach places the high resolution observations in the NDVI context space using a longer time series of lower resolution imagery. The vegetation descriptors

  13. Characterization of structures of the Nankai Trough accretionary prism from integrated analyses of LWD log response, resistivity images and clay mineralogy of cuttings: Expedition 338 Site C0002

    NASA Astrophysics Data System (ADS)

    Jurado, Maria Jose; Schleicher, Anja

    2014-05-01

    The objective of our research is a detailed characterization of structures on the basis of LWD oriented images and logs,and clay mineralogy of cuttings from Hole C0002F of the Nankai Trough accretionary prism. Our results show an integrated interpretation of structures derived from borehole images, petrophysical characterization on LWD logs and cuttings mineralogy. The geometry of the structure intersected at Hole C0002F has been characterized by the interpretation of oriented borehole resistivity images acquired during IODP Expedition 338. The characterization of structural features, faults and fracture zones is based on a detailed post-cruise interpretation of bedding and fractures on borehole images and also on the analysis of Logging While Drilling (LWD) log response (gamma radioactivity, resistivity and sonic logs). The interpretation and complete characterization of structures (fractures, fracture zones, fault zones, folds) was achieved after detailed shorebased reprocessing of resistivity images, which allowed to enhance bedding and fracture's imaging for geometry and orientation interpretation. In order to characterize distinctive petrophysical properties based on LWD log response, it could be compared with compositional changes derived from cuttings analyses. Cuttings analyses were used to calibrate and to characterize log response and to verify interpretations in terms of changes in composition and texture at fractures and fault zones defined on borehole images. Cuttings were taken routinely every 5 m during Expedition 338, indicating a clay-dominated lithology of silty claystone with interbeds of weakly consolidated, fine sandstones. The main mineralogical components are clay minerals, quartz, feldspar and calcite. Selected cuttings were taken from areas of interest as defined on LWD logs and images. The clay mineralogy was investigated on the <2 micron clay-size fraction, with special focus on smectite and illite minerals. Based on X-ray diffraction

  14. Comparison of meta-analyses among elastosonography (ES) and positron emission tomography/computed tomography (PET/CT) imaging techniques in the application of prostate cancer diagnosis.

    PubMed

    Ouyang, Qiaohong; Duan, Zhongxiang; Lei, Jixiao; Jiao, Guangli

    2016-03-01

    The early diagnosis of prostate cancer (PCa) appears to be of vital significance for the provision of appropriate treatment programs. Even though several sophisticated imaging techniques such as positron emission tomography/computed tomography (PET/CT) and elastosonography (ES) have already been developed for PCa diagnosis, the diagnostic accuracy of these imaging techniques is still controversial to some extent. Therefore, a comprehensive meta-analysis in this study was performed to compare the accuracy of various diagnostic imaging methods for PCa, including 11C-choline PET/CT, 11C-acetate PET/CT, 18F-fluorocholine PET/CT, 18F-fluoroglucose PET/CT, transrectal real-time elastosonography (TRTE), and shear-wave elastosonography (SWE). The eligible studies were identified through systematical searching for the literature in electronic databases including PubMed, Cochrane, and Web of Science. On the basis of the fixed-effects model, the pooled sensitivity (SEN), specificity (SPE), and area under the receiver operating characteristics curve (AUC) were calculated to estimate the diagnostic accuracy of 11C-choline PET/CT, 11C-acetate PET/CT, 18F-fluorocholine (FCH) PET/CT, 18F-fluoroglucose (FDG) PET/CT, TRTE, and SWE. All the statistical analyses were conducted with R language Software. The present meta-analysis incorporating a total of 82 studies demonstrated that the pooled sensitivity of the six imaging techniques were sorted as follows: SWE > 18F-FCH PET/CT > 11C-choline PET/CT > TRTE > 11C-acetate PET/CT > 18F-FDG PET/CT; the pooled specificity were also compared: SWE > 18F-FCH PET/CT > 11C-choline PET/CT > TRTE > 18F-FDG PET/CT > 11C-acetate PET/CT; finally, the pooled diagnostic accuracy of the six imaging techniques based on AUC were ranked as below: SWE > 18F-FCH PET/CT > 11C-choline PET/CT > TRTE > 11C-acetate PET/CT > 18F-FDG PET/CT. SWE and 18F-FCH PET/CT imaging could offer more assistance in the

  15. Utilizing magnetic resonance imaging logs, openhole logs, and sidewall core analyses to evaluate shaly sands for water-free production

    SciTech Connect

    Taylor, D.A.; Morganti, J.K.; White, H.J. ); Noblett, B.R. )

    1996-01-01

    Nuclear magnetic resonance (NMR) logging using the new C Series Magnetic Resonance Imaging Log (MRIL) system is rapidly enhancing formation evaluation throughout the industry. By measuring irreducible water saturations, permeabilities, and effective porosities, MRIL data can help petrophysicists evaluate low-resistivity pays. In these environments, conventional openhole logs may not define all of the pay intervals. The MRIL system can also reduce the number of unnecessary completions in zones of potentially high water cut. MRIL tool theory and log presentations used with conventional logs and sidewall cores are presented along with field examples. Scanning electron microscope (SEM) analysis shows good correlation of varying grain size in sandstones with the T2 distribution and bulk volume irreducible water determined from the MRIL measurements. Analysis of each new well drilled in the study area shows how water-free production zones were defined. Because the MRIL data were not recorded on one of the wells, predictions from the conventional logs and the MRIL data collected on the other two wells were used to estimate productive zones in the first well. Discussion of additional formation characteristics, completion procedures, actual production, and predicted producibility of the shaly sands is presented. Integrated methodologies resulted in the perforation of 3 new wells for a gross initial potential of 690 BOPD and 0 BWPD.

  16. Utilizing magnetic resonance imaging logs, openhole logs, and sidewall core analyses to evaluate shaly sands for water-free production

    SciTech Connect

    Taylor, D.A.; Morganti, J.K.; White, H.J.; Noblett, B.R.

    1996-12-31

    Nuclear magnetic resonance (NMR) logging using the new C Series Magnetic Resonance Imaging Log (MRIL) system is rapidly enhancing formation evaluation throughout the industry. By measuring irreducible water saturations, permeabilities, and effective porosities, MRIL data can help petrophysicists evaluate low-resistivity pays. In these environments, conventional openhole logs may not define all of the pay intervals. The MRIL system can also reduce the number of unnecessary completions in zones of potentially high water cut. MRIL tool theory and log presentations used with conventional logs and sidewall cores are presented along with field examples. Scanning electron microscope (SEM) analysis shows good correlation of varying grain size in sandstones with the T2 distribution and bulk volume irreducible water determined from the MRIL measurements. Analysis of each new well drilled in the study area shows how water-free production zones were defined. Because the MRIL data were not recorded on one of the wells, predictions from the conventional logs and the MRIL data collected on the other two wells were used to estimate productive zones in the first well. Discussion of additional formation characteristics, completion procedures, actual production, and predicted producibility of the shaly sands is presented. Integrated methodologies resulted in the perforation of 3 new wells for a gross initial potential of 690 BOPD and 0 BWPD.

  17. Characterization of Influenza Vaccine Hemagglutinin Complexes by Cryo-Electron Microscopy and Image Analyses Reveals Structural Polymorphisms.

    PubMed

    McCraw, Dustin M; Gallagher, John R; Harris, Audray K

    2016-06-01

    Influenza virus afflicts millions of people worldwide on an annual basis. There is an ever-present risk that animal viruses will cross the species barrier to cause epidemics and pandemics resulting in great morbidity and mortality. Zoonosis outbreaks, such as the H7N9 outbreak, underscore the need to better understand the molecular organization of viral immunogens, such as recombinant influenza virus hemagglutinin (HA) proteins, used in influenza virus subunit vaccines in order to optimize vaccine efficacy. Here, using cryo-electron microscopy and image analysis, we show that recombinant H7 HA in vaccines formed macromolecular complexes consisting of variable numbers of HA subunits (range, 6 to 8). In addition, HA complexes were distributed across at least four distinct structural classes (polymorphisms). Three-dimensional (3D) reconstruction and molecular modeling indicated that HA was in the prefusion state and suggested that the oligomerization and the structural polymorphisms observed were due to hydrophobic interactions involving the transmembrane regions. These experiments suggest that characterization of the molecular structures of influenza virus HA complexes used in subunit vaccines will lead to better understanding of the differences in vaccine efficacy and to the optimization of subunit vaccines to prevent influenza virus infection. PMID:27074939

  18. An automatic generation of non-uniform mesh for CFD analyses of image-based multiscale human airway models

    NASA Astrophysics Data System (ADS)

    Miyawaki, Shinjiro; Tawhai, Merryn H.; Hoffman, Eric A.; Lin, Ching-Long

    2014-11-01

    The authors have developed a method to automatically generate non-uniform CFD mesh for image-based human airway models. The sizes of generated tetrahedral elements vary in both radial and longitudinal directions to account for boundary layer and multiscale nature of pulmonary airflow. The proposed method takes advantage of our previously developed centerline-based geometry reconstruction method. In order to generate the mesh branch by branch in parallel, we used the open-source programs Gmsh and TetGen for surface and volume meshes, respectively. Both programs can specify element sizes by means of background mesh. The size of an arbitrary element in the domain is a function of wall distance, element size on the wall, and element size at the center of airway lumen. The element sizes on the wall are computed based on local flow rate and airway diameter. The total number of elements in the non-uniform mesh (10 M) was about half of that in the uniform mesh, although the computational time for the non-uniform mesh was about twice longer (170 min). The proposed method generates CFD meshes with fine elements near the wall and smooth variation of element size in longitudinal direction, which are required, e.g., for simulations with high flow rate. NIH Grants R01-HL094315, U01-HL114494, and S10-RR022421. Computer time provided by XSEDE.

  19. Characterization of Influenza Vaccine Hemagglutinin Complexes by Cryo-Electron Microscopy and Image Analyses Reveals Structural Polymorphisms

    PubMed Central

    McCraw, Dustin M.; Gallagher, John R.

    2016-01-01

    Influenza virus afflicts millions of people worldwide on an annual basis. There is an ever-present risk that animal viruses will cross the species barrier to cause epidemics and pandemics resulting in great morbidity and mortality. Zoonosis outbreaks, such as the H7N9 outbreak, underscore the need to better understand the molecular organization of viral immunogens, such as recombinant influenza virus hemagglutinin (HA) proteins, used in influenza virus subunit vaccines in order to optimize vaccine efficacy. Here, using cryo-electron microscopy and image analysis, we show that recombinant H7 HA in vaccines formed macromolecular complexes consisting of variable numbers of HA subunits (range, 6 to 8). In addition, HA complexes were distributed across at least four distinct structural classes (polymorphisms). Three-dimensional (3D) reconstruction and molecular modeling indicated that HA was in the prefusion state and suggested that the oligomerization and the structural polymorphisms observed were due to hydrophobic interactions involving the transmembrane regions. These experiments suggest that characterization of the molecular structures of influenza virus HA complexes used in subunit vaccines will lead to better understanding of the differences in vaccine efficacy and to the optimization of subunit vaccines to prevent influenza virus infection. PMID:27074939

  20. Nonintrusive Finger-Vein Recognition System Using NIR Image Sensor and Accuracy Analyses According to Various Factors

    PubMed Central

    Pham, Tuyen Danh; Park, Young Ho; Nguyen, Dat Tien; Kwon, Seung Yong; Park, Kang Ryoung

    2015-01-01

    Biometrics is a technology that enables an individual person to be identified based on human physiological and behavioral characteristics. Among biometrics technologies, face recognition has been widely used because of its advantages in terms of convenience and non-contact operation. However, its performance is affected by factors such as variation in the illumination, facial expression, and head pose. Therefore, fingerprint and iris recognitions are preferred alternatives. However, the performance of the former can be adversely affected by the skin condition, including scarring and dryness. In addition, the latter has the disadvantages of high cost, large system size, and inconvenience to the user, who has to align their eyes with the iris camera. In an attempt to overcome these problems, finger-vein recognition has been vigorously researched, but an analysis of its accuracies according to various factors has not received much attention. Therefore, we propose a nonintrusive finger-vein recognition system using a near infrared (NIR) image sensor and analyze its accuracies considering various factors. The experimental results obtained with three databases showed that our system can be operated in real applications with high accuracy; and the dissimilarity of the finger-veins of different people is larger than that of the finger types and hands. PMID:26184214

  1. Seasonal forcing of image-analysed mesozooplankton community composition along the salinity gradient of the Guadalquivir estuary

    NASA Astrophysics Data System (ADS)

    Taglialatela, Simone; Ruiz, Javier; Prieto, Laura; Navarro, Gabriel

    2014-08-01

    The composition and distribution of the mesozooplankton was studied monthly from April 2008 to June 2009 in the Guadalquivir estuary using a fast image analysis technique as well as with traditional microscope counting. The mesozooplankton showed a very clear temporal and spatial pattern with peaks of abundance in late-Spring/early-Summer 2008 and Spring 2009 in the inner estuary. The abundances peaked at 135 × 103 ind. m-3. Calanipeda aquaedulcis was the most abundant species in the fresh and brackish waters (salinity between 0.5 and 7), accounting in many cases for up to 100% of the individuals. Acartia clausi instead was identified as the most abundant species in the middle part of the estuary (salinity between 10 and 30). Cyclopoida of the family Cyclopidae (possibly Acanthocyclops spp.) were occasionally abundant there as well as some species of freshwater Cladocera. At the mouth, the mesozooplanktonic community included appendicularians, chaetognaths, copepods and Cladocera. Canonical Correspondence Analysis (CCA) indicates that the changes observed in the taxonomic composition along the estuary were strictly correlated with the salinity gradient. Furthermore, no evidence of seasonal species substitution was observed in the Guadalquivir estuary, whereas a clear spatial displacement of C. aquaedulcis and A. clausi populations was observed after large discharges from the dam in Alcala del Rio.

  2. Nonintrusive Finger-Vein Recognition System Using NIR Image Sensor and Accuracy Analyses According to Various Factors.

    PubMed

    Pham, Tuyen Danh; Park, Young Ho; Nguyen, Dat Tien; Kwon, Seung Yong; Park, Kang Ryoung

    2015-01-01

    Biometrics is a technology that enables an individual person to be identified based on human physiological and behavioral characteristics. Among biometrics technologies, face recognition has been widely used because of its advantages in terms of convenience and non-contact operation. However, its performance is affected by factors such as variation in the illumination, facial expression, and head pose. Therefore, fingerprint and iris recognitions are preferred alternatives. However, the performance of the former can be adversely affected by the skin condition, including scarring and dryness. In addition, the latter has the disadvantages of high cost, large system size, and inconvenience to the user, who has to align their eyes with the iris camera. In an attempt to overcome these problems, finger-vein recognition has been vigorously researched, but an analysis of its accuracies according to various factors has not received much attention. Therefore, we propose a nonintrusive finger-vein recognition system using a near infrared (NIR) image sensor and analyze its accuracies considering various factors. The experimental results obtained with three databases showed that our system can be operated in real applications with high accuracy; and the dissimilarity of the finger-veins of different people is larger than that of the finger types and hands. PMID:26184214

  3. Textural analyses of carbon fiber materials by 2D-FFT of complex images obtained by high frequency eddy current imaging (HF-ECI)

    NASA Astrophysics Data System (ADS)

    Schulze, Martin H.; Heuer, Henning

    2012-04-01

    Carbon fiber based materials are used in many lightweight applications in aeronautical, automotive, machine and civil engineering application. By the increasing automation in the production process of CFRP laminates a manual optical inspection of each resin transfer molding (RTM) layer is not practicable. Due to the limitation to surface inspection, the quality parameters of multilayer 3 dimensional materials cannot be observed by optical systems. The Imaging Eddy- Current (EC) NDT is the only suitable inspection method for non-resin materials in the textile state that allows an inspection of surface and hidden layers in parallel. The HF-ECI method has the capability to measure layer displacements (misaligned angle orientations) and gap sizes in a multilayer carbon fiber structure. EC technique uses the variation of the electrical conductivity of carbon based materials to obtain material properties. Beside the determination of textural parameters like layer orientation and gap sizes between rovings, the detection of foreign polymer particles, fuzzy balls or visualization of undulations can be done by the method. For all of these typical parameters an imaging classification process chain based on a high resolving directional ECimaging device named EddyCus® MPECS and a 2D-FFT with adapted preprocessing algorithms are developed.

  4. Grain-size and grain-shape analyses using digital imaging technology: Application to the fluvial formation of the Ngandong paleoanthropological site in Central Java, Indonesia

    NASA Astrophysics Data System (ADS)

    Sipola, Maija

    2013-04-01

    This study implements grain-size and grain-shape analyses to better understand the fluvial processes responsible for forming the Ngandong paleoanthropological site along the Solo River in Central Java. The site was first discovered and excavated by the Dutch Geological Survey in the early 1930's, during which fourteen Homo erectus fossils and thousands of other macrofaunal remains were uncovered. The Homo erectus fossils discovered at Ngandong are particularly interesting to paleoanthropologists because the morphology of the excavated crania suggests they are from a recently-living variety of the species. The primary scientific focus for many years has been to determine the absolute age of the Ngandong fossils, while the question of exactly how the Ngandong site itself formed has been frequently overlooked. In this study I use Retsch CAMSIZER digital imaging technology to conduct grain-size and grain-shape analyses of sediments from the terrace stratigraphy at the Ngandong site to understand if there are significant differences between sedimentary layers in grain-size and/or grain-shape, and what these differences mean in terms of local paleoflow dynamics over time. Preliminary analyses indicate there are four distinct sedimentary layers present at Ngandong with regard to size sorting, with the fossil-bearing layers proving to be the most poorly-sorted and most similar to debris-flow deposits. These results support hypotheses by geoarchaeologists that the fossil-bearing layers present at Ngandong were deposited during special flow events rather than under normal stream flow conditions.

  5. Surface Roughness and Critical Exponent Analyses of Boron-Doped Diamond Films Using Atomic Force Microscopy Imaging: Application of Autocorrelation and Power Spectral Density Functions

    NASA Astrophysics Data System (ADS)

    Gupta, S.; Vierkant, G. P.

    2014-09-01

    The evolution of the surface roughness of growing metal or semiconductor thin films provides much needed information about their growth kinetics and corresponding mechanism. While some systems show stages of nucleation, coalescence, and growth, others exhibit varying microstructures for different process conditions. In view of these classifications, we report herein detailed analyses based on atomic force microscopy (AFM) characterization to extract the surface roughness and growth kinetics exponents of relatively low boron-doped diamond (BDD) films by utilizing the analytical power spectral density (PSD) and autocorrelation function (ACF) as mathematical tools. The machining industry has applied PSD for a number of years for tool design and analysis of wear and machined surface quality. Herein, we present similar analyses at the mesoscale to study the surface morphology as well as quality of BDD films grown using the microwave plasma-assisted chemical vapor deposition technique. PSD spectra as a function of boron concentration (in gaseous phase) are compared with those for samples grown without boron. We find that relatively higher boron concentration yields higher amplitudes of the longer-wavelength power spectral lines, with amplitudes decreasing in an exponential or power-law fashion towards shorter wavelengths, determining the roughness exponent ( α ≈ 0.16 ± 0.03) and growth exponent ( β ≈ 0.54), albeit indirectly. A unique application of the ACF, which is widely used in signal processing, was also applied to one-dimensional or line analyses (i.e., along the x- and y-axes) of AFM images, revealing surface topology datasets with varying boron concentration. Here, the ACF was used to cancel random surface "noise" and identify any spatial periodicity via repetitive ACF peaks or spatially correlated noise. Periodicity at shorter spatial wavelengths was observed for no doping and low doping levels, while smaller correlations were observed for relatively

  6. Computational replication of the patient-specific stenting procedure for coronary artery bifurcations: From OCT and CT imaging to structural and hemodynamics analyses.

    PubMed

    Chiastra, Claudio; Wu, Wei; Dickerhoff, Benjamin; Aleiou, Ali; Dubini, Gabriele; Otake, Hiromasa; Migliavacca, Francesco; LaDisa, John F

    2016-07-26

    The optimal stenting technique for coronary artery bifurcations is still debated. With additional advances computational simulations can soon be used to compare stent designs or strategies based on verified structural and hemodynamics results in order to identify the optimal solution for each individual's anatomy. In this study, patient-specific simulations of stent deployment were performed for 2 cases to replicate the complete procedure conducted by interventional cardiologists. Subsequent computational fluid dynamics (CFD) analyses were conducted to quantify hemodynamic quantities linked to restenosis. Patient-specific pre-operative models of coronary bifurcations were reconstructed from CT angiography and optical coherence tomography (OCT). Plaque location and composition were estimated from OCT and assigned to models, and structural simulations were performed in Abaqus. Artery geometries after virtual stent expansion of Xience Prime or Nobori stents created in SolidWorks were compared to post-operative geometry from OCT and CT before being extracted and used for CFD simulations in SimVascular. Inflow boundary conditions based on body surface area, and downstream vascular resistances and capacitances were applied at branches to mimic physiology. Artery geometries obtained after virtual expansion were in good agreement with those reconstructed from patient images. Quantitative comparison of the distance between reconstructed and post-stent geometries revealed a maximum difference in area of 20.4%. Adverse indices of wall shear stress were more pronounced for thicker Nobori stents in both patients. These findings verify structural analyses of stent expansion, introduce a workflow to combine software packages for solid and fluid mechanics analysis, and underscore important stent design features from prior idealized studies. The proposed approach may ultimately be useful in determining an optimal choice of stent and position for each patient. PMID:26655589

  7. Histological analyses by matrix-assisted laser desorption/ionization-imaging mass spectrometry reveal differential localization of sphingomyelin molecular species regulated by particular ceramide synthase in mouse brains.

    PubMed

    Sugimoto, Masayuki; Shimizu, Yoichi; Yoshioka, Takeshi; Wakabayashi, Masato; Tanaka, Yukari; Higashino, Kenichi; Numata, Yoshito; Sakai, Shota; Kihara, Akio; Igarashi, Yasuyuki; Kuge, Yuji

    2015-12-01

    Sphingomyelin (SM) is synthesized by SM synthase (SMS) from ceramide (Cer). SM regulates signaling pathways and maintains organ structure. SM comprises a sphingoid base and differing lengths of acyl-chains, but the importance of its various forms and regulatory synthases is not known. It has been reported that Cer synthase (CerS) has restricted substrate specificity, whereas SMS has no specificity for different lengths of acyl-chains. We hypothesized that the distribution of each SM molecular species was regulated by expression of the CerS family. Thus, we compared the distribution of SM species and CerS mRNA expression using molecular imaging. Spatial distribution of each SM molecular species was investigated using ultra-high-resolution imaging mass spectrometry (IMS). IMS revealed that distribution of SM molecular species varied according to the lengths of acyl-chains found in each brain section. Furthermore, a combination study using in situ hybridization and IMS revealed the spatial expression of CerS1 to be associated with the localization of SM (d18:1/18:0) in cell body-rich gray matter, and CerS2 to be associated with SM (d18:1/24:1) in myelin-rich white matter. Our study is the first comparison of spatial distribution between SM molecular species and CerS isoforms, and revealed their distinct association in the brain. These observations were demonstrated by suppression of CerS2 using siRNA in HepG2 cells; that is, siRNA for CerS2 specifically decreased C22 very long-chain fatty acid (VLCFA)- and C24 VLCFA-containing SMs. Thus, histological analyses of SM species by IMS could be a useful approach to consider their molecular function and regulative mechanism. PMID:26398595

  8. A SPITZER IRAC IMAGING SURVEY FOR T DWARF COMPANIONS AROUND M, L, AND T DWARFS: OBSERVATIONS, RESULTS, AND MONTE CARLO POPULATION ANALYSES

    SciTech Connect

    Carson, J. C.; Marengo, M.; Patten, B. M.; Hora, J. L.; Schuster, M. T.; Fazio, G. G.; Luhman, K. L.; Sonnett, S. M.; Allen, P. R.; Stauffer, J. R.; Schnupp, C.

    2011-12-20

    We report observational techniques, results, and Monte Carlo population analyses from a Spitzer Infrared Array Camera imaging survey for substellar companions to 117 nearby M, L, and T dwarf systems (median distance of 10 pc, mass range of 0.6 to {approx}0.05 M{sub Sun }). The two-epoch survey achieves typical detection sensitivities to substellar companions of [4.5 {mu}m] {<=} 17.2 mag for angular separations between about 7'' and 165''. Based on common proper motion analysis, we find no evidence for new substellar companions. Using Monte Carlo orbital simulations (assuming random inclination, random eccentricity, and random longitude of pericenter), we conclude that the observational sensitivities translate to an ability to detect 600-1100 K brown dwarf companions at semimajor axes {approx}>35 AU and to detect 500-600 K companions at semimajor axes {approx}>60 AU. The simulations also estimate a 600-1100 K T dwarf companion fraction of <3.4% for 35-1200 AU separations and <12.4% for the 500-600 K companions for 60-1000 AU separations.

  9. Sociopolitical Analyses.

    ERIC Educational Resources Information Center

    Van Galen, Jane, Ed.; And Others

    1992-01-01

    This theme issue of the serial "Educational Foundations" contains four articles devoted to the topic of "Sociopolitical Analyses." In "An Interview with Peter L. McLaren," Mary Leach presented the views of Peter L. McLaren on topics of local and national discourses, values, and the politics of difference. Landon E. Beyer's "Educational Studies and…

  10. Geologic analyses of LANDSAT-1 multispectral imagery of a possible power plant site employing digital and analog image processing. [in Pennsylvania

    NASA Technical Reports Server (NTRS)

    Lovegreen, J. R.; Prosser, W. J.; Millet, R. A.

    1975-01-01

    A site in the Great Valley subsection of the Valley and Ridge physiographic province in eastern Pennsylvania was studied to evaluate the use of digital and analog image processing for geologic investigations. Ground truth at the site was obtained by a field mapping program, a subsurface exploration investigation and a review of available published and unpublished literature. Remote sensing data were analyzed using standard manual techniques. LANDSAT-1 imagery was analyzed using digital image processing employing the multispectral Image 100 system and using analog color processing employing the VP-8 image analyzer. This study deals primarily with linears identified employing image processing and correlation of these linears with known structural features and with linears identified manual interpretation; and the identification of rock outcrops in areas of extensive vegetative cover employing image processing. The results of this study indicate that image processing can be a cost-effective tool for evaluating geologic and linear features for regional studies encompassing large areas such as for power plant siting. Digital image processing can be an effective tool for identifying rock outcrops in areas of heavy vegetative cover.

  11. A study on quantitative analyses before and after injection of contrast medium in spine examinations performed by using diffusion weighted image

    NASA Astrophysics Data System (ADS)

    Cho, Jae-Hwan; Lee, Hae-Kag; Kim, Yong-Kyun; Dong, Kyung-Rae; Chung, Woon-Kwan; Joo, Kyu-Ji

    2013-02-01

    This study examined the changes in the signal-to-noise ratio (SNR), the contrast-to-noise ratio (CNR) and the apparent diffusion coefficient (ADC) of metastatic cancer in the lumbar region by using diffusion weighted image taken with a 1.5 T (Tesla) magnetic resonance (MR) scanner before and after injecting a contrast medium. The study enrolled 30 healthy people and 30 patients with metastatic spine cancer from patients who underwent a lumbar MRI scan from January 2011 to October 2012. A 1.5 T MR scanner was used to obtain the diffusion weighted images (DWIs) before and after injecting the contrast medium. In the group with metastatic spine cancer, the SNR and the CNR were measured in three parts among the L1-L5 lumbar vertebrae, which included the part with metastatic spine cancer, the area of the spine with spine cancer, and the area of spine under the region with cancer. In the acquired ADC map image, the SNRs and the ADCs of the three parts were measured in ADC map images. Among the healthy subjects, the measurements were conducted for the lumbar regions of L3-L5. According to the results, in the group with metastatic spine cancer, the SNR in the DWI before the contrast medium had been injected was lowest in the part with spine cancer. In the DWI after the contrast medium had been injected, the SNR and the CNR were increased in all three parts. In the ADC map image after the contrast medium had been injected, the SNR decreased in all three parts compared to the SNR before the contrast had been injected. The ADC after had been injected the contrast medium was decreased in all three parts compared to that before the contrast medium had been injected. In the healthy group, the SNR was increased in the L3-L5 lumbar regions in the DWI. In the ADC map image, the SNR in all the three parts was decreased in the DWI after injecting the contrast medium had been injected. The ADC in the ADC map image was also decreased in all three parts.

  12. Analyses of requirements for computer control and data processing experiment subsystems: Image data processing system (IDAPS) software description (7094 version), volume 2

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A description of each of the software modules of the Image Data Processing System (IDAPS) is presented. The changes in the software modules are the result of additions to the application software of the system and an upgrade of the IBM 7094 Mod(1) computer to a 1301 disk storage configuration. Necessary information about IDAPS sofware is supplied to the computer programmer who desires to make changes in the software system or who desires to use portions of the software outside of the IDAPS system. Each software module is documented with: module name, purpose, usage, common block(s) description, method (algorithm of subroutine) flow diagram (if needed), subroutines called, and storage requirements.

  13. Image

    Energy Science and Technology Software Center (ESTSC)

    2007-08-31

    The computer side of the IMAGE project consists of a collection of Perl scripts that perform a variety of tasks; scripts are available to insert, update and delete data from the underlying Oracle database, download data from NCBI's Genbank and other sources, and generate data files for download by interested parties. Web scripts make up the tracking interface, and various tools available on the project web-site (image.llnl.gov) that provide a search interface to the database.

  14. Characteristics and Origin of a Cratered Unit near the MSL Bradbury Landing Site (Gale Crater, Mars) Based on Analyses of Surface Data and Orbital Images

    NASA Astrophysics Data System (ADS)

    Jacob, S.; Rowland, S. K.; Edgett, K. S.; Kah, L. C.; Wiens, R. C.; Day, M. D.; Calef, F.; Palucis, M. C.; Anderson, R. B.

    2014-12-01

    Using orbiter images, the Curiosity landing ellipse was mapped as six distinct units based mainly on geomorphic characteristics. These units are the alluvial fan material (ALF), fractured light-toned surface (FLT), cratered plains/surfaces (CS), smooth hummocky plains (SH), rugged unit (RU) and striated light-toned outcrops (SLT) (Grotzinger et al., 2014; DOI: 10.1126/science.1242777). The goal of this project was to characterize and determine the origin of the CS. The CS is a thin, nearly horizontal, erosion resistant capping unit. HiRISE mosaics were utilized to subdivide the CS into four geomorphic sub-units. Crater densities were calculated for each sub-unit to provide a quantifiable parameter that could aid in understanding how the sub-units differ. Mastcam images from many locations along Curiosity's traverse show fields of dark, massive boulders, which are presumably erosional remnants of the CS. This indicates that the CS was likely more laterally extensive in the past. In situ CS outcrops, seen at Shaler and multiple locations near the Zabriskie Plateau, appear to have a rough, wind-sculpted surface and may consist of two distinct lithologies. The lower lithology displays hints of layering that have been enhanced by differential weathering, whereas the upper lithology consists of dark, massive rock. When present, the outcrops can extend laterally for several meters, but Mastcam images of outcrops do not always reveal both sections. ChemCam data show that CS targets have concentrations of Al, Na, and K that are indicative of an alkali feldspar phase. The physical and chemical characteristics of the CS suggest a massive deposit that has seen little to no chemical alteration. Physical characteristics of the CS do not allow us to unambiguously determine its geologic origin; possible emplacement mechanisms would require the ability to spread laterally over a nearly horizontal surface, and include inflating lava (e.g., pāhoehoe) or a distal delta deposit. The

  15. Expression of the G2-M checkpoint regulators cyclin B1 and cdc2 in nonmalignant and malignant human breast lesions: immunocytochemical and quantitative image analyses.

    PubMed Central

    Kawamoto, H.; Koizumi, H.; Uchikoshi, T.

    1997-01-01

    We investigated the in vivo expression of cyclin B1 and Cdc2 (key molecules for G2-M transition during the cell cycle) in nonmalignant and cancerous human breast lesions using immunohistochemistry and quantitative proliferative index (PI) analysis. Breast epithelial cells co-expressed cyclin B1 and Cdc2 in their cytoplasm in the G2 phase and in their nuclei in the M phase. Cyclin B1, but not Cdc2, immunostaining rapidly disappeared from the nuclei during the mitotic metaphase to anaphase transition. Static image analysis revealed the mean proliferative index for cyclin B1/cdc2 for each type of lesion to be as follows: normal glands (n = 20), 2.0/2.5%; benign lesions, including typical ductal hyperplasia (n = 76), 2.5/5.8%; atypical ductal hyperplasia (n = 21), 3.0/6.6%; carcinomas in situ (n = 70), 7.4/14.0%; and invasive carcinomas (n = 58), 10.0/22.9%. Proliferative index data for atypical hyperplasia were virtually identical to those for benign lesions and were significantly lower than those for breast cancer, suggesting that expression levels of cyclin B1 and Cdc2 may be used to distinguish premalignant human breast lesions from advanced disease. Furthermore, the proliferative index for cyclin B1 for comedo-type ductal carcinomas in situ agreed with that for invasive ductal carcinomas (mean, 10.1% versus 9.5%), apparently explaining the clinicopathological aggressiveness of this tumor at the molecular level. Images Figure 1 Figure 2 Figure 3 PMID:9006317

  16. Images.

    ERIC Educational Resources Information Center

    Barr, Catherine, Ed.

    1997-01-01

    The theme of this month's issue is "Images"--from early paintings and statuary to computer-generated design. Resources on the theme include Web sites, CD-ROMs and software, videos, books, and others. A page of reproducible activities is also provided. Features include photojournalism, inspirational Web sites, art history, pop art, and myths. (AEF)

  17. IDATEN and G-SITENNO: GUI-assisted software for coherent X-ray diffraction imaging experiments and data analyses at SACLA.

    PubMed

    Sekiguchi, Yuki; Yamamoto, Masaki; Oroguchi, Tomotaka; Takayama, Yuki; Suzuki, Shigeyuki; Nakasako, Masayoshi

    2014-11-01

    Using our custom-made diffraction apparatus KOTOBUKI-1 and two multiport CCD detectors, cryogenic coherent X-ray diffraction imaging experiments have been undertaken at the SPring-8 Angstrom Compact free electron LAser (SACLA) facility. To efficiently perform experiments and data processing, two software suites with user-friendly graphical user interfaces have been developed. The first is a program suite named IDATEN, which was developed to easily conduct four procedures during experiments: aligning KOTOBUKI-1, loading a flash-cooled sample into the cryogenic goniometer stage inside the vacuum chamber of KOTOBUKI-1, adjusting the sample position with respect to the X-ray beam using a pair of telescopes, and collecting diffraction data by raster scanning the sample with X-ray pulses. Named G-SITENNO, the other suite is an automated version of the original SITENNO suite, which was designed for processing diffraction data. These user-friendly software suites are now indispensable for collecting a large number of diffraction patterns and for processing the diffraction patterns immediately after collecting data within a limited beam time. PMID:25343809

  18. Utilizing magnetic resonance imaging logs, open hole logs and sidewall core analyses to evaluate shaly sands for water-free production

    SciTech Connect

    Taylor, D.; Morganti, J.; White, H.

    1995-06-01

    NMR logging using the new C series Magnetic Resonance Imaging Logging (MRIL){trademark} is rapidly enhancing formation evaluation throughout the industry. By measuring irreducible water saturations, permeability and effective porosities, MRIL data can help petrophysicists evaluate low resistivity pays. In these instances, conventional open hole logs may not define all of the pay intervals. MRIL can also minimize unnecessary completions in zones of potentially high water-cut. This case study will briefly discuss MRIL tool theory and log presentations used with the conventional logs and sidewall cores. SEM analysis will show a good correlation of varying grain size sands with the T{sub 2} distribution and bulk volume irreducible from MRIL. Discussions of each well in the study area will show how water-free production zones were defined. Because the MRIL data was not recorded on one of the wells, the advanced petrophysical program HORIZON was used to predict the MRIL bulk volume irreducible and effective porosity to estimate productive zones. Discussion of additional formation characteristics, completion procedures, actual production and predicted producibility of the shaly sands will be presented.

  19. 3D object-oriented image analysis in 3D geophysical modelling: Analysing the central part of the East African Rift System

    NASA Astrophysics Data System (ADS)

    Fadel, I.; van der Meijde, M.; Kerle, N.; Lauritsen, N.

    2015-03-01

    Non-uniqueness of satellite gravity interpretation has traditionally been reduced by using a priori information from seismic tomography models. This reduction in the non-uniqueness has been based on velocity-density conversion formulas or user interpretation of the 3D subsurface structures (objects) based on the seismic tomography models and then forward modelling these objects. However, this form of object-based approach has been done without a standardized methodology on how to extract the subsurface structures from the 3D models. In this research, a 3D object-oriented image analysis (3D OOA) approach was implemented to extract the 3D subsurface structures from geophysical data. The approach was applied on a 3D shear wave seismic tomography model of the central part of the East African Rift System. Subsequently, the extracted 3D objects from the tomography model were reconstructed in the 3D interactive modelling environment IGMAS+, and their density contrast values were calculated using an object-based inversion technique to calculate the forward signal of the objects and compare it with the measured satellite gravity. Thus, a new object-based approach was implemented to interpret and extract the 3D subsurface objects from 3D geophysical data. We also introduce a new approach to constrain the interpretation of the satellite gravity measurements that can be applied using any 3D geophysical model.

  20. Flow modification in canine intracranial aneurysm model by an asymmetric stent: studies using digital subtraction angiography (DSA) and image-based computational fluid dynamics (CFD) analyses

    PubMed Central

    Hoi, Yiemeng; Ionita, Ciprian N.; Tranquebar, Rekha V.; Hoffmann, Kenneth R.; Woodward, Scott, H.; Taulbee, Dale B.; Meng, Hui; Rudin, Stephen

    2011-01-01

    An asymmetric stent with low porosity patch across the intracranial aneurysm neck and high porosity elsewhere is designed to modify the flow to result in thrombogenesis and occlusion of the aneurysm and yet to reduce the possibility of also occluding adjacent perforator vessels. The purposes of this study are to evaluate the flow field induced by an asymmetric stent using both numerical and digital subtraction angiography (DSA) methods and to quantify the flow dynamics of an asymmetric stent in an in vivo aneurysm model. We created a vein-pouch aneurysm model on the canine carotid artery. An asymmetric stent was implanted at the aneurysm, with 25% porosity across the aneurysm neck and 80% porosity elsewhere. The aneurysm geometry, before and after stent implantation, was acquired using cone beam CT and reconstructed for computational fluid dynamics (CFD) analysis. Both steady-state and pulsatile flow conditions using the measured waveforms from the aneurysm model were studied. To reduce computational costs, we modeled the asymmetric stent effect by specifying a pressure drop over the layer across the aneurysm orifice where the low porosity patch was located. From the CFD results, we found the asymmetric stent reduced the inflow into the aneurysm by 51%, and appeared to create a stasis-like environment which favors thrombus formation. The DSA sequences also showed substantial flow reduction into the aneurysm. Asymmetric stents may be a viable image guided intervention for treating intracranial aneurysms with desired flow modification features. PMID:21666881

  1. Beta Adrenergic Receptor Stimulation Suppresses Cell Migration in Association with Cell Cycle Transition in Osteoblasts-Live Imaging Analyses Based on FUCCI System.

    PubMed

    Katsumura, Sakie; Ezura, Yoichi; Izu, Yayoi; Shirakawa, Jumpei; Miyawaki, Atsushi; Harada, Kiyoshi; Noda, Masaki

    2016-02-01

    Osteoporosis affects over 20 million patients in the United States. Among those, disuse osteoporosis is serious as it is induced by bed-ridden conditions in patients suffering from aging-associated diseases including cardiovascular, neurological, and malignant neoplastic diseases. Although the phenomenon that loss of mechanical stress such as bed-ridden condition reduces bone mass is clear, molecular bases for the disuse osteoporosis are still incompletely understood. In disuse osteoporosis model, bone loss is interfered by inhibitors of sympathetic tone and adrenergic receptors that suppress bone formation. However, how beta adrenergic stimulation affects osteoblastic migration and associated proliferation is not known. Here we introduced a live imaging system, fluorescent ubiquitination-based cell cycle indicator (FUCCI), in osteoblast biology and examined isoproterenol regulation of cell cycle transition and cell migration in osteoblasts. Isoproterenol treatment suppresses the levels of first entry peak of quiescent osteoblastic cells into cell cycle phase by shifting from G1 /G0 to S/G2 /M and also suppresses the levels of second major peak population that enters into S/G2 /M. The isoproterenol regulation of osteoblastic cell cycle transition is associated with isoproterenol suppression on the velocity of migration. This isoproterenol regulation of migration velocity is cell cycle phase specific as it suppresses migration velocity of osteoblasts in G1 phase but not in G1 /S nor in G2 /M phase. Finally, these observations on isoproterenol regulation of osteoblastic migration and cell cycle transition are opposite to the PTH actions in osteoblasts. In summary, we discovered that sympathetic tone regulates osteoblastic migration in association with cell cycle transition by using FUCCI system. PMID:26192605

  2. Use of INSAT-3D sounder and imager radiances in the 4D-VAR data assimilation system and its implications in the analyses and forecasts

    NASA Astrophysics Data System (ADS)

    Indira Rani, S.; Taylor, Ruth; George, John P.; Rajagopal, E. N.

    2016-05-01

    INSAT-3D, the first Indian geostationary satellite with sounding capability, provides valuable information over India and the surrounding oceanic regions which are pivotal to Numerical Weather Prediction. In collaboration with UK Met Office, NCMRWF developed the assimilation capability of INSAT-3D Clear Sky Brightness Temperature (CSBT), both from the sounder and imager, in the 4D-Var assimilation system being used at NCMRWF. Out of the 18 sounder channels, radiances from 9 channels are selected for assimilation depending on relevance of the information in each channel. The first three high peaking channels, the CO2 absorption channels and the three water vapor channels (channel no. 10, 11, and 12) are assimilated both over land and Ocean, whereas the window channels (channel no. 6, 7, and 8) are assimilated only over the Ocean. Measured satellite radiances are compared with that from short range forecasts to monitor the data quality. This is based on the assumption that the observed satellite radiances are free from calibration errors and the short range forecast provided by NWP model is free from systematic errors. Innovations (Observation - Forecast) before and after the bias correction are indicative of how well the bias correction works. Since the biases vary with air-masses, time, scan angle and also due to instrument degradation, an accurate bias correction algorithm for the assimilation of INSAT-3D sounder radiance is important. This paper discusses the bias correction methods and other quality controls used for the selected INSAT-3D sounder channels and the impact of bias corrected radiance in the data assimilation system particularly over India and surrounding oceanic regions.

  3. Analysing the effect of crystal size and structure in highly efficient CH3NH3PbI3 perovskite solar cells by spatially resolved photo- and electroluminescence imaging

    NASA Astrophysics Data System (ADS)

    Mastroianni, S.; Heinz, F. D.; Im, J.-H.; Veurman, W.; Padilla, M.; Schubert, M. C.; Würfel, U.; Grätzel, M.; Park, N.-G.; Hinsch, A.

    2015-11-01

    CH3NH3PbI3 perovskite solar cells with a mesoporous TiO2 layer and spiro-MeOTAD as a hole transport layer (HTL) with three different CH3NH3I concentrations (0.032 M, 0.044 M and 0.063 M) were investigated. Strong variations in crystal size and morphology resulting in diversified cell efficiencies (9.2%, 16.9% and 12.3%, respectively) were observed. The physical origin of this behaviour was analysed by detailed characterization combining current-voltage curves with photo- and electroluminescence (PL and EL) imaging as well as light beam induced current measurements (LBIC). It was found that the most efficient cell shows the highest luminescence and the least efficient cell is most strongly limited by non-radiative recombination. Crystal size, morphology and distribution in the capping layer and in the porous scaffold strongly affect the non-radiative recombination. Moreover, the very non-uniform crystal structure with multiple facets, as evidenced by SEM images of the 0.032 M device, suggests the creation of a large number of grain boundaries and crystal dislocations. These defects give rise to increased trap-assisted non-radiative recombination as is confirmed by high-resolution μ-PL images. The different imaging techniques used in this study prove to be well-suited to spatially investigate and thus correlate the crystal morphology of the perovskite layer with the electrical and radiative properties of the solar cells and thus with their performance.CH3NH3PbI3 perovskite solar cells with a mesoporous TiO2 layer and spiro-MeOTAD as a hole transport layer (HTL) with three different CH3NH3I concentrations (0.032 M, 0.044 M and 0.063 M) were investigated. Strong variations in crystal size and morphology resulting in diversified cell efficiencies (9.2%, 16.9% and 12.3%, respectively) were observed. The physical origin of this behaviour was analysed by detailed characterization combining current-voltage curves with photo- and electroluminescence (PL and EL) imaging as

  4. Evolution of 3-D subduction-induced mantle flow around lateral slab edges in analogue models of free subduction analysed by stereoscopic particle image velocimetry technique

    NASA Astrophysics Data System (ADS)

    Strak, Vincent; Schellart, Wouter P.

    2014-10-01

    We present analogue models of free subduction in which we investigate the three-dimensional (3-D) subduction-induced mantle flow focusing around the slab edges. We use a stereoscopic Particle Image Velocimetry (sPIV) technique to map the 3-D mantle flow on 4 vertical cross-sections for one experiment and on 3 horizontal depth-sections for another experiment. On each section the in-plane components are mapped as well as the out-of-plane component for several experimental times. The results indicate that four types of maximum upwelling are produced by the subduction-induced mantle flow. The first two are associated with the poloidal circulation occurring in the mantle wedge and in the sub-slab domain. A third type is produced by horizontal motion and deformation of the frontal part of the slab lying on the 660 km discontinuity. The fourth type results from quasi-toroidal return flow around the lateral slab edges, which produces a maximum upwelling located slightly laterally away from the sub-slab domain and can have another maximum upwelling located laterally away from the mantle wedge. These upwellings occur during the whole subduction process. In contrast, the poloidal circulation in the mantle wedge produces a zone of upwelling that is vigorous during the free falling phase of the slab sinking but that decreases in intensity when reaching the steady-state phase. The position of the maximum upward component and horizontal components of the mantle flow velocity field has been tracked through time. Their time-evolving magnitude is well correlated to the trench retreat rate. The maximum upwelling velocity located laterally away from the subducting plate is ∼18-24% of the trench retreat rate during the steady-state subduction phase. It is observed in the mid upper mantle but upwellings are produced throughout the whole upper mantle thickness, potentially promoting decompression melting. It could thereby provide a source for intraplate volcanism, such as Mount Etna in

  5. DATA AND ANALYSES

    EPA Science Inventory

    In order to promote transparency and clarity of the analyses performed in support of EPA's Supplemental Guidance for Assessing Susceptibility from Early-Life Exposure to Carcinogens, the data and the analyses are now available on this web site. The data is presented in two diffe...

  6. SNS shielding analyses overview

    SciTech Connect

    Popova, Irina; Gallmeier, Franz; Iverson, Erik B; Lu, Wei; Remec, Igor

    2015-01-01

    This paper gives an overview on on-going shielding analyses for Spallation Neutron Source. Presently, the most of the shielding work is concentrated on the beam lines and instrument enclosures to prepare for commissioning, save operation and adequate radiation background in the future. There is on-going work for the accelerator facility. This includes radiation-protection analyses for radiation monitors placement, designing shielding for additional facilities to test accelerator structures, redesigning some parts of the facility, and designing test facilities to the main accelerator structure for component testing. Neutronics analyses are required as well to support spent structure management, including waste characterisation analyses, choice of proper transport/storage package and shielding enhancement for the package if required.

  7. Spacelab Charcoal Analyses

    NASA Technical Reports Server (NTRS)

    Slivon, L. E.; Hernon-Kenny, L. A.; Katona, V. R.; Dejarme, L. E.

    1995-01-01

    This report describes analytical methods and results obtained from chemical analysis of 31 charcoal samples in five sets. Each set was obtained from a single scrubber used to filter ambient air on board a Spacelab mission. Analysis of the charcoal samples was conducted by thermal desorption followed by gas chromatography/mass spectrometry (GC/MS). All samples were analyzed using identical methods. The method used for these analyses was able to detect compounds independent of their polarity or volatility. In addition to the charcoal samples, analyses of three Environmental Control and Life Support System (ECLSS) water samples were conducted specifically for trimethylamine.

  8. Wavelet Analyses and Applications

    ERIC Educational Resources Information Center

    Bordeianu, Cristian C.; Landau, Rubin H.; Paez, Manuel J.

    2009-01-01

    It is shown how a modern extension of Fourier analysis known as wavelet analysis is applied to signals containing multiscale information. First, a continuous wavelet transform is used to analyse the spectrum of a nonstationary signal (one whose form changes in time). The spectral analysis of such a signal gives the strength of the signal in each…

  9. Apollo 14 microbial analyses

    NASA Technical Reports Server (NTRS)

    Taylor, G. R.

    1972-01-01

    Extensive microbiological analyses that were performed on the Apollo 14 prime and backup crewmembers and ancillary personnel are discussed. The crewmembers were subjected to four separate and quite different environments during the 137-day monitoring period. The relation between each of these environments and observed changes in the microflora of each astronaut are presented.

  10. Geomorphic analyses from space imagery

    NASA Technical Reports Server (NTRS)

    Morisawa, M.

    1985-01-01

    One of the most obvious applications of space imagery to geomorphological analyses is in the study of drainage patterns and channel networks. LANDSAT, high altitude photography and other types of remote sensing imagery are excellent for depicting stream networks on a regional scale because of their broad coverage in a single image. They offer a valuable tool for comparing and analyzing drainage patterns and channel networks all over the world. Three aspects considered in this geomorphological study are: (1) the origin, evolution and rates of development of drainage systems; (2) the topological studies of network and channel arrangements; and (3) the adjustment of streams to tectonic events and geologic structure (i.e., the mode and rate of adjustment).

  11. Information Omitted From Analyses.

    PubMed

    2015-08-01

    In the Original Article titled “Higher- Order Genetic and Environmental Structure of Prevalent Forms of Child and Adolescent Psychopathology” published in the February 2011 issue of JAMA Psychiatry (then Archives of General Psychiatry) (2011;68[2]:181-189), there were 2 errors. Although the article stated that the dimensions of psychopathology were measured using parent informants for inattention, hyperactivity-impulsivity, and oppositional defiant disorder, and a combination of parent and youth informants for conduct disorder, major depression, generalized anxiety disorder, separation anxiety disorder, social phobia, specific phobia, agoraphobia, and obsessive-compulsive disorder, all dimensional scores used in the reported analyses were actually based on parent reports of symptoms; youth reports were not used. In addition, whereas the article stated that each symptom dimension was residualized on age, sex, age-squared, and age by sex, the dimensions actually were only residualized on age, sex, and age-squared. All analyses were repeated using parent informants for inattention, hyperactivity-impulsivity, and oppositional defiant disorder, and a combination of parent and youth informants for conduct disorder,major depression, generalized anxiety disorder, separation anxiety disorder, social phobia, specific phobia, agoraphobia, and obsessive-compulsive disorder; these dimensional scores were residualized on age, age-squared, sex, sex by age, and sex by age-squared. The results of the new analyses were qualitatively the same as those reported in the article, with no substantial changes in conclusions. The only notable small difference was that major depression and generalized anxiety disorder dimensions had small but significant loadings on the internalizing factor in addition to their substantial loadings on the general factor in the analyses of both genetic and non-shared covariances in the selected models in the new analyses. Corrections were made to the

  12. Systematic Processing of Clementine Data for Scientific Analyses

    NASA Technical Reports Server (NTRS)

    Mcewen, A. S.

    1993-01-01

    If fully successful, the Clementine mission will return about 3,000,000 lunar images and more than 5000 images of Geographos. Effective scientific analyses of such large datasets require systematic processing efforts. Concepts for two such efforts are described: glogal multispectral imaging of the moon; and videos of Geographos.

  13. [Network analyses in neuroimaging studies].

    PubMed

    Hirano, Shigeki; Yamada, Makiko

    2013-06-01

    Neurons are anatomically and physiologically connected to each other, and these connections are involved in various neuronal functions. Multiple important neural networks involved in neurodegenerative diseases can be detected using network analyses in functional neuroimaging. First, the basic methods and theories of voxel-based network analyses, such as principal component analysis, independent component analysis, and seed-based analysis, are described. Disease- and symptom-specific brain networks have been identified using glucose metabolism images in patients with Parkinson's disease. These networks enable us to objectively evaluate individual patients and serve as diagnostic tools as well as biomarkers for therapeutic interventions. Many functional MRI studies have shown that "hub" brain regions, such as the posterior cingulate cortex and medial prefrontal cortex, are deactivated by externally driven cognitive tasks; such brain regions form the "default mode network." Recent studies have shown that this default mode network is disrupted from the preclinical phase of Alzheimer's disease and is associated with amyloid deposition in the brain. Some recent studies have shown that the default mode network is also impaired in Parkinson's disease, whereas other studies have shown inconsistent results. These incongruent results could be due to the heterogeneous pharmacological status, differences in mesocortical dopaminergic impairment status, and concomitant amyloid deposition. Future neuroimaging network analysis studies will reveal novel and interesting findings that will uncover the pathomechanisms of neurological and psychiatric disorders. PMID:23735528

  14. Development of a systematic computer vision-based method to analyse and compare images of false identity documents for forensic intelligence purposes-Part I: Acquisition, calibration and validation issues.

    PubMed

    Auberson, Marie; Baechler, Simon; Zasso, Michaël; Genessay, Thibault; Patiny, Luc; Esseiva, Pierre

    2016-03-01

    Following their detection and seizure by police and border guard authorities, false identity and travel documents are usually scanned, producing digital images. This research investigates the potential of these images to classify false identity documents, highlight links between documents produced by a same modus operandi or same source, and thus support forensic intelligence efforts. Inspired by previous research work about digital images of Ecstasy tablets, a systematic and complete method has been developed to acquire, collect, process and compare images of false identity documents. This first part of the article highlights the critical steps of the method and the development of a prototype that processes regions of interest extracted from images. Acquisition conditions have been fine-tuned in order to optimise reproducibility and comparability of images. Different filters and comparison metrics have been evaluated and the performance of the method has been assessed using two calibration and validation sets of documents, made up of 101 Italian driving licenses and 96 Portuguese passports seized in Switzerland, among which some were known to come from common sources. Results indicate that the use of Hue and Edge filters or their combination to extract profiles from images, and then the comparison of profiles with a Canberra distance-based metric provides the most accurate classification of documents. The method appears also to be quick, efficient and inexpensive. It can be easily operated from remote locations and shared amongst different organisations, which makes it very convenient for future operational applications. The method could serve as a first fast triage method that may help target more resource-intensive profiling methods (based on a visual, physical or chemical examination of documents for instance). Its contribution to forensic intelligence and its application to several sets of false identity documents seized by police and border guards will be

  15. LDEF Satellite Radiation Analyses

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.

    1996-01-01

    Model calculations and analyses have been carried out to compare with several sets of data (dose, induced radioactivity in various experiment samples and spacecraft components, fission foil measurements, and LET spectra) from passive radiation dosimetry on the Long Duration Exposure Facility (LDEF) satellite, which was recovered after almost six years in space. The calculations and data comparisons are used to estimate the accuracy of current models and methods for predicting the ionizing radiation environment in low earth orbit. The emphasis is on checking the accuracy of trapped proton flux and anisotropy models.

  16. Broadband rotor noise analyses

    NASA Technical Reports Server (NTRS)

    George, A. R.; Chou, S. T.

    1984-01-01

    The various mechanisms which generate broadband noise on a range of rotors studied include load fluctuations due to inflow turbulence, due to turbulent boundary layers passing the blades' trailing edges, and due to tip vortex formation. Existing analyses are used and extensions to them are developed to make more accurate predictions of rotor noise spectra and to determine which mechanisms are important in which circumstances. Calculations based on the various prediction methods in existing experiments were compared. The present analyses are adequate to predict the spectra from a wide variety of experiments on fans, full scale and model scale helicopter rotors, wind turbines, and propellers to within about 5 to 10 dB. Better knowledge of the inflow turbulence improves the accuracy of the predictions. Results indicate that inflow turbulence noise depends strongly on ambient conditions and dominates at low frequencies. Trailing edge noise and tip vortex noise are important at higher frequencies if inflow turbulence is weak. Boundary layer trailing edge noise, important, for large sized rotors, increases slowly with angle of attack but not as rapidly as tip vortex noise.

  17. Broadband rotor noise analyses

    NASA Astrophysics Data System (ADS)

    George, A. R.; Chou, S. T.

    1984-04-01

    The various mechanisms which generate broadband noise on a range of rotors studied include load fluctuations due to inflow turbulence, due to turbulent boundary layers passing the blades' trailing edges, and due to tip vortex formation. Existing analyses are used and extensions to them are developed to make more accurate predictions of rotor noise spectra and to determine which mechanisms are important in which circumstances. Calculations based on the various prediction methods in existing experiments were compared. The present analyses are adequate to predict the spectra from a wide variety of experiments on fans, full scale and model scale helicopter rotors, wind turbines, and propellers to within about 5 to 10 dB. Better knowledge of the inflow turbulence improves the accuracy of the predictions. Results indicate that inflow turbulence noise depends strongly on ambient conditions and dominates at low frequencies. Trailing edge noise and tip vortex noise are important at higher frequencies if inflow turbulence is weak. Boundary layer trailing edge noise, important, for large sized rotors, increases slowly with angle of attack but not as rapidly as tip vortex noise.

  18. IMAGES, IMAGES, IMAGES

    SciTech Connect

    Marcus, A.

    1980-07-01

    The role of images of information (charts, diagrams, maps, and symbols) for effective presentation of facts and concepts is expanding dramatically because of advances in computer graphics technology, increasingly hetero-lingual, hetero-cultural world target populations of information providers, the urgent need to convey more efficiently vast amounts of information, the broadening population of (non-expert) computer users, the decrease of available time for reading texts and for decision making, and the general level of literacy. A coalition of visual performance experts, human engineering specialists, computer scientists, and graphic designers/artists is required to resolve human factors aspects of images of information. The need for, nature of, and benefits of interdisciplinary effort are discussed. The results of an interdisciplinary collaboration are demonstrated in a product for visualizing complex information about global energy interdependence. An invited panel will respond to the presentation.

  19. EEG analyses with SOBI.

    SciTech Connect

    Glickman, Matthew R.; Tang, Akaysha

    2009-02-01

    The motivating vision behind Sandia's MENTOR/PAL LDRD project has been that of systems which use real-time psychophysiological data to support and enhance human performance, both individually and of groups. Relevant and significant psychophysiological data being a necessary prerequisite to such systems, this LDRD has focused on identifying and refining such signals. The project has focused in particular on EEG (electroencephalogram) data as a promising candidate signal because it (potentially) provides a broad window on brain activity with relatively low cost and logistical constraints. We report here on two analyses performed on EEG data collected in this project using the SOBI (Second Order Blind Identification) algorithm to identify two independent sources of brain activity: one in the frontal lobe and one in the occipital. The first study looks at directional influences between the two components, while the second study looks at inferring gender based upon the frontal component.

  20. LDEF Satellite Radiation Analyses

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.

    1996-01-01

    This report covers work performed by Science Applications International Corporation (SAIC) under contract NAS8-39386 from the NASA Marshall Space Flight Center entitled LDEF Satellite Radiation Analyses. The basic objective of the study was to evaluate the accuracy of present models and computational methods for defining the ionizing radiation environment for spacecraft in Low Earth Orbit (LEO) by making comparisons with radiation measurements made on the Long Duration Exposure Facility (LDEF) satellite, which was recovered after almost six years in space. The emphasis of the work here is on predictions and comparisons with LDEF measurements of induced radioactivity and Linear Energy Transfer (LET) measurements. These model/data comparisons have been used to evaluate the accuracy of current models for predicting the flux and directionality of trapped protons for LEO missions.

  1. Network Class Superposition Analyses

    PubMed Central

    Pearson, Carl A. B.; Zeng, Chen; Simha, Rahul

    2013-01-01

    Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., for the yeast cell cycle process [1]), considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix , which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for derived from Boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with . We show how to generate Derrida plots based on . We show that -based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on . We motivate all of these results in terms of a popular molecular biology Boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for , for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses. PMID:23565141

  2. NOAA's National Snow Analyses

    NASA Astrophysics Data System (ADS)

    Carroll, T. R.; Cline, D. W.; Olheiser, C. M.; Rost, A. A.; Nilsson, A. O.; Fall, G. M.; Li, L.; Bovitz, C. T.

    2005-12-01

    NOAA's National Operational Hydrologic Remote Sensing Center (NOHRSC) routinely ingests all of the electronically available, real-time, ground-based, snow data; airborne snow water equivalent data; satellite areal extent of snow cover information; and numerical weather prediction (NWP) model forcings for the coterminous U.S. The NWP model forcings are physically downscaled from their native 13 km2 spatial resolution to a 1 km2 resolution for the CONUS. The downscaled NWP forcings drive an energy-and-mass-balance snow accumulation and ablation model at a 1 km2 spatial resolution and at a 1 hour temporal resolution for the country. The ground-based, airborne, and satellite snow observations are assimilated into the snow model's simulated state variables using a Newtonian nudging technique. The principle advantages of the assimilation technique are: (1) approximate balance is maintained in the snow model, (2) physical processes are easily accommodated in the model, and (3) asynoptic data are incorporated at the appropriate times. The snow model is reinitialized with the assimilated snow observations to generate a variety of snow products that combine to form NOAA's NOHRSC National Snow Analyses (NSA). The NOHRSC NSA incorporate all of the available information necessary and available to produce a "best estimate" of real-time snow cover conditions at 1 km2 spatial resolution and 1 hour temporal resolution for the country. The NOHRSC NSA consist of a variety of daily, operational, products that characterize real-time snowpack conditions including: snow water equivalent, snow depth, surface and internal snowpack temperatures, surface and blowing snow sublimation, and snowmelt for the CONUS. The products are generated and distributed in a variety of formats including: interactive maps, time-series, alphanumeric products (e.g., mean areal snow water equivalent on a hydrologic basin-by-basin basis), text and map discussions, map animations, and quantitative gridded products

  3. On study design in neuroimaging heritability analyses

    NASA Astrophysics Data System (ADS)

    Koran, Mary Ellen; Li, Bo; Jahanshad, Neda; Thornton-Wells, Tricia A.; Glahn, David C.; Thompson, Paul M.; Blangero, John; Nichols, Thomas E.; Kochunov, Peter; Landman, Bennett A.

    2014-03-01

    Imaging genetics is an emerging methodology that combines genetic information with imaging-derived metrics to understand how genetic factors impact observable structural, functional, and quantitative phenotypes. Many of the most well-known genetic studies are based on Genome-Wide Association Studies (GWAS), which use large populations of related or unrelated individuals to associate traits and disorders with individual genetic factors. Merging imaging and genetics may potentially lead to improved power of association in GWAS because imaging traits may be more sensitive phenotypes, being closer to underlying genetic mechanisms, and their quantitative nature inherently increases power. We are developing SOLAR-ECLIPSE (SE) imaging genetics software which is capable of performing genetic analyses with both large-scale quantitative trait data and family structures of variable complexity. This program can estimate the contribution of genetic commonality among related subjects to a given phenotype, and essentially answer the question of whether or not the phenotype is heritable. This central factor of interest, heritability, offers bounds on the direct genetic influence over observed phenotypes. In order for a trait to be a good phenotype for GWAS, it must be heritable: at least some proportion of its variance must be due to genetic influences. A variety of family structures are commonly used for estimating heritability, yet the variability and biases for each as a function of the sample size are unknown. Herein, we investigate the ability of SOLAR to accurately estimate heritability models based on imaging data simulated using Monte Carlo methods implemented in R. We characterize the bias and the variability of heritability estimates from SOLAR as a function of sample size and pedigree structure (including twins, nuclear families, and nuclear families with grandparents).

  4. Analysing the Metaphorical Images of Turkish Preschool Teachers

    ERIC Educational Resources Information Center

    Kabadayi, Abdulkadir

    2008-01-01

    The metaphorical basis of teacher reflection about teaching and learning has been a rich area of theory and research. This is a study of metaphor as a shared system of interpretation and classification, which teachers and student teachers and their supervising teachers can cooperatively explore. This study employs metaphor as a means of research…

  5. Characterization of high-speed video systems: tests and analyses

    NASA Astrophysics Data System (ADS)

    Carlton, Patrick N.; Chenette, Eugene R.; Rowe, W. J.; Snyder, Donald R.

    1992-01-01

    The current method of munitions systems testing uses film cameras to record airborne events such as store separation. After film exposure, much time is spent in developing the film and analyzing the images. If the analysis uses digital methods, additional time is required to digitize the images preparatory to the analysis phase. Because airborne equipment parameters such as exposure time cannot be adjusted in flight, images often suffer as a result of changing lighting conditions. Image degradation from other sources may occur in the film development process, and during digitizing. Advances in the design of charge-coupled device (CCD) cameras and mass storage devices, coupled with sophisticated data compression and transmission systems, provide the means to overcome these shortcomings. A system can be developed where the image sensor provides an analog electronic signal and, consequently, images can be digitized and stored using digital mass storage devices or transmitted to a ground station for immediate viewing and analysis. All electronic imaging and processing offers the potential for improved data quality, rapid response time and closed loop operation. This paper examines high speed, high resolution imaging system design issues assuming an electronic image sensor will be used. Experimental data and analyses are presented on the resolution capability of current film and digital image processing technology. Electrical power dissipation in a high speed, high resolution CCD array is also analyzed.

  6. Analysing Children's Drawings: Applied Imagination

    ERIC Educational Resources Information Center

    Bland, Derek

    2012-01-01

    This article centres on a research project in which freehand drawings provided a richly creative and colourful data source of children's imagined, ideal learning environments. Issues concerning the analysis of the visual data are discussed, in particular, how imaginative content was analysed and how the analytical process was dependent on an…

  7. Feed analyses and their interpretation

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Compositional analysis is central to determining the nutritional value of feedstuffs. The utility of the values and how they should be used depends on how representative the feed subsample is, the nutritional relevance of the assays, analytical variability of the analyses, and whether a feed is suit...

  8. Workload analyse of assembling process

    NASA Astrophysics Data System (ADS)

    Ghenghea, L. D.

    2015-11-01

    The workload is the most important indicator for managers responsible of industrial technological processes no matter if these are automated, mechanized or simply manual in each case, machines or workers will be in the focus of workload measurements. The paper deals with workload analyses made to a most part manual assembling technology for roller bearings assembling process, executed in a big company, with integrated bearings manufacturing processes. In this analyses the delay sample technique have been used to identify and divide all bearing assemblers activities, to get information about time parts from 480 minutes day work time that workers allow to each activity. The developed study shows some ways to increase the process productivity without supplementary investments and also indicated the process automation could be the solution to gain maximum productivity.

  9. FORTRAN Algorithm for Image Processing

    NASA Technical Reports Server (NTRS)

    Roth, Don J.; Hull, David R.

    1987-01-01

    FORTRAN computer algorithm containing various image-processing analysis and enhancement functions developed. Algorithm developed specifically to process images of developmental heat-engine materials obtained with sophisticated nondestructive evaluation instruments. Applications of program include scientific, industrial, and biomedical imaging for studies of flaws in materials, analyses of steel and ores, and pathology.

  10. Supplementary report on antilock analyses

    NASA Technical Reports Server (NTRS)

    Zellner, J. W.

    1985-01-01

    Generic modulator analysis was performed to quantify the effects of dump and reapply pressure rates on antilock stability and performance. Analysis will include dump and reapply rates, and lumped modulator delay. Based on the results of the generic modulator analysis and earlier toggle optimization analysis (with Mitsubishi modulator), a recommended preliminary antilock design was synthesized and its response and performance simulated. The results of these analyses are documented.

  11. Biological aerosol warner and analyser

    NASA Astrophysics Data System (ADS)

    Schlemmer, Harry; Kürbitz, Gunther; Miethe, Peter; Spieweck, Michael

    2006-05-01

    The development of an integrated sensor device BiSAM (Biological Sampling and Analysing Module) is presented which is designed for rapid detection of aerosol or dust particles potentially loaded with biological warfare agents. All functional steps from aerosol collection via immuno analysis to display of results are fully automated. The core component of the sensor device is an ultra sensitive rapid analyser PBA (Portable Benchtop Analyser) based on a 3 dimensional immuno filtration column of large internal area, Poly HRP marker technology and kinetic optical detection. High sensitivity despite of the short measuring time, high chemical stability of the micro column and robustness against interferents make the PBA an ideal tool for fielded sensor devices. It is especially favourable to combine the PBA with a bio collector because virtually no sample preparation is necessary. Overall, the BiSAM device is capable to detect and identify living micro organisms (bacteria, spores, viruses) as well as toxins in a measuring cycle of typically half an hour duration. In each batch up to 12 different tests can be run in parallel together with positive and negative controls to keep the false alarm rate low.

  12. Mitogenomic analyses of eutherian relationships.

    PubMed

    Arnason, U; Janke, A

    2002-01-01

    Reasonably correct phylogenies are fundamental to the testing of evolutionary hypotheses. Here, we present phylogenetic findings based on analyses of 67 complete mammalian mitochondrial (mt) genomes. The analyses, irrespective of whether they were performed at the amino acid (aa) level or on nucleotides (nt) of first and second codon positions, placed Erinaceomorpha (hedgehogs and their kin) as the sister group of remaining eutherians. Thus, the analyses separated Erinaceomorpha from other traditional lipotyphlans (e.g., tenrecs, moles, and shrews), making traditional Lipotyphla polyphyletic. Both the aa and nt data sets identified the two order-rich eutherian clades, the Cetferungulata (comprising Pholidota, Carnivora, Perissodactyla, Artiodactyla, and Cetacea) and the African clade (Tenrecomorpha, Macroscelidea, Tubulidentata, Hyracoidea, Proboscidea, and Sirenia). The study corroborated recent findings that have identified a sister-group relationship between Anthropoidea and Dermoptera (flying lemurs), thereby making our own order, Primates, a paraphyletic assembly. Molecular estimates using paleontologically well-established calibration points, placed the origin of most eutherian orders in Cretaceous times, 70-100 million years before present (MYBP). The same estimates place all primate divergences much earlier than traditionally believed. For example, the divergence between Homo and Pan is estimated to have taken place approximately 10 MYBP, a dating consistent with recent findings in primate paleontology. PMID:12438776

  13. Mars periglacial punctual features analyses

    NASA Astrophysics Data System (ADS)

    Machado, Adriane; Barata, Teresa; Ivo Alves, E.; Cunha, Pedro P.

    2012-11-01

    The presence of patterned grounds on Mars has been reported in several papers, especially the study of polygons distribution, size and formation processes. In the last years, the presence of basketball terrains has been noticed on Mars. Studies were made to recognize these terrains on Mars through the analysis of Mars Orbiter Camera (MOC) images. We have been developing an algorithm that recognizes automatically and extracts the hummocky patterns on Mars related to landforms generated by freeze-thaw cycles such as mud boils features. The algorithm is based on remote sensing data that establishes a comparison between the hummocks and mud boils morphology and size from Adventdalen at Longyearbyen (Svalbard - Norway) and hummocky patterns on Mars using High Resolution Imaging Science Experiment (HiRISE) imagery.

  14. Analysing photonic structures in plants

    PubMed Central

    Vignolini, Silvia; Moyroud, Edwige; Glover, Beverley J.; Steiner, Ullrich

    2013-01-01

    The outer layers of a range of plant tissues, including flower petals, leaves and fruits, exhibit an intriguing variation of microscopic structures. Some of these structures include ordered periodic multilayers and diffraction gratings that give rise to interesting optical appearances. The colour arising from such structures is generally brighter than pigment-based colour. Here, we describe the main types of photonic structures found in plants and discuss the experimental approaches that can be used to analyse them. These experimental approaches allow identification of the physical mechanisms producing structural colours with a high degree of confidence. PMID:23883949

  15. Summary of LDEF battery analyses

    NASA Technical Reports Server (NTRS)

    Johnson, Chris; Thaller, Larry; Bittner, Harlin; Deligiannis, Frank; Tiller, Smith; Sullivan, David; Bene, James

    1992-01-01

    Tests and analyses of NiCd, LiSO2, and LiCf batteries flown on the Long Duration Exposure Facility (LDEF) includes results from NASA, Aerospace, and commercial labs. The LiSO2 cells illustrate six-year degradation of internal components acceptable for space applications, with up to 85 percent battery capacity remaining on discharge of some returned cells. LiCf batteries completed their mission, but lost any remaining capacity due to internal degradation. Returned NiCd batteries tested an GSFC showed slight case distortion due to pressure build up, but were functioning as designed.

  16. Magnetic Imaging

    NASA Astrophysics Data System (ADS)

    Petford-Long, A. K.

    Spin-transport effects, such as giant magnetoresistance, rely on the fact that there is a difference in scattering between the spin-up and spin-down electrons in a ferromagnetic material. The degree to which each electron channel is scattered depends on the magnetisation direction within the material, and thus on the local magnetic domain structure. It is therefore of importance when analysing spin-transport devices to understand their magnetic domain structure, both as a bulk property and locally. The aim of this chapter is to review a number of the techniques currently used to image magnetic domain structure in materials. Although a considerable amount of information about the magnetic properties and behaviour of a piece of material, for example a thin ferromagnetic film, can be obtained from bulk magnetometry measurements, it is often extremely useful to image the magnetic domain structure of the film and thus gain information about its magnetic properties at a local level. The various magnetic imaging techniques yet to be described can be extended, by the application of in-situ magnetic fields which allow not only the magnetic domains but also the magnetisation reversal process to be followed in real-time.

  17. Analyses to improve operational flexibility

    SciTech Connect

    Trikouros, N.G.

    1986-01-01

    Operational flexibility is greatly enhanced if the technical bases for plant limits and design margins are fully understood, and the analyses necessary to evaluate the effect of plant modifications or changes in operating modes on these parameters can be performed as required. If a condition should arise that might jeopardize a plant limit or reduce operational flexibility, it would be necessary to understand the basis for the limit or the specific condition limiting operational flexibility and be capable of performing a reanalysis to either demonstrate that the limit will not be violated or to change the limit. This paper provides examples of GPU Nuclear efforts in this regard. Examples of Oyster Creek and Three Mile Island operating experiences are discussed.

  18. Uncertainty and Sensitivity Analyses Plan

    SciTech Connect

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.

  19. Chemical analyses of provided samples

    NASA Technical Reports Server (NTRS)

    Becker, Christopher H.

    1993-01-01

    Two batches of samples were received and chemical analysis was performed of the surface and near surface regions of the samples by the surface analysis by laser ionization (SALI) method. The samples included four one-inch optics and several paint samples. The analyses emphasized surface contamination or modification. In these studies, pulsed sputtering by 7 keV Ar+ and primarily single-photon ionization (SPI) by coherent 118 nm radiation (at approximately 5 x 10(exp 5) W/cm(sup 2) were used. For two of the samples, also multiphoton ionization (MPI) at 266 nm (approximately 5 x 10(exp 11) W/cm(sup 2) was used. Most notable among the results was the silicone contamination on Mg2 mirror 28-92, and that the Long Duration Exposure Facility (LDEF) paint sample had been enriched in K and Na and depleted in Zn, Si, B, and organic compounds relative to the control paint.

  20. 3-D Cavern Enlargement Analyses

    SciTech Connect

    EHGARTNER, BRIAN L.; SOBOLIK, STEVEN R.

    2002-03-01

    Three-dimensional finite element analyses simulate the mechanical response of enlarging existing caverns at the Strategic Petroleum Reserve (SPR). The caverns are located in Gulf Coast salt domes and are enlarged by leaching during oil drawdowns as fresh water is injected to displace the crude oil from the caverns. The current criteria adopted by the SPR limits cavern usage to 5 drawdowns (leaches). As a base case, 5 leaches were modeled over a 25 year period to roughly double the volume of a 19 cavern field. Thirteen additional leaches where then simulated until caverns approached coalescence. The cavern field approximated the geometries and geologic properties found at the West Hackberry site. This enabled comparisons are data collected over nearly 20 years to analysis predictions. The analyses closely predicted the measured surface subsidence and cavern closure rates as inferred from historic well head pressures. This provided the necessary assurance that the model displacements, strains, and stresses are accurate. However, the cavern field has not yet experienced the large scale drawdowns being simulated. Should they occur in the future, code predictions should be validated with actual field behavior at that time. The simulations were performed using JAS3D, a three dimensional finite element analysis code for nonlinear quasi-static solids. The results examine the impacts of leaching and cavern workovers, where internal cavern pressures are reduced, on surface subsidence, well integrity, and cavern stability. The results suggest that the current limit of 5 oil drawdowns may be extended with some mitigative action required on the wells and later on to surface structure due to subsidence strains. The predicted stress state in the salt shows damage to start occurring after 15 drawdowns with significant failure occurring at the 16th drawdown, well beyond the current limit of 5 drawdowns.

  1. Isotopic signatures by bulk analyses

    SciTech Connect

    Efurd, D.W.; Rokop, D.J.

    1997-12-01

    Los Alamos National Laboratory has developed a series of measurement techniques for identification of nuclear signatures by analyzing bulk samples. Two specific applications for isotopic fingerprinting to identify the origin of anthropogenic radioactivity in bulk samples are presented. The first example is the analyses of environmental samples collected in the US Arctic to determine the impact of dumping of radionuclides in this polar region. Analyses of sediment and biota samples indicate that for the areas sampled the anthropogenic radionuclide content of sediments was predominantly the result of the deposition of global fallout. The anthropogenic radionuclide concentrations in fish, birds and mammals were very low. It can be surmised that marine food chains are presently not significantly affected. The second example is isotopic fingerprinting of water and sediment samples from the Rocky Flats Facility (RFP). The largest source of anthropogenic radioactivity presently affecting surface-waters at RFP is the sediments that are currently residing in the holding ponds. One gram of sediment from a holding pond contains approximately 50 times more plutonium than 1 liter of water from the pond. Essentially 100% of the uranium in Ponds A-1 and A-2 originated as depleted uranium. The largest source of radioactivity in the terminal Ponds A-4, B-5 and C-2 was naturally occurring uranium and its decay product radium. The uranium concentrations in the waters collected from the terminal ponds contained 0.05% or less of the interim standard calculated derived concentration guide for uranium in waters available to the public. All of the radioactivity observed in soil, sediment and water samples collected at RFP was naturally occurring, the result of processes at RFP or the result of global fallout. No extraneous anthropogenic alpha, beta or gamma activities were detected. The plutonium concentrations in Pond C-2 appear to vary seasonally.

  2. imageMCR

    SciTech Connect

    2011-09-27

    imageMCR is a user friendly software package that consists of a variety inputs to preprocess and analyze the hyperspectral image data using multivariate algorithms such as Multivariate Curve Resolution (MCR), Principle Component Analysis (PCA), Classical Least Squares (CLS) and Parallel Factor Analysis (PARAFAC). MCR provides a relative quantitative analysis of the hyperspectral image data without the need for standards, and it discovers all the emitting species (spectral pure components) present in an image, even those in which there is no a priori information. Once the spectral components are discovered, these spectral components can be used for future MCR analyses or used with CLS algorithms to quickly extract concentration image maps for each component within spectral image data sets.

  3. imageMCR

    Energy Science and Technology Software Center (ESTSC)

    2011-09-27

    imageMCR is a user friendly software package that consists of a variety inputs to preprocess and analyze the hyperspectral image data using multivariate algorithms such as Multivariate Curve Resolution (MCR), Principle Component Analysis (PCA), Classical Least Squares (CLS) and Parallel Factor Analysis (PARAFAC). MCR provides a relative quantitative analysis of the hyperspectral image data without the need for standards, and it discovers all the emitting species (spectral pure components) present in an image, even thosemore » in which there is no a priori information. Once the spectral components are discovered, these spectral components can be used for future MCR analyses or used with CLS algorithms to quickly extract concentration image maps for each component within spectral image data sets.« less

  4. 7 CFR 94.102 - Analyses available.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Analyses available. 94.102 Section 94.102 Agriculture... POULTRY AND EGG PRODUCTS Voluntary Analyses of Egg Products § 94.102 Analyses available. A wide array of analyses for voluntary egg product samples is available. Voluntary egg product samples include...

  5. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Uncertainty analyses. 436.24 Section 436.24 Energy... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... by conducting additional analyses using any standard engineering economics method such as...

  6. 7 CFR 94.102 - Analyses available.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Analyses available. 94.102 Section 94.102 Agriculture... POULTRY AND EGG PRODUCTS Voluntary Analyses of Egg Products § 94.102 Analyses available. A wide array of analyses for voluntary egg product samples is available. Voluntary egg product samples include...

  7. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 3 2011-01-01 2011-01-01 false Uncertainty analyses. 436.24 Section 436.24 Energy... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... by conducting additional analyses using any standard engineering economics method such as...

  8. APXS ANALYSES OF BOUNCE ROCK: THE FIRST SHERGOTTITE ON MARS

    NASA Technical Reports Server (NTRS)

    Ming, Douglas W.; Zipfel, J.; Anderson, R.; Brueckner, J.; Clark, B. C.; Dreibus, G.; Economou, T.; Gellert, R.; Lugmair, G. W.; Klingelhoefer, G.

    2005-01-01

    During the MER Mission, an isolated rock at Meridiani Planum was analyzed by the Athena instrument suite [1]. Remote sensing instruments noticed its distinct appearance. Two areas on the untreated rock surface and one area that was abraded with the Rock Abrasion Tool were analyzed by Microscopic Imager, Mossbauer Mimos II [2], and Alpha Particle X-ray Spectrometer (APXS). Results of all analyses revealed a close relationship of this rock with known basaltic shergottites.

  9. Speed analyses of stimulus equivalence.

    PubMed Central

    Spencer, T J; Chase, P N

    1996-01-01

    The functional substitutability of stimuli in equivalence classes was examined through analyses of the speed of college students' accurate responding. After training subjects to respond to 18 conditional relations, subjects' accuracy and speed of accurate responding were compared across trial types (baseline, symmetry, transitivity, and combined transitivity and symmetry) and nodal distance (one- through five-node transitive and combined transitive and symmetric relations). Differences in accuracy across nodal distance and trial type were significant only on the first tests of equivalence, whereas differences in speed were significant even after extended testing. Response speed was inversely related to the number of nodes on which the tested relations were based. Significant differences in response speed were also found across trial types, except between transitivity and combined trials. To determine the generality of these comparisons, three groups of subjects were included: An instructed group was given an instruction that specified the interchangeability of stimuli related through training; a queried group was queried about the basis for test-trial responding: and a standard group was neither instructed nor queried. There were no significant differences among groups. These results suggest the use of response speed and response accuracy to measure the strength of matching relations. PMID:8636663

  10. Helicopter tail rotor noise analyses

    NASA Technical Reports Server (NTRS)

    George, A. R.; Chou, S. T.

    1986-01-01

    A study was made of helicopter tail rotor noise, particularly that due to interactions with the main rotor tip vortices, and with the fuselage separation mean wake. The tail rotor blade-main rotor tip vortex interaction is modelled as an airfoil of infinite span cutting through a moving vortex. The vortex and the geometry information required by the analyses are obtained through a free wake geometry analysis of the main rotor. The acoustic pressure-time histories for the tail rotor blade-vortex interactions are then calculated. These acoustic results are compared to tail rotor loading and thickness noise, and are found to be significant to the overall tail rotor noise generation. Under most helicopter operating conditions, large acoustic pressure fluctuations can be generated due to a series of skewed main rotor tip vortices passing through the tail rotor disk. The noise generation depends strongly upon the helicopter operating conditions and the location of the tail rotor relative to the main rotor.

  11. Imprecise probabilities in engineering analyses

    NASA Astrophysics Data System (ADS)

    Beer, Michael; Ferson, Scott; Kreinovich, Vladik

    2013-05-01

    Probabilistic uncertainty and imprecision in structural parameters and in environmental conditions and loads are challenging phenomena in engineering analyses. They require appropriate mathematical modeling and quantification to obtain realistic results when predicting the behavior and reliability of engineering structures and systems. But the modeling and quantification is complicated by the characteristics of the available information, which involves, for example, sparse data, poor measurements and subjective information. This raises the question whether the available information is sufficient for probabilistic modeling or rather suggests a set-theoretical approach. The framework of imprecise probabilities provides a mathematical basis to deal with these problems which involve both probabilistic and non-probabilistic information. A common feature of the various concepts of imprecise probabilities is the consideration of an entire set of probabilistic models in one analysis. The theoretical differences between the concepts mainly concern the mathematical description of the set of probabilistic models and the connection to the probabilistic models involved. This paper provides an overview on developments which involve imprecise probabilities for the solution of engineering problems. Evidence theory, probability bounds analysis with p-boxes, and fuzzy probabilities are discussed with emphasis on their key features and on their relationships to one another. This paper was especially prepared for this special issue and reflects, in various ways, the thinking and presentation preferences of the authors, who are also the guest editors for this special issue.

  12. Comparison between Inbreeding Analyses Methodologies.

    PubMed

    Esparza, Mireia; Martínez-Abadías, Neus; Sjøvold, Torstein; González-José, Rolando; Hernández, Miquel

    2015-12-01

    Surnames are widely used in inbreeding analysis, but the validity of results has often been questioned due to the failure to comply with the prerequisites of the method. Here we analyze inbreeding in Hallstatt (Austria) between the 17th and the 19th centuries both using genealogies and surnames. The high and significant correlation of the results obtained by both methods demonstrates the validity of the use of surnames in this kind of studies. On the other hand, the inbreeding values obtained (0.24 x 10⁻³ in the genealogies analysis and 2.66 x 10⁻³ in the surnames analysis) are lower than those observed in Europe for this period and for this kind of population, demonstrating the falseness of the apparent isolation of Hallstatt's population. The temporal trend of inbreeding in both analyses does not follow the European general pattern, but shows a maximum in 1850 with a later decrease along the second half of the 19th century. This is probably due to the high migration rate that is implied by the construction of transport infrastructures around the 1870's. PMID:26987150

  13. Network analyses in systems pharmacology

    PubMed Central

    Berger, Seth I.; Iyengar, Ravi

    2009-01-01

    Systems pharmacology is an emerging area of pharmacology which utilizes network analysis of drug action as one of its approaches. By considering drug actions and side effects in the context of the regulatory networks within which the drug targets and disease gene products function, network analysis promises to greatly increase our knowledge of the mechanisms underlying the multiple actions of drugs. Systems pharmacology can provide new approaches for drug discovery for complex diseases. The integrated approach used in systems pharmacology can allow for drug action to be considered in the context of the whole genome. Network-based studies are becoming an increasingly important tool in understanding the relationships between drug action and disease susceptibility genes. This review discusses how analysis of biological networks has contributed to the genesis of systems pharmacology and how these studies have improved global understanding of drug targets, suggested new targets and approaches for therapeutics, and provided a deeper understanding of the effects of drugs. Taken together, these types of analyses can lead to new therapeutic options while improving the safety and efficacy of existing medications. Contact: ravi.iyengar@mssm.edu PMID:19648136

  14. Consumption patterns and perception analyses of hangwa.

    PubMed

    Kwock, Chang Geun; Lee, Min A; Park, So Hyun

    2012-03-01

    Hangwa is a traditional food, corresponding to the current consumption trend, in need of marketing strategies to extend its consumption. Therefore, the purpose of this study was to analyze consumers' consumption patterns and perception of Hangwa to increase consumption in the market. A questionnaire was sent to 250 consumers by e-mail from Oct 8∼23, 2009 and the data from 231 persons were analyzed in this study. Statistical, descriptive, paired samples t-test, and importance-performance analyses were conducted using SPSS WIN 17.0. According to the results, Hangwa was purchased mainly 'for present' (39.8%) and the main reasons for buying it were 'traditional image' (33.3%) and 'taste' (22.5%). When importance and performance of attributes considered in purchasing Hangwa were evaluated, performance was assessed to be lower than importance for all attributes. The attributes in the first quadrant with a high importance and a high performance were 'a sanitary process', 'a rigorous quality mark' and 'taste', which were related with quality of the products. In addition, those with a high importance but a low performance were 'popularization through advertisement', 'promotion through mass media', 'conversion of thought on traditional foods', 'a reasonable price' and 'a wide range of price'. In conclusion, Hangwa manufacturers need to diversify products and extend the expiration date based on technologies to promote its consumption. In terms of price, Hangwa should become more available by lowering the price barrier for consumers who are sensitive to price. PMID:24471065

  15. The relationship among sea surface roughness variations, oceanographic analyses, and airborne remote sensing analyses

    NASA Technical Reports Server (NTRS)

    Oertel, G. F.; Wade, T. L.

    1981-01-01

    The synthetic aperture radar (SAR) was studied to determine whether it could image large scale estuaries and oceanic features such as fronts and to explain the electromagnetic interaction between SAR and the individual surface front features. Fronts were observed to occur at the entrance to the Chesapeake Bay. The airborne measurements consisted of data collection by SAR onboard an F-4 aircraft and real aperture side looking radar (SLAR) in Mohawk aircraft. A total of 89 transects were flown. Surface roughness and color as well as temperature and salinity were evaluated. Cross-frontal surveys were made. Frontal shear and convergence flow were obtained. Surface active organic materials, it was indicated, are present at the air-sea interface. In all, 2000 analyses were conducted to characterize the spatial and temporal variabilities associated with water mass boundaries.

  16. NOx analyser interefence from alkenes

    NASA Astrophysics Data System (ADS)

    Bloss, W. J.; Alam, M. S.; Lee, J. D.; Vazquez, M.; Munoz, A.; Rodenas, M.

    2012-04-01

    Nitrogen oxides (NO and NO2, collectively NOx) are critical intermediates in atmospheric chemistry. NOx abundance controls the levels of the primary atmospheric oxidants OH, NO3 and O3, and regulates the ozone production which results from the degradation of volatile organic compounds. NOx are also atmospheric pollutants in their own right, and NO2 is commonly included in air quality objectives and regulations. In addition to their role in controlling ozone formation, NOx levels affect the production of other pollutants such as the lachrymator PAN, and the nitrate component of secondary aerosol particles. Consequently, accurate measurement of nitrogen oxides in the atmosphere is of major importance for understanding our atmosphere. The most widely employed approach for the measurement of NOx is chemiluminescent detection of NO2* from the NO + O3 reaction, combined with NO2 reduction by either a heated catalyst or photoconvertor. The reaction between alkenes and ozone is also chemiluminescent; therefore alkenes may contribute to the measured NOx signal, depending upon the instrumental background subtraction cycle employed. This interference has been noted previously, and indeed the effect has been used to measure both alkenes and ozone in the atmosphere. Here we report the results of a systematic investigation of the response of a selection of NOx analysers, ranging from systems used for routine air quality monitoring to atmospheric research instrumentation, to a series of alkenes ranging from ethene to the biogenic monoterpenes, as a function of conditions (co-reactants, humidity). Experiments were performed in the European Photoreactor (EUPHORE) to ensure common calibration, a common sample for the monitors, and to unequivocally confirm the alkene (via FTIR) and NO2 (via DOAS) levels present. The instrument responses ranged from negligible levels up to 10 % depending upon the alkene present and conditions used. Such interferences may be of substantial importance

  17. Image Gallery

    MedlinePlus

    ... R S T U V W X Y Z Image Gallery Share: The Image Gallery contains high-quality digital photographs available from ... Select a category below to view additional thumbnail images. Images are available for direct download in 2 ...

  18. Cancer Imaging

    MedlinePlus

    ... I/II Trials CIP ARRA-Funded Clinical Trials Informatics The Cancer Imaging Archive TCGA Imaging Genomics Quantitative Imaging Network LIDC-IDRI Imaging Informatics Resources News & Events News and Announcements Events – Meetings ...

  19. Analyses of Transistor Punchthrough Failures

    NASA Technical Reports Server (NTRS)

    Nicolas, David P.

    1999-01-01

    The failure of two transistors in the Altitude Switch Assembly for the Solid Rocket Booster followed by two additional failures a year later presented a challenge to failure analysts. These devices had successfully worked for many years on numerous missions. There was no history of failures with this type of device. Extensive checks of the test procedures gave no indication for a source of the cause. The devices were manufactured more than twenty years ago and failure information on this lot date code was not readily available. External visual exam, radiography, PEID, and leak testing were performed with nominal results Electrical testing indicated nearly identical base-emitter and base-collector characteristics (both forward and reverse) with a low resistance short emitter to collector. These characteristics are indicative of a classic failure mechanism called punchthrough. In failure analysis punchthrough refers to an condition where a relatively low voltage pulse causes the device to conduct very hard producing localized areas of thermal runaway or "hot spots". At one or more of these hot spots, the excessive currents melt the silicon. Heavily doped emitter material diffuses through the base region to the collector forming a diffusion pipe shorting the emitter to base to collector. Upon cooling, an alloy junction forms between the pipe and the base region. Generally, the hot spot (punch-through site) is under the bond and no surface artifact is visible. The devices were delidded and the internal structures were examined microscopically. The gold emitter lead was melted on one device, but others had anomalies in the metallization around the in-tact emitter bonds. The SEM examination confirmed some anomalies to be cosmetic defects while other anomalies were artifacts of the punchthrough site. Subsequent to these analyses, the contractor determined that some irregular testing procedures occurred at the time of the failures heretofore unreported. These testing

  20. Analyses of the LMC Novae

    NASA Astrophysics Data System (ADS)

    Vanlandingham, K. M.; Schwarz, G. J.; Starrfield, S.; Hauschildt, P. H.; Shore, S. N.; Sonneborn, G.

    In the past 10 years, 6 classical novae have been observed in the Large Magellanic Cloud (LMC). We have begun a study of these objects using ultraviolet spectra obtained by IUE and optical spectra from nova surveys. We are using the results of this study to further our understanding of novae and stellar evolution. Our study includes analysis of both the early, optically thick spectra using model atmospheres, and the later nebular spectra using optimization of photoionization codes. By analysing of all the LMC novae in a consistent manner, we can compare their individual results and use their combined properties to calibrate Galactic novae. In addition, our studies can be used to determine the elemental abundances of the nova ejecta, the amount of mass ejected, and the contribution of novae to the ISM abundances. To date we have analyzed Nova LMC 1988#1 and Nova LMC 1990#1, and have obtained preliminary results for Nova LMC 1991. The results of this work are presented in this poster. The metal content of the LMC is known to be sub-solar and varies as a function of location within the cloud. A detailed abundance analysis of the ejecta of the LMC novae provides important information concerning the effect of initial metal abundances on energetics of the nova outburst. Since the distance to the LMC is well known, many important parameters of the outburst, such as the luminosity, can be absolutely determined. Both galactic and extragalactic novae have been proposed as potential standard candles. Recent work by Della Valle & Livio (1995) has improved on the standard relations (e.g., Schmidt 1957; Pfau 1976; Cohen 1985; Livio 1992) by including novae from the LMC and M31. Unfortunately, the dependence of the nova outburst on metallicity has not been well-studied. Recent theoretical work by Starrfield et al. (1998) indicates that the luminosity of the outburst increases with decreasing metal abundances. If there is a dependence of luminosity on metallicity, it will have to

  1. Quantum Image Encryption Algorithm Based on Quantum Image XOR Operations

    NASA Astrophysics Data System (ADS)

    Gong, Li-Hua; He, Xiang-Tao; Cheng, Shan; Hua, Tian-Xiang; Zhou, Nan-Run

    2016-03-01

    A novel encryption algorithm for quantum images based on quantum image XOR operations is designed. The quantum image XOR operations are designed by using the hyper-chaotic sequences generated with the Chen's hyper-chaotic system to control the control-NOT operation, which is used to encode gray-level information. The initial conditions of the Chen's hyper-chaotic system are the keys, which guarantee the security of the proposed quantum image encryption algorithm. Numerical simulations and theoretical analyses demonstrate that the proposed quantum image encryption algorithm has larger key space, higher key sensitivity, stronger resistance of statistical analysis and lower computational complexity than its classical counterparts.

  2. Quantum Image Encryption Algorithm Based on Quantum Image XOR Operations

    NASA Astrophysics Data System (ADS)

    Gong, Li-Hua; He, Xiang-Tao; Cheng, Shan; Hua, Tian-Xiang; Zhou, Nan-Run

    2016-07-01

    A novel encryption algorithm for quantum images based on quantum image XOR operations is designed. The quantum image XOR operations are designed by using the hyper-chaotic sequences generated with the Chen's hyper-chaotic system to control the control-NOT operation, which is used to encode gray-level information. The initial conditions of the Chen's hyper-chaotic system are the keys, which guarantee the security of the proposed quantum image encryption algorithm. Numerical simulations and theoretical analyses demonstrate that the proposed quantum image encryption algorithm has larger key space, higher key sensitivity, stronger resistance of statistical analysis and lower computational complexity than its classical counterparts.

  3. Statistical image analysis of longitudinal RAVENS images

    PubMed Central

    Lee, Seonjoo; Zipunnikov, Vadim; Reich, Daniel S.; Pham, Dzung L.

    2015-01-01

    Regional analysis of volumes examined in normalized space (RAVENS) are transformation images used in the study of brain morphometry. In this paper, RAVENS images are analyzed using a longitudinal variant of voxel-based morphometry (VBM) and longitudinal functional principal component analysis (LFPCA) for high-dimensional images. We demonstrate that the latter overcomes the limitations of standard longitudinal VBM analyses, which does not separate registration errors from other longitudinal changes and baseline patterns. This is especially important in contexts where longitudinal changes are only a small fraction of the overall observed variability, which is typical in normal aging and many chronic diseases. Our simulation study shows that LFPCA effectively separates registration error from baseline and longitudinal signals of interest by decomposing RAVENS images measured at multiple visits into three components: a subject-specific imaging random intercept that quantifies the cross-sectional variability, a subject-specific imaging slope that quantifies the irreversible changes over multiple visits, and a subject-visit specific imaging deviation. We describe strategies to identify baseline/longitudinal variation and registration errors combined with covariates of interest. Our analysis suggests that specific regional brain atrophy and ventricular enlargement are associated with multiple sclerosis (MS) disease progression. PMID:26539071

  4. Pawnee Nation Energy Option Analyses

    SciTech Connect

    Matlock, M.; Kersey, K.; Riding In, C.

    2009-07-21

    Pawnee Nation of Oklahoma Energy Option Analyses In 2003, the Pawnee Nation leadership identified the need for the tribe to comprehensively address its energy issues. During a strategic energy planning workshop a general framework was laid out and the Pawnee Nation Energy Task Force was created to work toward further development of the tribe’s energy vision. The overarching goals of the “first steps” project were to identify the most appropriate focus for its strategic energy initiatives going forward, and to provide information necessary to take the next steps in pursuit of the “best fit” energy options. Description of Activities Performed The research team reviewed existing data pertaining to the availability of biomass (focusing on woody biomass, agricultural biomass/bio-energy crops, and methane capture), solar, wind and hydropower resources on the Pawnee-owned lands. Using these data, combined with assumptions about costs and revenue streams, the research team performed preliminary feasibility assessments for each resource category. The research team also reviewed available funding resources and made recommendations to Pawnee Nation highlighting those resources with the greatest potential for financially-viable development, both in the near-term and over a longer time horizon. Findings and Recommendations Due to a lack of financial incentives for renewable energy, particularly at the state level, combined mediocre renewable energy resources, renewable energy development opportunities are limited for Pawnee Nation. However, near-term potential exists for development of solar hot water at the gym, and an exterior wood-fired boiler system at the tribe’s main administrative building. Pawnee Nation should also explore options for developing LFGTE resources in collaboration with the City of Pawnee. Significant potential may also exist for development of bio-energy resources within the next decade. Pawnee Nation representatives should closely monitor

  5. 49 CFR 1180.7 - Market analyses.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 8 2010-10-01 2010-10-01 false Market analyses. 1180.7 Section 1180.7..., TRACKAGE RIGHTS, AND LEASE PROCEDURES General Acquisition Procedures § 1180.7 Market analyses. (a) For major and significant transactions, applicants shall submit impact analyses (exhibit 12) describing...

  6. 10 CFR 61.13 - Technical analyses.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 2 2011-01-01 2011-01-01 false Technical analyses. 61.13 Section 61.13 Energy NUCLEAR....13 Technical analyses. The specific technical information must also include the following analyses... air, soil, groundwater, surface water, plant uptake, and exhumation by burrowing animals. The...

  7. 49 CFR 1180.7 - Market analyses.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 8 2011-10-01 2011-10-01 false Market analyses. 1180.7 Section 1180.7..., TRACKAGE RIGHTS, AND LEASE PROCEDURES General Acquisition Procedures § 1180.7 Market analyses. (a) For major and significant transactions, applicants shall submit impact analyses (exhibit 12) describing...

  8. 10 CFR 61.13 - Technical analyses.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Technical analyses. 61.13 Section 61.13 Energy NUCLEAR....13 Technical analyses. The specific technical information must also include the following analyses... air, soil, groundwater, surface water, plant uptake, and exhumation by burrowing animals. The...

  9. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 3 2014-01-01 2014-01-01 false Uncertainty analyses. 436.24 Section 436.24 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION FEDERAL ENERGY MANAGEMENT AND PLANNING PROGRAMS Methodology and Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or timing of cash flows are uncertain and...

  10. Integrated Field Analyses of Thermal Springs

    NASA Astrophysics Data System (ADS)

    Shervais, K.; Young, B.; Ponce-Zepeda, M. M.; Rosove, S.

    2011-12-01

    A group of undergraduate researchers through the SURE internship offered by the Southern California Earthquake Center (SCEC) have examined thermal springs in southern Idaho, northern Utah as well as mud volcanoes in the Salton Sea, California. We used an integrated approach to estimate the setting and maximum temperature, including water chemistry, Ipad-based image and data-base management, microbiology, and gas analyses with a modified Giggenbach sampler.All springs were characterized using GISRoam (tmCogent3D). We are performing geothermometry calculations as well as comparisons with temperature gradient data on the results while also analyzing biological samples. Analyses include water temperature, pH, electrical conductivity, and TDS measured in the field. Each sample is sealed and chilled and delivered to a water lab within 12 hours.Temperatures are continuously monitored with the use of Solinst Levelogger Juniors. Through partnership with a local community college geology club, we receive results on a monthly basis and are able to process initial data earlier in order to evaluate data over a longer time span. The springs and mudpots contained microbial organisms which were analyzed using methods of single colony isolation, polymerase chain reaction, and DNA sequencing showing the impact of the organisms on the springs or vice versa. Soon we we will collect gas samples at sites that show signs of gas. This will be taken using a hybrid of the Giggenbach method and our own methods. Drawing gas samples has proven a challenge, however we devised a method to draw out gas samples utilizing the Giggenbach flask, transferring samples to glass blood sample tubes, replacing NaOH in the Giggenbach flask, and evacuating it in the field for multiple samples using a vacuum pump. We also use a floating platform devised to carry and lower a levelogger, to using an in-line fuel filter from a tractor in order to keep mud from contaminating the equipment.The use of raster

  11. On categorizations in analyses of alcohol teratogenesis.

    PubMed Central

    Sampson, P D; Streissguth, A P; Bookstein, F L; Barr, H M

    2000-01-01

    In biomedical scientific investigations, expositions of findings are conceptually simplest when they comprise comparisons of discrete groups of individuals or involve discrete features or characteristics of individuals. But the descriptive benefits of categorization become outweighed by their limitations in studies involving dose-response relationships, as in many teratogenic and environmental exposure studies. This article addresses a pair of categorization issues concerning the effects of prenatal alcohol exposure that have important public health consequences: the labeling of individuals as fetal alcohol syndrome (FAS) versus fetal alcohol effects (FAE) or alcohol-related neurodevelopmental disorder (ARND), and the categorization of prenatal exposure dose by thresholds. We present data showing that patients with FAS and others with FAE do not have meaningfully different behavioral performance, standardized scores of IQ, arithmetic and adaptive behavior, or secondary disabilities. Similarly overlapping distributions on measures of executive functioning offer a basis for identifying alcohol-affected individuals in a manner that does not simply reflect IQ deficits. At the other end of the teratological continuum, we turn to the reporting of threshold effects in dose-response relationships. Here we illustrate the importance of multivariate analyses using data from the Seattle, Washington, longitudinal prospective study on alcohol and pregnancy. Relationships between many neurobehavioral outcomes and measures of prenatal alcohol exposure are monotone without threshold down to the lowest nonzero levels of exposure, a finding consistent with reports from animal studies. In sum, alcohol effects on the developing human brain appear to be a continuum without threshold when dose and behavioral effects are quantified appropriately. Images Figure 1 Figure 3 PMID:10852839

  12. Pipeline for macro- and microarray analyses.

    PubMed

    Vicentini, R; Menossi, M

    2007-05-01

    The pipeline for macro- and microarray analyses (PMmA) is a set of scripts with a web interface developed to analyze DNA array data generated by array image quantification software. PMmA is designed for use with single- or double-color array data and to work as a pipeline in five classes (data format, normalization, data analysis, clustering, and array maps). It can also be used as a plugin in the BioArray Software Environment, an open-source database for array analysis, or used in a local version of the web service. All scripts in PMmA were developed in the PERL programming language and statistical analysis functions were implemented in the R statistical language. Consequently, our package is a platform-independent software. Our algorithms can correctly select almost 90% of the differentially expressed genes, showing a superior performance compared to other methods of analysis. The pipeline software has been applied to 1536 expressed sequence tags macroarray public data of sugarcane exposed to cold for 3 to 48 h. PMmA identified thirty cold-responsive genes previously unidentified in this public dataset. Fourteen genes were up-regulated, two had a variable expression and the other fourteen were down-regulated in the treatments. These new findings certainly were a consequence of using a superior statistical analysis approach, since the original study did not take into account the dependence of data variability on the average signal intensity of each gene. The web interface, supplementary information, and the package source code are available, free, to non-commercial users at http://ipe.cbmeg.unicamp.br/pub/PMmA. PMID:17464422

  13. TRMM-Based Merged Precipitation Analyses

    NASA Technical Reports Server (NTRS)

    Adler, Robert; Huffman, George; Bolvin, David; Nelkin, Eric; Curtis, Scott

    1999-01-01

    This paper describes results of using Tropical Rainfall Measuring Mission (TRMM) information as the key calibration tool in a merged analysis on a 1X1 latitude/longitude monthly scale based on multiple satellite sources and raingauge analyses. The TRMM-based product is compared with surface-based validation data sets and the community-based 20-year Global Precipitation Climatology Project (GPCP)monthly analyses. The TRMM-based merged analysis uses the TRMM information to calibrate the estimates from SSM/I and geosynchronous IR observations and merges those estimates together with the TRMM and gauge information to produce accurate rainfall estimates with the increased sampling provided by the combined satellite information. This TRMM merged analysis uses the combined instrument (Precipitation Radar [PR] and TRMM Microwave Imager [TMI]) retrieval of Haddad as the TRMM estimate with which to calibrate the other satellite estimates. This TRMM Combined instrument (TCI) estimate is shown to produce very similar absolute values to the other main TRMM products. The TRMM and other satellites merged analysis compares favorably to the atoll data set of Morrissey for the months of 1998 with a very small positive bias of 2%. However, comparison with the preliminary results from the TRMM ground validation radar information at Kwajalein atoll in the western Pacific Ocean shows a 26% positive bias. Therefore, absolute magnitudes from TRMM and/or the ground validation need to be treated with care at this point. A month by month comparison of the TRMM merged analysis and the GPCP analysis indicates very similar patterns, but with subtle differences in magnitude. Focusing on the Pacific Ocean ITCZ one can see the TRMM-based estimates having higher peak values and lower values in the ITCZ periphery. These attributes also show up in the statistics, where GPCP>TRMM at low values (below 10 mm/d) and TRMM>GPCP at high values (greater than 15 mm/d). Integrated over the 37N-37S belt for all

  14. Imaging medical imaging

    NASA Astrophysics Data System (ADS)

    Journeau, P.

    2015-03-01

    This paper presents progress on imaging the research field of Imaging Informatics, mapped as the clustering of its communities together with their main results by applying a process to produce a dynamical image of the interactions between their results and their common object(s) of research. The basic side draws from a fundamental research on the concept of dimensions and projective space spanning several streams of research about three-dimensional perceptivity and re-cognition and on their relation and reduction to spatial dimensionality. The application results in an N-dimensional mapping in Bio-Medical Imaging, with dimensions such as inflammatory activity, MRI acquisition sequencing, spatial resolution (voxel size), spatiotemporal dimension inferred, toxicity, depth penetration, sensitivity, temporal resolution, wave length, imaging duration, etc. Each field is represented through the projection of papers' and projects' `discriminating' quantitative results onto the specific N-dimensional hypercube of relevant measurement axes, such as listed above and before reduction. Past published differentiating results are represented as red stars, achieved unpublished results as purple spots and projects at diverse progress advancement levels as blue pie slices. The goal of the mapping is to show the dynamics of the trajectories of the field in its own experimental frame and their direction, speed and other characteristics. We conclude with an invitation to participate and show a sample mapping of the dynamics of the community and a tentative predictive model from community contribution.

  15. Image Calibration

    NASA Technical Reports Server (NTRS)

    Peay, Christopher S.; Palacios, David M.

    2011-01-01

    Calibrate_Image calibrates images obtained from focal plane arrays so that the output image more accurately represents the observed scene. The function takes as input a degraded image along with a flat field image and a dark frame image produced by the focal plane array and outputs a corrected image. The three most prominent sources of image degradation are corrected for: dark current accumulation, gain non-uniformity across the focal plane array, and hot and/or dead pixels in the array. In the corrected output image the dark current is subtracted, the gain variation is equalized, and values for hot and dead pixels are estimated, using bicubic interpolation techniques.

  16. Indexing Images.

    ERIC Educational Resources Information Center

    Rasmussen, Edie M.

    1997-01-01

    Focuses on access to digital image collections by means of manual and automatic indexing. Contains six sections: (1) Studies of Image Systems and their Use; (2) Approaches to Indexing Images; (3) Image Attributes; (4) Concept-Based Indexing; (5) Content-Based Indexing; and (6) Browsing in Image Retrieval. Contains 105 references. (AEF)

  17. Image intensification

    SciTech Connect

    Csorba, I.P.

    1989-01-01

    These proceedings discuss the papers on image intensification. The topics discussed are : High speed optical detector tube technology; image tube camera technology; microchannel plate technology; high resolution x-ray imaging device; and process and evaluation techniques.

  18. MELCOR analyses for accident progression issues

    SciTech Connect

    Dingman, S.E.; Shaffer, C.J.; Payne, A.C.; Carmel, M.K. )

    1991-01-01

    Results of calculations performed with MELCOR and HECTR in support of the NUREG-1150 study are presented in this report. The analyses examined a wide range of issues. The analyses included integral calculations covering an entire accident sequence, as well as calculations that addressed specific issues that could affect several accident sequences. The results of the analyses for Grand Gulf, Peach Bottom, LaSalle, and Sequoyah are described, and the major conclusions are summarized. 23 refs., 69 figs., 8 tabs.

  19. Electron/proton spectrometer certification documentation analyses

    NASA Technical Reports Server (NTRS)

    Gleeson, P.

    1972-01-01

    A compilation of analyses generated during the development of the electron-proton spectrometer for the Skylab program is presented. The data documents the analyses required by the electron-proton spectrometer verification plan. The verification plan was generated to satisfy the ancillary hardware requirements of the Apollo Applications program. The certification of the spectrometer requires that various tests, inspections, and analyses be documented, approved, and accepted by reliability and quality control personnel of the spectrometer development program.

  20. The ASSET intercomparison of ozone analyses: method and first results

    NASA Astrophysics Data System (ADS)

    Geer, A. J.; Lahoz, W. A.; Bekki, S.; Bormann, N.; Errera, Q.; Eskes, H. J.; Fonteyn, D.; Jackson, D. R.; Juckes, M. N.; Massart, S.; Peuch, V.-H.; Rharmili, S.; Segers, A.

    2006-12-01

    This paper aims to summarise the current performance of ozone data assimilation (DA) systems, to show where they can be improved, and to quantify their errors. It examines 11 sets of ozone analyses from 7 different DA systems. Two are numerical weather prediction (NWP) systems based on general circulation models (GCMs); the other five use chemistry transport models (CTMs). The systems examined contain either linearised or detailed ozone chemistry, or no chemistry at all. In most analyses, MIPAS (Michelson Interferometer for Passive Atmospheric Sounding) ozone data are assimilated; two assimilate SCIAMACHY (Scanning Imaging Absorption Spectrometer for Atmospheric Chartography) observations instead. Analyses are compared to independent ozone observations covering the troposphere, stratosphere and lower mesosphere during the period July to November 2003. Biases and standard deviations are largest, and show the largest divergence between systems, in the troposphere, in the upper-troposphere/lower-stratosphere, in the upper-stratosphere and mesosphere, and the Antarctic ozone hole region. However, in any particular area, apart from the troposphere, at least one system can be found that agrees well with independent data. In general, none of the differences can be linked to the assimilation technique (Kalman filter, three or four dimensional variational methods, direct inversion) or the system (CTM or NWP system). Where results diverge, a main explanation is the way ozone is modelled. It is important to correctly model transport at the tropical tropopause, to avoid positive biases and excessive structure in the ozone field. In the southern hemisphere ozone hole, only the analyses which correctly model heterogeneous ozone depletion are able to reproduce the near-complete ozone destruction over the pole. In the upper-stratosphere and mesosphere (above 5 hPa), some ozone photochemistry schemes caused large but easily remedied biases. The diurnal cycle of ozone in the

  1. Comparative Analyses Of Multi-Frequency PSI Ground Deformation Measurements

    NASA Astrophysics Data System (ADS)

    Duro, Javier; Sabater, Jose R.; Albiol, David; Koudogbo, Fifame N.; Arnaud, Alain

    2012-01-01

    In recent years many new developments have been made in the field of SAR image analysis. The wider diversity of available SAR imagery gives the possibility of covering wide ranges of applications in the domain of ground motion monitoring for risk management and damage assessment. The work proposed is based on the evaluation of differences in ground deformation measurements derived from multi-frequency PSI analyses. The objectives of the project are the derivation of rules and the definition of criteria for the selection of the appropriate SAR sensor for a particular type of region of interest. Key selection factors are the satellite characteristics (operating frequency, spatial resolution, and revisit time), the geographic localization of AOI, the land cover type and the extension of the monitoring period. All presented InSAR analyses have been performed using the Stable Point Network (SPN) PSI software developed by Altamira Information [1].

  2. Oncological image analysis: medical and molecular image analysis

    NASA Astrophysics Data System (ADS)

    Brady, Michael

    2007-03-01

    This paper summarises the work we have been doing on joint projects with GE Healthcare on colorectal and liver cancer, and with Siemens Molecular Imaging on dynamic PET. First, we recall the salient facts about cancer and oncological image analysis. Then we introduce some of the work that we have done on analysing clinical MRI images of colorectal and liver cancer, specifically the detection of lymph nodes and segmentation of the circumferential resection margin. In the second part of the paper, we shift attention to the complementary aspect of molecular image analysis, illustrating our approach with some recent work on: tumour acidosis, tumour hypoxia, and multiply drug resistant tumours.

  3. Aviation System Analysis Capability Executive Assistant Analyses

    NASA Technical Reports Server (NTRS)

    Roberts, Eileen; Kostiuk, Peter

    1999-01-01

    This document describes the analyses that may be incorporated into the Aviation System Analysis Capability Executive Assistant. The document will be used as a discussion tool to enable NASA and other integrated aviation system entities to evaluate, discuss, and prioritize analyses.

  4. 49 CFR 1572.107 - Other analyses.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... conviction for a serious crime not listed in 49 CFR 1572.103, or a period of foreign or domestic imprisonment... 49 Transportation 9 2011-10-01 2011-10-01 false Other analyses. 1572.107 Section 1572.107... ASSESSMENTS Standards for Security Threat Assessments § 1572.107 Other analyses. (a) TSA may determine that...

  5. 49 CFR 1572.107 - Other analyses.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... conviction for a serious crime not listed in 49 CFR 1572.103, or a period of foreign or domestic imprisonment... 49 Transportation 9 2010-10-01 2010-10-01 false Other analyses. 1572.107 Section 1572.107... ASSESSMENTS Standards for Security Threat Assessments § 1572.107 Other analyses. (a) TSA may determine that...

  6. Amplitude analyses of charmless B decays

    NASA Astrophysics Data System (ADS)

    Latham, Thomas

    2016-05-01

    We present recent results from the LHCb experiment of Amplitude Analyses of charmless decays of B0 and BS0 mesons to two vector mesons. Measurements obtained include the branching fractions and polarization fractions, as well as CP asymmetries. The analyses use the data recorded by the LHCb experiment during Run 1 of the LHC.

  7. 49 CFR 1180.7 - Market analyses.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 8 2012-10-01 2012-10-01 false Market analyses. 1180.7 Section 1180.7..., TRACKAGE RIGHTS, AND LEASE PROCEDURES General Acquisition Procedures § 1180.7 Market analyses. (a) For... identify and address relevant markets and issues, and provide additional information as requested by...

  8. 49 CFR 1180.7 - Market analyses.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 8 2014-10-01 2014-10-01 false Market analyses. 1180.7 Section 1180.7..., TRACKAGE RIGHTS, AND LEASE PROCEDURES General Acquisition Procedures § 1180.7 Market analyses. (a) For... identify and address relevant markets and issues, and provide additional information as requested by...

  9. 49 CFR 1180.7 - Market analyses.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 8 2013-10-01 2013-10-01 false Market analyses. 1180.7 Section 1180.7..., TRACKAGE RIGHTS, AND LEASE PROCEDURES General Acquisition Procedures § 1180.7 Market analyses. (a) For... identify and address relevant markets and issues, and provide additional information as requested by...

  10. 10 CFR 61.13 - Technical analyses.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... air, soil, groundwater, surface water, plant uptake, and exhumation by burrowing animals. The analyses... expected exposures due to routine operations and likely accidents during handling, storage, and disposal of... 10 Energy 2 2013-01-01 2013-01-01 false Technical analyses. 61.13 Section 61.13 Energy...

  11. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle...

  12. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle...

  13. Infrared imaging of comets

    NASA Technical Reports Server (NTRS)

    Telesco, Charles M.

    1988-01-01

    Thermal infrared imaging of comets provides fundamental information about the distribution of dust in their comae and tails. The imaging program at NASA Marshall Space Flight Center (MSFC) uses a unique 20-pixel bolometer array that was developed to image comets at 8 to 30 micrometer. These images provide the basis for: (1) characterizing the composition and size distribution of particles, (2) determining the mass-loss rates from cometary nuclei, and (3) describing the dynamics of the interaction between the dust and the solar radiation. Since the array became operational in 1985, researchers have produced a unique series of IR images of comets Giacobini-Zinner (GZ), Halley, and Wilson. That of GZ was the first groundbased thermal image ever made of a comet and was used to construct, with visible observations, an albedo map. Those data and dynamical analyses showed that GZ contained a population of large (approximately 300 micrometer), fluffy dust grains that formed a distinict inner tail. The accumulating body of images of various comets has also provided a basis for fruitfully intercomparing comet properties. Researchers also took advantage of the unique capabilities of the camera to resolve the inner, possible protoplanetary, disk of the star Beta Pictoris, while not a comet research program, that study is a fruitful additional application of the array to solar system astronomy.

  14. [Brain metastases imaging].

    PubMed

    Delmaire, C; Savatovsky, J; Boulanger, T; Dhermain, F; Le Rhun, E; Météllus, P; Gerber, S; Carsin-Nicole, B; Petyt, G

    2015-02-01

    The therapeutic management of brain metastases depends upon their diagnosis and characteristics. It is therefore imperative that imaging provides accurate diagnosis, identification, size and localization information of intracranial lesions in patients with presumed cerebral metastatic disease. MRI exhibits superior sensitivity to CT for small lesions identification and to evaluate their precise anatomical location. The CT-scan will be made only in case of MRI's contraindication or if MRI cannot be obtained in an acceptable delay for the management of the patient. In clinical practice, the radiologic metastasis evaluation is based on visual image analyses. Thus, a particular attention is paid to the imaging protocol with the aim to optimize the diagnosis of small lesions and to evaluate their evolution. The MRI protocol must include: 1) non-contrast T1, 2) diffusion, 3) T2* or susceptibility-weighted imaging, 4) dynamic susceptibility contrast perfusion, 5) FLAIR with contrast injection, 6) T1 with contrast injection preferentially using the 3D spin echo images. The role of the nuclear medicine imaging is still limited in the diagnosis of brain metastasis. The Tc-sestamibi brain imaging or PET with amino acid tracers can differentiate local brain metastasis recurrence from radionecrosis but still to be evaluated. PMID:25649387

  15. Functional analyses and treatment of precursor behavior.

    PubMed

    Najdowski, Adel C; Wallace, Michele D; Ellsworth, Carrie L; MacAleese, Alicia N; Cleveland, Jackie M

    2008-01-01

    Functional analysis has been demonstrated to be an effective method to identify environmental variables that maintain problem behavior. However, there are cases when conducting functional analyses of severe problem behavior may be contraindicated. The current study applied functional analysis procedures to a class of behavior that preceded severe problem behavior (precursor behavior) and evaluated treatments based on the outcomes of the functional analyses of precursor behavior. Responding for all participants was differentiated during the functional analyses, and individualized treatments eliminated precursor behavior. These results suggest that functional analysis of precursor behavior may offer an alternative, indirect method to assess the operant function of severe problem behavior. PMID:18468282

  16. Image processing technology

    SciTech Connect

    Van Eeckhout, E.; Pope, P.; Balick, L.

    1996-07-01

    This is the final report of a two-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The primary objective of this project was to advance image processing and visualization technologies for environmental characterization. This was effected by developing and implementing analyses of remote sensing data from satellite and airborne platforms, and demonstrating their effectiveness in visualization of environmental problems. Many sources of information were integrated as appropriate using geographic information systems.

  17. 7 CFR 94.102 - Analyses available.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... analyses for total ash, fat by acid hydrolysis, moisture, salt, protein, beta-carotene, catalase... glycol, SLS, and zeolex. There are also be tests for starch, total sugars, sugar profile, whey,...

  18. 7 CFR 94.102 - Analyses available.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... analyses for total ash, fat by acid hydrolysis, moisture, salt, protein, beta-carotene, catalase... glycol, SLS, and zeolex. There are also be tests for starch, total sugars, sugar profile, whey,...

  19. Quality control considerations in performing washability analyses

    SciTech Connect

    Graham, R.D.

    1984-10-01

    The author describes, in considerable detail, the procedures for carrying out washability analyses as laid down in ASTM Standard Test Method D4371. These include sampling, sample preparation, hydrometer standardisation, washability testing, and analysis of specific gravity fractions.

  20. SCM Forcing Data Derived from NWP Analyses

    DOE Data Explorer

    Jakob, Christian

    2008-01-15

    Forcing data, suitable for use with single column models (SCMs) and cloud resolving models (CRMs), have been derived from NWP analyses for the ARM (Atmospheric Radiation Measurement) Tropical Western Pacific (TWP) sites of Manus Island and Nauru.

  1. Comparison with Russian analyses of meteor impact

    SciTech Connect

    Canavan, G.H.

    1997-06-01

    The inversion model for meteor impacts is used to discuss Russian analyses and compare principal results. For common input parameters, the models produce consistent estimates of impactor parameters. Directions for future research are discussed and prioritized.

  2. 10 CFR 61.13 - Technical analyses.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... requirements of part 20 of this chapter. (d) Analyses of the long-term stability of the disposal site and the... processes such as erosion, mass wasting, slope failure, settlement of wastes and backfill,...

  3. 10 CFR 61.13 - Technical analyses.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... requirements of part 20 of this chapter. (d) Analyses of the long-term stability of the disposal site and the... processes such as erosion, mass wasting, slope failure, settlement of wastes and backfill,...

  4. Anthocyanin analyses of Vaccinium fruit dietary supplements

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Vaccinium fruit ingredients within dietary supplements were identified by comparisons with anthocyanin analyses of known Vaccinium profiles (demonstration of anthocyanin fingerprinting). Available Vaccinium supplements were purchased and analyzed; their anthocyanin profiles (based on HPLC separation...

  5. A History of Rotorcraft Comprehensive Analyses

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2013-01-01

    A history of the development of rotorcraft comprehensive analyses is presented. Comprehensive analyses are digital computer programs that calculate the aeromechanical behavior of the rotor and aircraft, bringing together the most advanced models of the geometry, structure, dynamics, and aerodynamics available in rotary wing technology. The development of the major codes of the last five decades from industry, government, and universities is described. A number of common themes observed in this history are discussed.

  6. Analyses and forecasts with LAWS winds

    NASA Technical Reports Server (NTRS)

    Wang, Muyin; Paegle, Jan

    1994-01-01

    Horizontal fluxes of atmospheric water vapor are studied for summer months during 1989 and 1992 over North and South America based on analyses from European Center for Medium Range Weather Forecasts, US National Meteorological Center, and United Kingdom Meteorological Office. The calculations are performed over 20 deg by 20 deg box-shaped midlatitude domains located to the east of the Rocky Mountains in North America, and to the east of the Andes Mountains in South America. The fluxes are determined from operational center gridded analyses of wind and moisture. Differences in the monthly mean moisture flux divergence determined from these analyses are as large as 7 cm/month precipitable water equivalent over South America, and 3 cm/month over North America. Gridded analyses at higher spatial and temporal resolution exhibit better agreement in the moisture budget study. However, significant discrepancies of the moisture flux divergence computed from different gridded analyses still exist. The conclusion is more pessimistic than Rasmusson's estimate based on station data. Further analysis reveals that the most significant sources of error result from model surface elevation fields, gaps in the data archive, and uncertainties in the wind and specific humidity analyses. Uncertainties in the wind analyses are the most important problem. The low-level jets, in particular, are substantially different in the different data archives. Part of the reason for this may be due to the way the different analysis models parameterized physical processes affecting low-level jets. The results support the inference that the noise/signal ratio of the moisture budget may be improved more rapidly by providing better wind observations and analyses than by providing better moisture data.

  7. Quantitative analyses for elucidating mechanisms of cell fate commitment in the mouse blastocyst

    NASA Astrophysics Data System (ADS)

    Saiz, Néstor; Kang, Minjung; Puliafito, Alberto; Schrode, Nadine; Xenopoulos, Panagiotis; Lou, Xinghua; Di Talia, Stefano; Hadjantonakis, Anna-Katerina

    2015-03-01

    In recent years we have witnessed a shift from qualitative image analysis towards higher resolution, quantitative analyses of imaging data in developmental biology. This shift has been fueled by technological advances in both imaging and analysis software. We have recently developed a tool for accurate, semi-automated nuclear segmentation of imaging data from early mouse embryos and embryonic stem cells. We have applied this software to the study of the first lineage decisions that take place during mouse development and established analysis pipelines for both static and time-lapse imaging experiments. In this paper we summarize the conclusions from these studies to illustrate how quantitative, single-cell level analysis of imaging data can unveil biological processes that cannot be revealed by traditional qualitative studies.

  8. Albedo Pattern Recognition and Time-Series Analyses in Malaysia

    NASA Astrophysics Data System (ADS)

    Salleh, S. A.; Abd Latif, Z.; Mohd, W. M. N. Wan; Chan, A.

    2012-07-01

    Pattern recognition and time-series analyses will enable one to evaluate and generate predictions of specific phenomena. The albedo pattern and time-series analyses are very much useful especially in relation to climate condition monitoring. This study is conducted to seek for Malaysia albedo pattern changes. The pattern recognition and changes will be useful for variety of environmental and climate monitoring researches such as carbon budgeting and aerosol mapping. The 10 years (2000-2009) MODIS satellite images were used for the analyses and interpretation. These images were being processed using ERDAS Imagine remote sensing software, ArcGIS 9.3, the 6S code for atmospherical calibration and several MODIS tools (MRT, HDF2GIS, Albedo tools). There are several methods for time-series analyses were explored, this paper demonstrates trends and seasonal time-series analyses using converted HDF format MODIS MCD43A3 albedo land product. The results revealed significance changes of albedo percentages over the past 10 years and the pattern with regards to Malaysia's nebulosity index (NI) and aerosol optical depth (AOD). There is noticeable trend can be identified with regards to its maximum and minimum value of the albedo. The rise and fall of the line graph show a similar trend with regards to its daily observation. The different can be identified in term of the value or percentage of rises and falls of albedo. Thus, it can be concludes that the temporal behavior of land surface albedo in Malaysia have a uniform behaviours and effects with regards to the local monsoons. However, although the average albedo shows linear trend with nebulosity index, the pattern changes of albedo with respects to the nebulosity index indicates that there are external factors that implicates the albedo values, as the sky conditions and its diffusion plotted does not have uniform trend over the years, especially when the trend of 5 years interval is examined, 2000 shows high negative linear

  9. Diagnostic Imaging

    MedlinePlus

    Diagnostic imaging lets doctors look inside your body for clues about a medical condition. A variety of machines and ... and activities inside your body. The type of imaging your doctor uses depends on your symptoms and ...

  10. Medical Imaging.

    ERIC Educational Resources Information Center

    Barker, M. C. J.

    1996-01-01

    Discusses four main types of medical imaging (x-ray, radionuclide, ultrasound, and magnetic resonance) and considers their relative merits. Describes important recent and possible future developments in image processing. (Author/MKR)

  11. Computer techniques used for some enhancements of ERTS images

    NASA Technical Reports Server (NTRS)

    Billingsley, F. C.; Goetz, A. F. H.

    1973-01-01

    The JPL VICAR image processing system has been used for the enhancement of images received from the ERTS for the Arizona geology mapping experiment. This system contains flexible capabilities for reading and repairing MSS digital tape images, for geometric corrections and interpicture registration, for various enhancements and analyses of the data, and for display of the images in black and white and color.

  12. Measuring image quality in overlapping areas of panoramic composed images

    NASA Astrophysics Data System (ADS)

    Mitjà, Carles; Bover, Toni; Escofet, Jaume

    2012-06-01

    Several professional photographic applications uses the merging of consecutive overlapping images in order to obtain bigger files by means of stitching techniques or extended field of view (FOV) for panoramic images. All of those applications share the fact that the final composed image is obtained by overlapping the neighboring areas of consecutive individual images taken as a mosaic or a series of tiles over the scene, from the same point of view. Any individual image taken with a given lens can carry residual aberrations and several of them will affect more probably the borders of the image frame. Furthermore, the amount of distortion aberration present in the images of a given lens will be reversed in position for the two overlapping areas of a pair of consecutive takings. Finally, the different images used in composing the final one have corresponding overlapping areas taken with different perspective. From all the previously stated can be derived that the software employed must remap all the pixel information in order to resize and match image features in those overlapping areas, providing a final composed image with the desired perspective projection. The work presented analyse two panoramic format images taken with a pair of lenses and composed by means of a state of the art stitching software. Then, a series of images are taken to cover an FOV three times the original lens FOV, the images are merged by means of a software of common use in professional panoramic photography and the final image quality is evaluated through a series of targets positioned in strategic locations over the whole taking field of view. That allows measuring the resulting Resolution and Modulation Transfer Function (MTF). The results are shown compared with the previous measures on the original individual images.

  13. Fractal and Lacunarity Analyses: Quantitative Characterization of Hierarchical Surface Topographies.

    PubMed

    Ling, Edwin J Y; Servio, Phillip; Kietzig, Anne-Marie

    2016-02-01

    Biomimetic hierarchical surface structures that exhibit features having multiple length scales have been used in many technological and engineering applications. Their surface topographies are most commonly analyzed using scanning electron microscopy (SEM), which only allows for qualitative visual assessments. Here we introduce fractal and lacunarity analyses as a method of characterizing the SEM images of hierarchical surface structures in a quantitative manner. Taking femtosecond laser-irradiated metals as an example, our results illustrate that, while the fractal dimension is a poor descriptor of surface complexity, lacunarity analysis can successfully quantify the spatial texture of an SEM image; this, in turn, provides a convenient means of reporting changes in surface topography with respect to changes in processing parameters. Furthermore, lacunarity plots are shown to be sensitive to the different length scales present within a hierarchical structure due to the reversal of lacunarity trends at specific magnifications where new features become resolvable. Finally, we have established a consistent method of detecting pattern sizes in an image from the oscillation of lacunarity plots. Therefore, we promote the adoption of lacunarity analysis as a powerful tool for quantitative characterization of, but not limited to, multi-scale hierarchical surface topographies. PMID:26758776

  14. Web-based cephalometric procedure for craniofacial and dentition analyses

    NASA Astrophysics Data System (ADS)

    Arun Kumar, N. S.; Kamath, Srijit R.; Ram, S.; Muthukumaran, B.; Venkatachalapathy, A.; Nandakumar, A.; Jayakumar, P.

    2000-05-01

    Craniofacial analysis is a very important and widely used procedure in orthodontic caphalometry, which plays a key role in diagnosis and treatment planning. This involves establishing reference standards and specification of landmarks and variables. The manual approach takes up a tremendous amount of the orthodontist's time. In this paper, we developed a web-based approach for the craniofacial and dentition analyses. A digital computed radiography (CR) system is utilized for obtaining the craniofacial image, which is stored as a bitmap file. The system comprises of two components - a server and a client. The server component is a program that runs on a remote machine. To use the system, the user has to connect to the website. The client component is now activated, which uploads the image from the PC and displays it on the canvas area. The landmarks are identified using a mouse interface. The reference lines are generated. The resulting image is then sent to the server which performs all measurement and calculates the mean, standard deviation, etc. of the variables. The results generated are sent immediately to the client where it is displayed on a separate frame along with the standard values for comparison. This system eliminates the need for every user to load other expensive programs on his machine.

  15. Dynamic and static error analyses of neutron radiography testing

    SciTech Connect

    Joo, H.; Glickstein, S.S.

    1999-03-01

    Neutron radiography systems are being used for real-time visualization of the dynamic behavior as well as time-averaged measurements of spatial vapor fraction distributions for two phase fluids. The data in the form of video images are typically recorded on videotape at 30 frames per second. Image analysis of he video pictures is used to extract time-dependent or time-averaged data. The determination of the average vapor fraction requires averaging of the logarithm of time-dependent intensity measurements of the neutron beam (gray scale distribution of the image) that passes through the fluid. This could be significantly different than averaging the intensity of the transmitted beam and then taking the logarithm of that term. This difference is termed the dynamic error (error in the time-averaged vapor fractions due to the inherent time-dependence of the measured data) and is separate from the static error (statistical sampling uncertainty). Detailed analyses of both sources of errors are discussed.

  16. Proof Image

    ERIC Educational Resources Information Center

    Kidron, Ivy; Dreyfus, Tommy

    2014-01-01

    The emergence of a proof image is often an important stage in a learner's construction of a proof. In this paper, we introduce, characterize, and exemplify the notion of proof image. We also investigate how proof images emerge. Our approach starts from the learner's efforts to construct a justification without (or before) attempting any…

  17. Image alignment

    DOEpatents

    Dowell, Larry Jonathan

    2014-04-22

    Disclosed is a method and device for aligning at least two digital images. An embodiment may use frequency-domain transforms of small tiles created from each image to identify substantially similar, "distinguishing" features within each of the images, and then align the images together based on the location of the distinguishing features. To accomplish this, an embodiment may create equal sized tile sub-images for each image. A "key" for each tile may be created by performing a frequency-domain transform calculation on each tile. A information-distance difference between each possible pair of tiles on each image may be calculated to identify distinguishing features. From analysis of the information-distance differences of the pairs of tiles, a subset of tiles with high discrimination metrics in relation to other tiles may be located for each image. The subset of distinguishing tiles for each image may then be compared to locate tiles with substantially similar keys and/or information-distance metrics to other tiles of other images. Once similar tiles are located for each image, the images may be aligned in relation to the identified similar tiles.

  18. Canonical Images

    ERIC Educational Resources Information Center

    Hewitt, Dave

    2007-01-01

    In this article, the author offers two well-known mathematical images--that of a dot moving around a circle; and that of the tens chart--and considers their power for developing mathematical thinking. In his opinion, these images each contain the essence of a particular topic of mathematics. They are contrasting images in the sense that they deal…

  19. Image tubes

    SciTech Connect

    Csorba, I.P.

    1985-01-01

    This text provides a wealth of valuable, hard-to-find data on electron optics, imaging, and image intensification systems. The author explains details of image tube theory, design, construction, and components. He includes material on the design and operation of camera tubes, power components, and secondary electron emitters, as well as data on photomultiplier tubes and electron guns.

  20. Finite element analyses of CCAT preliminary design

    NASA Astrophysics Data System (ADS)

    Sarawit, Andrew T.; Kan, Frank W.

    2014-07-01

    This paper describes the development of the CCAT telescope finite element model (FEM) and the analyses performed to support the preliminary design work. CCAT will be a 25 m diameter telescope operating in the 0.2 to 2 mm wavelength range. It will be located at an elevation of 5600 m on Cerro Chajnantor in Northern Chile, near ALMA. The telescope will be equipped with wide-field cameras and spectrometers mounted at the two Nasmyth foci. The telescope will be inside an enclosure to protect it from wind buffeting, direct solar heating, and bad weather. The main structures of the telescope include a steel Mount and a carbon-fiber-reinforced-plastic (CFRP) primary truss. The finite element model developed in this study was used to perform modal, frequency response, seismic response spectrum, stress, and deflection analyses of telescope. Modal analyses of telescope were performed to compute the structure natural frequencies and mode shapes and to obtain reduced order modal output at selected locations in the telescope structure to support the design of the Mount control system. Modal frequency response analyses were also performed to compute transfer functions at these selected locations. Seismic response spectrum analyses of the telescope subject to the Maximum Likely Earthquake were performed to compute peak accelerations and seismic demand stresses. Stress analyses were performed for gravity load to obtain gravity demand stresses. Deflection analyses for gravity load, thermal load, and differential elevation drive torque were performed so that the CCAT Observatory can verify that the structures meet the stringent telescope surface and pointing error requirements.

  1. Prismatic analyser concept for neutron spectrometers

    SciTech Connect

    Birk, Jonas O.; Jacobsen, Johan; Hansen, Rasmus L.; Lefmann, Kim; Markó, Márton; Niedermayer, Christof; Freeman, Paul G.; Christensen, Niels B.; Månsson, Martin; Rønnow, Henrik M.

    2014-11-15

    Developments in modern neutron spectroscopy have led to typical sample sizes decreasing from few cm to several mm in diameter samples. We demonstrate how small samples together with the right choice of analyser and detector components makes distance collimation an important concept in crystal analyser spectrometers. We further show that this opens new possibilities where neutrons with different energies are reflected by the same analyser but counted in different detectors, thus improving both energy resolution and total count rate compared to conventional spectrometers. The technique can readily be combined with advanced focussing geometries and with multiplexing instrument designs. We present a combination of simulations and data showing three different energies simultaneously reflected from one analyser. Experiments were performed on a cold triple axis instrument and on a prototype inverse geometry Time-of-flight spectrometer installed at PSI, Switzerland, and shows excellent agreement with the predictions. Typical improvements will be 2.0 times finer resolution and a factor of 1.9 in flux gain compared to a focussing Rowland geometry, or of 3.3 times finer resolution and a factor of 2.4 in flux gain compared to a single flat analyser slab.

  2. Prismatic analyser concept for neutron spectrometers.

    PubMed

    Birk, Jonas O; Markó, Márton; Freeman, Paul G; Jacobsen, Johan; Hansen, Rasmus L; Christensen, Niels B; Niedermayer, Christof; Månsson, Martin; Rønnow, Henrik M; Lefmann, Kim

    2014-11-01

    Developments in modern neutron spectroscopy have led to typical sample sizes decreasing from few cm to several mm in diameter samples. We demonstrate how small samples together with the right choice of analyser and detector components makes distance collimation an important concept in crystal analyser spectrometers. We further show that this opens new possibilities where neutrons with different energies are reflected by the same analyser but counted in different detectors, thus improving both energy resolution and total count rate compared to conventional spectrometers. The technique can readily be combined with advanced focussing geometries and with multiplexing instrument designs. We present a combination of simulations and data showing three different energies simultaneously reflected from one analyser. Experiments were performed on a cold triple axis instrument and on a prototype inverse geometry Time-of-flight spectrometer installed at PSI, Switzerland, and shows excellent agreement with the predictions. Typical improvements will be 2.0 times finer resolution and a factor of 1.9 in flux gain compared to a focussing Rowland geometry, or of 3.3 times finer resolution and a factor of 2.4 in flux gain compared to a single flat analyser slab. PMID:25430125

  3. Quality assurance of ultrasound imaging instruments by monitoring the monitor.

    PubMed

    Walker, J B; Thorne, G C; Halliwell, M

    1993-11-01

    Ultrasound quality assurance (QA) is a means of assuring the constant performance of an ultrasound instrument. A novel 'ultrasound image analyser' has been developed to allow objective, accurate and repeatable measurement of the image displayed on the ultrasound screen, i.e. as seen by the operator. The analyser uses a television camera/framestore combination to digitize and analyse this image. A QA scheme is described along with the procedures necessary to obtain a repeatable measurement of the image so that comparisons with earlier good images can be made. These include repositioning the camera and resetting the video display characteristics. The advantages of using the analyser over other methods are discussed. It is concluded that the analyser has distinct advantages over subjective image assessment methods and will be a valuable addition to current ultrasound QA programmes. PMID:8272435

  4. Ultrasonic colour flow imaging.

    PubMed

    Wells, P N

    1994-12-01

    Real-time ultrasonic colour flow imaging, which was first demonstrated to be feasible only about a decade ago, has come into widespread clinical use. Ultrasound is scattered by ensembles of red blood cells. The ultrasonic frequency that gives the best signal-to-noise ratio for backscattering from blood depends on the required penetration. The frequency of ultrasound backscattered from flowing blood is shifted by the Doppler effect. The direction of flow can be determined by phase quadrature detection, and range selectivity can be provided by pulse-echo time-delay measurements. The Doppler frequency spectrum can be determined by Fourier analysis. Early two- and three-dimensional flow-imaging systems used slow manual scanning; velocity colour coding was introduced. Real-time colour flow imaging first became feasible when autocorrelation detection was used to extract the Doppler signal. Time-domain processing, which is a broad-band technique, was also soon shown to be practicable, for analysing both radio-frequency pulse-echo wavetrains and two-dimensional image speckle. Frequency- and time-domain processing both require effective cancellation of stationary echoes. The time-domain approach seems to have advantages in relation to both aliasing and the effects of attenuation in overlying tissues. Colour-coding schemes that can be interpreted without the need to refer to keys have been adopted, for both velocity and flow disturbance. Colour coding according to signal power has also been reintroduced. Three-dimensional display has been demonstrated. In interpreting colour flow images, it is important to understand the functions of critical system controls and the origins of artifacts. Various strategies can be adopted to increase the image frame rate. The problems of performance measurement and safety need to be kept under review. There are numerous opportunities for further development of ultrasonic colour flow imaging, including improvements in system design, methods of

  5. Geomagnetic local and regional harmonic analyses.

    USGS Publications Warehouse

    Alldredge, L.R.

    1982-01-01

    Procedures are developed for using rectangular and cylindrical harmonic analyses in local and regional areas. Both the linear least squares analysis, applicable when component data are available, and the nonlinear least squares analysis, applicable when only total field data are available, are treated. When component data are available, it is advantageous to work with residual fields obtained by subtracting components derived from a harmonic potential from the observed components. When only total field intensity data are available, they must be used directly. Residual values cannot be used. Cylindrical harmonic analyses are indicated when fields tend toward cylindrical symmetry; otherwise, rectangular harmonic analyses will be more advantageous. Examples illustrating each type of analysis are given.-Author

  6. A qualitative method for analysing multivoicedness

    PubMed Central

    Aveling, Emma-Louise; Gillespie, Alex; Cornish, Flora

    2015-01-01

    ‘Multivoicedness’ and the ‘multivoiced Self’ have become important theoretical concepts guiding research. Drawing on the tradition of dialogism, the Self is conceptualised as being constituted by a multiplicity of dynamic, interacting voices. Despite the growth in literature and empirical research, there remains a paucity of established methodological tools for analysing the multivoiced Self using qualitative data. In this article, we set out a systematic, practical ‘how-to’ guide for analysing multivoicedness. Using theoretically derived tools, our three-step method comprises: identifying the voices of I-positions within the Self’s talk (or text), identifying the voices of ‘inner-Others’, and examining the dialogue and relationships between the different voices. We elaborate each step and illustrate our method using examples from a published paper in which data were analysed using this method. We conclude by offering more general principles for the use of the method and discussing potential applications. PMID:26664292

  7. Advanced laser stratospheric monitoring systems analyses

    NASA Technical Reports Server (NTRS)

    Larsen, J. C.

    1984-01-01

    This report describes the software support supplied by Systems and Applied Sciences Corporation for the study of Advanced Laser Stratospheric Monitoring Systems Analyses under contract No. NAS1-15806. This report discusses improvements to the Langley spectroscopic data base, development of LHS instrument control software and data analyses and validation software. The effect of diurnal variations on the retrieved concentrations of NO, NO2 and C L O from a space and balloon borne measurement platform are discussed along with the selection of optimum IF channels for sensing stratospheric species from space.

  8. Imaging Biomarkers or Biomarker Imaging?

    PubMed Central

    Mitterhauser, Markus; Wadsak, Wolfgang

    2014-01-01

    Since biomarker imaging is traditionally understood as imaging of molecular probes, we highly recommend to avoid any confusion with the previously defined term “imaging biomarkers” and, therefore, only use “molecular probe imaging (MPI)” in that context. Molecular probes (MPs) comprise all kinds of molecules administered to an organism which inherently carry a signalling moiety. This review highlights the basic concepts and differences of molecular probe imaging using specific biomarkers. In particular, PET radiopharmaceuticals are discussed in more detail. Specific radiochemical and radiopharmacological aspects as well as some legal issues are presented. PMID:24967536

  9. Hyperspectral imaging in medicine: image pre-processing problems and solutions in Matlab.

    PubMed

    Koprowski, Robert

    2015-11-01

    The paper presents problems and solutions related to hyperspectral image pre-processing. New methods of preliminary image analysis are proposed. The paper shows problems occurring in Matlab when trying to analyse this type of images. Moreover, new methods are discussed which provide the source code in Matlab that can be used in practice without any licensing restrictions. The proposed application and sample result of hyperspectral image analysis. PMID:25676816

  10. Masonry: Task Analyses. Competency-Based Education.

    ERIC Educational Resources Information Center

    Henrico County Public Schools, Glen Allen, VA. Virginia Vocational Curriculum Center.

    These task analyses are designed to be used in combination with the "Trade and Industrial Education Service Area Resource" in order to implement competency-based education in the masonry program in Virginia. The task analysis document contains the task inventory, suggested task sequence lists, and content outlines for the secondary courses Masonry…

  11. FAME: Software for analysing rock microstructures

    NASA Astrophysics Data System (ADS)

    Hammes, Daniel M.; Peternell, Mark

    2016-05-01

    Determination of rock microstructures leads to a better understanding of the formation and deformation of polycrystalline solids. Here, we present FAME (Fabric Analyser based Microstructure Evaluation), an easy-to-use MATLAB®-based software for processing datasets recorded by an automated fabric analyser microscope. FAME is provided as a MATLAB®-independent Windows® executable with an intuitive graphical user interface. Raw data from the fabric analyser microscope can be automatically loaded, filtered and cropped before analysis. Accurate and efficient rock microstructure analysis is based on an advanced user-controlled grain labelling algorithm. The preview and testing environments simplify the determination of appropriate analysis parameters. Various statistic and plotting tools allow a graphical visualisation of the results such as grain size, shape, c-axis orientation and misorientation. The FAME2elle algorithm exports fabric analyser data to an elle (modelling software)-supported format. FAME supports batch processing for multiple thin section analysis or large datasets that are generated for example during 2D in-situ deformation experiments. The use and versatility of FAME is demonstrated on quartz and deuterium ice samples.

  12. An electrochemical calibration unit for hydrogen analysers.

    PubMed

    Merzlikin, Sergiy V; Mingers, Andrea M; Kurz, Daniel; Hassel, Achim Walter

    2014-07-01

    Determination of hydrogen in solids such as high strength steels or other metals in the ppb or ppm range requires hot-extraction or melt-extraction. Calibration of commercially available hydrogen analysers is performed either by certified reference materials CRMs, often having limited availability and reliability or by gas dosing for which the determined value significantly depends on atmospheric pressure and the construction of the gas dosing valve. The sharp and sudden appearance of very high gas concentrations from gas dosing is very different from real effusion transients and is therefore another source of errors. To overcome these limitations, an electrochemical calibration method for hydrogen analysers was developed and employed in this work. Exactly quantifiable, faradaic amounts of hydrogen can be produced in an electrochemical reaction and detected by the hydrogen analyser. The amount of hydrogen is exactly known from the transferred charge in the reaction following Faradays law; and the current time program determines the apparent hydrogen effusion transient. Random effusion transient shaping becomes possible to fully comply with real samples. Evolution time and current were varied for determining a quantitative relationship. The device was used to produce either diprotium (H2) or dideuterium (D2) from the corresponding electrolytes. The functional principle is electrochemical in nature and thus an automation is straightforward, can be easily implemented at an affordable price of 1-5% of the hydrogen analysers price. PMID:24840442

  13. Amino acid analyses of Apollo 14 samples.

    NASA Technical Reports Server (NTRS)

    Gehrke, C. W.; Zumwalt, R. W.; Kuo, K.; Aue, W. A.; Stalling, D. L.; Kvenvolden, K. A.; Ponnamperuma, C.

    1972-01-01

    Detection limits were between 300 pg and 1 ng for different amino acids, in an analysis by gas-liquid chromatography of water extracts from Apollo 14 lunar fines in which amino acids were converted to their N-trifluoro-acetyl-n-butyl esters. Initial analyses of water and HCl extracts of sample 14240 and 14298 samples showed no amino acids above background levels.

  14. Preliminary analyses of the Heliothis virescens transcriptome

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Generation and analysis of genomic resources for the tobacco budworm, Heliothis virescens, are necessary for detailed functional genomic studies of the physiology and biochemistry of this highly destructive pest. In this study we present preliminary analyses of the ~45,000 publicly available H. vir...

  15. Conducting ANOVA Trend Analyses Using Polynomial Contrasts.

    ERIC Educational Resources Information Center

    Laija, Wilda

    When analysis of variance (ANOVA) or linear regression is used, results may only indicate statistical significance. This statistical significance tells the researcher very little about the data being analyzed. Additional analyses need to be used to extract all the possible information obtained from a study. While a priori and post hoc comparisons…

  16. Analysing Simple Electric Motors in the Classroom

    ERIC Educational Resources Information Center

    Yap, Jeff; MacIsaac, Dan

    2006-01-01

    Electromagnetic phenomena and devices such as motors are typically unfamiliar to both teachers and students. To better visualize and illustrate the abstract concepts (such as magnetic fields) underlying electricity and magnetism, we suggest that students construct and analyse the operation of a simply constructed Johnson electric motor. In this…

  17. Cosmetology: Task Analyses. Competency-Based Education.

    ERIC Educational Resources Information Center

    Henrico County Public Schools, Glen Allen, VA. Virginia Vocational Curriculum Center.

    These task analyses are designed to be used in combination with the "Trade and Industrial Education Service Area Resource" in order to implement competency-based education in the cosmetology program in Virginia. The task analysis document contains the task inventory, suggested task sequence lists, and content outlines for the secondary courses…

  18. Integrated genomic analyses of ovarian carcinoma.

    PubMed

    2011-06-30

    A catalogue of molecular aberrations that cause ovarian cancer is critical for developing and deploying therapies that will improve patients' lives. The Cancer Genome Atlas project has analysed messenger RNA expression, microRNA expression, promoter methylation and DNA copy number in 489 high-grade serous ovarian adenocarcinomas and the DNA sequences of exons from coding genes in 316 of these tumours. Here we report that high-grade serous ovarian cancer is characterized by TP53 mutations in almost all tumours (96%); low prevalence but statistically recurrent somatic mutations in nine further genes including NF1, BRCA1, BRCA2, RB1 and CDK12; 113 significant focal DNA copy number aberrations; and promoter methylation events involving 168 genes. Analyses delineated four ovarian cancer transcriptional subtypes, three microRNA subtypes, four promoter methylation subtypes and a transcriptional signature associated with survival duration, and shed new light on the impact that tumours with BRCA1/2 (BRCA1 or BRCA2) and CCNE1 aberrations have on survival. Pathway analyses suggested that homologous recombination is defective in about half of the tumours analysed, and that NOTCH and FOXM1 signalling are involved in serous ovarian cancer pathophysiology. PMID:21720365

  19. Chemical Analyses of Silicon Aerogel Samples

    SciTech Connect

    van der Werf, I.; Palmisano, F.; De Leo, Raffaele; Marrone, Stefano

    2008-04-01

    After five years of operating, two Aerogel counters: A1 and A2, taking data in Hall A at Jefferson Lab, suffered a loss of performance. In this note possible causes of degradation have been studied. In particular, various chemical and physical analyses have been carried out on several Aerogel tiles and on adhesive tape in order to reveal the presence of contaminants.

  20. Correlation Functions Aid Analyses Of Spectra

    NASA Technical Reports Server (NTRS)

    Beer, Reinhard; Norton, Robert H., Jr.

    1989-01-01

    New uses found for correlation functions in analyses of spectra. In approach combining elements of both pattern-recognition and traditional spectral-analysis techniques, spectral lines identified in data appear useless at first glance because they are dominated by noise. New approach particularly useful in measurement of concentrations of rare species of molecules in atmosphere.

  1. Uncertainty quantification approaches for advanced reactor analyses.

    SciTech Connect

    Briggs, L. L.; Nuclear Engineering Division

    2009-03-24

    The original approach to nuclear reactor design or safety analyses was to make very conservative modeling assumptions so as to ensure meeting the required safety margins. Traditional regulation, as established by the U. S. Nuclear Regulatory Commission required conservatisms which have subsequently been shown to be excessive. The commission has therefore moved away from excessively conservative evaluations and has determined best-estimate calculations to be an acceptable alternative to conservative models, provided the best-estimate results are accompanied by an uncertainty evaluation which can demonstrate that, when a set of analysis cases which statistically account for uncertainties of all types are generated, there is a 95% probability that at least 95% of the cases meet the safety margins. To date, nearly all published work addressing uncertainty evaluations of nuclear power plant calculations has focused on light water reactors and on large-break loss-of-coolant accident (LBLOCA) analyses. However, there is nothing in the uncertainty evaluation methodologies that is limited to a specific type of reactor or to specific types of plant scenarios. These same methodologies can be equally well applied to analyses for high-temperature gas-cooled reactors and to liquid metal reactors, and they can be applied to steady-state calculations, operational transients, or severe accident scenarios. This report reviews and compares both statistical and deterministic uncertainty evaluation approaches. Recommendations are given for selection of an uncertainty methodology and for considerations to be factored into the process of evaluating uncertainties for advanced reactor best-estimate analyses.

  2. The Economic Cost of Homosexuality: Multilevel Analyses

    ERIC Educational Resources Information Center

    Baumle, Amanda K.; Poston, Dudley, Jr.

    2011-01-01

    This article builds on earlier studies that have examined "the economic cost of homosexuality," by using data from the 2000 U.S. Census and by employing multilevel analyses. Our findings indicate that partnered gay men experience a 12.5 percent earnings penalty compared to married heterosexual men, and a statistically insignificant earnings…

  3. 7 CFR 94.102 - Analyses available.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analyses available. 94.102 Section 94.102 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) COMMODITY LABORATORY TESTING PROGRAMS POULTRY AND EGG PRODUCTS...

  4. Airbags to Martian Landers: Analyses at Sandia National Laboratories

    SciTech Connect

    Gwinn, K.W.

    1994-03-01

    A new direction for the national laboratories is to assist US business with research and development, primarily through cooperative research and development agreements (CRADAs). Technology transfer to the private sector has been very successful as over 200 CRADAs are in place at Sandia. Because of these cooperative efforts, technology has evolved into some new areas not commonly associated with the former mission of the national laboratories. An example of this is the analysis of fabric structures. Explicit analyses and expertise in constructing parachutes led to the development of a next generation automobile airbag; which led to the construction, testing, and analysis of the Jet Propulsion Laboratory Mars Environmental Survey Lander; and finally led to the development of CAD based custom garment designs using 3D scanned images of the human body. The structural analysis of these fabric structures is described as well as a more traditional example Sandia with the test/analysis correlation of the impact of a weapon container.

  5. Indirect Imaging

    NASA Astrophysics Data System (ADS)

    Kundu, Mukul R.

    This book is the Proceedings of an International Symposium held in Sydney, Australia, August 30-September 2, 1983. The meeting was sponsored by the International Union of Radio Science and the International Astronomical Union.Indirect imaging is based upon the principle of determining the actual form of brightness distribution in a complex case by Fourier synthesis, using information derived from a large number of Fourier components. The main topic of the symposium was how to get the best images from data obtained from telescopes and other similar imaging instruments. Although the meeting was dominated by radio astronomers, with the consequent dominance of discussion of indirect imaging in the radio domain, there were quite a few participants from other disciplines. Thus there were some excellent discussions on optical imaging and medical imaging.

  6. Image Processing

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Electronic Imagery, Inc.'s ImageScale Plus software, developed through a Small Business Innovation Research (SBIR) contract with Kennedy Space Flight Center for use on space shuttle Orbiter in 1991, enables astronauts to conduct image processing, prepare electronic still camera images in orbit, display them and downlink images to ground based scientists for evaluation. Electronic Imagery, Inc.'s ImageCount, a spin-off product of ImageScale Plus, is used to count trees in Florida orange groves. Other applications include x-ray and MRI imagery, textile designs and special effects for movies. As of 1/28/98, company could not be located, therefore contact/product information is no longer valid.

  7. Passive adaptive imaging through turbulence

    NASA Astrophysics Data System (ADS)

    Tofsted, David

    2016-05-01

    Standard methods for improved imaging system performance under degrading optical turbulence conditions typically involve active adaptive techniques or post-capture image processing. Here, passive adaptive methods are considered where active sources are disallowed, a priori. Theoretical analyses of short-exposure turbulence impacts indicate that varying aperture sizes experience different degrees of turbulence impacts. Smaller apertures often outperform larger aperture systems as turbulence strength increases. This suggests a controllable aperture system is advantageous. In addition, sub-aperture sampling of a set of training images permits the system to sense tilts in different sub-aperture regions through image acquisition and image cross-correlation calculations. A four sub-aperture pattern supports corrections involving five realizable operating modes (beyond tip and tilt) for removing aberrations over an annular pattern. Progress to date will be discussed regarding development and field trials of a prototype system.

  8. Medical imaging

    SciTech Connect

    Chapman, D.

    1996-09-01

    There are a number of medically related imaging programs at synchrotron facilities around the world. The most advanced of these are the dual energy transvenous coronary angiography imaging programs, which have progressed to human imaging for some years. The NSLS facility will be discussed and patient images from recent sessions from the NSLS and HASYLAB will be presented. The effort at the Photon Factory and Accumulator Ring will also be briefly covered, as well as future plans for the new facilities. Emphasis will be on the new aspects of these imaging programs; this includes imaging with a peripheral venous injection of the iodine contrast agent, imaging at three photon energies, and the potential of a hospital-based compact source. Other medical programs to be discussed, are the multiple energy computed tomography (MECT) project at the NSLS and plans for a MECT program at the ESRF. Recently, experiments performed at the NSLS to image mammography phantoms using monochromatic beam have produced very promising results. This program will be discussed as well as some new results from imaging a phantom using a thin Laue crystal analyzer after the object to eliminate scatter onto the detector. {copyright} {ital 1996 American Institute of Physics.}

  9. Image barcodes

    NASA Astrophysics Data System (ADS)

    Damera-Venkata, Niranjan; Yen, Jonathan

    2003-01-01

    A Visually significant two-dimensional barcode (VSB) developed by Shaked et. al. is a method used to design an information carrying two-dimensional barcode, which has the appearance of a given graphical entity such as a company logo. The encoding and decoding of information using the VSB, uses a base image with very few graylevels (typically only two). This typically requires the image histogram to be bi-modal. For continuous-tone images such as digital photographs of individuals, the representation of tone or "shades of gray" is not only important to obtain a pleasing rendition of the face, but in most cases, the VSB renders these images unrecognizable due to its inability to represent true gray-tone variations. This paper extends the concept of a VSB to an image bar code (IBC). We enable the encoding and subsequent decoding of information embedded in the hardcopy version of continuous-tone base-images such as those acquired with a digital camera. The encoding-decoding process is modeled by robust data transmission through a noisy print-scan channel that is explicitly modeled. The IBC supports a high information capacity that differentiates it from common hardcopy watermarks. The reason for the improved image quality over the VSB is a joint encoding/halftoning strategy based on a modified version of block error diffusion. Encoder stability, image quality vs. information capacity tradeoffs and decoding issues with and without explicit knowledge of the base-image are discussed.

  10. Scientific Images

    MedlinePlus

    ... Financial Planning Organizations Clinical Trials Research Alzheimer's Disease ... | Alzheimer’s Disease Mechanisms and Processes The medical illustration images available below may be downloaded in ...

  11. Body Imaging

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Magnetic Resonance Imaging (MRI) and Computer-aided Tomography (CT) images are often complementary. In most cases, MRI is good for viewing soft tissue but not bone, while CT images are good for bone but not always good for soft tissue discrimination. Physicians and engineers in the Department of Radiology at the University of Michigan Hospitals are developing a technique for combining the best features of MRI and CT scans to increase the accuracy of discriminating one type of body tissue from another. One of their research tools is a computer program called HICAP. The program can be used to distinguish between healthy and diseased tissue in body images.

  12. Multispectral imaging and image processing

    NASA Astrophysics Data System (ADS)

    Klein, Julie

    2014-02-01

    The color accuracy of conventional RGB cameras is not sufficient for many color-critical applications. One of these applications, namely the measurement of color defects in yarns, is why Prof. Til Aach and the Institute of Image Processing and Computer Vision (RWTH Aachen University, Germany) started off with multispectral imaging. The first acquisition device was a camera using a monochrome sensor and seven bandpass color filters positioned sequentially in front of it. The camera allowed sampling the visible wavelength range more accurately and reconstructing the spectra for each acquired image position. An overview will be given over several optical and imaging aspects of the multispectral camera that have been investigated. For instance, optical aberrations caused by filters and camera lens deteriorate the quality of captured multispectral images. The different aberrations were analyzed thoroughly and compensated based on models for the optical elements and the imaging chain by utilizing image processing. With this compensation, geometrical distortions disappear and sharpness is enhanced, without reducing the color accuracy of multispectral images. Strong foundations in multispectral imaging were laid and a fruitful cooperation was initiated with Prof. Bernhard Hill. Current research topics like stereo multispectral imaging and goniometric multispectral measure- ments that are further explored with his expertise will also be presented in this work.

  13. Factor Analysis of the Image Correlation Matrix.

    ERIC Educational Resources Information Center

    Kaiser, Henry F.; Cerny, Barbara A.

    1979-01-01

    Whether to factor the image correlation matrix or to use a new model with an alpha factor analysis of it is mentioned, with particular reference to the determinacy problem. It is pointed out that the distribution of the images is sensibly multivariate normal, making for "better" factor analyses. (Author/CTM)

  14. Accuracy in Quantitative 3D Image Analysis

    PubMed Central

    Bassel, George W.

    2015-01-01

    Quantitative 3D imaging is becoming an increasingly popular and powerful approach to investigate plant growth and development. With the increased use of 3D image analysis, standards to ensure the accuracy and reproducibility of these data are required. This commentary highlights how image acquisition and postprocessing can introduce artifacts into 3D image data and proposes steps to increase both the accuracy and reproducibility of these analyses. It is intended to aid researchers entering the field of 3D image processing of plant cells and tissues and to help general readers in understanding and evaluating such data. PMID:25804539

  15. Image Tool

    SciTech Connect

    Baker, S.A.; Gardner, S.D.; Rogers, M.L.; Sanders, F.; Tunnell, T.W.

    2001-01-01

    ImageTool is a software package developed at Bechtel Nevada, Los Alamos Operations. This team has developed a set of analysis tools, in the form of image processing software used to evaluate camera calibration data. Performance measures are used to identify capabilities and limitations of a camera system, while establishing a means for comparing systems. The camera evaluations are designed to provide system performance, camera comparison and system modeling information. This program is used to evaluate digital camera images. ImageTool provides basic image restoration and analysis features along with a special set of camera evaluation tools which are used to standardize camera system characterizations. This process is started with the acquisition of a well-defined set of calibration images. Image processing algorithms provide a consistent means of evaluating the camera calibration data. Performance measures in the areas of sensitivity, noise, and resolution are used as a basis for comparing camera systems and evaluating experimental system performance. Camera systems begin with a charge-coupled device (CCD) camera and optical relay system and may incorporate image intensifiers, electro-static image tubes, or electron bombarded charge-coupled devices (EBCCDs). Electro-optical components provide fast shuttering and/or optical gain to camera systems. Camera types evaluated include gated intensified cameras and multi-frame cameras used in applications ranging from X-ray radiography to visible and infrared imaging. It is valuable to evaluate the performance of a camera system in order to determine if a particular system meets experimental requirements. In this paper we highlight the processing features of ImageTool.

  16. Advanced Land Imager Assessment System

    NASA Technical Reports Server (NTRS)

    Chander, Gyanesh; Choate, Mike; Christopherson, Jon; Hollaren, Doug; Morfitt, Ron; Nelson, Jim; Nelson, Shar; Storey, James; Helder, Dennis; Ruggles, Tim; Kaita, Ed; Levy, Raviv; Ong, Lawrence; Markham, Brian; Schweiss, Robert

    2008-01-01

    The Advanced Land Imager Assessment System (ALIAS) supports radiometric and geometric image processing for the Advanced Land Imager (ALI) instrument onboard NASA s Earth Observing-1 (EO-1) satellite. ALIAS consists of two processing subsystems for radiometric and geometric processing of the ALI s multispectral imagery. The radiometric processing subsystem characterizes and corrects, where possible, radiometric qualities including: coherent, impulse; and random noise; signal-to-noise ratios (SNRs); detector operability; gain; bias; saturation levels; striping and banding; and the stability of detector performance. The geometric processing subsystem and analysis capabilities support sensor alignment calibrations, sensor chip assembly (SCA)-to-SCA alignments and band-to-band alignment; and perform geodetic accuracy assessments, modulation transfer function (MTF) characterizations, and image-to-image characterizations. ALIAS also characterizes and corrects band-toband registration, and performs systematic precision and terrain correction of ALI images. This system can geometrically correct, and automatically mosaic, the SCA image strips into a seamless, map-projected image. This system provides a large database, which enables bulk trending for all ALI image data and significant instrument telemetry. Bulk trending consists of two functions: Housekeeping Processing and Bulk Radiometric Processing. The Housekeeping function pulls telemetry and temperature information from the instrument housekeeping files and writes this information to a database for trending. The Bulk Radiometric Processing function writes statistical information from the dark data acquired before and after the Earth imagery and the lamp data to the database for trending. This allows for multi-scene statistical analyses.

  17. The life sciences Global Image Database (GID)

    PubMed Central

    Gonzalez-Couto, Eduardo; Hayes, Brian; Danckaert, Anne

    2001-01-01

    Although a vast amount of life sciences data is generated in the form of images, most scientists still store images on extremely diverse and often incompatible storage media, without any type of metadata structure, and thus with no standard facility with which to conduct searches or analyses. Here we present a solution to unlock the value of scientific images. The Global Image Database (GID) is a web-based (http://www.g wer.ch/qv/gid/gid.htm) structured central repository for scientific annotated images. The GID was designed to manage images from a wide spectrum of imaging domains ranging from microscopy to automated screening. The annotations in the GID define the source experiment of the images by describing who the authors of the experiment are, when the images were created, the biological origin of the experimental sample and how the sample was processed for visualization. A collection of experimental imaging protocols provides details of the sample preparation, and labeling, or visualization procedures. In addition, the entries in the GID reference these imaging protocols with the probe sequences or antibody names used in labeling experiments. The GID annotations are searchable by field or globally. The query results are first shown as image thumbnail previews, enabling quick browsing prior to original-sized annotated image retrieval. The development of the GID continues, aiming at facilitating the management and exchange of image data in the scientific community, and at creating new query tools for mining image data. PMID:11125130

  18. Causal Mediation Analyses for Randomized Trials.

    PubMed

    Lynch, Kevin G; Cary, Mark; Gallop, Robert; Ten Have, Thomas R

    2008-01-01

    In the context of randomized intervention trials, we describe causal methods for analyzing how post-randomization factors constitute the process through which randomized baseline interventions act on outcomes. Traditionally, such mediation analyses have been undertaken with great caution, because they assume that the mediating factor is also randomly assigned to individuals in addition to the randomized baseline intervention (i.e., sequential ignorability). Because the mediating factors are typically not randomized, such analyses are unprotected from unmeasured confounders that may lead to biased inference. We review several causal approaches that attempt to reduce such bias without assuming that the mediating factor is randomized. However, these causal approaches require certain interaction assumptions that may be assessed if there is enough treatment heterogeneity with respect to the mediator. We describe available estimation procedures in the context of several examples from the literature and provide resources for software code. PMID:19484136

  19. Sensitivity in risk analyses with uncertain numbers.

    SciTech Connect

    Tucker, W. Troy; Ferson, Scott

    2006-06-01

    Sensitivity analysis is a study of how changes in the inputs to a model influence the results of the model. Many techniques have recently been proposed for use when the model is probabilistic. This report considers the related problem of sensitivity analysis when the model includes uncertain numbers that can involve both aleatory and epistemic uncertainty and the method of calculation is Dempster-Shafer evidence theory or probability bounds analysis. Some traditional methods for sensitivity analysis generalize directly for use with uncertain numbers, but, in some respects, sensitivity analysis for these analyses differs from traditional deterministic or probabilistic sensitivity analyses. A case study of a dike reliability assessment illustrates several methods of sensitivity analysis, including traditional probabilistic assessment, local derivatives, and a ''pinching'' strategy that hypothetically reduces the epistemic uncertainty or aleatory uncertainty, or both, in an input variable to estimate the reduction of uncertainty in the outputs. The prospects for applying the methods to black box models are also considered.

  20. Analyses and characterization of double shell tank

    SciTech Connect

    Not Available

    1994-10-04

    Evaporator candidate feed from tank 241-AP-108 (108-AP) was sampled under prescribed protocol. Physical, inorganic, and radiochemical analyses were performed on tank 108-AP. Characterization of evaporator feed tank waste is needed primarily for an evaluation of its suitability to be safely processed through the evaporator. Such analyses should provide sufficient information regarding the waste composition to confidently determine whether constituent concentrations are within not only safe operating limits, but should also be relevant to functional limits for operation of the evaporator. Characterization of tank constituent concentrations should provide data which enable a prediction of where the types and amounts of environmentally hazardous waste are likely to occur in the evaporator product streams.

  1. Overview of SNS accelerator shielding analyses

    SciTech Connect

    Popova, I.; Gallmeier, F. X.; Ferguson, P.; Iverson, E.; Lu, W.

    2012-07-01

    The Spallation Neutron Source is an accelerator driven neutron scattering facility for materials research. During all phases of SNS development, including design, construction, commissioning and operation, extensive neutronics work was performed in order to provide adequate shielding, to assure safe facility operation from radiation protection point of view, and to optimize performance of the accelerator and target facility. Presently, most of the shielding work is concentrated on the beam lines and instrument enclosures to prepare for commissioning, safe operation and adequate radiation background in the future. Although the accelerator is built and in operation mode, there is extensive demand for shielding and activation analyses. It includes redesigning some parts of the facility, facility upgrades, designing additional structures, storage and transport containers for accelerator structures taken out of service, and performing radiation protection analyses and studies on residual dose rates inside the accelerator. (authors)

  2. Neuronal network analyses: premises, promises and uncertainties

    PubMed Central

    Parker, David

    2010-01-01

    Neuronal networks assemble the cellular components needed for sensory, motor and cognitive functions. Any rational intervention in the nervous system will thus require an understanding of network function. Obtaining this understanding is widely considered to be one of the major tasks facing neuroscience today. Network analyses have been performed for some years in relatively simple systems. In addition to the direct insights these systems have provided, they also illustrate some of the difficulties of understanding network function. Nevertheless, in more complex systems (including human), claims are made that the cellular bases of behaviour are, or will shortly be, understood. While the discussion is necessarily limited, this issue will examine these claims and highlight some traditional and novel aspects of network analyses and their difficulties. This introduction discusses the criteria that need to be satisfied for network understanding, and how they relate to traditional and novel approaches being applied to addressing network function. PMID:20603354

  3. Reliability of chemical analyses of water samples

    SciTech Connect

    Beardon, R.

    1989-11-01

    Ground-water quality investigations require reliable chemical analyses of water samples. Unfortunately, laboratory analytical results are often unreliable. The Uranium Mill Tailings Remedial Action (UMTRA) Project`s solution to this problem was to establish a two phase quality assurance program for the analysis of water samples. In the first phase, eight laboratories analyzed three solutions of known composition. The analytical accuracy of each laboratory was ranked and three laboratories were awarded contracts. The second phase consists of on-going monitoring of the reliability of the selected laboratories. The following conclusions are based on two years experience with the UMTRA Project`s Quality Assurance Program. The reliability of laboratory analyses should not be taken for granted. Analytical reliability may be independent of the prices charged by laboratories. Quality assurance programs benefit both the customer and the laboratory.

  4. Analysing organic transistors based on interface approximation

    SciTech Connect

    Akiyama, Yuto; Mori, Takehiko

    2014-01-15

    Temperature-dependent characteristics of organic transistors are analysed thoroughly using interface approximation. In contrast to amorphous silicon transistors, it is characteristic of organic transistors that the accumulation layer is concentrated on the first monolayer, and it is appropriate to consider interface charge rather than band bending. On the basis of this model, observed characteristics of hexamethylenetetrathiafulvalene (HMTTF) and dibenzotetrathiafulvalene (DBTTF) transistors with various surface treatments are analysed, and the trap distribution is extracted. In turn, starting from a simple exponential distribution, we can reproduce the temperature-dependent transistor characteristics as well as the gate voltage dependence of the activation energy, so we can investigate various aspects of organic transistors self-consistently under the interface approximation. Small deviation from such an ideal transistor operation is discussed assuming the presence of an energetically discrete trap level, which leads to a hump in the transfer characteristics. The contact resistance is estimated by measuring the transfer characteristics up to the linear region.

  5. Inelastic and Dynamic Fracture and Stress Analyses

    NASA Technical Reports Server (NTRS)

    Atluri, S. N.

    1984-01-01

    Large deformation inelastic stress analysis and inelastic and dynamic crack propagation research work is summarized. The salient topics of interest in engine structure analysis that are discussed herein include: (1) a path-independent integral (T) in inelastic fracture mechanics, (2) analysis of dynamic crack propagation, (3) generalization of constitutive relations of inelasticity for finite deformations , (4) complementary energy approaches in inelastic analyses, and (5) objectivity of time integration schemes in inelastic stress analysis.

  6. Seven New Bulk Chemical Analyses of Aubrites

    NASA Astrophysics Data System (ADS)

    Easton, A. J.

    1985-09-01

    New bulk chemical analyses are given of Aubres, Bishopville, Bustee, Khor Temiki, Norton County, Peña Blanca Spring and Shallowater. Selective attack by dry chlorine (350°C) on magnetic and non-magnetic fractions was used to determine the distribution of some normally lithophile elements (Al, Ca, Cr, K, Mg, Mn, Na, P and Ti) between silicate and sulphide groups of minerals.

  7. BWR core melt progression phenomena: Experimental analyses

    SciTech Connect

    Ott, L.J.

    1992-06-01

    In the BWR Core Melt in Progression Phenomena Program, experimental results concerning severe fuel damage and core melt progression in BWR core geometry are used to evaluate existing models of the governing phenomena. These include control blade eutectic liquefaction and the subsequent relocation and attack on the channel box structure; oxidation heating and hydrogen generation; Zircaloy melting and relocation; and the continuing oxidation of zirconium with metallic blockage formation. Integral data have been obtained from the BWR DF-4 experiment in the ACRR and from BWR tests in the German CORA exreactor fuel-damage test facility. Additional integral data will be obtained from new CORA BWR test, the full-length FLHT-6 BWR test in the NRU test reactor, and the new program of exreactor experiments at Sandia National Laboratories (SNL) on metallic melt relocation and blockage formation. an essential part of this activity is interpretation and use of the results of the BWR tests. The Oak Ridge National Laboratory (ORNL) has developed experiment-specific models for analysis of the BWR experiments; to date, these models have permitted far more precise analyses of the conditions in these experiments than has previously been available. These analyses have provided a basis for more accurate interpretation of the phenomena that the experiments are intended to investigate. The results of posttest analyses of BWR experiments are discussed and significant findings from these analyses are explained. The ORNL control blade/canister models with materials interaction, relocation and blockage models are currently being implemented in SCDAP/RELAP5 as an optional structural component.

  8. BWR core melt progression phenomena: Experimental analyses

    SciTech Connect

    Ott, L.J.

    1992-01-01

    In the BWR Core Melt in Progression Phenomena Program, experimental results concerning severe fuel damage and core melt progression in BWR core geometry are used to evaluate existing models of the governing phenomena. These include control blade eutectic liquefaction and the subsequent relocation and attack on the channel box structure; oxidation heating and hydrogen generation; Zircaloy melting and relocation; and the continuing oxidation of zirconium with metallic blockage formation. Integral data have been obtained from the BWR DF-4 experiment in the ACRR and from BWR tests in the German CORA exreactor fuel-damage test facility. Additional integral data will be obtained from new CORA BWR test, the full-length FLHT-6 BWR test in the NRU test reactor, and the new program of exreactor experiments at Sandia National Laboratories (SNL) on metallic melt relocation and blockage formation. an essential part of this activity is interpretation and use of the results of the BWR tests. The Oak Ridge National Laboratory (ORNL) has developed experiment-specific models for analysis of the BWR experiments; to date, these models have permitted far more precise analyses of the conditions in these experiments than has previously been available. These analyses have provided a basis for more accurate interpretation of the phenomena that the experiments are intended to investigate. The results of posttest analyses of BWR experiments are discussed and significant findings from these analyses are explained. The ORNL control blade/canister models with materials interaction, relocation and blockage models are currently being implemented in SCDAP/RELAP5 as an optional structural component.

  9. Unsteady aerodynamic analyses for turbomachinery aeroelastic predictions

    NASA Technical Reports Server (NTRS)

    Verdon, Joseph M.; Barnett, M.; Ayer, T. C.

    1994-01-01

    Applications for unsteady aerodynamics analysis in this report are: (1) aeroelastic: blade flutter and forced vibration; (2) aeroacoustic: noise generation; (3) vibration and noise control; and (4) effects of unsteadiness on performance. This requires that the numerical simulations and analytical modeling be accurate and efficient and contain realistic operating conditions and arbitrary modes of unsteady excitation. The assumptions of this application contend that: (1) turbulence and transition can be modeled with the Reynolds averaged and using Navier-Stokes equations; (2) 'attached' flow with high Reynolds number will require thin-layer Navier-Stokes equations, or inviscid/viscid interaction analyses; (3) small-amplitude unsteady excitations will need nonlinear steady and linearized unsteady analyses; and (4) Re to infinity will concern inviscid flow. Several computer programs (LINFLO, CLT, UNSVIS, AND SFLOW-IVI) are utilized for these analyses. Results and computerized grid examples are shown. This report was given during NASA LeRC Workshop on Forced Response in Turbomachinery in August of 1993.

  10. Nuclear analyses for the ITER ECRH launcher

    NASA Astrophysics Data System (ADS)

    Serikov, A.; Fischer, U.; Heidinger, R.; Spaeh, P.; Stickel, S.; Tsige-Tamirat, H.

    2008-05-01

    Computational results of the nuclear analyses for the ECRH launcher integrated into the ITER upper port are presented. The purpose of the analyses was to provide the proof for the launcher design that the nuclear requirements specified in the ITER project can be met. The aim was achieved on the basis of 3D neutronics radiation transport calculations using the Monte Carlo code MCNP. In the course of the analyses an adequate shielding configuration against neutron and gamma radiation was developed keeping the necessary empty space for mm-waves propagation in accordance with the ECRH physics guidelines. Different variants of the shielding configuration for the extended performance front steering launcher (EPL) were compared in terms of nuclear response functions in the critical positions. Neutron damage (dpa), nuclear heating, helium production rate, neutron and gamma fluxes have been calculated under the conditions of ITER operation. It has been shown that the radiation shielding criteria are satisfied and the supposed shutdown dose rates are below the ITER nuclear design limits.

  11. Phylogenetic analyses of Andromedeae (Ericaceae subfam. Vaccinioideae).

    PubMed

    Kron, K A; Judd, W S; Crayn, D M

    1999-09-01

    Phylogenetic relationships within the Andromedeae and closely related taxa were investigated by means of cladistic analyses based on phenotypic (morphology, anatomy, chromosome number, and secondary chemistry) and molecular (rbcL and matK nucleotide sequences) characters. An analysis based on combined molecular and phenotypic characters indicates that the tribe is composed of two major clades-the Gaultheria group (incl. Andromeda, Chamaedaphne, Diplycosia, Gaultheria, Leucothoë, Pernettya, Tepuia, and Zenobia) and the Lyonia group (incl. Agarista, Craibiodendron, Lyonia, and Pieris). Andromedeae are shown to be paraphyletic in all analyses because the Vaccinieae link with some or all of the genera of the Gaultheria group. Oxydendrum is sister to the clade containing the Vaccinieae, Gaultheria group, and Lyonia group. The monophyly of Agarista, Lyonia, Pieris, and Gaultheria (incl. Pernettya) is supported, while that of Leucothoë is problematic. The close relationship of Andromeda and Zenobia is novel and was strongly supported in the molecular (but not morphological) analyses. Diplycosia, Tepuia, Gaultheria, and Pernettya form a well-supported clade, which can be diagnosed by the presence of fleshy calyx lobes and methyl salicylate. Recognition of Andromedeae is not reflective of our understanding of geneological relationships and should be abandoned; the Lyonia group is formally recognized at the tribal level. PMID:10487817

  12. Blurred Image

    ERIC Educational Resources Information Center

    Conde, Maryse

    1975-01-01

    The growing influence of Western culture has greatly affected African women's status and image in the traditional society. Working women are confronted with the dilemma of preserving family traditions while changing their behavior and image to become members of the labor force. (MR)

  13. Image fusion

    NASA Technical Reports Server (NTRS)

    Pavel, M.

    1993-01-01

    The topics covered include the following: a system overview of the basic components of a system designed to improve the ability of a pilot to fly through low-visibility conditions such as fog; the role of visual sciences; fusion issues; sensor characterization; sources of information; image processing; and image fusion.

  14. Cerenkov imaging.

    PubMed

    Das, Sudeep; Thorek, Daniel L J; Grimm, Jan

    2014-01-01

    Cerenkov luminescence (CL) has been used recently in a plethora of medical applications like imaging and therapy with clinically relevant medical isotopes. The range of medical isotopes used is fairly large and expanding. The generation of in vivo light is useful since it circumvents depth limitations for excitation light. Cerenkov luminescence imaging (CLI) is much cheaper in terms of infrastructure than positron emission tomography (PET) and is particularly useful for imaging of superficial structures. Imaging can basically be done using a sensitive camera optimized for low-light conditions, and it has a better resolution than any other nuclear imaging modality. CLI has been shown to effectively diagnose disease with regularly used PET isotope ((18)F-FDG) in clinical setting. Cerenkov luminescence tomography, Cerenkov luminescence endoscopy, and intraoperative Cerenkov imaging have also been explored with positive conclusions expanding the current range of applications. Cerenkov has also been used to improve PET imaging resolution since the source of both is the radioisotope being used. Smart imaging agents have been designed based on modulation of the Cerenkov signal using small molecules and nanoparticles giving better insight of the tumor biology. PMID:25287690

  15. Imaging Atherosclerosis.

    PubMed

    Tarkin, Jason M; Dweck, Marc R; Evans, Nicholas R; Takx, Richard A P; Brown, Adam J; Tawakol, Ahmed; Fayad, Zahi A; Rudd, James H F

    2016-02-19

    Advances in atherosclerosis imaging technology and research have provided a range of diagnostic tools to characterize high-risk plaque in vivo; however, these important vascular imaging methods additionally promise great scientific and translational applications beyond this quest. When combined with conventional anatomic- and hemodynamic-based assessments of disease severity, cross-sectional multimodal imaging incorporating molecular probes and other novel noninvasive techniques can add detailed interrogation of plaque composition, activity, and overall disease burden. In the catheterization laboratory, intravascular imaging provides unparalleled access to the world beneath the plaque surface, allowing tissue characterization and measurement of cap thickness with micrometer spatial resolution. Atherosclerosis imaging captures key data that reveal snapshots into underlying biology, which can test our understanding of fundamental research questions and shape our approach toward patient management. Imaging can also be used to quantify response to therapeutic interventions and ultimately help predict cardiovascular risk. Although there are undeniable barriers to clinical translation, many of these hold-ups might soon be surpassed by rapidly evolving innovations to improve image acquisition, coregistration, motion correction, and reduce radiation exposure. This article provides a comprehensive review of current and experimental atherosclerosis imaging methods and their uses in research and potential for translation to the clinic. PMID:26892971

  16. Image reconstruction

    SciTech Connect

    Defrise, Michel; Gullberg, Grant T.

    2006-04-05

    We give an overview of the role of Physics in Medicine andBiology in development of tomographic reconstruction algorithms. We focuson imaging modalities involving ionizing radiation, CT, PET and SPECT,and cover a wide spectrum of reconstruction problems, starting withclassical 2D tomogra tomography in the 1970s up to 4D and 5D problemsinvolving dynamic imaging of moving organs.

  17. Cerenkov Imaging

    PubMed Central

    Das, Sudeep; Thorek, Daniel L.J.; Grimm, Jan

    2014-01-01

    Cerenkov luminescence (CL) has been used recently in a plethora of medical applications like imaging and therapy with clinically relevant medical isotopes. The range of medical isotopes used is fairly large and expanding. The generation of in vivo light is useful since it circumvents depth limitations for excitation light. Cerenkov luminescence imaging (CLI) is much cheaper in terms of infrastructure than positron emission tomography (PET) and is particularly useful for imaging of superficial structures. Imaging can basically be done using a sensitive camera optimized for low-light conditions, and it has a better resolution than any other nuclear imaging modality. CLI has been shown to effectively diagnose disease with regularly used PET isotope (18F-FDG) in clinical setting. Cerenkov luminescence tomography, Cerenkov luminescence endoscopy, and intraoperative Cerenkov imaging have also been explored with positive conclusions expanding the current range of applications. Cerenkov has also been used to improve PET imaging resolution since the source of both is the radioisotope being used. Smart imaging agents have been designed based on modulation of the Cerenkov signal using small molecules and nanoparticles giving better insight of the tumor biology. PMID:25287690

  18. Piramal Imaging.

    PubMed

    Dinkelborg, Ludger

    2015-08-01

    Piramal Imaging, a division of Piramal Enterprises Ltd, is a global radiopharmaceutical company that is actively developing novel PET radiotracers for use in molecular imaging. The company focuses on developing innovative products that improve early detection and characterization of chronic and life-threatening diseases, leading to better therapeutic outcomes and improved quality of life. PMID:26295720

  19. Imaging Genetics

    ERIC Educational Resources Information Center

    Munoz, Karen E.; Hyde, Luke W.; Hariri, Ahmad R.

    2009-01-01

    Imaging genetics is an experimental strategy that integrates molecular genetics and neuroimaging technology to examine biological mechanisms that mediate differences in behavior and the risks for psychiatric disorder. The basic principles in imaging genetics and the development of the field are discussed.

  20. Imaging Atherosclerosis

    PubMed Central

    Tarkin, Jason M.; Dweck, Marc R.; Evans, Nicholas R.; Takx, Richard A.P.; Brown, Adam J.; Tawakol, Ahmed; Fayad, Zahi A.

    2016-01-01

    Advances in atherosclerosis imaging technology and research have provided a range of diagnostic tools to characterize high-risk plaque in vivo; however, these important vascular imaging methods additionally promise great scientific and translational applications beyond this quest. When combined with conventional anatomic- and hemodynamic-based assessments of disease severity, cross-sectional multimodal imaging incorporating molecular probes and other novel noninvasive techniques can add detailed interrogation of plaque composition, activity, and overall disease burden. In the catheterization laboratory, intravascular imaging provides unparalleled access to the world beneath the plaque surface, allowing tissue characterization and measurement of cap thickness with micrometer spatial resolution. Atherosclerosis imaging captures key data that reveal snapshots into underlying biology, which can test our understanding of fundamental research questions and shape our approach toward patient management. Imaging can also be used to quantify response to therapeutic interventions and ultimately help predict cardiovascular risk. Although there are undeniable barriers to clinical translation, many of these hold-ups might soon be surpassed by rapidly evolving innovations to improve image acquisition, coregistration, motion correction, and reduce radiation exposure. This article provides a comprehensive review of current and experimental atherosclerosis imaging methods and their uses in research and potential for translation to the clinic. PMID:26892971

  1. Retinal Imaging and Image Analysis

    PubMed Central

    Abràmoff, Michael D.; Garvin, Mona K.; Sonka, Milan

    2011-01-01

    Many important eye diseases as well as systemic diseases manifest themselves in the retina. While a number of other anatomical structures contribute to the process of vision, this review focuses on retinal imaging and image analysis. Following a brief overview of the most prevalent causes of blindness in the industrialized world that includes age-related macular degeneration, diabetic retinopathy, and glaucoma, the review is devoted to retinal imaging and image analysis methods and their clinical implications. Methods for 2-D fundus imaging and techniques for 3-D optical coherence tomography (OCT) imaging are reviewed. Special attention is given to quantitative techniques for analysis of fundus photographs with a focus on clinically relevant assessment of retinal vasculature, identification of retinal lesions, assessment of optic nerve head (ONH) shape, building retinal atlases, and to automated methods for population screening for retinal diseases. A separate section is devoted to 3-D analysis of OCT images, describing methods for segmentation and analysis of retinal layers, retinal vasculature, and 2-D/3-D detection of symptomatic exudate-associated derangements, as well as to OCT-based analysis of ONH morphology and shape. Throughout the paper, aspects of image acquisition, image analysis, and clinical relevance are treated together considering their mutually interlinked relationships. PMID:21743764

  2. Bivariate flood frequency analyses using Copula function

    NASA Astrophysics Data System (ADS)

    Sraj, Mojca; Bezak, Nejc; Brilly, Mitja

    2013-04-01

    The objective of the study was (1) to perform all steps in flood frequency analyses using Copula approach, (2) to select the most appropriate Copula function and (3) to evaluate the conditional bivariate return periods for the next pairs of variables: peak-volume, volume-duration and peak-duration, respectively. Flood frequency analyses are usually made by univariate distribution functions and in most cases only peaks are considered in analyses. However, hydrological processes are multidimensional, so it is reasonable to consider more than one variable in analyses. Different marginal distributions can be used for Copula modelling. Copula function successfully models dependence between two or more depended variables and determination of marginal distributions and Copula selection are two separate processes. Hydrological station Litija on the Sava river is one of the oldest stations in Slovenia and it lies in eastern part of country. 58 years of annual maximums were used for analyses and three-points graphical method was used for base flow separation. The log-Pearson type 3 distribution was selected as marginal distribution of peaks and durations, the Pearson type 3 distribution was chosen as marginal distribution of volumes. Some frequently used Copula functions from the Archimedean (Gumbel-Hougaard, Frank, Joe, Clayton, BB1 and Ali-Mikhail-Haq), Elliptical (Student-t and Normal) and Extreme value (Galambos, Hüsler-Reiss and Tawn) families were applied to the data. Copula parameters were estimated with the method of moments based on the inversion of Kendall's tau and with the maximum likelihood method. Graphical and statistical test were applied for the comparison of different Copula functions. For the pair peak-duration the Kendall correlation coefficient was negative and only Copulas able to model negative dependence were used. The Gumbel-Hougaard, Frank and Ali-Mikhail-Haq Copulas were selected as optimal based on tests results for the pairs: peak-volume, volume

  3. Site-specific seismic hazard analyses

    NASA Astrophysics Data System (ADS)

    Montalva, Gonzalo Andres

    Current seismic hazard analyses are generally performed using probabilistic methods. When dealing with a specific site, the typical methodology involves using a ground motion prediction equation (GMPE) to estimate the rock outcrop ground motion and associated variability, then the ground motion is propagated to the ground surface by site response analysis. The site response process is inherently variable. Including this uncertainty in site response analyses without modifying the input ground motion uncertainty produces double counting of the uncertainty associated with site response. In this dissertation the total uncertainty is partitioned into its several contributing components, quantifying these components, and proposing methods to perform site-specific seismic hazard analyses without double counting uncertainties. Four random field models were developed, and an existing one was fitted to a different database. These models can be used to generate shear-wave velocity profiles for site response analyses. Two types of models are presented, using Gaussian random fields, and using Markov Chains. The first ones showed better performance, and among those a stationary Gaussian model (stationary on rho) showed the best performance, and it is the simplest among the five models. Three GMPE's were developed, one only from surface records, one from "at-depth" records, and a third one combining surface and "at-depth" records. The results show the iv same magnitude and distance scaling for the three equations. For stations that recorded a large number of records, total uncertainty was measured by the standard deviation of the observed minus predicted, and similarly for intra-event residuals. These statistics serve as lower bounds for site-specific seismic hazard analyses, note that these standard deviations are non-ergodic. The use of a GMPE capable of predicting bedrock and surface median ground motions, allows the partition of the components of the total uncertainty at the

  4. Penalized likelihood phenotyping: unifying voxelwise analyses and multi-voxel pattern analyses in neuroimaging: penalized likelihood phenotyping.

    PubMed

    Adluru, Nagesh; Hanlon, Bret M; Lutz, Antoine; Lainhart, Janet E; Alexander, Andrew L; Davidson, Richard J

    2013-04-01

    Neuroimage phenotyping for psychiatric and neurological disorders is performed using voxelwise analyses also known as voxel based analyses or morphometry (VBM). A typical voxelwise analysis treats measurements at each voxel (e.g., fractional anisotropy, gray matter probability) as outcome measures to study the effects of possible explanatory variables (e.g., age, group) in a linear regression setting. Furthermore, each voxel is treated independently until the stage of correction for multiple comparisons. Recently, multi-voxel pattern analyses (MVPA), such as classification, have arisen as an alternative to VBM. The main advantage of MVPA over VBM is that the former employ multivariate methods which can account for interactions among voxels in identifying significant patterns. They also provide ways for computer-aided diagnosis and prognosis at individual subject level. However, compared to VBM, the results of MVPA are often more difficult to interpret and prone to arbitrary conclusions. In this paper, first we use penalized likelihood modeling to provide a unified framework for understanding both VBM and MVPA. We then utilize statistical learning theory to provide practical methods for interpreting the results of MVPA beyond commonly used performance metrics, such as leave-one-out-cross validation accuracy and area under the receiver operating characteristic (ROC) curve. Additionally, we demonstrate that there are challenges in MVPA when trying to obtain image phenotyping information in the form of statistical parametric maps (SPMs), which are commonly obtained from VBM, and provide a bootstrap strategy as a potential solution for generating SPMs using MVPA. This technique also allows us to maximize the use of available training data. We illustrate the empirical performance of the proposed framework using two different neuroimaging studies that pose different levels of challenge for classification using MVPA. PMID:23397550

  5. Photointerpretation of LANDSAT images

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Learning objectives include: (1) developing a facility for applying conventional techniques of photointerpretation to small scale (satellite) imager; (2) promoting the ability to locate, identify, and interpret small natural and man made surface features in a LANDSAT image; (3) using supporting imagery, such as aerial and space photography, to conduct specific applications analyses; (4) learning to apply change detection techniques to recognize and explain transient and temporal events in individual or seasonal imagery; (5) producing photointerpretation maps that define major surface units, themes, or classes; (6) classifying or analyzing a scene for specific discipline applications in geology, agriculture, forestry, hyrology, coastal wetlands, and environmental pollution; and (7) evaluating both advantages and shortcomings in relying on the photointerpretive approach (rather than computer based analytical approach) for extracting information from LANDSAT data.

  6. MULTISPECTRAL THERMAL IMAGER - OVERVIEW

    SciTech Connect

    P. WEBER

    2001-03-01

    The Multispectral Thermal Imager satellite fills a new and important role in advancing the state of the art in remote sensing sciences. Initial results with the full calibration system operating indicate that the system was already close to achieving the very ambitious goals which we laid out in 1993, and we are confident of reaching all of these goals as we continue our research and improve our analyses. In addition to the DOE interests, the satellite is tasked about one-third of the time with requests from other users supporting research ranging from volcanology to atmospheric sciences.

  7. Department of Energy's team's analyses of Soviet designed VVERs

    SciTech Connect

    Not Available

    1989-09-01

    This document provides Appendices A thru K of this report. The topics discussed respectively are: radiation induced embrittlement and annealing of reactor pressure vessel steels; loss of coolant accident blowdown analyses; LOCA blowdown response analyses; non-seismic structural response analyses; seismic analyses; S'' seal integrity; reactor transient analyses; fire protection; aircraft impacts; and boric acid induced corrosion. (FI).

  8. Optical eigenmodes for illumination & imaging

    NASA Astrophysics Data System (ADS)

    Kosmeier, Sebastian

    Gravitational Microlensing, as a technique for detecting Extrasolar Planets, is recognised for its potential in discovering small-mass planets similar to Earth, at a distance of a few Astronomical Units from their host stars. However, analysing the data from microlensing events (which statistically rarely reveal planets) is complex and requires continued and intensive use of various networks of telescopes working together in order to observe the phenomenon. As such the techniques are constantly being developed and refined; this project outlines some steps of the careful analysis required to model an event and ensure the best quality data is used in the fitting. A quantitative investigation into increasing the quality of the original photometric data available from any microlensing event demonstrates that 'lucky imaging' can lead to a marked improvement in the signal to noise ratio of images over standard imaging techniques, which could result in more accurate models and thus the calculation of more accurate planetary parameters. In addition, a simulation illustrating the effects of atmospheric turbulence on exposures was created, and expanded upon to give an approximation of the lucky imaging technique. This further demonstrated the advantages of lucky images which are shown to potentially approach the quality of those expected from diffraction limited photometry. The simulation may be further developed for potential future use as a 'theoretical lucky imager' in our research group, capable of producing and analysing synthetic exposures through customisable conditions.

  9. Stable isotopic analyses in paleoclimatic reconstruction

    SciTech Connect

    Wigand, P.E.

    1995-09-01

    Most traditional paleoclimatic proxy data have inherent time lags between climatic input and system response that constrain their use in accurate reconstruction of paleoclimate chronology, scaling of its variability, and the elucidation of the processes that determine its impact on the biotic and abiotic environment. With the exception of dendroclimatology, and studies of short-lived organisms and pollen recovered from annually varved lacustrine sediments, significant periods of time ranging from years, to centuries, to millennia may intervene between climate change and its first manifestation in paleoclimatic proxy data records. Reconstruction of past climate through changes in plant community composition derived from pollen sequences and plant remains from ancient woodrat middens, wet environments and dry caves all suffer from these lags. However, stable isotopic analyses can provide more immediate indication of biotic response to climate change. Evidence of past physiological response of organisms to changes in effective precipitation as climate varies can be provided by analyses of the stable isotopic content of plant macrofossils from various contexts. These analyses consider variation in the stable isotopic (hydrogen, oxygen and carbon) content of plant tissues as it reflects (1) past global or local temperature through changes in meteoric (rainfall) water chemistry in the case of the first two isotopes, and (2) plant stress through changes in plant respiration/transpiration processes under differing water availability, and varying atmospheric CO, composition (which itself may actually be a net result of biotic response to climate change). Studies currently being conducted in the Intermountain West indicate both long- and short-term responses that when calibrated with modem analogue studies have the potential of revealing not only the timing of climate events, but their direction, magnitude and rapidity.

  10. Evaluation of Model Operational Analyses during DYNAMO

    NASA Astrophysics Data System (ADS)

    Ciesielski, Paul; Johnson, Richard

    2013-04-01

    A primary component of the observing system in the DYNAMO-CINDY2011-AMIE field campaign was an atmospheric sounding network comprised of two sounding quadrilaterals, one north and one south of the equator over the central Indian Ocean. During the experiment a major effort was undertaken to ensure the real-time transmission of these data onto the GTS (Global Telecommunication System) for dissemination to the operational centers (ECMWF, NCEP, JMA, etc.). Preliminary estimates indicate that ~95% of the soundings from the enhanced sounding network were successfully transmitted and potentially used in their data assimilation systems. Because of the wide use of operational and reanalysis products (e.g., in process studies, initializing numerical simulations, construction of large-scale forcing datasets for CRMs, etc.), their validity will be examined by comparing a variety of basic and diagnosed fields from two operational analyses (ECMWF and NCEP) to similar analyses based solely on sounding observations. Particular attention will be given to the vertical structures of apparent heating (Q1) and drying (Q2) from the operational analyses (OA), which are strongly influenced by cumulus parameterizations, a source of model infidelity. Preliminary results indicate that the OA products did a reasonable job at capturing the mean and temporal characteristics of convection during the DYNAMO enhanced observing period, which included the passage of two significant MJO events during the October-November 2011 period. For example, temporal correlations between Q2-budget derived rainfall from the OA products and that estimated from the TRMM satellite (i.e., the 3B42V7 product) were greater than 0.9 over the Northern Sounding Array of DYNAMO. However closer inspection of the budget profiles show notable differences between the OA products and the sounding-derived results in low-level (surface to 700 hPa) heating and drying structures. This presentation will examine these differences and

  11. Combustion Devices CFD Team Analyses Review

    NASA Technical Reports Server (NTRS)

    Rocker, Marvin

    2008-01-01

    A variety of CFD simulations performed by the Combustion Devices CFD Team at Marshall Space Flight Center will be presented. These analyses were performed to support Space Shuttle operations and Ares-1 Crew Launch Vehicle design. Results from the analyses will be shown along with pertinent information on the CFD codes and computational resources used to obtain the results. Six analyses will be presented - two related to the Space Shuttle and four related to the Ares I-1 launch vehicle now under development at NASA. First, a CFD analysis of the flow fields around the Space Shuttle during the first six seconds of flight and potential debris trajectories within those flow fields will be discussed. Second, the combusting flows within the Space Shuttle Main Engine's main combustion chamber will be shown. For the Ares I-1, an analysis of the performance of the roll control thrusters during flight will be described. Several studies are discussed related to the J2-X engine to be used on the upper stage of the Ares I-1 vehicle. A parametric study of the propellant flow sequences and mixture ratios within the GOX/GH2 spark igniters on the J2-X is discussed. Transient simulations will be described that predict the asymmetric pressure loads that occur on the rocket nozzle during the engine start as the nozzle fills with combusting gases. Simulations of issues that affect temperature uniformity within the gas generator used to drive the J-2X turbines will described as well, both upstream of the chamber in the injector manifolds and within the combustion chamber itself.

  12. Raman Imaging

    NASA Astrophysics Data System (ADS)

    Stewart, Shona; Priore, Ryan J.; Nelson, Matthew P.; Treado, Patrick J.

    2012-07-01

    The past decade has seen an enormous increase in the number and breadth of imaging techniques developed for analysis in many industries, including pharmaceuticals, food, and especially biomedicine. Rather than accept single-dimensional forms of information, users now demand multidimensional assessment of samples. High specificity and the need for little or no sample preparation make Raman imaging a highly attractive analytical technique and provide motivation for continuing advances in its supporting technology and utilization. This review discusses the current tools employed in Raman imaging, the recent advances, and the major applications in this ever-growing analytical field.

  13. Analyses of Shuttle Orbiter approach and landing

    NASA Technical Reports Server (NTRS)

    Ashkenas, I. L.; Hoh, R. H.; Teper, G. L.

    1982-01-01

    A study of the Shuttle Orbiter approach and landing conditions is summarized. The causes of observed PIO-like flight deficiencies are listed, and possible corrective measures are examined. Closed-loop pilot/vehicle analyses are described, and a description is given of path-attitude stability boundaries. The latter novel approach is found to be of great value in delineating and illustrating the basic causes of this multiloop pilot control problem. It is shown that the analytical results are consistent with flight test and fixed-base simulation. Conclusions are drawn concerning possible improvements in the Shuttle Orbiter/Digital Flight Control System.

  14. Environmental monitoring final report: groundwater chemical analyses

    SciTech Connect

    Not Available

    1984-02-01

    This report presents the results of analyses of groundwater qualtiy at the SRC-I Demonstration Plant site in Newman, Kentucky. Samples were obtained from a network of 23 groundwater observation wells installed during previous studies. The groundwater was well within US EPA Interim Primary Drinking Water Standards for trace metals, radioactivity, and pesticides, but exceeded the standard for coliform bacteria. Several US EPA Secondary Drinking Water Standards were exceeded, namely, manganese, color, iron, and total dissolved solids. Based on the results, Dames and Moore recommend that all wells should be sterilized and those wells built in 1980 should be redeveloped. 1 figure, 6 tables.

  15. ORNL analyses of AVR performance and safety

    SciTech Connect

    Cleveland, J.C.

    1985-01-01

    Because of the high interest in modular High Temperature Reactor performance and safety, a cooperative project has been established involving the Oak Ridge National Laboratory (ORNL), Arbeitsgemeinschaft Versuchs Reaktor GmbH (AVR), and Kernforschungsanlage Juelich GmbH (KFA) in reactor physics, performance and safety. This paper presents initial results of ORNL's examination of a hypothetical depressurized core heatup accident and consideration of how a depressurized core heatup test might be conducted by AVR staff. Also presented are initial analyses of a test involving a reduction in core flow and of a test involving reactivity insertion via control rod withdrawal.

  16. Further analyses of Rio Cuarto impact glass

    NASA Technical Reports Server (NTRS)

    Schultz, Peter H.; Bunch, T. E.; Koeberl, C.; Collins, W.

    1993-01-01

    Initial analyses of the geologic setting, petrology, and geochemistry of glasses recovered from within and around the elongate Rio Cuarto (RC) craters in Argentina focused on selected samples in order to document the general similarity with impactites around other terrestrial impact craters and to establish their origin. Continued analysis has surveyed the diversity in compositions for a range of samples, examined further evidence for temperature and pressure history, and compared the results with experimentally fused loess from oblique hypervelocity impacts. These new results not only firmly establish their impact origin but provide new insight on the impact process.

  17. Method of performing computational aeroelastic analyses

    NASA Technical Reports Server (NTRS)

    Silva, Walter A. (Inventor)

    2011-01-01

    Computational aeroelastic analyses typically use a mathematical model for the structural modes of a flexible structure and a nonlinear aerodynamic model that can generate a plurality of unsteady aerodynamic responses based on the structural modes for conditions defining an aerodynamic condition of the flexible structure. In the present invention, a linear state-space model is generated using a single execution of the nonlinear aerodynamic model for all of the structural modes where a family of orthogonal functions is used as the inputs. Then, static and dynamic aeroelastic solutions are generated using computational interaction between the mathematical model and the linear state-space model for a plurality of periodic points in time.

  18. Analyses of containment structures with corrosion damage

    SciTech Connect

    Cherry, J.L.

    1996-12-31

    Corrosion damage to a nuclear power plant containment structure can degrade the pressure capacity of the vessel. For the low-carbon, low- strength steels used in containments, the effect of corrosion on material properties is discussed. Strain-to-failure tests, in uniaxial tension, have been performed on corroded material samples. Results were used to select strain-based failure criteria for corroded steel. Using the ABAQUS finite element analysis code, the capacity of a typical PWR Ice Condenser containment with corrosion damage has been studied. Multiple analyses were performed with the locations of the corrosion the containment, and the amount of corrosion varied in each analysis.

  19. Autisme et douleur – analyse bibliographique

    PubMed Central

    Dubois, Amandine; Rattaz, Cécile; Pry, René; Baghdadli, Amaria

    2010-01-01

    La présente analyse bibliographique a pour objectif de réaliser un bilan des travaux publiés dans le champ de la douleur et de l’autisme. L’article aborde, dans un premier temps, les études publiées concernant les modes d’expression de la douleur observés dans cette population. Différentes hypothèses permettant d’expliquer les particularités expressives des personnes avec autisme sont ensuite passées en revue : excès d’endorphines, particularités dans le traitement sensoriel, déficit sociocommunicatif. Cette analyse bibliographique aborde, pour terminer, la question de l’évaluation et de la prise en compte de la douleur chez les personnes avec autisme. Les auteurs concluent à l’absence d’homogénéité des résultats des études publiées et au besoin de poursuivre les recherches afin de parvenir à des données consensuelles sur un domaine d’étude encore peu exploité au plan scientifique. Sur un plan clinique, l’approfondissement des connaissances dans ce domaine devrait permettre de mettre au point des outils d’évaluation de la douleur et d’ainsi en assurer une meilleure prise en charge au quotidien. PMID:20808970

  20. Special analyses reveal coke-deposit structure

    SciTech Connect

    Albright, L.F.

    1988-08-01

    A scanning electron microscope (SEM) and an energy dispersive X-ray analyzer (EDAX) have been used to obtain information that clarifies the three mechanisms of coke formation in ethylene furnaces, and to analyze the metal condition at the exit of furnace. The results can be used to examine furnace operations and develop improved ethylene plant practices. In this first of four articles on the analyses of coke and metal samples, the coking mechanisms and coke deposits in a section of tube from an actual ethylene furnace (Furnace A) from a plant on the Texas Gulf Coast are discussed. The second articles in the series will analyze the condition of the tube metal in the same furnace. To show how coke deposition and metal condition dependent on the operating parameters of an ethylene furnace, the third article in the series will show the coke deposition in a Texas Gulf Coast furnace tube (Furnace B) that operated at shorter residence time. The fourth article discusses the metal condition in that furnace. Some recommendations, based on the analyses and findings, are offered in the fourth article that could help extend the life of ethylene furnace tubes, and also improve overall ethylene plant operations.

  1. Hierarchical regression for analyses of multiple outcomes.

    PubMed

    Richardson, David B; Hamra, Ghassan B; MacLehose, Richard F; Cole, Stephen R; Chu, Haitao

    2015-09-01

    In cohort mortality studies, there often is interest in associations between an exposure of primary interest and mortality due to a range of different causes. A standard approach to such analyses involves fitting a separate regression model for each type of outcome. However, the statistical precision of some estimated associations may be poor because of sparse data. In this paper, we describe a hierarchical regression model for estimation of parameters describing outcome-specific relative rate functions and associated credible intervals. The proposed model uses background stratification to provide flexible control for the outcome-specific associations of potential confounders, and it employs a hierarchical "shrinkage" approach to stabilize estimates of an exposure's associations with mortality due to different causes of death. The approach is illustrated in analyses of cancer mortality in 2 cohorts: a cohort of dioxin-exposed US chemical workers and a cohort of radiation-exposed Japanese atomic bomb survivors. Compared with standard regression estimates of associations, hierarchical regression yielded estimates with improved precision that tended to have less extreme values. The hierarchical regression approach also allowed the fitting of models with effect-measure modification. The proposed hierarchical approach can yield estimates of association that are more precise than conventional estimates when one wishes to estimate associations with multiple outcomes. PMID:26232395

  2. Analyses of containment structures with corrosion damage

    SciTech Connect

    Cherry, J.L.

    1997-01-01

    Corrosion damage that has been found in a number of nuclear power plant containment structures can degrade the pressure capacity of the vessel. This has prompted concerns regarding the capacity of corroded containments to withstand accident loadings. To address these concerns, finite element analyses have been performed for a typical PWR Ice Condenser containment structure. Using ABAQUS, the pressure capacity was calculated for a typical vessel with no corrosion damage. Multiple analyses were then performed with the location of the corrosion and the amount of corrosion varied in each analysis. Using a strain-based failure criterion, a {open_quotes}lower bound{close_quotes}, {open_quotes}best estimate{close_quotes}, and {open_quotes}upper bound{close_quotes} failure level was predicted for each case. These limits were established by: determining the amount of variability that exists in material properties of typical containments, estimating the amount of uncertainty associated with the level of modeling detail and modeling assumptions, and estimating the effect of corrosion on the material properties.

  3. Used Fuel Management System Interface Analyses - 13578

    SciTech Connect

    Howard, Robert; Busch, Ingrid; Nutt, Mark; Morris, Edgar; Puig, Francesc; Carter, Joe; Delley, Alexcia; Rodwell, Phillip; Hardin, Ernest; Kalinina, Elena; Clark, Robert; Cotton, Thomas

    2013-07-01

    Preliminary system-level analyses of the interfaces between at-reactor used fuel management, consolidated storage facilities, and disposal facilities, along with the development of supporting logistics simulation tools, have been initiated to provide the U.S. Department of Energy (DOE) and other stakeholders with information regarding the various alternatives for managing used nuclear fuel (UNF) generated by the current fleet of light water reactors operating in the United States. An important UNF management system interface consideration is the need for ultimate disposal of UNF assemblies contained in waste packages that are sized to be compatible with different geologic media. Thermal analyses indicate that waste package sizes for the geologic media under consideration by the Used Fuel Disposition Campaign may be significantly smaller than the canisters being used for on-site dry storage by the nuclear utilities. Therefore, at some point along the UNF disposition pathway, there could be a need to repackage fuel assemblies already loaded and being loaded into the dry storage canisters currently in use. The implications of where and when the packaging or repackaging of commercial UNF will occur are key questions being addressed in this evaluation. The analysis demonstrated that thermal considerations will have a major impact on the operation of the system and that acceptance priority, rates, and facility start dates have significant system implications. (authors)

  4. ISFSI site boundary radiation dose rate analyses.

    PubMed

    Hagler, R J; Fero, A H

    2005-01-01

    Across the globe nuclear utilities are in the process of designing and analysing Independent Spent Fuel Storage Installations (ISFSI) for the purpose of above ground spent-fuel storage primarily to mitigate the filling of spent-fuel pools. Using a conjoining of discrete ordinates transport theory (DORT) and Monte Carlo (MCNP) techniques, an ISFSI was analysed to determine neutron and photon dose rates for a generic overpack, and ISFSI pad configuration and design at distances ranging from 1 to -1700 m from the ISFSI array. The calculated dose rates are used to address the requirements of 10CFR72.104, which provides limits to be enforced for the protection of the public by the NRC in regard to ISFSI facilities. For this overpack, dose rates decrease by three orders of magnitude through the first 200 m moving away from the ISFSI. In addition, the contributions from different source terms changes over distance. It can be observed that although side photons provide the majority of dose rate in this calculation, scattered photons and side neutrons take on more importance as the distance from the ISFSI is increased. PMID:16604670

  5. Waste Stream Analyses for Nuclear Fuel Cycles

    SciTech Connect

    N. R. Soelberg

    2010-08-01

    A high-level study was performed in Fiscal Year 2009 for the U.S. Department of Energy (DOE) Office of Nuclear Energy (NE) Advanced Fuel Cycle Initiative (AFCI) to provide information for a range of nuclear fuel cycle options (Wigeland 2009). At that time, some fuel cycle options could not be adequately evaluated since they were not well defined and lacked sufficient information. As a result, five families of these fuel cycle options are being studied during Fiscal Year 2010 by the Systems Analysis Campaign for the DOE NE Fuel Cycle Research and Development (FCRD) program. The quality and completeness of data available to date for the fuel cycle options is insufficient to perform quantitative radioactive waste analyses using recommended metrics. This study has been limited thus far to qualitative analyses of waste streams from the candidate fuel cycle options, because quantitative data for wastes from the front end, fuel fabrication, reactor core structure, and used fuel for these options is generally not yet available.

  6. Integrated Genomic Analyses of Ovarian Carcinoma

    PubMed Central

    2011-01-01

    Summary The Cancer Genome Atlas (TCGA) project has analyzed mRNA expression, miRNA expression, promoter methylation, and DNA copy number in 489 high-grade serous ovarian adenocarcinomas (HGS-OvCa) and the DNA sequences of exons from coding genes in 316 of these tumors. These results show that HGS-OvCa is characterized by TP53 mutations in almost all tumors (96%); low prevalence but statistically recurrent somatic mutations in 9 additional genes including NF1, BRCA1, BRCA2, RB1, and CDK12; 113 significant focal DNA copy number aberrations; and promoter methylation events involving 168 genes. Analyses delineated four ovarian cancer transcriptional subtypes, three miRNA subtypes, four promoter methylation subtypes, a transcriptional signature associated with survival duration and shed new light on the impact on survival of tumors with BRCA1/2 and CCNE1 aberrations. Pathway analyses suggested that homologous recombination is defective in about half of tumors, and that Notch and FOXM1 signaling are involved in serous ovarian cancer pathophysiology. PMID:21720365

  7. NEXT Ion Thruster Performance Dispersion Analyses

    NASA Technical Reports Server (NTRS)

    Soulas, George C.; Patterson, Michael J.

    2008-01-01

    The NEXT ion thruster is a low specific mass, high performance thruster with a nominal throttling range of 0.5 to 7 kW. Numerous engineering model and one prototype model thrusters have been manufactured and tested. Of significant importance to propulsion system performance is thruster-to-thruster performance dispersions. This type of information can provide a bandwidth of expected performance variations both on a thruster and a component level. Knowledge of these dispersions can be used to more conservatively predict thruster service life capability and thruster performance for mission planning, facilitate future thruster performance comparisons, and verify power processor capabilities are compatible with the thruster design. This study compiles the test results of five engineering model thrusters and one flight-like thruster to determine unit-to-unit dispersions in thruster performance. Component level performance dispersion analyses will include discharge chamber voltages, currents, and losses; accelerator currents, electron backstreaming limits, and perveance limits; and neutralizer keeper and coupling voltages and the spot-to-plume mode transition flow rates. Thruster level performance dispersion analyses will include thrust efficiency.

  8. Transportation systems analyses: Volume 1: Executive Summary

    NASA Astrophysics Data System (ADS)

    1993-05-01

    The principal objective of this study is to accomplish a systems engineering assessment of the nation's space transportation infrastructure. This analysis addresses the necessary elements to perform man delivery and return, cargo transfer, cargo delivery, payload servicing, and the exploration of the Moon and Mars. Specific elements analyzed, but not limited to, include the Space Exploration Initiative (SEI), the National Launch System (NLS), the current expendable launch vehicle (ELV) fleet, ground facilities, the Space Station Freedom (SSF), and other civil, military and commercial payloads. The performance of this study entails maintaining a broad perspective on the large number of transportation elements that could potentially comprise the U.S. space infrastructure over the next several decades. To perform this systems evaluation, top-level trade studies are conducted to enhance our understanding of the relationships between elements of the infrastructure. This broad 'infrastructure-level perspective' permits the identification of preferred infrastructures. Sensitivity analyses are performed to assure the credibility and usefulness of study results. This executive summary of the transportation systems analyses (TSM) semi-annual report addresses the SSF logistics resupply. Our analysis parallels the ongoing NASA SSF redesign effort. Therefore, there could be no SSF design to drive our logistics analysis. Consequently, the analysis attempted to bound the reasonable SSF design possibilities (and the subsequent transportation implications). No other strategy really exists until after a final decision is rendered on the SSF configuration.

  9. Bioinformatics tools for analysing viral genomic data.

    PubMed

    Orton, R J; Gu, Q; Hughes, J; Maabar, M; Modha, S; Vattipally, S B; Wilkie, G S; Davison, A J

    2016-04-01

    The field of viral genomics and bioinformatics is experiencing a strong resurgence due to high-throughput sequencing (HTS) technology, which enables the rapid and cost-effective sequencing and subsequent assembly of large numbers of viral genomes. In addition, the unprecedented power of HTS technologies has enabled the analysis of intra-host viral diversity and quasispecies dynamics in relation to important biological questions on viral transmission, vaccine resistance and host jumping. HTS also enables the rapid identification of both known and potentially new viruses from field and clinical samples, thus adding new tools to the fields of viral discovery and metagenomics. Bioinformatics has been central to the rise of HTS applications because new algorithms and software tools are continually needed to process and analyse the large, complex datasets generated in this rapidly evolving area. In this paper, the authors give a brief overview of the main bioinformatics tools available for viral genomic research, with a particular emphasis on HTS technologies and their main applications. They summarise the major steps in various HTS analyses, starting with quality control of raw reads and encompassing activities ranging from consensus and de novo genome assembly to variant calling and metagenomics, as well as RNA sequencing. PMID:27217183

  10. Analyses of broadband noise mechanisms of rotors

    NASA Astrophysics Data System (ADS)

    George, A. R.

    The various source mechanisms which generate broadband noise on a range of rotors are reviewed. Analyses of these mechanisms are presented and compared to existing experimental data. The sources considered are load fluctuations due to inflow turbulence, due to turbulent blade boundary layers passing the trailing edge, and due to tip vortex formation turbulence. Vortex shedding noise due to laminar boundary layers and blunt trailing edges is not considered in detail as it can be avoided in most cases. Present analyses are adequate to predict the spectra from a wide variety of experiments on fans, helicopter rotors, and wind turbines to within about 5 to 10 dB. Better knowledge of the inflow turbulence improves the accuracy of the predictions. Inflow turbulence noise depends strongly on ambient conditions and dominates at low frequencies. Trailing edge and tip vortex noise are important at higher frequencies if inflow turbulence is weak. Boundary layer trailing edge noise increases slowly with angle of attack but not as rapidly as tip vortex formation noise. Tip noise can be important at high angles of attack for wide chord, square edge tips.

  11. Analyses of broadband noise mechanisms of rotors

    NASA Technical Reports Server (NTRS)

    George, A. R.

    1986-01-01

    The various source mechanisms which generate broadband noise on a range of rotors are reviewed. Analyses of these mechanisms are presented and compared to existing experimental data. The sources considered are load fluctuations due to inflow turbulence, due to turbulent blade boundary layers passing the trailing edge, and due to tip vortex formation turbulence. Vortex shedding noise due to laminar boundary layers and blunt trailing edges is not considered in detail as it can be avoided in most cases. Present analyses are adequate to predict the spectra from a wide variety of experiments on fans, helicopter rotors, and wind turbines to within about 5 to 10 dB. Better knowledge of the inflow turbulence improves the accuracy of the predictions. Inflow turbulence noise depends strongly on ambient conditions and dominates at low frequencies. Trailing edge and tip vortex noise are important at higher frequencies if inflow turbulence is weak. Boundary layer trailing edge noise increases slowly with angle of attack but not as rapidly as tip vortex formation noise. Tip noise can be important at high angles of attack for wide chord, square edge tips.

  12. MCNP analyses of criticality calculation results

    SciTech Connect

    Forster, R.A.; Booth, T.E.

    1995-05-01

    Careful assessment of the results of a calculation by the code itself can reduce mistakes in the problem setup and execution. MCNP has over four hundred error messages that inform the user of FATAL or WARNING errors that have been discovered during the processing of just the input file. The latest version, MCNP4A, now performs a self assessment of the calculated results to aid the user in determining the quality of the Monte Carlo results. MCNP4A, which was released to RSIC in October 1993, contains new analyses of the MCNP Monte Carlo calculation that provide simple user WARNINGs for both criticality and fixed source calculations. The goal of the new analyses is to provide the MCNP criticality practitioner with enough information in the output to assess the validity of the k{sub eff} calculation and any associated tallies. The results of these checks are presented in the k{sub eff} results summary page, several k{sub eff} tables and graphs, and tally tables and graphs. Plots of k{sub eff} at the workstation are also available as the problem is running or in a postprocessing mode to assess problem performance and results.

  13. Medical Imaging.

    ERIC Educational Resources Information Center

    Jaffe, C. Carl

    1982-01-01

    Describes principle imaging techniques, their applications, and their limitations in terms of diagnostic capability and possible adverse biological effects. Techniques include film radiography, computed tomography, nuclear medicine, positron emission tomography (PET), ultrasonography, nuclear magnetic resonance, and digital radiography. PET has…

  14. Body Imaging

    NASA Technical Reports Server (NTRS)

    2001-01-01

    The high-tech art of digital signal processing (DSP) was pioneered at NASA's Jet Propulsion Laboratory (JPL) in the mid-1960s for use in the Apollo Lunar Landing Program. Designed to computer enhance pictures of the Moon, this technology became the basis for the Landsat Earth resources satellites and subsequently has been incorporated into a broad range of Earthbound medical and diagnostic tools. DSP is employed in advanced body imaging techniques including Computer-Aided Tomography, also known as CT and CATScan, and Magnetic Resonance Imaging (MRI). CT images are collected by irradiating a thin slice of the body with a fan-shaped x-ray beam from a number of directions around the body's perimeter. A tomographic (slice-like) picture is reconstructed from these multiple views by a computer. MRI employs a magnetic field and radio waves, rather than x-rays, to create images. In this photograph, a patient undergoes an open MRI.

  15. Body Imaging

    NASA Technical Reports Server (NTRS)

    2001-01-01

    The high-tech art of digital signal processing (DSP) was pioneered at NASA's Jet Propulsion Laboratory (JPL) in the mid-1960s for use in the Apollo Lunar Landing Program. Designed to computer enhance pictures of the Moon, this technology became the basis for the Landsat Earth resources satellites and subsequently has been incorporated into a broad range of Earthbound medical and diagnostic tools. DSP is employed in advanced body imaging techniques including Computer-Aided Tomography, also known as CT and CATScan, and Magnetic Resonance Imaging (MRI). CT images are collected by irradiating a thin slice of the body with a fan-shaped x-ray beam from a number of directions around the body's perimeter. A tomographic (slice-like) picture is reconstructed from these multiple views by a computer. MRI employs a magnetic field and radio waves, rather than x-rays, to create images.

  16. EPR imaging

    NASA Astrophysics Data System (ADS)

    Eaton, Sandra S.; Eaton, Gareth R.

    Useful EPR imaging has been achieved using simple gradient coils on a standard spectrometer. Resolution of less than 1 mm is possible without deconvolution of the resulting spectra. Examples are presented using DPPH and nitroxyl radicals.

  17. Imaging Immunosenescence

    PubMed Central

    Qian, Feng; Montgomery, Ruth R.

    2016-01-01

    Summary To demonstrate effects of aging visually requires a robust technique that can reproducibly detect small differences in efficiency or kinetics between groups. Investigators of aging will greatly appreciate the benefits of Amnis ImageStream technology (www.amnis.com/), which combines quantitative flow cytometry with simultaneous high-resolution digital imaging. Imagestream is quantitative, reproducible, feasible with limited samples, and it facilitates in-depth examination of cellular mechanisms between cohorts of samples. PMID:26420711

  18. Imaging System

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The 1100C Virtual Window is based on technology developed under NASA Small Business Innovation (SBIR) contracts to Ames Research Center. For example, under one contract Dimension Technologies, Inc. developed a large autostereoscopic display for scientific visualization applications. The Virtual Window employs an innovative illumination system to deliver the depth and color of true 3D imaging. Its applications include surgery and Magnetic Resonance Imaging scans, viewing for teleoperated robots, training, and in aviation cockpit displays.

  19. X-ray CT analyses, models and numerical simulations: a comparison with petrophysical analyses in an experimental CO2 study

    NASA Astrophysics Data System (ADS)

    Henkel, Steven; Pudlo, Dieter; Enzmann, Frieder; Reitenbach, Viktor; Albrecht, Daniel; Ganzer, Leonhard; Gaupp, Reinhard

    2016-06-01

    An essential part of the collaborative research project H2STORE (hydrogen to store), which is funded by the German government, was a comparison of various analytical methods for characterizing reservoir sandstones from different stratigraphic units. In this context Permian, Triassic and Tertiary reservoir sandstones were analysed. Rock core materials, provided by RWE Gasspeicher GmbH (Dortmund, Germany), GDF Suez E&P Deutschland GmbH (Lingen, Germany), E.ON Gas Storage GmbH (Essen, Germany) and RAG Rohöl-Aufsuchungs Aktiengesellschaft (Vienna, Austria), were processed by different laboratory techniques; thin sections were prepared, rock fragments were crushed and cubes of 1 cm edge length and plugs 3 to 5 cm in length with a diameter of about 2.5 cm were sawn from macroscopic homogeneous cores. With this prepared sample material, polarized light microscopy and scanning electron microscopy, coupled with image analyses, specific surface area measurements (after Brunauer, Emmet and Teller, 1938; BET), He-porosity and N2-permeability measurements and high-resolution microcomputer tomography (μ-CT), which were used for numerical simulations, were applied. All these methods were practised on most of the same sample material, before and on selected Permian sandstones also after static CO2 experiments under reservoir conditions. A major concern in comparing the results of these methods is an appraisal of the reliability of the given porosity, permeability and mineral-specific reactive (inner) surface area data. The CO2 experiments modified the petrophysical as well as the mineralogical/geochemical rock properties. These changes are detectable by all applied analytical methods. Nevertheless, a major outcome of the high-resolution μ-CT analyses and following numerical data simulations was that quite similar data sets and data interpretations were maintained by the different petrophysical standard methods. Moreover, the μ-CT analyses are not only time saving, but also

  20. Diagnostic imaging.

    PubMed

    Morris, Peter; Perkins, Alan

    2012-04-21

    Physical techniques have always had a key role in medicine, and the second half of the 20th century in particular saw a revolution in medical diagnostic techniques with the development of key imaging instruments: x-ray imaging and emission tomography (nuclear imaging and PET), MRI, and ultrasound. These techniques use the full width of the electromagnetic spectrum, from gamma rays to radio waves, and sound. In most cases, the development of a medical imaging device was opportunistic; many scientists in physics laboratories were experimenting with simple x-ray images within the first year of the discovery of such rays, the development of the cyclotron and later nuclear reactors created the opportunity for nuclear medicine, and one of the co-inventors of MRI was initially attempting to develop an alternative to x-ray diffraction for the analysis of crystal structures. What all these techniques have in common is the brilliant insight of a few pioneering physical scientists and engineers who had the tenacity to develop their inventions, followed by a series of technical innovations that enabled the full diagnostic potential of these instruments to be realised. In this report, we focus on the key part played by these scientists and engineers and the new imaging instruments and diagnostic procedures that they developed. By bringing the key developments and applications together we hope to show the true legacy of physics and engineering in diagnostic medicine. PMID:22516558

  1. Image Mission Attitude Support Experiences

    NASA Technical Reports Server (NTRS)

    Ottenstein, N.; Challa, M.; Home, A.; Harman, R.; Burley, R.

    2001-01-01

    The spin-stabilized Imager for Magnetopause to Aurora Global Exploration (IMAGE) is the National Aeronautics and Space Administration's (NASA's) first Medium-class Explorer Mission (MIDEX). IMAGE was launched into a highly elliptical polar orbit on March 25, 2000 from Vandenberg Air Force Base, California, aboard a Boeing Delta II 7326 launch vehicle. This paper presents some of the observations of the flight dynamics analyses during the launch and in-orbit checkout period through May 18, 2000. Three new algorithms - one algebraic and two differential correction - for computing the parameters of the coning motion of a spacecraft are described and evaluated using in-flight data from the autonomous star tracker (AST) on IMAGE. Other attitude aspects highlighted include support for active damping consequent upon the failure of the passive nutation damper, performance evaluation of the AST, evaluation of the Sun sensor and magnetometer using AST data, and magnetometer calibration.

  2. Time series analyses of global change data.

    PubMed

    Lane, L J; Nichols, M H; Osborn, H B

    1994-01-01

    The hypothesis that statistical analyses of historical time series data can be used to separate the influences of natural variations from anthropogenic sources on global climate change is tested. Point, regional, national, and global temperature data are analyzed. Trend analyses for the period 1901-1987 suggest mean annual temperatures increased (in degrees C per century) globally at the rate of about 0.5, in the USA at about 0.3, in the south-western USA desert region at about 1.2, and at the Walnut Gulch Experimental Watershed in south-eastern Arizona at about 0.8. However, the rates of temperature change are not constant but vary within the 87-year period. Serial correlation and spectral density analysis of the temperature time series showed weak periodicities at various frequencies. The only common periodicity among the temperature series is an apparent cycle of about 43 years. The temperature time series were correlated with the Wolf sunspot index, atmospheric CO(2) concentrations interpolated from the Siple ice core data, and atmospheric CO(2) concentration data from Mauna Loa measurements. Correlation analysis of temperature data with concurrent data on atmospheric CO(2) concentrations and the Wolf sunspot index support previously reported significant correlation over the 1901-1987 period. Correlation analysis between temperature, atmospheric CO(2) concentration, and the Wolf sunspot index for the shorter period, 1958-1987, when continuous Mauna Loa CO(2) data are available, suggest significant correlation between global warming and atmospheric CO(2) concentrations but no significant correlation between global warming and the Wolf sunspot index. This may be because the Wolf sunspot index apparently increased from 1901 until about 1960 and then decreased thereafter, while global warming apparently continued to increase through 1987. Correlation of sunspot activity with global warming may be spurious but additional analyses are required to test this hypothesis

  3. Pathway Analyses Implicate Glial Cells in Schizophrenia

    PubMed Central

    Duncan, Laramie E.; Holmans, Peter A.; Lee, Phil H.; O'Dushlaine, Colm T.; Kirby, Andrew W.; Smoller, Jordan W.; Öngür, Dost; Cohen, Bruce M.

    2014-01-01

    Background The quest to understand the neurobiology of schizophrenia and bipolar disorder is ongoing with multiple lines of evidence indicating abnormalities of glia, mitochondria, and glutamate in both disorders. Despite high heritability estimates of 81% for schizophrenia and 75% for bipolar disorder, compelling links between findings from neurobiological studies, and findings from large-scale genetic analyses, are only beginning to emerge. Method Ten publically available gene sets (pathways) related to glia, mitochondria, and glutamate were tested for association to schizophrenia and bipolar disorder using MAGENTA as the primary analysis method. To determine the robustness of associations, secondary analyses were performed with: ALIGATOR, INRICH, and Set Screen. Data from the Psychiatric Genomics Consortium (PGC) were used for all analyses. There were 1,068,286 SNP-level p-values for schizophrenia (9,394 cases/12,462 controls), and 2,088,878 SNP-level p-values for bipolar disorder (7,481 cases/9,250 controls). Results The Glia-Oligodendrocyte pathway was associated with schizophrenia, after correction for multiple tests, according to primary analysis (MAGENTA p = 0.0005, 75% requirement for individual gene significance) and also achieved nominal levels of significance with INRICH (p = 0.0057) and ALIGATOR (p = 0.022). For bipolar disorder, Set Screen yielded nominally and method-wide significant associations to all three glial pathways, with strongest association to the Glia-Astrocyte pathway (p = 0.002). Conclusions Consistent with findings of white matter abnormalities in schizophrenia by other methods of study, the Glia-Oligodendrocyte pathway was associated with schizophrenia in our genomic study. These findings suggest that the abnormalities of myelination observed in schizophrenia are at least in part due to inherited factors, contrasted with the alternative of purely environmental causes (e.g. medication effects or lifestyle). While not

  4. Precise Chemical Analyses of Planetary Surfaces

    NASA Technical Reports Server (NTRS)

    Kring, David; Schweitzer, Jeffrey; Meyer, Charles; Trombka, Jacob; Freund, Friedemann; Economou, Thanasis; Yen, Albert; Kim, Soon Sam; Treiman, Allan H.; Blake, David; Lisse, Carey

    1996-01-01

    We identify the chemical elements and element ratios that should be analyzed to address many of the issues identified by the Committee on Planetary and Lunar Exploration (COMPLEX). We determined that most of these issues require two sensitive instruments to analyze the necessary complement of elements. In addition, it is useful in many cases to use one instrument to analyze the outermost planetary surface (e.g. to determine weathering effects), while a second is used to analyze a subsurface volume of material (e.g., to determine the composition of unaltered planetary surface material). This dual approach to chemical analyses will also facilitate the calibration of orbital and/or Earth-based spectral observations of the planetary body. We determined that in many cases the scientific issues defined by COMPLEX can only be fully addressed with combined packages of instruments that would supplement the chemical data with mineralogic or visual information.

  5. Project analysis and integration economic analyses summary

    NASA Technical Reports Server (NTRS)

    Macomber, H. L.

    1986-01-01

    An economic-analysis summary was presented for the manufacture of crystalline-silicon modules involving silicon ingot/sheet, growth, slicing, cell manufacture, and module assembly. Economic analyses provided: useful quantitative aspects for complex decision-making to the Flat-plate Solar Array (FSA) Project; yardsticks for design and performance to industry; and demonstration of how to evaluate and understand the worth of research and development both to JPL and other government agencies and programs. It was concluded that future research and development funds for photovoltaics must be provided by the Federal Government because the solar industry today does not reap enough profits from its present-day sales of photovoltaic equipment.

  6. Neutronic Analyses of the Trade Demonstration Facility

    SciTech Connect

    Rubbia, C.

    2004-09-15

    The TRiga Accelerator-Driven Experiment (TRADE), to be performed in the TRIGA reactor of the ENEA-Casaccia Centre in Italy, consists of the coupling of an external proton accelerator to a target to be installed in the central channel of the reactor scrammed to subcriticality. This pilot experiment, aimed at a global demonstration of the accelerator-driven system concept, is based on an original idea of C. Rubbia. The present paper reports the results of some neutronic analyses focused on the feasibility of TRADE. Results show that all relevant experiments (at different power levels in a wide range of subcriticalities) can be carried out with relatively limited modifications to the present TRIGA reactor.

  7. Attitude stability analyses for small artificial satellites

    NASA Astrophysics Data System (ADS)

    Silva, W. R.; Zanardi, M. C.; Formiga, J. K. S.; Cabette, R. E. S.; Stuchi, T. J.

    2013-10-01

    The objective of this paper is to analyze the stability of the rotational motion of a symmetrical spacecraft, in a circular orbit. The equilibrium points and regions of stability are established when components of the gravity gradient torque acting on the spacecraft are included in the equations of rotational motion, which are described by the Andoyer's variables. The nonlinear stability of the equilibrium points of the rotational motion is analysed here by the Kovalev-Savchenko theorem. With the application of the Kovalev-Savchenko theorem, it is possible to verify if they remain stable under the influence of the terms of higher order of the normal Hamiltonian. In this paper, numerical simulations are made for a small hypothetical artificial satellite. Several stable equilibrium points were determined and regions around these points have been established by variations in the orbital inclination and in the spacecraft principal moment of inertia. The present analysis can directly contribute in the maintenance of the spacecraft's attitude.

  8. TRACE ELEMENT ANALYSES OF URANIUM MATERIALS

    SciTech Connect

    Beals, D; Charles Shick, C

    2008-06-09

    The Savannah River National Laboratory (SRNL) has developed an analytical method to measure many trace elements in a variety of uranium materials at the high part-per-billion (ppb) to low part-per-million (ppm) levels using matrix removal and analysis by quadrapole ICP-MS. Over 35 elements were measured in uranium oxides, acetate, ore and metal. Replicate analyses of samples did provide precise results however none of the materials was certified for trace element content thus no measure of the accuracy could be made. The DOE New Brunswick Laboratory (NBL) does provide a Certified Reference Material (CRM) that has provisional values for a series of trace elements. The NBL CRM were purchased and analyzed to determine the accuracy of the method for the analysis of trace elements in uranium oxide. These results are presented and discussed in the following paper.

  9. Anthocyanin analyses of Vaccinium fruit dietary supplements.

    PubMed

    Lee, Jungmin

    2016-09-01

    Vaccinium fruit ingredients within dietary supplements were identified by comparisons with anthocyanin analyses of known Vaccinium profiles (demonstration of anthocyanin fingerprinting). Available Vaccinium supplements were purchased and analyzed, their anthocyanin profiles (based on high-performance liquid chromatography [HPLC] separation) indicated if products' fruit origin listings were authentic. Over 30% of the Vaccinium fruit (cranberry, lingonberry, bilberry, and blueberry; 14 of 45) products available as dietary supplements did not contain the fruit listed as ingredients. Six supplements contained no anthocyanins. Five others had contents differing from labeled fruit (e.g., bilberry capsules containing Andean blueberry fruit). Of the samples that did contain the specified fruit (n = 27), anthocyanin content ranged from 0.04 to 14.37 mg per capsule, tablet, or teaspoon (5 g). Approaches to utilizing anthocyanins in assessment of sample authenticity, and a discussion of the challenges with anthocyanin profiles in quality control are both presented. PMID:27625778

  10. Study of spin-scan imaging for outer planets missions. [imaging techniques for Jupiter orbiter missions

    NASA Technical Reports Server (NTRS)

    Russell, E. E.; Chandos, R. A.; Kodak, J. C.; Pellicori, S. F.; Tomasko, M. G.

    1974-01-01

    The constraints that are imposed on the Outer Planet Missions (OPM) imager design are of critical importance. Imager system modeling analyses define important parameters and systematic means for trade-offs applied to specific Jupiter orbiter missions. Possible image sequence plans for Jupiter missions are discussed in detail. Considered is a series of orbits that allow repeated near encounters with three of the Jovian satellites. The data handling involved in the image processing is discussed, and it is shown that only minimal processing is required for the majority of images for a Jupiter orbiter mission.

  11. Stellar Imager

    NASA Technical Reports Server (NTRS)

    Carpenter, Kenneth

    2007-01-01

    The Stellar Imager (SI) is one of NASA's "Vision Missions" - concepts for future, space-based, strategic missions that could enormously increase our capabilities for observing the Cosmos. SI is designed as a UV/Optical Interferometer which will enable 0.1 milli-arcsecond (mas) spectral imaging of stellar surfaces and, via asteroseismology, stellar interiors and of the Universe in general. The ultra-sharp images of the Stellar Imager will revolutionize our view of many dynamic astrophysical processes by transforming point sources into extended sources, and snapshots into evolving views. SI, with a characteristic angular resolution of 0.1 milli-arcseconds at 2000 Angstroms, represents an advance in image detail of several hundred times over that provided by the Hubble Space Telescope. The Stellar Imager will zoom in on what today-with few exceptions - we only know as point sources, revealing processes never before seen, thus providing a tool as fundamental to astrophysics as the microscope is to the study of life on Earth. SI's science focuses on the role of magnetism in the Universe, particularly on magnetic activity on the surfaces of stars like the Sun. It's prime goal is to enable long-term forecasting of solar activity and the space weather that it drives, in support of the Living With a Star program in the Exploration Era. SI will also revolutionize our understanding of the formation of planetary systems, of the habitability and climatology of distant planets, and of many magneto-hydrodynamically controlled processes in the Universe. Stellar Imager is included as a "Flagship and Landmark Discovery Mission" in the 2005 Sun Solar System Connection (SSSC) Roadmap and as a candidate for a "Pathways to Life Observatory" in the Exploration of the Universe Division (EUD) Roadmap (May, 2005) and as such is a candidate mission for the 2025-2030 timeframe. An artist's drawing of the current "baseline" concept for SI is presented.

  12. Topological Analyses of Symmetric Eruptive Prominences

    NASA Astrophysics Data System (ADS)

    Panasenco, O.; Martin, S. F.

    Erupting prominences (filaments) that we have analyzed from Hα Doppler data at Helio Research and from SOHO/EIT 304 Å, show strong coherency between their chirality, the direction of the vertical and lateral motions of the top of the prominences, and the directions of twisting of their legs. These coherent properties in erupting prominences occur in two patterns of opposite helicity; they constitute a form of dynamic chirality called the ``roll effect." Viewed from the positive network side as they erupt, many symmetrically-erupting dextral prominences develop rolling motion toward the observer along with right-hand helicity in the left leg and left-hand helicity in the right leg. Many symmetricaly-erupting sinistral prominences, also viewed from the positive network field side, have the opposite pattern: rolling motion at the top away from the observer, left-hand helical twist in the left leg, and right-hand twist in the right leg. We have analysed the motions seen in the famous movie of the ``Grand Daddy" erupting prominence and found that it has all the motions that define the roll effect. From our analyses of this and other symmetric erupting prominences, we show that the roll effect is an alternative to the popular hypothetical configuration of an eruptive prominence as a twisted flux rope or flux tube. Instead we find that a simple flat ribbon can be bent such that it reproduces nearly all of the observed forms. The flat ribbon is the most logical beginning topology because observed prominence spines already have this topology prior to eruption and an initial long magnetic ribbon with parallel, non-twisted threads, as a basic form, can be bent into many more and different geometrical forms than a flux rope.

  13. ANALYSES OF WOUND EXUDATES FOR CLOSTRIDIAL TOXINS.

    PubMed

    NOYES, H E; PRITCHARD, W L; BRINKLEY, F B; MENDELSON, J A

    1964-03-01

    Noyes, Howard E. (Walter Reed Army Institute of Research, Washington, D.C.), William L. Pritchard, Floyd B. Brinkley, and Janice A. Mendelson. Analyses of wound exudates for clostridial toxins. J. Bacteriol. 87:623-629. 1964.-Earlier studies indicated that death of goats with traumatic wounds of the hindquarter could be related to the number of clostridia in the wounds, and that toxicity of wound exudates for mice and guinea pigs could be partially neutralized by commercial trivalent gas gangrene antitoxin. This report describes in vitro and in vivo analyses of wound exudates for known clostridial toxins. Wounds were produced by detonation of high-explosive pellets. Wound exudates were obtained by cold saline extraction of both necrotic tissues and gauze sponges used to cover the wounds. Exudates were sterilized by Seitz filtration in the cold. In vitro tests were used to measure alpha-, theta-, and mu-toxins of Clostridium perfringens and the epsilon-toxin of C. novyi. Mouse protection tests, employing commercial typing antisera, were used to analyze exudates for other clostridial toxins. Lethality of wound exudates for mice could be related to (i) the numbers of clostridia present in the wound, (ii) survival time of the goats, and (iii) positive lecithovitellin (LV) tests of the exudates. However, the LV tests could not be neutralized by antitoxin specific for C. perfringens alpha-toxin. Mice were not protected by typing antisera specific for types A, C, or D C. perfringens or C. septicum but were protected by antisera specific for type B C. perfringens and types A and B C. novyi. PMID:14127581

  14. Reporting guidelines for population pharmacokinetic analyses.

    PubMed

    Dykstra, Kevin; Mehrotra, Nitin; Tornøe, Christoffer Wenzel; Kastrissios, Helen; Patel, Bela; Al-Huniti, Nidal; Jadhav, Pravin; Wang, Yaning; Byon, Wonkyung

    2015-08-01

    The purpose of this work was to develop a consolidated set of guiding principles for the reporting of population pharmacokinetic (PK) analyses based on input from a survey of practitioners as well as discussions between industry, consulting, and regulatory scientists. The survey found that identification of population covariate effects on drug exposure and support for dose selection (in which population PK frequently serves as preparatory analysis for exposure-response modeling) are the main areas of influence for population PK analysis. The proposed guidelines consider 2 main purposes of population PK reports: (1) to present key analysis findings and their impact on drug development decisions, and (2) as documentation of the analysis methods for the dual purpose of enabling review of the analysis and facilitating future use of the models. This work also identified 2 main audiences for the reports: (1) a technically competent group responsible for in-depth review of the data, methodology, and results; and (2) a scientifically literate but not technically adept group, whose main interest is in the implications of the analysis for the broader drug development program. We recommend a generalized question-based approach with 6 questions that need to be addressed throughout the report. We recommend 8 sections (Synopsis, Introduction, Data, Methods, Results, Discussion, Conclusions, Appendix) with suggestions for the target audience and level of detail for each section. A section providing general expectations regarding population PK reporting from a regulatory perspective is also included. We consider this an important step toward industrialization of the field of pharmacometrics such that a nontechnical audience also understands the role of pharmacometric analyses in decision making. Population PK reports were chosen as representative reports to derive these recommendations; however, the guiding principles presented here are applicable for all pharmacometric reports

  15. Reporting guidelines for population pharmacokinetic analyses.

    PubMed

    Dykstra, Kevin; Mehrotra, Nitin; Tornøe, Christoffer Wenzel; Kastrissios, Helen; Patel, Bela; Al-Huniti, Nidal; Jadhav, Pravin; Wang, Yaning; Byon, Wonkyung

    2015-06-01

    The purpose of this work was to develop a consolidated set of guiding principles for reporting of population pharmacokinetic (PK) analyses based on input from a survey of practitioners as well as discussions between industry, consulting and regulatory scientists. The survey found that identification of population covariate effects on drug exposure and support for dose selection (where population PK frequently serves as preparatory analysis to exposure-response modeling) are the main areas of influence for population PK analysis. The proposed guidelines consider two main purposes of population PK reports (1) to present key analysis findings and their impact on drug development decisions, and (2) as documentation of the analysis methods for the dual purpose of enabling review of the analysis and facilitating future use of the models. This work also identified two main audiences for the reports: (1) a technically competent group responsible for in-depth review of the data, methodology, and results, and (2) a scientifically literate, but not technically adept group, whose main interest is in the implications of the analysis for the broader drug development program. We recommend a generalized question-based approach with six questions that need to be addressed throughout the report. We recommend eight sections (Synopsis, Introduction, Data, Methods, Results, Discussion, Conclusions, Appendix) with suggestions for the target audience and level of detail for each section. A section providing general expectations regarding population PK reporting from a regulatory perspective is also included. We consider this an important step towards industrialization of the field of pharmacometrics such that non-technical audience also understands the role of pharmacometrics analyses in decision making. Population PK reports were chosen as representative reports to derive these recommendations; however, the guiding principles presented here are applicable for all pharmacometric reports

  16. Interphase cytogenetic and AgNOR analyses of hydatidiform moles.

    PubMed Central

    Watanabe, M; Ghazizadeh, M; Konishi, H; Araki, T

    1998-01-01

    AIM: To determine the potential value of interphase cytogenetic and argyrophilic nucleolar organiser region (AgNOR) analyses in the diagnosis and classification of hydatidiform moles. METHODS: Serial tissue sections from 37 hydatidiform moles, histologically classified as 11 complete and 15 partial, and from 11 hydropic abortuses were examined by in situ hybridisation using digoxigenin labelled probes specific for chromosomes 1, X, and Y, and a one step silver staining method. The percentages of diploid and triploid nuclei, and the mean number of AgNORs for each tissue were determined. RESULTS: Interphase cytogenetics showed that eight of the 11 cases (73%) each of complete mole and hydropic abortus had diploid pattern and the three remaining cases (27%) of each group were triploid. Two of the triploid complete moles and one of the triploid hydropic abortuses were revised to partial moles and one remaining triploid complete mole was revised to hydropic abortus. Of the 15 partial moles, nine (60%) were triploid, and six (40%) were diploid. These diploid cases were revised to three complete moles and three hydropic abortuses. There was a significant difference (p < 0.0001) between the mean (SD) AgNOR count in partial mole (5.11 (0.91)) versus hydropic abortus (3.79 (0.90)) and complete mole (3.39 (0.97)). The total of 15 triploid cases showed a high mean AgNOR count of 5.24 (0.73). Also, after reclassification, eight of the nine partial moles (89%) had a mean AgNOR count of > or = 5. The results of analyses by the two methods were closely correlated. CONCLUSIONS: Interphasecytogeneticanalysis using chromosome specific probes and AgNOR count provides a valuable approach for ploidy analysis in histological sections of hydatidiform moles and helps to resolve difficult cases. Images PMID:9771442

  17. Human temporal lobe epilepsy analyses by tissue proteomics.

    PubMed

    Mériaux, Céline; Franck, Julien; Park, Dan Bi; Quanico, Jusal; Kim, Young Hye; Chung, Chun Kee; Park, Young Mok; Steinbusch, Harry; Salzet, Michel; Fournier, Isabelle

    2014-06-01

    Although there are many types of epilepsy, temporal lobe epilepsy (TLE) is probably in humans the most common and most often studied. TLE represents 40% of the total epilepsy form of the disease and is difficult to treat. Despite a wealth of descriptive data obtained from the disease history of patients, the EEG recording, imaging techniques, and histological studies, the epileptogenic process remains poorly understood. However, it is unlikely that a single factor or a single mechanism can cause many changes associated with this neuropathological phenomenon. MALDI mass spectrometry imaging (MSI) coupled to protein identification, because of its ability to study a wide range of molecules, appears to be suitable for the preparation of molecular profiles in TLE. Seven neuropeptides have been have been identified in Dental gyrus regions of the hippocampus in relation with TLE pathology. Shot-gun studies taking into account gender influence have been performed. Tissue microextraction from control (10) toward 10 TLE patients have been analyzed after trypsin digestion followed by separation on nanoLC coupled to LTQ orbitrap. From the shot-gun analyses, results confirmed the presence of specific neuropeptides precursors and receptors in TLE patients as well as proteins involved in axons regeneration including neurotrophins, ECM proteins, cell surface proteins, membrane proteins, G-proteins, cytoskeleton proteins and tumor suppressors. Among the tumor suppressors identified, the Leucine-rich glioma inactivated 1 (LGI1) protein was found. LGI1 gene recently been demonstrated being implicated in heritability of TLE. We have also demonstrate the presence a complete profile of tumor suppressors in TLE patients, 7 have been identified. Refining this analysis taken into account the gender influence in both control and in TLE reflected the presence of specific proteins between male and female and thus mechanisms in pathology development could be completely different. PMID:24449190

  18. Hierarchical Segmentation Enhances Diagnostic Imaging

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Bartron Medical Imaging LLC (BMI), of New Haven, Connecticut, gained a nonexclusive license from Goddard Space Flight Center to use the RHSEG software in medical imaging. To manage image data, BMI then licensed two pattern-matching software programs from NASA's Jet Propulsion Laboratory that were used in image analysis and three data-mining and edge-detection programs from Kennedy Space Center. More recently, BMI made NASA history by being the first company to partner with the Space Agency through a Cooperative Research and Development Agreement to develop a 3-D version of RHSEG. With U.S. Food and Drug Administration clearance, BMI will sell its Med-Seg imaging system with the 2-D version of the RHSEG software to analyze medical imagery from CAT and PET scans, MRI, ultrasound, digitized X-rays, digitized mammographies, dental X-rays, soft tissue analyses, moving object analyses, and soft-tissue slides such as Pap smears for the diagnoses and management of diseases. Extending the software's capabilities to three dimensions will eventually enable production of pixel-level views of a tumor or lesion, early identification of plaque build-up in arteries, and identification of density levels of microcalcification in mammographies.

  19. Analyses of moisture in polymers and composites

    NASA Technical Reports Server (NTRS)

    Ryan, L. E.; Vaughan, R. W.

    1980-01-01

    A suitable method for the direct measurement of moisture concentrations after humidity/thermal exposure on state of the art epoxy and polyimide resins and their graphite and glass fiber reinforcements was investigated. Methods for the determination of moisture concentration profiles, moisture diffusion modeling and moisture induced chemical changes were examined. Carefully fabricated, precharacterized epoxy and polyimide neat resins and their AS graphite and S glass reinforced composites were exposed to humid conditions using heavy water (D20), at ambient and elevated temperatures. These specimens were fixtured to theoretically limit the D20 permeation to a unidirectional penetration axis. The analytical techniques evaluated were: (1) laser pyrolysis gas chromatography mass spectrometry; (2) solids probe mass spectrometry; (3) laser pyrolysis conventional infrared spectroscopy; and (4) infrared imaging thermovision. The most reproducible and sensitive technique was solids probe mass spectrometry. The fabricated exposed specimens were analyzed for D20 profiling after humidity/thermal conditioning at three exposure time durations.

  20. Operational Satellite-based Surface Oil Analyses (Invited)

    NASA Astrophysics Data System (ADS)

    Streett, D.; Warren, C.

    2010-12-01

    During the Deepwater Horizon spill, NOAA imagery analysts in the Satellite Analysis Branch (SAB) issued more than 300 near-real-time satellite-based oil spill analyses. These analyses were used by the oil spill response community for planning, issuing surface oil trajectories and tasking assets (e.g., oil containment booms, skimmers, overflights). SAB analysts used both Synthetic Aperture Radar (SAR) and high resolution visible/near IR multispectral satellite imagery as well as a variety of ancillary datasets. Satellite imagery used included ENVISAT ASAR (ESA), TerraSAR-X (DLR), Cosmo-Skymed (ASI), ALOS (JAXA), Radarsat (MDA), ENVISAT MERIS (ESA), SPOT (SPOT Image Corp.), Aster (NASA), MODIS (NASA), and AVHRR (NOAA). Ancillary datasets included ocean current information, wind information, location of natural oil seeps and a variety of in situ oil observations. The analyses were available as jpegs, pdfs, shapefiles and through Google, KML files and also available on a variety of websites including Geoplatform and ERMA. From the very first analysis issued just 5 hours after the rig sank through the final analysis issued in August, the complete archive is still publicly available on the NOAA/NESDIS website http://www.ssd.noaa.gov/PS/MPS/deepwater.html SAB personnel also served as the Deepwater Horizon International Disaster Charter Project Manager (at the official request of the USGS). The Project Manager’s primary responsibility was to acquire and oversee the processing and dissemination of satellite data generously donated by numerous private companies and nations in support of the oil spill response including some of the imagery described above. SAB has begun to address a number of goals that will improve our routine oil spill response as well as help assure that we are ready for the next spill of national significance. We hope to (1) secure a steady, abundant and timely stream of suitable satellite imagery even in the absence of large-scale emergencies such as

  1. Database-Driven Analyses of Astronomical Spectra

    NASA Astrophysics Data System (ADS)

    Cami, Jan

    2012-03-01

    Spectroscopy is one of the most powerful tools to study the physical properties and chemical composition of very diverse astrophysical environments. In principle, each nuclide has a unique set of spectral features; thus, establishing the presence of a specific material at astronomical distances requires no more than finding a laboratory spectrum of the right material that perfectly matches the astronomical observations. Once the presence of a substance is established, a careful analysis of the observational characteristics (wavelengths or frequencies, intensities, and line profiles) allows one to determine many physical parameters of the environment in which the substance resides, such as temperature, density, velocity, and so on. Because of this great diagnostic potential, ground-based and space-borne astronomical observatories often include instruments to carry out spectroscopic analyses of various celestial objects and events. Of particular interest is molecular spectroscopy at infrared wavelengths. From the spectroscopic point of view, molecules differ from atoms in their ability to vibrate and rotate, and quantum physics inevitably causes those motions to be quantized. The energies required to excite vibrations or rotations are such that vibrational transitions generally occur at infrared wavelengths, whereas pure rotational transitions typically occur at sub-mm wavelengths. Molecular vibration and rotation are coupled though, and thus at infrared wavelengths, one commonly observes a multitude of ro-vibrational transitions (see Figure 13.1). At lower spectral resolution, all transitions blend into one broad ro-vibrational molecular band. The isotope. Molecular spectroscopy thus allows us to see a difference of one neutron in an atomic nucleus that is located at astronomical distances! Since the detection of the first interstellar molecules (the CH [21] and CN [14] radicals), more than 150 species have been detected in space, ranging in size from diatomic

  2. An extended diffraction-enhanced imaging method for implementing multiple-image radiography

    NASA Astrophysics Data System (ADS)

    Chou, Cheng-Ying; Anastasio, Mark A.; Brankov, Jovan G.; Wernick, Miles N.; Brey, Eric M.; Connor, Dean M., Jr.; Zhong, Zhong

    2007-04-01

    Diffraction-enhanced imaging (DEI) is an analyser-based x-ray imaging method that produces separate images depicting the projected x-ray absorption and refractive properties of an object. Because the imaging model of DEI does not account for ultra-small-angle x-ray scattering (USAXS), the images produced in DEI can contain artefacts and inaccuracies in medical imaging applications. In this work, we investigate an extended DEI method for concurrent reconstruction of three images that depict an object's projected x-ray absorption, refraction and USAXS properties. The extended DEI method can be viewed as an implementation of the recently proposed multiple-image radiography paradigm. Validation studies are conducted by use of computer-simulated and synchrotron measurement data.

  3. High perfomance liquid chromatography in pharmaceutical analyses.

    PubMed

    Nikolin, Branko; Imamović, Belma; Medanhodzić-Vuk, Saira; Sober, Miroslav

    2004-05-01

    In testing the pre-sale procedure the marketing of drugs and their control in the last ten years, high performance liquid chromatography replaced numerous spectroscopic methods and gas chromatography in the quantitative and qualitative analysis. In the first period of HPLC application it was thought that it would become a complementary method of gas chromatography, however, today it has nearly completely replaced gas chromatography in pharmaceutical analysis. The application of the liquid mobile phase with the possibility of transformation of mobilized polarity during chromatography and all other modifications of mobile phase depending upon the characteristics of substance which are being tested, is a great advantage in the process of separation in comparison to other methods. The greater choice of stationary phase is the next factor which enables realization of good separation. The separation line is connected to specific and sensitive detector systems, spectrafluorimeter, diode detector, electrochemical detector as other hyphernated systems HPLC-MS and HPLC-NMR, are the basic elements on which is based such wide and effective application of the HPLC method. The purpose high performance liquid chromatography (HPLC) analysis of any drugs is to confirm the identity of a drug and provide quantitative results and also to monitor the progress of the therapy of a disease.1) Measuring presented on the Fig. 1. is chromatogram obtained for the plasma of depressed patients 12 h before oral administration of dexamethasone. It may also be used to further our understanding of the normal and disease process in the human body trough biomedical and therapeutically research during investigation before of the drugs registration. The analyses of drugs and metabolites in biological fluids, particularly plasma, serum or urine is one of the most demanding but one of the most common uses of high performance of liquid chromatography. Blood, plasma or serum contains numerous endogenous

  4. Growth curve analyses in selected duck lines.

    PubMed

    Maruyama, K; Vinyard, B; Akbar, M K; Shafer, D J; Turk, C M

    2001-12-01

    1. Growth patterns of male ducks from 4 lines (lines A, B, C and D) selected for market weight were analysed and compared to growth patterns of ducks in the respective line 7 generations earlier. Growth curves were analysed using procedures derived from the Weibull sigmoidal function and the linear-linear relative growth rate model and simple allometry. 2. The ducks were fed ad libitum under 24-h lighting throughout the experiment. At weekly intervals from the time of hatch through 70 d of age, 16 ducks from each line were killed to determine body, carcase, breast-muscle, leg and thigh-muscle, and abdominal fat weights. 3. Line A was the heaviest line, followed by line B, line C and line D. However, body weight, carcase weight and breast-muscle weight at 49 d of age were not significantly different between lines A and B. After 7 generations of selection, the breast-muscle yield was increased to >19% and the abdominal fat percent was reduced to <1.4% in all lines. 4. The Weibull growth curve analysis of body weight showed an increase in the asymptotes during selection, while the age of the inflection point remained constant in all lines (21.3 to 26.0 d). For breast-muscle growth, ducks reached the inflection point 12.8 to 14.3 d later than for body weight. Between line A and line B, asymptotes for body weight, asymptotes for breast-muscle weight and allometric growth coefficients of breast muscle and leg and thigh muscles from 14 to 49 d were not significantly different. 5. The relative growth rate model discriminated body and breast-muscle growth patterns of line A and line B. The initial decline in the relative body growth rate was less and the time to reach the transition was longer in line A than line B. On the other hand, the initial decline in the relative breast-muscle growth rate was greater in line A than line B. PMID:11811908

  5. Imaging infection.

    PubMed

    Ketai, Loren; Jordan, Kirk; Busby, Katrina H

    2015-06-01

    Thoracic imaging is widely used to detect lower respiratory tract infections, identify their complications, and aid in differentiating infectious from noninfectious thoracic disease. Less commonly, the combination of imaging findings and a clinical setting can favor infection with a specific organism. This confluence can occur in cases of bronchiectatic nontuberculous mycobacterial infections in immune-competent hosts, invasive fungal disease among neutropenic patients, Pneumocystis jiroveci pneumonia in patients with AIDS, and in cytomegalovirus infections in patients with recent hematopoietic cell transplantation. These specific diagnoses often depend on computed tomography scanning rather than chest radiography alone. PMID:26024600

  6. Brain Imaging

    PubMed Central

    Racine, Eric; Bar-Ilan, Ofek; Illes, Judy

    2007-01-01

    Advances in neuroscience are increasingly intersecting with issues of ethical, legal, and social interest. This study is an analysis of press coverage of an advanced technology for brain imaging, functional magnetic resonance imaging, that has gained significant public visibility over the past ten years. Discussion of issues of scientific validity and interpretation dominated over ethical content in both the popular and specialized press. Coverage of research on higher order cognitive phenomena specifically attributed broad personal and societal meaning to neuroimages. The authors conclude that neuroscience provides an ideal model for exploring science communication and ethics in a multicultural context. PMID:17330151

  7. Image Processing Diagnostics: Emphysema

    NASA Astrophysics Data System (ADS)

    McKenzie, Alex

    2009-10-01

    Currently the computerized tomography (CT) scan can detect emphysema sooner than traditional x-rays, but other tests are required to measure more accurately the amount of affected lung. CT scan images show clearly if a patient has emphysema, but is unable by visual scan alone, to quantify the degree of the disease, as it appears merely as subtle, barely distinct, dark spots on the lung. Our goal is to create a software plug-in to interface with existing open source medical imaging software, to automate the process of accurately diagnosing and determining emphysema severity levels in patients. This will be accomplished by performing a number of statistical calculations using data taken from CT scan images of several patients representing a wide range of severity of the disease. These analyses include an examination of the deviation from a normal distribution curve to determine skewness, a commonly used statistical parameter. Our preliminary results show that this method of assessment appears to be more accurate and robust than currently utilized methods which involve looking at percentages of radiodensities in air passages of the lung.

  8. Seismic Soil-Structure Interaction Analyses of a Deeply Embedded Model Reactor – SASSI Analyses

    SciTech Connect

    Nie J.; Braverman J.; Costantino, M.

    2013-10-31

    This report summarizes the SASSI analyses of a deeply embedded reactor model performed by BNL and CJC and Associates, as part of the seismic soil-structure interaction (SSI) simulation capability project for the NEAMS (Nuclear Energy Advanced Modeling and Simulation) Program of the Department of Energy. The SASSI analyses included three cases: 0.2 g, 0.5 g, and 0.9g, all of which refer to nominal peak accelerations at the top of the bedrock. The analyses utilized the modified subtraction method (MSM) for performing the seismic SSI evaluations. Each case consisted of two analyses: input motion in one horizontal direction (X) and input motion in the vertical direction (Z), both of which utilized the same in-column input motion. Besides providing SASSI results for use in comparison with the time domain SSI results obtained using the DIABLO computer code, this study also leads to the recognition that the frequency-domain method should be modernized so that it can better serve its mission-critical role for analysis and design of nuclear power plants.

  9. Trend Analyses of Nitrate in Danish Groundwater

    NASA Astrophysics Data System (ADS)

    Hansen, B.; Thorling, L.; Dalgaard, T.; Erlandsen, M.

    2012-04-01

    This presentation assesses the long-term development in the oxic groundwater nitrate concentration and nitrogen (N) loss due to intensive farming in Denmark. Firstly, up to 20-year time-series from the national groundwater monitoring network enable a statistically systematic analysis of distribution, trends and trend reversals in the groundwater nitrate concentration. Secondly, knowledge about the N surplus in Danish agriculture since 1950 is used as an indicator of the potential loss of N. Thirdly, groundwater recharge CFC (Chlorofluorocarbon) age determination allows linking of the first two dataset. The development in the nitrate concentration of oxic groundwater clearly mirrors the development in the national agricultural N surplus, and a corresponding trend reversal is found in groundwater. Regulation and technical improvements in the intensive farming in Denmark have succeeded in decreasing the N surplus by 40% since the mid 1980s while at the same time maintaining crop yields and increasing the animal production of especially pigs. Trend analyses prove that the youngest (0-15 years old) oxic groundwater shows more pronounced significant downward nitrate trends (44%) than the oldest (25-50 years old) oxic groundwater (9%). This amounts to clear evidence of the effect of reduced nitrate leaching on groundwater nitrate concentrations in Denmark. Are the Danish groundwater monitoring strategy obtimal for detection of nitrate trends? Will the nitrate concentrations in Danish groundwater continue to decrease or are the Danish nitrate concentration levels now appropriate according to the Water Framework Directive?

  10. Local spin analyses using density functional theory

    NASA Astrophysics Data System (ADS)

    Abate, Bayileyegn; Peralta, Juan

    Local spin analysis is a valuable technique in computational investigations magnetic interactions on mono- and polynuclear transition metal complexes, which play vital roles in catalysis, molecular magnetism, artificial photosynthesis, and several other commercially important materials. The relative size and complex electronic structure of transition metal complexes often prohibits the use of multi-determinant approaches, and hence, practical calculations are often limited to single-determinant methods. Density functional theory (DFT) has become one of the most successful and widely used computational tools for the electronic structure study of complex chemical systems; transition metal complexes in particular. Within the DFT formalism, a more flexible and complete theoretical modeling of transition metal complexes can be achieved by considering noncollinear spins, in which the spin density is 'allowed to' adopt noncollinear structures in stead of being constrained to align parallel/antiparallel to a universal axis of magnetization. In this meeting, I will present local spin analyses results obtained using different DFT functionals. Local projection operators are used to decompose the expectation value of the total spin operator; first introduced by Clark and Davidson.

  11. Interim Basis for PCB Sampling and Analyses

    SciTech Connect

    BANNING, D.L.

    2001-01-18

    This document was developed as an interim basis for sampling and analysis of polychlorinated biphenyls (PCBs) and will be used until a formal data quality objective (DQO) document is prepared and approved. On August 31, 2000, the Framework Agreement for Management of Polychlorinated Biphenyls (PCBs) in Hanford Tank Waste was signed by the US. Department of Energy (DOE), the Environmental Protection Agency (EPA), and the Washington State Department of Ecology (Ecology) (Ecology et al. 2000). This agreement outlines the management of double shell tank (DST) waste as Toxic Substance Control Act (TSCA) PCB remediation waste based on a risk-based disposal approval option per Title 40 of the Code of Federal Regulations 761.61 (c). The agreement calls for ''Quantification of PCBs in DSTs, single shell tanks (SSTs), and incoming waste to ensure that the vitrification plant and other ancillary facilities PCB waste acceptance limits and the requirements of the anticipated risk-based disposal approval are met.'' Waste samples will be analyzed for PCBs to satisfy this requirement. This document describes the DQO process undertaken to assure appropriate data will be collected to support management of PCBs and is presented in a DQO format. The DQO process was implemented in accordance with the U.S. Environmental Protection Agency EPA QAlG4, Guidance for the Data Quality Objectives Process (EPA 1994) and the Data Quality Objectives for Sampling and Analyses, HNF-IP-0842, Rev. 1 A, Vol. IV, Section 4.16 (Banning 1999).

  12. Interim Basis for PCB Sampling and Analyses

    SciTech Connect

    BANNING, D.L.

    2001-03-20

    This document was developed as an interim basis for sampling and analysis of polychlorinated biphenyls (PCBs) and will be used until a formal data quality objective (DQO) document is prepared and approved. On August 31, 2000, the Framework Agreement for Management of Polychlorinated Biphenyls (PCBs) in Hanford Tank Waste was signed by the U.S. Department of Energy (DOE), the Environmental Protection Agency (EPA), and the Washington State Department of Ecology (Ecology) (Ecology et al. 2000). This agreement outlines the management of double shell tank (DST) waste as Toxic Substance Control Act (TSCA) PCB remediation waste based on a risk-based disposal approval option per Title 40 of the Code of Federal Regulations 761.61 (c). The agreement calls for ''Quantification of PCBs in DSTs, single shell tanks (SSTs), and incoming waste to ensure that the vitrification plant and other ancillary facilities PCB waste acceptance limits and the requirements of the anticipated risk-based disposal approval are met.'' Waste samples will be analyzed for PCBs to satisfy this requirement. This document describes the DQO process undertaken to assure appropriate data will be collected to support management of PCBs and is presented in a DQO format. The DQO process was implemented in accordance with the U.S. Environmental Protection Agency EPA QA/G4, Guidance for the Data Quality Objectives Process (EPA 1994) and the Data Quality Objectives for Sampling and Analyses, HNF-IP-0842, Rev. 1A, Vol. IV, Section 4.16 (Banning 1999).

  13. Reproducibility of neuroimaging analyses across operating systems

    PubMed Central

    Glatard, Tristan; Lewis, Lindsay B.; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C.

    2015-01-01

    Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed. PMID:25964757

  14. Causal mediation analyses with rank preserving models.

    PubMed

    Have, Thomas R Ten; Joffe, Marshall M; Lynch, Kevin G; Brown, Gregory K; Maisto, Stephen A; Beck, Aaron T

    2007-09-01

    We present a linear rank preserving model (RPM) approach for analyzing mediation of a randomized baseline intervention's effect on a univariate follow-up outcome. Unlike standard mediation analyses, our approach does not assume that the mediating factor is also randomly assigned to individuals in addition to the randomized baseline intervention (i.e., sequential ignorability), but does make several structural interaction assumptions that currently are untestable. The G-estimation procedure for the proposed RPM represents an extension of the work on direct effects of randomized intervention effects for survival outcomes by Robins and Greenland (1994, Journal of the American Statistical Association 89, 737-749) and on intervention non-adherence by Ten Have et al. (2004, Journal of the American Statistical Association 99, 8-16). Simulations show good estimation and confidence interval performance by the proposed RPM approach under unmeasured confounding relative to the standard mediation approach, but poor performance under departures from the structural interaction assumptions. The trade-off between these assumptions is evaluated in the context of two suicide/depression intervention studies. PMID:17825022

  15. Comparative sequence analyses of sixteen reptilian paramyxoviruses

    USGS Publications Warehouse

    Ahne, W.; Batts, W.N.; Kurath, G.; Winton, J.R.

    1999-01-01

    Viral genomic RNA of Fer-de-Lance virus (FDLV), a paramyxovirus highly pathogenic for reptiles, was reverse transcribed and cloned. Plasmids with significant sequence similarities to the hemagglutinin-neuraminidase (HN) and polymerase (L) genes of mammalian paramyxoviruses were identified by BLAST search. Partial sequences of the FDLV genes were used to design primers for amplification by nested polymerase chain reaction (PCR) and sequencing of 518-bp L gene and 352-bp HN gene fragments from a collection of 15 previously uncharacterized reptilian paramyxoviruses. Phylogenetic analyses of the partial L and HN sequences produced similar trees in which there were two distinct subgroups of isolates that were supported with maximum bootstrap values, and several intermediate isolates. Within each subgroup the nucleotide divergence values were less than 2.5%, while the divergence between the two subgroups was 20-22%. This indicated that the two subgroups represent distinct virus species containing multiple virus strains. The five intermediate isolates had nucleotide divergence values of 11-20% and may represent additional distinct species. In addition to establishing diversity among reptilian paramyxoviruses, the phylogenetic groupings showed some correlation with geographic location, and clearly demonstrated a low level of host species-specificity within these viruses. Copyright (C) 1999 Elsevier Science B.V.

  16. Recent Advances in Cellular Glycomic Analyses

    PubMed Central

    Furukawa, Jun-ichi; Fujitani, Naoki; Shinohara, Yasuro

    2013-01-01

    A large variety of glycans is intricately located on the cell surface, and the overall profile (the glycome, given the entire repertoire of glycoconjugate-associated sugars in cells and tissues) is believed to be crucial for the diverse roles of glycans, which are mediated by specific interactions that control cell-cell adhesion, immune response, microbial pathogenesis and other cellular events. The glycomic profile also reflects cellular alterations, such as development, differentiation and cancerous change. A glycoconjugate-based approach would therefore be expected to streamline discovery of novel cellular biomarkers. Development of such an approach has proven challenging, due to the technical difficulties associated with the analysis of various types of cellular glycomes; however, recent progress in the development of analytical methodologies and strategies has begun to clarify the cellular glycomics of various classes of glycoconjugates. This review focuses on recent advances in the technical aspects of cellular glycomic analyses of major classes of glycoconjugates, including N- and O-linked glycans, derived from glycoproteins, proteoglycans and glycosphingolipids. Articles that unveil the glycomics of various biologically important cells, including embryonic and somatic stem cells, induced pluripotent stem (iPS) cells and cancer cells, are discussed. PMID:24970165

  17. Cyanide analyses for risk and treatability assessments

    SciTech Connect

    MacFarlane, I.D.; Elseroad, H.J.; Pergrin, D.E.; Logan, C.M.

    1994-12-31

    Cyanide, an EPA priority pollutant and target analyte, is typically measured as total. However, cyanide complexation, information which is not acquired through total cyanide analysis, is often a driver of cyanide toxicity and treatability. A case study of a former manufacture gas plant (MGP) is used to demonstrate the usability of various cyanide analytical methods for risk and treatability assessments. Several analytical methods, including cyanide amenable to chlorination and weak acid dissociable cyanide help test the degree of cyanide complexation. Generally, free or uncomplexed cyanide is more biologically available, toxic, and reactive than complexed cyanide. Extensive site testing has shown that free and weakly dissociable cyanide composes only a small fraction of total cyanide as would be expected from the literature, and that risk assessment will be more realistic considering cyanide form. Likewise, aqueous treatment for cyanide can be properly tested if cyanide form is accounted for. Weak acid dissociable cyanide analyses proved to be the most reliable (and potentially acceptable) cyanide method, as well as represent the most toxic and reactive cyanide forms.

  18. Analysing specificity of a bipolar EEG measurement.

    PubMed

    Vaisanen, Juho; Ryynanen, Outi; Hyttinen, Jari; Malmivuo, Jaakko

    2006-01-01

    The objective in bioelectric measurements such as ECG and EEG is to register the signal arising from sources in the region of interest. It is also desired that signal-to-noise ratio (SNR) of a measurement is high. The sensitivity of an ideal measurement should focus on and be greater on the target areas in comparison to other areas of the volume conductor. Previously the half-sensitivity volume (HSV) has been applied to describe how focused the measurement is. In this paper we introduce a concept of the half-sensitivity ratio (HSR) which describes how well the sensitivity is concentrated in HSV compared to other source regions i.e. how specific the measurement is to the sources in HSV. Further we may have different region of interests (ROI) to which the measurements are wanted to be specific. Then the concept is called region of interest sensitivity ratio (ROISR). We present here an application of the HSR in analysing sensitivity distributions of bioelectric measurements. We studied the effects of interelectrode distance and the scalp/skull/brain resistivity ratio on the HSR of a bipolar EEG measurement with a three-layer spherical head model. The results indicate that when the focus of interest is on cortical activity more specified and concentrated sensitivity distributions are achieved with smaller interelectrode distances. Further a preliminary measurement with visual evoked potentials provides evidence of the relationship between HSR and SNR of a measurement. PMID:17945619

  19. Phylogenomic Analyses Support Traditional Relationships within Cnidaria

    PubMed Central

    Zapata, Felipe; Goetz, Freya E.; Smith, Stephen A.; Howison, Mark; Siebert, Stefan; Church, Samuel H.; Sanders, Steven M.; Ames, Cheryl Lewis; McFadden, Catherine S.; France, Scott C.; Daly, Marymegan; Collins, Allen G.; Haddock, Steven H. D.; Dunn, Casey W.; Cartwright, Paulyn

    2015-01-01

    Cnidaria, the sister group to Bilateria, is a highly diverse group of animals in terms of morphology, lifecycles, ecology, and development. How this diversity originated and evolved is not well understood because phylogenetic relationships among major cnidarian lineages are unclear, and recent studies present contrasting phylogenetic hypotheses. Here, we use transcriptome data from 15 newly-sequenced species in combination with 26 publicly available genomes and transcriptomes to assess phylogenetic relationships among major cnidarian lineages. Phylogenetic analyses using different partition schemes and models of molecular evolution, as well as topology tests for alternative phylogenetic relationships, support the monophyly of Medusozoa, Anthozoa, Octocorallia, Hydrozoa, and a clade consisting of Staurozoa, Cubozoa, and Scyphozoa. Support for the monophyly of Hexacorallia is weak due to the equivocal position of Ceriantharia. Taken together, these results further resolve deep cnidarian relationships, largely support traditional phylogenetic views on relationships, and provide a historical framework for studying the evolutionary processes involved in one of the most ancient animal radiations. PMID:26465609

  20. Transportation systems analyses. Volume 1: Executive summary

    NASA Astrophysics Data System (ADS)

    1992-11-01

    The principal objective is to accomplish a systems engineering assessment of the nation's space transportation infrastructure. This analysis addresses the necessary elements to perform crew delivery and return, cargo transfer, cargo delivery and return, payload servicing, and the exploration of the Moon and Mars. Specific elements analyzed, but not limited to, include: the Space Exploration Initiative (SEI), the National Launch System (NLS), the current expendable launch vehicle (ELV) fleet, ground facilities, the Space Station Freedom (SSF), and other civil, military and commercial payloads. The performance of this study entails maintaining a broad perspective on the large number of transportation elements that could potentially comprise the U.S. space infrastructure over the next several decades. To perform this systems evaluation, top-level trade studies are conducted to enhance our understanding of the relationship between elements of the infrastructure. This broad 'infrastructure-level perspective' permits the identification of preferred infrastructures. Sensitivity analyses are performed to assure the credibility and usefulness of study results. Conceptual studies of transportation elements contribute to the systems approach by identifying elements (such as ETO node and transfer/excursion vehicles) needed in current and planned transportation systems. These studies are also a mechanism to integrate the results of relevant parallel studies.

  1. PRECLOSURE CONSEQUENCE ANALYSES FOR LICENSE APPLICATION

    SciTech Connect

    S. Tsai

    2005-01-12

    Radiological consequence analyses are performed for potential releases from normal operations in surface and subsurface facilities and from Category 1 and Category 2 event sequences during the preclosure period. Surface releases from normal repository operations are primarily from radionuclides released from opening a transportation cask during dry transfer operations of spent nuclear fuel (SNF) in Dry Transfer Facility 1 (DTF 1), Dry Transfer Facility 2 (DTF 2), the Canister Handling facility (CHF), or the Fuel Handling Facility (FHF). Subsurface releases from normal repository operations are from resuspension of waste package surface contamination and neutron activation of ventilated air and silica dust from host rock in the emplacement drifts. The purpose of this calculation is to demonstrate that the preclosure performance objectives, specified in 10 CFR 63.111(a) and 10 CFR 63.111(b), have been met for the proposed design and operations in the geologic repository operations area. Preclosure performance objectives are discussed in Section 6.2.3 and are summarized in Tables 1 and 2.

  2. Characterization of branch complexity by fractal analyses

    USGS Publications Warehouse

    Alados, C.L.; Escos, J.; Emlen, J.M.; Freeman, D.C.

    1999-01-01

    The comparison between complexity in the sense of space occupancy (box-counting fractal dimension D(c) and information dimension D1) and heterogeneity in the sense of space distribution (average evenness index f and evenness variation coefficient J(cv)) were investigated in mathematical fractal objects and natural branch structures. In general, increased fractal dimension was paired with low heterogeneity. Comparisons between branch architecture in Anthyllis cytisoides under different slope exposure and grazing impact revealed that branches were more complex and more homogeneously distributed for plants on northern exposures than southern, while grazing had no impact during a wet year. Developmental instability was also investigated by the statistical noise of the allometric relation between internode length and node order. In conclusion, our study demonstrated that fractal dimension of branch structure can be used to analyze the structural organization of plants, especially if we consider not only fractal dimension but also shoot distribution within the canopy (lacunarity). These indexes together with developmental instability analyses are good indicators of growth responses to the environment.

  3. Comparison of manual, digital and lateral CBCT cephalometric analyses

    PubMed Central

    NAVARRO, Ricardo de Lima; OLTRAMARI-NAVARRO, Paula Vanessa Pedron; FERNANDES, Thais Maria Freire; de OLIVEIRA, Giovani Fidelis; CONTI, Ana Cláudia de Castro Ferreira; de ALMEIDA, Marcio Rodrigues; de ALMEIDA, Renato Rodrigues

    2013-01-01

    Objective: The aim of this study was to compare the reliability of three different methods of cephalometric analysis. Material and Methods: Conventional pretreatment lateral cephalograms and cone beam computed tomography (CBCT) scans from 50 subjects from a radiological clinic were selected in order to test the three methods: manual tracings (MT), digitized lateral cephalograms (DLC), and lateral cephalograms from CBCT (LC-CBCT). The lateral cephalograms were manually analyzed through the Dolphin Imaging 11.0(tm) software. Twenty measurements were performed under the same conditions, and retraced after a 30-day period. Paired t tests and the Dahlberg formula were used to evaluate the intra-examiner errors. The Pearson's correlation coefficient and one-way analysis of variance (ANOVA) tests were used to compare the differences between the methods. Results: Intra-examiner reliability occurred for all methods for most of the measurements. Only six measurements were different between the methods and an agreement was observed in the analyses among the 3 methods. Conclusions: The results demonstrated that all evaluated methodologies are reliable and valid for scientific research, however, the method used in the lateral cephalograms from the CBCT proved the most reliable. PMID:23739848

  4. Comparative analyses of multifrequency PSI ground deformation measurements

    NASA Astrophysics Data System (ADS)

    Sabater, José R.; Duro, Javier; Arnaud, Alain; Albiol, David; Koudogbo, Fifamè N.

    2011-11-01

    In recent years many new developments have been made in the field of SAR image analysis. The diversity of available SAR imagery allows a wider range of applications to be covered in the domain of risk management and hazard mapping. The work that we propose is based on the analysis of differences in ground deformation measurements extracted from the processing of data stacks acquired at different frequencies. The aim of the project is the definition of criteria that could assist in the selection of the most appropriate SAR mission according to the type of regions of interest. Key factors are geographic localization and land cover. The study is organized in two main parts. First, the impact of sensitivity to motion, land cover characteristics, spatial resolution and atmospheric artifacts is investigated at different wavelengths. Second, the PS density achieved and the capacity to detect and monitor fast and slow motions over urban and rural areas with different frequencies is analyzed. The presented InSAR analyses have been performed using the Stable Point Network (SPN) PSI software developed by Altamira Information.

  5. Consistency of methods for analysing location-specific data

    PubMed Central

    Zanca, F.; Chakraborty, D. P.; Marchal, G.; Bosmans, H.

    2010-01-01

    Although the receiver operating characteristic (ROC) method is the acknowledged gold-standard for imaging system assessment, it ignores localisation information and differentiation between multiple abnormalities per case. As the free-response ROC (FROC) method uses localisation information and more closely resembles the clinical reporting process, it is being increasingly used. A number of methods have been proposed to analyse the data that result from an FROC study: jackknife alternative FROC (JAFROC) and a variant termed JAFROC1, initial detection and candidate analysis (IDCA) and ROC analysis via the reduction of the multiple ratings on a case to a single rating. The focus of this paper was to compare JAFROC1, IDCA and the ROC analysis methods using a clinical FROC human data set. All methods agreed on the ordering of the modalities and all yielded statistically significant differences of the figures-of-merit, i.e. p < 0.05. Both IDCA and JAFROC1 yielded much smaller p-values than ROC. The results are consistent with a recent simulation-based validation study comparing these and other methods. In conclusion, IDCA or JAFROC1 analysis of FROC human data may be superior at detecting modality differences than ROC analysis. PMID:20159917

  6. Image sets for satellite image processing systems

    NASA Astrophysics Data System (ADS)

    Peterson, Michael R.; Horner, Toby; Temple, Asael

    2011-06-01

    The development of novel image processing algorithms requires a diverse and relevant set of training images to ensure the general applicability of such algorithms for their required tasks. Images must be appropriately chosen for the algorithm's intended applications. Image processing algorithms often employ the discrete wavelet transform (DWT) algorithm to provide efficient compression and near-perfect reconstruction of image data. Defense applications often require the transmission of images and video across noisy or low-bandwidth channels. Unfortunately, the DWT algorithm's performance deteriorates in the presence of noise. Evolutionary algorithms are often able to train image filters that outperform DWT filters in noisy environments. Here, we present and evaluate two image sets suitable for the training of such filters for satellite and unmanned aerial vehicle imagery applications. We demonstrate the use of the first image set as a training platform for evolutionary algorithms that optimize discrete wavelet transform (DWT)-based image transform filters for satellite image compression. We evaluate the suitability of each image as a training image during optimization. Each image is ranked according to its suitability as a training image and its difficulty as a test image. The second image set provides a test-bed for holdout validation of trained image filters. These images are used to independently verify that trained filters will provide strong performance on unseen satellite images. Collectively, these image sets are suitable for the development of image processing algorithms for satellite and reconnaissance imagery applications.

  7. Imaging sciences workshop

    SciTech Connect

    Candy, J.V.

    1994-11-15

    This workshop on the Imaging Sciences sponsored by Lawrence Livermore National Laboratory contains short abstracts/articles submitted by speakers. The topic areas covered include the following: Astronomical Imaging; biomedical imaging; vision/image display; imaging hardware; imaging software; Acoustic/oceanic imaging; microwave/acoustic imaging; computed tomography; physical imaging; imaging algorithms. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.

  8. Image and Data-analysis Tools For Paleoclimatic Reconstructions

    NASA Astrophysics Data System (ADS)

    Pozzi, M.

    It comes here proposed a directory of instruments and computer science resources chosen in order to resolve the problematic ones that regard the paleoclimatic recon- structions. They will come discussed in particular the following points: 1) Numerical analysis of paleo-data (fossils abundances, species analyses, isotopic signals, chemical-physical parameters, biological data): a) statistical analyses (uni- variate, diversity, rarefaction, correlation, ANOVA, F and T tests, Chi^2) b) multidi- mensional analyses (principal components, corrispondence, cluster analysis, seriation, discriminant, autocorrelation, spectral analysis) neural analyses (backpropagation net, kohonen feature map, hopfield net genetic algorithms) 2) Graphical analysis (visu- alization tools) of paleo-data (quantitative and qualitative fossils abundances, species analyses, isotopic signals, chemical-physical parameters): a) 2-D data analyses (graph, histogram, ternary, survivorship) b) 3-D data analyses (direct volume rendering, iso- surfaces, segmentation, surface reconstruction, surface simplification,generation of tetrahedral grids). 3) Quantitative and qualitative digital image analysis (macro and microfossils image analysis, Scanning Electron Microscope. and Optical Polarized Microscope images capture and analysis, morphometric data analysis, 3-D reconstruc- tions): a) 2D image analysis (correction of image defects, enhancement of image de- tail, converting texture and directionality to grey scale or colour differences, visual enhancement using pseudo-colour, pseudo-3D, thresholding of image features, binary image processing, measurements, stereological measurements, measuring features on a white background) b) 3D image analysis (basic stereological procedures, two dimen- sional structures; area fraction from the point count, volume fraction from the point count, three dimensional structures: surface area and the line intercept count, three dimensional microstructures; line length and the

  9. [Endometrial imaging].

    PubMed

    Lemercier, E; Genevois, A; Dacher, J N; Benozio, M; Descargues, G; Marpeau, L

    2000-12-01

    The diagnostic value of endovaginal sonography in benign or malignant endometrial pathology is high, increased by sonohysterography. Sonohysterography is useful in the diagnosis of endometrial thickness and to determine further investigations. MRI is accurate in the uterine adenomyosis diagnosis and is the imaging modality of choice for the preoperative endometrial cancer staging. PMID:11173754

  10. Inner Image

    ERIC Educational Resources Information Center

    Mollhagen, Nancy

    2004-01-01

    In this article, the author states that she has always loved self portraits but most teenagers do not enjoy looking too closely at their own faces in an effort to replicate them. Thanks to a new digital camera, she was able to use this new technology to inspire students to take a closer look at their inner image. Prior to the self-portrait…

  11. Forest Imaging

    NASA Technical Reports Server (NTRS)

    1992-01-01

    NASA's Technology Applications Center, with other government and academic agencies, provided technology for improved resources management to the Cibola National Forest. Landsat satellite images enabled vegetation over a large area to be classified for purposes of timber analysis, wildlife habitat, range measurement and development of general vegetation maps.

  12. Photoacoustic imaging platforms for multimodal imaging

    PubMed Central

    2015-01-01

    Photoacoustic (PA) imaging is a hybrid biomedical imaging method that exploits both acoustical Epub ahead of print and optical properties and can provide both functional and structural information. Therefore, PA imaging can complement other imaging methods, such as ultrasound imaging, fluorescence imaging, optical coherence tomography, and multi-photon microscopy. This article reviews techniques that integrate PA with the above imaging methods and describes their applications. PMID:25754364

  13. Fracturing and brittleness index analyses of shales

    NASA Astrophysics Data System (ADS)

    Barnhoorn, Auke; Primarini, Mutia; Houben, Maartje

    2016-04-01

    The formation of a fracture network in rocks has a crucial control on the flow behaviour of fluids. In addition, an existing network of fractures , influences the propagation of new fractures during e.g. hydraulic fracturing or during a seismic event. Understanding of the type and characteristics of the fracture network that will be formed during e.g. hydraulic fracturing is thus crucial to better predict the outcome of a hydraulic fracturing job. For this, knowledge of the rock properties is crucial. The brittleness index is often used as a rock property that can be used to predict the fracturing behaviour of a rock for e.g. hydraulic fracturing of shales. Various terminologies of the brittleness index (BI1, BI2 and BI3) exist based on mineralogy, elastic constants and stress-strain behaviour (Jin et al., 2014, Jarvie et al., 2007 and Holt et al., 2011). A maximum brittleness index of 1 predicts very good and efficient fracturing behaviour while a minimum brittleness index of 0 predicts a much more ductile shale behaviour. Here, we have performed systematic petrophysical, acoustic and geomechanical analyses on a set of shale samples from Whitby (UK) and we have determined the three different brittleness indices on each sample by performing all the analyses on each of the samples. We show that each of the three brittleness indices are very different for the same sample and as such it can be concluded that the brittleness index is not a good predictor of the fracturing behaviour of shales. The brittleness index based on the acoustic data (BI1) all lie around values of 0.5, while the brittleness index based on the stress strain data (BI2) give an average brittleness index around 0.75, whereas the mineralogy brittleness index (BI3) predict values below 0.2. This shows that by using different estimates of the brittleness index different decisions can be made for hydraulic fracturing. If we would rely on the mineralogy (BI3), the Whitby mudstone is not a suitable

  14. Value of MR histogram analyses for prediction of microvascular invasion of hepatocellular carcinoma

    PubMed Central

    Huang, Ya-Qin; Liang, He-Yue; Yang, Zhao-Xia; Ding, Ying; Zeng, Meng-Su; Rao, Sheng-Xiang

    2016-01-01

    Abstract The objective is to explore the value of preoperative magnetic resonance (MR) histogram analyses in predicting microvascular invasion (MVI) of hepatocellular carcinoma (HCC). Fifty-one patients with histologically confirmed HCC who underwent diffusion-weighted and contrast-enhanced MR imaging were included. Histogram analyses were performed and mean, variance, skewness, kurtosis, 1th, 10th, 50th, 90th, and 99th percentiles were derived. Quantitative histogram parameters were compared between HCCs with and without MVI. Receiver operating characteristics (ROC) analyses were generated to compare the diagnostic performance of tumor size, histogram analyses of apparent diffusion coefficient (ADC) maps, and MR enhancement. The mean, 1th, 10th, and 50th percentiles of ADC maps, and the mean, variance. 1th, 10th, 50th, 90th, and 99th percentiles of the portal venous phase (PVP) images were significantly different between the groups with and without MVI (P <0.05), with area under the ROC curves (AUCs) of 0.66 to 0.74 for ADC and 0.76 to 0.88 for PVP. The largest AUC of PVP (1th percentile) showed significantly higher accuracy compared with that of arterial phase (AP) or tumor size (P <0.001). MR histogram analyses—in particular for 1th percentile for PVP images—held promise for prediction of MVI of HCC. PMID:27368028

  15. Spectral analyses of solar-like stars

    NASA Astrophysics Data System (ADS)

    Doyle, Amanda P.

    2015-03-01

    Accurate stellar parameters are important not just to understand the stars themselves, but also for understanding the planets that orbit them. Despite the availability of high quality spectra, there are still many uncertainties in stellar spectroscopy. In this thesis, the finer details of spectroscopic analyses are discussed and critically evaluated, with a focus on improving the stellar parameters. Using high resolution, high signal-to-noise HARPS spectra, accurate parameters were determined for 22 WASP stars. It is shown that there is a limit to the accuracy of stellar parameters that can be achieved, despite using high S/N spectra. It is also found that the selection of spectral lines used and the accuracy of atomic data is crucial, and different line lists can result in different values of parameters. Different spectral analysis methods often give vastly different results even for the same spectrum of the same star. Here it is shown that many of these discrepancies can be explained by the choice of lines used and by the various assumptions made. This will enable a more reliable homogeneous study of solar-like stars in the future. The Rossiter-McLaughlin effect observed for transiting exoplanets often requires prior knowledge of the projected rotational velocity (vsini). This is usually provided via spectroscopy, however this method has uncertainties as spectral lines are also broadened by photospheric velocity fields known as "macroturbulence". Using rotational splitting frequencies for 28 Kepler stars that were provided via asteroseismology, accurate vsini values have been determined. By inferring the macroturbulence for 28 Kepler stars, it was possible to obtain a new calibration between macroturbulence, effective temperature and surface gravity. Therefore macroturbulence, and thus vsini, can now be determined with confidence for stars that do not have asteroseismic data available. New spectroscopic vsini values were then determined for the WASP planet host

  16. Consumption Patterns and Perception Analyses of Hangwa

    PubMed Central

    Kwock, Chang Geun; Lee, Min A; Park, So Hyun

    2012-01-01

    Hangwa is a traditional food, corresponding to the current consumption trend, in need of marketing strategies to extend its consumption. Therefore, the purpose of this study was to analyze consumers’ consumption patterns and perception of Hangwa to increase consumption in the market. A questionnaire was sent to 250 consumers by e-mail from Oct 8∼23, 2009 and the data from 231 persons were analyzed in this study. Statistical, descriptive, paired samples t-test, and importance-performance analyses were conducted using SPSS WIN 17.0. According to the results, Hangwa was purchased mainly ‘for present’ (39.8%) and the main reasons for buying it were ‘traditional image’ (33.3%) and ‘taste’ (22.5%). When importance and performance of attributes considered in purchasing Hangwa were evaluated, performance was assessed to be lower than importance for all attributes. The attributes in the first quadrant with a high importance and a high performance were ‘a sanitary process’, ‘a rigorous quality mark’ and ‘taste’, which were related with quality of the products. In addition, those with a high importance but a low performance were ‘popularization through advertisement’, ‘promotion through mass media’, ‘conversion of thought on traditional foods’, ‘a reasonable price’ and ‘a wide range of price’. In conclusion, Hangwa manufacturers need to diversify products and extend the expiration date based on technologies to promote its consumption. In terms of price, Hangwa should become more available by lowering the price barrier for consumers who are sensitive to price. PMID:24471065

  17. Finite Element analyses of soil bioengineered slopes

    NASA Astrophysics Data System (ADS)

    Tamagnini, Roberto; Switala, Barbara Maria; Sudan Acharya, Madhu; Wu, Wei; Graf, Frank; Auer, Michael; te Kamp, Lothar

    2014-05-01

    Soil Bioengineering methods are not only effective from an economical point of view, but they are also interesting as fully ecological solutions. The presented project is aimed to define a numerical model which includes the impact of vegetation on slope stability, considering both mechanical and hydrological effects. In this project, a constitutive model has been developed that accounts for the multi-phase nature of the soil, namely the partly saturated condition and it also includes the effects of a biological component. The constitutive equation is implemented in the Finite Element (FE) software Comes-Geo with an implicit integration scheme that accounts for the collapse of the soils structure due to wetting. The mathematical formulation of the constitutive equations is introduced by means of thermodynamics and it simulates the growth of the biological system during the time. The numerical code is then applied in the analysis of an ideal rainfall induced landslide. The slope is analyzed for vegetated and non-vegetated conditions. The final results allow to quantitatively assessing the impact of vegetation on slope stability. This allows drawing conclusions and choosing whenever it is worthful to use soil bioengineering methods in slope stabilization instead of traditional approaches. The application of the FE methods show some advantages with respect to the commonly used limit equilibrium analyses, because it can account for the real coupled strain-diffusion nature of the problem. The mechanical strength of roots is in fact influenced by the stress evolution into the slope. Moreover, FE method does not need a pre-definition of any failure surface. FE method can also be used in monitoring the progressive failure of the soil bio-engineered system as it calculates the amount of displacements and strains of the model slope. The preliminary study results show that the formulated equations can be useful for analysis and evaluation of different soil bio

  18. Viscoelastic analyses of launch vehicle components

    SciTech Connect

    Chi, J.K.; Lin, S.R.

    1995-12-31

    Current analysis techniques for solid rocket propellant, and insulation used in space launch vehicles, have several shortcomings. The simplest linear elastic analysis method ignores the inherent viscoelastic behavior of these materials entirely. The relaxation modulus method commonly used to simulate time-dependent effects ignores the past loading history, while the rigorous viscoelastic finite-element analysis is often expensive and impractical. The response of viscoelastic materials is often characterized by the time-dependent relaxation moduli obtained from uniaxial relaxation tests. Since the relaxation moduli are functions of elapsed time, the viscoelastic analysis is not only dependent on the current stress or strain state but also the full loading history. As a preliminary step towards developing a procedure which will yield reasonably conservative results for analyzing the structural response of solid rocket motors, an equivalent-modulus approach was developed. To demonstrate its application, a viscoelastic thick-walled cylindrical material, confined by a stiff steel case and under an internal pressure condition, was analyzed using (1) the equivalent-modulus elastic quasi-static method, (2) an exact viscoelastic closed-form solution, and (3) the viscoelastic finite-element program. A combination of two springs and one viscous damper is used to represent the viscoelastic material with parameters obtained from stress-relaxation tests. The equivalent modulus is derived based on an accumulated quasi-static stress/strain state. The exact closed-form solution is obtained by the Laplace Transform method. The ABAQUS program is then used for the viscoelastic finite-element solution, where the loading-rate dependent moduli is represented by a Prony series expansion of the relaxation modulus. Additional analyses were performed for two space launch solid rocket motors for the purpose of comparing results from the equivalent-modulus approach and the ABAQUS program.

  19. Trend analyses with river sediment rating curves

    USGS Publications Warehouse

    Warrick, Jonathan A.

    2015-01-01

    Sediment rating curves, which are fitted relationships between river discharge (Q) and suspended-sediment concentration (C), are commonly used to assess patterns and trends in river water quality. In many of these studies it is assumed that rating curves have a power-law form (i.e., C = aQb, where a and b are fitted parameters). Two fundamental questions about the utility of these techniques are assessed in this paper: (i) How well to the parameters, a and b, characterize trends in the data? (ii) Are trends in rating curves diagnostic of changes to river water or sediment discharge? As noted in previous research, the offset parameter, a, is not an independent variable for most rivers, but rather strongly dependent on b and Q. Here it is shown that a is a poor metric for trends in the vertical offset of a rating curve, and a new parameter, â, as determined by the discharge-normalized power function [C = â (Q/QGM)b], where QGM is the geometric mean of the Q values sampled, provides a better characterization of trends. However, these techniques must be applied carefully, because curvature in the relationship between log(Q) and log(C), which exists for many rivers, can produce false trends in â and b. Also, it is shown that trends in â and b are not uniquely diagnostic of river water or sediment supply conditions. For example, an increase in â can be caused by an increase in sediment supply, a decrease in water supply, or a combination of these conditions. Large changes in water and sediment supplies can occur without any change in the parameters, â and b. Thus, trend analyses using sediment rating curves must include additional assessments of the time-dependent rates and trends of river water, sediment concentrations, and sediment discharge.

  20. SEDS Tether M/OD Damage Analyses

    NASA Technical Reports Server (NTRS)

    Hayashida, K. B.; Robinson, J. H.; Hill, S. A.

    1997-01-01

    The Small Expendable Deployer System (SEDS) was designed to deploy an endmass at the end of a 20-km-long tether which acts as an upper stage rocket, and the threats from the meteoroid and orbital debris (M/OD) particle environments on SEDS components are important issues for the safety and success of any SEDS mission. However, the possibility of severing the tether due to M/OD particle impacts is an even more serious concern, since the SEDS tether has a relatively large exposed area to the M/OD environments although its diameter is quite small. The threats from the M/OD environments became a very important issue for the third SEDS mission, since the project office proposed using the shuttle orbiter as a launch platform instead of the second stage of a Delta II expendable rocket, which was used for the first two SEDS missions. A series of hyper-velocity impact tests were performed at the Johnson Space Center and Arnold Engineering Development Center to help determine the critical particle sizes required to sever the tether. The computer hydrodynamic code or hydrocode called CTH, developed by the Sandia National Laboratories, was also used to simulate the damage on the SEDS tether caused by both the orbital debris and test particle impacts. The CTH hydrocode simulation results provided the much needed information to help determine the critical particle sizes required to sever the tether. The M/OD particle sizes required to sever the tether were estimated to be less than 0.1 cm in diameter from these studies, and these size particles are more abundant in low-Earth orbit than larger size particles. Finally, the authors performed the M/OD damage analyses for the three SEDS missions; i.e., SEDS-1, -2, and -3 missions, by using the information obtained from the hypervelocity impact test and hydrocode simulations results.

  1. Assessing the reproducibility of discriminant function analyses

    PubMed Central

    Andrew, Rose L.; Albert, Arianne Y.K.; Renaut, Sebastien; Rennison, Diana J.; Bock, Dan G.

    2015-01-01

    Data are the foundation of empirical research, yet all too often the datasets underlying published papers are unavailable, incorrect, or poorly curated. This is a serious issue, because future researchers are then unable to validate published results or reuse data to explore new ideas and hypotheses. Even if data files are securely stored and accessible, they must also be accompanied by accurate labels and identifiers. To assess how often problems with metadata or data curation affect the reproducibility of published results, we attempted to reproduce Discriminant Function Analyses (DFAs) from the field of organismal biology. DFA is a commonly used statistical analysis that has changed little since its inception almost eight decades ago, and therefore provides an opportunity to test reproducibility among datasets of varying ages. Out of 100 papers we initially surveyed, fourteen were excluded because they did not present the common types of quantitative result from their DFA or gave insufficient details of their DFA. Of the remaining 86 datasets, there were 15 cases for which we were unable to confidently relate the dataset we received to the one used in the published analysis. The reasons ranged from incomprehensible or absent variable labels, the DFA being performed on an unspecified subset of the data, or the dataset we received being incomplete. We focused on reproducing three common summary statistics from DFAs: the percent variance explained, the percentage correctly assigned and the largest discriminant function coefficient. The reproducibility of the first two was fairly high (20 of 26, and 44 of 60 datasets, respectively), whereas our success rate with the discriminant function coefficients was lower (15 of 26 datasets). When considering all three summary statistics, we were able to completely reproduce 46 (65%) of 71 datasets. While our results show that a majority of studies are reproducible, they highlight the fact that many studies still are not the

  2. High-Resolution Views of Io's Emakong Patera: Latest Galileo Imaging Results

    NASA Technical Reports Server (NTRS)

    Williams, D. A.; Keszthelyi, L. P.; Davies, A. G.; Greeley, R.; Head, J. W., III

    2002-01-01

    This presentation will discuss analyses of the latest Galileo SSI (solid state imaging) high-resolution images of the Emakong lava channels and flow field on Jupiter's moon Io. Additional information is contained in the original extended abstract.

  3. Data Filtering in Instrumental Analyses with Applications to Optical Spectroscopy and Chemical Imaging

    ERIC Educational Resources Information Center

    Vogt, Frank

    2011-01-01

    Most measurement techniques have some limitations imposed by a sensor's signal-to-noise ratio (SNR). Thus, in analytical chemistry, methods for enhancing the SNR are of crucial importance and can be ensured experimentally or established via pre-treatment of digitized data. In many analytical curricula, instrumental techniques are given preference…

  4. Image structure restoration from sputnik with multi-matrix scanners

    NASA Astrophysics Data System (ADS)

    Eremeev, V.; Kuznetcov, A.; Myatov, G.; Presnyakov, Oleg; Poshekhonov, V.; Svetelkin, P.

    2014-10-01

    The paper is devoted to the earth surface image formation by means of multi-matrix scanning cameras. The realized formation of continuous and spatially combined images consists of consistent solutions for radiometric scan correction, stitching and geo-referencing of multispectral images. The radiometric scan correction algorithm based on statistical analyses of input images is described. Also, there is the algorithm for sub-pixel stitching of scans into one continuous image which could be formed by the virtual scanner. The paper contains algorithms for geometrical combining of multispectral images obtained in different moments; and, examples illustrating effectiveness of the suggested processing algorithms.

  5. Static and dynamic analyses of tensegrity structures

    NASA Astrophysics Data System (ADS)

    Nishimura, Yoshitaka

    Tensegrity structures are a class of truss structures consisting of a continuous set of tension members (cables) and a discrete set of compression members (bars). Since tensegrity structures are light weight and can be compactly stowed and deployed, cylindrical tensegrity modules have been proposed for space structures. From a view point of structural dynamics, tensegrity structures pose a new set of problems, i.e., initial shape finding. Initial configurations of tensegrity structures must be computed by imposing a pre-stressability condition to initial equilibrium equations. There are ample qualitative statements regarding the initial geometry of cylindrical and spherical tensegrity modules. Quantitative initial shape anlyses have only been performed on one-stage and two-stage cylindrical modules. However, analytical expressions for important geometrical parameters such as twist angles and overlap ratios lack the definition of the initial shape of both cylindrical and spherical tensegrity modules. In response to the above needs, a set of static and dynamic characterization procedures for tensegrity modules was first developed. The procedures were subsequently applied to Buckminster Fuller's spherical tensegrity modules. Both the initial shape and the corresponding pre-stress mode were analytically obtained by using the graphs of the tetrahedral, octahedral (cubic), and icosahedral (dodecahedral) groups. For pre-stressed configurations, modal analyses were conducted to classify a large number of infinitesimal mechanism modes. The procedures also applied tocyclic cylindrical tensegrity modules with an arbitrary number of stages. It was found that both the Maxwell number and the number of infinitesimal mechanism modes are independent of the number of stages in the axial direction. A reduced set of equilibrium equations was derived by incorporating cyclic symmetry and the flip, or quasi-flip, symmetry of the cylindrical modules. For multi-stage modules with more than

  6. Genome-Facilitated Analyses of Geomicrobial Processes

    SciTech Connect

    Kenneth H. Nealson

    2012-05-02

    that makes up chitin, virtually all of the strains were in fact capable. This led to the discovery of a great many new genes involved with chitin and NAG metabolism (7). In a similar vein, a detailed study of the sugar utilization pathway revealed a major new insight into the regulation of sugar metabolism in this genus (19). Systems Biology and Comparative Genomics of the shewanellae: Several publications were put together describing the use of comparative genomics for analyses of the group Shewanella, and these were a logical culmination of our genomic-driven research (10,15,18). Eight graduate students received their Ph.D. degrees doing part of the work described here, and four postdoctoral fellows were supported. In addition, approximately 20 undergraduates took part in projects during the grant period.

  7. Multispectral Imaging Broadens Cellular Analysis

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Amnis Corporation, a Seattle-based biotechnology company, developed ImageStream to produce sensitive fluorescence images of cells in flow. The company responded to an SBIR solicitation from Ames Research Center, and proposed to evaluate several methods of extending the depth of field for its ImageStream system and implement the best as an upgrade to its commercial products. This would allow users to view whole cells at the same time, rather than just one section of each cell. Through Phase I and II SBIR contracts, Ames provided Amnis the funding the company needed to develop this extended functionality. For NASA, the resulting high-speed image flow cytometry process made its way into Medusa, a life-detection instrument built to collect, store, and analyze sample organisms from erupting hydrothermal vents, and has the potential to benefit space flight health monitoring. On the commercial end, Amnis has implemented the process in ImageStream, combining high-resolution microscopy and flow cytometry in a single instrument, giving researchers the power to conduct quantitative analyses of individual cells and cell populations at the same time, in the same experiment. ImageStream is also built for many other applications, including cell signaling and pathway analysis; classification and characterization of peripheral blood mononuclear cell populations; quantitative morphology; apoptosis (cell death) assays; gene expression analysis; analysis of cell conjugates; molecular distribution; and receptor mapping and distribution.

  8. Synthetic aperture sonar image statistics

    NASA Astrophysics Data System (ADS)

    Johnson, Shawn F.

    Synthetic Aperture Sonar (SAS) systems are capable of producing photograph quality seafloor imagery using a lower frequency than other systems of comparable resolution. However, as with other high-resolution sonar systems, SAS imagery is often characterized by heavy-tailed amplitude distributions which may adversely affect target detection systems. The constant cross-range resolution with respect to range that results from the synthetic aperture formation process provides a unique opportunity to improve our understanding of system and environment interactions, which is essential for accurate performance prediction. This research focused on the impact of multipath contamination and the impact of resolution on image statistics, accomplished through analyses of data collected during at-sea experiments, analytical modeling, and development of numerical simulations. Multipath contamination was shown to have an appreciable impact on image statistics at ranges greater than the water depth and when the levels of the contributing multipath are within 10 dB of the direct path, reducing the image amplitude distribution tails while also degrading image clarity. Image statistics were shown to depend strongly upon both system resolution and orientation to seafloor features such as sand ripples. This work contributes to improving detection systems by aiding understanding of the influences of background (i.e. non-target) image statistics.

  9. Body Image Dissatisfaction and Distortion, Steroid Use, and Sex Differences in College Age Bodybuilders.

    ERIC Educational Resources Information Center

    Peters, Mark Anthony; Phelps, LeAddelle

    2001-01-01

    Compares college age bodybuilders by sex and steroid intake on two variables: body image dissatisfaction and body image distortion. Results reveal only a significant effect for gender on body distortion. No steroid-use differences were apparent for either body image dissatisfaction or body image distortion. Analyses indicate that female…

  10. Imaging bolometer

    DOEpatents

    Wurden, G.A.

    1999-01-19

    Radiation-hard, steady-state imaging bolometer is disclosed. A bolometer employing infrared (IR) imaging of a segmented-matrix absorber of plasma radiation in a cooled-pinhole camera geometry is described. The bolometer design parameters are determined by modeling the temperature of the foils from which the absorbing matrix is fabricated by using a two-dimensional time-dependent solution of the heat conduction equation. The resulting design will give a steady-state bolometry capability, with approximately 100 Hz time resolution, while simultaneously providing hundreds of channels of spatial information. No wiring harnesses will be required, as the temperature-rise data will be measured via an IR camera. The resulting spatial data may be used to tomographically investigate the profile of plasmas. 2 figs.

  11. Imaging bolometer

    DOEpatents

    Wurden, Glen A.

    1999-01-01

    Radiation-hard, steady-state imaging bolometer. A bolometer employing infrared (IR) imaging of a segmented-matrix absorber of plasma radiation in a cooled-pinhole camera geometry is described. The bolometer design parameters are determined by modeling the temperature of the foils from which the absorbing matrix is fabricated by using a two-dimensional time-dependent solution of the heat conduction equation. The resulting design will give a steady-state bolometry capability, with approximately 100 Hz time resolution, while simultaneously providing hundreds of channels of spatial information. No wiring harnesses will be required, as the temperature-rise data will be measured via an IR camera. The resulting spatial data may be used to tomographically investigate the profile of plasmas.

  12. Anmap: Image and data analysis

    NASA Astrophysics Data System (ADS)

    Alexander, Paul; Waldram, Elizabeth; Titterington, David; Rees, Nick

    2014-11-01

    Anmap analyses and processes images and spectral data. Originally written for use in radio astronomy, much of its functionality is applicable to other disciplines; additional algorithms and analysis procedures allow direct use in, for example, NMR imaging and spectroscopy. Anmap emphasizes the analysis of data to extract quantitative results for comparison with theoretical models and/or other experimental data. To achieve this, Anmap provides a wide range of tools for analysis, fitting and modelling (including standard image and data processing algorithms). It also provides a powerful environment for users to develop their own analysis/processing tools either by combining existing algorithms and facilities with the very powerful command (scripting) language or by writing new routines in FORTRAN that integrate seamlessly with the rest of Anmap.

  13. Brain imaging

    SciTech Connect

    Bradshaw, J.R.

    1989-01-01

    This book presents a survey of the various imaging tools with examples of the different diseases shown best with each modality. It includes 100 case presentations covering the gamut of brain diseases. These examples are grouped according to the clinical presentation of the patient: headache, acute headache, sudden unilateral weakness, unilateral weakness of gradual onset, speech disorders, seizures, pituitary and parasellar lesions, sensory disorders, posterior fossa and cranial nerve disorders, dementia, and congenital lesions.

  14. First Super-Earth Atmosphere Analysed

    NASA Astrophysics Data System (ADS)

    2010-12-01

    The atmosphere around a super-Earth exoplanet has been analysed for the first time by an international team of astronomers using ESO's Very Large Telescope. The planet, which is known as GJ 1214b, was studied as it passed in front of its parent star and some of the starlight passed through the planet's atmosphere. We now know that the atmosphere is either mostly water in the form of steam or is dominated by thick clouds or hazes. The results will appear in the 2 December 2010 issue of the journal Nature. The planet GJ 1214b was confirmed in 2009 using the HARPS instrument on ESO's 3.6-metre telescope in Chile (eso0950) [1]. Initial findings suggested that this planet had an atmosphere, which has now been confirmed and studied in detail by an international team of astronomers, led by Jacob Bean (Harvard-Smithsonian Center for Astrophysics), using the FORS instrument on ESO's Very Large Telescope. "This is the first super-Earth to have its atmosphere analysed. We've reached a real milestone on the road toward characterising these worlds," said Bean. GJ 1214b has a radius of about 2.6 times that of the Earth and is about 6.5 times as massive, putting it squarely into the class of exoplanets known as super-Earths. Its host star lies about 40 light-years from Earth in the constellation of Ophiuchus (the Serpent Bearer). It is a faint star [2], but it is also small, which means that the size of the planet is large compared to the stellar disc, making it relatively easy to study [3]. The planet travels across the disc of its parent star once every 38 hours as it orbits at a distance of only two million kilometres: about seventy times closer than the Earth orbits the Sun. To study the atmosphere, the team observed the light coming from the star as the planet passed in front of it [4]. During these transits, some of the starlight passes through the planet's atmosphere and, depending on the chemical composition and weather on the planet, specific wavelengths of light are

  15. Molecular cloning of chicken aggrecan. Structural analyses.

    PubMed Central

    Chandrasekaran, L; Tanzer, M L

    1992-01-01

    domain. Thus different variants of chondroitin sulphate and keratan sulphate domains may have evolved separately to fulfil specific biochemical and physiological functions. Images Fig. 1. Fig. 3. Fig. 4. Fig. 5. Fig. 6. Fig. 7. Fig. 8. PMID:1339285

  16. FY01 Supplemental Science and Performance Analyses, Volume 1: Scientific Bases and Analyses, Rev 00

    SciTech Connect

    David Dobson

    2001-06-30

    The U.S. Department of Energy (DOE) is considering the possible recommendation of a site at Yucca Mountain, Nevada, for development as a geologic repository for the disposal of high-level radioactive waste and spent nuclear fuel. To facilitate public review and comment, in May 2001 the DOE released the Yucca Mountain Science and Engineering Report (S&ER) (DOE 2001 [DIRS 153849]), which presents technical information supporting the consideration of the possible site recommendation. The report summarizes the results of more than 20 years of scientific and engineering studies. A decision to recommend the site has not been made: the DOE has provided the S&ER and its supporting documents as an aid to the public in formulating comments on the possible recommendation. When the S&ER (DOE 2001 [DIRS 153849]) was released, the DOE acknowledged that technical and scientific analyses of the site were ongoing. Therefore, the DOE noted in the Federal Register Notice accompanying the report (66 FR 23 013 [DIRS 155009], p. 2) that additional technical information would be released before the dates, locations, and times for public hearings on the possible recommendation were announced. This information includes: (1) the results of additional technical studies of a potential repository at Yucca Mountain, contained in this FY01 Supplemental Science and Performance Analyses: Vol. 1, Scientific Bases and Analyses; and FY01 Supplemental Science and Performance Analyses: Vol. 2, Performance Analyses (McNeish 2001 [DIRS 155023]) (collectively referred to as the SSPA) and (2) a preliminary evaluation of the Yucca Mountain site's preclosure and postclosure performance against the DOE's proposed site suitability guidelines (10 CFR Part 963 [64 FR 67054] [DIRS 124754]). By making the large amount of information developed on Yucca Mountain available in stages, the DOE intends to provide the public and interested parties with time to review the available materials and to formulate and submit

  17. RADIOLOGICAL AND HAZARDOUS MATERIALS ANALYSES FOR CONTAMINATED SUPERFUND SITES

    EPA Science Inventory

    Perform laboratory analyses on environmental samples. The analyses are to measure radioactive and hazardous contaminants to support regional, state, and federal activities that are part of of site assessment and cleanup.

  18. Electron Microprobe Techniques for Use in Tephrochronological Analyses

    NASA Astrophysics Data System (ADS)

    Fournelle, J.; Severin, K.; Wallace, K.; Beget, J.; Larsen, J.

    2006-12-01

    Tephrochronology generally assumes that a layer of volcanic ash represents a snapshot of eruption/deposition and of a region within the subvolcanic magma chamber. Correlation of tephra deposits over long distances helps establish age control for other deposits (volcanic and nonvolcanic). Reliable correlations depend on establishing similarity among tephra deposits. Although multi-parameter characterization of a tephra enhances long-distance correlations, identification and correlation of unknown tephras is often done using only geochemical analyses. Techniques vary but generally deal with chemically characterizing all (bulk) or portions (glass, crystals) of the tephra layer, with various geochemical techniques at various spatial scales. Electron probe microanalysis (EPMA) is the most commonly used analytical tool for geochemical analysis and imaging of micron-size volumes of glass and crystals, yet, despite warnings from numerous EPMA analysts dating back to at least 1992, a standard method for collecting, reducing, and reporting tephra data among and within laboratories is not common practice, making comparison of data sets problematic. We review the complexities in volcanic glass analysis, which include: 1) selection of standards (natural and synthetic, minerals and glasses, simple and complex chemistry, primary and secondary); 2) beam diameter, current level and count times; 3) time dependent element migration (volatiles Na, K, Al, Si); and 4) possible hydration of the glass. For example, there are multiple methods available for treating the volatile elements (minimizing the effect vs. not minimizing but correcting for it), and Morgan and London (1996) examined some of these for hydrous silicate glasses; we suggest continued comparisons are warranted, particularly on commonly used standards. Some published data sets are normalized to 100 wt% without an explanation of the extent of the deficiency in raw total. We review the 10 recommendations made by Froggatt

  19. Analysing regenerative potential in zebrafish models of congenital muscular dystrophy.

    PubMed

    Wood, A J; Currie, P D

    2014-11-01

    The congenital muscular dystrophies (CMDs) are a clinically and genetically heterogeneous group of muscle disorders. Clinically hypotonia is present from birth, with progressive muscle weakness and wasting through development. For the most part, CMDs can mechanistically be attributed to failure of basement membrane protein laminin-α2 sufficiently binding with correctly glycosylated α-dystroglycan. The majority of CMDs therefore arise as the result of either a deficiency of laminin-α2 (MDC1A) or hypoglycosylation of α-dystroglycan (dystroglycanopathy). Here we consider whether by filling a regenerative medicine niche, the zebrafish model can address the present challenge of delivering novel therapeutic solutions for CMD. In the first instance the readiness and appropriateness of the zebrafish as a model organism for pioneering regenerative medicine therapies in CMD is analysed, in particular for MDC1A and the dystroglycanopathies. Despite the recent rapid progress made in gene editing technology, these approaches have yet to yield any novel zebrafish models of CMD. Currently the most genetically relevant zebrafish models to the field of CMD, have all been created by N-ethyl-N-nitrosourea (ENU) mutagenesis. Once genetically relevant models have been established the zebrafish has several important facets for investigating the mechanistic cause of CMD, including rapid ex vivo development, optical transparency up to the larval stages of development and relative ease in creating transgenic reporter lines. Together, these tools are well suited for use in live-imaging studies such as in vivo modelling of muscle fibre detachment. Secondly, the zebrafish's contribution to progress in effective treatment of CMD was analysed. Two approaches were identified in which zebrafish could potentially contribute to effective therapies. The first hinges on the augmentation of functional redundancy within the system, such as upregulating alternative laminin chains in the candyfloss

  20. Runtime and Pressurization Analyses of Propellant Tanks

    NASA Technical Reports Server (NTRS)

    Field, Robert E.; Ryan, Harry M.; Ahuja, Vineet; Hosangadi, Ashvin; Lee, Chung P.

    2007-01-01

    Multi-element unstructured CFD has been utilized at NASA SSC to carry out analyses of propellant tank systems in different modes of operation. The three regimes of interest at SSC include (a) tank chill down (b) tank pressurization and (c) runtime propellant draw-down and purge. While tank chill down is an important event that is best addressed with long time-scale heat transfer calculations, CFD can play a critical role in the tank pressurization and runtime modes of operation. In these situations, problems with contamination of the propellant by inclusion of the pressurant gas from the ullage causes a deterioration of the quality of the propellant delivered to the test article. CFD can be used to help quantify the mixing and propellant degradation. During tank pressurization under some circumstances, rapid mixing of relatively warm pressurant gas with cryogenic propellant can lead to rapid densification of the gas and loss of pressure in the tank. This phenomenon can cause serious problems during testing because of the resulting decrease in propellant flow rate. With proper physical models implemented, CFD can model the coupling between the propellant and pressurant including heat transfer and phase change effects and accurately capture the complex physics in the evolving flowfields. This holds the promise of allowing the specification of operational conditions and procedures that could minimize the undesirable mixing and heat transfer inherent in propellant tank operation. It should be noted that traditional CFD modeling is inadequate for such simulations because the fluids in the tank are in a range of different sub-critical and supercritical states and elaborate phase change and mixing rules have to be developed to accurately model the interaction between the ullage gas and the propellant. We show a typical run-time simulation of a spherical propellant tank, containing RP-1 in this case, being pressurized with room-temperature nitrogen at 540 R. Nitrogen

  1. Residual Strength Analyses of Monolithic Structures

    NASA Technical Reports Server (NTRS)

    Forth, Scott (Technical Monitor); Ambur, Damodar R. (Technical Monitor); Seshadri, B. R.; Tiwari, S. N.

    2003-01-01

    Finite-element fracture simulation methodology predicts the residual strength of damaged aircraft structures. The methodology uses the critical crack-tip-opening-angle (CTOA) fracture criterion to characterize the fracture behavior of the material. The CTOA fracture criterion assumes that stable crack growth occurs when the crack-tip angle reaches a constant critical value. The use of the CTOA criterion requires an elastic- plastic, finite-element analysis. The critical CTOA value is determined by simulating fracture behavior in laboratory specimens, such as a compact specimen, to obtain the angle that best fits the observed test behavior. The critical CTOA value appears to be independent of loading, crack length, and in-plane dimensions. However, it is a function of material thickness and local crack-front constraint. Modeling the local constraint requires either a three-dimensional analysis or a two-dimensional analysis with an approximation to account for the constraint effects. In recent times as the aircraft industry is leaning towards monolithic structures with the intention of reducing part count and manufacturing cost, there has been a consistent effort at NASA Langley to extend critical CTOA based numerical methodology in the analysis of integrally-stiffened panels.In this regard, a series of fracture tests were conducted on both flat and curved aluminum alloy integrally-stiffened panels. These flat panels were subjected to uniaxial tension and during the test, applied load-crack extension, out-of-plane displacements and local deformations around the crack tip region were measured. Compact and middle-crack tension specimens were tested to determine the critical angle (wc) using three-dimensional code (ZIP3D) and the plane-strain core height (hJ using two-dimensional code (STAGS). These values were then used in the STAGS analysis to predict the fracture behavior of the integrally-stiffened panels. The analyses modeled stable tearing, buckling, and crack

  2. Laboratory Spectral Analyses of Microcrystalline Silica

    NASA Astrophysics Data System (ADS)

    Hardgrove, C. J.; Rogers, D.

    2011-12-01

    Sedimentary rocks have been identified on Mars in increasing numbers and at scales ranging from cobbles to regional outcrops. For this reason, it is important to assess the potential of using thermal infrared (TIR) spectra to obtain quantitative mineralogical information of sedimentary samples. A single sedimentary sample can be a complex mixture of clasts and chemical precipitates of varying crystal size; thus the assumption that the spectral contribution from each component combines linearly in the bulk rock spectrum may not hold true. The spectral properties of some microcrystalline (<20 um) phases also differ slightly from their macrocrystalline counterparts; within the microcrystalline silica suite ("cherts"), wide spectral variability is observed between samples [1]. Thus our first step is to understand the spectral variability observed within microcrystalline chemical precipitates that are found in terrestrial sedimentary rocks. Here we describe several causes of thermal infrared spectral variability in terrestrial "chert" samples, which were identified using emission, micro-FTIR, micro-Raman spectroscopy and SEM analyses. In this work, "chert" refers to microcrystalline silica with fibrous (a.k.a. chalcedony) or non-fibrous fabric, and may consist of pure alpha-quartz or a mixture of alpha-quartz with the low-temperature silica polymorph, moganite (which typically occurs as intergrowths within chert). Moganite is so prevalent within terrestrial cherts that it has been suggested that its absence indicates high water to rock ratios in the formation environment. Using microspectroscopy, we have isolated the first TIR reflectance spectrum for moganite. Increasing proportions of moganite within chert samples primarily has the effect of decreasing the 8.47-um reflectance peak within the quartz restrahlen ~9 um "doublet", thus accounting for some of the spectral variability observed between chert samples. A second, widely observed, spectral characteristic of some

  3. 32 CFR 651.27 - Programmatic NEPA analyses.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 4 2011-07-01 2011-07-01 false Programmatic NEPA analyses. 651.27 Section 651...) ENVIRONMENTAL QUALITY ENVIRONMENTAL ANALYSIS OF ARMY ACTIONS (AR 200-2) Records and Documents § 651.27 Programmatic NEPA analyses. These analyses, in the form of an EA or EIS, are useful to examine impacts...

  4. 76 FR 63913 - Commercial Building Workforce Job/Task Analyses

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-14

    ... of Energy Efficiency and Renewable Energy Commercial Building Workforce Job/Task Analyses AGENCY... and Renewable Energy (EERE) announces the availability of a draft set of six Job/Task Analyses... this notice, DOE also requests comments on these documents. Job/Task Analyses were developed for...

  5. 9 CFR 590.580 - Laboratory tests and analyses.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Laboratory tests and analyses. 590.580... § 590.580 Laboratory tests and analyses. The official plant, at their expense, shall make tests and analyses to determine compliance with the Act and the regulations. (a) Samples shall be drawn from...

  6. 24 CFR 81.65 - Other information and analyses.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 1 2011-04-01 2011-04-01 false Other information and analyses. 81... information and analyses. When deemed appropriate and requested in writing, on a case by-case basis, by the... conduct additional analyses concerning any such report. A GSE shall submit additional reports or...

  7. Image Editing Via Searching Source Image

    NASA Astrophysics Data System (ADS)

    Yu, Han; Deng, Liang-Jian

    Image editing has important applications by changing the image texture, illumination, target location, etc. As an important application of Poisson equation, Poisson image editing processes images on the gradient domain and has been applied to seamless clone, selection editing, image denoising, etc. In this paper, we present a new application of Poisson image editing, which is based on searching source image. The main feature of the new application is all modifying information comes from the source image. Experimental results show that the proposed application performs well.

  8. Non-destructive infrared analyses: a method for provenance analyses of sandstones

    NASA Astrophysics Data System (ADS)

    Bowitz, Jörg; Ehling, Angela

    2008-12-01

    Infrared spectroscopy (IR spectroscopy) is commonly applied in the laboratory for mineral analyses in addition to XRD. Because such technical efforts are time and cost consuming, we present an infrared-based mobile method for non-destructive mineral and provenance analyses of sandstones. IR spectroscopy is based on activating chemical bonds. By irradiating a mineral mixture, special bonds are activated to vibrate depending on the bond energy (resonance vibration). Accordingly, the energy of the IR spectrum will be reduced thereby generating an absorption spectrum. The positions of the absorption maxima within the spectral region indicate the type of the bonds and in many cases identify minerals containing these bonds. The non-destructive reflection spectroscopy operates in the near infrared region (NIR) and can detect all common clay minerals as well as sulfates, hydroxides and carbonates. The spectra produced have been interpreted by computer using digital mineral libraries that have been especially collected for sandstones. The comparison of all results with XRD, RFA and interpretations of thin sections demonstrates impressively the accuracy and reliability of this method. Not only are different minerals detectable, but also differently ordered kaolinites and varieties of illites can be identified by the shape and size of the absorption bands. Especially clay minerals and their varieties in combination with their relative contents form the characteristic spectra of sandstones. Other components such as limonite, hematite and amorphous silica also influence the spectra. Sandstones, similar in colour and texture, often can be identified by their characteristic reflectance spectra. Reference libraries with more than 60 spectra of important German sandstones have been created to enable entirely computerized interpretations and identifications of these dimension stones. The analysis of infrared spectroscopy results is demonstrated with examples of different sandstones

  9. NASA Earth Exchange (NEX) Supporting Analyses for National Climate Assessments

    NASA Astrophysics Data System (ADS)

    Nemani, R. R.; Thrasher, B. L.; Wang, W.; Lee, T. J.; Melton, F. S.; Dungan, J. L.; Michaelis, A.

    2015-12-01

    The NASA Earth Exchange (NEX) is a collaborative computing platform that has been developed with the objective of bringing scientists together with the software tools, massive global datasets, and supercomputing resources necessary to accelerate research in Earth systems science and global change. NEX supports several research projects that are closely related with the National Climate Assessment including the generation of high-resolution climate projections, identification of trends and extremes in climate variables and the evaluation of their impacts on regional carbon/water cycles and biodiversity, the development of land-use management and adaptation strategies for climate-change scenarios, and even the exploration of climate mitigation through geo-engineering. Scientists also use the large collection of satellite data on NEX to conduct research on quantifying spatial and temporal changes in land surface processes in response to climate and land-cover-land-use changes. Researchers, leveraging NEX's massive compute/storage resources, have used statistical techniques to downscale the coarse-resolution CMIP5 projections to fulfill the demands of the community for a wide range of climate change impact analyses. The DCP-30 (Downscaled Climate Projections at 30 arcsecond) for the conterminous US at monthly, ~1km resolution and the GDDP (Global Daily Downscaled Projections) for the entire world at daily, 25km resolution are now widely used in climate research and applications, as well as for communicating climate change. In order to serve a broader community, the NEX team in collaboration with Amazon, Inc, created the OpenNEX platform. OpenNEX provides ready access to NEX data holdings, including the NEX-DCP30 and GDDP datasets along with a number of pertinent analysis tools and workflows on the AWS infrastructure in the form of publicly available, self contained, fully functional Amazon Machine Images (AMI's) for anyone interested in global climate change.

  10. Spectral Analyses of the Interactions of Giant Vortices on Jupiter

    NASA Astrophysics Data System (ADS)

    Yanamandra-Fisher, P. A.; Simon-Miller, A. A.; Orton, G. S.

    2010-12-01

    The merger of the three white ovals into Oval BA in 2000, and its subsequent color change from white to red in 2005, appear to be loosely correlated to periodic interactions with the Great Red Spot (GRS). The interactions of these two largest vortices in the solar system - the Great Red Spot (GRS) and Oval BA - on Jupiter occur once every 18 months. Our data was acquired primarily at NASA/Infrared Telescope Facility (IRTF), with 1- to 5-micron imager, NSFCAM and its successor, NSFCAM2. We chose four canonical wavelengths that characterize the vertical structure of Jupiter’s atmosphere. Spectral decomposition of the geometrically-registered data identifies several physical changes on the planet: variation of global cloudiness increases during the interaction; the albedo of discrete clouds at different altitudes vary and there appears to be either enhancement or depletion of ammonia vertically in the atmosphere, especially after the color change in Oval BA in 2005. Analyses of the post-color change interactions of GRS and Oval BA in 2006 and 2008 indicate changes in thermal and albedo fields, with enhancement of ammonia in the perturbed region (Otto, Yanamandra-Fisher and Simon-Miller, BAAS, 2009; Fletcher et al., Icarus, 208, Issue 1). We shall present results of the current 2010 interaction, and its comparison to the previous interactions of the GRS - Oval BA. Our goals are to establish common attributes of the interactions in terms of physical changes in the local meteorology for both the unperturbed and perturbed states of the atmosphere, while differences in the interactions may highlight the temporal changes in the global atmospheric state of Jupiter.

  11. Image Processor

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Texas Instruments Programmable Remapper is a research tool used to determine how to best utilize the part of a patient's visual field still usable by mapping onto his field of vision with manipulated imagery. It is an offshoot of a NASA program for speeding up, improving the accuracy of pattern recognition in video imagery. The Remapper enables an image to be "pushed around" so more of it falls into the functional portions in the retina of a low vision person. It works at video rates, and researchers hope to significantly reduce its size and cost, creating a wearable prosthesis for visually impaired people.

  12. Multiscale image restoration for photon imaging systems

    NASA Astrophysics Data System (ADS)

    Jammal, Ghada; Bijaoui, Albert

    1999-05-01

    Nuclear medicine imaging is a widely used commercial imaging modality which relies on photon detection as the basis of image formation. As a diagnosis tool, it is unique in that it documents organ function and structure. It is a way to gather information that may be otherwise unavailable or require surgery. Practical limitations on imaging time and the amount of activity that can be administered safely to patients are serious impediments to substantial further improvements in nuclear medicine imaging. Hence, improvements of image quality via optimized image processing represent a significant opportunity to advance the state-of-the-art int his field. We present in this paper a new multiscale image restoration method that is concerned with eliminating one of the major sources of error in nuclear medicine imaging, namely Poisson noise, which degrades images in both quantitative and qualitative senses and hinders image analysis and interpretation. The paper then quantitatively evaluates the performances of the proposed method.

  13. Using a Log Analyser to Assist Research into Haptic Technology

    NASA Astrophysics Data System (ADS)

    Jónsson, Fannar Freyr; Hvannberg, Ebba Þóra

    Usability evaluations collect subjective and objective measures. Examples of the latter are time to complete a task. The paper describes use cases of a log analyser for haptic feedback. The log analyser reads a log file and extracts information such as time of each practice and assessment session, analyses whether the user goes off curve and measures the force applied. A study case using the analyser is performed using a PHANToM haptic learning environment application that is used to teach young visually impaired students the subject of polynomials. The paper answers six questions to illustrate further use cases of the log analyser.

  14. Digital image processing.

    PubMed

    Seeram, Euclid

    2004-01-01

    Digital image processing is now commonplace in radiology, nuclear medicine and sonography. This article outlines underlying principles and concepts of digital image processing. After completing this article, readers should be able to: List the limitations of film-based imaging. Identify major components of a digital imaging system. Describe the history and application areas of digital image processing. Discuss image representation and the fundamentals of digital image processing. Outline digital image processing techniques and processing operations used in selected imaging modalities. Explain the basic concepts and visualization tools used in 3-D and virtual reality imaging. Recognize medical imaging informatics as a new area of specialization for radiologic technologists. PMID:15352557

  15. Image Ambiguity and Fluency

    PubMed Central

    Jakesch, Martina; Leder, Helmut; Forster, Michael

    2013-01-01

    Ambiguity is often associated with negative affective responses, and enjoying ambiguity seems restricted to only a few situations, such as experiencing art. Nevertheless, theories of judgment formation, especially the “processing fluency account”, suggest that easy-to-process (non-ambiguous) stimuli are processed faster and are therefore preferred to (ambiguous) stimuli, which are hard to process. In a series of six experiments, we investigated these contrasting approaches by manipulating fluency (presentation duration: 10ms, 50ms, 100ms, 500ms, 1000ms) and testing effects of ambiguity (ambiguous versus non-ambiguous pictures of paintings) on classification performance (Part A; speed and accuracy) and aesthetic appreciation (Part B; liking and interest). As indicated by signal detection analyses, classification accuracy increased with presentation duration (Exp. 1a), but we found no effects of ambiguity on classification speed (Exp. 1b). Fifty percent of the participants were able to successfully classify ambiguous content at a presentation duration of 100 ms, and at 500ms even 75% performed above chance level. Ambiguous artworks were found more interesting (in conditions 50ms to 1000ms) and were preferred over non-ambiguous stimuli at 500ms and 1000ms (Exp. 2a - 2c, 3). Importantly, ambiguous images were nonetheless rated significantly harder to process as non-ambiguous images. These results suggest that ambiguity is an essential ingredient in art appreciation even though or maybe because it is harder to process. PMID:24040172

  16. Integrated Waste Treatment Unit (IWTU) Input Coal Analyses and Off-Gass Filter (OGF) Content Analyses

    SciTech Connect

    Jantzen, Carol M.; Missimer, David M.; Guenther, Chris P.; Shekhawat, Dushyant; VanEssendelft, Dirk T.; Means, Nicholas C.

    2015-04-23

    in process piping and materials, in excessive off-gas absorbent loading, and in undesired process emissions. The ash content of the coal is important as the ash adds to the DMR and other vessel products which affect the final waste product mass and composition. The amount and composition of the ash also affects the reaction kinetics. Thus ash content and composition contributes to the mass balance. In addition, sodium, potassium, calcium, sulfur, and maybe silica and alumina in the ash may contribute to wall-scale formation. Sodium, potassium, and alumina in the ash will be overwhelmed by the sodium, potassium, and alumina from the feed but the impact from the other ash components needs to be quantified. A maximum coal particle size is specified so the feed system does not plug and a minimum particle size is specified to prevent excess elutriation from the DMR to the Process Gas Filter (PGF). A vendor specification was used to procure the calcined coal for IWTU processing. While the vendor supplied a composite analysis for the 22 tons of coal (Appendix A), this study compares independent analyses of the coal performed at the Savannah River National Laboratory (SRNL) and at the National Energy Technology Laboratory (NETL). Three supersacks a were sampled at three different heights within the sack in order to determine within bag variability and between bag variability of the coal. These analyses were also compared to the vendor’s composite analyses and to the coal specification. These analyses were also compared to historic data on Bestac coal analyses that had been performed at Hazen Research Inc. (HRI) between 2004-2011.

  17. Scrotal imaging

    PubMed Central

    Studniarek, Michał; Modzelewska, Elza

    2015-01-01

    Pathological lesions within the scrotum are relatively rare in imaging except for ultrasonography. The diseases presented in the paper are usually found in men at the age of 15–45, i.e. men of reproductive age, and therefore they are worth attention. Scrotal ultrasound in infertile individuals should be conducted on a routine basis owing to the fact that pathological scrotal lesions are frequently detected in this population. Malignant testicular cancers are the most common neoplasms in men at the age of 20–40. Ultrasound imaging is the method of choice characterized by the sensitivity of nearly 100% in the differentiation between intratesticular and extratesticular lesions. In the case of doubtful lesions that are not classified for intra-operative verification, nuclear magnetic resonance is applied. Computed tomography, however, is performed to monitor the progression of a neoplastic disease, in pelvic trauma with scrotal injury as well as in rare cases of scrotal hernias involving the ureters or a fragment of the urinary bladder. PMID:26674847

  18. Eos visible imagers

    NASA Technical Reports Server (NTRS)

    Barnes, W. L.

    1990-01-01

    Some of the proposed Earth Observing System (Eos) optical imagers are examined. These imagers include: moderate resolution imaging spectrometer (MODIS); geoscience laser ranging system (GLRS); high resolution imaging spectrometer (HIRIS); the intermediate thermal infrared spectrometer (ITIR); multi-angle imaging spectrometer (MISR); earth observing scanning polarimeter (EOSP); and the lightening imaging sensor (LIS).

  19. Large area CMOS image sensors

    NASA Astrophysics Data System (ADS)

    Turchetta, R.; Guerrini, N.; Sedgwick, I.

    2011-01-01

    CMOS image sensors, also known as CMOS Active Pixel Sensors (APS) or Monolithic Active Pixel Sensors (MAPS), are today the dominant imaging devices. They are omnipresent in our daily life, as image sensors in cellular phones, web cams, digital cameras, ... In these applications, the pixels can be very small, in the micron range, and the sensors themselves tend to be limited in size. However, many scientific applications, like particle or X-ray detection, require large format, often with large pixels, as well as other specific performance, like low noise, radiation hardness or very fast readout. The sensors are also required to be sensitive to a broad spectrum of radiation: photons from the silicon cut-off in the IR down to UV and X- and gamma-rays through the visible spectrum as well as charged particles. This requirement calls for modifications to the substrate to be introduced to provide optimized sensitivity. This paper will review existing CMOS image sensors, whose size can be as large as a single CMOS wafer, and analyse the technical requirements and specific challenges of large format CMOS image sensors.

  20. Motivation and synthesis of the FIAC experiment: Reproducibility of fMRI results across expert analyses.

    PubMed

    Poline, Jean-Baptiste; Strother, Stephen C; Dehaene-Lambertz, Ghislaine; Egan, Gary F; Lancaster, Jack L

    2006-05-01

    The Functional Imaging Analysis Contest (FIAC) culminated in the FIAC Workshop held at the 11th Annual Meeting of the Organization for Human Brain Mapping in Toronto in 2005. This special issue summarizes various analyses used by contestants with a single functional magnetic resonance imaging (fMRI) study, a cortical-language study using sentence repetition. The results from the cognitive neuroscientists who developed the test-base language study, and report their data analysis, are complemented by expert analyses of the same test-base data by most of the major groups actively developing fMRI software packages. Analyses include many variants of the general linear model (GLM), cutting-edge spatial- and temporal-wavelets, permutation-based, and ICA approaches. A number of authors also include surface-based approaches. Several articles describe the important emerging areas of diagnostics for GLM analysis, multivariate predictive modeling, and functional connectivity analysis. While the FIAC did not achieve all of its goals, it helped identify new activation regions in the test-base data, and more important, through this special issue it illustrates the significant methods-driven variability that potentially exists in the literature. Variable results from different methods reported here should provide a cautionary note and motivate the Human Brain Mapping community to explore more thoroughly the methodologies they use for analyzing fMRI data. PMID:16583364

  1. Survey of the methods and reporting practices in published meta-analyses of test performance: 1987 to 2009.

    PubMed

    Dahabreh, Issa J; Chung, Mei; Kitsios, Georgios D; Terasawa, Teruhiko; Raman, Gowri; Tatsioni, Athina; Tobar, Annette; Lau, Joseph; Trikalinos, Thomas A; Schmid, Christopher H

    2013-09-01

    We performed a survey of meta-analyses of test performance to describe the evolution in their methods and reporting. Studies were identified through MEDLINE (1966-2009), reference lists, and relevant reviews. We extracted information on clinical topics, literature review methods, quality assessment, and statistical analyses. We reviewed 760 publications reporting meta-analyses of test performance, published between 1987 and 2009. Eligible reviews included a median of 18 primary studies that were used in quantitative analyses. Most common clinical areas were cardiovascular disease (21%) and oncology (25%); most common test categories were imaging (44%) and biomarker tests (28%). Assessment of verification and spectrum bias, blinding, prospective study design, and consecutive patient recruitment became more common over time (p < 0.001 comparing reviews published through 2004 vs 2005 onwards). These changes coincided with the increasing use of checklists to guide assessment of methodological quality. Heterogeneity tests were used in 58% of meta-analyses; subgroup or regression analyses were used in 57%. Random effects models were employed in 57% of meta-analyses (38% through 2004 vs 72% 2004-onwards; p < 0.001). Use of bivariate models of sensitivity and specificity increased in recent years (21% in 2008-2009 vs 7% in earlier years; p < 0.001). Methods employed in meta-analyses of test performance have improved with the introduction of quality assessment checklists and the development of more sophisticated statistical methods. PMID:26053844

  2. Speckle imaging algorithms for planetary imaging

    SciTech Connect

    Johansson, E.

    1994-11-15

    I will discuss the speckle imaging algorithms used to process images of the impact sites of the collision of comet Shoemaker-Levy 9 with Jupiter. The algorithms use a phase retrieval process based on the average bispectrum of the speckle image data. High resolution images are produced by estimating the Fourier magnitude and Fourier phase of the image separately, then combining them and inverse transforming to achieve the final result. I will show raw speckle image data and high-resolution image reconstructions from our recent experiment at Lick Observatory.

  3. Infrared scanning images: An archeological application

    USGS Publications Warehouse

    Schaber, G.G.; Gumerman, G.J.

    1969-01-01

    Aerial infrared scanner images of an area near the Little Colorado River in north-central Arizona disclosed the existence of scattered clusters of parallel linear features in the ashfall area of Sunset Crater. The features are not obvious in conventional aerial photographs, and only one cluster could be recognized on the ground. Soil and pollen analyses reveal that they are prehistoric agricultural plots.

  4. Metric Learning to Enhance Hyperspectral Image Segmentation

    NASA Technical Reports Server (NTRS)

    Thompson, David R.; Castano, Rebecca; Bue, Brian; Gilmore, Martha S.

    2013-01-01

    Unsupervised hyperspectral image segmentation can reveal spatial trends that show the physical structure of the scene to an analyst. They highlight borders and reveal areas of homogeneity and change. Segmentations are independently helpful for object recognition, and assist with automated production of symbolic maps. Additionally, a good segmentation can dramatically reduce the number of effective spectra in an image, enabling analyses that would otherwise be computationally prohibitive. Specifically, using an over-segmentation of the image instead of individual pixels can reduce noise and potentially improve the results of statistical post-analysis. In this innovation, a metric learning approach is presented to improve the performance of unsupervised hyperspectral image segmentation. The prototype demonstrations attempt a superpixel segmentation in which the image is conservatively over-segmented; that is, the single surface features may be split into multiple segments, but each individual segment, or superpixel, is ensured to have homogenous mineralogy.

  5. Nonspectroscopic imaging for quantitative chlorophyll sensing

    NASA Astrophysics Data System (ADS)

    Kim, Taehoon; Kim, Jeong-Im; Visbal-Onufrak, Michelle A.; Chapple, Clint; Kim, Young L.

    2016-01-01

    Nondestructive imaging of physiological changes in plants has been intensively used as an invaluable tool for visualizing heterogeneous responses to various types of abiotic and biotic stress. However, conventional approaches often have intrinsic limitations for quantitative analyses, requiring bulky and expensive optical instruments for capturing full spectral information. We report a spectrometerless (or spectrometer-free) reflectance imaging method that allows for nondestructive and quantitative chlorophyll imaging in individual leaves in situ in a handheld device format. The combination of a handheld-type imaging system and a hyperspectral reconstruction algorithm from an RGB camera offers simple instrumentation and operation while avoiding the use of an imaging spectrograph or tunable color filter. This platform could potentially be integrated into a compact, inexpensive, and portable system, while being of great value in high-throughput phenotyping facilities and laboratory settings.

  6. Imaging genetics and psychiatric disorders.

    PubMed

    Hashimoto, R; Ohi, K; Yamamori, H; Yasuda, Y; Fujimoto, M; Umeda-Yano, S; Watanabe, Y; Fukunaga, M; Takeda, M

    2015-01-01

    Imaging genetics is an integrated research method that uses neuroimaging and genetics to assess the impact of genetic variation on brain function and structure. Imaging genetics is both a tool for the discovery of risk genes for psychiatric disorders and a strategy for characterizing the neural systems affected by risk gene variants to elucidate quantitative and mechanistic aspects of brain function implicated in psychiatric disease. Early studies of imaging genetics included association analyses between brain morphology and single nucleotide polymorphisms whose function is well known, such as catechol-Omethyltransferase (COMT) and brain-derived neurotrophic factor (BDNF). GWAS of psychiatric disorders have identified genes with unknown functions, such as ZNF804A, and imaging genetics has been used to investigate clues of the biological function of these genes. The difficulty in replicating the findings of studies with small sample sizes has motivated the creation of largescale collaborative consortiums, such as ENIGMA, CHARGE and IMAGEN, to collect thousands of images. In a genome-wide association study, the ENIGMA consortium successfully identified common variants in the genome associated with hippocampal volume at 12q24, and the CHARGE consortium replicated this finding. The new era of imaging genetics has just begun, and the next challenge we face is the discovery of small effect size signals from large data sets obtained from genetics and neuroimaging. New methods and technologies for data reduction with appropriate statistical thresholds, such as polygenic analysis and parallel independent component analysis (ICA), are warranted. Future advances in imaging genetics will aid in the discovery of genes and provide mechanistic insight into psychiatric disorders. PMID:25732148

  7. Imaging Genetics and Psychiatric Disorders

    PubMed Central

    Hashimoto, R; Ohi, K; Yamamori, H; Yasuda, Y; Fujimoto, M; Umeda-Yano, S; Watanabe, Y; Fukunaga, M; Takeda, M

    2015-01-01

    Imaging genetics is an integrated research method that uses neuroimaging and genetics to assess the impact of genetic variation on brain function and structure. Imaging genetics is both a tool for the discovery of risk genes for psychiatric disorders and a strategy for characterizing the neural systems affected by risk gene variants to elucidate quantitative and mechanistic aspects of brain function implicated in psychiatric disease. Early studies of imaging genetics included association analyses between brain morphology and single nucleotide polymorphisms whose function is well known, such as catechol-O-methyltransferase (COMT) and brain-derived neurotrophic factor (BDNF). GWAS of psychiatric disorders have identified genes with unknown functions, such as ZNF804A, and imaging genetics has been used to investigate clues of the biological function of these genes. The difficulty in replicating the findings of studies with small sample sizes has motivated the creation of large-scale collaborative consortiums, such as ENIGMA, CHARGE and IMAGEN, to collect thousands of images. In a genome-wide association study, the ENIGMA consortium successfully identified common variants in the genome associated with hippocampal volume at 12q24, and the CHARGE consortium replicated this finding. The new era of imaging genetics has just begun, and the next challenge we face is the discovery of small effect size signals from large data sets obtained from genetics and neuroimaging. New methods and technologies for data reduction with appropriate statistical thresholds, such as polygenic analysis and parallel independent component analysis (ICA), are warranted. Future advances in imaging genetics will aid in the discovery of genes and provide mechanistic insight into psychiatric disorders. PMID:25732148

  8. Image catalogs.

    PubMed

    Gomoll, Andreas H; Thornhill, Thomas S

    2004-04-01

    The advent of digital photography and radiography allows documentation of interesting clinical findings with unprecedented ease, and many orthopaedic surgeons have taken extensive advantage of this opportunity to create large digital libraries of clinical results. However, this leaves surgeons with a rapidly increasing volume of data to store and organize; therefore, a system for archiving, locating, and managing images, radiographs, and digital slide presentations has become a crucial need in most orthopaedic groups and practices. However, many surgical groups and practices are not familiar with the computer technology available to initiate such systems. In this review, we discuss several software solutions currently on the market to address the specific needs of orthopaedic surgeons, and as a practical example, discuss a system that is in place in the Department of Orthopaedic Surgery at our institution. Overall, depending on the individual circumstances of each institution, there are various options that meet different technologic and financial requirements. PMID:15123922

  9. Medical Imaging System

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The MD Image System, a true-color image processing system that serves as a diagnostic aid and tool for storage and distribution of images, was developed by Medical Image Management Systems, Huntsville, AL, as a "spinoff from a spinoff." The original spinoff, Geostar 8800, developed by Crystal Image Technologies, Huntsville, incorporates advanced UNIX versions of ELAS (developed by NASA's Earth Resources Laboratory for analysis of Landsat images) for general purpose image processing. The MD Image System is an application of this technology to a medical system that aids in the diagnosis of cancer, and can accept, store and analyze images from other sources such as Magnetic Resonance Imaging.

  10. PRELIMINARY STUDIES OF VIDEO IMAGES OF SMOKE DISPERSION IN THE NEAR WAKE OF A MODEL BUILDING

    EPA Science Inventory

    A scary of analyses of video images of smoke in a wind tunnel study of dispersion in the near wake of a model building is presented. The analyses provide information on both the instantaneous and the time- average patterns of dispersion. ince the images represent vertically-integ...

  11. The Influence of University Image on Student Behaviour

    ERIC Educational Resources Information Center

    Alves, Helena; Raposo, Mario

    2010-01-01

    Purpose: The purpose of this paper is to analyse the influence of image on student satisfaction and loyalty. Design/methodology/approach: In order to accomplish the objectives proposed, a model reflecting the influence of image on student satisfaction and loyalty is applied. The model is tested through use of structural equations and the final…

  12. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. PMID:26556680

  13. An economic critique of several EPA regulatory analyses

    SciTech Connect

    VanderHart, P.G. )

    1993-01-01

    The economic analyses of regulations performed for the Environmental Protection Agency (EPA) suffer from a number of deficiencies. They rely on obsolete data, assume very simple responses from those regulated, and violate some of the basic tenets of benefit-cost analysis. This paper focuses on several analyses performed on the regulation of solvent wastes. In most cases the deficiencies lead the analyses to overstate the costs of the regulations, and therefore indicate that the regulations are less worthy than they truly are.

  14. Methods and Procedures for Shielding Analyses for the SNS

    SciTech Connect

    Gallmeier, Franz X.; Iverson, Erik B.; Remec, Igor; Lu, Wei; Popova, Irina

    2014-01-01

    In order to provide radiologically safe Spallation Neutron Source operation, shielding analyses are performed according to Oak Ridge National Laboratory internal regulations and to comply with the Code of Federal Regulations. An overview of on-going shielding work for the accelerator facility and neutrons beam lines, methods, used for the analyses, and associated procedures and regulations is presented. Methods used to perform shielding analyses are described as well.

  15. Thermal analyses for quality control of plastics, ceramics, and explosives

    SciTech Connect

    Brown, C.R.; Garrod, M.J.; Whitaker, R.B.

    1990-01-01

    Thermal analyses are performed for production quality control (q.c.) and for surveillance at Mound on plastic, ceramic, explosive and pyrotechnic materials. For the weapons surveillance program, weapon components are disassembled after varying times in the field; thermal and other analyses are then performed on the component materials. The types of thermal analyses done include: differential scanning calorimetry (DSC), differential thermal analysis (DTA), thermogravimetry (TG), thermomechanical analysis (TMA), and high temperature TG/DTA. 5 refs., 4 figs.

  16. Towards Efficiency of Oblique Images Orientation

    NASA Astrophysics Data System (ADS)

    Ostrowski, W.; Bakuła, K.

    2016-03-01

    Many papers on both theoretical aspects of bundle adjustment of oblique images and new operators for detecting tie points on oblique images have been written. However, only a few achievements presented in the literature were practically implemented in commercial software. In consequence often aerial triangulation is performed either for nadir images obtained simultaneously with oblique photos or bundle adjustment for separate images captured in different directions. The aim of this study was to investigate how the orientation of oblique images can be carried out effectively in commercial software based on the structure from motion technology. The main objective of the research was to evaluate the impact of the orientation strategy on both duration of the process and accuracy of photogrammetric 3D products. Two, very popular software: Pix4D and Agisoft Photoscan were tested and two approaches for image blocks were considered. The first approach based only on oblique images collected in four directions and the second approach included nadir images. In this study, blocks for three test areas were analysed. Oblique images were collected with medium-format cameras in maltan cross configuration with registration of GNSS and INS data. As a reference both check points and digital surface models from airborne laser scanning were used.

  17. Medical imaging.

    PubMed Central

    Kreel, L.

    1991-01-01

    There is now a wide choice of medical imaging to show both focal and diffuse pathologies in various organs. Conventional radiology with plain films, fluoroscopy and contrast medium have many advantages, being readily available with low-cost apparatus and a familiarity that almost leads to contempt. The use of plain films in chest disease and in trauma does not need emphasizing, yet there are still too many occasions when the answer obtainable from a plain radiograph has not been available. The film may have been mislaid, or the examination was not requested, or the radiograph had been misinterpreted. The converse is also quite common. Examinations are performed that add nothing to patient management, such as skull films when CT will in any case be requested or views of the internal auditory meatus and heal pad thickness in acromegaly, to quote some examples. Other issues are more complicated. Should the patient who clinically has gall-bladder disease have more than a plain film that shows gall-stones? If the answer is yes, then why request a plain film if sonography will in any case be required to 'exclude' other pathologies especially of the liver or pancreas? But then should cholecystography, CT or scintigraphy be added for confirmation? Quite clearly there will be individual circumstances to indicate further imaging after sonography but in the vast majority of patients little or no extra information will be added. Statistics on accuracy and specificity will, in the case of gall-bladder pathology, vary widely if adenomyomatosis is considered by some to be a cause of symptoms or if sonographic examinations 'after fatty meals' are performed. The arguments for or against routine contrast urography rather than sonography are similar but the possibility of contrast reactions and the need to limit ionizing radiation must be borne in mind. These diagnostic strategies are also being influenced by their cost and availability; purely pragmatic considerations are not

  18. SENSITIVITY ANALYSIS FOR SALTSTONE DISPOSAL UNIT COLUMN DEGRADATION ANALYSES

    SciTech Connect

    Flach, G.

    2014-10-28

    PORFLOW related analyses supporting a Sensitivity Analysis for Saltstone Disposal Unit (SDU) column degradation were performed. Previous analyses, Flach and Taylor 2014, used a model in which the SDU columns degraded in a piecewise manner from the top and bottom simultaneously. The current analyses employs a model in which all pieces of the column degrade at the same time. Information was extracted from the analyses which may be useful in determining the distribution of Tc-99 in the various SDUs throughout time and in determining flow balances for the SDUs.

  19. Analysing harmonic motions with an iPhone’s magnetometer

    NASA Astrophysics Data System (ADS)

    Yavuz, Ahmet; Kağan Temiz, Burak

    2016-05-01

    In this paper, we propose an experiment for analysing harmonic motion using an iPhone’s (or iPad’s) magnetometer. This experiment consists of the detection of magnetic field variations obtained from an iPhone’s magnetometer sensor. A graph of harmonic motion is directly displayed on the iPhone’s screen using the Sensor Kinetics application. Data from this application was analysed with Eureqa software to establish the equation of the harmonic motion. Analyses show that the use of an iPhone’s magnetometer to analyse harmonic motion is a practical and effective method for small oscillations and frequencies less than 15–20 Hz.

  20. Optical Image Contrast Reversal Using Bacteriorhodopsin Films

    NASA Astrophysics Data System (ADS)

    Wang, Ying-Li; Yao, Bao-Li; Menke, Neimule; Zheng, Yuan; Lei, Ming; Chen, Guo-Fu

    2005-05-01

    The implementation of image contrast reversal by using a photochromic material of Bacteriorhodopsin (BR) films is demonstrated with two methods based on the optical properties of BR. One is based on the absorption difference between the B and M states. Images recorded by green light can be contrast reversed readout by violet light. The other is based on the photoinduced anisotropy of BR when it is excited by linear polarization light. By placing the BR film between two crossed polarizers (i.e. a polarizer and an analyser), the difference of polarization states of the recorded area and the unrecorded area can be detected, and thus different contrast images can be obtained by rotating the polarization axis of the analyser.

  1. Selective image encryption using a spatiotemporal chaotic system.

    PubMed

    Xiang, Tao; Wong, Kwok-wo; Liao, Xiaofeng

    2007-06-01

    A universal selective image encryption algorithm, in which the spatiotemporal chaotic system is utilized, is proposed to encrypt gray-level images. In order to resolve the tradeoff between security and performance, the effectiveness of selective encryption is discussed based on simulation results. The scheme is then extended to encrypt RGB color images. Security analyses for both scenarios show that the proposed schemes achieve high security and efficiency. PMID:17614669

  2. Remote Compositional Analyses of Lunar Olivine-Bearing Lithologies

    NASA Astrophysics Data System (ADS)

    Isaacson, P.; Clark, R. N.; Head, J. W.; Klima, R.; Petro, N. E.; Pieters, C. M.; Staid, M.; Sunshine, J. M.; Taylor, L. A.; Thaisen, K. G.; Tompkins, S.

    2009-12-01

    The Moon Mineralogy Mapper (M3) is a guest instrument on Chandrayaan-1, India’s first mission to the Moon. M3 is an imaging spectrometer covering the wavelength range of 430 nm - 3000 nm, and was designed to map the mineralogy of the lunar surface. The high spectral resolution of M3 enables the diagnostic absorption features of lunar minerals to be identified clearly, while the high spatial resolution of M3 allows the identification and mapping of distinct lithologic units. Olivine is an important mineral with which to interpret the petrologic evolution of igneous rocks. The composition of olivine (Mg#) is used to indicate the degree of evolution of the source magma from which a sample crystallized. Visible to near-infrared reflectance spectroscopy is sensitive to the Mg# of olivine, as the diagnostic olivine absorption features shift in response to changing major element abundances (Mg and Fe) content. These changes in diagnostic absorption features can be detected by modeling the individual absorption bands with the Modified Gaussian Model (MGM). Spectra of lunar olivines differ from spectra of their terrestrial and synthetic counterparts due to the inclusions of Cr-spinel common to lunar olivines; however, analysis of lunar olivine mineral separates in terrestrial laboratories and modeling of the resulting reflectance spectra have been able to unravel the chromite effects on the olivine spectrum. Previous efforts at remote compositional analysis of lunar olivine have been limited by spectral resolution and coverage or by spatial resolution. However, the spatial and spectral resolution provided by M3 enable olivine composition to be determined remotely in a spatial context. We are in the process of identifying olivine-bearing lithologies on the lunar farside and analyzing the olivine composition with the modified MGM approach. Initial compositional analyses have been completed for a crater on the rim of the Moscoviense basin that appears to be largely dominated

  3. Identifying neural correlates of visual consciousness with ALE meta-analyses.

    PubMed

    Bisenius, Sandrine; Trapp, Sabrina; Neumann, Jane; Schroeter, Matthias L

    2015-11-15

    Neural correlates of consciousness (NCC) have been a topic of study for nearly two decades. In functional imaging studies, several regions have been proposed to constitute possible candidates for NCC, but as of yet, no quantitative summary of the literature on NCC has been done. The question whether single (striate or extrastriate) regions or a network consisting of extrastriate areas that project directly to fronto-parietal regions are necessary and sufficient neural correlates for visual consciousness is still highly debated [e.g., Rees et al., 2002, Nat Rev. Neurosci 3, 261-270; Tong, 2003, Nat Rev. Neurosci 4, 219-229]. The aim of this work was to elucidate this issue and give a synopsis of the present state of the art by conducting systematic and quantitative meta-analyses across functional magnetic resonance imaging (fMRI) studies using several standard paradigms for conscious visual perception. In these paradigms, consciousness is operationalized via perceptual changes, while the visual stimulus remains invariant. An activation likelihood estimation (ALE) meta-analysis was performed, representing the best approach for voxel-wise meta-analyses to date. In addition to computing a meta-analysis across all paradigms, separate meta-analyses on bistable perception and masking paradigms were conducted to assess whether these paradigms show common or different NCC. For the overall meta-analysis, we found significant clusters of activation in inferior and middle occipital gyrus; fusiform gyrus; inferior temporal gyrus; caudate nucleus; insula; inferior, middle, and superior frontal gyri; precuneus; as well as in inferior and superior parietal lobules. These results suggest a subcortical-extrastriate-fronto-parietal network rather than a single region that constitutes the necessary NCC. The results of our exploratory paradigm-specific meta-analyses suggest that this subcortical-extrastriate-fronto-parietal network might be differentially activated as a function of the

  4. ';Big Data' can make a big difference: Applying Big Data to National Scale Change Analyses

    NASA Astrophysics Data System (ADS)

    Mueller, N. R.; Curnow, S.; Melrose, R.; Purss, M. B.; Lewis, A.

    2013-12-01

    The traditional method of change detection in remote sensing is based on acquiring a pair of images and conducting a set of analyses to determine what is different between them. The end result is a single change analysis for a single time period. While this may be repeated several times, it is generally a time consuming, often manual process providing a series of snapshots of change. As datasets become larger, and time series analyses become more sophisticated, these traditional methods of analysis are unviable. The Geoscience Australia ';Data Cube' provides a 25-year time series of all Landsat-5 and Landsat-7 data for the entire Australian continent. Each image is orthorectified to a standard set of pixel locations and is fully calibrated to a measure of surface reflectance (the 25m Australian Reflectance Grid [ARG25]). These surface reflectance measurements are directly comparable, between different scenes, and regardless of whether they are sourced from the Landsat-5 TM instrument or the Landsat-7 ETM+. The advantage of the Data Cube environment lies in the ability to apply an algorithm to every pixel across Australia (some 1013 pixels) in a consistent way, enabling change analysis for every acquired observation. This provides a framework to analyse change through time on a scene to scene basis, and across national-scale areas for the entire duration of the archive. Two examples of applications of the Data Cube are described here: surface water extent mapping across Australia; and vegetation condition mapping across the Murray-Darling Basin, Australia's largest river system.. Ongoing water mapping and vegetation condition mapping is required by the Australian government to produce information products for a range of requirements including ecological monitoring and emergency management risk planning. With a 25 year archive of Landsat-5 and Landsat-7 imagery hosted on an efficient High Performance Computing (HPC) environment, high speed analyses of long time

  5. Coordinated in Situ Analyses of Organic Nanoglobules in the Sutter's Mill Meteorite

    NASA Technical Reports Server (NTRS)

    Nakamura-Messenger, K.; Messenger, S.; Keller, L. P.; Clemett, S. J.; Nguyen, A. N.; Gibson, E. K.

    2013-01-01

    The Sutter's Mill meteorite is a newly fallen carbonaceous chondrite that was collected and curated quickly after its fall. Preliminary petrographic and isotopic investigations suggest affinities to the CM2 carbonaceous chondrites. The primitive nature of this meteorite and its rapid recovery provide an opportunity to investigate primordial solar system organic matter in a unique new sample. Here we report in-situ analyses of organic nanoglobules in the Sutter's Mill meteorite using UV fluorescence imaging, Fourier-transform infrared spectroscopy (FTIR), scanning transmission electron microscopy (STEM), NanoSIMS, and ultrafast two-step laser mass spectrometry (ultra-L2MS).

  6. Display depth analyses with the wave aberration for the auto-stereoscopic 3D display

    NASA Astrophysics Data System (ADS)

    Gao, Xin; Sang, Xinzhu; Yu, Xunbo; Chen, Duo; Chen, Zhidong; Zhang, Wanlu; Yan, Binbin; Yuan, Jinhui; Wang, Kuiru; Yu, Chongxiu; Dou, Wenhua; Xiao, Liquan

    2016-07-01

    Because the aberration severely affects the display performances of the auto-stereoscopic 3D display, the diffraction theory is used to analyze the diffraction field distribution and the display depth through aberration analysis. Based on the proposed method, the display depth of central and marginal reconstructed images is discussed. The experimental results agree with the theoretical analyses. Increasing the viewing distance or decreasing the lens aperture can improve the display depth. Different viewing distances and the LCD with two lens-arrays are used to verify the conclusion.

  7. The 10 MWe solar thermal central receiver pilot plant: Beam safety tests and analyses

    NASA Astrophysics Data System (ADS)

    Brumleve, T. D.

    1984-07-01

    Potential eye hazards of reflected heliostat beams were evaluated and the adequacy of the beam safety central strategy at the 10 MWe solar thermal central receiver pilot plant was verified. Special video techniques were used during helicopter flyovers and a ground level to determine retinal irradiance and image size relative to a reference Sun. Receiver brightness was also measured. Measured values were consistent with analyses, and safety provisions at the plant were found to be adequate. Other beam control strategies for heliostats designed to stow face-up in high winds are studied, and one strategy is checked experimentally during the helicopter flyover tests.

  8. scikit-image: image processing in Python.

    PubMed

    van der Walt, Stéfan; Schönberger, Johannes L; Nunez-Iglesias, Juan; Boulogne, François; Warner, Joshua D; Yager, Neil; Gouillart, Emmanuelle; Yu, Tony

    2014-01-01

    scikit-image is an image processing library that implements algorithms and utilities for use in research, education and industry applications. It is released under the liberal Modified BSD open source license, provides a well-documented API in the Python programming language, and is developed by an active, international team of collaborators. In this paper we highlight the advantages of open source to achieve the goals of the scikit-image library, and we showcase several real-world image processing applications that use scikit-image. More information can be found on the project homepage, http://scikit-image.org. PMID:25024921

  9. scikit-image: image processing in Python

    PubMed Central

    Schönberger, Johannes L.; Nunez-Iglesias, Juan; Boulogne, François; Warner, Joshua D.; Yager, Neil; Gouillart, Emmanuelle; Yu, Tony

    2014-01-01

    scikit-image is an image processing library that implements algorithms and utilities for use in research, education and industry applications. It is released under the liberal Modified BSD open source license, provides a well-documented API in the Python programming language, and is developed by an active, international team of collaborators. In this paper we highlight the advantages of open source to achieve the goals of the scikit-image library, and we showcase several real-world image processing applications that use scikit-image. More information can be found on the project homepage, http://scikit-image.org. PMID:25024921

  10. Analyses and Measures of GPR Signal with Superimposed Noise

    NASA Astrophysics Data System (ADS)

    Chicarella, Simone; Ferrara, Vincenzo; D'Atanasio, Paolo; Frezza, Fabrizio; Pajewski, Lara; Pavoncello, Settimio; Prontera, Santo; Tedeschi, Nicola; Zambotti, Alessandro

    2014-05-01

    The influence of EM noises and environmental hard conditions on the GPR surveys has been examined analytically [1]. In the case of pulse radar GPR, many unwanted signals as stationary clutter, non-stationary clutter, random noise, and time jitter, influence the measurement signal. When GPR is motionless, stationary clutter is the most dominant signal component due to the reflections of static objects different from the investigated target, and to the direct antenna coupling. Moving objects like e.g. persons and vehicles, and the swaying of tree crown, produce non-stationary clutter. Device internal noise and narrowband jamming are e.g. two potential sources of random noises. Finally, trigger instabilities generate random jitter. In order to estimate the effective influence of these noise signal components, we organized some experimental setup of measurement. At first, we evaluated for the case of a GPR basic detection, simpler image processing of radargram. In the future, we foresee experimental measurements for detection of the Doppler frequency changes induced by movements of targets (like physiological movements of survivors under debris). We obtain image processing of radargram by using of GSSI SIR® 2000 GPR system together with the UWB UHF GPR-antenna (SUB-ECHO HBD 300, a model manufactured by Radarteam company). Our work includes both characterization of GPR signal without (or almost without) a superimposed noise, and the effect of jamming originated from the coexistence of a different radio signal. For characterizing GPR signal, we organized a measurement setup that includes the following instruments: mod. FSP 30 spectrum analyser by Rohde & Schwarz which operates in the frequency range 9 KHz - 30 GHz, mod. Sucoflex 104 cable by Huber Suhner (10 MHz - 18 GHz), and HL050 antenna by Rohde & Schwarz (bandwidth: from 850 MHz to 26.5 GHz). The next analysis of superimposed jamming will examine two different signal sources: by a cellular phone and by a

  11. [Medical image enhancement: Sharpening].

    PubMed

    Kats, L; Vered, M

    2015-04-01

    Most digital imaging systems provide opportunities for image enhancement operations. These are applied to improve the original image and to make the image more appealing visually. One possible means of enhancing digital radiographic image is sharpening. The purpose of sharpening filters is to improve image quality by removing noise or edge enhancement. Sharpening filters may make the radiographic images subjectively more appealing. But during this process, important radiographic features may disappear while artifacts that simulate pathological process might be generated. Therefore, it is of utmost importance for dentists to be familiar with and aware of the use of image enhancement operations, provided by medical digital imaging programs. PMID:26255429

  12. Removal of subsurface fluorescence in cryo-imaging using deconvolution.

    PubMed

    Krishnamurthi, Ganapathy; Wang, Charlie Y; Steyer, Grant; Wilson, David L

    2010-10-11

    We compared image restoration methods [Richardson-Lucy (RL), Wiener, and Next-image] with measured "scatter" point-spread-functions, for removing subsurface fluorescence from section-and-image cryo-image volumes. All methods removed haze, delineated single cells from clusters, and improved visualization, but RL best represented structures. Contrast-to-noise and contrast-to-background improvement from RL and Wiener were comparable and 35% better than Next-image. Concerning detection of labeled cells, ROC analyses showed RL ≈Wiener > Next-image > no processing. Next-image was faster than other methods and less prone to image processing artifacts. RL is recommended for the best restoration of the shape and size of fluorescent structures. PMID:20941133

  13. X-Ray Imaging

    MedlinePlus

    ... Brain Surgery Imaging Clinical Trials Basics Patient Information X-Ray Imaging Print This Page X-ray imaging is perhaps the most familiar type of imaging. Images produced by X-rays are due to the different absorption rates of ...

  14. Split image optical display

    DOEpatents

    Veligdan, James T.

    2007-05-29

    A video image is displayed from an optical panel by splitting the image into a plurality of image components, and then projecting the image components through corresponding portions of the panel to collectively form the image. Depth of the display is correspondingly reduced.

  15. Split image optical display

    DOEpatents

    Veligdan, James T.

    2005-05-31

    A video image is displayed from an optical panel by splitting the image into a plurality of image components, and then projecting the image components through corresponding portions of the panel to collectively form the image. Depth of the display is correspondingly reduced.

  16. Terahertz wave reciprocal imaging

    NASA Astrophysics Data System (ADS)

    Xu, Jingzhou; Zhang, X.-C.

    2006-04-01

    A reciprocal imaging technology with an encoding/decoding image readout method allows a single detector (such as a heterodyne detector) to produce a two dimensional (2D) image simultaneously. Applying it in a pulsed terahertz imaging system could create a 2D terahertz image with 100pixels per frame which produces the same signal to noise ratio as a signal spot measurement.

  17. RIM-13: A high-resolution imaging tool for aerial image monitoring of patterned and blank EUV reticles

    NASA Astrophysics Data System (ADS)

    Booth, M.; Brunton, A.; Cashmore, J.; Elbourn, P.; Elliner, G.; Gower, M.; Greuters, J.; Hirsch, J.; Kling, L.; McEntee, N.; Richards, P.; Truffert, V.; Wallhead, I.; Whitfield, M.

    2006-03-01

    Key features of the RIM-13 EUV actinic reticle imaging microscope are summarised. This is a tool which generates aerial images from blank or patterned EUV masks, emulating the illumination and projection optics of an exposure tool. Such images of mask defects, acquired by a CCD camera, are analysed using the tool software to predict their effect on resist exposure. Optical, mechanical and software performance of the tool are reported.

  18. Field-Based Land Cover Classification Aided with Texture Analyses Using Terrasar-X Data

    NASA Astrophysics Data System (ADS)

    Mahmoud, Ali; Pradhan, Biswajeet; Buchroithner, Manfred

    The present study aims to evaluate the field-based approach for the classification of land cover using the recently launched high resolution SAR data. TerraSAR-X1 (TSX-1) strip mode im-age, coupled with Digital Ortho Photos with 20 cm spatial resolution was used for land cover classification and parcel mapping respectively. Different filtering and texture analyses tech-niques were applied to extract textural information from the TSX-1 image in order to assess the enhancement of the classification accuracy. Several attributes of parcels were derived from the available TSX-1 image in order to define the most suitable attributes discriminating be-tween different land cover types. Then, these attributes were further analyzed by statistical and various image classification methods for landcover classification. The results showed that, tex-tural analysis performed higher classification accuracy than the earlier. The authors conclude that, an integrated landcover classification using the textural information in TerraSAR-X1 has high potential for landcover mapping. Key words: Landcover classification, TerraSARX1, field based, texture analysis

  19. Enhancing forensic science with spectroscopic imaging

    NASA Astrophysics Data System (ADS)

    Ricci, Camilla; Kazarian, Sergei G.

    2006-09-01

    This presentation outlines the research we are developing in the area of Fourier Transform Infrared (FTIR) spectroscopic imaging with the focus on materials of forensic interest. FTIR spectroscopic imaging has recently emerged as a powerful tool for characterisation of heterogeneous materials. FTIR imaging relies on the ability of the military-developed infrared array detector to simultaneously measure spectra from thousands of different locations in a sample. Recently developed application of FTIR imaging using an ATR (Attenuated Total Reflection) mode has demonstrated the ability of this method to achieve spatial resolution beyond the diffraction limit of infrared light in air. Chemical visualisation with enhanced spatial resolution in micro-ATR mode broadens the range of materials studied with FTIR imaging with applications to pharmaceutical formulations or biological samples. Macro-ATR imaging has also been developed for chemical imaging analysis of large surface area samples and was applied to analyse the surface of human skin (e.g. finger), counterfeit tablets, textile materials (clothing), etc. This approach demonstrated the ability of this imaging method to detect trace materials attached to the surface of the skin. This may also prove as a valuable tool in detection of traces of explosives left or trapped on the surfaces of different materials. This FTIR imaging method is substantially superior to many of the other imaging methods due to inherent chemical specificity of infrared spectroscopy and fast acquisition times of this technique. Our preliminary data demonstrated that this methodology will provide the means to non-destructive detection method that could relate evidence to its source. This will be important in a wider crime prevention programme. In summary, intrinsic chemical specificity and enhanced visualising capability of FTIR spectroscopic imaging open a window of opportunities for counter-terrorism and crime-fighting, with applications ranging

  20. Analysing land cover and land use change in the Matobo National Park and surroundings in Zimbabwe

    NASA Astrophysics Data System (ADS)

    Scharsich, Valeska; Mtata, Kupakwashe; Hauhs, Michael; Lange, Holger; Bogner, Christina

    2016-04-01

    Natural forests are threatened worldwide, therefore their protection in National Parks is essential. Here, we investigate how this protection status affects the land cover. To answer this question, we analyse the surface reflectance of three Landsat images of Matobo National Park and surrounding in Zimbabwe from 1989, 1998 and 2014 to detect changes in land cover in this region. To account for the rolling countryside and the resulting prominent shadows, a topographical correction of the surface reflectance was required. To infer land cover changes it is not only necessary to have some ground data for the current satellite images but also for the old ones. In particular for the older images no recent field study could help to reconstruct these data reliably. In our study we follow the idea that land cover classes of pixels in current images can be transferred to the equivalent pixels of older ones if no changes occurred meanwhile. Therefore we combine unsupervised clustering with supervised classification as follows. At first, we produce a land cover map for 2014. Secondly, we cluster the images with clara, which is similar to k-means, but suitable for large data sets. Whereby the best number of classes were determined to be 4. Thirdly, we locate unchanged pixels with change vector analysis in the images of 1989 and 1998. For these pixels we transfer the corresponding cluster label from 2014 to 1989 and 1998. Subsequently, the classified pixels serve as training data for supervised classification with random forest, which is carried out for each image separately. Finally, we derive land cover classes from the Landsat image in 2014, photographs and Google Earth and transfer them to the other two images. The resulting classes are shrub land; forest/shallow waters; bare soils/fields with some trees/shrubs; and bare light soils/rocks, fields and settlements. Subsequently the three different classifications are compared and land changes are mapped. The main changes are

  1. 43 CFR 46.130 - Mitigation measures in analyses.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 43 Public Lands: Interior 1 2011-10-01 2011-10-01 false Mitigation measures in analyses. 46.130... Mitigation measures in analyses. (a) Bureau proposed action. The analysis of the proposed action and any alternatives must include an analysis of the effects of the proposed action or alternative as well as...

  2. 44 CFR 1.9 - Regulatory impact analyses.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Regulatory impact analyses. 1... HOMELAND SECURITY GENERAL RULEMAKING; POLICY AND PROCEDURES General § 1.9 Regulatory impact analyses. (a) FEMA shall, in connection with any major rule, prepare and consider a Regulatory Impact Analysis....

  3. 41 CFR 101-27.208 - Inventory analyses.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 41 Public Contracts and Property Management 2 2011-07-01 2007-07-01 true Inventory analyses. 101...-Management of Shelf-Life Materials § 101-27.208 Inventory analyses. (a) An inventory analysis shall be... the established shelf-life period. If the analysis indicates there are quantities which will not...

  4. 41 CFR 101-27.208 - Inventory analyses.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 41 Public Contracts and Property Management 2 2010-07-01 2010-07-01 true Inventory analyses. 101...-Management of Shelf-Life Materials § 101-27.208 Inventory analyses. (a) An inventory analysis shall be... the established shelf-life period. If the analysis indicates there are quantities which will not...

  5. A computer graphics program for general finite element analyses

    NASA Technical Reports Server (NTRS)

    Thornton, E. A.; Sawyer, L. M.

    1978-01-01

    Documentation for a computer graphics program for displays from general finite element analyses is presented. A general description of display options and detailed user instructions are given. Several plots made in structural, thermal and fluid finite element analyses are included to illustrate program options. Sample data files are given to illustrate use of the program.

  6. What can we do about exploratory analyses in clinical trials?

    PubMed

    Moyé, Lem

    2015-11-01

    The research community has alternatively embraced then repudiated exploratory analyses since the inception of clinical trials in the middle of the twentieth century. After a series of important but ultimately unreproducible findings, these non-prospectively declared evaluations were relegated to hypothesis generating. Since the majority of evaluations conducted in clinical trials with their rich data sets are exploratory, the absence of their persuasive power adds to the inefficiency of clinical trial analyses in an atmosphere of fiscal frugality. However, the principle argument against exploratory analyses is not based in statistical theory, but pragmatism and observation. The absence of any theoretical treatment of exploratory analyses postpones the day when their statistical weaknesses might be repaired. Here, we introduce examination of the characteristics of exploratory analyses from a probabilistic and statistical framework. Setting the obvious logistical concerns aside (i.e., the absence of planning produces poor precision), exploratory analyses do not appear to suffer from estimation theory weaknesses. The problem appears to be a difficulty in what is actually reported as the p-value. The use of Bayes Theorem provides p-values that are more in line with confirmatory analyses. This development may inaugurate a body of work that would lead to the readmission of exploratory analyses to a position of persuasive power in clinical trials. PMID:26390962

  7. Descriptive Analyses of Pediatric Food Refusal and Acceptance

    ERIC Educational Resources Information Center

    Borrero, Carrie S. W.; Woods, Julia N.; Borrero, John C.; Masler, Elizabeth A.; Lesser, Aaron D.

    2010-01-01

    Functional analyses of inappropriate mealtime behavior typically include conditions to determine if the contingent delivery of attention, tangible items, or escape reinforce food refusal. In the current investigation, descriptive analyses were conducted for 25 children who had been admitted to a program for the assessment and treatment of food…

  8. Training Residential Staff to Conduct Trial-Based Functional Analyses

    ERIC Educational Resources Information Center

    Lambert, Joseph M.; Bloom, Sarah E.; Kunnavatana, S. Shanun; Collins, Shawnee D.; Clay, Casey J.

    2013-01-01

    We taught 6 supervisors of a residential service provider for adults with developmental disabilities to train 9 house managers to conduct trial-based functional analyses. Effects of the training were evaluated with a nonconcurrent multiple baseline. Results suggest that house managers can be trained to conduct trial-based functional analyses with…

  9. Rational Analyses of Information Foraging on the Web

    ERIC Educational Resources Information Center

    Pirolli, Peter

    2005-01-01

    This article describes rational analyses and cognitive models of Web users developed within information foraging theory. This is done by following the rational analysis methodology of (a) characterizing the problems posed by the environment, (b) developing rational analyses of behavioral solutions to those problems, and (c) developing cognitive…

  10. 46 CFR Appendix B to Part 154 - Stress Analyses Definitions

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 5 2013-10-01 2013-10-01 false Stress Analyses Definitions B Appendix B to Part 154...—Stress Analyses Definitions The following are the standard definitions of stresses for the analysis of an independent tank type B: Normal stress means the component of stress normal to the plane of...

  11. Integrated metagenomic and metaproteomic analyses of marine biofilm communities

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Metagenomic and metaproteomic analyses were utilized to begin to understand the role varying environments play on the composition and function of complex air-water interface biofilms sampled from the hulls of two ships that were deployed in different geographic waters. Prokaryotic community analyses...

  12. Tracing Success: Graphical Methods for Analysing Successful Collaborative Problem Solving

    ERIC Educational Resources Information Center

    Joiner, Richard; Issroff, Kim

    2003-01-01

    The aim of this paper is to evaluate the use of trace diagrams for analysing collaborative problem solving. The paper describes a study where trace diagrams were used to analyse joint navigation in a virtual environment. Ten pairs of undergraduates worked together on a distributed virtual task to collect five flowers using two bees with each…

  13. 46 CFR Appendix B to Part 154 - Stress Analyses Definitions

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 5 2010-10-01 2010-10-01 false Stress Analyses Definitions B Appendix B to Part 154...—Stress Analyses Definitions The following are the standard definitions of stresses for the analysis of an independent tank type B: Normal stress means the component of stress normal to the plane of...

  14. Treatment of Pica through Multiple Analyses of Its Reinforcing Functions.

    ERIC Educational Resources Information Center

    Piazza, Cathleen C.; Fisher, Wayne W.; Hanley, Gregory P.; LeBlanc, Linda A.; Worsdell, April S.; And Others

    1998-01-01

    A study conducted functional analyses of the pica of three young children. The pica of one participant was maintained by automatic reinforcement; that of the other two was multiply-controlled by social and automatic reinforcement. Preference and treatment analyses were used to address the automatic function of the pica. (Author/CR)

  15. Recent Trends in Conducting School-Based Experimental Functional Analyses

    ERIC Educational Resources Information Center

    Carter, Stacy L.

    2009-01-01

    Demonstrations of school-based experimental functional analyses have received limited attention within the literature. School settings present unique practical and ethical concerns related to the implementation of experimental analyses which were originally developed within clinical settings. Recent examples have made definite contributions toward…

  16. Analyses of response-stimulus sequences in descriptive observations.

    PubMed

    Samaha, Andrew L; Vollmer, Timothy R; Borrero, Carrie; Sloman, Kimberly; Pipkin, Claire St Peter; Bourret, Jason

    2009-01-01

    Descriptive observations were conducted to record problem behavior displayed by participants and to record antecedents and consequences delivered by caregivers. Next, functional analyses were conducted to identify reinforcers for problem behavior. Then, using data from the descriptive observations, lag-sequential analyses were conducted to examine changes in the probability of environmental events across time in relation to occurrences of problem behavior. The results of the lag-sequential analyses were interpreted in light of the results of functional analyses. Results suggested that events identified as reinforcers in a functional analysis followed behavior in idiosyncratic ways: after a range of delays and frequencies. Thus, it is possible that naturally occurring reinforcement contingencies are arranged in ways different from those typically evaluated in applied research. Further, these complex response-stimulus relations can be represented by lag-sequential analyses. However, limitations to the lag-sequential analysis are evident. PMID:19949537

  17. Treatment of pica through multiple analyses of its reinforcing functions.

    PubMed Central

    Piazza, C C; Fisher, W W; Hanley, G P; LeBlanc, L A; Worsdell, A S; Lindauer, S E; Keeney, K M

    1998-01-01

    We conducted functional analyses of the pica of 3 participants. The pica of 1 participant appeared to be maintained by automatic reinforcement; that of the other 2 participants appeared to be multiply controlled by social and automatic reinforcement. Subsequent preference and treatment analyses were used to identify stimuli that would complete with the automatic function of pica for the 3 participants. These analyses also identified the specific aspect of oral stimulation that served as automatic reinforcement for 2 of the participants. In addition, functional analysis-based treatments were used to address the socially motivated components of 2 of the participants' pica. Results are discussed in terms of (a) the importance of using the results of functional analyses to develop treatments for pica and (b) the advantages of developing indirect analyses to identify specific sources of reinforcement for automatically reinforced behavior. PMID:9652098

  18. Smart Image Enhancement Process

    NASA Technical Reports Server (NTRS)

    Jobson, Daniel J. (Inventor); Rahman, Zia-ur (Inventor); Woodell, Glenn A. (Inventor)

    2012-01-01

    Contrast and lightness measures are used to first classify the image as being one of non-turbid and turbid. If turbid, the original image is enhanced to generate a first enhanced image. If non-turbid, the original image is classified in terms of a merged contrast/lightness score based on the contrast and lightness measures. The non-turbid image is enhanced to generate a second enhanced image when a poor contrast/lightness score is associated therewith. When the second enhanced image has a poor contrast/lightness score associated therewith, this image is enhanced to generate a third enhanced image. A sharpness measure is computed for one image that is selected from (i) the non-turbid image, (ii) the first enhanced image, (iii) the second enhanced image when a good contrast/lightness score is associated therewith, and (iv) the third enhanced image. If the selected image is not-sharp, it is sharpened to generate a sharpened image. The final image is selected from the selected image and the sharpened image.

  19. What Is an Image?

    ERIC Educational Resources Information Center

    Gerber, Andrew J.; Peterson, Bradley S.

    2008-01-01

    The article helps to understand the interpretation of an image by presenting as to what constitutes an image. A common feature in all images is the basic physical structure that can be described with a common set of terms.

  20. Restoration Of MEX SRC Images For Improved Topography: A New Image Product

    NASA Astrophysics Data System (ADS)

    Duxbury, T. C.

    2012-12-01

    Surface topography is an important constraint when investigating the evolution of solar system bodies. Topography is typically obtained from stereo photogrammetric or photometric (shape from shading) analyses of overlapping / stereo images and from laser / radar altimetry data. The ESA Mars Express Mission [1] carries a Super Resolution Channel (SRC) as part of the High Resolution Stereo Camera (HRSC) [2]. The SRC can build up overlapping / stereo coverage of Mars, Phobos and Deimos by viewing the surfaces from different orbits. The derivation of high precision topography data from the SRC raw images is degraded because the camera is out of focus. The point spread function (PSF) is multi-peaked, covering tens of pixels. After registering and co-adding hundreds of star images, an accurate SRC PSF was reconstructed and is being used to restore the SRC images to near blur free quality. The restored images offer a factor of about 3 in improved geometric accuracy as well as identifying the smallest of features to significantly improve the stereo photogrammetric accuracy in producing digital elevation models. The difference between blurred and restored images provides a new derived image product that can provide improved feature recognition to increase spatial resolution and topographic accuracy of derived elevation models. Acknowledgements: This research was funded by the NASA Mars Express Participating Scientist Program. [1] Chicarro, et al., ESA SP 1291(2009) [2] Neukum, et al., ESA SP 1291 (2009). A raw SRC image (h4235.003) of a Martian crater within Gale crater (the MSL landing site) is shown in the upper left and the restored image is shown in the lower left. A raw image (h0715.004) of Phobos is shown in the upper right and the difference between the raw and restored images, a new derived image data product, is shown in the lower right. The lower images, resulting from an image restoration process, significantly improve feature recognition for improved derived